This paper develops a quasi-maximum likelihood (QML) procedure for estimating the parameters of multi-dimensional stochastic differential equations. The transitional density is taken to be a time-varying multivariate Gaussian where the first two moments of the distribution are approximately the true moments of the unknown transitional density. For affine drift and diffusion functions, the moments are shown to be exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good. The estimation procedure is easily generalizable to models with latent factors, such as the stochastic volatility class of model. The QML method is as effective as alternative methods when proxy variables are used for unobserved states. A conditioning estimation procedure is also developed that allows parameter estimation in the absence of proxies.
This paper considers the estimation of discrete time duration models. We highlight the enhance identification opportunities embedded in multiple spell data to separately identify the effect of duration dependence and individual time invariant unobserved heterogeneity. We consider two types of models: (i) random effects models specifying a mass point distribution for the unobserved heterogeneity; and (ii) fixed effects models in which the distribution of the effects is left unrestricted. The availability of multiple spell data allows us to consider this type of models, in the spirit of fixed effects discrete choice panel data models. We study the finite sample properties of different estimators for previous models by means of Monte Carlo simulations. Finally, as an empirical illustration, we estimate unemployment duration models using Spanish administrative data with information on the entire labor history of the individua ls.
This paper presents an alternative method to derive the limiting distribution of residual-based statistics. Our method does not impose an explicit assumption of (asymptotic) smoothness of the statistic of interest with respect to the model’s parameters. and, thus, is especially useful in cases where such smoothness is difficult to establish. Instead, we use a locally uniform convergence in distribution condition, which is automatically satisfied by residual-based specification test statistics. To illustrate, we derive the limiting distribution of a new functional form specification test for discrete choice models, as well as a runs-based tests for conditional symmetry in dynamic volatility models.
Le Cam’s third lemma, Local Asymptotic Normality (LAN)
Multiple hypothesis testing and clustering with mixtures of non-central t-distributions applied in microarray data analysis
Multiple testing analysis, based on clustering methodologies, is usually applied in Microarray Data Analysis for comparisons between pair of groups. In this paper, we generalize this methodology to deal with multiple comparisons among more than two groups obtained from microarray expressions of genes. Assuming normal data, we define a statistic which depends on sample means and sample variances, distributed as a non-central t-distribution. As we consider multiple comparisons among groups, a mixture of non-central t-distributions is derived. The estimation of the components of mixtures is obtained via a Bayesian approach, and the model is applied in a multiple comparison problem from a microarray experiment obtained from gorilla, bonobo and human cultured fibroblasts.
A new methodology is presented for approximating the moments of least squares coefficient estimators in situations where endogeneity and dynamics are present. The OLS estimator is the focus here, but the method, which is valid under a simple set of smoothness and moment conditions, can be applied to related estimators. An O(T−1) approximation is presented for the bias in OLS estimation of a general ARX(p) model.
moment approximation; bias; finite sample
Identification and Estimation of Auction Model with Two-Dimensional Unobserved Heterogeneity
Elena Krasnokutskaya (Department of Economics, University of Pennsylvania)
This paper investigates the empirical importance of allowing for multi-dimensional sources of unobserved heterogeneity in auction models with private information. It in turn develops the estimation procedure that recovers the distribution of private information in the presence of two distinct sources of unobserved heterogeneity. It is shown that this estimation procedure identifies components of the model and produces uniformly consistent estimators of these components. The estimation procedure is applied to the data from highway procurement. The results of the estimation indicate that allowing for two-dimensional unobserved heterogeneity may significantly affect the results of estimation as well as policy-relevant instruments derived from the estimated distributions of bidders’ costs.
We generalize the results for statistical functionals given by [Fernholz, 1983] and [Serfling, 1980] to M estimates for samples drawn for an ergodic and stationary martingale sequence. In a first step, we take advantage of some recent results on the uniform convergency of the empirical distribution given by [Adams & Nobel, 2010] to prove consistency of M estimators, before we assume Hadamard differentiability of our estimators to prove their asymptotic normality. Further we apply the results to the LAD estimator of [Peng & Yao, 2003] and the maximum-likelihood estimator for GARCH processes to show the wide field of possible applications of this method. —
This paper represents empirical studies of SV models with a generalized hyperbolic (GH) skew Student’s t-error distribution to embed both asymmetric heavy-tailness and leverage effects for financial time series. An efficient Markov chain Monte Carlo estimation method is described and the model is fit to daily S&P500 stock returns. The practical importance of the proposed model is highlighted through the model comparison based on the marginal likelihood, Value at Risk (VaR) and expected shortfall. The empirical results show that incorporating leverage and asymmetric heavy-tailness contributes to the model fit and predicting the expected shortfall.
We argue that identification problems bedevil most applied spatial research. Spatialeconometrics solves these problems by deriving estimators assuming that functional formsare known and by using model comparison techniques to let the data choose betweencompeting specifications. We argue that in most situations of interest this, at best, achievesonly very weak identification. Worse, in most cases, such an approach will simply beuninformative about the economic processes at work rendering much applied spatialeconometric research ‘pointless’, unless the main aim is simply description of the data. Weadvocate an alternative approach based on the ‘experimental paradigm’ which puts issues ofidentification and causality at centre stage.
statistical methods, spatial, modeling
Sign Restrictions in Structural Vector Autoregressions: A Critical Review
The paper provides a review of the estimation of structural VARs with sign restrictions. It is shown how sign restrictions solve the parametric identification problem present in structural systems but leave the model identification problem unresolved. A market and a macro model are used to illustrate these points. Suggestions have been made on how to find a unique model. These are reviewed, along with some of the difficulties that can arise in how one is to use the impulse responses found with sign restrictions.
Structural Vector Autoregressions, New Keynesian Model, Sign Restrictions
Censored Gamma Regression Models for Limited Dependent Variables with an Application to Loss Given Default
Regression models for limited continuous dependent variables having a non-negligible probability of attaining exactly their limits are presented. The models differ in the number of parameters and in their flexibility. It is shown how to fit these models and they are applied to a Loss Given Default dataset from insurance to which they provide a good fit.
Testing the Invariance of Expectations Models of Inflation
Jennifer L. Castle
Jurgen A. Doornik
David F. Hendry
The new-Keynesian Phillips curve (NKPC) includes expected future inflation as a major feedforward variable to explain current inflation. Models of this type are regularly estimated by replacing the expected value by the actual future outcome, then using Instrumental Variables or Generalized Method of Moments methods to estimate the parameters. However, the underlying theory does not allow for various forms of non-stationarity in the data – despite the fact that crises, breaks and regimes shifts are relatively common. We investigate the consequences for NKPC estimation of breaks in data processes using the new technique of impulse-indicator saturation, and apply the resulting methods to salient published studies to check their viablility.
New Keynesian Phillips curve, inflation expectations, structural breaks, impulse-indicator, saturation
Efficient estimation of Markov regime-switching models: An application to electricity wholesale market prices
In this paper we discuss the calibration issues of models built on mean-reverting processes combined with Markov switching. Due to the unobservable switching mechanism, estimation of Markov regime-switching (MRS) models requires inferring not only the model parameters but also the state process values at the same time. The situation becomes more complicated when the individual regimes are independent from each other and at least one of them exhibits temporal dependence (like mean reversion in electricity spot prices). Then the temporal latency of the dynamics in the regimes as to be taken into account. In this paper we propose a method that greatly reduces the computational burden induced by the introduction of independent regimes in MRS models. We perform a simulation study to test the efficiency of the proposed method and apply it to a sample series of wholesale electricity spot prices from the German EEX market. The pr oposed 3-regime MRS model fits this data well and also contains unique features that allow for useful interpretations of the price dynamics.
Markov regime-switching; heteroskedasticity; EM algorithm; independent regimes; electricity spot price
The new methodology to study the impact of corporate events on bonds is comprised of a sampling technique and regression model. The method is different from standard approaches, motivated by the belief that event impact should be reflected in levels of yield premium. The regression tests for a change in average bond price after an event, statistical inference is made by estimates of a dummy variable. A new sampling method is described to accommodate the irregular spacing of bond trades in time.
Event Study; Bonds; TRACE; ANOVA
A causal interpretation of extensive and intensive margin effects in generalized Tobit models
Kevin E. Staub (Socioeconomic Institute, University of Zurich)
The usual decomposition of effects in corner solution models into extensive and intensive margins is generally incompatible with a causal interpretation. This paper proposes a decomposition based on the joint distribution of potential outcomes which is meaningful in a causal sense. The difference between decompositions can be substantial and yield diametrically opposed results, as shown in a standard Tobit model example. In a generalized Tobit application exploring the effect of reducing firm entry regulation on bilateral trade flows between countries, estimates suggest that using the usual decomposition would overstate the contribution of the extensive margin by around 15%.
Limited dependent variables, potential outcomes, causality, conditional-on-positives effect, Tobit, two-part model, country margins of trade
Within and Between Panel Cointegration in the German Regional Output-Trade-FDI Nexus
For spatial data with a sufficiently long time dimension, the concept of global cointegration has been recently included in the econometrics research agenda. Global cointegration arises when non-stationary time series are cointegrated both within and between spatial units. In this paper, we analyze the role of globally cointegrated variable relationships using German regional data (NUTS 1 level) for GDP, trade, and FDI activity during the period 1976–2005. Applying various homogeneous and heterogeneous panel data estimators to a Spatial Panel Error Correction Model (SpECM) for regional output growth allows us to analyze the short- and long-run impacts of internationalization activities. For the long-run cointegration equation, the empirical results support the hypothesis of export- and FDI-led growth. We also show that for export and outward FDI activity positive cross-regional eff ects are at work. Likewise, in the sho rt-run SpECM specification, direct and indirect spatial externalities are found to be present. As a sensitivity analysis, we use a spatial weighting matrix based on interregional goods transport fl ows rather than geographical distances. This scheme thus allows us to address more soundly the role of positive and negative effects of trade/FDI on output activity for a system of interconnected regions.