[ECM] Econométrie: working papers (RePEc, 24/07/2010)

Source : NEP (New Economics Papers) | RePEc

  • Noncausal Vector Autoregression
Date: 2010-04
By: Lanne, Markku
Saikkonen, Pentti
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:23717&r=ecm
In this paper, we propose a new noncausal vector autoregressive (VAR) model for non-Gaussian time series. The assumption of non-Gaussianity is needed for reasons of identifiability. Assuming that the error distribution belongs to a fairly general class of elliptical distributions, we develop an asymptotic theory of maximum likelihood estimation and statistical inference. We argue that allowing for noncausality is of particular importance in economic applications which currently use only conventional causal VAR models. Indeed, if noncausality is incorrectly ignored, the use of a causal VAR model may yield suboptimal forecasts and misleading economic interpretations. Therefore, we propose a procedure for discriminating between causality and noncausality. The methods are illustrated with an application to interest rate data.
Keywords: Vector autoregression; noncausal time series; non-Gaussian time series
JEL: C32
  • GMM Estimation with Noncausal Instruments
Date: 2009-09
By: Lanne, Markku
Saikkonen, Pentti
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:23649&r=ecm
Lagged variables are often used as instruments when the generalized method of moments (GMM) is applied to time series data. We show that if these variables follow noncausal autoregressive processes, their lags are not valid instruments and the GMM estimator is inconsistent. Moreover, in this case, endogeneity of the instruments may not be revealed by the J-test of overidentifying restrictions that may be inconsistent and, as shown by simulations, its finite-sample power is, in general, low. Although our explicit results pertain to a simple linear regression, they can be easily generalized. Our empirical results indicate that noncausality is quite common among economic variables, making these problems highly relevant.
Keywords: Noncausal autoregression; instrumental variables; test of overidentifying restrictions
JEL: C51
  • Estimation methods comparison of SVAR model with the mixture of two normal distributions – Monte Carlo analysis
Date: 2010
By: Katarzyna Maciejowska
URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/27&r=ecm
This paper addresses the issue of obtaining maximum likelihood estimates of parameters for structural VAR models with a mixture of distributions. Hence the problem does not have a closed form solution, numerical optimization procedures need to be used. A Monte Carlo experiment is design to compare the performance of four maximization algorithms and two estimation strategies. It is shown that the EM algorithm outperforms the general maximization algorithms such as BFGS, NEWTON and BHHH. Moreover simplification of the probelm introduced in the two steps quasi ML method does not worsen small sample properties of the estimators and therefore may be recommended in the empirical analysis.
Keywords: Structural vetcor autoregression , Error correction models, Mixed normal, Monte Carlo
JEL: C32
  • Optimal Forecasting of Noncausal Autoregressive Time Series
Date: 2010-02
By: Lanne, Markku
Luoto, Jani
Saikkonen, Pentti
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:23648&r=ecm
In this paper, we propose a simulation-based method for computing point and density forecasts for univariate noncausal and non-Gaussian autoregressive processes. Numerical methods are needed to forecast such time series because the prediction problem is generally nonlinear and no analytic solution is therefore available. According to a limited simulation experiment, the use of a correct noncausal model can lead to substantial gains in forecast accuracy over the corresponding causal model. An empirical application to U.S. inflation demonstrates the importance of allowing for noncausality in improving point and density forecasts.
Keywords: Noncausal autoregression; density forecast; inflation
JEL: C53
  • HEGY Tests in the Presence of Moving Averages
Date: 2010
By: Tomás del Barrio Castro (Universitat de les Illes Balears)
Denise R. Osborn (University of Manchester)
URL: http://d.repec.org/n?u=RePEc:ubi:deawps:42&r=ecm
We analyze the asymptotic distributions associated with the seasonal unit root tests of the Hylleberg et al. (1990) procedure for quarterly data when the innovations follow a moving average process. Although both the t- and F-type tests suffer from scale and shift effects compared with the presumed null distributions when a fixed order of autoregressive augmentation is applied, these effects disappear when the order of augmentation is sufficiently large. However, as found by Burridge and Taylor (2001) for the autoregressive case, individual t-ratio tests at the the semi-annual frequency are not pivotal even with high orders of augmentation, although the corresponding joint F-type statistic is pivotal. Monte Carlo simulations verify the importance of the order of augmentation for finite samples generated by seasonally integrated moving average processes.
Keywords: Seasonal integration, HEGY tests, unit root tests, moving averages
JEL: C12
  • Bayesian Model Selection and Forecasting in Noncausal Autoregressive Models
Date: 2009-09
By: Lanne, Markku
Luoma, Arto
Luoto, Jani
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:23646&r=ecm
In this paper, we propose a Bayesian estimation and prediction procedure for noncausal autoregressive (AR) models. Specifically, we derive the joint posterior density of the past and future errors and the parameters, which gives posterior predictive densities as a byproduct. We show that the posterior model probability provides a convenient model selection criterion and yields information on the probabilities of the alternative causal and noncausal specifications. This is particularly useful in assessing economic theories that imply either causal or purely noncausal dynamics. As an empirical application, we consider U.S. inflation dynamics. A purely noncausal AR model gets the strongest support, but there is also substantial evidence in favor of other noncausal AR models allowing for dependence on past inflation. Thus, although U.S. inflation dynamics seem to be dominated by expectations, the backward-looking component is not completely missing. Finally, the noncausal specifications seem to yield inflation forecasts which are superior to those from alternative models especially at longer forecast horizons.
Keywords: Noncausality; Autoregression; Bayesian model selection; Forecasting
JEL: C52
  • Threshold Bipower Variation and the Impact of Jumps on Volatility Forecasting
Date: 2010-07-06
By: Fulvio Corsi
Davide Pirino
Roberto Reno’
URL: http://d.repec.org/n?u=RePEc:ssa:lemwps:2010/11&r=ecm
This study reconsiders the role of jumps for volatility forecasting by showing that jumps have a positive and mostly significant impact on future volatility. This result becomes apparent once volatility is separated into its continuous and discontinuous component using estimators which are not only consistent, but also scarcely plagued by small-sample bias. To this purpose, we introduce the concept of threshold bipower variation, which is based on the joint use of bipower variation and threshold estimation. We show that its generalization (threshold multipower vari- ation) admits a feasible central limit theorem in the presence of jumps and provides less biased estimates, with respect to the standard multipower variation, of the continuous quadratic varia- tion in finite samples. We further provide a new test for jump detection which has substantially more power than tests based on multipower variation. Empirical analysis (on the S&P500 index, individual stocks and US bond yields) shows that the proposed techniques improve significantly the accuracy of volatility forecasts especially in periods following the occurrence of a jump.
Keywords: volatility estimation, jump detection, volatility forecasting, threshold estimation, financial markets
JEL: G1
  • A robust test for error cross-section correlation in panel models
Date: 2010-07
By: L Godfrey
T Yamagata
URL: http://d.repec.org/n?u=RePEc:yor:yorken:10/16&r=ecm
A wild bootstrap test of the null hypothesis that the errors of a panel data model are not correlated over cross-section units is proposed. The new test is more generally applicable than others that use the restrictive assumptions of normality and homoskedasticity. Monte Carlo results indicate that the new test is reliable.
Keywords: Cross-section correlation; Wild bootstrap; Robust test
JEL: C12
  • A Computationally Practical Simulation Estimation Algorithm for Dynamic Panel Data Models with Unobserved Endogenous State Variables
Date: 2010-07-05
By: Michael P. Keane (University of Technology Sydney and Arizona State University)
Robert M. Sauer (University of Bristol)
URL: http://d.repec.org/n?u=RePEc:jgu:wpaper:1008&r=ecm
This paper develops a simulation estimation algorithm that is particularly useful for estimating dynamic panel data models with unobserved endogenous state variables. The new approach can easily deal with the commonly encountered and widely discussed “initial conditions problem,” as well as the more general problem of missing state variables during the sample period. Repeated sampling experiments on dynamic probit models with serially correlated errors indicate that the estimator has good small sample properties. We apply the estimator to a model of married women’s labor force participation decisions. The results show that the rarely used Polya model, which is very difficult to estimate given missing data problems, fits the data substantially better than the popular Markov model. The Polya model implies far less state dependence in employment status than the Markov model. It also implies that observed heterogeneity in education, young children and husband income are much more important determinants of participation, while race is much less important.
Keywords: Initial Conditions, Missing Data, Simulation, Female Labor Force Participation Decisions
JEL: C15
  • Common factors in nonstationary panel data with a deterministic trend – estimation and distribution theory
Date: 2010
By: Katarzyna Maciejowska
URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/28&r=ecm
The paper studies large-dimention factor models with nonstationary factors and allows for deterministic trends and factors integrated of order higher then one.We follow the model speci.cation of Bai (2004) and derive the convergence rates and the limiting distributions of estimated factors, factors loadings and common components. We discuss in detail a model with a linear time trend. We ilustrate the theory with an empirical exmple that studies the fluctuations of the real activity of U.S.economy. We show that these .uctuationas can be explained by two nonstationary factors and a small number of stationary factors. We test the economic interpretation of nonstationary factors.
Keywords: Common-stochastic trends; Dynamic factors; Generalized dynamic factor models; Principal components; Nonstationary panel data
JEL: C13
  • The Estimation of Meta-Frontiers by Constrained Maximum Likelihood
Date: 2010
By: Alexandre Repkine (Department of Economics, Korea University, Seoul, Republic of Korea)
URL: http://d.repec.org/n?u=RePEc:iek:wpaper:1011&r=ecm
Existing approaches to the meta-frontier estimation are largely based on the linear programming technique, which does not hinge on any statistical underpinnings. We suggest estimating meta-frontiers by constrained maximum likelihood subject to the constraints that specify the way in which the estimated meta-frontier overarches the individual group frontiers. We present a methodology that allows one to either estimate meta-frontiers using the conventional set of constraints that guarantees overarching at the observed combinations of production inputs, or to specify a range of inputs within which such overarching will hold. In either case the estimated meta-frontier coefficients allow for the statistical inference that is not straightforward in case of the linear programming estimation. We apply our methodology to the world¡¯s FAO agricultural data and find similar estimates of the meta-frontier parameters in case of the same set of constraints. On the contrary, the parameter estimates differ a lot between different sets of constraints.
Keywords: technical efficiency, meta-frontiers, constrained maximum likelihood
JEL: O40
  • Valuation Ratios and Stock Price Predictability in South Africa: Is it there?
Date: 2010-07
By: Rangan Gupta (Department of Economics, University of Pretoria)
Mampho P. Modise (Department of Economics, University of Pretoria and South African Treasury, Pretoria, South Africa)
URL: http://d.repec.org/n?u=RePEc:pre:wpaper:201016&r=ecm
Using monthly South African data for 1990:01-2009:10, this paper, to the best of our knowledge, is the first to examine the predictability of real stock prices based on valuation ratios, namely, price-dividend and price-earnings ratios. We cannot detect either short-horizon or long-horizon predictability; that is, the hypothesis that the current value of a valuation ratio is uncorrelated with future stock price changes cannot be rejected at both short- and long- horizons based on bootstrapped critical values constructed from linear representations of the data. We find, via Monte Carlo simulations, that the power to detect predictability in finite samples tends to decrease at long horizons in a linear framework. Though Monte Carlo simulations applied to exponential smooth-transition autoregressive (ESTAR) models of the price-dividend and price-earnings ratios, show increased power, the ability of the non-linear framework i n explaining the pattern of stock price predictability in the data does not show any promise both at short- and long-horizons, just as in the linear predictive regressions.
Keywords: Predictive regression, Monte Carlo simulation, Nonlinear mean-reversion
JEL: C22
  • An Area Wide Real Time Data Base for the Euro Area
Date: 2010-07
By: Domenico Giannone
Jérôme Henry
Magdalena Lalik
Michèle Modugno
URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/59649&r=ecm
This paper describes how we constructed a real time database for the euro area covering more than 200 series regularly published in the European Central Bank Monthly Bulletin, as made available ahead of publication to the Governing Council members before their first meeting of the month. We describe the database in details and study the properties of the euro area real-time data flow and data revisions, also providing comparisons with the United States and Japan. We finally illustrate how such revisions can contribute to the uncertainty surrounding key macroeconomic ratios and the NAIRU.
Keywords: real-time; euro area; revisions; database
JEL: C01
  • A Hedonic Metric Approach to Estimating the Demand for Differentiated Products: An Application to Retail Milk Demand
Date: 2010-03-29
By: Gulseven, Osman
Wohlgenant, Michael
URL: http://d.repec.org/n?u=RePEc:ags:aesc10:91675&r=ecm
This article introduces the Hedonic Metric (HM) approach as an original method to model the demand for differentiated products. Using this approach, initially we create an n-dimensional hedonic space based on the characteristic information available to consumers. Next, we allocate products into this space and estimate the elasticities using distances. What distinguishes our model from traditional demand models such as Almost Ideal Demand System (AIDS) and Rotterdam Model is the way we link elasticities with product characteristics. Moreover, our model significantly reduces the number of parameters to be estimated, thereby making it possible to estimate large number of differentiated products in a single demand system. We applied our model to estimate the retail demand for fluid milk products. We also compared our results with the Distance Metric (DM) approach of Rojas and Peterson (2008) using the estimation results from traditional models as a benchmark point. Our approach is shown to give superior results and better approximations to original models.
Keywords: Hedonic Metrics, Distance Metrics, Rotterdam Model, Almost Ideal Demand System, Differentiated Products, Milk Demand., Food Security and Poverty, C30, C80, Q11, Q13, Q18,
  • QR-GARCH-M Model for Risk-Return Tradeoff in U.S. Stock Returns and Business Cycles
Date: 2010-04
By: Nyberg, Henri
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:23724&r=ecm
In the empirical finance literature findings on the risk return tradeoff in excess stock market returns are ambiguous. In this study, we develop a new QR-GARCH-M model combining a probit model for a binary business cycle indicator and a regime switching GARCH-in-mean model for excess stock market return with the business cycle indicator defining the regime. Estimation results show that there is statistically significant variation in the U.S. excess stock returns over the business cycle. However, consistent with the conditional ICAPM, there is a positive risk-return relationship between volatility and expected return independent of the state of the economy.
Keywords: Regime switching GARCH model; GARCH-in-mean model; probit model; stock return; risk-return tradeoff; business cycle
JEL: C32

Laisser un commentaire

Entrez vos coordonnées ci-dessous ou cliquez sur une icône pour vous connecter:

Logo WordPress.com

Vous commentez à l'aide de votre compte WordPress.com. Déconnexion / Changer )

Image Twitter

Vous commentez à l'aide de votre compte Twitter. Déconnexion / Changer )

Photo Facebook

Vous commentez à l'aide de votre compte Facebook. Déconnexion / Changer )

Photo Google+

Vous commentez à l'aide de votre compte Google+. Déconnexion / Changer )

Connexion à %s