# [ECM] Econométrie: working papers (RePEc, 21/09/2010)

Source : NEP (New Economics Papers) | RePEc﻿

• Consistent Density Deconvolution under Partially Known Error Distribution
 Date: 2009-10-06 By: Schwarz, Maik Van Bellegem, Sébastien URL: http://d.repec.org/n?u=RePEc:ide:wpaper:23156&r=ecm We estimate the distribution of a real-valued random variable from contaminated observations. The additive error is supposed to be normally distributed, but with unknown variance. The distribution is identiable from the observations if we restrict the class of considered distributions by a simple condition in the time domain. A minimum distance estimator is shown to be consistent imposing only a slightly stronger assumption than the identification condition. Keywords: deconvolution, error measurement, density estimation
• Iterative Regularization in Nonparametric Instrumental Regression
 Date: 2010-07 By: Johannes, Jan Van Bellegem, Sébastien Vanhems, Anne URL: http://d.repec.org/n?u=RePEc:ide:wpaper:23149&r=ecm We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an illposed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the Landweber-Fridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of ill-posedness. A Monte-Carlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator. Keywords: Nonparametric estimation; Instrumental variable; Ill-posed inverse problem JEL: C14
• Nonparametric Frontier Estimation from Noisy Data
 Date: 2010-05 By: Florens, Jean-Pierre Schwarz, Maik Van Bellegem, Sébastien URL: http://d.repec.org/n?u=RePEc:ide:wpaper:22801&r=ecm A new nonparametric estimator of production a frontier is defined and studied when the data set of production units is contaminated by measurement error. The measurement error is assumed to be an additive normal random variable on the input variable, but its variance is unknown. The estimator is a modification of the m-frontier, which necessitates the computation of a consistent estimator of the conditional survival function of the input variable given the output variable. In this paper, the identification and the consistency of a new estimator of the survival function is proved in the presence of additive noise with unknown variance. The performance of the estimator is also studied through simulated data.
• Modelling Conditional Heteroscedasticity in Nonstationary Series
 Date: 2010 By: Cizek, P. (Tilburg University, Center for Economic Research) URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:201084&r=ecm To accommodate the inhomogenous character of financial time series over longer time periods, standard parametric models can be extended by allow- ing their coeffcients to vary over time. Focusing on conditional heteroscedas- ticity models, we discuss various strategies to identify and estimate varying- coefficients models and compare all methods by means of a real-data applica- tion. Keywords: adaptive estimation;conditional heteroscedasticity;varying-coefficient models;time series JEL: C14
• Testing for Structural Breaks at Unknown Time: A Steeplechase
 Date: 2010-09 By: Makram El-Shagi Sebastian Giesen URL: http://d.repec.org/n?u=RePEc:iwh:dispap:19-10&r=ecm This paper analyzes the role of common data problems when identifying structural breaks in small samples. Most notably, we survey small sample properties of the most commonly applied endogenous break tests developed by Brown, Durbin, and Evans (1975) and Zeileis (2004), Nyblom (1989) and Hansen (1992), and Andrews, Lee, and Ploberger (1996). Power and size properties are derived using Monte Carlo simulations. Results emphasize that mostly the CUSUM type tests are aﬀected by the presence of heteroscedasticity, whereas the individual parameter Nyblom test and AvgLM test are proved to be highly robust. However, each test is signiﬁcantly aﬀected by leptokurtosis. Contrarily to other tests, where skewness is far more problematic than kurtosis, it has no additional eﬀect for any of the endogenous break tests we analyze. Concerning overall robustness the Nyblom test performs best, while being almost on par to more recent ly developed tests in terms of power.
• A simple and Efficient (Parametric Conditional) Test for the Pareto Law
 Date: 2010-02-01 By: Goerlich Gisbert Francisco J. (Ivie) URL: http://d.repec.org/n?u=RePEc:fbb:wpaper:20101&r=ecm This working paper presents a simple and locally optimal test statistic for the Pareto law. The test is based on the Lagrange multiplier (LM) principle and can be computed easily once the maximum likelihood estimator of the scale parameter of the Pareto density has been obtained. A Monte Carlo exercise shows the good small sample properties of the test under the null hypothesis of the Pareto law and also its power against some sensible alternatives. Finally, a simple application to urban economics is performed. An appendix presents derivations and proofs. Keywords: LM test, Pareto law, statistical distributions
• A Cholesky-MIDAS model for predicting stock portfolio volatility
 Date: 2010 By: Ralf Becker Adam Clements Robert O’Neill URL: http://d.repec.org/n?u=RePEc:man:cgbcrp:149&r=ecm This paper presents a simple forecasting technique for variance covariance matrices. It relies significantly on the contribution of Chiriac and Voev (2010) who propose to forecast elements of the Cholesky decomposition which recombine to form a positive definite forecast for the variance covariance matrix. The method proposed here combines this methodology with advances made in the MIDAS literature to produce a forecasting methodology that is flexible, scales easily with the size of the portfolio and produces superior forecasts in simulation experiments and an empirical application.
• Analysis of coexplosive processes..
 Date: 2010 By: Nielsen, Bent URL: http://d.repec.org/n?u=RePEc:ner:oxford:http://economics.ouls.ox.ac.uk/14854/&r=ecm A vector autoregressive model allowing for unit roots as well as an explosive characteristic root is developed. The Granger-Johansen representation shows that this results in processes with two common features: a random walk and an explosively growing process. Cointegrating and coexplosive vectors can be found that eliminate these common factors. The likelihood ratio test for a simple hypothesis on the coexplosive vectors is analyzed. The method is illustrated using data from the extreme Yugoslavian hyperinflation of the 1990s.
• Using Dynamic Copulae for Modeling Dependency in Currency Denominations of a Diversifed World Stock Index
 Date: 2010-09-01 By: Katja Ignatieva (School of Finance and Economics, University of Technology, Sydney) Eckhard Platen (School of Finance and Economics, University of Technology, Sydney) Renata Rendek (School of Finance and Economics, University of Technology, Sydney) URL: http://d.repec.org/n?u=RePEc:uts:rpaper:284&r=ecm The aim of this paper is to model the dependencya mong log-returns when security account prices are expressed in units of a well diversified world stock index. The paper uses the equi-weighted index EWI104s, calculated as the average of 104 world industry sector indices. The log-returns of its denominations in different currencies appear to be Student-t distributed with about four degrees of freedom. Motivated by these findings, the dependency in log-returns of currency denominations of the EWI104s is modeled using time-varying copulae, aiming to identify the best fitting copula family. The Student-t copula turns generally out to be superior to e.g. the Gaussian copula, where the dependence structure relates to the multivariate normal distribution. It is shown that merely changing the distributional assumption for the log-returns of the marginals from normal to Student-t leads to a significantly better fit. Furthermore, t he Student-t copula with Student-t marginals is able to better capture dependent extreme values than the other models considered. Finally, the paper applies copulae to the estimation of the Value-at-Risk and the expected shortfall of a portfolio, constructed of savings accounts of different currencies. The proposed copula-based approach allows to split market risk into general and specific market risk, as defied in regulatory documents. The paper demonstrates that the approach performs clearly better than the Risk Metrics approach. Keywords: diversified world stock index; Student-t distribution; time-varying copula; Value-at-Risk; expected shortfall
• Archimedean Copulas and Temporal Dependence
 Date: 2010-09-09 By: Beare, Brendan K. URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:1549539&r=ecm We study the dependence properties of stationary Markov chains generated by Archimedean copulas. Under some simple regularity conditions, we show that regular variation of the Archimedean generator at zero and one implies geometric orgodicityof the associated Markov chain. We verify our assumptions for a range of Archimedean copulas used in applications. Keywords: archimedean copula, geometric ergodicity, Markov chain, mixing, regular variation, tail dependence
• Looking behind Granger causality
 Date: 2010-09 By: Chen, Pu Hsiao, Chih-Ying URL: http://d.repec.org/n?u=RePEc:pra:mprapa:24859&r=ecm Granger causality as a popular concept in time series analysis is widely applied in empirical research. The interpretation of Granger causality tests in a cause-effect context is, however, often unclear or even controversial, so that the causality label has faded away. Textbooks carefully warn that Granger causality does not imply true causality and preferably refer the Granger causality test to a forecasting technique. Applying theory of inferred causation, we develop in this paper a method to uncover causal structures behind Granger causality. In this way we re-substantialize the causal attribution in Granger causality through providing an causal explanation to the conditional dependence manifested in Granger causality. Keywords: Granger Causality; Time Series Causal Model; Graphical Model JEL: C1
• Hidden Regular Variation: Detection and Estimation
 Date: 2010-01 By: Abhimanyu Mitra Sidney I. Resnick URL: http://d.repec.org/n?u=RePEc:arx:papers:1001.5058&r=ecm Hidden regular variation defines a subfamily of distributions satisfying multivariate regular variation on $\mathbb{E} = [0, \infty]^d \backslash \{(0,0, …, 0) \}$ and models another regular variation on the sub-cone $\mathbb{E}^{(2)} = \mathbb{E} \backslash \cup_{i=1}^d \mathbb{L}_i$, where $\mathbb{L}_i$ is the $i$-th axis. We extend the concept of hidden regular variation to sub-cones of $\mathbb{E}^{(2)}$ as well. We suggest a procedure for detecting the presence of hidden regular variation, and if it exists, propose a method of estimating the limit measure exploiting its semi-parametric structure. We exhibit examples where hidden regular variation yields better estimates of probabilities of risk sets.
• Parametric estimation of risk neutral density functions
 Date: 2010-09 By: Maria Grith Volker Krätschmer URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010-045&r=ecm This chapter deals with the estimation of risk neutral distributions for pricing index options resulting from the hypothesis of the risk neutral valuation principle. After justifying this hypothesis, we shall focus on parametric estimation methods for the risk neutral density functions determining the risk neutral distributions. We we shall differentiate between the direct and the indirect way. Following the direct way, parameter vectors are estimated which characterize the distributions from selected statistical families to model the risk neutral distributions. The idea of the indirect approach is to calibrate characteristic parameter vectors for stochastic models of the asset price processes, and then to extract the risk neutral density function via Fourier methods. For every of the reviewed methods the calculation of option prices under hypothetically true risk neutral distributions is a building block. We shall give e xplicit formula for call and put prices w.r.t. reviewed parametric statistical families used for direct estimation. Additionally, we shall introduce the Fast Fourier Transform method of call option pricing developed in [6]. It is intended to compare the reviewed estimation methods empirically. Keywords: Risk neutral valuation principle, risk neutral distribution, logprice risk neutral distribution, risk neutral density function, Black Scholes formula, Fast Fourier Transform method, log-normal distributions, mixtures of log-normal distributions, generalized gamma distributions, model calibration, Merton’s jump diffusion model, Heston’s volatility model JEL: C13
• Modelling income processes with lots of heterogeneity..
 Date: 2010-10 By: Browning, Martin Ejrnæs, Mette Alvarez, Javier URL: http://d.repec.org/n?u=RePEc:ner:oxford:http://economics.ouls.ox.ac.uk/14853/&r=ecm All empirical models of earnings processes in the literature assume a good deal of homogeneity. In contrast to this we model earnings processes allowing for lots of heterogeneity between agents. We also introduce an ex- tension to the linear ARMA model that allows that the initial convergence to the long run may be di¤erent from that implied by the conventional ARMA model. This is particularly important for unit root tests which are actually tests of a composite of two independent hypotheses. We t our models to a variety of statistics including most of those considered by pre- vious investigators. We use a sample drawn from the PSID, and focus on white males with a high school degree. Despite this observable homogene- ity we nd much greater latent heterogeneity than previous investigators. JEL: J30
• A time series causal model
 Date: 2010-09 By: Chen, Pu URL: http://d.repec.org/n?u=RePEc:pra:mprapa:24841&r=ecm Cause-effect relations are central in economic analysis. Uncovering empirical cause-effect relations is one of the main research activities of empirical economics. In this paper we develop a time series casual model to explore casual relations among economic time series. The time series causal model is grounded on the theory of inferred causation that is a probabilistic and graph-theoretic approach to causality featured with automated learning algorithms. Applying our model we are able to infer cause-effect relations that are implied by the observed time series data. The empirically inferred causal relations can then be used to test economic theoretical hypotheses, to provide evidence for formulation of theoretical hypotheses, and to carry out policy analysis. Time series causal models are closely related to the popular vector autoregressive (VAR) models in time series analysis. They can be viewed as restricted structural VAR models identified by the inferred causal relations. Keywords: Inferred Causation; Automated Learning; VAR; Granger Causality; Wage-Price Spiral JEL: E31
• Using « Shares » vs. « Log of Shares » in Fixed-Effect Estimations
 Date: 2010-09 By: Gerdes, Christer (SOFI, Stockholm University) URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5171&r=ecm This paper looks at potential implications emerging from including « shares » as a control variable in fixed effect estimations. By shares I refer to the ratio of a sum of units over another, such as the share of immigrants in a city or school. As will be shown in this paper, a logarithmic transformation of shares has some methodological merits as compared to the use of shares defined as mere ratios. In certain empirical settings the use of the latter might result in coefficient estimates that, spuriously, are statistically significant more often than they should. Keywords: consistency, Törnqvist index, symmetry, spurious significance JEL: C23