[ECM] Econométrie: working papers (RePEc, 30/08/2010)

Source : NEP (New Economics Papers) | RePEc

  • « Bayesian Estimation and Particle Filter for Max-Stable Processes »
Date: 2010-08
By: Tsuyoshi Kunihama (Graduate School of Economics, University of Tokyo)
Yasuhiro Omori (Faculty of Economics, University of Tokyo)
Zhengjun Zhang (Department of Statistics, University of Wisconsin Madison)
URL: http://d.repec.org/n?u=RePEc:tky:fseres:2010cf757&r=ecm
Extreme values are often correlated over time, for example, in a financial time series, and these values carry various risks. Max-stable processes such as maxima of moving maxima (M3) processes have been recently considered in the literature to describe timedependent dynamics, which have been difficult to estimate. This paper first proposes a feasible and efficient Bayesian estimation method for nonlinear and non-Gaussian state space models based on these processes and describes a Markov chain Monte Carlo algorithm where the sampling efficiency is improved by the normal mixture sampler. Furthermore, a unique particle filter that adapts to extreme observations is proposed and shown to be highly accurate in comparison with other well-known filters. Our proposed algorithms were applied to daily minima of high-frequency stock return data, and a model comparison was conducted using marginal likelihoods to investigate the time- dependent dynamics in extreme stock returns for financial risk management.
  • Asymptotically effcient estimation of the conditionalexpected shortfall
Date: 2010
By: Samantha Leorato (Tor Vergata University)
Franco Peracchi (Tor Vergata University, EIEF)
Andrei V. Tanase (Tor Vergata University)
URL: http://d.repec.org/n?u=RePEc:eie:wpaper:1014&r=ecm
We propose a procedure for efficient estimation of the trimmed mean of a random variable Y conditional on a set of covariates X. For concreteness, we focus on a financial application where the trimmed mean of interest corresponds to a coherent measure of risk, namely the conditional expected shortfall. Our estimator is based on the representation of the estimand as an integral of the conditional quantile function. We extend the class of estimators originally proposed by Peracchi and Tanase (2008) by introducing a weighting function that gives different weights to different conditional quantiles. Our approach allows for either parametric or nonparametric modeling of the conditional quantiles and the weights, but is essentially nonparametric in spirit. We prove consistency and asymptotic normality of the resulting estimator. Optimizing over the weighting function, we obtain asymptotic efficiency gains with respect to the un weighted estimators. The gains are especially noticeable in the case of fat-tailed distributions.
  • Testing for Bivariate Spherical Symmetry
Date: 2010
By: Einmahl, J.H.J.
Gantner, M. (Tilburg University, Center for Economic Research)
URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:201071&r=ecm
An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distri- bution-free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the asymptotic ones, are presented. In a simulation study, the good perfor- mance of the test is demonstrated. Furthermore, a real data example is presented.
Keywords: Asymptotic distribution;distribution-free;empirical like- lihood;hypothesis test;spherical symmetry.
JEL: C12
  • The asymptotic variance of semi-parametric estimators with generated regressors
Date: 2010-08
By: Jinyong Hahn
Geert Ridder
URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:23/10&r=ecm
<p>We study the asymptotic distribution of three-step estimators of a finite dimensional parameter vector where the second step consists of one or more nonparametric regressions on a regressor that is estimated in the first step. The first step estimator is either parametric or non-parametric. Using Newey’s (1994) path-derivative method we derive the contribution of the first step estimator to the influence function. In this derivation it is important to account for the dual role that the first step estimator plays in the second step non-parametric regression, i.e., that of conditioning variable and that of argument. We consider three examples in more detail: the partial linear regression model estimator with a generated regressor, the Heckman, Ichimura and Todd (1998) estimator of the Average Treatment Effect and a semi-parametric control variable estimator.</p>
  • Nonparametric Estimation of An Instrumental Regression: A Quasi-Bayesian Approach Based on Regularized Posterior
Date: 2010-03
By: Florens, Jean-Pierre
Simoni, Anna
URL: http://d.repec.org/n?u=RePEc:ide:wpaper:22800&r=ecm
We propose a Quasi-Bayesian nonparametric approach to estimating the structural relationship ‘ among endogenous variables when instruments are available. We show that the posterior distribution of ‘ is inconsistent in the frequentist sense. We interpret this fact as the ill-posedness of the Bayesian inverse problem defined by the relation that characterizes the structural function ‘. To solve this problem, we construct a regularized posterior distribution, based on a Tikhonov regularization of the inverse of the marginal variance of the sample, which is justified by a penalized projection argument. This regularized posterior distribution is consistent in the frequentist sense and its mean can be interpreted as the mean of the exact posterior distribution resulting from a gaussian prior distribution with a shrinking covariance operator.
JEL: C11
  • Testing for spatial heterogeneity in functional MRI using the multivariate general linear model
Date: 2010
By: Leech, Robert (The Computational, Cognitive and Clinical Neuroimaging Laboratory, The Division of Experimental Medicine, Imperial College London)
Leech, Dennis (Department of Economics, University of Warwick)
URL: http://d.repec.org/n?u=RePEc:wrk:warwec:938&r=ecm
Much current research in functional MRI employs multivariate machine learning approaches (e.g., support vector machines) to detect fine-scale spatial patterns from the temporal fluctuations of the neural signal. The aim of many studies is not classification, however, but investigation of multivariate spatial patterns, which pattern classifiers detect only indirectly. Here we propose a direct statistical measure for the existence of fine-scale spatial patterns (or spatial heterogeneity) applicable for fMRI datasets. We extend the univariate general linear model (typically used in fMRI analysis) to a multivariate case. We demonstrate that contrasting maximum likelihood estimations of different restrictions on this multivariate model can be used to estimate the extent of spatial heterogeneity in fMRI data. Under asymptotic assumptions inference can be made with reference to the X2 distribution. The test statistic is then ass essed using simulated timecourses derived from real fMRI data. This demonstrates the utility of the proposed measure of heterogeneity as well as considerations in its application. Measuring spatial heterogeneity in fMRI has important theoretical implications in its own right and has potential uses for better characterising neurological conditions such as stroke and Alzheimer’s disease.
Keywords: wordsNeuroimaging ; Multivariate pattern analysis ; Maximum likelihood estimation ; Seemingly unrelated regression
  • On the Design of Data Sets for Forecasting with Dynamic Factor Models
Date: 2010-07-13
By: Gerhard Rünstler (WIFO)
URL: http://d.repec.org/n?u=RePEc:wfo:wpaper:y:2010:i:376&r=ecm
Forecasts from dynamic factor models potentially benefit from refining the data set by eliminating uninformative series. The paper proposes to use forecast weights as provided by the factor model itself for this purpose. Monte Carlo simulations and an empirical application to forecasting euro area, German, and French GDP growth from unbalanced monthly data suggest that both forecast weights and least angle regressions result in improved forecasts. Overall, forecast weights provide yet more robust results.
  • The heterogeneous thresholds ordered response model: Identification and inference
Date: 2010
By: Franco Peracchi (Tor Vergata University, EIEF)
Claudio Rossetti (LUISS, ISFOL)
URL: http://d.repec.org/n?u=RePEc:eie:wpaper:1013&r=ecm
Although many surveys ask respondents to evaluate their own condition or to report their degree of satisfaction with various aspects of life, there is a persistent concern about interpersonal comparability of these self-assessments. Statistically, the problem is one of identification in ordered response models where the observed responses are derived from latent continuous random variables discretized through a set of heterogeneous thresholds or cutoff points. As a solution to the identification problem, King et al. (2004) propose the use of anchoring vignettes, namely brief descriptions of hypothetical people or situations that survey respondents are asked to evaluate on the same scale they used to rate their own situation. While vignettes have been introduced in several social surveys and are increasingly used in a variety of fields, reliability of this approach hinges crucially on the validity of the assumptions of res ponse consistency and vignette equivalence. This paper proposes a joint test of these key assumptions based on the fact that the underlying statistical model is overidentified if the two assumptions hold. We apply our test to self-assessment on various components or domains of health using data from Release 2 of the first wave of the Survey of Health, Ageing and Retirement in Europe (SHARE). We find that, in most cases, the test rejects the overidentifying restrictions imposed by the assumptions of response consistency and vignette equivalence. Thus, our results cast doubts on the usefulness of anchoring vignettes for identifying and correcting interpersonal incomparability of answers to subjective survey questions
  • A Class of Simple Distribution-Free Rank-Based Unit Root Tests (Revision of DP 2009-02)
Date: 2010
By: Hallin, M.
Akker, R. van den
Werker, B.J.M. (Tilburg University, Center for Economic Research)
URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:201072&r=ecm
AMS 1980 subject classification : 62G10 and 62G20.
Keywords: Unit root;Dickey-Fuller test;Local Asymptotic Normality;Rank test
JEL: C12
  • Identification of Local Treatment Effects Using a Proxy for an Instrument
Date: 2010-05-01
By: Karim Chalak (Boston College)
URL: http://d.repec.org/n?u=RePEc:boc:bocoec:738&r=ecm
The method of indirect least squares (ILS) using a proxy for a discrete instrument is shown to identify a weighted average of local treatment effects. The weights are nonnegative if and only if the proxy is intensity preserving for the instrument. A similar result holds for instrumental variables (IV) methods such as two stage least squares. Thus, one should carefully interpret estimates for causal effects obtained via ILS or IV using an error-laden proxy of an instrument, a proxy for an instrument with missing or imputed observations, or a binary proxy for a multivalued instrument. Favorably, the proxy need not satisfy all the assumptions required for the instrument. Specifically, an individual’s proxy can depend on others’ instrument and the proxy need not affect the treatment nor be exogenous. In special cases such as with binary instrument, ILS using any suitable proxy for an instrument identifies local average treatm ent effects.
Keywords: causality, compliance, indirect least squares, instrumental variables, local average treatment effect, measurement error, proxy, quadrant dependence, two stage least squares.
JEL: C21
  • Betit: A Family that Nests Probit and Logit
Date: 2010
By: Wim P.M. Vijverberg
URL: http://d.repec.org/n?u=RePEc:ess:wpaper:id:2768&r=ecm
This paper proposes a dichotomous choice model that is based on a transformed beta (or zâ€) distribution. This model, called betit, nests both logit and probit and allows for various skewed and peaked disturbance densities. Because the shape of this density affects the estimated relation between the dichotomous choice variable and its determinants, the greater flexibility of the transformed beta distribution is useful in generating more accurate representations of this relationship. The paper considers asymptotic biases of the logit and probit models under conditions where betit should have been used. It also investigates small sample power and provides two examples of applications that illustrative of the capability of the betit model. [IZA Discussion Paper No. 222]
Keywords: Dichotomous choice model, beta distribution, logit, probit
  • A control function approach to estimating dynamic probit models with endogenous regressors, with an application to the study of poverty persistence in China
Date: 2010-08-01
By: Giles, John
Murtazashvili, Irina
URL: http://d.repec.org/n?u=RePEc:wbk:wbrwps:5400&r=ecm
This paper proposes a parametric approach to estimating a dynamic binary response panel data model that allows for endogenous contemporaneous regressors. This approach is of particular value for settings in which one wants to estimate the effects of an endogenous treatment on a binary outcome. The model is next used to examinethe impact of rural-urban migration on the likelihood that households in rural China fall below the poverty line. In this application, it is shown that migration is important for reducing the likelihood that poor households remain in poverty and that non-poor households fall into poverty. Furthermore, it is demonstrated that failure to control for unobserved heterogeneity would lead the researcher to underestimate the impact of migrant labor markets on reducing the probability of falling into poverty.
Keywords: Rural Poverty Reduction,Population Policies,Achieving Shared Growth,Debt Markets,Regional Economic Development
  • Expected Improvement in Efficient Global Optimization Through Bootstrapped Kriging
Date: 2010
By: Kleijnen, Jack P.C.
Beers, W.C.M. van
Nieuwenhuyse, I. van (Tilburg University, Center for Economic Research)
URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:201062&r=ecm
This paper uses a sequentialized experimental design to select simulation input com- binations for global optimization, based on Kriging (also called Gaussian process or spatial correlation modeling); this Kriging is used to analyze the input/output data of the simulation model (computer code). This paper adapts the classic « ex- pected improvement » (EI) in « efficient global optimization » (EGO) through the introduction of an unbiased estimator of the Kriging predictor variance; this estima- tor uses parametric bootstrapping. Classic EI and bootstrapped EI are compared through four popular test functions, including the six-hump camel-back and two Hartmann functions. These empirical results demonstrate that in some applications bootstrapped EI finds the global optimum faster than classic EI does; in general, however, the classic EI may be considered to be a robust global optimizer.
Keywords: Simulation;Optimization;Kriging;Bootstrap
JEL: C0
  • Running and Jumping Variables in RD Designs: Evidence Based on Race, Socioeconomic Status, and Birth Weights
Date: 2010-08
By: Barreca, Alan (Tulane University)
Guldi, Melanie (Mount Holyoke College)
Lindo, Jason M. (University of Oregon)
Waddell, Glen R. (University of Oregon)
URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5106&r=ecm
Throughout the years spanned by the U.S. Vital Statistics Linked Birth and Infant Death Data (1983-2002), birth weights are measured most precisely for children of white and highly educated mothers. As a result, less healthy children, who are more likely to be of low socioeconomic status, are disproportionately represented at multiples of round numbers. This has crucial implications for any study using a regression discontinuity design in which birth weights are used as the running variable. For example, estimates will be biased in a manner that leads one to conclude that it is “good” to be strictly to the left of any 100-gram cutoff. As such, prior estimates of the effects of very low birth weight classification (Almond, Doyle, Kowalski, and Williams 2010) have been overstated and appear to be zero. This analysis highlights a more general problem that can afflict regression discontinuity designs. In cases where attri butes related to the outcomes of interest predict heaping in the running variable, estimated effects are likely to be biased. We discuss approaches to diagnosing and correcting for this type of problem.
Keywords: regression discontinuity, donut RD, birth weight, infant mortality
JEL: C21
  • Sibuya copulas
Date: 2010-08
By: Marius Hofert
Frederic Vrins
URL: http://d.repec.org/n?u=RePEc:arx:papers:1008.2292&r=ecm
The standard intensity-based approach for modeling defaults is generalized by making the deterministic term structure of the survival probability stochastic via a common jump process. The survival copula of the vector of default times is derived and it is shown to be explicit and of the functional form as dealt with in the work of Sibuya. Besides the parameters of the jump process, the marginal survival functions of the default times appear in the copula. Sibuya copulas therefore allow for functional parameters and asymmetries. Due to the jump process in the construction, they allow for a singular component. Depending on the parameters, they may also be extreme-value copulas or Levy-frailty copulas. Further, Sibuya copulas are easy to sample in any dimension. Properties of Sibuya copulas including positive lower orthant dependence, tail dependence, and extremal dependence are investigated. An application to pricing first- to-default contracts is outlined and further generalizations of this copula class are addressed.
  • Linking Granger Causality and the Pearl Causal Model with Settable Systems
Date: 2010-08-01
By: Halbert White (University of California-San Diego)
Karim Chalak (Boston College)
Xun Lu (Hong Kong University of Science and Technology)
URL: http://d.repec.org/n?u=RePEc:boc:bocoec:744&r=ecm
The causal notions embodied in the concept of Granger causality have been argued to belong to a different category than those of Judea Pearl’s Causal Model, and so far their relation has remained obscure. Here, we demonstrate that these concepts are in fact closely linked by showing how each relates to straightforward notions of direct causality embodied in settable systems, an extension and refinement of the Pearl Causal Model designed to accommodate optimization, equilibrium, and learning. We then provide straightforward practical methods to test for direct causality using tests for Granger causality.
Keywords: Causal Models, Conditional Exogeneity, Conditional Independence, Granger Non-causality
JEL: C12
  • The “Meteorological” and the “Engineering” Type of Econometric Inference: a 1943 Exchange between Trygve Haavelmo and Jakob Marschak
Date: 2010-08-11
By: Bjerkholt, Olav (Dept. of Economics, University of Oslo)
URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2010_007&r=ecm
The article presents an exchange of letters between Jakob Marschak and Trygve Haavelmo in May-July 1943. Marschak had from the beginning of 1943 become the research director of Cowles Commission at the University of Chicago. Trygve Haavelmo, who at the time worked for the Norwegian Shipping and Trade Mission in New York, had just published the article on the statistical implications of simultaneous equations, which would become his most quoted work. The content and the implications of the article was at the centre of the letter exchange. The introduction provides some background for the exchange.
Keywords: history; econometrics
JEL: B23

Laisser un commentaire

Entrez vos coordonnées ci-dessous ou cliquez sur une icône pour vous connecter:

Logo WordPress.com

Vous commentez à l'aide de votre compte WordPress.com. Déconnexion / Changer )

Image Twitter

Vous commentez à l'aide de votre compte Twitter. Déconnexion / Changer )

Photo Facebook

Vous commentez à l'aide de votre compte Facebook. Déconnexion / Changer )

Photo Google+

Vous commentez à l'aide de votre compte Google+. Déconnexion / Changer )

Connexion à %s