Econométrie: working papers (RePEc, 30/11/2010)

Source : NEP (New Economics Papers) | RePEc

  • Semiparametric Quantile Regression Estimation in Dynamic Models with Partially Varying Coefficients
Date: 2010-11-22
By: Zongwu Cai (Department of Mathematics & Statistics, University of North Carolina at Charlotte)
Zhijie Xiao (Boston College)
URL: http://d.repec.org/n?u=RePEc:boc:bocoec:761&r=ecm
We study quantile regression estimation for dynamic models with partially varying coefficients so that the values of some coefficients may be functions of informative covariates. Estimation of both parametric and nonparametric functional coefficients are proposed. In particular, we propose a three stage semiparametric procedure. Both consistency and asymptotic normality of the proposed estimators are derived. We demonstrate that the parametric estimators are root-n consistent and the estimation of the functional coefficients is oracle. In addition, efficiency of parameter estimation is discussed and a simple efficient estimator is proposed. A simple and easily implemented test for the hypothesis of varying-coefficient is proposed. A Monte Carlo experiment is conducted to evaluate the performance of the proposed estimators.
Keywords: Efficiency; nonlinear time series; partially linear; partially varying coefficients; quantile regression; semiparametric
  • Z-Estimators and Auxiliary Information under Weak Dependence
Date: 2010
By: Federico Crudu
URL: http://d.repec.org/n?u=RePEc:cns:cnscwp:201022&r=ecm
In this paper we introduce a weighted Z-estimator for moment condition models in the presence of auxiliary information on the unknown distribution of the data under the assumption of weak dependence. The resulting weighted estimator is shown to be consistent and asymptotically normal. Its small sample properties are checked via Monte Carlo experiments.
Keywords: Z-estimators; M-estimators; GMM; Generalized Empirical Likelihood; blocking techniques; ?-mixing.
JEL: C12
  • A Smoothed- Distribution Form of Nadaraya- Watson Estimation
Date: 2010-11
By: Ralph W Bailey
John T Addison
URL: http://d.repec.org/n?u=RePEc:bir:birmec:10-30&r=ecm
Keywords: nonparametric regression; Nadaraya- Watson; kernel density; conditional expectation estimator; conditional variance estimator; local polynomial estimator
JEL: C14
  • Specification Analysis of Structural Quantile Regression Models
Date: 2010-11-19
By: Juan Carlos Escanciano
Chuan Goh
URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-415&r=ecm
This paper introduces a broad family of tests for the hypothesis of linearity in parameters of functions that are identified by conditional quantile restrictions involving instrumental variables. These tests are tantamount to assessments of lack of fit for quantile regression models involving endogenous conditioning variables, and may be applied to assess the validity of post-estimation inferences regarding the counterfactual effect of endogenous treatments on the distribution of outcomes. We show that the use of an orthogonal projection on the tangent space of nuisance parameters at each quantile index improves power performance and facilitates the simulation of critical values via the application of simple multiplier-type bootstrap procedures. Monte Carlo evidence is included, along with an application to an empirical analysis of the structure of demand for a particular subsegment of the market for anti-bacterial drugs in India.
Keywords: Quantile regression, instrumental variables, structural models
JEL: C12
  • Nonparametric Estimation of the Fractional Derivative of a Distribution Function
Date: 2010
By: Andreea Borla (GREQAM – Groupement de Recherche en Économie Quantitative d’Aix-Marseille – Université de la Méditerranée – Aix-Marseille II – Université Paul Cézanne – Aix-Marseille III – Ecole des Hautes Etudes en Sciences Sociales (EHESS) – CNRS : UMR6579)
Costin Protopopescu (GREQAM – Groupement de Recherche en Économie Quantitative d’Aix-Marseille – Université de la Méditerranée – Aix-Marseille II – Université Paul Cézanne – Aix-Marseille III – Ecole des Hautes Etudes en Sciences Sociales (EHESS) – CNRS : UMR6579)
URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00536979_v1&r=ecm
We propose an estimator for the fractional derivative of a distribution function. Our estimator, based on finite differences of the empirical distribution function generalizes the estimator proposed by Maltz for the nonnegative real case. The asymptotic bias, variance and the consistency of the estimator are studied. Finally, the optimal choice for the  »smoothing parameter » proves that even in the fractional case, the Stone’s rate of convergence is achieved.
Keywords: fractional derivative; nonparametric estimation; distribution function; generalized differences
  • Asymptotic Distributions of the Least Squares Estimator for Diffusion Processes
Date: 2010-01
By: Qiankun Zhou (School of Economics, Singapore Management University)
Jun Yu (School of Economics, Singapore Management University)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:20-2010&r=ecm
The asymptotic distributions of the least squares estimator of the mean reversion parameter (κ) are developed in a general class of diffusion models under three sampling schemes, namely, longspan, in-fill and the combination of long-span and in-fill. The models have an affine structure in the drift function, but allow for nonlinearity in the diffusion function. The limiting distributions are quite different under the alternative sampling schemes. In particular, the in-fill limiting distribution is non-standard and depends on the initial condition and the time span whereas the other two are Gaussian. Moreover, while the other two distributions are discontinuous at κ = 0, the in-fill distribution is continuous in κ. This property provides an answer to the Bayesian criticism to the unit root asymptotics. Monte Carlo simulations suggest that the in-fill asymptotic distribution provides a more accurate approximation to the finite sample distribution than the other two distributions in empirically realistic settings. The empirical application using the U.S. Federal fund rates highlights the difference in statistical inference based on the alternative asymptotic distributions and suggests strong evidence of a unit root in the data.
Keywords: Vasicek Model, One-factor Model, Mean Reversion, In-fill Asymptotics, Long-span Asymptotics, Unit Root Test
JEL: C12
  • Bias-Corrected Estimation for Spatial Autocorrelation
Date: 2010-10
By: Zhenlin Yang (School of Economics, Singapore Management University)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:12-2010&r=ecm
The biasedness issue arising from the maximum likelihood estimation of the spatial autoregressive model (SAR) is further investigated under a broader set-up than that in Bao and Ullah (2007a). A major difficulty in analytically evaluating the expectations of ratios of quadratic forms is overcome by a simple bootstrap procedure. With that, the corrections on bias and variance of the spatial estimator can easily be made up to third-order, and once this is done, the estimators of other model parameters become nearly unbiased. Compared with the analytical approach, the new approach is much simpler, and can easily be extended to other models of a similar structure. Extensive Monte Carlo results show that the new approach performs excellently in general.
Keywords: Third-order bias; Third-order variance; Bootstrap; Concentrated estimating equation; Monte Carlo; Quasi-MLE; Spatial layout.
JEL: C10
  • Capturing the Zero: A New Class of Zero-Augmented Distributions and Multiplicative Error Processes
Date: 2010-11
By: Nikolaus Hautsch
Peter Malec
Melanie Schienle
URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010-055&r=ecm
We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed on high frequencies, such as cumulated trading volumes or the time between potentially simultaneously occurring market events. We introduce a flexible point-mass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of liquid NYSE stocks, we show that the model captures both the dynamic and distribution propertie s of the data very well and is able to correctly predict future distributions.
Keywords: high-frequency data, point-mass mixture, multiplicative error model, excess zeros, semiparametric specification test, market microstructure
JEL: C22
  • Simulation-based Estimation Methods for Financial Time Series Models
Date: 2010-10
By: Jun Yu (School of Economics, Singapore Management University)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:19-2010&r=ecm
This chapter overviews some recent advances on simulation-based methods of estimating financial time series models that are widely used in financial economics. The simulation-based methods have proven to be particularly useful when the likelihood function and moments do not have tractable forms, and hence, the maximum likelihood (ML) method and the generalized method of moments (GMM) are diffcult to use. They are also capable of improving the finite sample performance of the traditional methods. Both frequentist’s and Bayesian simulation-based methods are reviewed. Frequentist’s simulation-based methods cover various forms of simulated maximum likelihood (SML) methods, the simulated generalized method of moments (SGMM), the efficient method of moments (EMM), and the indirect inference (II) method. Bayesian simulation-based methods cover various MCMC algorithms. Each simulation-based method is discussed in the context of a specific financial time series model as a motivating example. Empirical applications, based on real exchange rates, interest rates and equity data, illustrate how the simulation-based methods are implemented. In particular, SML is applied to a discrete time stochastic volatility model, EMM to estimate a continuous time stochastic volatility model, MCMC to a credit risk model, the II method to a term structure model.
Keywords: Generalized method of moments, Maximum likelihood, MCMC, Indirect Inference, Credit risk, Stock price, Exchange rate, Interest rate..
  • A New Bayesian Unit Root Test in Stochastic Volatility Models
Date: 2010-10
By: Yong Li (Business School, Sun Yat-Sen University)
Jun Yu (School of Economics, Singapore Management University)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:21-2010&r=ecm
A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of diverged “size” in the marginal likelihood approach. Second, to improve the “power” of the unit root test, a mixed prior specification with random weights is employed. It is shown that the posterior odds ratio is the by-product of Bayesian estimation and can be easily computed by MCMC methods. A simulation study examines the “size” and “power” performances of the new method. An empirical study, based on time series dat a covering the subprime crisis, reveals some interesting results.
Keywords: Bayes factor; Mixed Prior; Markov Chain Monte Carlo; Posterior odds ratio; Stochastic volatility models; Unit root testing.
  • Standardized LM Tests for Spatial Error Dependence in Linear or Panel Regressions
Date: 2010-09
By: Badi H. Baltagi (Department of Economics and Center for Policy Research, Syracuse University)
Zhenlin Yang (School of Economics, Singapore Management University)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:11-2010&r=ecm
The robustness of the LM tests for spatial error dependence of Burridge (1980) for the linear regression model and Anselin (1988) for the panel regression model are examined. While both tests are asymptotically robust against distributional misspecification, their finite sample behavior can be sensitive to the spatial layout. To overcome this shortcoming, standardized LM tests are suggested. Monte Carlo results show that the new tests possess good finite sample properties. An important observation made throughout this study is that the LM tests for spatial dependence need to be both mean and variance-adjusted for good finite sample performance to be achieved. The former is, however, often neglected in the literature.
Keywords: Distributional misspecification; Group interaction; LM test; Moran’s I Test; Robustness; Spatial panel models.
JEL: C23
  • Nonparametric modeling and forecasting electricity demand: an empirical study
Date: 2010-10-18
By: Han Lin Shang
URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2010-19&r=ecm
This paper uses half-hourly electricity demand data in South Australia as an empirical study of nonparametric modeling and forecasting methods for prediction from half-hour ahead to one year ahead. A notable feature of the univariate time series of electricity demand is the presence of both intraweek and intraday seasonalities. An intraday seasonal cycle is apparent from the similarity of the demand from one day to the next, and an intraweek seasonal cycle is evident from comparing the demand on the corresponding day of adjacent weeks. There is a strong appeal in using forecasting methods that are able to capture both seasonalities. In this paper, the forecasting methods slice a seasonal univariate time series into a time series of curves. The forecasting methods reduce the dimensionality by applying functional principal component analysis to the observed data, and then utilize an univariate time series forecasting method and functional principal component regression techniques. When data points in the most recent curve are sequentially observed, updating methods can improve the point and interval forecast accuracy. We also revisit a nonparametric approach to construct prediction intervals of updated forecasts, and evaluate the interval forecast accuracy.
Keywords: Functional principal component analysis; functional time series; multivariate time series, ordinary least squares, penalized least squares; ridge regression; seasonal time series
JEL: C88
  • Untested Assumptions and Data Slicing: A Critical Review of Firm-Level Production Function Estimators
Date: 2010
By: Markus Eberhardt
Christian Helmers
URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:513&r=ecm
This paper surveys the most popular parametric and semi-parametric estimators for Cobb-Douglas production functions arising from the econometric literature of the past two decades. We focus on the different approaches dealing with ‘transmission bias’ in firm-level studies, which arises from firms’ reaction to unobservable productivity realisations when making input choices. The contribution of the paper is threefold: we provide applied economists with (i) an in-depth discussion of the estimation problem and the solutions suggested in the literature; (ii) a detailed empirical example using FAME data for UK high-tech firms, emphasising analytical tools to investigate data properties and the robustness of the empirical results; (iii) a powerful illustration of the impact of estimator choice on TFP estimates, using matched data on patents in ‘TFP regressions’. Our discussion concludes that while from a theoretical p oint of view the different estimators are conceptually very similar, in practice, the choice of the preferred estimator is far from arbitrary and instead requires in-depth analysis of the data properties rather than blind belief in asymptotic consistency.
Keywords: Productivity production function, UK firms, panel data estimates
JEL: D21
  • Approach to Analysis of Self-Selected Interval Data
Date: 2010-02-15
By: Belyaev, Yuri (Centre of Biostochastics, SLU-Umeå)
Kriström, Bengt (CERE, SLU-Umeå and Umeå University)
URL: http://d.repec.org/n?u=RePEc:hhs:slucer:2010_002&r=ecm
We analyze an approach to quantitative information elicitation in surveys that includes many currently popular variants as special cases. Rather than asking the individual to state a point estimate or select between given brackets, the individual can self-select any interval of choice. We propose a new estimator for such interval censored data. It can be viewed as an extension of Turnbull’s estimator (Turnbull(1976)) for interval censored data. A detailed empirical example is provided, using a survey on the valuation of a public good. We estimate survival functions based on a Weibull and a mixed Weibull/exponential distribution and prove that a consistent maximum likelihood estimator exists and that its accuracy can be consistently estimated by re-sampling methods in these two families of distributions.
Keywords: Interval data; Maximum Likelihood; Turnbull estimator; willingness-to-pay; quantitative elicitation
JEL: C25
  • Estimating the GARCH Diffusion: Simulated Maximum Likelihood in Continuous Time
Date: 2010-01
By: Tore Selland Kleppe (Department of Mathematics, University of Bergen)
Jun Yu (School of Economics, Singapore Management University)
Hans J. Skaug (Department of Mathematics, University of Bergen)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:13-2010&r=ecm
A new algorithm is developed to provide a simulated maximum likelihood estimation of the GARCH diffusion model of Nelson (1990) based on return data only. The method combines two accurate approximation procedures, namely, the polynomial expansion of Aït-Sahalia (2008) to approximate the transition probability density of return and volatility, and the Efficient Importance Sampler (EIS) of Richard and Zhang (2007) to integrate out the volatility. The first and second order terms in the polynomial expansion are used to generate a base-line importance density for an EIS algorithm. The higher order terms are included when evaluating the importance weights. Monte Carlo experiments show that the new method works well and the discretization error is well controlled by the polynomial expansion. In the empirical application, we fit the GARCH diffusion to equity data, perform diagnostics on the model fit, and test the finiteness of the importance weights.
Keywords: Ecient importance sampling; GARCH diusion model; Simulated Maximum likelihood; Stochastic volatility
JEL: C11
  • Scale-dependence of the Negative Binomial Pseudo-Maximum Likelihood Estimator
Date: 2010
By: Clément Bosquet (GREQAM – Groupement de Recherche en Économie Quantitative d’Aix-Marseille – Université de la Méditerranée – Aix-Marseille II – Université Paul Cézanne – Aix-Marseille III – Ecole des Hautes Etudes en Sciences Sociales (EHESS) – CNRS : UMR6579)
Hervé Boulhol (CES – Centre d’économie de la Sorbonne – CNRS : UMR8174 – Université Panthéon-Sorbonne – Paris I)
URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00535594_v1&r=ecm
Following Santos Silva and Tenreyro (2006), various studies have used the Poisson Pseudo-Maximum Likelihood to estimate gravity specifications of trade flows and non-count data models more generally. Some papers also report results based on the Negative Binomial estimator, which is more general and encompasses the Poisson assumption as a special case. This note shows that the Negative Binomial estimator is inappropriate when applied to a continuous dependent variable which unit choice is arbitrary, because estimates artificially depend on that choice.
Keywords: pseudo-maximum likelihood methods;negative binomial estimator;Poisson regression;gamma PML
  • Can We Trust Cluster-Corrected Standard Errors? An Application of Spatial Autocorrelation with Exact Locations Known
Date: 2011-08-18
By: John Gibson (University of Waikato)
Bonggeun Kim (Seoul National University)
Susan Olivia (Monash University)
URL: http://d.repec.org/n?u=RePEc:wai:econwp:10/07&r=ecm
Standard error corrections for clustered samples impose untested restrictions on spatial correlations. Our example shows these are too conservative, compared with a spatial error model that exploits information on exact locations of observations, causing inference errors when cluster corrections are used.
Keywords: clustered samples; GPS; spatial correlation
JEL: C31
  • Algebraic theory of identification in parametric models
Date: 2010-05
By: Kociecki, Andrzej
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:26820&r=ecm
The article presents the problem of identification in parametric models from an algebraic point of view. We argue that it is not just another perspective but the proper one. That is, using our approach we can see the very nature of the identification problem, which is slightly different than that suggested in the literature. In practice, it means that in many models we can unambiguously estimate parameters that have been thought as unidentifiable. This is illustrated in the case of Simultaneous Equations Model (SEM), where our analysis leads to conclusion that existing identification conditions, although correct, are based on the inappropriate premise: only the structural parameters that are in one–to–one correspondence with the reduced form parameters are identified. We will show that this is not true. In fact, there are other structural parameters, which are identified, but can not be uniquely recovered from the red uced form parameters. Although we apply our theory only to SEM, it can be used in many standard econometric models.
Keywords: identification; group theory
JEL: C01
  • A Non-parametric Approach to Incorporating Incomplete Workouts Into Loss Given Default Estimates
Date: 2010-11-16
By: Rapisarda, Grazia
Echeverry, David
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:26797&r=ecm
When estimating Loss Given Default (LGD) parameters using a workout approach, i.e. discounting cash flows over the workout period, the problem arises of how to take into account partial recoveries from incomplete work-outs. The simplest approach would see LGD based on complete recovery profiles only. Whilst simple, this approach may lead to data selection bias, which may be at the basis of regulatory guidance requiring the assessment of the relevance of incomplete workouts to LGD estimation. Despite its importance, few academic contributions have covered this topic. We enhance this literature by developing a non-parametric estimator that -under certain distributional assumptions on the recovery profiles- aggregates complete and incomplete workout data to produce unbiased and more efficient estimates of mean LGD than those obtained from the estimator based on resolved cases only. Our estimator is appropriate in LGD estimat ion for wholesale portfolios, where the exposure-weighted LGD estimators available in the literature would not be applicable under Basel II regulatory guidance.
Keywords: Credit risk; bank loans; loss-given-default; LGD; incomplete observations; mortality curves
JEL: C14
  • Sharp IV bounds on average treatment effects under endogeneity and noncompliance
Date: 2010-11
By: Martin Huber
Giovanni Mellace
URL: http://d.repec.org/n?u=RePEc:usg:dp2010:2010-31&r=ecm
In the presence of an endogenous treatment and a valid instrument, causal effects are (nonparametrically) point identified only for the subpopulation of compliers, given that the treatment is monotone in the instrument. Further populations of likely policy interest have been widely ignored in econometrics. Therefore, we use treatment monotonicity and/or stochastic dominance assumptions to derive sharp bounds on the average treatment effects of the treated population, the entire population, the compliers, the always takers, and the never takers. We also provide an application to labor market data and briefly discuss testable implications of the instrumental exclusion restriction and stochastic dominance.
Keywords: Instrument, noncompliance, principal stratification, nonparametric bounds
JEL: C14
  • Spectral Analysis of Non-Stationary Time Series
Date: 2010
By: D M NACHANE
URL: http://d.repec.org/n?u=RePEc:ess:wpaper:id:3191&r=ecm
The aim of this paper is to take stock of the important recent contributions to spectral analysis, especially as they apply to non-stationary processes. Non-stationary processes are particularly relevant in the empirical sciences where most phenomena exhibit pronounced departures from stationary.
Keywords: spectral analysis, non-stationary, empirical sciences, time series,
  • Estimation of the Semiparametric Factor Model: Application to Modelling Time Series of Electricity Spot Prices.
Date: 2010
By: Liebl, Dominik
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:26800&r=ecm
Classical univariate and multivariate time series models have problems to deal with the high variability of hourly electricity spot prices. We propose to model alternatively the daily mean electricity supply functions using a dynamic factor model. And to derive, subsequently, the hourly electricity spot prices by the evaluation of the estimated supply functions at the corresponding hourly values of demand for electricity. Supply functions are price (EUR/MWh) functions, that increase monotonically with demand for electricity (MW). Apart from this new conceptual approach, that allows us to represent the auction design of energy exchanges in a most natural way, our main contribution is an extraordinary simple algorithm to estimate the factor structure of the dynamic factor model. We decompose the time series into a functional spherical component and an univariate scaling component. The elements of the spherical component are all standardized having unit size such that we can robustly estimate the factor structure. This algorithm is much simpler than procedures suggested in the literature. In order to use a parsimonious labeling we will refer to the daily mean supply curves simply as price curves.
Keywords: Factor Analysis; functional time series data; sparse data; electricity spot market prices; European Electricity Exchange (EEX)
JEL: C14
  • Should macroeconomic forecasters use daily financial data and how?
Date: 2010-11
By: Elena Andreou
Eric Ghysels
Andros Kourtellos
URL: http://d.repec.org/n?u=RePEc:ucy:cypeua:9-2010&r=ecm
We introduce easy to implement regression-based methods for predicting quarterly real economic activity that use daily financial data and rely on forecast combinations of MIDAS regressions. Our analysis is designed to elucidate the value of daily information and provide real-time forecast updates of the current (nowcasting) and future quarters. Our findings show that while on average the predictive ability of all models worsens substantially following the financial crisis, the models we propose suffer relatively less losses than the traditional ones. Moreover, these predictive gains are primarily driven by the classes of government securities, equities, and especially corporate risk.
Keywords: MIDAS, macro forecasting, leads, daily financial information, daily factors.
  • Forecasting Compositional Time Series with Exponential Smoothing Methods
Date: 2010-11
By: Anne B. Koehler
Ralph D. Snyder
J. Keith Ord
Adrian Beaumont
URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2010-20&r=ecm
Compositional time series are formed from measurements of proportions that sum to one in each period of time. We might be interested in forecasting the proportion of home loans that have adjustable rates, the proportion of nonagricultural jobs in manufacturing, the proportion of a rock’s geochemical composition that is a specific oxide, or the proportion of an election betting market choosing a particular candidate. A problem may involve many related time series of proportions. There could be several categories of nonagricultural jobs or several oxides in the geochemical composition of a rock that are of interest. In this paper we provide a statistical framework for forecasting these special kinds of time series. We build on the innovations state space framework underpinning the widely used methods of exponential smoothing. We couple this with a generalized logistic transformation to convert the measurements from the unit interval to the entire real line. The approach is illustrated with two applications: the proportion of new home loans in the U.S. that have adjustable rates; and four probabilities for specified candidates winning the 2008 democratic presidential nomination.
Keywords: compositional time series, innovations state space models, exponential smoothing, forecasting proportions
JEL: C22
  • Corrigendum to “A Gaussian Approach for Continuous Time Models of the Short Term Interest Rate »
Date: 2010-10
By: Peter C.B. Phillips (Yale University)
Jun Yu (School of Economics, Singapore Management University)
URL: http://d.repec.org/n?u=RePEc:siu:wpaper:18-2010&r=ecm
An error is corrected in Yu and Phillips (2001) (Econometrics Journal, 4, 210-224) where a time transformation was used to induce Gaussian disturbances in the discrete time equivalent model. It is shown that the error process in this model is not a martingale and the Dambis, Dubins-Schwarz (DDS) theorem is not directly applicable. However, a detrended error process is a martingale, the DDS theorem is applicable, and the corresponding stopping time correctly induces Gaussianity. We show that the two stopping time sequences differ by O(a2), where a is the pre-specified normalized timing constant.
Keywords: Nonlinear Diffusion, Normalizing Transformation, Level Effect, DDS Theorem.
  • Posterior Predictive Analysis for Evaluating DSGE Models
Date: 2010-10-30
By: Faust, Jon
Gupta, Abhishek
URL: http://d.repec.org/n?u=RePEc:pra:mprapa:26721&r=ecm
In this paper, we develop and apply certain tools to evaluate the strengths and weaknesses of dynamic stochastic general equilibrium (DSGE) models. In particular, this paper makes three contributions: One, it argues the need for such tools to evaluate the usefulness of the these models; two, it defines these tools which take the form of prior and particularly posterior predictive analysis and provides illustrations; and three, it provides a justification for the use of these tools in the DSGE context in defense against the standard criticisms for the use of these tools.
Keywords: Prior and posterior predictive analysis; DSGE Model Evaluation; Monetary Policy.
JEL: C52
  • How useful is the carry-over effect for short-term economic forecasting?
Date: 2010
By: Tödter, Karl-Heinz
URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:201021&r=ecm
The carry-over effect is the advance contribution of the old year to growth in the new year. Among practitioners the informative content of the carry-over effect for short-term forecasting is undisputed and is used routinely in economic forecasting. In this paper, the carry-over effect is analysed ‘statistically’ and it is shown how it reduces the uncertainty of short-term economic forecasts. This is followed by an empirical analysis of the carry-over effect using simple forecast models as well as Bundesbank and Consensus projections. —
Keywords: forecast uncertainty,growth rates,carry-over effect,variance contribution,Chebyshev density
JEL: C53

Laisser un commentaire

Entrez vos coordonnées ci-dessous ou cliquez sur une icône pour vous connecter:

Logo WordPress.com

Vous commentez à l'aide de votre compte WordPress.com. Déconnexion / Changer )

Image Twitter

Vous commentez à l'aide de votre compte Twitter. Déconnexion / Changer )

Photo Facebook

Vous commentez à l'aide de votre compte Facebook. Déconnexion / Changer )

Photo Google+

Vous commentez à l'aide de votre compte Google+. Déconnexion / Changer )

Connexion à %s