Quantile Regression: Estimation and Simulation
Marilena Furno, Domenico Vistocco (Wiley Series in Probability and Statistics, Wiley, 2018).
Volume two of Quantile Regression offers an important guide for applied researchers that draws on the same example-based approach adopted for the first volume. The text explores topics including robustness, expectiles, m-quantile, decomposition, time series, elemental sets and linear programming.
Graphical representations are widely used to visually introduce several issues, and to illustrate each method. All the topics are treated theoretically and using real data examples. Designed as a practical resource, the book is thorough without getting too technical about the statistical background.
The authors cover a wide range of QR models useful in several fields. The software commands in R and Stata are available in the appendixes and featured on the accompanying website.
Written for researchers and students in the fields of statistics, economics, econometrics, social and environmental science, this text offers guide to the theory and application of quantile regression models.
About this book
- Provides an overview of several technical topics such as robustness of quantile regressions, bootstrap and elemental sets, treatment effect estimators
- Compares quantile regression with alternative estimators like expectiles, M-estimators and M-quantiles
- Offers a general introduction to linear programming focusing on the simplex method as solving method for the quantile regression problem
- Considers time-series issues like non-stationarity, spurious regressions, cointegration, conditional heteroskedasticity via quantile regression
- Offers an analysis that is both theoretically and practical
- Presents real data examples and graphical representations to explain the technical issues
Table of contents
Chapter 1: Robust regression
This chapter focuses on the shortcomings of the OLS estimator in the linear regression model by analyzing an artificial data set, discussed by Anscombe, where OLS yields the same estimated coefficients in four different models, but the result is correct only in one of them. It considers the behavior of the median regression in the Anscombe data and in two additional models analyzing real data, in order to show the robustness of the quantile regression approach. The chapter also considers the influence function, a statistic that measures the impact of outliers on an estimator, together with its empirical approximation, the sample influence function. Examples with real and artificial data sets allow to compare the influence function of the OLS and of the median regression estimators in samples characterized by different kinds of outliers. The chapter focuses on some diagnostic measures defined to detect outliers.
Chapter 2: Quantile regression and related methods
The detailed analysis of a regression model at various points of the conditional distribution, allowed by quantile regressions, can be imported into the least squares framework. This implies investigating the dependence among variables not only at the conditional mean but also in the tails, just as occurs in the quantile regression framework. Analogously to quantile regressions, expectiles allow to compute a regression model away from the conditional mean. M-quantiles merge together the M-estimators and the expectiles approach. The M-quantile estimator merges the weighting system curbing outliers of the M-estimators and the asymmetric weights defining the location of the expectiles. The purpose is to robustly compute the regression coefficients at different points of the conditional distribution of the dependent variable. The M-quantiles estimators are frequently used in small-area estimation. Examples analyzing real and artificial data sets point out the characteristics of the above estimators.
Chapter 3: Resampling, subsampling, and quantile regression
This chapter focuses on subsampling and resampling methods to further analyze the characteristics of the quantile regression estimator. It considers the elemental sets approach, which provides a very intuitive tool to compare OLS and quantile regressions. The chapter discusses alternative interpretations of the quantile regression estimators, based respectively on the p-dimensional subsets and on the use of bootstrap when the former approach is unfeasible due to the large size of the sample. It reports a brief review of the asymptotic distribution of the non-extreme quantile regression estimator for comparison’s sake. The asymptotics of intermediate- and central-order quantile regression estimators could be obtained by modifying the extreme-value approach. The chapter analyzes an additional quantile regression approach relying on bootstrap, the quantile treatment effect (QTE) estimator, which is implemented to evaluate the impact of a treatment or a policy.
Chapter 4: A not so short introduction to linear programming
Quantile regression is a statistical method suitable to model the whole conditional distribution of a response variable in terms of a set of explanatory variables. The beginning of wide dissemination was the formulation in terms of a linear programming problem. This chapter proposes a (not so short) journey into the world of linear programming. Mathematics is softened in order to focus on geometric intuition and teaching examples. The general simplex algorithm and its main variants are presented and then applied to generic problems. The two-phase method resorts to a twofold application of the simplex algorithm: the first to detect an initial basis, if it exists, and the second to look for the optimal solution. The chapter aims to enable the reader to manage the linear programming machinery and to apply it to the quantile regression framework.
Chapter 5: Linear programming for quantile regression
This chapter outlines the algorithm for solving the quantile regression problem using a small data set, detailing the various steps of the procedure. The machinery behind the use of linear programming for solving regression problems is first presented for the case of median regression and then extended to the more general quantile regression. The chapter also introduces the median regression problem in case of univariate regression. A geometric interpretation of the minimization problem characterizing quantile regression is also outlined using the point/line duality introduced by Edgeworth in 1888 in his pioneer work on median methods for linear regression. Edgeworth proposed a general formulation of median regression involving a multivariate vector of explanatory variables naming it plural median. An alternative formulation of the L1 problem in terms of LP directly provides the signed residuals.
Chapter 6: Correlation
This chapter considers estimation and inference in case of stationary and non-stationary autoregressive processes as estimated by quantile regressions. It reports the autocorrelations of the residuals of the estimated AR(1) model, computed at the conditional mean. Tests of stationarity are implemented together with other closely related tests, although the latter are not specifically defined for the quantile regression model. The presence of unit root, besides causing a nonstandard distribution of the t test, has an additional relevant implication in a regression model. The case of spurious regression and of cointegrated variables are discussed in simulated data sets and for the consumption function. The test for cointegration brings to the analysis of changing coefficient models and to the test functions defined to detect them. The quantile regression conditionally heteroskedastic model concludes the chapter by further analyzing the inflation rate series.