Predicting Economic Recessions Using Machine Learning Algorithms abstract Even at the beginning of 2008, the economic recession of 2008/09 was not being predicted. The failure to predict recessions is a persistent theme in economic forecasting. The Survey of Professional Forecasters (SPF) provides data on predictions made for the growth of total output, GDP, in the United States for one, two, three and four quarters ahead since the end of the 1960s. Over a three quarters ahead horizon, the mean prediction made for GDP growth has never been negative over this period. The correlation between the mean SPF three quarters ahead forecast and the data is very low, and over the most recent 25 years is not significantly different from zero.

Here, we show that the machine learning technique of random forests has the potential to give early warning of recessions. We use a small set of explanatory variables from financial markets which would have been available to a forecaster at the time of making the forecast. We train the algorithm over the 1970Q2-1990Q1 period, and make predictions one, three and six quarters ahead. We then re-train over 1970Q2-1990Q2 and make a further set of predictions, and so on. We did not attempt any optimisation of predictions, using only the default input parameters to the algorithm we downloaded in the package R.

We compare the predictions made from 1990 to the present with the actual data. One quarter ahead, the algorithm is not able to improve on the SPF predictions. Three and six quarters ahead, the correlations between actual and predicted are low, but they are very significantly different from zero. Although the timing is slightly wrong, a serious downturn in the first half of 2009 could have been predicted six quarters ahead in late 2007. The algorithm never predicts a recession when one did not occur.

We obtain even stronger results with random forest machine learning techniques in the case of the United Kingdom.

ETF Arbitrage Under Liquidity Mismatch abstract A natural liquidity mismatch emerges when liquid exchange traded funds (ETFs) hold relatively illiquid assets. We provide a theory and empirical evidence showing that this liquidity mismatch can reduce market efficiency and increase the fragility of these ETFs. We focus on corporate bond ETFs and examine the role of authorized participants (APs) in ETF arbitrage. In addition to their role as dealers in the underlying bond market, APs also play a unique role in arbitrage between the bond and ETF markets since they are the only market participants that can trade directly with ETF issuers. Using novel and granular AP-level data, we identify a conflict between APs’ dual roles as bond dealers and as ETF arbitrageurs. When this conflict is small, liquidity mismatch reduces the arbitrage capacity of ETFs; as the conflict increases, an inventory management motive arises that may even distort ETF arbitrage, leading to large relative mispricing. These findings suggest an important risk in ETF arbitrage.

Predicting Bankruptcy with Support Vector Machines abstract The purpose of this work is to introduce one of the most promising among recently developed statistical techniques – the support vector machine (SVM) – to corporate bankruptcy analysis. An SVM is implemented for analysing such predictors as financial ratios. A method of adapting it to default probability estimation is proposed. A survey of practically applied methods is given. This work shows that support vector machines are capable of extracting useful information from financial data, although extensive data sets are required in order to fully utilize their classification power.

Variance Risk Premia on Stocks and Bonds abstract We study equity (EVRP) and Treasury variance risk premia (TVRP) jointly and document a number of novel facts: First, relative to their volatility, TVRP are comparable in magnitude to EVRP. Second, while there is mild positive co-movement between EVRP and TVRP unconditionally, time series estimates of correlation display distinct spikes in both directions and have been notably more volatile since the financial crisis. Third (i) short maturity TVRP predict excess returns on short maturity bonds; (ii) long maturity TVRP and EVRP predict excess returns on long maturity bonds; and (iii) while EVRP predict equity returns for horizons up to 6-months, long maturity TVRP contains robust information for equity returns at longer horizons. Finally, exploiting the dynamics of real and nominal Treasuries we present evidence that expected inflation is power determinant of the joint dynamics of EVRP and TVRP and their co-movement. We argue this result is consistent with an inflation signalling role in a deflationary economic environment.

Diversify and Purify Factor Premiums in Equity Markets abstract In this paper we consider the question of how to improve the efficacy of strategies designed to capture factor premiums in equity markets and, in particular, from the value, quality, low risk and momentum factors. We consider a number of portfolio construction approaches designed to capture factor premiums with the appropriate levels of risk controls aiming at increasing information ratios. We show that information ratios can be increased by targeting constant volatility over time, hedging market beta and hedging exposures to the size factor, i.e. neutralizing biases in the market capitalization of stocks used in factor strategies. With regards to the neutralization of sector exposures, we find this to be of importance in particular for the value and low risk factors. Finally, we look at the added value of shorting stocks in factor strategies. We find that with few exceptions the contributions to performance from the short leg are inferior to those from the long leg. Thus, long-only strategies can be efficient alternatives to capture these factor premiums. Finally, we find that factor premiums tend to have fatter tails than what could be expected from a Gaussian distribution of returns, but that skewness is not significantly negative in most cases.

FFT Based Option Pricing abstract The Black-Scholes formula, one of the major breakthroughs of modern finance, allows for an easy and fast computation of option prices. But some of its assumptions, like constant volatility or log-normal distribution of asset prices, do not find justification in the markets. More complex models, which take into account the empirical facts, often lead to more computations and this time burden can become a severe problem when computation of many option prices is required, e.g. in calibration of the implied volatility surface. To overcome this problem Carr and Madan (1999) developed a fast method to compute option prices for a whole range of strikes. This method and its application are the theme of this chapter. In Section 1.3, we briefly discuss the Merton, Heston and Bates models concentrating on aspects relevant for the option pricing method. In the following section, we present the method of Carr and Madan which is based on the fast Fourier transform (FFT) and can be applied to a variety of models. We also consider brie∞y some further developments and give a short introduction to the FFT algorithm. In the last section, we apply the method to the three analyzed models, check the results by Monte Carlo simulations and comment on some numerical issues.

Quantile Regression in Risk Calibration abstract Financial risk control has always been challenging and becomes now an even harder problem as joint extreme events occur more frequently. For decision makers and government regulators, it is therefore important to obtain accurate information on the interdependency of risk factors. Given a stressful situation for one market participant, one likes to measure how this stress affects other factors. The CoVaR (Conditional VaR) framework has been developed for this purpose. The basic technical elements of CoVaR estimation are two levels of quantile regression: one on market risk factors; another on individual risk factor. Tests on the functional form of the two-level quantile regression reject the linearity. A flexible semiparametric modeling framework for CoVaR is proposed. A partial linear model (PLM) is analyzed. In applying the technology to stock data covering the crisis period, the PLM outperforms in the crisis time, with the justification of the backtesting procedures. Moreover, using the data on global stock markets indices, the analysis on marginal contribution of risk (MCR) defined as the local first order derivative of the quantile curve sheds some light on the source of the global market risk.

TVICA - Time Varying Independent Component Analysis and Its Application to Financial Data abstract Source extraction and dimensionality reduction are important in analyzing high dimensional and complex financial time series that are neither Gaussian distributed nor stationary. Independent component analysis (ICA) method can be used to factorize the data into a linear combination of independent components, so that the high dimensional problem is converted to a set of univariate ones. However conventional ICA methods implicitly assume stationarity or stochastic homogeneity of the analyzed time series, which leads to a low accuracy of estimation in case of a changing stochastic structure. A time varying ICA (TVICA) is proposed here. The key idea is to allow the ICA filter to change over time, and to estimate it in so-called local homogeneous intervals. The question of how to identify these intervals is solved by the LCP (local change point) method. Compared to a static ICA, the dynamic TVICA provides good performance both in simulation and real data analysis. The data example is concerned with independent signal processing and deals with a portfolio of highly traded stocks.

Identifying Price Bubble Periods in the Energy Sector abstract In this paper we test for the existence of single and multiple bubble periods in three energy sector indexes and five energy sector spot prices using the Supremum Augmented Dickey-Fuller (SADF) and the Generalized SADF (GSADF). These techniques allow us to estimate the beginning and the end of the bubble periods. Our results provide strong statistically significant evidence of speculative bubbles in all energy sector price series tested with the exception of the natural gas price series where we observe the weak explosive episodes. Our results are valuable to energy analyst, investors, and oil importing and exporting countries.

Empirical Pricing Kernels and Investor Preferences abstract This paper analyzes empirical market utility functions and pricing kernels derived from the DAX and DAX option data for three market regimes. A consistent parametric framework of stochastic volatility is used. All empirical market utility functions show a region of risk proclivity that is reproduced by adopting the hypothesis of heterogeneous individual investors whose utility functions have a switching point between bullish and bearish attitudes. The inverse problem of finding the distribution of individual switching points is formulated in the space of stock returns by discretization as a quadratic optimization problem. The resulting distributions vary over time and correspond to different market regimes.

Forecast Based Pricing of Weather Derivatives abstract Forecasting based pricing of Weather Derivatives (WDs) is a new approach in valuation of contingent claims on nontradable underlyings. Standard techniques are based on historical weather data. Forward-looking information such as meteorological forecasts or the implied market price of risk (MPR) are often not incorporated. We adopt a risk neutral approach (for each location) that allows the incorporation of meteorological forecasts in the framework of WD pricing. We study weather Risk Premiums (RPs) implied from either the information MPR gain or the meteorological forecasts. The size of RPs is interesting for investors and issuers of weather contracts to take advantages of geographic diversification, hedging effects and price determinations. By conducting an empirical analysis to London and Rome WD data traded at the Chicago Mercantile Exchange (CME), we find out that either incorporating the MPR or the forecast outperforms the standard pricing techniques.

Common Functional Implied Volatility Analysis abstract Trading, hedging and risk analysis of complex option portfolios depend on accurate pricing models. The modelling of implied volatilities (IV) plays an important role, since volatility is the crucial parameter in the Black-Scholes (BS) pricing formula. It is well known from empirical studies that the volatilities implied by observed market prices exhibit patterns known as volatility smiles or smirks that contradict the assumption of constant volatility in the BS pricing model. On the other hand, the IV is a function of two parameters: the strike price and the time to maturity and it is desirable in practice to reduce the dimension of this object and characterize the IV surface through a small number of factors. Clearly, a dimension reduced pricing-model that should reflect the dynamics of the IV surface needs to contain factors and factor loadings that characterize the IV surface itself and their movements across time.

Speculation and Power Law abstract It is now well established empirically that financial price changes are distributed according to a power law, with cubic exponent. This is a fascinating regularity, as it holds for various classes of securities, on various markets, and on various time scales. The universality of this law suggests that there must be some basic, general and stable mechanism behind it. The standard (neoclassical) paradigm implies no such mechanism. Agent-based models of financial markets, on the other hand, exhibit realistic price changes, but they involve relatively complicated, and often mathematically intractable, mechanisms. This paper identifies a simple principle behind the power law: the feedback intrinsic to the very idea of speculation, namely buying when one expects a price rise (and selling when one expects a price fall). By this feedback, price changes follow a random coefficient autoregressive process, and therefore they have a power law by Kesten theorem.

The Interaction of Skewness and Analysts' Forecast Dispersion in Asset Pricing abstract I develop a new asset pricing theory that bridges two seemingly unrelated pricing effects from separate literatures: (1) the negative relationship between ex-ante return skewness and expected returns and (2) the negative relationship between dispersion in financial analysts' earnings forecasts and expected returns. I show that both effects arise intrinsically from market clearing of stochastic demand in a standard noisy rational expectations economy that incorporates skewed assets followed by financial analysts. Positive correlation between forecast dispersion and investor heterogeneity arises endogenously. The theory generates several novel testable predictions regarding the interaction of ex-ante skewness and forecast dispersion on asset prices.

Copula-Based Factor Model for Credit Risk Analysis abstract A standard quantitative method to access credit risk employs a factor model based on joint multi- variate normal distribution properties. By extending a one-factor Gaussian copula model to make a more accurate default forecast, this paper proposes to incorporate a state-dependent recovery rate into the conditional factor loading, and model them by sharing a unique common factor. The common factor governs the default rate and recovery rate simultaneously and creates their association implicitly. In accordance with Basel III, this paper shows that the tendency of default is more governed by systematic risk rather than idiosyncratic risk during a hectic period. Among the models considered, the one with random factor loading and a state-dependent recovery rate turns out to be the most superior on the default prediction.

A Dynamic Semiparametric Factor Model for Implied Volatility String Dynamics abstract A primary goal in modelling the implied volatility surface (IVS) for pricing and hedging aims at reducing complexity. For this purpose one fits the IVS each day and applies a principal component analysis using a functional norm. This approach, however, neglects the degenerated string structure of the implied volatility data and may result in a modelling bias. We propose a dynamic semiparametric factor model (DSFM), which approximates the IVS in a finite dimensional function space. The key feature is that we only fit in the local neighborhood of the design points. Our approach is a combination of methods from functional principal component analysis and backfitting techniques for additive models. The model is found to have an approximate 10% better performance than a sticky moneyness model. Finally, based on the DSFM, we devise a generalized vega-hedging strategy for exotic options that are priced in the local volatility framework. The generalized vega-hedging extends the usual approaches employed in the local volatility framework.

CDO and HAC abstract Modelling portfolio credit risk is one of the crucial challenges faced by financial services industry in the last few years. We propose the valuation model of collateralized debt obligations (CDO) based on copula functions with up to three parameters, with default intensities estimated from market data and with a random loss given default that is correlated with default times. The methods presented are used to reproduce the spreads of the iTraxx Europe tranches. We apply hierarchical Archimedean copulae (HAC) whose construction allows for the fact that the risky assets of the CDO pool are chosen from six different industry sectors. The dependence among the assets from the same group is specified with the higher value of the copula parameter, otherwise the lower value of the parameter is ascribed. The copula with two and three parameters models the relation between the loss given default and the default times. Our approach describes the market prices better than the standard pricing procedure based on the Gaussian distribution.

Forecasting the Term Structure of Variance Swaps abstract Recently, Diebold and Li (2003) obtained good forecasting results for yield curves in a reparametrized Nelson-Siegel framework. We analyze similar modeling approaches for price curves of variance swaps that serve nowadays as hedging instruments for options on realized variance.

We consider the popular Heston model, reparametrize its variance swap price formula and model the entire variance swap curves by two exponential factors whose loadings evolve dynamically on a weekly basis. Generalizing this approach we consider a reparametrization of the three-dimensional Nelson-Siegel factor model. We show that these factors can be interpreted as level, slope and curvature and how they can be estimated directly from characteristic points of the curves. Moreover, we analyze a semiparametric factor model.

Estimating autoregressive models for the factor loadings we get termstructure forecasts that we compare in addition to the random walk and the static Heston model that is often used in industry. In contrast to the results of Diebold and Li (2003) on yield curves, no model produces better forecasts of variance swap curves than the random walk but forecasting the Heston model improves the popular static Heston model. Moreover, the Heston model is better than the flexible semiparametric approach that outperforms the Nelson-Siegel model.

Functional Analytic (Ir-)Regularity Properties of SABR-type Processes abstract The SABR model is a benchmark stochastic volatility model in interest rate markets, which has received much attention in the past decade. Its popularity arose from a tractable asymptotic expansion for implied volatility, derived by heat kernel methods. As markets moved to historically low rates, this expansion appeared to yield inconsistent prices. Since the model is deeply embedded in market practice, alternative pricing methods for SABR have been addressed in numerous approaches in recent years. All standard option pricing methods make certain regularity assumptions on the underlying model, but for SABR these are rarely satisfied. We examine here regularity properties of the model from this perspective with view to a number of (asymptotic and numerical) option pricing methods. In particular, we highlight delicate degeneracies of the SABR model (and related processes) at the origin, which deem the currently used popular heat kernel methods and all related methods from (sub-) Riemannian geometry ill-suited for SABR-type processes, when interest rates are near zero. We describe a more general semigroup framework, which permits to derive a suitable geometry for SABR-type processes (in certain parameter regimes) via symmetric Dirichlet forms. Furthermore, we derive regularity properties (Feller- properties and strong continuity properties) necessary for the applicability of popular numerical schemes to SABR-semigroups, and identify suitable Banach- and Hilbert spaces for these. Finally, we comment on the short time and large time asymptotic behaviour of SABR-type processes beyond the heat-kernel framework.

Testing Monotonicity of Pricing Kernels abstract The behaviour of market agents has always been extensively covered in the literature. Risk averse behaviour, described by von Neumann and Morgenstern (1944) via a concave utility function, is considered to be a cornerstone of classical economics. Agents prefer a fixed profit over uncertain choice with the same expected value, however lately there has been a lot of discussion about the reliability of this approach. Some authors have shown that there is a reference point where market utility functions are convex. In this paper we have constructed a test to verify uncertainty about the concavity of agents’ utility function by testing the monotonicity of empirical pricing kernels (EPKs). A monotone decreasing EPK corresponds to a concave utility function while non-monotone decreasing EPK means non-averse pattern on one or more intervals of the utility function. We investigated the EPK for German DAX data for years 2000, 2002 and 2004 and found the evidence of non-concave utility functions: H0 hypothesis of monotone decreasing pricing kernel was rejected at 5% and 10% significance level in 2002 and at 10% significance level in 2000.

more

### Other great sites:

Ernie Chan Jonathan Kinlay Quantocracy Quantpedia Robot Wealth Quantifiable Edges MathFinance Collector Dekalog Quanttech Scott's Investments Return and Risk Investment Idiocy The R Trader Talaikis Algo Tr8dr Mintegration Volatility Futures & Options Intelligent Trading Tech Quantum Financier Kyle Balkissoon Shifting Sands Predictive Alpha Mechanical Markets QuantStrat TradeR MKTSTK CSS Analytics