Trending academic research:

See also: Most popular SSRN papers

"
  • Stock Trading via Feedback Control: Stochastic Model Predictive or Genetic? abstract This work underlies the poster presented at the XVIII Workshop on Quantitative Finance (QFW2017) in Milano on January 25-27, 2017.
    We seek a discussion about the most suitable feedback control structure for stock trading under the consideration of proportional transaction costs. Suitability refers to robustness and performance capability. Both are tested by considering different one-step ahead prediction qualities, including the ideal case, correct prediction of the direction of change in daily stock prices and the worst-case. Feedback control structures are partitioned into two general classes: stochastic model predictive control (SMPC) and genetic. For the former class three controllers are discussed, whereby it is distinguished between two Markowitz- and one dynamic hedging-inspired SMPC formulation. For the latter class five genetic trading algorithms are disucssed, whereby it is distinguished between two different moving average (MA) based, two trading range (TR) based, and one strategy based on historical optimal (HistOpt) trajectories. This paper also gives a preliminary discussion about how modified dynamic hedging-inspired SMPC formulations may serve as alternatives to Markowitz portfolio optimization. The combinations of all of the eight controllers with five different one-step ahead prediction methods are backtested for daily trading of the 30 components of the German stock market index DAX for the time period between November 27, 2015 and November 25, 2016.
  • Trends and Risk Premia: Update and Additional Plots abstract Recently, our group has published two papers that have received some attention in the finance community. One is about the profitability of trend following strategies over 200 years, the second is about the correlation between the profitability of "Risk Premia" and their skewness. In this short note, we present two additional plots that fully corroborate our findings on new data.
  • Dynamic correlations at different time-scales with Empirical Mode Decomposition abstract The Empirical Mode Decomposition (EMD) provides a tool to characterize time series in terms of its implicit components oscillating at different time-scales. We apply this decomposition to intraday time series of the following three financial indices: the S\&P 500 (USA), the IPC (Mexico) and the VIX (volatility index USA), obtaining time-varying multidimensional cross-correlations at different time-scales. The correlations computed over a rolling window are compared across the three indices, across the components at different time-scales, at different lags and over time. We uncover a rich heterogeneity of interactions which depends on the time-scale and has important led-lag relations which can have practical use for portfolio management, risk estimation and investments.
  • qBitcoin abstract A decentralized online quantum cash system, called qBitcoin, is given. We design the system which has great benefits of quantization in the following sense. Firstly, quantum teleportation technology is used for coin transaction, which prevents an owner of a coin from keeping the original coin data after sending the coin to another. This was a main problem in systems using classical information and a blockchain was introduced to solve this issue. In qBitcoin, the double-spending problem never happens and its security is guaranteed theoretically by virtue of quantum information theory. Making a bock is time consuming and the system of qBitcoin is based on a quantum chain, instead of blocks. Therefore a payment can be completed much faster than Bitcoin. Moreover we employ quantum digital signature so that it naturally inherits properties of peer-to-peer (P2P) cash system as originally proposed in Bitcoin.
  • Portfolio Optimization with Entropic Value-at-Risk abstract The entropic value-at-risk (EVaR) is a new coherent risk measure, which is an upper bound for both the value-at-risk (VaR) and conditional value-at-risk (CVaR). As important properties, the EVaR is strongly monotone over its domain and strictly monotone over a broad sub-domain including all continuous distributions, while well-known monotone risk measures, such as VaR and CVaR lack these properties. A key feature for a risk measure, besides its financial properties, is its applicability in large-scale sample-based portfolio optimization. If the negative return of an investment portfolio is a differentiable convex function, the portfolio optimization with the EVaR results in a differentiable convex program whose number of variables and constraints is independent of the sample size, which is not the case for the VaR and CVaR. This enables us to design an efficient algorithm using differentiable convex optimization. Our extensive numerical study shows the high efficiency of the algorithm in large scales, compared to the existing convex optimization software packages. The computational efficiency of the EVaR portfolio optimization approach is also compared with that of CVaR-based portfolio optimization. This comparison shows that the EVaR approach generally performs similarly, and it outperforms as the sample size increases. Moreover, the comparison of the portfolios obtained for a real case by the EVaR and CVaR approaches shows that the EVaR approach can find portfolios with better expectations and VaR values at high confidence levels.
  • Measurement of Common Risk Factors: A Panel Quantile Regression Model for Returns abstract This paper investigates how to measure common market risk factors using newly proposed Panel Quantile Regression Model for Returns. By exploring the fact that volatility crosses all quantiles of the return distribution and using penalized fixed effects estimator we are able to control for otherwise unobserved heterogeneity among financial assets. Direct benefits of the proposed approach are revealed in the portfolio Value-at-Risk forecasting application, where our modeling strategy performs significantly better than several benchmark models according to both statistical and economic comparison. In particular Panel Quantile Regression Model for Returns consistently outperforms all the competitors in the 5\% and 10\% quantiles. Sound statistical performance translates directly into economic gains which is demonstrated in the Global Minimum Value-at-Risk Portfolio and Markowitz-like comparison. Overall results of our research are important for correct identification of the sources of systemic risk, and are particularly attractive for high dimensional applications.
  • How many paths to simulate correlated Brownian motions? abstract We provide an explicit formula giving the optimal number of paths needed to simulate two correlated Brownian motions.
  • Econophysics of Business Cycles: Aggregate Economic Fluctuations, Mean Risks and Mean Square Risks abstract This paper presents hydrodynamic-like model of business cycles aggregate fluctuations of economic and financial variables. We model macroeconomics as ensemble of economic agents on economic space and agent's risk ratings play role of their coordinates. Sum of economic variables of agents with coordinate x define macroeconomic variables as functions of time and coordinates x. We describe evolution and interactions between macro variables on economic space by hydrodynamic-like equations. Integral of macro variables over economic space defines aggregate economic or financial variables as functions of time t only. Hydrodynamic-like equations define fluctuations of aggregate variables. Motion of agents from low risk to high risk area and back define the origin for repeated fluctuations of aggregate variables. Economic or financial variables on economic space may define statistical moments like mean risk, mean square risk and higher. Fluctuations of statistical moments describe phases of financial and economic cycles. As example we present a simple model relations between Assets and Revenue-on-Assets and derive hydrodynamic-like equations that describe evolution and interaction between these variables. Hydrodynamic-like equations permit derive systems of ordinary differential equations that describe fluctuations of aggregate Assets, Assets mean risks and Assets mean square risks. Our approach allows describe business cycle aggregate fluctuations induced by interactions between any number of economic or financial variables.
  • Forecasting day-ahead electricity prices in Europe: the importance of considering market integration abstract Motivated by the increasing integration among electricity markets, in this paper we propose three different methods to incorporate market integration in electricity price forecasting and to improve the predictive performance. First, we propose a deep neural network that considers features from connected markets to improve the predictive accuracy in a local market. To measure the importance of these features, we propose a novel feature selection algorithm that, by using Bayesian optimization and functional analysis of variance, analyzes the effect of the features on the algorithm performance. In addition, using market integration, we propose a second model that, by simultaneously predicting prices from two markets, improves even further the forecasting accuracy. Finally, we present a third model to predict the probability of price spikes; then, we use it as an input in the other two forecasters to detect spikes. As a case study, we consider the electricity market in Belgium and the improvements in forecasting accuracy when using various French electricity features. In detail, we show that the three proposed models lead to improvements that are statistically significant. Particularly, due to market integration, predictive accuracy is improved from 15.7% to 12.5% sMAPE (symmetric mean absolute percentage error). In addition, we also show that the proposed feature selection algorithm is able to perform a correct assessment, i.e. to discard the irrelevant features.
  • Feedback effect between Volatility of capital flows and financial stability: evidence from Democratic Republic of Congo abstract Financial system being the place of metting capital flows (equality between saving and investment), a volatility of capital flows can destroy the robustness and good working of financial system, it means subvert financial stability. The same a weak financial system, few regulated and bad manage can exacerbate volatility of capital flows and finely undermine financial stability. The present study provides evidence on feedback effect between volatility of capital flows and financial stability in Democratic republic of Congo (DRC), and estimate the contributions of macroeconomic and macroprudential policies in the attenuation volatility of capital flows effects on financial stability and in the prevention of instability financial. Assessment dynamic regression model a la Feldstein-Horioka we showed that financial system is widely supplied and financed by internationals capital flows. This implicate Congolese economy is financially mobile, that can be dangerous for financial stability. The study dynamic econometric of financial system's absolute size, we stipulate financial system has a systemic weight on real economy. Hence a shock of financial system could have devastating effects on Congolese economy. We estimate a vector autoregressive (VAR) model for prove the bilateral causality and impacts of macroeconomic and macroprudential policies. With regard to results, it proved on the one there is a feedback effect between volatility of capital flows and financial stability, on the other hand macroeconomic and macroprudential policies can't attenuate volatility of capital flows and prevent instability financial. It prove macroprudential approach is given a better result than monetary policy. The implementation of framework macroprudential by Central Bank of Congo will be beneficial in the realization of financial stability and attenuation volatility of capital flows.Keywords: Volatility of capital flows, financial stability, macroeconomic and macroprudential policies
  • DGM: A deep learning algorithm for solving partial differential equations abstract High-dimensional PDEs have been a longstanding computational challenge. We propose a deep learning algorithm similar in spirit to Galerkin methods, using a deep neural network instead of linear combinations of basis functions. The PDE is approximated with a deep neural network, which is trained on random batches of spatial points to satisfy the differential operator and boundary conditions. The algorithm is mesh-less, which is key since meshes become infeasible in higher dimensions. Instead of forming a mesh, sequences of spatial points are randomly sampled. We implement the approach for American options (a type of free-boundary PDE which is widely used in finance) in up to 100 dimensions. We call the algorithm a "Deep Galerkin Method (DGM)".
  • Haircutting Non-cash Collateral abstract Haircutting non-cash collateral has become a key element of the post-crisis reform of the shadow banking system and OTC derivatives markets. This article develops a parametric haircut model by expanding haircut definitions beyond the traditional value-at-risk measure and employing a double-exponential jump-diffusion model for collateral market risk. Haircuts are solved to target credit risk measurements, including probability of default, expected loss or unexpected loss criteria. Comparing to data-driven approach typically run on proxy data series, the model enables sensitivity analysis and stress test, captures market liquidity risk, allows idiosyncratic risk adjustments, and incorporates relevant market information. Computational results for main equities, securitization, and corporate bonds show potential for uses in collateral agreements, e.g. CSAs, and for regulatory capital calculations.
  • BSDEs with weak reflections and partial hedging of American options abstract We introduce a new class of \textit{Backward Stochastic Differential Equations with weak reflections} whose solution $(Y,Z)$ satisfies the weak constraint $\textbf{E}[\Psi(\theta,Y_\theta)] \geq m,$ for all stopping time $\theta$ taking values between $0$ and a terminal time $T$, where $\Psi$ is a random non-decreasing map and $m$ a given threshold. We study the wellposedness of such equations and show that the family of minimal time $t$-values $Y_t$ can be aggregated by a right-continuous process. We give a nonlinear Mertens type decomposition for lower reflected $g$-submartingales, which to the best of our knowledge, represents a new result in the literature. Using this decomposition, we obtain a representation of the minimal time $t$-values process. We also show that the minimal supersolution of a such equation can be written as a \textit{stochastic control/optimal stopping game}, which is shown to admit, under appropriate assumptions, a value and saddle points. From a financial point of view, this problem is related to the approximative hedging for American options.
  • Economic Design of Memory-Type Control Charts: The Fallacy of the Formula Proposed by Lorenzen and Vance (1986) abstract The memory-type control charts, such as EWMA and CUSUM, are powerful tools for detecting small quality changes in univariate and multivariate processes. Many papers on economic design of these control charts use the formula proposed by Lorenzen and Vance (1986) [Lorenzen, T. J., & Vance, L. C. (1986). The economic design of control charts: A unified approach. Technometrics, 28(1), 3-10, DOI: 10.2307/1269598]. This paper shows that this formula is not correct for memory-type control charts and its values can significantly deviate from the original values even if the ARL values used in this formula are accurately computed. Consequently, the use of this formula can result in charts that are not economically optimal. The formula is corrected for memory-type control charts, but unfortunately the modified formula is not a helpful tool from a computational perspective. We show that simulation-based optimization is a possible alternative method.
  • Dynamic Asset Price Jumps and the Performance of High Frequency Tests and Measures abstract This paper provides an extensive evaluation of high frequency jump tests and measures, in the context of dynamic models for asset price jumps. Specifically, we investigate: i) the power of alternative tests to detect individual price jumps, including in the presence of volatility jumps; ii) the frequency with which sequences of dynamic jumps are identified; iii) the accuracy with which the magnitude and sign of sequential jumps are estimated; and iv) the robustness of inference about dynamic jumps to test and measure design. Substantial differences are discerned in the performance of alternative methods in certain dimensions, with inference being sensitive to these differences in some cases. Accounting for measurement error when using measures constructed from high frequency data to conduct inference on dynamic jump models would appear to be advisable.
  • Unemployment: Study of Causes and Possible Solutions abstract The following measures against unemployment are proposed: In the short term, to promote greater income for the poorest sectors. It is shown that this can be paid with the resulting increased production, without losing income to the other economic agents. In the mid term, the creation of ad-hoc companies for investment in projects profitable but long lasting. And in the long run, the abandonment of the competitive models. As these proposals go against current ideas (liberalisation, labour market flexibility, free market, etc.), the statements are rigorously demonstrated, even at the risk of making the lecture harder.
    Part 1 explores the problem and uses a simple model and others heuristic arguments to create familiarity with macroeconomic models. Part 2 is a simplified summary of Macroeconomic Theory textbook. It serves as a review to the reader whose knowledge in economy are out of date, or as a first approximation to the topic if he or she does not have them. In the light of the theory, economic policies are evaluated for the Argentine case in the 90's. The work accepts the Keynesian explanation of unemployment (insufficient demand), but we disagree on its solution (public expenditure). Finally, in Part 3 we elaborate and justify the proposals.
  • Multilayer Aggregation of Investor Trading Networks abstract Investor trading networks are gaining rapid interest in financial market studies. In this paper, we propose three improvements for investor trading network analyses: investor categorization, transaction bootstrapping and information aggregation. Each of these components can be used individually or in combination. We introduce a tractable multilayer aggregation procedure to summarize security-wise and time-wise information integration of investor category trading networks. As an application, we analyze the unique dataset of Finnish shareholders throughout 2004-2009. We find that households play a central role in investor networks, having the most synchronized trading. Furthermore, we observe that the window size used for averaging has a substantial effect on the number of inferred relationships. However, the relative node centrality in the networks is rather stable. We would like to note that the use of our proposed aggregation framework is not limited to the field of investor trading networks. It can be used for different non-financial applications, with both observable and inferred relationships, that span over a number of different information layers.
  • Value-at-Risk and Expected Shortfall for the major digital currencies abstract Digital currencies and cryptocurrencies have hesitantly started to penetrate the investors, and the next step will be the regulatory risk management framework. We examine the Value-at-Risk and Expected Shortfall properties for the major digital currencies, Bitcoin, Ethereum, Litecoin, and Ripple. The methodology used is GARCH modelling followed by Filtered Historical Simulation. We find that digital currencies are subject to a higher risk, therefore, to higher sufficient buffer and risk capital to cover potential losses.
  • Dynamic trading under integer constraints abstract In this paper we investigate discrete time trading under integer constraints, that is, we assume that the offered goods or shares are traded in integer quantities instead of the usual real quantity assumption. For finite probability spaces and rational asset prices this has little effect on the core of the theory of no-arbitrage pricing. For price processes not restricted to the rational numbers, a novel theory of integer arbitrage free pricing and hedging emerges. We establish an FTAP, involving a set of absolutely continuous martingale measures satisfying an additional property. The set of prices of a contingent claim is no longer an interval, but is either empty or dense in an interval. We also discuss superhedging with integral portfolios.
  • The Keynesian Model in the General Theory: A Tutorial abstract This small overview of the General Theory is the kind of summary I would have liked to have read, before embarking in a comprehensive study of the General Theory at the time I was a student. As shown here, the main ideas are quite simple and easy to visualize. Unfortunately, numerous introductions to Keynesian theory are not actually based on Keynes opus magnum, but in obscure neoclassical reinterpretations. This is completely pointless since Keynes' book is so readable.
View More Headlines