Net thread Inshallah ->
What is the difference between the value at risk (VaR) and the conditional value at risk (CVaR)?
Value - at - Risk is the purported worst-case loss under normal circumstances/market conditions developed using a computational technique and further specific modelling assumptions tailored as per the risk reporting requirements of a Trading / Investment Desk.
But I tend to disagree with the term WCL - (the Worst Case Loss), as mostly used by some authors in the field of FRM - Financial Risk Management.
In my opinion, the Market #VaR of a given financial security/exposure is the maximum model loss adjusted for a MOE - Margin of Error(return standard deviation is multiplied by a Z - critical value), based on what the trader/ risk manager would be willing to accept
as a given probability over a specified risk horizon/ time interval “in terms of an amount” on Normal Days in Normal Markets.
(This definition only applies to #Parametric / VCV - VaR Model which cannot forecast beyond three standard deviations, due to #Gaussian (Normal) Distribution modelling constraint).
In the case of #Portfolio#VaR the worst-case loss is the loss suffered when all loosely correlated assets, having a true diversification edge, might suffer due to positive/tight #correlations (under asymmetric market conditions) which might emerge during periods
of extreme #volatility and herd culture trading; for e.g. as it usually happens in the case of a stock market crash and subsequent downturns.
There might be days when the Actual Recorded Loss exceeds the Maximum(so-called WCL) #VaR Model Loss/es beyond a given threshold due to market abnormalities accompanied by the extreme bouts of #volatility jumps and/or return persistence phenomenons.
You may refer to how data visualization tools such as the Q-Q Plot are used to investigate heavy tails and volatility cluster patterns in the Financial Market Risk Literature.
#VaR can be computed using various methods, but the four that are most commonly utilized in the industry are given in the following order, =>
1.VCV - Variance Co-variance Method.
2.HS - Historical Simulation Method.
3.Bootstrapping Method.
4.Monte Carlo Simulation Method.
Kindly note further: that the above are not VaRs - Value at Risk Types!
They are only competing and contrasting computational techniques!!
Many in the risk domain incorrectly refer to the above computational techniques as #VaR in the sense that models have a different purpose.
Yes, all VaR methods have their own=>
1.Competing Computational philosophies that use a lot to some to no distributional assumption/s(none for MCS, Backward simulation and HS - Historical Simulation Methods),
2.Further, assume the returns to be I.I.D or No I.I.D. based on the above.
3.Varying Financial product(Options/ Vanilla Bonds/ Equities) applications, For E.g. VCV /Parametric VaR Method, cannot be used to accurately compute the probabilistic forward-looking loss measures
of Options and other derivative financial securities which have a Non - linear pay -off.
4.Generate a specific understanding and explanation of volatility clustering in the context of market conditions, observed and modelled heavy tails and tail dependence behaviours.
5.Forecasting Estimation Risks under different volatility regimes, Contrasting EDA - Exploratory Data Analysis Methods and other dashboard Visualization Tools might be used by the risk modeller for each #VaR Method.
6.A specific Asset / Portfolio Market Liquidation Assumption over the Holding Period in certain cases.
For, e.g, a 30-day Value at Risk Forecast would assume that the investor will not liquidate the inventory of assets for the next month! The portfolio allocation is intact.
8.Risk factor mapping methodology,
9.Volatility updating/smoothing/scaling techniques,(Can only be done in Parametric VaR Model due to specific distributional assumptions).
10.Back-testing methods might differ,
11.Sample iteration requirements,
12.Auto-Correlation / Serial Dependence Effects on the Risk Factors,
13.Risk Factor Aggregation and Correlation Assumptions,
14.Adjusting for the Higher Moments of the distribution (Cornish Fisher modified VaR) to capture Skew and Kurtosis.
15.Reporting the Portfolio Summary Metric with varying applications across different exposures/portfolios trading in different markets at different institutions,
16.Contrasts in terms of how #Computationally / Machine Learning Model-intensive they all might be!
17 Etc.
Normally the VCV - Variance Co-variance Variant of Value - at - Risk Model (based on the Gaussian Return Model Distributional Assumptions) underestimates the risk under extreme market volatility conditions, for example, a market crash we saw during the Global Credit Crisis
of 2007–09, the Greek Default Crisis, ERM Crisis back in the 1990s and the complete failure of the Normal /Parametric VaR Model during the LTCM crisis, the Swiss (Fixed Pegged exchange rate) Franc Devaluation Crisis and so on etc.
HS and MCS Methods might better capture Tail Risks compared to the Parametric VaR & Delta Normal VaR. But again it varies across products, markets, volatility patterns, white noise assumption violations and risk horizons.
CVAR / ESF/ ETL are all measures that compute and report losses in excess of the VaR Probabilistic Forecast Metric as per a selected computational technique and confidence interval /level! It is also referred to as an Average Value at Risk Model,
as it combines the average of all market-driven financial losses that exceed the predicted model loss beyond a certain confidence interval.
In Risk Management Theory, the ETL - Extreme Tail Loss Metrics are fast replacing Conventional Value at Risk Computational Techniques /
Models(which formed the backbone of the IMA - Internal Modeling Approach to compute and report the Trading Book given MRC - Market Risk Capital requirements and the Banking Book has given Credit Risk Capital requirements driven by the FIRB/ AIRB guidelines as issued by @BIS_org
The upcoming #Basel IV -#FRTB reforms are told to be relying more so on #ES - Expected shortfall or the #CVAR - Conditional value at risk models, which's theoretically classified as the Coherent Measures of Risk/s to assess the impact of financial hits on the
bank economic / risk-based capital and the capital adequacy standards that shall come into force to meet the post-global credit crisis banking industry needs accordingly. #CVAR, unlike #VaR, is a more coherent measure of #marketrisk.
Kindly do note: That a VaR model is not a coherent risk measure!
Hence Data aggregation across risks factors and P&l A/Cs will not be possible by merely adding or subtracting risk numbers across business lines /product types/ exposures.
In the Language of Statistics, we can sum up our treatise on this forum by saying that
VaR is a risk metric that is ideally believed to be subject to constraint, which is the alpha level chosen for modelling, or in other words the confidence interval(the cut-off pt in the curve)
The Risk Modeller innocently assumes that
The probability that VaR shall not exceed alpha (α) for a given confidence level is the Type 1 Error rate
The probability that VaR shall exceed the model forecast is (1-α)
Hope this thread will provide educational guidance to risk modelling experts engaged in portfolio and investment risk management, capital computation, stress testing, pricing, hedging, and ERM assignment across financial and non-financial institutions.
In other threads, I shall cover Operational VaR and Credit VaR and ETL Measures in more detail.
Should basic human needs be catered to as services within the framework of a market order, where forces of demand and supply shall determine the price for each interaction aka transaction and the social value created for the consumer via production, exchange and consumption?
For e.g. should the receiver of services in lieu of tax money and welfare benefits earmarked by a government in areas such as health and education be treated as consumers in any other industry?
Should we have the option to choose the most efficient hospitals, and schools?
Each public service providing institution should be corporatized?
Each public service institution should have a Profit and Loss A/c and a balance sheet?
Each public service institution should treat recipients as buyers of products and their interactions akin to transactions?
Is this Unanticipated or Anticipated Inflation Risk?
This is what economists need to explain to us!
We knew it was coming due to the reflationary policy stance of central banks in the first world, and elsewhere.
But, the augmented rate was not unexpected.
But, this is not the 1970s, when oil price shocks, and other supply-side macroeconomic and microeconomic distortions of the postwar years raised the inflation rate unexpectedly to astronomical heights.
I don't see any massive stagflation developing due to technological gains.
Yes, the "PHILLIPS CURVE" is officially deceased.
won't return again!
Monetarists and some other schools of thought have created hues and cries of the return of the economic phantom, but, no, it won't happen.
Prices and #Unemployment have been disentangled.
As part of the Financial Literacy Programs, all individuals must learn how to manage their retirement investment proceeds and personal wealth.
Many people misconstrue that such planning is not important and should be solely left to either the employer or the govt!
How wrong!
In most of the developing countries where old-age financial benefits are not sponsored or guaranteed by the government, via social safety nets, the vulnerable people are left at the mercy of the market, extended family network or the philanthropist to help them.
Even private sector firms, which provide access to #provident and #pension fund designed retirement investment planning schemes, are no guarantee, of a safe and smooth exit out of the workforce due to #Systemic Risk which can destabilize the economy or the society in the long run
Late Lee Kuan Yew was right when he said that new businesses require new skills, which in return require more vocational and academic training for the workforce.
Any country aspiring to follow the Asian model of economic development must invest in human capital formation.
Hong Kong under the British, UAE, and Singapore are 3 perfect examples of how certain nations can arbitrage on the inefficiencies which exist in their neighbourhoods.
All three examples serve as a binding case study in #Geospatial#Economics and complex #Agglomeration Benefits
Look at the #City, in the UK!
Are the English-born bankers?
How did they manage to develop a major global financial centre over the last two hundred years?
Let's see how Brexit will affect the mystique which surrounds London as a major financial hub
However, its applications in Social Sciences and its sub-fields such as Economics and Business Studies is growing all the time.
Finance and Risk are broadly categorized as subfields of Microeconomics.
That's my opinion.
The two subjects (Risk and Finance, which I would like to jointly refer to as Risk Finance) share a lot in common with the Pricing Theory, Utility Theory, Portfolio Theory, Risk Pooling, Risk Financing and Risk Sharing Theories, the Moral Hazard Problem,
Every Concept has a Form, as expounded by #Plato.
But, it does not mean, that a complex concept cannot have simple explainable forms?
It can!
All Computational Algorithms that train various Mathematical or Statistical Models are a representation of some theory, which originated as a concept using logical methods of enquiry, having empirical or rational forms.
Model #Parsimony could be best understood in the light of #Occam's Razor.