Profit Margins – Are they Predicting a Crash?

Jeremy Grantham: A Bullish Bear

Is Jeremy Grantham, co-founder and CIO of GMO, bullish or bearish these days?  According to Myles Udland at Business Insider, he’s both.  He quotes Grantham:

“I think the global economy and the U.S. in particular will do better than the bears believe it will because they appear to underestimate the slow-burning but huge positive of much-reduced resource prices in the U.S. and the availability of capacity both in labor and machinery.”

Grantham

Udland continues:

“On top of all this is the decline in profit margins, which Grantham has called the “most mean-reverting series in finance,” implying that the long period of elevated margins we’ve seen from American corporations is most certainly going to come an end. And soon. “

fredgraph

Corporate Profit Margins as a Leading Indicator

The claim is an interesting one.  It certainly looks as if corporate profit margins are mean-reverting and, possibly, predictive of recessionary periods. And there is an economic argument why this should be so, articulated by Grantham as quoted in an earlier Business Insider article by Sam Ro:

“Profit margins are probably the most mean-reverting series in finance, and if profit margins do not mean-revert, then something has gone badly wrong with capitalism.

If high profits do not attract competition, there is something wrong with the system and it is not functioning properly.”

Thomson Research / Barclays Research’s take on the same theme echoes Grantham:

“The link between profit margins and recessions is strong,” Barclays’ Jonathan Glionna writes in a new note to clients. “We analyze the link between profit margins and recessions for the last seven business cycles, dating back to 1973. The results are not encouraging for the economy or the market. In every period except one, a 0.6% decline in margins in 12 months coincided with a recession.”

barclays-margin

Buffett Weighs in

Even Warren Buffett gets in on the act (from 1999):

“In my opinion, you have to be wildly optimistic to believe that corporate profits as a percent of GDP can, for any sustained period, hold much above 6%.”

warren-buffett-477

With the Illuminati chorusing as one on the perils of elevated rates of corporate profits, one would be foolish to take a contrarian view, perhaps.  And yet, that claim of Grantham’s (“probably the most mean-reverting series in finance”) poses a challenge worthy of some analysis.  Let’s take a look.

The Predictive Value of Corporate Profit Margins

First, let’s reproduce the St Louis Fed chart:

CPGDP

Corporate Profit Margins

A plot of the series autocorrelations strongly suggests that the series is not at all mean-reverting, but non-stationary, integrated order 1:

CPGDPACF

Autocorrelations

 

Next, we conduct an exhaustive evaluation of a wide range of time series models, including seasonal and non-seasonal ARIMA and GARCH:

ModelFit ModelFitResults

The best fitting model (using the AIC criterion) is a simple ARMA(0,1,0) model, integrated order 1, as anticipated.  The series is apparently difference-stationary, with no mean-reversion characteristics at all.  Diagnostic tests indicate no significant patterning in the model residuals:

ModelACF

Residual Autocorrelations

LjungPlot

Ljung-Box Test Probabilities

Using the model to forecast a range of possible values of the Corporate Profit to GDP ratio over the next 8 quarters suggests a very wide range, from as low as 6% to as high as 13%!

Forecast

 

CONCLUSION

The opinion of investment celebrities like Grantham and Buffett to the contrary, there really isn’t any evidence in the data to support the suggestion that corporate profit margins are mean reverting, even though common-sense economics suggests they should be.

The best-available econometric model produces a very wide range of forecasts of corporate profit rates over the next two years, some even higher than they are today.

If a recession is just around the corner,  corporate profit margins aren’t going to call it for us.

Posted in ARMA, Corporate Profit Margins, Forecasting, Fundamentals | Tagged | Leave a comment

Alpha Extraction and Trading Under Different Market Regimes

Market Noise and Alpha Signals

One of the perennial problems in designing trading systems is noise in the data, which can often drown out an alpha signal.  This is turn creates difficulties for a trading system that relies on reading the signal, resulting in greater uncertainty about the trading outcome (i.e. greater volatility in system performance).  According to academic research, a great deal of market noise is caused by trading itself.  There is apparently not much that can be done about that problem:  sure, you can trade after hours or overnight, but the benefit of lower signal contamination from noise traders is offset by the disadvantage of poor liquidity.  Hence the thrust of most of the analysis in this area lies in the direction of trying to amplify the signal, often using techniques borrowed from signal processing and related engineering disciplines.

There is, however, one trick that I wanted to share with readers that is worth considering.  It allows you to trade during normal market hours, when liquidity is greatest, but at the same time limits the impact of market noise.

Quantifying Market Noise

How do you measure market noise?  One simple approach is to start by measuring market volatility, making the not-unreasonable assumption that higher levels of volatility are associated with greater amounts of random movement (i.e noise). Conversely, when markets are relatively calm, a greater proportion of the variation is caused by alpha factors.  During the latter periods, there is a greater information content in market data – the signal:noise ratio is larger and hence the alpha signal can be quantified and captured more accurately.

For a market like the E-Mini futures, the variation in daily volatility is considerable, as illustrated in the chart below.  The median daily volatility is 1.2%, while the maximum value (in 2008) was 14.7%!

Fig1

The extremely long tail of the distribution stands out clearly in the following histogram plot.

Fig 2

Obviously there are times when the noise in the process is going to drown out almost any alpha signal. What if we could avoid such periods?

Noise Reduction and Model Fitting

Let’s divide our data into two subsets of equal size, comprising days on which volatility was lower, or higher, than the median value.  Then let’s go ahead and use our alpha signal(s) to fit a trading model, using only data drawn from the lower volatility segment.

This is actually a little tricky to achieve in practice:  most software packages for time series analysis or charting are geared towards data occurring at equally spaced points in time.  One useful trick here is to replace the actual date and time values of the observations with sequential date and time values, in order to fool the software into accepting the data, since there are no longer any gaps in the timestamps.  Of course, the dates on our time series plot or chart will be incorrect. But that doesn’t matter:  as long as we know what the correct timestamps are.

An example of such a system is illustrated below.  The model was fitted  to  3-Min bar data in EMini futures, but only on days with market volatility below the median value, in the period from 2004 to 2015.  The strategy equity curve is exceptionally smooth, as might be expected, and the performance characteristics of the strategy are highly attractive, with a 27% annual rate of return, profit factor of 1.58 and Sharpe Ratio approaching double-digits.

Fig 3

Fig 4

Dealing with the Noisy Trading Days

Let’s say you have developed a trading system that works well on quiet days.  What next?  There are a couple of ways to go:

(i) Deploy the model only on quiet trading days; stay out of the market on volatile days; or

(ii) Develop a separate trading system to handle volatile market conditions.

Which approach is better?  It is likely that the system you develop for trading quiet days will outperform any system you manage to develop for volatile market conditions.  So, arguably, you should simply trade your best model when volatility is muted and avoid trading at other times.  Any other solution may reduce the overall risk-adjusted return.  But that isn’t guaranteed to be the case – and, in fact, I will give an example of systems that, when combined, will in practice yield a higher information ratio than any of the component systems.

Deploying the Trading Systems

The astute reader is likely to have noticed that I have “cheated” by using forward information in the model development process.  In building a trading system based only on data drawn from low-volatility days, I have assumed that I can somehow know in advance whether the market is going to be volatile or not, on any given day.  Of course, I don’t know for sure whether the upcoming session is going to be volatile and hence whether to deploy my trading system, or stand aside.  So is this just a purely theoretical exercise?  No, it’s not, for the following reasons.

The first reason is that, unlike the underlying asset market, the market volatility process is, by comparison, highly predictable.  This is due to a phenomenon known as “long memory”, i.e. very slow decay in the serial autocorrelations of the volatility process.  What that means is that the history of the volatility process contains useful information about its likely future behavior.  [There are several posts on this topic in this blog – just search for “long memory”].  So, in principle, one can develop an effective system to forecast market volatility in advance and hence make an informed decision about whether or not to deploy a specific model.

But let’s say you are unpersuaded by this argument and take the view that market volatility is intrinsically unpredictable.  Does that make this approach impractical?  Not at all.  You have a couple of options:

You can test the model built for quiet days on all the market data, including volatile days.  It may perform acceptably well across both market regimes.

For example, here are the results of a backtest of the model described above on all the market data, including volatile and quiet periods, from 2004-2015.  While the performance characteristics are not quite as good, overall the strategy remains very attractive.

Fig 5

Fig 6

 

Another approach is to develop a second model for volatile days and deploy both low- and high-volatility regime models simultaneously.  The trading systems will interact (if you allow them to) in a highly nonlinear and unpredictable way.  It might turn out badly – but on the other hand, it might not!  Here, for instance, is the result of combining low- and high-volatility models simultaneously for the Emini futures and running them in parallel.  The result is an improvement (relative to the low volatility model alone), not only in the annual rate of return (21% vs 17.8%), but also in the risk-adjusted performance, profit factor and average trade.

Fig 7

Fig 8

 

CONCLUSION

Separating the data into multiple subsets representing different market regimes allows the system developer to amplify the signal:noise ratio, increasing the effectiveness of his alpha factors. Potentially, this allows important features of the underlying market dynamics to be captured in the model more easily, which can lead to improved trading performance.

Models developed for different market regimes can be tested across all market conditions and deployed on an everyday basis if shown to be sufficiently robust.  Alternatively, a meta-strategy can be developed to forecast the market regime and select the appropriate trading system accordingly.

Finally, it is possible to achieve acceptable, or even very good results, by deploying several different models simultaneously and allowing them to interact, as the market moves from regime to regime.

 

Posted in Alpha, Forecasting, Natural Gas Futures, Regime Shifts, Signal Processing, Systematic Strategies, Volatility Modeling | Leave a comment

How to Make Money in a Down Market

The popular VIX blog Vix and More evaluates the performance of the VIX ETFs (actually ETNs) and concludes that all of them lost money in 2015.  Yes, both long volatility and short volatility products lost money!

VIX ETP performance in 2015

Source:  Vix and More

By contrast, our Volatility ETF strategy had an exceptional year in 2015, making money in every month but one:

Monthly Pct Returns

How to Profit in a Down Market

How do you make money when every product you are trading loses money?  Obviously you have to short one or more of them.  But that can be a very dangerous thing to do, especially in a product like the VIX ETNs.  Volatility itself is very volatile – it has an annual volatility (the volatility of volatility, or VVIX) that averages around 100% and which reached a record high of 212% in August 2015.

VVIX

The CBOE VVIX Index

Selling products based on such a volatile instrument can be extremely hazardous – even in a downtrend: the counter-trends are often extremely violent, making a short position challenging to maintain.

Relative value trading is a more conservative approach to the problem.  Here, rather than trading a single product you trade a pair, or basket of them.  Your bet is that the ETFs (or stocks) you are long will outperform the ETFs you are short.  Even if your favored ETFs declines, you can still make money if the ETFs you short declines even more.

This is the basis for the original concept of hedge funds, as envisaged by Alfred Jones in the 1940’s, and underpins the most popular hedge fund strategy, equity long-short.  But what works successfully in equities can equally be applied to other markets, including volatility.  In fact, I have argued elsewhere that the relative value (long/short) concept works even better in volatility markets, chiefly because the correlations between volatility processes tend to be higher than the correlations between the underlying asset processes (see The Case for Volatility as an Asset Class).

 

Posted in ETFs, Hedge Funds, Relative Value, VIX Index, Volatility ETF Strategy, Volatility Modeling | Tagged , , , | Comments Off on How to Make Money in a Down Market

Volatility Strategies in 2015

2015 proved to be an extremely difficult year for volatility strategies generally. The reasons are not difficult to fathom: the sea-change in equity markets resulting from the Fed’s cessation of quantitative easing (for now) produced a no-less dramatic shift in the volatility term structure.  During the summer months spot and front-month volatility surged, producing an inverted term structure in VIX futures and causing havoc for a great number of volatility carry strategies that depend on the usual downward-sloping shape of the forward volatility curve.

Performance results for many volatility strategies over the course of the year reflect the difficulties of managing these market gyrations.   The blog site Volatility Made Simple, which charts the progress of 24  volatility strategies, reported the year-end results as follows:Volatility Strategies

While these strategies are hardly the “best in class”, the fact that all but a handful reported substantial losses for the year speaks volumes about the the challenges faced by volatility strategies during periods of market turbulence.  Simplistic approaches, such as volatility carry strategies, will tend to blow up when volatility surges and the curve inverts and the losses incurred during such episode will often undo most or all of the gains accrued in prior months, or years.

Although our own volatility ETF portfolio on any given day might bear a passing resemblance to some of these strategies, in fact the logic behind it is considerably more sophisticated. We take an options-theoretic approach to pricing leveraged ETFs, which allows us to exploit the potential for selling expensive Theta against cheap Gamma, while at the same time affording opportunities to take advantage of the convexity of levered ETF products (for a more detailed explanation, see Investing in Leveraged ETFs – Theory and Practice).  We also mix together multiple models using different a wide range of data frequencies, in both time and trade space, and apply a model management system to optimize the result in real time (the reader is referred to my post on Meta-Strategies for more on this topic).

So much for theory – how did all this work out in practice in 2015?  The following are some summary results for our volatility ETF strategy, which we operate for several managed accounts.
Perf Summary Dec 2015

The substantial increase in annual returns during 2015 is largely a reflection of the surge in volatility during the summer months (especially July), although it is interesting to note, too, that performance also improved on a risk-adjusted basis during the year and currently stands at around 3.60.  It is perhaps unlikely that the strategy will continue performing at these elevated levels in 2016, although volatility shows no sign of moderating yet.  Our aim is to produce returns of 30% to 40% during a normal year, although 2016 could prove to be above-average, especially if the equity market corrects.

 

Posted in Meta-Strategy, VIX Index, Volatility ETF Strategy, Volatility Modeling | Comments Off on Volatility Strategies in 2015

Overnight Trading in the E-Mini S&P 500 Futures

Jeff Swanson’s Trading System Success web site is often worth a visit for those looking for new trading ideas.

A recent post Seasonality S&P Market Session caught my eye, having investigated several ideas for overnight trading in the E-minis.  Seasonal effects are of course widely recognized and traded in commodities markets, but they can also apply to financial products such as the E-mini.  Jeff’s point about session times is well-made:  it is often worthwhile to look at the behavior of an asset, not only in different time frames, but also during different periods of the trading day, day of the week, or month of the year.

Jeff breaks the E-mini trading session into several basic sub-sessions:

  1. “Pre-Market” Between 530 and 830
  2. “Open” Between 830 and 900
  3. “Morning” Between 900 though 1130
  4. “Lunch” Between 1130 and 1315
  5. “Afternoon” Between 1315 and 1400
  6. “Close” Between 1400 and 1515
  7. “Post-Market” Between 1515 and 1800
  8. “Night” Between 1800 and 530

In his analysis Jeff’s strategy is simply to buy at the open of the session and close that trade at the conclusion of the session. This mirrors the traditional seasonality study where a trade is opened at the beginning of the season and closed several months later when the season comes to an end.

Evaluating Overnight Session and Seasonal Effects

The analysis evaluates the performance of this basic strategy during the “bullish season”, from Nov-May, when the equity markets traditionally make the majority of their annual gains, compared to the outcome during the “bearish season” from Jun-Oct.

None of the outcomes of these tests is especially noteworthy, save one:  the performance during the overnight session in the bullish season:

Fig 1

The tendency of the overnight session in the E-mini to produce clearer trends and trading signals has been well documented.  Plausible explanations for this phenomenon are that:

(a) The returns process in the overnight session is less contaminated with noise, which primarily results from trading activity; and/or

(b) The relatively poor liquidity of the overnight session allows participants to push the market in one direction more easily.

Either way, there is no denying that this study and several other, similar studies appear to demonstrate interesting trading opportunities in the overnight market.

That is, until trading costs are considered.  Results for the trading strategy from Nov 1997-Nov 2015 show a gain of $54,575, but an average trade of only just over $20:

Gross PL

# Trades

Av Trade

$54,575

2701

$20.21

Assuming that we enter and exit aggressively, buying at the market at the start of the session and selling MOC at the close, we will pay the bid-offer spread and commissions amounting to around $30, producing a net loss of $10 per trade.

The situation can be improved by omitting January from the “bullish season”, but the slightly higher average trade is still insufficient to overcome trading costs :

Gross PL

# Trades

Av Trade

$54,550

2327

$23.44

Designing a Seasonal Trading Strategy for the Overnight Session

At this point an academic research paper might conclude that the apparently anomalous trading profits are subsumed within the bid-offer spread.  But for a trading system designer this is not the end of the story.

If the profits are insufficient to overcome trading frictions when we cross the spread on entry and exit, what about a trading strategy that permits market orders on only the exit leg of the trade, while using limit orders to enter?  Total trading costs will be reduced to something closer to $17.50 per round turn, leaving a net profit of almost $6 per trade.

Of course, there is no guarantee that we will successfully enter every trade – our limit orders may not be filled at the bid price and, indeed, we are likely to suffer adverse selection – i.e. getting filled on every losing trading, while missing a proportion of the winning trades.

On the other hand, we are hardly obliged to hold a position for the entire overnight session.  Nor are we obliged to exit every trade MOC – we might find opportunities to exit prior to the end of the session, using limit orders to achieve a profit target or cap a trading loss.  In such a system, some proportion of the trades will use limit orders on both entry and exit, reducing trading costs for those trades to around $5 per round turn.

The key point is that we can use the seasonal effects detected in the overnight session as a starting point for the development for a more sophisticated trading system that uses a variety of entry and exit criteria, and order types.

The following shows the performance results for a trading system designed to trade 30-minute bars in the E-mini futures overnight session during the months of Nov to May.The strategy enters trades using limit prices and exits using a combination of profit targets, stop loss targets, and MOC orders.

Data from 1997 to 2010 were used to design the system, which was tested on out-of-sample data from 2011 to 2013.  Unseen data from Jan 2014 to Nov 2015 were used to provide a further (double blind) evaluation period for the strategy.

Fig 2

 

 

  

ALL TRADES

LONG

SHORT

Closed Trade Net Profit

$83,080

$61,493

$21,588

  Gross Profit

$158,193

$132,573

$25,620

  Gross Loss

-$75,113

-$71,080

-$4,033

Profit Factor

2.11

1.87

6.35

Ratio L/S Net Profit

2.85

Total Net Profit

$83,080

$61,493

$21,588

Trading Period

11/13/97 2:30:00 AM to 12/31/13 6:30:00 AM (16 years 48 days)

Number of Trading Days

2767

Starting Account Equity

$100,000

Highest Equity

$183,080

Lowest Equity

$97,550

Final Closed Trade Equity

$183,080

Return on Starting Equity

83.08%

Number of Closed Trades

849

789

60

  Number of Winning Trades

564

528

36

  Number of Losing Trades

285

261

24

  Trades Not Taken

0

0

0

Percent Profitable

66.43%

66.92%

60.00%

Trades Per Year

52.63

48.91

3.72

Trades Per Month

4.39

4.08

0.31

Max Position Size

1

1

1

Average Trade (Expectation)

$97.86

$77.94

$359.79

Average Trade (%)

0.07%

0.06%

0.33%

Trade Standard Deviation

$641.97

$552.56

$1,330.60

Trade Standard Deviation (%)

0.48%

0.44%

1.20%

Average Bars in Trades

15.2

14.53

24.1

Average MAE

$190.34

$181.83

$302.29

Average MAE (%)

0.14%

0.15%

0.27%

Maximum MAE

$3,237

$2,850

$3,237

Maximum MAE (%)

2.77%

2.52%

3.10%

Win/Loss Ratio

1.06

0.92

4.24

Win/Loss Ratio (%)

2.10

1.83

7.04

Return/Drawdown Ratio

15.36

14.82

5.86

Sharpe Ratio

0.43

0.46

0.52

Sortino Ratio

1.61

1.69

6.40

MAR Ratio

0.71

0.73

0.33

Correlation Coefficient

0.95

0.96

0.719

Statistical Significance

100%

100%

97.78%

Average Risk

$1,099

$1,182

$0.00

Average Risk (%)

0.78%

0.95%

0.00%

Average R-Multiple (Expectancy)

0.0615

0.0662

0

R-Multiple Standard Deviation

0.4357

0.4357

0

Average Leverage

0.399

0.451

0.463

Maximum Leverage

0.685

0.694

0.714

Risk of Ruin

0.00%

0.00%

0.00%

Kelly f

34.89%

31.04%

50.56%

Average Annual Profit/Loss

$5,150

$3,811

$1,338

Ave Annual Compounded Return

3.82%

3.02%

1.22%

Average Monthly Profit/Loss

$429.17

$317.66

$111.52

Ave Monthly Compounded Return

0.31%

0.25%

0.10%

Average Weekly Profit/Loss

$98.70

$73.05

$25.65

Ave Weekly Compounded Return

0.07%

0.06%

0.02%

Average Daily Profit/Loss

$30.03

$22.22

$7.80

Ave Daily Compounded Return

0.02%

0.02%

0.01%

INTRA-BAR EQUITY DRAWDOWNS

ALL TRADES

LONG

SHORT

Number of Drawdowns

445

422

79

Average Drawdown

$282.88

$269.15

$441.23

Average Drawdown (%)

0.21%

0.20%

0.33%

Average Length of Drawdowns

10 days 19 hours

10 days 20 hours

66 days 1 hours

Average Trades in Drawdowns

3

3

1

Worst Case Drawdown

$6,502

$4,987

$4,350

Date at Trough

12/13/00 1:30

5/24/00 4:30

12/13/00 1:30

Posted in eMini Futures, Futures, Overnight Trading | Tagged , , | Comments Off on Overnight Trading in the E-Mini S&P 500 Futures

Improving A Hedge Fund Investment – Cantab Capital’s Quantitative Aristarchus Fund

cantab

In this post I am going to take a look at what an investor can do to improve a hedge fund investment through the use of dynamic capital allocation. For the purposes of illustration I am going to use Cantab Capital’s Aristarchus program – a quantitative fund which has grown to over $3.5Bn in assets under management since its opening with $30M in 2007 by co-founders Dr. Ewan Kirk and Erich Schlaikjer.

I chose this product because, firstly, it is one of the most successful quantitative funds in existence and, secondly, because as a CTA its performance record is publicly available.

Cantab’s Aristarchus Fund

Cantab’s stated investment philosophy is that algorithmic trading can help to overcome cognitive biases inherent in human-based trading decisions, by exploiting persistent statistical relationships between markets. Taking a multi-asset, multi-model approach, the majority of Cantab’s traded instruments are liquid futures and forwards, across currencies, fixed income, equity indices and commodities.

Let’s take a look at how that has worked out in practice:

Fig 1 Fig 2

Whatever the fund’s attractions may be, we can at least agree that alpha is not amongst them.  A Sharpe ratio of < 0.5 (I calculate to be nearer 0.41) is hardly in Renaissance territory, so one imagines that the chief benefit of the product must lie in its liquidity and low market correlation.  Uncorrelated it may be, but an investor in the fund must have extremely deep pockets – and a very strong stomach – to handle the 34% drawdown that the fund suffered in 2013.

Improving the Aristarchus Fund Performance

If we make the assumption that an investment in this product is warranted in the first place, what can be done to improve its performance characteristics?  We’ll look at that question from two different perspectives – the investor’s and the manager’s.

Firstly, from the investor’s perspective, there are relatively few options available to enhance the fund’s contribution, other than through diversification.  One other possibility available to the investor, however, is to develop a program for dynamic capital allocation.  This requires the manager to be open to allowing significant changes in the amount of capital to be allocated from month to month, or quarter to quarter, but in a liquid product like Aristarchus some measure of flexibility ought to be feasible.

An analysis of the fund’s performance indicates the presence of a strong dependency in the returns process.  This is not at all unusual.  Often investment strategies have a tendency to mean-revert: a negative dependency in which periods of poor performance tend to be followed by positive performance, and vice versa.  CTA strategies such as Aristarchus tend to be trend-following, and this can induce positive dependency in the strategy returns process, in which positive months tend to follow earlier positive months, while losing months tend to be followed by further losses.  This is the pattern we find here.

Consequently, rather than maintaining a constant capital allocation, an investor would do better to allocate capital dynamically, increasing the amount of capital after a positive period, while decreasing the allocation after a period of losses.  Let’s consider a variation of this allocation plan, in which the amount of allocated capital is increased by 70% when the last monthly equity value exceeds the quarterly moving average, while the allocation is reduced to zero when the last month’s equity falls below the average.  A dynamic capital allocation plan as simple as this appears to produce a significant improvement in the overall performance of the investment:

Fig 4

The slight increase in annual volatility in the returns produced by the dynamic capital allocation model is more than offset by the 412bp improvement in the CAGR. Consequently, the Sharpe Ratio improves from o.41 to 0.60.

Nor is this by any means the entire story: the dynamic model produces lower average drawdowns (7.93% vs. 8.52%) and, more importantly, reduces the maximum drawdown over the life of the fund from a painful 34.87% to more palatable 23.92%.

The much-improved risk profile of the dynamic allocation scheme is reflected in the Return/Drawdown Ratio, which rises from 2.44 to 6.52.

Note, too, that the average level of capital allocated in the dynamic scheme is very slightly less than the original static allocation.  In other words, the dynamic allocation technique results in a more efficient use of capital, while at the same time producing a higher rate of risk-adjusted return and enhancing the overall risk characteristics of the strategy.

Improving Fund Performance Using a Meta-Strategy

So much for the investor.  What could the manager to do improve the strategy performance?  Of course, there is nothing in principle to prevent the manager from also adopting a dynamic approach to capital allocation, although his investment mandate may require him to be fully invested at all times.

Assuming for the moment that this approach is not available to the manager, he can instead look into the possibilities for developing a meta-strategy.    As I explained in my earlier post on the topic:

A meta-strategy is a trading system that trades trading systems.  The idea is to develop a strategy that will make sensible decisions about when to trade a specific system, in a way that yields superior performance compared to simply following the underlying trading system.

It turns out to be quite straightforward to develop such a meta-strategy, using a combination of stop-loss limits and profit targets to decide when to turn the strategy on or off.  In so doing, the manager is able to avoid some periods of negative performance, producing a significant uplift in the overall risk-adjusted return:

Fig 5

Conclusion

Meta-strategies and dynamic capital allocation schemes can enable the investor and the investment manager to improve the performance characteristics of their investment and investment strategy, by increasing returns, reducing volatility and the propensity of the strategy to produce substantial drawdowns.

We have demonstrated how these approaches can be applied successfully to Cantab’s Aristarchus quantitative fund, producing substantial gains in risk adjusted performance and reductions in the average and maximum drawdowns produced over the life of the fund.

Posted in CTA, Dynamic Capital Model, Futures, Mean Reversion, Meta-Strategy, Momentum | Tagged , , , , | Comments Off on Improving A Hedge Fund Investment – Cantab Capital’s Quantitative Aristarchus Fund

Careers at Systematic Strategies

QuantTrader JobPropTraderJob

Posted in Proprietary Traders, Quant/Traders, Systematic Strategies | Tagged , , | Comments Off on Careers at Systematic Strategies

A Meta-Strategy in Euro Futures

Several readers responded to my recent invitation to send me details of their trading strategies, to see if I could develop a meta-strategy with superior overall performance characteristics (see original post here).

One reader sent me the following strategy in EUR futures, with a promising-looking equity curve over the period from 2009-2014.

EUR Orig Equity Curve

I have no information about the underlying architecture of the strategy, but a performance analysis shows that it trades approximately once per day, with a win rate of 49%, a PNL per trade of $4.79 and a IR estimated to be 2.6.

Designing the Meta-Strategy

My task was to see if I could design a meta-strategy that would “trade” the underlying strategy, i.e. produce signals to turn the underlying strategy on or off.  Here we are designing a long-only strategy, where a “buy” trade represents the signal to turn the underlying strategy on, while an exit trade from the meta-strategy turns the underlying strategy off.

The meta-strategy is built in trade time rather than calendar time – we don’t want the meta-strategy trying to turn the underlying trading strategy on or off while it is in the middle of a trade.  The data we use in the design exercise is the trade-by-trade equity curve, including the date and timestamp and the open, high, low and close values of the equity curve for each trade.

No allowance for trading costs is necessary since all of the transaction costs are baked into the PNL of the underlying strategy – there are no additional costs entailed in turning the strategy on or off, as long as we do that in a period when there is no open position.

In designing the meta-strategy I chose simply to try to improve the overall net PNL.  This is a good starting point, but one would typically go on to consider a variety of other possible criteria, including, for example, Net Profit / Av. Max Drawdown, Net Profit / Flat Time, MAR Ratio, Sharpe Ratio, Kelly Criterion, or a combination of them.

I used 80% of the trade data to design and test the strategy and reserved 20% of the data to test the performance of the meta-strategy out-of-sample.

Results

The analysis summarized below shows a clear improvement in the overall performance of the meta-strategy, compared to the underlying strategy.  Net PNL and Average Trade are increased by 40%, while the trade standard deviation is noticeably reduced, leading to a higher IR of 5.27 vs 3.10.  The win rate increases from around 2/3 to over 90%.

Although not as marked, the overall improvement in strategy performance metrics during the out-of-sample test period is highly significant, both economically and statistically.

Note that the Meta-strategy is a long-only strategy in which each “trade” is a period in which the system trades the underlying EUR futures strategy.  So in fact, in the Meta-strategy, each trade represents a number of successive underlying, real trades (which of course may be long or short).

Put another way, the Meta-Strategy turns the underlying trading strategy on and off 276 times in total.

Perf1

Perf 2 Perf 3 Perf 4

 

Conclusion

It is feasible to design a meta-strategy that improves the overall performance characteristics of an underlying trading strategy, by identifying the higher-value trades and turning the strategy on or off based on forecasts of its future performance.

No knowledge is required of the mechanics of the underlying trading strategy in order to design a profitable Meta-strategy.

Meta-strategies have been successfully applied to problems of capital allocation, where decisions are made on a regular basis about how much capital to allocate to multiple trading strategies, or traders.

 

 

Posted in F/X, Meta-Strategy | Tagged , , , | Comments Off on A Meta-Strategy in Euro Futures

Portfolio Improvement for the Equity Investor

Portfolio

Equity investors and long-only portfolio managers are constantly on the lookout for ways to improve their portfolios, either by yield enhancement, or risk reduction.  In the case of yield enhancement, the principal focus is on adding alpha to the portfolio through stock selection and active management, while risk reduction tends to be accomplished through diversification.

Another approach is to seek improvement by adding investments outside the chosen universe of stocks, while remaining within the scope of the investment mandate (which, for instance, may include equity-related products, but not futures or options).  The advent of volatility products in the mid-2000’s offered new opportunities for risk reduction; but this benefit was typically achieved at the cost of several hundred basis points in yield.  Over the last decade, however, a significant evolution has taken place in volatility strategies, such that they can now not only provide insurance for the equity portfolio, but, in addition, serve as an orthogonal source of alpha to enhance portfolio yields.

An example of one such product is our volatility strategy, a quantitative approach to trading VIX-related ETF products traded on ARCA. A summary of the performance of the strategy is given below.

Vol Strategy perf Sept 2015

The mechanics of the strategy are unlikely to be of great interest to the typical equity investor and so need not detain us here.  Rather, I want to focus on how an investor can use such products to enhance their equity portfolio.

Performance of the Equity Market and Individual Sectors

The last five years have been extremely benign for the equity market, not only for the broad market, as evidenced by the performance of the SPDR S&P 500 Trust ETF (SPY), and also by almost every individual sector, with the notable exception of energy.

Sector ETF Performance 2012-2015

The risk-adjusted returns have been exceptional over this period, with information ratios reaching 1.4 or higher for several of the sectors, including Financials, Consumer Staples, Healthcare and Consumer Discretionary.  If the equity investor has been in a position to diversify his portfolio as fully as the SPY ETF, it might reasonably been assumed that he has accomplished the maximum possible level of risk reduction; at the same time, no-one is going to argue with a CAGR of 16.35%.  Yet, even here, portfolio improvement is possible.

Yield Enhancement

The key to improving the portfolio yield lies in the superior risk-adjusted performance of the volatility portfolio compared to the equity portfolio and also due the fact that, while the correlation between the two is significant (at 0.44), it is considerably lower than 1.  Hence there is potential for generating higher rates of return on a risk-adjusted basis by combining the pair of portfolios in some proportion.

To illustrate this we assume, firstly, that the investor is comfortable with the currently level of risk in his broadly diversified equity portfolio, as measured by the annual standard deviation of returns, currently 10.65%.   Holding this level of risk constant, we now introduce an overlay strategy, namely the volatility portfolio, to which we seek to allocate some proportion of the available investment capital.  With this constraint it turns out that we can achieve a substantial improvement in the overall yield by reducing our holding in the equity portfolio to just over 2/3 of the current level (67.2%) and allocating 32.8% of the capital to the volatility portfolio.  Over the period from 2012, the combined equity and volatility portfolio produced a CAGR of 26.83%, but with the same annual standard deviation – a yield enhancement of 10.48% annually.  The portfolio Information Ratio improves from 1.53 to a 2.52, reflecting the much higher returns produced by the combined portfolio, for the same level of risk as before.

Chart

Risk Reduction

The given example may appear impressive, but it isn’t really a practical proposition.  Firstly, no equity investor or portfolio manager is likely to want to allocate 1/3 of their total capital to a strategy operated by a third party, no matter how impressive the returns. Secondly, the capacity in the volatility strategy is, realistically, of the order of $100 million.  A 32.8% allocation of capital from a sizeable equity portfolio would absorb a large proportion of the available capacity in the volatility ETF strategy, or even all of it.

A much more realistic approach would be to cap the allocation to the volatility component at a reasonable level – say, 5%.  Then the allocation from a $100M capital budget would be $5M, well within the capacity constraints of the volatility product.  In fact, operating at this capped allocation percentage, the volatility strategy provides capacity for equity portfolios of up to $2Bn in total capital.

Let’s look at an example of what can be achieved under a 5% allocation constraint.  In this scenario I am going to move along the second axis of portfolio improvement – risk reduction.  Here, we assume that we wish to maintain the current level of performance of the equity portfolio (CAGR 16.35%), while reducing the risk as much as possible.

A legitimate question at this stage would be to ask how it might be possible to reduce risk by introducing a new investment that has a higher annual standard deviation than the existing portfolio?  The answer is simply that we move some of our existing investment into cash (or, rather, Treasury securities).  In fact, by allocating the maximum allowed to the volatility portfolio (5%) and reducing our holding in the equity portfolio to 85.8% of the original level (with the remaining 9.2% in cash), we are able to create a portfolio with the same CAGR but with an annual volatility in single digits: 9.53%, a reduction in risk of  112 basis points annually.  At the same time, the risk adjusted performance of the portfolio improves from 1.53 to 1.71 over the period from 2012.

Of course, the level of portfolio improvement is highly dependent on the performance characteristics of both the equity portfolio and overlay strategy, as well as the correlation between them. To take a further example, if we consider an equity portfolio mirroring the characteristics of the Materials Select Sector SPDR ETF (XLB), we can achieve a reduction of as much as 3.31% in the annual standard deviation, without any loss in expected yield, through an allocation of 5% to the volatility overlay strategy and a much higher allocation of 18% to cash.

Other Considerations

Investors and money managers being what they are, it goes against the grain to consider allocating money to a third party – after all, a professional money manager earns his living from his own investment expertise, rather than relying on others.  Yet no investor can reasonably expect to achieve the same level of success in every field of investment.  If you have built your reputation on your abilities as a fundamental analyst and stock picker, it is unreasonable to expect that you will be able accomplish as much in the arena of quantitative investment strategies.  Secondly, by capping the allocation to an external manager at the level of 5% to 10%, your primary investment approach remains unaltered –  you are maintaining the fidelity of your principal investment thesis and investment mandate.  Thirdly, there is no reason why overlay strategies such as the one discussed here should not provide easy liquidity terms – after all, the underlying investments are liquid, exchange traded products. Finally, if you allocate capital in the form of a managed account you can maintain control over the allocated capital and make adjustments rapidly, as your investment needs change.

Conclusion

Quantitative strategies have a useful role to play for equity investors and portfolio managers as a means to improve existing portfolios, whether by yield enhancement, risk reduction, or a combination of the two.  While the level of improvement is highly dependent on the performance characteristics of the equity portfolio and the overlay strategy, the indications are that yield enhancement, or risk reduction, of the order of hundreds of basis points may be achievable even through very modest allocations of capital.

Posted in Portfolio Management, Risk Management, Volatility ETF Strategy, Volatility Modeling | Tagged , | Comments Off on Portfolio Improvement for the Equity Investor

Daytrading Volatility ETFs

ETFAs we have discussed before, there is no standard definition of high frequency trading.  For some, trading more than once or twice a day constitutes high frequency, while others regard anything less than several hundred times a session as low, or medium frequency trading.  Hence in this post I have referred to “daytrading” since we can at least agree on that description for a strategy that exits all positions by the close of the session.

HFT Trading in ETFs – Challenges and Opportunities

High frequency trading in equities and ETFs offer their own opportunities and challenges compared to futures. Amongst the opportunities we might list:

  • Arbitrage between destinations (exchanges, dark pools) where the stock is traded
  • Earning rebates from the exchanges willing to pay for order flow
  • Arbitraging news flows amongst pairs or baskets of equities

When it comes to ETFs, unfortunately, the set of possibilities is more restricted than for single names and one is often obliged to dig deeply into the basket/replication/cointegration type of approach, which can be very challenging in a high frequency context.  The risk of one leg of a multi-asset trade being left unfilled is such that one has to be willing to cross the spread to get the trade on.  Depending on the trading platform and the quality of the execution algorithms, this can make trading the strategy prohibitively expensive.

In that case you have a number of possibilities to consider.  You can simplify the trade, limit the number of stocks in the basket and hope that there is enough alpha left in the reduced strategy. You can focus on managing the trade execution sufficiently well that aggressive trading becomes necessary on relatively few occasions and you look to minimize the costs of paying the spread when they arise.  You can design strategies with higher profit factors that are able to withstand the performance drag entailed in trading aggressively.  Or you can design slower versions of the strategy where latency, fill rates and execution costs are not such critical factors.

Developing high frequency strategies in the volatility ETFs presents special challenges.  Being fairly new, the products have limited histories, which makes modeling more of a challenge.  One way to address this is to create synthetic series priced from the VIX futures, using the published methodology for constructing the ETFs.  Be warned, though, that these synthetic series are likely to inflate your backtest results since they aren’t traded instruments.

Another practical problem that crops up regularly in products like UVXY and VXX is that the broker has difficulty locating stock for short selling.  So you are limited to taking the strategy offline when that occurs, designing strategies that trade long only, or as we do, switching to other products when the ETF is unavailable to short.

Then there is the capacity issue. Despite their fast-growing popularity, volatility ETF funds are in many cases quite small, totaling perhaps a few hundred millions of dollars in AUM. You are never going to be able to construct a strategy capable of absorbing billions of dollars of investment in the ETF products alone.

Volatility and Alpha

volatilitychartFor these reasons, volatility ETFs are not a natural choice for many investment strategists.  But they do have one great advantage compared to other products:  volatility.  Volatility implies uncertainty about the true value of a security, which means that market participants can have very different views about what it is worth at any moment in time.  So the prospects for achieving competitive advantage through superior analytical methods is much greater than for a stock that hardly moves at all and on whose value everyone concurs.  Furthermore, volatility creates regular opportunities for hitting stops, and creating mini crashes or short squeezes, in which the security is temporarily under- or over-valued.  If ever there was a security offering the potential for generating alpha, it is the volatility ETF.

The volatility of the VIX ETFs is enormous, by the standards of regular stocks.  A typical stock might have an annual volatility of 30% to 60%.  The lowest level ever seen in the VVIX index series so far is 70%. To give you an idea of how extreme it can become, during the latest market swoon in August the VVIX, the volatility-of-volatility for the S&P500 index, reached over 200% a year.

A Daytrading Strategy in the VXX

So, despite the challenges and difficulties, there are very good reasons to make the attempt to develop strategies for the volatility ETF products.  My firm, Systematic Strategies, has developed several such algorithms that are combined to create a strategy that trades the volatility ETFs very successfully.  Until recently, however,  all of the sub-strategies we employ were longer term in nature, and entailed holding positions overnight.  We wanted to develop higher frequency algorithms that could react more quickly to changes in the volatility landscape.  We had to dig pretty deep into the arsenal of trading ideas to get there, but eventually we succeeded.  After six months of live trading we were ready to release the new VXX daytrading algorithm into production for our volatility ETF strategy investors.  Here’s how it looks (results are for a $100,000 account).

Fig 1 Fig 2 Fig 3

As you can see, the strategy trades up to around 10 times a day with a reasonable profit factor (1.53) and win rate of just under 60%. By itself, the strategy has a Sharpe Ratio of around 6, so it is well worth trading on its own.  But its real value (for us) emerges when it is combined in appropriate proportion with the other, lower frequency algorithms in the volatility strategy.  The additional alpha from the VXX strategy reduces the size of the loss in August and produces a substantial gain in September, taking the YTD return to just under 50%.  Returns for Oct MTD are already at 16%.

Vol Strategy perf Sept 2015

 

 

Posted in Algorithmic Trading, High Frequency Trading, VIX Index, Volatility ETF Strategy, Volatility Modeling | Comments Off on Daytrading Volatility ETFs