Pages

Friday, October 14, 2011

Trading Mean Reversion with Augen Spikes

One of the more interesting things I have come across is the idea of looking at price changes in terms of recent standard deviation, a concept put forward by Jeff Augen. The gist is to express a close to close return as a function of the standard deviation of recent price moves. If returns were normally distributed you would expect moves of less than 1 standard deviation approximately 68% of the time. It's probably no surprise that actual returns don't appear to be normally distributed, so larger spikes are relatively frequent.

The plan

Augen uses it to gain an edge in detecting where option volatility is mispriced, but I believe the concept is very useful and can be used in other areas. I originally thought it might be useful as a volatility filter by highlighting "abnormal" spikes in the VIX but I'm yet to have much success with that. Another idea was that it might be useful for swing trading in these mean-reverting times, which turned out to be more fruitful.

The basic signal is to go short when an upwards spike greater than 1 standard deviation occurs, and to go long when a downwards spike greater than 1 occurs (i.e spike values greater than 1 or less -1 respectively). For reference I compared it to trading daily mean reversion (going short after an up day, going long after a down day), and fading RSI(2) extremes of 90/10.

How'd it go?



Overall it performed well, with the caveat it was only effective in markets where mean reversion strategies in general are effective. Interestingly, using the default parameters of a 20 day standard deviation with a 1 day lookback has outperformed both Daily Mean Reversion and RSI(2) over the last few years, both of which have fallen a bit flat during the same period.

Scaling into positions based on the size of the spike also seemed to be effective in producing good returns in absolute terms.

Final thoughts

Like most MR strategies it has an abysmal backtest if you start from the 70's. I am still fascinated by the regime change that took place between follow through and mean reversion around 1999/2000, which Marketsci has covered in some depth. Maybe one day I will figure it out.

A comparison of the equity curves is below, along with the code. I think this is a very useful way of looking at price changes, and will continue to investigate ways it can be put to good use.


require(quantmod)
require(PerformanceAnalytics)

slideapply <- function(x, n, FUN=sd) {
    v <- c(rep(NA, length(x)))
    for (i in n:length(x) ) {
        v[i] <- FUN(x[(i-n+1):i])
    }
    return(v)
}

augenSpike <- function(x, n=20, k=1) {
    prchg <- c(rep(NA,k), diff(x, k))
    lgchg <- c(rep(NA,k), diff(log(x), k))
    stdevlgchg <- slideapply(lgchg, n, sd)
    stdpr <- x * stdevlgchg
    #shuffle things up one
    stdpr <- c(NA, stdpr[-length(stdpr)])
    spike <- prchg / stdpr
    return(spike)
}

perf <- function(x) { print(maxDrawdown(x)); table.AnnualizedReturns(x) }
retsum <- function(x) { return(exp(cumsum(na.omit(x))))}

getSymbols("SPY", from="2000-01-01")
spy <- SPY["2000/2011"]
spy$up_day <- ifelse(ROC(Cl(spy)) >=0, 1, 0 )
sig <- Lag(ifelse(spy$up_day == 1, -1, 1))
ret <- ROC(Cl(spy)) * sig
dmr_eq <- retsum(ret)
plot(dmr_eq)
perf(ret)
#[1] 0.3217873
#                          SPY.Close
#Annualized Return            0.1131
#Annualized Std Dev           0.2203
#Annualized Sharpe (Rf=0%)    0.5137

spy$rsi2 <- RSI(Cl(spy), 2)
sig <- Lag(ifelse(spy$rsi2 < 10, 1, ifelse(spy$rsi2 > 90, -1, 0)))
ret <- ROC(Cl(spy)) * sig
rsi_eq <- retsum(ret)
plot(rsi_eq)
perf(ret)
#[1] 0.1819428
#                          SPY.Close
#Annualized Return            0.0963
#Annualized Std Dev           0.1198
#Annualized Sharpe (Rf=0%)    0.8038

aus <- augenSpike(as.vector(Cl(spy)), k=1)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, -1, ifelse(spy$spike < -1, 1, 0)))

ret <- ROC(Cl(spy)) * sig
k1_eq <- retsum(ret)
plot(k1_eq)
perf(ret)
#[1] 0.1379868
#                          SPY.Close
#Annualized Return            0.1066
#Annualized Std Dev           0.1152
#Annualized Sharpe (Rf=0%)    0.9256

aus <- augenSpike(as.vector(Cl(spy)), k=2)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, -1, ifelse(spy$spike < -1, 1, 0)))

ret <- ROC(Cl(spy)) * sig
k2_eq <- retsum(ret)
plot(k2_eq)
perf(ret)
#[1] 0.2134826
#                          SPY.Close
#Annualized Return            0.1091
#Annualized Std Dev           0.1433
#Annualized Sharpe (Rf=0%)    0.7608

aus <- augenSpike(as.vector(Cl(spy)), k=1)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, (-1 * spy$spike), ifelse(spy$spike < -1, abs(spy$spike), 0)))

ret <- ROC(Cl(spy)) * sig
k1_scaled_eq <- retsum(ret)
plot(k1_scaled_eq)
perf(ret)

Sunday, October 2, 2011

Jeff Augen Volatility Spike Code in R

[Update: I have updated this so the number of days used for standard deviation can be passed as a parameter, you can find the code at Trading Mean Reversion with Augen Spikes ]

Jeff Augen has written many excellent books on options trading, including The Volatility Edge in Options Trading in which he presents a novel way of looking at a securities price movement as a function of its recent standard deviation. 


I believe it's a very useful way at looking at price moves, so implemented the following which I believe matches as it was described in the book.


slideapply <- function(x, n, FUN=sd) {
    v <- c(rep(NA, length(x)))


    for (i in n:length(x) ) {
     v[i] <- FUN(x[(i-n+1):i])
    }
    return(v)
}


augenSpike <- function(x, n=20) {
    prchg <- c(NA, diff(x))
    lgchg <- c(NA, diff(log(x)))
    stdevlgchg <- slideapply(lgchg, n, sd)
    stdpr <- x * stdevlgchg
    #shuffle things up one
    stdpr <- c(NA, stdpr[-length(stdpr)])
    spike <- prchg / stdpr
    return(spike)
}

An example of how to use it with quantmod:


getSymbols('SPY')

sp <- SPY['2010/2011']
asp <- augenSpike(as.vector(Cl(sp)))
sp$spike <- asp
barplot(sp['2011']$spike, main="Augen Price Spike SPY 2011", xlab="Time Daily", ylab="Price Spike in Std Dev")


Which gives the following chart
If you want to verify it has been implemented correctly (and I won't hold it against you), I used the following which is based on the example data he gave in the book. You will need the slideapply function from above which will apply a function to a subset of a vector along a sliding window.



aub <- data.frame(c(47.58, 47.78, 48.09, 47.52, 48.47, 48.38, 49.30, 49.61, 50.03, 51.65, 51.65, 51.57, 50.60, 50.45, 50.83, 51.08, 51.26, 50.89, 50.51, 51.42, 52.09, 55.83, 55.79, 56.20))
colnames(aub) <- c('Close')
aub$PriceChg <- c(NA, diff(aub$Close))
aub$LnChg <- ROC(aub$Close)
aub$StDevLgChg<-slideapply(aub$LnChg, 20, sd)
aub$StdDevPr <- aub$Close * aub$StDevLgChg


pr <- aub$StdDevPr
pr <- c(NA, pr[-length(pr)])


aub$Spike <- aub$PriceChg / pr
aub



Which for me at least gives the same data as printed. Let me know if you find it useful or find any errors. 




Adding a volatility filter with VIX

We saw in the basic system how we could add a factor, namely the 200 day moving average, to improve the overall performance of our system. You could spend a lot of time playing with different moving averages, and different combinations of crossovers if you are so inclined, but its fairly easy to see they only work well in strongly trending markets. 


Instead of looking for further optimisation through price, what other factors might be of use in improving risk adjusted returns? And, more importantly for now, how can we represent them in R?


For this example I will use VIX as a proxy for overall market volatility. When VIX is high (for some definition of high), it means uncertainty reigns and for a long only system, its probably better to wait it out. We will quantify this as when the VIX is under it's 50 day moving average, volatility is low enough to risk our equity in the hope of gains.


Implementing this in R is quite straightforward, we just generate a second lagged signal vector and take the product of the 200 day MA vector.


The results are better. The extra factor reduces risk adjusted return, though on the whole the system isn't something I would put my own money into. At the very least it clearly gives a better result than buy & hold. Hopefully you can see the benefit of researching orthogonal factors as inputs. 


As an aside, you could think of the work by Mebane Faber as introducing additional factors through the use of different asset classes and the relative performance of each. A relative performance filter, plus use of a price based filter like the 200 day moving average, provides a very solid overall performance. Looking for different factors you can model and use is probably going to be more fruitful than testing say the 250 SMA vs 200 SMA. There is only so much any one factor can give. 
require(quantmod)
require(PerformanceAnalytics)


getSymbols(c('SPY', '^VIX'), from='1999-01-01')
SPY$ma200 <- SMA(Cl(SPY), 200)
VIX$ma50 <- SMA(Cl(VIX), 50)
spy <- SPY['2000/2011']
vix <- VIX['2000/2011']
sig <- Lag(ifelse(Cl(spy) > spy$ma200, 1, 0))


vix_sig <- Lag(ifelse(Cl(vix) < vix$ma50, 1, 0))
vf_sig <- sig * vix_sig
vf_ret <- ROC(Cl(spy)) * vf_sig
vf_eq <- exp(cumsum(na.omit(vf_ret)))


maxDrawdown(vf_ret)
#[1] 0.1532796
table.AnnualizedReturns(vf_ret)
# Annualized Return            0.0084
# Annualized Std Dev           0.0757
# Annualized Sharpe (Rf=0%)    0.1110