Archives

All posts for the month March, 2015

Electrocomponents has a sales release on Thursday. The short interest is at extreme highs, its earnings are divergent to the price action and the dividend does not appear covered by FCF. With the stock looking expensive in terms of valuation (EV/EBITDA, P/E etc.) this name may be a risky one to have in your long book.

I found ECM LN in the OTAS earnings positioning view for the FTSE 350, by looking for companies with a earnings/sales release this week, together with high days to cover. Mitie group (MTO LN) was at the top (for those of you who got my Insights note on Mitie Group, note it is down 670 bps at pixel time). Electrocomponents was second. For any questions on how to do this please call or email.

ScreenHunter_01 Mar. 31 0  2399.00

The YTD price action and the 12m fwd EPS trendline appear somewhat at odds with each other, with the stock rallying despite earnings appearing flat. The stock currently trades at a 21% premium to consensus analyst price targets. Have all the analysts got this wrong?

ScreenHunter_01 Mar. 31 0  2389.00

8.0% of free float shares on loan equates to nearly 20 days to cover in terms of average daily volume. Obviously a massive short squeeze risk, but have all the hedge funds got it wrong with the short base following the share price all the way up so far this year?

ScreenHunter_01 Mar. 31 0  2409.01

Considering the consensus 12m fwd dividend yield estimate of 4.76%, have you noticed analyst predictions for FCF dividend cover in FY1 and FY2? Yes, that’s right, they are less than 1…

ScreenHunter_01 Mar. 31 0  2378.58

And finally, despite the Chairman of the Board, Peter Johnson, recently buying stock it is particularly interesting to observe that the Level A directors (i.e. those most senior at the company) backtest in the red over the medium term.

ScreenHunter_01 Mar. 31 1  2470.22

 

Systematic trading is a great way to practice financial statistics. However, it is not an easy way to make money.

The idea is alluring because it seems to only depend on being able to program and having a good idea, neither of which is that hard. Unlike manual trading, a computer strategy isn’t affected by bad judgment or emotion, and it can process far more data than a human can, at least in terms of the sheer number of numbers it can look at. The main point, though, is that its historic performance can be looked at before deciding to invest — which should at least be an indication of the performance that it should realise when the strategy is live.

Although systematic trading has these advantages, it is best to start any research program with a realistic impression of the difficulties that will haunt it, which is why this post will go through a few of the problems that you might hit when doing this.

Orthonormal basis sets

Orthonormal basis sets are a sophisticated way to decompose market returns into trading signals. However, they’re not new and they’re not likely to work for you.

Before fast computers with good internet connections were commonplace, the people who had the capability to trade systematically really did find it comparatively easy to make money. Back then there were lots of correlations that were reasonably predictable and which could be monetized. The problem was that as more and more time and money went into the industry, they saturated the easier and then the not-so-easy strategies, and the opportunities went away. It is a general rule that if there are people who make money from a strategy, and they have enough capital, they’ll saturate the strategy until they can’t make more money by putting more capital into it. Then if someone else comes in, they obviously won’t make money off the strategy unless they’re better at exploiting it.

This leads to two inescapable conclusions: To make money from a particular strategy, you need to be either better than the other people looking at it, or the only person trying it. “Better” here means that you have better, faster data, more robust IT systems, better risk models and risk controls, better cost models and controls, and more experienced people helping you. In that case, you are playing a different game, and this article will not be as relevant to you.

For everyone else, your best hope is to pick a strategy that nobody else is using. This means either using new data that nobody else is using, or to use commonly used data in a new way.

New Data

Using new data is actually a good recommendation. There’s as much data about as there is stuff in the world, and new data could be anything from the results of phone surveys to weather forecasting. You can even buy and use data on which private jet landed where, and put that into a strategy. The problem is that some data isn’t that useful, some data is expensive, and some data is time consuming and fiddly to use. Some data would appear to be new, but on close examination might turn out to be surprisingly well used in other forms: News sentiment is popular at the moment, but it’s wise to remember that it’s giving you less data, not more, than the original news feeds, which are actively used for trading anyway. Even following the holiday movements of CEOs around is all very well, but there will be portfolio managers who know the CEOs personally, and they’ll have far more information than a timeseries of airports.

But there is genuine potential in new data, if the data bears up to scrutiny.

Old Data

A well known approach is to look at price graphs to try to see patterns or things that will help you guess where the price is going. But prices, as well as volumes, debt, earnings, profits, and many many more data sets make up a common body of data that a very large number of quant houses have access to, and use as much as they can. For this kind of data, you need to do something that both (a) nobody else is doing with the data, and (b) has an economic reason for why it works (otherwise it’ll just be a spurious correlation, and it’ll die away).

(a) You should have reason to think that nobody is running the same strategy with the same data. The strategy should be truly original, and not just a really complicated way of doing what’s been done before. A more complicated strategy might have lots of little catches, for instance having different behaviour if the volume is high and the price is high, unless the company is heavily indebted. Or the complication might be mathematical sophistication. You could convolve hourly returns with vectors taken from the hydrogenic wave function, and the resulting vectors regressed against the forward return. You might be right that few people have tried exactly that, but you shouldn’t assume that the space of linear kernels applied to prices hasn’t been thoroughly explored, because it has. Either way, just taking an obvious strategy and adding complexity to it probably won’t make it all that different, yet it will make it harder to exclude spurious statistical signals: your search space will be larger so the number of spurious false positives will be larger.

Combining several data sets is another way to come up with something new, but the same effect that means that there are explosively more strategies out there using multiple data sets also makes it explosively harder to be sure that your combination isn’t spurious. And the economically obvious combinations will all be extensively looked at already, as well as a fair few which aren’t economically obvious.

(b) The strategy has to be economically rational. If there is no reason behind it, it won’t work. Prices move because of information and people trading. There’s no secret black box somewhere which contains a weird set of rules to be uncovered. If it seems to work that prime-numbered days of the year always have positive returns, it only seems like that because of a coincidence. If you could get complete statistical certainty that an economically unusual strategy works, then there would be reason to believe it works. However, if that happens, then it would probably be because of messed up statistics: It’s very difficult to get statistical certainty in finance. Achieving statistical confidence in finance will be a topic for another blog post, and the topic is very important — in some ways it’s the main reason that it’s hard to find new working strategies on old data — it might be argued that a given set of data has only a small number of orthogonal strategies that can be shown to work statistically significantly, and as you open up your search space, you will find rehashes of simpler strategies, along with more and more spurious strategies.

It’s not all bad though:

This has described the initial difficulties in finding a data set on which to base a good strategy. There are factors which make it slightly less bleak than it could be:

1) There are lots of potential strategies that won’t be fully exploited if they’re too small for the big fish to care about. A strategy that only makes a few tens of thousands of pounds a year won’t be interesting to many quant houses.

2) The turnover is high: People come and go. Hedge funds come and go. People will stop looking at a given strategy for more reasons than just that it doesn’t make money any more.

3) Quants don’t talk to each other as much as people do in other industries. And that guardedness means that there isn’t a vast body of knowledge on which a new hedge fund can build. This is good news for small fish, since it means that they’re not starting the race with much of a handicap, at least intellectually.

Conclusion

In conclusion, systematic strategies have to be based on either new data or genuinely new methods, both of which pose their own problems. This is subtly different from good old human decision-making strategies. People have access to a wealth of information not available to a computer: They have seen the new smart phone with the same eyes that will choose whether or not to buy it, and determine whether it will be a success. People also understand the underlying factors that make a company successful far better than any current computer system. It is very likely that most good human stock pickers will have a genuine economic insight that, when informed by the right data and analytics, will be both rational and somewhat unique. The crucial difference is that they are using a much more diverse set of data to make each decision, and they’re not choosing a way to combine the data on statistical inferences between the data and previous price movements, but rather understanding the significance of each data point in isolation.

Hugo Martay

It’s hard to find luxury at a bargain, even more so in a QE environment, but perhaps Switzerland’s Richemont is showing early signs of sector-relative value in a sector that already seems too expensive…

ScreenHunter_01 Mar. 27 0  1967.58

  • Sector-relative valuation flagging as inexpensive; historically this has offered a good inflexion point in subsequent price action
  • EPS downward revisions have been getting less and less negative over the previous 3 months; price now falling back in line with the longer-term earnings trend
  • Dividend looking attractive vs. where the yield usually lies relative to the sector (also decent in absolute terms at 2%)
  • Signals are supportive, with two technical signals having fired recently, both with over 67% historic reliability

EPS Momentum

Looking at the trend in 12m fwd EPS since 2010, the stock price tends to trade around its earnings estimates. Stock is currently a consensus buy at a 11% discount to consensus analyst price targets.

ScreenHunter_01 Mar. 27 0  1987.59

Stacked Graph

Observe the subsequent price action every time the sector-relative P/E has touched its lower bounds.

ScreenHunter_01 Mar. 27 0  1937.57

Technical Signals

Stock is looking oversold on all Stochastic measures.

ScreenHunter_01 Mar. 27 0  1927.56

Two reliable technical signals have fired recently, a Fast Stochastic (2.8% average return over 20 trading days, 68% reliability) and a Bollinger Band (4.0% average return over 20 trading days, 68% reliability).

ScreenHunter_01 Mar. 27 0  2018.17

Implied Volatility

At 77% the put ratio is at its statistical highs, indicating people still want downside protection.

ScreenHunter_01 Mar. 27 0  1998.01

Looks like out-the-money calls are cheap at the moment, which is interesting if you don’t want to own the stock outright.

ScreenHunter_01 Mar. 27 0  2008.05

 

This stock came out of earnings in rather bad shape, and has since been notably diverged from the price performance of European retailers, so it is interesting to note that short interest has come down significantly in the last 5 days and the stock is flagging up as a potential sector-relative value opportunity…

ScreenHunter_01 Mar. 26 0  1738.26

Risk sentiment indicators

Although short interest is low in absolute terms, interesting to note that it has come down significantly in the last 5 days and over the last 1 month (making this notably divergent to the 1 month negative price action).

ScreenHunter_02

Valuation Page

Home Retail Group is arguably starting to look attractive in terms of valuation, being at its recent lows in absolute terms, vs. the FTSE All Share and vs. the retailing space.

ScreenHunter_04

Stacked Graph page

Note how such lows in sector-relative valuation have historically offered an inflexion point in terms of price action (in both cases the stock was below its 50D and 200D moving averages).

ScreenHunter_03

EPS Momentum Page

Recent price action looks overly severe given the downward revisions in 12m fwd EPS…does the market expect more downgrades to follow? The stock currently trades at a 12% discount to consensus sell side analyst price targets.

ScreenHunter_05

Performance Page

We see clearly from the spread chart, on a 2 year view, that such underperformance the versus the Stoxx Retail sector has historically offered an inflexion point.

ScreenHunter_06

Dividend Page

However, very interesting that the FCF dividend cover is forecast to be so low in FY1 and FY2…could this be a potential risk?

ScreenHunter_07

Nike was a strong outperformer vs. its industry group on Friday post good earnings, where sales were strong against stronger dollar concerns. However, OTAS is highlighting that this current rally may not have legs.

EPS Momentum has been negative over the past month and past week, -3.01% and 2.8% respectively. The stock has recently re-rated vs. the 12M Forward EPS:

NKE EPS

NKE EPS Chart

EPS Graph

Additionally, 12M Forward P/E of 26.4x is stretched and is at a 41% premium to the industry average, an over 2 standard deviation high vs. normal levels:

NKE Val StampNKE ValNKE Val Industry

Despite a technical signal firing on Friday, this standard overbought signal has worked only 61% of the time, returning a positive 1.13% over the next 20 days. However, as we can see, the stock has been in overbought territory for some time:

NKE Signal

NKE Signal Backtest

 

NKE Fast Stoch

The options market is beginning to price in negatives as the Downside Skew increased sharply on Friday and call option activity suggests limited upside to the stock:

NKE Implied Vol

NKE Downside

NKE Put Ratio

NKE Top Ten Vol

Beat a man with the strength of you argument, not with the strength of your arm.

– Oliver Wendell Holmes

The much quoted Merrill Lynch survey shows that pro European sentiment is at a high compared with the US, which is something that the OTAS Core relative value chart shows clearly.

image2015-3-18 8-42-9

The STOXX 600 may trade at a lower PE than the S&P 500, but comparing the relative rating to history shows that it does not get more expensive than this. Of course, this was before the ECB was committed to a more expansionary monetary policy than the Fed and large swathes of the European bond market were offering negative yields. However, make no mistake, this is poised to be a monumental arm wrestle between those buying the principle of Central Bank omnipotence and those that are going along for the ride, while watching and waiting for the signs of a reversal.

The option market tells us that this arm wrestle is going to be a drawn out affair. Implied volatility on the STOXX 600, which is a measure of how much constituent stocks will have moved three months hence, is almost exactly the same as for the S&P 500, which is +/-8½%.

image2015-3-18 8-54-8

Volatility is independent of direction and everyone exposed to Europe through the option market might think that the STOXX 600 components are going up, while all those exposed to S&P 500 stocks believe that they will fall. But I doubt it.

The stock lending market provides another potential guide. Short interest in European large caps is half the level of that in the US. The largest impact on the borrow market is the European dividend paying season and the avoidance of withholding tax facilitated by stock lending.

image2015-3-18 8-59-31

Those with access to OTAS are able to fire up this analysis and hover over the chart and see how relative short interest in March 2015 compares with March over the past several years. I won’t spoil the surprise.

The final comparison is between the average CDS in Europe and North America. The cost of insuring credit in Europe has declined rapidly relative to the US for the past 1¾ years and is now almost at parity.

image2015-3-18 9-6-56

European corporate debt has been cheaper to insure than US in the past, but not by much and not for long. One wonders how negative European bond yields will have to go to change the perception of companies in sclerotic Europe relative to dynamic US. Or of course, how inflexible the US will have to become.

The Power of Argument

OTAS is a decision support tool and the investment decision that we are supporting today is whether to continue with the free ride in European equities at the expense of the US. The relative valuation suggests caution, while the option market indicates that you have some time to decide, allowing you to continue with your favourite stock positions through Q2.

The stock borrowing market suggests that you should pay attention to shares that have very low short interest, which may indicate an inflection point. However, watch out for that pesky dividend paying season in Europe, which is conveniently highlighted in OTAS stock analysis informing of you of when you might disregard stock borrowing data.

Finally the credit market tells you that if lower relative interest rates are going to support continued outperformance in Europe, then we are entering unprecedented times when European companies are better credits than US counterparts. The Fed may influence this decision by removing certain wording from its statements, but for how long. In the end you have to choose whether you have a better chance of being repaid by the likes of Cablevision or Arcelormittal, two companies chosen at random from either side of the Atlantic that just happen to have very similar costs of credit.

image2015-3-18 9-33-35

image2015-3-18 9-33-44

US retail sales have been down for three straight months and are falling at the fastest pace since Lehman collapsed. This is being reflected in declining EPS Momentum across the sector and in an expensive valuation for retailing stocks.

image2015-3-16 16-27-13

EPS Momentum in the sector is -0.9% and the rate of decline is accelerating. This is in contrast to the market as a whole, as EPS Momentum across the S&P 500 is -0.13% and has improved in the past month. This is in keeping with previous blogs about the effect of the price of oil.

At 17.7x 12 months forward earnings, the retail sector is above the long run average valuation of 14.7x and hovering around 2007 and 2010 peak levels. The Divergence chart for the sector showing price moves and recent estimate changes at component stocks, illustrates how few companies are experiencing upgrades, with about the same number having downgrades of at least 25%.

image2015-3-16 16-32-35

The problem with retail is that a lot of businesses have an implicit reliance on inflation to protect margins. These businesses push through annual price increases and hope to have to pay suppliers no more extra than this. In a deflationary environment, these firms are going to come unstuck very quickly.

There are of course plenty of retailers that are used to battling declining product prices, or have a business model built on delivering sustainable efficiency gains. In the last month there have been double digit upgrades for stocks such as Wayfair and American Eagle Outfitters, while even Amazon has broken a long run of downgrades to record an improved outlook in the last five days. That does not do much for its excessive valuation, but nonetheless it is welcome boost at a company that is used to living with often self enforced pricing pain.

Five stocks across the sector flag as having a relative valuation at extreme lows compared with peers, the largest of which is Tiffany. The stock is trading at lows only exceeded when the world was reflecting upon Lehman’s collapse. The Stacked Graph below shows that low relative valuation for Tiffany has in the past coincided with absolute share price lows.

image2015-3-16 16-43-13

Remember you can generate your own overnight report on Retailing, and any other sector or market, using OTAS Lingo.

A debate is raging as NASDAQ reaches a 15 year high as to whether stocks are expensive, because prices have dislocated from the trend in earnings, or cheap because the last time NASDAQ was at this level the valuation was in triple digits. One way to make NASDAQ look cheap is to compare its constituents with European stocks.

The 12 months forward PE of 21x is as high as it has been since July 2007.

image2015-3-3 11-56-52

However, compared with the S&P 500, NASDAQ’s valuation is in the middle of the historic range and recovering from depressed levels as recently as October 2014.

image2015-3-3 11-58-42

The S&P is at its highest valuation in years and is at extreme levels compared with the average of the past decade. One reason for this is depressed earnings, which we discussed here as being a function of low oil prices, which have since bottomed out. The EPS Momentum of the North American Energy sector is -18%, but it is improving from where it was last month. At the same time EPS Momentum across the S&P has also improved, although remains negative at -0.5%.

Over in Europe, which is on the cusp of unleashing QE, valuation is also at extremes, while EPS Momentum is -0.25%.

image2015-3-3 12-11-47

Compared with the US market, the relative valuation of 0.93x is at high, but not extreme levels, having performed a dive and subsequent recovery between September 2014 and January 2015.

image2015-3-3 12-13-36

Comparing NASDAQ to Europe, the former looks attractive value as a relative PE of 1.26x is below the long term average of 1.34x and well below exceptionally elevated levels of 1.44x and above.

image2015-3-3 12-16-5

So while NASDAQ looks fairly expensive on an absolute basis, on relative measures this is not the case (by using median values we are not skewing averages with Apple’s overwhelming market cap). Seven stocks among the index 100 are flagging as exceptionally lowly valued relative to their respective sectors right now and among these SBA Communications has positive EPS Momentum and had a backtested technical signal fire on last night’s close. 68% of the time this signal has fired in the past, the stock has risen an average 7% over the next 20 trading days.

image2015-3-3 12-22-22

This post discusses my experiences as a software engineer at OTAS Technologies, developing almost exclusively in Haskell for the past six months. Prior to this I’d learnt about and used functional languages solely in academia, in the context of type theory, with little development experience. Here are my views on some pretty fundamental issues I deal with day-to-day.

Tooling

The Haskell toolchain is not simple. Certainly, it took me several weeks before I adopted the system I currently use, although happily this system works successfully. The Haskell platform ships with GHC and Cabal. Cabal is the Haskell build system and dependency manager, and it is at once powerful and fragile. Usage of Cabal is well documented and its operation is not complicated, and yet many resources and tutorials do not stress the importance of sandboxes in Haskell development.

Sandboxes provide a mechanism to isolate the build process of individual Haskell projects. By default Cabal will use a shared (global) package database for storing compiled packages, dependencies of your library/program. However, sharing compiled packages leads to breakages when their dependencies are updated, an affliction known as `Cabal hell’. Sandboxes solve this issue by isolating (sandboxing) compiled dependencies per project, at the expense of vastly longer setup times, as all dependencies need to re-downloaded and built. However, their use is crucial in Haskell development and I’m unsure why this isn’t enforced by Cabal.

Isolated build environments with sandboxes are half the story to reliable Haskell development. It is necessary to future-proof your packages, ensuring upgrades to dependencies break neither your package or another dependency. `Stackage’ is a Hackage alternative that provides restrictions on packages, ensuring they are mutually-compatible. By restricting your projects to package dependencies specified by the Stackage snapshot you guarantee your package is buildable for all time. Certainly I make use of these snapshots in all my Haskell projects. I have not yet needed to upgrade a snapshot to use a newer version of a dependency, although I anticipate the process will be simple. Certainly Stackage enables me to upgrade dependencies in a controlled fashion.

I currently advise Haskellers to install the Haskell toolchain local to their user, and not through an OS-level package manager. The reason for this is twofold. Firstly, package managers frequently have an outdated version of GHC. Secondly, package managers also provide packages of popular Haskell libraries, which are then installed to the global package database. This interferes with the sandbox approach described above, and the two are not compatible.

Note that I have not explored packaging solutions such as Nix, as I find my current approach satisfactory, although I will be interested in seeing how the project develops.

Libraries

The Haskell ecosystem includes some fantastic software, available for instant use as libraries served by Hackage. Spend enough time using Hackage and you become familiar with many of the authors! A key skill when developing Haskell projects is identifying libraries that match your needs, the architecture of your program, and your other dependencies. Many people would expect the functionality provided by some of these libraries to be shipped with the compiler (`random’ and `time’ come to mind..). However, I see it as an advantage that core functionality is spread across many independent developers, as features are developed and pushed faster. The ecosystem certainly moves fast!

Certain libraries provide types and combinators that seemingly alter the very language. The prototypical example is Edward Kmett’s `lens’ library, which provides an extremely general, elegant and powerful framework for navigating data structures. Usage of the lens library is straightforward, but the underlying mathematical concepts are smart, and very interesting to learn about.

It takes experience to understand the gravity of your choice of library. There are libraries which attempt to perform the same functionality (pipes and conduit, warp and happstack etc.) but are architected in vastly different ways, such that converting between the two is far from trivial. For example, the lens library forms the sole interface to my OTAS Base wrapper, and is so heavily used that upgrading the dependency version requires careful consideration.

Using Haskell day-to-day

Haskell development is, frankly, awesome. When I describe my job, I frequently say I’m `living the programming dream’ (having the autonomy to choose my software tools and having full control over my projects are two reasons for my happiness, too). Programming in a purely functional language allows you to produce smarter, more correct code, period. The type system forces you to request and record the behavior of your function in its type. If your function needs access to a read-only environment, wrap its type in the Reader transformer. Need to log? Use the Writer monad. If you need to characterise failure, incorporate an Error type into your stack.

Because this correctness and clarity stems from restricting the behaviour of your functions, you do need to think carefully about what exactly they need to do. In particular, adding a feature to the software can often require a fundamental change in the monad transformer stack at the heart of your program. Therefore it is imperative you carefully consider the direction the project will take even before you begin development, and always consider the implications of the types you choose.

One source of difficulty for me is knowing to what extent I should generalise my types. This is often the case with monadic stacks based on IO. Annoyingly, core IO actions provided by the Haskell Prelude are of a concrete IO type (e.g. putStrLn “Hello, world!” :: IO ()), and one must explicitly import and use `liftIO’ from the `MonadIO’ typeclass when performing IO actions deeper in the stack. Handling exceptions requires similar machinery. Ideally, all IO actions would utilise generalised typeclasses such as MonadIO, `MonadBase’ and `MonadBaseControl’ as fully general frameworks for exception handling and IO actions. However, prematurely generalising your types complicates development, and furthermore many libraries do not currently support the most general interface possible. In practice fully general types leads to much cleaner code, although reasoning about the types and debugging becomes more difficult.

Finally, I’m beginning to notice limitations in the language which would greatly improve some aspects of development, although it seems unlikely they will see a solution in the near future. Foremost in this is Haskell’s record system. I would like to see a lens-based anonymous record system implemented in GHC, which would solve the overloaded records problem and enable `first class’ record manipulations. Alas, this would be a new dialect of Haskell! Additionally, the Prelude supports many outdated and inefficient concepts deemed beginner friendly (the String type, for example) and which are only now being improved upon (the applicative-monad proposal and the foldable-traversable proposal are two key imminent changes).

Certainly though, my experience with Haskell has been extremely positive. I’d be skeptical of anyone disregarding Haskell for any problem domain due to performance (except perhaps embedded or OS-level software) or through lack of libraries. Certainly at OTAS, barring exceptional circumstances I’ll be encouraging all future projects to be implemented in Haskell.

On the back of the NASDAQ breaching the 5,000 level today, we take a closer look at valuations and Director Dealings.

The market has enjoyed a rally over the past 2 months and thus pushed P/E levels higher. 12M Fwd P/E is well above the 2 standard deviation line, showing the current level of 21.2x forward earnings is stretched on a 2 year view.

image2015-3-2 16-16-23image2015-3-2 16-15-11

image2015-3-2 16-14-41

Maxing out to 2006, we see these levels were last seen heading into the financial crisis:

image2015-3-2 16-14-11

What makes this analysis more interesting is that there was record insider buying this past February. In fact, it is only the second record amount since 2006:

image2015-3-2 16-16-42

Sorting the Director dealings page for buys over the past month, AMGN, FAST, GILD, GMCR, MDLZ, and YHOO are brought to the top of the list:

image2015-3-2 16-17-51

Highest February on a 2 year time frame:

image2015-3-2 16-18-15

Maxed out to 2006 shows the only other notable spike in insider buys was in July 2012:

image2015-3-2 16-18-40