Tag Archives: Capital Markets

Why Interest Rates Will Asymptotically Approach Zero Forever (Part 2 – Specifics)

Why Interest Rates Will Asymptotically Approach Zero Forever (Part 2 – Specifics)

by Srikant Krishna (sri@srikantkrishna.com )

In the previous post we had set the stage after the largest credit (read: debt) expansion bubble in the history of the world. Affluence truly seemed both eternal and contagious, changing lives and the standard of living around the globe. Central banks aberrantly contemplated that they firmly had control of interest rates, and through it the price of time, and in transitive sequence, a general manipulative mechanism on economic growth.

But in reality, they were absurdly incorrect. The global financial crisis has at its roots many causes, but the capability can be assigned to two very basic human characteristics: incompetence and greed. Whereas politicians all over the world were making promises that could simply not be funded, primary dealers of bonds were more than happy to comply by funding deficits across many developed nations. And central bankers, ostensibly a component of the incestuous revolving door apparatus that conjoins politics, academia, policy institutions, central banking and investment banking, suffered from exceptionally pathological delusional incompetence.

As individuals, corporations, and finally governments attained the economic analog of the Malthusian carrying capacity in terms of debt and the ability to service it, the global credit and financing systems began to collapse, and with it, a massive indiscriminate delevering across the board produced what is known as the Global Financial Crisis. After all, a debt based monetary system requires both the principal and interest to be repaid – or in another words, a perpetually exponential growth in the issuance of credit. If, by whatever reason, endogenous (war, depression) or exogenous (natural disaster), the credit issuance mechanism were to be retarded, the funds required to pay the interest component of debt service would not be available (and perhaps neither the principal), and the entire system deflates back to reality through an endless series of defaults.

During this period, U.S. Treasurys attained extremely low yields, and remained that way for the subsequent years. Investors felt the safest instruments to park their assets were U.S. Treasurys, backed by the full faith and credit of thousands of nuclear warheads, hundreds of foreign bases, and a very aggressive and cogent policy, driving the yields down on these instruments. Additionally, coordinated central bank action across the globe resulted in massive backstopping of otherwise insolvent financial institutions through a combination of mechanisms including asset purchases, repos, and lowering of both the EFF and discount window rates in the United States and their equivalents abroad.

Europe, however, was beginning to fall apart at the periphery (even though the core itself has been rotting for quite some time now). Some modicum of recovery seemed to finally occur in the summer of 2009, but this was short lived until precisely the Greek chaos that ensued in the spring and summer of 2010. The situation continued into 2011, with the collateral damage including the ablation of MF Global and “rumors” of pending calamity at Jeffries. It was the ascendancy of Mario Draghi to the throne as the head of the ECB, and an extraordinarily determined European bureaucracy that finally compressed PIGS yield spreads from an inevitable fiscal collapse. But I hear that a substantial portion of the populace is burning freshly cut forest wood for warmth this winter…

The purpose of this pair of articles is explain why interest rates will remain low forever, and asymptotically reach zero (and a concomitant negative real yield). There are three required factors that would elicit, sustain, and promote this phenomenon indefinitely in the absence of popular and social abrogation: the mathematics of bond yields, the unsustainability of fiscal provisions, and the intervention of central banks. Let us now explore in detail each of these requirements and contributing factors to the zero interest rate environment that has become the new normal.

From a basic mathematical perspective, with the intuition and practice that can be attained in grade or high school, we can understand that the yield on a bond instrument can be represented as k/x, where k represents the original payments in the interest service, and x is the price of the bond. Visually, as the price of the bond increases, a plot of the yield versus the price becomes extraordinary flat. Mathematically, as the price of the bond (x) increases to very high levels, the derivative, or rate of change, of this function (-k/x^2), approaches zero. The largest rates of change of yield with respect to price is when the price of the bond is very low. This is one of the reasons why in the summer of 2011, yields in peripheral European countries were exploding. However, once the price of the bond is forced to be sufficiently high, either by endogenous market participants (funds), or exogenous factors (central bank intervention), it is exceptionally difficult to influence the yield based on the simple aforementioned relationship. Therefore, it only required that the ECB, BoJ and the FED push bond prices to a certain level whence it becomes absurdly facile to maintain low rates.

Secondly, if bond yields were to increase, many so-called “developed” economies would become insolvent overnight. The cost to service the debt would far exceed the income generated by taxation revenue for these sovereigns. Personally, I believe that the central banking system frankly does not commend this sort of fiscal subterfuge and immorality, but is forced to participate in the suppression of rates for purely political purposes. Let us be honest, and state that central banks today are far less “independent” than they insist, and are increasingly branches of fiscally insolvent (read: inept) governments. After all, when presented with the scenario of a standing army of soldiers and domestic police demanding promised benefits, versus a small group of highly educated gentry in suits, my expectations of the superiority of enforced subjugation certainly lie with the former group. Yields cannot increase without corresponding widespread social chaos, and therefore will be suppressed by an increasingly politicized banking system.

The final reason that bond rates, will in perpetuity, remain low for “certain developed nations” has to do with the practical basis underpinning fiat currency. This topic has been well discussed both in the real world and the ethereal internet, so I will neither profess any particular bias nor explanation. But in the context of examination of factual evidence, given the previous two reasons, all that is required for central bankers (or primary dealers and equivalents) to retain or suppress interest rates is to simply to act as marginal buyers. It is purely this simple. Given the mathematical relationship between yield and prices, and the political incentive, nay charter, that is extended to certain financial institutions, all that is required is a slight excess of demand (buyers) over supply (sellers) to forever maintain the low interest rate requirements. But unlike hedge funds, retail investors, sovereign wealth funds, and even investment banks, the singular capacity afforded to those “privileged” nations with a central banking system will ensure that bids can be conjured from the void, and digital funds are transmitted within femtoseconds, immediately leaping into the commodities and “risky” asset markets. Good luck with your USD $6 chai latte.

As the real economies across the globe continue to decline, and the indulgent promises afforded during the extravagent years come to fruition, it is the confluence of these three powerful characteristics that will ensure that yields remain low. Until of course, social or natural phenomena disrupt the engrained charter than has not been sapiently decided, but rather forced, upon swaths of the world.

Copyright © 2013, Srikant Krishna

Srikant Krishna is a financial technologist and quantitative trader. He has a background in biophysics, software development, and the capital markets.

You can follow him on Twitter @SrikantKrishna, and on LinkedIn at http://www.linkedin.com/in/srikantkrishna/, or e-mail him at sri@srikantkrishna.com.

You can visit his blog as well.


Capturing, Storing and Backtesting CTS/CQS Tick Data in C#/.NET

Tick data is the lifeblood of the capital markets. Unlike order book data, which can be stuffed, stale, and away from the inside market in the majority of cases, tick data represents actionable quotes and transpired trades that can be regarded as the “principal components” of capital market data. Within tick data, one can measure volume, quoting frequency, spreads, VWAPs, moving averages, volatility, and so forth. This article therefore emphasizes the capture and analysis of tick data as opposed to order book information, which can be loosely defined as orthogonal in certain respects.

There was once a time when even the attempt to capture and record tick data, specifically the CTS/CQS “tape” from the U.S. equity markets, was a sophisticated process involving a team of individuals. Even more highly regarded was the replay/analysis/backtesting of the tick data. This was often conducted only in the realm of investment banks or hedge funds.

I briefly, without code examples, want to describe how I effectively store, record, analyze, and backtest the “tape” easily and efficiently each day as part of my model construction and trading strategy deployment.

On average, the CTS/CQS produces about 30GB of information, plus or minus a few GB depending on precisely what fields are stored. I attempt to store everything (condition codes and such), and so my files tend to be a little bit larger. I receive the tick data through multicast UDP, and I proceed to immediately fire an event that strips it off of the network buffer and throw it onto a separate queue in memory. This is so as not to lose data during periods of intense volume (open, close, FED announcement), and so forth. Once it is in my in-memory queue, I then proceed to write each tick, represented as either a trade or quote. I use a common class to represent both trades and quotes as there are a lot of characteristics that are shared and useful between the two.

I begin recording at 09:00 each day (for the possibility of algorithmic “pre-market” analysis), and stop at 16:20. The roughly 20-30GB files are then compressed into a .gz format using standard software such as 7-zip and so forth. The original files are then discarded, and the compressed files are transferred over to my Microsoft Azure Cloud Storage account. I invariably can compress the files to 10% of the original size, or roughly ~2.5GB to ~3.5GB

I then download recent updates on a period (weekly) basis and distribute them across all my backtesting/analysis servers. I then replay the tick data by using the C#/.NET built-in uncompressing reader. Keep in mind that as each tick is being uncompressed, it is placed on a queue an and event is fired that processes the tick throughout my backtesting system and strategies. Therefore, I usually have 6 cores operational on a dual Xeon 8-core server at any given point. Backtesting a single day only requires a few minutes (depending of course on the complexity of the strategy), and then the entire set of trades and messages over the backtesting period is serialized and stored as a “Model” object.

I have created a WPF viewer for the model that displays the market data and various transformations (differencing, moving averages, volume, cumulative volume, quote frequency, and so forth). I use the Visiblox package to greatly facilitate this, and I include annotations on where I’ve placed my trades so I have a visual sense of the strategy. Additionally, because I have the full Model characteristics, I can compute various performance measures against the backtest (Sharpe ratio, annualized return, and so forth.).

Now, the entire process I described is necessary because I using machines with only 12GB of memory. Each day’s worth of compressed CTS/CQS data is approximately 3GB. If I had access to a 64GB or 128GB machine, the backtesting procedure would be far quicker as I could load and entire month or two worth of data into memory and never have to access secondary storage (be it a HDD or SSD).

My current project is to move the entire backtesting apparatus onto the Microsoft Azure platform, so that I fully avail the “utility computing” model and backtest day and night with literally unlimited resources. As the trading volumes have decreased, it actually facilitates backtesting using home-grown software. That is another reason why I develop fully on the Microsoft stack – everything just “works” together, without headaches of which version of Linux I’m using and so forth. But’s that’s just a personal aside.

The gold standard, in the final analysis, for these sorts of systems is of course KDB+, which is incredibly fast and powerful. It is an in memory database with an exceptionally brilliant design and comes with its own extremely concise language (q). But, since I’ve been a freelancer, I’ve had to develop my own techniques for managing large amounts of tick data.

I hope this article is useful to other financial technologists who regularly record and analyze capital market data.

Copyright © 2013, Srikant Krishna

Srikant Krishna is a financial technologist and quantitative trader. He has a background in biophysics, software development, and the capital markets.

You can follow him on Twitter @SrikantKrishna, and on LinkedIn at http://www.linkedin.com/in/srikantkrishna/, or e-mail him at sri@srikantkrishna.com.

In Defense of Automated Trading

In Defense of Automated Trading

In the past several years, there has been an onerous amount of invective hurled at “high-frequency trading”, “algorithmic trading”, and their synonymous brethren. The acerbic consideration afforded to this style of trading has been particularly exacerbated by the conflagration that is the global financial crisis, with no end in sight. A large part of the criticism originates from non-practitioners, not only in the capacity of automated trading, but with respect to capital markets in general. Even specialized financial media sources such as ZeroHedge interminably condemn both the electronic trading apparatus and those market regulators who face a daily supervisory penance.

As is the case with most aspects of the social world that is touched by technology, trading in the capital markets has experienced a thorough revolution, nay series of revolutions, over the course of the preceding decades. Should we be so bewildered that trading systems incorporate technological developments such as many-core GPU processing, reconfigurable hardware, and in-memory databases? Is it so mystifying that the ubiquitous (and relatively obese) financial services industry avails services such as co-location, data-mining, satellite communication, and microwave transmission? After all, a visit to the theater to watch the latest animated film, or to your local healthcare facility to procure an MRI will each demonstrate the same pervasive technological transformation.  Why must institutions trusted with the vital task of growing investors’ capital be precluded from availing cutting-edge methodology and devices?

It has become an invariant that the landscape of computerized trading systems is lumped into an opaque, monolithic entity that few on the planet seemingly comprehend. This perspective however is simply wrong, and masks the tremendous diversity and specific roles that the landscape actually details. For example, “high-frequency trading systems” which are usually operated in the context of a proprietary broker-dealer or market-making platform with direct connections to the exchanges are very different beasts than an algorithmic trading system, operating in an 

agency capacity at a broker-dealer, but also with direct connections to the exchanges and harboring a similar assortment of supporting tools and technology. Each of these, in turn, is different from the extensive use of computers in quantitative modeling and trading, often performed by a hedge-fund. This simple non-exhaustive list of automated trading examples reveals the diversity in purpose, timescale, and methodology employed by the trading firm operating the entity.

Let us first distinguish between what facets and consequences of automated trading should be important to the non-participants, and what should not. Firstly, let us immediately disregard the magnitude of (potential) profits experienced by these firms. In a world of “expert networks”, calamitous multi-billion dollar CDS bets, and nefarious manipulation of global interest-rate benchmarks, this avenue of criticism simply cannot be maintained with any modicum of integrity. The two most important aspects of the market from a participant’s point of view should be transaction cost, and price discovery.

From the perspective of transaction cost, it has never been cheaper, faster, and more convenient to execute trades in the U.S. and various other global capital markets. Whether from a web browser or a mobile device, placing a trade and receiving the execution can transpire in a matter of seconds for humans. Compare this with having a call or discussion over the telephone with a stockbroker, who then would have to contact their trading desk or floor brokers, and after a chaotic exchange that occurs through an outcry market (or even an electronic matching system controlled by a specialist or market maker), fills and executions are finally disseminated back to the client. Of course, a phone call, voicemail or snail mail would be the method of conveyance this transaction. Contrast this with the millisecond response on a simple handheld device, and instant and powerful accounting features. The reason that this is possible is because of technology, the same technology that is being castigated as an evil to be eliminated. Another component influencing transaction cost is the actual commissions, either direct or indirect in the form of markups and spreads paid to dealers. There was once a time when being a NYSE specialist or a Nasdaq market-maker at a leading firm was a highly coveted role, often bringing in bonuses on order of several $million per annum to those lucky
few. In fact, war stories of complete unethical rip-offs of clients were copiously disbursed within trading circles at the evening session around the bar or dinner. It is because of these widespread unethical practices such as frontrunning, principal trading by specialists, fading away of quotes, expansion of spreads, and so forth that order handling rules had to be implemented, and decimalization was invited by the buy-side participants. Perhaps the culmination of the buy-side’s (and other market participants’) angst towards this process was the filing of the lawsuit against NYSE by Calpers in the early 2000s. But the writing was already on the wall at this point for the high-flying specialists and market-makers, as sell-side algorithmic and program trading began to seriously eat into their monopolized businesses. At this point, I must interject and state rather openly that I find it amusing that the most vocal opponents of “machines” and “algo trading” were often these very same individuals who were maintaining exceptionally comfortable livelihoods to the detriment of mutual funds, pensioners, and small retail investors. To reiterate, never in the history of the global capital markets has it been more efficient and cheaper for any institution or human to transact in the public capital markets (at least pertaining to liquid instruments). If a firm is engaged in procuring highly exotic and complex forwards or derivatives, let it be said that they are the mercy of their counterparty or dealer, and they may simply be digging a new grave at a different casino table.

The second fundamental aspect from a market participant’s perspective is the notion of price discovery, which is to ask whether the market price of a particular instrument represents the true sentiment amongst all the participants as a whole. Counterexamples of this are the virtually daily millisecond “flash crashes” which occur in various stocks, often independent of characteristics such as liquidity or volatility. These indeed occur because the speed in which algorithms operate, and the inability of various machines to distinguish very atypical market conditions, creates positive feedback loops that result in small time interval “catastrophes”. There are three important points that pertain to these phenomena. Firstly, the time intervals are so small as to be rather irrelevant to human traders. If a flash crash occurs on the order of half a second, I’d be rather impressed if a large number of human traders were to be affected during the physical process of placing the trade. Secondly, the exchanges have been very accommodative in busting the executions during those particular time intervals. Thirdly, the “flash crash” (or inverse price movement), is temporary, and reverts back to a non-absurd price relatively quickly. There is a dearth of permanent price impact produced by the cascade of erratic machine behavior and stop triggers. In fact, there are strategies that seek to profit precisely from these sorts of crashes, by quickly buying into extreme dislocations in price. As was the case for transaction costs, if buy-side firms or market-participants attempt to “defect” in a prisoner’s dilemma by using dark pools or electronic liquidity providers (ELPs), then they run the risk as of being buried in a different grave, away from the quoted markets in which there is a semblance of discipline and transparency. This is not to say that “trading in the dark” is necessarily a dangerous notion; however just like the complex derivative example it is necessary for the market participant to understand the advantages and risks involved in doing so.

It has never been easier to engage in the capital markets. As I’ve traveled through the developing economies of Latin America and Asia, I see a tremendous difference from my prior visits in the market knowledge that is pursued by that young professionals who are earning money to invest. Wall Street has always been rife with bubbles and busts, without exception, throughout history. This, in fact, is true of any global marketplace. We have observed unethical and nefarious behavior at very high levels, involving complex instruments, illegal information dissemination, very large notional values, and outright fraud. But these evils employ centuries-old implements of deception and accounting gimmicks. Trading volumes have greatly decreased, and the residual volume is often machines trading against each other. Exchanges, in fact, would probably go out of business were it not for these automated trading systems attempting to gain tiny profits at a very frequent rate. To suddenly shift the blame towards technological advances that are, undoubtedly double-edged swords, and the talented people that attempt to learn and profit from them, is an argument that simply cannot be maintained vis-à-vis the broader picture.

Copyright © 2013, Srikant Krishna

Srikant Krishna is a financial technologist and quantitative trader. He has a background in biophysics, software development, and the capital markets.

You can follow him on Twitter @SrikantKrishna, and on LinkedIn at http://www.linkedin.com/in/srikantkrishna/, or e-mail him at sri@srikantkrishna.com.