Quant development: The Latency Challenge
Only 13% attributed to the network and 65% to the application
Strategies for using latency as a competitive advantage.
As more players jump onto the HFT bandwagon, the use of exchange co-location, low latency switches, and hardware acceleration cards will no longer give you the edge. The edge is and always will be the application and how effectively it can use operating system resource. This is the tough stuff folks. To accomplish this, low latency team development teams should have embedded performance engineers or systems administrators with low level internals skills.
It´s not even micro seconds anymore. Most of the big guys think in nano seconds! They´d be sitting with their machines right in the Exchanges´s server room if they could. What´s next? NO latency at all? Future, here we come.
Huh… how many computers do you know of that can accurately track anything below microseconds?
Some of our machines do in fact sit in the Exchange’s server rooms.
I asked how many computers can track anything into the microseconds as straw-man type response.
Most all systems can if they have a reliable source of the time data they use and an evidence practice as well. Also the key issue is not to rely on the local TIME OF DAY (ToD) service to accurately track anything but rather to log to a controlled system with its time managed properly making the actual ToD in the production framework almost irrelevant.
We – Certichron happen to provide access to reference clock services which NIST itself operates. This is much better than a GPS service and provides 50ps or coarser access to the actual UTC standard itself. It also suffers none of the spoofability or other attacks GPS does and well – its about evidence of time.
How we do this is we cloned the US Timebase in Boulder and operate a fractional-instance of the timebase in concert with NIST Time and Frequency. its part of a technology we had them design a decade ago to facilitate placing an official NIST UTC instance in Tokyo (in the Nomura Trading Building on Shinjuku there).
So that’s how we deal with putting sub-microsecond time service into the trading framework. The first of these master timing centers and the regional access points in NYC and Bridgewater NJ also make this a complete service solution.
As to why GPS is not good enough… Most people fail to understand or even allow the review of the evidence model GPS produces and its horrible. The L1 GPS System can be spoofed and jammed so easily it is 100% unreliable for all securities applications.
Why we think NIST UTC is so important is that aside from not being the legal US Timebase per 15 USC 271 and 15 USC 272, GPS is a horrible source of evidence of anything. But it’s also important to notice that the spoofability of the L1 system is so easy that the Joint Chiefs of Staff issued a quiet-memorandum banning the use of unencrypted L1 services for all Military applications in 1998.
That order issued by the DoD alone disqualified it for all use as a trust-anchor for commercial transaction processing – and its something that the GPS industry as a whole has fought to scrub out of the public awareness since it would mean a large portion of the 1.5 B marketspace per year as reported on the WWW.GPS.GOV website. If you need more proof check out the SPOOFING GPS google searches or GPS Jamming…
That said a reliable source of time which can be proven after the fact is key to all market data operations and reporting!
evidence is so compelling. Its just funny that most of the people around today take being told their evidence models were designed to meet the wrong set of standards pretty hard. They take that they relied on their vendors and were lied to in their time management practices pretty hard too, but hey… it is what it is.
NIST UTC rules!
NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!