As said in my video, follow the downloaded packages INSTALL readme guide. You need to make to build from the source of Redis. It ain’t that bad.
For doRedis R package video demo and instructions on how to use it in within R:
Using a scalable stress testing platform for model verification?
Someone sent this from the Trading Show:
As the landscape for systematic trading becomes increasingly saturated and competitive, quants must equip themselves with the latest tools and knowledge for developing and implementing trading models.
On June 4, at The Trading Show Chicago, Sri Krishnamurthy, CEO & Founder, Quant University, will be presenting on “Model risk management for trading models – using an infinitely scalable stress testing platform for effective model verification and validation.”
With a free visitor pass, you’ll have access to Sri’s talk, as well as a number of other seminar sessions spanning connectivity, predictive analytics, latency management and quantitative strategies for algorithmic trading.
Furthermore, your complimentary visitor pass provides you with access to 60+ exhibitors showcasing the very latest technologies and solutions for the algorithmic trading community.
This for classic hft needs but was told not to go with commercial fpga solutions. Instead it is wiser to roll your own system I have talked to numerous industry experts and many agree. It seems this is a very complex process with threading and limited programming language selection. It might be wise to see how matlab performs in this area.
Hypertable for quant development- An open source, high performance, scalable database
Hypertable has been under development for five years and is an open source, scalable database modeled after Google’s proprietary Bigtable database. Hypertable has been deployed as part of a Financial trading system, holding 1/2 trillion records of historical financial trade data on a relatively small cluster (16 machines). It might be a good fit for your application.
Hypertable is an open source project based on published best practices and our own experience in solving large-scale data-intensive tasks. Our goal is nothing less than that Hypertable become the world’s most massively parallel..