I am repeating this same message from last week but this is ABSOLUTE THELAST CHANCE:
Unfortunately, this chance will NEVER come around. This will be the CHEAPEST you will EVER get to join my QuantLabs.net Premium Membership. Forever Amren. Think harder and more carefully if you are considering it
Here is a recent members testimonial if you missed it:
..somehow they should be paying more… just my 2 cents…..BTW… know that there are those of us out here that think what you are doing is awesome.
The next day, you will wake up to a 50 percent increase of this. I don’t discount or offer FREE trials as the software provided already is quite valuable with the source code.
The 4 day countdown has begun for the absolute LOWEST rates you will get in my QuantLabs.net Premium Membership. It goes up 50% come this Tues Jan 21 no ifs, no buts, .end of story. Remember I tried to be a nice guy to allow for an extra week for fence sitters.
I have done some serious research over the last day on what I will consider trade secrets in rapidly building trading models and strategies. This is leveraging off the power of Matlab’s environment. I am also looking into potentially starting a fund through some sort of partnership arrangement. This is how serious I am with my learnings.
1. Youtube video on TRADE secrets for Hedge Fund and HFT starting to get posted on quant membership
It is getting serious now as I have some highly people qualified helping out.
You got 24 hours until the NEW QuantLabs.net Membership rates kick in. This takes affect sometime on Tues morning of Jan 15. So be forwarned! I also posted a great supportive member testimonial to show the true value of this. As a result, get your LAST CHANCE discount now. This will be the cheapest rate EVER for the membership moving forward.
Remember: The rate does go up 50%. This is not a typo!
1. Amazing testimonial from QuantLabs.net Premium Members posted today. Big thanks to them and their support!!!
…If you are creating C++ modules from Matlab, it should be quite easy to develop .NET “appliances” or “bots” to perform various advanced functions (add-ons which traders want very badly but cannot get or cannot afford)useful for enhancing the capabilities of any data/broker platforms which support .net. HUGE market for your memberships…
I went to my Meetup last nite where I always learn lots of things. From what I see, everything is being streamed into trading platforms these days. Also, screw APIs as they becomes bottlenecks with latency so you need to learn how to program sockets. Interactive Brokers comes to mind with their Trader Workstation versus their FIX gateway options.
Not only that, many in the past have told me that using R with RCpp/RInside could be a bottleneck. There is a potential with that so it leads me into investigating event rule based programming using a Complext Event Processing engine like Esper. Well, there is none for C++ as Esper is for C# or Java. Too bad but there is an option to stream it in. I posted about that yesterday. There are some options I posted at:
I also got my backups in place if R does indeed become the bottleneck, I can hint it is most popular part of QuantLabs.net. It also involves a very big expense but you gotta do what you gotta do to get things working. Aahh… the joyous challenges in working on these HFT potential platforms?
Another important element are FIX engines built on top of wonky QuickFIX. I documented my experience on one solution called FIX8
lowest latency data injection to hadoop for quant analytics?
I’ve customer looking for low latency data injection to hadoop . Customer wants to inject 1million records per/sec. Can someone guide me which tools or technology can be used for this kind of data injection to hadoop.
There is a number of solutions for loading data into HDFS: Flume, Scribe, Chukwa. Some teams load data into HBase as fast storage. If data is loaded from a relational database there is a Sqoop.
As usual the devil is in details. What is the size of a record? What are the latency requirements (msec, seconds, minutes)? How many sources of data? Is it a continuous data stream or a batch load?