Big Data Sets & Hadoop – BackTesting and/or Model Development
Any out there use or using Hadoop as part of their backtesting repository? Any good lessons or experiences in using Hadoop?
That is what we are going to use. We have done some researches in that field, including consultations with search engine developers that are using hadoop, and found out that it will fully suit us for storage and cluster calculations. The project is in development stage at the moment, so I can’t tell right now how hadoop performs) but it was designed for hard calculations
18 hours ago
Did you look at any in-memory databases like VoltDB? or did Hadoop have more in the way of calculations?
Hadoop should only be used if you need to spread the calculations between multiple nodes. This is where the speed is. You can easily connect as much nodes as you need. And that can be virtual machines, dedicated servers, work stations) and it can store significant amount of data with real time back up.
FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!