Solid State Drive Arrays For Large Data Storage Requirements
I am wondering if any of you have used arrays of SSDs for fast data retrieval of large chunks of data? I have a client whose spec requires 120 TB of data to be accessed most expediently, and he specifically is looking for SSDs as a candidate media. This data will be written once, and read many times, so it should not be too harsh on the SSDs’ longevity.
Will the 120TB be access by a large cluster or by end-users on workstations?
Is the 120TB be accessed all at once or can it be subdivided?
Take a look at the Gridiron Systems TurboCharger. A 2.5 TB system (SSD-based) can accelerate up to 50 TB of back-end storage. It designed specifically for read intensive environments. Here’s their product page:
FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!