Is Big Data platform is suitable for file size 15K to 2Gb i am confirm about big files but not sure about the files of 15K size.
Your question prompts several follow-up questions:
1. Which “Big Data platform”?
2. What kind of files, and what are you planning to do with them?
3. How many files?
HDFS is not intended for small files, there’s a fair amount of overhead per file, and you wouldn’t reap many of the benefits that way. However, it might still work for you (for example, if you’ve got a relatively small number of little files, along with lots of huge files, it might be simplest to just stick the small files in HDFS and accept the wasted space, etc.).
Could you concatenate multiple files together? Or combine them some other way (e.g. if they’re log files, you might merge many of them together, sorted by timestamp). Or would it make sense to treat the filenames as keys, and the files as values, in one of the many Big Data key-value stores?
1. Hadoop + hbase
2. any file txt , logs, database backup or any media files
3. millions of file read and write simultaneously at a time with any size from kb to gbs
4. access via java api or php
NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!