100 TB of data on Mongo DB? Possible?
Asked Answered
A

2

10

What kind of an architecture is needed to store 100 TB data and query it with aggregation? How many nodes? Disk size per node? What can the best practice be?

Every day 240GB will be written but the size will remain same because the same amount data will be deleted.

Or any different thoughts about storing the data and fast group queries?

Anisaanise answered 22/1, 2013 at 7:48 Comment(2)
Yes, it is, the related question is out of date since there have been bigger scenarios on the user group since.Arresting
You tagged this with the vertica tag. Do you want some sort of information about that as well?Imhoff
M
4

Kindly refer to related question,

MongoDB limit storage size?

Quoting from the the top answer:

The "production deployments" page on MongoDB's site may be of interest to you. Lots of presentations listed with infrastructure information. For example:

http://blog.wordnik.com/12-months-with-mongodb says they're storing 3 TB per node.

Musky answered 22/1, 2013 at 7:52 Comment(0)
P
4

I highly recommend HBase.

Facebook uses it for its Messages service, which in Nov 2010 was handling 15 billion messages a day.

We tested MongoDB for a large data set but ended up going with HBase and have been happily using it for months now.

Peekaboo answered 24/1, 2013 at 22:36 Comment(2)
how did you handle infrastructure management? We're a small startup and don't have resources yet to do it at 100%Gottlieb
Sorry, maybe I'm not understanding - what do you mean by infrastructure management? You mean managing the Hadoop/HBase cluster? We used Amazon Elastic MapReduce.Peekaboo

© 2022 - 2024 — McMap. All rights reserved.