Information is at our fingertips today, all because of the Google. Google is capable of searching almost the whole public internet within the fraction of a second. Serving 40,000 search queries every second on average. To do this, Google has to maintain a clone of entire sites (in technical terms cached content) indexed by it. In that case, Google must have stored each and every page to process, so it can serve the search results faster.
Google has never revealed any numbers related to the data they have stored in their data centers. But we’all know it has a whole lot of data with it. Without Google giving out the details it is not possible to find out the exact number.
But, based on a calculated guess a rough estimate has given by few folks on the internet. The calculation is based on the number of servers in each data center Google own, the capital expenditure, electricity consumption, etc. Calculations apart. The estimated data that might Google have is 10-15 exabytes. I believe this estimate is only for Google Search! This measure does not include Google Drive, Photos and other services Google Offer.
I know most of us never heard the metric more than Terabyte (TB) that consists of 1024 GB. 1024 TB equals 1 petabyte (PB) and 1024 PB forms 1 exabyte (EB). This roughly translates into 1 billion Gigabytes (GB).
1 Billion (or 100 Crore) GB of data. Don’t forget that every day it is growing and no one knows what it would be for today.
Google is offering unlimited storage for photos back up in Google Photos, Google Drive files (only for Google’s format), BlogSpot and many other services that require lots and lots of data storage. Don’t forget that YouTube is also owned by Google. The estimate data store given here does not include any of these services.