What the Google Book Project wrought

Five million English language books.
Those books are only four percent of all the books ever published.
In the august journal Science, in 2011, data analyst and eventual Whitney artist Jean-Baptiste Michel, along with a host of colleagues, told us about the more than fifteen million books and five hundred billion words, in various languages, that Google digitized.
Even if you only read in English — you can’t read all those books. You can’t even read all of those from the last twenty years. And if you fancy yourself a wordsmith, you do not know all of the one million words of the English vocabulary.
Quite simply: If someone tells you that haven’t read enough, just say:
There are more books than you can read and more words than you can remember.
Notes
This was written by Josh Dubrow and is based on Jean-Baptiste Michel, Yuan Kui Shen, Aviva Presser Aiden, Adrian Veres, Matthew K. Gray, Joseph P. Pickett, Dale Hoiberg et al. “Quantitative analysis of culture using millions of digitized books.” Science 331, no. 6014 (2011): 176-182. Et al authors include the “Google Books Team.”
And most of those books came from just 40 university libraries.
In 2014, the Whitney Museum in New York City bought Jean-Baptiste Michel’s artwork, “I wish I could be exactly what you’re looking for.” As of this writing, Jean-Baptiste Michel is a “core member” of Patch Biosciences, a company that produces “Machine-designed DNA for gene therapy.” https://web.archive.org/web/20210215194729/https://patch.bio/
“…we estimated the number of words in the English lexicon as 544,000 in 1900, 597,000 in 1950, and 1,022,000 in 2000.” p. 176.
Copyright Occam’s Press 2021: Attribution-NonCommercial-NoDerivs CC BY-NC-ND