This will be part 1 in a longer series on learning. Stay tuned!
Some writers confuse authenticity, which they ought always to aim at, with originality, which they should never bother about. — W.H. Auden
Not so long ago, knowledge getting lost was a purely practical problem. Books were burned, storytellers died, manuscripts got lost, and that was the end of it.1 Although not necessarily easy to solve, it was a straightforward problem. Today, with pretty much all of the world’s information freely floating around on the internet, can we still lose our knowledge?
Knowledge knows many forms. Blogs, essays, opinion pieces, theses, papers, videos,… all filled with a tiny sliver of the world’s collective brain. And pretty much all of it can be found online. Quintillions of bytes of data are added to the internet every single day which is, well, a lot.2 If knowledge is buried by the sheer amount of data that’s added to the pile, does it even matter that it’s technically still around? If all information is available, should we be worried about its destruction, or its discoverability?
There are clear instances in history where the availability of the world’s information became more democratised. There was the invention of the printing press, mechanical movable type printing, the spread of libraries, and eventually radio and television. But it was the dawn of the internet that made it especially apparent that the mere availability of information doesn’t bring us much. There is a good piece of evidence for this:
I think the most depressing fact about humanity is that during the 2000s most of the world was handed essentially free access to the entirety of knowledge and that didn't trigger a golden age. — Erik Hoel
Granted, the value of knowledge always came from its discoverability. There’s no point to having a book in a library if no one ever reads it. But I’d wager that wasn’t a very common occurrence. At least in the golden age of libraries, things could still be discovered. There was a very finite amount of books in every category and at least from my experience, I read what was on the shelves. Even the most obscure books must’ve had some rotation. The fact that you could come across some book you’d never heard of and unexpectedly learn something new was the best thing about the whole ritual. Simple availability was enough because clutter was limited. And the finite nature of the model made it so that most everything got read eventually. This was also the initial beauty of surfing when the internet first came along; you could discover things you didn’t even know you were looking for.3 Today, it’s becoming increasingly harder to do so. With everything up for grabs, not just availability but exposure becomes all the more important. What floats to the top?4
What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture. (...) As Huxley remarked in Brave New World: the civil libertarians who are ever on the alert to oppose tyranny, they fail to take into account men’s almost infinite appetite for distractions. (...) Orwell feared that what we fear will ruin us. Huxley feared that our desire will ruin us. — Neil Postman
This is why discoverability is the most prevalent threat to knowledge getting lost. When the companies that build the algorithms are financially incentivised to push forward only those things that already perfectly grab your attention —and let’s be honest, it’s not research papers— you never discover something new. The whole model actively pushes down useful ideas in favour of trivialism. Everything that’s a bit of a stretch, that’s not instantly digestible, instantly likeable, outside of the mainstream, or generally anything that the algorithm has no idea what to do with will get pushed down into the shame pit of the internet with all the other things that are deemed unworthy of human praise and attention.5 In the long run you end up with a great majority of people that think the same, look the same, feel the same, buy the same things, go to the same places, incapable of forming an identity if their life depended on it. All of it is a bit of a blackpill because there is no operator at the switch actively doing this. It’s a harm we are inflicting upon ourselves and the further we fall into the trap of our own desires the more the algorithm capitalises on that. I know what you’re thinking; the algorithm was built with exactly that intention in the first place. And that’s true of course. But what it does is that it tilts the floor of human behaviour; it’s harder to walk uphill but it’s still possible. You simply have to choose to do so.
Why the preservation of original ideas is important
Original discoveries are rare. I think we as humans have a tendency to overestimate the recurring chance of one-off events. An apple dropping on Newton’s head, Galileo taking a bath, Einstein being bored at the patent office.6 These things happened once, surely they must happen again. It’s only a matter of time before we’d discover gravity, hydrostatic balance, and the theory of general relativity all over again. They are, after all, physical truths of the universe we live in. No?
In some cases it’s much easier to check an answer than to invent it. It took Newton to invent calculus, but some high schoolers are able to use calculus, and anyone who uses calculus can confirm that it correctly solves calculus problems. — Scott Alexander
The trap of genius discoveries is precisely that they seem obvious in hindsight. In some cases they’re even easily replicable. But we shouldn’t fall into that fallacy to think that the apparentness of the idea is at all correlated with the ease of discoverability. There are many discoveries in science that we regard as obvious, but took thousands of years before they took place. Common sense is always common after the fact. And the loss of knowledge is precarious precisely because we don’t know how long it might take before someone rediscovers it. It took a genius to invent calculus, it will take another one to rediscover it. In a world that doesn’t produce all that many geniuses anymore, that realistically might take a while. It could mean the difference between the Dark Ages and Enlightenment. Even in the present day, there are many accounts from scientists that don’t believe they would’ve made their discoveries in today’s world. Like this one, and this one, and this one. If you read the articles you’ll notice we’re not talking about some arbitrary discoveries in a field you’ve never heard of. It is Nobel Prize winning research that laid foundational groundwork for our modern understanding of physics, biology, chemistry and medicine. This is why we shouldn’t take it for granted.
The value of unoriginality
Much of humanity’s knowledge, though, finds itself in densely packed research studies, niche books, and eccentric manuscripts filled with jargon that’s understandable only to those who drank its blood. Especially older manuscripts were subject to a sort of ‘evolutionary obfuscation’.7
Manuscripts were not easy to read, by later typographic standards, and what readers found in manuscripts they tended to commit at least somewhat to memory. Relocating material in a manuscript was not always easy. — Walter Ong
One of the biggest flaws of academia, being the knowledge institutions that they are, is that they do a terrible job dispersing their discovered knowledge to the outside world. Anyone who’s ever written a thesis or a dissertation knows exactly what I’m talking about. These things are not meant to be consumed outside of the walls they were produced in. Even published works seldom leave the confines of University halls and tenured offices. Academia spawns such specialised work in niche fields that it becomes increasingly harder to understand its contents even as a ‘partial insider’. I have a good friend whose dissertation was so ahead of the curve that the defence jury had trouble to fully comprehend it. If seasoned academics, hardened in a specific field can’t string it all together then I have some bad news for you. Let’s paint this with an example:
Anyone who has survived the torments of tertiary education will have had the experience of getting a broad look at a field in a 101 class, then drilling deeper into specific subfields in more advanced classes, and then into yet more specific sub-subfields in yet more advanced classes, until eventually you're stuck at home on a Saturday night reading an article in an obscure Belgian journal titled "Iron Content in Antwerp Horseshoes, 1791-1794: Trade and Equestrian Culture Under the Habsburgs", and the list of references carries the threatening implication of an entire literature on the equestrian metallurgy of the Low Countries, with academics split into factions justifying or expostulating the irreconcilable implications of rival theories. And then you realize that there's an equally obscure literature about every single subfield-of-a-subfield-of-a-subfield. You realize that you will never be a polymath and that simply catching up with the state of the art in one tiny corner of knowledge is a daunting proposition. The thought of exiting this ridiculous sham we call life flashes in your mind, but you dismiss it and heroically persist in your quest to understand those horseshoes instead.
It’s a painfully comical example of a problem that all too many people are familiar with. If knowledge is so specialised to the point that it obstructs understanding, then we’re no further from our oral traditions. Written records become nothing more than data, 1’s and 0’s that mean as much to you as they do to me. At some point you’ll end up in such specialised fields, that with the death of the people who have borne the ideas, so dies the knowledge. So what do we do about it?
If you think about it, a lot of the best research that’s being done in the world is being done in academic institutions. But the nature of academia being what it is, it’s kind of a little silo where the experts speak to each other, they write for each other in a language of their own for journals that only they read. {…} So I’m all in favor of taking that big well of material and turning it into stories. Again, when I say stories, I don’t mean made-up; I mean factual, empirical, research fact-checked and so on. — Stephen Dubner
Dubner has a good point, but it’s a problem that falls outside the scope of academia. The nature of pioneering work, of course, is that it will be hard to understand for the uninitiated.8 It is literally the first time someone commits a specific idea to paper. Imagine being a genius scientist discovering something no one in the world ever had knowledge of before, and then having to use words in a way it makes sense for the plebs. I think you see my point. It might be too much to ask for the ‘Quantum Entanglement for Dummies’ version when we’re still busy breaking the boundaries of the field.
While we desperately need original work, we also need reproductions to democratise the knowledge we already possess. True originality is rare, but there is merit in rephrasing existing ideas in new ways. I’m convinced that understanding hinges more on the elegance of the message than the complexity of the idea. The art form matters. Ideas are only as valuable as the vehicles they are carried in. As we’ve established, the mere availability of information doesn’t induce knowledge so we need a more elegant way to go about it. Novel dispersion of existing ideas is paramount if we want to keep them alive, and understanding lies in repetition. The more ways in which we express and combine ideas, the wider the training set for our collective mind to fully comprehend them. We need reworked versions of everything that’s already out there to bring the knowledge that we have back from the shame pit into the light for everyone to see.
The misconception of originality
I’m not an original thinker. Like everyone else, I’m a product of the things I’ve read, the projects I’ve worked on, the people I’ve met. It doesn’t mean I plagiarise, but my writing is a product of ideas that are already floating out there. My style consists of observing relatively obvious things in the world, reading existing research, and making connections simply by looking closer. It’s nothing earth-shattering but I think there is value in it. I read the things I read because I find them interesting. And the conclusions that I draw from the different sources that influence me are probably different from the next person. So I write them down on the off chance that someone might take something away from it, add it to their own library of knowledge, and make a unique connection until eventually a new insight is born.
This is how most ideas and insights come to be. Yet, we have this undying belief that originality is all around us, and that it takes a sole genius to break through the shackles of contemporary thinking for innovation to take place. But apart from some exceptions, the reality is more boring. In his excellent piece on innovation, Steven Johnson eloquently breaks this down:
We have a natural tendency to romanticise breakthrough innovations, imagining momentous ideas transcending their surroundings, a gifted mind somehow seeing over the detritus of old ideas and ossified tradition. But ideas are works of bricolage. They are, almost inevitably, networks of other ideas. We take the ideas we've inherited or stumbled across, and we jigger them together into some new shape.
I don’t think the bulk of our knowledge will ever get lost, per se, but I’m worried that if fewer and fewer people are able to discover ideas, the value of each one will keep decreasing until there’s little point left. The value of an idea is not absolute. It fulfils its place in the web and the value comes from what its string is connected to. If we keep cutting threads, eventually there will be places we’re not able to reach anymore. Interconnectivity is everything. Johnson again:
The premise that innovation prospers when ideas can serendipitously connect and recombine with other ideas may seem logical enough, but the strange fact is that a great deal of the past two centuries of legal and folk wisdom about innovation has pursued the exact opposite argument, building walls between ideas. {...} The problem with these closed environments is that they make it more difficult to explore the adjacent possible, because they reduce the overall network of minds that can potentially engage with a problem, and they reduce the unplanned collisions between ideas originating in different fields.
The initial promise of the internet was bright. It would connect us on a global level, and the ability to transfer information to the other side of the world with the click of a button would make this the most intelligent and innovative generation the world had ever seen. Only, that didn’t happen. We’re still sitting on that untapped well of potential. And while the world might have changed, the technology hasn’t. It’s all still possible. Even more so than ever before. What a valuable thing it is to be able to think out loud on the scale of the world. And what a valuable thing everyone can think along with you. What a terrible tragedy it would be to lose that. Let’s not squander it. Whatever is possible in this world doesn’t come from inventing the future. We simply need to reinvent the present. So let’s.
The reality is slightly more complex of course, but for the sake of this essay let’s not dive too much into that.
Supposedly, one quintillion is the width of the Milky Way Galaxy in metres, which I guess is equally meaningless but there you go.
But then again, in those early days the internet was arguably just an online library, so it makes sense.
The indexation of the internet helped, but at some point a database is only useful if you know what you’re looking for and where to look for it. Also, people don’t actively search that much anymore. It seems that algorithmic feeding has become the norm for the majority. Fixing the problem requires fixing the algorithm.
Or not financially interesting.
It’s easiest to paint this with scientific discoveries but it goes beyond that of course. The painting of the Sistine Chapel, The Beatles writing ‘A Day in the Life’, the opening riff to Stairway to Heaven. For those things to happen you need a combination of talent, skill, and circumstance christened with some divine luck to tie it all together.
What I mean by this is work where a good part of the intended meaning got lost due to changes in time, language, culture, preconceived beliefs, etc.
Everyone is familiar with the “if you can’t explain it to a child, you don’t understand it yourself” quote, typically attributed to Einstein, although probably not something he ever said. It’s also a ridiculous statement.