Space porn: These images are (quite literally) out of this world
The last two decades have completely transformed the way we know. Thanks to the rise of the Internet, information is far more accessible than ever before. It’s more connected to other pieces of information and more open to debate. Organizations — and even governmental projects like Data.gov — are putting more previously inaccessible data on the Web than people in the pre-Internet age could possibly have imagined. But this change raises another, more ominous question: Is this deluge overwhelming our brains?
In his new book, “Too Big to Know,” David Weinberger, a senior researcher at Harvard’s Berkman Center for Internet and Society, attempts to answer that question by looking at the ways our newly interconnected society is transforming the media, science and our everyday lives. In an accessible yet profound work, he explains that in our new universe, facts have been replaced by “networked facts” that exist largely in the context of a digital network. As a result, Weinberger believes we have entered a new golden age, one in which technology has finally caught up with humans’ endless curiosity, and one that has the potential to revolutionize a wide swath of occupations and research fields.
Salon spoke to Weinberger over the phone about the rise of the information cloud, the demise of expert knowledge, and why this is the greatest time in human history.
In the book you mention the “smartest guy in the room” metaphor. According to your book, that’s an outdated metaphor. Now it’s the room itself that’s smart.
In the West we’ve pegged knowledge to what fits in books or gets written down. That’s been our medium for preserving and communicating knowledge. Because books are written by individuals, it has often made knowledge seem like the product of individuals, even though everybody has always understood that individuals are working within the social network. With the new medium of knowledge — the Internet — knowledge not only takes on properties of that medium but also lives at the level of the network. So rather than simply trying to cultivate smart people, we also need to be looking above the level of the individual to the network in which he or she is embedded to see where knowledge lives.
Historically, we have thought of knowledge as something that experts have on a given topic. We call on experts to explain things to us, on TV or in the news. How is this changing?
In the West, knowledge begins as a winnowing process. It goes back to ancient Greece, where the rich, free menfolk were debating politics and steering the state. Many opinions were expressed, but only some of them were true, so knowledge became the winnowing of those opinions defined to be rare gems of truth. That idea — that knowledge is what makes it through a winnowing process — not by coincidence fits perfectly with the paper medium that we used for it. Paper is expensive, libraries are small, very few people can get published. So we’ve thought of knowledge as that which makes it through a very small aperture.
Some people have described this process as a pyramid.
In 1988, Russell Ackoff, an organizational theorist, proposed a pyramid that has become really standard in many business environments. You have data at the bottom, then information, and then knowledge — and then at the top, wisdom, as if wisdom is the reduced set of knowledge. The idea is in line with our traditional idea of knowledge, which is based on the idea that there’s too much to know, there’s more than can fit into any skull, so we need to come up with strategies to deal with it. And that pyramid is the information age’s elaboration of this. In every step you get quality and value by reducing what was at lower steps, but we’ve had a reductive sense of knowledge for about 2,500 years.
In the last few years, there’s been concern that the Internet is giving us far more information than our brain can handle. But that’s not a new concern…
Yes. Clay Shirky very pithily said a few months ago, “There’s no such thing as information overload — only filter failure.” I think Clay was partly trying to provide some historical continuity so that we shouldn’t feel like we need to freak out, that we have faced this same issue over and over again, and we need to fix our filters. But filters work differently in the digital age. Physical filters work by removing that which doesn’t make it through the filter, whether it’s the manuscripts that get sent to publishers or journals that never see the light of day, or books that don’t make it into the library, or coffee grinds. Digital filters don’t remove anything; they only reduce the number of clicks that it takes to get to something.
So by “filter” you’re referring to things like, for example, a blog post about the “top 10 magazine articles of 2011.”
Yes, exactly. In the digital age we filter forward instead of filtering out. As a result, all that material is still available to us and to others to filter in their own ways, and to bring forward in other contexts. That is a very significant difference. You may filter those 10 articles, but all the other ones will still show up in a search, or tomorrow you may get them in an email from a friend or Google+ recommending that particular link. Nothing is removed. It’s all there and available. That is a huge difference.
You talk about the rise of the networked fact. What is that?
Over the past couple hundred years, we’ve had this idea that knowledge is composed of facts about the world, and together we are engaged in this multigenerational enterprise of gathering facts and posting them, and ultimately we’ll have a complete picture of the world. That view of facts as the irreducible atoms of knowledge has some benefit, but we’re seeing a different type of fact emerge on the Net as well. Traditional facts are still there. Facts are facts. But we’re seeing organizations of all sorts releasing their data, their facts, onto the Web as huge clouds of triples [another word for linked data]. They’re a connection of two ideas through some relationship — that’s why they’re called triples — but not only can they be linked together by computers, they themselves consist of links. Each of the elements of a linked atom is a pointer to some resource that disambiguates it and explains what it is.
Let’s think of an example of a fact.
OK. I’m in Edmonton, Canada, right now, so let’s say: “Edmonton is in Canada.”
OK, so, if the triple is “Edmonton is in Canada,” ideally each of those should link to some other spot on the Web that explains exactly which Edmonton, because there’s probably more than one, along with which Canada (though there’s probably only one). And “is in” is a very ambiguous statement, so you would point to some vocabulary that defines it for geography. Each of these little facts is designed not only to be linked up by computers, but it itself consists of links. It’s a very different idea than that facts are bricks that lay a firm foundation. The old metaphor for knowledge was architectural and archaeological: foundations, bricks. Now we have clouds.
One of the advantages of networked thinking is that we can harness diverse opinions in a way that was impossible before. Why is this important?
We’ve known for a long time, and I think culturally we’ve accepted, that diversity is an important thing in the work of knowledge. Doris Kearns Goodwin’s “Team of Rivals” takes Lincoln’s Cabinet as an example. Knowledge was traditionally about driving out difference and settling things, and not coincidentally the medium of knowledge was amenable to that. If you print a book, you can’t really change it. Books settle things, and so does knowledge. We don’t say we know something if there is significant debate about it. We have viewed diversity as a path to settling things and believe we get the best results if we have a diverse set of people.
But in the networked world that’s shifting a bit. It’s not simply that diversity is a good means to the ends of knowledge, but knowledge consists of a network of people and ideas that are not totally in sync, that are diverse, that disagree. Books generally have value because they encapsulate some topic and provide you with everything you know, because when you’re reading it you cannot easily leap out of the book to get to the next book. The Web only has value because it contains difference. When I link to you, I’ll say why I am linking to you and I’ll explain what the difference is between the site you’re currently reading and what I’m pointing you at. We are beginning to think of knowledge itself as having value insofar as it contains difference.
Some writers won’t hyperlink their pieces on principle, because they think it ruins the reading experience.
There are lots of ways to write, and people should write the way they want to. But if an author posts something that has no links out, it nevertheless becomes a part of the Web — other people will link to it and say what they think about it. There’s no escaping that now. But I should acknowledge I wrote a paper book. There are no hyperlinks in the paper book. There are URLs, but you can’t click on them, so apparently I do think there is still some value in writing in non-linked media.
But parts of your book are probably going to make their way online.
And I blogged about many parts of it and discussed it in my course, so it is a continuous process. Nevertheless, especially for people of my generation, books still count.
In the book you explain that one of the advantages of the Web is that it is cumulative, by which you mean that online facts will never go away. But if someone posts something that says “Thomas Rogers shot someone’s dog,” the Internet will never let me live that down, despite the fact that it’s untrue. Isn’t that a reason to be worried?
All of these blessings are highly mixed, but the good part about the Internet being cumulative is that we don’t have to take special steps in order to preserve that which is of value even if it is of minimal value, is tiny or insignificant, or only has value when it’s aggregated with tons of other pieces that may in themselves have no intrinsic value. You used to have to go to great lengths to preserve that stuff and to make it public. The Net lowers the hurdle for what we record. The good side of it is that we’re able to go through that and discover what escaped us. The bad part is that we can discover stuff that would have escaped malevolent eyes.
One of the characteristics of the Web seems to be the flattening of news. If you look at a site like Buzzfeed, it has reports about the death of Kim Jong Il right next to viral videos about cats. It’s jarring — and seems a little amoral.
You are pointing to the benefits of having a very small aperture for news. That aperture was controlled by full-time professional editors, but it means that what comes through the news hole now is anything anybody is interested in enough to post. On the other hand, when you have so few apertures for news and they’re controlled by such a similar set of people, you get a certain limited set of stories. We at least now have the opportunity to create filters that let in more than the traditional room of middle-aged white men. If we’re not reading the stuff that matters, it’s our fault.
Sites like Buzzfeed or reddit that are aggregating based upon user suggestions reflect what users are interested in. It turns out that we’re interested in is a pretty broad range of stuff, and most of it isn’t all that serious. A lot of it is just funny or in some cases distasteful, but it turns out that is what we’re interested in. The challenge is trying to educate our interests, but that’s what education is about.
In the book you talk quite a lot about Darwin as an example of an old-school information gatherer. What would a modern-day Darwin look like?
I use Darwin throughout the book because there’s no arguing with him as a deep and wonderful thinker. Darwin is also a peculiarly good example because he was so reluctant to write; he remained relatively private about his theory for decades and published only when somebody else, Wallace, was on the verge of beating him to it. So here we have Darwin working alone and talking with a refined social network (he had colleagues who were very important and helpful to him), but working slowly, spending seven years working to determine a single fact: whether barnacles were crustaceans or mollusks. The first five chapters of “The Origin of Species” explain his theory, and the next six chapters deal with objections that either had been raised or he imagines would be raised.
Darwin today would not be operating this way. He would very likely be tweeting from the Beagle. He would be announcing his findings and initial ideas online, and people would be arguing with him all along, taking his ideas, applying them elsewhere, pushing back, criticizing him deeply — all of the things we do on the Web. That work has revolutionary, incredible value, but put into the Web, it gets teased out, amplified, corrected, as well as misunderstood and degraded. Nevertheless, that Web itself has more value than the individual content, so I would expect that Darwin today would be gathering his data from clouds of linked data, trying out ideas on the Web, and drawing those ideas in the tussle. The old rhythm of knowledge and science not coincidentally is the rhythm of publishing. The Web has completely broken that rhythm.
Do you think all of these changes are good or bad?
It’s both good and bad. It’s both impossible and unhelpful to ask if it’s making us smarter or stupider. But I am actually very hopeful. Ask anybody who is in any of the traditional knowledge fields, and she or he will very likely tell you that the Internet has made them smarter. They couldn’t do their work without it; they’re doing it better than ever before, they know more; they can find more; they can run down dead ends faster than ever before. In the sciences and humanities, it’s hard to find somebody who claims the Internet is making him or her stupid, even among those who claim the Internet is making us stupid. And I believe this is the greatest time in human history.
That’s a very lofty statement.
I’m pretty confident about that one, actually. Curiosity can lead you to lots of bad directions. It can steer you wrong and waste your time, but it is fundamental. We need it more than anything else if we’re going to try to understand our world. Now we have a medium that is as broad as our curiosity.
Thomas Rogers is Salon's former Arts Editor. He has written for the Globe & Mail, the Village Voice and other publications. He can be reached at @thomasmaxrogers.More Thomas Rogers.
NASA astronaut Mike Hopkins
On December 28, 2013, Expedition 38 crew member Mike Hopkins participating in the second of two space walks to replace a degraded pump module on the International Space Station. (NASA astronaut Rick Mastracchio is reflected in his helmet!)
The Soyuz TMA-10M
The Soyuz TMA-10M headed towards the International Space Station with crew members from Expedition 37 onboard.
40 years ago the Apollo 8 mission flew up to the moon, orbited it ten times and then returned to Earth. This picture was taken from that flight and shows the Earth as it seemingly rises in similar fashion to a sunrise.
Sunrise from Expedition 36
NASA Flight Engineer Karen L. Nyberg of Expedition 36 took this photo of the sun rising -- a sight they saw nearly 16 times per day due to the speed of the International Space Station's orbit around the earth.
A pair of NanoRacks CubeSats -- nanosattelite spacecrafts carrying experiments -- were launched by Expedition 38.