Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

How the Wayback Machine is trying to solve the web’s growing linkrot problem

The Verge: “We’ve been talking a lot about the future of the web on Decoder and across The Verge lately, and one big problem keeps coming up: huge chunks of the web keep going offline. In a lot of meaningful ways, large portions of the web are dying. Servers go offline, software upgrades break links and pages, and companies go out of business — the web isn’t static, and that means sometimes parts of it simply vanish. It’s not just the “really old” internet from the ’90s or early 2000s that’s at risk. A recent study from the Pew Research Center found that 38 percent of all links from 2013 are no longer accessible. That’s more than a third of the collected media, knowledge, and online culture from just a decade ago — gone. Pew calls it “digital decay,” but for decades, many of us have simply called it linkrot. Lately, that means a bunch of really meaningful work is gone as well, as various news outlets have failed to make it through the platform era. The list is virtually endless: sites like MTV News, Gawker (twice in less than a decade), Protocol, The Messenger, and, most recently, Game Informer are all gone. Some of those were short-lived, but some outlets that were live for decades had their entire archives vanish in a snap.  But it’s not all grim. For nearly as long as we’ve had a consumer internet, we’ve had the Internet Archive, a massive mission to identify and back up our online world into a vast digital library. It was founded in 1996, and in 2001, it launched the Wayback Machine, an interface that lets anyone call up snapshots of sites and look at how they used to be and what they used to say at a given moment in time. It’s a huge and incredibly complicated project, and it’s our best defense against linkrot.”

Sorry, comments are closed for this post.