Data is everywhere and whatever we do we are contributing to an information explosion. Phone calls, emails, tweets, Skype chats, internet searches, sat-nav enabled journeys; pretty much all activity in the modern world relentlessly adds to the huge amount of content being generated, transmitted or stored every second of every day. Even while we are asleep networks are pinging our phones and electronically chatting to them so this digital traffic never stops.
It has been estimated that in all of human history up to the year 2000, the equivalent of five exabytes of data was generated - this is the whole of human knowledge (albeit given as an unimaginably-large number). In comparison, we now generate more than five exabytes every day, and the rate at which this is being done is continually rising.
Humankind is drowning in information and it is debateable whether the majority of it will be of any lasting value. The vast amount of data being created masks within it smaller amounts knowledge which really is valuable to mankind, but the problems come when one asks how we identify that important information and then, more crucially, how we capture it for accessibility by future generations.
Selecting and archiving the right information in the right way is now becoming a major issue but instead of new technologies being part of the solution, the sheer pace of technological change is actually becoming a potential problem. One of Google’s senior executives Vint Cerf recently spoke about this situation and warned about what he called ‘bit rot’, the situation where we will be unable to recover or read digital information in the future leading to a ‘forgotten generation or even a forgotten century’. The crux of the argument is this: although we are now very efficiently digitising much of human knowledge, what happens in the future when new hardware or software is incompatible with the way in which we have stored the information? So although the actual digital bits of information will survive, we might not be able to access them.
Ancient civilisations etched their languages into stone tablets or vellum sheets and those writings have survived for thousands of years because the only reading tool which is needed is the human eye; no special interface, no computer, no program required. Nowadays, it is common for large companies, government agencies and many public bodies like hospitals to have customised in-house software to keep records. These bespoke systems (often costing millions of dollars) are amongst the most likely to fall foul of the ‘bit rot’ problem because by their very nature, such specialist software packages are not going to be widely available and so are also not likely to be future-proof either.
Many people now advise taking the old-fashioned option for really important things, printing those irreplaceable digital family photographs and maintaining back-up paper copies of key e-records. Even in high-tech micro technology facilities, there is a good reason why traditional lab books are still used. After all, short of being burnt in a fire, paper records are likely to last far longer than the latest PCs or tablets. There is indeed much to be said for simply writing things down!