It's commonly quoted that Sir Tim Berners-Lee was the man who 'invented the internet' at the turn of the 1990s. While this is only partially true, the reason behind why he developed his brand new system for communication is the same one that is still pushing the boundaries today - ease of collaboration.
But back in a time when Google wasn’t a verb, Amazon was just a rainforest and Tinder was something you started a campfire with, the internet was being devised and built for very different reasons than those it is commonly used for today.
Internet communication has its origins far earlier in history than many might think. As Cold War-inspired fears of communications blackouts from Soviet nuclear attacks weighed heavily on the United States, in the early 1960’s scientists had already developed the concept of ‘packet-switching’, a method for transmitting electronic data from one computer to another at distance. This would enable much better communication than the vulnerable phone lines should there be an attack.
Somewhat appropriately, the first message ever sent from one computer to another was an error. On 29th October 1969, the Advanced Research Projects Agency Network (ARPAnet) attempted to send the message ‘LOGIN’ from one device in California to the other in Stanford (each was about the size of a small house). The network crashed, with the Stanford computer only receiving ‘LOG’. Nonetheless, the idea had worked, and throughout the 1970s and 80s further developments were made, such as the Transmission Control Protocol and Internet Protocol, or TCP/IP, a communications model that set standards for how data could be transmitted between multiple networks.
While Sir Tim Berners-Lee didn’t conceive of the internet itself, he can best be described as the visionary ‘chef’ who bought together the composite ingredients to create the earliest version of the ‘web’ that we know today: the World Wide Web, complete with HTTP, URLs and the whole concept of websites and hyperlinks. But why did he do it? The answer: to meet the demand for automated information-sharing between scientists in universities and institutes around the world from his workplace at CERN in Geneva.
The internet is now used for everything from e-commerce and communication to gaming. It’s difficult to conceive of any business or industry that doesn’t rely heavily on it in some way. And just like the reason for which it was invented, it still has a huge application in scientific research.
We live in the data age as much as we live in the digital age. With data strategy more and more important for organisations, research-driven institutions such as life sciences organisations are needing to up their game in terms of the technology used to communicate data to multiple people around the world. Enter cloud technology.
Cloud solutions are to the 21st century what the original internet was to the 20th. They are fast becoming the standard in which data sharing and transfer efficiencies are increased. This is vital in something like drug research which involves multiple organisations across a variety of geographies and can often be hindered by data being stored and forgotten about in one place while the same research is taking place elsewhere. This is of current relevance given the coronavirus pandemic and the time-critical race for a vaccine.
Using traditional pen and paper methods of data capture, even allied with spreadsheets, means manually transferring data between systems, which is time-consuming and can result in inaccurate data. Reports even estimate that employees can spend as much as 30 per cent of their workday searching for information. Inefficiencies and concerns around accuracy are rife and create significant business issues. With cloud technology, this data capture and sharing is automatic.
Rather than being limited by sending individual files to people, cloud platforms offer a way to bring together all data in a safe, secure and instantly accessible environment. Additionally, when linked to in-lab technologies, such as electronic laboratory notebooks (ELNs) and laboratory information management systems (LIMS), experimental data can also be uploaded the moment it is captured, minimizing human error.
This human error element is key. Research and data, along with the methods with which it is captured, must be meticulously documented from a compliance standpoint. It also needs to be easily and readily available for audits. Failure to meet these guidelines can result in significant financial penalties. With the cloud solution, applications can be embedded with technology to standardise all data logging and automatically fulfil regulatory requirements in a secure way with constant backing up. The ‘lost USB stick’ could well be a thing of the past with the data stored centrally and securely.
The original internet was fundamentally developed so scientists could communicate and access data from anywhere in the world. The advent of cloud technology, a natural evolution of the system, is once again being used for this exact purpose decades later. The need for humans to collaborate and communicate in ever more complex and widespread ways is pushing the technological boundaries ever further and will have a lasting impact on our efficiencies as a society.