The Evolution of Cybersecurity: Part 1
Written by: Lindsay McKay
The first hackers were not criminals and had no malicious intent. They were seen as technology enthusiasts, and their only goals were to explore, optimize, and tinker. This type of hacker flourished in the 60s and 70s; it was not until the 1980s when turnkey personal computers were widely available that a new type of hacker emerged. These new hackers were concerned with personal gain. Instead of using their technological know-how for improving computers, they used it for criminal activities, including pirating software, creating malicious viruses, and breaking into systems to steal sensitive information. But wait, the first computer worm was developed in 1971 before the internet, so how did that work? Let’s look through the history and evolution of the internet and cybersecurity, from the ARPANET to the commercialization of anti-virus software and the beginning of the cybersecurity industry getting recognized.
The ARPANET was the precursor to the internet. The first time two computers communicated with one another was in 1961 in an MIT lab, using packet-switching technology. Sometime later, in 1969, the Pentagon’s Advanced Research Projects Agency concluded the development of ARPANET, with the interconnection of four university computers. The first message was sent on October 29, 1969, between the Standford Research Institute, the University of California-Santa Barbara, and the University of Utah install nodes. Stanford’s computer ultimately crashed before the message was completed.
In 1971, a developer working on the ARPANET developed the first computer worm known as ‘The Creeper’, a program that moves from one computer to another. To him, it was a fun experiment with the message reading: I’m the creeper, catch me if you can. To combat this intrusion, a colleague created the first cybersecurity program called ‘Reaper’, which scoured ARPANET to find and delete the worm. Reaper was not only the very first example of antivirus software, but it was also the first self-replicating program. This cheeky battle between two coworkers became a moment of cultural significance in exposing the vulnerabilities within interconnected computers.
From Cheeky to Malicious
Before the 1990s, only academics, librarians, engineers, military, and governments had access to the ARPANET. In 1991 the World Wide Web went public thanks to a British scientist, but prior to this, a computer worm known as the ‘Morris Worm’ infected over 6000 computers (10% of the network) connected globally to the ARPANET. Again, this worm had no malicious intent and may have even been released unintentionally by an MIT graduate. The response to this worm was the creation of the Computer Emergency Response Team (CERT), whose role was to coordinate information and responses to computer vulnerabilities and security. CERTs would become the first big players in the cybersecurity industry. While they were able to fight and respond to viruses as they came out, they were very much response teams only working reactively and unable to prevent outbreaks.
Unfortunately, the ‘Morris Worm’ paved the way for the creation of malicious programs, which exploded with the launch of the World Wide Web and throughout the 90s. Emails became the main target of viruses, with the viruses’ I LOVE YOU’ and ‘Melissa’ infecting tens of millions of computers, causing a worldwide failure of email systems. During this time, the anti-virus industry was booming and responded with products like McAfee, Norton Antivirus, and Kaspersky, which detected threats by scanning all the files in a system and comparing them to a database containing “signatures” of known malware.
Cybersecurity Gets Recognized
People were becoming sick of only being reactive, and the need for regulation and available cybersecurity education and resources for professionals was becoming apparent. During this time, many non-profits and certification associations established themselves, one of those being the Computing Technology Industry Association (CompTIA). As of today, CompTIA is one of the most highly respected and reputable associations offering beginner to expert certification exams and partners with institutions worldwide to provide cybersecurity courses. In 1993, the first IT entry-level certification, the CompTIA A+ certification, was launched. Following that, in 1999, the CompTIA Network+ certification exam was launched for those specializing in network technologies. Around the year 2000, there was a need for an entry to intermediate level certification for professionals pursuing a career in information security, so CompTIA launched the CompTIA Security+ certification in 2002 to address this need. After the largest data breach in history (the attack on Yahoo), the information security industry was looking quite bleak. To tackle this gloomy time, CompTIA launched its first cybersecurity analyst certification, the CompTIA CySA+ certification. Through the exam prep course, professionals learn how to apply behavioural analytics to prevent and combat cyberattacks and support experts who play the role of a threat hunter.
Check out The Evolution of Cybersecurity: Part 2 to learn about different cybersecurity technologies, the rise of connected devices, cyberattacks on automobiles and more!
Are you prepared for an IT disaster? Check out How to handle an IT disaster without losing your cool on Innovation Networks.