I just wrapped up the toughest yet most rewarding semester of my degree (and yes, I absolutely slayed it). This semester pushed me to go over 72 hours without sleep, but every second of those sleepless nights was worth it for the concepts I explored and the things I learned. It also inspired this post.
In my courses, I can’t help but dive into the philosophical side of things, constantly asking thought-provoking questions (which, depending on their passion for teaching, makes my professors either love me or, well, not-so-love me). But with this post, I’m taking that curiosity to a whole new level: challenging how much of technology is actually inspired by us—humans.
I’ve always been fascinated by the human touch in technology, especially after reading an MIT article on the brain and machine learning models and watching a Nobel laureate’s talk on the same topic. Back then, I mostly thought about the neuroscience aspect—how human learning has shaped the fields of machine learning and artificial intelligence. However, taking classes like cybersecurity and computer networks this semester, I realized it goes far beyond that. Whether fundamental or advanced, almost every technological concept we use has roots in human behavior, biology, or natural processes.
One day, as my cybersecurity professor explained the origins of the "Trojan horse" in cyberattacks, I grabbed my phone and jotted down this blog idea in my notes app. In this post, I’ll dive deeper into some of the ideas I explored this semester while briefly touching on others.
Communication Protocols and the Internet
Let’s start with data communication protocols, which I explored in my Data Communication and Computer Networks course. A communication protocol is essentially a system of rules that enables two or more entities in a communication system to transmit information using variations of physical signals.
These protocols define the rules, syntax, semantics, synchronization of communication, and even methods for recovering from errors. They can be implemented through hardware, software, or a combination of both.
Interestingly, this concept originates from human social protocols. In a human context, "protocol" refers to a formal set of procedures or etiquette, often used in royal, diplomatic, or official settings. These procedures are designed to manage relationships and ensure smooth interactions. When we compare the two, both are essentially rulebooks for facilitating communication—one governs interactions between humans, while the other manages interactions between devices or network systems.
Specific network protocols also reflect human-inspired designs. For example, internet protocols like TCP/IP closely mimic structured communication norms used by humans. One standout example is TCP’s "handshake mechanism." This mechanism initiates a communication session by sending an initial signal and awaiting acknowledgment from the other device. It’s remarkably similar to how humans start a conversation—one person says “hello,” and the interaction only proceeds when the other responds.
Another fascinating parallel is error checking in data transmission. When devices detect and correct errors during communication, it’s akin to humans clarifying misunderstandings in a conversation to ensure they’re on the same page.
Artificial Neural Networks: Mimicking the Brain
Now let’s move on to my favorite concept, which I learned in my all-time favorite university course, Deep Learning, taught by my favorite computer science professor, Prof. Tamer Elsayed. In machine learning, a neural network (also known as an artificial neural network or ANN) is a model inspired by the structure and function of biological neural networks in the human brain. An ANN consists of interconnected units or nodes called artificial neurons, which are loosely modeled after the neurons in the brain.
These neurons are linked by edges that mimic synapses. Each artificial neuron receives signals (real numbers) from its connected neurons, processes them, and sends a signal to others—behavior strikingly similar to how signals are transmitted in the brain. Recent advancements have even focused on creating artificial neurons that more closely mimic biological ones, significantly improving model performance. This concept has inspired me to take a neuroscience course next semester—so QU med students will be having an “irrelevant” engineering student among them <3
Another fascinating type of neural network we explored in this course is Convolutional Neural Networks (CNNs). These architectures are designed to process and analyze visual data, like images, and are modeled after the way the human brain handles visual information.
How CNNs Mirror the Human Visual System
The human brain’s visual system processes images by breaking them down into essential features through specialized neurons in the visual cortex, which identify edges, shapes, patterns, and colors. Early layers focus on basic elements like edges and gradients, while deeper layers analyze abstract patterns such as object forms and facial recognition.
Similarly, Convolutional Neural Networks (CNNs) mimic this functionality with components like convolutional layers, which detect patterns such as edges, textures, and gradients, resembling the role of neurons in the visual cortex. Pooling layers simplify data while preserving key features, reflecting the brain’s efficiency in processing visual information. Finally, fully connected layers combine these extracted features to make decisions, such as identifying objects or faces.
Natural Language Processing (NLP): Understanding and Generating Text
The second concept, which I also studied with Prof. Tamer in this course and an earlier one called Information Retrieval, focuses on Natural Language Processing (NLP). NLP models, such as BERT, are inspired by human linguistics—how we understand context, syntax, and semantics in conversations. BERT’s ability to encode and process text mirrors how humans make sense of complex language structures.
On the other hand, NLP decoder models like GPT replicate the human ability to generate text. These models create sentences that are not only grammatically correct but also meaningful, resembling the process by which humans learn to communicate effectively.
Cybersecurity Attacks: Borrowing Nature’s Tricks
My cybersecurity course was another favorite this semester— as it made me a bit more dangerous and invincible. The cybersecurity field is deeply interconnected with human behavior, the natural world, and the systems we create.
Malware
Let’s start with malware, particularly Trojan Horses, Worms, and Viruses. Trojan horses, inspired by the Greek myth, disguise themselves as legitimate software to trick users into downloading or executing them, giving attackers access to systems, data, or networks—just like the Greeks hid inside a wooden horse to infiltrate Troy.
Worms are self-replicating malware that spread across networks without human interaction or a host file, exploiting vulnerabilities in software or operating systems. Much like biological worms consume resources and damage their environment, digital worms wreak havoc across networks.
Meanwhile, viruses require a host file to spread, activating when an infected program or file is executed. They replicate by attaching themselves to other programs or files, much like biological viruses infect living cells to multiply.
Blockchain and Consensus
Blockchain is a decentralized system for maintaining a record of transactions across a peer-to-peer network, especially in cryptocurrency. In a blockchain, consensus ensures that all participants (nodes) agree on the validity of a transaction or block before adding it to the chain. This is strikingly similar to how communities or organizations make decisions—whether through voting in elections or reaching unanimous agreements in juries.
The Immune System in Cybersecurity
Cybersecurity systems borrow heavily from the human immune system in how they protect networks and devices from malicious attacks. Firewalls act as the first line of defense, much like the skin or physical barriers in the body. Antivirus software scans for and removes malicious software, resembling white blood cells neutralizing pathogens. Intrusion Detection Systems (IDS) monitor for suspicious activity, similar to how antibodies identify threats. Finally, machine learning models learn from past attacks to predict and defend against future ones, mirroring the immune system's memory function.
Other Bio-Inspired Technologies
Evolutionary Algorithms
Evolutionary algorithms mimic the genetic evolution of humans and the natural behavior of animals. These algorithms are based on the Darwin principle that, in a population competing for limited resources, only the fittest individuals survive. This concept is applied to optimization problems where an objective function needs to be maximized or minimized. For example, in soft sensor design, evolutionary algorithms are commonly used for training models and hyperparameter tuning.
Swarm Intelligence
Swarm intelligence is inspired by the collective behavior of decentralized and self-organized systems like ant colonies, bee hives, bird flocks, and fish schools. While individual members of these groups are simple and lack sophisticated decision-making, their collective behavior allows them to accomplish complex tasks, such as finding the shortest route to food, allocating tasks efficiently, and defending their colonies. This phenomenon is mirrored in metaheuristic optimization techniques, where simple agents interact with each other and their environment to solve complex problems. For example, algorithms inspired by ant colonies replicate how ants find the shortest path between their nest and a food source to search for optimal solutions in design domains. Additional strategies are used to avoid getting trapped in confined areas of the search space. These swarm-based techniques have proven to be robust, particularly in engineering applications like traffic optimization and design problems.
Many technological advancements are inspired by humans, their bodies, their environment, and the creatures around them. This reminds me that inventions rarely come from nowhere. The ability to critically examine everyday activities, behaviors, and knowledge—and then connect and apply them to entirely different fields—is one of the most fascinating aspects of human creativity. Reflecting on even the smallest details of life, instead of merely consuming information and moving through the motions, is a skill every person should invest more time in developing.
Resources:
Lectures
Deep Learning
Data Communication and Computer Networks I (DCCN-I)
Cybersecurity Fundamentals
Books
Cisco CCNA 1: Introduction to Networks (Version 5)
Understanding Deep Learning by Simon J.D. Prince
Computer Security: Principles and Practice (4th Edition) by William Stallings & Lawrie Brown
Articles
Online References
Scientific Papers
Analysis of Swarm Intelligence-Based Algorithms for Constrained Optimization
The Role of Artificial Intelligence-Driven Soft Sensors in Advanced Sustainable Process Industries: A Critical Review
KM, till next time <3
Komentáře