shocking tech facts

Since the launch of the internet, nothing has changed faster than technology. In just a few decades, we've witnessed a rapid transformation in how we live, communicate, work, and even think.

You only have to look at the recent launch of AI to understand this. Before the launch of ChatGPT in 2022, it was considered by many outside of the tech space as belonging to the realm of science fiction rather than as a tool of today. 

Now everything is ‘powered by AI’ – and it seems like every tech company and its dog is investing in new AI initiatives and developing their large language models (LLMs) and ML-powered technologies. 

It’s not the first time such a technological shift has revolutionised the industry either. Before everyone was talking about AI, they were talking about the Metaverse, and before that, it was cloud computing, smartphones, the internet, etcetera etcetera. 

With technology moving so quickly, some of the advances and innovations in the industry can seem unbelievable. But you’d be surprised at some of the bizarre stories that have led us to the technologies we use today. 

Unbelievable tech Facts 

surprising tech facts

From mind-boggling inventions to internet oddities, the tech world is full of surprising facts that will make you laugh, scratch your head, and perhaps even question if it’s even real.

In this top 10, we're counting down ten surprising technology facts that sound fake but are actually real. These tech facts will leave you shocked! 

The first computer mouse was made of wood

The first computer mouse, invented by computer scientist Doug Engelbart in 1964, looked nothing like the sleek, ergonomic mice we use today. Engelbart's original mouse was a simple rectangular block of wood with a single button on top and two metal wheels on the bottom for tracking movement across a surface.

The wooden prototype was part of a broader vision to make computers more interactive and user-friendly. Engelbart’s invention revolutionized how humans interacted with machines, shifting from complex, text-based commands to a more intuitive point-and-click interface. Though it took decades for the mouse to become a standard part of personal computers, Engelbart's wooden prototype laid the foundation for the modern graphical user interface, making computing accessible to millions.

A single Google search uses more power than the entire Apollo program

During the Apollo moon landings, NASA relied on room-sized computers, like the Apollo Guidance Computer (AGC), which had just 64KB of memory and operated at 0.043 MHz—laughably tiny by today’s standards. In contrast, when you perform a simple Google search today, it triggers a vast network of data centers filled with thousands of servers that process the query in milliseconds using complex algorithms. This requires far more computational power than NASA had at its disposal during the entire Apollo mission.

The amount of processing, memory, and energy used by Google’s infrastructure for just one search outstrips what NASA used to put astronauts on the moon. This incredible leap in computing power demonstrates how far technology has come, making what once seemed impossible now achievable with the push of a button.

The first computer bug was an actual bug

The term “computer bug” traces back to an incident in 1947 when a team of engineers, including the legendary computer scientist Grace Hopper, encountered an unexpected problem while working on the Harvard Mark II, an early electromechanical computer. The machine, which was the size of a room and filled with thousands of vacuum tubes, relays, and switches, was malfunctioning. Upon investigation, the team found the culprit: a moth had flown into one of the relays, physically jamming the machine and causing a failure. The engineers carefully removed the moth, and, in a humorous moment, taped it to the project’s logbook with the entry, "First actual case of bug being found."

This event wasn’t the first time the term “bug” was used to describe technical glitches. The term had been used in engineering circles for years to refer to mechanical or electrical issues, dating back to the 19th century. In fact, Thomas Edison used "bug" to describe flaws in his inventions. However, this incident marked the first recorded instance of a “bug” in the context of computing, forever linking the term with software and hardware errors. Today, debugging—the process of identifying and fixing errors in a computer program—remains one of the most important aspects of software development.

A hacker once used a refrigerator to send spam

In 2014, cybersecurity firm Proofpoint discovered that a botnet—a network of compromised devices—had hijacked various smart appliances, including a refrigerator, to send out over 750,000 phishing  emails. These devices, which were connected to the internet but lacked adequate security, became easy targets for hackers.

The incident revealed a significant weakness in the emerging world of smart technology, where everyday appliances like refrigerators, TVs, and thermostats are connected to the web but often have minimal security protections. This vulnerability has led to other devices becoming botnets too – from smartwatches to electric toothbrushes. As IoT continues to expand, with more devices gaining connectivity, the potential for such cyber exploits has grown, highlihtig the need for stronger security measures to protect even the most unlikely devices from being hacked.

The first digital camera was the size of a toaster

The first digital camera, invented by Kodak engineer Steve Sasson in 1975, was a groundbreaking piece of technology that looked more like a toaster than the miniature devices we carry today. This early prototype weighed about 8 pounds and was roughly the size of a bread toaster, making it cumbersome by modern standards. It featured a rudimentary design, with a 0.01-megapixel sensor that captured black-and-white images. The camera recorded images onto a cassette tape, and it took around 23 seconds to capture a single image—a far cry from the instant gratification of today’s digital photography.

Sasson’s invention represented a major shift in how we think about photography, paving the way for the transition from film to digital media. However, despite its revolutionary nature, Kodak's leadership was initially hesitant to embrace digital technology, fearing it would cannibalize their lucrative film business. As a result, the digital camera languished in obscurity for years, even as the technology rapidly advanced in the following decades.

The internet weighs as Much as a Strawberry

Physicist Dr. John D. Kubiatowicz estimated that the electrons in motion, required to power and store the vast amounts of data on the internet, collectively add up to an incredibly tiny mass – around 50 grams, roughly the same as a large strawberry.

This estimate comes from the fact that while the individual electrons responsible for data storage and transfer are nearly weightless when multiplied by the enormous quantity of data (around 40 zettabytes as of recent estimates), their mass becomes significant enough to be measurable, albeit in small terms. While the internet feels weightless in our digital interactions, this quirky fact gives a tangible dimension to something we typically consider intangible.

There are more mobile phones than people on Earth

The proliferation of mobile devices has been nothing short of extraordinary. According to the GSMA’s real-time tracker, there are currently around 8.5 billion mobile connections globally, while the global population stands at around 8 billion. This is due to people owning multiple devices—think smartphones, tablets, and wearable gadgets. The rise of IoT (Internet of Things) has contributed as well, with many non-phone devices (like smart watches or smart meters) having cellular connectivity.

In some developed nations, it's common for individuals to have multiple mobile numbers or devices for personal, work, or travel purposes. This widespread connectivity reflects how essential mobile technology has become for communication, work, and entertainment across the globe, from the most remote villages to densely populated cities.

NASA’s internet speed is 13,000 times faster than yours

While typical residential broadband speeds hover around 100 Mbps, NASA's network, particularly the Energy Sciences Network (ESnet), can reach speeds of up to 91 Gbps. This incredible bandwidth is essential for handling the enormous volumes of data generated by various space missions, satellites, and scientific experiments.

While most home users deal with speeds measured in megabits per second (Mbps), NASA’s gigabit internet is necessary to send vast amounts of data—like satellite images and research data—from outer space back to Earth. It's so fast that you can download a high-definition movie in just milliseconds. For instance, data from the Hubble Space Telescope, the Mars rovers, and other missions can produce vast amounts of information—like high-resolution images, telemetry data, and scientific measurements—that must be transmitted back to Earth for analysis. The high-speed connections enable scientists and researchers to transfer this data efficiently, allowing for timely insights and discoveries.

More people use the internet than have access to clean drinking water

As staggering as it sounds, more people globally have access to the internet than to clean drinking water. Approximately 5.5 billion people—about 67% of the world’s population—use the internet, a number that has skyrocketed with the rapid spread of mobile devices and affordable connectivity. In contrast, only around 74% of people, or about 5.9 billion, have access to safely managed drinking water, leaving more than 2 billion without reliable access.

This surprising disparity highlights a paradox in global development: while technology, especially the internet, has expanded quickly even in remote and underdeveloped regions, essential resources like clean water still lag behind in terms of availability. This reflects how the digital revolution has outpaced efforts to solve basic human needs, and raises questions about the allocation of resources and priorities in global infrastructure development.

You could store the entire internet in just a few grams of DNA

DNA, the molecule that encodes the genetic information in all living organisms, has the potential to hold immense amounts of data in an incredibly compact form. Researchers estimate that a single gram of DNA could theoretically store up to 215 petabytes (215 million gigabytes) of data. In recent studies, scientists have successfully encoded various types of data—such as text, images, and even video—into DNA sequences, demonstrating its capability as a storage medium. This approach offers significant advantages over traditional digital storage methods: DNA is incredibly stable, can last for thousands of years under the right conditions, and requires minimal energy to maintain.

As of now, the total amount of data on the internet is estimated to be around 40 zettabytes (40 billion terabytes), which means that, in theory, all that information could be compactly stored in just a few kilograms of DNA. While DNA data storage is still largely experimental and faces challenges in terms of cost and read/write speeds, it represents a revolutionary step forward in data preservation and has the potential to address the growing demand for more efficient and sustainable storage solutions in our increasingly data-driven world.