The Computing Future

For our modern world, computing is as essential as the air we breathe. But a long evolutionary improvement process needs to undergo disruptive changes.

The future of computing is more exciting than it has been in a long time. This article shows why.

Nowadays the term computing is very broad. The definition covers everything that is necessary to handle information with computers, e.g. hardware and software and its respective development.

Whole disciplines like Computer Engineering, Information Technology or Cybersecurity also belong to this term.

Computing History in a Nutshell

I want to start this article with a very brief history of computing and based on this extrapolate where the journey could go.

Computing is as old as mankind. Maybe even older. Japanese scientist proved, that Rhesus Monkeys are pretty good at adding up numbers.

However, we are the only species on this planet that has perfected the tools to support our calculations.

The Ishango Bone (19,000 BC, Congo) is probably the earliest proven computing tool in human history. Definitely proven are the Babylonian (2,400 BC) and Chinese Abacus (190) and the invention of the binary system by Pingala (200 BC, India).

In medieval times Persia and the Arabian world where a stronghold of wisdom and research. The work of Persian al-Khwarizmi (Algorithm, 820), Arabian Al-Kindi (Cryptography, 850) and Llull laid important foundations and partly influenced the great European thinkers decades and centuries later.

The further evolution of computing in Europe lead to the first mechanical calculator a.k.a. machine arithmétique. The invention by Pascal (France, 1642) was a breaktrough which later inspired inventors like Leibniz.

However, it took until the end of the 1950s for the great leap forward with the introduction of the first transistors that could be mass produced (Bell Labs, 1959).

The transition to digital electronics was a real revolution and is therefore called the “Third Industrial Revolution”.

Nowadays, we speak of the Fourth Industrial Revolution. To be honest, this is a little bit too early.

The End of Moore‘s Law

It is too early because up to know we have not yet witnessed a true revolution. Right now we still follow Moore‘s Law (processor complexity will double every two years) with the existing transistor based technology.

Yet it is also clear that Moore‘s law is finite. Nowadays, modern chips are manufactured using the 7nm process.

On the one hand, the artisan engineering behind this technology is incredible. On the other hand, the physical limitations are already visible as Intel is still struggling with the introduction of the 10nm manufacturing process.

Another fact is becoming more and more clear. Until now, we have only been doing evolutionary steps. Not revolutionary ones.

Literally, we perfected a technology from the middle of the last century.

At the same time, computing as we know it today is coming under pressure from several megatrends.

I‘m talking about IoT, AI and Edge Computing. All these trends are interconnected and will change computing as we know it forever.

The Future of Computing

The history of computer science is full of hilariously false predictions. I do not intend to join this series, but I can try to give an approximate outlook.

As I showed in the previous chapter, I believe that our current technology is doomed to stagnate. At the same time, the amount of data is growing exponentially.

We can‘t solve this by simply building bigger cloud data centers, as the location of computing more and more shifts towards the edge.

Driverless cars are a prominent example, as this application requires low latency and offline functionality with enormous amounts of data at the same time.

In the future, we need powerful computing resources everywhere – not only in the cloud.

Quantum Computing

So, as we are running into a dead end with the current technology stack, where is the way out?

In my opinion, the only way out is basic research to enable a new key technology with higher potential.

Two candidates are already in focus.

The first and better known is quantum computing.

Quantum computing is considered a key technology of the 21st century and big tech companies as well as governments are driving research forward with gigantic amounts of money.

In October 2019, Google claimed to have achieved quantum supremacy. This means that quantum computers outperform conventional ones in certain fields.

The potential for uses cases like (huge) database search are there without a doubt.

However, quantum computers are still decades away from being minituarized and mass producible. Current research is around fundamental issues like controlling quantum decoherence.

Biocomputers

Elon Musk‘s fear of a possible singularity (AI superintelligence) lead to the foundation of the startup Neuralink.

The goal is human enhancement with brain-computer-interfaces to be able to keep pace with the technological progress.

The first prototype shown late August 2020 received a FDA approval as breakthrough device, which enables limited human testing.

At this point one might wonder why we don’t actually take the biological route right away.

Biocomputers, for example, use DNA to calculate and offer the advantage of massive parallelization with much lower energy consumption than comparable standard electronics.

Another huge benefit of biocomputers lies in the potential of all biologically derived systems to self-replicate and self-assemble.

But despite remarkable progress in recent decades, bio-computers are still further away from production use than quantum computers.

The Future of Computing and the Way Forward

There is no ready-to-use technology in sight that could replace the ubiquitous digital electronics in the near future.

Especially not with the requirements of the megatrends Edge Computing and IoT, such as offline functionality or low latency.

Various parties are currently making great efforts to be the first in the new key technology.

The main focus is on quantum computing, but with great uncertainties.

To move forward, we need to squeeze everything out of the existing technology and develop intelligent algorithms at the software and application level, while at the same time pushing basic research to the production level.

If not, we as humanity will soon find ourselves in a position where computing becomes a hindrance to new research breakthroughs, such as autonomous driving.

Want to support my work? Buy my recommendations on Amazon>>*


Image Source: Pixabay, Pixabay License

About Post Author

7 Comments

Leave a Reply

Share via
Copy link