أخبار المركز
  • د. أمل عبدالله الهدابي تكتب: (اليوم الوطني الـ53 للإمارات.. الانطلاق للمستقبل بقوة الاتحاد)
  • معالي نبيل فهمي يكتب: (التحرك العربي ضد الفوضى في المنطقة.. ما العمل؟)
  • هالة الحفناوي تكتب: (ما مستقبل البشر في عالم ما بعد الإنسانية؟)
  • مركز المستقبل يصدر ثلاث دراسات حول مستقبل الإعلام في عصر الذكاء الاصطناعي
  • حلقة نقاشية لمركز المستقبل عن (اقتصاد العملات الإلكترونية)

Quantum Computing

The end of the silicon era and the start of the quantum supremacy period

14 مارس، 2024


Classic digital computers are rapidly approaching their peak computing capability, marking a significant milestone in the evolution of this silicon-based architecture that has shaped the world for decades. However, the development of computers that operate at the atomic or quantum level is now inevitable. As a result, competition among countries and technological companies to advance quantum computing technology has intensified, driven by the immense potential it offers.

Quantum computers possess such immense power that they can be deemed the ultimate computing machines. In theory, they have the ability to crack any existing electronic security code. This alarming situation has prompted the US-based National Institute of Standards and Technology (NIST), the governing body responsible for setting national laws and standards, to issue recommendations. These guidelines aim to assist large companies and government agencies in preparing for the inevitable transition to the new quantum era. NIST has already stated that quantum computers will have the capability to break the 128-bit AES encryption, which is widely used by numerous organizations to safeguard their data. The annual expenditure on this encryption method amounts to tens of billions of dollars across various countries.

Quantum computers have far-reaching applications beyond security and military domains. They are revolutionizing fields such as medicine, space exploration, environmental studies, climate research, and energy. Unlike regular computers, quantum computers can perform computations that were previously unimaginable. A notable breakthrough occurred in 2019 when Google introduced the Sycamore device. This quantum computer solved a complex mathematical problem in just 200 seconds, a task that would have taken traditional digital computers 10,000 years to complete. Another significant development took place in 2020, when China's Quantum Innovation Institute unveiled a quantum computer that is a staggering 100 trillion times faster than a supercomputer.

As a result, several major technological companies, such as Google, Microsoft, Intel, IBM, Rigetti, and Honeywell, are competing to build quantum computers, with each company working on prototypes. In 2021, IBM introduced its own quantum computer, named Eagle, as the competition expands beyond American corporations to include Chinese companies.

The End of the Silicon Era

During the 1950s, computers were so large that they occupied entire rooms. Only big companies and government institutions, such as the Pentagon and major banks, could afford them. For instance, the ENIAC computer could perform tasks in thirty seconds that would take a human twenty hours. However, due to their size and cost, computers remained inaccessible to most people. A revolution in the development of electronic chips changed this. Over the decades, the size of electronic chips gradually decreased. Eventually, a single chip the size of a fingernail could hold approximately a billion transistors. Transistors are devices that amplify or switch electrical impulses by controlling the flow of electrons. In binary terms, a state of 1 signifies the passage of electrons through the transistor, while a state of 0 denotes no flow of electrons. This advancement enabled engineers and scientists to create small, portable computers. Consequently, the advent of cell phones, Internet of Things devices, and other microelectronics became possible.

As a result of the rapid development of electronic chips, Gordon Moore, one of Intel's founders, formulated Moore's Law in 1965. According to this law, the number of transistors typically doubles every two years, leading to improved performance and reduced costs. In a subsequent update, Moore revised the period to every 18 months. It is important to note that this principle is an empirical finding that guides the semiconductor industry, rather than a strict physical law. Moore's Law has had a significant impact on the growth of the electronics sector, driving innovation, reducing production costs, and enabling the creation of more powerful and smaller electronic devices at affordable prices.

However, in recent years, scientists have observed that Moore's Law is nearing its physical limit. The primary reason for this is that transistors have become so small that current technology is unable to accommodate them. Electronic chips have shrunk to the point where transistors are only about twenty atoms wide. Once the distance between transistors reaches approximately five atoms, the electron's location becomes uncertain due to the principles of quantum mechanics. Electrons are the charge-carrying components of transistors and play a crucial role in the operation and processing of data within a computer.

The ambiguity in the position of electrons, combined with the small size of the transistor, may result in electrons "leaking" out of their designated path. This can cause the chip to short-circuit or generate excessive heat, potentially leading to the melting of the chips. In essence, if we continue to rely solely on silicon, Moore's Law will inevitably fall short. This computing architecture has reached its end, and we may be witnessing the conclusion of the silicon age.

This does not imply that progress in computing technology has halted. Instead, innovation has shifted focus from simply increasing the number of transistors on a chip to advancements in system design, the utilization of new materials, and the exploration of alternative computing architectures. In other words, while Moore's Law may no longer exist in its traditional form, computing technology continues to evolve in new directions, with quantum computing leading the way.

The Beginning of the Quantum Era

The transition from the Silicon Age to the Quantum Age, as indicated by the culmination of Moore's Law, marks a significant shift in computing. The decreasing size of transistors to less than five atoms has initiated engineering at the atomic level. Rather than depending on transistors to regulate electrical charge flow, the movement of atoms themselves will be relied upon. Instead of conventional transistors that control electrical current, leveraging the mobility of atoms for computing tasks is proposed. This approach harnesses quantum mechanics to process and analyze vast amounts of data concurrently, in contrast to the sequential processing of digital computers.

To illustrate this, let's consider a maze scenario with a mouse and a piece of cheese at the end. While a digital computer must painstakingly examine each possible pathway in a sequential order, a quantum computer can simultaneously assess all pathways, resulting in a much faster solution. This remarkable capability expands the realm of solvable problems, including those that were previously deemed unsolvable using digital means. The number of bits that can be processed per second by classical digital computers is either 1 or 0, depending on whether they are turned on or off. However, a quantum computer employs quantum bits, or QUBITS, which are based on the principles of quantum mechanics. QUBITS can exist in a superposition of states, known as quantum superposition, allowing for more than one state to be processed simultaneously.

This means that an atom can simultaneously rotate in multiple directions - up, down, right, left, or diagonally. This simultaneous rotation results in the atom being in states 0, 1, and a combination of both (0,1). While a classical bit signifies the flow of electric current, a qubit denotes the rotational state of particles within an atom, such as electrons, protons, and neutrons. The advantage of a qubit lies in its ability to represent both zeros and ones concurrently, i.e., rotation and movement in different directions. This superposition of states grants quantum computers the remarkable capability to process information and perform numerous calculations simultaneously, surpassing the efficiency and speed of regular computers.

Therefore, the concept of quantum superposition is a fundamental aspect of quantum mechanics that sets quantum computing apart from traditional computing. In simple terms, it describes how quantum particles, like electrons or photons, can simultaneously exist in multiple states. This notion forms one of the primary foundations of quantum mechanics.

In addition to superposition, qubits also exhibit quantum entanglement. This property allows them to interact and become entangled with each other. When a new qubit is added, it interacts with all the previous qubits, resulting in a multiplication of potential interactions. Ordinary bits, on the other hand, cannot do this as they only take on independent states and do not become entangled. Consequently, quantum computers are fundamentally ten times more powerful than digital computers, as the number of interactions doubles with each additional qubit.

The situation, however, is far from simple. The state of quantum superposition and the interconnection of atoms can be affected by external forces altering the atoms' spin. This influence can lead to the binding or breaking of atomic bonds, causing instability in the qubit and an upsurge in quantum errors.

When we attempt to measure the state of a qubit to determine whether it is in a 0 or 1 state, the superposition collapses into either of these two states. This means that the qubit has the potential to exist in both states prior to the measurement. However, once the measurement is performed, the qubit chooses only one state to be in. Consequently, any measurement or interaction with the qubit may collapse its meta-quantum state, thereby changing the possible outcomes for mathematical operations. As a result, regulating these states and maintaining their stability in the face of external measurements is a significant issue in the development and operation of quantum computers.

Designing quantum software and algorithms requires a deep understanding of quantum mechanics, as well as proficiency in new coding and computation methods. Despite the challenges involved, research in the field of quantum computing is progressing rapidly, leading to significant advancements in areas such as quantum chemistry, algorithm enhancement, and cybersecurity.

The transition to the quantum age is rapidly approaching. Scientists are on the verge of being able to manage the state of quantum breakdown and achieve the highest possible stability for qubits. This breakthrough will lead to new and unprecedented discoveries, further widening the gap between developed and underdeveloped countries.