Googles Quantum Computer Turned Back On And What It Reveal Scared All Scientists

Googles Quantum Computer Turned Back On And What It Reveal Scared All Scientists

Show Video

Quantum computing joining forces with AI  would mark a significant collaboration between   advanced software and hardware technologies.  However, the potential outcomes of merging   robotics with quantum computers are causing  concern among scientists. What secrets will   be unlocked? Will this collaboration work  or ruin the entire research? Join us as we   unravel the chilling revelation uncovered after  Google's newest quantum computer was activated!  Michio Kaku, a theoretical  physicist, and futurist,   often explores the exciting possibilities  and challenges brought by new technologies,   like quantum computers. As scientists, we love  thinking about new, creative ideas that could   change how we see the world. Kaku's concerns  about quantum computers focus mainly on how   they might affect security, privacy, and society,  rather than fearing the technology itself. So,  

how do these quantum computers work? Well, they  play by different rules than regular computers.   Instead of using regular bits that can only  be 0 or 1, quantum computers use qubits,   which can be 0, 1, or both at the same time! This  mind-bending feature allows them to solve problems   that classical computers can't even touch.  Quantum computers have the power to explore vast   landscapes of possibilities and give us reasonable  answers. But building them is like something out   of a science fiction movie. Qubits are made from  exotic materials and kept at near-absolute zero   temperatures to coax out their quantum properties.  However, just having the hardware isn't enough. We need special instructions, like  Quantum gates, to manipulate these   qubits and perform calculations. But here's  the tricky part: quantum systems are delicate  

and prone to errors. Scientists have to use  complex protocols to fix mistakes without   messing up the qubits. It's like being quantum  mechanics ninjas, always swooping in to correct   errors without disturbing the delicate state of  the qubits. And here's an interesting tidbit:  

correct ideas often come from interacting with  incorrect ones. So, keeping these quantum systems   stable requires extreme measures. That's  where dilution refrigerators come in. These   superpowered freezers help minimize noise  and keep the quantum magic running smoothly.  Imagine constructing a computer and submerging  it into a large tank of liquid helium. While the  

astonishing capability of qubits existing in  multiple states at once promises to transform   how we solve problems, quantum computers are  the ultimate computing machines because they   operate on atoms and can perform calculations  far beyond our current capabilities. However,   keeping these machines running presents a unique  challenge. Superconducting qubits, a common type,   require temperatures near absolute zero, colder  than even the depths of space. This is where   dilution refrigerators come in – the unsung  heroes of quantum computing. These powerful   freezers tirelessly maintain an icy environment  for the delicate qubits, but they come at a cost:   they consume massive amounts of energy, sometimes  tens of thousands of watts. Surprisingly,  

while the computations themselves may be  energy-efficient for specific tasks, the   supporting infrastructure behind quantum computers  comes with a hefty energy price tag. Dilution   refrigerators and the intricate electronics  needed to control and read qubit states are   real power hogs. So, how do these mind-bending  qubits translate into mind-blowing speed? Here's where the true magic of quantum computing  unfolds. Imagine a million coins spinning in   the air, each capable of landing on heads or  tails. A regular computer can only check each   coin one by one. However, a quantum computer can  analyze all the possibilities at the same time,   thanks to a phenomenon called superposition. The  incredible parallel processing power of quantum  

computers allows them to solve problems  much faster than even the most powerful   classical computers. But it's not just about  raw power – quantum computers also use special   algorithms designed to take advantage of the  unique properties of qubits. For example,   imagine searching through a massive unsorted  database to find a specific item. A classical   computer would have to check each entry one by  one. However, a quantum algorithm could find the  

item in a fraction of the time. This ability  to dramatically speed up computations opens   up new possibilities in fields like medicine,  material science, and artificial intelligence,   where complex simulations and calculations  are currently limited by classical computing.   Moreover, the advancements in quantum  computing are set to change digital security. Redefining Cryptography and Security Traditional cryptographic systems,   like public key cryptography, which secures  our emails and financial transactions, rely   on mathematical challenges that are difficult  for classical computers to solve. However,   with the increased computational capabilities  of quantum computers, these cryptographic   systems become vulnerable to attacks. This  highlights the need for new cryptographic  

methods to secure our digital information in  the era of quantum computing. However, this   security relies on the limitations of classical  computing. In the world of quantum computing,   the rules of the game change entirely. Qubits,  the building blocks of quantum computers, can   exist in multiple states simultaneously, allowing  quantum computers to perform many calculations at   once. This ability poses a direct threat to  the foundation of public key cryptography.

For example, RSA encryption, which relies on the  difficulty of factoring the product of two large   prime numbers, could potentially be undone by a  quantum computer in mere seconds or minutes – a   task that would take a classical computer  thousands of years. Quantum computers have the   power to compute things that are currently beyond  our reach. The introduction of Shor's algorithm in   1994 was an innovative moment, demonstrating the  potential of quantum computers to solve previously   overwhelming problems at unimaginable speeds. This  algorithm can efficiently factor large numbers   and solve discrete logarithm problems, making  encryption methods like RSA, Diffie-Hellman,   and ECC vulnerable. The implications of this  are profound, highlighting the urgent need for   new cryptographic standards resistant to the  capabilities of quantum computing. However,  

on the flip side of this looming threat is the  emergence of quantum cryptography, offering a   glimmer of hope in securing communications  against the prowess of quantum computing. Quantum key distribution, QKD, a novel approach  in this field, leverages the principles of quantum   mechanics to secure communication channels.  The foundation of QKD's security lies not in   computational complexity, but in the physical  properties of quantum particles. A standout   feature of this method is its incorporation of  the no-cloning theorem, which states that it   is impossible to exactly copy an unknown quantum  state. This concept ensures that if someone tries  

to listen in on a conversation, it will be noticed  because when you try to measure a quantum system,   it changes, which tells the people talking that  there's a security problem. As we learn more about   quantum computing, it's amazing to see how it can  change the world. It's like a huge step forward in   computer power, promising to solve problems  that regular computers couldn't touch. But,   like with any big new technology, quantum  computing brings both good things and challenges.

It's not just about making encryption stronger.  Quantum computers might be able to break the codes   we use now sooner than we think, which could  mess up our digital privacy. Stuff that's safe   now might not be safe later, leaving people open  to problems like someone stealing their identity   or their money. And it's not just about keeping  secrets safe. Quantum computing is also going to   shake things up in finance. It's going to make a  big change in how we look at money and trading,  

and it could change digital currencies  in a big way. So, while it's exciting,   it's also a little scary to think  about all the changes it might bring. The Challenges to Global Stability and Fairness Cryptocurrencies, like Bitcoin and Ethereum,   are known for their strong security, relying  on complicated codes to keep transactions   safe. But the power of quantum computing  could change all that. Quantum computers   are so smart that they might be able to crack  these codes, which could lead to some serious   problems. Imagine someone being able to spend  the same money twice or even create new money   without permission. It's not just a small  issue—it could mess up the whole system,   allowing bad people to mess with transaction  records or steal money. Even just the idea  

that quantum computers could break into  cryptocurrencies could make people lose   faith in them, causing prices to drop and  making people rethink using digital money. And it's not just about cryptocurrencies. Quantum  computing could shake up the world of finance in   a big way. It could help us understand financial  systems better and predict what might happen in  

the market way faster and more accurately than  regular computers ever could. That could change   how we handle money and investments, making things  more efficient but also more complicated. This   big step forward in how we analyze things with  quantum computing could completely change how we   think about and deal with financial markets. But,  it's not all easy sailing. Some big problems come   with this quantum advantage, especially  when it comes to fairness in the market.

Think about it— if only a few people had access  to quantum computers. They could predict what's   going to happen in the market way before everyone  else, giving them a huge advantage. This wouldn't   just mess up how the market works, but it  would also bring up some serious questions   about whether it's fair or not. To make sure  everyone has a fair chance in the market,   we might need to change the rules. We'd have  to make sure that people with quantum computers  

can't use them to cheat or mess with how  the market works. It's all about making   sure the market stays fair and trustworthy,  even as technology keeps moving forward. Another potential risk of quantum computing  is that it could make some countries more   vulnerable to warfare. Quantum computers are  incredibly powerful and can solve tough problems,   which is useful for tackling challenges  in our world. However, they also pose a   challenge when it comes to countries and  their military strategies. For example,   one country has quantum computing technology,  while another doesn't. This creates a big  

problem. The country with quantum computing  has a significant advantage. They can use it   to outsmart and outmaneuver their opponent  at every turn. Meanwhile, the other country,   lacking quantum technology, struggles to  defend its important stuff like military   secrets or critical assets. This unequal  access to advanced technology could lead   to an information gap in warfare, putting some  nations at a disadvantage. So, while quantum  

tech holds great promise, it also raises concerns  about fairness and security in global conflicts. The shortage of helium is something that often  goes unnoticed when we talk about quantum   computing. But, just like your computer needs  a fan to stay cool, quantum technology relies   on helium for cooling. Helium isn't unlimited,  and it can be pretty expensive. This brings up  

two big concerns. First, companies using quantum  computers need a steady supply of helium to keep   their machines running smoothly. Think of  it like needing a specific ingredient for   your favorite recipe—if it's hard to find,  cooking becomes a real challenge. Similarly,   if companies can't get enough helium, operating  quantum computers becomes a struggle. Secondly,  

because helium isn't plentiful, there's a risk  that only a few organizations will have the   capability to operate quantum computers. This  concentration of expertise might not be good   for the overall progress of quantum computing.  So, the availability of helium is more crucial   than we might realize for the future of this  advanced technology. The growing concerns posed   by quantum computing highlight the pressing need  to switch to post-quantum cryptographic methods,   as current encryption techniques are at risk  of being compromised by quantum attacks. Let's look into some aspects of public key  encryption methods and their contenders.  

We'll start by examining some performance tests  regarding key exchange and digital signatures.   Public key encryption involves using a pair of  keys: one to encrypt and another to decrypt. For   example, Alice's public key encrypts, while  her private key decrypts. On the other hand,  

for digital signatures, we flip the process.  We use Alice's private key to sign a message   and her public key to verify the signature. In  key exchange, we typically use methods like the   elliptic curve Diffie-Hellman, where both Bob and  Alice generate private keys, exchange public keys,   and derive a shared key using a key derivation  function. However, these methods face risks from  

quantum computers. RSA, ElGamal, and elliptic  curve methods are all vulnerable to attacks   from quantum computers. Therefore, we need to  transition to post-quantum crypto methods. For   elliptic curve methods, we use a base point on a  curve and a secret key to generate a public key.   Unfortunately, methods like ECDSA are vulnerable  to quantum computers. New methods like Curve25519,   SECP256K1, and P256 offer standard signature and  key exchange methods but are also susceptible to   quantum attacks. Now, let's explore the existing  hash-based methods and their performance.

Navigating the Future of Digital Security In the past, creating a large number of   private keys required a stateful system,  often referred to as a "miracle tree." This   method had its limitations. However, newer  techniques have emerged to enhance security   and efficiency by making the process stateless  and integrating symmetric key methods. One such  

approach is McEliece, a code-based method with a  long-standing track record of security. Another   method is multivariate polynomial quadratics,  which includes techniques like the oil and   vinegar method, where polynomials are created  with a hidden trap door. Learning with errors,   LWE, combined with lattice-based methods  is gaining popularity due to its ability   to produce compact digital signatures  and facilitate key exchange. Lastly,   isogenies use elliptic curves to transition  from one curve to another. We have a variety   of options, including hash-based methods,  multivariate quadratics, code-based methods,   and isogenies. Determining the best method  for post-quantum cryptography is crucial. In 2016, the U.S. Department of Commerce's  National Institute of Standards and Technology,  

NIST initiated a post-quantum cryptography  project, conducting three rounds of evaluations.   They've chosen the first batch of encryption tools  built to withstand future quantum computers. These   powerful computers could potentially break the  security measures used in our everyday digital   activities, like online banking and email. These  four selected encryption algorithms will be part   of NIST's post-quantum cryptographic standard,  which is expected to be finalized in about   two years. Commerce Secretary Gina M. Raimondo  highlighted the importance of this announcement.   She emphasized that it's a crucial step in  protecting our sensitive data from potential   cyberattacks by quantum computers. With NIST's  expertise and dedication to advanced technology,  

businesses can innovate while ensuring the  trust and confidence of their customers. This decision came after a six-year effort managed  by NIST. They called upon cryptographers worldwide   to develop encryption methods that could resist  attacks from more powerful quantum computers. The   selection marks the beginning of the final phase  of NIST's project to standardize post-quantum   cryptography. NIST is always looking ahead to  understand what the U.S. industry and society  

might need in the future. We know that once  quantum computers powerful enough to break   our current encryption systems are created, our  information could be at risk," explained Laurie   E. Locascio, the Under Secretary of Commerce  for Standards and Technology and NIST Director.   "That's why our post-quantum cryptography  program has brought together the brightest   minds in cryptography from around the world  to develop these quantum-resistant algorithms.  

These algorithms will set a standard and greatly  enhance the security of our digital information." In addition to the four selected algorithms,  NIST is considering four more for inclusion   in the standard. They will announce the finalists  from this group at a later date. NIST is releasing   its choices in two stages to ensure a wide range  of defense tools. Cryptographers have recognized   from the start of NIST's effort that different  systems and tasks require encryption. A useful   standard would offer solutions tailored for  various situations, use different approaches   for encryption, and provide multiple algorithms  for each use case in case one is found to be   vulnerable. Encryption uses math to protect  sensitive electronic information, including  

the secure websites we surf and the emails we  send. Widely used public-key encryption systems,   which rely on math problems that even the  fastest conventional computers find intractable,   ensure these websites and messages are  inaccessible to unwelcome third parties. However, a sufficiently capable quantum  computer, which would be based on a different   technology than conventional computers  could solve these math problems quickly,   defeating encryption systems. To counter this  threat, the four quantum-resistant algorithms   rely on math problems that both conventional and  quantum computers should have difficulty solving,   thereby defending privacy both now and  down the road. The algorithms are designed   for two main tasks for which encryption  is typically used: general encryption,   used to protect information exchanged across  a public network; and digital signatures,   used for identity authentication. All four of the  algorithms were created by experts collaborating  

from multiple countries and institutions. NIST has  chosen the CRYSTALS-Kyber algorithm for general   encryption, especially when we're accessing secure  websites. When it comes to verifying identities   in digital transactions or signing documents  online, digital signatures play a crucial role.

NIST has carefully selected three algorithms  for this purpose: CRYSTALS-Dilithium, FALCON,   and SPHINCS+. CRYSTALS-Dilithium  stands out as the primary choice,   recommended by NIST for its high efficiency.  However, FALCON is preferred in situations   where smaller signatures are needed. On  the other hand, SPHINCS+ offers a different   approach based on hash functions, making it  valuable as a backup option. Meanwhile, four   other algorithms are still being considered for  general encryption, following various approaches   that don't involve structured lattices or hash  functions. During this developmental phase,   NIST encourages security experts to explore these  new algorithms but advises against integrating   them into systems just yet, as there may  be slight changes before finalization.

To prepare for the transition, users can  assess their systems for applications using   public-key cryptography, which will need  replacement before quantum computers become   a cryptographic concern. Additionally, they  can inform their IT departments and vendors   about the upcoming changes. Now, with the  NIST project reaching its final stages,   the need to transition from traditional public  key methods to post-quantum cryptography is   becoming increasingly urgent. Currently,  there are four contenders for key exchange   and three finalists for digital signatures.  NIST aims to ensure diversity by considering   alternative winners. This approach ensures that  if one method, such as a lattice-based approach,  

is selected, there will be alternative options  available during the standardization process. The Diversity of Encryption Methods We have a variety of encryption methods   to consider, each with its unique characteristics.  Michaelis, a well-established code-based method,   has been in use for quite some time.  Kyber and Saber are lattice-based methods,   while NTRU is another lattice method that's  gained recognition. Lattice methods might   dominate the key exchange arena. When it comes  to digital signatures, we see a mix of lattice   methods like Dilithium and Falcon, along with a  multivariate polynomial method called Rainbow,   which is also known as an "oil and  vinegar" method. As for alternatives,  

we have methods like BIKE, Frodo, HQC, and PSYCH,  which is the only isogeny-based method among   them. The finalists for key exchange include  McLees, Kyber, NTRU, and Saber, each varying   in the sizes of keys they generate and their  performance. To evaluate their performance,   we utilized the LibOQS library, which  enables testing on different systems. In terms of signature schemes supported by  this library, we have Sphinx, Rainbow, Picnic,   Falcon, and Dilithium. For key exchange, we have  McLees, Frodo, Kyber, NTRU, Saber, and PSYCH,   each offering different levels of security and  efficiency. Level one serves as the baseline,   providing security equivalent to 128 bits, similar  to what we get with EES. Level three offers  

192-bit security, while level five offers 256-bit  security. Each level has its own implementation   tailored to its specific security requirements.  We can examine the impact of these implementations   on a Raspberry Pi. However, if we consider the  number of cycles more broadly, it depends heavily   on the clock speed, which ultimately determines  the total time taken for execution. The results   are color-coded for clarity. Values between one  and two times the baseline are shown in green,   between two and ten in yellow, and various shades  of orange indicate factors ranging from 10 to over   1000 times the baseline. Dark orange indicates  a factor exceeding 1000 times the baseline.

When it comes to key generation and exchange,  Bob encrypts a key, encapsulates it,   and sends it to Alice. Alice then decapsulates  it using her public key. These operations are   crucial not only for public key encryption  but also for key exchange protocols.Looking   at the number of cycles per second taken by each  method, Kyber stands out at the top, performing   well in both encapsulation and decapsulation. It  consistently receives a scoring factor of 10 in   each area. Following Kyber, we have Cyber Ichiba  and Saber, with a lightweight version of Saber   performing decently, particularly in level  one. NTRU joins alongside Saber and Kyber,   showing strong performance. As we move down the  list, we encounter methods like HQC and BIKE,  

followed by Frodo KM. Further down, we  see the isogeny-based methods making their   appearance. At the end of the list, we find  the code-based method of McLease. Generally,   non-lattice methods seem to struggle a bit more  with key generation but perform reasonably well   in encapsulation and decapsulation. However,  exogenous-based methods within the code base   struggle significantly, with large factors  indicating performance issues, especially   with key sizes produced by McLease, which are  much larger compared to lattice-based methods.

If we were to devise a scoring system, here's  how the methods stack up: Kyber emerges as the   fastest across key generation, encapsulation,  and decapsulation cycles, followed by Saber,   NTRU, and then BIKE, Michaelis, HQC, and so  forth. Generally, it's the lattice methods   that shine brightest across all aspects, with  Kyber notably outperforming the rest. Now,   turning our attention to key sizes, we notice  something interesting: the isogeny methods excel   in producing smaller key sizes for both public and  cipher keys, as well as secret and shared values.   Comparatively, Kyber and other methods generate  larger public keys, although typically only two   to three times larger than those from isogeny  methods. However, their private keys tend to be   substantially larger than the compressed versions  produced by isogeny methods. When we delve deeper,   we find that McLease and other code-based  methods encounter challenges, particularly   in the size of the public and private keys they  generate, although the resulting cipher tends   to be relatively small compared to others. Once  more, across various aspects, the lattice methods  

consistently show commendable performance,  particularly in terms of their key size. Now, let's delve into digital signatures,  including Dilithium, Falcon, and Rainbow,   along with their alternative counterparts. While  the evaluations for the Raspberry Pi are detailed,   let's take a broader view and examine the  number of cycles required. Interestingly,   the hash-based zero-knowledge proof method,  Picnic, emerges as the leader in key generation.  

However, it falls short when it comes to  signing and verification. Moving on to the   hash-based methods, we observe a general  struggle across key generation, signing,   and verification. Similarly, the struggles  persist with the hash-based methods across   various aspects. However, lattice-based  methods continue to perform admirably,   offering a balanced compromise in key  generation, signing, and verification. Regarding key sizes, we notice that the hash-based  methods tend to excel. However, their signature  

sizes are not as impressive. For instance,  Picnic encounters challenges in signature size,   and Rainbow, the multivariate method, struggles  with both public and secret key sizes. Overall,   each method exhibits its own set of strengths  and weaknesses. However, the lattice methods   appear to offer the best compromise, although  they may not always produce the smallest key or   signature sizes. Yet, they remain generally  acceptable across all aspects. In summary,   hash-based methods excel in generating small key  sizes for digital signatures but tend to produce   larger overall signatures. They also appear to  be slower when tested on the Raspberry Pi. On the   other hand, multivariate methods have larger key  sizes but smaller signatures. However, they too  

exhibit sluggish performance on the Raspberry Pi.  In contrast, lattice methods emerge as a favorable   compromise for both key and signature sizes. Among  them, Dilithium stands out as the top performer   for key exchange. When tested on the Raspberry  Pi, Kyber demonstrates the best performance, while   slower methods like NTRU and the isogeny-based  methods lag in terms of key generation speed. Raspberry Pi has proven to be a valuable tool for  testing the CRYSTALS-Kyber algorithm. Raspberry   Pis are tiny computers. The biggest one is about  the size of a deck of cards, while the smallest is  

just a bit bigger than a stick of gum. You might  look at one and think, "That doesn't look like any   computer I've seen before!" But appearances can be  deceiving. Despite its quirky looks, a Raspberry   Pi has everything a regular computer has. It's  got ports for connecting a monitor, keyboard,   mouse, and even the internet. It's the real deal—a  full-fledged computer! The only difference is,  

that while most computers run Windows or Mac,  Raspberry Pi runs something called Linux.   Specifically, it's a version called Raspbian,  made just for Raspberry Pi. But here's the thing:   not all Raspberry Pis are the same. There  are several versions, each with its features.

First up, we have the Pi Zero. It's the tiny  one, but don't let its size fool you—it packs   a punch! However, because it's so small, you  might need a few extra adapters to connect   everything to it. Once you get the hang of it,  the Pi Zero is perfect for projects with limited   space. It's the cheapest option, however, it's  not as powerful as its bigger siblings. Now,  

we have the Model A series, the middle  child of the Raspberry Pi family,   so to speak. It's not as speedy as the Model  B, but it's not as tiny as the Pi Zero either.   It's got a faster processor than the Pi Zero,  along with a full-sized USB port, audio port,   and HDMI port. The newer versions even come  with built-in wireless and Bluetooth. Next up,   we have the Model B series. This one's got all  the bells and whistles. It's so powerful it can   probably knit a Hogwarts sweater while rescuing  a cat from a tree. It boasts four USB ports,   a full-sized Ethernet jack, and up to  four gigabytes of memory. With a quad-core  

processor and support for dual monitors, it's  more powerful than many laptops on the market. But here's the real magic of Raspberry Pi: the  GPIO pins. These little pins allow you to send   and receive electrical signals, controlled  right from the operating system. They're   what make Raspberry Pi so versatile and popular  among makers and hobbyists alike. Being able to   manipulate electrical signals allows you  to control various electronic devices.  

Starting with simple applications, such  as LED lights, motors, buttons, switches,   radio signals, audio signals, and LCDs, the  possibilities are endless. You could design   your mini arcade system with joysticks and  buttons. The GPIO pins of the Raspberry Pi   bridge the gap between computing and the physical  world, offering endless creative opportunities.

What are your thoughts on quantum computers  and the potential dangers they bring? Let us   know your opinion in the comments below.  And if you enjoyed watching this video,   make sure to give it a thumbs  up and subscribe to the channel.

2024-04-09 13:39

Show Video

Other news