Qubit Design Basics

Explore top LinkedIn content from expert professionals.

  • View profile for Jeremy Connell-Waite
    Jeremy Connell-Waite Jeremy Connell-Waite is an Influencer

    Global Communications Designer 👁️ 🐝 Ⓜ️ | Author of “The 109 Rules of Storytelling”

    87,496 followers

    Tell me about QUANTUM COMPUTING in 2-minutes or less, using language my kid can understand. Challenge accepted. This was a question I got recently in a Q&A. I tried to channel my inner Hemingway. Big ideas, small words and short sentences! So if you fancy learning something new today - here's my take, and some useful resources worth checking out if you want a deeper dive. ⬇️ Imagine a computer that doesn’t just think in ones and zeros, like the ones we use today. A quantum computer uses "qubits" instead of bits. A bit can be a 1 or a 0. But a qubit can be both at the same time — this is called "superposition". It’s like flipping a coin and having it be heads and tails until you look.   Quantum computers also use something called entanglement. When two qubits are entangled, what happens to one instantly affects the other, even if they’re far apart. This lets quantum computers connect ideas in powerful new ways.   Because of superposition and entanglement, a quantum computer can explore many answers at once instead of one by one. That makes it super fast for some problems. It could help discover new medicines, protect data (search “quantum safe”), fight climate change, or even train smarter (ethical) AI.   But quantum computers are very hard to build. Qubits are delicate and can lose their power if they get too hot or too noisy. Scientists all over the world are racing to make them stronger and more stable. Quantum computers have to be kept at extremely low temperatures (-459°F) which is even colder than in outer space!   If they succeed, quantum computers could solve problems so big that today’s fastest supercomputers would take thousands of years to finish. Quantum computers won’t replace classical computers – but they will help us to solve many problems that we’ve never been able to solve before.   Quantum computers are not just faster – they give us a whole new way to understand the world. [263 words / 2 minutes] ⬇️ Want a Deeper Dive? 🥶 WATCH: Quantum computers exaplained by MKBHD [17 mins] https://lnkd.in/eNdRycfu 📒 READ: Wired's Easy Guide to Quantum Computing - Why It Works & How It Could Change The World https://lnkd.in/eiuAHxnQ 📖 FREE book "The Quantum Decade" from IBM Institute for Business Value https://lnkd.in/ejMCnKTX 🗺️ FUTURE: The Next 5 Years? Technology Atlas by IBM https://lnkd.in/ePaWdATp 📝 LEARN: 10 FREE courses (Most courses cost $2,500+ These 10 will get you started) https://lnkd.in/eM3k-Dtt

  • View profile for Peter Bordow

    Distinguished Engineer, Managing Director and PQC/Quantum Systems & Emerging Technologies R&D Leader for Cybersecurity at Wells Fargo

    6,076 followers

    I'm excited to share this Case Study for Quantum Entropy Injection into HSMs for Post Quantum Cryptographic (PQC) Key Generation that our amazing PQC team and I recently completed.   In cybersecurity, entropy is the measure of randomness in a string of bits. In cryptography, entropy is used to produce random numbers, which in turn are used to produce cryptographic keys. As entropy increases, randomness gets better, keys become more difficult to determine, and security improves. Entropy is also important for the generation of random numbers and other critical security parameters such as seeds, salts, and initialization vectors for cryptographic algorithms.   Financial institutions must deal with the constant risk of cyber-attacks, underlining the responsibility to maintain and strengthen digital security for customers’ trust and integrity. A foundational step for addressing these issues is generating stronger cryptographic keys with better entropy (as part of a broader Defense in Depth PQC strategy). Using random bits (from quantum sourced entropy) that are proven for improved randomness and unpredictability is pivotal for both today’s classical cryptography and tomorrow’s quantum resistant cryptography.   Wells Fargo, Thales, and Quantinuum, working in collaboration, demonstrated the ability to generate strong cryptographic keys within the cryptographic boundary of a Thales Luna HSM, a FIPS 140-2 level 3 cryptographic module with external entropy. The keys were generated using random bits with verified quantum entropy acquired from the Quantinuum Origin trapped ion-based quantum computer and validated using the Bell Test to prove it met the threshold for quantum entropy. This cryptographic solution gives Wells Fargo a proven quantum entropy source to generate ultra-secure keys that can be designed and deployed at scale.

  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 12,000+ direct connections & 35,000+ followers.

    35,184 followers

    MIT Sets Quantum Computing Record with 99.998% Fidelity Researchers at MIT have achieved a world-record single-qubit fidelity of 99.998% using a superconducting qubit known as fluxonium. This breakthrough represents a significant step toward practical quantum computing by addressing one of the field’s greatest challenges: mitigating noise and control imperfections that lead to operational errors. Key Highlights: 1. The Problem: Noise and Errors • Qubits, the building blocks of quantum computers, are highly sensitive to noise and imperfections in control mechanisms. • Such disturbances introduce errors that limit the complexity and duration of quantum algorithms. “These errors ultimately cap the performance of quantum systems,” the researchers noted. 2. The Solution: Two New Techniques To overcome these challenges, the MIT team developed two innovative techniques: • Commensurate Pulses: This method involves timing quantum pulses precisely to make counter-rotating errors uniform and correctable. • Circularly Polarized Microwaves: By creating a synthetic version of circularly polarized light, the team improved the control of the qubit’s state, further enhancing fidelity. “Getting rid of these errors was a fun challenge for us,” said David Rower, PhD ’24, one of the study’s lead researchers. 3. Fluxonium Qubits and Their Potential • Fluxonium qubits are superconducting circuits with unique properties that make them more resistant to environmental noise compared to traditional qubits. • By applying the new error-mitigation techniques, the team unlocked the potential of fluxonium to operate at near-perfect fidelity. 4. Implications for Quantum Computing • Achieving 99.998% fidelity significantly reduces errors in quantum operations, paving the way for more complex and reliable quantum algorithms. • This milestone represents a major step toward scalable quantum computing systems capable of solving real-world problems. What’s Next? The team plans to expand its work by exploring multi-qubit systems and integrating the error-mitigation techniques into larger quantum architectures. Such advancements could accelerate progress toward error-corrected, fault-tolerant quantum computers. Conclusion: A Leap Toward Practical Quantum Systems MIT’s achievement underscores the importance of innovation in error correction and control to overcome the fundamental challenges of quantum computing. This breakthrough brings us closer to the realization of large-scale quantum systems that could transform fields such as cryptography, materials science, and complex optimization problems.

  • View profile for Dr. Ayesha Khanna
    Dr. Ayesha Khanna Dr. Ayesha Khanna is an Influencer

    AI Entrepreneur and Advisor. Board Member. Forbes Groundbreaking Female Entrepreneur in Southeast Asia. LinkedIn Top Voice for AI.

    81,386 followers

    Here’s something that has never happened before: JPMorganChase says it has used quantum computing to create truly random numbers for the first time ever. Here’s why this is big: Random numbers used by computers aren’t truly random. They’re created by algorithms that follow rules, so if you know the starting point, you can guess the rest. That’s a problem for things like encryption, where predictability is a risk. Quantum computers, however, can tap into the randomness of quantum physics itself—producing numbers with no predictable pattern at all. And for the first time, JPMorgan has proven that kind of randomness is real. 🔐 Random numbers are key to keeping digital systems secure, from cryptography and online banking to encrypted messaging apps. For example, when you send a message on WhatsApp, it’s scrambled using a secret “random number” so only the person you’re messaging can read it. That secret code is based on randomly generated numbers. If the numbers aren’t truly random, hackers could figure them out and decrypt your messages. To verify their output, JPMorgan employed US Department of Energy supercomputers, demonstrating that the randomness met rigorous mathematical standards. For a technology still in its early stages, this is a big step toward real-world quantum applications. It could support use cases in high-stakes industries like finance and critical infrastructure, where verifiable randomness is crucial for audit trails, encryption protocols, and risk modeling. So, what’s in a random number? Just your bank account, your messages, your identity—pretty much everything worth protecting. #artificialintelligence #innovation

  • View profile for Jaime Gómez García
    Jaime Gómez García Jaime Gómez García is an Influencer

    Global Head of Santander Quantum Threat Program | Chair of Europol Quantum Safe Financial Forum | Representative at EU Quantum Industry Consortium, AMETIC | LinkedIn QuantumTopVoices 2022-2024 | Quantum Leap Award 2025

    16,157 followers

    Microsoft and Quantinuum reach new milestone in quantum error correction. The collaboration claims to have used an innovative qubit-virtualization system on Quantinuum's H2 ion-trap platform to create 4 highly reliable logical qubits from only 30 physical qubits. What is quantum error correction? The physical qubits, with error rates in the order of 10^-2, are combined to deliver logical qubits with error rates in the order of 10^-5. According to their press release, this is the largest gap between physical and logical error rates reported to date, and has allowed them to run ran more than 14,000 individual experiments without a single error. (https://lnkd.in/dzETsvVA) The race for the qubits count seemed to finish in 2023, with the latest update on IBM's roadmap focusing on quality rather than on quantity (https://lnkd.in/dFu52wJR, "Until this year, our path was scaling the number of qubits. Going forward we will add a new metric, gate operations—a measure of the workloads our systems can run."), and other developments in quantum error correction, like the one announced in December by Harvard University, Massachusetts Institute of Technology, QuEra Computing Inc. and National Institute of Standards and Technology (NIST)/University of Maryland in December (https://lnkd.in/dkW-TT-w) Practical quantum computing gets a little closer, although it is still a distant target. Microsoft Press release: https://lnkd.in/deJ4QCBk Quantinuum's press release: https://lnkd.in/d4Wnmvdq More details from Microsoft: https://lnkd.in/dusfZ4KY Paper: https://lnkd.in/dpPCX3td #quantumcomputing #quantumerrorcorrection #technology

  • View profile for Kurt Cagle
    Kurt Cagle Kurt Cagle is an Influencer

    Editor In Chief @ The Cagle Report | Ontologist | Author | Iconoclast

    25,355 followers

    US researchers have achieved quantum teleportation over 30 kilometers using standard internet fiber optic cables, a major step towards secure quantum networks. This process used entangled particles to transmit quantum states while coexisting with regular internet traffic, proving compatibility between quantum and classical communication. The breakthrough, published in Optica, eliminates the need for costly infrastructure, paving the way for advanced applications in quantum computing, faster data sharing, and highly secure communication systems. This milestone demonstrates the practicality of integrating quantum technology into existing networks. Source – ZME Science I have regularly been critical of quantum computing, but there's another area of quantum mechanics - entanglement - that I think holds far more potential short term. Entanglement (aka spooky action at a distance, according to Einstein) causes two particles to effectively act as if they were the same particle (bosons), even when separated by sizeable distances. If you influence one particle, the other particle will change state without any intervening transmission, and this change of state (such as polarity, can then be detected). This experiment showed that you can transmit one of a pair of such particles across coaxial cables and maintain entanglement. The upshot of this is very interesting, because it means that messages can be send point to point without having to be routed through a complex network. Not only would this have a huge impact upon the speed of such systems, but the communication would be completely secure as there is no possibility of a man-in-the-middle type effect. It also reduces the need for big cryptographic keys, and futureproofs against quantum decoding.

  • View profile for Kai Beckmann
    Kai Beckmann Kai Beckmann is an Influencer

    Deputy Chairman of the Executive Board at Merck KGaA

    30,529 followers

      For the first time, researchers were able to observe quantum coherence at room temperature.   The majority of quantum chips only function near the absolute zero point of approximately minus 273 degrees Celsius. This is because qubits are fragile and only operate error-free without external influences. That's why the primary thing one sees when it comes to a quantum computer is its cooling unit – the so-called cryocooler. Additionally, the sensitive computers also need to be shielded from the outside world using a vacuum. At room temperature, the quantum information stored in qubits loses its quantum superposition and entanglement.   While the quantum coherence at room temperature was observed only for over 100 nanoseconds, the findings could help pave the way for designing new materials for innovative solutions. Improving concepts for qubits and quantum processors could bring us one step closer to practical quantum computers.   Research is ongoing and at the center of it all are materials.   #quantumcomputing #innovation #technology via SciTechDaily https://lnkd.in/dDm6HhX8    

  • View profile for Marco Pistoia

    CEO, IonQ Italia

    18,083 followers

    Two days ago, we were proud to see the Nature Magazine publish our article on Certified Quantum Randomness, a task we demonstrated on the Quantinuum H2 trapped-ion #quantum computer, unattainable on any classical supercomputer. Unlike the randomness sources accessible on today's classical computers, the output of our #quantumcomputing-based protocol can be certified to be random under certain computational-hardness assumptions, with no trust required in the hardware generating the randomness. We are humbled by the enthusiastic response we received from the scientific community and industry. To better explain of the usefulness of Certified Quantum Randomness in the industry, we wrote a companion perspective paper, entitled "Applications of Certified Randomness," now available as an arXiv preprint at the following URL: https://lnkd.in/eCX7vDXP In this perspective, we explore real-world applications for which the use of certified randomness protocols may lead to improved security and fairness. We identify promising applications in areas including #cryptography, differential #privacy, financial markets, and #blockchain. Through this initial exploration, we hope to shed light on potential applications of certified randomness.

  • View profile for Laurent Prost

    Product Manager at Alice & Bob

    5,474 followers

    Google's Willow chip shows that quantum error correction is starting to work. Just "starting", because while the ~1e-3 error rate reached by Willow is good, it has been achieved by others without error correction. So, how do we get error rates we couldn't reach with physical qubits alone? Easy: you "just" add more qubits in your logical qubit. But because there are errors on two dimensions in quantum computing, a 2D-structure (the surface code) is usually required to correct errors. This means that increasing protection against errors causes the number of qubits to grow quickly. With a surface code, protecting against 1 error at a time during an error correction cycle requires 17 qubits. 2 errors at a time? 49 qubits. 3 errors at a time? 97 qubits. This is the max Willow could achieve. This quadratic scaling leads Google to expect that reaching a 1e-6 error rate on a Willow-like chip will require some 1457 physical qubits (protecting against 13 errors at a time). And this is the reason why Alice & Bob is going for cat qubits instead. By reducing error correction from a 2D to a 1D problem, cat qubits make the scaling of error rates much more favorable. Even with the simplest error correction code (a repetition code), correcting one error at a time only requires 5 qubits. 2 errors? 9 qubits. 3 errors? 13 qubits. 13 errors? This is just 53 qubits instead of 1457! This situation is summarized in the graph below. It is taken from our white paper (link in the 1st comment) and I added a point corresponding to the biggest Willow experiment. Now, to be fair, Alice & Bob still needs to release the results of even a 5-qubit experiment. But when this is done, there is a fair chance the error rates will quickly catch up with those achieved by Google or others, because so few additional qubits are required to improve error rates. There are big challenges on both sides. Mastering cat qubits is hard. Scaling chips is hard. But consistent progress is being made on both sides too. Anyway, I can't wait for the moment when I can add the Alice & Bob equivalent of the Willow experiment on the chart below. And for once, I hope it will be up and to the left!

  • View profile for Maciej Malinowski

    Lead Quantum Computer Architect at Oxford Ionics // Building next-gen compute hardware // Atoms and bits

    2,512 followers

    The real significance of Google's Willow quantum chip... Fundamentally, building quantum computers (QC) is about achieving low operation errors. Sure, other metrics matter too, but the error rate is the big one. If you look at the landscape of QC applications, many of them require *ridiculously* low error rates - say 1 error in 10^12 operations or less. Nobody thinks this can be achieved through hardware engineering alone - this needs quantum error correction (QEC) for sure. But should we be confident that QEC will actually work? Sure, it will work to some extent - but can it work well enough to reach error rates as low as 1e-12 or less? QEC makes non-trivial assumptions about the nature of the physical errors which are never quite true, and deviations from those assumptions could plausibly derail QEC by setting a "logical noise floor" - an error rate below which QEC ceases to work. The previous most thorough search for the logical noise floor in QEC was performed by Google in 2023. At that time, they found that QEC ceases to work at a rather high error rate of 1e-6. This was due to high-energy cosmic rays hitting their qubit chips, causing large-scale correlated errors which cannot be taken out by QEC. That's a *big* issue! Google latest chip incorporates design changes to make it immune to cosmic ray errors. After incorporating those changes, the logical noise floor search was repeated and reported in the recent paper. It turns out the mitigation work, and the logical noise floor was pushed all the way down to a new record of 1e-10, i.e. 1 error per 10^10 operations! This is the most convincing evidence to date that - in a well-engineered QC - QEC is actually capable of pushing the error rates down to levels compatible with most known QC applications. To me, this repetition-code is actually the most important finding reported in Google's paper! Funnily enough, Google's team reports that they actually don't know where this error may be coming from. Error rates this low are also really challenging to study, because it can take considerable data acquisition time to establish meaningful statistics. But I'm sure they'll figure it out soon enough... 😇

Explore categories