Cultivating Optimism: A Skill for Success

Optimism is a remarkable and transformative belief that invites us to view our circumstances through a lens of potential and hope. It asserts that improvement is not just a distant dream but a reality within our grasp, even in the face of complex challenges and significant constraints. Instead of waiting for absolute certainty to take action, we are encouraged to trust that clarity and solutions will emerge through our engagement with the world around us.

This perspective extends far beyond a mere positive attitude; it fundamentally shapes our approach to life’s obstacles. By allowing us to let go of fear and embrace new possibilities, optimism empowers us to move forward, even when the path ahead seems uncertain. It prevents us from being immobilized by the risks and potential failures that often overshadow our ambitions.

Where fear may narrow our focus and amplify doubts, optimism broadens our horizon, revealing opportunities alongside risks. It fosters creativity, bolsters innovative thinking, and instills the confidence needed to confront difficulties. Setbacks transform from mere failures into invaluable learning experiences that guide our next steps. By adopting this optimistic mindset, we pave the way for decisive action amid uncertainty, driven by the belief that our efforts will ultimately yield positive outcomes. Over time, this approach diminishes the hold of fear and fortifies our confidence in the pursuit of success.

The Art of Navigating Uncertainty

In professional environments, optimism shows up when we share ideas, ask for feedback, and see pushback as helpful instead of negative. In our personal lives, it means making progress even when we don’t know the outcome, trusting that moving forward will make things clearer.

People who expect success and move forward with confidence stay engaged longer. Instead of viewing setbacks as reasons to give up, they see them as signs to change their approach. This attitude promotes ongoing improvement in our ideas. Rather than asking, “What if this fails?” optimism leads us to think, “What if this helps me grow and teaches me something important?”

Research shows that optimism is more than just feeling good; it can lead to real success. A well-known study found that optimistic workers performed better than those who weren’t as hopeful, not just because they worked hard, but because they believed success was possible, even when challenges appeared [1]. By focusing on our ideas’ potential for success rather than fears of failure, we can turn setbacks into chances to learn and adopt a mindset where anything seems possible, even with tough problems.

Stories of Growth in Design

When faced with criticism or obstacles, an optimistic designer doesn’t back down – instead, they listen carefully, rethink their approach, and make improvements. Take, for example, a designer presenting ideas to stakeholders who seem doubtful; rather than viewing skepticism as rejection, optimism encourages open dialogue and teamwork, which can lead to better outcomes. This collaborative atmosphere can foster creative solutions that might not have been considered otherwise.

Additionally, the influence of optimism on problem solving is profound. An optimistic designer approaches challenges with a solution-oriented mindset, exploring multiple angles and possibilities. They become adept at adjusting their strategies in real-time, allowing them to pivot when necessary. This flexibility not only enhances the design process but also builds resilience, enabling designers to bounce back from setbacks with renewed vigor.

This positive mindset helps designers stay focused on what works, learn from setbacks, and see every challenge as something that can be overcome, even when the problems are tough. It nurtures a culture of innovation, where experimentation and risk-taking are encouraged. This can lead to groundbreaking ideas that elevate design projects. When designers embrace an optimistic view, they are more likely to inspire those around them, fostering an environment where creativity thrives.

Furthermore, optimism extends beyond design into everyday life and long-term goals. It shapes how we interact with colleagues, clients, and even family, promoting stronger relationships built on trust and mutual respect. Instead of worrying about being judged or failing, optimism encourages us to engage with the world in meaningful ways. This perspective allows us to approach uncertainty with confidence, viewing our efforts, big or small, as essential steps in our journey toward success. It empowers individuals to take initiative, volunteer for new projects, and seek out opportunities for growth, ultimately contributing to both personal and professional development.

Embracing the Power of Optimism

Optimism is not a magical gift; it’s a skill you can nurture through purposeful habits:

  • Shift your self-talk: See setbacks as valuable lessons rather than proof of shortcomings, without being consumed by how the setback made you feel. Our self-talk drives our actions, and the more we focus on cultivating positive self-talk, the more we will see that reflect in our actions and the outcomes we see, whether at work or in everyday life. This process requires intentional effort and mindfulness; we can start by recognizing negative patterns in our self-talk and consciously replacing them with affirming statements. Over time, by continuously reinforcing positive narratives, we establish a healthier mindset, which not only benefits our personal growth but also enhances our relationships and overall well-being.
  • Acknowledge progress: Small achievements build confidence, making bigger goals seem within reach and more attainable. Achieving what might seem like the smallest task can significantly impress our subconscious mind, fostering a positive feedback loop that encourages us to continue pushing ourselves to achieve even more every day. This recognition not only propels our motivation but also reinforces our belief in our capabilities, allowing us to set and pursue increasingly ambitious objectives with a sense of purpose and enthusiasm.
  • Connect with supportive people: Our choices in company and environment shape our mindset and actions. Surrounding ourselves with individuals who uplift, encourage, and inspire can significantly influence our personal growth and overall well-being. Seeking out a community that shares similar values, aspirations, and interests can foster a sense of belonging and motivation, enhancing our ability to navigate life’s challenges and achieve our goals.
  • Express gratitude: Pay attention to what you have and what matters now, instead of dwelling on what’s absent. Practicing gratitude can have a profound impact on your overall well-being and happiness. It helps you appreciate the positive aspects of your life, leading to a clearer perspective. Gratitude changes our mindset to one of opportunity and abundance rather than one of lack. This shift in focus is essential, as it trains our subconscious mind to find the opportunities that surround us, ultimately fostering a more fulfilling and enriched life rather than concentrating solely on what’s lacking. By regularly acknowledging and valuing the good in your life, you can create a positive feedback loop that enhances your emotional resilience and overall life satisfaction.

Conclusion

Building optimism enriches both our personal and professional lives, enabling us to approach uncertainty with curiosity rather than apprehension. When we examine design, careers, relationships, and ambitions through an optimistic lens, we empower ourselves as a collective: fear no longer constrains our choices or actions. Instead, we learn to identify new possibilities, viewing setbacks as stepping stones toward success because we trust in our collective capacity to take constructive action.

By replacing fear with optimism, we unlock a wealth of opportunities that the universe has to offer. An optimistic perspective allows us to see the abundance that surrounds us, revealing resources and connections that we might have previously overlooked. As we cultivate an optimistic outlook together, we align ourselves with the flow of life, inviting growth and prosperity into our shared experiences.

In essence, optimism acts as a magnifying glass that amplifies our awareness of possibilities, encouraging us to reach for what is available and achievable. As we embrace this powerful mindset collectively, we not only enhance our potential as individuals but also inspire those around us to recognize the vast opportunities that lie ahead. This shift towards optimism nurtures a culture of collaboration and support, where everyone is empowered to explore their unique paths while contributing to the abundance available in the universe.

[1] Seligman, M. E. P. (2006). Learned optimism: How to change your mind and your life (2nd ed.). Vintage Books.

Transformative Discovery: Integrating Coaching Principles for Project Success

The Human-Centered Approach to Discovery

At the core of effective discovery work lies the importance of coaching when gathering requirements. Over time, I’ve realized that meaningful insights rarely emerge from rigid templates or formal interviews; instead, they arise through genuine conversations where people feel supported enough to pause, think deeply, and express what they need.

Often, an initial request such as “We need a dashboard,” or “Can you shorten this workflow?” uncovers more fundamental issues like decision-making, team alignment, confidence, or communication barriers. By approaching discovery with a coaching mindset, we can reveal these underlying concerns rather than just addressing superficial symptoms. If you’ve ever experienced a discovery session that seemed more like coaching than interviewing, you’ll recognize the value of intentionally cultivating this dynamic.

Reflecting on my recent years of interviews, I’ve noticed a shift, they increasingly resemble coaching sessions. Initially, I thought I was merely “collecting requirements,” but over time, it became clear I was guiding people in clarifying their actual needs. Rather than just recording their requests, I was facilitating their thinking.

In early design meetings, users typically begin with basic asks: “We want a dashboard,” “Can you make this workflow shorter,” “Can we have a button that does X?” These are useful starting points, but they seldom tell the whole story. When I consciously adopt a coaching approach, slowing down, listening attentively, and posing thoughtful questions, the dialogue changes dramatically. At that moment, our focus shifts beyond the user interface into deeper topics: friction, decision-making processes, confidence, accountability, ambiguity, and the human elements hidden beneath feature requests.

Many professionals who have spent decades in their roles rarely get the chance to reflect on the patterns shaping their daily work. So, when I ask something as straightforward as, “What’s the hardest part about planning next season?” the answer often reveals gaps and bottlenecks behind the scenes, rather than issues with the software itself. These stories simply don’t surface during standard meetings.

Uncovering Deeper Insights through Curiosity and Coaching

Curiosity allows us to explore areas untouched by process charts and requirement documents. Prioritizing the individual over the process exposes context that’s invisible on paper, like emotional burden, workplace politics, quiet worries, workarounds, and shared tribal knowledge. Coaching fosters an environment where all these factors come to light, transforming them into valuable material for design decisions.

I used to think the better I got at systems, the less I’d need to do this. But it turned out the opposite is true. The better the system, the more human the conversations become. Coaching is almost like a bridge, helping people cross from “I think I need this feature” to “Here’s what I’m actually trying to solve.”

Active Listening and Guided Curiosity

Active listening forms the core of my approach, ensuring I deeply understand not just participants’ words but the meaning behind them. I reflect statements back — such as, “So it sounds like the challenge isn’t entering the data, it’s aligning on which data to trust, right?” — to confirm genuine understanding. This often transforms technical discussions into conversations about alignment, ownership, or governance.

A key tool is the “Five Whys” technique, which I use as a guide for curiosity rather than a rigid checklist. If someone requests better notifications, I’ll probe: “Why is that important?” and follow with questions like, “Why is it hard to notice things right now?” or, “What happens when you miss something?” By the fourth or fifth ‘why,’ the conversation surfaces underlying factors such as workload, confidence, or fear of missing out, revealing emotional and operational triggers beneath the initial request.

In workplaces, these deeper issues often connect to organizational culture. For example, a request for faster workflows sometimes indicates a real need for predictability or reduced chaos, rooted in communication or authority structures rather than the system itself. Recognizing these patterns enables more effective design decisions by addressing root causes instead of just symptoms.

Intentional silence is another valuable technique. After asking a question, I resist filling the pause, giving participants space to think and speak freely. This silence often prompts unfiltered insights, especially when someone is on the verge of articulating something new. Allowing this space helps participants trust and own their insights, leading to more meaningful outcomes.

Future-Focused Exploration and Empowering Language

I also employ future-anchoring questions like, “Imagine it’s six months after launch — what does success look like for you?” or, “If the system made your job easier in one specific way, what would that be?” These help participants shift from immediate concerns to aspirational thinking, revealing priorities such as autonomy or coordination that guide design principles.

Tone and language are critical for psychological safety. I aim to make discovery feel inviting, often assuring participants, “There’s no wrong answer here,” or encouraging them to think out loud. When people use absolutes — “We always have to redo this,” “No one ever gives us the right information” — it signals where they feel stuck. I gently challenge these constraints by asking, “What might need to change for that to be different?” This opens possibilities and helps distinguish between real and internalized limitations. Coaching-based discovery is key to uncovering and addressing these constraints for lasting change.

Reflections and Takeaways

Coaching Tools as Foundational Practice

Initially, I viewed coaching tools as separate from implementation work, and more of an optional soft skill than a crucial element. Over time, my outlook changed: I saw these tools as fundamental to successful outcomes. I noticed that the best results happened when participants truly took ownership of the insights we discovered together. That sense of ownership was strongest when the understanding came from them, even with my guidance. Insights gained this way tend to last longer and have a greater impact.

My approach to discovery has evolved significantly over time. Initially, I viewed discovery as a process focused on extracting insights from users. More recently, it has transitioned into facilitating users’ own self-discovery, enabling them to articulate intuitions and knowledge that may have previously been unexpressed. This progression from a transactional checklist to a collaborative and transformative meaning-making practice has had a substantial impact on my design methodology.

Efficiency through Early Alignment and Clarity

Contrary to prevailing assumptions, coaching-based discovery does not impede project timelines. Although it demands greater initial investment of time, the resulting enhanced alignment and mutual understanding often expedite progress. Early engagement in substantive discussions enables teams to minimize rework, clarify decision-making processes, and avoid misinterpretations, which can ultimately result in projects being completed ahead of schedule due to unified objectives.

Efficiency is driven by clarity. When users feel acknowledged and their perspectives are incorporated, their level of engagement and willingness to collaborate increases. The trust established during these interactions persists throughout testing, feedback, and rollout stages, mitigating many subsequent problems that typically occur when user requirements are not considered from the outset.

Strong Implementation Questions Are Strong Coaching Questions

At their core, effective implementation questions are essentially strong coaching questions. These are fuelled by curiosity, maintain a non-judgmental tone, and aim to empower others. Instead of guiding someone toward a set answer, such questions encourage individuals to uncover their own insights about the work.

Regardless of the type of discovery — be it design, implementation, or workflow — insight comes from those directly involved. Coaching goes beyond mere technique; it represents a mindset based on the belief that people already hold valuable wisdom. The coach’s job is to help draw out this knowledge, using thoughtful questions.

A key moment in coaching-based discovery happens when someone has a sudden realization, saying things like, “I’ve never thought about it that way,” or “Now I understand why this keeps happening.” These moments are where improvements in design and implementation begin.

Such realizations act as anchors throughout a project. When team members shift their understanding, these breakthroughs can be revisited during times of complexity or tough decisions, providing direction as a “north star” to keep teams aligned.

Coaching is not just a resource, it should be demonstrated in everyday interactions. As teams experience its benefits, they often adopt coaching practices with each other, leading to genuine transformation that extends past individual projects and influences wider workplace culture.

Ultimately, the real value of this work lies not just in the solutions themselves, but in the conversations that reshape how people engage with their work.

Atom Loss: A Bottleneck in Quantum Computing

It was believed that a reliable quantum computer running indefinitely was a decade or more away. With these new advancements in mitigating atom loss, quantum computers running indefinitely and producing reliable results are only a few years away.

Introduction

Until recently, quantum computers have faced a significant obstacle known as ‘atom loss’, which has limited their advancement and ability to operate for long durations. At the heart of these systems are quantum bits, or qubits, which represent information in a quantum state, allowing them to be in the state 0, 1, or both simultaneously, thanks to superposition. Qubits are formed from subatomic particles and engineered through precise manipulation and measurement of quantum mechanical properties.

Historically, this atom loss phenomenon restricted quantum computers to performing computations for only a few milliseconds. Even the most sophisticated machines struggled to operate beyond a few seconds. However, recent breakthroughs by Sandia National Laboratories and Harvard University researchers have changed this landscape dramatically. At Harvard, researchers have built a quantum computer that could sustain operations for over two hours [1], a substantial improvement over previous limitation. This advancement has led scientists to believe they are on the verge of enabling quantum computers to run continuously, potentially without time constraints.

What causes atom loss?

Atom loss presents a significant challenge in quantum computing, as it results in the loss of the fundamental unit of information – the qubit – along with any data it contains. During quantum computations, qubits may be lost from the system due to factors such as noise and temperature fluctuations. This phenomenon can lead to information degradation and eventual system failure. To maintain qubit stability and prevent atom loss, a stringent set of physical, environmental, and engineering conditions must be satisfied.

Environmental fluctuations

Maintaining the integrity of qubits in a quantum computing system is heavily dependent on shielding them from various environmental disturbances. Qubits are highly sensitive to noise, electromagnetic fields, and stray particles, any of which can interfere with their quantum coherence. Quantum coherence describes the ability of a qubit to remain in a stable superposition state over time; the duration of this coherence directly affects how long a qubit can function without experiencing errors.

One fundamental requirement for preserving quantum coherence is the maintenance of cryogenic environments. Qubits must be kept at temperatures near absolute zero, which is essential for eliminating thermal noise and fostering the quantum behaviour necessary for reliable operations. Even slight fluctuations in temperature or the presence of external electromagnetic influences can cause the delicate quantum state of a qubit to degrade or flip unpredictably, leading to information loss and system errors [2].

These stringent environmental controls are critical for ensuring that qubits remain stable and effective throughout quantum computations, highlighting the importance of addressing environmental fluctuations as a key challenge in quantum computing.

Trap imperfections

Neutral atom processors have become a prominent platform for achieving large-scale, fault-tolerant quantum computing [3]. This approach enables qubits to be encoded in states that possess exceptionally long coherence times, often extending up to tens of seconds. The extended coherence time is crucial for maintaining quantum information over prolonged computations, which is essential for complex and reliable quantum operations.

The operation of neutral atom processors relies on the use of optical tweezer arrays. These arrays are dynamically configured, allowing qubits to be trapped in arbitrary geometries and enabling the system to scale to tens of thousands of qubits. The flexibility in configuration and scalability makes neutral atom processors especially suited for advancing quantum computing technology beyond previous limitations.

Despite these advantages, neutral atom processors are not immune to challenges. Atom loss remains a significant issue, arising from several sources. Heating within the system can cause atoms to escape their traps, while collisions with background gas particles further contribute to atom loss. Additionally, during the excitation of an atom from one quantum state to another, such as the transition to a Rydberg state, anti-trapping can occur, leading to the loss of qubits from the processor array.

Readout errors

During the process of reading out quantum information, qubits may be displaced from their positions within the two-dimensional arrays. This readout operation, which involves imaging the qubits to determine their quantum state, can inadvertently lead to the loss of qubits from the processor array. Such atom loss poses a risk to the integrity and continuity of quantum computations.

To address this challenge, neutral atom processor arrays are typically designed with additional qubits that act as a buffer. These extra qubits ensure that, even when some atoms are lost during readout or other operations, enough qubits remain available for the system to continue performing calculations reliably.

Another approach to mitigating atom loss during readout is to slow down the imaging process. By reducing the speed of readout operations, the likelihood of displacing qubits can be minimized, thereby decreasing the rate at which atoms are lost from the array. However, this strategy comes with a trade-off: slowing down readout operations leads to reduced overall system efficiency, as calculations take longer to complete [4]. As a result, there is an inherent balance between maintaining qubit integrity and preserving the speed and efficiency of quantum computations.

Imperfect isolation

Maintaining perfect isolation of qubits from their environment is an immense challenge, primarily because it demands highly sophisticated and costly shielding methods. In practice, it is virtually impossible to completely shield quantum systems from external influences. As a result, stray electromagnetic signals, fluctuations in temperature, and mechanical vibrations can penetrate these defences and interact with quantum systems. Such interactions are detrimental, as they can disrupt the delicate balance required for quantum operations and ultimately lead to atom loss within the processor array [5]. These environmental disturbances compromise the stability and coherence of qubits, posing a significant obstacle to the reliability and scalability of quantum computers.

Recent solutions and research

Multiple research teams are developing ways to reduce atom loss by detecting and correcting missing atoms in quantum systems, improving calculation reliability.

Researchers at Sandia National Laboratories, in collaboration with the University of New Mexico, have published a study demonstrating, for the first time, that qubit leakage errors in neutral atom platforms can be detected without compromising or altering computational outcomes [6]. The team achieved this by utilising the alternating states of entanglement and disentanglement among atoms within the system. In experiments where the atoms were disentangled, results showed substantial deviations compared to those observed during entanglement. This approach enabled the detection of the presence of adjacent atoms without direct observation, thereby preserving the integrity of the information contained within each atom.

Ancilla qubits are essential in quantum error correction and algorithms [7]. These extra qubits help with measurement and gate implementation, yet they do not store information from the main quantum state. By weakly entangling ancilla qubits with the physical qubits, it becomes possible for them to identify errors without disturbing the actual quantum data. Thanks to non-demolition measurements, errors can be detected while keeping the physical qubit’s state intact.

A group of physicists from Harvard University have recently created the first quantum computer capable of continuous operation without needing to restart [1]. By inventing a technique to replenish qubits in optical tweezer arrays as they exit the system, the researchers managed to keep the computer running for more than two hours. Their setup contains 3,000 qubits and can inject up to 300,000 atoms each second into the array, compensating for any lost qubits. This approach enables the system to maintain quantum information, even as atoms are lost and replaced. According to the Harvard team, this innovation could pave the way for quantum systems that can function indefinitely.

Conclusion

It was previously believed that atom loss could seriously hinder the progress of quantum computing. Atom loss and qubit leakage were serious errors that could render calculations unreliable. With the advancements introduced by the researchers at Sandia National Laboratories, the University of New Mexico and Harvard University, and a host of other teams around the world, the revolutionary advancements quantum computers could introduce in scientific research, medicine and finance are becoming closer than ever. It was believed that a reliable quantum computer running indefinitely was a decade or more away. With these new advancements in mitigating atom loss, quantum computers running indefinitely and producing reliable results are only a few years away.

[1] Harvard Researchers Develop First Ever Continuously Operating Quantum Computer

[2] Quantum Chips: The Brains Behind Quantum Computing

[3] Quantum Error Correction resilient against Atom Loss

[4] Novel Solutions For Continuously Loading Large Atomic Arrays

[5] Quantum Decoherence: The Barrier to Quantum Computing

[6] A breakthrough in Quantum Error Correction

[7] Ancilla Qubit

Bringing Ideas to Life: My Journey as a Product Architect

My work is about helping clients and organizations bring their ideas to life, transforming understanding into development, and development into reality, with as little friction and as much functionality as possible.

Lately, I have been reflecting on what drew me, as a designer, to write about topics such as artificial intelligence and quantum computing. I have been fascinated with both topics and how they have transformed that way we view the world. Everything we see today in terms of advancements in AI and quantum computing started with an idea, brought to life through innovation and perseverance.

In AI, there was the idea that machine learning would transform the way we do business by leveraging large amounts of data to provide valuable insights, something that would not be easily attainable through human effort. In quantum computing, there was the idea that applying the way particles behave in the universe to computing would unlock a vast potential for computing capabilities and power, beyond what classical computers can achieve. So many other advancements and achievements in AI and quantum computing continue to be realized through the conception of ideas and the relentless pursuit of ways and methods to implement them.

Everything starts with an idea

Beyond AI and quantum computing, everything we see around us started with an idea, brought to life through continued and persistent effort to make it a reality. Every building we see, every product, every service and all material and immaterial things in our lives are the product of an idea.

As a designer and product architect, I also help make ideas a reality through persistent effort and the application of methodology that lays a roadmap for the implementation of those ideas. Similarly, AI and quantum computing are fields that are bringing novel and exciting concepts to life through the development and application of scientific methodology.

While thinking about all of this, I pondered how I would define my work and role as a designer. How would I describe my work, knowing that most of us use technology without thinking about the journey a product takes from idea to experience? What value do I bring to organizations that hire me to help them with their problems? In an age where products are incorporating ever more advanced and sophisticated technology, as is the case with AI and quantum computing, how does my work extend beyond simply developing designs and prototypes?

To answer these questions, I am drawn back to the fact that everything around us starts with an idea. As a designer, it is extremely rewarding to me to help make ideas for my clients a reality while navigating the conceptual, technical and implementation challenges.

Making the invisible useful

I’ve been thinking a lot about the similarities between how we design physical spaces and how we design digital ones. Just like a building starts as an idea in an architect’s mind, so are the products that I work on and help a multitude of organizations bring to life. As a designer, I help lay the foundations for a product idea by thoroughly understanding the motivations and needs behind it, and what benefits and improvements implementing it would bring.

Buildings serve needs by providing housing for people or serving as places to work, and for businesses and organizations to operate. A well-designed building offers an effortless flow that draws people in and makes them want to stay. Similarly, great digital design allows for seamless navigation, creating an experience that feels natural and engaging. Before an architect devises plans and drawings for a building, they must first maintain a clear vision of the idea in their mind, understand the needs behind it and ensure that their designs and plans meet those needs.

From there, the idea and concept of the building in the architect’s mind are translated into plans and drawings. Those plans are drawn and shared with a builder, who in turn collaborates with the architect to bring them to life. Without the architect and their clear vision of the idea and concept behind the building, the building would not exist, at least not in the shape and form that the architect would have imagined. It would not properly serve the needs and bring about the benefits that accompanied the original idea.

Just like a building architect, as a product architect I must also understand the needs behind digital products to create experiences that truly serve the user. Through this process, I envision flows and interactions that will enable users to achieve their goals in the simplest and easiest way possible, reducing friction while also achieving the desired business value and benefit. Like an architect, I collaborate with members of technical teams so that the idea behind the product can be realized to its full potential through detailed roadmaps, designs and prototypes.

Figure 1. Architects are masters of the invisible made useful.

An architect must possess technical and creative skills that enable them to visualize the idea of a building. The same is true for me as a product architect. Without the ability to clearly articulate complex technical concepts through detailed designs and specifications while also applying a creative lens, product ideas would not be realized to their full potential.

In summary, how do I define my work? My work is about helping clients and organizations bring their ideas to life, transforming understanding into development, and development into reality, with as little friction and as much functionality as possible. I can help you and your organization achieve the same. Let me show you how.

The Principles of Quantum Computing Explained

Today, a variety of companies are producing mainstream quantum hardware and making tools available to developers, turning quantum computing technology that was theoretical a few decades ago into a reality.

Introduction

During one of his Messenger Lectures at MIT in 1964, the renowned Nobel prize laureate and theoretical physicist, Richard Feynman, was quoted as saying “I think I can safely say that no one can understand quantum mechanics”. Feynman emphasized the counter intuitiveness of quantum mechanics, and encouraged listeners at his lecture to simply accept how atoms behave at the quantum level, rather than trying to apply a classical understanding onto it [1].

At its core, quantum theory describes how light and matter behave at the subatomic level. Quantum theory explains how particles can appear in two different places at the same time, how light can behave both as a particle and a wave, and how electrical current can flow both clockwise and counter-clockwise in a wire. These ideas can seem strange to us, even bizarre, yet quantum mechanics gave rise to a new world of possibilities science, technology and information processing.

What is a quantum computer?

While classical computers use bits that can be either 0 or 1, quantum computers use quantum bits (qubits) that can be 0, 1 or both at the same time, suspended in superposition. Qubits are created by manipulating and measuring systems that exhibit quantum mechanical behaviour. Because qubits can hold superposition and exhibit interference, they can solve problems differently than classical computers.

Quantum computers perform quantum computations by manipulating the quantum states of qubits in a controlled way to perform algorithms [2]. Quantum computers can adopt an arbitrary quantum state from an arbitrary input quantum state. This enables quantum computers to accurately compute the behaviour of small particles that follow the laws of quantum mechanics, such as the behaviour of an electron in a hydrogen molecule. Quantum computers can also be used to efficiently run optimization and machine learning algorithms.

For example, a classical computer might apply a brute force method to solve a maze by trying every possible path and remembering the paths that don’t work. A quantum computer, on the other hand, may not need to test all paths in the maze to arrive at the solution. Instead, given a snapshot of the maze, a quantum computer relies on measuring the probability amplitudes of qubits to determine the outcome. Since the amplitudes behave like waves, the solution is found when the waves overlap.

Principles of quantum computing

Quantum computing relies on four key principles:

Superposition – represents all possible combinations of a qubit through a complex multi-dimensional computational space. Superposition allows the representation of complex problems in new ways using these computational spaces. The quantum state is measured by collapsing it from the superposition of possibilities into a binary state that can be registered as binary code using 0 and 1[3].   

Entanglement – the ability of qubits to correlate their state with other qubits. Entanglement implies close connections among qubits in a quantum system, such that each qubit can immediately determine information about other qubits in the system.

Interference – qubits placed in a state of collective superposition structure information in a way that looks like waves, with amplitudes associated with each wave. These waves can either peak at a particular level or cancel each other out, thus amplifying the probability or canceling it out for a specific outcome. Amplifying or canceling out a probability are both forms of interference.

Decoherence – occurs when a system collapses from a quantum state to a non-quantum state. This can be triggered intentionally through measurement of the quantum system or other unintentional factors. Quantum computers require avoiding or minimizing decoherence.                 

Combining these principles can help explain how quantum computers work. By preparing a superposition of quantum states, a quantum circuit written by the user uses operations to entangle qubits and generate interference patterns, as governed by a quantum algorithm. Outcomes are either canceled out or amplified through interference, and the amplified outcomes serve as the solution to the computation.

Conclusion

Today, a variety of companies are producing mainstream quantum hardware and making tools available to developers, turning quantum computing technology that was theoretical a few decades ago into a reality. Superconducting quantum processors are being delivered at regular intervals, increasing quantum computing speed and capacity. Researchers are continuing to make quantum computers even more useful, while overcoming challenges related to scaling quantum hardware and software, quantum error correction and quantum algorithms.


Designing solutions that work for users is what fuels my work. I’d love to connect and talk through your design ideas or challenges, connect with me today on LinkedIn or contact me at Mimico Design House.


References

[1] Quantum Mechanics by Richard P. Feynman

[2] The basics of Quantum Computing

[3] What is quantum computing?