Tag: Quantum Computer

Has The Age Of Quantum Computing Arrived?

Ever since Charles Babbage’s conceptual, unrealised Analytical Engine in the 1830s, computer science has been trying very hard to race ahead of its time.

Particularly over the last 75 years, there have been many astounding developments – the first electronic programmable computer, the first integrated circuit computer, the first microprocessor.

But the next anticipated step may be the most revolutionary of all.

Quantum computing is the technology that many scientists, entrepreneurs and big businesses expect to provide a, well, quantum leap into the future.

If you’ve never heard of it there’s a helpful video doing the social media rounds that’s got a couple of million hits on YouTube.

It features the Canadian prime minister, Justin Trudeau, detailing exactly what quantum computing means.

Trudeau was on a recent visit to the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, one of the world’s leading centres for the study of the field. D

During a press conference there, a reporter asked him, half-jokingly, to explain quantum computing.

Quantum mechanics is a conceptually counterintuitive area of science that has baffled some of the finest minds – as Albert Einstein said “God does not play dice with the universe” – so it’s not something you expect to hear politicians holding forth on.

Throw it into the context of computing and let’s just say you could easily make Zac Goldsmith look like an expert on Bollywood.

But Trudeau rose to the challenge and gave what many science observers thought was a textbook example of how to explain a complex idea in a simple way.

The concept of quantum computing is relatively new, dating back to ideas put forward in the early 1980s by the late Richard Feynman, the brilliant American theoretical physicist and Nobel laureate.

He conceptualised the possible improvements in speed that might be achieved with a quantum computer. But theoretical physics, while a necessary first step, leaves the real brainwork to practical application.

With normal computers, or classical computers as they’re now called, there are only two options – on and off – for processing information.

A computer “bit”, the smallest unit into which all information is broken down, is either a “1” or a “0”.

And the computational power of a normal computer is dependent on the number of binary transistors – tiny power switches – that are contained within its microprocessor.

Back in 1971 the first Intel processor was made up of 2,300 transistors. Intel now produce microprocessors with more than 5bn transistors. However, they’re still limited by their simple binary options.

But as Trudeau explained, with quantum computers the bits, or “qubits” as they are known, afford far more options owing to the uncertainty of their physical state.

In the mysterious subatomic realm of quantum physics, particles can act like waves, so that they can be particle or wave or particle and wave.

This is what’s known in quantum mechanics as superposition. As a result of superposition a qubit can be a 0 or 1 or 0 and 1. That means it can perform two equations at the same time.

Two qubits can perform four equations. And three qubits can perform eight, and so on in an exponential expansion. That leads to some inconceivably large numbers, not to mention some mind-boggling working concepts.

At the moment those concepts are closest to entering reality in an unfashionable suburb in the south-west corner of Trudeau’s homeland.

Please like, share and tweet this article.

Pass it on: New Scientist

The Quantum Computer That Could Spell The End Of Encryption

The researchers from Massachusetts Institute of Technology (MIT) and Austria’s University of Innsbruck call it “the beginning of the end for encryption schemes“.

Most encryption used today uses integer factorisation, or “the factoring problem“, and its security comes from the difficulty of factoring large numbers.

For example, finding the prime factors, or multipliers, for the number 15 is fairly easy as it’s a small number.

However, a larger number such as 91, may take some pen and paper.

An even larger number, say with 232 digits, has taken scientists two years to factor, using hundreds of classical computers operating in parallel.

In encryption, two different, but intimately related numbers, are used for the encryption and decryption, making it easy to calculate but hard to reverse.

However, a quantum computer is expected to outperform traditional computers and crack this problem by using hundreds of atoms, essentially in parallel, to quickly factor huge numbers because data is encoded in the ‘spin’ of individual electrons.

Unlike standard computers, quantum bits, or qubits can exist in multiple states at once rather than the binary 1 or 0 of conventional bits.

This means they can perform multiple calculations in parallel and hold far more information than normal bits.

For example, a computer with just 1,000 qubits could easily crack modern encryption keys while smartphone games like Angry Birds typically use 40,000 conventional bits to run.

It typically takes about 12 qubits to factor the number 15, but researchers at MIT and the University of Innsbruck in Austria have found a way to pare that down to five qubits, each represented by a single atom.

This has been designed and built by a quantum computer from five atoms in an ion trap. The computer uses laser pulses to carry out algorithms on each atom, to correctly factor the number 15.

The approach thus provides the potential for designing a powerful quantum computer, but with fewer resources,” said the research paper.

We factor the number 15 by effectively employing and controlling seven qubits and four ‘cache qubits’ and by implementing generalised arithmetic operations, known as modular multipliers.

The system is designed in a way that more atoms and lasers can be added to build a bigger and faster quantum computer, able to factor much larger numbers.

The scientists said the results represent the first scalable implementation of Shor’s algorithm, a quantum algorithm named after mathematician Peter Shor in 1994 to solve the factorisation problem.

We show that Shor’s algorithm, the most complex quantum algorithm known to date, is realisable in a way where, yes, all you have to do is go in the lab, apply more technology, and you should be able to make a bigger quantum computer,” said Isaac Chuang, professor of physics and professor of electrical engineering and computer science at MIT.

It might still cost an enormous amount of money to build – you won’t be building a quantum computer and putting it on your desktop anytime soon – but now it’s much more an engineering effort, and not a basic physics question.

The researchers claimed the ion-trap quantum computer returns the correct factors with a confidence level exceeding 99 per cent.

Please like, share and tweet this article.

Pass it on: New Scientist