Tag: super computers

Has The Age Of Quantum Computing Arrived?

Ever since Charles Babbage’s conceptual, unrealised Analytical Engine in the 1830s, computer science has been trying very hard to race ahead of its time.

Particularly over the last 75 years, there have been many astounding developments – the first electronic programmable computer, the first integrated circuit computer, the first microprocessor.

But the next anticipated step may be the most revolutionary of all.

Quantum computing is the technology that many scientists, entrepreneurs and big businesses expect to provide a, well, quantum leap into the future.

If you’ve never heard of it there’s a helpful video doing the social media rounds that’s got a couple of million hits on YouTube.

It features the Canadian prime minister, Justin Trudeau, detailing exactly what quantum computing means.




Trudeau was on a recent visit to the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, one of the world’s leading centres for the study of the field. D

During a press conference there, a reporter asked him, half-jokingly, to explain quantum computing.

Quantum mechanics is a conceptually counterintuitive area of science that has baffled some of the finest minds – as Albert Einstein said “God does not play dice with the universe” – so it’s not something you expect to hear politicians holding forth on.

Throw it into the context of computing and let’s just say you could easily make Zac Goldsmith look like an expert on Bollywood.

But Trudeau rose to the challenge and gave what many science observers thought was a textbook example of how to explain a complex idea in a simple way.

The concept of quantum computing is relatively new, dating back to ideas put forward in the early 1980s by the late Richard Feynman, the brilliant American theoretical physicist and Nobel laureate.

He conceptualised the possible improvements in speed that might be achieved with a quantum computer. But theoretical physics, while a necessary first step, leaves the real brainwork to practical application.

With normal computers, or classical computers as they’re now called, there are only two options – on and off – for processing information.

A computer “bit”, the smallest unit into which all information is broken down, is either a “1” or a “0”.

And the computational power of a normal computer is dependent on the number of binary transistors – tiny power switches – that are contained within its microprocessor.

Back in 1971 the first Intel processor was made up of 2,300 transistors. Intel now produce microprocessors with more than 5bn transistors. However, they’re still limited by their simple binary options.

But as Trudeau explained, with quantum computers the bits, or “qubits” as they are known, afford far more options owing to the uncertainty of their physical state.

In the mysterious subatomic realm of quantum physics, particles can act like waves, so that they can be particle or wave or particle and wave.

This is what’s known in quantum mechanics as superposition. As a result of superposition a qubit can be a 0 or 1 or 0 and 1. That means it can perform two equations at the same time.

Two qubits can perform four equations. And three qubits can perform eight, and so on in an exponential expansion. That leads to some inconceivably large numbers, not to mention some mind-boggling working concepts.

At the moment those concepts are closest to entering reality in an unfashionable suburb in the south-west corner of Trudeau’s homeland.

Please like, share and tweet this article.

Pass it on: New Scientist

This Supercomputer Comes In As The Fifth Fastest Machine In The World

The top two spots on the list of the world’s most powerful supercomputers have both been captured by the US.

The last time the country was in a similar position was three years ago.

The fastest machine – Titan, at Oak Ridge National Laboratory in Tennessee – is an upgrade of Jaguar, the system which held the top spot in 2009.

The supercomputer will be used to help develop more energy-efficient engines for vehicles, model climate change and research biofuels.

It can also be rented to third-parties, and is operated as part of the US Department of Energy’s network of research labs.

The Top 500 list of supercomputers was published by Hans Muer, professor of computer science at Mannheim, who has been keeping track of developments since 1986.

It was released at the SC12 supercomputing conference in Salt Lake City, Utah.




Mixed processors

Titan leapfrogged the previous champion IBM’s Sequoia – which is used to carry out simulations to help extend the life of nuclear weapons – thanks to its mix of central processing unit (CPU) and graphics processing unit (GPU) technologies.

According to the Linpack benchmark it operates at 17.59 petaflop/sec – the equivalent of 17,590 trillion calculations per second.

The benchmark measures real-world performance – but in theory the machine can boost that to a “peak performance” of more than 20 petaflop/sec.

To achieve this the device has been fitted with 18,688 Tesla K20x GPU modules made by Nvidia to work alongside its pre-existing CPUs.

Traditionally supercomputers relied only on CPUs.

CPU cores are designed to handle between one and a few streams of instructions at speed, but are not efficient at carrying out many at once.

That makes them well suited for complex tasks in which the answer to one calculation is used to work out the next.

GPU cores are typically slower at carrying out individual calculations, but make up for this by being able to carry out many at the same time.

This makes them best suited for “parallellisable jobs” – processes that can be broken down into several parts that are then run simultaneously.

Mixing CPUs and GPUs together allows the most appropriate core to carry out each process. Nvidia said that in most instances its GPUs now carried out about 90% of Titan’s workload.

Please like, share and tweet this article.

Pass it on: Popular Science