Tag: processors

Google’s First Mobile Chip Is An Image Processor Hidden In The Pixel 2

One thing that Google left unannounced during its Pixel 2 launch event on October 4th is being revealed today: it’s called the Pixel Visual Core, and it is Google’s first custom system-on-a-chip (SOC) for consumer products.

You can think of it as a very scaled-down and simplified, purpose-built version of Qualcomm’s Snapdragon, Samsung’s Exynos, or Apple’s A series chips. The purpose in this case?

Accelerating the HDR+ camera magic that makes Pixel photos so uniquely superior to everything else on the mobile market.

Google plans to use the Pixel Visual Core to make image processing on its smartphones much smoother and faster, but not only that, the Mountain View also plans to use it to open up HDR+ to third-party camera apps.




The coolest aspects of the Pixel Visual Core might be that it’s already in Google’s devices. The Pixel 2 and Pixel 2 XL both have it built in, but laying dormant until activation at some point “over the coming months.”

It’s highly likely that Google didn’t have time to finish optimizing the implementation of its brand-new hardware, so instead of yanking it out of the new Pixels, it decided to ship the phones as they are and then flip the Visual Core activation switch when the software becomes ready.

In that way, it’s a rather delightful bonus for new Pixel buyers.

The Pixel 2 devices are already much faster at processing HDR shots than the original Pixel, and when the Pixel Visual Core is live, they’ll be faster and more efficient.

Looking at the layout of Google’s chip, which is dubbed an Image Processing Unit (IPU) for obvious reasons, we see something sort of resembling a regular 8-core SOC.

Technically, there’s a ninth core, in the shape of the power-efficient ARM Cortex-A53 CPU in the top left corner.

But the important thing is that each of those eight processors that Google designed has been tailored to handle HDR+ duties, resulting in HDR+ performance that is “5x faster and [uses] less than 1/10th the energy” of the current implementation, according to Google.

This is the sort of advantage a company can gain when it shifts to purpose-specific hardware rather than general-purpose processing.

Google says that it will enable Pixel Visual Core as a developer option in its preview of Android Oreo 8.1, before updating the Android Camera API to allow access to HDR+ for third-party camera devs.

Obviously, all of this tech is limited strictly to the Pixel 2 generation, ruling out current Pixel owners and other Android users.

As much as Google likes to talk about enriching the entire Android ecosystem, the company is evidently cognizant of how much of a unique selling point its Pixel camera system is, and it’s working hard to develop and expand the lead that it has.

As a final note, Google’s announcement today says that HDR+ is only the first application to run on the programmable Pixel Visual Core, and with time we should expect to see more imaging and machine learning enhancements being added to the Pixel 2.

Pleas like, share and tweet this article.

Pass it on: Popular Science

Graphene Could Soon Make Your Computer 1000 Times Faster

Researchers from several universities have teamed up to develop a radical kind of transistor.

Instead of using silicon, the team used graphene to build a logic gate series that uses less power but could work 1,000 times faster than current ones.

The discovery of graphene in 2004 began a flurry of studies to isolate other two-dimensional materials. Graphene was found to be a wonder material, possessing a set of unique and remarkable properties.

One of these is its ability to conduct electricity ten times better than copper, the most commonly used conductor in electronics.

At room temperature, graphene is also capable of conducting electricity 250 times better than silicon, a rate faster than any other known substance.




These properties led a team of researchers from Northwestern University, The University of Texas at Dallas (UT Dallas), University of Illinois at Urbana-Champaign, and University of Central Florida (UCF) to consider developing a graphene-based transistor.

In a study published in the journal Nature Communications, the team found that a graphene-based transistor could actually work better than silicon transistors used in today’s computers.

A quick explanation first: Transistors are key in today’s computer circuits, as these act as on and off switches that allow electronic signals and electrical power through.

When put together, transistors form logic gates — the core of microprocessors, serving as input and output and acting either as 0s or 1s (so-called binary bits).

These are what allow microprocessors to solve logic and computing problems.

If you want to continue to push technology forward, we need faster computers to be able to run bigger and better simulations for climate science, for space exploration, for Wall Street,” co-author Ryan Gelfand, an assistant professor at UCF, said in a press release.

To get there, we can’t rely on silicon transistors anymore.

Please like, share and tweet this article.

Pass it on: New Scientist

Google’s Getting Serious About Building Its Own iPhone

Google unveiled its first custom-designed smartphone chip on Tuesday, the Pixel Visual Core, which is used in its new Pixel 2 and Pixel 2 XL smartphones.

The Pixel Visual Core enables smartphones to take better pictures using HDR+, a technology that can take clear pictures even if there’s a lot of brightness and darkness in the same shot.




One example might be taking a picture of a shadowy skyscraper against a bright blue sky.

With HDR+, you’ll be able to capture both the skyscraper and the blue sky, without bits of either washing out because of parts of the image being too bright or too dark.

While the chip exists in current phones, it isn’t yet activated, but will be in a future software release.

The Pixel 2 and Pixel XL 2 aren’t the first smartphones to offer HDR support, but Google is trying to make its photos the best using the new processor.

Google said that the Pixel Visual Core will be accessible by camera applications created by other developers, not just the built-in camera app, and that it plans to activate access to the core through software updates “in the coming months.

Please like, share and tweet this article.

Pass it on: Popular Science