Month: March, 2018

Universe’s First Stars Detected? Here Are The Facts!

Stars are our constant companions in the night sky, but seas of twinkling lights weren’t always a feature of the cosmos.

Now, scientists peering back into deep time suggest that the earliest stars didn’t turn on until about 180 million years after the big bang, when the universe as we know it exploded into existence.

For decades, teams of scientists have been chasing—in fact, racing—to detect the signatures of these first stars.

The new detection, from a project called EDGES, is in the form of a radio signal triggered when light from those stars began interacting with the hydrogen gas that filled primordial empty space.

If the signal stands up to scrutiny, the detection simultaneously opens up a new line of cosmological inquiry and offers a few conundrums to tackle.

The era of cosmic dawn has been entirely uncharted territory until now,” says physicist Cynthia Chiang of the University of KwaZulu-Natal in South Africa.

It’s extremely exciting to see a new glimpse of this slice of the universe’s history, and the EDGES detection is the initial step toward understanding the nature of the first stars in more detail.




Cosmic Dawn

Shortly after the universe was born, it was plunged into darkness. The first stars turned on when hot gas coalesced around clumps of dark matter, then contracted and became dense enough to ignite the nuclear hearts of infant suns.

As those early stars began breathing ultraviolet light into the cosmos, their photons mingled with primordial hydrogen gas, causing it to absorb background radiation and become translucent.

When that happened, those hydrogen atoms produced radio waves that traveled through space at a predictable frequency, which astronomers can still observe today with radio telescopes.

The same process is going on in modern stars as they continue to send light into the cosmos.

But the radio waves produced by those first stellar gasps have been traveling through space for so long that they’ve been stretched, or redshifted.

That’s how astronomers identified the fingerprints of the earliest stars in radio waves detected by a small antenna in western Australia.

From Light to Dark

If the signal is real, it presents a challenge for some scientists who’ve been thinking about how the early universe worked.

For starters, the time frame during which these earliest stars emerged lines up well with some theories, but it’s not exactly bang on with others.

In previous work, Furlanetto and his colleagues started with actual observations of the earliest known galaxies, and then rewound the cosmic clock using computer models, searching for the age at which a signal from the first stars might appear.

The universe’s first galaxies are thought to be small, fragile, and not that great at birthing stars, so Furlanetto wouldn’t expect the signal to peak until about 325 million years after the big bang.

But if the first stars had already furnished enough light to make their presence known 180 million years after the big bang, those early galaxies must be doing something different.

As well, the primordial hydrogen gas is absorbing photons at rates that are at least two times higher than predicted.

That’s problematic for some ideas about the temperature of the early universe. It means that either the primordial gas was colder than expected, or background radiation was hotter.

Dark matter makes up the bulk of the universe’s mass, but it doesn’t behave like normal matter and has proven tricky to understand.

It regularly evades direct detection, and scientists are struggling to pin down, what, exactly it is and how it has influenced the structure of the universe through time.

But, she notes, it’s way too early to accept that conclusion.

An alternate possibility is that there are simply more photons for the hydrogen gas to absorb, though it’s not obvious where all those photons would come from in the early universe.

So she and others are waiting for independent confirmation of the EDGES result before diving too deep into the possible dark matter scenarios.

Please like, share and tweet this article.

Pass it on: Popular Science

Google Clips: A Smart Camera That Doesn’t Make The Grade

Picture this: you’re hanging out with your kids or pets and they spontaneously do something interesting or cute that you want to capture and preserve.

But by the time you’ve gotten your phone out and its camera opened, the moment has passed and you’ve missed your opportunity to capture it.

That’s the main problem that Google is trying to solve with its new Clips camera, a $249 device available starting today that uses artificial intelligence to automatically capture important moments in your life.

Google says it’s for all of the in-between moments you might miss when your phone or camera isn’t in your hand.




It is meant to capture your toddler’s silly dance or your cat getting lost in an Amazon box without requiring you to take the picture.

The other issue Google is trying to solve with Clips is letting you spend more time interacting with your kids directly, without having a phone or camera separating you, while still getting some photos.

That’s an appealing pitch to both parents and pet owners alike, and if the Clips camera system is able to accomplish its goal, it could be a must-have gadget for them.

But if it fails, then it’s just another gadget that promises to make life easier, but requires more work and maintenance than it’s worth.

The problem for Google Clips is it just doesn’t work that well.

Before we get into how well Clips actually works, I need to discuss what it is and what exactly it’s doing because it really is unlike any camera you’ve used before.

At its core, the Clips camera is a hands-free automatic point-and-shoot camera that’s sort of like a GoPro, but considerably smaller and flatter.

It has a cute, unassuming appearance that is instantly recognizable as a camera, or at least an icon of a camera app on your phone.

Google, aware of how a “camera that automatically takes pictures when it sees you” is likely to be perceived, is clearly trying to make the Clips appear friendly, with its white-and-teal color scheme and obvious camera-like styling.

But of those that I showed the camera to while explaining what it’s supposed to do, “it’s creepy” has been a common reaction.

One thing that I’ve discovered is that people know right away it’s a camera and react to it just like other any camera.

That might mean avoiding its view when they see it, or, like in the case of my three-year-old, walking up to it and smiling or picking it up.

That has made it tough to capture candids, since, for the Clips to really work, it needs to be close to its subject.

Maybe over time, your family would learn to ignore it and those candid shots could happen, but in my couple weeks of testing, my family hasn’t acclimated to its presence.

The Clips’ camera sensor can capture 12-megapixel images at 15 frames per second, which it then saves to its 16GB of internal storage that’s good for about 1,400 seven-second clips.

The battery lasts roughly three hours between charges.

Included with the camera is a silicone case that makes it easy to prop up almost anywhere or, yes, clip it to things. It’s not designed to be a body camera or to be worn.

Instead, it’s meant to be placed in positions where it can capture you in the frame as well.

There are other accessories you can buy, like a case that lets you mount the Clips camera to a tripod for more positioning options, but otherwise, using the Clips camera is as simple as turning it on and putting it where you want it.

Once the camera has captured a bunch of clips, you use the app to browse through them on your phone, edit them down to shorter versions, grab still images, or just save the whole thing to your phone’s storage for sharing and editing later.

The Clips app is supposed to learn based on which clips you save and deem “important” and then prioritize capturing similar clips in the future.

You can also hit a toggle to view “suggested” clips for saving, which is basically what the app thinks you’ll like out of the clips it has captured.

Google’s definitely onto something here. The idea is an admirable first step toward a new kind of camera that doesn’t get between me and my kids. But first steps are tricky — ask any toddler!

Usually, after you take your first step, you fall down. To stand back up, Google Clips needs to justify its price, the hassle of setting it up, and the fiddling between it and my phone.

It needs to reassure me that by trusting it and putting my phone away, I won’t miss anything important, and I won’t be burdened by having to deal with a lot of banal captures.

Otherwise, it’s just another redundant gadget that I have to invest too much time and effort into managing to get too little in return.

That’s a lot to ask of a tiny little camera, and this first version doesn’t quite get there. To live up to it all, Clips needs to be both a better camera and a smarter one.

Please like, share and tweet this article.

Pass it on: Popular Science

First Wattway Solar Road Pilot In US Pops Up In Rural Georgia

The first Wattway solar road pilot in America has popped up in rural west Georgia.

The Ray C. Anderson Foundation, named for sustainable manufacturing pioneer Ray Anderson, is testing renewable technologies along an 18-mile stretch of road, and recently installed 538 square feet of Colas‘ Wattway solar road system near the border between Georgia and Alabama.

Part of Georgia’s Interstate 85 was named for Anderson, but as over five million tons of carbon dioxide are emitted yearly on that road portion alone.

Anderson’s family felt placing his name there didn’t honor his legacy, and began to look into renewable technologies to clear the air – so to speak.




Thus began The Ray, an 18-mile living laboratory for clean technologies, including not only the solar roads, but also a solar-powered electric vehicle charging station, and WheelRight, a system people can drive over to test their tire pressure, which could lead to improved fuel inefficiency.

The first Wattway solar panel pilot is part of The Ray near a Georgia Visitor Information Center in West Point, Georgia.

According to Wattway by Colas, the average expected output for the 538-square-meter pilot is anticipated to be 7,000 kilowatt-hours per year, which will help power the center.

And these technologies are just the beginning. The foundation will also construct bioswales, or shallow drainage ditches filled with native Georgia plants to capture pollutants during rain.

In a right-of-way space, they’ll build a one megawatt solar installation. They’re working with the Georgia Department of Transportation to bring such ideas to life along the 18-mile road stretch.

Not only will several of their projects beautify the highway, but will generate clean energy and bring in money for investors. And other parts of the state have shown interest in building their own Wattway roads.

The Ray executive director Allie Kelly dreams of a day when highways will “serve as a power grid for the future,” but she believes that day is coming sooner than we may think.

She told Curbed, “We’re at a tipping point in transportation. In five to ten years, we won’t remember a time when we invested a dime in infrastructure spending for a road that only did one thing.”

Please like, share and tweet this article.

Pass it on: Popular Science

Megapixels Don’t Matter Anymore. Here’s Why More Isn’t Always Better.

For years, smartphone makers have been caught up in a megapixel spec race to prove that their camera is better than the next guy’s.

But we’ve finally come to a point where even the lower-end camera phones are packing more megapixels than they need, so it’s getting harder to differentiate camera hardware.

Without that megapixel crutch to fall back on, how are we supposed to know which smartphone has the best camera?

Well thankfully, there are several other important specs to look for in a camera, and it’s just a matter of learning which ones matter the most to you.




Why Megapixels Don’t Matter Anymore

The term “megapixel” actually means “one million pixels,” so a 12-megapixel camera captures images that are comprised of 12,000,000 tiny little dots.

A larger number of dots (pixels) in an image means that the image has more definition and clarity, which is also referred to as having a higher resolution.

This might lead you to believe that a camera with more megapixels will take better pictures than a camera with fewer megapixels, but that’s not always the case.

The trouble is, we’ve reached a point where all smartphone cameras have more than enough megapixels.

For instance, a 1080p HD TV has a resolution of 2.1 megapixels, and even the highest-end 4K displays top out at 8.3 megapixels.

Considering that nearly every smartphone camera has a double-digit megapixel rating these days, your photos will be in a higher resolution than most screens can even display.

Simply put, you won’t be able to see any difference in resolution between pictures taken by two different smartphone cameras, because most screens you’ll be viewing them on aren’t capable of displaying that many megapixels.

Really, anything greater than 8.3 megapixels is only helpful for cropping. In other words, if your phone takes 12-megapixel photos, you can crop them by roughly 50%, and the resolution will still be just as high as a 4K TV.

Pixel Size Is the Real Difference Maker

The hot new number to judge your phone’s camera by is the pixel size. You’ll see this spec listed as a micron value, which is a number followed by the symbol “µm.”

A phone with a 1.4µm pixel size will almost always capture better pictures than one with a 1.0µm pixel size, thanks to physics.

If you zoomed in far enough on one of your photos, you could see the individual pixels, right? Well, each of those tiny little dots was captured by microscopic light sensors inside your smartphone’s camera.

These light sensors are referred to as “pixels” because, well, they each capture a pixel’s worth of light. So if you have a 12-megapixel camera, the actual camera sensor has twelve million of these light-sensing pixels.

Each of these pixels measures light particles called photons to determine the color and brightness of the corresponding pixel in your finished photo.

When a bright blue photon hits one of your camera’s light sensors, it tells your phone to make a dot with bright blue coloring.

Put twelve million of these dots together in their various brightness and colors, then you’ll end up with a picture.

A Little Aperture Goes a Long Way

The next key spec to look for is the camera’s aperture, which is represented as f divided by a number (f/2.0, for example).

Because of the “f divided by” setup, this is one of those rare specs where a smaller number is always better than a larger one.

To help you understand aperture, let’s go back to pixel size for a second.

If larger pixels mean your camera can collect more light particles to create more accurate photos, then imagine pixels as a bucket, and photons as falling rain.

The bigger the opening of the bucket (pixel), the more rain (photons) you can collect, right?

Well aperture is like a funnel for that bucket. The bottom of this imaginary funnel has the same diameter as the pixel bucket, but the top is wider—which means you can collect even more photons.

In this analogy, a wider aperture gives the photon bucket a wider opening, so it focuses more light onto your camera’s light-sensing pixels.

Image Stabilization: EIS vs. OIS

With most spec sheets, you’ll see a camera’s image stabilization technology listed as either EIS or OIS. These stand for Electronic Image Stabilization and Optical Image Stabilization, respectively.

OIS is easier to explain, so let’s start with that one. Simply put, this technology makes it to where your camera sensor physically moves to compensate for any shaking while you’re holding your phone.

If you’re walking while you’re recording a video, for instance, each of your steps would normally shake the camera—but OIS ensures that the camera sensor remains relatively steady even while the rest of your phone shakes around it.

In general, though, it’s always better to have a camera with OIS.

For one, the cropping and stretching can reduce quality and create a “Jello effect” in videos, but in addition to that, EIS has little to no effect on reducing blur in still photos.

Now that you’ve got a better understanding about camera specs, have you decided which smartphone you’re going to buy next?

If you’re still undecided, you can use our smartphone-buyer’s flowchart at the following link, and if you have any further questions, just fire away in the comment section below.

Please like, share and tweet this article.

Pass it on: Popular Science

Why Can’t We Feel Earth’s Spin?

Earth spins on its axis once in every 24-hour day. At Earth’s equator, the speed of Earth’s spin is about 1,000 miles per hour (1,600 kph).

The day-night has carried you around in a grand circle under the stars every day of your life, and yet you don’t feel Earth spinning.

Why not? It’s because you and everything else – including Earth’s oceans and atmosphere – are spinning along with the Earth at the same constant speed.

It’s only if Earth stopped spinning, suddenly, that we’d feel it. Then it would be a feeling similar to riding along in a fast car, and having someone slam on the brakes!




Think about riding in a car or flying in a plane. As long as the ride is going smoothly, you can almost convince yourself you’re not moving.

A jumbo jet flies at about 500 miles per hour (about 800 km per hour), or about half as fast as the Earth spins at its equator. But, while you’re riding on that jet, if you close your eyes, you don’t feel like you’re moving at all.

And when the flight attendant comes by and pours coffee into your cup, the coffee doesn’t fly to the back of the plane. That’s because the coffee, the cup and you are all moving at the same rate as the plane.

Now think about what would happen if the car or plane wasn’t moving at a constant rate, but instead speeding up and slowing down. Then, when the flight attendant poured your coffee … look out!

Earth is moving at a fixed rate, and we’re all moving along with it, and that’s why we don’t feel Earth’s spin. If Earth’s spin were suddenly to speed up or slow down, you would definitely feel it.

The constant spin of the Earth had our ancestors pretty confused about the true nature of the cosmos. They noticed that the stars, and the sun and the moon, all appeared to move above the Earth.

Because they couldn’t feel Earth move, they logically interpreted this observation to mean that Earth was stationary and “the heavens” moved above us.

With the notable exception of the early Greek scientist Aristarchus, who first proposed a heliocentric model of the universe hundreds of years B.C.E., the world’s great thinkers upheld the geocentric idea of the cosmos for many centuries.

It wasn’t until the 16th Century that the heliocentric model of Copernicus began to be discussed and understood.

While not without errors, Copernicus’ model eventually convinced the world that Earth spun on its axis beneath the stars … and also moved in orbit around the sun.

Bottom line: Why don’t we feel Earth rotating, or spinning, on its axis? It’s because Earth spins steadily – and moves at a constant rate in orbit around the sun – carrying you as a passenger right along with it.

Please like, share and tweet this article.

Pass it on: Popular Science

NASA’s Exoplanet-Hunter TESS Gets Prepped For Launch

Final preparations are underway here at Kennedy Space Center to get NASA’s next planet-hunting spacecraft, the Transiting Exoplanet Survey Satellite (TESS), ready for its planned April 16 launch.

The satellite, built by Orbital ATK, arrived here on Feb, 12 after a 17-hour drive down from Orbital’s facility in Dulles, Virginia, and was ushered inside the Payload Hazardous Servicing Facility (PHSF) to be readied for launch.

However, before it hitches a ride to space atop SpaceX’s Falcon 9 rocket, NASA invited members of the media to get a close-up look at TESS inside a specialized clean room.

The PHSF is one of the last stops a spacecraft makes before launch. Inside this unique facility, engineers conduct final tests and load hazardous fuels, such as hydrazine that will help propel the spacecraft.

Therefore, anyone who enters must follow a strict protocol, including wearing a special suit known as a bunny suit.




Before entering the clean room, a group of eager journalists were regaled with mission specifics by the TESS team, which included the mission’s principal investigator, George Ricker of MIT’s Kavli Institute for Astrophysics and Space Research.

The TESS mission, which is managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and operated by the Massachusetts Institute of Technology (MIT), will spend at least two years studying more than 200,000 of the closest and brightest stars in our solar neighborhood.

TESS will scan the sky, looking for tiny dips in starlight. These dips in brightness — known as transits — could indicate that one or more planets is orbiting the star.

Ricker said that the team expects to discover several thousand planets during the spacecraft’s mission.

The Kepler Space Telescope, NASA’s planet-hunting powerhouse, has identified more than 2,000 confirmed exoplanets using the same “transit” technique as TESS.

However, TESS has a much larger field of view — nearly 20 times larger than Kepler — potentially allowing it to surpass Kepler in the number of exoplanet discoveries.

Thanks to Kepler, we now know that planets around other stars are very common. Kepler spent its primary mission staring at a narrow patch of sky to answer that very question.

Unfortunately, all of Kepler’s discoveries are too far away for follow-up study.

Scheduled to launch next year, Webb will scan the targets identified by TESS to look for water vapor, methane and other atmospheric gases. And, with a little luck, Webb might even spot signatures indicative of life beyond Earth.

TESS will launch into a high, elliptical orbit around Earth that is in a 2:1 resonance with the moon — it will orbit twice for every one time the moon goes all the way around.

This type of orbit has multiple benefits: it is very stable, meaning it won’t be affected by space debris, radiation, while allowing the spacecraft to easily communicate with the ground.

However, this type of orbit limits the number of launch opportunities, as it must be synchronized with the moon’s orbit around the Earth. After launch, it will take the spacecraft two months to reach its destination.

During our visit, engineers were prepping the spacecraft for final testing before launch. That testing included final checkouts of the solar arrays and is expected to be completed February 21.

Next, TESS will be mated to the launch vehicle.

Originally slated to launch on March 20, TESS is currently scheduled to lift off on April 16, following a one-month delay requested by the launch provider, SpaceX. However, TESS must launch by June per congressional mandate.

Please like, share and tweet this article.

Pass it on: New Scientist

Any Aliens On Exoplanet Proxima B Likely Wiped Out Last Year

The chances that the nearest planet beyond our solar system might be habitable and perhaps even host quirky, telepathic aliens have dimmed significantly thanks to one day last year when the star Proxima Centauri shone exceptionally bright.

Astronomers observed a huge solar flare from the star just four light-years away that increased its brightness a thousandfold for about 10 seconds and probably showered nearby planet Proxima b with energetic particles.

The findings were published Monday in The Astrophysical Journal Letters.

March 24, 2017, was no ordinary day for Proxima Cen,” said Meredith MacGregor, an astronomer at the Carnegie Institution for Science, in a statement.




It’s likely that Proxima b was blasted by high-energy radiation during this flare.

That’s bad news for the prospects of life there. Any intelligent beings would have basically experienced the horrible ending of the 2009 Nicolas Cage science fiction flick “Knowing.”

It was already known the dwarf star was prone to outbursts of smaller, X-ray flares.

But if Proxima b has also been on the receiving end of major flares like the one MacGregor and her colleagues caught with Chile’s Atacama Large Millimeter/submillimeter Array (ALMA), it’s not going to be worth planning a vacation there any century soon.

Over the billions of years since Proxima b formed, flares like this one could have evaporated any atmosphere or ocean and sterilized the surface,” she said.

There is hope for the future habitability of Proxima b, though. Eventually, Proxima Centauri will calm down and begin to cool into a white dwarf that might be far more hospitable.

Unfortunately, that’s unlikely to happen for at least another 4 trillion years, or roughly 300 times the current age of the universe.

Stubborn old star.

Please like, share and tweet this article.

Pass it on: Popular Science

World’s Biggest Plane, Stratolaunch, Marks Another Key Milestone

Rockets have been the way to get satellites into orbit since the dawn of the space age. But Microsoft co-founder Paul Allen hopes to shake that up with help from the world’s biggest airplane.

Stratolaunch” is a 500,000-pound beast with twin fuselages and a wingspan of 385 feet. Allen’s Seattle-based company is developing it as a platform for lifting rockets into the stratosphere before launching them into space.

It’s seen as a cheaper, more reliable route to low-Earth orbit (LEO) — the sweet spot for many kinds of satellites.

The plane is still in development and has yet to fly, but last December it taxied out onto the runway at the Mojave Air & Space Port in Mojave, California. In another test last Sunday, it hit a new top taxi speed of 46 miles per hour.




If all goes according to plan, the plane will take its first test flight next year. As to when Stratolaunch might begin commercial operations, no date has been given.

Air-launching rockets into space isn’t a new idea. The Pegasus XL rocket built by aerospace contractor Orbital ATK launches from a modified Lockheed TriStar jetliner.

NASA and Richard Branson’s Virgin Group have similar projects under development, as does the Defense Advanced Research Projects Agency (DARPA).

But none of these other platforms is quite on the scale of Stratolaunch. Powered by six huge Pratt & Whitney turbofan engines, the aircraft is intended to carry up to 550,000 pounds to an altitude of 35,000 feet.

It has room between its fuselages to suspend rockets from the central portion of the wing. The company has partnered with Orbital ATK to launch its Pegasus XL rocket and aims eventually to carry three on each mission.

Stratolaunch was designed by Mojave, California-based Scaled Composites, which specializes in concept aircraft.

The company won the Ansari X Prize to launch the first private, reusable, manned spacecraft in 2004 with its SpaceShipOne, which was also launched from a plane.

Despite the ambitious nature of the project, space entrepreneur Gary Hudson thinks it has a good chance of success — in part because of Allen’s deep pockets.

Please like, share and tweet this article.

Pass it on: Popular Science

Kids Can Sign Up As Citizen Scientists With New App That feeds Australian Species Database

Eight-year-old Brisbane boy Griffin Chong and 10-year-old Austin McConville from Melbourne are topping the scoreboard.

The pair are keen players on an online app that aims to make science fun while adding to a national database of Australian species.

It is one of a rising number of online projects and websites that are making science more accessible.

You get to learn more animals and birds and insects,” Austin said.

The keen animal spotters take and upload photos and then identify their finds and those made by others.




They also compete with other players, many of them adults. Out of 5,000 players, Griffin is rated sixth at identifying species and Austin is the number one bird identifier.

I just see the scientific name lots and then one day I just remember it,” Griffin said.

Jacki Liddle said her son Griffin had had an interest in science and animals from an early age.

He’s just got that time and enthusiasm that kids can have,” she said.

Kids are “sponges for information”

Austin’s father Andrew McConville said his son was a third-generation bird-watcher, but while animal spotting had long been a family activity for them, he had been impressed by the contribution children could make to science through the new platforms.

They’re naturally curious and just sponges for information,” he said.

Paul Flemons, who heads digital collections and citizen science for the Australian Museum, said there had always been interest in citizen science.

He said leaps in technology and apps and platforms like e-Bird, iNaturalist, Australasian Fishes and DigiVol were making it easy for people to get involved, especially kids.

Last year, 50,000 people took part in a national online project called Wildlife Spotter, launched by the ABC and one-quarter of them were children.

They’re great platforms for kids to become part of the citizen science community and people feel like they’re making a difference,” he said.

Australian National University evolution ecologist Professor Craig Moritz said there were other benefits too.

One of the really cool things about citizen science is it actually gets kids out in the real world getting interested in out biodiversity, that’s a huge plus,” he said.

The other side of it is generating information that can then be interpreted to understand how that biodiversity is changing.

And possibly a whole new variety of researchers.

Right now I would like to be an ornithologist,” Austin said.

Scientist!” Griffin Chong added of his future plans.

Please like, share and tweet this article.

Pass it on: Popular Science