Category: News Posts

First ‘Space Nation’ Set To Blast Off From Earth

The first nation in space finally launches Saturday aboard a commercial spacecraft set to blast off from NASA’s Wallops Flight Facility in Virginia.

Although the physical territory of Asgardia will consist solely of what’s basically just a floating file server in orbit, the self-proclaimed “space kingdom” insists the deployment of the satellite Asgardia-1 is just the beginning of a much grander vision of a true space state.

Asgardia is probably one of the few self-declared sovereign states you could fit in a backpack.




Asgardia-1, a small cubesat that’s roughly the size of a loaf of bread, is among the 14 cubesats that will be launched from Wallops early Saturday aboard an Orbital ATK Cygnus spacecraft bound for a long stop at the International Space Station.

It will then have to wait at the ISS for a month before Cygnus detaches and heads to a higher altitude where the satellite can be deployed.

Asgardia-1 will carry some key files, like the national constitution, flag and database of all its “citizens“, but most of its storage is filled with files uploaded by citizens.

So far, over 100,000 humans have accepted the terms of the constitution and uploaded over 18,000 files to the satellite, according to Asgardia’s website.

The reaction to the space nation-building project has been mixed.

While over half a million would-be Asgardians have requested citizenship, others point out that the idea of claiming territory in space could conflict with existing law.

It’s also a stretch to consider Asgardia’s free orbiting cloud storage service an actual nation for even more fundamental reasons.

The ramshackle treehouse I built in my backyard has been declared sovereign territory by at least one young girl dabbling in imaginary megalomania.

But because no other nation on Earth has acknowledged that claim of independence, it carries about as much weight as Asgardia’s assertion of nationhood.

Please like, share and tweet this article.

Pass it on: New Scientist

Twins! Distant Galaxy Looks Like Our Own Milky Way

Almost like a postcard from across the universe, astronomers have photographed a spiral galaxy that could be a twin of our own Milky Way.

The distant galaxy, called NGC 6744, was imaged by the Wide Field Imager on the MPG/ESO 2.2-metre telescope at the European Southern Observatory’s La Silla Observatory in Chile.

The pinwheel lies 30 million light-years away in the southern constellation of Pavo (The Peacock).

We are lucky to have a bird’s-eye view of the spiral galaxybecause of its orientation, face-on, as seen from Earth. It’s a dead ringer for our own home in the cosmos, scientists say.




If we had the technology to escape the Milky Way and could look down on it from intergalactic space, this view is close to the one we would see — striking spiral arms wrapping around a dense, elongated nucleus and a dusty disc,” according to an ESO statement.

There is even a distorted companion galaxy — NGC 6744A, seen here as a smudge to the lower right of NGC 6744, which is reminiscent of one of the Milky Way’s neighboring Magellanic Clouds.

The main difference between NGC 6744 and the Milky Way is the two galaxies’ size. While our galaxy is roughly 100,000 light-years across, our “twin” galaxy extends to almost twice that diameter, researchers said.

The photogenic object is one of the largest and nearest spiral galaxies to Earth.

It’s about as bright as 60 billion suns, and its light spreads across a large area in the sky about two-thirds the width of the full moon making the galaxy visible as a hazy glow through a small telescope.

The reddish spots along the spiral arms in NGC 6744 represent regions where new stars are being born.

The picture was created by combining four exposures taken through different filters that collected blue, yellow-green and red light and the glow coming from hydrogen gas.

These are shown in the new picture as blue, green, orange and red, respectively.

Please like, share and tweet this article.

Pass it on: New Scientist

The Origins Of Our Species Might Need A Rethink

On the outskirts of Beijing, a small limestone mountain named Dragon Bone Hill rises above the surrounding sprawl.

Along the northern side, a path leads up to some fenced-off caves that draw 150,000 visitors each year, from schoolchildren to grey-haired pensioners.

It was here, in 1929, that researchers discovered a nearly complete ancient skull that they determined was roughly half a million years old.

Dubbed Peking Man, it was among the earliest human remains ever uncovered, and it helped to convince many researchers that humanity first evolved in Asia.

Since then, the central importance of Peking Man has faded. Although modern dating methods put the fossil even earlier at up to 780,000 years old the specimen has been eclipsed by discoveries in Africa that have yielded much older remains of ancient human relatives.




Such finds have cemented Africa’s status as the cradle of humanity the place from which modern humans and their predecessors spread around the globe and relegated Asia to a kind of evolutionary cul-de-sac.

But the tale of Peking Man has haunted generations of Chinese researchers, who have struggled to understand its relationship to modern humans.

It’s a story without an ending,” says Wu Xinzhi, a palaeontologist at the Chinese Academy of Sciences’ Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) in Beijing.

They wonder whether the descendants of Peking Man and fellow members of the species Homo erectus died out or evolved into a more modern species, and whether they contributed to the gene pool of China today.

Keen to get to the bottom of its people’s ancestry, China has in the past decade stepped up its efforts to uncover evidence of early humans across the country.

It is reanalysing old fossil finds and pouring tens of millions of dollars a year into excavations. And the government is setting up a US$1.1-million laboratory at the IVPP to extract and sequence ancient DNA.

In its typical form, the story of Homo sapiens starts in Africa. The exact details vary from one telling to another, but the key characters and events generally remain the same. And the title is always ‘Out of Africa’.

In this standard view of human evolution, H. erectus first evolved there more than 2 million years ago.

Then, some time before 600,000 years ago, it gave rise to a new species: Homo heidelbergensis, the oldest remains of which have been found in Ethiopia.

About 400,000 years ago, some members of H. heidelbergensis left Africa and split into two branches: one ventured into the Middle East and Europe, where it evolved into Neanderthals; the other went east, where members became Denisovans a group first discovered in Siberia in 2010.

The remaining population of H. heidelbergensis in Africa eventually evolved into our own species, H. sapiens, about 200,000 years ago.

Then these early humans expanded their range to Eurasia 60,000 years ago, where they replaced local hominins with a minuscule amount of interbreeding.

Please like, share and tweet this article.

Pass it on: New Scientist

This Supercomputer Comes In As The Fifth Fastest Machine In The World

The top two spots on the list of the world’s most powerful supercomputers have both been captured by the US.

The last time the country was in a similar position was three years ago.

The fastest machine – Titan, at Oak Ridge National Laboratory in Tennessee – is an upgrade of Jaguar, the system which held the top spot in 2009.

The supercomputer will be used to help develop more energy-efficient engines for vehicles, model climate change and research biofuels.

It can also be rented to third-parties, and is operated as part of the US Department of Energy’s network of research labs.

The Top 500 list of supercomputers was published by Hans Muer, professor of computer science at Mannheim, who has been keeping track of developments since 1986.

It was released at the SC12 supercomputing conference in Salt Lake City, Utah.




Mixed processors

Titan leapfrogged the previous champion IBM’s Sequoia – which is used to carry out simulations to help extend the life of nuclear weapons – thanks to its mix of central processing unit (CPU) and graphics processing unit (GPU) technologies.

According to the Linpack benchmark it operates at 17.59 petaflop/sec – the equivalent of 17,590 trillion calculations per second.

The benchmark measures real-world performance – but in theory the machine can boost that to a “peak performance” of more than 20 petaflop/sec.

To achieve this the device has been fitted with 18,688 Tesla K20x GPU modules made by Nvidia to work alongside its pre-existing CPUs.

Traditionally supercomputers relied only on CPUs.

CPU cores are designed to handle between one and a few streams of instructions at speed, but are not efficient at carrying out many at once.

That makes them well suited for complex tasks in which the answer to one calculation is used to work out the next.

GPU cores are typically slower at carrying out individual calculations, but make up for this by being able to carry out many at the same time.

This makes them best suited for “parallellisable jobs” – processes that can be broken down into several parts that are then run simultaneously.

Mixing CPUs and GPUs together allows the most appropriate core to carry out each process. Nvidia said that in most instances its GPUs now carried out about 90% of Titan’s workload.

Please like, share and tweet this article.

Pass it on: Popular Science

Neutron Star Smash-Up Produces Gravitational Waves And Light In Unprecedented Stellar Show​

The 2015 detection of gravitational waves – ripples in the very fabric of space and time – was one of the biggest scientific breakthroughs in a century.

But because it was caused by two black holes merging, the event was all but invisible, detectable indirectly via the LIGO facility.

Now a team of scientists has announced the fifth detection of gravitational waves, but there’s a groundbreaking difference this time around.

The ripples were caused by the collision of two neutron stars, meaning the event was accompanied by light, radio, and other electromagnetic signals for the first time.

First predicted by Albert Einstein over 100 years ago, gravitational waves are caused by cosmic cataclysms like the collision of two black holes, but because of the immense distance.




By the time they reach us here on Earth the distortions are occurring on the subatomic scale.

To observe waves that tiny, LIGO beams lasers down a 4-km (2.5-mi) long tunnel and measures how gravitational waves might warp the beam as they wash over our local corner of spacetime.

That delicate process is effective at confirming the phenomenon, but still somewhat indirect.

This is the first time that the collision of two neutron stars has been detected, and this is the closest and most precisely located gravitational wave signal we’ve received,” says Susan Scott, the Leader of the General Relativity Theory and Data Analysis Group at Australian National University (ANU), which played a key role in the observation.

It is also the loudest gravitational wave signal we’ve detected.

The collision occurred in a galaxy called NGC 4993, which lies about 130 million light-years away – that might sound far, but it’s much closer than previous observations, which occurred at distances of billions of light-years.

As well as producing gravitational waves, the neutron stars’ collision sent a host of electromagnetic signals sweeping across the universe, including a short gamma ray burst, X-rays, light and radio waves.

These were picked up by observatories all over the world, helping pinpoint the source.

ANU was among those, using SkyMapper and the Siding Spring Observatory in New South Wales, Australia, to observe the brightness and color of the light signals given off.

Along with learning more about gravitational waves, the discovery can teach astronomers about neutron stars.

Created when larger stars collapse, neutron stars are relatively tiny – only about 10 km (6.2 mi) wide – and incredibly dense, with very strong magnetic fields. Other than that, not a whole lot is known about them.

With this discovery we have the opportunity to learn so much more about neutron stars, which have been quite a mystery to us,” says Scott.

Unlike black holes, neutron star collisions emit other signals such as gamma rays, light and radio waves so astronomers around the world were able to observe the event through telescopes. This is an amazing time to be a scientist.

Please like, share and tweet this article.

Pass it on: New Scientist

Are Fidget Toys Legitimately Good For Your Brain?

Fidget” isn’t exactly a word with the most positive of connotations. For many of us, it recalls veiled childhood threats of “stop fidgeting or,” and then the promised removal of something we value more highly than fidgeting.

Type “stop” into Google’s search box and “stop fidget” is one of the first recommendations its autocomplete feature presents you with.

But fidgeting, like beloved 1990s TV properties, is making a comeback.

Last year, the creators of Fidget Cube a Kickstarter desk toy allowing users to click, roll, flip, glide, spin and assorted fidgety verbs set out to raise $15,000 to make their product a reality.

They wound up raking in $6,465,690 from 154,926 backers.

Fidget Cube has inevitably been followed by a number of other crowdfunding campaigns designed to appeal to the twitchy fingers of those who supported it.




One was a fidget pen called Think Ink, which combines a titanium pen exterior with a number of tactile elements for distracted fingers to play with. It made more than quadruple its funding target.

I made this for my daughter,” co-founder Kent Lyon said.

She had just started a new job, which she nervous about, and started noticing that she was fidgeting a whole lot. Whether it was clicking her pen or playing with her hair, she found that she couldn’t stop doing something with her hands.” Lyon gave Think Ink the subtitle “Fidget to focus.

But is this really a thing — or is the idea that a distracting toy can actually help us just a pseudoscientific marketing ploy?

It’s tempting to bust out the klaxons at the breaking news that a fidget toy purveyor thinks fidget toys increase productivity.

However, it just may be correct.

Research has shown that even small repetitive activities can increase the levels of neurotransmitters in the brain in a way that increases our ability to focus and pay attention.

Even if the fidget you are carrying out involves minimal concentration fidgeting with a pen, chewing gum, or doodling on a piece of paper this type of multitasking can positively impact the outcome of a particular task.

This is especially noticeable when dealing with children with ADHD, as Purdue University professor Sydney Zentall has noted in her work.

According to Zentall, while failure to stay on task can reduce work speed and production, there is no evidence that most “distractions” increase errors among children with ADHD.

Surprisingly, she said, these kind of fidget distractions “may actually help the child perform in the classroom, especially when tasks are long and tedious.

That is, off-task looking may provide ‘doses’ of environmental stimulation that the child needs.

There is even evidence that fidgeting can have a positive impact on people’s physical health.

Examinations regarding the physical benefits of fidgeting are relatively few and far between, but a 2008 study tracked daily movements for a group of slim and overweight women, and discovered that the slimmer group tended to fidget more.

If the obese women adopted the activity patterns of the lean women,” the authors of the study noted, they might burn an extra 300 calories per day.

Sure, you’re never going to match a five-mile run by playing with your Fidget Cube, but the findings suggest that every little bit helps.

Ultimately, we’re still still a long way from the makers of fidget-focused desk toys being able to make explicit medical claims for their devices — but it seems that there is real scientific evidence to suggest that fidgeting has an important role to play in our lives.

Please like, share and tweet this article.

Pass it on: New Scientist

According To Researchers, Global Carbon Emissions Rising Again After Brief Plateau

For three years in a row, the world’s carbon emissions were virtually stable — holding steady after decades of growth.

But now they’re on the rise again, which is bad news for efforts to fight climate change, according to a team of researchers who have released a new study on the topic.

Seventy-six scientists from around the world contributed to the Global Carbon Project, or GCP, which released its annual “Carbon Budget” yesterday.

The budget estimates that total global carbon emissions from fossil fuels and industrial sources will rise by 2 percent in 2017.

There’s a fair amount of uncertainty in that projection, with possible values from .8 percent to 3 percent — but the researchers are confident it represents an overall rise, fueled in part by changes in the Chinese economy.




The anticipated change is a “big rise,” lead author Corinne Le Quéré tells NPR. “And this is contrary to what is needed in order to tackle climate change.

It’s a shift from the more hopeful findings from the last few years. From 2014 to 2016, according to the GCP analysis, the rate of emissions was basically flat.

Scientists agree that a reduction in carbon emissions is necessary to keep the global warming at 2 degrees Celsius or less, the target established by the global accord on climate change.

That level of climate change is still projected to have a range of damaging effects, including devastation for some island nations — but it will be far from the worse-case scenario projected if emissions continue to rise.

The increase in carbon emissions is not distributed evenly around the world.

The U.S. and the countries of the European Union, which once generated nearly all of the world’s fossil-fuel and industrial carbon emissions, now contribute less than half of the world’s cumulative emissions.

Their contributions are expected to continue to fall in 2017, albeit at a lower rate than they had previously been falling.

Annual Global Fossil Fuel And Cement Emissions
Total global emissions from fossil fuels and cement production (which the Global Climate Project analyzes to quantify industrial carbon output) have been rising, in general, for decades. The pace had slowed to a near standstill over the last three years. This year, however, researchers anticipate a 2 percent rise in the annual release of carbon dioxide from fossil fuels and industry.

Emissions from China, India and the rest of the world, however, are projected to show marked increase in 2017.

The result is “an emissions tug-of-war,” as the CICERO Center for International Climate Research put it in a press release.

That makes it hard to tell what’s going to happen next, because the trend is “so fragile,” as Le Quéré told NPR yesterday.

It’s the difference between emissions rising in parts of the world and decreasing in other parts of the world,” she says. Overall? “Frankly, it could really go either way.

And it’s crucial for that upward trend to start moving down, and quickly, she says.

She points to already-evident consequences of global warming: warmer oceans that can fuel more powerful storms and rising sea levels that cause more devastating coastal surge damage.

In order to tackle climate change emissions you have to go down to almost zero” emissions, she says. “The faster we do it, the more we limit the risks from climate change.

Please like, share and tweet this article.

Pass it on: Popular Science

Why Are We Still Using Electroconvulsive Therapy?

The idea of treating a psychiatric illness by passing a jolt of electricity through the brain was one of the most controversial in 20th Century medicine.

So why are we still using a procedure described by its critics as barbaric and ineffective?

Sixty-four-year-old John Wattie says his breakdown in the late 1990s was triggered by the collapse of his marriage and stress at work.

We had a nice house and a nice lifestyle, but it was all just crumbling away. My depression was starting to overwhelm me. I lost control, I became violent,” he explains.

John likens the feeling to being in a hole, a hole he could not get out of despite courses of pills and talking therapies.




But now, he says, all of that has changed thanks to what is one of the least understood treatments in psychiatry – electroconvulsive therapy (ECT).

“Before ECT I was the walking dead. I had no interest in life, I just wanted to disappear. After ECT I felt like there was a way out of it. I felt dramatically better.

The use of electricity to treat mental illness started out as an experiment. In the 1930s psychiatrists noticed some heavily distressed patients would suddenly improve after an epileptic fit.

Passing a strong electric current through the brain could trigger a similar seizure and – they hoped – a similar response.

By the 1960s it was being widely used to treat a variety of conditions, notably severe depression.

But as the old mental asylums closed down and aggressive physical interventions like lobotomies fell out of favour, so too did electroshock treatment, as ECT was previously known.

The infamous ECT scene in One Flew Over the Cuckoo’s Nest cemented the idea in the public’s mind of a brutal treatment, although by the time the film was released in 1975 it was very rarely given without a general anaesthetic.

Perhaps more significantly, new anti-depressant drugs introduced in the 1970-80s gave doctors new ways to treat long-term mental illness.

But for a group of the most severely depressed patients, ECT has remained one of the last options on the table when other therapies have failed.

Annually in the UK around 4,000 patients, of which John is one, still undergo ECT.

It’s not intuitive that causing seizures can be good for depression but it’s long been determined that ECT is effective,” says Professor Ian Reid at the University of Aberdeen, who heads up the team treating John.

In the 75 years since ECT was first used scientists have argued about why and how it might work. The latest theories build on the idea of hyperconnectivity.

This new concept in psychiatry suggests parts of the brain can start to transmit signals in a dysfunctional way, overloading the system and leading to conditions from depression to autism.

Please like, share and tweet this article.

Pass it on: Popular Science

Neptune’s Moon: Triton

We don’t know with what beverage William Lassell may have celebrated his discovery of Neptune’s moon, Triton, but beer made it possible.

Lassell was one of 19th century England’s grand amateur astronomers, using the fortune he made in the brewery business to finance his telescopes.

He spotted Triton on 10 October 1846 — just 17 days after a Berlin observatory discovered Neptune.

Curiously, a week before he found the satellite, Lassell thought he saw a ring around the planet. That turned out to be a distortion caused by his telescope.

But when NASA’s Voyager 2 visited Neptune in 1989, it revealed that the gas giant does have rings, though they’re far too faint for Lassell to have seen them.

Since Neptune was named for the Roman god of the sea, its moons were named for various lesser sea gods and nymphs in Greek mythology.




Triton (not to be confused with Saturn’s moon, Titan), is far and away the largest of Neptune’s satellites. Dutch-American astronomer Gerard Kuiper (for whom the Kuiper Belt was named) found Neptune’s third-largest moon, Nereid, in 1949.

He missed Proteus, the second-largest, because it’s too dark and too close to Neptune for telescopes of that era.

Proteus is a slightly non-spherical moon, and it is thought to be right at the limit of how massive an object can be before its gravity pulls it into a sphere.

Proteus and five other moons had to wait for Voyager 2 to make themselves known. All six are among the darker objects found in the solar system.

Astronomers using improved ground-based telescopes found more satellites in 2002 and 2003, bringing the known total to 13.

Voyager 2 revealed fascinating details about Triton. Part of its surface resembles the rind of a cantaloupe.

Ice volcanoes spout what is probably a mixture of liquid nitrogen, methane and dust, which instantly freezes and then snows back down to the surface.

One Voyager 2 image shows a frosty plume shooting 8 km (5 miles) into the sky and drifting 140 km (87 miles) downwind.

Triton’s icy surface reflects so much of what little sunlight reaches it that the moon is one of the coldest objects in the solar system, about -400 degrees Fahrenheit (-240 degrees Celsius).

Triton is the only large moon in the solar system that circles its planet in a direction opposite to the planet’s rotation (a retrograde orbit), which suggests that it may once have been an independent object that Neptune captured.

The disruptive effect this would have had on other satellites could help to explain why Nereid has the most eccentric orbit of any known moon it’s almost seven times as far from Neptune at one end of its orbit as at the other end.

Neptune’s gravity acts as a drag on the counter-orbiting Triton, slowing it down and making it drop closer and closer to the planet.

Millions of years from now, Triton will come close enough for gravitational forces to break it apart possibly forming a ring around Neptune bright enough for Lassell to have seen with his telescope.

Please like, share and tweet this article.

Pass it on: New Scientist

Scientists Found Out That Wounds Sustained At Night Heal Twice As Slowly Injuries Sustained At Daytime

Body clocks cause wounds such as cuts and burns sustained during the day to heal around 60 percent faster than those sustained at night, scientists have discovered in a finding that has implications for surgery and wound-healing medicines.

In a study published in the journal Science Translational Medicine on Wednesday, the scientists showed for the first time how our internal body clocks regulate wound healing by skin cells, and optimize healing during the day.

Burns that happened at night took an average of 60 percent longer to heal than burns that occurred during the day, the scientists found.

Night-time burns – sustained between 8pm to 8am – were 95 percent healed after an average of 28 days, compared with only 17 days if the burn happened between 8am and 8pm.




Body clocks – known as circadian rhythms – regulate almost every cell in the body, driving 24-hour cycles in many processes such as sleeping, hormone secretion and metabolism.

The key to accelerated daytime wound healing, the scientists found, was that skin cells moved more rapidly to repair the wound and there was also more collagen – the main structural protein in skin – deposited around the wound site.

This is the first time that the circadian clock within individual skin cells has been shown to determine how effectively they respond to injuries,” said John O‘Neill, who co-led the research at Britain’s Medical Research Council Laboratory of Molecular Biology.

We consistently see about a two-fold difference in wound healing speed between the body clock’s day and night. It may be that our bodies have evolved to heal fastest during the day when injuries are more likely to occur.

Treatment of wounds costs health services worldwide billions of dollars a year – in Britain’s National Health Service alone, the costs are estimated around 5 billion pounds ($6.56 billion) a year.

Experts say this is partly due to a lack of effective drugs to speed up wound closure.

John Blaikley, a clinician scientist from Britain’s University of Manchester, said these new insights into the circadian factors important in skin repair should help the search for better wound-healing drugs.

It could also help doctors improve outcomes by changing what time of day surgery is carried out, or when medicines are given, he said.

Please like, share and tweet this article.

Pass it on: New Scientist