Tag: science

Learn How to Sleep on Your Back (It Isn’t Easy)

If you’re able to sleep on your back, you’re one of the few. Only 14% of us sleep on our backs. What’s so great about it?

Back sleeping can help to reduce back and neck pain, minimize the effects of acid reflux, decrease wrinkles, and even help to maintain perky breasts.

While back sleeping can exacerbate snoring issues for some and isn’t recommended during pregnancy, it’s considered the healthiest way to sleep.




Sleeping this way makes it easy for your head, neck, and spine to maintain a neutral position; they’re in near ideal alignment when lying on a flat surface. Most doctors and sleep experts recommend it–if you can pull it off.

It’s possible to learn how to sleep on your back, but it’s not easy for everyone.

How to Sleep on Your Back–First a Few Tips:

  • Use positioning pillows. Extra pillows in the bed can help to keep your body positioned in whatever specific way works best for you. A pillow under each arm is a preferred technique, but do whatever works for you.
  • Keep a pillow under your knees to help maintain proper alignment of your back. This can help if you’re experiencing any lower back discomfort.
  • Be persistent. Always roll to your back when you catch yourself positioned otherwise.
  • Use a pillow that will hold your head in place. What’s the best pillow for back sleepers? A malleable type like a buckwheat pillow works well. It will prevent your head from rolling from side to side and give you the best support.

Please like, share and tweet this article.

Pass it on: Popular Science

How 17th Century Fraud Gave Rise To Bright Orange Cheese

The news from Kraft last week that the company is ditching two artificial dyes in some versions of its macaroni and cheese products left me with a question.

Why did we start coloring cheeses orange to begin with? Turns out there’s a curious history here.

In theory, cheese should be whitish — similar to the color of milk, right?

Well, not really. Centuries ago in England, lots of cheeses had a natural yellowish-orange pigment. The cheese came from the milk of certain breeds of cows, such as Jersey and Guernsey.

Their milk tends to be richer in color from beta-carotene in the grass they eat.




So, when the orange pigment transferred to the cow’s milk, and then to the cheese, it was considered a mark of quality.

But here’s where the story gets interesting.

Cheese expert Paul Kindstedt of the University of Vermont explains that back in the 17th century, many English cheesemakers realized that they could make more money if they skimmed off the cream — to sell it separately or make butter from it.

But in doing so, most of the color was lost, since the natural orange pigment is carried in the fatty cream.

So, to pass off what was left over — basically low-fat cheese made from white milk — as a high-quality product, the cheesemakers faked it.

American cows are mostly grain-fed, whereas almost all cows in New Zealand are pasture-raised.

The cheesemakers were initially trying to trick people to mask the white color [of their cheese],” explains Kindstedt.

They began adding coloring from saffron, marigold, carrot juice and later, annatto, which comes from the seeds of a tropical plant.

The devious cheesemakers of the 17th century used these colorings to pass their products off as the full-fat, naturally yellowish-orange cheese that Londoners had come to expect.

The tradition of coloring cheese then carried over in the U.S. Lots of cheesemakers in Indiana, Ohio, Wisconsin and New York have a long history of coloring cheddar.

The motivation was part tradition, part marketing to make their cheeses stand out. There was another reason, too: It helped cheesemakers achieve a uniform color in their cheeses.

But Kindstedt says it’s not a tradition that ever caught on in New England dairy farms. And that’s why to this day, we still see lots of naturally white cheddar cheese from places such as Vermont.

With the boom in the artisanal food movement, we’re starting to see more cheese produced from grass-fed cows.

And as a result, we may notice the butterlike color in summer cheeses — similar to what the 17th century Londoners ate.

We absolutely see the color changes when the cows transition onto pasture in early May,” cheesemaker Nat Bacon of Shelburne Farms in Vermont wrote to us in an email.

He says it’s especially evident “in the whey after we cut the curd, and also in the finished cheese. Both get quite golden in color, kind of like straw, with the beta-carotenes the cows are eating in the fresh meadow grasses.

Please like, share and tweet this article.

Pass it on: Popular Science

Ice Chunks Are Fascinating To Look At, But Can Cause Serious Flooding

Mohawk River ice jam: Giant chunks of ice washed up on the shore of the Mohawk River.

The weather has been all over the spectrum in the last few weeks.

Record-breaking frigid temperatures and wind chills kicked off 2018, followed by temperatures into the 50s and 60s last week.

Temperatures plummeted to start off this week, and central Pennsylvania woke up Wednesday to a few inches of fresh snowfall.




A gradual warmup is expected into this weekend, with highs in the 50s predicted. The upcoming warmup has meteorologists keeping an eye on what the warmer weather could mean for ice jams in rivers and streams.

It’s something we’re going to have to keep an eye on,” said Craig Evanego, a meteorologist with National Weather Service.

Evanego explained the warmer temperatures could enable ice jams to break free and move down rivers and streams.

He added that some areas in the northern part of the state are already experiencing ice jam issues on localized streams thanks to elevated water levels.

Frozen river ice: Frozen river ice melts in layers as chunks wash up on shore.

With the gradual warmup, we’ll see if things will begin to thaw and move down the (Susquehanna) river,” Evanego said.

Senior meteorologist Alex Sosnowski with AccuWeather said that thanks to the persistent cold, pretty thick layers of ice have been able to form.

Along with fluctuating temperatures, Sosnowski said river levels are a little higher, adding that another rain event is expected from Monday to Wednesday next week.

Mohawk River ice jam: Jams can cause floods, which threaten buildings near the banks.

Sosnowski explained a major risk with ice jams is that when they break free, they send a surge of water down the river, which can cause flooding in unprotected areas.

Sosnowski expected that levee systems should be able to protect against any flooding caused by ice packs, and said unprotected areas are at the most risk for flooding.

Sosnowski encouraged those who want to go out and observe ice packs to do so carefully, as they can break away and begin drifting downstream at any time.

Please like, share and tweet this article.

Pass it on: Popular Science

NASA Rovers Set New Record For Longest Mission On Mars

NASA’s long-lived twin Mars rovers Spirit and Opportunity have set a new endurance record on Mars, with Opportunity hot on the heels of its sister robot for the title of longest-running mission on the Martian surface.

Opportunity today matched the Mars mission lifespan of NASA’s iconic Viking 1 lander, which spent six years and 116 days (for a total of 2,245 days) working on the red planet in the mid 1970s and early 80s.

If Opportunity survives three weeks longer than its older robotic twin Spirit, which has been silent for weeks but may actually be hibernating, the rover will take the all-time record for the longest mission on Mars.

The two solar-powered rovers recently experienced their fourth Martian winter solstice – the day with the least amount of sunlight at their respective spots on Mars – last May 12, 2010.

Opportunity and Spirit were initially slated for only 90-day missions to explore the geology and chemistry of their respective landing sites.




But they blew past those deadlines and have continued their missions for far longer than NASA engineers ever thought possible.

In January of 2010, they each celebrated their sixth anniversary on Mars. That means right now both rovers are in the midst of their seventh Earth year exploring the red planet.

Spirit touched down on the surface of Mars in January 2004, ahead of Opportunity, but fell silent on March 22, when it skipped a planned communications session with controllers on Earth.

The beleaguered Spirit rover has been out of communication for weeks after entering a low-power hibernation mode once winter sat in and temperatures dropped along with the sun dipping in the sky, leaving Spirit with insufficient power to properly function.

The rover may wake up with the arrival of the Martian spring, and if so, will keep its hold on the record for the longest mission.

Spirit landed on Mars on Jan. 3, 2004 while Opportunity touched down on Jan. 25 (Eastern Time) of that year. So Opportunity would have to survive at least 22 days longer than its twin to take the Martian mission title.

But, because Spirit is out of contact, mission managers may not know for several weeks whether or not it has survived and was still in operation on its record-setting day.

Opportunity, which is doing fine, is expected to breeze past Viking 1’s 2,245-day record today with no problems. The rover also hit another milestone in March, passing the 20-kilometer (12.43-mile) mark.

While Opportunity could swipe the Mars surface mission record from Spirit, it has a long way to go to take the title for longest mission in the Martian neighborhood.

Opportunity has been steadily roving toward a huge Mars crater called Endeavour since mid-2008, when it finished its last crater pit stop Victoria Crater.

Photos from the rover show the rim of Endeavour in the distance with vast plains of Martian sand etched with ripple-like dunes.

Please like, share and tweet this article.

Pass it on: Popular Science

Blind Fish In Dark Caves Shed Light On The Evolution Of Sleep

Out of the approximately 3 billion letters of DNA that make up your genome, there are about a 100 letters that neither of your parents possess.

These are your own personal mutations. The machinery that copies DNA into new cells is very reliable, but it is not perfect. It makes errors at a rate equivalent to making a single typo for every 100 books filled with text.

The sperm and egg cells that fused to form you carried a few such mutations, and therefore so do you.

Changes to DNA are more likely to be disruptive than beneficial, simply because it is easier for changes to mess things up than to improve them.

This mutational burden is something that all life forms have to bear. In the long run, individuals that carry harmful mutations will, on average, produce fewer offspring than their peers.




Over many generations, this means that the mutation will dwindle in frequency. This is how natural selection is constantly ‘weeding out’ disruptive mutations from our genomes.

There is a flip side to this argument, and it is the story of the blind cave fish. If a mutation disrupts a gene that is not being used, natural selection will have no restoring effect.

This is why fish that adapt to a lifestyle of darkness in a cave tend to lose their eyes. There is no longer any advantage to having eyes, and so the deleterious mutations that creep in are no longer being weeded out.

Think of it as the ‘use it or lose it’ school of evolution.

A world without light is quite an alien place. There are many examples of fish that live in completely dark caves.

Remarkably, if you compare these fish to their relatives that live in rivers or in the ocean, you find that the cavefish often undergo a similar set of changes. Their eyes do not fully develop, rendering them essentially blind.

They lose pigmentation in their skin, and their jaws and teeth tend to develop in particular ways.

This is an example of what is known as convergent evolution, where different organisms faced with similar ecological challenges also stumble upon similar evolutionary solutions.

The changes mentioned above are all about appearance, but what about changes in behavior? In particular, when animals sleep, they generally line up with the day and night cycle.

In the absence of any daylight, how do their sleep patterns evolve?

A recent paper by Erik Duboué and colleagues addressed this question by comparing 4 groups of fish of the same species Astyanax mexicanus.

Three of the populations (the Pachón, Tinaja, and Molino) were blind cavefish that inhabited different dark caves, whereas the fourth was a surface-dwelling fish.

The authors defined sleep for their fish to be a period of a minute or more when the fish were not moving. They checked that this definition met the usual criteria.

Sleeping fish were harder to wake up, and fish that were deprived of sleep compensated by sleeping more over the next 12 hours (these are both situations that any college student is familiar with).

The researchers also tracked the speeds of all the fish, and found that, while they were awake, the cavefish moved faster or just as fast as the surface fish.

This means that it’s not that the cavefish are constantly sleep deprived and in a lethargic, sleepy state. They are just as wakeful as the surface fish (if not more so), and genuinely need less sleep.

These three cavefish populations all evolved independently, and yet they have converged on remarkably similar sleep patterns.

To study the genetics of this phenomenon, the researchers cross-bred the surface fish with the cavefish. The cave dwellers and surface fish all belong to the same species, which means that they can have viable offspring.

They found that the mixed offspring (Pachón x surface and Tinaja x surface) had a reduced need for sleep that was indistinguishable from that of their cave-dwelling parent.

Thus sleep reduction is clearly a genetic trait, and it is a dominant trait (Dominant traits are present in the offspring if they are inherited from just one parent. A recessive trait, on the other hand, will only be present if it is inherited from both parents.)

Unlocking the secrets of sleep is inherently cool science, and it also has the potential to help people suffering from sleep disorders.

Who knows, it may even lead to the superpower of doing away with sleep altogether.

Please like, share and tweet this article.

Pass it on: Popular Science

Has The Age Of Quantum Computing Arrived?

Ever since Charles Babbage’s conceptual, unrealised Analytical Engine in the 1830s, computer science has been trying very hard to race ahead of its time.

Particularly over the last 75 years, there have been many astounding developments – the first electronic programmable computer, the first integrated circuit computer, the first microprocessor.

But the next anticipated step may be the most revolutionary of all.

Quantum computing is the technology that many scientists, entrepreneurs and big businesses expect to provide a, well, quantum leap into the future.

If you’ve never heard of it there’s a helpful video doing the social media rounds that’s got a couple of million hits on YouTube.

It features the Canadian prime minister, Justin Trudeau, detailing exactly what quantum computing means.




Trudeau was on a recent visit to the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, one of the world’s leading centres for the study of the field. D

During a press conference there, a reporter asked him, half-jokingly, to explain quantum computing.

Quantum mechanics is a conceptually counterintuitive area of science that has baffled some of the finest minds – as Albert Einstein said “God does not play dice with the universe” – so it’s not something you expect to hear politicians holding forth on.

Throw it into the context of computing and let’s just say you could easily make Zac Goldsmith look like an expert on Bollywood.

But Trudeau rose to the challenge and gave what many science observers thought was a textbook example of how to explain a complex idea in a simple way.

The concept of quantum computing is relatively new, dating back to ideas put forward in the early 1980s by the late Richard Feynman, the brilliant American theoretical physicist and Nobel laureate.

He conceptualised the possible improvements in speed that might be achieved with a quantum computer. But theoretical physics, while a necessary first step, leaves the real brainwork to practical application.

With normal computers, or classical computers as they’re now called, there are only two options – on and off – for processing information.

A computer “bit”, the smallest unit into which all information is broken down, is either a “1” or a “0”.

And the computational power of a normal computer is dependent on the number of binary transistors – tiny power switches – that are contained within its microprocessor.

Back in 1971 the first Intel processor was made up of 2,300 transistors. Intel now produce microprocessors with more than 5bn transistors. However, they’re still limited by their simple binary options.

But as Trudeau explained, with quantum computers the bits, or “qubits” as they are known, afford far more options owing to the uncertainty of their physical state.

In the mysterious subatomic realm of quantum physics, particles can act like waves, so that they can be particle or wave or particle and wave.

This is what’s known in quantum mechanics as superposition. As a result of superposition a qubit can be a 0 or 1 or 0 and 1. That means it can perform two equations at the same time.

Two qubits can perform four equations. And three qubits can perform eight, and so on in an exponential expansion. That leads to some inconceivably large numbers, not to mention some mind-boggling working concepts.

At the moment those concepts are closest to entering reality in an unfashionable suburb in the south-west corner of Trudeau’s homeland.

Please like, share and tweet this article.

Pass it on: New Scientist

How the U.S. Built The World’s Most Ridiculously Accurate Atomic Clock

Throw out that lame old atomic clock that’s only accurate to a few tens of quadrillionths of a second. The U.S. has introduced a new atomic clock that is three times more accurate than previous devices.

Atomic clocks are responsible for synchronizing time for much of our technology, including electric power grids, GPS, and the watch on your iPhone.

On Apr. 3, the National Institute of Standards and Technology () in Boulder, ColoNISTrado officially launched their newest standard for measuring time using the NIST-F2 atomic clock, which has been under development for more than a decade.

NIST-F2 is accurate to one second in 300 million years,” said Thomas O’Brian, who heads NIST’s time and frequency division, during a press conference April 3.




The clock was recently certified by the International Bureau of Weights and Measures as the world’s most accurate time standard.

The advancement is more than just a feather in the cap for metrology nerds. Precise timekeeping underpins much of our modern world.

GPS, for instance, needs accuracy of about a billionth of a second in order to keep you from getting lost. These satellites rely on high precision coming from atomic clocks at the U.S. Naval Observatory.

GPS, in turn, is used for synchronizing digital networks such as cell phones and the NTP servers that provide the backbone of the internet.

Your smartphone doesn’t display the time to the sixteenth decimal place, but it still relies on the frequency standards coming from NIST’s clocks, which make their measurements while living in a tightly controlled lab environment.

Real world clocks must operate under strained conditions such as temperature swings, significant vibration, or changing magnetic fields that degrade and hamper their accuracy.

It’s important then that the ultimate reference standard has much better performance than the real world technologies.

What will we do once we reach the ability to break down time into super-tiny, hyper-accurate units? Nobody knows.

Please like, share and tweet this article.

Pass it on: New Scientist

Engineers Create New Architecture For Vaporizable Electronics

Engineers from Cornell and Honeywell Aerospace have demonstrated a new method for remotely vaporizing electronics into thin air, giving devices the ability to vanish – along with their valuable data – if they were to get into the wrong hands.

This unique ability to self-destruct is at the heart of an emerging technology known as transient electronics, in which key portions of a circuit, or the whole circuit itself, can discreetly disintegrate or dissolve.

And because no harmful byproducts are released upon vaporization, engineers envision biomedical and environmental applications along with data protection.

There are a number of existing techniques for triggering the vaporization, each with inherent drawbacks.




Some transient electronics use soluble conductors that dissolve when contacted by water, requiring the presence of moisture.

Others disintegrate when they reach a specific temperature, requiring a heating element and power source to be attached.

Cornell engineers have created a transient architecture that evades these drawbacks by using a silicon-dioxide microchip attached to a polycarbonate shell.

Hidden within the shell are microscopic cavities filled with rubidium and sodium biflouride – chemicals that can thermally react and decompose the microchip.

Ved Gund, Ph.D. ’17, led the research as a graduate student in the Cornell SonicMEMS Lab, and said the thermal reaction can be triggered remotely by using radio waves to open graphene-on-nitride valves that keep the chemicals sealed in the cavities.

The encapsulated rubidium then oxidizes vigorously, releasing heat to vaporize the polycarbonate shell and decompose the sodium bifluoride. The latter controllably releases hydrofluoric acid to etch away the electronics,” said Gund.

Amit Lal, professor of electrical and computer engineering, said the unique architecture offers several advantages over previously designed transient electronics, including the ability to scale the technology.

The stackable architecture lets us make small, vaporizable, LEGO-like blocks to make arbitrarily large vanishing electronics,” said Lal.

Gund added that the technology could be integrated into wireless sensor nodes for use in environmental monitoring.

For example, vaporizable sensors can be deployed with the internet of things platform for monitoring crops or collecting data on nutrients and moisture, and then made to vanish once they accomplish these tasks,” said Gund.

Lal, Gund and Honeywell Aerospace were recently issued a patent for the technology, and the SonicMEMS Lab is continuing to research new ways the architecture can be applied toward transient electronics as well as other uses.

Our team has also demonstrated the use of the technology as a scalable micro-power momentum and electricity source, which can deliver high peak powers for robotic actuation,” said Lal.

Fabrication of the polycarbonate shell was completed by Christopher Ober, professor of materials science and engineering, with other components of the architecture provided by Honeywell Aerospace.

Portions of the research were funded under the Defense Advanced Research Projects Agency’s Vanishing Programmable Resources program.

Please like, share and tweet this article.

Pass it on: Popular Science

Superbugs And Antibiotic Resistance

For the last century, medical professionals and microbiologists have waged a war against germs of every type and with the breakthrough of antibiotics, changed the world in which we live.

It also changed the world for our symbionts, the 4 to 6 pounds of bacteria, fungi and viruses who have hung on to our species through thick and thin for eons of time; to them we are their movable feast.

It was indeed a war that we appeared to be winning.  We thought we were firmly living in the ‘Antibiotic-Age’ and it was here to stay for all time.

However, while we were basking in its potency, unfortunately we were also rapidly and inexplicably sowing the seeds of its demise.

In a recent landmark report, US health policy makers warn that, with mounting evidence of superbugs overcoming our antibiotics,  that our situation is extremely serious.

The report gives a glimpse of the world to come, as even now there are a dozen different drug resistant microbial species that have totally overcome our existing antibiotics.

These resistant strains are now responsible for causing 2 million infections and 23,000 deaths each year in the US alone.




According to the WHO, the rapid emergence of multi-drug resistant (MDR) strains calls for a comprehensive and coordinated response to prevent a global catastrophe.

The WHO warns that, “...many infectious diseases are rapidly becoming untreatable and uncontrollable.”

CDC director Tom Frieden says that we must take urgent action to “change the way antibiotics are used” by cutting unneeded use in humans and animals and take basic steps to prevent infections in the first place.

The tools we have at our disposal, besides tracking resistant infections, are vaccines, safe food & patient infection control practices, paired with effective and enlightened hand hygiene.

Human populations weathered numerous plagues before antibiotics were discovered. It is edifying that geneticists have found that the human genome is littered with the remnants of our past battles with pathogens.

The difference is that today we know how to effectively apply all of the preventive measures that are at our disposal.

We should keep in mind that the advent of infectious disease adapted to humans is a relatively recent phenomenon.

The ‘Post-Antibiotic Age’, if it comes, represents the ongoing evolution between a microbe and its human host, with hand & surface hygiene reigning supreme as the most effective means of preventing infection.

These elements, along with water sanitation and hygienic treatment of human waste, have formed the basis for the hygiene revolution over the last hundred years.

Within this, the discovery and development of antibiotics is perhaps the short lived apex or crowning glory of the revolution.

To rise to the challenge, we need to recognize that our bodies are complex ecological systems and the maintenance of our barrier function is critical to preventing skin infection and keeping out invading pathogens.

This is no more than an extension and further development of the original hygiene revolution, where we see the true relations between living organisms and the many elements of the environment.

Skin health is critical to maintaining hand hygiene compliance.  Hand hygiene is certainly capable of rising to the challenge, but not if skin is damaged.

In the ‘Post-Antibiotic Age’, maintaining healthy skin will be essential to preventing a wide range of infections caused by strains we helped to create.

Healthy hands are safe hands, but hand hygiene does not have to go it alone if there is a “sea-change” with respect to how agri-food producers and healthcare professionals utilize antibiotics.

CDC Director Frieden stated that, “It’s not too late,” but that there is a list of urgent and life-threatening infections that must be addressed via a more effective collaboration; they include carbapenem-resistant Enterobacteriaceae (CRE), drug resistant gonorrhea and C. difficile.

The WHO has called for the agri-food industry to take the threat of MDRs seriously and curb over use of antibiotics, particularly as it is estimated that there is at least a 1000-fold greater use of antibiotics compared to humans.

In hospitals we must embrace best antibiotic and hygiene practices to make a turn from what the Center for Global Development has called “a decade of neglect“.

We need to “Get  Smart” and set targets for reducing antibiotic use in healthcare facilities.

Let’s all appreciate the good microbial flora and fauna that exist on and in us, as without these little creatures life as we know it would not exist.

We should also recognize that the more bad bugs encounter antibiotics, the more likely they are to adapt. As Health Canada puts it, “Do bugs need drugs?“.

While antibiotics have allowed us to temporarily gain the upper hand, nothing lasts forever;  but with a holistic view of hand hygiene there is no reason why we can’t continue to improve our control of infections.

But for this to happen, there can be no excuses or compromises for effective hand hygiene practices.

Please like, share and tweet this article.

Pass it on: New Scientist

 

Breakthrough As Human Eggs Developed In The Lab For First Time

Women at risk of premature fertility loss might have cause for new hope as researchers reveal that human eggs can be developed in the lab from their earliest stages to maturity.

While the feat has previously been achieved for mouse eggs, and has given rise to live young after fertilization, the process has proved tricky in humans.

Experts say the latest development could not only aid the understanding of how human eggs develop, but open the door to a new approach to fertility preservation for women at risk of premature fertility loss – such as those undergoing chemotherapy or radiotherapy.

The research could be particularly relevant for girls who have not gone through puberty. Currently, to preserve their fertility ovarian tissue is taken before treatment and frozen for later implantation.




[For young girls] that is the only option they have to preserve their fertility, said Prof Evelyn Telfer, co-author of the research from the University of Edinburgh.

But the approach has drawbacks. In the case of re-implanted tissue, “the big worry, and the big risk, is can you put cancer cells back,” said Stuart Lavery, a consultant gynaecologist at Hammersmith Hospital, who was not involved in the study.

The new research offers a way for eggs to be extracted, grown and used, without the need to re-implant the tissue.

When you have got the eggs, of course you would have no contaminating cells – hopefully it would be an embryo that you would be implanting back in,” said Telfer.

But, she warned, it would be several years before the technique could be used in clinics, with further tests needed to make sure the mature eggs are normal and the process safe.

Writing in the journal Molecular Human Reproduction, researchers from Edinburgh and New York describe how they took ovarian tissue from 10 women in their late twenties and thirties and, over four steps involving different cocktails of nutrients, encouraged the eggs to develop from their earliest form to maturity.

Of the 48 eggs that reached the penultimate step of the process, nine reached full maturity.

Although various teams have achieved different stages of the process before, the new work is the first time researchers have taken the same human eggs all the way from their earliest stages to the point at which they would be released from the ovaries.

Before reaching this level of maturity, eggs cannot be fertilised.

Lavery added the new technique could also prove useful for women who have passed through puberty. While these women can have mature eggs collected before treatment, that approach also has problems.

Telfer adds that the new approach could also be useful for women whose eggs fail to fully develop in the body and, more fundamentally, will help boost our understanding of the mechanisms underpinning the development of human eggs.

However, it will be many years before the research leads to new fertility preservation treatments.

Among other issues, the authors note that the eggs developed faster than they would in the body, while a small cell known as a polar body – ejected in the final stages of the egg’s development when the number of chromosomes is halved – was unusually large, which might suggest abnormal development.

Prof Robin Lovell-Badge, a group leader at the Francis Crick Institute, was also cautious, noting it was possible that not all of the eggs were at the earliest stage of development to start with.

Telfer admits far more research is necessary, and hopes to get regulatory approval for future research.

The next step would be to try and fertilise these eggs and then to test the embryos that were produced, and then to go back and improve each of the steps.”

Please like, share and tweet this article.

Pass it on: New Scientist