Tag: robot

This Is A Bio Inspired 3D Printed Spider Octopod Robot

T8 robot

The T8 octopod robot is modeled after a real tarantula, and the way it moves is startlingly realistic an effect that’s amplified by its high-resolution 3D-printed shell, which conceals the robotics inside

Each T8 moves with the help of 26 Hitec HS-35HD servo motors. Three in each leg and two to move the body and is pre-programmed using Robugtix’s Bigfoot Inverse Kinematics Engine, which handles the calculations for factors like trajectory planning and gait and motor control.

All the operator has to do is press buttons on the controller, which communicates with the robot via an XBee radio module.

It’s an impressively spooky little critter, though. Check it out in the video below.

Please like, share and tweet this article.

Pass it on: Popular Science

This Is The Newest And Fastest Way To Press Vinyl

The first new record-pressing machines built in over 30 years are finally online.

The brainchild of some Canadian R&D guys with a background designing fancy MRI machines. The Warm Tone record press is everything that its vintage counterpart is not: safe, fast, fully automated, reliable, run by cloud-based software, and iOS-controlled.

Unlike the old stamping behemoths, a single worker can operate several Warm Tone units at once.

Its unrivaled speed and efficiency leaves the standard cycle time benchmarks in the dust, too: 20 seconds versus 35 seconds, which translates to three records per minute instead of only two.

Please like, share and tweet this article.

Pass it on: Popular Science

You’ll Never Dance Alone With This Artificial Intelligence Project

Your next dance partner might not be a person.

A new project from the Georgia Institute of Technology allows people to get jiggy with a computer-controlled dancer, which “watches” the person and improvises its own moves based on prior experiences.

When the human responds, the computerized figure or “virtual character” reacts again, creating an impromptu dance couple based on artificial intelligence (AI).

The LuminAI project is housed inside a 15-foot-tall geodesic dome, designed and constructed by Georgia Tech digital media master’s student Jessica Anderson, and lined with custom-made projection panels for dome projection mapping.

The surfaces allow people to watch their own shadowy avatar as it struts with a virtual character named VAI, which learns how to dance by paying attention to which moves the current user is doing and when.

The more moves it sees, the better and deeper the computer’s dance vocabulary. It then uses this vocabulary as a basis for future improvisation.

The system uses Kinect devices to capture the person’s movement, which is then projected as a digitally enhanced silhouette on the dome’s screens.

The computer analyzes the dance moves being performed and leans on its memory to choose its next move.

The team says this improvisation is one of the most important parts of the project. The avatar recognizes patterns, but doesn’t always react the same way every time.

That means that the person must improvise too, which leads to greater creativity all around. All the while, the computer is capturing these new experiences and storing the information to use as a basis for future dance sessions.

LuminAI was unveiled for the first time this past weekend in Atlanta at the Hambidge Art Auction in partnership with the Goat Farm Arts Center.

It was featured within a dance and technology performance, in a work called Post, as a finalist for the Field Experiment ATL grant. T. Lang Dance performed set choreography with avatars and virtual characters within the dome.

Post is the fourth and final installment of Lang’s Post Up series, which focuses on the stark realities and situational complexities after an emotional reunion between long lost souls.

Please like, share and tweet this article.

Pass it on: New Scientist

Ocado’s Collaborative Robot Is Getting Closer To Factory Work

Retailer Ocado is getting closer to creating an autonomous humanoid robot that can help engineers fix mechanical faults in its factories.

The firm’s latest robot, ARMAR-6, has a human-looking torso, arms with eight degrees of freedom, hands that can grip and a head with cameras inside. But it doesn’t have legs and is equipped with a large wheeled base that lets it move around.

To this end, ARMAR-6 uses a three camera systems inside its head to help it detect and recognize humans and objects; speech recognition helps it understand commands; and its hands are able to pick-up and grasp objects.

At present, the robot is still a prototype but getting to this point has taken two and a half years. Four European universities have been working to create each of the systems, under the EU’s Horizon2020 project.

The retailer has already automated large parts of its warehouse operation. Its 90,000-square-metre Dordon warehouse, near Birmingham, has 8,000 crates moving around it at any one time, across 35 kilometers of conveyor belts.

However, components can break and require maintenance. This is where future versions of the ARMAR-6 robot will come in.

Other training tasks that have been worked on include getting it to find a spray bottle, pick it up, and then handing it across to a human.


At the moment, this is a prescribed sequence,” Deacon says. “But the ultimate aim is for the robot to be able to recognize where in a maintenance task the technician is and understand from its behavioral repertoire what will be a good thing for it to do in order to assist the technician.”

Ocado’s humanoid project runs under the banner of Secondhands and involves engineers and computer scientists from EPFL, Karlsruhe Institute of Technology, Sapienza Università di Roma, and University College London.

Each university has developed individual elements of the ARMAR-6 system.

The firm first laid out the ambitious plans for the collaborative robot in 2015. Since then, it has worked on a number of robotics projects.

Most recently, it revealed its robotic arm that can pick-up items using suction. It’s planned the gripper will be used in the company’s factories to lift and place thousands of different items into the shopping of its customers.

Please like, share and tweet this article.

Pass it on: New Scientist

The Kiwi Bots Trundles Along Campus Streets And Deliver Food To Students

A Kiwibot delivers food around the UC Berkeley campus via Kiwi, a new on-demand delivery service.

UC Berkeley has a new on-demand delivery service. Unlike any of its predecessors however, this one relies on robots.

Kiwi uses a fleet of 20 terrier-sized wheeled robots to pick up and deliver food and personal-care items within a roughly one square mile area centered around campus.

The vehicles operate between Shattuck and Piedmont avenues and between Hearst Avenue and Dwight Way.

If you want to start a robot company, use a campus,” said founder and CEO Felipe Chávez, “and if you want to use a campus, use UC Berkeley.

The city and the campus were a natural fit for Kiwi.

At the most basic level, the infrastructure paved streets, well-maintained sidewalks, functioning traffic signals, crosswalks and a generally law-abiding citizenry allows for robots to operate smoothly.

In addition, UC Berkeley has a high concentration of residents pressed both for time and space to prepare meals at home.

Finally, there’s Berkeley’s abundance of local dining options and a generally favorable attitude towards innovation.

Though Chávez started Kiwi in his native city of Bogotá, Colombia in 2015, using people as couriers, he switched to robots when he brought Kiwi north to Cal in January of 2017 as part of UC Berkeley’s LAUNCH program, an incubator for promising startups.

The bot roamed free-range on the plaza as Chávez sat on a bench for our interview.

He looked ahead, unconcerned as the robot made several turns around the fountain, then headed off under an alley of trees, almost out to Sather Gate before turning back and circling the plaza again.

The Kiwibot’s motions appear self-directed but it uses the same technology as Roomba, the robot vacuum cleaner.

It recognizes boundaries and avoids obstacles by using lidar sensors and has a smartphone mounted to its hood.

Please like, share and tweet this article.

Pass it on: New Scientist

This Is An Underwater Robotic Excavation System For Flooded Open-Cut Mines

The ¡VAMOS! Project (Viable Alternative Mine Operating System) is developing a novel underwater excavation system to test the technological and economic viability of the mining of inland mineral deposits in flooded open-cut mines, currently uneconomic using conventional methods.

A floating launch and recovery vessel has been built, and in July 2017, work will be completed on a remotely-operated underwater roadheader and robotic assistance vehicle.

After completion, the first of two European trials will commence.

During these trials, the road-transportable system will be tested on a range of rock-types and its technological and economic viability and socio-environmental impact will be analysed.

By demonstration of a safe, silent, clean and low-visibility system, the project hopes to encourage investment in abandoned and prospective EU mines by providing an alternative and more cost-effective excavation technique, ultimately aiming to reduce the EU’s reliance on strategically important raw materials imports.

Following a design freeze in October 2016, work is set to be completed on all system components and software by July 2017, shortly before the first European field trial in 2017 in England.

Post-trial microeconomic, environmental andstrategic foresight analyses will guide the future development of the technology vision.

Please like, share and tweet this article.

Pass it on: New Scientist

The Robot Companions For The Elderly Mothers

Care homes plan to use robots to interact with the elderly – raising fears they could become a cheap replacement for staff. A £15,000 robot is to patrol care homes and seek out elderly residents in Southend to talk to.

And a separate British trial starting this month will use robots to bolster staff at homes in the UK, Poland and Greece. It is hoped that they will eventually be able to monitor pulses and signs of illness in order to alert staff.

ut critics warned that machines should not replace human interaction – and called for funds to be spent on the care crisis instead as the number of older Britons increases.

Lesley Salter, the Southend councillor in charge of care, said the robot called Pepper was “cute, kind, engaging and learning all the time“.

She said: “We strongly believe that Pepper can have a positive impact on social care.” Pepper, which is 4ft tall and gets around on three wheels, will not be used for one-to-one personal care in Southend.

Phil Webster, of the council, said he was developing a memory game for older people involving Pepper.

The robot has cameras with shape-recognition software, as well as four microphones, which allow it to decipher voice tones and expressions in order to determine if people are happy.

Mr Webster said: “In a residential home he could patrol around and seek out people to talk to.”

He could go up to someone of his own volition and… send an email back saying,I spent some time with Henry. He says he’s happy but he looks sad. And you could gain more knowledge about the service users.

But Matthew Egan, of the Unison union, said: ‘A smile or a hug from a machine is going to be small comfort to anyone feeling sad and alone.

Buying robots might be cheaper than training and employing experienced staff, but they’re essentially sticking plasters masking a much bigger problem.

As we all live longer, the one million extra care workers needed to look after us all will only materialise when the government provides the funding the system urgently needs.

The other trial, by Lincoln University, will last about four months.

Lincoln care home resident Jean Clark, 86, who has been introduced to the robot, said: “The most important thing is health – it will be able to detect the health of the person and maybe communicate that information to a doctor.”

My family don’t live in Lincoln so anything that can help me and my disabled husband is fantastic.

In Japan, a bear-shaped robot is being used to lift people out of beds and into chairs. Disability charity Scope called for more funding rather than “pipe dreams of robot carers“.

Age UK director Caroline Abrahams said: “There’s a lot to be said for making smarter use of technology to help people manage health conditions, stay independent for longer and improve the efficiency of back office functions.

However, technology should only be introduced in situations where it delivers real benefits. When it comes to caring for older people there is no substitute for the human touch.

Other Pepper models, made by Japanese firm Softbank, are being used to welcome bank customers and take patients to hospital departments.

It can change its eye colour and the tone of its voice to match the mood of the person it is speaking to. It can also interact through touch sensors.

Please like, share and tweet this article.

Pass it on: New Scientist

Origami Robots Now Come With Their Own Tiny Exoskeletons

You’ve probably seen origami “robots” before: flat sheets of metal or plastic that fold into bots that can walk, climb, and even swim.

They’re not of much practical use right now, but they represent a promising path for robot development.

Now, in a bid to augment the bots’ abilities, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a new tool for them: origami exoskeletons.

In a paper published today, researchers describe four exoskeletons, each made out of a plastic sheet that folds into a predefined shape when heated for a few seconds.

There’s a boat-shaped exoskeleton and a glider: one for “walking,” and another that folds up into a crude wheel for faster movement.

Each exoskeleton can be donned in turn by a tiny lead bot called Primer. This isn’t a robot as we usually think of them, but a small magnetic cube that can be controlled remotely using magnetic fields.

In the future, the researchers imagine this sort of approach to robot design could help up make multifunctional bots that can perform complex tasks remotely.

They could be used for deep-sea mining operations, for example, or for building colonies in space.

These are locations where you don’t want to waste resources shipping out lots of different bots for different jobs, so it’s more efficient to send one with a set of origami tools.

As Rus says: “Why update a whole robot when you can just update one part of it?

Please like, share and tweet this article.

Pass it on: New Scientist

How Humans And Robots Will Complement Each Other

Across the world, robots have replaced workers in factories, taken on the role of customer service agents in call centers, and, in the near future, will be driving our cars.

But while factory workers, customer service specialists, and taxi drivers may have a lot to worry about in the new age of automation and AI, there’s reason for hope: Robots often need humans to work with them.

TechRepublic talked to experts in robotics, AI, and finance to learn more about how humans and robots will complement each other in future jobs.

It’s clear that robots are still not good at everything—the common joke among roboticists is that if you want to stop a robot takeover, all you have to do is close a door.

So Joe Jones, founder of Harvest Automation (and original Roomba inventor), told TechRepublic that human-robot collaboration “makes designing the robot easier.”

The model for industrial robots has been that they are big and dangerous and must be kept behind a fence, away from people,” said Jones.

“This means the robot must be entirely autonomous. But, people and robots have different strengths—if the robot must do the whole job itself, it may have to perform functions that robots aren’t especially good at.

One primary place for robots, warehouses, is “rather challenging,” said Jones. For example, one problem is picking non-rigid objects through a hole cut in a cardboard box.

One good solution to this problem might be to let people identify and manipulate, and have the robot drive around the warehouse carrying totes and consolidating the items picked for each order,” said Jones.

We implemented this example when Harvest became interested in warehouse robots, and it seems to have a lot of merit.

Steve Palomino, director of financial transformation at Redwood Software, which provides enterprise robotic process automation, sees the potential for a lot of new jobs in finance.

“[Of] all the technological advances, we haven’t had a disruption in accounting,” said Palomino.

How humans and robots work together is that robots can take over mundane tasks, like account balances,” said Palomino.

Right now, I have to look at your checking account, and compare it to your QuickBooks account and your Excel spreadsheet, and make sure they’re equal.

For a machine, it’s a much easier task. “If you take a large corporation, of between 2000-8000 accounts, you have hordes of humans spending hours in Excel looking back and forth trying to find variance,” he said.

This will free humans to do important things that they went to college for—complex accounting.

Another reason that robots and workers can work together? Toby Walsh, professor of AI at The University of New South Wales said that robots can help humans “play to our strengths.

These are just a few of the ways that humans and robots may work together in the future—it is not comprehensive. Know of more ways that this is happening, or will happen? Let us know in the comment section.

Please like, share and tweet this article.

Pass it on: Popular Science

Self-Healing Robot Can Adapt To Injury Within Minutes

From putting out forest fires to grabbing you a cup of coffee, robots have the potential to be hugely beneficial to humans.

The problem, however, is that they seem to fall apart when they’re injured. A new study published in Nature may have just overcome this hitch by creating a robot that learns to adapt to its injuries. What could possibly go wrong?

Researchers from Pierre and Marie Curie University and the University of Wyoming have created a robot that is able to get back on its feet—literally—after two of its legs were broken.

They also developed a robotic arm that is able to place a ball into a can, despite having several broken motors.

When injured, animals do not start learning from scratch,” senior author Jean-Baptiste Mouret said in a statement.

Instead, they have intuitions about different ways to behave. These intuitions allow them to intelligently select a few, different behaviors to try out and, after these tests, they choose one that works in spite of the injury.

For example, if you hurt your ankle, you quickly try to find a way to overcome the injury by testing out new ways to walk.

Using this principle, researchers created an algorithm called ‘Intelligent Trial and Error’ that makes a detailed map of the different behaviors the robot can perform and allows them to adapt to unexpected situations.

Once damaged, the robot becomes like a scientist. It has prior expectations about different behaviors that might work, and begins testing them.”

“However, these predictions come from the simulated, undamaged robot. It has to find out which of them work, not only in reality, but given the damage,” says lead author Antoine Cully in a statement.

For example, if walking, mostly on its hind legs, does not work well, it will next try walking mostly on its front legs. What’s surprising is how quickly it can learn a new way to walk.”

“It’s amazing to watch a robot go from crippled and flailing around to efficiently limping away in about two minutes,” he adds.

Intelligent Trial and Error undergoes two crucial steps; the first involves a new type of evolutionary algorithm called MAP-Elites to create a behavior-performance map.

MAP-Elites depends on Darwin’s concept of ‘survival of the fittest’ to create competitions in computer simulations, which evolve artificially intelligent robots. In the second step, the robots

MAP-Elites depends on Darwin’s concept of ‘survival of the fittest’ to create competitions in computer simulations, which evolve artificially intelligent robots. In the second step, the robots uses its prior knowledge provided by the first step to adapt to specific damages.

Researchers hope this new technique can lead to the development of more ‘autonomous’ robots. To see the robots in action, watch the video below.

Please like, share and tweet this article.

Pass it on: New Scientist