Deprecated: Return type of Boxzilla\DI\Container::offsetExists($id) should either be compatible with ArrayAccess::offsetExists(mixed $offset): bool, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /www/answerswithjoecom_679/public/wp-content/plugins/boxzilla/src/di/class-container.php on line 124

Deprecated: Return type of Boxzilla\DI\Container::offsetGet($id) should either be compatible with ArrayAccess::offsetGet(mixed $offset): mixed, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /www/answerswithjoecom_679/public/wp-content/plugins/boxzilla/src/di/class-container.php on line 90

Deprecated: Return type of Boxzilla\DI\Container::offsetSet($id, $value) should either be compatible with ArrayAccess::offsetSet(mixed $offset, mixed $value): void, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /www/answerswithjoecom_679/public/wp-content/plugins/boxzilla/src/di/class-container.php on line 72

Deprecated: Return type of Boxzilla\DI\Container::offsetUnset($id) should either be compatible with ArrayAccess::offsetUnset(mixed $offset): void, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /www/answerswithjoecom_679/public/wp-content/plugins/boxzilla/src/di/class-container.php on line 133
technology Archives - Answers With Joe

Tag: technology

Tech Failures That Were Ahead Of Their Time


Being first doesn’t always guarantee success. Often the public isn’t ready, or the product isn’t ready, or there’s not an infrastructure in place, whatever the reason, many times the first to market are just the first to fail. Today we celebrate those successful failures.

TRANSCRIPT:

The hype around the Segway was HUGE. Some of the biggest innovators of our time, like Apple co-founder Steve Jobs, said it would be as big of a deal as the P.C.
Before it was unveiled, people started speculating that it would change the face of transportation as we know it. Was it one of those flying cars the Jetsons promised us?
No. It was just this.

You probably remember seeing the Segway scooters back in the 2000s if you lived in a bustling city and ever went to one of those snooty coffee shops. You know, the ones where they don’t allow you to use cream or sugar because they want to you “taste the essence” of their coffee.
The Segway was a two-wheeled transportation device that could self-balance and move simply by leaning the direction the driver wanted to go.

Medical inventor Dean Kamen designed the Segway, and he was heavy on the hype train. He predicted it would be “to the car what the car was to the horse and buggy.”

Well, here’s the first problem. The Segway’s top speed was only 10 mph. Maybe if the Segway competed against the horse and buggy, it might’ve had a fighting chance.

Each unit cost about $5,000 and walking is pretty much free, so you can probably do the math as to why no one bought the thing.

It did find some success in sectors like security and tourism, and it was one of the few funny parts of those Paul Blart: Mall Cop movies.
In fact, it seemed like the only dependable part of its legacy is all the amusing accident photos and videos it provided us.

Even President George W. Bush was caught on camera falling off of one.
Six years after its unveiling, Segway issue a massive recall of all its units. The company said it received six reports of people falling off the scooters causing injuries to heads and wrists when people tried to slow them down.

By 2009, Kamen sold Segway, and in 2015, it was sold again to a Chinese mobility company called Ninebot. It only sold around 140,000 units before the company stopped making them in 2020.

The Segway brand, however, hasn’t disappeared. In fact, it used the technology to create electric scooters, e-bikes and other e-vehicles. It’s even expanded its brand to create portable power stations and robotics, creating products like self-autonomous lawnmowers and outdoor delivery vehicles.

Electric cars are everywhere these days but it wasn’t that long ago that they were so rare and so odd looking that they were considered a novelty.

For instance, actor and noted environmentalist Ed Begley Jr. was one of the first notable names to regularly drive an electric car in the 70s and 80s, and it was basically a modified golf cart.

Imagine trying to take a date in one of those things. That’s one step away from pulling up in front of their house on a bicycle, ringing the bell, and inviting them to sit up front on the handlebars.

One of the first major attempts to bring an electric car to the market happened in 1985 with the Sinclair C5.

And well, calling it a car would be a generous assessment. It was more like a Power Wheels for grownups with places to be, but didn’t care how on time they were.

Just like Thomas Edison, its creator Sir Clive Sinclair was actually a successful inventor. He was a computer pioneer who created the first pocket calculator and the Sinclair ZX80, the first mass market computer for less than 100 pounds.

It was called an electric vehicle but it was actually a one-seat, pedal-powered tricycle and it sat so close to the ground, it was practically a death trap on wheels. It had no windshield and the body was made of polypropylene, so you were probably gonna eat a bug or two on every trip, and that’s if you’re lucky.

There was no way you could maneuver it in traffic, so the only safe way to drive it was to make sure there were no other cars on the road.

It came with a top speed of fifteen miles per hour and could only go twenty miles per hour on a full charge and that’s if the weather was good and there were no major inclines on the road.
There was a ton of build-up leading to its unveiling and some car makers actually worried about the competition it might create. When they finally saw it, the only competition among car makers was to see who would laugh at it the loudest without sounding like they were faking it.

Even if the design failed upon execution, it did have some prescient ideas. The car was only design to go short distances without relying on gas.

So in 1992, Sinclair released another short range vehicle called the Sinclair Zike.
It was pedal powered but like the C5, it was a lightweight electric scooter powered by nickel-cadmium batteries.

Seems like an early prototype for those electric scooters or e-bikes you can rent now with a phone app like Lime or Unagi that you see around towns.

Speaking of e-vehicles, let’s talk about the EV1.

It first appeared at the Los Angeles Auto Show in 1990 and it looked like the kind of vehicle you expected to lead humanity into the future except…you know, no flying. Stupid lying Jetsons.

Unlike the previous two transportation failures, it looked like and operated like an actual car. It could go from zero to sixty in eight seconds.
The same year, California’s Air Resources Board passed a mandate requiring the state’s seven top-selling automakers to devote a share of its sales (two percent, to be exact) to electric vehicles.

The problem is that the public really wasn’t clamoring for an electric vehicle yet and G.M. started to go broke within the next two years. So plans to develop and expand production on the EV1 were delayed or cancelled.

However, G.M. didn’t completely scrap plans for the EV1. Instead, they relocated a group of one hundred engineers to an off-site facility to work on it.

Two years later, they produced a fleet of fifty prototypes that they tested with real drivers. The cars received positive feedback. The company’s finances started to recover, and G.M. revived its E.V. program but it still had some major issues.

It had a decent charge time of four hours and could last twelve-to-sixteen hours on one hundred and ten volts but cold weather reduced its range. There also wasn’t an infrastructure in place so people could charge it on the go or even their own homes. They could only go around one hundred miles before they required a recharge, and they required special equipment to refill the batteries.

Even though it was marketed as a commuter car, it only had two seats. So while it may have been ideal for suburban drivers, it was unusable for families. Even though it could achieve a respectable speed, the batteries made it harder to control and heavier overall.

Eventually, they were only marketed in Los Angeles, Phoenix, Tucson, San Francisco, and Sacramento. The models sold in Arizona were an utter failure because the optional battery offered as an add-on performed poorly in hot weather.

Ultimately, G.M. built five hundred EV1s and only leased around four hundred of them.
When G.M. stopped the vehicle’s production in 2003, some of the few remaining owners held a funeral for it in the Hollywood Forever Cemetery, which may just be the saddest funeral because it was led by someone on a Segway.

Fortunately, the EV1’s influence wouldn’t die with it. Even though G.M. couldn’t afford to keep the development on the EV1 going, Tesla Motors founders Martin Eberhard and Marc Tarpenning decided to develop their own all electric vehicle.

It followed on similar designs but soon found ways to correct some of the mistakes G.M. couldn’t fix, like the hefty car battery and how the weather affected its charging and performance.

Tesla may still have profitability problems and they made a car that looks like it was designed by a fifth grade geometry student who just learned how to use a protractor.
But a lot of the innovations and discoveries of the EV1 helped revive G.M.’s electric vehicle program and led to all-electric Ford F-150s and Chevy Suburbans.

If you were a fan of video games in the 90s, then you already know about the Virtual Boy. Or maybe you’d rather forget it.

It was a major video game maker’s first attempt to bring virtual reality into home consoles, and it was an epic failure. Yes, even bigger than the Wii-U. But not by much.

Nintendo was riding high on the success of the Super Nintendo, but its popularity and innovative graphics and gameplay were starting to fade by the mid 90s. The company had its popular Nintendo 64 in the works that would bring the company into a new age of immersive games with 360-degree controls but it needed something to tide the fans over. They came up with this red monster.

Nintendo unveiled the Virtual Boy in 1995, originally calling it the VR-32, claiming it could put players in the game. It was the company’s first 32-bit console so it could compete with the likes of the Sony PlayStation without requiring CD storage or even a T.V. to operate it.
It could use 3-D graphics that actually felt three-dimensional. Unfortunately, it could only produce games in shades of red, and the games were lackluster at best. Even though it didn’t need a T.V., you couldn’t play it on the go and the red screen graphics caused eye-strain and headaches. Long story short, it became the company’s lowest-selling console of all time. Only fourteen games were produced for the machine.

In fact, the only thing it really did well is open the door for V.R. machines that followed it and build on its mistakes.

Most notably, Meta’s Quest series features devices that made V.R. completely centralized in a single headset without using a computer.

There’s also no controllers, just the user’s hand operating it.
The Virtual Boy may have been an epic failure but it provided a proof of concept for a home V.R. console. In fact, Nintendo is actually reviving the Virtual Boy for its Switch and Switch 2 consoles as part of its classics series.

Sometimes, an innovation comes around because of competition between two rivals. Then, the public decides on the best option.

In the case of the VCR, the Betamax was the clear loser. Excuse me, second place winner.
Sony wanted to enter the burgeoning home video market in the mid 70s and came up with a prototype for a Betamax machine with Matsushita (MAT-SU-SHITA) that eventually became Panasonic.

Little did they know that J.V.C. was working on its own Video Home System (or VHS), and when Matsushita presented them with the VHS system, Sony still decided Betamax was the better way to go. You can probably guess what happened next.

In some ways, Betamax was actually the superior machine. The tapes were smaller, and the recorders could reproduce colors better. They could also play and fast-forward quicker, if you’re the type of person who can’t sit still and watch a movie at the normal speed, I guess.
What really killed the Betamax was the recording time. It could only record a program for 60 minutes and JVC’s VHS machine could do double the time. Eventually RCA would introduce a machine that could record three hours of television programming.

If you’re a sports fan, then you know that the VCR eventually won because you could record games AND flagrantly violate that “express written consent” rule you hear at the end of every game. I’m told that’s funny.

Oh and I should amend something I said earlier. Third place winner.

The Sega Dreamcast was a complete disaster. It actually produced a full volume of games or at least way more than the Virtual Boy did and came up with a number of innovations that have become standards in modern gaming.

It’s really a product of bad timing and fierce competition. It came out in the late 1990s when systems like the Nintendo 64 and the PlayStation followed by the extremely popular PlayStation 2 were dominating gamers’ choice.

It was able to compete with those consoles with extremely high and sharp graphics. It also came a wide variety of games that included the beloved Sonic the Hedgehog franchise as well as some exclusives like Crazy Taxi, Jet Set Radio and Skies of Arcadia.

Unfortunately, it had a hard time getting over the failure of its predecessor, the Sega Saturn and all the add-on consoles that erased the success of the Sega Genesis.
But it introduce a lot of innovative features and designs that a lot of modern players take for granted.

For starters, it boosted the ability for online multiplayer games. Now, pretty much every game has a multiplayer features so you can play with gamers from all corners of the globe and learn all sorts of interesting curse words and slurs. The Dreamcast, however, came with a built-in modem and offered a dedicated subscription-based online gaming service.

It even allowed for crossplay with games like Quake III: Arena where Dreamcast players could virtually explode players on PCs.

It forced rival console and game companies to focus on improving graphics thanks to the Dreamcast’s Z-buffer bandwidth could prevent better depth of field and sharper colors and edges.

Even one of its strangest games introduced a microphone as an accessory. What was it called?

Seaman.

Not gonna even go there.

Sony may be one of the major players in the gaming market with its PlayStation consoles but it’s ill-fated attempt to enter in the early 90s became one of the worst CD based consoles in history.

Fortunately, it made its console, the Philips CD-I, with another company. So it may sound like the most innovative concept this product failure is learning how to get another company to take the blame.

The Philips CD-I came out in 1991 at a time when the compact disc started to dominate the music market. It was a smart move at the time. Sony and Philips developed the CD so they already had a corner on the market, and the prices of CD drivers were dropping as they became more prevalent in computing. They could hold huge amounts of storage and handle more complex program, so it made sense to produce a CD based console.

The Philips CD-I didn’t just try to be a game console. It aimed to work as a home entertainment system that could do more than just play games. It could play movies, offered a lot of educational CD games and experiences, play music, and give users a way to share their photos on their TVs.

Because it could do so many things and offered so many different uses, it was way more expensive. It also worked with Nintendo to produce a series of CD games.
Unfortunately, they looked and sounded like this.

Yes, that was a Zelda game and no, not something we were able to record from your nightmares.
The Philips CD-I failed hard but Nintendo definitely learned from its mistakes when it made consoles like the GameCube and the Nintendo Wii. Their biggest lesson? Don’t make games that eventually turn into tragic Internet memes.

The Internet and television seemed destined to collide in some way. WebTV, later known as MSN TV, was the first to take the chance of combining both mediums in a set top box.
It was really just a glorifying computer for your television. So it had to operate as fast as a TV on the slow, sloggy speeds of 90s internet.

But the innovative part was how it let you pick the shows you wanted to watch, even if you had to wait an entire day to learn about what’s happening on that evening’s episode of Duckman.
Here’s a demo of the service in 1996. Cofounder Bruce Leak is trying to show people how easy it is to pull up the website for a little known NBC TV show called “Friends.”

So yes, it was slow and only could show you websites but it showed that people wanted smarter TVs. It paved the way for streaming services and even a way for networks to develop their own online programming and even for accessing the internet and programming on smaller handheld devices.

Apple’s product line is full of massive failures. For instance, it may be remembered for bringing the graphic interface of the Apple Macintosh to the market, but it was an epic failure in its first year on the market.

One of its biggest failures came in 1993 when it attempted to corner the personal digital assistant market with the Newton. It was a failure on just about every level.
It had awful handwriting and reading recognition leading to one of the greatest Simpsons memes of all time, and I’m convinced that wasn’t a goal in the marketing plan.

It cost a ton of money for a simple PDA and worst of all, it used a Stylus. Yuck. The only good thing about a Stylus is how easy it is to gag yourself with it.

When Steve Jobs returned to Apple in 1997, he killed the project but he used the team that develop it to work on a tablet that didn’t require a pen.

That’s right. If it wasn’t for the failure of the Newton, we wouldn’t have…

The iPhone.

The term “Google” may have wormed its way into our vernacular as the generic term for Internet searching and researching, but they’ve also had some epic failures they’d like you to forget.
Case in point: Google Glass.

Google bet a lot of time and money on this wearable tech and it got all sorts of high profile attention. It got a 12-page story in Vogue Magazine. Models wore them during New York Fashion Week on the runways. Celebrities even started wearing them like…
Beyonce and Oprah. When the media world’s biggest power duo can’t move your product, it’s time to give it up.

Its failure had nothing to do with the starpower trying to push it. The thing was filled with bugs and it felt like a step back for filming videos in an age when even the least smart phones on the market had cameras built into them.

It showed, however, that there was a case for building wearable technology but it needed to be more focused and usable. Google Glass felt more like something you needed a degree with MIT to understand its true potential.

Without Google Glass, we wouldn’t have wearable technology like smartwatches that could use phones to communicate with email and tech and even Fitbit style items that could track movements.

“Zoom” is another one of those brand words that’s somehow become the official noun and verb for video and web conferencing. There were a LOT of attempts to make video conferencing a mainstream concept and they go back a lot farther than Skype, which was the word for video conferencing until…well, people realize it sucked.

The videophone actually goes back decades to, believe it or not, the 1920s when Commerce Secretary Herbert Hoover took a video call in an AT&T Bell Labs “videobooth” in New York City.
AT&T unveiled a smaller but still impressive picturephone in 1964 at the New York City World’s Fair. The demo let people at the fair talk with strangers at Disneyland in Anaheim, California.

Unfortunately, the people wasn’t interested. What?!? A technology that we thought was revolutionary when we saw it in “Blade Runner” isn’t something that people weren’t clamoring for back then? I know!

AT&T started commercial service with calling booth in New York City, Washington DC and Chicago but the interest wasn’t there. The big problems were that people had to schedule their call in advance, which makes sense since the other person had to be on a working camera phone at the time. The calls could only last 15 minutes at most, and only three cities had access to the phones. The calls also cost $16 for a three-minute call. By 1968, AT&T pulled the plug on the whole thing.

Now we not only have apps like Zoom that make video teleconferencing easy and access but they’ve also become standard features on phones.

One of the most annoying problems with the Internet, even to this day, is having to remember long URL addresses or even emails. Sure we have features built into our browsers and our phones but that’s because our brains can’t seem to remember anything except useless knowledge like the number of times we spilled a drink on someone or who played Max Headroom. It was more times than I care to remember and TV’s Matt Frewer, by the way.

One attempt to make accessing brands from print to the web came in the early 2000s with this device called the CueCat.

It was a literally cat shaped scanner that made you realize why it’s a good thing no one ever tried to make a mouse that looked like an actual mouse.

It was also an ambitious idea to get advertisers to share their web based properties through print media. You could hook it up to a computer and scan a special code in an ad if you want to learn more about it and…yeah, it was a failure.

Gizmodo actually listed it was the worst invention of the decade and investors lost millions, probably because it was shipped to magazine subscribers for free.

But without the idea, we wouldn’t have QR codes that do pretty much the same thing as the Cuecat but with smart devices. Also, if your phone is in one of those cases that looks like a cat, we need to talk.

These are just random things that were created by accident either while someone was trying to create a new device or just something stumbled upon by accident.

For instance, the heart pacemaker that we know today was created by Wilson Greatbatch.
He wasn’t trying to create this lifesaving device. He was actually just trying to create a piece of equipment to record heart rhythms.

He was looking for a resistor to complete the circuitry for his device and he accidentally pulled out the wrong one.

When he installed it, he discovered the circuit could produce intermittent electrical pulses that acted just like the ones emitted by the human heart.

He realized by accident this device could driver a human heart.

It took years for surgeons to actually implant the device but more than 60 years later, it’s being used by a million people all over the world.

Play-Doh is another product created for one use but ended up becoming something wholly else solely by accident.

Joseph McVicker had taken over a struggling company that at one time was the largest wallpaper cleaner manufacturer in the world.

The company had seen better days and he realized he needed to find a new use for his product. His sister-in-law Kay Zufall read an article about how wallpaper cleaner could be used for special modeling projects.

Kay was also a nursery school teacher and the material was nontoxic so it was completely safe for children to play with, so she decided to try some on her class of kids. They loved the stuff and she suggested the name “Play-Doh.”

Now it’s a toy that we’ve all grown up with. Now we just need something that separates the colors once kids discover that you can mix them altogether.

Dr. Harry Coover is credited with inventing Super Glue and it started as a complete accident.
Coover invented his compound during World War II. He worked as part of a research team at Eastman-Kodak’s chemical division. He was trying to make a clear plastic using a substance known as cyanoacrylates that could be used for precision gunsights to help soldiers as part of the war effort.

He didn’t meet his goal but he found the substance he was experimented was very sticky and difficult to work with. Later he found that moisture causes the chemicals to harden and bond.
They rejected the idea for gunsights but continued their research. Six years later, he re-examined the substances while overseeing a group of chemists researching heat-resistant polymers for jet airplane canopies. That’s when he and his team realized the substance required no heat or pressure to stick together. In fact, they created a permanent bond all on their own.

Later, he patented his discovery and it became known as Superglue.

Yes, even that thing in your kitchen that you use to nuke half-eaten Hot Pocket and should probably clean was born of failure.

Percy Spencer worked for a relatively new company Raytheon in Cambridge, Mass. that would go on to become one of the biggest defense contractors in the US. At the time he was the company’s fourth employee and he only had a fifth grade education.
Like a lot of great inventors, he was a tinkerer and completely self-taught in subjects like trigonometry, chemistry and physics.

He was already known as an adept creator who built inventions created by his employees like the gas rectifier vacuum tubes invented by Charles Smith who eliminated the need for batteries to operate a radio.

At the time, he was working with magnetrons, a high powered vacuum tube used in early radar systems.

One day, he realized that the candy bar in his pocket had melted while observing radar sets in operation. These sets emitted electromagnetic waves so he started testing them with different types of foods and he learned that they actually cooked the items.

He patented the process as “A Method of Treating Foodstuffs” and his discovered became what we now know as the modern microwave.

A Technology That Would Change The World (If It Exists)

Cold Fusion burst on the scene in 1989 with the announcement that researchers Fleischmann and Pons had created energy through a chemical process that induces fusion at room temperature.

When nobody else was able to replicate their findings, cold fusion went down as one of the biggest science fiascoes in decades. But some believe they were on the right track, and that their method could be the key to change the world.

How Technology Destroyed The Truth

The promise of the internet was that if we connect the world and give everyone a voice, we could move forward as one. It didn’t turn out that way.

The way we consume information has changed drastically over the years.

For the majority of modern history, newspapers were the arbiter of truth, and people read the newspaper once a day and then talked about the issues with friends, family, and coworkers.

When radio came around, the news was delivered 2-3 times a day, by distinguished and trusted broadcasters like Edward R. Murrow that delivered the news right down the middle.

Broadcast TV increased the amount of news to 3-4 times a day, but still news was just something people ingested in between other forms of entertainment. And different sides of the news were presented evenly thanks to the Fairness Doctrine.

But with cable TV and the first cable news network, CNN, all that changed. Then news became the entertainment and more cable news outlets like Fox News, MSNBC, Headline News, and CNBC split into different ideological camps.

But with the rise of social media, the news became an all-day every day feast, and worst of all, it removed the gatekeepers. Meaning anybody with any viewpoint could get their message heard.

This was supposed to be a good thing. But it has proven to divide us even further and be exploited by troll farms and moneyed interests.

Even more upsetting is this is happening at a time when we need together on the same page to combat various existential threats.

The post-truth era could be one of the contributors to the downfall of humanity if we’re not careful.

This Isn’t The End Of Printed Photos, It’s The Golden Age

As a society, we now produce more photographs than ever before, and the total number is becoming difficult to fathom. This year, it is estimated that billions of humans armed with smartphones will take some 1.2 trillion pictures.

Many of them will be shared on social media, but many more will simply be forgotten. A few good selfies will flash before your eyes as you swipe left or right on them, late some Friday night.

But hardly any will make the transition into the physical world, bits becoming blots of ink that coalesce into an image on a piece of paper, canvas, wood, or metal — a print.

The reasons for this are rational, and there’s no point fighting progress, but nor should we ignore the value of a print. We may no longer print every photo by default, but this can actually be a good thing for printing.

It is now about quality rather than quantity, and the pictures we choose to print deserve the best treatment.

Honestly, there has never been a better time to print than now, thanks to technological advances in both digital cameras and inkjet printers.

If you haven’t yet tried your hand at photo printing, you owe it to yourself to do so, even if you’re just a casual photographer.




Print isn’t dead — it’s better than ever

It’s a common refrain in the digital age, and not just in reference to photography. Print is dead, or at least dying, right? In truth, a certain type of print has certainly declined, but this isn’t a tragedy.

Prints used to be the only way we had to view our photos. We’d drop our film off at the drugstore and pick it up 24 hours later not because it was a better system, but because it was all we had.

We tend to romanticize the print, but when printing was the norm, many photos were still lost and forgotten (and some were found again).

Most were destined for photo albums or shoeboxes that would sit around and collect dust until moving day. If fewer were forgotten, it was because fewer were made.

Far fewer, in fact — in 2000, Kodak announced 80 billion pictures had been taken that year.

Sure, that sounds like a lot (it was a new milestone at the time), but for those who think of such large numbers as vague clouds of zeros, consider that 80 billion is still 1.12 trillion shy of 2017’s 1.2 trillion photos.

For the mathematically disinclined, let’s put it another way: Subtracting the total number of photos made in the year 2000 from those made in 2017 would have no effect on the number of shirtless mirror selfies posted by lonely men on Tinder.

With so many photos being taken, it’s no wonder so relatively few are being printed. Every print costs money, after all, so of course people aren’t going to print 1.3 trillion photos.

What’s more, the point of printing (often the point of taking a photo in the first place) was to share your memory with someone else.

Now that we don’t need prints to do that, it makes sense that people are choosing not to spend money on them, especially when electronically sharing images also happens to be much more convenient.

But people still love prints. Even the “low end” of printing is alive and well as instant photography has seen a huge resurgence in recent years.

Polaroid Originals has built an entire brand around it, and Fujifilm Instax cameras and film packs made up six of the top ten best selling photography products on Amazon last holiday season.

Please like, share and tweet this article.

Pass it on: Popular Science

Meet Your Future Robot Overlords

Get Brilliant at http://www.brilliant.org/answerswithjoe/
And the first 295 to sign up for a premium account get 20% off every month!

Robots no longer live in science fiction. They’re all around us. Right now. Let’s look at the current most advanced robots and see where things might go in the future.

From their first mention in a Czech play to Elon Musk’s “alien dreadnought” automated factory, robots have been slowly becoming a huge part of our lives.

The types of robots include:
Industrial/Warehouse Robots
Service/Companion Robots
Military Robots
Exploratory Robots

Industrial robots include AMRs, which automate products around a warehouse floor.

Service and Companion robots include Asimo from Honda, Romeo and Pepper from SoftRobotics, and Milo, a robot for autistic kids.

Military Robots are usually funded by DARPA and include the Atlas and Spotmini from Boston Dynamics

Exploratory robots include NASA space probes including the Curiosity Rover.

Ditching Microbeads: The Search For Sustainable Skincare

Is smoother skin worth more than having potable water or edible fish?

For years, research has shown that beauty products made with tiny microbeads, gritty cleansers that scrub off dead skin cells, have been damaging water supplies, marine life and the ecological balance of the planet.

Beat the Microbead, an international campaign to ban the plastic beads, reports that marine species are unable to distinguish between food and microbeads.

According to the campaign, “over 663 different species were negatively impacted by marine debris with approximately 11% of reported cases specifically related to the ingestion of microplastics“.

To make things worse, microbeads can act like tiny sponges, absorbing several other dangerous chemicals, including pesticides and flame retardants. As they ingest microbeads, marine animals also consume these other poisons.




The obvious solution to the microbead problem is to cut it off at the source.

But while major cosmetic companies like Johnson & Johnson, Unilever, and Procter & Gamble have pledged to phase out the use of microbeads in favor of natural alternatives, they also say that the shift could take several years.

And as more research is done, it appears that microbead replacements may come with dangers of their own.

Some of the natural replacements for microbeads also have negative consequences.

Greg Boyer, chair of the chemistry department at SUNY-College of Environmental Science and Forestry, says a possible negative consequence is with degrading sugars that biochemically “burn” the sugar for energy.

A variety of biodegradable ingredients are available to developers.

Victoria Fantauzzi, co-founder of Chicago-based La Bella Figura Beauty, says that her company recently released a facial cleanser that uses enzymes found in papaya and pineapple, ingredients known to effectively exfoliate skin cells.

Please like, share and tweet this article.

Pass it on: Popular Science

Are Quantum Computers On The Verge Of A Breakthrough?

Get Brilliant at http://www.brilliant.org/answerswithjoe/ And the first 295 to sign up for a premium account get 20% off every month!

For years now, quantum computers have been just out of reach, but some exciting new developments over the last year indicate that the age of quantum computing is a lot closer than we think.

 

Check out Jason’s channel: https://www.youtube.com/channel/UCS-u…

LINKS LINKS LINKS:

D-Wave video: https://www.youtube.com/watch?v=zvfkX…

Quantum Annealing Explained: https://www.youtube.com/watch?v=UV_Rl…

Supercooled qubits: https://newatlas.com/stable-supercool…

IBM’s new Neuromorphic chip: https://www.youtube.com/watch?v=nE819…

Google Bristlecone: https://www.sciencenews.org/article/g…

Silicon based quantum chip: https://gizmodo.com/new-silicon-chip-…

Moore’s Law Is Ending – Here’s 7 Technologies That Could Bring It Back To Life

Gordon E. Moore was one of the co-founders of Intel and first proposed was came to be known as Moore’s Law, which predicted that computer power would double every 2 years.

For nearly 50 years, the industry kept pace with this prediction, but in recent years there’s been a slowdown. 2 main reasons are heat and the quantum tunneling effect that occurs at the atomic scales.

Some of the technologies that have been theorized to break through this barrier include:

Graphene processors. Graphene carries electricity far better than traditional silicon processors, but is currently very expensive to produce.

Three Dimensional Chips. Some manufacturers are experimenting with 3-D chips that combine processing and memory in one place to improve speed.

Molecular transistors. Transistors that use a single molecule to transfer electricity.

Photon transistors. These take electrons out of the process entirely and replaces them with laser beams.

Quantum computers. These long-hyped machines could perform multiple calculations at once by using the superposition of quantum particles to process information.

Protein computers. These use folding proteins to make calculations.

And finally, DNA computers. DNA is the perfect data storage device, allowing scientists to store 700 terabytes of information in only one gram. But it can also be used in logic gates and are being tested in a processing capacity.

Links:

https://phys.org/news/2016-05-graphene-based-transistor-clock-processors.html

Computerphile on the physics of computer chips

Computerphile on the end of Moore’s Law:

http://www.livescience.com/52207-faster-3d-computer-chip.html

First Functional Molecular Transistor Comes Alive

https://arstechnica.com/gadgets/2015/07/scientists-build-single-molecule-transistor-gated-with-individual-atoms/

http://news.mit.edu/2013/computing-with-light-0704

Michio Kaku on Moore’s Law

https://www.engadget.com/2016/02/26/scientists-built-a-book-sized-protein-powered-biocomputer/

Harvard cracks DNA storage, crams 700 terabytes of data into a single gram

http://computer.howstuffworks.com/dna-computer.htm

https://en.wikipedia.org/wiki/Moore%27s_law

https://www.pcper.com/news/Storage/IDF-2016-Intel-Demo-Optane-3D-XPoint-Announces-Optane-Testbed-Enterprise-Customers

How To Survive The Future – The Answers With Joe Podcast

This is the audio version of the YouTube video, so some references may be made to something you can’t see.

Automation and artificial intelligence are already causing massive disruptions to commerce and industry all over the world. Economists warn that in the next 10 years, 30% of jobs could go away due to technological advancement. An unemployment rate that would be worse than even The Great Depression. How does society react in the face of this kind of change, and what can we do to position ourselves to be ready for the changes to come? In this audio version of my YouTube video, I discuss what I think are the best options.

 

Subscribe to YouTube Channel

Subscribe to Podcast

You can be canker sore free in only 6 weeks!