Answers With Joe

Tech Failures That Were Ahead Of Their Time


Being first doesn’t always guarantee success. Often the public isn’t ready, or the product isn’t ready, or there’s not an infrastructure in place, whatever the reason, many times the first to market are just the first to fail. Today we celebrate those successful failures.

TRANSCRIPT:

The hype around the Segway was HUGE. Some of the biggest innovators of our time, like Apple co-founder Steve Jobs, said it would be as big of a deal as the P.C.
Before it was unveiled, people started speculating that it would change the face of transportation as we know it. Was it one of those flying cars the Jetsons promised us?
No. It was just this.

You probably remember seeing the Segway scooters back in the 2000s if you lived in a bustling city and ever went to one of those snooty coffee shops. You know, the ones where they don’t allow you to use cream or sugar because they want to you “taste the essence” of their coffee.
The Segway was a two-wheeled transportation device that could self-balance and move simply by leaning the direction the driver wanted to go.

Medical inventor Dean Kamen designed the Segway, and he was heavy on the hype train. He predicted it would be “to the car what the car was to the horse and buggy.”

Well, here’s the first problem. The Segway’s top speed was only 10 mph. Maybe if the Segway competed against the horse and buggy, it might’ve had a fighting chance.

Each unit cost about $5,000 and walking is pretty much free, so you can probably do the math as to why no one bought the thing.

It did find some success in sectors like security and tourism, and it was one of the few funny parts of those Paul Blart: Mall Cop movies.
In fact, it seemed like the only dependable part of its legacy is all the amusing accident photos and videos it provided us.

Even President George W. Bush was caught on camera falling off of one.
Six years after its unveiling, Segway issue a massive recall of all its units. The company said it received six reports of people falling off the scooters causing injuries to heads and wrists when people tried to slow them down.

By 2009, Kamen sold Segway, and in 2015, it was sold again to a Chinese mobility company called Ninebot. It only sold around 140,000 units before the company stopped making them in 2020.

The Segway brand, however, hasn’t disappeared. In fact, it used the technology to create electric scooters, e-bikes and other e-vehicles. It’s even expanded its brand to create portable power stations and robotics, creating products like self-autonomous lawnmowers and outdoor delivery vehicles.

Electric cars are everywhere these days but it wasn’t that long ago that they were so rare and so odd looking that they were considered a novelty.

For instance, actor and noted environmentalist Ed Begley Jr. was one of the first notable names to regularly drive an electric car in the 70s and 80s, and it was basically a modified golf cart.

Imagine trying to take a date in one of those things. That’s one step away from pulling up in front of their house on a bicycle, ringing the bell, and inviting them to sit up front on the handlebars.

One of the first major attempts to bring an electric car to the market happened in 1985 with the Sinclair C5.

And well, calling it a car would be a generous assessment. It was more like a Power Wheels for grownups with places to be, but didn’t care how on time they were.

Just like Thomas Edison, its creator Sir Clive Sinclair was actually a successful inventor. He was a computer pioneer who created the first pocket calculator and the Sinclair ZX80, the first mass market computer for less than 100 pounds.

It was called an electric vehicle but it was actually a one-seat, pedal-powered tricycle and it sat so close to the ground, it was practically a death trap on wheels. It had no windshield and the body was made of polypropylene, so you were probably gonna eat a bug or two on every trip, and that’s if you’re lucky.

There was no way you could maneuver it in traffic, so the only safe way to drive it was to make sure there were no other cars on the road.

It came with a top speed of fifteen miles per hour and could only go twenty miles per hour on a full charge and that’s if the weather was good and there were no major inclines on the road.
There was a ton of build-up leading to its unveiling and some car makers actually worried about the competition it might create. When they finally saw it, the only competition among car makers was to see who would laugh at it the loudest without sounding like they were faking it.

Even if the design failed upon execution, it did have some prescient ideas. The car was only design to go short distances without relying on gas.

So in 1992, Sinclair released another short range vehicle called the Sinclair Zike.
It was pedal powered but like the C5, it was a lightweight electric scooter powered by nickel-cadmium batteries.

Seems like an early prototype for those electric scooters or e-bikes you can rent now with a phone app like Lime or Unagi that you see around towns.

Speaking of e-vehicles, let’s talk about the EV1.

It first appeared at the Los Angeles Auto Show in 1990 and it looked like the kind of vehicle you expected to lead humanity into the future except…you know, no flying. Stupid lying Jetsons.

Unlike the previous two transportation failures, it looked like and operated like an actual car. It could go from zero to sixty in eight seconds.
The same year, California’s Air Resources Board passed a mandate requiring the state’s seven top-selling automakers to devote a share of its sales (two percent, to be exact) to electric vehicles.

The problem is that the public really wasn’t clamoring for an electric vehicle yet and G.M. started to go broke within the next two years. So plans to develop and expand production on the EV1 were delayed or cancelled.

However, G.M. didn’t completely scrap plans for the EV1. Instead, they relocated a group of one hundred engineers to an off-site facility to work on it.

Two years later, they produced a fleet of fifty prototypes that they tested with real drivers. The cars received positive feedback. The company’s finances started to recover, and G.M. revived its E.V. program but it still had some major issues.

It had a decent charge time of four hours and could last twelve-to-sixteen hours on one hundred and ten volts but cold weather reduced its range. There also wasn’t an infrastructure in place so people could charge it on the go or even their own homes. They could only go around one hundred miles before they required a recharge, and they required special equipment to refill the batteries.

Even though it was marketed as a commuter car, it only had two seats. So while it may have been ideal for suburban drivers, it was unusable for families. Even though it could achieve a respectable speed, the batteries made it harder to control and heavier overall.

Eventually, they were only marketed in Los Angeles, Phoenix, Tucson, San Francisco, and Sacramento. The models sold in Arizona were an utter failure because the optional battery offered as an add-on performed poorly in hot weather.

Ultimately, G.M. built five hundred EV1s and only leased around four hundred of them.
When G.M. stopped the vehicle’s production in 2003, some of the few remaining owners held a funeral for it in the Hollywood Forever Cemetery, which may just be the saddest funeral because it was led by someone on a Segway.

Fortunately, the EV1’s influence wouldn’t die with it. Even though G.M. couldn’t afford to keep the development on the EV1 going, Tesla Motors founders Martin Eberhard and Marc Tarpenning decided to develop their own all electric vehicle.

It followed on similar designs but soon found ways to correct some of the mistakes G.M. couldn’t fix, like the hefty car battery and how the weather affected its charging and performance.

Tesla may still have profitability problems and they made a car that looks like it was designed by a fifth grade geometry student who just learned how to use a protractor.
But a lot of the innovations and discoveries of the EV1 helped revive G.M.’s electric vehicle program and led to all-electric Ford F-150s and Chevy Suburbans.

If you were a fan of video games in the 90s, then you already know about the Virtual Boy. Or maybe you’d rather forget it.

It was a major video game maker’s first attempt to bring virtual reality into home consoles, and it was an epic failure. Yes, even bigger than the Wii-U. But not by much.

Nintendo was riding high on the success of the Super Nintendo, but its popularity and innovative graphics and gameplay were starting to fade by the mid 90s. The company had its popular Nintendo 64 in the works that would bring the company into a new age of immersive games with 360-degree controls but it needed something to tide the fans over. They came up with this red monster.

Nintendo unveiled the Virtual Boy in 1995, originally calling it the VR-32, claiming it could put players in the game. It was the company’s first 32-bit console so it could compete with the likes of the Sony PlayStation without requiring CD storage or even a T.V. to operate it.
It could use 3-D graphics that actually felt three-dimensional. Unfortunately, it could only produce games in shades of red, and the games were lackluster at best. Even though it didn’t need a T.V., you couldn’t play it on the go and the red screen graphics caused eye-strain and headaches. Long story short, it became the company’s lowest-selling console of all time. Only fourteen games were produced for the machine.

In fact, the only thing it really did well is open the door for V.R. machines that followed it and build on its mistakes.

Most notably, Meta’s Quest series features devices that made V.R. completely centralized in a single headset without using a computer.

There’s also no controllers, just the user’s hand operating it.
The Virtual Boy may have been an epic failure but it provided a proof of concept for a home V.R. console. In fact, Nintendo is actually reviving the Virtual Boy for its Switch and Switch 2 consoles as part of its classics series.

Sometimes, an innovation comes around because of competition between two rivals. Then, the public decides on the best option.

In the case of the VCR, the Betamax was the clear loser. Excuse me, second place winner.
Sony wanted to enter the burgeoning home video market in the mid 70s and came up with a prototype for a Betamax machine with Matsushita (MAT-SU-SHITA) that eventually became Panasonic.

Little did they know that J.V.C. was working on its own Video Home System (or VHS), and when Matsushita presented them with the VHS system, Sony still decided Betamax was the better way to go. You can probably guess what happened next.

In some ways, Betamax was actually the superior machine. The tapes were smaller, and the recorders could reproduce colors better. They could also play and fast-forward quicker, if you’re the type of person who can’t sit still and watch a movie at the normal speed, I guess.
What really killed the Betamax was the recording time. It could only record a program for 60 minutes and JVC’s VHS machine could do double the time. Eventually RCA would introduce a machine that could record three hours of television programming.

If you’re a sports fan, then you know that the VCR eventually won because you could record games AND flagrantly violate that “express written consent” rule you hear at the end of every game. I’m told that’s funny.

Oh and I should amend something I said earlier. Third place winner.

The Sega Dreamcast was a complete disaster. It actually produced a full volume of games or at least way more than the Virtual Boy did and came up with a number of innovations that have become standards in modern gaming.

It’s really a product of bad timing and fierce competition. It came out in the late 1990s when systems like the Nintendo 64 and the PlayStation followed by the extremely popular PlayStation 2 were dominating gamers’ choice.

It was able to compete with those consoles with extremely high and sharp graphics. It also came a wide variety of games that included the beloved Sonic the Hedgehog franchise as well as some exclusives like Crazy Taxi, Jet Set Radio and Skies of Arcadia.

Unfortunately, it had a hard time getting over the failure of its predecessor, the Sega Saturn and all the add-on consoles that erased the success of the Sega Genesis.
But it introduce a lot of innovative features and designs that a lot of modern players take for granted.

For starters, it boosted the ability for online multiplayer games. Now, pretty much every game has a multiplayer features so you can play with gamers from all corners of the globe and learn all sorts of interesting curse words and slurs. The Dreamcast, however, came with a built-in modem and offered a dedicated subscription-based online gaming service.

It even allowed for crossplay with games like Quake III: Arena where Dreamcast players could virtually explode players on PCs.

It forced rival console and game companies to focus on improving graphics thanks to the Dreamcast’s Z-buffer bandwidth could prevent better depth of field and sharper colors and edges.

Even one of its strangest games introduced a microphone as an accessory. What was it called?

Seaman.

Not gonna even go there.

Sony may be one of the major players in the gaming market with its PlayStation consoles but it’s ill-fated attempt to enter in the early 90s became one of the worst CD based consoles in history.

Fortunately, it made its console, the Philips CD-I, with another company. So it may sound like the most innovative concept this product failure is learning how to get another company to take the blame.

The Philips CD-I came out in 1991 at a time when the compact disc started to dominate the music market. It was a smart move at the time. Sony and Philips developed the CD so they already had a corner on the market, and the prices of CD drivers were dropping as they became more prevalent in computing. They could hold huge amounts of storage and handle more complex program, so it made sense to produce a CD based console.

The Philips CD-I didn’t just try to be a game console. It aimed to work as a home entertainment system that could do more than just play games. It could play movies, offered a lot of educational CD games and experiences, play music, and give users a way to share their photos on their TVs.

Because it could do so many things and offered so many different uses, it was way more expensive. It also worked with Nintendo to produce a series of CD games.
Unfortunately, they looked and sounded like this.

Yes, that was a Zelda game and no, not something we were able to record from your nightmares.
The Philips CD-I failed hard but Nintendo definitely learned from its mistakes when it made consoles like the GameCube and the Nintendo Wii. Their biggest lesson? Don’t make games that eventually turn into tragic Internet memes.

The Internet and television seemed destined to collide in some way. WebTV, later known as MSN TV, was the first to take the chance of combining both mediums in a set top box.
It was really just a glorifying computer for your television. So it had to operate as fast as a TV on the slow, sloggy speeds of 90s internet.

But the innovative part was how it let you pick the shows you wanted to watch, even if you had to wait an entire day to learn about what’s happening on that evening’s episode of Duckman.
Here’s a demo of the service in 1996. Cofounder Bruce Leak is trying to show people how easy it is to pull up the website for a little known NBC TV show called “Friends.”

So yes, it was slow and only could show you websites but it showed that people wanted smarter TVs. It paved the way for streaming services and even a way for networks to develop their own online programming and even for accessing the internet and programming on smaller handheld devices.

Apple’s product line is full of massive failures. For instance, it may be remembered for bringing the graphic interface of the Apple Macintosh to the market, but it was an epic failure in its first year on the market.

One of its biggest failures came in 1993 when it attempted to corner the personal digital assistant market with the Newton. It was a failure on just about every level.
It had awful handwriting and reading recognition leading to one of the greatest Simpsons memes of all time, and I’m convinced that wasn’t a goal in the marketing plan.

It cost a ton of money for a simple PDA and worst of all, it used a Stylus. Yuck. The only good thing about a Stylus is how easy it is to gag yourself with it.

When Steve Jobs returned to Apple in 1997, he killed the project but he used the team that develop it to work on a tablet that didn’t require a pen.

That’s right. If it wasn’t for the failure of the Newton, we wouldn’t have…

The iPhone.

The term “Google” may have wormed its way into our vernacular as the generic term for Internet searching and researching, but they’ve also had some epic failures they’d like you to forget.
Case in point: Google Glass.

Google bet a lot of time and money on this wearable tech and it got all sorts of high profile attention. It got a 12-page story in Vogue Magazine. Models wore them during New York Fashion Week on the runways. Celebrities even started wearing them like…
Beyonce and Oprah. When the media world’s biggest power duo can’t move your product, it’s time to give it up.

Its failure had nothing to do with the starpower trying to push it. The thing was filled with bugs and it felt like a step back for filming videos in an age when even the least smart phones on the market had cameras built into them.

It showed, however, that there was a case for building wearable technology but it needed to be more focused and usable. Google Glass felt more like something you needed a degree with MIT to understand its true potential.

Without Google Glass, we wouldn’t have wearable technology like smartwatches that could use phones to communicate with email and tech and even Fitbit style items that could track movements.

“Zoom” is another one of those brand words that’s somehow become the official noun and verb for video and web conferencing. There were a LOT of attempts to make video conferencing a mainstream concept and they go back a lot farther than Skype, which was the word for video conferencing until…well, people realize it sucked.

The videophone actually goes back decades to, believe it or not, the 1920s when Commerce Secretary Herbert Hoover took a video call in an AT&T Bell Labs “videobooth” in New York City.
AT&T unveiled a smaller but still impressive picturephone in 1964 at the New York City World’s Fair. The demo let people at the fair talk with strangers at Disneyland in Anaheim, California.

Unfortunately, the people wasn’t interested. What?!? A technology that we thought was revolutionary when we saw it in “Blade Runner” isn’t something that people weren’t clamoring for back then? I know!

AT&T started commercial service with calling booth in New York City, Washington DC and Chicago but the interest wasn’t there. The big problems were that people had to schedule their call in advance, which makes sense since the other person had to be on a working camera phone at the time. The calls could only last 15 minutes at most, and only three cities had access to the phones. The calls also cost $16 for a three-minute call. By 1968, AT&T pulled the plug on the whole thing.

Now we not only have apps like Zoom that make video teleconferencing easy and access but they’ve also become standard features on phones.

One of the most annoying problems with the Internet, even to this day, is having to remember long URL addresses or even emails. Sure we have features built into our browsers and our phones but that’s because our brains can’t seem to remember anything except useless knowledge like the number of times we spilled a drink on someone or who played Max Headroom. It was more times than I care to remember and TV’s Matt Frewer, by the way.

One attempt to make accessing brands from print to the web came in the early 2000s with this device called the CueCat.

It was a literally cat shaped scanner that made you realize why it’s a good thing no one ever tried to make a mouse that looked like an actual mouse.

It was also an ambitious idea to get advertisers to share their web based properties through print media. You could hook it up to a computer and scan a special code in an ad if you want to learn more about it and…yeah, it was a failure.

Gizmodo actually listed it was the worst invention of the decade and investors lost millions, probably because it was shipped to magazine subscribers for free.

But without the idea, we wouldn’t have QR codes that do pretty much the same thing as the Cuecat but with smart devices. Also, if your phone is in one of those cases that looks like a cat, we need to talk.

These are just random things that were created by accident either while someone was trying to create a new device or just something stumbled upon by accident.

For instance, the heart pacemaker that we know today was created by Wilson Greatbatch.
He wasn’t trying to create this lifesaving device. He was actually just trying to create a piece of equipment to record heart rhythms.

He was looking for a resistor to complete the circuitry for his device and he accidentally pulled out the wrong one.

When he installed it, he discovered the circuit could produce intermittent electrical pulses that acted just like the ones emitted by the human heart.

He realized by accident this device could driver a human heart.

It took years for surgeons to actually implant the device but more than 60 years later, it’s being used by a million people all over the world.

Play-Doh is another product created for one use but ended up becoming something wholly else solely by accident.

Joseph McVicker had taken over a struggling company that at one time was the largest wallpaper cleaner manufacturer in the world.

The company had seen better days and he realized he needed to find a new use for his product. His sister-in-law Kay Zufall read an article about how wallpaper cleaner could be used for special modeling projects.

Kay was also a nursery school teacher and the material was nontoxic so it was completely safe for children to play with, so she decided to try some on her class of kids. They loved the stuff and she suggested the name “Play-Doh.”

Now it’s a toy that we’ve all grown up with. Now we just need something that separates the colors once kids discover that you can mix them altogether.

Dr. Harry Coover is credited with inventing Super Glue and it started as a complete accident.
Coover invented his compound during World War II. He worked as part of a research team at Eastman-Kodak’s chemical division. He was trying to make a clear plastic using a substance known as cyanoacrylates that could be used for precision gunsights to help soldiers as part of the war effort.

He didn’t meet his goal but he found the substance he was experimented was very sticky and difficult to work with. Later he found that moisture causes the chemicals to harden and bond.
They rejected the idea for gunsights but continued their research. Six years later, he re-examined the substances while overseeing a group of chemists researching heat-resistant polymers for jet airplane canopies. That’s when he and his team realized the substance required no heat or pressure to stick together. In fact, they created a permanent bond all on their own.

Later, he patented his discovery and it became known as Superglue.

Yes, even that thing in your kitchen that you use to nuke half-eaten Hot Pocket and should probably clean was born of failure.

Percy Spencer worked for a relatively new company Raytheon in Cambridge, Mass. that would go on to become one of the biggest defense contractors in the US. At the time he was the company’s fourth employee and he only had a fifth grade education.
Like a lot of great inventors, he was a tinkerer and completely self-taught in subjects like trigonometry, chemistry and physics.

He was already known as an adept creator who built inventions created by his employees like the gas rectifier vacuum tubes invented by Charles Smith who eliminated the need for batteries to operate a radio.

At the time, he was working with magnetrons, a high powered vacuum tube used in early radar systems.

One day, he realized that the candy bar in his pocket had melted while observing radar sets in operation. These sets emitted electromagnetic waves so he started testing them with different types of foods and he learned that they actually cooked the items.

He patented the process as “A Method of Treating Foodstuffs” and his discovered became what we now know as the modern microwave.

Leave a Reply

Your email address will not be published. Required fields are marked *