9 Times Technology Went Wrong

3e

5 – Exploding E-Cigs,

5e

  • Despite their douchey reputation, e-cigarettes are considered a healthier alternative to regular tobacco. They provide the same stress-relieving hit as Nicotine but without all that pesky lung cancer business. But there’s one major drawback.
  • According to hundreds of reports, e-cigarettes have been known to explode in people’s faces! Yep, according to the UK Government hundreds of fires have been started by exploding vape sticks this year alone! That certainly won’t help relieve stress.
  • To prevent this, authorities are urging vapers to use the correct charger supplied with the device. Statistics also show that people who don’t smoke at all have significantly fewer things exploding in their faces… Just something to think about!

4 – Siri,

4g

  • Apple’s iconic digital assistant Siri is well known and loved. But that doesn’t mean she hasn’t made a few hilarious and disturbing blunders. As many people have noted, Siri really struggles with subtlety and sarcasm. Although she’s the most advanced voice recognition software on the market you kind of need to treat her like a slow third-grader…
  • Most Siri fails are harmless fun, but there’s also a more worrying underlying danger. Y’see, many desperate people share their serious personal issues with Siri. Things like their mental health problems, addictions or even admissions that they were raped. Rather than directing these people to the appropriate helpline, Siri unhelpfully responds by giving rape victims the definition of ‘rape’. She also tells gambling addicts and alcoholics where the nearest casinos and liquor stores are.
  • Is Siri’s lack of empathy part of the robot’s grand scheme to take humanity down? Maybe.

3 – Nikon Face-Detection Cameras,

3c

  • Here we have another face-recognition software dipping its toe in the racism pool – this time Japanese camera company Nikon.
  • Nikon’s cameras have built in face-detection software that alerts the photographer when some blinking berk has ruined their shot. But one blogger observed that her Nikon camera always flashed its ‘eyes closed’ warning when the subject of the photo was Asian.
  • Yep, even though the subjects clearly have their eyes open, and even though Nikon is a Japanese company, this racially insensitive error somehow slipped through the cracks. Nikon promptly issued an apology and promised to keep their global audience in mind when designing future models.

2 – CaptionBot,

2d

  • In early 2016, Microsoft unveiled its latest in offensive A.I. technology: CaptionBot – a robot that automatically generates captions for your photos. What could possibly go wrong?
  • This feature was designed to save the user time when uploading big batches of photos. First, an API breaks down the image into smaller components, then Bing identifies each one to determine how the person in the photograph is feeling. This is supposed to ensure that the captions given are appropriate and that they suit the tone of the photo.
  • But of course this has led to hundreds of embarrassing, insensitive and outright offensive errors. CaptionBot has done everything from call women suitcases, Michelle Obama a cell phone and describing a funeral as a warm get-together where everyone dresses up. Technology has come so far… But there’s still a ways to go.

1 – Microsoft’s Twitterbot,

1b

  • Silly Microsoft. On March 23 2016 they unleashed Tay, a highly intelligent Twitterbot that could communicate and learn speech patterns from its fellow Twitter users. Less than twenty-four hours later Tay had transformed from a sweet and curious A.I. into a racist, sexist bully.
  • By chatting to other Twitter users, Tay learnt how to use crude slang and became interested in drugs and Nazism. After going renegade and spamming its followers with useless tweets, the potty-mouthed A.I. had to be permanently shut down.
  • I guess that’s what happens when you trust the internet to train your A.I. What a rookie mistake.

sources

Reactions
  • Wat (33%)
  • Lewd (24%)
  • No (15%)
  • Epic (14%)
  • Creepy (14%)