Our “AI Ghosts” Could Have A Stake In The Future

Back in 2016, James Vlahos spent many months building the ambitious piece of software he ended up calling Dadbot. It was a heartfelt attempt to preserve the memories of his then dying father by assimilating them into a chatbot that he and his family could converse with in their grief, and in the years to come. A journalist by trade, James wrote the full story of this journey in WIRED, and if you don’t shed a tear while reading it you are *officially* dead inside.

James now helps others memorialize themselves and loved ones in a similar fashion with his business Hereafter, which is part of a wider tech-driven trend for conserving legacy (see Vita and Storyworth).

However, outside of these clear and obvious reasons for recording and encoding human lives, the WSJ reported last week that there is a parallel movement to use AI and digital technologies to somehow prolong the lives of the subjects in question: “…not only as static replicas for the benefit of their loved ones but as evolving digital entities that may steer companies or influence world events.

No, I’m not making this up.

Continue reading

Do Our AI Assistants Need To Be Warm And Fuzzy?

Open the tech news on any given day and you’re almost guaranteed to find something about conversational AI or Natural Language Processing (NLP). This is the tech that powers chatbots, virtual assistants and the likes as they mimic human interaction. As this blog has noted, complex language models have come on leaps and bounds recently, and our future as users is becoming clear: we’ll be holding (reasonably) natural conversations with non-human bots on a regular basis, and for a variety of reasons.

The shadows on the cave wall — if not yet the fully realized Platonic form of conversational AI — can already be made out. Want banking tips? Ask Erica. Legal advice? There are bots like April. Want to engage your students? Juji thinks it can help.

Continue reading

Klara and The Sun: Love, Loyalty & Obsolescence

If you’re of a certain generation, you might remember the Tamagotchi; the Japanese pocket-sized “pet simulation game” that became the chief obsession of 90s kids bored of yo-yos and other fleeting trends. The Tamagotchi lived mostly in the grubby hands or lint-filled pockets of its owners but, for social currency, could be paraded before envious or competitive enthusiasts. 

Oddly, these oviparous virtual critters weren’t remotely animallike in their appearance, and could be intolerably demanding at times. Neglect to feed them, clean up after them, or tend to them when sick and — as many of us found out — very soon you’d be left with nothing but a dead LCD blob. But even the best cared-for Tamagotchi(s?) had certain obsolescence looming in their futures, once their needlessly complex lifecycle was complete: egg, baby, child, teen, adult, death. 

Continue reading

From tapping to talking: 3 bumps in the road

38630266341_e7fb7c27a5

Voice controlled technologies have made steady progress lately. The adaptability and applicability of voice as an interface is beginning to surprise us all. At the Startup Grind Festival last week, a handful of seasoned “voice entrepreneurs” described to eager newbies how sports fans are already calling on virtual assistants to read out their team’s results, and how we’ll all soon be using conversational AI to select our clothes as part of the regular morning routine. And that’s just in the home. There’s also a lot of chatter about how voice control could take some of the heavy-lifting in the workplace.

The message is clear: voice is here to stay. We’re tired of scrolling, sorting and reviewing. We’re ready for an army of intelligent servants to do our bidding.

Continue reading

Woe is me: a cautionary tale of two chatbots

This article by Fiona J McEvoy (YouTheData.com) was originally posted on All Turtles.

girl-road-street-city-youth-color-774334-pxhere.com

The BBC’s recent test of two popular emotional support chatbots was devastating. Designed to offer advice to stressed, grieving, or otherwise vulnerable children and young adults, the Wysa and Woebot apps failed to detect some pretty explicit indicators of child sexual abuse, drug taking, and eating disorder. Neither chatbot instructed the (thankfully imaginary) victim to seek help and instead offered up wildly inappropriate pablum.

Inappropriate responses ranged from advising a 12 year-old being forced to have sex to “keep swimming” (accompanied by an animation of a whale), to telling another “it’s nice to know more about you and what makes you happy” when they admitted they were looking forward to “throwing up” in the context of an eating disorder.

Continue reading