A Birth, a Bereavement, and the Essence of Snoop Dogg

Tech needs better definitions

Dall-E at it’s absolute finest

As the curtain fell on 2024, I had a baby. It was Friday the 13th, a date often associated with bad luck thanks to European superstitions stretching back a couple of hundred years. There is even a specific term to describe the “fear of Friday the 13th”: paraskevidekatriaphobia. Although urban legend actually has it that the shrink that coined it declared anyone who learned to pronounce it would be cured, which suggests a sense of humor (and reminds us how extra humans can be…).

And while my husband and I likewise noted the date as a humoress quirk, bad luck would, in fact, catch-up with us. Just five weeks later my Dad died. Baz Luhrman was onto something when he said the real troubles in life blindside you on an idle Tuesday afternoon.

Though he’d been ill for a while my father’s passing wasn’t expected, and I found myself in the strange and bewildering position of both gaining and losing a fundamental life force in such quick succession that it felt simultaneous. In the paperwork and aftermath, I occasionally confused my father’s DOD for our baby’s DOB. Those who have had similar experiences may relate to the feeling of being in some strange existential continuum, whereby it seems impossible that there wasn’t some kind of transfer. Like one Dr. Who fluidly regenerating into the next

Continue reading

Missives from Cannes: Three Observations on Gen AI Application

Enthusiasm abounded at the World AI Cannes Festival

La Croisette de Cannes dans les années 1930

By now, we’re well-accustomed to waves of tech-based fervor. You don’t really have to touch the industry to have become a cynic. Perhaps you remember “peak blockchain” in 2019, when the technology was integrated into a toothbrush for reasons that no-one will ever begin to understand?

And of course, we’ve endured fanfare and furor over the metaverse, XR, Big Data, crypto, Web3, NFTs, quantum computing and, most notably, we’ve seen AI grow legitimate roots as the defining technology of its era (while its name continues to be frequently used and abused by sneaky bandwagon jumpers…).  

On those firm foundations, generative AI is the term at the center current hype-cycle. In Cannes last week a reported 16,000 attendees swarmed on the famed Palais des Festivals et Des Congrès to learn about it, talk about it, and – for many – showcase the tentative steps they’ve taken towards real-world application.

As with previous seasons, there was an urgency (perhaps even a whiff of desperation…?) in the air as companies from industries as diverse as hospitality, finance, entertainment, marketing, and pharma joined this latest gold rush as hopefuls. And, unsurprisingly, both snake oil and substance could be found.

I was lucky enough to host the festival’s Applications Stage, and here are three broad-brush observations I made:

Continue reading

ChatGPT: A Cautionary Tale (With Some Positive Takeaways)

I haven’t posted in a while. In truth, there hasn’t been a lot that’s piqued my interest, and there are now elaborate global mechanisms and a squadron of eager commentators prepped and ready to address the issues I used to point at on this humble blog. In November, I could’ve written something predictable about the impact of ChatGPT, but I felt like I’d already played that tune back in 2020 when I attempted to summarize the intelligent thoughts of some philosophers.

ChatGPT. GPT-3. Potato. Potato.

The most interesting aspects of this kind of AI are yet to come, I don’t doubt that. But I am here to share a cautionary tale that syncs nicely with my ramblings over the last 5 (5??) years. It’s a story about reliance and truth. About the quest for knowledge, and how it almost always involves some level of fumbling around in the dark, but never more so than now.

The Uncanny Valley and the Meaning of Irony

There has been a lot of discussion about how human is too human when it comes to robots, bots, and other types of disembodied AI voices. An interest in this topic led to a frustrating Google search which led me to…you guessed it…ChatGPT.

What did we ever do without it? I’m starting to forget.

Continue reading

Insidious “corrective” image filters allow app creators to dictate beauty standards

Portrait thought to be of Simonetta Carraneo Vespucci by Sandro Botticelli c.1480-1485.

In the 15th century, Florentine statesman and all-round bigwig Lorenzo d’Medici (also modestly known as “Lorenzo The Magnificent”) made some pretty outspoken comments on the looks and deportment of the ideal Italian Renaissance beauty. Despite himself being described as “quite strikingly ugly“, Lorenzo was rather specific on what should be considered desirable, basing his high standards on celebrated noblewoman Simonetta Carraneo Vespucci. He writes:

of an attractive and ideal height; the tone of her skin, white but not pale, fresh but not glowing; her demeanor was grave but not proud, sweet and pleasing, without frivolity or fear. Her eyes were lively and her gaze restrained, without trace of pride or meanness; her body was so well proportioned, that among other women she appeared dignified…in walking and dancing…and in all her movements she was elegant and attractive; her hands were the most beautiful that Nature could create. She dressed in those fashions which suited a noble and gentle lady…” (Commento del magnifico Lorenzo De’ Medici sopra alcuni de’ suoi sonetti)

Clearly beauty standards have evolved since Lorenzo’s time — and thankfully we’re probably less concerned about the restraint of our gaze and the beauty of our hands — but this notion of one common beauty ideal for women, dictated from without, unfortunately persists. And while Renaissance women agonized about achieving Simonetta’s bodily proportions and alabaster skin, their 21st century counterparts are turning to technological, and even surgical correction to emulate the new, algorithmically dictated standards for attention-worthy good looks.

Continue reading

Will Google’s Controversial LaMDA Help or Hinder Internet Discovery?

In his online Masterclass on the art of writing, renowned journalist Malcolm Gladwell explains the shortcomings of Google when it comes to research and discovery. “The very thing that makes you love Google is why Google is not that useful“, he chirps. To Gladwell, a Google search is but a dead-end when a true researcher wants to be led “somewhere new and unexpected“.

In juxtaposition to Google’s search engine stands ye olde library, which Gladwell calls the “physical version of the internet” (sans some of the more sophisticated smut…). In a library — should it be required — guidance is on-hand in the form of a librarian, and unlike the internet there is a delightful order to things that the writer likens to a good conversation. Discovery can be as simple as finding what books surround the book that inspired you…and following the trail. Gladwell elucidates: “The book that’s right next to the book is the book that’s most like it, and then the book that’s right next to that one is a little bit different, and by the time you get ten books away you’re getting into a book that’s in the same general area but even more different.”

There is something altogether more natural and relational about uncovering the new — and the forgotten — in the context of a library or a conversation. Hidden gems lay undisturbed, unlike popularity-ranked internet search results that spew out the obvious and the familiar.

Enter LaMDA AI.

Continue reading

Klara and The Sun: Love, Loyalty & Obsolescence

If you’re of a certain generation, you might remember the Tamagotchi; the Japanese pocket-sized “pet simulation game” that became the chief obsession of 90s kids bored of yo-yos and other fleeting trends. The Tamagotchi lived mostly in the grubby hands or lint-filled pockets of its owners but, for social currency, could be paraded before envious or competitive enthusiasts. 

Oddly, these oviparous virtual critters weren’t remotely animallike in their appearance, and could be intolerably demanding at times. Neglect to feed them, clean up after them, or tend to them when sick and — as many of us found out — very soon you’d be left with nothing but a dead LCD blob. But even the best cared-for Tamagotchi(s?) had certain obsolescence looming in their futures, once their needlessly complex lifecycle was complete: egg, baby, child, teen, adult, death. 

Continue reading

AI Ethics for Startups – 7 Practical Steps

Radiologists assessing the pain experienced by osteoarthritis patients typically use a scale called the Kellgren-Lawrence Grade (KLG). The KLG calculates pain levels based on the presence of certain radiographic features, like missing cartilage or damage. But data from the National Institute of Health revealed a disparity between the level of pain as calculated by the KLG and Black patients’ self-reported experience of pain.

The MIT Technology Review explains: “Black patients who show the same amount of missing cartilage as white patients self-report higher levels of pain.”

But why?

Continue reading

Deepfaking the Deceased: Is it Ever Okay?

In February last year, the world baulked as the media reported that a South Korean broadcaster had used virtual reality technology to “reunite” a grieving mother with the 7-year old child she lost in 2016. 

YouTube.com

As part of a documentary entitled I Met You, Jang Ji-sung was confronted by an animated and lifelike vision of her daughter Na-yeon as she played in a neighborhood park in her favorite dress. It was an emotionally charged scene, with the avatar asking the tearful woman, “Mom, where have you been? Have you been thinking of me?”

“Always”, the mother replied. 

Remarkably, documentary makers saw this scene as “heartwarming”, but many felt that something was badly wrong. Ethicists, like Dr. Blaby Whitby from the University of Sussex, cautioned the media: “We just don’t know the psychological effects of being “reunited” with someone in this way.”

Indeed, this was unchartered territory. 

Continue reading

Playing to the Algorithm: Are We Training the Machines or…?

It is our human inclination to want to look good. Our desire to impress keeps the fashion industry alive, it also motivates many of us to work or study hard, and there are billions of dollars to be made from our desperation to look visibly fit and healthy. So, it should come as no surprise that as algorithms hold more and more sway over decision-making and the conferral of status (e.g. via credit or hiring decisions), many of us are keen to put our best foot forward and play into their discernible preferences. 

This is certainly true of those in business, as discovered by the authors of the working paper How to Talk When A Machine is Listening: Corporate Disclosure in the Age of AI. An article posted by the National Bureau of Economic Research describes the study’s findings:

Continue reading