AI and the future shape of product design

This article by Fiona J McEvoy (YouTheData.com) was originally posted on All Turtles.

Design.jpg

These days we talk so much about artificial intelligence and its creators that it’s easy to overlook the increasingly prolific role AI itself is playing in product creation and design. Across different industries, the technical and the creative are being drawn closely together to create a range of products that may otherwise never have been conceived.

Blowing past the wind tunnel

Take, for example, the new aerodynamic bicycle presented this month at the International Conference on Machine Learning, which was designed using Neural Concept software. By employing AI in the design phase, a small team from French college IUT Annecy were able to completely bypass the usual methods of testing for aerodynamism – a process that usually requires a great deal of time and computing power.

Continue reading

What AI can learn from nature

This article by Fiona J McEvoy (YouTheData.com) was originally posted on All Turtles.

Leonardo_Design_for_a_Flying_Machine,_c._1488

In designing his famous flying machine, Leonardo da Vinci took inspiration from bird flight. The inventor’s Codex on the Flight of Birds, details their behaviors and makes proposals for mechanical flight that would influence the development of the first modern airplane hundreds of years later.

Birds aren’t the only animals to influence scientific progress. For many years scientists have sought to unlock the extraordinary qualities of shark skin, which has huge advantages for both increasing speed and repelling germs. Recently, Walmart filed a patent for the creation of a swarm of robotic bees which they hope to use for the autonomous pollination of crop fields. Perhaps unsurprisingly, the humble original is perfectly designed for the task.

Continue reading

AI that understands how you really feel

This article by Fiona J McEvoy (YouTheData.com) was originally posted on All Turtles.

Emotion AI 3

The way we interact with technology keeps changing. Of late, many more of us are using speech and gesture to give instructions to our devices, and it’s actually starting to feel natural. We tell Alexa to turn the lights off, silence our smart watches by smothering them with our palms, and unlock our phones with a look. For this to work as seamlessly as it does, our devices have to pensively watch and listen to us. Pretty soon they could begin to understand and anticipate our emotional needs, too.

The move towards what’s been called implicit understanding – in contrast with explicit interaction – will be facilitated by technologies like emotion-tracking AI. Technology that uses cues from our vocal tone, facial expressions and other micro-movements to determine our mood and, from there, our needs. According to researchers at Gartner, very soon our fridge will be able to suggest food to match our feelings, and research VP Annette Zimmerman has even claimed that, “By 2022, your personal device will know more about your emotional state than your own family.”

Continue reading

The Problem with Next Generation Virtual Assistants

33433114056_ff8bc048f1_b

It may not seem like it, but there is quite an arms race going on when it comes to interactive AI and virtual assistants. Every tech company wants their offering to be more intuitive…more human. Yet although they’re improving, voice activated tech like Alexa and Siri are still pretty clunky, and often underwhelming in their interactions.

This obviously isn’t great if developers want to see them entering the workplace in such a way as to supercharge sales.  Continue reading