Designing for Bad Intentions: Wearables and Cyber Risks is delighted to feature a guest post by John Gray, the co-founder of MentionMapp Analytics. John is a media researcher and entrepreneur exploring how issues like the spread of misinformation, and the exploitation of personal privacy are eroding trust in our social institutions and discourse. He’s written numerous case studies and has co-authored “The Ecosystem of Fake: Bots, Information and Distorted Realities.” 


It’s the bad people with bad intent that’s causing the problem, not technology” – Shane Luke, Sr. Director of Digital Innovation, Nike

We exude data, like the sweat that streams off our skin. It’s the norm. Just as another new normal is the news of the latest PR tour by data breach apologists full like empty promises of we’ll do better”. Like the soles of an ultra-marathoners shoes, the cliched technocratic mind-set of “moving fast, breaking things” and “asking for forgiveness rather than permission”, is beginning to wear thin.

We accept the devices in our pockets, and on our wrists, feet, and even our faces are communicating data. Yet the data they produce becomes a target for bad-actors. As technology weaves deeper into what we wear, there’s more to our fashion statements than meets the eye.

With recent data breach headlines concerning Stava and MyFitnessPal still fresh in my mind, I decided to speak with Nike’s Sr. Director of Digital Innovation, Shane Luke, about data security and wearable technology.  While Gartner estimates that the wearable device space was worth an estimated $30.5BN last year, it is still a space challenged with reconciling the intimacy of what we wear, with the data we essentially are. Personal privacy and data security must be key considerations in product design and development.

Luke has been designing wearable products with these issues in mind since we first spoke in 2010, during his first stop at Nike as Director of Product. At the time he saw how Nike was working to connect technology with their products, and he understood that, “Computing technology was going require the need for digital services, the integration of sensors, plus the ability to analyze and manage data.”

While in 2018 Luke still thinks about the design of tomorrow’s digitally connected sports equipment and apparel, it is now personal privacy, data security, and trust that defines what he delivers. Every conversation about product design for Luke starts with the question: “Why does this need to exist?” He says, “It’s based on challenging assumptions about why something is good, why something is needed, why something will be better than what exists or why something needs to exist if it doesn’t.”

smart sneakers.png

Data security presents formidable technical challenges, and putting a bunch of technically brilliant people in a room doesn’t guarantee desirable outcomes. While we need brilliant technical people to solve these problems, as Luke points out, “If they don’t know which challenges to solve, or necessarily how to approach them you get to a point where you’re not really defining the problem in the right way.”

Throwing more technology at the challenge doesn’t always deliver a solution. Reflecting on his experience as Chief Product Officer at Recon Instruments between 2012 and 2014, Luke always thought about reframing the idea of an existing problem: “As I’m trying to work my way back to a solution, obviously I have a vision of the problem, what the solution might be when I start. There’s a degree of refinement as we progress, one of constantly asking if something is actually a problem? If so, will those dealing with it actually see this the solution?”

Take Google Glass as an example. It went from being everywhere in the media to becoming the butt of many a joke – probably none more famous the head twitching SNL skit. Glass tried to be a general-purpose device, while Recon chose to address specific sport niches like ski goggles. Luke asks an important question: “Do I need to wear a display on my face all day long in order to have augmented reality experiences or get notifications or (speaking of privacy) take pictures?”

The absence of streets teaming with “glassholes” suggests the answer seems to be no.

Luke and I agreed that many products in the wearables space are technically excellent, or at least have a degree of “coolness.” Yet, they don’t have a strong raison d’être. Companies are still putting technology and data first, with need and privacy coming second. Despite countless examples of abysmal failings, some wearables continue to enjoy success. Rather than being overly critical of yesterday’s failings, Luke believes there’s an inherent bias which leads many people to having “dystopian extrapolation” worldview. He claims, “We look at what’s happening and then extrapolate to how the worst possible outcome is going to happen if we don’t act.”

People have always feared technology.  “The reason behind this fear, is that most people don’t understand privacy and security. They don’t understand how it’s managed by tech companies, they don’t understand what kind of power these kind of companies have or might not have or what their products could do or not do,” Luke continued, “If you combine lack of awareness and understanding with the amplification from every incident that occurs and knowing every time something happens it’s delivered faster and more widely than it ever before, then of course a lot people are going to think about the worst outcomes.”

While we can’t design away people’s fear and bias about technology, we have to do better designing for intent. Thinking about the admission from Alphabet Chairman, Eric Schmidt that: “It didn’t occur to us that there were criminals”, it was important to press Luke on the technocratic-utopian “rainbow and unicorn” worldview. Luke acknowledges intent is the problem: “People’s intent is the problem in almost all the cases, and it a significant issue.”

It’s inevitable there will be mistakes designing complex products and systems. Having worked on products considered leading “industry firsts” Luke believes that on the whole technology is helping us immensely, and accepts that planning for every human intention is impossible. In the case of data breaches, “It’s because there are bad people out there, not because of technology. People find exploits, and it’s the bad people with bad intent that’s causing the problem whether it’s Russian hackers, or Equifax being breached.” Awareness isn’t enough, the design process has to acknowledge that somebody is going to try to exploit that product or system.

There are no simple answers when it comes to securing our data-selves and what we wear, but the Recon Jet provides a good example. It has a camera right on the front of it, it tracks location, and does an enormous amount with data it gets from the wearer. Luke starts with being clear that, “We have to be really good at explaining how the technologies work and what you’re exposed to or from versus not exposed to or from. I think we have to be really good about making intent clear.”

Believing in foolproof is foolhardy, and technological abstinence a near impossibility. So long as we keep producing the data, we’ll be at risk. But, raising the bar of education and communication between those making the devices and those wearing them is an important part of the solution. With Europe leading the way with an initiative like GDPR, the personal data conversation is changing. Regardless, when it comes to your personal privacy and data ownership, knowing that big brothers are not your best keepers is the best cloak to wear.

“If most of us are ashamed of shabby clothes and shoddy furniture let us be more ashamed of shabby ideas and shoddy philosophies…. It would be a sad situation if the wrapper were better than the meat wrapped inside it.” ― Albert Einstein

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s