AI Ethics - A New Skill for UX-Designers

AI has our attention. The possibilities clog our news feeds, create interesting conversations, and give tech leaders inspiration to explore solutions. What will the development of this technology look like? What will this mean for us as humans? Could this impact all of society? With all the questions being asked, only one thing is absolutely clear. We’re about to enter one of the biggest transformations our society has witnessed in the last century - if not millennium.

A couple of weeks ago, Google held it’s annual developers conference, Google I/O. As usual, they introduced a range of new services and features and every one shared a common thread: AI. Google will use AI to continue to power Google Photos, their new Google News service, and even ways to make you use your phone less (JOMO = Joy of Missing Out). However, the big story was their demo of Google Assistant calling up a hairdresser on your behalf to book an appointment.

Have a listen:


The voice sounds very natural - had I been on the other end of that call I would never have guessed I was talking to a machine. This is not only due to it’s clear sounding voice and natural speech patterns, but also because Google added human speech quirks like “hmm and um”.

Bridget Carey of CNN was one of the first of many to call Google out on the ethical implications of this:

I am genuinely bothered and disturbed at how morally wrong it is for the Google Assistant voice to act like a human and deceive other humans on the other line of a phone call, using upspeak and other quirks of language. “Hi um, do you have anything available on uh May 3?”

If Google created a way for a machine to sound so much like a human that now we can’t tell what is real and what is fake, we need to have a talk about ethics and when it’s right for a human to know when they are speaking to a robot.
In this age of disinformation, where people don’t know what’s fake news… how do you know what to believe if you can’t even trust your ears with now Google Assistant calling businesses and posing as a human? That means any dialogue can be spoofed by a machine and you can’t tell. Bridget Carey

Google Assistant making calls pretending to be human not only without disclosing that it’s a bot, but adding “ummm” and “aaah” to deceive the human on the other end with the room cheering it… horrifying. Silicon Valley is ethically lost, rudderless and has not learned a thing.Zeynep Tufekci

I think we’re reaching the point now where the differences between companies like Apple and Google are going to become much more obvious. For years, the consensus has been that Apple has lagged behind Google in AI/personal assistants and we’re starting to understand why that has been the case. It’s always been my belief that Apple is a product-driven company focused on human needs whereas Google is a technology-driven company focused on leveraging data. It’s a fundamentally different way of valuing the user - the human.

The call that Google Assistant (or Duplex?) makes to the hairdresser is less than a minute long. I’m not confident that average people could have so little time in their schedule to spare that a minute to call for an appointment. Instead Google could actually be aiming this product at people that simply don’t want to talk to another person, something that Rene Ritchie mentions.

I can’t imagine someone I know (like my hairdresser) receiving an incoming call from me and instead end up talking with my digital voice assistant. It’s a great example of how from a technology standpoint, it may seem like the best thing ever. However, from a human perspective, it’s downright insulting to the person on the other end who realizes that you can’t even spare the 30 seconds it takes to call them yourself to make an appointment. This feature would make much more sense if the small business that you were calling had its own automated response capability. However, at that point, one has to ask why a phone call would even be needed in the first place.Neil Cybart

If you ask Oxford University or Ball State University one of the most likely jobs to be automated in the near future is telemarketers. Telemarketing is already a problem today and we still have humans making the calls - just imagine if you extract the cost of humans from telemarketings companies. Your phone may become as filled with spam as your email is.

The true power of AI like Google Duplex is that once it’s deployed it can operate 24 hours a day, 365 days per year for the cost of an electrical bill. And while that electrical bill may be pretty hefty - we will have created a problem-solving, singularly-focused, intelligence that has no need to rest or be valued in any way. Solving “human” problems without the burden of humanity.

David Cope who created EMI (Experiments in musical intelligence):

David Cope has written programs that compose concertos, chorales, symphonies and operas. His first creation was named EMI (Experiments in Musical Intelligence), which specialised in imitating the style of Johann Sebastian Bach. It took seven years to create the program, but once the work was done, EMI composed 5,000 chorales à la Bach in a single day.

Cope arranged a performance of a few select chorales in a music festival at Santa Cruz. Enthusiastic members of the audience praised the wonderful performance, and explained excitedly how the music touched their innermost being. They didn’t know it was composed by EMI rather than Bach, and when the truth was revealed, some reacted with glum silence, while others shouted in anger.

Who do you trust?

The simple fact is we’ve been using artificial intelligence algorithms for years - everything from your Google searches and finding partners on Match.com to your weekly playlist from Spotify or guiding your Amazon shopping experience. As ‘makers’ it needs to be clear that the tools we create are having a massive impact on people’s lives. What was ‘just an app’ yesterday, could be the thing forming your beliefs and inspiring your actions tomorrow.

Technology isn’t an industry, it’s a method of transforming the culture and economics of existing systems and institutions. That can be a little bit hard to understand if we only judge tech as a set of consumer products that we purchase. But tech goes a lot deeper than the phones in our hands, and we must understand some fundamental shifts in society if we’re going to make good decisions about the way tech companies shape our lives—and especially if we want to influence the people who actually make technology.12Things Everyone Should Understand About Tech

Yuval Noah Harari, author of Homo Sapies and Homo Deus, argues that liberalism will eventually fade away as we eventually trust the algorithm more than we trust ourselves. Ray Dalio, successful investor has already switched his company to radical transparency using a point system that analyzes data to rate people’s ‘believability’ rather than operating through democracy or even hierarchy. I highly recommend you to watch his TED Talk: How to build a company where the best ideas win as an example of how AI can make us more honest, transparent, and guide better decision making.

You might not have been able to tell that Google Duplex was a machine rather than an actual human being, but others claimed that the music composed by EMI was obviously lacking ‘soul’ and the human ear could tell.

Critics argued the music is technically excellent, but that it lacks something. It is too accurate. It has no depth. It has no soul. Professor Steve Larson from the University of Oregon sent Cope a challenge for a musical showdown. Larson suggested that professional pianists play three pieces one after the other: one by Bach, one by EMI, and one by Larson himself. The audience would then be asked to vote who composed which piece. Larson was convinced people would easily tell the difference between soulful human compositions, and the lifeless artefact of a machine. Cope accepted the challenge.

On the appointed date, hundreds of lecturers, students and music fans assembled in the University of Oregon’s concert hall. At the end of the performance, a vote was taken.

The result? The audience thought that EMI’s piece was genuine Bach, that Bach’s piece was composed by Larson, and that Larson’s piece was produced by a computer.

What’s our responsibility?

Developing technology is exciting and inspiring. However, the old rule still applies: just because you can, doesn’t mean you should. The impact of this tech on our daily lives is growing stronger each day and the shift is just starting.

In mature disciplines like law or medicine, we often see centuries of learning incorporated into the professional curriculum, with explicit requirements for ethical education. Now, that hardly stops ethical transgressions from happening—we can see deeply unethical people in positions of power today who went to top business schools that proudly tout their vaunted ethics programs. But that basic level of familiarity with ethical concerns gives those fields a broad fluency in the concepts of ethics so they can have informed conversations. And more importantly, it ensures that those who want to do the right thing and do their jobs in an ethical way have a firm foundation to build on.

But until the very recent backlash against some of the worst excesses of the tech world, there had been little progress in increasing the expectation of ethical education being incorporated into technical training. There are still very few programs aimed at upgrading the ethical knowledge of those who are already in the workforce; continuing education is largely focused on acquiring new technical skills rather than social ones. There’s no silver-bullet solution to this issue; it’s overly simplistic to think that simply bringing computer scientists into closer collaboration with liberal arts majors will significantly address these ethics concerns. But it is clear that technologists will have to rapidly become fluent in ethical concerns if they want to continue to have the widespread public support that they currently enjoy. 12Things Everyone Should Understand About Tech

There’s a rapidly growing urgency for us to have serious conversations about our ethical responsibilities for the products we create as well as the products we choose to use. Everything is still so new. There’s no real direction or consensus to help us determine what’s considered OK and what’s way out of line. Our world is always moving so quickly, we hardly ever stop and consider the ethics of our choices. Instead, we simply see a machine calling a hairdresser, think how awesome that is, and move on with our day…

If you enjoyed this post (please share it), I think this is a great follow-up: The Moral Implications of our Apps

Please share:TwitterLinkedInFacebook
Get more writing like this

Sign up and get new writing, just like this, every other two weeks. Unsubscribe any time (I'm not a dickhead).

Books

User Experiences that Matter (2016)
Mastering Freelance (2017)

If You're Getting Started in UX

What's a 'User Experience' Anyways?
How Do You Learn UX?
Working as a UX Designer

Next Steps in UX

Working as a UX Lead
Defining a UX Strategy
Writing as Part of the UX Process

Thought-pieces

AI Ethics - A New Skill for UX-Designers
Designer Ethics & The Moral Implications of our Apps
The Future of the UX-Designer
Voice Input’s Effect on Social Norms

The Work We Do

Chasing Growth
New Tools Don’t Always Equal Productivity
Why Designers Need to Write
The Tools I Use to Run My Business

Featured Writing & Interviews Elsewhere

Q&A With Anton Sten, Author of User Experiences that Matter - Adobe
What the F*#!ck is a UX Designer anyway - Working not Working
It’s Time for a Code of Ethics for Designers - Medium Modus
The Art of Going Freelance - .Net Magazine
It Takes Time - Being Freelance episode 100

From My Newsletter

Who’s listening?

UX of Email Newsletters

Working as a UX-lead

2018 in review

What’s my location?

I’m taking a break

Stay humble, stay eager

Back to Work!

Vanity Metrics

The Future of Retail

2017 review

What do you do?

Carpe Diem UX-designers

What´s Good Design?

Chasing Growth

A Redesign

Is Less More?

Why Simple is Hard

Pricing It Perfectly

Trusting Your Gut

Built to last

An Eye on the Future

UX Design explained

Bite-sized Posts

The Enemy

The Hot Potato Process

Leave the Phone at Home

Delight Comes Last

The Cost of Lies

Big Mood Machine

Simplicity is a war

The next iPhone

Leadership or management

Everyone should own a dog

Silence is gold

Cameras that understand

Humans, not users

Keeping AI Honest

Right to privacy

The State of UX in 2019

Why scrap scrappy?

Organized for browsing

The iPhone Franchise

Why Small Teams Win

The Bullshit Web

Just keep at it

Let them eat cake

Netflix Culture

Skype

Unfoundered

Tech is not Neutral

Productivity

Whose risk?

Why Small Teams Win

Phone Bored

Karim Rashid

Dieter Rams

Bleeding Out

Fake News is spam

Conversational Design

Dropbox

Bye bye Facebook

Cuba

The seat at the table

Givenchy

Love letters to trees

Pricing Philisophy

Specialize

Personas

Make me think

Hawaii Missile Alert

Why Design Systems fail

What You Build

Checkout for Winners

Living a Testing Culture

Creative Class

How To Predict Your Future

Github

Medium

Design quotes

Enough

Why?

ARKit

Designing for Mobile

Failure, Reflect, Renew

Growth

Working with me

Great user experience

Naming your icons

Conversations

All writing

Who’s listening?

The Enemy

The Hot Potato Process

Leave the Phone at Home

UX of Email Newsletters

Delight Comes Last

The Cost of Lies

Big Mood Machine

Simplicity is a war

The next iPhone

Working as a UX-lead

Leadership or management

Everyone should own a dog

Silence is gold

Cameras that understand

Humans, not users

Keeping AI Honest

Right to privacy

2018 in review

What’s my location?

The State of UX in 2019

Why scrap scrappy?

I’m taking a break

Organized for browsing

Stay humble, stay eager

The iPhone Franchise

Why Small Teams Win

Back to Work!

The Bullshit Web

Just keep at it

Let them eat cake

Netflix Culture

Skype

Unfoundered

Vanity Metrics

Tech is not Neutral

Productivity

Whose risk?

Why Small Teams Win

Phone Bored

Karim Rashid

Dieter Rams

Bleeding Out

Fake News is spam

Conversational Design

Dropbox

Bye bye Facebook

Cuba

The seat at the table

Givenchy

Love letters to trees

Pricing Philisophy

Specialize

Personas

Make me think

The Future of Retail

Hawaii Missile Alert

2017 review

Why Design Systems fail

What You Build

Checkout for Winners

Living a Testing Culture

What do you do?

Creative Class

How To Predict Your Future

Carpe Diem UX-designers

What´s Good Design?

Chasing Growth

Github

Medium

A Redesign

Design quotes

Enough

Why?

ARKit

Designing for Mobile

Is Less More?

Why Simple is Hard

Pricing It Perfectly

Failure, Reflect, Renew

Trusting Your Gut

Built to last

An Eye on the Future

Growth

Working with me

UX Design explained

Great user experience

Naming your icons

Conversations