3 Comments

Summary:

The digital age has made possible many of the human-technology interactions that once were the stuff of science fiction. But at what cost? UX designers must be aware of and accountable for the human impact of their work.

assemblyline
photo: Everett Collection/Shutterstock.com

New technologies have always produced unintended consequences. But user experience (UX) designers and engineers face a number of new ethical challenges today with the rise of technology and our interaction and dependence on it.

UX designers’ primary job is to improve usability and extend productivity. But they also have a responsibility to address the unintended consequences of new technologies, some of them with a clear ethical dimension. Following is a look at some of the principle ethical quandaries that UX designers will run up against and must deal with responsibly.

Human costs and de-valuing work

So much of the UX discipline’s early efforts were driven by the desire to improve human performance and productivity while reducing errors. Few questioned the value of these gains, achieved by optimizing system design, augmenting human ability, and automation, especially as it eliminated dangerous, repetitive, or tedious work – think of assembly line factory jobs that in past decades injured and maimed scores of people.

But some forms of automation come at the cost of diminishing the work’s intellectual and emotional value. Consider the levels of automation found in fast-food restaurants or warehouse fulfillment centers, where work is de-humanized, worker growth is diminished, and the value of rewarding work is stripped away. Undoubtedly these issues were at play with the spate of protests and suicides by distraught Foxconn workers in recent years.

The question for the UX professional who designs these work experiences then is: at what point must efficiency and optimization yield to human concerns?

‘De-skilling’

Over the past two decades, there have been tremendous advances in the development of powerful support systems that augment human intelligence in demanding environments. For example, some aircraft systems, such as the  Boeing Dreamliner and the  F-35 Lighting II, have become so complicated that they challenge the human capacity to fly them without assistance from an “intelligent” assistant.  The positive benefits of this technology can reduce error and improve safety.

At the same time, UX researchers must examine the possibiliy that automation can create a situation where skilled operators can be replaced be less-skilled operators. (On a mainstream level, that would include losing the ability to navigate without the aid of GPS, or more simply the ability to do math without using a calculator.)

In some cases, the gains from technology will outweigh the loss of skills. In others, the level of support and automation might warrant reconsideration. Whatever the outcome, it is critical that UX designers initiate this conversation, so that users of technology can make informed choices about their extent and consequences.

Influencing user behavior

We’ve gotten pretty good at being able to subconsciously influence and alter behavior (by nudging, for one), which creates a vexing ethical conundrum for UX designers. The UX professional must understand that for every product created with the “best intention,” there will be another that deliberately nudges the user to ends not in the user’s best interest.  Thus on the one hand, they recognize that human behavior often results in sub-optimum choices and actions. On the other hand, they recognize that they have the potential, through design, to affect that behavior in other ways – positive and negative.

So how do UX professionals define their ethical responsibilities as they subconsciously influence users’ decisions or actions? The case of producing negative outcomes is clear; less clear is who determines what is “positive.” The line between the two is often not well defined. Take for instance the medicare prescription drug plan finder tool on the medicare.gove site which navigates this dilemma well. It guides and supports the user in an unbiased fashion to the plan that best aligns with their health needs – a great improvement over early support efforts on the site.

The erosion of privacy

With the best intentions, technologies have been developed to remotely monitor the activities of the elderly – what and how much they eat, where they’re located, even when they take their prescriptions. Similarly, products like vuezone or  Car Connection allow parents to monitor every movement of their children – what they’re doing at home, how fast they are driving, where they are at 2 a.m.

The benefits of such technologies are real, for one allowing the elderly to live independently or for parents to be confident in the safety of their children . Yet such constant monitoring of the individual can also have the opposite effect, instead leaving one feeling the loss of highly valued privacy and dignity because of non-stop monitoring. With each new capability comes added consequences.

The dangers of distraction

The convergence of technologies can tax our attention spans in a way that threatens the limits of human capabilities. One case is the increased integration of communication, navigation, and entertainment technologies in automotive design. We now have GPS screens, entertainment monitors, handsfree cellphone use, and advanced stereo systems with various control mechanisms.

While these technologies deliver unquestionable value and pleasure to the driver and passengers, they indisputably divide the operator’s attention, distracting him or her from the stated purpose of driving, leading to life- threatening situations (and that’s not even including texting while driving). The problem has become so severe that the Highway Safety Administration has created a website to address this issue.

So what responsibility do UX professionals have in these situations? The likelihood of distraction and its consequences should become an area of intense focus in the UX discipline’s research agenda.

At the end of the day, UX professionals must increasingly consider where their responsibilities lie – with the organization that reaps financial gains from the technology sold, or with the user who may possibly suffer negative or life- threatening consequences from these products.

Bill M. Gribbons is professor of information design and corporate communication and director of the graduate human factors program at Bentley University. 

Photo courtesy Everett Collection/Shutterstock.com.

You’re subscribed! If you like, you can update your settings

  1. Nicholas Paredes Sunday, March 10, 2013

    An article I read this morning by Michael Pollan in Lucky Peach describes the evolution of food technologies, and mentions the refinement of flours and sugars as a turning point in the manufacturing of food. I wonder how we as a society could have argue for or against these technologies. We now know that refined products have lead to heart disease and diabetes. We also know that we can feed many more mouths.

    Likewise, I feel that we as designers of technology are ignorant of the complexities. They are unfathomably complex. Frankly, we could work a lot less and save the world from ourselves. A pill reminder app can promote the sales a pharmaceuticals that many probably don’t need. Alternatively, we can use the same tools designing life saving treatments along with the necessary evolution of medical intervention.

    I do know that the most dangerous problem is our education and the subsequent changes to our socio-economic selves and not the technologies that we are developing. We tend to over promote the impact of our technologies and under appreciate the human gluttony that is stripping the planet bare. Pollan wants us to return to the kitchen with our friends and family thereby gaining an appreciation for the food we eat.

    “We are all of us tempted to read more books, look at more pictures, listen to more music than we can possibly absorb, and the result of such gluttony is not a cultured mind but a consuming one; what it reads, looks at, listens to is immediately forgotten, leaving no more traces behind than yesterday’s newspaper,” – W.H. Auden

  2. Hi Bill,

    I am curious about what you think of the “First Things First Manifesto”?

    Eric

    http://maxbruinsma.nl/index1.html?ftf2000.htm
    http://en.wikipedia.org/wiki/First_Things_First_2000_manifesto

  3. Reblogged this on i-hilser and commented:
    Interesting ethical challenges for UX designers (and all designers by extensions IMO)

Comments have been disabled for this post