The Risks and Responsibilities of Tech Innovation

Do companies bear responsibility for the social risks of how consumers or buyers use their products? This question has been less explored, but is no less important.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

This post originally appeared on the MIT Sloan Management Review blog.

Companies clearly have a responsibility for working conditions in their supply chain. But what are companies' responsibilities in the "demand chain"? Do companies bear responsibility for the social risks of how consumers or buyers use their products? This question has been less explored, but is no less important, particularly in light of one of the most talked-about tech products now hitting the market: Google Glass.

Google Glass is prompting concern among lawmakers and advocates for safe driving, worried about what drivers will do with the Internet at their eyeballs. A California woman charged with distracted driving last fall for wearing Google Glass behind the wheel was acquitted because police couldn't prove the device was on; even so, a spokesman for the American Automobile Association noted, "Just looking at this and using common sense, it would seem to be something someone should not be doing while they're behind the wheel."

Individuals must drive safely. But does Google also bear responsibility for potential harm caused by Glass users?

The United Nations Guiding Principles on business and human rights were developed over six years of wide-ranging research and consultation, including with companies and business lobbying groups, and unanimously endorsed by the Human Rights Council in 2011. (I served as an advisor to the effort.) The Principles state that companies must "avoid causing or contributing to adverse human rights impacts through their own activities, and address such impacts when they occur."

Google's design, manufacture, and distribution of Glass clearly constitute Google's "own activities;" therefore, if Glass distracts drivers and thereby causes traffic accidents, Google has a responsibility to address this issue.

So far, Google seems to feel otherwise. "When you're wearing Glass, we just ask you to be very aware of what's going on around you, to use it wisely, the same way you would use any technology," said a Google spokesperson.

In other words, it's not our problem.

But some might disagree. Vivek Krishnamurthy, an attorney with the law firm Foley Hoag and an expert in the intersection of technology and human rights, told me, "The key question is, does this pose a set of dangers that is different enough from other technologies that a warning" -- apart from what's buried in terms-of-agreement legalese -- "is not enough?"

There is no question that Google believes Glass is a game-changer. So what should a company in Google's position -- with potentially groundbreaking technology that has the potential create serious risks -- do when faced with this dilemma?

Acknowledge responsibility.

Companies cannot proudly take ownership for the positive impacts of their products while distancing themselves from harms. Companies should acknowledge that there may be risks to using their products in plain English (and Spanish, French, German, Arabic, Mandarin, and any other language necessary) -- while making it clear that they will be proactive in assessing and mitigating those risks.

Proactively work with others.

Companies should regularly convene advisory groups of experts across different disciplines; in Glass's case, perhaps a mix of safe-driving advocates, behavioral scientists, and experts in dual-use technology and human rights could help ensure that Glass doesn't exacerbate our worst tendencies and encourage unsafe behavior. Such external input should be factored into the product development stage; demonstrating that social risks have been evaluated should be a requirement before every new feature and app release.

Leave room for safeguards.

Krishnamurthy told me, "The company can't consider all of the risks because they're unknown -- but the company should think carefully about what they do to design around those risks." Companies should allow for the ability to push out software updates to make their products safer once the risks become clearer. There are plenty of safe driving apps for cell phones, which might offer a precedent for Google to follow.

Hold users accountable.

Microsoft suspends the accounts of XBox Live Users for violating their Code of Conduct. Google should make it clear that they do not condone unsafe behavior with Glass and will participate in users' prosecution when they've violated the law.

Companies must manage the social risks that they cause or contribute to, no matter where they occur and at whose hands. As they develop technology that aims to transform how we live, it is even more critical that they acknowledge this responsibility.

Popular in the Community

Close

What's Hot