As Facebook reels, Silicon Valley dabbles in ethics

Facebook CEO Mark Zuckerberg's testimony before Congress put a spotlight on a major shift within the American tech sector: a rapid, albeit uneven, embrace of ethics.

Facebook CEO Mark Zuckerberg takes his seat to testify before a joint hearing of the Commerce and Judiciary committees on Capitol Hill in Washington, on April 10, 2018, about the use of Facebook data to target American voters in the 2016 election. (AP Photo/Pablo Martinez Monsivais)

WASHINGTON (RNS) — When Mark Zuckerberg was grilled in Congress on Facebook’s data privacy scandal, few news outlets picked up on his reference to an internal ethics team at the company.

In an exchange with Sen. Gary Peters, D-Mich., the tech titan was asked whether his company has a “set of principles” to guide its development of artificial intelligence, a technology it uses to combat misuse of the massively popular social media platform.

“We have a whole (artificial intelligence) ethics team that is working on developing basically the technology … for making sure this goes in the direction that we want,” Zuckerberg replied.


His comment came as lawmakers sought answers over the scandal involving Cambridge Analytica, a company that worked with Donald Trump’s 2016 presidential campaign and is said to have illegitimately obtained data from as many as 87 million Facebook users.

Debates are raging these days in the tech industry over issues such as data privacy, online hate speech and foreign interference in American elections, all with important ethical dimensions.

But the Facebook CEO’s comment offers a glimpse into what experts say is a shift within the American tech sector: a rapid, albeit uneven, embrace of ethics, one that Zuckerberg hinted at during his opening remarks before the Senate, saying of the Cambridge Analytica incident, “We didn’t take a broad enough view of our responsibility, and that was a big mistake.”

A ‘groundswell’ of ethical debate

Tech companies are often loath to detail how they make ethical decisions — Zuckerberg, for instance, did not immediately provide details about his company’s principles when asked by Peters, saying he would follow up later.

Brian Green, director of technology ethics at Santa Clara University’s Markkula Center for Applied Ethics in California, said this tight-lipped approach is likely strategic.

“I think it’s because every company is afraid that they’re the one that’s going to look like they have an ethical problem,” said Green, who has spoken to at least one tech company about ethics but would not give its name. “(But) actually ethics is normal, and we’re all doing ethics all the time.”


Irina Raicu. Photo courtesy of Santa Clara University

Despite this reticence, Irina Raicu, who also works at Santa Clara, as director of its Internet Ethics Program, said there has been a “groundswell” of discussion about ethics in Silicon Valley over the past few years. She pointed to a previous Facebook controversy as a major catalyst: a 2014 report that the company — along with researchers from Cornell University — manipulated the newsfeeds of 689,003 users as part of a scientific experiment to test emotional reactions.

Facebook defended itself at the time by arguing, among other things, that “none of the data used was associated with a specific person’s Facebook account.” Raicu said that episode triggered a national conversation that revolved around a specific question also at the heart of the Cambridge Analytica scandal: Yes, what Facebook did was probably legal — but was it ethical?

Raicu pointed to the “issue of content moderation” — be it the spread of “fake news” or online trolling such as the “terrible harassment of journalists on Twitter” — as fueling more recent calls for ethical reform. She also mentioned the recent debate over the spread of misinformation about survivors of the high school shooting in Parkland, Fla., including false claims that some students who spoke out in favor of gun control were “crisis actors.”

The ethics-based discourse has already produced practical partnerships. In October 2017, for example, the Anti-Defamation League announced it was teaming up with Facebook, Twitter, Google and Microsoft to create a Cyberhate Problem-Solving Lab to counter online hate speech.

“As more and more questions are posed … both the corporate departments in these companies and the people working in these companies — the technologists — have been taking more and more note of (ethics),” she said, noting that Facebook has provided research ethics training since 2014.

Raicu said the influence of ethics in tech — sparked by scandals across the industry — is becoming increasingly visible. She said “tech ethicists” are now invited onto the campuses of major tech companies to speak to employees, company officials are reaching out to Santa Clara and other groups asking for instructional materials on ethics and the Institute of Electrical and Electronics Engineers launched a TechEthics initiative in 2016.


The field even has its own rising stars, such as Tristan Harris, a former “Google design ethicist.” Harris has accrued a following through TED Talks and his organization the Center for Humane Technology, which is critical of any technology Harris says uses “unethical persuasion” to “hijack” our “minds and society,” such as social media platforms or smartphones. His campaign, often called the “Time Well Spent” movement, appears to be having an impact: In a January post, Zuckerberg listed “time well spent” as a design goal for Facebook, which many interpreted as a direct reference to Harris.

Green noted one of the loudest voices on the subject has been none other than Pope Francis, whose 2015 encyclical on the environment, “Laudato Si’,” praised the “immense possibilities” that technology — including information technology — offers, but warned that humanity’s “immense technological development has not been accompanied by a development in human responsibility, values and conscience.” Zuckerberg and his wife, Priscilla Chan, even met with the pontiff during a visit to the Vatican in August 2016 to discuss “how to use communications technology to alleviate poverty, encourage a culture of encounter, and to communicate a message of hope, especially to the most disadvantaged.”

Pope Francis talks to Facebook CEO Mark Zuckerberg, second from left, and Priscilla Chan, Zuckerberg’s wife, during a meeting at the Vatican on Aug. 29, 2016. Vatican spokesman Greg Burke is at far right. Photo courtesy of Osservatore Romano

Francis also railed against the overuse of social media and smartphones in an exhortation published on Monday — as have other religious leaders, such as the Dalai Lama. Even so, Green — who teaches at Santa Clara, a Jesuit university — noted the hyper-diverse tech field includes workers who claim a number of nationalities and faiths (or no faith), meaning religious voices are often lifted up alongside more secular approaches.

Disrupting ‘disruption’ culture

To create a more ethical tech sector, experts say, it must re-evaluate a Silicon Valley culture that champions virtually any technology that “disrupts” societal norms.

“Social transformations are presented as default good,” said Colin Koopman, an associate professor of philosophy at the University of Oregon. He argued the relentless pursuit of transformation “doesn’t even raise the question of the specific inequalities, freedoms and other kinds of political and ethical costs that might come from those transformations.”


Meanwhile, Green said the tech sector often sees its boundaries as primarily legal, not moral.

“I think the compliance mindset has steered people toward a sort of ethical minimalism, which is something they’re now realizing might not be the best approach,” he said.

Microsoft released a book on artificial intelligence in January that argued it “could make sense” for coders to take a “Hippocratic Oath … like we have for doctors.” A month later, a group of data scientists from across the industry — including representatives from Microsoft, Pinterest and Google, according to Wired — gathered in San Francisco to draft a specific ethics code for their profession.

They produced a list of principles, such as “be responsible for maximizing social benefit and minimizing harm,” respect “human dignity” and take steps to measure and plan for bias.

Raicu’s center came up with a quintessentially Silicon Valley solution: an app for ethical decision-making.

Colin Koopman. Photo courtesy of University of Oregon

Koopman, who penned a New York Times op-ed in the wake of the Cambridge Analytica scandal calling for ethics in data science, cautioned that ethics codes can sometimes prove static, or quickly rendered obsolete by the rapid advancement of technology.

Instead, he suggested ethics review boards for companies and a “social and ethical audit culture” to help anticipate problems ahead of time.


“I don’t think principles will get us there — I think they’re important, it’s just that too often the conversation goes there first, and then sort of stays there,” Koopman said. “I point to education. … We’re teaching kids coding in high school, but you don’t hear a lot about teaching kids the basics of data privacy.”

Tech ethics 2.0

While the flurry of ethical discourse excites ethicists, many are quick to dispel any notion that Silicon Valley will suddenly become a moral utopia overnight. Raicu said the tech industry is vast and covers a number of different areas, and the field is still growing: “There are ethicists, but not a lot of them work in this particular area.”

She also argued in an article in The Atlantic that ethical training will “not inoculate technologists against making unethical decisions — nothing can do that.”

But many experts say implementing ethical frameworks, while perhaps inconvenient in the short term, is actually a sound business strategy.

“It makes good business sense that they would want to anticipate the effects of what they’re doing now,” Koopman said. “Not just, ‘How is this going to impact next quarter’s profits,’ but also, ‘How is this going to impact the company in 10 years or 20 years, where we find ourselves in a situation where we’ve amassed a huge warehouse of data on all of our users?’”

A DNA strand next to the title of the series.

Donate to Support Independent Journalism!

Donate Now!