SAN FRANCISCO – Defects in the design of Facebook’s WhatsApp platform may have led to as many as two dozen people losing their lives in India.

With its communications encrypted end-to-end, there is no way for anyone to moderate posts; so WhatsApp has become what “an unfiltered platform for fake news and religious hatred,” according to a Washington Post report.

Vivek Wadhwa

WhatsApp, which offers messaging and Voice over IP service, is not used as broadly in the U.S. as in countries such as India, where it has become the dominant mode of mobile communications. But imagine Facebook  and Twitter without any filters or moderation: the Wild Wild West during the heyday of Cambridge Analytica. Now imagine millions of people who have never before been online, becoming dependent on and trusting everything they read there, and you will realize the damage that such technologies can do.

Earlier this month, India’s Ministry of Electronics and Information Technology sent out a stern warning to WhatsApp, asking it to immediately stop the spread of “irresponsible and explosive messages filled with rumours and provocation.” It argued that the “platform cannot evade accountability and responsibility, especially when good technological inventions are abused by some miscreants who resort to provocative messages which lead to spread of violence.”

WhatsApp’s response, according to New Delhi’s The Wire, was to offer minor enhancements, public education campaigns, and “a new project to work with leading academic experts in India to learn more about the spread of misinformation, which will help inform additional product improvements going forward.” It defended its need to encrypt messages and argued that “many people (nearly 25% in India) are not in a group” — in other words, 75% of the population is affected.

Facebook is using the same tactics that it used when the United Nations found it to have played “a determining role” in the genocide against Rohingya refugees in Myanmar: pleading ignorance and inability to do anything about it. It is in effect blaming the design of its product — and its users — for the carnage it is enabling.

Here is the real issue: Facebook’s business model relies on people’s dependence on its platforms for practically all of their communications and news consumption, setting itself up as their most important provider of factual information — yet it takes no responsibility for that information.

Facebook’s marketing strategy begins with creating an addiction to its platform using a technique that former Google ethicist Tristan Harris has been highlighting: intermittent variable rewards. Casinos use this technique to keep us pouring money into slot machines; Facebook and WhatsApp use it to keep us checking news feeds and messages. We are their rats in a maze, hitting buttons to receive food.

When Facebook added news feeds to its social-media platform, its intentions were to become a primary source of information. It began by curating news stories to suit our interests and presenting them in a feed that we would see on occasion. Then it required us to go through this newsfeed in order to get to anything else. Once it had us trained to accept this, Facebook started monetizing the newsfeed by selling targeted ads to anyone who would buy them.

It was bad enough that, after its acquisition by Facebook, WhatsApp began providing Facebook with all kinds of information about its users so that Facebook could track and target them. But, in order to make WhatsApp as addictive as Facebook’s social-media platform, Facebook added chat and news features to it — something that it was not designed to accommodate. WhatsApp started off as a private, secure messaging platform; it wasn’t designed to be a news source or public forum.

WhatsApp’s group-messaging feature is particularly problematic because users can remain anonymous, identified only by a mobile number. A motivated user can create or join unlimited numbers of groups; and, when messages are forwarded, other than the word “forwarded,” there is no indication of the accuracy of the message or its true source. Messages’ encryption prevents law-enforcement officials and even WhatsApp itself from viewing what is being said. No consideration was given in the design of the product to the supervision and moderation necessary in public forums.

Facebook needs to be held liable for the deaths that WhatsApp has already caused and be required to take its product off the market until its design flaws are fixed. That is the best way of motivating it to fix them. Rest assured: The technology industry always finds a way of solving problems when profits are at stake.

(C) Vivek Wadhwa