The business model causes collateral damage, but they will never walk away from it
Corporate sex scandals are the canaries in the coal mine of the business world. They are bad enough on their own. But more often than not, they also signal something amiss in a company’s culture — particularly when they come in multiples.
That was my first thought when reading about how Google paid the founder of Android $90m, while keeping quiet about a sexual misconduct claim. But it was a line in an internal email from Google chief executive Sundar Pichai that really got my attention: “In the last two years, 48 people have been terminated for sexual harassment, including 13 who were senior managers and above. None of these individuals received an exit package.”
That’s good, I guess. But really — 48 people? What does this say about Google? Indeed, what does this say about Silicon Valley in general? Toxic corporate culture, including the sexual mis-steps of the technologists, is related to the toxic business model of hyper-targeted advertising. Both display a tendency to do harm until they are fully exposed and force action.
The problem was put in sharp focus last week by Apple head Tim Cook, who gave a speech at the EU privacy commissioners conference in which he decried the “data industrial complex” exemplified by businesses such as Google and Facebook. These companies make the vast majority of their money keeping people online for as long as possible, in order to garner as much of their personal data as possible. It can then be leveraged via paid advertisements deployed with the precision of drone strikes.
“Our own information — from the everyday to the deeply personal — is being weaponised against us with military efficiency,” said Mr Cook, whose own company makes most of its money not from data, but from hardware.
Apple has its own issues in Europe and the US — from tax offshoring to legal battles over intellectual property infringement. But it is the huge platform groups — Facebook, Twitter, Instagram, and Google — that have the deeper problems. Their business depends on manipulating behaviour, via Las Vegas-style techniques offering variable rewards to keep users hooked. Their opaque algorithms lock users into filter bubbles which they can then monetise.
It is a business model that causes endless collateral damage, as evidenced by the weekly drumbeat of scandal. But it is one that they will never walk away from voluntarily. It is simply too profitable.
I spoke to a former YouTube engineer, Guillaume Chaslot, of the Center for Humane Technology, a group of Silicon Valley refugees working to create less harmful business models. A few years back, he was part of an internal project at YouTube to develop algorithms that would increase the diversity and quality of content seen by users. But because the subtler algorithms did not improve “watch time” as much as the traditional ones, the project was dropped.
YouTube, which did not contradict this account, says that the company’s recommendation system has “changed substantially over time” and now includes other metrics beyond watch time, including consumer surveys and the number of “shares” or “likes”. But as anyone who uses the site knows, you are still pushed towards what you have spent the most time with — whether that is cat videos or political propaganda. Both Google and Facebook now throw more resources at unmasking suspect accounts and removing content. But they do not want to be censors, and are no good at it anyway, as shown by the frequent muddles over what they do and do not decide to take down.
What is more, their complex and opaque digital advertising systems are still highly vulnerable to exploitation, no matter how many people they put on the problem.
Consider last week’s news that 125 Android apps and websites were subject to multimillion-dollar digital advertising fraud. Unlike the Google sex scandal, it failed to make the front pages, but the revelation was enough to prompt Mark Warner, a Democratic senator, to write to the Federal Trade Commission, calling on it to address “the prevalence of digital advertising fraud and in particular the inaction of major industry stakeholders in curbing these abuses”.
I doubt this will happen soon. In a properly functioning market, start-ups might disrupt the paradigm with new search and social media business models that maximise utility rather than time spent online. Some have tried. But because Google and Facebook have such a monopoly on their respective areas, innovators cannot gain traction. In this sense, data privacy and monopoly power, two issues that are often spoken about separately, are actually related. The fact that Margrethe Vestager, the EU competition commissioner, spoke at the privacy summit last week shows that Europe is beginning to understand the links.
How to fix the problem? As with sex scandals, I suspect the answer is transparency. Regulators need to force platforms to make their algorithms public. Only then will we fully understand the way in which the targeted advertising model is undermining liberal democracy, and begin to garner enough public support to force the attention merchants to shift their models.
All Rights Reserved for Rana Foroohar