Will the world ever trust Facebook again?

Wednesday, 01 August 2018

    Current

    After the data harvesting scandal that shook social media, can CEO Mark Zuckerberg bring Facebook back into the realm of public trust?


    In mid-June 2010, BP’s then chief executive Tony Hayward appeared before a US congressional hearing on the causes of the Deepwater Horizon offshore drilling rig explosion that killed 11 men, injured 17, and triggered the worst oil disaster in history. During his testimony, Hayward refused to acknowledge legal liability and avoided answering questions on BP’s cost-cutting safety decisions. His responses ranged from “I’m afraid I can’t recall that” to, simply, “I don’t know.” Angry politicians accused BP of having a “cavalier attitude” to risk and a “systemic safety problem”.

    Almost eight years later, Mark Zuckerberg, the 34-year-old CEO of Facebook, faced a similar congressional grilling after it was revealed London-based consultancy Cambridge Analytica had allegedly illegally harvested data from as many as 87 million users to build detailed psychological profiles of voters. While the CEO fielded 10 hours of questions over two days from nearly 100 US senators and lawmakers, global advocacy group Avaaz placed 100 life-size cut-outs of ”Zuck” on the Capitol Building lawn in Washington DC as protesters chanted, “The internet is getting dark and we owe it all to Mark!”

    “It was a breach of trust between Facebook and the people who share their data with us and expect us to protect it,” Zuckerberg admitted during the hearings. But like Hayward, he masterfully dodged questions. More than 40 times, the face of Facebook told lawmakers he had no answers at hand and would get back to them later. It was easy to be left with the impression that either he didn’t know the details of how his own organisation ran, or he was hiding information and playing for time.

    In terms of trust, it’s an important distinction. Had the unruly beast called Facebook slipped its leash, got away from its owner and outwitted Zuckerberg’s apparently benign original intentions? Or had this kind of data mining and selling been the plan, at some level, for a very long time? Cambridge Analytica might have been an invader, but the trove of information was there for the taking — and Facebook Login had made it possible. The hearings turned out to be rather awkward for others, too. The not so tech-savvy senators, with an average age of 62, struggled to grasp what Facebook does and can do — and how to fix it.

    The real challenge for regulators and Facebook, however, is not this one data leak. It’s the growing erosion of trust in tech platforms — and perhaps all organisations — as more light is shone on the private practices behind the public idealism. Users who handed over their trust wholesale, with a confident ”agree” click for the fine-print terms and conditions, are realising it has left them open to abuses they hadn’t imagined. A recent Digital Trust survey by Business Insider found that 81 per cent of Facebook users have little to no confidence in it to protect their data and privacy. Facebook is the least trusted platform (although, interestingly, WhatsApp and Instagram are more trusted platforms despite being owned by Facebook).

    A “trust breach” amounts to a lack of faith in the confidence in “the system” itself. What should we believe in if Facebook’s privacy and data-collection practices have failed us? Who, or what, can be relied upon? The platform, the executives, the regulators, the technology, other users? Anger, fear and disenchantment are deadly viruses that spread fast. So how well has Facebook managed its trust crisis and can it recover?

    facebook computer

    Responsiveness: How quickly does a company respond to the initial mistake?

    Zuckerberg and his executives stayed silent for five deadly days after the Cambridge Analytica story broke. As the crisis built up a head of steam, hashtags #DeleteFacebook followed by #WheresZuck trended on Twitter. When the CEO finally issued a 937-word statement, he posted it, rather naively, on his Facebook page. “We will learn from this experience to secure our platform further and make our community safer for everyone going forward…” it read, in part.

    It was too little, too late. Within half an hour, the post had more than 3100 comments. “Literally no-one will ever trust you again. You sold stuff to a foreign power during an election cycle without batting an eye... Enjoy your new regulations,” one comment ran, echoing a common scathing response.

    Most users or customers don’t really care about the nuances, or even the facts, of a crisis; they tune into the larger brewing narrative. “If I could live this last week again, I would definitely have Mark and myself speaking earlier,” Sheryl Sandberg, Facebook’s chief operating officer, later admitted in an interview with CNBC as Facebook shares plummeted by 18 per cent, wiping more than US$100 billion off the company’s valuation.

    It was always going to be a tough job to explain away such a breach of trust. Taking quite a while to do it, however, wasn’t smart. A delayed response creates a dangerous “trust vacuum” that quickly fills with wild accusations, long-held suspicions, confusion and panic. The crisis moves beyond its specifics, balloons and threatens to taint the entire company. People begin to fear what else can go wrong. What other shortcomings might lurk in the system that we don’t yet know about?

    Ownership — does the company take genuine responsibility for what went wrong?

    In any crisis, it is human nature to ask, ”Who should we blame?” The social media giant has a history of responding to breaches by laying blame at anyone’s door but its own. For 15 years, since Zuckerberg created FaceMash, the Harvard ”hot-or-not” student rating site, from his dorm room, he’s been saying sorry for mistakes, as if they were all down to forces outside his control or unforeseeable. “This is not how I meant for things to go, and I apologise for any harm done,” he said in 2003, after FaceMash had grown into an on-campus phenomenon.

    In 2007, Facebook introduced Beacon, a system that automatically shares with advertisers information about what users are doing in apps and third-party websites — from Epicurious.com to Travelocity.com. “We simply did a bad job with this release, and I apologise for it,” Zuckerberg admitted in a post after the controversial launch. By 2008, according to Wired, the founder had only written four posts on the Facebook blog. Each one was a kind of rote apology for a mishap, with a lukewarm promise to do better. Things would only get worse.

    “Personally, I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way is a pretty crazy idea,” he said in response to mounting criticism of foreign powers abusing Facebook to spread ”fake news” during the 2016 US presidential election. A mere six months later, he had changed his tune: “For the ways my work has been used to divide rather than bring us together, I ask forgiveness and I will work to do better.”

    For a visionary, he was showing what appeared to be a limitless gift for failing to see the consequences of his own company’s workings. Or was he being disingenuous? How much longer would people trust an internet giant that could only offer apologies after the fact?

    On 25 March 2018, Facebook took out bold full-page advertisements in the largest British and American newspapers. It felt like yet another empty promise, implying many things had now been fixed and more changes were on the way. When Zuckerberg was asked repeatedly by journalists and lawmakers whether he wished he could have made different choices to prevent the Cambridge Analytica debacle, he batted away the charge that his decisions were the root of the situation. “Social norms are changing quickly,” he said in one interview with Wired.

    It’s like a song on repeat: apologies for “moving fast and breaking things”, with big promises to become more “transparent”. Zuckerberg should take a little advice from Shakespeare: “the fault… is not in our stars, but in ourselves.”

    Does Facebook care enough to do the right thing, especially once the wrong thing has been exposed?

    Empathy — does the company genuinely care about the impact its mistakes have had on people’s lives?

    Facebook’s history of apologies falls short of ever acknowledging the underlying problem: the platform’s unfathomable power and its ad-based model that controls how a quarter of the world’s population shares and receives information. No single entity should have that kind of domination.

    When something goes wrong, companies should try, at the very least, to convey genuine empathy for the impact on individual lives. But Zuckerberg appears to find this kind of empathy hard. At times, he seems to struggle even to see the breaches as ethical issues. Does he really care how people’s personal information is used, whether it’s to target users with special offers or expose them to nefarious political ads?

    How does he feel about the Facebook Live streaming service that has been used to broadcast shocking murders and suicides? Or the way foreign bots can spread disinformation that may sway our votes in elections and referendums? Does he ever really acknowledge that Facebook knows more about us than if they were wiretapping our phones? Does it actually bother him that an addiction to too many “likes” may actually be bad for us?

    When we ask, “Do we trust Facebook and its CEO?” what we are really asking is whether they care enough to do the right thing, especially once the wrong thing has been exposed.

    Without genuine concern, without some driving moral core, the string of apologies risks looking like nothing more than a self-serving smokescreen.

    Assurance — what changes are made to make people feel confident the system has been repaired?

    “How do you sustain a business model in which users don’t pay for your service?” Senator Orrin Hatch rather cluelessly asked Zuckerberg early on in the congressional hearing.

    “Senator, we run ads,” Zuckerberg replied, with what many have interpreted as a smirk. In May 2017, Facebook reported that approximately 98 per cent of its quarterly revenue came from advertising, up from 85 per cent in 2012. Zuckerberg argues that the ad-based model is fundamentally serving the ethical mission of Facebook — to provide free services to as many people as possible. The complication is that you, the user, are also the product, which results in a massive conflict of interest for Facebook. When there’s a ton of money to be made in monetising your data, how much trust can you place in Facebook’s expressed good intentions to bring the “world closer together”?

    There is a large degree of variation in how privacy and data is dealt with globally (the European Union had its General Data Protection Regulation in the pipeline for two years before it came into effect in May), but this latest crisis might be a wake-up call for government. It is time to step in with outside regulation to harness the power of tech platforms.

    Facebook hopes to head that off at the pass. The company has already introduced an Access Your Information tool that will provide a simpler way for users to manage, delete or download their personal profile information. At its recent annual F8 conference for developers, Zuckerberg announced Clear History, a new privacy feature allowing users to delete data Facebook has collected from sites and apps. The platform also announced rigorous policies and terms that will restrict data access for third-party app developers. A program called Partner Categories, that allowed advertisers to use data brokers such as Experian and Epsilon to target ads to relevant Facebook users, will also be shut down within the next six months. These fixes are all well and good, but nothing has been done to reduce Facebook’s power or restructure its business model.

    Zuckerberg’s 15 years of ”sorries” for screw-ups are symptoms of a systemic problem: a crisis of accountability. That’s not a technological problem, it’s a cultural one. How can a platform extract so much value from its users, or set up systems that allow such invasive mining of personal data, and not be held more accountable when things go wrong? Zuck could learn a thing or two from Baroness Onora O’Neill, a philosopher and professor at the University of Cambridge. “Trust, in the end, is distinctive because it’s given by other people. You can’t rebuild what other people give you. You have to give them the basis for giving you their trust.”

    So far, Facebook has been lucky. Billions of its users have been willing to forgive and forget. It’s like a drug no-one wants to give up, even when they know the harmful side effects. Addictions run their course and forgiveness is not bottomless. People may not leave Facebook, but they may use it a lot less. Convenience trumps the need for trust, but only while there’s no alternative.

    Facebook needs to do something profound and meaningful. For instance, ban all paid political advertising for a set period around an election. The motive for acting responsibly shouldn’t be that stocks are tanking or class actions looming, or that privacy activists need to be thrown a bone. It would be a start in reassuring users and legislators that a platform of this size can be safely stewarded into the future.

    In July, the UK Information Commissioner fined Facebook £500,000 ($888,900) for two breaches of the UK Data Protection Act 1998, for failing to safeguard users’ information and be transparent about how that data was harvested. Had the company been prosecuted under the GDPR, the maximum fine would have been four per cent of global revenues, more than US$40b.

    In Australia, where 300,000 users were affected, Facebook faces a likely class action after litigation funder IMF Bentham lodged a representative complaint with the Office of the Australian Information Commissioner.

    Right now, Facebook is a runaway train with no-one taking care of the passengers. It needs to demonstrate why it still deserves the public trust.

    Latest news

    This is of of your complimentary pieces of content

    This is exclusive content.

    You have reached your limit for guest contents. The content you are trying to access is exclusive for AICD members. Please become a member for unlimited access.