Frances Haugen has Irish roots – literally.
“I’m named after my Irish grandmother”, she tells The University Times on a visit to Trinity. “She has red hair, too. It’s where I get my little red highlights in my hair.”
Irish Americans are routinely mocked for having a thin understanding of this country in the 21st century – but Haugen has an acute awareness of Ireland’s place in the modern world. The data engineer-turned whistleblower is adamant that “Ireland is a tech superpower”. After the US, she says, “Ireland has the most ability to affect change”, because so many tech giants have their European headquarters located here.
“We have a history of standing up for the little guy”, she says. But right now, Ireland is not doing that when it comes to social media. Haugen told the Oireachtas media committee last month that tech companies like Facebook headquartered here have skirted regulation up to now – and as a result, Facebook specifically has made decisions in Dublin which have caused severe harm in places like Myanmar and Ethiopia.
Last year, Haugen blew the whistle on Facebook by leaking internal documents which suggest, among other things, that employees regularly raised concerns about the activity of drug cartels and human traffickers on the platform but the company failed to meaningfully deal with them.
I came forward because I wanted to be able to sleep at night. People’s lives are on the line because of misinformation in those places
In the months since, Haugen has effectively become the spokesperson for those calling for more governmental regulation of Facebook and its tech counterparts. But she doesn’t downplay her own feelings about what she exposed: “I came forward because I wanted to be able to sleep at night … people’s lives are on the line because of misinformation in those places.”
Indeed, legal action launched in the US and UK purports that Facebook’s negligence facilitated the genocide of Rohingya Muslims in Myanmar after its algorithms amplified hate speech and its monitoring structures failed to remove inflammatory posts.
Westerners might find it difficult to comprehend how this could be. While the destructive power of Facebook is evident in the English-speaking world through the likes of anti-vaccine movements, anyone who speaks English and has an internet connection has a plethora of options for online networking. “The most vulnerable places in the world are often the most linguistically diverse … the English web has lots of things that aren’t Facebook.”
So Ireland has to ask itself: “When [tech giants] mess up in these places – where they don’t build safety systems, where you have things go off the rails – should they get a free pass? Because what they’ve shown is that they keep cutting corners.”
“And it’s not like anyone sat down and said: ‘I want to cause ethnic violence in Africa’, or ‘I want to destabilize Southeast Asia’. No one did”, Haugen says. But she’s adamant that Facebook had ample opportunity to de-escalate this destruction – and chose not to. “They have a long, long, long series of experiments where they saw ways of making it a little bit better – but it cost little slivers of money.”
@FrancesHaugen tells @jesskellynt ‘the way we got here, was that a giant multinational technology company made decisions in isolation’. Now we have a ‘critical civilisation problem’ and transparency is key to finding the path forward @TLRHub @TCDLawSchool pic.twitter.com/nWFI50gmYf
— Schuler Democracy Forum (@SchulerForum) March 21, 2022
It’s no coincidence that Haugen’s flying visit to Dublin saw her come to Trinity to speak at the Long Room Hub’s Schuler Democracy Forum – specialised education, she says, could make or break how we respond to the spiralling power and influence of tech giants. If you have a problem on a global scale, with devastating and potentially irreversible consequences, you train thousands of people to expert level to deal with it.
“If we were talking about an oil company – every year globally, we graduate probably at least 50,000 environmental science majors, people who learn about, how do we monitor water quality? What is an appropriate level of rigor for making sure air pollution is acceptable? What are the effects on the human body?”
“We don’t graduate anyone who has the level of depth of understanding of how the dynamics of systems like the design of the algorithms, the choices of features, how those impact public policy outcomes. And because of that, we are forced to accept Facebook’s frame of reference of how to solve these problems.”
Haugen wants to build the “lab bench” for people to become qualified experts in social media and tech corporates. And while she says herself that she’s “super excited” by this prospect, every proposed solution to the problem of unchecked tech companies seems to throw up the reality that the likes of Facebook are so unwilling to be truly regulated.
“The way people are taught data science today [is] they do basically toy problems with these very, very simplified sets of data because the platforms won’t release real data. That’s never gonna happen.”
We don’t graduate anyone who has the level of depth of understanding of the design of the algorithms, the choices of features, how those impact public policy outcomes
Indeed, Haugen believes that Facebook’s greatest offence in evading external regulation isn’t its tokenistic efforts to regulate itself, but what exactly it purports to be regulating. When Mark Zuckerberg began to push back against accusations that Facebook has failed to police its platform, he wasn’t claiming to have put in place structures to prevent the use of Facebook as a tool for ethnic violence. He was positioning himself as a crusader against censorship.
“Facebook wants us to argue about censorship”, Haugen explains. “They funded an entire Facebook oversight board, where the only mandate of it is censorship.”
“If Facebook wants to be the only one who can act, they have to more adequately invest in acting. Because otherwise you see it play out [where Facebook is] literally now a weapon of war.”
But where free speech does come into it is the long-accepted truth that the loudest voices on Facebook get the most exposure. If “one to two per cent of the population [makes] up 80 per cent of all the speech”, users are inevitably being shown a distorted version of reality in their feeds. And this is how the purpose of Facebook has fundamentally changed from being a place to post pictures of your pets to something far more complex – and potentially sinister.
“The conversation here isn’t actually about censorship. It’s about product choices that Facebook made that led us to a situation where the most extreme ideas get the most distribution.”
Realistically, she continues, Facebook hasn’t been a mere connector of friends and families “since, like, the late 2000s”.
The conversation here isn’t actually about censorship. It’s about product choices that Facebook made that led us to a situation where the most extreme ideas get the most distribution
“They’ve been pushing people into these giant groups … let’s say, your friend clicks ‘like’ on one of their friend’s posts – Facebook will show that post to you, because they think you might be interested.”
Returning to her point about specialist education, Haugen points out that how to tackle this isn’t “an empirical question – it’s a philosophical question”.
“I want us to begin to be able to even have a chance to teach these classes at scale, because people keep pushing me to tell them: what is the solution? They’re like, what’s the magic bullet? Tell us the five things we need to fix.”
“And I think that’s just reinforcing the problem that got us here, which is: a small number of very privileged people were making all decisions and all the trade offs and I don’t I don’t think that’s the right path forward for our civilization.”