The only surprise was that it did not come sooner. Big Tech whistleblowers are not exactly running out of the offices of Silicon Valley, so it was with some excitement that Facebook could produce a person willing enough to show us the laundry, with the dirt still caking the content.
And the laundry in question proved to be bountiful, with internal company documents running into the thousands showing a fruit salad range of mendacity, deception and approaches to combating hate, violence and misinformation on its platform. The Wall Street Journal capitalised.
Before the Senate Commerce Subcommittee on Consumer Production, Product Safety, and Data Security, Frances Haugen, who revealed her identity on October 3, elaborated. Lawmakers were certainly more pleased with Haugen’s frankness, a far cry from the testimony of Facebook global head of Safety, Antigone Davis, who gave little away the week prior.
As an algorithm specialist, Haugen spent time at Facebook dealing with civic misinformation, counterespionage and democracy. She also had previous stints at Google, Pinterest and Yelp. “Having worked on four different types of social networks, I understand how complex and nuanced these problems are,” she claimed in her opening statement. “However, the choices being made inside Facebook are disastrous – for our children, for our public safety, for our privacy and for our democracy – and that is why we must demand Facebook make changes.”
Where there were conflicts between profits and safety, these were resolved in favour of the former. “The result has been more division, more harm, more lies, more threats, and more combat.” Online discussions (Haugen calls it “dangerous online talk”) had, in some cases, “led to violence that harms and even kills people.”
The hearing itself spent much time on Facebook’s newsfeed algorithm, which emphasises interactions (likes and comments) from those the company deems the use closest to. While not in of itself pernicious, data scientists, Haugen’s documents reveal, were concerned that this focus was having a skewed effect.
Another concern for Haugen is the company’s use of engagement-based ranking. Content receiving more reactions from users are given ranking in terms of priority, meaning that violence and misinformation receive prominence. In “basically damning 10 years of my own work,” Haugen suggested that a chronological ranking system would be preferable.
Facebook’s relationship with information – and misinformation – is deeply problematic. Safeguards were implemented in the leadup to the 2020 US presidential election, only to be removed. After the Capitol riot of January 6, they were reintroduced. This, Haugen suggests, demonstrates a false logic at play: that using its current algorithms is necessary for profits while stressing safety would diminish them. Not so, claims the whistleblower: having oversight governed by researchers, academics, and government bodies could actually aid growth. “With appropriate oversight and some of these constraints, it’s possible that Facebook could actually be a much more profitable company five or ten years down the road, because it wasn’t as toxic, and not as many people quit it.”
These suggestions are not free of their own problems. Government oversight is hardly a guarantee on the veracity and verity of information and having an example of it set in the United States is bound to see it replicated in other countries. Nor is it a guarantee against censorship, ever the prerogative of moralising lawmakers keen to use the message of safety to block material.
Haugen also wishes to see reforms to Section 230 of the Communications Decency Act which protects the social media platforms from legal liability. Should the algorithms in question be shown to cause harm, then the company should be made liable. “Facebook should not be given a free pass on choices it makes to prioritize growth and reactiveness over public safety.”
Zuckerberg’s response to the Haugen show was predictably filled with denial. “We care deeply about issues like safety, well-being and mental health.” He found it “difficult to see coverage that misrepresents our work and our motives.” The examples he adduced were themselves suggestive of how deep the mire has become: the creation of “an industry-leading research program to understand these important issues”; the employment of “so many people” in “fighting harmful content.” But what really irked Zuckerberg was the suggestion that “we prioritize profit over safety and well-being.”
The company chief can hardly be too bothered: he is vacationing. It fell to the demons of Facebook PR to go to work. “Today,” Director of Policy Communications Lena Pietsch fired in statement, “a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question.”
https://twitter.com/samidh/status/1445490555740250120
The statement had it all: demeaning the whistleblower’s testimony as irrelevant, ill-informed and unimportant, largely because she was unimportant to begin with, lacked access to the relevant channels and could not possibly have formed a valid opinion about the company. That said, Facebook did agree that it was “time to begin to create standard rules for the internet.” This involved an over to you message to Congress. “It’s been 25 years since the rules of the internet have been updated and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”
Beyond these disclosures, Facebook will be fighting with committed savagery to convince those on the Hill that change, were it to happen, should be minimal. From the company’s perspective, it has to be, given the central tenets of surveillance capitalism that underpin its success.
[textblock style=”7″]
Like what we do at The AIMN?
You’ll like it even more knowing that your donation will help us to keep up the good fight.
Chuck in a few bucks and see just how far it goes!
[/textblock]
Farcebook!
Faeces book, BB. Drop a message…
How about fixing the ‘community standards’ algorithms first? There is a mass of deeply misogynistic, racist, violent content that, apparently, doesn’t breach community standards. But let one little feminist say ”men are trash’ in response to another report of male violence, and all hell breaks loose.
Facebook is truly weird.
They’ve removed many of our posts on COVID-19 (or tagged them as unreliable) despite all being 100% factual. Yet two large groups on Facebook (which are USA based) are pages set up to promote those horse-worming pills as a cure to COVID… and FB does nothing. Maybe they are beholden to the advertising revenue that these pages pull in.
And while I’m moaning about Facebook… Carol and I were in a restaurant that had posters of silent movie stars on its walls. In the background of a photo I took of Carol was a poster showing the bare back of one of the actresses. I duly posted that photo on my Facebook page for our family to see.
I was banned from Facebook for 14 days over that photo. I was guilty of publishing nudity. A friggin’ bare back of a 1920s movie star on a background photo was considered nudity. 🤦🏻♂️😳
On another occasion I was banned for publishing a photo of Hitler. It was an article on WW2. I can only assume that according to Facebook Hitler had nothing to do with the war. 🤷🏻♂️
Michael, have you ever asked yourself, why do you bother with Farcebook?
Yeah I know, folks join because of the “convenience” in communicating, sharing photos with their families.
I often wonder if folks remember what they did to communicate before Farcebook came along.
Fact ..It is primarily the “Likes”, ie the 👍 symbol, that drives the FB revenue stream, and very addictive. A dopamine hit!
Folks suffer withdrawal symptoms no different from the use in actual drugs. Depression, ill moods, even suicide… FFS!!
IMO FB needs to get rid of the “likes”, “dislikes”, etc etc!
Then it may end up being a nice place to visit..
The same goes for all social media platforms that rely on participants and members judging each other without written formal argument or debate. It’s so easy for nasty folks to simply click 👎, ‘dislike’ and hound, bully another person without reason.
One of the main reasons I really like AIM is precisely that there are no “likes” or upticks, down ticks on the comments..
(And it’s a great group of folks, mostly very aware and intelligent)
BB, for us Facebook is a necessary evil as most of this site’s traffic comes from Facebook. Apart from that, I only use it to catch up with distant family or friends.
Speaking of likes and nested comments (where you could respond directly to the person you were addressing), we tried it once due to a few people requesting such. I hated it. It didn’t last a week. It made us look like a Facebook page.
“a necessary evil”
Lol, is this phrase an oxymoron?
it conjures up all sorts of weird imaginations, and excuses all sorts of shenanigans in the world….
I’m sure glad you didn’t keep the “FB responses” here at AIM…
👍 to that BB. 😄
P.S. What’s this Facebook of which you speak…. Never been a member…….heard about it, but….
P.P.S. Back in the early days of Facebook my (then) young children were bullied to the Max on faceshit. I’ve never been a fan since.
BB/LOVO, Facebook is spiralling into a right-wing sludge. Nothing is surer.
If it gets any worse it’ll be taken over by Geelong supporters.
Hmmm. Maybe not. There aren’t enough of them. 😁
BB, a necessary evil is what a business client called my firm when I worked in finance. After 35 years I’ve finally had the chance to borrow it.
By the way, BB, have you met LOVO?
LOVO can smell an opened bottle of wine from 500ks. He’s particularly fond of Blackberry Nip.
Yes, his standards are low. 😁
Never mind the morals,feel the money.
G’day BB, 🍻 🍺🍸🍷👍😉😆….’nough said
….p.s. I heard a whisperers that the cellar was stripped and that the bbnip was all that was left.?
That’s the spirit, LOVO.
The keys to the cellar are all yours.
G’day Migs, 😎 ….thanx…. But, hey, in the age of battery operated friction cutters ..whom the fuck needs keys..😄 Jest sayin’
Besides…it just shows how long it’s been since you ventured down to the Cellar. I sold all the BBnip to a bloke named Baccus. …..got a good price, ay…
clink
Good night all, I’m off to listen to one of my favourite albums from the mid-70’s before heading to bed – Olias of Sunhillow by Jon Anderson. If you’ve never heard it then:
Is that LOVO I see here? Thee LOVO?
Oh me Gawd,.. Roswell, …G’day Cobba. … 😆
LOVO. It is you. 😀
Praise be.
G’day LOVO
OMDog, if I drank all that piss I’d be under the table 🥴🥴🥴🥴🥴
Blackberry Nip eh, talk about a desperado from the 70’s… a disgusting drink.
What sort of handle is LOVO anyway, how about if I change the V to a C.. 😎
GL
Interesting Album, I quite enjoyed some of the tracks..
Couple of my favourites back from the 70’s….
Other People’s Rooms by Mark Almond
Free Fall Through Featherless Flight by Jeannie Lewis
Dr Kampmark’s article here makes use of some of Frances Haugen’s whistle-blowing against Facebook: “…it is with some excitement that Facebook could produce a person willing to show us its laundry, with dirt still caking the content…a fruit salad range of mendacity, deception and approaches to combating hate, violence and misinformation on its platform. The Wall Street Journal capitalised.”
Note the last little sentence: The Wall Street Journal capitalised. Surprise. Surprise.
Dr Kampmark contrasts what Haugen the critic says with what Davis the Facebook rep says. Another surprise. Facebook defends itself? But we are not given much about the details of this “salad”.
“…the choices being made inside Facebook are disastrous – for our children, for our public safety, for our privacy and our democracy – and that is why we must demand Facebook makes changes”, says Haugen.
The problem seems to be with Artificial Intelligence or algorithms “having a skewed effect”; “users are given ranking in terms of priority, meaning that violence and misinformation receive prominence.” Haugen suggests chronological publication rather than ranking users. But Haugen’s suggestions for reform, to make Facebook more successful, have their own problems.
This use of algorithms is what upset Michael here at AIMN. The algorithm picked up a bare back, because it was was naked, and the name Hitler, who was indeed a bad person – but the algorithm had no idea about context – so Michael was rightly annoyed by a wrongful penalty.
Facebook’s overview was faulty. It is about metrics more than people, profits more than safety, says Haugen. “Having oversight governed by researchers, academics and government bodies could actually aid growth”, says Haugen. Dr Kampmark states his own reservations about such suggestions.
Meanwhile, there are other news outlets quite happy at last to receive Facebook money for their news published on Facebook.
So we come to the bleeding obvious. Can you imagine MSM news outlets enduring “oversight governed by researchers, academics and government bodies”? Or that a platform should be liable for what users publish there?
And what about news being published as if by some algorithm because is so repetitive, so full of weasel words and contrived propaganda, attacking contrary views with vile misinformation, bile and vinegar?
Should we have standard rules across media in all its forms? And I do not mean just Facebook or any other bigtech site. At present we are flooded with dangerous lies and misinformation.