Four years ago, Facebook product manager-turned-whistleblower Francis Haugen delivered a warning: 鈥淭he company鈥檚 leadership knows how to make Facebook and Instagram safer but won鈥檛 make the necessary changes because they have put their astronomical profits before people.鈥
She was right, but we didn鈥檛 listen. If we had, we might have avoided the absurd double-standard 鈥 where patently wrong things are allowed because they鈥檙e done by Big Tech.
In the real world, it is illegal to market gambling to children. So why is it OK for Kick, a popular streaming platform for videogame enthusiasts, to hire Drake to stream himself betting millions on roulette to his legion of young fans? (Kick, which advertises itself as 鈥渇or users aged 13 and older鈥 is owned by e-gambling platform Stake.)
Marketing sex toys to youth could result in jail time. And yet and an investigation from the shows that Meta鈥檚 AI chatbots, including those impersonating celebrities, have readily engaged in sexually graphic conversations with adolescents. Elon Musk鈥檚 xAI, meanwhile, has a built-in “sexy” mode 鈥 and his own engineers are that this is leading to AI-generated child sexual abuse content.
Pushing addictive toys onto kids could net huge lawsuits. Yet TikTok is accused of being aware its app causes harm, with a lawsuit alleging that听, the company acknowledges TikTok is 鈥渁ddictive,鈥 that it causes its predominately-teenager user base to feel 鈥渟elf conscious, jealous, and bitter鈥 and leaves them 鈥渟usceptible 鈥 for developing body dy[s]morphia, eating disorders, over sexualization.鈥澨
None of this is accidental. As Haugen tried to warn us, these negative outcomes 鈥 addiction, sexual exploitation, depression 鈥 are the product of company policy.
Tech executives and their teams of lobbyists convince governments that they are just as concerned about these problems as we are, and that it is their nasty users who are to blame. Meta claimed to be horrified about how Facebook corroded our democracy and facilitated a genocide in Myanmar, so they funded fact-checkers and promised to fight misinformation. TikTok has a whole program to help teens use their app in a healthy way. Twitter claimed to care about free speech.
It鈥檚 as if they鈥檙e laughing at lawmakers who think these efforts are genuine. Facebook killed its fact-checking program as soon as Donald Trump was elected. TikTok, confesses that its youth safety products don鈥檛 work and that 鈥渙ur goal is not to reduce the time spent鈥 on the app. Twitter regularly bans speech its owner doesn鈥檛 like.
Canada is still labouring under the delusion it can work with these bad actors. Last year, Ottawa proposed a new bureaucracy of online harm: A Digital Safety Commission, which can order illegal content be taken offline and request companies adopt 鈥渄esign features鈥 to protect children. (Another effort in Parliament, to mandate that online porn sites require ID, is even more unworkable and unwise.)
This proposal is a gift to these companies. It allows them to pay lip service to these harms while doing very little differently.听
Instead, the government should opt for another kind of approach: Transparency, antitrust, and liability.
Transparency
Big Tech keeps gaslighting the public because we cannot see how they manipulate their public platforms. Ottawa can mandate that tech giants provide algorithmic transparency, publishing documentation on how their systems work and why they recommend the content they do.听
Antitrust
The vertical integration of these companies have given them perverse incentives. Meta isn鈥檛 just a social media company, it also makes hardware, develops AI, runs the world鈥檚 largest messaging platform, and is a part of an online advertising oligopoly. (Meta, Google, and Amazon control 90 per cent of Canada鈥檚 online advertising market.)
As a recent report from the Canadian Anti-Monopoly Project argues, without breaking this monopoly power, any legislative solutions 鈥渨ill inevitably be compromises in the interest of entrenched incumbents.鈥
Liability
Canadian courts have not yet made clear whether tech companies can be held liable for the societal harms they cause. Ottawa should proactively clarify the circumstances where tech platforms can be sued for their products 鈥 not for their users鈥 speech, but for clear cases of corporate malfeasance.听
Canada can鈥檛 do all this alone, and it shouldn鈥檛 bother trying. Luckily, the European Union and a number of American states have already launched lawsuits, passed legislation and imposed regulation. It hasn鈥檛 all been good, but Canada should join what works.
There is a growing list of reasons why we need to prioritize the fight against corporate control over the internet. If nothing else, do it for the kids who are being exploited by this rampant greed.
- Star staff
- Star staff
- Star staff
- Star staff
听
听
Error! Sorry, there was an error processing your request.
There was a problem with the recaptcha. Please try again.
You may unsubscribe at any time. By signing up, you agree to our and . This site is protected by reCAPTCHA and the Google and apply.
Want more of the latest from us? Sign up for more at our newsletter page.
To join the conversation set a first and last name in your user profile.
Sign in or register for free to join the Conversation