Updated: Aug 26, 2021
Social media and tech giants aren’t going to get any smaller. But they need to be regulated once and for all in their own right if we’re to live in a society that can thrive from the convenience and connectivity they offer, without having to suffer harm at their hands.
I’m a millennial (just). I was a student at one of the first institutions where Facebook was launched. I’m a consumer. I’m a technology, media and privacy lawyer. I’m a mother. I’m an early adopter of new technology.
No-one can deny the popularity, influence, and power big tech and social media platforms have.
But since social media’s hegemony, the scandals keep on coming. From severe privacy breaches including selling personal data to third parties, fake news and disinformation, to showcasing live footage of mass killings and being cited as a major contributing factor to mental health issues that have in some cases led to suicide.
Many of us are conflicted. We are cynical about the level of social responsibility these organisations really have. We don’t really trust them. But we scroll through feeds daily, continue to post personal content, obtain a large proportion of our daily news from the one place, and can’t help but click through to tailored ads. And they do keep us connected with people globally, especially at times when a lot of the world remains housebound.
So where to from here? These platforms are certainly big enough, mean enough, and influential enough to warrant oversight beyond voluntary codes and loose notice and take-down procedures. The days of technology giants hiding behind safe harbours, and/or declaring themselves a mere conduit for content are surely over.
In most countries, to date, social media platforms are left to regulate themselves so long as they comply with local laws on illegal material. The party line remains that the platforms do not, and cannot, control user-generated content, and as such the extent of their obligations is the requirement to take-down offensive / harmful / defamatory / copyright infringing content within some reasonable period of time from when they have been notified that such content exists on their platform.
For every action, an opposite reaction
But for each of social media’s harmful action, has come an opposite, albeit not necessarily equal, reaction from national lawmakers and regulators. And in response, platforms have managed to keep broader regulation at bay by the promise of self-regulating through voluntary codes of practice, piecemeal updates to privacy settings and associated ad campaigns.
Below are 5 areas where laws or regulation have or propose to address individual social media and/or big tech fails.
1. Harmful posts including live streaming of violent content and self harm
In Australia, in 2015, the Enhancing Online Safety Act created an eSafety Commissioner with the power to demand that social media companies take down harassing or abusive posts. In 2018, the powers were expanded to include revenge porn. In 2019, following the live-streaming of Christchurch, New Zealand’s mosque shootings, Australia passed the Sharing of Abhorrent Violent Material Act in 2019, introducing criminal penalties for social media companies, possible jail sentences for tech executives for up to three years and financial penalties worth up to 10% of a company’s global turnover.
Similar rules exist in other countries. As of February this year, the UK media regulator, OFCOM has been granted greater enforcement power over social media platforms hosting harmful content including cyber bullying. This comes in response to the suicide of 14-year old Molly Russell committed suicide, and self harm and suicide posts were subsequently found on her Instagram account.
In the European Union, social media platforms face fines if they do not delete extremist content within an hour.
Although we own any content we post on a social media platform, their terms of service grant extremely broad licences for use of those posts, images, videos and stories as they see fit. This just adds to the swathes of free data we provide to them.
In addition, if a platform is found to host content which infringes another person’s copyright, it is the uploading user that has infringed copyright, and there is rarely corresponding liability for the platform that hosts it.
To that end, the EU’s recent copyright directive puts the responsibility on platforms to make sure that copyright infringing content is not hosted on their sites, where previous legislation only required the platforms to take down such content if it was pointed out to them.
If an individual believes that content has been published, and viewed by more than one person, which that individual feels has lowered their reputation in the eyes of the public, they have grounds to file a defamation claim. Laws in this area are geared towards traditional print publishers such as newspapers. Platforms the size of Facebook, Instagram and Google which host such large volumes of user-generated content over which they have limited if any editorial control, generally do not qualify as a publisher. Nevertheless, there have been cases which have found that Google could be deemed a publisher in certain circumstances. Australia for one is considering updates to existing defamation laws to make them substantially subject to those laws as “traditional” publishers notwithstanding the volumes of content to monitor, and the fact that content is user-generated.
4. Privacy breaches
The EU General Data Protection Legislation, the gold standard of global privacy laws, aims to regulate corporate use of personal data, and address the countless ways that social media platforms and big tech glean and use vast amounts of personal data for commercial gain, without transparency. Facebook’s largest data privacy breach scandal regarding the sale of personal data to Cambridge Analytica which used the data for political advertising purposes was revealed in 2018. Since then, law suits from national privacy authorities around the globe continue to abound and will likely see fines in the billions of dollars worldwide.
5. Unfair terms and impact on mainstream advertising, news and journalism
In December 2017 the Australian Competition and Consumer Commission (ACCC) was asked to conduct an inquiry into the impact of Facebook and Google on competition in the media and advertising markets, with a particular focus on their impact on news and journalism. The calls for such an inquiry came from print media in their exasperation at not being able to compete with the likes of Facebook and Google for advertising revenues, and those platforms’ ability to curate news. Among other things, the final report shone a light on the lack of transparency in their advertising business practices, and also the role they play in the proliferation of fake news and disinformation. The ACCC made 29 or so recommendations, including calling on the platforms to develop voluntary codes on fake news and disinformation, and sweeping reforms to Australian privacy law (which arguably went beyond the scope of the inquiry’s remit).
The report could flag Australia as being one of the first countries to get closer to implementing more general “social media laws”.
You’re not the boss of me
There are many of us, myself included, who cannot disentangle ourselves from big tech, and remain unwilling to sacrifice the convenience and connectivity they bring to our lives by limiting our uses.
I can’t overlook the positive disruption that these organisations also bring. Yes, they dominate the online advertising world, and certainly impose unilateral and unfair terms on news outlets, individuals and small businesses. But the answer can’t be to pander to those dynastic aging white folk in old media who don’t know how to compete and who go crying to the competition and consumer regulator.
But our recourse against these organisations cannot be to appeal through the organisation itself. If legislation, mandatory codes of practice, and the courts are good enough for everyone else, why not social media platforms and big tech?
Instead of shoe-horning social media into various existing telecommunications, copyright, privacy, and defamation laws, they need to be regulated in their own right, as the behemoths that they are. They need to be held accountable. Gaps in existing laws need to be closed up and apply as intended. And at the same time they can restore credibility by playing on a more level playing field, enshrine minimum expectations that society expects, and continue to disrupt without having to be broken up.