Big Tech + Big Data = Big Problems

Comment

Image Description: the logos of Google, Apple, and Facebook aligned horizontally

CW: mention of self-harm and addiction

I’ve always been totally indifferent to the harvesting of my data. Why should I care that the government and a small cohort of multinational corporations can access the totally uninteresting mass of information that has accumulated to form my digital footprint? Neither of my parents are CEOs at these companies, and it is only these two individuals, and, potentially my tutors, whose knowledge of my online gallivanting would make me anxious.

Nevertheless, like most things I have little clue about, my opinion on the matter has changed as I have learnt more about the topic, and specifically how our data is being used.

The data collected from our online activity is vast. It is not simply our search history, but also the content we engage with the most; engagement, almost being synonymous to watch time in many cases but likes, comments, and shares also come under its umbrella. Facebook alone generates 1 million gigabytes of data a day and as of January 2021 the social media company was host to 300 petabytes of user data. That’s over 4.6 million times the storage of your iPhone!

We are told this data can then be used to provide us with a far more ‘personalised’ experience, showing us more of the content that we will want to see. And whilst this is not strictly false, I believe it to be both incredibly manipulative and damaging.

The goal of these tech and social media companies is to keep us on their product for as long as possible: this is the optimum for their ad-based business models in which they make more money with more time spent by the consumer on their platform. So, armed with an abundance of media to offer and billions of peoples’ data, Big Tech is trying to keep and sell our attention.

What this data is used to do is to create what is often referred to as an asymmetry of power between the company and their consumers. Using all of our collective data to feed algorithms built by some of the best minds of our time, these tech companies are able to supply us with the content that is likely to keep us on our phone.

Armed with an abundance of media to offer and billions of peoples’ data, Big Tech is trying to keep and sell our attention.

These predictions about what posts or videos are likely to keep our attention can be generated with great accuracy: with this abundance of data to hand, algorithms can provide the consumer with content that kept other users with similar interests engaged. If you revised your GCSE Biology thoroughly, you will remember that the more data available will lead to more accurate predictions – and if there is one thing that big tech is not short of, it is data.

So, whilst we feel that we are actively choosing to click on another video in Youtube’s recommended bar or continue scrolling through our Instagram Explore, we are often being pinned down to the screen by the Arnold Schwarzenegger of algorithmic technologies.

This is a serious issue, the severity of which is clear when we examine the effects social media usage appears to have on young people. Since social media first became accessible on mobile phones in 2009 there has been a number of worrying trends in teen mental health. Self-harm in girls between the ages of 10-14 has increased by 189% since 2009, where it had remained steady for the past 10 years, with the suicide rate in the same demographic raising by 151%.  A 2018 study also showed how reduced social media usage among university students had led to a reduction in lonely and depressive feelings after just three weeks.

This all seems to be indicating that social media use can result in deteriorated mental health. Whilst the technology can be used in healthy doses, the power of the algorithm for many is just too much and they can start to consume media in an unhealthy quantity.

Some psychologists estimate that already between 5 to 10% of Americans are addicted to social media; if we allow these algorithms to grow stronger as they are fed more data, more and more of us are going to struggle with problematic social media use and its often dire consequences.

We are often being pinned down to the screen by the Arnold Schwarzenegger of algorithmic technologies.

The persuasive nature of these algorithms is not just detrimental to our mental health but also our political landscape. Our feeds have become a mess of sensationalist journalism, to which we only ever see one side. The personalised content provided to us by the algorithm will typically be exclusively left- or right-wing. This is leading to the formation of political echo chambers where people are only consuming media which confirms their already held beliefs.

This unvaried information diet is a prime driver of the political polarisation we are experiencing. A 2017 study found that the topics being discussed in these separate echo chambers rarely overlapped, so individuals from different “bubbles” are not even able to communicate and discuss their political standings with each other as they are founded on totally different issues.

This article is not an attack on social media but more an attack on its current state. I believe it needs to be treated similarly in some respects to other addictive activities which can be damaging such as gambling or smoking. The current infrastructure of these platforms is geared to make them as addictive as possible, and with young and impressionable people making up >40% of their user base, this seems totally immoral.

It is time Big Tech stands up to their big responsibilities and prevent us from continuing along this trajectory. However, in the absence of regulation we are unlikely to see any of these companies alter their product in a way which would potentially harm their bottom line.

In Silicon Valley the idea of government regulation is seen as a bit of a joke due to the lack of tech-literacy among many government officials. This is why we need to see more effort on the side of governments to form expert groups which can tackle this growing problem and take an important step in reducing social media usage to more healthy levels.

It is evident that Facebook and their affiliate apps require independent external regulation when we look at the failure of Facebook’s Oversight Board. The Board was set up in 2018 and has been described as functionally analogous to the Supreme Court of the United States. However, it seems the Oversight Board is doing far fewer checks and far less balancing than the public imagined.

However, it seems the Oversight Board is doing far fewer checks and far less balancing than the public imagined.

With the Board lacking the power to initiate its own investigations into the company’s practice, they can only act when Facebook comes to them with an issue. This Board appears to be some large scale shell organisation which distracts from any meaningful regulation being placed on the tech giant.

The UK appears to be waking up to these problems and has published the draft of an online safety bill which would aim to enforce a duty of care for the consumer onto these tech companies. Social media companies would now be held more responsible for the content on their platforms, ensuring greater efforts are made in limiting the reach of illegal and harmful content online.

Whilst this bill would introduce levels of accountability before unseen in tech, I believe there is still far much more to be done.

If we are to be serious about solving these problems we need to address these issues at the level of the algorithm. Big Tech needs to undergo a philosophical overhaul in which they quit employing manipulative algorithms to prey on the attention of their users. Humanitarian values need to stop being subservient to economic ones and the influence of these platforms needs to be accounted for in their design. Such a change in ethos could prove seminal for Big Tech, moulding the shape of the industry for years to come.

Regulation which enforces this can guide the tech sphere into a more humanitarian space and we can see these products begin to be tailored more towards their intended use of connecting individuals, not isolating them.

 

Image Credit: Huzaifa Abedeen via Wikimedia Commons

 

Sign up for the newsletter!


Want to contribute? Join our contributors’ group here or email us – click here for contact details