Can participation in social media platforms contribute to racial disparities?
It’s not hard to argue that our society is hooked on the feedback loops that are mainstream social media platforms. Almost everyone, regardless of age, social class, gender, etc, has accounts on most or all of the major platforms. (This does not necessarily mean they’re active users, of course.)
One would be forgiven for thinking that there is no deeper meaning to be found in the depths of these platforms and the connections they expose and facilitate to their users.
These deeper connections and relations aren’t visible to the end-user. Nor are they visible to the humans moderating the platforms, or the companies providing access. We’re not talking about connections in the typical sense, of algorithms controlling who you interact with, and what content you’re exposed to.
Rather, I’d like to explore how the conscious choices that society makes when using social media and how those choices can help or hurt various groups.
From the perspective of just one user, it could be difficult to understand how your choices while on a given platform, say Instagram, could possibly affect the outcome of something as large as a social justice movement.¹
However, when you consider that the algorithm used to recommend posts and accounts to you is also impacted by feedback that you give it, you should be able to understand the nature of the issue. Sure, you individually won’t convince the algorithm to stop promoting, say, Black, Lives Matter-related posts.
But, if the algorithm notices that you and a lot of people with similar metadata² didn’t like the post, or whatever action the platform has decided demonstrates dis-interest, then it’s much more likely that users you follow & who follow you will not see the post, ever. Hence, the actions of a few people can drastically bias the recommendation engine in terms of what types of content similar groups of people will see.
Now that we understand this effect exists on a small scale with one person and their immediate followers, we will move to explore how it can be visualized and understood in a larger context, like a nation’s population.
Because the social media platforms in question collect such an ungodly amount of information and metadata on their users, it’s impossible for one user to have the exact same groupings as another. Hence, one user being exposed to a new algorithmic preference will usually end up starting the same wave of preference in that user’s other groups. Because of this, you can imagine the updating similar to a snowball growing in size as it rolls down a hill.
A great example of this effect in action can be seen in TikTok’s recommendation algorithm. If you and your friends like a few posts with the same hashtag, it’s almost certain that another friend of yours will start to see that type of post, even if nobody from your friend group has ever sent it to them. It works out to be an effect similar to being a bit unsettled when Gmail advertises something related to a post you saw on Facebook.
The exact ways in which this algorithmic snowball can impact a social justice type movement of course varies, mainly dependent on how much of the organization’s popularity has been garnered through social media promotion. And, if such an organization wanted to try and work around this phenomenon, they could simply spend a couple of dollars to promote their post, which, if it gains likes from different demographics, will then start to positively snowball among them.
Of course, on most platforms, users have some form of a “don’t show me this ad” button, which could work fairly effectively to keep those posts away from the demographics to which that user belongs. One starts to see that this is an endless arms race between people wanting attention and those who are annoyed at the “next big thing.”
When I started this piece, I was hoping that I could end it with some sort of recommendation, or theory for the average person to adapt to their lifestyle. I’m not so sure there is one, but it is certainly interesting to noodle over the implications of the smallest actions you take on any social media platform. Heck, for all we know, the number of seconds you spent “focused” on a post might be an input factor for the recommendation algorithm. At this point, any data has become fair game.
Sources & Notes:
- For the sake of this op-ed, I’m going to leave out my personal feelings on social media “slacktivism.” Trust that there’s fair consideration of the issue, regardless.
- Information that is used to group you amongst people with similar interests, and hence, preferred types/content of ads. (That’s what makes money for the platforms!)