Comment: You are what you eat

Social media is undermining our shared reality, and those of us in the agriculture sector need to take better note of the trend. 

Why? Because not doing so could result in a continued separation between rural and urban Canada – as well as one another. 

Social media is a drug of sorts, in that it can be used responsibly as a means of recreation. Without constant vigilance, though, it can turn into a habit that clouds the mind and skews perceptions of reality. 

But while we as individuals can’t change the algorithms that exacerbate social divisions, we can be more mindful of how easily social media can undermine how we think. Put more bluntly, we can start thinking more critically, and quit going to ImRight.com with so much frequency. 

Generally, Twitter and other social media platforms work by measuring how individuals engage with that platform. The information gathered from that engagement is then used to identify what other information those individuals might like to watch, read and share. 

[EDITORIAL] Advancing the farm conversation on social media

The more likes, replies, comments, link sharing and scrolling time, the more refined the algorithm gets. The result is a timeline filled with more targeted articles, connections, advertisements, and so on. 

My own Twitter feed, for example, is full of farming related content. I’m also an enthusiast for the outdoors, so there’s a good dose of fishing, hunting and hiking material. The suggested content I see now appears much more related to these themes, at least in comparison to a time when my profile was a blank slate. A good algorithm can, in this way, introduce new and interesting content that users may not otherwise discover. 

The dark side? That content can be highly skewed, incorrect and troubling from a personal and social cohesion perspective. 

In my own case, I’ve been alarmed to see an uptick in highly charged political advertisements “suggested” to me, the suggestion clearly coming in part from my interests and who I follow. I may find such advertisements disagreeable, but the algorithm and those paying for the advertisement space don’t know that. 

If I was less disgusted and more inclined to agree with the content recommendation, I could be one more step down an ever-narrowing path. 

[RELATED] Conversations, not comparisons, create opportunities to agvocate

I’ll highlight two issue-specific examples – glyphosate and COVID vaccines – to illustrate my point. 

Decades of study indicate glyphosate is one of the most benign agrochemical products ever made, but just a couple clicks on (highly dubious) articles suggesting a strong link with cancer makes the algorithm think I want to see more articles featuring “glyphosate” and “cancer.” That’s fine for me because I’m aware of the controversy and the question of validity. But for someone who doesn’t know what glyphosate is, how it’s used, or how it works, they could quickly become convinced the chemical presents an existential health crisis. This is a large part of the reason the controversy around the chemical remains active today

Vaccines are a good example to highlight how a social media denizen doesn’t have to click on an anti-vaccination post in order to expose themselves to anti-vaccination messaging. Following a politician who rails against vaccines or other health technologies might be enough. Being a member of a community or parenting group whose members discuss concerns about the safety of such technologies might be enough. Having Facebook “friends” who vocally oppose particular health mandates might be enough. 

For many this won’t be an issue but for some, it could be the difference between believing vaccines are a life-saving tool or that they’re part of a board-of-shadowy-figures plot to establish a new world order via microchip, whatever that’s supposed to mean. 

A very real, slippery slope

That last sentence is not meant to sound outlandish. I personally know several people who hold such beliefs. But they didn’t get there overnight – it took a lot of scrolling. It’s a well documented phenomenon, and one often identified by journalists trying to draw attention to an endemic, potentially fatal problem. 

The BBC’s “undercover voters” initiative, for example, offers an in-depth look at how different social media engagement can lead to drastically different takes on reality. This project is particularly enlightening because it shows how people with different cultural leanings and demographics are exposed to wildly different, often self-reinforcing perspectives on the same subjects. It highlights a phenomenon affecting everyone – liberal or conservative, religious or atheist, and so on. 

Indeed, the slippery slope of social media algorithms is part of how tribal and conspiratorial thinking thrives, not to mention political extremism. It’s a strange paradox – the more open you are to ever-more targeted messaging, the harder it is to consider wider perspectives. 

There’s a reason why your aunt and uncle went from believing the moon landing was faked to card-carrying members of Q-Anon. There’s a reason why your sibling is now convinced homeopathic medicine cures cancer, or why an otherwise reasonable, intelligent friend now believes the constitution and public institutions are less important than executing a violent takeover of the government.

[RELATED] Scientists urged to help fight misinformation

The speed at which information spreads on social media is an additional, massive complicating factor. It takes seconds – literally seconds – for false headlines, photos with misleading captions, out-of-context videos or quotes and other false information to spread under the guise of fact. 

As we in the agriculture sector know so well, it’s nearly impossible to get the cat back in the bag by chasing it with facts. Once false information exists, it’s free to reaffirm existing beliefs and draw others into the echo chamber. 

Despite the potential of such platforms, Twitter, Facebook, YouTube, and even podcasts can poison us. 

First, because the algorithms are designed in a way that amplifies outrage and tribalism, enabling the downward spiral of narrowed thinking and self-delusion. Second, because we as individuals and a wider global society have utterly failed to recognize and be wary of how easy it is to descend into self-affirming and misinformed ways of thinking. 

It’s not all Mark Zuckerberg’s fault. We’ve been lazy. 

We call it a “social media feed” because these platforms are literally feeding us what they think we want to eat, however unhealthy the meal might be. 

We must be mindful enough to keep a balanced diet – or at the very least recognize we are willingly digesting junk food.

Source: Farmtario.com

Share