• 0 Posts
  • 7 Comments
Joined 1 year ago
cake
Cake day: October 1st, 2023

help-circle


  • I have a similar background, and I actually am automating my home. However, what Google/Alexa tote as automation isn’t actually automation; I still would have to say something/press a button.

    I have a pretty healthy home assistant setup, with stuff like electrochromic film on my windows that will dim the windows if someone is sitting near them and the sun is at the right angle to be in their eyes because I hate when I have to hold my head in a position to keep the sun out of my eyes.

    I picked an extreme example, but I’ve also got things like reminders when my laundry or dishes are done (running off of a metered plug, so it just detects power spikes from the machines), presence detectors in rooms to automate lights on/off, and a whole slough of things that will happen when I click the play button on Plex (lights go out, curtains close, windows dim). I’ve got humidity sensors in the bathroom for starting/stopping the vent fan, I’ve got particulate/heat/humidity sensors for starting and stopping the hood vent in the kitchen.

    Obviously these things save a few seconds here and there but it is nice to not have to think about these things anymore.



  • The answer is yes and with significant effect. I just barely skimmed this article but this doesn’t seem to be focusing on the important factor: Algorithmic content feeds.

    Modern day social media (things like Facebook, Reddit, YouTube, X, etc.) Are all set up with one goal in mind: make as much money as possible. This in itself isn’t a problem depending on who you ask, but let’s pick one social media as an example and see why this eventually causes political polarization.

    For this demonstration, I will pick Facebook, but this could just as easily be done with any free, ad-supported website/app.

    Okay, so to reach their goal of getting as much money as possible, Facebook shows ads between posts. Companies pay Facebook to show those ads to people, with the promise that they will be shown to people that fit a demographic that would be interested in the product. When the ad is viewed by enough people, Facebook will stop running the ad unless the company pays again.

    Now that we know how they make money, let’s look at how they ensure they get as many people to view as many ads as possible. This mostly boils down to a few metrics.

    1. Time spent on the platform
    2. Engagement (views, link clicks, comments, likes, messages, posts, etc.)

    If you spend more time on Facebook, you will see more ads. To maximize time spent on the platform, Facebook keeps track of everything you do, both on their site and off. I won’t go into specifics here, but they utilize web cookies to keep track of your browsing history and things like app permissions to keep track of your location and what you do on your phone. From this data, and potentially other data on you that they purchase from data brokers, they build a pretty good profile on what you would be interested in seeing. They show you relevant ads and relevant posts to hopefully keep you on their site.

    Keeping engagement high means you are more likely to click on an ad, which pays out more than a view for an ad. To ensure you are fully engaging with content, as discussed above, Facebook keeps track of what you like to view and interact with, and puts that in front of you. However, Facebook also knows what type of content garners more interaction.

    This is where the whole system leads to political polarization. There are two types of content that bring the most engagement: Controversy and content designed to make you angry. So what does Facebook do? It throws the most controversial, rage-baity article that makes your political opponents seem like absolute monsters in front of you. Often times, these posts are actually really misleading and full of both deliberate misinformation or non-malicious misinformation. These posts get people riled up, and so they are very likely to engage with the post. And because Facebook knows that you are less likely to stay on the site if it shows you something that you don’t engage with, it avoids showing you posts that show the other side of the story, so you are caught in an echo chamber of your own ideas and the misinformation of the outrage-inducing posts you have seen.

    Facebook won’t show you posts that are on situations where you and your political opponents actually agree, because if it doesn’t get you worked up, you aren’t likely to engage with it. They also won’t show you posts that have a majority of engagement from your political opponents, since it’s likely not something that the data profile they have on you suggests you’d like.

    News content that shows both sides agreeing is already hard to find, considering that the news sites also know that rage-inducing content gets more views and more eyes on their ads, so they primarily focus on making controversial content anyway.

    Enough of this over time will make you think that everyone on the Internet agrees with you, since Facebook doesn’t show you content that those who oppose your ideas are engaging with. This type of situation will support an us-vs-them mentality, and breeds pockets of the social media with either left-leaning content or right-leaning content is all that’s being shown, which breads political polarization.

    Thanks for coming to my TED Talk, sorry it was so long.

    tl;dr: Social media exists to make their owning companies money, politically polarizing content gets them more money, thus in a way social media exists to make politically polarizing content