If you’re an adult who follows “only young gymnasts, cheerleaders and other teen and preteen influencers active on” Instagram, what other content is the Instagram Reels algorithm likely to recommend that you check out? The answer, according to a recent Wall Street Journal investigation, is “jarring doses of salacious content … including risqué footage of children.”
Think about that friend who always encourages you to order just one more drink at the bar.
To understand what’s going on here, let’s step out of the digital world and go “brick and mortar.” Let’s think about that friend who always encourages you to order just one more drink at the bar. This friend can be a lot of fun, in moderation and in adults-only settings. In large doses and in an all-ages setting, this friend can become a dangerous creep — and turn you into one, too.
Let’s call this friend Al. Al knows you, and he knows what you like. Al is out to show you a good time and keep the good times rolling. Al doesn’t know where the line is. Algorithms on social media platforms and search engines typically act like our friend Al.
As the U.S. Supreme Court explained in Twitter v. Taamneh, algorithmically generated recommendations mean that “a person who watches cooking shows on YouTube is more likely to see cooking-based videos and advertisements for cookbooks, whereas someone who likes to watch professorial lectures might see collegiate debates and advertisements for TED Talks.”
Let’s think about what happens when you and Al go watch football at the local high school. It’s fun to relive your glory days, until later when Al follows the students, and you follow Al … to the girls’ locker room. There are the cheerleaders, just off the field, still in their sports bras and athletic shorts. You try to tell yourself there’s nothing wrong with seeing them like that — it’s more than they’d be wearing if you were all at the town pool. But part of you — the part that…
Read the full article here