[go: up one dir, main page]

Study: Social media probably can’t be fixed

Jimmy_James

Wise, Aged Ars Veteran
166
I still think it's the algorithms. Myspace and Facebook were fine back in the aughts. They were great for keeping tabs on the lives of friends that had moved to different cities, especially in the days prior to a visit or a phone call. You could keep yourself sort of in the loop to make your conversations with them better, and to know when you should reach out. But then Facebook moved away from the "wall" and started blasting things at you that it thought you'd want to see. That wasn't what people signed up for originally, and it completely changed the dynamic.
 
Upvote
264 (276 / -12)
Treat them like any other publisher and make them liable for any libel they publish.

Things will get cleaned up real quick.

Unless they have a duty to maintain the truth, they will disregard it for greater engagement. We absolved them of a responsibility given to every other publishing medium. Cable news was raked over the coals for propagating lies about voting machines. They can still lie about other things, but there is an outer boundary they can't cross.
 
Upvote
92 (133 / -41)
How would you effectively model clout chasers and people who post content for purely monetization? Rage baiting for engagement has been around since yellow journalism. Wouldn't a cultural shift need to occur to temper what is occurring in social media? I don't see a way where social media will positively change us in the near future. I'm glad at least there are some novel studies that are looking to model how it can at least be attempted.
 
Upvote
82 (83 / -1)

matheme

Wise, Aged Ars Veteran
122
Subscriptor
How valid is a study that uses as agents LLMs that are (for a large part) trained on exchanges on the existing social media platforms? I would think that these 'agents' have incorporated all the existing flaws of these platforms, and are thus not ideal to check if they would 'behave' differently when structural changes to these platforms are tested for their ability to ameliorate the situation.
 
Upvote
209 (213 / -4)

85mm

Ars Scholae Palatinae
947
Subscriptor++
A lot of these power law distributions could be solved by putting a cap on number of followers. Facebook used to be great for keeping in contact with people I actually knew, family, friends, even acquaintances. Relationships where both parties would recognise each other in whatever sphere we interacted.

I would love to see that sort of distribution tested with this model.
 
Upvote
117 (119 / -2)

musicssound

Wise, Aged Ars Veteran
166
How valid is a study that uses as agents LLMs that are (for a large part) trained on exchanges on the existing social media platforms? I would think that these 'agents' have incorporated all the existing flaws of these platforms, and are thus not ideal to check if they would 'behave' differently when structural changes to these platforms are tested for their ability to ameliorate the situation.
pretty valid, the users of any new social media are also likely to have been "trained" on existing social media platforms. a deeper change is possible, we were having a fantastic time over on Cohost because there was a slow scale up from a small group with a clear philosophy of interaction, leading to many of the users calling it a "social media detox", but we never got that big before it shut down tbh. (screwed over by the payment processors just like Itch before that became big news, the devs bet all their efforts on a Patreon competitor but realized too late that the payment processors would never accept their policies around adult content and ran out of money before they could come up with another idea.)

edit: and it's not like we didn't have drama lmao, we did see some similar problems to other social media, just less of them.
 
Upvote
4 (21 / -17)

rgvtim

Smack-Fu Master, in training
81
How would you effectively model clout chasers and people who post content for purely monetization? Rage baiting for engagement has been around since yellow journalism. Wouldn't a cultural shift need to occur to temper what is occurring in social media? I don't see a way where social media will positively change us in the near future. I'm glad at least there are some novel studies that are looking to model how it can at least be attempted.
IDK exactly, but on platforms like Facebook a simple thumbs down where your reaction was included in the algorithms that pushed stuff onto a feed would be a good start. Reddit ain't perfect, but it does seam to take negative reactions into account. Facebook is not even doing a bare minimum.
 
Upvote
58 (59 / -1)

Jackattak

Ars Tribunus Angusticlavius
6,812
Subscriptor++
Same here. I was able to get off my blood pressure medication.
I have PTSD from military service and was finding myself triggered and thinking bad thoughts far too often.

With a (now) 7 year old little girl I'm trying to raise to be a productive rainbow unicorn in the world, social media was cancer for me.

We're so much happier and healthier.
 
Upvote
174 (175 / -1)

citizencoyote

Ars Scholae Palatinae
1,452
Subscriptor++
So misinformation, often highly polarized information as AI models become more powerful, that content is going to take over. I have a hard time seeing the conventional social media models surviving that.
This is both very bleak and somewhat heartening in that it sounds like current social media is doomed to collapse and something else will rise to take its place. Maybe that's already happening, with curated spaces like what some are doing with WhatsApp and other spaces. I'm torn on whether this is good or bad, since it can lead to isolation. However, removing the heavy hand of the uber-influencers should help moderate the extremes. As Törnberg said, most people hold fairly similar and moderate views, and I would posit that two random people could find quite a bit of common ground. A group of average people is unlikely to wind up spouting Nazi drivel at one another, and that's how you get people to move away from extremism.

So yeah, burn it all down. Let something emerge from the ashes, hopefully not one dependent on engagement and influencers.
 
Upvote
54 (56 / -2)

jnk1000

Ars Scholae Palatinae
839
Subscriptor++
If one advances the premise that social media is a thing that can or should be fixed, then they are missing the most basic realization about social media.

It can't be fixed and trying to fix it is a fool's errand. It's rotten to the core and the only sensible fix would be to fly it into the Sun or equivalent.
 
Upvote
40 (48 / -8)
After reading Doctorow's "The Internet Con: How to Seize the Means of Computation", which talks at some length about the issues of these monolithic services, I think the best approach is in tearing things back to old-internet days.

The scale at which something like FB operates means that it's bound to be rife with bad actors and effective moderation is impossible. It's full of rot and needs to be uprooted entirely at this point.

I'm thinking like privately hosted servers with FB-like clones that are capped at a few hundred users, and with actual sysop admins involved in keeping the peace and moderating the content. (The problem mainly lies in finding the time to set it up and then in convincing the bulk of your friends group to actually move, of course, but "FB but ad-and-bullshit-free" might be enough of a carrot for some contingent.)
 
Upvote
68 (74 / -6)

bert23

Seniorius Lurkius
30
I deactivated my Facebook account around 4 years ago.

I've logged in two or three times since and I can't believe how bad it is. It's an unusable cesspool.

Not only that but it's a bit of a ghost town. Most people on my friends list haven't posted or updated their pics in years or have deactivated their accounts too.

I'm in the UK and that's my experience anyway.

Not sure what Instagram is like nowadays. TikTok seems to be quite popular but tbh I don't use any of them.

Social media is boring and i'm not interested these days. I use Reddit but I'm a lurker mostly.
 
Upvote
79 (81 / -2)

Case

Ars Tribunus Angusticlavius
6,709
I'm sure I'm overly cynical...but how could anyone be surprised by how social media has turned out?

The web itself should have been a clue. Overrun with ads and hackers from the very start. Give the public the ability to, from the safety of their basements and with no oversight, interact with each other, and OF COURSE it's going to be full of bots, trolls, snakeoil influencers, misinformation and generally hateful assholes (and now AI, wonderful).

Sad to read more idealistic ideas of a connected world in various sci-fi novels (from A.C. Clark and many others). A very nice thought, you just need better humans for that to happen. Instead, we've invented tech that is turning "us" into worse ones.

Ironic that I'm posting this on a form of social media of course, but my only way of copying with the toilet has been to largely climb out of it. Granted I'm old and it puts me into the luddite-old-guy-who-doesn't-understand-tech category at a glance, but whatever :)

Only way to win is not to play as I see it. But then my livlihood (thankfully) doesn't require me to have a fucking "brand".
 
Upvote
63 (70 / -7)
The real headline for this article should read something like: “Humanity can’t be fixed, social media serves to accelerate its decline.”

If it were for social media being small business lifelines I’m fairly sure it would have died off ages ago. Social media is a place where the worst of the internet gets a voice, and frankly that sucks.
 
Upvote
97 (101 / -4)

ars_matey

Smack-Fu Master, in training
5
A lot of these power law distributions could be solved by putting a cap on number of followers. Facebook used to be great for keeping in contact with people I actually knew, family, friends, even acquaintances. Relationships where both parties would recognise each other in whatever sphere we interacted.

I would love to see that sort of distribution tested with this model.
That was Path (I'm not allowed to post a link, I guess ) - I wish it had been successful, I rather liked it
 
Upvote
2 (2 / 0)

and-yet-it-grooves

Smack-Fu Master, in training
9
Subscriptor++
What are the primary negative unexpected consequences that have emerged from social media platforms?

Petter Törnberg
: First, you have echo chambers or filter bubbles. The risk of broad agreement is that if you want to have a functioning political conversation, functioning deliberation, you do need to do that across the partisan divide. If you're only having a conversation with people who already agree with each other, that's not enough.
This is a great interview, but the concept of "echo chambers" is still one that I don't understand. Particularly because we only apply it online, and only to these specific sites. If you told a group of Democratic Socialists that whenever they have a meeting they also had to let in a bunch of Proud Boys, you'd be rightfully laughed at.

I wouldn't want to share a room with a bunch of nazis, so why would I want to be on the same social media site as them?
 
Upvote
102 (117 / -15)

iquanyin

Ars Tribunus Militum
1,909
Treat them like any other publisher and make them liable for any libel they publish.

Things will get cleaned up real quick.

Unless they have a duty to maintain the truth, they will disregard it for greater engagement. We absolved them of a responsibility given to every other publishing medium. Cable news was raked over the coals for propagating lies about voting machines. They can still lie about other things, but there is an outer boundary they can't cross.
are newspapers liable for the comment section? no. i'm not interested in repealing section 230. here's what works for me: i spend time curating my feed. i block and mute whenever i run across an asshole or liar. if i find i need to do too much of that on a given platform for using it to be enjoyable, i delete my account for that platform.

making a business liable for what users say (vs paid reporters, ie, its own employees) would just make comment sections go away everywhere. i don't find that a good outcome.
 
Upvote
18 (34 / -16)

muddledzen

Wise, Aged Ars Veteran
395
Subscriptor++
Haven't used social media (outside of comment sections on a couple of websites) in 15 years and I haven't regretted it.

My friends who do are some of the unhappiest people I know, and they're constantly in some sort of outrage cycle about something unimportant that I'm blissfully unaware of.

Side bonus: battery life on my devices is generally 2x to 3x of theirs.
 
Upvote
38 (43 / -5)

chaos215bar2

Ars Tribunus Militum
2,773
Subscriptor++
pretty valid, the users of any new social media are also likely to have been "trained" on existing social media platforms. a deeper change is possible, we were having a fantastic time over on Cohost because there was a slow scale up from a small group with a clear philosophy of interaction, leading to many of the users calling it a "social media detox", but we never got that big before it shut down tbh. (screwed over by the payment processors just like Itch before that became big news, the devs bet all their efforts on a Patreon competitor but realized too late that the payment processors would never accept their policies around adult content and ran out of money before they could come up with another idea.)

edit: and it's not like we didn't have drama lmao, we did see some similar problems to other social media, just less of them.
The difference is, actual humans can learn and change their behavior over time. Chatbots trained on current social media postings ultimately will never change their overall behavior, they'll just respond differently given different inputs.

It strikes me that this team seemingly did not test the most simple, yet drastic change possible: Don't promote content a user has not already subscribed to in some way. If Facebook, say, were just a chronological feed of posts from friends, it's hard to imagine most of the real issues impacting users wouldn't be solved.

You're still going to get "elite" posters with disproportionately high numbers of subscribers / "friends", i.e. celebrities, but it's not really clear that's a problem that actually needs to be solved.
 
Upvote
48 (52 / -4)

iquanyin

Ars Tribunus Militum
1,909
This is a great interview, but the concept of "echo chambers" is still one that I don't understand. Particularly because we only apply it online, and only to these specific sites. If you told a group of Democratic Socialists that whenever they have a meeting they also had to let in a bunch of Proud Boys, you'd be rightfully laughed at.

I wouldn't want to share a room with a bunch of nazis, so why would I want to be on the same social media site as one?
i think it's righting claptrap that passed unexamined (by critical thinking) into a mainstream idea. humans prefer to mostly or always spend time on what they are drawn to. i realize that online adds an unfortunate element to this not found offline: paid and bad faith and "i'm in it for the money" kinds worldwide can fool the unwary into thinking they are just like their offline buddies. that's much harder to do offline, and generally happens only with things like undercover investigations, or spy stuff for key people in government.

but that's life. if we have enough time as a species, most will learn to spot the crap and not engage. some won't but ...we cannot control everything. either we do adapt in a (fairly) healthy way or we don't. lotsa words to say this: i agree with you.
 
Upvote
-17 (2 / -19)

CUclimber

Ars Legatus Legionis
19,535
Subscriptor
IDK exactly, but on platforms like Facebook a simple thumbs down where your reaction was included in the algorithms that pushed stuff onto a feed would be a good start. Reddit ain't perfect, but it does seam to take negative reactions into account. Facebook is not even doing a bare minimum.
I'm not sure how most people use Reddit, but I have mine filtered down and subscribed to a bunch of special interests that are well-moderated, and it's wonderful. Rock climbing, 3d printing, some narrowly-focused subreddits on photography gear... I get a huge value from it and I rarely get sucked in to the toxicity that is all over the site on the big default subreddits.

You can't filter or subscribe to categorized content on most platforms so you get that Bo Burnham "A little bit of everything, all of the time" situation which is just an all-around disaster.
 
Upvote
68 (68 / 0)

Veritas super omens

Ars Legatus Legionis
25,376
Subscriptor++
are newspapers liable for the comment section? no. i'm not interested in repealing section 230. here's what works for me: i spend time curating my feed. i block and mute whenever i run across an asshole or liar. if i find i need to do too much of that on a given platform for using it to be enjoyable, i delete my account for that platform.

making a business liable for what users say (vs paid reporters, ie, its own employees) would just make comment sections go away everywhere. i don't find that a good outcome.
While it is not an optimal solution, the outcome of the current system will be worse, much worse.
 
Upvote
1 (7 / -6)

poltroon

Ars Tribunus Militum
1,832
Subscriptor
Interesting interview and I really appreciate the digging deeper here past the initial study.

We have defined "social media" to be a la facebook and other very large entities, but I don't see this kind of toxicity in smaller groups that are moderated by people. In part I think that's because the moderators basically tamp down people who write outrageous, untrue things.

The large social media entities get large specifically because they want and tolerate and benefit from these attention-seekers and say "oh noes it's too hard to have humans read it and make judgements." But that's honestly false: they have specifically chosen to make that hard and to benefit from unfettered growth, like cancer, over healthy conversation.

I appreciate the mention of Bluesky coming in but Bluesky (a) isn't moderated and (b) for me it's not a bad experience at all. It shows me people I follow, and I don't follow crazy folk.

I appreciate that the authors are thoughtful about the use of LLMs and also that it's hard to do without LLMs - but given that LLMs are so thoroughly trained in this environment, I wonder how much this is just them reproducing what already exists.

Reddit is an interesting place to study because it's the same underlying tech and rules managing multiple communities. Some are pleasant and some are toxic. What differentiates them?

Even Facebook, inside private groups, can be perfectly useful and valuable, when they have good human moderation.

I've moderated and written moderation tools for medium sized communities. If you're willing to throw the banhammer, they don't turn to slop.
 
Upvote
82 (84 / -2)

GMBigKev

Ars Praefectus
5,228
Subscriptor
Not really; just stop using social media.

Amen. And nothing valuable would be lost.

Sure it can, just don't use it.

Problem is that even if you don't use it yourself there's no real way to avoid it entirely. How much news media regurgitates Twitter / Bluesky? How many headlines have been printed based on the latest thing some politician or CEO has said on their Social Media profile? How many times have you been linked a social media profile in order to make a complaint about something?

Not personally using it doesn't stop the issues involved.
 
Upvote
76 (80 / -4)

efbrazil

Wise, Aged Ars Veteran
123
I don’t like seeing “social media” treated as though all social platforms are just variants of social graphs like facebook or twitter. Is wikipedia social media? What about reddit, which is based around groups and not identity? What about arstechnica discussion forums? If the goal is to be constructive, then look at building on “social media” models that capture some of the attributes the authors judge as good.
 
Upvote
60 (60 / 0)

metavirus

Ars Scholae Palatinae
642
Subscriptor++
Problem is that even if you don't use it yourself there's no real way to avoid it entirely. How much news media regurgitates Twitter / Bluesky? How many headlines have been printed based on the latest thing some politician or CEO has said on their Social Media profile? How many times have you been linked a social media profile in order to make a complaint about something?

Not personally using it doesn't stop the issues involved.
Not personally using it x 100,000,000 persons not personally using it would certainly help. No single drop in the bucket causes the bucket to overflow.
 
Upvote
23 (27 / -4)
Post content hidden for low score. Show…

GMBigKev

Ars Praefectus
5,228
Subscriptor
This is a great interview, but the concept of "echo chambers" is still one that I don't understand. Particularly because we only apply it online, and only to these specific sites. If you told a group of Democratic Socialists that whenever they have a meeting they also had to let in a bunch of Proud Boys, you'd be rightfully laughed at.

I wouldn't want to share a room with a bunch of nazis, so why would I want to be on the same social media site as them?

I agree with this. Am I part of an "Echo Chamber" because I don't want to try to explain to the four hundredth Nazi that trans people deserve the right to self-determine? I curate my IRL friends that we're all progressive, queer, goth/metal enjoying nerds. Should I be forced to include some right wing bigot into my friends group?
 
Upvote
46 (61 / -15)
Friendly reminder that this comment section is "social media" by many definitions. And has many of the characteristics of other web forums and news comment sections.

Anyway, I'm mostly surprised/intrigued by the use of LLMs to model human behavior. Obviously it is a big assumption that LLMs will act like humans in a novel environment. From now on are we going to see tons of LLM psychology experiments?
 
Upvote
46 (46 / 0)

GMBigKev

Ars Praefectus
5,228
Subscriptor
Not personally using it x 100,000,000 persons not personally using it would certainly help. No single drop in the bucket causes the bucket to overflow.

Yes well if you have a strategy for getting 100 million people to stop using social media, then I'd be interested to hear.
 
Upvote
36 (36 / 0)