"The [structural] mechanism producing these problematic outcomes is really robust and hard to resolve."
See full article...
See full article...
Yes.Or should we nuke our social media accounts altogether
"The [structural] mechanism producing these problematic outcomes is really robust and hard to resolve."
Amen. And nothing valuable would be lost.Not really; just stop using social media.
I'm three years into life without social media and so far it's been one of the best decisions I've ever made.Yes.
Same here. I was able to get off my blood pressure medication.I'm three years into life without social media and so far it's been one of the best decisions I've ever made.
pretty valid, the users of any new social media are also likely to have been "trained" on existing social media platforms. a deeper change is possible, we were having a fantastic time over on Cohost because there was a slow scale up from a small group with a clear philosophy of interaction, leading to many of the users calling it a "social media detox", but we never got that big before it shut down tbh. (screwed over by the payment processors just like Itch before that became big news, the devs bet all their efforts on a Patreon competitor but realized too late that the payment processors would never accept their policies around adult content and ran out of money before they could come up with another idea.)How valid is a study that uses as agents LLMs that are (for a large part) trained on exchanges on the existing social media platforms? I would think that these 'agents' have incorporated all the existing flaws of these platforms, and are thus not ideal to check if they would 'behave' differently when structural changes to these platforms are tested for their ability to ameliorate the situation.
IDK exactly, but on platforms like Facebook a simple thumbs down where your reaction was included in the algorithms that pushed stuff onto a feed would be a good start. Reddit ain't perfect, but it does seam to take negative reactions into account. Facebook is not even doing a bare minimum.How would you effectively model clout chasers and people who post content for purely monetization? Rage baiting for engagement has been around since yellow journalism. Wouldn't a cultural shift need to occur to temper what is occurring in social media? I don't see a way where social media will positively change us in the near future. I'm glad at least there are some novel studies that are looking to model how it can at least be attempted.
I have PTSD from military service and was finding myself triggered and thinking bad thoughts far too often.Same here. I was able to get off my blood pressure medication.
This is both very bleak and somewhat heartening in that it sounds like current social media is doomed to collapse and something else will rise to take its place. Maybe that's already happening, with curated spaces like what some are doing with WhatsApp and other spaces. I'm torn on whether this is good or bad, since it can lead to isolation. However, removing the heavy hand of the uber-influencers should help moderate the extremes. As Törnberg said, most people hold fairly similar and moderate views, and I would posit that two random people could find quite a bit of common ground. A group of average people is unlikely to wind up spouting Nazi drivel at one another, and that's how you get people to move away from extremism.So misinformation, often highly polarized information as AI models become more powerful, that content is going to take over. I have a hard time seeing the conventional social media models surviving that.
That was Path (I'm not allowed to post a link, I guess ) - I wish it had been successful, I rather liked itA lot of these power law distributions could be solved by putting a cap on number of followers. Facebook used to be great for keeping in contact with people I actually knew, family, friends, even acquaintances. Relationships where both parties would recognise each other in whatever sphere we interacted.
I would love to see that sort of distribution tested with this model.
This is a great interview, but the concept of "echo chambers" is still one that I don't understand. Particularly because we only apply it online, and only to these specific sites. If you told a group of Democratic Socialists that whenever they have a meeting they also had to let in a bunch of Proud Boys, you'd be rightfully laughed at.What are the primary negative unexpected consequences that have emerged from social media platforms?
Petter Törnberg: First, you have echo chambers or filter bubbles. The risk of broad agreement is that if you want to have a functioning political conversation, functioning deliberation, you do need to do that across the partisan divide. If you're only having a conversation with people who already agree with each other, that's not enough.
are newspapers liable for the comment section? no. i'm not interested in repealing section 230. here's what works for me: i spend time curating my feed. i block and mute whenever i run across an asshole or liar. if i find i need to do too much of that on a given platform for using it to be enjoyable, i delete my account for that platform.Treat them like any other publisher and make them liable for any libel they publish.
Things will get cleaned up real quick.
Unless they have a duty to maintain the truth, they will disregard it for greater engagement. We absolved them of a responsibility given to every other publishing medium. Cable news was raked over the coals for propagating lies about voting machines. They can still lie about other things, but there is an outer boundary they can't cross.
The difference is, actual humans can learn and change their behavior over time. Chatbots trained on current social media postings ultimately will never change their overall behavior, they'll just respond differently given different inputs.pretty valid, the users of any new social media are also likely to have been "trained" on existing social media platforms. a deeper change is possible, we were having a fantastic time over on Cohost because there was a slow scale up from a small group with a clear philosophy of interaction, leading to many of the users calling it a "social media detox", but we never got that big before it shut down tbh. (screwed over by the payment processors just like Itch before that became big news, the devs bet all their efforts on a Patreon competitor but realized too late that the payment processors would never accept their policies around adult content and ran out of money before they could come up with another idea.)
edit: and it's not like we didn't have drama lmao, we did see some similar problems to other social media, just less of them.
i think it's righting claptrap that passed unexamined (by critical thinking) into a mainstream idea. humans prefer to mostly or always spend time on what they are drawn to. i realize that online adds an unfortunate element to this not found offline: paid and bad faith and "i'm in it for the money" kinds worldwide can fool the unwary into thinking they are just like their offline buddies. that's much harder to do offline, and generally happens only with things like undercover investigations, or spy stuff for key people in government.This is a great interview, but the concept of "echo chambers" is still one that I don't understand. Particularly because we only apply it online, and only to these specific sites. If you told a group of Democratic Socialists that whenever they have a meeting they also had to let in a bunch of Proud Boys, you'd be rightfully laughed at.
I wouldn't want to share a room with a bunch of nazis, so why would I want to be on the same social media site as one?
I'm not sure how most people use Reddit, but I have mine filtered down and subscribed to a bunch of special interests that are well-moderated, and it's wonderful. Rock climbing, 3d printing, some narrowly-focused subreddits on photography gear... I get a huge value from it and I rarely get sucked in to the toxicity that is all over the site on the big default subreddits.IDK exactly, but on platforms like Facebook a simple thumbs down where your reaction was included in the algorithms that pushed stuff onto a feed would be a good start. Reddit ain't perfect, but it does seam to take negative reactions into account. Facebook is not even doing a bare minimum.
While it is not an optimal solution, the outcome of the current system will be worse, much worse.are newspapers liable for the comment section? no. i'm not interested in repealing section 230. here's what works for me: i spend time curating my feed. i block and mute whenever i run across an asshole or liar. if i find i need to do too much of that on a given platform for using it to be enjoyable, i delete my account for that platform.
making a business liable for what users say (vs paid reporters, ie, its own employees) would just make comment sections go away everywhere. i don't find that a good outcome.
Not really; just stop using social media.
Amen. And nothing valuable would be lost.
Sure it can, just don't use it.
Not personally using it x 100,000,000 persons not personally using it would certainly help. No single drop in the bucket causes the bucket to overflow.Problem is that even if you don't use it yourself there's no real way to avoid it entirely. How much news media regurgitates Twitter / Bluesky? How many headlines have been printed based on the latest thing some politician or CEO has said on their Social Media profile? How many times have you been linked a social media profile in order to make a complaint about something?
Not personally using it doesn't stop the issues involved.
This is a great interview, but the concept of "echo chambers" is still one that I don't understand. Particularly because we only apply it online, and only to these specific sites. If you told a group of Democratic Socialists that whenever they have a meeting they also had to let in a bunch of Proud Boys, you'd be rightfully laughed at.
I wouldn't want to share a room with a bunch of nazis, so why would I want to be on the same social media site as them?
Not personally using it x 100,000,000 persons not personally using it would certainly help. No single drop in the bucket causes the bucket to overflow.