censorship and content moderation is a difficult problem. i don’t think its solution involves a bunch of american corporations being in charge of and thought policing large chunks of our online communication.

leaving moderation to centralized entities clearly isn’t working, but decentralization also prevents effective censorship. so maybe these are not the tools we should be using.

it affects more than just criminal use, and our problems are growing beyond the political, affecting our collective mental health. social media is very hard to moderate without raising ethical questions, and impossible to moderate without introducing subjective bias.

we need to better understand how ideas—or weaponized memes—spread throughout the system, and come up with countermeasures to mitigate its amplifying and accelerating effects, regardless of message content.

it’s not even a new problem. we’ve failed to mitigate the dangers of mass media from the start.

i certaintly don’t expect any solutions coming from the likes of facebook or google because their very business model is part of the problem

what social media platforms do right now is sort of segregate people into these ideological bubbles, causing the well-known echo chamber effect which leads to radicalization, atomization and people losing touch with reality. i think this is the wrong way to go about it.

maybe the way to limit the spread of information should be based more on geographical closeness than closeness of interests.

Follow

another factor to consider is that people tend to form parasocial relationships with influential personalities the moment they click the follow button.

Sign in to participate in the conversation
nazrin.moe

-