PRIVACY + SECURITY BLOG

News, Developments, and Insights

high-tech technology background with eyes on computer display

Cartoon Social Media - TeachPrivacy Privacy Training 02 small

It is hard to imagine a world without social media. People are increasingly relying on social media to maintain friendships, share photos and happenings with family, and keep current with the news.  But there’s a dark side – more superficial relationships, cyberbullying, harassment, hate speech, and manipulation. Social media has become a cesspool of lies and misinformation campaigns, a place where radicalized hate groups can spread their venom, recruit more members, and rally their followers to attack.

Several prominent social media sites are struggling to figure out what to do. In the early days of the commercial Internet (mid 1990s through early 2000s), idealists pushed a vision of the Internet as a free speech zone. Bad speech would be countered and beaten by good speech, lies would be defeated by truth, and freedom and happiness would reign.  Platforms could just remain neutral and rarely intervene.  They could mainly let the battles be fought, with the faith that eventually the forces of good would win out over the forces of evil.

But this view is naive. We have seen in the past 10-15 years that lies, hate, harassment, defamation, invasion of privacy, and many other social ills are festering online. Social media platforms must wake up and realize that the earlier idealism isn’t the direction reality is taking us. A position of neutrality isn’t appropriate. Platforms must intervene more; they must govern.

Social media platforms currently lack much experience and skill with governance. They don’t have enough personnel who have the background to formulate wise rules, procedures, and due process.  But the call for platforms to govern is increasing in volume, and they can’t keep avoiding it. This is why social media companies should start hiring more people in the humanities, who often have a background in thinking about complicated moral and philosophical issues.

* * * *

This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy and data security training. He also posts at his blog at LinkedIn, which has more than 1 million followers.

Professor Solove is the organizer, along with Paul Schwartz, of the annual Privacy + Security Forum events.

NEWSLETTER: Subscribe to Professor Solove’s free newsletter
TWITTER: Follow Professor Solove on Twitter.