Online Reputation Management for Crowd-Sourcing Platforms: Cleaning up After the “Bewildered Herd”
As the Web 2.0 model has shifted to content being generated by users (often referred to as “crowdsourcing”) as opposed to administrators, it’s presented a somewhat novel problem of proofing the contributions of the masses.
The “Bewildered Herd” is a term attributed to Walter Lippmann who is one of the early scholars of journalism and public relations. Lippmann’s contention was that the public was essentially too inept to govern itself and needed to have smart people make up its mind for it in order for society to function. To wit:
“The public must be put in its place, so that it may exercise its own powers, but no less and perhaps even more, so that each of us may live free of the trampling and the roar of a bewildered herd.”
(Walter Lippmann, Public Opinion, 1922)
Crowdsourcing (originated by Jeff Howe of Wired) is explained by Clay Shirky below:
On the whole, user-generated contributions are amazingly effective and have accomplished a powerful amount of the work in building the Internet. There are, though, occasionally problems. Here are some of the sites I try to watch regularly for inaccuracies and misinformation:
- Google Local
- Yahoo Answers
- Google Sidewiki
Which crowdsourcing sites do you monitor for inaccuracies?