Sharing Information Corrupts Wisdom of Crowds
By Brandon Keim
When people can learn what others think, the wisdom of crowds may veer towards ignorance.
In a new study of crowd wisdom — the statistical phenomenon by which individual biases cancel each other out, distilling hundreds or thousands of individual guesses into uncannily accurate average answers — researchers told test participants about their peers’ guesses. As a result, their group insight went awry.
“Although groups are initially ‘wise,’ knowledge about estimates of others narrows the diversity of opinions to such an extent that it undermines collective wisdom”, wrote researchers led by mathematician Jan Lorenz and sociologist Heiko Rahut of Switzerland’s ETH Zurich, in Proceedings of the National Academy of Sciences on May 16. “Even mild social influence can undermine the wisdom of crowd effect.”
5月16日，苏黎世联邦理工学院的数学家Jan Lorenz和社会学家 Heiko Rahut领导的研究人员宣布：“尽管群体是“明智”的，但群体之间互相猜测彼此想法会降低个体之间意见的多样性，进而影响集体的智慧。即便是微弱的社会影响也会降低群体的智慧。”
The effect — perhaps better described as the accuracy of crowds, since it best applies to questions involving quantifiable estimates — has been described for decades, beginning with Francis Galton’s 1907 account of fair goers guessing an ox’s weight. It reached mainstream prominence with economist James Surowiecki’s 2004 bestseller, The Wisdom of Crowds.
几十年来，不乏关于群体能够准确解决量化预测问题的记载，最早可以追溯到 Francis Galton 在1907年的一个试验：让逛集市的人猜测一头牛的重量。这个观点因为年经济学家 James Surowiecki 在2004年出版的畅销书《群体的智慧》一跃成为主流观点。
Study participants were asked how many murders occurred in Switzerland in 2006. At the end of each round of questioning, they were given small payments for coming close to the actual answer (signified by the gray bar). At left is the range of responses among participants who received no information about others.
As Surowiecki explained, certain conditions must be met for crowd wisdom to emerge. Members of the crowd ought to have a variety of opinions, and to arrive at those opinions independently.
正如 Surowiecki 解释的一样，只有在满足一定的条件下，群体才是有智慧的。群体的成员应当独立的得出自己的观点，这些观点要保持多样性。
Take those away, and crowd intelligence fails, as evidenced in some market bubbles. Computer modeling of crowd behavior also hints at dynamics underlying crowd breakdowns, with he balance between information flow and diverse opinions becoming skewed.
Lorenz and Rahut’s experiment fits between large-scale, real-world messiness and theoretical investigation. They recruited 144 students from ETH Zurich, sitting them in isolated cubicles and asking them to guess Switzerland’s population density, the length of its border with Italy, the number of new immigrants to Zurich and how many crimes were committed in 2006.
Lorenz 和 Rahut 的试验适合大规模、混乱的现实世界，也适合理论研究。他们从苏黎世联邦理工学院招募了144个学生，让他们坐在单独的小房间内，并问他们：瑞典的人口密度是多少，瑞典和意大利边境线的长度，到苏黎世的新移民有多少，以及2006年共发生多少次犯罪行为。
After answering, test subjects were given a small monetary reward based on their answer’s accuracy, then asked again. This proceeded for four more rounds; and while some students didn’t learn what their peers guessed, others were told.
As testing progressed, the average answers of independent test subjects became more accurate, in keeping with the wisdom-of-crowds phenomenon. Socially influenced test subjects, however, actually became less accurate.
The researchers attributed this to three effects. The first they called “social influence”: Opinions became less diverse. The second effect was “range reduction”: In mathematical terms, correct answers became clustered at the group’s edges. Exacerbating it all was the “confidence effect,” in which students became more certain about their guesses.
“The truth becomes less central if social influence is allowed,” wrote Lorenz and Rahut, who think this problem could be intensified in markets and politics — systems that rely on collective assessment.
Lorenz 和 Rahut 写到：“如果允许施加社会影响，真相就会受到影响“。他们认为这个问题在市场和政治系统中（这两个都是依赖于集体决策的）会更严重。
“Opinion polls and the mass media largely promote information feedback and therefore trigger convergence of how we judge the facts,” they wrote. The wisdom of crowds is valuable, but used improperly it “creates overconfidence in possibly false beliefs.”