The social influence bias is an asymmetric herding effect on online social media platforms which makes users overcompensate for negative ratings but amplify positive ones. Positive social influence can accumulate and result in a rating bubble, while negative social influence is neutralized by crowd correction. This phenomenon was first described in a paper written by Lev Muchnik, Sinan Aral and Sean J. Taylor in 2014, then the question was revisited by Cicognani et al., whose experiment reinforced Munchnik's and his co-authors' results.
Online customer reviews are trusted sources of information in various contexts such as online marketplaces, dining, accommodation, movies, or digital products. However, these online ratings are not immune to herd behavior, which means that subsequent reviews are not independent from each other. As on many such sites, preceding opinions are visible to a new reviewer, he or she can be heavily influenced by the antecedent evaluations in his or her decision about the certain product, service or online content. This form of herding behavior inspired Muchnik, Aral and Taylor to conduct their experiment on influence in social contexts.
Muchnik, Aral, and Taylor designed a large-scale randomized experiment to measure social influence on user reviews. The experiment was conducted on social news aggregation website like Reddit. The study lasted for 5 months, the authors randomly assigned 101 281 comments to one of the following treatment groups: up-treated (4049), down-treated (1942), or control (the proportions reflect the observed ratio of up-and down-votes. Comments which fell to the first group were given an up-vote upon the creation of the comment, the second group got a down-vote upon creation, the comments in the control group remained untouched. A vote is equivalent to a single rating (+1 or -1). As other users are unable to trace a user’s votes, they were unaware of the experiment. Due to randomization, comments in the control and the treatment group were not different in terms of expected rating. The treated comments were viewed more than 10 million times and rated 308 515 times by successive users.
The up-vote treatment increased the probability of up-voting by the first viewer by 32% over the control group (Figure 1A), while the probability of down-voting did not change compared to the control group, which means that users did not correct the random positive rating. The upward bias remained inplace for the observed 5-month period. The accumulating herding effect increased the comment’s mean rating by 25% compared to the control group comments (Figure 1C). Positively manipulated comments did receive higher ratings at all parts of the distribution, which means that they were also more likely to collect extremely high scores. The negative manipulation created an asymmetric herd effect: although the probability of subsequent down-votes was increased by the negative treatment, the probability of up-voting also grew for these comments. The community performed a correction which neutralized the negative treatment and resulted non-different final mean ratings from the control group. The authors also compared the final mean scores of comments across the most active topic categories on the website. The observed positive herding effect was present in the “politics,” “culture and society,” and “business” subreddits, but was not applicable for “economics,” “IT,” “fun,” and “general news”.
The skewed nature of online ratings makes review outcomes different to what it would be without the social influence bias. In a 2009 experiment by Hu, Zhang and Pavlou showed that the distribution of reviews of a certain product made by unconnected individuals is approximately normal, however, the rating of the same product on Amazon followed a J-Shaped distribution with twice as much five-star ratings than others. Cicognani, Figini and Magnani came to similar conclusions after their experiment conducted on a tourism services website: positive preceding ratings influenced raters' behavior more than mediocre ones. Positive crowd correction makes community-based opinions upward-biased.
Main article: Media bias § Social media bias
Social media bias is also reflected in hostile media effect. Social media has a place in disseminating news in modern society, where viewers are exposed to other people's comments while reading news articles. In their 2020 study, Gearhart and her team showed that viewers' perceptions of bias increased and perceptions of credibility decreased after seeing comments with which they held different opinions.
Yu-Ru and Wen-Ting's research looks into how liberals and conservatives conduct themselves on twitter after three mass shooting events. Although they would both show negative emotions towards the incidents they differed in the narratives they were pushing. Both sides would often contrast in what the root cause was along with who are deemed the victims, heroes, and villain/s. There was also an decrease any conversation that was considered proactive.
Media bias is also reflected in search systems in social media. Kulshrestha and her team found through research in 2018 that the top-ranked results returned by these search engines can influence users' perceptions when they conduct searches for events or people, which is particularly reflected in political bias and polarizing topics.
((cite journal)): Cite journal requires