Consequences of homophily in online networks

Consider yourself and your closest group of friends. The chances are high that you and your friends are similar in several aspects; you share a common interest, you might be the same age, maybe you are coworkers, or maybe you live in the same city. This is an essential principle in social networking theory called homophily. Homophily means that people are most likely to establish friendships with other people that are similar to themselves. These similarities can be either immutable (unchangeable), like ethnicity or nationality, or mutable (changeable), like interests or beliefs.

So what happens when people only interact with others who are alike in ethnicity, age, social class or political opinion?

“Homophily limits people’s social worlds in a way that has powerful implications for the information they receive, the attitudes they form, and the interactions they experience” (McPherson et al., 2001, p. 415)

In online social networks like Facebook and Twitter, when people only interact with other people who share identical views or opinions, they may end up in what’s called an “echo chamber”. An echo chamber is when a group of people voice the same opinions without having to ever encounter opposing arguments. Such groups can also act as focal points for other people wanting to engage in these discussions.

This increases polarization within highly discussed topics, like for example politics, where one side encounters a vast amount of opinions that only support their particular point of view.

In real life it is natural that people connect more often with similar people, but in social media it has become exaggerated, with many platforms encouraging this development by enabling users to shut out unwanted information.

Both Facebook and Google have algorithms that personalize content to fit what the user is usually browsing, this could be search results on Google, it could be recommendations for groups, or “people you may know” on Facebook. With these algorithms in place it is very easy for a person to become trapped in a filter bubble, where browsing history and search personalization may further restrict the information this person receives on a polarized subject.

How do we prevent further polarization and increase available information from both sides? It could be an improvement if platforms were to display articles or information showing the other side of the topic to a user whenever the user is viewing a side of a hot topic.

Another solution would be for people to connect to other people having opposing views, but this only works when a person is already aware of the problem in the first place, which is difficult as we can presume the average person is not very informed of how homophily and social media algorithms work, and how they impact what they see on their news feeds.

However, if just one person from a polarized group alter their views towards the opposite position, they can affect other people in the group. This is because of the concept of social influence; a person in a network containing homophily is able to influence other members in group to modify their own mutable characteristics over time. This could lead to a shift in information flow and lead to new and different views for people.

As a conclusion I would say that considering how important online social networks are in our everyday lives, it is necessary for people to know how much homophily impacts the information they receive online and in their news feeds.

 

Easley, D. & Kleinberg, J. (2010). Networks, Crowds and Markets: Reasoning about a Highly Connected World. Cambridge University Press.

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin Press, New York.

https://www.theringer.com/2016/11/16/16045730/social-media-echo-chamber-2016-election-facebook-twitter-b433df38a4cb

McPherson, M., Smith-Lovin, L., Cook, J. M. (2001). Birds of a Feather: Homophily in Social Networks. Annual Review of Sociology, Vol. 27, 415-444. https://doi.org/10.1146/annurev.soc.27.1.415