Q&A

How Opinions Become Chaotic

Physics 16, 181
Sarah Marzen finds that potential sociological strategies for combatting the spread of disinformation can have the opposite effect, leading to a population with more polarized or chaotic views.
S. Marzen; APS/Carin Cain; A. Gandelheid/stock.adobe.com

With the birth of the internet came the ability of every individual—from a teenager to a dictator—to spread lies far and wide. The advent of social media then amplified the likelihood of this happening, with research from the World Health Organization and others showing that posts containing falsehoods get shared 70% more frequently than those that don’t. This issue can impact everything from the content of policies put in place by government officials to the personal decisions individuals make surrounding healthcare.

The problem counts among one of many information-related issues that Sarah Marzen of the Claremont Colleges, California, is working to understand. Using tools from machine learning and information theory, she probes questions ranging from how memory works to how disinformation propagates through a community, the latter being an avenue she only recently started exploring. Marzen spoke to Physics Magazine about her love of all things related to information and about how her experience with mental illness has influenced the problems she tackles.

All interviews are edited for brevity and clarity.

The propagation of false or inaccurate information is an increasing problem for society. What aspect of this issue are you currently studying?

How people’s opinions evolve over time as they interact with information on some topic.

How do you measure this change?

My group has developed a toy model that can evaluate the behavioral changes of individuals who send and receive information privately. The model assigns to each person an initial bias score, indicating what type of information appeals to their initial beliefs. The model also assigns to each piece of information a credibility score that depends on who sent it and on its content. That credibility score determines whether, after reading the information, an individual’s views on a topic change. We’ve found that individuals typically maintain their initial opinions. But under certain conditions, the beliefs can become chaotic, meaning that an individual’s view constantly changes in a weird, aperiodic way that makes it hard to predict what it will be in the future.

What are those conditions?

Not what you would expect. Chaos emerges when individuals actively work to fight their innate tendencies to favor information that confirms their original beliefs. This action is an intervention strategy that my group thought might be a valid approach for encouraging opinions to converge among a group of individuals. But we see that instead it can do the opposite. If individuals try too hard to fight their current confirmation biases, they end up with chaotic beliefs that change in a completely random manner.

Are there conditions when the model does show consensus?

Yes, when none of the individuals have preassigned cognitive biases. But as soon as those get added in, opinions either polarize or become chaotic.

How might this information help social media platforms trying to combat the spread of misinformation or individuals trying to bring about consensus on a topic?

We don’t really know. It would be great if everybody could just eliminate their biases, and all agree on what some set of data shows. But everyday experience shows that is unlikely to happen. Scientists are looking at ways to control chaos in all kinds of systems. It could be possible to inject information in some way that it stabilizes the system. But that’s an open question.

Looking at information in sociological contexts is a relatively new research avenue for you. What precipitated your transition to investigating problems in this field?

I am a diagnosed paranoid schizophrenic and have been for about a decade now. Soon after the onset of my schizophrenia, I experienced psychosis accompanied by hallucinations. Specifically, I thought that there was an organization filled with men who had a hold on what information was out in the world. And they were after me because I was trying to combat the misinformation and disinformation that they put out.

My new research projects all stem out of that experience—both from me thinking about misinformation in a scientific context and from me trying to understand my illness a little bit better.

While discussions about mental illness diagnoses are becoming more common, there are still many stigmatisms around them. Can you tell me about your experiences?

Everybody has been incredibly supportive. I know that’s not been the case for other people. But I feel like things are changing. Society increasingly understands that mental illnesses are diseases and, as with all diseases, the symptoms can make seemingly simple everyday tasks difficult. It is hard telling family, friends, and colleagues, but most people are kind and compassionate and they want to help.

Do you have any advice for scientists grappling with mental illness?

Don’t keep it to yourself. The people around you can and will want to find ways to provide you support. You won’t be a burden.

–Katherine Wright

Katherine Wright is the Deputy Editor of Physics Magazine.


Recent Articles

Simulations Suggest Flu Virus Vulnerability
Biological Physics

Simulations Suggest Flu Virus Vulnerability

Studies of influenza A’s unusual propulsion strategy suggest that drugs could target a critical protein. Read More »

Newly Discovered Acoustic Forces
Acoustics

Newly Discovered Acoustic Forces

Calculations have uncovered two previously unknown forces that act on nonspherical particles in a sound field. Read More »

Watching Crystallization Advance
Condensed Matter Physics

Watching Crystallization Advance

Experiments with colloidal particles have uncovered conditions where an intermediate layer that separates a crystallizing liquid from its solid forms. Read More »

More Articles