April 16, 2017
About the author : Joni holds a PhD in marketing. He is currently working as a postdoctoral researcher at Qatar Computing Research Institute and Turku School of Economics. Contact: joolsa (at) utu.fi
However, after the meeting I realized this might not be enough, and in fact be naïve thinking. It may not matter that algorithms and social media platforms show people ‘this is false information’. People might choose to believe in the conspiracy theory anyway, for various reasons. In those cases, the problem is not the lack of information, it is something else.
And the real question is: Can technology fix that something else? Or at least be part of the solution?
Because, technically, the algorithm is simple:
=> results in a balanced and informed citizen!
But, as said, if the opposing content is against what you want to believe in, well, then the problem is not “seeing” enough that content.
These are tough questions and reside in the interface of sociology and algorithms. On one hand, some of the solutions may approach manipulation but, as propagandists could tell, manipulation has to be subtle to be effective.
The major risk is that people might rebel against a balanced worldview. It is good to remember that ‘what you need to see’ is not the same as ‘what you want to see’. There is little that algorithms can do if people want to live in a bubble.
Originally published at https://algoritmitutkimus.fi/2017/04/16/the-balanced-view-algorithm/