Technology is helping to redesign 2016 as a modern 1933

date
Sep 25, 2016
slug
2016-technology-is-helping-to-redesign-2016-as-a-modern-1933
status
Published
tags
technology
education
journalism
type
Post
ogImage
summary
Technology, media bias, and the rise of extremism: The impact of social networks and information bubbles on political discourse and democratic societies.
After an economic crisis that only the very wealthy did not feel, there was a political radicalization that brought back to the table topics that political calm had pushed away: immigration, inflation, poverty, unemployment, uncertainty. From this scenario, a populist character emerged, a demagogue, with wide gaps of ignorance, who led polls with a dialogue of xenophobia, intolerance, militarism, and interventionism. You can see this state of affairs in the US with Donald Trump, in ‘Brexit’, in the evolution of the right in France and Germany, but the above description is not current - it is the rise of Nazism and fascism before World War II. And now as before, the tragedy seemed unlikely until it happened.
Hitler did not have a social network when he plotted a military coup in a brewery a decade before coming to power, but if he could, he would certainly have wanted one. In certain scenarios, social media can be hate and conspiracy theory producing machines that lead, mostly, to a venomous rage, which makes people averse to dialogue. The self-reverberation box of a Facebook or Twitter, for example, is an unparalleled experiment, which created hubs of anger, usually in the hands of those who are more extreme.
None of the networks propose to this, of course, but it doesn't matter. Trolls and resentful people love microcosms where they can break out of their insignificance. Rational discourse is not attractive to the algorithm and hydrophobic vomitoriums spread exponentially due to the functioning of networks enabled by social media. Extremists bury themselves in their own reason because they no longer see contrary opinions within their immediate circle because the Internet hides what you do not want to see. Slowly, the hydrophobes increase the certainty of their ideas and lose the ability to debate - which is the core of civilization. In summary, no one wants to be happy anymore - everyone just wants to be right.
What is the role of the media in this scenario? The media seems to have forgotten its function to check reality. The maxim applies to most democratic regimes. Even companies with very high standards of protocol and quality like the New York Times, were sucked into the ideological war of this moment of radicalization. The truth is that technological advances often come with a built-in tragedy. The development of chemistry made World War I a dehumanizing massacre with the use of chemical weapons; the improvement of airplanes did the same in World War II and so on. Technology is neither good nor bad - but its use can go from the sublime to the catastrophic.
Donald Trump is a gross reactionary eschatological assembly, irrefutable proof that democracy often fails, but he undoubtedly receives more aggressive treatment from the "liberal" media (in quotes, because, the term in Brazil has a different connotation than in the rest of the world). Yes, he provokes this treatment, and even profits from it (it is estimated that his campaign has generated the equivalent of $2 billion in free media so far), but the hole is deeper. It was practically proven that the Democratic party rigged Hillary Clinton's candidacy because they believed that in a polarized dispute between Bernie Sanders and Trump, the result would be more uncertain. For this apparent setup, the candidate and the party were subjected to a critique that corresponds to a fraction of what it should. As Getúlio Vargas used to say, "for friends, everything; for enemies, the law".
"Impartial" coverage is a delirious invention of manual writing authors. Nothing human is impartial. The variation is that there can be well-done or poorly done coverage. All coverage is originated with bias because it is an irresistible trait of human behavior.
Here is the question: if a publication, no matter how good, knows that a politician has told a lie and that his followers will believe that lie anyway (since they live immersed in conspiracy theories), can the publication (or social network), ethically, omit it? In the case of Facebook, was there or was there not bad faith when their editors "limited" the reach of articles/discussions clearly republican? Did they really limit or simply react according to their principles and beliefs? Is it possible to strip away all your prejudice?

A lie told a thousand times…

At the last Online News Association conference, Facebook's product director, Fidji Fimo, said that the company takes the spread of fake news very seriously. Probably the company's policy is just that. Facebook's business model is sitting on top of user trust and deliberate manipulations would be a shot in the foot, something its management rarely does. The company has already proven that it has this concern. In the episode of the supposed manipulation of the Trending Topics by editors, the response came quickly, with the dismissal and dismantling of the team that took care of the tool and the algorithms to choose the most discussed topics.
But what if Facebook (and other companies that make surreal profits from this information management) can't do anything about it? And if they can/could, should they be the only ones to have this responsibility? How has the digitization of interpersonal relations altered the design of the same?
This is the crux of the problem of information bubbles. Actually, "The Filter Bubbles", is the title of a book on the subject. Eli Pariser, the author, basically argues that algorithms, by incessantly filtering the information that comes to us, neutralize any possibility of exposure to new ideas, different opinions, discussion. You know that annoying friend who keeps poking you on Facebook? In the virtual world of networks, this is often the only contact you have with people who disagree with you.
The American informational titans do not provoke the emergence of bubbles themselves, but their business model is favorable to this. Broadly speaking, social networks help you connect with people you agree with. Facebook wants you to be happy and it's much easier to make you happy by showing you opinions similar to yours. A happy you is a much more affable customer and prone to buying things offered by advertisers, who in turn, pay rivers of money to put their faces in the middle of your happiness.
No one in their right mind disagrees that the digital medium has redesigned the global economy. There are studies on this and basically no controversy. Non-financial relations between people have also undergone a radical mutation. Contacts between acquaintances and less known people have their genesis in the calculations that complex algorithms of our digital life managers make and this has implications. A study by a Canadian researcher has a consistent argument: if algorithms determine a large part of people's lives, wouldn't it be fair to think that they couldn't be black boxes, like those used by Facebook, Google, and other managers of our digital life? And on the other hand, would it be fair for companies to reveal trade secrets based on a perception of reality that is totally opinionated?
Between 1922 and 1936, prejudices and intolerance that were latent in Europe caused seven countries (Italy, Soviet Union, Austria, Germany, Portugal, Spain, and Greece) to gain extremist leaders whose mottos were exactly the same as those defended by Trump, Le Pen, and the like. The belligerent rhetoric that gained significant space after the economic jolt of 2008 is worryingly similar to that of the beginning of the last century. In the period before World War II, technology did not play as relevant a role in the radicalization of the socio-political scenario as it does today.
Consolidated democracies like the USA, France, and England, today see belligerent speeches gain electoral relevance. This intolerance is born, believe it or not, in your reaction to your annoying friend poking you because of divergent political beliefs on a social network. The next time these childish provocations start, look for him/her and try to understand his/her point of view, explaining yours. More than avoiding losing a friend, this gesture can help to disinfect a society that is increasingly infected.

© Cassiano Gobbet 2023 - 2024