top of page

Why do political views so often come in a package?

Why do political views so often come in a package?
Why do political views so often come in a package?

Hello, today we're going to talk about politics, but not exactly about it. More precisely, about a pattern associated with it.

Have you ever noticed that if someone supports women's rights and opposes racism and homophobia, you can often almost automatically predict their position on international conflicts? For example, they're highly likely to sympathize with the actions of Hamas or, at least, justify them in moral terms. Conversely, if someone displays homophobic, sexist views and appeals to "traditional values" and religion, you can assume they'll support strict immigration policies, a strong state, and conservative political forces.

Why does this happen? Why do beliefs that at first glance seem unrelated form stable and predictable combinations?

The phrase, "If I can predict someone's political views based on just one aspect, you're not a serious thinker," doesn't originate with any one author. It's a general conclusion drawn from classic studies of mass political consciousness. In particular, political scientist Philip Converse demonstrated back in 1964 that the beliefs of most people are structured and interconnected, and John Zaller later clarified that these connections are shaped by elite cues and the social environment. In other words, the predictability of views isn't random, but the result of specific cognitive and social mechanisms.


Cognitive dissonance and the striving for internal logic

One of the key mechanisms explaining the "package-like" nature of political views was described by psychologist Leon Festinger. In his work, "A Theory of Cognitive Dissonance," he argues that humans strive for internal consistency in their beliefs.

Festinger puts it very clearly:

"Individuals strive for internal consistency." In other words, our psyche is structured in such a way that contradictions within our belief system cause tension. This state, cognitive dissonance, is perceived as psychological discomfort that must be resolved.

It's important to understand that this isn't just a matter of "unease" or mild doubt. Dissonance can cause quite noticeable internal tension, anxiety, and even irritation. This is precisely why people strive to get rid of it as quickly as possible. And here's the key point: dissonance can be resolved in a variety of ways, and not always through a rational reexamination of beliefs. Festinger also noted:

"When dissonance is present, in addition to trying to reduce it, the person will actively avoid situations and information that would likely increase the dissonance." This means that a person not only eliminates contradictions but also begins to avoid information that could exacerbate them. This directly links cognitive dissonance to selective information perception.


How exactly is dissonance resolved?

When a person encounters a contradiction, they have several strategies. They can change their beliefs, change their interpretation of the facts, or simply devalue the source of the information. In practice, the second and third options are most often used because they require less effort.

For example, if a person considers themselves a humanitarian and a supporter of human rights, and then encounters a situation where "their side" displays violence, conflict arises. Admitting that "their" side may be wrong is painful because it undermines identity.

Instead, cognitive reprocessing occurs. Violence can be reframed as a "necessary measure," a "contextually justified action," or a "response to a more serious threat." Thus, facts are not completely denied, but their meaning is altered to fit within an existing system.


Self justification and Self image defense

This idea was further developed by psychologist Elliot Aronson, who demonstrated that cognitive dissonance is particularly powerful when it affects self-image.

Aronson wrote:

"People are not just rationalizing; they are rationalizing to maintain a positive self-image." This means that a person defends not just beliefs, but a self-image of themselves as a good, reasonable, and moral person. This is why, having adopted one moral position, a person begins to build an entire system around it. This system serves not only to explain the world but also to maintain self-esteem. If one part of this system is destroyed, the entire structure risks being called into question. Therefore, the psyche resists change.


Why one idea leads to others

Cognitive dissonance explains why accepting one idea often leads to the adoption of a whole set of related positions.

This occurs because each new idea is tested not on its own, but for compatibility with an existing system. If an idea fits seamlessly into this system, it is accepted almost automatically. If not, it is either rejected or rethought. Over time, a coherent structure of beliefs develops, in which individual elements support one another. This creates a sense of logic and coherence. However, this logic is often internal, not objective. It is based not on an independent analysis of each position, but on their mutual consistency.


The illusion of rationality

One of the most interesting effects is that a person may sincerely believe their views are the result of rational analysis. In fact, the opposite occurs: a position is formed first (often under the influence of emotions or the social environment), and then arguments are selected to support it. This process is described as "motivated reasoning" and is closely related to cognitive dissonance. People don't simply seek truth; they seek confirmation of their existing position.


Two different sets of logic

Interestingly, cognitive dissonance operates symmetrically across different ideological systems. If a person is guided by humanistic values, they will strive to support anything related to the protection of the weak and oppressed. If a person is guided by order and stability, they will consistently support the structures that these values ​​ensure. In both cases, a sense of logic emerges. But this logic is not universal, but internal, tied to a starting point.

Thus, two people can reach opposing conclusions, yet both consider themselves consistent and rational.


Amplification of the effect through the social environment

Cognitive dissonance intensifies when a person is in a group with similar views. In this case, any contradictions not only cause internal discomfort but can also lead to social pressure. A person begins not only to avoid contradictory information but also to more actively demonstrate agreement with the group. This further entrenches the belief system. Over time, this can lead to alternative positions being perceived not just as erroneous, but as morally unacceptable.


Cognitive dissonance is not simply a psychological effect. It is a fundamental mechanism that explains why political views strive for consistency and form stable "packages." It demonstrates that a person is not so much seeking the truth as striving to preserve the integrity of their worldview and self-image. This is why changing one's views is not simply an intellectual process. It always poses a psychological risk to oneself and the social group in which one finds oneself.


Political identity as a form of social belonging

Lilliana Mason demonstrates that in modern societies, political views are no longer simply a set of opinions on specific issues. They are increasingly becoming part of a person's social identity. In her book, Uncivil Agreement, she formulates this as follows:

“Partisan identities have become aligned with multiple social identities, making them stronger and more emotionally charged.” This means that political stances no longer exist in isolation. They are intertwined with lifestyle, culture, education, media consumption, and even aesthetic preferences. A person doesn't simply "believe something is right" they become part of a specific social camp.


Politics as a cultural code

When a political stance becomes part of an identity, it begins to function as a cultural code. Through it, a person signals to others who they are, what circles they belong to, and with whom they associate themselves. This manifests itself on many levels. Political views begin to influence which sources of information a person considers reliable, which jokes they perceive as acceptable, what topics they discuss, and even the language they use. Over time, a coherent "package" of not only ideas but also a style of thinking develops. This package becomes recognizable and socially legible. And it's important to note that this isn't just a matter of conscious choice. Many of these elements are learned through environment and repetition. A person gradually begins to speak, think, and react in ways that are customary within their group.


Why changing positions becomes "expensive"

When beliefs are embedded in identity, changing them ceases to be a simple intellectual act. It's no longer a question of "agree and disagree." It's a question of "who am I?" The sociological theory of social identity, developed by Henri Tajfel, explains this mechanism. Tajfel wrote:

"Individuals strive to maintain a positive social identity by favoring their in-group over out-groups." In other words, a person strives to maintain a positive image of the group to which they belong because this is directly related to their self esteem. Admitting that "our" position may be wrong threatens not only our beliefs but also our sense of self-worth as part of a group. Therefore, changing our position is perceived as a loss, sometimes even as a betrayal.


Fear of social isolation as a factor in thinking

Stepping outside the group's stance is associated with the risk of social isolation. This risk is not always expressed in overt conflict. More often, it manifests itself in milder forms: decreased support, distancing, and loss of approval. Nevertheless, from a psychological perspective, even such signals are significant. Humans are social beings, and belonging to a group is a basic need. Political scientist Elisabeth Noelle-Neumann noted in her "spiral of silence" theory:

"People fear isolation and therefore tend to align with what they perceive as the dominant opinion." This means that a person can not only change their views but also suppress doubts if they conflict with the group's position.


Why we defend "our own" rather than ideas

When political identity becomes entrenched, an important shift occurs. A person begins to defend not so much specific ideas as their group. This leads to the same actions being evaluated differently depending on who commits them. If an action comes from "one's own," it is more often justified or mitigated. If from "them," it is condemned more harshly. This effect is due to the fact that moral evaluation begins to depend on group affiliation. Facts themselves fade into the background, giving way to interpretation.


Ignoring facts as a defense of identity

This results in a situation in which a person may ignore or reinterpret facts that contradict their position. This is not necessarily a conscious distortion; more often, it is an automatic defense. Research in the field of motivated reasoning shows that people tend to accept information that confirms their views and critically consider information that contradicts them. This process is closely linked to identity. Recognizing an inconvenient fact can mean the need to reconsider one's position, and thus a threat to social belonging. Formation of a rigid "us and them." At a certain stage, a clear division between "us" and "them" emerges. This is not simply a difference in views, but a difference in perception of the world.

"We" begin to be perceived as:

1) More moral.

2) More rational.

3) More informed.

"They" in turn, are perceived as:

1) Misguided.

2) Manipulable.

3) Dangerous.


This process is described as increasing intergroup polarization. It makes dialogue between groups extremely difficult, because each side perceives the other not just as an opponent, but as a threat.


Emotional strengthening of identity

It's important that political identity is accompanied by strong emotional involvement. This makes it resilient to change. Lilliana Mason emphasizes that modern political conflicts are becoming less ideological and more affective. People may not have a deep understanding of the issues, but they still experience strong emotions regarding "us" and "them." This strengthens the defense of a position. The stronger the emotion, the more difficult it is to change one's opinion.


Political identity transforms views from a set of ideas into part of social belonging. This makes them more stable, more emotional, and less susceptible to rational revision. As a result, a person begins to defend not so much the truth of their beliefs as their place in the group. And this is precisely why changing a political position isn't simply a change of opinion. It is always, in part, a change of self.


Reference groups and the desire to fit in

The concept of a reference group was introduced by sociologist Robert K. Merton and has become a key tool for understanding how human beliefs are formed. Merton defined reference groups as those social groups to which an individual is guided when forming their views, behavioral norms, and self-esteem, regardless of whether they are actually members. In his formulation:

"Reference groups are those groups which individuals use as a standard for evaluating themselves and their own behavior." This means that a person evaluates themselves not in absolute terms, but through the lens of a significant social group. And this is where an important shift occurs: views are formed not only as a result of analysis, but also as a means of fitting in.


The desire to belong as a basic mechanism

The desire to be part of a group is not just a social habit, but a basic psychological need. Abraham Maslow, in his hierarchy of needs, emphasized that the need for belonging and acceptance occupies a central place in the structure of human motivation. He wrote:

"The need to belong is a fundamental human motivation." This means that a person will strive for acceptance even at the cost of changing their own beliefs. Moreover, this change is rarely perceived as a compromise. On a subjective level, it is experienced as a "natural development" of views.

When a person says to themselves, "I want to be part of this environment" or "I want to be perceived as this kind of person," they initiate a process of adaptation. This process almost never begins with political views. It begins with more superficial things like language, appearance, and interests. But gradually it reaches deeper convictions.


The mechanism of gradual adaptation

Adaptation to a reference group occurs in stages. First, a person adopts modes of expression, then norms of behavior, and finally interpretations of reality. Sociologist Herbert Blumer, developing the ideas of symbolic interactionism, noted:

"People act toward things on the basis of the meanings those things have for them." These meanings are formed through social interaction. In other words, a person learns not only what to think but also how to perceive events. At some point, they begin to not only repeat the group's positions, but to see the world as the group does. This is the moment when views cease to be borrowed and become "their own."


Political views as a social password

Within a reference group, political beliefs begin to function as a marker of belonging. They become a kind of "password" that identifies who is inside and who is outside. This mechanism can be described through the concept of social signaling. When a person expresses a certain position, they not only communicate their opinion but also demonstrate their affiliation. Sociologist Pierre Bourdieu wrote:

"Taste classifies, and it classifies the classifier." In other words, our preferences, including political ones, not only reflect us but also place us in a certain social category. Therefore, "correct" views within a group become a way to maintain one's status. And deviating from them is risky.


Different groups have different normalities

One of the key consequences of this process is the formation of different systems of "normality" in different social circles. In one environment, a certain position is perceived as obvious and morally obligatory. In another, it's absurd or even dangerous. This isn't necessarily related to level of education or awareness. It's linked to the norms accepted in a particular group. This effect is reinforced by the fact that a person most often interacts with those who already share their views. As a result, their worldview becomes increasingly homogeneous.


Unconsciousness of the process

The key point is that this process is rarely conscious. People don't formulate it as a strategy. They don't think, "I'll change my views now to fit in." Instead, they feel that certain ideas simply "sound right." This is due to the effect of cognitive availability and frequency of repetition. The more often a person hears a certain position in their environment, the more natural it seems to them. Psychologist Solomon Asch, in his experiments on conformity, showed that people tend to agree with the opinion of a group even when it is clearly wrong. He wrote:

"That intelligent, well meaning young people are willing to call white black is a matter of concern." This effect demonstrates how powerful group pressure can be, even in simple situations. In politics, where everything is much more complex and less clear-cut, this effect is only amplified.


Internalized norm integration

Over time, external pressure turns into internalized conviction. The person stops feeling like they're "fitting in." They feel like they've always thought this way. This process is described as internalization. External norms become part of the internal value system. As a result, the person's position completely aligns with the group's position, but is perceived as the result of personal choice.


The cost of on conformity

It's important to understand that deviating from the norms of the reference group has a social cost. This can include loss of approval, a reduction in status, or even exclusion from the group. Therefore, even if a person has doubts, they may suppress them. This is due not only to the desire to avoid conflict but also to a deeper need to maintain belonging. Thus, beliefs are maintained not only cognitively but also socially.


Reference groups play a key role in shaping political views. They provide a framework within which "correct" and "incorrect" positions are defined. As a result, a person can arrive at certain beliefs not through analysis, but through a process of gradual adaptation and a desire to belong. This is precisely why political views so often coincide within groups and differ so sharply between them. Because these differences are based not only on logic but also on social structure.


Cognitive economy and ready made belief systems

One of the key explanations for why political views are formed in "packets" is the principle of cognitive economy. Its essence is that the human brain strives to minimize resource expenditure when processing information. Political scientist John Zaller, in his work "The Nature and Origins of Mass Opinion," demonstrates that most people do not form political beliefs through consistent and in-depth analysis. He writes:

"Citizens do not hold fixed opinions on most issues; rather, they construct opinions on the spot from the considerations that are most salient to them." This means that opinions do not always exist as stable structures. They are often formed "on the spot"—from the ideas and signals that are available at the moment. And here a key problem arises: if a person lacks the time, knowledge, or motivation for analysis, they will rely on simplified models.


Limited cognitive resources

The idea of ​​bounded rationality was developed by Herbert Simon. He argued:

“The capacity of the human mind for formulating and solving complex problems is very small compared to the size of the problems.” This means that a person cannot objectively process all available information. Politics is a particularly complex field, where each topic involves numerous factors, historical contexts, and contradictions. Under such conditions, attempting to analyze everything from scratch becomes virtually impossible. Therefore, the brain uses shortcuts so called heuristics.


Heuristics as a thinking tool

Psychologists Daniel Kahneman and Amos Tversky demonstrated that humans make decisions using quick and simplified rules. They noted:

“People rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values ​​to simpler judgmental operations.” In a political context, this means that instead of analysis, a person asks themselves a simpler question: “Who is right?” or “Whose position is this?” If the source is credible, the position is accepted. If not, it is rejected.


Signals as a substitute for analysis

This is where the mechanism described by John Zaller comes into play. People are guided by signals from:

1) Opinion leaders.

2) Media.

3) Social groups.


Zaller emphasizes that the perception of information depends not so much on its content as on who conveys it. This creates a situation in which a person may support a position not because they have analyzed it, but because "their" people support it.


Formation of ready made packages of views

Over time, these signals accumulate into stable systems. A person doesn't simply accept individual ideas they accept a holistic set. Political scientist Philip Converse described this as a structure of beliefs, where individual elements are interconnected. Importantly, however, these connections are often not logical. They are social. That is, idea A is connected to idea B not because one follows from the other, but because they both belong to the same ideological "package."


How does this mechanism work in practice

When confronted with a new topic, they rarely analyze it from scratch. Instead, they implicitly ask themselves: "Which side does this belong to?" After this, a position is formed automatically, based on an existing system. This process can be described as categorization. New information is not evaluated independently, but rather fits into an existing structure. As a result, thinking becomes fast but formulaic.


The illusion of understanding

One of the side effects of cognitive economy is the illusion of understanding. A person may be confident that they have "figured it out" when in fact they have simply matched it to an existing category. Daniel Kahneman describes this as the operation of "System 1"—fast, intuitive thinking that creates a sense of certainty without deep analysis. This explains why people can hold strong opinions on complex issues without having detailed information.


Systemic persistence and resistance to change

Once a belief system is formed, it begins to reinforce itself. New information is not evaluated neutrally, but through the lens of existing beliefs. Information that confirms a position is readily accepted. Information that contradicts it is criticized or ignored. This makes the system stable but closed.


Social reinforcement of cognitive economy

Cognitive economy is reinforced in a social environment. When surrounded by people with similar views, a person needs to analyze even less. The group already performs this function for them. As a result, a collective system of views is formed that is perceived as obvious and self evident.


Cognitive economy explains why people use pre-existing belief systems instead of independent analysis. This isn't a sign of stupidity, but an adaptive mechanism. It allows us to quickly navigate a complex world, but it makes our thinking dependent on external cues. This is why political views often appear as "packages." Because they are formed not as a result of consistent analysis, but as a result of simplification and social coordination.


System justification and psychological stability

The theory of system justification, developed by John Jost, explains one of the most paradoxical aspects of political thinking: people tend to defend and justify the existing social order, even if it conflicts with their interests or values. Jost formulates this as follows:

“People are motivated to defend and justify the existing social, economic, and political arrangements.” At first glance, this seems illogical. Why would someone support a system that might be unfair to them? However, from a psychological perspective, this is not only logical but also predictable.


The need for stability as a basic factor

To understand this mechanism, it is important to consider that humans require not only physical security but also predictability. The brain perceives uncertainty as a threat. Even if the current system is imperfect, it is understandable. It provides a sense of structure and control. John Jost also notes:

“System justification serves a palliative function, reducing anxiety and uncertainty.” In other words, justifying the system serves a calming function. It reduces anxiety associated with uncertainty and possible change. Recognizing that the system is unfair triggers a chain of thoughts:

If the system is bad → It needs to be changed → Changes are unpredictable → The future is uncertain.

It is this last step that causes anxiety. Therefore, the psyche often chooses the easier path—adapting perception rather than changing one's worldview.


Rationalization as a defense tool

When a person encounters a contradiction between their values ​​and reality, the rationalization mechanism is activated. This isn't necessarily a conscious lie to oneself. It's the processing of information in such a way that it becomes compatible with the existing system. For example, if a person sees actions that they would condemn in another situation, they can change their interpretation of these actions. They can be presented as "forced," "contextually justified," or "having no alternative." This process allows one to maintain the feeling that the world as a whole remains logical and manageable.


The Paradox of Supporting a Disadvantaged System

One of the most interesting aspects of system justification theory is that people can support a system that is objectively disadvantageous to them. John Jost points out:

"Even members of disadvantaged groups may exhibit outgroup favoritism." This means that representatives of less privileged groups can support structures that maintain their own inequality. Why does this happen? Because the alternative, acknowledging injustice, requires a re-evaluation of the entire system and causes internal conflict. Supporting the system, even an imperfect one, allows one to avoid this conflict.


Redefining Moral Norms

When a system is justified, not only is the facts interpreted, but moral criteria are also shifted. Actions that might be considered unacceptable in an abstract situation begin to be evaluated differently depending on the context. For example, violence can be justified as:

1) Necessary.

2) Protective.

3) Temporary.

4) "Lesser Evil."


This isn't just a double standard. It's the adaptation of a moral system to the existing reality. Thus, morality ceases to be universal and becomes dependent on context and belonging.


Connection to Cognitive Dissonance

System justification theory is closely related to the concept of cognitive dissonance. When a conflict arises between values ​​and reality, a person can either change their beliefs or change their interpretation of reality. In conditions of social instability, the latter option is psychologically easier. Therefore, instead of saying, "This system is unfair," a person can say, "The situation is more complex," "There is context," or "It's not that simple." These formulations help reduce internal tension without having to change their fundamental worldview.


Illusion of control and predictability

Another important aspect is the illusion of control. Even if a person cannot actually influence the system, the feeling that it is understandable and logical creates a sense of security. Social systems, no matter how imperfect they may be, provide people with guidelines. They define what is considered normal, acceptable, and expected. The destruction of these guidelines is perceived as chaos. Therefore, a person may prefer an imperfect order to uncertain change.


Social reinforcement of systemic justification

This mechanism is reinforced in groups. If others are also inclined to justify the system, this becomes the norm. A person receives confirmation of their interpretations and is less likely to encounter alternative views. As a result, a collective justification is formed that further entrenches the existing order.


Why is this important for understanding political views?

Systemic justification helps explain why people can support political positions that at first glance seem contrary to their own interests or values. It also explains why changing views is slow and often accompanied by strong resistance. It's not just about logic, but also about psychological stability.


Systemic justification is a mechanism that allows people to maintain a sense of stability and predictability in a complex world. It demonstrates that people defend not only their interests or values, but also the very structure of the reality in which they live. This is precisely why political views often prove resilient even in the face of contradictory facts. Because changing them requires not just a revision of opinions, but a rejection of a psychologically comfortable worldview.


Differences in moral foundations

One of the most fundamental approaches to explaining differences in political views is offered by Jonathan Haidt within the framework of Moral Foundations Theory. His key idea is that people use different "moral lenses" when interpreting the same events. Haidt puts it this way:

"Morality binds and blinds." In other words, morality simultaneously unites people within a group and "blinds" them to alternative viewpoints.


Morality as intuition,not reasoning

One of Haidt's most important theses is that moral judgments are primarily intuitive, not rational. He writes:

"The emotional dog wags the rational tail." This means that an emotional reaction occurs first, followed by a rational explanation. People do not reach a conclusion through analysis, but rather adapt arguments to fit an existing sense of "right/wrong." Thus, when two people argue, they often don't compare arguments. They defend already formed intuitive reactions.


Different sets of moral reasons

Haidt identifies several basic moral reasons, the most important of which are:

1) Caring/Harm.

2) Fairness/Deception.

3) Loyalty/Betrayal.

4) Authority/Undermining.

5) Sanctity/Degradation.


The key point is that different groups of people assign different weights to these reasons. Some emphasize caring and fairness; for them, reducing suffering and protecting the vulnerable is paramount. Others are more focused on loyalty, order, and authority; for them, maintaining structure and stability are more important.


Different interpretations of the same reality

This leads to fundamentally different perceptions of the same events. One person sees the situation primarily as suffering and injustice. Their attention automatically focuses on victims, inequality, and violations of rights. Another person in the same situation sees a threat to order, a violation of rules, or an undermining of stability. Their attention is directed at the consequences for the system and the group. Importantly, both perceptions subjectively seem "obvious." Each person sincerely believes they are simply "looking at the facts," when in fact, they are interpreting them through their own moral system.


Why misunderstandings occur

Due to differences in moral foundations, people often not only disagree but also misunderstand each other at a fundamental level. Haidt emphasizes:

"We are all intuitive politicians, judging first and reasoning later." This means that the dispute begins after the conclusion has been reached. Arguments become a tool for defense, not a search for truth. The result is an effect in which one side perceives the other not as possessing a different logic, but as someone ignoring the "obvious."


Morality and group identity

Moral foundations are closely linked to social identity. Groups form and reinforce specific moral priorities. If the value of care dominates in a group, its members will be sensitive to suffering and injustice. If the values ​​of order and loyalty dominate, members will be more sensitive to threats to the system and violations of rules. Over time, this leads to morality becoming part of the group norm.


Moral asymmetry

An interesting effect is that each group tends to view its morality as universal. That is, people think not "these are my values," but "these are the right values." As a result, alternative approaches are perceived not as equal, but as mistaken or even dangerous. This increases polarization because compromise begins to be perceived as a moral betrayal.


Emotional intensity and moral conflicts

Because moral judgments are tied to emotions, debates on these topics become particularly intense. The more strongly a person feels that their core value is being violated, the more severe their reaction becomes. This explains why political discussions often quickly devolve from arguments to emotional reactions.


Why dialogue becomes difficult

When people use different moral foundations, they effectively speak different "languages." One appeals to justice and suffering, the other to order and stability. Arguments from one side may simply not be accepted by the other because they don't address its core values. This makes dialogue not just difficult, but structurally difficult.


Differences in moral foundations show that political disagreements are not simply a matter of facts or logic. They are differences in fundamental ways of perceiving the world. People don't simply reach different conclusions. They base their decisions on different criteria for what is considered important.


Social media and the amplification of radicalism

The digital environment not only reflects existing social processes but significantly amplifies them, creating conditions for accelerated polarization of opinions. One of the key mechanisms of this phenomenon is the so-called "group polarization" effect, described in detail by Cass Sunstein. He demonstrated that discussion within homogeneous groups does not lead to the averaging of positions, but, on the contrary, to their amplification. As he writes, "people predisposed to a particular point of view become even more confident and radical in their beliefs after discussion with like-minded individuals." Social media creates the perfect conditions for this effect. Platform algorithms, whether Facebook, YouTube, or TikTok are optimized to retain user attention. To this end, they display content that evokes the greatest response, often emotionally. This results in the formation of so-called "information bubbles" or "echo chambers," where people are primarily exposed to confirmation of preexisting views. As Eli Pariser, the author of the "filter bubble" concept, notes, "You don't decide what you see algorithms decide it for you, and they select content you're most likely to like." This leads to alternative viewpoints gradually disappearing from the user's view, creating the illusion that their position is not only widespread but also the only reasonable one. Furthermore, as Shoshana Zuboff emphasizes, digital platforms are embedded in the logic of so-called "surveillance capitalism," where user behavior is actively modeled: "algorithms don't just predict our behavior, they seek to shape it." This means that radical content not only spreads faster, it becomes economically profitable. As a result, complex social and political issues are inevitably simplified. Nuances disappear, and moderate positions lose visibility and appeal. As Jonathan Haidt writes, "social media doesn't just allow people to express their views, it structures those views in such a way that extreme positions gain an advantage." The result is a binary perception of reality: the world begins to be divided into "right" and "wrong," "us" and "them," with no middle ground. This increases conflict, reduces the capacity for dialogue, and makes society more vulnerable to manipulation.

Thus, social media is not a neutral communication tool, but an active factor, increasing the radicalization and polarization of modern society.


The emotional nature of political thinking

Drew Westen's research shows that human political thinking is largely determined by emotion rather than rational analysis. He concludes that when confronted with political information, people first experience an immediate emotional reaction, likely sympathy or antipathy and only then do logical reasoning mechanisms kick in. As he notes, "We don't so much make political decisions with reason as we use reason to justify decisions already made emotionally." This conclusion is supported by neuropsychological data: when evaluating political figures or ideas, areas of the brain associated with emotions and social evaluations are activated, rather than cold, analytical thinking. Rationality plays a more supportive role in this process. It acts not as an independent judge weighing arguments, but as an advocate defending an already formed position. Jonathan Haidt expresses a similar idea, comparing reason and intuition: "Intuition is the elephant, and reason is the rider; "The rider doesn't control the elephant, but rather explains where it's already gone." This metaphor aptly illustrates the limited role of rationality in political judgment. As a result, political views become particularly resistant to change. Because they are based not only on ideas but also on deeply rooted emotions, fear, anger, a sense of belonging, or identity attempts to change them through logical arguments often prove ineffective. As George Lakoff emphasizes, "People vote not for their interests, but for their identity." This explains why political discussions often reach a dead end: participants speak the language of arguments, but in reality defend emotionally charged positions. Thus, political thinking should be viewed not as a purely rational process, but as a complex interplay of emotions, identity, and subsequent logical justification.


Conclusion

Political views are formed not as a result of cold and dispassionate analysis, but as a result of a complex interaction between psychological mechanisms and the social environment. They are influenced by emotions, group affiliation, cultural context, and the information environment in which a person finds themselves. As Jonathan Haidt notes, "We first feel, then we reason," and this logic underlies most of our beliefs. Predictability of political positions is not a sign of weak thinking, but a natural consequence of the human desire for internal consistency, social belonging, and simplification of a complex reality. People tend to seek confirmation of their views and avoid information that contradicts them. In this sense, as Leon Festinger, the author of the theory of cognitive dissonance, wrote, "people seek to reduce internal tension, even if this means ignoring the facts." The problem arises not when a person has a fixed position, but when they stop questioning it. A lack of internal verification turns beliefs into dogmas, making them immune to new data and alternative viewpoints. This is especially dangerous in the digital environment, since, as Cass Sunstein points out, "echo chambers reinforce certainty, not accuracy." Perhaps the key sign of independent thinking lies not in originality of views, but in the ability to step outside one's own belief system and critique it. This requires not only intellectual effort but also a psychological willingness to question. That's why the key question to ask yourself is: "Why do I think this way and what would it take for me to change my mind?"

@lev_me_vision
by @lev_me_vision

Comments


Contact

+30 2314 042 342

youthcouncil21@gmail.com

Never Miss an Action.

Thanks for submitting!

  • Facebook
  • Twitter
  • YouTube

Have Any Questions?

Thanks for submitting!

© 2023 by Skooled. Proudly created with Wix.com

bottom of page