Beside the logic fallacies, intentionally or unintentionally used by a speaker to convince the audience about his/her argument making it to seem more persuasive or valid, a critical thinker needs to be aware that our own thought processes are not always clear or rational, but can be altered by a “filter” which is activated in an unconscious, automatic and non-easily controllable way, that can alter how you perceive things and get in the way of making good logical decisions. This “filter” is based on our personal experience and preferences, already existing opinions and pre-built inclinations, and it is called “cognitive bias”. This filter is activated by our brain, to categorise, prioritize and process the vast amount of input it receives. 

Newspapers tend to have an organisational ‘view’ or political slant and this affects both what they report and how they report it. In the same way, we all have our opinions and tendencies, coming from cultural norms and belief, and these aff

ect the way we create our opinions and evaluate what we read or hear, leading to systematic deviations from a standard rationality or good judgment. 

Communication strategies aimed at convincing people about opinions and ideas, used for example in politics and advertisement can take advantage of our cognitive biases to shape our opinions, by leveraging less on evidence, reason and logic and more instead by appeals to our flawed patterns of feeling and thinking. 

Being aware of how our brain works in this sense and what kind of biases exist is useful for trying to not to be influenced by them in the development of our own opinions. 

Here below is a list of some of the most common cognitive biases, divided in three main categories:


Selection biases are caused by choosing non-random data for analysis. The bias exists due to a flaw in the sample selection process. Some information is  unconsciously chosen or disregarded, misleading the analyst into a wrong conclusion.

Confirmation Bias

The tendency to easily accept information that confirms your point of view and ignore or reject information that does not support it.

Anchoring Bias

The tendency to place excessive weight or importance on one piece of information – often the first piece of information you learned about a topic.

Absence of evidence

A failure to consider the degree of completeness of available evidence and not addressing the impact of the absence of information on analytic conclusions. The absence of information did not indicate the absence of a problem, but the impossibility of getting the information about a potential issue.


Social biases are a result of our interactions with other people. The way we are processing and analysing information depends on our relations with the persons who provided us with information or hypotheses.

jigsaw puzzle icon

Attribution error

Overemphasising personality-based explanations for behaviours observed in others, while under-emphasising the role and power of situational influences on the same behaviour.

Thinking that a farmer managed to sell more wheat because he/she is very hard-working, and not because he/she had the opportunities (maybe he/she lives closer to the market), means (maybe he/she used new fertilizers) and support (several members of his/her family help him/her) to achieve such results.

man in front of mirror icon

Mirror Imaging (also known as projection)

Assuming that others will act the same as we would, given similar circumstances or that the same dynamic is in play when something seems to happen in similar context as in the past.

At the beginning of the Ebola crisis, humanitarian actors assumed that affected communities would be open to sensitisation campaigns and were surprised by the aggressive attitude of the affected populations.

Stereotyping icon


Expecting a group or person to have certain characteristics without having real information about the person. It allows us to quickly identify strangers as friends or enemies but we tend to overuse it even when no danger is perceivable.


Process bias is our tendency to process information based on cognitive factors rather than evidence. When we process information, we often display inherent thinking errors. They prevent an analyst from accurately understanding reality even when all the needed data and evidence are in his/her hand.

negativity icon


Paying more attention to and giving more weight to negative rather than positive experience or other kinds of information.

cluster icon

Clustering illusion

Overestimating the value of perceived patterns in random data. The human brain excels at finding patterns and relationships, but tends to overgeneralise. We usually confuse correlation for causation. While the two might be correlated, meaning they appear to follow the same path, they do not cause each other.

Framing icon


Being influenced in our decisions by how a situation has been presented.


Functional Fixedness

Tendency to utilize an object or an idea in only the way it is traditionally used.

Mere Exposure Effect

Tendency to like something just because you are familiar with it.

Not Invented Here Bias

The tendency to discount information, ideas, standards, or products developed outside of a certain group.


The urge to do the opposite of what you are asked to do in order to preserve your freedom of choice.

lightbulb in a box icon

Status Quo Bias

The tendency to want things to stay relatively the same as they have always been.

Contact Us

Do you want to sign up to receive our newsletter or write us to have more information?

Coordinator – Centro per lo Sviluppo Creativo Danilo Dolci – Italy