top of page

Yet another bias?

Yet another bias?


The word Unconscious Bias has become familiar to many people. Many of our clients request unconscious bias trainings, workshops and presentations. We are pleased about this, because talking about unconscious bias is important. In addition to the well-known gender bias, there are several other types of bias - so to make sure that no one loses track, here is an overview.


First of all: unconscious bias is subject to various distortion effects in our brain. In science, these were investigated long before the word even existed, especially with regard to errors in data sets, calculations, perception and thought patterns. In statistics, for example, biases are systematic errors that potentially have a distorting effect on the entire data collection. There are also biases that seem to be programmed into our way of thinking: so-called cognitive biases.


Other biases are socialized and grow out of social structures. We are often, usually without realizing it, influenced by sexist, racist or classist narratives; the list goes on. As a result, our perceptions, the decisions we make and the judgments we pass are distorted. In the case of gender bias, for example, it is - as the name suggests - a bias based on someone’s gender (1). Similar to a small error at the beginning of a mathematical calculation, gender bias distorts everything that follows. A kind of learned, sexist thought error.


There is a long list of types of bias that play a major role in everyday professional life - even if most of them run in the background. At first, it seems that HR departments are particularly affected, as biases are especially visible in selection processes. However, the following overview shows that everyone in the organization can start with themselves. It is important to review your own beliefs and proactively rethink processes and procedures in order to correct inscribed biases. Some of the bias types listed have their own detailed entry with further studies.


Gender bias 


The term gender bias covers various types of bias towards a person on the basis of their (perceived) gender identity. Gender bias therefore covers a whole range of very different cognitive distortions that disadvantage women, non-binary and trans people in particular. 


One example of this is the prevailing androcentrism - i.e. the fact that men and their perspectives, needs and problems are set as the norm. In an androcentric system, this norm-setting is not consciously practiced; rather, it is regarded as neutral and self-evident. This type of bias has become widely known, especially since 2019, when the award-winning book "Invisible Women" by Caroline Criado Pérez (2) was published. It describes in detail the extent to which our androcentric social order and the associated standardization of men as the norm systematically disadvantages women in medicine, road traffic, politics and the world of work.



In addition to androcentrism, there are other facets of gender bias, such as gender insensitivity and the double standards of evaluation. The latter has been researched in great detail and shows that people in the same situations, with the same behaviors or characteristics, are evaluated differently depending on their gender. More examples, current information on the current state of research and what can be done about this can be found in the corresponding blog entry.


Affinity bias 


In very simplified terms, affinity bias describes the phenomenon that people prefer other people if they resemble them. This can be based on obvious character traits that lead one person to see themselves in the other. However, the bias can also take effect in more subtle ways, for example when people have similar body language. People often do not realize that the sympathy they feel is based on a similarity - it happens subconsciously.


Affinity bias is also at the core of phenomena such as the Thomas cycle, which scientists such as Dr. Philine Erfurt-Sandhu (3) have been researching for years. This describes the appointment of German board members according to the principle of similarity, in this particular case based on the first name Thomas and the typical social identities associated with it. For example, there are more people named Thomas or Michael on the boards of DAX, MDAX & Co. than there are women overall (4). The Thomas cycle makes it clear that different types of bias can work together and intertwine - it is often difficult to say which bias is taking effect. A decision can therefore be affected by both gender bias and affinity bias.


Race bias


Race bias is very well researched and supplemented by a growing body of studies that analyze skin color and (perceived) ethnic origin and their impact in the workplace. For example, an article published in 2008 by Bertrand and Mullainathan examined the response rate to job applications sent out in Boston/Chicago. The CVs that the test groups sent out were completely identical - except for the first names of the applicants. It turned out that the people with Western and European-sounding names had to send out an average of ten applications to receive a callback. Tamika, Aisha, Rasheed and Tyrone, on the other hand, had to send out 50 percent more applications for the same result (5). 


There are also experiments in German-speaking countries that show a bias towards Black people and people of color - whether in the search for housing or in the workplace. 


The halo effect


There have even been studies on the halo effect since the 1920s - so the phenomenon was known in research long before the current diversity debates (6). 


The halo effect does exactly what the name implies: it puts a halo around a person and causes them to be judged through a filter - and therefore more positively than they should be. For example, certain information in a person's CV can stand out and make an impression. For example, the name of an elite university, an aristocratic-sounding surname or a renowned company. This information creates a kind of “anticipatory trust” and everything the person says or does further on in the assessment process, e.g. at a job interview, is cast in a more positive light. 


Conversely, the horn bias can attach two devil's horns to a person from the outset due to the "wrong" information - such as a dialect, an external feature that irritates or does not "fit in", or a bad residential area. As a result, any information that the person reveals about themselves is viewed particularly critically and the encounter is doomed to failure. 


Bandwagon bias


The bandwagon bias can be symbolized by a train: The train chugs through the country, someone jumps on it and rides along. It is based on the scientific observation that people (unconsciously) orient themselves towards certain individual pieces of information when making decisions or judgments. For example, anchoring mechanisms lead people to be influenced by existing figures when assessing prices (or appropriate salaries). This anchoring effect (7) is often used in advertising: A product is labeled with two prices, the supposed original price, now crossed out, and the supposedly reduced price. Regardless of how much the product is worth or whether a reduction has taken place, we think we have landed a bargain. 


In terms of work, the bandwagon bias is particularly evident in application processes in which one person gives their opinion of a candidate - and then asks the others for their opinions. These opinions will be based on the prevailing opinion. If there is also a hierarchical relationship, this bias can be reinforced and the opinion of the person to whom the most power is attributed is the one that people will follow. - What can be done about this? Solutions are, for example, application procedures in which reviewers make separate judgments and only after a written evaluation (e.g. in keywords, with post-its or similar) do they enter into the discussion. Notes on candidates' documents should also be avoided, as they influence the opinion of the person who receives the documents next.


Not only relevant for recruiters


What to do with all the different types of bias? It sounds banal and yet it is still not the norm: set criteria. The best way to combat the types of bias mentioned above is to have carefully considered, clearly defined and transparently communicated criteria for everyone involved in the application process. Of course, this also includes the question of how the criteria can be measured. For example, what are the criteria for measuring "openness"? 



Bias can distract us from seeing the information, we ought to see.
Bias can distract us from seeing the information, we ought to see.


Another tip for making bias visible is to keep observation and interpretation separate. These two levels should never be mixed. An observation is everything that can be captured (on camera). Everything beyond that is the interpretation, i.e. how what was observed affected the respective persons. However, there is no panacea against bias. It is worthwhile to critically reorganize the processes, train all those involved and constantly take a hard look at yourself. Why do I favor this candidate? How would this statement have come across in the mouth of the previous person? 


It is also important to note that the different types of bias shown here are not only relevant for HR professionals. Biases affect us all at work - regardless of what role we perform. They determine how we perceive each other, judge how we speak to each other, who we like to have lunch with or even who we confide in. They therefore play a significant role in creating work culture - to the disadvantage of some and to the advantage of others. Knowing bias, naming it and declaring war on it is essential for an equal working culture.


(1) Tversky, A., Kahneman, D. (1974). Judgment under uncertainty: heuristics and biases. Science, 185, 1124–1131

(2) Pérez, Caroline Criado (2019) Invisible Women

(4) AllBright Stiftung gGmbH (2017). Ein ewiger Thomas-Kreislauf? Wie deutsche Börsenunternehmen ihre Vorstände rekrutieren

(5) Bertrand and Mullainathan (2004). Are Emily and Greg More Employable than Lakisha and Jamal? A Field Experiment on Labor Market Discrimination

(6) The term was first used by Thorndike (1920).


Comments


Commenting has been turned off.
bottom of page