5.2 Understanding potential biases
TABLE OF CONTENTS
In section 2, we mentioned how to deal with errors, duplicates, missing data or outliers in your dataset. Such issues are data quality limitations that needs to be honestly shared and acknowledged in the report methodology section. Doing so will help others understand how reliable your data and conclusions are, while additionally pointing to contextual challenges that may arise if similar types of data collection / research is repeated at different times or in different locations.
It can also be helpful to spot limitation of data and analysis in data collected or analysis drafted by others: ACAPS - Spot Dubious Data.
Given the contexts in which humanitarians and development professionals operate, the data we collect generally has significant limitations, often resulting from the need to be pragmatic within the research design on tight timelines. Limitations can occur in many ways and at many levels. Without claiming to be exhaustive, we will list here some common limitations and how they may impact on you to encourage you to think about the possible biases in the data you collect yourself.
One of the main limitations often lies in the sample chosen, which is often difficult to make representative, especially when there are strong access constraints in challenging or hard-to-reach locations.
The difficulty of including a particular group, often due to access difficulties, can actually lead to the ‘invisibilisation’ of certain issues or even cause more harm. If decisions on what to do are made on the basis of biased or partial data, perspective and behavior of certain population groups can be discounted from the conclusions drawn from the analysis.
To illustrate, Caroline Criado Perez’s in Invisible Women: Exposing Data Bias in a World Designed for Men (Broché, 2019) provides many examples of the impact of the absence of gendered data on (especially political) decision-making. You can find a good sum up in Janet Anderson’s article “Caroline Criado Perez’s book Invisible Women calls for rethinking algorithms before it’s too late”.
Further, the lack of trust of a population, or of part of a population, can lead individuals not to answer or to give untrue answers to many survey questions. As an example, you can check out this case study based on ACAPS experience in Cox Bazar.
Other key elements that can lead to the collection of partial or wrong information include the lack of preparation of questionnaires through adequate focus on contextualized translations. Language or poor translation of questionnaires can also lead to highly biased responses and the resultant analyses. Translators Without Borders have published a report focusing specifically on potential translation issues with enumerators in humanitarian surveys, and what this means for quantitative data collection and results - The words between us, TWB, 2018.
The same is true of contextual misunderstanding, which can lead to questions being asked in a way that is incomprehensible to the target population (or even to certain groups within the target population). Contextual misunderstanding leads to questions being developed and that provide (at best) biased data, or (at worst) can ask sensitive or and inappropriate questions to respondents.
It is therefore essential to understand which data collection methods are appropriate for the context and the target population, the right way to ask questions, the right language to use, but also the need to create a climate of trust. This is not always possible.
Whether or not it is possible to address these possible factors affecting bias, it is necessary that they are explicitly stated in your analysis so as not to distort the decision-making that will result.
An additional interesting layer to look at when we speak about limitation of analysis is how cognitive bias, present in all human beings, affects our reasoning. Cognitive bias will affect the reasoning that guides our analysis, which will then be inherent when we want to draw a conclusion from the data. Hence, these biases should always be acknowledged to try to mitigate their impacts.
More details on how bias can affect our analysis provided by ACAPS here.