6.1 Needs definition
TABLE OF CONTENTS
- Case study: too-rapid implementation of a tool
- Case study: overzealous staff
- Case study: culturally inappropriate data collection (or contrary to empowerment)
- Key resources
Case study: too-rapid implementation of a tool
The situation
You are a logistics provider for an NGO assisting refugees and you have produced and are maintaining up-to-date an online map of the security risks and location of armed groups in your area of intervention against a backdrop of war. The link is not readily found on the Internet, but you have shared it via email with the relevant teams to facilitate their travel and the free version does not require an account.
You learn from listening to the local radio that one of the armed factions has learned of its existence, does not appreciate the disclosure of information about them and belittles your organisation’s work with local communities, accusing you of working for other factions.
What are the potential risks?
Whilst the risks will always be context-dependent, we can well imagine :
- For the beneficiaries, a loss of confidence in the NGO’s actions and in the NGOs’ principle of neutrality, or even to become the target of intimidation/pressure/violence by the armed group
- For the teams, for the NGO, the risk of wasting time, a reputational risk, or even bad security decisions because they will no longer have access to critical quality data.
What to do?
Immediately, the most important thing is to decommission the map or shut down access if easily possible, through the establishment of appropriate user rights, and to set up mitigation measures to be defined according to the potential damage caused (protection of your teams, of the beneficiary populations, interactions with populations, communication campaign, etc…)
How could the situation have been avoided?
This is the typical case of an NGO member thinking they were doing the right thing, without being aware of data protection and of how to apply the “do no harm” principle to data management tools.
An awareness raising, situational and/or peer exchange session facilitated by a focal point, where topics such as the necessary steps to choose or implement a tool responsibly, or how to conduct a contextualized risk analysis are addressed, would probably have allowed the staff concerned to ask themselves the right questions at the right time as to the risks of their approach.
It is of course also the responsibility of the Head of Mission to ensure that all teams are aware of the contextual risks and equipped to deal with them.
Case study: overzealous staff
The situation
You are a project manager from a medical NGO dealing with very sensitive data and start working in a small rural development NGO that supports farmers in terms of training and advice regarding their farms and the fair sale of their agricultural products. At the onset of a project aimed at understanding farmers’ knowledge of new seeds they are about to use, a collection is carried out amongst them. As you would normally do, you produce a thorough risk analysis, mapping all of the actors in the area and their potential interest in the data, then disseminating training to the three project enumerators via a 5-day course so that they may take ownership of the collection amongst the farmers and avoid any blunder, and a collection plan with data encryption and server not connected to the internet. Your head of mission tells you that between the license costs and the training time there is no budget left on the project.
What are the potential risks?
- The answer is not adapted to the specific needs of the activity and is very expensive.
- The technical complexity of what is implemented will be difficult to pass on to the rest of the team.
- Maintaining such a solution over time, with transmission and training of teams, is too ambitious.
- There exists a risk of abandonment of the tool or procedure, failure to take ownership of the procedure
- Technical one-upmanship does not guarantee data security.
What to do?
Reconsider your position and go back to a simpler collection mode and avoid going that far in the future.
How could the situation have been avoided?
There is no universal guide to define your data protection needs, each situation being different in terms of context (political, legislative, technological, ethical, partnership…), of risks, of the means allocated to the subject…
Nevertheless, in this case, in a context that seems at first low risk, the approach implemented by the project manager seems disproportionate to the potential risks (list of farmers supported for skills development). Let us not forget that perfect is the enemy of good, we must not implement unnecessarily complex procedures and training if the situation does not require it.
In this type of situation, it seems essential to remain firmly planted in the realm of the practical and start with a quick general analysis, perhaps more closely assessing one or two actors at risk (e.g. grain companies with special interests in the region or personal data legislation), to exchange with other actors who may have faced similar issues to verify the coherence of the approach envisaged and to establish a plan designed in the light of these elements.
The opposite case clearly occurs much more frequently, but it is also important not to overdo it to prevent the responsible data management topic from being seen as a cumbersome normative approach rather than an opportunity for practices that are more respectful of people and actors.
Case study: culturally inappropriate data collection (or contrary to empowerment)
The situation
During a “Knowledge, Attitudes, Practices” survey on water, hygiene and sanitation (or “WASH KAP”) in a refugee camp, in order to better understand the needs in terms of awareness activities and associated distributions, you ask your enumerators (mainly men) to question concerned girls and women about menstruation. You also ask to take pictures of the types of hygienic protection used and where the protections are changed.
The context is one where the subject is relatively taboo and outside the scope of the men of the households, to whom the enumerators generally have access for their questionnaire.
Your organisation then receives various complaints about this, and the authorities request that the activities in this area cease.
What are the potential risks?
- Risk of stigmatisation toward data subjects and of non-compliance with ethical principles, beneficiaries of activities and enumerators.
- Risk of loss of credibility and access for teams, or even violence against them.
What to do?
In the short term, there is not much to do except to review the modalities and content of the collection if there is still time to do so, to try and better adapt them to the context in question. This could for instance be done via a consent request prior to asking the questions.
How could the situation have been avoided?
This type of situation could have been avoided by doing a little contextual research (discussion with other NGOs, key informants, discussion groups…) to understand what is acceptable or not in the context in question, what is sought by the populations concerned. With respect, it is necessary to adapt the collection and activities so that they are in line with local cultural norms. For example, investing in the identification of enumerators or “relay persons” within the community to address this sensitive topic without offending; or consider a group discussion that would have presented and explained the need for this information in order to determine with members of the community the best way to collect sufficient and quality data.
Key resources
- This OCHA guidance on a data impact assessment - Note #5: Data impact assessment in support of the Operational Guidance on Data responsibility in Humanitarian Action (IASC)
- These Engine Room resources on deciphering concepts and steps in responsible data management by the Engine Room, an introduction to a number of principles and an introduction workshop on good reflexes in program data management
- The checklist in part 6 of the Netherland Red Cross 510 Data Responsibility Policy
- Several proposals in other sector resources are available in Section 7 such as an analysis grid (produced by Care) to help you gauge your organisation’s maturity level in terms of responsible data management
- This Data Ethics Canvas by the Open Data Institute
- This MERL Tech/ Clear Global resource: Responsible Data in M&E (RDiME) Alliance to help you think about “Data Governance” (in an African context)
- Articles to read about problematic situations in program data management related to poor risk analysis: here, here and here