A TECHtonic Shift

Positive and negative effects of digitisation on human rights

At the BSR Conference in 2014, whilst introducing two speakers, Eileen Donahoe of Human Rights Watch pretty much nailed why those working in human rights should be thinking about the impact of technology on what they do, and why the authors of this blog believe we need to be talking a lot more about the good, the bad and the ugly impact on human rights of ‘rapid expansion in ICT and the exponential adoption of digital technology’.

As Ms Donahoe says herself,

“This is an exceedingly complex area and we are in a period of profound societal change and disruption, almost a tectonic shift… None of our social, political and legal institutions have caught up with the implications of this shift and our understanding of how to protect human rights is being deeply challenged.

Specifically, the well established human rights framework envisions a primary relationship between government and their citizens, where governments have a duty not to violate their own citizens rights and an obligation to prevent human rights abuses from others. The UN guiding principles for business and human rights re-affirms this principle but elaborates the responsibility of non-state actor businesses, to respect human rights. Several significant global shifts have challenged this basic model. Here are three:

  • The internet: In many ways it is a boom to the exercise of human rights but has also contributed to the distribution of power away from governments to non state actors. It challenges the concept of a sovereign nation state, geographical territorial boundaries and the principle of non-intervention. A significant portion of this power has been distribution away from government to the private sector especially tech companies. States frequently rely and even require private sector businesses to facilitate law enforcement and foreign intelligence surveillance.
  • Digitization: The advancement in digital technology has many positive effects, but it also has meant that governments have an advanced ability to monitor citizens movements, sensor speech, block, filter access to information and track communications. This puts privacy under assault and has negative implications for the exercise of all fundamental freedoms. When everything you say or do can be tracked, intercepted, monitored or surveilled by your government it has a chilling effect on what you feel free to say, where you feel free to go and with whom you choose to meet.
  • The follow on response of many governments around the world to the Snowdon revelations where with lots of mixed motivations, government have proposed regulations on businesses related to data retention, data localisation, sometimes with good intentions, sometimes not. Which also may have an effect on fragmenting the open interoperable internet as a global platform. From a human rights point of view, the loss or fragmentation of the internet would be a tragic result because the internet itself has facilitated so much human rights “

It cannot be in doubt that technology has led to the facilitation of human rights work and the creation of human rights groups who use the power of technology in exciting and innovative ways. Bytes for All is one such organisation. A Pakistan human rights organisation and research think tank with a focus on ICTs. One of their projects Take Back the Tech which was awarded the inaugural GEM-TECH award from UN Women and the International Telecommunications Union for Efforts to Reduce Threats Online and Building Women’s Confidence and Security in the Use of ICTs, focuses on strategic use of ICTs by the women and girls to fight violence against women in Pakistan. 

A different type of organisation working with tech for good is Defindia based in New Delhi, which focuses on content delivery of ICT. It wants individuals and communities to use technology for their own advantages. One of their projects eMSME, is a web service package designed to provide services at a minimal cost to help very small entrepreneurs. They want communities to use ICT for their empowerment, by providing accessible and affordable digital content based on cultural specifics or language necessities.

Tactical Tech is a group looking at the use of information in activism. Their Evidence & Action programme looks at the use of data, design and technology in campaigning and their Privacy & Expression programme helps activists understand and manage their digital security and privacy risks.

Funding for some of these organisations comes from tech firms / companies set up for ‘social good’. One such company is Benetech, a leading Silicon Valley-based non profit tech company, whose founder is social entrepreneur and former rocket scientist (and at that point the child in me says ‘Wow that is cool I want to say I’m a former rocket scientist) Jim Fruchterman. This company funds programs on literacy, human rights and the environment.  The Martus Project is a human rights initiative they fund, which aims to lower the barriers to strong digital security so that those working with vulnerable populations such as refugees, are able to collect, store and transfer individually identifiable information in secure ways.

So far so good. But what is the bad and indeed the ugly? As Eileen Donahoe points out, the impact on freedom of expression and privacy of advancement in technology is huge. The threats to these fundamental principles of human rights areas are rightly receiving a lot of attention. But what else? Prepare yourselves roboteers, Professor Noel Sharkey, formerly of the UK programme Robot Wars, and now part of the Campaign to Stop Killer Robots  speaks about the perils of autonomous weapons, the ‘moral buffer’ of fully autonomous drones used for targeted killings and the pressing dangers that military robots pose to peace and international security. As he warned, robots cannot reason at all:

“I would go to bed at night and I would be thinking about this and I would have horrible dreams, this child with a toy gun running out in front of a soldier, his mother screaming at him to come back. A soldier would understand that setting, a robot wouldn’t. It would think ‘Oh there is something happening here’ and gun down the child.”

And the ugly? Forbes published an article at the end of 2014 titled ‘People, not Technology, Must Solve Humanity’s Problems’. The ugly truth is that social media has amplified already existing race and class divisions. It has exposed us as the contradictory beings we really are.

“2014 to you may have seemed like every other year where we bicker, spit, and spew vile words at each other under the guise of some transparent progress. Yet in our fight for human equality in areas of poverty, race, religion, and gender, our technology has actually shown us how brutally unequal we really are….

Our technology has shown us how afraid we really are of each other. Our technology has shown us how much we’ve stopped listening to each other, and instead it has shown how we try to obtain the last word in a desperate bid of righteousness.

I don’t care if you’re sitting on the far left or the far right on issues that matter to our society. We’ve hyperpolarized our opinions to a point that nothing makes sense anymore…

We’ve confused privilege with human rights, and created an entirely different culture war that seeks to undermine the one thing we should be showing: kindness to one and other.

We’ve drowned out the real agents of change, and given platforms to people who only speak nonsense.

In 2015 let’s press mute on those people.

The article ends with a plea, Let’s use our technology to move the existence of all people forward, not just yours. Here’s to hoping we do.



Future robot policing report Final

Camilla Wood

UK based Legal Aid Lawyer

Leave a Reply

Your email address will not be published. Required fields are marked *