The Computer Says…
I have just returned from holiday with friends in a warmer, sunnier place (just had to get that in), where my only stress was keeping myself right on the various documents and protocols required pre and post departure. In particular, the return journey processes we all found a little stressful as they had to be completed online and saved to our phones as we had no access to a printer! (Paper is my security blanket!)
This experience reflects how quickly society has adapted to new technology shaping the way we live, work, and travel. Apps for this – vaccine passports for that – rapidly evolving technology is enabling our reopening up to the wider world, yet at the same time, potentially shutting it down for others.
This verification of our “status” brings into sharp focus the way our personal data is managed, stored and shared. It also raises a whole raft of issues about our reliance on computer technology – which we all know is great when it’s working. But what about when it isn’t?
The recently opened public enquiry into the Post Office and the Fujitsu Horizon Computer System is unearthing tales of truly horrendous and shocking experiences caused by a computer error that has forever tarnished the lives of over 3,500 subpost office managers and their families, including the over 700 who were wrongly sent to prison. Tragically at least 33 people are thought to have died before their names were cleared. Read further from The Guardian here and the Huffington Post here.
This issue was discussed on last week’s BBC Question time. I am paraphrasing Juergen Maier, former CEO of Siemens, when he said ‘this was a case of the computer saying one thing and human logic should have prompted exploring some other explanation rather than apparently assuming the organisation had so many “corrupt” people working for it across the country. Sadly, there appeared to be very little faith in the premise that most people are good people – and instead, the computer won the argument.
And sadly, the computer is continuing to win the argument when it comes to equality and bias as Caroline Criado Perez points out in her book Invisible Women. Computer algorithms have been consistently churning out data and solutions that cater for a world designed for men. But there are increasingly wide and far-reaching consequences if we don’t get to grips with this. And yet, so far, it’s unclear what will be the catalyst for tech companies to make widescale changes to combat this.
A 2019 article by Open Democracy also explores the issue of algorithms perpetuating issues of inequality and bias, sighting a number of examples that should give us all pause for thought, and yet, we are not even scratching the surface in terms of the scale of the problem.
With the accelerated use and application of technology during the pandemic, the pace of change through recovery shows no sign of letting up. Yet still, only 22% of professionals in AI and data science fields are women and they are more likely to occupy jobs with less status (or influence). So, when will we see the scale of strategic intervention that will make these necessary changes happen?
The tech world already knows the problems with current algorithms and how these are shaping our future – but there does not seem to be any cohesive desire or motivation to fix it. Technology brings many benefits and influences so many aspects of our lives. However, our data is harvested daily across a spectrum of sources, with conclusions drawn about us from it through the algorithms that are used.
How comfortable are we that built-in bias is shaping the choices we are offered, and that we subsequently make? While technology undoubtedly brings society great benefits what else might we be losing in our haste to embrace the technology rollercoaster – or is it increasingly only the computer who gets to say no?