bizEDGE New Zealand logo
Story image

New Zealanders uneasy over automated decision-making

New Zealanders are uneasy with how automated decision-making systems are used in society, particularly by the Government, according to new research. 

The Digital Council for Aotearoa report, Towards trustworthy and trusted automated decision-making in Aotearoa, notes that while New Zealanders have an appreciation for the way it can be used to improve peoples lives, some remain uncomfortable with the technology.

Automated decision-making is when some aspects of decision-making processes (eg, visa applications, media consumption, recruitment, youth support, parole decisions, surgical waiting lists) are carried out or informed by computer algorithms.

Concerns about bias and discrimination

"People appreciate that automated decision-making is useful for processing data at speed and at scale, and as an assistant to people," says Digital Council chairperson, Mitchell Pham. 

"But people worry that systems, programmers and decision-makers can introduce bias into these."

As one workshop participant said, "Algorithms are only as good as the people who designed them. Machine learning might help with that, but right now most of the algorithms are people-designed, so peoples individual biases and so on can come into play on that."

More transparency and better communication needed

"The Government and private companies hold significant datasets about the lives of individuals, whanau and communities", says Pham. 

"Communities need assurance and transparency about the data that feeds algorithms what algorithms are doing, their purpose, who is making the decisions that affect peoples lives, and what will be done with the data they use and collect," he explains.

"There was a general consensus among participants about transparency and how people should be ... able to see the impact of the algorithm: so we know what decisions its making, what impacts those decisions have, who is designing it and making the decisions about design, how often is getting things wrong."

 Better involvement and representation needed

"People who use automated decision-making tools are making sizeable moral and ethical decisions that positively or negatively impact peoples lives," says Pham. 

"Those people need to be a good reflection of the lived experience of the citizens they are there to serve."

A workshop participant commented that, "My hunch is that the health professionals and systems creating these algorithms don't have disabled peoples views in them. And what those people see as quality of life is quite often different to what we think of as quality of life."

Another participant added, "My biggest concern is that the algorithms involved with identifying these kids are probably created by white middle-aged men, which therefore marginalises indigenous people and values. So my concern is the bias of the programmer will come through in the algorithm.

"I think the idea is quite cool and can see the intended benefits but Id feel more comfortable if indigenous people are involved in the processes of creating these algorithms."

What it would take for people have more trust in algorithms and those using them

"People are not passive mathematical problems to be fixed by remote," says Pham. 

"The answer isn't hidden in an algorithm. The government needs to be looking at relational responses to addressing public unease. 

"Communities have the right and necessity to be involved in creating algorithms that are used on their people. This is a key part of honouring Te Tiriti o Waitangi, and was heard by the Digital Council from Maori, Pasifika, migrant, disability and youth communities alike," he says.

According to the report, participants talked about their desire for a focus on data and interventions that reflect the strengths and aspirations of whanau and communities. They also talked about wanting to see themselves reflected in teams that design, build, use and make decisions based on algorithms. 

"They want opportunities to exert choice and control over how data is used," Pham says.

"They want consent and buy-in that is actively sought rather than assumed or opt-out,"