Dr Andrew Chen
We’ve heard a lot about algorithms in recent years, whether they’re calculating your insurance premiums, running our traffic lights, or deciding what content to show you on social media platforms. While algorithms have a broader meaning, we often think of pieces of software running on some computer somewhere, making decisions that affect our lives. It turns out the government has many scenarios where they would like to improve our lives, and algorithms can help them make better decisions, respond to changing situations faster, and allocate resources more efficiently and fairly.
However, there is also the potential for the government to misuse algorithms or use them poorly, whether intentionally or accidentally. The types of decisions that government makes are consequential – they can affect a lot of people very quickly, and the impacts can be significant for individuals. At Koi Tū, we are focused on the long-term impacts and issues that affect how our societies function. Trust in government is crucial for a strong society, yet mistakes in using algorithms and other new technologies can undermine that trust. This is why the government has also introduced an Algorithm Charter, which sets underlying principles and commitments that government agencies should consider when developing algorithms.
The Algorithm Charter, launched in July 2020, asks government agencies to assess six key areas: transparency, partnership and consistency with The Treaty of Waitangi, engaging with impacted communities, understanding limitations and bias, human rights and ethics, and retaining human oversight. It provides a very high-level framework to help policymakers evaluate whether or not they are doing the right things when it comes to using algorithms, and mitigating the risks.
However, the Charter is not without its pitfalls. Government agencies commit voluntarily to the Charter, and there are no enforcement mechanisms or centralised reporting to ensure that agencies are living up to the Charter. There is significant variation between agencies regarding how and when they apply the Charter. The Charter doesn’t really engage with Māori, including in terms of Māori data sovereignty. And agencies are now finding edge cases where the Charter and its associated risk matrix don’t really apply – for example, where the Charter asks agencies to evaluate negative impact as “unintended harms for New Zealanders”, which leaves out harms to people overseas or the environment or other systems.
These shortcomings are of course, unintentional, and sometimes we have to operationalise something to find out why it might not work as fully as intended. StatsNZ and the Government Chief Data Steward (GCDS) have recently released their one-year review of the Charter, which reflects on the experiences of government agencies and subject matter experts. It’s clear that the Charter has had a positive overall impact in its first two years, but there is still so much more potential. Greater coordination and sharing of best practice between agencies would significantly lift the bar, with better templates and accountability. Maintaining a high standard and repeatedly demonstrating good behaviour helps to build trust between the government and its citizens – we still have a long way to go.
Andrew has written more comprehensively about the strengths and weaknesses of The Algorithm Charter in a new BWB Text, More Zeroes and Ones (ed Anna and Kelly Pendergrast), available in bookstores.