Algorithmic accountability: robodebt and the making of welfare cheats

Publication Name

Accounting, Auditing and Accountability Journal


Purpose: The paper aims to investigate how accounting techniques, when embedded within data-driven public-sector management systems, mask and intensify the neoliberal ideological commitments of powerful state and corporate actors. The authors explore the role of accounting in the operationalisation of “instrumentarian power” (Zuboff, 2019) – a new form of power that mobilises ubiquitous digital instrumentation to ensure that algorithmic architectures can tune, herd and modify behaviour. Design/methodology/approach: The authors employ a qualitative archival analysis of publicly available data related to the automation of welfare-policing systems to explore the role of accounting in advancing instrumentarian power. Findings: In exploring the automation of Australia's welfare debt recovery system (Robodebt), this paper examines a new algorithmic accountability that has emerged at the interface of government, technology and accounting. The authors show that accounting supports both the rise of instrumentarian power and the intensification of neoliberal ideals when buried within algorithms. In focusing on Robodebt, the authors show how the algorithmic reconfiguration of accountability within the welfare system intensified the inequalities that welfare recipients experienced. Furthermore, the authors show that, despite its apparent failure, it worked to modify welfare recipients' behaviour to align with the neoliberal ideals of “self-management” and “individual responsibility”. Originality/value: This paper addresses Agostino, Saliterer and Steccolini's (2021) call to investigate the relationship between accounting, digital innovations and the lived experience of vulnerable people. To anchor this, the authors show how algorithms work to mask the accounting assumptions that underpin them and assert that this, in turn, recasts accountability relationships. When accounting is embedded in algorithms, the ideological potency of calculations can be obscured, and when applied within technologies that affect vulnerable people, they can intensify already substantial inequalities.

Open Access Status

This publication is not available as open access



Link to publisher version (DOI)