Kelly Moes

Disability Studies Research | Intracranial Hypertension specialist

Automated Ableism: Disability Surveillance, Equity, and Bias in Australian Social Services


Book chapter


Maria Ionita, Kelly Moes, Katie Ellis
[under review], 2024

Cite

Cite

APA   Click to copy
Ionita, M., Moes, K., & Ellis, K. (2024). Automated Ableism: Disability Surveillance, Equity, and Bias in Australian Social Services. In [under review] (Ed.).


Chicago/Turabian   Click to copy
Ionita, Maria, Kelly Moes, and Katie Ellis. “Automated Ableism: Disability Surveillance, Equity, and Bias in Australian Social Services.” In , edited by [under review], 2024.


MLA   Click to copy
Ionita, Maria, et al. Automated Ableism: Disability Surveillance, Equity, and Bias in Australian Social Services. Edited by [under review], 2024.


BibTeX   Click to copy

@inbook{maria2024a,
  title = {Automated Ableism: Disability Surveillance, Equity, and Bias in Australian Social Services},
  year = {2024},
  author = {Ionita, Maria and Moes, Kelly and Ellis, Katie},
  editor = {[under review]}
}

 Abstract

This chapter problematises the expansion and reliance on algorithmic programs and surveillance by welfare agencies as determining factors for eligibility for disability services and supports. By examining three case studies from the Australian context - Robodebt, the Disability Support Pension, and the Justice system, it becomes evident that algorithmic technologies, such as risk assessment metrics and data-driven surveillance, constitute intrusive practices that reinforce existing prejudices in the policing of people with disability. 

It is argued that algorithmic profiling of people with disability elicits a hegemonic process of knowledge creation, influencing not only the ways in which disability identity is assigned and configured across data sets but also reifying ableism as a value system and relegating disability to the confines of health-driven and pathologised ways of thinking. By relying on predetermined metrics and narrow eligibility criteria for services, these processes undermine the complexity of the disability experience. As these examples illustrate, the use of surveillance and networked data for monitoring compliance could increase stigmatisation and discrimination of people with disability. The collection and analysis of personal data in the context of welfare administration enable multifaceted forms of surveillance of people with disability, sanctioning normative assumptions and labelling certain behaviours as compliant or 'risky'. The consequences are far-reaching, perpetuating the marginalisation and disenfranchisement of people with disability.




 [A1]Criminal Justice system?
criminal social justice?

Tools
Translate to