Idaho Medical System Algorithm

From Privacy Wiki
Jump to navigation Jump to search
Idaho Medical System Algorithm
Short Title Idaho Medical Algorithm Caused Cut of Funds for Disabled People
Location Idaho
Date 2011

Taxonomy Aggregation, Exclusion, Decisional Interference, Distortion
Information Identifying, Medical and health
Threat Actors State of Idaho

Individuals
Affected People on a state disability program in Idaho
High Risk Groups Disabled people
Secondary Consequences Lost Opportunity, Financial Cost

After the State of Idaho introduced a tool to determine costs for home care by means of an algorithm instead of manual calculation, funds for some beneficiaries dropped by as much as 42 percent.

Description

Around 2011, the state of Idaho started using a tool to determine costs for home care for people with disabilities. People could apply to have the state pay for visits from a professional caregiver, allowing them to stay in their own homes rather than move to a full-time care facility. Before the introduction of this automated decision-making tool, the costs were calculated manually by medical workers, individually for each patient. But after the tool was put in place, funds for some beneficiaries dropped by as much as 42 percent.

Several privacy violations can be seen here. Among them is Exclusion, because until the court decision in 2016, the state was refusing to reveal the formula it used, as well as the exact data that was processed. Such lack of transparency made it impossible for an average person to understand or challenge the use of their personal information.

After the formula was revealed, it showed that the data processed was flawed, that the tool was making arbitrary and irrational decisions with big impacts on people’s lives. This can be seen as a form of Decisional Interference. The algorithm created severe consequences for patients’ personal affairs limiting their options - the amount of state-financed medical help they would get.

Even though Aggregation of data can be used to benefit individuals (e.g. faster or easier access to the needed medical help), in this case, the data that was combined with other personal information of individuals was flawed and therefore the formula didn’t make much sense. This lead to Aggregation violating the privacy of patients and caused severe secondary consequences, such as Lost Opportunity of getting proper medical help, as well as Financial Costs because of the dropped funds for some individuals.

It is unclear, whether the flawed pieces of data were used as personal information, but it can nevertheless be seen as Distortion because it was put in the formula which can be seen as personal data. The use of inaccurate data in the formula leads to misleading and false information about individuals and caused the tool's irrational calculations.

Risk Statistics

Laws and Regulations

Sources

https://themarkup.org/ask-the-markup/2020/03/03/healthcare-algorithms-robot-medicine
https://www.theverge.com/2018/3/21/17144260/healthcare-medicaid-algorithm-arkansas-cerebral-palsy
https://www.aclu.org/blog/privacy-technology/pitfalls-artificial-intelligence-decisionmaking-highlighted-idaho-aclu-case