minekeron.blogg.se

Liquid war fighting algorithm
Liquid war fighting algorithm






liquid war fighting algorithm

She doesn’t remember exactly when she realized that some eligibility decisions were being made by algorithms. And the clients may not be aware of that, because a lot of these systems are invisible.” “They’re enmeshed in so many different algorithms that are barring them from basic services. “This is happening across the board to our clients,” she says. Increasingly, the fight over a client’s eligibility now involves some kind of algorithm. Other times it’s with a credit reporting agency, or a landlord. Sometimes that means facing off with a government agency. In her work as a civil lawyer and a poverty lawyer, her cases have always come down to the same things: representing people who’ve lost access to basic needs, like housing, food, education, work, or health care. Gilman has been practicing law in Baltimore for 20 years. If we want to be really good holistic lawyers, we need to be aware of that.” “Am I going to cross-examine an algorithm?”

liquid war fighting algorithm

“Basically every civil lawyer is starting to deal with this stuff, because all of our clients are in some way or another being touched by these systems,” says Michele Gilman, a clinical law professor at the University of Baltimore. Borrowing a playbook from the criminal defense world’s pushback against risk-assessment algorithms, they’re seeking to educate themselves on these systems, build a community, and develop litigation strategies. But for low-income individuals, the rapid growth and adoption of automated decision-making systems has created a hidden web of interlocking traps.įortunately, a growing group of civil lawyers are beginning to organize around this issue. Those of us with means can pass our lives unaware of any of this. Algorithms now decide which children enter foster care, which patients receive medical care, which families get access to stable housing.

liquid war fighting algorithm

Victims can be sent in a downward spiral that sometimes ends in homelessness or a return to their abuser.Ĭredit-scoring algorithms are not the only ones that affect people’s economic well-being and access to basic services. Worse, the algorithms are owned by private companies that don’t divulge how they come to their decisions. Their comprehensive influence means that if your score is ruined, it can be nearly impossible to recover. In the era of automated credit-scoring algorithms, the repercussions can also be far more devastating.Ĭredit scores have been used for decades to assess consumer creditworthiness, but their scope is far greater now that they are powered by algorithms: not only do they consider vastly more data, in both volume and type, but they increasingly affect whether you can buy a car, rent an apartment, or get a full-time job. While economic abuse is a long-standing problem, digital banking has made it easier to open accounts and take out loans in a victim’s name, says Carla Sanchez-Adams, an attorney at Texas RioGrande Legal Aid. Miriam is a survivor of what’s known as “coerced debt,” a form of abuse usually perpetrated by an intimate partner or family member.








Liquid war fighting algorithm