Forums before death by AOL, social media and spammers... "We can't have nice things"
|    alt.culture.alaska    |    People's weird obsession with Alaska    |    51,804 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 50,016 of 51,804    |
|    Democrat Incompetents to All    |
|    Blue State of Michigan's mistake led to     |
|    15 Feb 21 10:25:06    |
      XPost: alt.gossip.celebrities, alt.politics.democrats.d, sac.general       XPost: alt.rush-limbaugh       From: democrat-fools@nytimes.com              Democrats, too stupid to program computers.              LANSING — Brian Russell never committed unemployment insurance       fraud, or even attempted to do so.              And he had no idea an automated state of Michigan system had       accused him of doing anything wrong until 2016, when officials       seized his nearly $11,000 tax refund check.              The state finally cleared Russell in 2018, but the false fraud       debacle — which has hurt tens of thousands of innocent Michigan       residents — undermined his ability to provide for his two kids       and led to a bankruptcy filling.              "It's devastating," Russell, a 43-year-old maintenance       electrician from Zeeland, told the Free Press. "You would think       if they were going to put something that huge in place, they       would have someone — or even a team of people — overlooking it       and making sure there were no problems."              Experts say the MIDAS (Michigan Integrated Data Automated       System) false fraud fiasco, while unique to Michigan in terms of       the details, is only one of the most glaring national examples       of how the use of artificial intelligence by governments is       harming citizens. Those most likely to be harmed by such       systems, they say, are the economically disadvantaged.              "We're seeing more and more of these kinds of atrocities," said       Rashida Richardson, director of policy research at the AI Now       Institute, a nonprofit connected with New York University that       researches the social implications of artificial intelligence.              Other examples of "intelligent" government computer systems       running amok, in Michigan and elsewhere, include:              In another Michigan case, the Department of Health and Human       Services used an automated system to disqualify those with       outstanding felony warrants from receiving state food       assistance. Between the end of 2012 and the start of 2015, the       system produced false matches that improperly disqualified more       than 19,000 residents from food assistance. A 2013 federal class-       action lawsuit led to an out-of-court settlement and       reinstatement of those improperly disqualified.       In Idaho, introduction of an automated system to determine the       dollar value of disability services available to Medicaid       recipients resulted in large cuts for many recipients. A court       later found that the system was unlawfully arbitrary, unfair and       lacked due process. There have been similar cases related to       disability benefits in Arkansas and Oregon.       In Houston, where a system of algorithms was used to evaluate       the performance of teachers, teachers were able to overturn the       system on due process grounds. They successfully argued that       because the vendor considered the evaluation system a trade       secret, they were denied the right to use the data to understand       or improve their performance.       In the District of Columbia, an automated system used to assess       the risk for violence of youth in the juvenile justice system       was found to be racially discriminatory as it was used in       connection with one young defendant deemed "high risk" and in       need of detention. The system is still in use.       Other concerns relate to the use of facial recognition       technology, which is extensively used by police in Detroit, and       "predictive policing," which the Michigan State Police has shown       interest in.              The American Civil Liberties Union (ACLU) and other groups are       pointing to disasters like MIDAS to push for laws that limit,       regulate and increase transparency in the ways governments       collect and use data for computerized decision-making.              More: Michigan residents falsely accused of jobless fraud can       sue, Supreme Court says              More: State names jobless advocate to lead Unemployment       Insurance Agency              Richardson said governments can be expected to continue to       expand the range of applications as technology advances and the       marketing of systems by software vendors expands.              The "creepiest example" of a new system Richardson is aware of       is soon to be implemented in Allegheny County, Pennsylvania,       where officials have been using a "family screening tool" and       predictive analytics to try to head off child abuse. Starting in       January, the county is planning to assign each child and family       a "risk score" at birth, according to a county fact sheet and       news media reports.              Jim Hendler, a computer science professor and director of the       Institute for Data Exploration and Applications at Rensselaer       Polytechnic Institute in Troy, New York, said many concerns       about government use of artificial intelligence are well-founded       and others may be overblown.                     [continued in next message]              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca