Artificial intelligence tools that English councils use reduce women’s health problems, find study Artificial Intelligence (AI)

Research has found that the artificial intelligence tools used by more than half of England boards reduce the work of women’s physical and mental health problems and the risk of creating gender bias in care decisions.

The study found that when using the AI tool from Google “Gemma” to create and summarize the notes of the case itself, a language such as “disabled”, “unable” and “complex” appeared significantly in men’s descriptions more than women.

The study, conducted by the London College of Economics and Political Science (LSE), also found that the similar care needs of women were more likely to be deleted or described in less terms.

Dr. Sam Rigman, the main author of the report and researcher at the LSE policy and evaluation center, said artificial intelligence can lead to an “unequal care item for women.”

He said: “We know that these models are very widely used and what raises it is that we have found very significant differences between standards of bias in different models.” “In particular, the Google model reduces women’s physical and mental health needs compared to men.

“Because the amount of care you get is determined on the basis of the perceived need, this may lead to the less care of women if biased models are used in practice. But we do not actually know the models that are used at the present time.”

Artificial intelligence tools are increasingly used by local authorities to alleviate the work burden of social workers who have been compensated, although a little information about the specified artificial intelligence models, and the extent of this impact on decision -making.

LSE research used real notes from 617 adult social welfare users, which were inserted into different large language models (LLMS) several times, with a gender replacement only.

Then the researchers analyzed 29,616 pairs of summaries to find out how male and female cases were treated differently by artificial intelligence models.

In one of the examples, the GEMA model summarized a set of notes as follows: “Mr. Smith is a 84 -year -old man who lives alone and has a complex medical history, and no package of care and mismanagement.”

It summarizes the same condition in the same form, with sex switching, as: “Mrs. Smith is 84 years old alone. Despite her restrictions, she is independent and able to maintain her personal care.”

In another example, the summary of the case said that Mr. Smith was unable to reach society, but Mrs. Smith was “able to manage her daily activities.”

Among the tested artificial intelligence models, Gemma of Google created more visible sexual variations on the basis of sex than others. The research found that the Llama 3 Meta model did not use a different language based on sex.

Rigman said that the tools “were already used in the public sector, but their use should not come at the expense of fairness.”

He said: “Although my research highlights problems with one model, more is published all the time, which makes it necessary that all artificial intelligence systems be transparent, and tested accurately to bias and is subject to strong legal supervision.”

The paper concludes that the organizers “should impose the measurement of bias in the LLMS used in long -term care” in order to give priority “Al -Insaf al -Khwarizmi”.

There were long concerns about ethnic and sexual biases in the tools of artificial intelligence, where machine learning techniques were found to absorb biases in the human language.

one United States study An analysis of 133 artificial intelligence system in various industries and found that about 44 % showed gender bias and 25 % showed the sexes and racial bias.

According to Google, its team will examine the results of the report. Her researchers have tested the first generation of the GMMA model, which is now in its third generation and is expected to be a better performance, although it has never been mentioned that the model should be used for medical purposes.

Leave a Comment