[ad_1]
More and more, it is used in what is presented as a public health benefit. Australia recently expanded a program Using facial recognition to enforce covid-19 security measures. Quarantined people are subject to random check-ins, where they must post a selfie to confirm they are following the rules. Location data is also being collected, according to Reuters.
When it comes to basic needs like emergency aid to pay for housing and meals, Greer says the first priority is to make sure everyone has access to help. Preventing fraud is a seemingly reasonable goal, but he adds that the most immediate goal should be to provide people with the benefits they need.
“Systems must be built from the outset with human rights and the needs of vulnerable people in mind. These are unthinkable,” Greer says. “They can’t be bug fixes after they’ve already gone wrong.”
ID.me’s Hall says his company’s services are preferred over existing authentication methods and have helped reduce “major” unemployment scams since states implemented face verification checks. He says unemployment claims on their own or via video call with an ID.me representative have a real pass rate of about 91%.
“[That] Our goal was to enter,” he says. “If we could automate 91% of that, then states that are newer in terms of resources could use those resources to provide white-glove concierge service to the 9%.”
According to Hall, when users fail the facial recognition process, ID.me emails them to follow up.
“Everything about this company is about helping people access what they deserve,” he says.
technology in the real world
The months that JB survived without an income were tough. Financial anxiety was enough to cause stress, and other problems such as a broken computer increased the anxiety. Even his former employers couldn’t or wouldn’t help cut the bureaucracy.
“It’s so isolating to be like, ‘No one is helping me in any situation,'” says JB.
On the government side, experts say it makes sense for the pandemic to bring new technology to the fore, but cases like the JB’s show that technology by itself isn’t the whole answer. Anne L. Washington, an assistant professor of data policy at New York University, says it’s tempting to view a new government technology as successful when it works most of the time in research but fails in the real world 5% of the time. He compares the result to a musical chair game, in which five people in a room of 100 are always left without a seat.
“The problem is that governments have some kind of technology and it works 95% of the time – they think it’s solved,” he says. Instead, human intervention is becoming more important than ever. Washington says: “They need a system that will regularly take care of the five people standing.”
There is an additional layer of risk when it comes to a private company. Washington says the biggest problem with the launch of a new technology is where the data is kept. without trusted entity Sensitive data that has a legal duty to protect people’s information may fall into the hands of others. For example, how would we feel if the federal government entrusted our social security numbers to a private company when it was created?
“The problem is that governments get some kind of technology and it works 95% of the time – they think it’s solved”
Anne L. Washington, New York University
The widespread and uncontrolled use of facial recognition tools also has the potential to affect already marginalized groups more than others. For example, transgender people detailedcommon issues with tools like Google Photos, which can query whether pre- and post-transition photos show the same person. This means calculating with software over and over.
“[There’s] “The fallacy in technology’s ability to reflect the breadth of true diversity and extreme cases that exist in the real world,” says Daly Barnett, a technologist at the Electronic Frontier Foundation. “We can’t rely on them to accurately classify and calculate and reflect these beautiful extreme cases.”
worse than failure
Conversations about facial recognition typically discuss how the technology can fail or discriminate. But Barnett encourages people to think beyond whether biometric tools are working or whether tech bias is emerging. It pushes back the idea that we need them. Indeed, activists like Greer warn that tools can be even more dangerous when they work flawlessly. Facial recognition is already used to identify, punish or suppress protesters, but people oppose it. In Hong Kong, protesters wore masks and goggles. hide their facesfrom such police surveillance. federal prosecutors in the United States falling charges Against a protester accused of assaulting police officers and identified using facial recognition.
[ad_2]
Source link
