Accused of Cheating by Algorithm and a Professor He’s Never Met

[ad_1]

Dr. Orridge did not respond to requests for comment for this article. A spokesperson for Broward College said he could not discuss the case due to student privacy laws. In an email, he said the faculty were “doing their best” about what they saw in the Honorlock reports. He said the first warning of fraud will appear on a student’s enrollment, but will not have more serious consequences, such as preventing the student from graduating or transferring a loan to another institution.

Honorlock has not previously disclosed exactly how its AI works, but a company spokesperson explained that the company is performing. face recognition It uses Rekognition, an image analysis tool that Amazon started selling in 2016. Rekognition software searches for facial landmarks (nose, eyes, eyebrows, mouth) and gives a confidence score that what appears on the screen is a face. Moreover infer from most emotional state, gender and angle your face.

Brandon Smith, Honorlock’s president and director of operations, said that if Honorlock detects multiple faces in the room or the test taker’s face disappears, it will flag a test participant as suspicious, which can happen if people cover their faces with their hands in frustration.

Honorlock sometimes uses human workers to monitor test takers; If a quiz has a lot of flags, “live supervisors” are opened via chat to find out what’s going on. Recently, these overseers discovered that Rekognition had mistakenly recorded faces in photos or posters as additional people in the room.

When that happens, Honorlock tells Amazon’s engineers. “They take our real data and use it to improve their artificial intelligence,” said Mr Smith.

Recognition had to be one step ahead of what Honorlock was using. Mr. Smith said an earlier face-detection tool from Google was worse at detecting faces from people with a variety of skin tones.

But Rekognition has also been accused of bias. Joy Buolamwini, computer researcher and executive director of the Algorithmic Justice League, has found in a number of studies that gender classification software includes: recognitionit worked the least in dark-skinned women.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *