Experts tell lawmakers racial ‘bias’ inherent in facial recognition technology may NEVER be fixed

By Jon Dougherty

(NationalSentinel) As the development of so-called “facial recognition technology” for use by law enforcement and intelligence services continues, a group of experts told Congress this week that racial and other demographic “biases” inherent in the artificial intelligence-driven systems may be impossible to fix completely.

Technology has dramatically improved in recent years, NextGov reports, but the demographic biases in the software remain:

Facial recognition tools tend to work less effectively for women, people of color and the elderly, but those demographic differences are shrinking as the technology improves, according to Charles Romine, director of the Information Technology Lab at the National Institute of Standards and Technology. 

In 2017, NIST expanded a program to help government vendors test and improve their facial recognition systems, and algorithms’ overall performance has made “significant progress” since then, Romine said. Though accuracy still varies widely from system to system, he said, some of today’s best algorithms correctly identifying subjects some 99.7% of the time.

But while a rising tide lifts all boats, Romine said, technologists may never be able to build a system that identifies every type of person with the same level of accuracy.

“It’s unlikely that we will ever achieve a point where every single demographic is identical in performance across the board, whether that’s age, race or sex,” he told the House Homeland Security Committee on Tuesday. “We want to know just exactly how much the difference is.”

NIST is finalizing a report Thursday that describes demographic differences in facial recognition based on collected data during a vendor testing period. Romine informed lawmakers those results are slated to be released this fall.

He joined three other DHS officials to discuss the scale and scope of the agency’s programs to develop facial recognition and additional biometric technology.

Several lawmakers were interested in various limitations of the tools, but much of the hearing focused on the legality and integrity of the various DHS pilot programs — understandable, given the technology’s race and demographic recognition handicaps.

Today, NIST is finalizing a report on demographic differences in facial recognition based on data collected during their vendor testing program. Romine told the committee the results will be released sometime this fall.



Romine joined a trio of Homeland Security Department officials discuss the scope and scale of the agency’s efforts to use facial recognition and other biometric technology. While some lawmakers were interested in the technical limitations of the tools, much of the hearing revolved around the integrity and legality of the department’s various pilot programs.

DHS has facial recognition technology operating in various pilot programs including border and ports of entry, airports, and other locations. Officials said the programs are being operated within DHS’ current scope of authorities and that they are a convenience to field agents.

“I am not opposed to biometric technology and recognize it can be valuable to homeland security and facilitation,” Chairman Bennie Thompson, D-Miss., said during the hearing, NextGov reported. “However, its proliferation across [Homeland Security] raises serious questions about privacy, data security, transparency, and accuracy.”

13-star AMERICAN. ORIGINAL. Tee — Order yours now!

As is generally the case, the technology is advancing faster than the process of regulating it. Currently, there aren’t many laws that deal specifically with facial recognition technology, its uses, and where the biometric data comes from.

The hearing just happened to follow a report by the Washington Post this week claiming that Immigration and Customs Enforcement and the FBI are tapping into state driver’s license databases to collect data on millions of Americans without their consent.

  • Follow Jon Dougherty on Parler — the Twitter alternative

Subscribe to our YouTube channel

Subscribe to our Brighteon channel

Sign up for our daily headlines newsletter

We’ve reduced the number of ads on our site to enhance the reader experience. Please support us by patronizing our remaining advertisers. Thanks!


 

1
Leave a Reply

avatar
1 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  Subscribe  
Notify of
Tom Billesley
Guest
Tom Billesley

“Facial recognition tools tend to work less effectively for women, people of color and the elderly”. Is anyone in those groups complaining that they won’t always be recognized? What’s the beef?

%d bloggers like this: