Thomas Brewster
After the deadly terrorist attacks on Israel by Hamas two weeks ago, leaving 1,300 people dead and hundreds more unaccounted for, a group of cyber experts began a volunteer project to try to identify the missing. They wrote code that trawled social media sites including Telegram, Twitter and TikTok to gather images and video from the deadly assault. Then, using Amazon’s facial recognition algorithm, Rekognition, they compared those images to a database of photos provided by official Israeli government sources and the families of the missing.
Over the course of two weeks, the project, which was led by former deputy director of the Israeli government’s National Cyber Directorate, Refael Franco, was able to identify some 60 missing people and provide new leads on the whereabouts and status of five others. It hasn’t yet resulted in an actual rescue, but the project is promising enough that it has been handed over to the Israeli Defense Forces (IDF) and deployed as part of its rescue operations.
Franco, now cofounder of security startup Code Blue Cyber, told Forbes the project was handed off to the Israeli government because it had gathered “a lot of sensitive data that you don't want to share with civilians.” According to the IDF, as many as 200 Israelis are either missing or are known to have been kidnapped by Hamas.
Refael Franco (front left) in his "cyber war room" of experts trying to help locate missing individuals from the Israel-Hamas war.
Franco said his volunteer team of AI specialists and cybersecurity and intelligence analysts started with a map of locations where they believed Hamas operated, based on analysis of videos made by Hamas of captured Israelis, done by members of the team who previously had in-the-field experience around Gaza. Working in a “cyber war room,” they then looked for social media accounts relevant to those locations, like neighborhood Facebook groups or pages sharing war footage and began scraping images and videos.
Some social media sites, like Facebook, had to be searched manually because they don’t allow scrapers. But Telegram, which has proven to be one of the most popular apps for sharing wartime imagery, could be scraped automatically, said Franco.
One of the initiative’s main partners was three-year-old Israeli startup Tagbox, which typically uses its artificial intelligence algorithms to sort and categorize images and videos for businesses. But it partly shifted its focus for the last two weeks to assist Franco's effort, helping filter the scraped images for relevance before running them through the Amazon Rekognition tool.
“We had close to 1,000 people that we didn't have a status for, that could be alive or dead, here or in Gaza,” said Tagbox cofounder and CEO Guy Barner. “We just downloaded everything that might be somewhat relevant and we analyzed all of those materials. It was tens of thousands of photos and video files.” Amazon’s Rekognition algorithm was able to pick up on obscured faces that a human analyst couldn’t see, he said.
Once the technology had flagged a match, the team worked alongside the Israeli military to manually review the images. “It was a really tough job because you have to go over really, really bad photos and videos,” Barner said. “It was heart wrenching.”
Franco’s application of facial recognition is the second documented use case of the technology — most often associated with surveillance — in the Israel-Hamas war. Last week, Forbes reported that facial recognition made by Corsight AI, an Israeli company, was being used by a hospital to check inpatients against a database of missing individuals.
Amazon’s Rekognition tool launched in 2016, offering police and the public cheap and easy-to-use facial recognition. Researchers and reporters soon found it could be used for quick-and-easy surveillance operations, including on consenting Forbes employees, while others discovered it wasn’t always accurate. Buzzfeed News found Rekognition mistakenly identified criminals as celebrities, while the American Civil Liberties Union (ACLU) discovered it falsely matched 28 members of Congress with individuals in arrest mugshots. The ACLU said that false matches were disproportionate for people of color. Amazon later stopped sales to police departments over ethical concerns.
No comments:
Post a Comment