Swimming, soccer, and surveillance: Paris preps for an AI-monitored Olympics

Swimming, soccer, and surveillance: Paris preps for an AI-monitored Olympics

When this year’s Summer Olympics kicks off in Paris, France next week, nearly 100 floats filled with the world’s leading athletes are expected to chug their way across the Seine River. Around half a million fans will cheer as their nation’s sporting ambassadors pass their way through the Louvre, by the Eiffel Tower, and a travel guide book worth of other historical monuments. But fans won’t be the only ones watching. Thousands of CCTV cameras overlooking the river will monitor the proceedings in real-time. Behind the scenes, powerful new artificial intelligence models will churn through the footage searching for any signs of danger hidden in the crowds. The controversial new AI-enabled surveillance system, which critics argue may violate broader European Union privacy laws, is one of several ways France is using technology to make this year’s Olympic Games one of the most tightly monitored in memory.

AI surveillance will look for crowd disturbances

French lawmakers passed a new law late last year temporarily granting law enforcement the ability to use “experimental” artificial intelligence algorithms to monitor public video feeds and provide “real-time crowd analyses.” In practice, the AI detection models will reportedly parse through the feeds of thousands of CCTV cameras looking for signs of potentially dangerous anomalies hidden within the Olympic crowd. Those warning signs could include people wielding weapons, larger than expected crowds, fights and brawls, and unattended luggage.

A policeman stands in front of a giant screen featuring videos taken from surveillance cameras in the streets of Levallois-Perret, outside Paris on January 10, 2012 at Levallois police station. AFP PHOTO LIONEL BONAVENTURE (Photo by LIONEL BONAVENTURE / AFP) (Photo by LIONEL BONAVENTURE/AFP via Getty Images)

A policeman stands in front of a giant screen featuring videos taken from surveillance cameras in the streets of Levallois-Perret, outside Paris on January 10, 2012 at Levallois police station. Credit: LIONEL BONAVENTURE/AFP via Getty Images

France is partnering with a number of tech companies for the AI analsyes including Wintics, Videtics, Orange Business, and ChapsVision. Law enforcement have already tested the new system in some subway stations, the Cannes Film Festival, and a packed Depeche Mode concert. Paris Police Chief Laurent Nunez recently told Reuters the concert trial went “relatively well” and that “all lights are green” for the system’s use during the Olympics.

If the AI model does detect a potential threat, it will flag it for a human law enforcement officer who then decides whether or not to move forward with any further enforcement action. French officials maintain the real-time analyses will all take place without ever using facial recognition or collecting other unique biometric identifiers. Instead, law enforcement and their private partners say the model will only measure “behavioral” patterns such as body movement and positioning. The AI, officials claim, cannot identify individuals based on their biometrics identities.

“It’s not about recognizing ‘Mr. X’ in a crowd,” French Interior Minister Gérald Darmanin reportedly said during a meeting with French lawmakers earlier this year. “It’s about recognizing situations.”

Olympic games will put France’s new ‘experimental’ AI video surveillance to the test

But some critics question whether or not it’s technically possible to conduct this kind of AI video analyses without inadvertently collecting and comparing some biometric identifiers. Doing so could place France in violation of Europe’s General Data Protection Regulation (GDPR) and the recently enacted EU AI Act. A coalition of 38 European civil society organizations writing in an open letter earlier this year claim the model’s reported monitoring of gait, body positions, and gestures may still qualify as biometric markers used to identify certain individuals or groups. If that’s the case, the groups argue, then the system would violate existing GDPR rules limiting the breadth of biometrics data collection permitted in public space.

GDPR rules do allow for certain exceptions to the biometric collection rule under a public interest allowance, but rights groups argue the permissions granted in the French case are overly broad and disproportionate to any apparent threats. Rights groups and some lawmakers opposing the fast-tracked law also worried it could set a dangerous precedent for future public surveillance bills and potentially undermine broader EU efforts to rein in AI surveillance. Amnesty International adviser on AI regulation Mher Hakobyan said the surveillance power, even if temporary, “risks permanently transforming France into a dystopian surveillance state.” Human Rights Watch, which wrote its own letter to French lawmakers opposing the fast-tracked law, similarly fears it poses a “serious threat to civic freedoms and democratic principles,” and risks further exasperating racial disparities in law enforcement.

“The proposal paves the way for the use of invasive algorithm-driven video surveillance under the pretext of securing big events,” Human Rights Watch wrote in its letter. “The mere existence of untargeted (often called indiscriminate) algorithmic video surveillance in publicly accessible areas can have a chilling effect on fundamental civic freedom.”

Others, meanwhile, worry the supposedly temporary new measures will inevitably become the status quo. The surveillance law officially sunsets in 2025 though lawmakers will have the opportunity to extend its shelf-life if they wish. Supporters of the expanded powers argue they are necessary tools to bolster the country’s defenses against potentially deadly terrorist attacks. France specifically has experienced more than half a dozen major attacks in the past two decades, including a series of shootings in 2015 that left 130 people dead. The 2015 incident resulted in France issuing a temporary state of emergency that it ended up extending for more than two years.

“We’ve seen this before at previous Olympic Games like in Japan, Brazil and Greece,” La Quadrature du Net digital rights activist Noémie Levain said during an interview with the BBC earlier this year. “What were supposed to be special security arrangements for the special circumstances of the games, ended up being normalized.”

France ramps up security for massive outdoor opening ceremony

France’s emphasis on security at this year’s Olympic Games extends beyond video surveillance. Authorities have designated the immediate area surrounding parts of the Seine River where the Opening Ceremonies will take place an “anti-terrorism perimeter.” The roughly 3.7 mile stretch will be subject to heightened levels of security between July 18-26.

Roughly 20,000 French residents who live and work within that perimeter will reportedly be forced to undergo background checks prior to the games to determine whether or not they have any alleged ties to supposed Islamist extremist groups. Those individuals will each receive a government-issued QR code they will need to use to navigate around the area during the event. Well armed police and military units, which have become a common sighting throughout Paris over the past decade, will reportedly number ten times their normal presence. Local law enforcement will reportedly work alongside hundreds of diver bomb specialists, antiterrorism units, and specialized forces trained to down potential drone threats.

For years, the Olympics has served as a test-bed for nations around the world to advertise and deploy their newest digital monitoring tools. China famously used facial recognition at security checks during the 2008 Beijing Olympics and again during its more recent winter games. Russian intelligence officials overseeing the 2014 Winter Olympics in Sochi similarly monitored the digital communications and internet traffic of competitors and attendees alike. In all of these cases, host nations justify stepping outside the bounds of ordinary surveillance operations as a means to ensure security during a time of unprecedented attention. There’s a legitimate cause for concern. The Olympics has been the source of violence on more than one occasion. But even when the immediate perceived threat subsides, host nations have been known to hold on to their newfound monitoring capabilities, a practice activists say ultimately degrades civil liberties over time. Whether or not France will follow that same playbook, however, remains to be seen.

EMEA Tribune is not involved in this news article, it is taken from our partners and or from the News Agencies. Copyright and Credit go to the News Agencies, email news@emeatribune.com Follow our WhatsApp verified Channel210520-twitter-verified-cs-70cdee.jpg (1500×750)

Support Independent Journalism with a donation (Paypal, BTC, USDT, ETH)
WhatsApp channel DJ Kamal Mustafa