Facial Recognition Gets Attorney Mum Booted from Rockettes Show

Published on
04/01/2023 11:53 AM
Face Recognition Attorney Booted

A lawyer from New Jersey has been denied entry from the Rockettes show after face recognition pinned her as a security threat due to her employer’s legal battle with the show’s operator. 

Kelly Conlon told NBC New York that she and her daughter were turned away from the Radio City Music Hall in New York City when the venue’s face recognition system identified her as a lawyer at Davis, Saperstein and Salomon (DSS) and declared her a prohibited person. 

“It was pretty simultaneous”, Conlon explained. “They said that ‘Our face recognition picked you up’ – they knew my name before I told them. They knew the firm I was associated with before I told them. And they told me I was not allowed to be there”

“I was just a mom taking my daughter to see a Christmas show,” Conlon told the I-Team. “I did wait outside. It was embarrassing, it was mortifying.”

DSS is currently involved in injury litigation against a restaurant now owned by Madison Square Garden (MSG), which also handles the affairs at the Radio City Music hall. 

Conlon told security staff she had no involvement in any cases against MSG but was barred nonetheless because the recognition system had squared in on her. 

MSG told NBC New York that it has a policy of prohibiting attorneys involved in litigation against them from attending events at its venues and that Conlon's firm had been warned twice about staff not attending MSG-organised events. 

"In this particular situation, only the one attorney who chose to attend despite being notified in advance that she would be denied entry was not permitted to enter," MSG explained to The Register.

Surveillance as punishment 

MSG Entertainment is yet to confirm or deny if it was its face recognition security systems that identified Conlon, but the entertainment company did confirm that only individuals prohibited from entering its venues are contained within its database. 

Face recognition technology has been used at MSG venues since at least 2018 when the New York Times revealed that visitors’ faces were being scanned. 

The venue operator claimed at the time that it introduced the technology to improve security across its venues, explaining that the systems “provide a safe and wonderful experience for guests.”

But in June of last year, the company introduced a new policy to its security regulations that banned lawyers like Colon from entering its venues while litigation is ongoing. 

MSG’s blacklisted law firms have disputed this policy on multiple occasions, with some unsuccessfully challenging the ban in court. 

​​"Separating a mother from her daughter and Girl Scouts she was watching over — and to do it under the pretext of protecting any disclosure of litigation information — is absolutely absurd,"  Sam Davis, Partner of DSS, highlighted.

"This whole scheme is a pretext for doing collective punishment on adversaries who would dare sue MSG in their multi-billion dollar network,” he added.

"We are confident that our policy is in compliance with all applicable laws including the New York State Liquor Authority," he told the Register. 

More than 20 active lawsuits are pending against MSG Entertainment and its properties in the state according to New York court records.

Face recognition risks highlighted

As legally required by New York biometrics privacy laws, the company has signage outside all of its venues that informs visitors of the presence of face recognition, but critics believe many visitors are still unaware of the extent of the technology’s usage – and its risks.  

Experts and privacy advocates have long warned of the potential risks of using face recognition technology, claiming that the technology is not only intrusive but also ineffective

The failure rate of face recognition, most noticeable when identifying women, children, and minorities has been well documented, with Congress even holding a number of hearing discussing its risks in 2019. 

Due to its lack of inaccuracy, face recognition tech, particularly for surveillance by police and security bodies, has thus come under immense legal rebuttal in recent years. 

In a lawsuit, just last May, Clearview AI, a startup that scared images from social media to build a database was forced to restrict the sale of its databases to US businesses and security bodies following a legal battle with the UK’s data protection body. 

The UK firm was also slapped with a £7.5 million ($9.43 million) fine for illegally obtaining the images online.

Join 34,209 IT professionals who already have a head start

Network with the biggest names in IT and gain instant access to all of our exclusive content for free.

Get Started Now