close Human creators must be protected from AI using their likeness: Mitch Glazier Video

Human creators must be protected from AI using their likeness: Mitch Glazier

Recording Industry Association of America CEO Mitch Glazier says the Human Artistry Campaign aims to protect professional creators’ rights to their performances, voices and likenesses after AI creates Drake and The Weeknd songs.

A pilot program in the U.K. to enhance police capabilities via artificial intelligence has proven successful but could pave the way for a slide into a future of “predictive policing,” experts told Fox News Digital. 

“Artificial intelligence is a tool, like a firearm is a tool, and it can be useful, it can be deadly,” Christopher Alexander, CCO of Liberty Blockchain, told Fox News Digital. “In terms of the Holy Grail here, I really think it is the predictive analytics capability that if they get better at that, you have some very frightening capabilities.” 

British police in different communities have experimented with an artificial intelligence-powered (AI) system to help catch drivers committing violations, such as using their phones while driving or driving without a seat belt. Violators could face a fine of £200 ($250) for using a phone while driving. 

One trial carried out over a week at sites across East Yorkshire and Lincolnshire caught around 239 drivers breaking road rules, the BBC reported. The program also saw a trial in late 2022 in Devon and Cornwall, which caught 590 drivers not wearing seat belts over a 15-day period. 

HOUSE SPEAKER KEVIN MCCARTHY TAKES CONGRESS BACK TO SCHOOL ON AI

UK Surveillance artificial intelligence

New mobile technology capable of automatically detecting motorists who fail to wear a seat belt or who are holding phones at the wheel is being used in the U.K. for the first time under plans to boost road safety. (National Highways UK)

Safer Roads Humber, which helped set up the trial in cooperation with Humber Police, explained to Fox News Digital the program is not totally AI-run but involves human control to check for errors. The AI will use computer vision to determine if a person is not wearing a seatbelt or is using a phone, and the positive results go to a human to double-check. 

The initial review process takes up to five seconds, with false positives automatically deleted, a spokesperson from Safer Roads Humber explained. The system connects over phone signals, and humans can check the results remotely.  

BALLOONING AI-DRIVEN FACIAL RECOGNITION INDUSTRY SPARKS CONCERNS OVER BIAS, PRIVACY

Permanent implementation of the system would require more cameras, but the cameras and equipment can be vehicle mounted, such as on a trailer that can be left at the side of a road for weeks or even months, the spokesperson said. 

“Personally, I believe a mobile solution would work best as it would ensure road users change their behavior at all times rather than just at a static point,” Ian Robertson, partnership manager for Safer Roads Humber, said. 

facial recognition tech

National Highways says it identified more than 750 seat belt and mobile phone offenses along a short section of the M6 in Merseyside earlier this month. It was the highest number at a single site in a study that has visited numerous sites across the Midlands and South West. (National Highways UK)

Brian Cavanaugh, visiting fellow in the Border Security and Immigration Center at The Heritage Foundation, raised concerns that surveillance-heavy countries such as the United Kingdom could invest more heavily in using AI in combination with their massive systems, which could give rise to more authoritarian state control as an unintended consequence. 

“I absolutely see this as a slippery slope,” Cavanaugh told Fox News Digital. “You’re going from an open and free society to one you can control through facial recognition [technology] and AI algorithms – you’re basically looking at China.

“The U.K. is going to use safety and security metrics to say, ‘Well, that’s why we did it for phones and cars.’ And then they’re going to say, ‘If you have, say, guns … what’s next on their list of crimes that you crack down on because of safety and security?'” he added. “All of a sudden, you’re creating an authoritarian, technocratic government where you can control society through your carrots and sticks.

PRESIDENTIAL CANDIDATE WARNS AN AI PAUSE FOR US MEANS ‘CHINA RUNNING WITH IT’

“I believe there is the capacity to move from observations to predictive measures, but with that you have the possibility of false positives and the risk of a margin of error.” 

Cavanaugh argued that the better use for AI in policing would focus on understanding crime indexes, using data to create better-informed decisions on resource allocation and deployment. He stressed a need to keep human discretion at the core of any policing policy and that society never lets AI “take the place of the officer.”  

UK police surveillance

A police facial recognition van in front of the Cardiff City Stadium for the Cardiff City-Swansea City Championship match Jan. 12, 2020, in Cardiff, Wales. (Matthew Horwood/Getty Images)

Alexander described the more extreme version of this practice as “predictive policing,” akin to the kind of enforcement seen in the movie “Minority Report.” 

The Israel Defense Forces (IDF) recently discussed how it used AI to help determine targets during conflict and even use available data to pinpoint possible locations of enemy combatants or terrorists, a trial that resulted in successful operations against at least two Hamas commanders in 2021. 

Data Science and AI Commander Col. Yoav said AI helped the IDF do in days what might have taken “almost a year” to complete otherwise. 

KILLER ARTIFICIAL INTELLIGENCE GETS SHUT DOWN BEFORE PRESSING THE NUCLEAR BUTTON

“We take original subgroups, calculate their close circle [of personal connections], calculate relevant features, rank results and determine thresholds, using intelligence officers’ feedback to improve the algorithm,” he explained. 

Technology expert David Auerbach warns new AI could lead to human authenticity issues Video

Alexander warned that such developments will often start in the military and intelligence community, then “trickle down” to the private sector. 

“Presumably, you’re going to have more and more data,” Alexander argued. “People are going to think more about collecting it, and we’re going to get better and better at predictive capabilities, and … could the police show up in riot gear two hours before a riot even starts?”

CLICK HERE TO GET THE FOX NEWS APP 

He also used the example of the IRA, asking if British police could even end up using AI to obtain warrants and execute a search “just as people are setting up shop.” 

“I think the predictive capabilities are where the focus is … and it makes all the sense for it to be in the future,” he concluded. 

Peter Aitken is a Fox News Digital reporter with a focus on national and global news. 

Leave a Reply

Your email address will not be published. Required fields are marked *