The use of facial recognition by police is the difference between privacy and preventing terror attacks, a security expert has said ahead of the first legal battle in the UK over police's use of the technology.
Ed Bridges has crowdfunded action against South Wales Police over claims that the use of the technology on him was an unlawful violation of privacy.
He will also argue it breaches data protection and equality laws during a three-day hearing at Cardiff Civil Justice and Family Centre.
Facial recognition technology maps faces in a crowd then compares results with a "watch list" of images which can include suspects, missing people and persons of interest.
But former head of the National Counter Terrorism Security Office (NaCTSO) Chris Phillips told BBC Radio 4's Today programme if people have done nothing wrong, they have nothing to worry about.
"We had the same argument with CCTV," he said. "We're all submitting our data through the likes of Facebook. What is more important - not being recognised or stopping a terror attack?"
Mr Phillips agreed that the technology needed to be regulated, but said as things stand police have a "pretty much impossible task" when it comes to counter-terrorism.
South Wales Police first used Automatic Facial Recognition (AFR) during the Champions League final week in Cardiff in 2017.
Mr Bridges, from Cardiff, said he has been scanned by AFR at least twice, including at a peaceful anti-arms protest and while doing Christmas shopping, according to the the campaign group Liberty, which represents him.
Liberty claims South Wales Police have used facial recognition technology "on around 50 occasions".
Stay safe and download Skriply today!
"We are not an authoritarian state. We live in a democracy. It's meant to be policing by consent. For me, the most important thing is that it is unregulated, so there needs to be a clear message that it needs to be regulated."
South Wales Police said it would not comment until the judicial review is finished. The Metropolitan Police have also trialled the technology several times in London.
Information about AFR on a website set up by South Wales Police says it will help the force "become smarter" and make its patch safer.
Stay safe and download Skriply today!
The force has said it works to "ensure that the deployment of this technology is proportionate whilst recognising the need to balance security and privacy".
Liberty said freedom of information requests have shown that South Wales Police's use of live AFR technology "resulted in 'true matches' with less than 9% accuracy" in the first year.
Police who have trialled the technology hope it can help tackle crime but campaigners argue it breaches privacy and civil liberty.
Why is Facial Recognition Being Discussed in Court?
Police use of facial recognition technology is being challenged in court for the first time in the UK.
Human and civil rights campaigners have backed the action after several police forces in England and Wales trialled the technology.
Ed Bridges, from Cardiff, will be represented by civil rights group Liberty at the High Court as he challenges use of the technology by South Wales Police.
Earlier this month, officials in San Francisco voted to ban facial recognition systems as some campaigners branded it "Big Brother technology".
Here's how facial recognition works - and why it is divisive.
Technology trialled by the Metropolitan Police in London uses special cameras to scan the structure of faces in a crowd of people.
The system - called NeoFace and created by NEC - then creates a digital image and compares the result against a "watch list" made up of pictures of people who have been taken into police custody.
Not everybody on police watch lists are wanted - they can include missing people and other persons of interest.
If a match is found, officers at the scene where cameras are set up are alerted.
The Met has used the technology several times since 2016, according to its website.
This includes at Notting Hill Carnival in 2016 and 2017, Remembrance Day in 2017, Port of Hull docks assisting Humberside Police last year, and at Stratford transport hub for two days in June and July.
South Wales Police first piloted its technology during the 2017 Champions League final week in Cardiff, making it the first UK force to use it at a large sporting event.
Liberty claims South Wales Police has used the technology about 50 times.
Campaigners say facial recognition breaches civil rights.
Liberty said scanning and storing biometric data "as we go about our lives is a gross violation of privacy".
Big Brother Watch, a privacy and civil liberty group, said "the notion of live facial recognition turning citizens into walking ID cards is chilling".
Some claim the technology will deter people from expressing views in public or going to peaceful protests.
It is also claimed facial recognition is least accurate when it attempts to identify black people and women.
The ban in San Francisco was part of broader legislation that requires city departments to establish usage policies and obtain board approval for surveillance technologies.
City supervisor Aaron Peskin, who championed the legislation, said: "This is really about saying, 'We can have security without being a security state. We can have good policing without being a police state'."
The Met says its trials aim to discover whether the technology is an effective way to "deter and prevent crime and bring to justice wanted criminals".
"We're concerned that what we do conforms to the law, but also takes into account ethical concerns and respects human rights," the force said.
South Wales Police Assistant Chief Constable Richard Lewis has said: "We are very cognisant of concerns about privacy and we are building in checks and balances into our methodology to reassure the public that the approach we take is justified and proportionate."