Privacy concerns over live facial recognition are 'much smaller' than the need to protect the public from 'a knife through the chest', Met Police chief Cressida Dick has said.
She told a conference in Whitehall that critics of the use of such technology would need to justify to victims of crime why police should not be allowed to use these methods.
She said that if artificial intelligence could help identify potential terrorists, rapists or killers, most members of the public would want them to use it.
Recent Metropolitan Police use of facial recognition led to the arrest of eight criminals who would not otherwise have been caught, delegates heard.
Use of the technology has been criticised as a violation of privacy.
During a deployment by the Met at Oxford Circus last week, Silkie Carlo from civil liberties group Big Brother Watch, said:
People scanned by the cameras are checked against "watchlists" - said to contain suspects wanted by police and the courts - and approached by officers if there is a match.
The Met claims that the technology has a very low failure rate, with the system only creating a false alert one in every 1,000 times.
However, using a different metric, last year research from the University of Essex said the tech only achieved eight correct matches out of 42, across six trials it evaluated.
The latest algorithm used by the Met is said to show no bias on the base of ethnicity, although it is less accurate for women than men.