In 2020, the Canadian parliament held hearings on the use of facial recognition technology (FRT) by law enforcement agencies.[1] The hearings were prompted by concerns about the potential for the technology to be used for racial profiling and the violation of individual privacy rights. The hearings resulted in calls for greater transparency and accountability in the use of the technology.
FRT has become increasingly popular among businesses for various applications, including security, marketing, and customer service. However, with the widespread adoption of this technology, there are growing concerns over its potential impact on privacy, human rights, and data protection.
Corporate responsibility in the deployment of FRT is vital to ensure the protection of individual rights and maintain public trust. It is essential for businesses to prioritize transparency and accountability in the use of this technology to mitigate the risks associated with its deployment.
Privacy Implications from the Use of FRT
The use of FRT in public spaces has significant implications for privacy and human rights. One of the biggest concerns is the potential for mass surveillance, which could enable the tracking and monitoring of individuals without their knowledge or consent. It also raises questions about accuracy, bias, and discrimination, as the technology has been shown to have higher error rates for people with darker skin tones and women.[2]
There are also concerns about the potential for the technology to be used for profiling, tracking political or religious affiliations, and targeting vulnerable groups.[3] With such concerns, companies must ensure that their facial recognition systems are regularly audited and monitored to reduce bias and errors. Significantly, several tech companies have made pledges to use the technology ethically and responsibly. For example, Microsoft has called for federal regulation of the technology and has stated that it will not sell FRT to law enforcement agencies until proper safeguards are in place.[4]
Please log in to read the full article.