As the use of facial recognition technology has become increasingly prevalent, concerns about its potential for misuse have also grown. At Herta, we recognize the importance of responsible use of this technology and have made it our mission to always implement ethical guidelines that prioritize privacy, security, and fairness.
In recent years, facial recognition has been a focus of attention in different sectors due to the laws in some countries. But, beyond the legality that we talked about in our last blog post, we also believe in the importance of adhering to certain ethical principles that ensure a good use of artificial intelligence. We think that these principles, based on proportionality and ethics, must be closely related to our culture. Therefore, that all the people working in Herta attain with them.
In this blog we will delve in the 7 ethical standards of Herta:
- Facial recognition without demographic bias. In other words, Herta uses a carefully selected and refined training dataset (more than 50 million images) especially focused on collecting images from underrepresented groups. In addition, it uses a number of other methods:
- Usage balancing to indicate to the network which groups require special consideration.
- Simultaneous training with multiple targets.
- Gaining knowledge about specific groups within the identification task.
- Use layers that have demonstrated better generalization.
- Do not process biometric data when it is not needed. In cases where facial identification is not necessary, the system will only handle the images without cross-referencing the biometric data with the database. In other words, the system will not identify subject, even if they are registered in the database, if it is not necessary.
- Face blurring or masking. All faces detected by the system and which do not correspond to persons registered in the database will be blurred or hidden in real time. This function of the software allows a great solution within the European legality. Because in the process of anonymisation of the registered photographs, we support and defend the data protection laws (article 26 GDPR).
- Deletion of detected or identified images. The user can choose a time range for the retention of faces detected or identified by the system.
- Encryption. The communication of the system (from the beginning with the camera, the edge station, the server as well as the database) is encrypted.
- Centralisation of the database on the enrolled subjects. Provide security in the processing, such as integrity and confidentiality.
- Limited access to data on external devices. Mobile phones, tablets, electronic agendas, etc. These will be connected to the system and will receive alerts, but will not be able to download any data from the database.
As we have seen, all these ethical rules contribute to the security and protection of users’ data and Herta is highly committed to contribute to an ethical and respectful deployment of this technology. In this sense, benefits are established in both directions: On one hand, it helps to establish a safety net at the places where the software is installed and, at the same time, it protects the rights of the people on site.
We believe that the responsible use of facial recognition technology is crucial for building trust with our customers and the broader public, and we are committed to setting a positive example in the industry. We look forward to sharing our insights and engaging in discussions around the ethics of facial recognition technology.