Compliance with data privacy regulations is an essential and non-negotiable part of the go-to-market strategy for video analytics technology providers. Public confidence is a critical factor in the growing acceptance of video analytics solutions and it can only be achieved with open communication and transparency.
Uniquely identifying biometric data is personal and off limits
Data privacy regulations, such as Europe’s General Data Protection Regulations (GDPR) provide both technology providers and businesses with clear guidelines on how technology should be developed and used in order to preserve the privacy rights of individuals. Compliance with European regulations implies that businesses and technology providers adhere to an extremely rigid set of constraints in how they process and manage personal data and the public can rest assured that their personal data is safe and never used without their consent in a manner that would put them at risk. Video analytics systems, by their very nature, contain personal information and as a result solutions for extracting vital marketing, operational, or health & safety data are highly specialized and must operate strictly within the boundaries of the regulations.
People’s faces, how they walk, their fingerprints and heartbeats are all examples of uniquely identifying biometric elements, and are therefore considered to be personal information. Similarly, numerical representations which are the result of neural network processing, sometimes referred to as ‘templates’, are also considered to be uniquely identifying personal information. In fact, the European regulations even go so far as to prohibit the generation of ‘templates’ without the express permission of the individual to which they belong. And this makes a lot of practical sense, because although transforming an image of a face into a numerical representation might make it less recognizable to a person, and might even be considered as a method of anonymization, the output of this process generates a uniquely identifying biometric signature and therefore can expose the individual to a higher risk of re-identification and the future invasion of privacy.
For this reason, it is critical that GDPR compliant video analysis systems guarantee by design that uniquely identifying biometric information is processed in very specific ways, and that no ‘template’ is ever generated by the system without the consent of individuals.
Making information easily accessible to customers
Why C2RO is a recognized pioneer in privacy-aware AI video analytics
C2RO’s organizational, technical and physical safety measures that are related to Perceive™ software were carefully audited by the company’s office of the DPO through an in-depth Privacy Impact Assessment (PIA). The results of the PIA was then reviewed by the CNIL, the French national data privacy supervisory authority, as well as by a Canadian law firm which concurred that Perceive does not present high residual risks for the privacy rights of individuals, making C2RO the first AI video analytics technology providers to be compliant with both European and Canadian data privacy laws.
Read our related news on our GDPR and PIPEDA compliance
C2RO, the pioneer in GDPR-compliant AI Video Analytics, continues its commitment to respect data privacy by achieving its PIPEDA milestone in Canada.
C2RO has proudly announced today that a Canadian law firm has verified that the company’s privacy-aware AI video analytics technology, Perceive™, is in alignment with the 10 PIPEDA principles. This makes C2RO one of the first AI video analytics technology providers to be compliant with both European and Canadian data privacy laws.
C2RO Perceive AI Video Analytics Software Raises the Bar and Sets New Standards for GDPR compliance in Europe
C2RO has proudly announced today that the CNIL, an independent French administrative regulatory body whose mission is to ensure that data privacy law is properly applied by technology firms, concurred with C2RO’s Office of the Data Protection Officer (DPO) that the company’s AI video analytics solution Perceive™, does not present a high residual risk for the privacy rights of individuals.