Implementation of Privacy by Design in face-recognition software

Brief description of the case

In this case, we will talk about how we helped implement Privacy by design principles for developing a mobile application. Most often, privacy problems are visible even at the stage of designing applications and software, and they are easier to fix.

Using the Privacy by design tool, you can save a lot of money:

1) you will avoid millions of fines

2) your application will not have to be redone later due to the fact that it was not "allowed" to key online sites.

In this case, as a solution, we used data pseudonymization. You will see a concrete example of how this can be done if it is not possible to obtain consent from the people whose personal data you process.

A feature of this case is that the customer was very actively involved in the process and well informed. This helped solve the problem in the shortest possible time. 

FYI: Just a couple of months ago, during the implementation of GDPR in one large company, we met the same functionality. So this time we proposed a ready solution.

We were contacted by a company that develops mobile applications. The company employs more than 100 people. This is not their first or only product. This application was planned to be made free, as an addition to the main service.
Project goals and objectives
  • Conduct an audit on compliance with the GDPR.
  • Evaluate the possibility of implementing Privacy by Design.
  • Create a system that preserves the principle of transparency.
  • Protect the company from the risk of getting a fine or complaint for violating the GDPR.
  • dpo
    Project summary
    The client has developed a new application that will collect information about users, as well as make lists of numbers from their phone books and send them to the service. For example, we took a similar product that works on the territory of Russia. However, our client chose a different market, European, where GDPR operates: there is no legal basis for processing this data, which violates the principle of transparency. As a result of working with the customer, we found the optimal solution for this problem. This saved the company from the risk of getting a fine or complaint for violating the GDPR.
    Project results
    As a result, the company has implemented a tool that can maintain the principle of transparency and comply with regulations. The app was successfully launched. Millions of people now use it with pleasure.


    Siarhei Varankevich CIPP/E, CIPM, MBA
    Founder of Data Privacy Office LLC. Data Protection Trainer and Principal Consultant
    MBA, Certified Information Privacy Professional (CIPP/E), Certified Information Privacy Manager (CIPM). Started to work with the GDPR draft version, in 2015, in Munich. Defended his MBA thesis about the Regulation, in Bremen, in 2016

    Siarhei delivered hundreds of consultations on GDPR issues to companies around the world. He helped to implement the GDPR program as an external project manager in over 50 companies.

    In LinkedIn

    Two employees from the client side also worked on the problem solution.

    Why did the company choose us?

    Firstly, specialists who are available in the Russian market mainly focus on local legislation. And the customer needed a guarantee that they would comply with the requirements of the European GDPR Regulation and would not be subject to a large fine.

    Secondly, our company employs certified professionals in the field of information privacy CIPP / E, CIPM, as well as with experience in information security. Many of them live in the EU and collaborate with us remotely; they speak both Russian and the main European languages.

    Our client understood that he would receive the same service as they would receive from European consultants, but in a language that they understand and at a very reasonable cost.

    In addition, we could not only solve the problem, but also train company employees so that in the future they could find and eliminate possible risks themselves.

    Initially, several specialists from this company attended our 4-day GDPR DPP training course. These were employees who are involved in software development, representatives from business, managers and techies.

    The training helped us save time in the future, as long explanations were no longer needed, and the client could do part of the work independently.


    What have we done to solve the problem?

    First, we tried to find some exceptions in the GDPR Regulation or additional clarifications in order to circumvent the client’s problem. However, this did not lead to any results.

    Then we proposed to hold a consultation on Privacy by Design according to the method of Jason Kronk.

    The great risk was that we did not have the consent of the people whose phones were in the database. Data was processed secretly, so users were not aware of what was happening with their personal information.

    The problem is that these data are the object of interest of attackers: they can be stolen and started to be used for their own selfish purposes. Trite, even an employee of the company can download them to his USB flash drive and take it away.

    We needed to avoid this risk in order to comply with the GDPR.

    We prepared several options for solving this problem in advance and presented the client with consultations at the beginning, the rest appeared in the course of joint work.

    It is important that the client was involved at every stage of the discussion. Therefore, he had a clear awareness of all the pros and cons. In fact, it was his own decision.

    Time frame

    We solved this problem in 2 hours of consultation. Previously, preparation was also carried out by the client: a data flow diagram was prepared, the preparation of which also took 2 hours.

    Problem solving process

    Together with the customer, we determined the most optimal solution: pseudonymization of the phone numbers of those people whose consent for the processing of personal data we don't have.

    The program hashed the phone number. As a result, a certain code was obtained by which it is impossible to determine the phone number. However, when re-hashing inside the company, you can determine which particular number. This allowed us to find out the phone numbers that are in the database and for which there is a consent of the customers to the processing.

    We could justify such processing of non-user hashes with another legal basis - legitimate interest. It's helped to avoid possible complications and comply with the requirements.

    If the number is encoded with a hash, then nothing can be done with it. Theoretically, you can still find out these numbers by brute force, but it will take a lot of time and effort. Therefore, we recommended that spent hashes be deleted immediately from the system.

    What is the result?

    The company successfully launched the program. Millions of people are now using it with pleasure.

    The company’s privacy policy says the following:

    “We process data from the phone books of our users, but these numbers come to us in the form of hashes. This data is stored on a secure server and it takes a disproportionate amount of effort to extract and decode it. They are still considered personal data, but we process them on the basis of legitimate interest, and consent is not required. ”

    Just a couple of months ago, during the implementation of GDPR in one large company, we found the same functionality in their mobile application. The situation was very similar.


    Recommended services

    We can audit your compliance with GDPR. External and internal audits of projects, processes or instances of processing.
    GDPR Implementation
    GDPR Roadmap+ Implementation Program
    Training and consulting support of the working group on the GDPR implementation, ad hoc consultations on problematic areas.


    The course is loading, wait a few seconds