Violations of security and privacy can cost companies a lot of money, not only in the form of fines, but also in the form of reputational damage.
Mapping your data and its use is an important part of monitoring, and it is an area where AI can help. To find out more, we spoke to Amar Kanagaraj, the founder and CEO of oneDPO, a data protection and management company specializing in AI.
BN: Artificial intelligence technology is excellent for detecting patterns and trends, how can it help to protect data?
AK: Organizations have a complex data ecosystem, the complexity of which increases with the amount, variety and speed of data. Under these circumstances, it is rather difficult to identify the data protection risks. You’re looking for a needle in a huge pile of needles!
Moreover, the definition of data protection requires a good understanding of the context in which data are collected and used. For example, a company may have permission to use a person’s phone number for security purposes, but not to use that person’s phone number for marketing purposes.
To deal with this complexity, companies are adding manual processes to deal with privacy issues. The problem is that manual processes are not scalable. Unlike manual processes, AI-based solutions can adapt to the data. An A.I. can identify sensitive personal information from any data source within the organization. AI can also help organizations understand what data they have and how they use it. AI/ML can constantly search for patterns and identify privacy issues.
BN: Does the amount of data currently collected by companies increase the risk?
AK: Today’s economy collects an astonishing amount of data, from sensors to mobile applications. As the volume of data increases, the complexity of data management and administration increases exponentially. Data does not remain at the source within an organisation, but flows through the entire organisation. As the data flows through the organization, it is transformed, increasing the amount of data. Most companies do not have a clear picture of where and how their data is stored and managed. As the company becomes more vague about data management, the risk of data breaches increases.
Moreover, the regulatory environment is becoming incredibly complex with the emergence of numerous data protection rules with various restrictions on data processing. That is why data minimization, one of the basic principles of privacy, is becoming increasingly popular. Data minimisation recommends that companies limit the collection and processing of data to the minimum necessary for their business activities.
BN: How can you ensure that an AI is well-trained in detecting privacy issues?
AK: The effectiveness of CEW/ML depends largely on the data used for learning. In some areas of data protection, we have a rich set of data for AI training. For example, as security threats and incidents increase, so does the amount of incident data and user activity. The historical data provided the training data needed to use the CEW/ODM to identify safety issues. In recent years, a class of security products called UEBA (User and Entity Behavior Analytics) has been launched. The AEBA uses the CEW/LM to identify safety risks. They then apply the CEW to analyse numerous parameters and user models to identify insider risks.
The area of data protection is still largely in its infancy. The availability of data on AI training and implementation is therefore a challenge. On the positive side, PrivacyTech is constantly innovating to offer new solutions. At OneDPO, we use AI and privacy engineering to resolve data protection issues. AI has also grown rapidly in recent years. Many techniques and advances have contributed to the development of complex AI, even if your company works with a limited data set.
BN: Can AI help identify data breaches?
AK: The government recognizes the importance of protecting consumer privacy and enacts laws, such as the General Data Protection Regulations (GDPR) and the California Consumer Protection Act (CCPA), to control how consumer data is stored and used. New data protection laws have broadened the scope of personal data protection. Previously, organizations considered personal information (PII), such as social security numbers, to be personal data. As new laws come into force, companies must also protect data that could potentially lead to personal identification. For example, if a combination of postal code, age and gender identifies an individual, all three should be considered personally identifiable information (PII).
With the new laws, all previous approaches will be insufficient. As legislation evolves, AI-based data protection solutions can quickly adapt to new and complex requirements. As the tools evolve, AI can play an important role not only in ensuring compliance, but also in ensuring greater confidentiality.
BN: Is it important to include all copies of data, including backups, in the implementation of data protection?
AK: As far as data protection is concerned, backups should be treated with the same care as the actual data. According to the GDPR, when collecting, storing or using personal data of EU citizens, the organisation must ensure the protection of all data, including copies. Therefore, organizations must ensure that the security of personal information is encrypted and properly managed. When companies introduce data protection technologies in their operations, they should also include data protection in their initiative.
BN: Will strong data protection slow down companies’ innovation activities? How can an AI help?
AK: In the past, companies have not considered data protection a top priority. Driven by new rules and increased customer awareness, companies have begun to implement data protection processes and policies. In the absence of appropriate data protection tools, many of these policies are applied manually, with additional controls and limitations. These rudimentary methods limit the data flow and limit the value that can be derived from the data.
New PrivacyTech companies like us use technology to simplify the protection of privacy. Using AI and other privacy technologies, PrivacyTech’s new tools hope to automate the monitoring, detection and prevention of privacy issues. We are committed to embed confidentiality in our corporate culture, processes and systems. The automated solution increases confidence and ensures a better data flow, which significantly increases the speed of innovation. As technology advances, companies do not have to weigh up faster data flows within the company against the protection of consumer privacy.
Photo credits : vaeenma/depositphotos.com
data governance questions and answers,data governance (elevator pitch),e governance interview questions,data governance analyst interview questions,data governance frequently asked questions,technology governance interview questions