Chatbots can play a major role in data compliance solutions
Chatbots can play a major role in data compliance solutions
Data compliance is eating up a growing amount of time for businesses, with legal, IT and any data-handling function have to dot their ‘i’s and cross their fingers that they haven’t made an expensive mistake. Automation and chatbots can play a growing role in ensuring companies can train their staff to manage and deal with customers over GDPR and other data laws.
The European Union’s rollout of the General Data Protection Regulations (GDPR) in 2018 has caused every business that handles personal data to ensure it is compliant, regardless of where it is in the world. Even if not affected by GDPR, other states and regions have their own broadly similar rules that are coming into force about data and AI technologies.
Around the world, India is drafting its own Personal Data Protection Act while the Cyberspace Administration of China (CAC) has a new set of Data Protection Regulatory Guidelines, Between them covering a huge percentage of the world’s population.
In the US, California is leading the way with the new California Consumer Privacy Act (PDF) which came into force at the start of this year, hot on the heels of laws trying to limit fakery with bots. All laws have critics and advocates, but these rules will be commonplace within a few years.
The basics of the regulations are simple and bring 21st-century protection to personal data that was sorely missing. They mean that companies or organizations that store or process personal data need to ensure that people consent for you to store it. That you are able to provide it to them and delete it on request, aka the ‘right to be forgotten’. There are already plenty of examples of businesses or organizations being flooded with requests, and the problems will only get worse as consumer awareness grows.
So, if someone signs up for your company newsletter, they have to consent to you storing their email address. If someone talks to your chatbot, any identifying data needs to be similarly managed - but what if your legal team doesn’t know the chatbot stores personal data? And what is personally identifying data anyway? Is someone talking a bot calling himself Dave from a small village enough to make it personally identifiable? Here’s where things get tricky, and businesses need to take steps to make these rules explainable to each worker and customer.
Chatbots to the GDPRescue
Every person who works with data in a business needs to know the rules, and as requests from customers or users about GDPR come in, there needs to be a consistent way to handle them. Automation through chatbots can help in both of these respects.
Within the business, a chatbot can provide information to all workers in the form of a Q&A approach. The bot can help teams and new hires understand what data is “personally identifying”, how your business handles that data in respect of GDPR (or similar regulations), who to talk to if there is an issue and how their own data is being stored or used (for larger organizations).
The internal chatbot provides a single source of information that can be verified by company leaders or legal teams to ensure its accuracy. Legal teams can also use the bots to explain what the rules are when there are competing legislations, or the company works across various jurisdictions. This will save a lot of time when a company is growing fast or already has a sizeable footprint.
The bot can easily be updated to reflect any local or regional laws that also come into play. It can also be used to explain the consequences of a data breach, and what the costs to the company can be to highlight the urgency, with fines of up to 20 million euros for serious breaches of GDPR.
For customers, current chatbots can be updated to explain privacy policies, to harvest consent and to. Smart bot developers can separate personally-identifying information from anonymous information at source, and automate the right to be forgotten feature. For businesses that have multiple sources of personal data, it can create lists that the appropriate person or teams can use to find and delete any other data.
Any text in a customer-facing chatbot should be simple and free of legal jargon, while not encourage people to sign up under false pretences. If you hate those cookie/advert pop-ups that say “accept” but have no option for “no thank you” you’ll appreciate the sentiment.
California’s CCPA bill also adds the right to opt-out of the sale of personal information, with legal protections for minors. It also allows for the right to non-discrimination in terms of price or services, which might affect those who get offered different rates based on location or personal circumstances, something that Californian insurers and retailers will have to explain.
Therefore, chatbots will be ideal in explaining these rules and the implications to customers in a friendly manner, while developers need to ensure that new and future bots are compliant along with any other data services that access or are accessed by them. Existing bots need to be retrofitted to match any applicable laws, and future efforts should be designed around a privacy-first perspective.
While your business use of personal data might be fairly benign, management and protection needs to be clear and explainable to all. It will only take a few high-profile hacking incidents or the next election cycle and a social media scandal or two, or political actors accessing data they shouldn’t to create a tide of complaints that could lasso in huge numbers of businesses.
Having a chatbot on hand to explain the issues and what to do for all parties will be a key way to protect your business and save having to individually answer every request or query as the rules impact every worker and business around the world.