AI is the shiny object of the business world right now. Everyone’s looking at how artificial intelligence could help (or harm) their industry.
But in this case, AI kind of deserves to be the shiny object. It’s a rapidly evolving sector, and in most cases, there are opportunities to leverage AI in every department of a company, including human resources.
For HR teams, there are emerging uses for AI that could give your company a considerable advantage in employee training and development. With AI, HR teams can design training and development programs that are more effective and tailored to employees’ individual needs.
Considering that 85% of talent developers use internally created content to train employees, firms that use AI wisely have a competitive edge.
But (sorry, there is a “but” here) artificial intelligence comes with a certain amount of risk—including privacy risks that can jeopardize employee trust and your business’ reputation.
Five ways artificial intelligence is being used for employee development and training (and what the privacy risks are)
1. Personalized learning and development
We’ve known since elementary school that everyone has different styles of learning. Some prefer video, while others would rather claw their eyes out than watch a 20-minute explainer video instead of reading text.
AI has the potential to enable personalized learning experiences. It can adapt to user preferences or adjust a curriculum to meet their needs.
Privacy risk involved: getting too personal with data
To personalize anything, you need access to personal information, which creates an inherent privacy risk. To avoid missteps, businesses must establish practices for obtaining consent, providing user control over their data, and maintaining data security.
Similarly, all data must be reviewed before going into an AI model.
Many businesses have employees who speak English as a second language. Translating employee training modules into different languages can take a lot of time and money. AI can speed this process up exponentially by translating materials into other languages, which can then be reviewed for accuracy.
Similarly, for any employees with auditory or visual limitations, AI can speed up the process of providing alternative training models without delays or significant added costs. This helps ensure all employees have equal access to career training and development.
Privacy risk involved: adhering to compliance requirements
Increasing accessibility options is great, but businesses should approach it carefully.
To start, don’t overlook employee rights. In some jurisdictions, employees have the same rights based on how data is used and can opt out of processing in some scenarios.
AI-driven training that pulls from special categories of personal data (such as health information or personal information that may reveal ethnic or racial origins) may require compliance with privacy laws like the General Data Protection Regulation (GDPR) or California Consumer Privacy Act, as amended by CPRA just to name a few.
3. Identifying skill gaps
Identifying skill gaps in each employee can take…entirely too much time. Even if you manually assess each person and their skill proficiencies, you’ll still have to do it every year.
AI metrics can help identify where employees need additional proficiency to perform their job functions or get promoted. It can also help build tailored courses relevant to their needs and career development.
Privacy risk involved: meeting employee expectations for privacy
You might be identifying skill gaps among your entire employee base, but not all employees participating in training may be equally covered by privacy laws.
Yet privacy expectations often don’t match privacy reality. Consider expanding the application of privacy practices to include all employees so they feel equally informed and protected.
4. Adaptive testing
Almost every employee knows what it’s like to complete training for subjects you know like the back of your hand. It can feel like a waste of time and resources. However, other employees may find the same training module too difficult.
AI in learning and development has the potential to create adaptive tests that adjust questions based on an employee’s performance. This can help gauge precisely where each employee is in their skill level and provide a more accurate assessment of their knowledge and abilities.
Privacy risk involved: purposeful, consent-driven data collection
Analyzing and processing data from adaptive testing may inadvertently reveal or infer information that learners might not want to disclose. To safeguard privacy in adaptive learning, businesses must prioritize the learners’ consent, preferences, and rights over their data and ensure all data is used for relevant purposes.
5. Building effective course series
For businesses that use externally developed content to help develop employee skills, i.e., Photoshop, C++ programming, or leadership training, finding the proper series of courses for your employees can be tricky.
With AI, teams can input their desired competencies and end goals and ask an AI model to generate potential course series based on online reviews, curriculum, budget, and more. This gives HR teams a strong starting point to design a course for specific departments or positions within the company.
Privacy risk involved: employee training is vital
Having your team build courses using AI might seem like an efficient way to produce resources. However, before anyone inputs anything anywhere, it’s important to lay some groundwork:
- First, any AI vendor you use should be evaluated and put through a third-party/vendor risk assessment like any other vendor.
- Next, the software should be vetted to ensure the model won’t use data inputs for other purposes. If the model will use the data for other purposes, that creates a privacy risk that needs to be addressed before continuing.
- Finally, users should be trained on what information may be inputted into the system. Employees handling such data must receive proper training on privacy regulations relevant to the company and their obligations to comply with applicable privacy laws.
How to approach the use of AI in employee training and development without jeopardizing privacy
Before you start implementing and adding AI tools to your business operations, take some time to research the risks associated with those AI tools and how they might impact your business.
Many AI tools come with some sort of privacy risk, from data breaches to evolving legislation in the EU and employee data privacy.
Ensure your data privacy program is up-to-date
AI models use data, and AI models for employee training may rely on different types of employee data.
HR teams need to consider employee data privacy rights under any applicable jurisdictions. If you have employees in the EU or California, you might be limited in how you can use automated decision-making technology for profiling employees. Additionally, more local (think: city level) privacy regulations may apply to your business.
Data privacy laws are also becoming more common in the United States, where they are state-specific. Even if data privacy laws don’t impact your use of employee data this year, you could face issues in the future as the legal landscape changes.
As a general rule, stay current on your privacy operations to limit your risk exposure and prevent any problems before they occur.
Train your employees on the use cases and limits of artificial intelligence
This is a bit meta, but before you start using AI to train your employees, make sure your IT, HR, and other relevant departments have all received training on the use cases and limits of artificial intelligence.
To protect your business, provide additional employee training to anyone involved in employee training and development so that they understand your company policy around using different AI models (i.e., what it’s okay to use it for, use cases, and how to evaluate vendors). This will help prevent anyone from going a bit too above and beyond.
Once everyone is on the same page, you can implement AI tools that add value to your employee training and development.
Keep your eye on regulations
The California Privacy Protection Agency, a harbinger of privacy winds in the US, just released draft AI regulations.
These regulations apply to California residents, including consumers, employees, job applicants, and others. Employers using automated tools must notify job applicants and employees if an employment decision was based on ADMT and provide access to information about how the technology was used.
Yes, these are draft regulations, but they are a strong indication that legal guidance is coming.
With great power (and employee data) comes great responsibility
AI tools can streamline essential functions for employee training and development and give training and development staff more time to build up their programs.
That said, AI is not an “ask for forgiveness, not permission” situation. Many complications can arise from running headfirst to AI training and development tools without a detailed risk analysis and appropriate compliance measures.