LinkedIn has recently made the decision to pause the training of its generative AI models on UK user data, following a complaint from the Information Commissioner’s Office (ICO) [1] [5] [6] [7] [8] [9].

Description

Users in the UK were previously automatically opted in to have their data used for AI training without their knowledge, leading to concerns from privacy experts. The ICO confirmed that LinkedIn has halted the use of UK users’ information for AI training and expressed interest in further engagement with the company. This move comes after criticism for not extending the same courtesy to UK users as to those in the European Union, EEA [5] [8], and Switzerland [2] [3] [5] [6] [8] [9]. Microsoft-owned LinkedIn has also stated that it will not use customer data from these regions for AI model training, allowing users to opt out of data usage for this purpose. The EU AI Act requires transparency in the use of user-generated content for AI modeling and mandates explicit permission for AI use on user data [3]. In contrast, Meta has resumed its GenAI training program with UK user data after consulting with the ICO [4], despite facing backlash for the lack of fully informed consent from users. Privacy advocates have raised concerns about potential data exposure through employee use of GenAI, with reports indicating incidents in one in five UK businesses. LinkedIn’s decision to suspend model training with UK user data follows Meta’s acknowledgment of collecting non-private user data for similar purposes dating back to 2007. The ICO has welcomed LinkedIn’s action and will continue to monitor developers of generative AI to safeguard data privacy rights. This development underscores the ICO’s role in regulating AI systems and highlights the importance of data protection compliance in AI development. LinkedIn’s recent changes to its privacy policy have drawn criticism [7], particularly due to the absence of explicit user consent [7], as users were automatically enrolled for data sharing without clear notification [7]. Other tech giants [5] [7], like Meta [5], have faced similar backlash for using user data in AI training [7], indicating a broader industry issue with consent and transparency in data practices [7]. Privacy advocates advocate for a shift from opt-out to opt-in consent [7], stressing the necessity for users to actively agree before their data is utilized for AI model training [7]. GDPR and European data protection regulators are viewed as effective in safeguarding privacy on a global scale.

Conclusion

The decision by LinkedIn to halt AI model training with UK user data has significant implications for data privacy and user consent in AI development. It highlights the importance of transparency [2], explicit consent [7], and compliance with data protection regulations in the use of user data for AI modeling. The actions taken by LinkedIn and the response from the ICO underscore the need for ongoing monitoring and regulation of generative AI to protect data privacy rights. This development also emphasizes the growing demand for user consent and transparency in data practices within the tech industry, signaling a shift towards greater accountability and protection of user data.

References

[1] https://dig.watch/updates/uk-user-data-pulled-from-linkedins-ai-development
[2] https://legaltechnology.com/2024/09/20/linkedin-suspends-opt-out-ai-model-training-for-uk-following-ico-concerns/
[3] https://www.darkreading.com/cyber-risk/linkedin-user-data-collection-ai-training
[4] https://www.infosecurity-magazine.com/news/linkedin-pauses-genai-training-ico/
[5] https://techcrunch.com/2024/09/20/linkedin-has-stopped-grabbing-u-k-users-data-for-ai/
[6] https://thehackernews.com/2024/09/linkedin-halts-ai-data-processing-in-uk.html
[7] https://www.allaboutai.com/uk/ai-news/linkedin-suspends-ai-model-training-in-uk-over-privacy-issues/
[8] https://www.techradar.com/pro/security/the-linkedin-ai-saga-shows-us-the-need-for-eu-like-privacy-regulations
[9] https://www.neowin.net/news/linkedin-halts-ai-training-on-uk-user-data-following-ico-complaint/