By Julia Waller
It started, as these things often do, with a small line in a settings menu.
Back in November, LinkedIn made a change that many users will have scrolled straight past. Since then, the platform has been using public member data to help train its own AI systems. Profiles, posts and public activity are now part of the raw material shaping how LinkedIn’s future tools work.
Private messages are not included, which will be a relief to many. Still, it marks a meaningful shift in how professional data is treated on the world’s largest business network.
What actually changed
LinkedIn confirmed that from November 2025, it began using public member data across the EU, UK, Canada, Switzerland and Hong Kong to train its AI models. The stated aim is to improve search, recommendations and new AI-driven features.
By default, members are opted in. If you do nothing, your public activity is included. You can opt out, but only retrospectively. Anything already collected remains part of the training data.
This is not hidden or underhanded, but it is easy to miss unless you actively review your data settings.
Why this matters, even if it feels abstract
On one level, this is simply how modern platforms operate. Many people will shrug and move on.
But for anyone who uses LinkedIn as more than a digital business card, it is worth pausing. If you spend time crafting posts, sharing insight, refining your CV or building a professional voice, that work may now help train automated systems designed to replicate, summarise or repurpose similar content.
The upside is clear enough. Better recommendations, smarter tools and a platform that, in theory, understands its users more accurately.
The trade-off is subtler. Your words, ideas and experience contribute to something you do not control, are not credited for, and may never see directly.
What it means for businesses and advisers
For companies, advisers and professional services firms, this is a new layer to consider.
Content posted on company pages or by staff acting in a professional capacity may now feed directly into AI tools owned by the same platform distributing that content. That is a shift from content simply being seen or shared to content actively shaping the systems behind the scenes.
There may be benefits in visibility and relevance. There is also a lingering question around ownership, context and unintended reuse. None of this is unique to LinkedIn, but it is becoming harder to ignore.
It is another reminder that public content rarely stays in the box we imagine it lives in.
The wider direction of travel
LinkedIn is not acting in isolation. Meta, Google and others are all moving in the same direction. Platforms increasingly want to train their own AI models using their own ecosystems, rather than relying on scraped or third-party data.
This feels like the beginning rather than the end. AI can be a powerful tool, but as it learns more from our behaviour, language and patterns, new risks emerge alongside the efficiencies. More convincing scams, deeper impersonation and blurred lines between human and automated voices are already part of the conversation.
It is the familiar, slightly weary debate about data being used in ways that stretch beyond our original intent. The difference now is scale and speed.
What you can do
If you would prefer not to take part, opting out is straightforward:
Go to Settings → Data privacy → How LinkedIn uses your data → Data for generative AI improvement, and switch the toggle off.
Even if you leave it on, the important thing is awareness. Knowing how your data is used allows you to make deliberate choices about what you share and how you share it.
Changes like this rarely arrive with much noise. They appear quietly in settings menus, policy updates and footnotes, then gradually reshape how platforms behave and how professionals engage with them.
This shift aligns with patterns already emerging in Remit Consulting’s work on AI in real estate. Not dramatic disruption, but steady integration. Tools learning from behaviour, systems becoming more predictive, and data taking on a longer life than many users expect.
There is no single right response. Opting out or staying in is a personal and organisational choice. What matters more is awareness. Understanding how these platforms evolve, and how our professional activity feeds into that evolution, is becoming part of the job.
