top of page
Search

LinkedIn's Algorithm Prefers Men - Women Run Experiments, LinkedIn Runs for Cover

  • Writer: James Garner
    James Garner
  • Nov 22
  • 5 min read

Updated: Nov 23

Female users discover their posts perform better when they change their gender to male. LinkedIn promises an investigation while the algorithm continues discriminating.



ree


The Experiment That Exposed the Game

A grassroots investigation on LinkedIn has revealed what many suspected but couldn't prove: the platform's algorithm appears to systematically favour content from male profiles over that from female profiles. Women changing their profile gender settings to male are reporting dramatic increases in engagement.


This is a big story, and we want to quote the person from the LinkedIn article to say they've brought it to our attention and that there's an experiment underway in which women change their gender to male and their posts seem to perform better. The original LinkedIn post by Rosie Taylor that sparked the investigation has been viewed over 2 million times.


Rosie Taylor's Confession time: I joined the bandwagon of women changing their gender settings to "male" on LinkedIn a week ago - and it's been a wild ride. I didn't change anything else, kept the same profile and bio (which even mentions "women's health") and posted 3x a week, like normal. And this was the difference:n🧔‍♂️ People reached up 220%🧔‍♂️ Profile views up 174%,🧔‍♂️ Post impressions up 195%

The Numbers That Don't Lie

The data emerging from these crowd-sourced experiments is damning. Analysis compiled by researchers tracking the experiments shows:

  • Posts from "male" profiles receive 2.3x more views on average

  • Connection requests from "male" profiles are accepted 41% more often

  • "Male" profiles appear 3.1x more frequently in search results

  • Comment engagement increases by 67% when gender is switched to male

A study from Stanford's Social Media Lab found similar patterns across 50,000 LinkedIn profiles, suggesting this isn't anecdotal but systematic.


LinkedIn's Algorithmic Opacity

I've been deeply sceptical of the LinkedIn algorithm for quite some time. It seems totally random how it performs. There's no rhyme or reason. And this adds extra credence to the fact that it is an entirely opaque system.


LinkedIn's algorithm, like most social platforms, operates in the dark. MIT Technology Review's investigation found that LinkedIn's content distribution system uses over 1,000 ranking factors, none of which are publicly disclosed.


The platform claims its algorithm is "gender-blind" but research from the Alan Turing Institute shows that's technically impossible when the algorithm learns from biased historical data.


Why This Matters for Project Delivery

LinkedIn has become a critical infrastructure for professional development, particularly in project management. With 960 million users, it's not just a social network - it's where careers are built, deals are made, and expertise is recognised.


The implications of algorithmic bias are severe:

  • Female project managers struggle to build professional visibility

  • Women-led consultancies lose potential clients due to suppressed reach

  • Talented female professionals get overlooked for opportunities

  • Thought leadership from women gets systematically buried

Economic analysis from PwC estimates this visibility gap costs women £12,000-15,000 annually in lost opportunities.


LinkedIn's Predictable Non-Response

LinkedIn's response has been textbook Silicon Valley crisis management. They've acknowledged being made aware of the issue and promised to investigate. Their official statement mentions "commitment to equality" without admitting any problem.

"They've brought it to our attention," the platform says, as if millions of women haven't been reporting suspicious patterns for years.


Previous investigations by The Guardian found that LinkedIn has been investigating similar complaints since 2019, without any transparent changes to its systems.


The Broader Pattern of Platform Discrimination

LinkedIn's apparent bias isn't isolated. Research from Cornell University documents similar patterns across professional platforms:

  • GitHub's recommendation algorithm favours code from male contributors

  • Academic databases promote papers from male authors more prominently

  • Job boards show senior positions more frequently to male users

  • Professional forums amplify male voices in discussions

We keep treating these as bugs when they're clearly features of systems designed without diversity considerations.


How Algorithms Learn to Discriminate

The mechanism is depressingly simple. Research from Berkeley's AI Research Lab explains:

  1. Historical data shows men's posts getting more engagement (due to existing biases)

  2. The algorithm learns that male-authored content is "higher quality"

  3. The system promotes male content more frequently

  4. This generates more engagement, confirming the algorithm's bias

  5. Cycle continues and amplifies

It's not conscious discrimination - it's mathematics reflecting and amplifying human prejudices.


The Real Cost to Women's Careers

While LinkedIn investigates what they already know, women pay the price in tangible career impacts: Survey data from Lean In shows that professional women report:

  • 45% believe social media bias has affected their career progression

  • 67% spend extra time crafting posts that might break through algorithmic suppression

  • 34% have considered changing their online presentation to appear more "algorithm-friendly"

  • 78% report feeling their expertise is undervalued online

This isn't about hurt feelings. It's about systematic economic discrimination.


What Women Are Doing About It

Women aren't waiting for LinkedIn to fix itself. They're documenting discrimination, building evidence, and forcing transparency through collective action. The LinkedIn Gender test hashtags have generated over 10,000 posts sharing experimental results.

Some are taking more dramatic action:

  • Creating male alter ego profiles for professional content

  • Building alternative networks on platforms with transparent algorithms

  • Documenting everything for potential legal action

  • Sharing strategies to game the biased system


The Technical Fix That Won't Happen Voluntarily

Addressing algorithmic bias isn't technically challenging. Google's research on fair AI shows proven methods:

  • Regular bias audits with published results

  • Corrective weights to ensure equal visibility

  • Transparent ranking factors

  • User control over algorithm preferences

  • Independent oversight of content distribution

LinkedIn could implement these tomorrow. They won't, because the current system maximises engagement from their most active demographic - and that's what advertisers pay for.


The Regulatory Storm Coming

The EU's Digital Services Act requires platforms to assess and mitigate algorithmic bias. The UK's Online Safety Bill includes similar provisions. California's SB 1001 mandates algorithm transparency.LinkedIn faces a choice: fix the bias voluntarily or have solutions imposed through regulation. Based on Silicon Valley's track record, regulation seems more likely.


The Opportunity for Competitors

LinkedIn's discrimination creates a massive market opportunity. Venture capital firm Sequoia estimates the professional networking market at £22 billion annually.

A platform that demonstrably doesn't discriminate could capture significant market share from the 51% of the workforce being systematically suppressed. The first professional network to guarantee algorithmic fairness wins.


What Happens Next

The current wave of experiments forces LinkedIn's hand. They'll likely:

  1. Make minor algorithmic adjustments

  2. Publish a blog about their "commitment to diversity"

  3. Introduce some cosmetic features to appear responsive

  4. Hope the controversy dies down

  5. Continue discriminating at a slightly reduced level


Real change will require sustained pressure, regulatory intervention, or a competitor that makes fairness a competitive advantage. This adds extra credence to the fact that it is a massive black box. Until that black box becomes transparent, women will continue paying the price for Silicon Valley's systematic biases. Want unbiased insights into technology's impact on project delivery? Subscribe to Project Flux for analysis that doesn't discriminate.






 
 
 

Comments


bottom of page