LinkedIn conducted social experiments with 20 million users over five years

LinkedIn conducted experiments with more than 20 million users over five years that, while intended to improve how the platform works for members, could have affected some people’s livelihoods, according to a new study.

In experiments conducted around the world between 2015 and 2019, Linkedin randomly varied the ratio of weak to strong connections suggested by its “People You May Know” algorithm, the company’s automated system for recommending new connections to its users. Researchers from LinkedIn, MIT, Stanford, and Harvard Business School then analyzed aggregate data from the tests. in a study published this month in the journal Science.

LinkedIn’s algorithmic experiments may surprise millions of people because the company did not inform users that the tests were underway.

Tech giants like LinkedIn, the world’s largest professional network, routinely run large-scale experiments in which they test different versions of app features, web designs, and algorithms on different people. The long-standing practice, called A/B testing, aims to improve consumer experiences and keep them engaged, helping companies make money through premium membership fees or advertising. Users often have no idea that companies are testing them. (The New York Times use such tests evaluating headline wording and making decisions about the products and features the company launches).

But the changes made by LinkedIn are indicative of how such tweaks to widely used algorithms can turn into social engineering experiments with potentially life-altering consequences. Experts who study the social impacts of computing said conducting large-scale, lengthy experiments on people that could affect their job prospects, in ways that are invisible to them, raised questions about industry transparency and research oversight.

“The findings suggest that some users had better access to job opportunities or a significant difference in access to job opportunities,” he said. Michael Zimmer, associate professor of computer science and director of the Center for Data, Ethics and Society at Marquette University. “These are the kind of long-term consequences that need to be considered when we think about the ethics of engaging in this kind of big data research.”

The study in Science tested an influential theory in sociology called “The strength of weak ties”, which argues that people are more likely to obtain employment and other opportunities through close acquaintances than through close friends.

The researchers analyzed how LinkedIn’s algorithmic changes had affected users’ job mobility. They found that relatively weak social ties on LinkedIn proved to be twice as effective in securing employment as stronger social ties.

In a statement, Linkedin said during the study that it had “acted consistently with” the company’s user agreement, privacy policy and member settings. the Privacy Policy notes that LinkedIn uses members’ personal data for research purposes. The statement added that the company used the latest “non-invasive” social science techniques to answer important research questions “without any experimentation on members.”

LinkedIn, which is owned by Microsoft, did not respond directly to a question about how the company had considered the possible long-term consequences of its experiments on the employment and economic situation of users. But the company said the investigation had not disproportionately benefited some users.

The goal of the research was “to help people at scale,” he said. Karthik Rajkumar, a LinkedIn applied research scientist who was one of the study’s co-authors. “No one was put at a disadvantage in finding a job.”

Sinan Aral, a professor of management and data science at MIT and lead author of the study, said the LinkedIn experiments were an effort to ensure users had equal access to job opportunities.

“Doing an experiment with 20 million people and then implementing a better algorithm for everyone’s job prospects as a result of the knowledge you learn from, that’s what they’re trying to do.” Professor Aral he said, “instead of anointing some people to be socially mobile and others not.” (Professor Aral has done data analysis for The New York Times and received a research grant from Microsoft in 2010.)

User experiments conducted by large Internet companies have a checkered history. Eight years ago, a study of facebook A description was published of how the social network had quietly manipulated posts appearing in users’ news feeds to analyze the spread of negative and positive emotions on its platform. The week-long experiment, conducted on 689,003 users, quickly generated a backlash.

The Facebook study, whose authors included a company researcher and a Cornell professor, argued that people had given implicit consent to the emotion manipulation experiment when they signed up for Facebook. “All users agree before creating an account on Facebook,” the study said, “which constitutes informed consent for this research.”

Critics disagreed, with some criticizing Facebook for invading people’s privacy while exploiting their moods and causing them emotional distress. Others argued that the project had used an academic co-author to lend credibility to problematic business research practices.

Cornell later said that its internal ethics board was not required to review the project because Facebook had independently conducted the study and the professor, who had helped design the research, had not directly engaged in experiments with human subjects.

LinkedIn’s professional networking experiments were different in intent, scope, and scale. They were designed by Linkedin as part of the company’s ongoing efforts to improve the relevance of its “People You May Know” algorithm, which suggests new connections to members.

The algorithm analyzes data such as members’ employment history, job titles, and links to other users. It then tries to measure the probability that a LinkedIn member will send a friend an invitation to a suggested new connection, as well as the probability that that new connection will accept the invitation.

For the experiments, LinkedIn adjusted its algorithm to randomly vary the prevalence of strong and weak links that the system recommended. The first wave of testing, conducted in 2015, “had more than four million experimental subjects,” the study reported. The second wave of tests, conducted in 2019, involved more than 16 million people.

During the tests, people who clicked on the “People You May Know” tool and looked at the recommendations were assigned to different algorithmic paths. Some of those “treatment variants,” as the study called them, caused LinkedIn users to form more connections with people with whom they had only weak social ties. Other tweaks made people form fewer connections with weak ties.

It is unknown if the majority of LinkedIn members understand that they could be subject to experiments that may affect their job opportunities.

LinkedIn Privacy Policy It says the company may “use personal data available to us” to research “workplace trends, such as the availability of jobs and the skills needed for these jobs.” Their policy for external investigators which seeks to analyze the company’s data clearly states that those researchers will not be able to “experiment or test our members.”

But neither policy explicitly informs consumers that LinkedIn can experiment or test its members.

In a statement, LinkedIn said: “We are transparent with our members through our research section of our user agreement.”

In an editorial statement, Science said: “It was our understanding, and that of the reviewers, that the experiments conducted by LinkedIn operated under the guidelines of their user agreements.”

After the first wave of algorithmic testing, researchers at LinkedIn and MIT came up with the idea of ​​analyzing the results of those experiments to test the strength of weak ties theory. Although the decades-old theory had become a cornerstone of the social sciences, it had not been rigorously tested in a large-scale prospective trial that randomly assigned people to social connections of different strengths.

External researchers analyzed aggregate data from LinkedIn. The study reported that people who received more recommendations for moderately weak contacts generally applied for and accepted more jobs, results that fit with the weak attachment theory.

In fact, relatively weak connections—that is, people with whom LinkedIn members shared just 10 mutual connections—proved to be much more productive for job searches than stronger connections with whom LinkedIn members shared more than 20. mutual connections, according to the study.

One year after connecting on LinkedIn, people who had received more moderately weak connection recommendations were twice as likely to get jobs at the companies where those acquaintances worked compared to other users who had received more connection recommendations with links powerful.

“We found that these moderately weak ties are the best option for helping people find new jobs and much more so than the strongest ties,” said Mr. Rajkumar, a Linkedin researcher.

The 20 million users involved in the LinkedIn experiments created more than 2 billion new social connections and completed more than 70 million job applications that led to 600,000 new jobs, the study reported. Weak connections proved more helpful for job seekers in digital fields like artificial intelligence, while strong ties proved more helpful for jobs in industries that relied less on software, according to the study.

LinkedIn said it had applied the weak link findings to several features, including a new tool that notify members when a first or second degree connection is being contracted. But the company hasn’t made any study-related changes to its “People You May Know” feature.

MIT Professor Aral said the deeper meaning of the study was that it showed the importance of powerful social media algorithms, not only in amplifying problems like misinformation, but also as critical indicators of economic conditions like employment and unemployment.

Catherine Flick, a senior fellow in computing and social responsibility at De Montfort University in Leicester, England, described the study as an exercise in corporate marketing.

“The study has an inherent bias,” said Dr. Flick. “It shows that if you want to get more jobs, you should be on LinkedIn more.”

Leave a Comment

Your email address will not be published. Required fields are marked *