[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help] 

Status: Not Logged In; Sign In

Scientists unlock 30-year mystery: Rare micronutrient holds key to brain health and cancer defense

City of Fort Wayne proposing changes to food, alcohol requirements for Riverfront Liquor Licenses

Cash Jordan: Migrant MOB BLOCKS Whitehouse… Demands ‘11 Million Illegals’ Stay

Not much going on that I can find today

In Britain, they are secretly preparing for mass deaths

These Are The Best And Worst Countries For Work (US Last Place)-Life Balance

These Are The World's Most Powerful Cars

Doctor: Trump has 6 to 8 Months TO LIVE?!

Whatever Happened to Robert E. Lee's 7 Children

Is the Wailing Wall Actually a Roman Fort?

Israelis Persecute Americans

Israelis SHOCKED The World Hates Them

Ghost Dancers and Democracy: Tucker Carlson

Amalek (Enemies of Israel) 100,000 Views on Bitchute

ICE agents pull screaming illegal immigrant influencer from car after resisting arrest

Aaron Lewis on Being Blacklisted & Why Record Labels Promote Terrible Music

Connecticut Democratic Party Holds Presser To Cry About Libs of TikTok

Trump wants concealed carry in DC.

Chinese 108m Steel Bridge Collapses in 3s, 16 Workers Fall 130m into Yellow River

COVID-19 mRNA-Induced TURBO CANCERS.

Think Tank Urges Dems To Drop These 45 Terms That Turn Off Normies

Man attempts to carjack a New Yorker

Test post re: IRS

How Managers Are Using AI To Hire And Fire People

Israel's Biggest US Donor Now Owns CBS

14 Million Illegals Entered US in 2023: The Cost to Our Nation

American Taxpayers to Cover $3.5 Billion Pentagon Bill for U.S. Munitions Used Defending Israel

The Great Jonny Quest Documentary

This story About IRS Abuse Did Not Post

CDC Data Exposes Surge in Deaths Among Children of Covid-Vaxxed Mothers


Science/Tech
See other Science/Tech Articles

Title: Facebook manipulates the users minds in unethical experemental programing.
Source: Slate
URL Source: http://www.slate.com/articles/healt ... r_or_sadder_to_manipulate.html
Published: Jun 29, 2014
Author: Katy Waldman
Post Date: 2014-06-29 16:34:08 by titorite
Keywords: None
Views: 200
Comments: 1

Facebook has been experimenting on us. A new paper in the Proceedings of the National Academy of Sciences reveals that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to study “emotional contagion through social networks.”

The researchers, who are affiliated with Facebook, Cornell, and the University of California–San Francisco, tested whether reducing the number of positive messages people saw made those people less likely to post positive content themselves. The same went for negative messages: Would scrubbing posts with sad or angry words from someone’s Facebook feed make that person write fewer gloomy updates?

They tweaked the algorithm by which Facebook sweeps posts into members’ news feeds, using a program to analyze whether any given textual snippet contained positive or negative words. Some people were fed primarily neutral to happy information from their friends; others, primarily neutral to sad. Then everyone’s subsequent posts were evaluated for affective meanings.

The upshot? Yes, verily, social networks can propagate positive and negative feelings!

The other upshot: Facebook intentionally made thousands upon thousands of people sad.

Facebook’s methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” says James Grimmelmann, a professor of technology and the law at the University of Maryland. “This is the kind of thing that would require informed consent.”

Ah, informed consent. Here is the only mention of “informed consent” in the paper: The research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

That is not how most social scientists define informed consent.

Here is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you ... for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

So there is a vague mention of “research” in the fine print that one agrees to by signing up for Facebook. As bioethicist Arthur Caplan told me, however, it is worth asking whether this lawyerly disclosure is really sufficient to warn people that “their Facebook accounts may be fair game for every social scientist on the planet.”

Any scientific investigation that receives federal funding must follow the Common Rule for human subjects, which defines informed consent as involving, among other things, “a description of any foreseeable risks or discomforts to the subject.” As Grimmelmann observes, nothing in the data use policy suggests that Facebook reserves the right to seriously bum you out by cutting all that is positive and beautiful from your news feed. Emotional manipulation is a serious matter, and the barriers to experimental approval are typically high. (Princeton psychologist Susan K. Fiske, who edited the story for PNAS, told the Atlantic that this experiment was approved by the local institutional review board. But even she admitted to serious qualms about the study.)

Facebook presumably receives no federal funding for such research, so the investigation might be exempt from the Common Rule. Putting aside the fact that obeying these regulations is common practice even for private research firms such as Gallup and Pew, the question then becomes: Did Cornell or the University of California–San Francisco help finance the study? As public institutions, both fall under the law’s purview. If they didn’t chip in but their researchers participated nonetheless, it is unclear what standards the experiment would legally have to meet, according to Caplan. (I reached out to the study authors, their universities, and Facebook, and will update this story if they reply.)

Even if the study is legal, it appears to flout the ethical standards spelled out in instructions to scientists who wish to publish in PNAS. “Authors must include in the Methods section a brief statement identifying the institutional and/or licensing committee approving the experiments,” reads one requirement on the journal’s website. (The study did not.) “All experiments must have been conducted according to the principles expressed in the Declaration of Helsinki,” reads another. The Helsinki standard mandates that human subjects “be adequately informed of the aims, methods, sources of funding, any possible conflicts of interest, institutional affiliations of the researcher, the anticipated benefits and potential risks of the study and the discomfort it may entail.”

Over the course of the study, it appears, the social network made some of us happier or sadder than we would otherwise have been. Now it’s made all of us more mistrustful.

Post Comment   Private Reply   Ignore Thread  


TopPage UpFull ThreadPage DownBottom/Latest

#1. To: titorite (#0)

Check the end-user agreement.

Deasy  posted on  2014-06-29   16:40:14 ET  Reply   Trace   Private Reply  


TopPage UpFull ThreadPage DownBottom/Latest


[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help]