Facebook evil -- not even hiding it anymore

Started by Darren Dirt, April 04, 2012, 09:25:53 AM

Previous topic - Next topic

Darren Dirt

http://techcrunch.com/2012/04/03/facebook-threatens-to-sue-techcrunch-commenter

cliffs: somebody made a Chrome extension that pissed off Facebook, and by being in a DISCUSSION about the extension somebody gets SUED by Facebook.



yeah.
_____________________

Strive for progress. Not perfection.
_____________________

Darren Dirt

_____________________

Strive for progress. Not perfection.
_____________________

Darren Dirt

 Reason #142 why I don't have facebook...

Large-scale emotional-manipulation lab-rat experiments without user consent or even awareness! #andProudOfItApparently


(quote)
http://www.newstatesman.com/internet/2014/06/facebook-can-manipulate-your-mood-it-can-affect-whether-you-vote-when-do-we-start

Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion.

Dr Adam Kramer, the man behind the study and a longtime member of the company?s research team, commented in an excited Q & A that ?Facebook data constitutes the largest field study in the history of the world.? The ethics of this situation have yet to be unpacked.

I put a number of questions to Dr Kramer, over the course of three weeks of phone calls, emails and direct messages. I was repeatedly told that Adam Kramer was too busy to talk to me and would remain too busy for the foreseeable future, although Dr Kramer himself told me that he couldn?t speak to me without the say so of the press team. Facebook were unavailable for comment.

Facebook went to some lengths to be as unavailable as possible for comment without directly telling me where to shove my inquiry. Facebook were unavailable for comment in the way that a man who, on hearing the doorbell, runs out of the back door and over the garden wall is not at home to visitors.

I asked if it was possible for users to find out if their own newsfeeds had been altered. No answer.

I asked if it was possible for users to opt out of any further such studies. No answer, but if I?d got one, I suspect it would have been ?no? ? all users agree implicitly to be experimented upon when they sign up for the service.

I asked if anyone had bothered to check up on all the people in whom negative emotions were apparently induced. No answer.

The one thing Facebook?s representatives would tell me is that yes, they had indeed carried out the study and yes, they had been looking into the effects of emotional contagion for some time...

The ethics of altering people?s experience of the world on this scale, without their consent, for the purposes of research, do not appear to trouble the Facebook research team.

Emotional engineering is, and always has been, Facebook?s business model.

Writing at Medium, Zeynep Tufekci identifies this as ?a Gramscian model of social control: one in which we are effectively micro-nudged into ?desired behaviour? as a means of societal control?. That sort of control was never possible on this scale before.

The studies, taken individually, are creepy quasi-consensual experiments on individuals? most intimate feelings and most important choices. Taken together, there's a terrifying pattern.

Facebook?s service is not free. Facebook?s product is your information, your worldview, your memories and experience, and that is what you pay with every time you log in. That information is power of a quality that is can be traded upon and sold.

More people live a part of their lives on Facebook than live in any single country on earth, apart from China. It is, effectively, a country itself, a country of pure information where the authorities know everything you do and can change everything you see, without even telling you first.

The simple answer would, of course, be that if you don?t want to be spied on, emotionally manipulated and studied, quit Facebook.

(/quote)

_____________________

Strive for progress. Not perfection.
_____________________

Mr. Analog

How is it spying if you willingly give out the information?
By Grabthar's Hammer

Darren Dirt

http://blogs.wsj.com/digits/2014/10/02/facebook-changes-guidelines-on-user-experiments/

"Mike Schroepfer, chief technology officer at Facebook, said in a blog post that the public backlash over the emotions experiment caught Facebook off guard..."

R U @%&#ING KIDDING ME?

And these are the moral gatekeepers of your personal data, people? #AnotherReason

_____________________

Strive for progress. Not perfection.
_____________________

Mr. Analog

What personal data?

The "personal" data you willingly shared with a 3rd party corporation for the sole purpose of sharing it with people?
By Grabthar's Hammer

Darren Dirt

Quote from: Mr. Analog on October 03, 2014, 01:50:48 PM
What personal data?

The "personal" data you willingly shared with a 3rd party corporation for the sole purpose of sharing it with people?

Sorry, to clarify, the link I just posted is facebook's recent response to what I posted about a while ago -- where they chose a subset of FB users and modified their feeds in order to see what impact that would have on their activities on FB. It's about non-consensual manipulation of the users themselves, not about using the data of the users in whatever profit-based ways they want. And my post today is highlighting the whole "we did not expect the backlash" tone. These are the people guiding the FB ship.
_____________________

Strive for progress. Not perfection.
_____________________

Thorin

No, they're not the moral gatekeepers.  They have never claimed they have any morals.  They have claimed that they can make money by giving away free access to their website, which means they need to find another way to make that money.
Prayin' for a 20!

gcc thorin.c -pedantic -o Thorin
compile successful

Mr. Analog

The real story here is that people think that content facilitators like Facebook have moral obligations.
By Grabthar's Hammer

Tom

You'd think people should have moral obligations.
<Zapata Prime> I smell Stanley... And he smells good!!!

Mr. Analog

Quote from: Tom on October 03, 2014, 02:17:43 PM
You'd think people should have moral obligations.

Mandating morality requires defining it

Good luck with that one! :)
By Grabthar's Hammer

Tom

"Do unto others?" not always good enough.. but usually?
<Zapata Prime> I smell Stanley... And he smells good!!!

Darren Dirt

Quote from: Mr. Analog on October 03, 2014, 02:21:03 PM
Quote from: Tom on October 03, 2014, 02:17:43 PM
You'd think people should have moral obligations.

Mandating morality requires defining it

Good luck with that one! :)

Definining, mandating... ARGH!

Let me try again. What I actually meant was to focus on this:

The recent shareholder-friendly public response was (in my opinion) another clear demonstration display of the morals (or slight lack thereof) that are guiding the folks who are responsible for maintaining the security of FB users's private/personal data ... they were surprised by a backlash in response to a nonconsensual manipulation of the user base -- so whatever sociopathic block caused that blind spot can make you wonder what else in the future they will be morally fine with that most people would consider obviously immoral etc. That's all I meant.


Damn often-unclear English language! (especially at the end of an exhausting week)

For some reason reminds me about this I recently saw:
http://www.frivolity.com/teatime/Songs_and_Poems/english_is_tough_stuff.html
and also this: https://duckduckgo.com/?q=eye+have+a+spelling+checker
_____________________

Strive for progress. Not perfection.
_____________________

Mr. Analog

Quote from: Tom on October 03, 2014, 02:45:56 PM
"Do unto others?" not always good enough.. but usually?

Well, to put it specifically, here's an example

Some people consider dancing to be morally objectionable, is it Facebook's job to censor images, video, text, etc of that activity?

If it is not covered by law or by their own moral ideals then they will not interfere with content other people share because it is costly and contentious.

So back on point (sorry Darren!) the point of contention was that Facebook decided to filter content for some users and try to measure statuses to see if the filtering had an effect. I can perfectly understand how someone would feel about having their emotions tinkered with, y'know, for "science". Especially if they are a heavy user of Facebook. To a degree I can understand Facebook's motivation, they want a happy whitewashed community that presents a fun and happy world that promotes nothing but Fun Timez for all. (In fact one of the reasons I don't usually go on FB very much is most people are on there complaining about something or posting generally negative pop-sci clickbait article garbage, but that's beside the point)

Complaining to Facebook about what they did with users content feeds is an appropriate response, leaving Facebook because of it is even more appropriate.

Implying that Facebook collects private information or have a moral obligation to the public good is factually wrong.

The level of actual surprise that they were "caught off guard" by their "users reaction" is debatable, having seen how some managers think in large organizations I kind of think that yes, someone could be that cartoonishly thick.

Is Facebook Evil? Yes, but only for certain values of evil

(The same way 1+1=3 for large values of 1)
By Grabthar's Hammer

Tom

You know though, usually you need to get explicit permission from someone to perform experiments on them :o
<Zapata Prime> I smell Stanley... And he smells good!!!