Women like porn, but Facebook doesn’t like women liking porn

Women like porn too, it turns out. No, seriously. Some really do.

Facebook, however, apparently doesn’t like that fact.

On July 27, 2010, Facebook removed the Our Porn, Ourselves Facebook campaign page. After the page was removed, anti-porn organization Porn Harms claimed victory and thanked Facebook for the deletion, on the organization’s Facebook page and their Twitter feed. Our deleted group had roughly 3,500 members, most of whom were women (I combed through the member logs frequently). Our page had over three times the members of Porn Harms’ anti-porn page.

According to Facebook the deletion was in response to reported violations of Facebook’s Terms of Service, among which include obscenity. As I am an active and high-profile figure in the online social media space, I am not a newcomer to social media, or implementing Terms of Service. I also knew that someone was persistently trying to get every piece of art removed from our gallery — regardless of the content, nearly every user-uploaded photo was mysteriously being flagged and removed.

Wanna place bets that Facebook’s censorship (and yes, deleting something as obscene that has absolutely no obscenity is very likely a specious excuse, and therefore outright censorship outside the boundaries of the terms and conditions of use of the site) was done entirely at the behest of the anti-porn folks whose proverbial lunch was being eaten, judging by the member counts of the two groups?

Hat tip to @antiheroine (Skepchick Jen) for tweeting about this — yet another example of someone cheating at the rules of the intertubes to get their way when reality contraindicates their favored positions.

Heh. Positions.

{advertisement}
Women like porn, but Facebook doesn’t like women liking porn
{advertisement}

8 thoughts on “Women like porn, but Facebook doesn’t like women liking porn

  1. 3

    Honestly? I could flag my brother’s photo of everyone at his wife’s 33rd as inappropriate or obscene, and FB would take it down even though there isn’t anything inappropriate or obscene about it – it’s just a bunch of people standing around in a room. Hell, you could flag a photo of someone’s backyard obscene and it would be removed.

    It is my humble opinion that FB’s admins don’t even look at half the stuff they delete because it was flagged. Given the number of complaints they get, I’m not surprised – do you really think you can pay anyone to read through hundreds of pages in a day and then judge whether they really violate TOS or not?

    FB needs to hire some more admins. (I’ll do it! I need the money! I like sitting on FB all day! I really have nothing better to do! I would like to be paid to sit on FB all day! Pleeeeaaaaase?)

  2. 4

    That’s the whole problem though — they can’t look at every complaint, but they also won’t delete something just because one person flagged it as offensive. The anti-porn people are therefore using the fact that there’s a small army of them (smaller, mind you, than the members of the pro-porn group, but still), and flagging stuff as obscene spuriously, knowing full well that they’re gaming the system.

    Because it’s not that any of the stuff is actually obscene, it’s just that they’re personally offended because of either their own sexual morays, or their religious beliefs, or what have you. They’re making their voice count for more than anyone else’s. That’s cheating, far as I’m concerned. And yeah, FB having more admins would obviate the problem, but the only real way to put a stop to the cheating is to fix the process by which people can flag something as obscene — possibly by defining obscenity very strictly, and making a penalty for falsely marking things as obscene.

  3. 5

    Speaking as someone who deals with facebook as a company pretty regularly, Dartigen has it right. Zuckerberg has been pretty proud of their “Zero Tech Support Due To Quality of Product” mindset, but sometimes that means administration and not just problem solvers. Those poor bastards who actually remove flagged images must be crazy overworked, and they CERTAINLY don’t want to get into the process of judging what is and isn’t obscene or amoral. Numbers are all they have. I don’t even think more admins would solve it unless Facebook changes their stance on being neutral in these things, which would a big step. They’re not google, and they don’t even pretend to have “don’t be evil” ideals that they stick by even in the face of consumer pressure. This is why democracy and capitalism work so terribly well together and should serve as a reminder… the internet works on money like everything else. The best thing the offended parties can do isn’t try and get facebook to spend more money on admins to defend their ideals. It’s get off facebook, send a letter (or better yet try and get a large press release) letting them know their polices or lack there of have made them decide not to use their product, and give facebook as much bad press as possible regarding it. Vote with the wallet and stick to it. Fighting the whack jobs on the platform itself isn’t going to help, it would be much better to fight them outside of facebook where control over what is being said doesn’t cater to the loudest organized group.

  4. 6

    You know me, dude. And you know I absolutely hate to cede ground to idiots who can outscream you. That’s not to say it isn’t a Sisyphean task, to combat this “cheating”, especially where one cannot expect the people running the platform to take sides. But as with the right-wingers gaming Digg, one might perhaps expect that a technological solution be put into place (e.g., removing the “bury” mechanism / “report obscene” mechanism). Or putting anything marked as potentially obscene behind a click-through, age-verification wall.

    Or perhaps, as I’ve suggested, a simple definition of the word “obscene”. It’s too vague to be meaningful, as with obscenity laws put into place in democracies like those we find in North America. And it’s too easily abused by people with tender sensibilities, like the same right-wingers and religious nuts that put those rules into place. Defining “obscene” would at least mean someone couldn’t say “it disagrees with me therefore it’s obscene”. At least not in good conscience. It may not fix the problem, but it’ll make the victims feel a little better about the obvious cheating in that it’ll be all the more obvious to outside observers.

  5. 7

    I know my suggestions certainly won’t make people feel better and worse off would make the people responsible think they had a “victory” (“We drove them off facebook, yaaay!”) but eventually this comes down to which party is willing to take more drastic action, and that is something fundies tend to have us beat on. Drastic actions to change facebook policy are going to be a much riskier struggle then publicizing the stupid shit the fundies are doing off of the platform that caused the problem. Violet has the right plan as the founder of the group, mentioning all of her places of positive publicity for her cause and framing this as a moral issue. Bad publicity offsite of their own product will do them more harm than internal complaints. Facebook should know that people are going to hear about this. But I still think Facebook picked a side, and that means they shouldn’t continue to get the business of the people they chose to side against. Otherwise Facebook wins and at the end of the day, they are the ones who caved. Fundies are always going to pull this crap, it’s the businesses that cater to them that are the problem. Being on facebook is not a fundamental right, it’s a personal business decision, and when you fight it, you’re always fighting with a company acting in what it defines are its own best interests. You have to prove to them otherwise, as that was the basis or their decision, not a moral one.

    That being said, using this issue to create a “Change Facebooks Often Abused Obscenity Reporting Policy” group might not be a bad idea, if this a torch Violet’s community wishes to take up.

Comments are closed.