The NSFW Filter: Maybe Ello Won’t Force Me To See Porn

| November 7, 2014 | 11 Comments

Why The NSFW Filter Isn’t Good Enough On Ello and Other Social Networks

There’s a new social network to join and it’s been getting a lot of buzz. It’s called Ello.

Ello is the brainchild of a group of designers and artists. Its slogan is “Simple, beautiful and ad-free.” The designers wanted to create a social network that was as visually appealing as it is simple to use. It’s open to everyone, but its layout is especially designed for artists to share images of their work.

The network is still in Beta testing, but has been getting a lot of press. It’s different from Facebook, but is nonetheless seen as a competitor to the giant social network we all hate so much feel compelled to use.

Much of the chatter and speculation around Ello has been about their privacy policies and how they’re going to generate money for their investors. They’ve promised to never have ads or sell user data (for profit ßout?).

A few weeks ago they made further strides to make good on their promises by announcing that Ello has become a Public Benefit Corporation (PBC). A Public Benefit Corporation means that they “exist to produce a benefit for society as a whole” – not just to make profit (as they stated in a recent newsletter).

This all sounds very promising. But I must admit that I was particularly excited about Ello’s content / censorship policies. They stated that “nudity or sexually explicit content” wasn’t going to be censored. They will just require that the profile or content be marked “NSFW” (not safe for work).

This is a big step up from other big social networks, like Facebook, Google+, Instagram and YouTube. These more established social networks prohibit nudity and sexual content through totally vague, meaningless and dare I say, sexist policies. They can’t effectively police their own networks, so they rely on users to report offending content which results in totally random censorship (that, at times, even contradicts their own terms and agreements). People are frustrated that they can’t predict what will be censored any more than they can predict winning lottery numbers.

As a prime example that I’ve mentioned before – Facebook’s policy specifically states that they permit breastfeeding photos, but they continue taking down users’ breastfeeding photos.

Although I think it’s awesome that Ello won’t practice random censorship, I still find their NSFW policy to be problematic.

Below is my email to Ello in which I expressed my concerns:

NSFW used to be about what’s “safe for work,” but now it’s used as a default to hide what any user doesn’t want to see. And it’s assumed a user may not want to see nudity nor sexual content, and needs a filter to turn off both.

Well, that simply isn’t reality. Many people are fine with seeing simple or artistic nudity, but don’t want to see hardcore porn.

To include both within the “adult” filter is really not helpful to the user, nor is it fair to say all nudity is “adult.” Nudity isn’t always sexual, and sexual content doesn’t always include nudity.

We can all recognize that there is a world of difference between an image of an artistic nude photograph (like the work of Spencer Tunick), versus an image of hardcore pornography.

The Ello filter, however, leaves me no choice but to take the risk of seeing both.

So shouldn’t there be a setting that filters out porn, but let’s through simple / artistic / nonsexual nudity? A filter that lets me see my friend’s breastfeeding photos, but hides most content that looks like it came from a magazine on the adult rack?

I know there can be a fine line between “sexual” and “nonsexual” when it comes to photos. But just like you are leaving it up to the user to decide what is “sexually explicit” maybe you can also let them decide if their nudes are “sexually explicit” or not.

A user could click a button to allow nudity, knowing that they still may see some “sexual” stuff. But the sexually explicit filter would at least already cut out a lot of it.

nudity sexual content nsfw filter explicit ello yna

Naturally, there are other aspects to consider when creating a content filter, and I’m sure their policy is still being developed. But other questions:

  • What is nudity exactly? Is a nude statue or painting still NSFW? Will Ello be sexist like other networks and say that nudity includes female breasts, but not male breasts?
  • What about graphic violence? Or images of medical procedures?

Next, let’s take a look at how other websites and social networks filter content. Are there any well-developed systems out there?

It’s useful to look at how other sites go about filtering content from searches, without actually removing or censoring it:

google search engine filter safe search sexual content nudity yna

Google has a SafeSearch filter that decides what content is safe for a user to see, instead of letting users decide.

1 – Google – The Google search engine manages the censoring of content very poorly (no surprise there). Google considers nudity and sexual content to be one and the same. They have a “SafeSearch Filter” (because, you know, looking at images of our body parts is dangerous), and they write: “We do our best to keep the SafeSearch filter as thorough as possible, but sometimes sexual images, like porn or nudity, make it through.”

2 – Twitter also uses a type of filter system and doesn’t remove “potentially offensive content.” It offers the option of automatically hiding “sensitive content” such as “nudity, violence or medical procedures.” Users who share this type of media are supposed to indicate this in their profile settings. That’s as much description as they give for what sensitive content means. It’s strange that they don’t even mention sexual content, but maybe they figure it’s enough to list “nudity.”

twitter sensitive media nsfw filter nudity sexual content young naturists america

When you select the option to not see sensitive media, you start seeing warnings on content that may be considered “sensitive.” This gives the user the option to skip over it if you don’t want to see it, or click a button to view it if you so wish. This is a bit simplistic in that there’s no warning of what the content may contain. Nudity, porn, etc all get rolled into one so the user has to guess what the content may be, based on the words / context. What I do like about this approach is that the user can decide what to view on a case-by-case basis.

Currently on Ello, it seems you have to go back into your settings and turn off the NSFW filter in order to see any profile that contains this label. I feel it would be much more useful and effective to follow Twitter’s example and just click a button when you want to view potentially “sensitive” content.

3 – Tumblr – Tumblr also has an NSFW flagging policy for blogs that have “sexual or adult-oriented” content. It makes no mention of nudity, violence, or other graphic material at all. What makes Tumblr frustrating is that in 2013, they opted to make their search operate more like Google. It automatically filters out NSFW / “Adult” blogs unless you change your search settings. But they remove entire blogs from their searches that are marked NSFW (not just a single post). I came across one blogger’s letter to Tumblr in which they argue for giving users’ control over what they see by labeling posts and allowing the user to click a button to view it (much like Twitter).

I’m no expert on setting up Internet filters. That said, I feel it is important to discuss these issues as many (and even perhaps – most) people don’t even think about it. Of all the concerns being voiced about Ello, the NSFW policy doesn’t seem to be one of them while I feel it should be!

Today’s social networks (and search engines) can do a lot better when it comes to filtering content. Take it from someone who inadvertently sees a lot of porn while trying to find nonsexual nude images online.

Ello could become a great network for artists, educational organizations and naturists – a much better alternative to Facebook. But it’s still in Beta with much to be developed. We will have to wait and see how it all plays out in the end.

For the record, I did get a response to my letter to Ello. Their response was: “We will be developing more systematic privacy features down the road.”

Since Ello is still growing and being developed, now is a good time to send feedback! Your voices can make a difference – now more than later. I encourage everyone to write to Ello at: hello [at] ello.co and share your ideas, thoughts and concerns about their NSFW policy. Or just leave a comment on this blog as I will be sending them this link and I know they will be reading the comments!

Check out Ello at http://ello.co (I haven’t been active on it, but we are @youngnaturists and @yna)

Young Naturists & Nudists America

Tags: , , , ,

Category: Felicity's Nudist Blog, Censorship, Social Activism

About the Author ()

Author of Felicity's Blog. Co-founder of Young Naturists America. 3rd-generation nudie. Avid reader. Feminist. 70% vegan, 30% vegetarian. When I'm not busy eating, I'm writing about naturism, censorship, topfree equality, body image and other fun topics. I like feedback, so plz leave a comment when you've got something to say!
  • OculusDar

    hdgi27 How do I know I don’t want to?  How do I know if what you SAY is something I MAY not want to see is actually something I don’t want to see?  I really don’t care if YOU want to put a “warning” on your selfies, really I don’t.  I’m going to disregard whatever warning you provide in any case.  I’m just like that.  Insatiable curiosity.  I don’t worry about un-seeing things.  If what I see is distasteful to me I will simply put it out of my mind, or store it in the trash bin of my consciousness.  No biggie.  It rarely happens so my trash bin is pretty much empty.  All this nonsense about people being scarred for life from seeing something that someone else decided is inappropriate for them is, frankly, absurd.  

    I would certainly put a warning sign on the Sun if I thought some kid was going to stare directly into it for any period of time.  I wouldn’t want him to go blind or anything.  Since I can’t put a warning sign on the Sun I guess I will just have to be a responsible parent and tell them about the dangers myself.  Looking at pictures, on the other hand, is hardly going to cause anyone to go blind…or be scarred for life.  Especially if those pictures depict nude people or people engaging in the quite natural and common to every human being act of sex.

  • hdgi27

    i agree with you.  I prefer some porn, and reflect that in my photos of myself, however I would click a button to warn people of the content of my selfies.  I don’t want you to see them if you don’t want to.

  • OculusDar

    CampFullMonte OculusDar
    >Is the “flaw” in my thinking that I simply don’t have the “right” to chose what I share with whom or what I accept from others? That still doesn’t sit right with me. <

    It is not enough just to think…one must Reason with some degree of Logic.  Especially in today’s world of the Socialist Collective Agenda and the mindbenders hard at work to destroy any trace of individualism and personal responsibility.  The flaw, one of them as there are many flaws in most people’s thinking these days, is thinking that it’s okay to allow someone else to do your thinking for you.  Consider the cliche, Who will control the controllers?  I paraphrase.  I am not willing to give up control of my life, or my work, to anyone, much less self-appointed police state Nazis and self-righteous Socialist bullies.  I sound a bit bitter, don’t I?  Yup!  VERY!

    I have been an Artist for over 50 years and an advocate for Freedom longer than that.  The situation today is the worst I have ever seen it.  Hysteria runs rampant and Freedom in America is just about finished.  The Neo-cons and the Socialists have joined forces and are waging war against Freedom and personal responsibility [The Neocon-Socialist Alliance].  You may think I’m just being paranoid, but I assure you it is not without REASON.  Laws have been passed that make us ALL felons now.   This is not paranoia, or a delusion.  I’ve seen the laws, I know what they say, I know what they mean.  I know the intentions of the traitors and criminals that made them, and are actively enforcing them.  Madmen and fools.  Not a good combination for the survival of Freedom.  

    The flaw is in thinking that government and PUBLIC corporate [fictitious] entities have the right to tell us what we can see, read, and do.  I can tell you that this kind of foolish thinking will be the end of Freedom and personal choice for everyone.  Everyone except the controllers and masters of the Matrix that is well on its way to complete power over all of us.  

    This foolish notion that we can trust strangers to decide what it is we can see, show, share, read, and do, or rely on their judgments of what is best for us, what is politically correct and acceptable, what THEY think we want to be exposed to, is pure insanity and will not end well for freedom or us.  You can trust me on this.  Or…you can think it through for yourself.  Just look at the facts, the evidence, which is abundantly clear to those with a clear eye and logical mind.  It is all so very obvious.  And it boggles my mind when I see people totally ignoring what is so clearly a conspiracy to turn us all into mindless zombies and slaves to the Matrix elite.  

    You may think me crazy, and you may be right, but I have been observing and studying the Human Condition for almost 7 decades and what I have seen, what I see now, is more than a bit disturbing.  It’s scary ugly.  And it’s gong to get worse.  Much worse.  I ask only that you keep in mind what I have told you….and continue to THINK.  Best to think clearly and logically if you can manage it.  Outside the tangled web of deception that now threatens Freedom and Individualism. 

    One more thing.  I refuse to be labeled or categorized, and I refuse to allow my Work to be put in some socialist inspired category or censored by self-appointed tyrants and bullies.  If you don’t want to SEE my Art then you will have to make that decision all by your lonesome, on your own time, of your own free will.  The way it was meant to be.

  • CampFullMonte

    OculusDar CampFullMonte OK – I can see where you’re coming from in terms of your position regarding censorship and labeling. So a sincere thanks for taking the time to explain that. I think I also understand the rhetoric that suggests perceptions around censorship are “flawed” and maybe my perception IS flawed but I’m not yet convinced. I completely agree with your comments about personal responsibility being at an all time low (not just in the US). It is scary. I often describe this trend as legislating natural selection out of the human evolutionary process. Ultimately I want that personal responsibility. In this context, I want the freedom(responsibility) to decide what I want to share with whom. I want the freedom (responsibility) to refuse what others may want to share with me. However, when the sender is anonymous to me or there is an intermediary messenger or transport mechanism, I have to rely on that mechanism to properly flag (censor) on my behalf what I am offering or being offered. For that reason alone, in this context, I guess I want a level of, in your terms, censorship. Which probably brings us back to my thinking being “flawed”. Maybe it is but I’ll be dammed if I can come up with a mechanism that would deliver me the ability to take responsibility in this way that would not be considered censorship in your terms. Is the “flaw” in my thinking that I simply don’t have the “right” to chose what I share with whom or what I accept from others? That still doesn’t sit right with me. In any event – thank you for making me think about it.

  • OculusDar

    CampFullMonte OculusDar Perhaps you misunderstand my position regarding censorship and labeling.  Contrary to what you choose to believe the two are directly related and the results are the same.  It is common for people to become confused when attempting to justify a rationalization that is flawed.  The premise may APPEAR to be sound to the flawed point of view, but the fact remains it is just another form of censorship.  ANY restriction upon the presenter, in effect, censors the presenter and limits access to and by the viewer.  For example, FORCING me to label my work according to the narrow minded categories offered does, in fact and effect, censor me and my work.  If I refuse to categorize my work to fit in one of the limited categories I run the very high risk of being censored, banned banished and oppressed.  My work is suppressed and I have aided and abetted those who would censor me.  In other words, I either agree to be labeled and thus, self imposed restricted from presenting my work to all, the entirety of potential viewers, or be denied access outright to my potential audience.  The system, in effect, controls my output and outreach to the public at large and limits what I can show and to whom I may show it.  This is contrary to the principles of Free Speech and Freedom of the Press.  It is not only wrong, it is illegal in America and most civilized countries where Free speech is respected and protected by law.  In America those protections no longer exist.  In America Freedom is no longer respected. It is the flawed thinking of most Americans that is to blame for this criminal and insane war being waged against Free Speech and the Principles of Freedom and personal responsibility.   The Neo-con Socialist Alliance,  which is at the core of all things evil in America, has been pushing the Collective Agenda for several generations.  The very fact that most Americans refuse to acknowledge  or recognize the symtoms and tactics of this very real conspiracy tells me that they, the mindbenders, have been quite successful in indoctrinating the general public and are a very real threat to Freedom and individualism.  Personal responsibility is at an all time low in American society.  This is obvious.  And it’s scary.

  • I totally support a more detailed classification where a user can choose what they are open to view or not view and to classify their own material posted based on this same classification.  I do not see that as censorship in any way but the right to exercise an individuals right of choice.

    While I agree that Flickr took a set in this direction I found it still fell short.  The problem comes with the details of their policy and the policing of it.  Unfortunately due to the high volume, most system rely on the reporting of inappropriate content by the users.  The users reporting content often have not read the policy and have no often have no way of clarifying why it is being reported.  Generally their is some further form of review before any action is taking, but these reviewers a human and inconsistent.  I don’t think any system allowing the subjectivity of a group of superuser will ever work.

    I think you would need to have very clear categories.  Just looking at nudity and not some of the other categories mentioned above (sexual content, violence, medical procedures etc.) it could include All nudity, buttocks OK, male breast, female breasts, female genitalia, male genitalia, transgender etc. Maybe then have a system when an image is reported as a violation, include the same classification to clarify where the user saw a violation and then have the next 5 or 10 users weigh in if the image violates the policy the user reported and based on that either take an action against the user that posted the item or an action against users that report images that in the opinion of a vast majority of users is not a violation of the policy.  In some cased this later action could be adjusting the setting of that user that continuously looks for these images just to report them.

    In the end everyone’s freedom ends when they start infringing on someone else’s freedom.

  • CampFullMonte

    We stopped using Flickr as a photo sharing site a few years ago when they changed their user interface but I have to say their content flagging system worked well for us, once you understood it :-) It followed an approach similar to that which I think you seem to be advocating. That of categorizing the content you upload and share. From memory they had three simple categories. Safe, Moderate and Restricted. As the sharer you were expected to appropriately categorize your uploads and as a browser it was your responsibility to indicate if you did not want to view shares in one or more of those categories. This, coupled with another 3 categories of people with whom you were willing to share your images (e.g. public, friends and family or combinations thereof – irrespective of how they might have indicated what they wanted to see) left many grey areas but was a start. The system was policed by site users who were able to report images that were not correctly categorized. Flickr then ruled on these users reports and had various actions they could take, the ultimate was to ban users. It wasn’t foolproof and the system did sometimes fail or rule unjustly but it was a reasonable attempt. As you rightly pointed out in earlier exchanges with OculusDar . This approach is not censorship but labeling or categorizing. It supports the very human rights he is fighting for. A mechanism that supports my right to decide what I share with whom and my right to decide what I want to see.

  • OculusDar

    FelicityJones OculusDar I have to say this.  I think you are an intelligent person and I can appreciate what you do for nudism and, consequently, human beings and the Greater Good.  What I do is a bit more involved than simple promotion and defense of nudism.  That’s not to say you are not involved with the more complex issues that come with exercising the Natural human rights associated with and attached to nudism and that life choice.  The human condition and the Greater Good are at the core of what I do, as Artist and advocate for Freedom.  Having said that…

    This notion that people should be protected from themselves, not allowed to make their own decisions about what they see, or read, or do is not a healthy policy, and it certainly is inappropriate for a Free society.  This is my major concern.  Freedom.  And personal responsibility.  Freedoms [some people commonly call them rights] cannot exist in a society that allows tyranny and bullying to prevail and control what we can or cannot see, read, do as individuals.  It is contrary to the very principles of Freedom.  I have mentioned hypocrisy before, and I will not elaborate further on that here, except to remind you that it is common in society and especially when it comes to nudists and nudism.  Hypocrisy contaminates the the minds of people and corrupts their cause.  I hate to see it prevail so commonly in the nudist community and generally in a Free society.  

    As for censors suffering from psychological scarring, that should tell you something about the evils of censorship.  It’s not what they see that scars them…it’s their job that scars them.  The cure for that is simple.  They shouldn’t be censoring.  Free societies do not employ censors or engage in such tyrannical and bullying tactics.  Police State governments do that sort of thing.  And, it’s wrong as well as illegal.  To be sure, it IS illegal in America.  This is, no matter how one wants to twist it, alter and ignore the principles of Freedom to suit their emotionally manipulated hysteria and delusions, is wrong, always wrong, and damaging to people and destructive to Freedom.  This is indisputable if one understands the basic principles of Freedom and comprehends the real issues that threaten Freedom and diminishes personal responsibility in a Free society.  Regards, De McClung

  • OculusDar It’s not the same as censorship…it’s not removing content, it’s labeling it. Putting a warning label on something, like “graphic violence,” is not the same as censoring it. Saying “don’t look at it” is not a solution. What’s seen cannot be un-seen. If something is going to be psychologically scarring, you should have a choice to not look at it. 
    Other social networks censor some pretty serious shit and I’m not talking about just porn. The people employed as censors start to suffer from psychological distress, from looking at all the nasty stuff on the Internet, like animal abuse, human abuse, graphic medical videos. They have to get therapy. I think there are some things that should be censored, like child porn and other forms of abuse. But at least the labeling systems could mean these censors won’t have to do this work anymore.

  • OculusDar

    I’m going to put out a concept for you to think about.  You may disagree and others may not like what I’m about to say but…I speak the truth by way of Reason and the facts.  

    Here goes:  Censorship of ANY kind is wrong, wrong wrong.  Period!

    OK, that was rather short and to the point.  Maybe I should explain why it is wrong.  For one thing, in a FREE society, neither the government nor a PUBLIC entity [corporation] is allowed BY LAW [Excuse the caps but I hate quote marks] to infringe the rights of the People. EVER!  2.  FREE speech, and this includes ANY form of expression, can NEVER be censored or banned or made illegal by a government or corporate entity no matter what that speech may be.  This is not debatable.  It’s the LAW!  And it’s a basic core element of the Principles of Freedom.  Either we are FREE or we are not!

    3.  I know you want to argue with me over this but I will not argue.  There is NO valid argument against the truth, the truth supported by the facts.  Conclusions of fact follow logical thinking.  Tyrants and bullies are NEVER right!  And that’s what people are that want to control and censor what we can see, read, and do.  I will never allow a government or a PUBLIC entity to tell me what I can see or share with others.  It is anti-freedom and against every principle of freedom.  Certainly in a FREE society. 

    4.  If you don’t want to SEE something..then don’t look at it.  That doesn’t give you the right to tell others what they can or cannot see, or share, or express.  That would make you a tyrant and a bully.  And we all know how contemptible tyrants and bullies are.  Asking the government or a public corporation to censor what people have every right to see, share and express is wrong wrong wrong.  PERIOD!

    D. L. McClung, Director, TAPEFA, The Antigone Project, Exercising Freedom Advocacy.

  • NotanlinesGuy

    Agreed, and well written.  I think that porn is fine, but needs it’s own classification as there is a big difference between porn and simple nudity, or artistic nudity.