Porn again

First things first: Google is to be applauded for fighting a subpoena requiring it to turn over data on people’s searches to the federal government. The government isn’t seeking the data for a criminal investigation; it’s on a fishing expedition to build a case for getting an anti-porn law through the courts. As John Paczkowski writes, the subpoena puts us on a “very slippery, very dangerous slope.” If privacy laws are to mean anything, they need to apply to governments as well as companies.

But if Google’s on the right side of this fight, that doesn’t mean it’s necessarily doing the right thing in making the foulest sort of pornography – rape and incest images, for example – readily accessible to children. If you’d like to see the kind of stuff that Google disseminates, just go to Google Image Search, turn off the SafeSearch filtering (it’s just a checkbox), and do a search on “rape.” Trust me: it’s nasty. Google’s not alone, of course. You can do the same thing on Yahoo – though at least Yahoo forces you to click on two checkboxes before it coughs up the bad stuff.

I know this is a complicated issue. My own instincts run toward the libertarian when it comes to placing controls on what’s online. But there is a case to be made that placing some controls on the accessibility of some online content is in the best interests of society. (In fact, Google is actively censoring video content already.) And even if you reject that case (as principled people can), you need to at least consider the consequences of pretending there’s no problem here. If the internet community doesn’t police itself, it may well end up being policed by the police. Like it or not, some slippery slopes have to be negotiated.

10 thoughts on “Porn again

  1. BC

    This story isn’t about censorship or filtering.

    Clearly, the government thinks that Google isn’t doing a good enough job indexing this dodgy material. They want the raw data so they can data mine it further.

    Why can’t the FBI agents go to Google, turn off safesearch, type in “child porn” (or, as you suggest, “rape”), and then go down the list and investigate each of the sites in turn?

    Google’s pagerank algorithm should turn up the top offenders automatically. This should make the work of law enforcement really, really easy.

    The dodgy sites that show up in the top few pages for those search terms should expect a lot of law enforcement attention.

  2. Michal Migurski

    If there’s an Internet Community, then there’s probably a need for self-policing to mitigate a problem.

    I don’t believe there is such a thing, though. The “Internet Community” is nothing more (or less!) than the sum total of people using connected machines at any given time, connecting to one another and moving bits. Even Google itself it just one node in the graph, not any kind of infrastructure. I could imagine a situation where the FBI may want to investigate Google for “publishing” illegal material, or use Google to find original publishers, but I don’t see how this generalizes to the whole net.

    The safe-search “rape” example reminds me of the old doctor joke – Patient: “My arm hurts when I bend it like this!” Doctor: “Then don’t bend it like that.”

  3. Nick

    Michael: Guy goes to see his doctor, says, “Doc, my eyes hurt when I open them.” Doctor says, “So don’t open them.”

    Your point’s well-taken on “internet community.” I should have come up with a more precise phrase. I guess I was referring to those companies and opinion-makers who actively seek to influence the technological and legal shape of the net. The fact that Google’s banning some types of video from its service means that lines are already being drawn. So you can’t argue it’s impossible to draw lines.

  4. Seth Finkelstein

    Nick, there exists material which is legal in Denmark which is child porn in the US. Do you really mean all the Internet must voluntarily conform to US law as applied in its most restrictive jurisdiction? Because that’s the implication here.

    This is bad punditry. It sounds good – “the internet community doesn’t police itself, ..”.

    But:

    1) There is no such things as the Internet community here.

    2) We’re talking about about sexual mores of the ENTIRE WORLD!!!

    Again, I ask you – what *in specfic*, *in detail* would you have happen? No meaningless rhetorical declarations, of the type “We must wring our hands over how bad it is, and clean up the Internet”.

    A lot of very smart people have been arguing over this topic for at least a decade, really, it’s not for lack of trying.

  5. Nick

    Seth: My post is about tightening restrictions on access to certain types of content that a society might deem harmful to minors. Whether that content is “legal” or “illegal” is a different issue altogether; that obviously varies and always will vary from country to country, jurisdiction to jurisdiction. Nowhere do I imply that “all the Internet must voluntarily conform to US law as applied in its most restrictive jurisdiction.” Please.

    Google, Yahoo et al. already actively filter the content they deliver (I assume, though I don’t know for sure and don’t plan to try to find out, that they make reasonable efforts to filter from U.S. eyes images of child porn that violate U.S. law). As I wrote in an earlier comment, the question isn’t whether you draw lines, it’s where do you draw the lines. My own belief is that the lines could be tightened up a bit. I’m fully aware that you’ll never achieve a perfect system but that doesn’t mean you can’t try to achieve a less imperfect system. It’s when you do nothing that you invite in the legislators and the judges to draw the lines for you. Take your pick.

  6. Seth Finkelstein

    Nick, as you note, Google and Yahoo already respond to government laws, that is, when they receive a judicial declaration that such-and-such a site violates the laws of the country, they remove it from their index (this has been in the news here and there, with China especially).

    So they don’t “do nothing”. But I read you as wanting them to be pro-active, to take steps before any judicial declaration, to try to keep judges and legislators pleased (“invite in the legislators and the judges to draw the lines for you”). The fallacy of this approach is that it tends to become far, far, broader than anything a legislator or judge could impose – because it’s trying to second-guess them, which leads to erring on the side of caution.

    Is this reading correct? Do you in fact want the search engines to try to blacklist sites pre-emptively over fear of what will happen if they don’t?

    That’s a specific recommendation, and can be debated if you want, it’s been around forever. But we have to be clear as to what is being proposed.

    Just as legal issue, is it clear that the term “harmful to minors”, as a legal standard, *VARIES* *OVER* *THE* *US*? It is not one thing, but per-jursidiction. And that’s not nitpicking, it can vary substantially, the difference between San Francisco and Memphis can be significant.

    Look at one implication – if we’re talking lines, doesn’t the line *have* to be at least to the side of the most restrictive jurisdiction in the US? Are you really proposing that the line cross something which is arguably a law violation (somewhere) in the US?

  7. Jason Golod

    First of all, I am glad Google “stood up” against the request from the Feds. They can get a random sample of what people are searching for elsewhere. One interesting point that nobody seems to have touched on is that Google can do whatever the hell they want with their search engine. If they don’t want to show images (which aren’t even theirs…but that is another copyright story) that are tagged rape, child porn, or whatever the heck people that are looking for that crap look for, then it is their perrogative to do so. Are you not going to use Google because they won’t show you images of rape? I seriously doubt it. Is making the decision not to show images of rape a serious concession to anyone? If it is, it is to the masses of people around the US and the world. The funny thing is that Google doesn’t use the “do no evil” mantra anymore. Now it is, “don’t be evil.” I don’t want anyone telling me what to do and I am sure Google doesn’t either. I don’t want the government to be able to get access to Google’s logs when they want it. But, I don’t see any harm in Google putting the smackdown on images of rape and child pornography themselves.

  8. João Pedro Pereira

    I’m surprised to still read about an “Internet community”.

    There’s no more such thing as an Internet community. Internet is almost global and it’s present in daily life (in the developed countries, at least).

    There’s no reason for online space not to be policed by the same authorities that police offline world.

  9. Marcelo Lopez

    “There’s no more such thing as an Internet community. Internet is almost global and it’s present in daily life (in the developed countries, at least).” – João Pedro Pereira

    John, you must not be paying much attention to things around you very much. There IS such a THING as an internet “community”. There are MANY, in fact. Are there portuguese specific portals in Brazil ? Are there Mandarin specific portals in Beijing ? The answer is YES, to both. Regardless of whether this is the definition that Nick uses or not, is not the issue. Because, this is the definition MOST people do take. Yahoo.com…is a community. Yahoo 360! is a community. Google Groups, are communities.

    It is in each of those microcosms that issues of politics and legality and such apply. Applied locally.

    I don’t disagree with Google, as a matter of fact, whether for profit, or for altruism, I’m glad to see someone speak up, and say No. Even the founding fathers said there were times when people needed to say “NO !”.

  10. Brad

    Its easy to say “NO” when there is likely to be no financial disincentive. Google can easily say no in the US because really, whats going to happen?

    But, when there is risk of it hurting the back pocket of the company, then they can’t change their practices fast enough.

    Case in point was the information this week that Google in China will actively filter out (censor) all material that the CCP (Chinese Communist Party) deems unfit for publication.

    If they didn’t adapt to CCP policy – guess what, no Google in the worlds second largest internet user base.

    Google is a business and will adapt to the requirements made by the regulators, they won’t go out of their way to do this extra work because it is the right thing to do, they will only do it if they are required to do it. Its a very simple commercial concept. Things like Rape and Child Porn imagery, if illegal in a country, should be actively filtered in that country.

    A government can make it quite simple – tell the search engines “allow the reproduction of that material on your search – then you will be punished.” – Simple really.

Comments are closed.