|
2 of 4. Looking at the issue of self-made images of teens in the light of IWF statements.This item is part of a special series of posts. The Internet Watch Foundation is a key organisation setting standards on how CSEM on the internet is handled, as well as co-ordinating detection and removal activity. What do their statements and practices imply about what they implicitly believe about CSEM? Four posts from the Celibate Pedophiles blog explore this. In a complicated landscape of types of images, one major distinction is the age of the children. Images of younger children (let's say 0-8) are directed and produced by adults and the child has no agency, and I'll return to those in a later post.Images of older children (let's say 12-17) are in today's "wired" world typically self-produced and freely sent to someone else online. The main complaint is that the recipient then releases them to the world at large, in a betrayal of the child's trust. But it's worth reflecting that this same dynamic can play out with adults. A 20-year-old or a 50-year-old would also be distressed to have images they provided privately shared publicly. Anyone would be upset by blackmail.
At one point, the IWF podcast describes how careful work can be required to verify that an image is in fact illegal, since teens often look indistinguishable from young adults. If the analysts can't tell which images are illegal without detective work, the viewers can't either.
So how can a viewer tell if the pixels are from a crime scene or not? When StopItNow [addresses this issue](https://www.stopitnow.org.uk/concerned-about-your-own-thoughts-or-behaviour/concerned-about-use-of-the-internet/get-the-facts/no-grey-area/), they say in a section titled "No Grey Area": "Uncertainty about a childs age: As an adult, you will be able to clearly identify who is a child and who is an adult. There is no grey area here. If there is ANY DOUBT, do not access the images." This verges on the incoherent. No one can detect by looking whether someone's 18th birthday is in the recent past or the near future. If taken at all literally, viewing the vast majority of adult porn which features reasonably young actors would be prohibited. If StopItNow thought their advice was reasonable, perhaps it is because they are actually against adult pornography as well and see no problem with just telling people never to look at porn.
In a society of just laws, the defendant gets the benefit of the doubt. You should be at legal risk for viewing images only if a reasonable person would be certain they are of a person under 18 — and an interest in small, thin adult women with small breasts should put you at no risk. In practice, viewing of most child abuse images of 13-year-old girls would not be prosecutable on this basis. This is in a narrow sense a digression since IWF's mission is not to help convict people of viewing images but to remove them from the web. However, their series of videos is also clearly intended as an exhortation to people not to view such images, with episodes titled "It's not just an image", and "If you're watching, you're an offender".
If you can find the creator or original distributor of an image, determining whether the child is over or under 18 is important since it determines whether the image is evidence of a crime or not. If not, a reasonable IWF choice would be to ignore images that look like they could easily be images of young-looking adults.
IWF's approach also completely downplays any agency on the part of teens under age 18. They would like to present a simple story where any sexual picture of anyone under 18 is CSEM, the same category that encompasses 3-year-olds. Yet this narrative suffers from some uncomfortable facts that IWF and its allies would like to keep behind the curtain. UK law grants to 16-year-olds the right to decide when to have sex with someone. It is curious if in contrast a 16-year-old can be nothing but an innocent victim when it comes to sending images to someone. At one point in the IWF podcast it is noted in passing that children react differently to finding their abuse images are out there, immediately focusing on those who are highly distressed. They describe all self-made material as being forced or coerced. The voices of teens who say, "Yeah, I put it out there because I wanted to and I don't have any problem with it" are silenced. The podcast notes that teens very rarely make a report, and go through reasons — they think the man is their boyfriend, or they feel ashamed or are worried about the additional publicity. Left out is the idea that this is an entirely rational calculation on their part, in terms of the costs and benefits to them. It also ignores the mundane possibility that they just don't think it's a big deal.
Another perspective on this is that as part of their journey to becoming independent adults, teens face dangers. They can start using drugs. They can stop applying themselves to schoolwork. They can engage in unsafe sex with peers. We know they can be the victims of devastating cruelty from their peers as part of jockeying for status or defining who belongs to the in-group. Online, they can engage in sexual talk with others, or send or receive sexual pictures. In some cases the result may be sexual pictures that are released onto the web and become visible to the IWF and others. Because of the visibility, IWF magnifies this particular form of distress to appear far larger than what it really is — one tiny part of teens making bad choices and suffering the consequences.
For some perspective, I think it is clearly immoral to convince anyone to send sexual pictures under false pretenses, and it is immoral to release publicly anything sent with the understanding it was private. Younger victims may get more sympathy from us and bring forth our protective instincts, but their situation is fundamentally no different from that of adult victims.
With all that in mind, let's return to what a viewer should do if they see a sexual image of someone who looks to be on the young side. A young-looking 20? An early-maturing 13? The legal situation is the province of local laws, lawyers, and case law that is typically unsettled. But what about the moral situation, where we can hope for some conclusions that are independent of jurisdiction and do not require specialized training in the law?
In favor of not looking are: The idea that all porn is bad. The idea that a special interest in people who look on the young side is bad. The sense that the harm from any single act of viewing is great, so that it is vital to be on the safe side. The sense that that when teens make images of themselves, they have no agency and are nothing but helpless victims.
In favor of looking are: The idea that looking at porn can be a valid choice some people make for purposes of pleasure and satisfaction. The idea that an interest in the young-looking (but at least late pubescent) is natural and common. The idea that the moral status of what you look at depends on what you can actually see, not what might be going on behind the scenes. To expand on this point, a clearly adult porn actress might have been blackmailed into appearing, or it might be a choice forced by limited economic opportunities. An adult amateur might have been pressured by her boyfriend. Self-produced material by an adult might have been intended for private use only and not public distribution. Going beyond the realm of the sexual, any child who performs publicly, whether a singer, model, athlete, or actor, might be doing so under parental pressure and not from a free choice of their own. Most products you buy have long and complicated supply chains behind them, and some components likely are made in areas with poor labor conditions or repressive governments. Is it your moral duty to investigate all of that before making a purchase?
IWF and similar organizations take the first position, but it is based on assumptions that many other reasonable and moral people do not share. The law (UK law in particular) is more informed by the first position than the second, but many people might feel it is unjust and work for changing it (even as they obey the existing laws).
My comment, next day: Rereading this a day later, I add two things: (1) In much of the world, images of people who look like they might be adults or might be minors are illegal if they are in fact minors and viewing them can carry extreme penalties. Be very cautious. (2) There are other aspects to images that might cause moral problems. For instance, visible evidence of coercion or distress could just be adult actors pretending, but it poses more of a problem as the chances that a person is underage go up. More generally, any evidence that the images are not self-produced should cause moral qualms. And as the person appears so youthful that only a very unusual adult would look like that, that also is morally problematic.
This content was taken from Ethan's longstanding blog, Celibate Pedophiles. Some of the titles and taglines have been edited for their inclusion at thepword.
You can see an earlier version of the blog at the wayback machine. | |