Skip to Content

Pornhub was on NCMEC’s radar before NYT article says Vice President.

Pornhub was on NCMEC’s radar before NYT article says  Vice President.

According to the agency’s Vice President, Pornhub was on NCMEC’s radar long before the Nicholas Kristof New York Times article.

Cybertip, a service provided by NCMEC, is a clearinghouse for information on child sexual abuse. They are aided by public and electronic service providers such as Google, Microsoft, and Dropbox. In this interview the agency’s Vice President reveals that, while Pornhub is not a corporation based in the United States, they have been investigating them for child abuse/exploitation long before the New York Time article.

RELATED: ONLYFANS MODEL EARNS THOUSANDS AFTER RUNNING OVER CANBERRA PROTESTORS

As much as we are taught how the world ought to be, rarely are we taught how it  actually is. People lie, cheat, and steal all around us, and while sometimes they face the music of justice for their actions, other times they walk free. As COVID-19 brought much commerce  and business onto the internet, it sparked a big payday and recruitment event for web-based  paywall services like Pornhub, Patreon and OnlyFans to sell pornography of people during the  mandatory lockdowns across the world.

But just as one person’s trash is another person’s  treasure, the same can be said that one person’s pain and suffering equates to another one’s payday. As child sexual exploitation on the internet begins to grow noticeably out of control,  the governments appear to turn a blind eye to this multi-billion-dollar operation. The most  recent normalization event, as an example, involved a PhD student who attempted to humanize pedophiles and relabel them to a less “offensive” category of “minor-attracted persons” a few  months ago. 

The National Center for Missing and Exploited Children (NCMEC) is a non-profit  organization in the United States that receives information on children who have been  abducted or sexually victimized. Upon intaking such information, it is reviewed by analysts who  assess the severity and then make that information available to law enforcement.

Regardless of  whether or not the police choose to act on information, tips, and perhaps a victimized child’s  fate slip through the hands of the organization which does not have any policing ability. John Shehan is the Vice President of the Exploited Children Division (ECD) at the NCMEC in  Alexandria, Virginia. He was generous enough to speak with POPTOPIC regarding his  role as VP and his agency’s job in combating the exploitation of children on the internet and in real life. Despite seeing the worst side of humanity on a daily basis, the agency he  presides over appears to be determined to putting a stop to the monstrosities that are  recorded and sometimes sold over the internet involving children.

Interview with John Shehan.

POPTOPIC:What kinds of things do you do as Vice President of the (Child Exploitation, NCMEC)  division?

MR. SHEHAN:Sure, so I joined NCMEC in February of 2000 so I’ve been there for a little over 21  years and now… of course I didn’t start off as Vice President, I held a number of roles within the center. My entire career has been focused on combating the sexual exploitation of children.

As  the Vice President, I have a variety of different roles responsibilities, number one is oversight of  our Cybertip line: the Cybertip line was made in March of 1998, it receives reports of suspected child exploitation. Those reports can vary between child sexual abuse material, child sex  trafficking, the enticement of children. Online or offline events, so things like child molestation,  those kinds of reports as well.

John Shehan, NCMEC

The tip-line, since 1998, has received about a little over  100,000,000 (million) reports and what we do is we serve as a clearinghouse of information so  reports come in, we review that information, we assess the potential risk to a child and we make  that information available to law enforcement. We’re a private non-profit organization (NPO),  were not an investigatory agency, so we don’t have any law enforcement powers or abilities.  Again, we serve as a clearing house and make that information available to law enforcement for  independent review and possible investigation. So one of the areas I oversee is the Cybertip line. 

The other is the child victim identification program. We created the CID program in 2002. I was  there when we created the program. It was born out of our desire to know who the children are  who appear in sexual abusive images and videos. We were seeing the same images of these  children being sexually abused over and over again on websites.

Abused little girl

Really for our own mental  sanity, we wanted to know who these children were, we wanted to know the things that we  could rescue from situations of sexual abuse and exploitation and at the same time, we tried to  focus our attention on those unidentified children hoping to see background, clues or  information of that imagery that would help law enforcement identify those children.

A third  one is our survivor services: so for some cases of children, some of them now adults, whose  imagery are still recirculated online, we work with survivors to remove that imagery and notify  hosting providers, create child sexual abuse hash lists, and provide that to industries so they will  voluntarily, proactively remove said content.

Then we host roundtable working groups with  survivors as well to learn from their life experience. I conduct media interviews like this, educate  members of the public, I testify (to government agencies) on behalf of the organization whether  that’s to congressional subcommittees, federal courts, state courts, you name it. Other duties on  the side, I guess you could say. So there’s a pile of overview of my role as Vice-President.” 

POPTOPIC: “So, NCMEC does not investigate, they just assess threats and make the  information available to law enforcement, is that correct?” 

MR. SHEHAN: “Yes, so our organization was created in 1984 and we were born out of a  tragedy. Some people are familiar with the child who was abducted (Adam Walch) who  was abducted in 1981 at a local shopping mall while his mom was just a few aisles away  in the store. His parents began a desperate effort to locate their son, working with local  law enforcement, vocalizing to the community.

It was learned about 10 days later that not  only was Adam abducted, but also brutally murdered. And the Walshes, they channelled that anguish and they helped to spark a national outcry for more to be done on issues  relating to missing and exploited children. So, Adam was abducted in 1981. In 1984, the  NCMEC was created as a private NPO, we’re a 501(2)3. No Employees of NCMEC are law  enforcement, we have no law enforcement authority.

Certainly, we are authorized by  congress to do certain programs of work, one of which is the Cybertip line. But no, we do  not have any law enforcement abilities, we can’t serve subpoenas, we don’t do search  warrants, we don’t kick in doors. Clearly, what we are is a clearing house for information. 

Again, the Cybertip line will get tips from members of the public as well as electronic  service providers (ESPs), so the companies. There is a law in the United States (18 USC  2258-A) that requires ESPs companies like Google, Microsoft, and Dropbox, you name it,  they’re required by law that when they become aware of child sexual abuse material  (CSAM) on their platforms that they have to report into the Cybertip line.

We receive those  tips, and if you can imagine, there are people all around the world using different types of  platforms. So, when they’re uploading CSAM to those services and the companies report it  to us, we in turn are using the graphical information. Often times, the upload IP address,  to geolocate where in the world the content was uploaded and created from, and we make  that information available to law enforcement. So, NCMEC makes Cybertip line reports  available to state and federal law enforcement here in the United States, and also about  140 countries and territories around the world.  

POPTOPIC: “Recalling the scandal involving Pornhub and their legal scrutiny, what was  NECMEC’s involvement? Particular after the Nick Kristoff’s New York Times Article.” 

MR. SHEHAN:So we were aware of the sexual exploitation of children on platforms like  Pornhub and others owned by MindGeek well before that story broke, and if you do a little  bit of research, you’ll find that our own CEO testified to Canadian parliament on their site.” 

While Pornhub is not a U.S. company, we do allow them to report suspected child sexual  exploitation into our platform. We certainly notify them whenever we become aware of  potential child sex abuse material that they’re hosting, whether that be imagery or videos,  but they were on our radar well before that article came out.” 

POPTOPIC: “Was there anything that your agency could have done to open a law  enforcement investigation? 

MR. SHEHAN: Any tip that’s made into the Cybertip line is made available to law  enforcement. Every single tip. Now we cannot, like I mentioned earlier: we’re not law  enforcement, we have no law enforcement ability, so I can’t open an investigation, I can’t  make law enforcement open an investigation. We provide them with tips and information,  they review it, and then they make their independent determination whether or not to  investigate.

Human Trafficking

Often times, it depends on information within the tip, but, every single Cybertip line report that comes in is made available to law enforcement. In fact, FBI,  Homeland Security, Secret Service, military liaison, the Department of Justice’s child  exploitation and obscenity section, they have access to every single Cybertip line report,  even those that we may have sent to local jurisdiction.” 

POPTOPIC:Would there be a reason why law enforcement would choose not to act on a  report made on your Cybertip Line?

MR. SHEHAN:Well, I can’t speak on behalf of law enforcement for one particular Cybertip  line report, I can tell you that the tip line receives massive amounts of reports every day.  Right now, in 2021, we’re averaging 70,000 tip reports a day and those are both domestic  and international. Again, we make them all available to law enforcement that are making  their way through them.

Sometimes it depends on the information that’s provided to them,  sometimes it depends on data retention, or the response that they get back from legal  process. To get to why a specific report was or was not done, it would really come down to  law enforcement because it was made available to them, see what happens and why.

POPTOPIC:I take it your Cybertip line doesn’t rank things on priority so law enforcement  can take that into consideration? Example, a single explicit picture vs a child who was  kidnapped?

MR. SHEHAN: “No, we certainly assess risk to reports. When they come through, we try to  put a priority assessment on that lead. So those that we think that a child is in eminent  danger, it doesn’t matter, our team is 24/7 so that doesn’t matter.

Those are the ones that  wake up in the middle of the night, they wake law enforcement up in the middle of the  night, and they act upon those, absolutely. So we do, we certainly try to rank which  priorities, we try to escalate leads to law enforcement, but no that’s what happens.

POPTOPIC: “Would you also say that with the instance of child sex trafficking in another  country that comes through the Cybertip line, would that have any weight in your priority  ranking system?

MR. SHEHAN:Well — yes, of course, we get reports of child sex trafficking and exploitation, as  far as where the jurisdiction is, it makes no difference on our ranking level. We rank the  risk or the priority level of the report based on the perceived risk to the child within that  report.

So if we had information of child sex trafficking, if the child who is being sex  trafficked, if their life is in immediate danger, that’s a priority of 1. And in the middle, it’s  probably a priority of 2, were we need law enforcement to get to sooner than the later.  And depending on the country, if it goes to Europe, most likely it’s going to go to Europol.

Europol

We work really closely with Interpol and Europol. There are some countries in Europe that take reports directly from us, like Germany. They’re excepting them. There’s plenty of  countries in the Asia Pacific region that are receiving reports. All around the world, we’re  making those reports available. Typically the ones are ranked the priorities 1s or 2s, the  higher ranking, the ones where a child’s life is in danger, those are the ones that law  enforcement works with first.

POPTOPIC:Moving back to Pornhub, in his testimony to the Canadian parliament earlier this  year, David Tasillo, the COO of MindGeek, said his company is compliant and always reporting  unlawful content specifically to NCMEC, not to the Canadian Center for Child Protection. CCCP is  the domestic Canadian agency that has jurisdiction in Canada where MindGeek is based. Would  there be a reason why his company would report to NCMEC and not Canada’s national agency? 

MR. SHEHAN: Well, that’s an interesting question and I think there are some, especially  Canadian law enforcement who would debate that they’re not a Canadian company. They are  truly headquartered in a location in Europe, and I say this because they fully allowed Pornhub  (MindGeek) to report into the Cybertip line, we had to ensure they were not a Canadian  company.

And I say that because in Canada, there’s mandatory reporting laws that exist: if you  are an electronic service provider in Canada, for example, a few years ago KIK was a Canadian  company, but now a U.S. company and now under U.S. law reporting to the Cybertip line. But  before, KIK, being a Canadian company which required by Canadian law to report to the RCMP. 

So we had contacted the RCMP to make sure that MindGeek/Pornhub, were not considered a  Canadian company, because of they were, they should be reporting to them and not us. And we  were given the green light, they were not considered a Canadian company for purposes of  reporting.

POPTOPIC:On Pornhub’s Trust and Safety Webpage, they say they participate in the NCMEC’s  Electronic Service Provider Program. What is that? 

MR. SHEHAN:That’s reporting to us. So, If you go to NCMEC.org and scroll down to the middle  of the page, you’ll see a section called ‘by the numbers’. You will find on that page both 2019  and 2020 the number of reports by electronic service provider, and you’ll see the same number  of reports which countries we made them available to.

POPTOPIC:So, going back to Pornhub, you said that you only make information available to law enforcement.

MR. SHEHAN:Correct .”

POPTOPIC: “If you see something that is egregiously illegal and needs to be taken down from  the internet, the NCMEC does not reach out to ESPs like, for example, Pornhub if they find  something on there?

MR. SHEHAN:We would notify. So Cybertip reports themselves, so for tips that we receive in,  whether it’s from a member of the public or from the company, if you look at 18 USC 2258-A, it  requires the companies to report to the Cybertip line, and it basically requires/authorizes  NCMEC to make those reports available to law enforcement.

Now we are able to share elements  of Cybertip line reports with industry, with companies, which for the purpose of reducing CSAM  online. So we would notify a hosting provider when we are aware of CSAM on their platform  that’s live and accessible to the public, and we also provide hash values to these companies that  want to use them to help identify, remove and report CSAM.

POPTOPIC:But in terms of cease and desist, you wouldn’t send that directly to companies as an  urgent matter.

MR. SHEHAN:We would send them the link to their video or something they were hosting. If  they were hosting CSAM, we would send them a notice notifying them. But NCMEC doesn’t have  the ability to remove anything from the internet. We don’t have the legal authority to do a  cease and desist; we can make them aware, and maybe most companies review it, they remove  it when we tell them.

NCMEC’s VP discusses Elles Club controversy.

To shift gears, we asked about the case involving Elles Club seeing as though she has made  space on POPTOPIC’s columns.  

MR. SHEHAN:So I am familiar with reports regarding the model, and I can assure you that those  reports that came through were submitted and made available to law enforcement. In this case  it was EUROPOL.

POPTOPIC:So Pornhub removed the content, ManyVids they removed the content. I think there  was a third one but I’m not remembering (that also removed the content), but Patreon kept the  content up. Was your agency aware of that?

Pornhub Child star EllesClub admits she's trafficked; OnlyFans does nothing.

MR. SHEHAN:I don’t know that off hand. I haven’t gone through all the different reports with  the Patreon pieces. So no, I’m not familiar with that.” 

When asked why the illegal content was removed from several other websites in other  countries except for Patreon, Mr, Shehan maintained that his agency’s hands were tied.

MR. SHEHAN:Well, I think that goes back to what I mentioned to you earlier, that NCMEC has  no ability to remove anything from the internet. We can make someone aware and maybe they  review those. I have not seen the actual imagery that we’re talking about here, if it’s difficult or  what Patreon’s reasoning is behind not removing it, I think that’s a question you want to ask  them.”

We certainly make companies aware when we are made aware that there’s CSAM on  those platforms, and we provide all of those reports to law enforcement as well. We don’t have  any ability to remove anything from the internet. We serve as a clearing house to make sure  that everybody who needs to know does know. I think the better question is getting Patreon on  the phone and figuring out what went wrong or what they did in those areas.

POPTOPIC:They’ve been pretty evasive to everybody is my understanding. We’ve reached out  for comment and they like being very mysterious. Do you know if anyone from NCMEC reached  out to Patreon to inform them?” 

MR. SHEHAN:I haven’t read any. I haven’t read any case records of this particular scenario so I  don’t know off hand. I do know that my entire team of staff are pouring through reports made  by members of the public or from ESPs, and there are some companies that report third-party  hosting that’s not even on their platforms, and we’re sending out notices, tens of thousands of notices on a regular basis for their removal.

And they monitor after these notices, and we track  all that information. Anytime a provider isn’t taking content down, of course, that information is  made available to law enforcement as well. Usually, but if you’re looking at the law, 18 USC  2258-A that requires companies, when they’re made aware of CSAM they’re hosting, they’re  required to submit a report through the Cybertip line, and a failure to do so, there are penalties.  The people who enforce those penalties is not NCMEC. That’s going to be the Child Exploitation  and Obscenity Section of the Department of Justice.

Patreon’s integrity questioned.

In 2020 the NCMEC released its online report statistics. Although the numbers are provided by  several companies, Mr. Shehan was somewhat skeptical about the companies integrity while  reporting such figures. 

POPTOPIC:So I’m looking at your 2020 report (now these are obsolete), but only 19 cases of  CSAM were reported by Patreon. Do you suspect that number may be underrepresenting of  what might actually be on their website?

MR. SHEHAN:Oh, I think you’re going to find a number of companies on our website in that  particular year, take a look at Apple, take a look at Amazon, there’s a lot of very big providers  that could be doing more. It’s very easy to answer that.

Patreon integrity
(Photo Illustration by Pavlo Gonchar/SOPA Images/LightRocket via Getty Images)

POPTOPIC:You make a very compelling case that companies could be doing more but they’re  not. So I guess this is a question with an eye to the future, if the NCMEC knows that these  companies aren’t complying fully with the law, what do you think you guys as an organization  would be… what do you think the next step for the NCMEC would be in terms of changing the  ways that companies are sort of side-stepping their responsibilities, their obligations to the law  and to the public. 

MR. SHEHAN:Yeah, so I think if you backtrack to around March of 2020, you’ll find that I  testified in front of a congressional subcommittee on the EARN IT Act, where you’ll see that we  are very vocal that companies need to be doing more and they need to be incentivized,  potentially (through) regulation because right now, companies can take those steps to identify  and remove CSAM is voluntary.

Companies don’t have to do it. So I think whether its U.S. congress, or maybe the Europen commission, in the UK, there’s a lot of laws that could be  coming out of the pipe in the near future that are being a bit more focused on regulation when  it comes to CSAM material. Again, if you watch my testimony, you can Google it, it was March  2020, Congressional Subcommittee on the EARN IT Act, you’ll see I and some others testified  about that very question.

POPTOPIC:Do you have any plans to make NCMEC more vocal in addition to your testimony to  congress?

MR. SHEHAN:We’re still very much pushing and hoping to see EARN IT move forward this year  even in the late season, but yeah, we are very open and honest about what we think about that  as well as what’s going to happen is more and more companies go fully encrypted because that  in some ways could be the potential answer as to why they don’t need to do scanning. If they  just encrypt everything, they have no ability to do that because it’s all hidden.

POPTOPIC:Okay. Especially with payment services, everything… 

MR. SHEHAN:Livestreaming right now is incredibly difficult because everything is encrypted. If  you’ve seen our webpage, we’ve got something missingkids.org/(did not understand), you’ll see  we’ve done a lot of work on that in Europe. In past, privacy derogations December of 2020,  basically it stops companies from being able to scan.”

External web camera on a computer monitor

And so we… there’s a part of our webpage  where we do a lot of awareness on that we had a huge um… petition to get people signing, I  testified a number of times remotely to the European commission on how they need to change  that, and they did, they actually changed it back so that companies can proactively scan.

We returned to the example with Elles Club and companies failing to adhere to the basic  reporting procedures.

POPTOPIC:So on Pornhub, this model sold her content on ManyVids, won a bunch of accolades  and recognition and when it was reported to them, they complied and took the illegal content  down but when they were asked if they could… follow up back in May whether or not they  reported the matter to the police.

And they were evasive. They said ‘Thanks for the report, the  information has been forwarded to the appropriate contacts and channels.’ When asked if that  includes law enforcement, there was no response. There was another message from the sender:  Did you reach out or something?

ManyVids said reporters remain confidential and we already  sent the report off for review. As for clarification, did you report it to the police after you banned  the account for selling child porn on your website, and their last response before completely  ignoring the messages: We have standard operating procedures for this type of report and we  have taken all of those steps. So very evasive to a dichotomous answer of yes or no.

MR. SHEHAN: “Yea, they don’t sound very transparent.

POPTOPIC:Should be yes or no whether you told the police something.

MR. SHEHAN:Yes, that’s an important area for companies that are engaged in combating CSAM  and they need to produce transparency reports. They need to be more transparent on what  they’re doing. And choosing the business of the adult industry, more of these things are  required, I think you have a higher expectations than the other ESPs to be ready to respond  when these types of situations occur.

POPTOPIC:Absolutely. And like you said, it’s voluntary. They can choose to act or they choose to  kind of keep themselves out of it.” 

MR. SHEHAN:Yeah, and now I’ve heard of the site, but I’m not familiar… are they a US company  where they’re headquartered?

POPTOPIC: “That I don’t know.

MR. SHEHAN:Because if they’re a U.S. company, they would have to report it to NCMEC, but if  they’re not a U.S. company, then that reporting won’t apply to them.

POPTOPIC: “So they are… it says headquartered in Montreal, Canada.

MR. SHEHAN:There you go, so they’re reporting to RCMP.” 

POPTOPIC:Okay. Alright. And just out of curiosity, you guys have an internet work with law  enforcement agencies and other reporting agencies like with RCMPs where they can transfer  information to law enforcement as well?

MR. SHEHAN:Yes, of course. Like I said earlier, we work very closely with around 140  international law enforcement. RCMPs if we have something regarding Canada, they get our  leads. We also work really close with international NGOs, you mentioned Canadian center for  child protection, we worked with them since day 1.

We were created a little before them, they  actually modeled their original reporting forum after ours back in the day, we work really close  with (Unheard) up there, so yeah we’ve got deep international relations just by the nature of the  work we do, you have to.

Government doesn’t take this seriously enough…

Throughout the interview, NCMEC Vice President Mr. Shehan reiterated that his organization, chartered by the  U.S. congress, has no law enforcement abilities. It really makes me wonder just how well the  American government equips this flagship agency (or other organizations that do similar work)  to undertake such an enormous task.

As society continues to evolve into a more digital age,  things become simpler. Entire incomes are made by uploading videos onto websites that run  advertisements and share the revenues with creators. Public forums such as YouTube have  strict rules against material that involves sexual content whereas premium-paywall services like  Patreon and OnlyFans require viewers to purchase subscriptions to access content from  creators who are defined as “partners”.

Joe Biden

With Petabytes of data shared on the internet each  second, it seems improbable to monitor such a massive volume of information to ensure that it  is legally compliant. As we have already seen with Pornhub, Patreon and OnlyFans, many cases  of abuse tend to slip through the cracks and companies will continue to knowingly turn a blind  eye until someone sounds the alarm just like Nick Kristoff in his New York Times article, Ian  Miles Cheong or Laila Mickelwait who have been tugging the sleeves of our society’s prefects to  take action. The question remains unanswered: why are the governments of the world are  pretending like this isn’t a real problem.  

If you have witnessed child abuse on the internet or in real life, you are encouraged to contact  your local law enforcement agency. You can also make reports of illegal content/actions  involving minors to the NCMEC’s Cybertip hotline at cybertip.org. Big thanks to Mr. Shehan for his time. An even bigger thank you to his team that does a job handling atrocities many of us would never have the stomach for.