Josh Findley

Hoan Ton-That interviews Josh Findley, Special Agent at U.S. Department of Homeland Security


By Hoan Ton-That INTERVIEWS


Hoan Ton-That: Josh, thanks for being here. Why don't you tell us about yourself, Josh, and where you work and what you do.


Josh Findley: I'm a special agent for the Department of Homeland Security, Homeland Security investigations for about the last, 16, 17 years I've specialized in child sexual exploitation investigations, and, I have really gotten a rewarding experience out of having a positive impact on children's lives.


Hoan Ton-That: Great. So how did you get into wanting to be in law enforcement and then in this specialty child exploitation?


Josh Findley: So, I originally, got out of high school and decided I wanted to be in law enforcement and I was 18 years old and I was too young to do that. So I became a military policeman. So after that I wanted to move up in the ranks to the army criminal investigation division and was able to do that for several years and saw my civilian counterparts making more money and having a little more liberties to take leave and get time off. And so, I left the army and became a special agent for the treasury department at that time.


Hoan Ton-That: Oh interesting.


Josh Findley: And so after that, in 2004, I was able to work one of my first cases that involved a survivor of child exploitation. At that point we still talked about victims of child exploitation. Since that time, we've really evolved in our care and our understanding of what kids have gone through, and hence the change in the terminology to survivors.


Hoan Ton-That: Interesting. So is that what that transition was when you solved that first case or worked on it, how did that impact you and how you thought of your job in law enforcement?


Josh Findley: It was, it was so rewarding to see the impact that, the potential impact that this type of law enforcement had on children's lives. And I had a good friend, who told me a religious, scripture quote that if you save the life of one child, you save the world and so that, that always stuck with me. And to be able to give back to the most vulnerable people in our society is really where I felt a calling to go in my career.


Hoan Ton-That: That's amazing.


Hoan Ton-That: Clearview AI is a facial recognition company. And what makes us unique is we've been able to collect over 10 billion photos with faces from the open internet, and are used in an after the fact investigative capacity.


So it's not real time and Josh here was one of our first customers at our Homeland Security. And he, I remember him calling me up in 2019 to tell an amazing story of them being able to identify a child predator that they hadn’t identified before. And, and that is what made me really excited to serve our customers, to hear the stories about why they do what they do. It is very rewarding to see and all credit goes for it. So it inspires us every day. Thanks.


I'll give a quick overview of the technology for those who haven't seen it and how it works. Some of the misconceptions also about facial rec and we'll go into, you know, more of Josh and how they use it inside Homeland Security and more. So this is what it looks like. It's like Google for faces. Instead of typing in words or text you upload a photo.


So, law enforcement will upload a photo say for example, from a crime scene, and a lot of cases, you're, you know, trying to identify a victim of child abuse or a perpetrator, and all you have is a photo, and then it shows you online links that might match it, that you click on to do further investigation. So it's not definitive, identification technology, it's just a lead. So, that's basically what is.


Hoan Ton-That: So how did you end up hearing about Clearview and, tell us about that first case that we all know about now?


Josh Findley: Okay. So, this was back in 2019 and I was detailed to our cyber crime center. And one of the things that I was tasked with doing was, taking these images where we would just have a face, and no other information about who the offender was or who the survivor was, just a face. And so at that point, our normal kind of procedure was to create a wanted poster essentially, distribute that amongst our agents in the field and our task force officers in the field, to say, hey, has anybody seen this person? Has anybody worked on this case? Do we know who this is? And then the next step would be to circulate that if we didn't get a good response or we weren't able to find them, to circulate that to all law enforcement. And if that still was negative, then we would, go and receive a John Doe warrant which is, if we can establish someone's in the United States and all we have is a face, a judge can issue a warrant for their arrest just based on their picture and so then you go to the public in general, public facing websites and put this person's picture on and see if the public has any leads? Which oftentimes has resulted in hundreds or thousands of leads that we track down throughout the country. Well, when I originally sent this image out, one of our agents in New York city had, just through a task force, found someone that had access to your software. So they took it upon themselves to say, "Hey," and this was a suspect who had taken pictures of his face while committing the abuse. and so we had this suspect's picture and they put the picture into Clearview, and it came back with a positive result. and so, they sent that to me. And then at that point I had to track down who this person is and if this is actually the person. Because with all technology, there's never 100%, there's never anything that's a guarantee. And it's our job as investigators to, to actually prove the case.


Hoan Ton-That: So what's amazing about the case you worked on is it's visceral nature. So the image was found in a child abuse video that had been sold on the dark web. And it only appears for a few frames, is that correct?


Josh Findley: Yeah. There's only three different frames that had the, the suspect's face.


And the results from Clearview were from a bodybuilding show in Las Vegas. and the person who publicly posted this image is the person in front, that is a bodybuilder who was appearing at that show. Again, this image in itself told us nothing. It didn't necessarily tell us that this was actually our suspect.


Hoan Ton-That: And that was the only anything that came up?


Josh Findley: Yes.


Hoan Ton-That: How did it feel when you saw the image?


Josh Findley: I was thinking “Is it a match?” You know, just because technology says something, it always needs to be verified by human eyes and looked at, the importance of it and how accurate it's gonna be. And I liked it 'cause I had done a lot of these comparisons with different pictures that had been sent out. I was like, "Of anybody like this definitely looks like the guy." But the information I got really told me nothing about who this person is except they're in the background at a bodybuilding show. There's a company logo and I can tell that he's standing behind it. It gave me a date and a time that he was at a certain location. I was considering getting with Vegas Metro PD and looking at surveillance footage and seeing if I could track that down. But, I started looking at all the principles of this company and seeing if I could identify if he was actually one of their sales people or something like that. I was unable to do that. I found several pictures, but he wasn't close. And so, I finally contacted the company and they were able to give me a name to look at.


Hoan Ton-That: Interesting. So that's what's fascinating about this case and kind of the magic of facial recognition, especially when you have something as accurate as Clearview AI to have that one result. At the time we only had 3 billion images. It comes back in about a second. So after you've identified him, what's the process like in actually proving out a whole case?


Josh Findley: Right. And so, and, and this goes to somewhat to the privacy concerns, regarding your software, is after we had a name, we were able to do some Google searches and get on some of our, undercover accounts and access his Facebook page, which he had exerted privacy over, therefore Clearview would never have those images.


Hoan Ton-That: Mm-hmm


Josh Findley: So because we were able to then get to his Facebook page, then we were able to see more pictures of his face, pictures of the child's, pictures of things in the background, flooring, you know, shower curtains, those kinds of things. And so then this all leads us to the standard of probable cause, right?


Hoan Ton-That: Yeah.


Josh Findley: Clearview gave us just a lead. So now we go out and do our homework. And we find all these extra things. And now that leads us to the level of probable cause where we apply for a search warrant. Get a search warrant, go onto his computer, find all of the images, eventually present that to a grand jury. And then, move forward to an indictment and arrest and, you know, the rescue of the survivor in this case.


Hoan Ton-That: Yeah. And so just to finish the story, I mean, there's a lot of in betweens and twists and turns when you're doing an investigation. But, what happened at the end?


Josh Findley: He was convicted federally, and in the state of Nevada and, is serving out a 25 year prison sentence and will not harm any children for the next 25 years.


Hoan Ton-That: Wow. What about, victim identification as well, that's something that, have you used Clearview for that kind of purpose as well?


Josh Findley: So we have. Either a perpetrator can take an image that they've created, of a child's face, and put that on the internet. or in some cases, we have predators out there that are prolific sextortion artists where they sextort hundreds and hundreds of children and they collect images and pressure them into these things. Well, eventually, hopefully one of those kids is gonna come forward and we're gonna track it back. But then we can find a computer and it can have hundreds of images of children that we have no clue who they are. Most likely those are your, you know, tweenagers to teenagers. Those seem to be- the target's age of a lot of sextortionists. And, so we can have hundreds of these children that we don't know who they are and, they can be out there wondering when's this guy gonna hit me up again for another picture. They don't understand that he's been arrested. They don't have the ability to get counseling-


So Clearview's actually helped us a lot when we have all those unknown faces that we can start to identify them quicker, and get them the help that they need and get them, the knowledge that the offender has been, stopped.


Hoan Ton-That: So that's something that I had never known happened. And so I met you and explained these cases. You could live in a really nice neighborhood, the safest area possible, but your child could be being abused online without your knowledge. So, kind of a phenomenal thing to help the victims of these crimes as well.


Josh Findley: Absolutely.


Hoan Ton-That: Because they don't know their perpetrators are now gone.


So, when you were in HSI, how did you develop policies internally? What was your thinking about the use of this new technology? And have you seen... You know, we've improved our platform a lot, I think, to work with customers for compliance and reporting. But take us through the process for anyone else in the government about how to procure, how to explain it, to their agency and others.


Josh Findley: Right. And, with facial recognition and the use of AI being such a pioneering tool and honestly, a lot of misconceptions about real time versus your iPhone uses facial recognition. So, here's a lot of these broad policies that want to be implemented, but to really narrow down the use of this tool and to segregate that out policy wise, you know- we wanna use more caution about using this tool than we do about unlocking our cell phone every day. and so we went through, with our privacy assessment team identifying which kind of cases and, and how we were going to use the tool. At that point also our headquarters was coming up with a broader policy of how facial recognition would be used and reported, in our case files. So then once we were able to get through the privacy assessment, get it authorized through not only our parent agency, but through the Department of Homeland Security's prior privacy assessment. At that point then, we were able to expand its use to all of our child exploitation investigators throughout the country. Not, not that they all have licenses, but that we can use it for those kinds of cases. And also developing uses to help protect our undercover agents in the field.


Hoan Ton-That: So, I want everyone to give Josh a round of applause for the great work they do at Homeland. Thanks for being a customer of Clearview. We really appreciate it. We are inspired every day by your stories. Thanks again, Josh.


Josh Findley: And I, I wanna thank you. You know, you guys have been dedicated partners in helping us find, the subjects, that are abusing kids and, you've been supportive of us doing and, we really appreciate your tool and its ability to help us track out and do our jobs, more efficiently.