vlog

Skip to content
NOWCAST vlog News at 7am Sunday Morning
Watch on Demand
Advertisement

'Take It Down': New tool allows teens to remove explicit images

'Take It Down': New tool allows teens to remove explicit images
ah yeah, you know me, I show you apps all the time. I find new ones that look interesting. We try them out together. But this time something *** little different. It's more personal for me this time because I'm gonna show you *** few safety apps that I use every day myself with my own kids. These are the apps that help me monitor their screen time. What exactly are they doing on social media? What apps are they using? And these apps also keep them safe when they leave the house. I hope this helps you. These are my kids, Skylar Sloan and Blake and just like your kids, they're always on their phones. But this free app called screen time gives me the power to change that. My son blake is obsessed with his phone. Uh, it's *** major problem in our house for homework. All the same reasons in your house too. And he doesn't go to bed at night when supposed to. So this app right here. This does some really cool stuff. It actually tells you it has an app log. It's gonna tell me exactly what apps he's been on, how long he's been on them. I can assign him rewards and say, hey, if you clean your room, I'm gonna program an extra few minutes of screen time for you. And here's the best part I can shut it off right on the spot. So I'm going to pause his phone. I just paused. It paused blake. Come here. Yeah. All right. Look at your phone. Do you notice anything different. So here's what it did. It literally took all of his gaming apps right off his phone. These are just the apps that were loaded on the phone. No gaming apps. So go do your homework. No, but this next app takes it *** step further. It's called bark. It monitors, texts, emails and direct messages. So you don't have to for just *** few bucks *** month. The app's algorithm scans their phones for anything harmful from online predators to threats, even cyberbullying. If it detects something, the app will send you an instant alert and get this bark says it's already detected more than 2.6 million bullying messages and 629,000 self harm situations. But this next app may be my favorite. It's called life 3 60 I love this. It's free. Your kids are playing with their friends before after school. Maybe they're walking to school, walking to and from the bus. You want to know where they are. This app is really cool. Come on in. So I have all my kids programmed on here and like I can see right now blake and Sloan aren't here, but I can see they're at their friend's house right over there. I know exactly where they are in there together. My other daughter is clear across town and she is in that house right over there so you can really keep an eye on them. The other cool thing about this is that you can set *** perimeter around your house around The school and you will get *** pop up immediate alert if they go outside of that Perimeter, 360 even has features from my teenager Skylar who's out driving with friends or by herself. In addition to real time monitoring, it gives you *** look at how long the trip was the top speed they were going and where they went. It'll even alert me if she's texting while driving. We also have *** list of more apps that will help you monitor your kids in *** safe way, all of them recommended by security experts. I'm gonna put all that information and links on my website, Rawson Reports dot com. Hope it helps back to you.
Advertisement
'Take It Down': New tool allows teens to remove explicit images
"Once you send that photo, you can't take it back," goes the warning to teenagers, often ignoring the reality that many teens send explicit images of themselves under duress, or without understanding the consequences. A new online tool aims to give some control back to teens, or people who were once teens, and take down explicit images and videos of themselves from the internet. Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children, and funded in part by Meta Platforms, the owner of Facebook and Instagram. The site lets anyone anonymously — and without uploading any actual images — create what is essentially a digital fingerprint of the image. This fingerprint (a unique set of numbers called a "hash") then goes into a database and the tech companies that have agreed to participate in the project remove the images from their services. Now, the caveats. The participating platforms are, as of Monday, Meta's Facebook and Instagram, Yubo, OnlyFans and Pornhub, owned by Mindgeek. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down. In addition, if someone alters the original image — for instance, cropping it, adding an emoji or turning it into a meme — it becomes a new image and thus need a new hash. Images that are visually similar — such as the same photo with and without an Instagram filter, will have similar hashes, differing in just one character. "Take It Down is made specifically for people who have an image that they have reason to believe is already out on the Web somewhere, or that it could be," said Gavin Portnoy, a spokesman for the NCMEC. "You're a teen and you're dating someone and you share the image. Or somebody extorted you and they said, 'if you don't give me an image, or another image of you, I'm going to do X, Y, Z.'"Portnoy said teens may feel more comfortable going to a site than to involve law enforcement, which wouldn't be anonymous, for one. "To a teen who doesn't want that level of involvement, they just want to know that it's taken down, this is a big deal for them," he said. NCMEC is seeing an increase in reports of online exploitation of children. The nonprofit's CyberTipline received 29.3 million reports in 2021, up 35% from 2020.Meta, back when it was still Facebook, attempted to create a similar tool, although for adults, back in 2017. It didn't go over well because the site asked people to, basically, send their (encrypted) nudes to Facebook — not the most trusted company even in 2017. The company tested out the service in Australia for a brief period, but didn't expand it to other countries. But in that time, online sexual extortion and exploitation has only gotten worse, for children and teens as well as for adults. Many tech companies already use this hash system to share, take down and report to law enforcement images of child sexual abuse. Portnoy said the goal is to have more companies sign up. "We never had anyone say no," he said.Twitter and TikTok so far have not committed to the project. Neither company immediately respond to a message for comment Sunday.Antigone Davis, Meta's global head of safety, said Take It Down is one of many tools the company uses to address child abuse and exploitation on its platforms. "In addition to supporting the development of this tool and having, reporting and blocking systems on our on our platform, we also do a number of different things to try to prevent these kinds of situations from happening in the first place. So, for example, we don't allow unconnected adults to message minors," she said. The site works with real as well as artificial intelligence-generated images and "deepfakes," Davis said. Deepfakes are created to look like real, actual people saying or doing things they didn't actually do.

"Once you send that photo, you can't take it back," goes the warning to teenagers, often ignoring the reality that many teens send explicit images of themselves under duress, or without understanding the consequences.

A new online tool aims to give some control back to teens, or people who were once teens, and take down explicit images and videos of themselves from the internet.

Advertisement

Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children, and funded in part by Meta Platforms, the owner of Facebook and Instagram.

The site lets anyone anonymously — and without uploading any actual images — create what is essentially a digital fingerprint of the image. This fingerprint (a unique set of numbers called a "hash") then goes into a database and the tech companies that have agreed to participate in the project remove the images from their services.

Now, the caveats. The participating platforms are, as of Monday, Meta's Facebook and Instagram, Yubo, OnlyFans and Pornhub, owned by Mindgeek. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down.

In addition, if someone alters the original image — for instance, cropping it, adding an emoji or turning it into a meme — it becomes a new image and thus need a new hash. Images that are visually similar — such as the same photo with and without an Instagram filter, will have similar hashes, differing in just one character.

"Take It Down is made specifically for people who have an image that they have reason to believe is already out on the Web somewhere, or that it could be," said Gavin Portnoy, a spokesman for the NCMEC. "You're a teen and you're dating someone and you share the image. Or somebody extorted you and they said, 'if you don't give me an image, or another image of you, I'm going to do X, Y, Z.'"

Portnoy said teens may feel more comfortable going to a site than to involve law enforcement, which wouldn't be anonymous, for one.

"To a teen who doesn't want that level of involvement, they just want to know that it's taken down, this is a big deal for them," he said. NCMEC is seeing an increase in reports of online exploitation of children. The nonprofit's CyberTipline received 29.3 million reports in 2021, up 35% from 2020.

Meta, back when it was still Facebook, attempted to create a similar tool, although for adults, back in 2017. It didn't go over well because the site asked people to, basically, send their (encrypted) nudes to Facebook — not the most trusted company even in 2017. The company tested out the service in Australia for a brief period, but didn't expand it to other countries.

But in that time, online sexual extortion and exploitation has only gotten worse, for children and teens as well as for adults. Many tech companies already use this hash system to share, take down and report to law enforcement images of child sexual abuse. Portnoy said the goal is to have more companies sign up.

"We never had anyone say no," he said.

Twitter and TikTok so far have not committed to the project. Neither company immediately respond to a message for comment Sunday.

Antigone Davis, Meta's global head of safety, said Take It Down is one of many tools the company uses to address child abuse and exploitation on its platforms.

"In addition to supporting the development of this tool and having, reporting and blocking systems on our on our platform, we also do a number of different things to try to prevent these kinds of situations from happening in the first place. So, for example, we don't allow unconnected adults to message minors," she said.

The site works with real as well as artificial intelligence-generated images and "deepfakes," Davis said. Deepfakes are created to look like real, actual people saying or doing things they didn't actually do.