![]()
ABC News in Australia reports that Facebook is “teaming up with Government to stop nude photos ending up on Messenger, Instagram”:
Facebook is partnering with a small Australian Government agency to prevent sexual or intimate images being shared without the subject’s consent.
e-Safety Commissioner Julie Inman Grant said victims of “image-based abuse” would be able to take action before photos were posted to Facebook, Instagram or Messenger.
“We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly,” Ms Inman Grant said.
I guess you’ve got to be pretty worried that some toe-rag is interested in sharing nude photographs of you, if you’re prepared to ask for Facebook’s help in this way.
As far as I’m aware, Facebook hasn’t published any information on how it plans to implement this. I would imagine that they are using similar technology to that used by internet companies to identify child sexual abuse images – where they don’t need to store a copy of the actual offending content, but instead have a database of “fingerprints” that can identify images and videos.
My hope and expectation is that Facebook will automate the process as much as possible, but that there may need to be some human involvement to review submitted images.
My guess is that Facebook will tightly control who in the company can review and access submitted images, and that they will be blurred to protect people’s privacy, before they are converted into “fingerprints” and then permanently wiped.
You probably do need some human involvement to prevent people chucking images into the system which *wouldn’t* be classified as “revenge porn” (perhaps with mischief in mind, or perhaps in an attempt to prevent the spread of images that they were trying to suppress for other reasons).
This “human” element is probably the most risky part of the process, and there will be many people ready to castigate Facebook if it screws this up.
“Revenge porn” is horrendous enough as it is, without technology companies making the problem worse. Facebook knows that there will be many people concerned about how it handles such sensitive content, and I imagine they have put a good deal of thought into minimising the chances that anything goes wrong.
By the way, “revenge porn” is a horrendous phrase. We need to think up a better one.
Update 9 November 2017: Facebook has published some more details of its scheme.
For more discussion on this topic, be sure to listen to this episode of the Smashing Security podcast:
Show full transcript ▼
This transcript was generated automatically, probably contains mistakes, and has not been manually verified.
They are the web application and security scanner that can automatically find security flaws in your website and fix them before hackers can exploit them.
If you want to automatically check your web applications for cross-site scripting, SQL injection, and other vulnerabilities and coding errors that can leave you and your business exposed to malicious hackers, check out Netsparker.
Try it out now by downloading the demo from www.netsparker.com/smashing. And thanks to Netsparker for supporting the show.
Smashing Security, Episode 52: Facebook tackles vengeful scumbags and a sex toy privacy boob with Carole Theriault and Graham Cluley.
And it's people who I didn't even know listen to the podcast, but clearly do. But that was the one thing which really got them interested.
If you don't know what we're talking about, go listen to episode 51.
Just before Christmas, I think it is.
So we have picked a few of the topics, things which have caught our interest, which we will chat about right now.
And the first thing I wanted to talk about was Facebook and revenge porn. Actually, I don't want to talk about revenge porn because I really dislike that phrase, revenge porn.
And similarly with revenge porn, I don't know what it should be called. Maybe—
But of course, the newspapers love the phrase revenge porn. That's the sort of thing you may have seen in the press, and it's a ghastly phenomenon. We all know what it is.
It's a serious problem.
Maybe they hacked it and stole it from your computer, or maybe you shared it with them because you were in an intimate relationship with them, which then went sour.
And then people threaten, maybe blackmail you, or just because they don't like you anymore, to post it all over social media and send it to all of your friends and your parents and your employees.
And it's just, you know, it's horrible.
The internet has made this so much easier and social networks have made it so much simpler for someone in just a rage-fueled moment to share those sort of images with everybody who you know.
And ghastly. And as a consequence, people have obviously been very traumatized. In the worst cases, people have even committed suicide.
And meanwhile, everybody at your school for instance, has seen these photos and you feel you can't go there anymore. Horrible, horrible thing.
Well, Facebook is one company which is trying to take a stand against this.
For some time, you've been able to report image-based abuse on Facebook, and we'll put a link in the show notes so you can read more about that.
If you spot some images of you or video of you which shouldn't be being shared.
And so they're running a small test in Australia.
And this is what got everyone's attention in the press where— and some of the press presented this as upload all your naked photos to Facebook and then they will try and prevent them from being posted and shared anywhere on Facebook Messenger and Instagram.
And understandably, a lot of people got pretty irate about that. And so what are you talking about, Mark Zuckerberg?
What a perv, you know, that people might want to do those sort of things.
Because they don't want people uploading that sort of thing and sharing them on the network.
And so what they have is a database of fingerprints or checksums, if you like, which can identify offending images and videos. And they store that database.
So they don't store the images. What they do is that they have a database of checksums and fingerprints, I think would be the probably the best way to describe them.
And what they're trying to do is they're trying to expand that and say, look, maybe we can use this as well against image-based abuse against adults too.
So what they're saying is, If you have an image or if you have videos or if you believe that someone is going to start sharing images of you, then there's a process you can go through to basically tip us off and for us to keep an eye open and to try and prevent it from happening.
So it's not asking you, 'Hey, upload your holiday snaps,' or, you know, 'When you were out on the club the other night.' It's nothing like that.
So what they're trying to do is that they're using some smarter algorithms apparently to try and identify, I guess, the body or the shape or something.
So even if it's resized and altered in different fashions, they can still detect it and intercept it and prevent it from being shared.
Now, I realize they're hashing it and they're using all kinds of algorithms.
However, you are sending them a nude pic of you, one that you do not want online, one that you want removed from the internet.
Right now, as I said, the scheme has only been run in Australia, and you can go to the eSafety Commissioner's official website, so not a Facebook website, and complete an online form.
And what it's saying is that you have to send the image to yourself on Facebook Messenger, so you don't send it to any other Facebook users.
The eSafety Commissioner, which is the place where you've entered this form, they notify Facebook about what has happened, but they don't get to see the image.
They don't store the photograph to prevent anyone else from viewing it. And that obviously prevents in future anyone else uploading the photograph to the service.
It creates a website. So that may not be their expertise. It may be something which they want to regularly update, the algorithms and so forth.
And at the moment, this is something which has just been done as a trial.
So maybe if they decide, you know, this worked really well and we want to do more of this, maybe we want to roll it out over the rest of the world.
Maybe in future there will be something like that which could actually be built into the software. Who knows? It's possible. But right now that isn't happening.
And of course, there is the potential that a Facebook employee might see something which you don't want to see.
But then I think if you find yourself in that desperate situation where all of your schoolmates or your family or whoever might get to see things which you don't want them to see, then you may think, I don't care if some anonymous Facebook employee who doesn't know me gets to see this.
You know, it's not pleasant, but then you're not in a nice situation anyway.
I don't know how big a thing this is, but if it's affecting tens of thousands of people.
And obviously, there is an enormous amount of that kind of policing and moderation which is going on on Facebook. It often has been less than satisfactory, I think.
So you are not having to upload any picture other than the one that's currently on Facebook doing the rounds that you want removed.
But I actually think, although some have raised their eyebrows about this, I think actually this is probably, you know, we should give them some credit for at least trying to do something about a serious problem which has affected many people.
So you had to have something which was plugged into your computer and maybe it would be like a big boxing glove, which would extend out at you.
So anytime you were a bit of a twit online or just some vile toe rag, it went into your face and gave you a good punch on the nose.
That I think is what computers are missing right now. And maybe that would solve this problem. TM it, Graham. TM it. Hey, not just on the nose.
Maybe it could punch you somewhere else as well.
So they've admitted, well, what they call a minor bug in their Android app, which basically resulted in sound recordings being captured while the app was in use and stored on your phone's local storage.
I think the sound portion of it is mainly so that you can time the vibration to go along with whatever music you're listening to or something like that.
So he was cleaning up his phone and he stumbled across this audio file that he hadn't expected to be there.
And it turned out that it was a 6-minute recording of his session using the Lovense remote app.
So apparently it's a cache file that the app uses to store the sound that it needs to monitor to make use of the sound features.
And it was supposed to be deleted at the end of the session. But this bug meant that it wasn't actually deleting it.
Can I use that again sometime? You need it to have recorded the last 6 minutes or whatever.
It was only about a month ago, their Hush, it was called, Bluetooth-controlled bottom pleasuring devices.
So they were using the Bluetooth Low Energy, which is very cheap and efficient, but also very poor security.
So it was pretty easy for anybody that was in the vicinity to hijack the—we talked about this as well.
Yeah, they had to pay out $4 million in a, to settle a class action lawsuit. So their toys and apps were gathering too much sensitive data.
Not malware, but they were actually sending feedback to the developers. This is WeVibe, which included the temperature levels and vibration intensity.
And also they similarly had, they were using Bluetooth and were very easy to hijack by anyone nearby.
So Lovense, this particular case is about, so they've said that no data was being sent to their servers. Everything sent between the users is peer-to-peer, is encrypted in transit.
And they did say, yes, the cache file is required for the sound feature to operate. And they've issued an update, obviously. They actually had a look at their website.
They have some pretty reasonable privacy advice and they mentioned the encryption.
They mentioned not storing things centrally, but they also note it would be nearly impossible for someone to obtain any of the content that's happening on our platform.
So it's "nearly" very important.
And there's also several commenters who've noted the sound recording issue, but they're not all too concerned.
One review actually says, apparently this app records "creepy." But he still gave it 3 stars.
I mean, regardless—I mean, okay, this is a sex toy or whatever, but with any app, you have to be very careful because you don't necessarily know either what it's doing or what it might be storing and how securely it's keeping the information as well.
And the same goes for these speakers that you speak back to.
And even if you've just got a phone in your pocket, you're not in private.
I don't want to think about things like that. Okay.
Actually, the guy on Reddit who brought it up, he said he was playing pool in a bar when the recording was made.
So let me get started at the beginning.
The FBI got wind of something fishy going on when Washburn Computer Group, Gammell's former employer, reported getting hit by numerous DDoS attacks on their websites between July 2015 and September 2016.
Right? So they called the FBI and they're saying, look, we're getting hit all the time.
Now, the sender of these mocking emails had created Google and Yahoo accounts, and rather than use an anonymous name, he actually used the name of another employee at Washburn.
So the FBI— so this is how it all works— the FBI subpoenaed Google and Yahoo for the detailed logs of these email accounts, and guess what?
They found a direct link to Gammell's CenturyLink IP address and IPVanish VPN service.
So Gammell used to work at this company, and then he left and set up his own kind of soldering training company, and I think was looking to try and get a deal to kind of do training for Washburn, and it all fell flat.
So the FBI now have a probable cause to subpoena Google for Gamal's official Gmail address, and they get the information and they find some treasure, including registration emails and pro account purchases for a number of DDoS booter services, such as Sea Stress and BooterBox and VDoS.
Now, booter services are this kind of rent-a-DDoS web attack. It's DDoS-for-hire services. Now, this is where Gammill gets extremely unlucky.
One of the DDoS booter services Gammill registered with, so VDoS, suffered a cyber breach last summer, in summer of 2016.
And a security researcher, presumably working on cleaning up the DDoS incident, handed over all the DDoS logs to the FBI.
And here the FBI were able to unveil that Gamal was likely to be behind a load of DDoS attacks on servers belonging to companies like Wells Fargo, JPMorgan Chase, and the one he might regret most, Minnesota Digital Branch.
It starts with, dear colleagues, this is Mr. 'You underestimate your capabilities.' And he ends with, 'We will do much business. Thank you for your outstanding product. Smile emoji.
It's a really good story. So it's true that apparently in a number of his emails, he, Gamal, has claimed to be a member of Anonymous, the hacker collective. Our Mr.
Cannabis was arraigned in a Minnesota court this week. Washburn said that it suffered losses of over $15,000, which I was surprised that there wasn't a zero missing, but there—
But do you ever really get that? Oh, well, I'm very satisfied with my revenge. I'll now move on with my life. I think it's just—
NetSparker is a web application security scanner. It can automatically quickly find the flaws in your website security and fix them before hackers can exploit them.
You can try it out right now. Download a demo from www.netsparker.com/smashing. On with the show.
Could be a funny story, a book that they've read, something interesting, TV show, movie, record, app, website, podcast, whatever. It doesn't have to be security related necessarily.
It could be about absolutely anything at all. But my pick of the week this week is something a little bit useful.
And what are you going to do when you're no longer around? How are your loved ones going to handle it? Now, you might want all that information to be deleted once you've gone.
Or you might want to hand some of it over to someone else to look after.
Or you might be on a rubber dinghy in the middle of the Atlantic waiting for someone to pick you up for months on end.
But anyway, the thing is that for Google to decide your account is inactive, you have to decide what the timeout period is, the period of inactivity that must occur before it assumes you've gone a clunker.
So what you do in advance is you set up your trusted contacts.
You tell Google, these are the people I want you to tell if I haven't been active on my account and who do I want you to alert?
And you have to give it a phone number as well for these people because they don't simply want to rely on those people's email addresses in case their email address is actually compromised.
And you can decide which bits of the Google universe you want to share.
So you might decide, well, I do want to share my contacts, but I don't want to share my email, for instance, right? If only to invite people to the memorial or something like that.
Because I don't know who they are.
And then suddenly 6 months later you get a phone call from Google saying, oh, we think Dave might be dead. That's, you know, that's going to kind of bring up all those bad memories.
But certainly I can imagine plenty of scenarios where people would want this information to be shared with their nearest and dearest.
And yes, maybe a little bit upset, but maybe after 3 months you'd be able to cope a little bit better with it.
So I think you basically set up a trusted person and then they have to request access and you could set a timeout to say, okay, so if my wife tries to access my LastPass and says it's an emergency, it sends me a message.
And if I don't respond within that time, it assumes that I'm dead or incapacitated or in a Turkish prison.
Allows them access rather than just waiting and then suddenly starting spamming people.
And then I set that the time to say, okay, if give, give me 24 hours in case I'm not in prison.
I don't know why I assume you're going to die before me. I'm sure that's not the case.
So I thought I'll pop out into my shed and I will knock something up out of bits of wood that I have lying around. I have lots of kind of scraps and recycled, reclaimed things.
I thought that'll be it, make a nice little wine rack.
So I've been tinkering away, and one of the things it's reminded me of is I have some very, very lovely tools, thanks to my very, very kind in-laws, from a Canadian store called Lee Valley Tools.
And these guys make quite beautiful, you know, they're lovingly crafted, hardwearing, and very classical looking stuff.
They don't just make that kind of traditional classical stuff. They have lots of inventive gadgets and gizmos. With new ideas as well.
And they have kitchen things as well as kind of woodworking stuff, and they ship internationally.
So I, my pick of the week is for anybody that's interested in woodworking or has a friend that's interested in woodworking or just wants a stunning plane to sit on their mantelpiece as a decoration, you know, pop to Lee Valley.
They ship internationally. It's a great, great website.
Well, it's great to know, John, that while you're whittling away in your shed, you've got such fine equipment in your hands.
Now Christmas is coming and, you know, we're all thinking of what to get kids in our lives, but wouldn't it be great to give them something that they love that also teaches them something super valuable?
I bought these for my niece and nephew a few years ago, and they went down a storm. And what I'm talking about is Snap Circuits from Elenco Electronics.
These are color-coded electronic kits for kids aged 8 and above.
And with Snap Circuits, kids can build hundreds of different projects like mini motors and speakers and lamps and doorbells and burglar alarms and all sorts of stuff.
And it's really, really fun. The pieces are really good quality. They have a really satisfying snap when you put them into place.
And it's a really great way to introduce kids to electronics.
And you know what, you might just learn a little bit more if you're not really au fait with the ins and outs of electricity.
And there's— if you go to the website, there's tons and tons of different Snap Circuit sections you can buy.
Now, the one thing I would warn is not to buy directly from the website because they don't have HTTPS. So maybe go to Amazon or other trusted online retailers to purchase them.
But I promise you won't be disappointed in either one of your picks. Or a physical shop.
So you can actually test the circuitry using your iPhone or Android. It's pretty cool stuff.
You can also join us on Facebook at smashingsecurity.com/facebook, or you can buy some swag at smashingsecurity.com/store.
Thank you very much, John, for joining me and Carole Theriault.
Until next time, cheerio. Bye-bye.

So, how does it get consent? Does it message you out of the blue and say "bob" is trying to publish an intimate picture of you? Do they show you the picture? What if it's not you? Then it would be Facebook spreading it. What if you're a twin?
How about more articles that say… don't allow others/self to take intimate pictures of you. Unless they're actual non/never digital images stored in a controlled safe place [and that's not even 100% safe], assume it'll eventually fall in the wrong hands.
My guess is that Facebook doesn't bother asking for consent.
If it determines the image is in its database of dodgy images it simply won't allow it to be uploaded to its servers.
We therefore have to trust the Facebook image matching algorithms! Mmmmm, judging by the way they seem to lose perfectly legitimate comments at random from the site (of which I have proof), I don't think I'd trust much of their software capabilities.
There is another phrase used instead of "revenge porn" – "image based sexual abuse". I'm guessing the media isn't using it as the "porn" part likely sells more copies for them.
Don't take nude photos. Problem, solved.
Yes, go ahead and upload your nude photos to Facebook. What can POSSIBLY go wrong?!
Did you see this thread by Alex Stamos, and the article linked there?
https://twitter.com/alexstamos/status/928740488395608065
It is something Facebook, any many organisations working with victims, have thought about very carefully. No solution is perfect, but if you're worried about your nudes being shared among your class mates, with your family or within your social circle, should you really worry about some anonymous person at Facebook also having access to your nudes?
Also, someone else made the good point that the use of "your" in many articles (including your blog) implies that this is something for the general public. It's for a very specific group of people, for whom this may be the least bad of all bad options.
Why does photo need uploading?
Surely like several firms on the market that use a Photoshop plugin to calculate fingerprint, it could actually be done in the browser or local utility?