
Anti-porn “shameware” apps take a privacy pounding, is your image already being used by AI, and deepfake danger continues to deepen.
All this and much more is discussed in the latest edition of the award-winning “Smashing Security” podcast by cybersecurity veterans Graham Cluley and Carole Theriault, joined this week by Host Unknown’s Thom Langford.
Show full transcript ▼
This transcript was generated automatically, probably contains mistakes, and has not been manually verified.
It's interesting. The system claims to be able to distinguish between porn and non-porn images.
Right.
I had a professor who said that anything that was longer than it was wide was a phallic symbol. So, you know, just saying. I was like, a toaster?
Fridge?
Smashing Security.
Episode 291, Deepfake Dangers, AI Image Opt-out, and Controlling Your Urges with Carole Theriault and Graham Cluley. Hello, hello, and welcome to Smashing Security episode 291. My name's Graham Cluley.
And I'm Carole Theriault.
And Carole, who have we got in the hot seat joining us this week?
Well, it is the sometimes wonderful Thom Langford from Host Unknown. Welcome, Thom.
Sometimes? Well, let's hope I'm wonderful today then.
We'll see. How you doing, Thom?
I'm very good. It's always a pleasure to be here, I have to say.
You've been a busy boy though.
No, I know, I know. It's, we've been all over the place and it's, yeah, just busy, busy, busy. What can I say? I was photographing a wedding just the other week actually.
There you go, you see, multi-talented.
Well, you know, something like that.
One talent at least.
Yeah, that's right.
Let's not waste Busy Thom's time here and let's kick off and thank this week's sponsor Bitwarden, Kolide, and Cybersecurity Inside Podcast. It's their support that helps us give you this show for free. Now coming up in today's show, Graham, what do you got?
Oh, I'm going to be telling you how technology can help you get over your filthy little habit.
Oh God, why would I want to get over that?
It's too early for this. Thom, what about you?
I'm going to be talking about how the internet really doesn't forget, even when it shouldn't have remembered in the first place.
Intriguing.
And I'm talking deepfakes, Clint Eastwood style. The good, the bad, and the ugly. All this and much more coming up on this episode of Smashing Security.
Now, chums, chums, I've got a question for you, which is this. Do you have a porn problem? Thom, I'm looking at you principally.
I wouldn't call it a problem, more a hobby.
Pastime?
Yeah.
Right. Okay. All right. Well, some people think that they do have a problem. Maybe they think they spend too much time bashing the bishop, polishing the lighthouse, pulling the pud, tally-whacking. Have you got any favorite phrases you like to use, Thom?
Spanking the monkey.
We've been recording for not even 3 minutes.
Well, three minutes. I want to— I know this is a serious point. I've been reading Wired magazine and they have investigated something called accountability apps and I want to talk to you about today. They tell the story of a chap called Hao Wei Lin, who was attending an Evangelical Baptist church in the deep US South. And he had a problem. He had a problem, which was that he was going to regular weekly one-on-one sessions with the church leader to see how his faith was going. His particular concern was that he was gay, and he thought that he might get kicked out of the church. And he was reassured— this is a happy ending, if you— he was reassured to be told that God still loved him in spite of his, quote, "struggle with same-sex attraction." And this is what the church leaders told him? Yes, that's right. And that they would welcome him into the group, which is a marvellous thing. Good times, right? But of course, God alone can't fix the, quote, "problem" of being gay. And so at the next one-on-one, at his next session with the church leader, not that kind of session, Hao Wei Lin was told to install an app on his smartphone. So the church leader says, "So remember what we were talking about last week? I think you should install this app here." Not Pornhub or Grindr wasn't the app that he's been told to install. He was being told to install an app called Covenant Eyes. Covenant Eyes is an app which monitors everything that users see and do on their smart devices.
Whoa.
Okay, no, no, no. What do you mean everything the user sees? It's not a Google Glass.
Well, no, it's not plugged into your spectacles.
It's plugged into something else.
It takes screenshots, at least one per minute.
What?
Of your activity.
At least one per minute?
Yes, yes.
You couldn't even get through to lunchtime without your battery dying, surely?
Well, it is apparently doing that. And then if you look at anything that the app considers questionable, it sends a report to the person that you have identified as your ally, your assistant.
Oh my God. This is not bossware. This is Godware, TM, Ontario. Well, before you start claiming the trademark, there is already an app. So these apps are already— they've got names. Oh my gosh.
Or accountability.
Shamework?
Shamework, that's right. The idea is that if you share with maybe a close friend, an ally, details as to your porn habit, or details as to how often you're looking up rude things on the internet, then that may encourage you to do it less. And the concern here is that this church is telling its congregation to install these apps, and this has been reported to someone who's your spiritual elder.
Let me just put this in a different context. Imagine this was a cupcake app, right?
Yes.
So, and every time you eat a cupcake, it starts yelling at you going, "You disgusting cake-eating disgusting fatty fat fat, blah blah," right? And that's supposed to be, and as I cry sobbing, stuffing icing into my face.
I saw you with that lemon drizzle. I know what was going on there.
You harlot.
So the idea is that after an evening's porn perusal, the Covenant Eyes app tells your friend what you've been up to. So it gives them a full report of where you've been, what you've searched for. One of your buddies, right? Carole, you're my bud bud, right?
Right.
So you would get a report from me as to where I've been on the internet, what I've been searching— Well, what if I don't want that? Well, no, 'cause you've agreed to do this because you're my bud bud. You're helping me with my problem.
Oh, right. I'm like your mentor, your guiding light to salvation.
You're someone I trust, and you will receive blurred screenshots of whatever I've been looking at. And you can then call me up and say, hey, hey, how's it going? How you doing this morning? Hey, everything all right over there? You have a good evening?
Can you send me the picture with it not blurred?
Well, sometimes the blurring isn't obvious, is it? I don't know if it isn't always easy to make out what's going on. I remember, Carole, long, long ago when we worked at Sophos, we did a press release about a piece of malware called Bad Bunny.
Yes, I remember Bad Bunny. We talked about this before.
Have we? Well, some listeners may not know about Bad Bunny. I know. But so maybe—
Maybe they shouldn't.
Would you like to describe the Bad Bunny malware?
No, not at all.
No.
No, no, no, no.
So Bad Bunny was a piece of malware which displayed an image, didn't it?
As a payload, yeah.
As a payload of two people leapfrogging, but the person behind—
Missed.
No, the person behind was dressed up in a full-size bunny rabbit outfit.
Oh, cute.
Yeah. Yeah, it was very cute. It was very cute. There was nothing rude about it at all. It was all in the mind of the person watching, thinking that Easter was coming. So it wasn't anything like that. But we pixelated out the eyes, didn't we, of the bunny? That was the thing.
And the human involved.
The recipient.
And the recipient as well. That's right, yes. So I'm just saying that sometimes, you know, the blurred image, you can still get the gist of what's going on.
Graham, you know, you're getting lewd.
Look, I'm sorry. It's just because Thom was coming on the show.
It's what's in the news. Let's face it. We are merely holding a mirror up to society.
Can I say, I think I've done very well not reporting at all on the chess scandal, which is going on for the last few weeks.
I'm surprised you haven't, actually.
But I—
I thought that'd be right up your street, or passage, or whatever you like to call it.
I haven't given it a moment's thought. I didn't even know about it.
Oh, you know. Well, you're missing out on a lot of good gossip. Anyway—
It's great.
Anyway, anyway.
Makes me want to learn Morse code.
Anyway, Covenant Eyes, this app tells your friends what you've been up to, gives them this report.
Covenant Eyes. It's terrific, even the name of it.
It's awful. Well, I went to check out the website of Covenant Eyes. Turns out around about 1.5 million people have installed it. They've got this really professional promotional video containing a sort of cut-price Poundland Thom Cruise who's describing his porn habit and his cutesy wife and how their marriage has improved since he installed this.
Until he's regularly shamed into not doing anything sexual.
Either that or his wife now finally gets to see what he was looking at.
Anyway.
Every night.
Yeah, she gets a little report. Oh, okay, that's all right. Let's try that one.
The reason why these apps apparently appeal to people is why would anyone want to watch porn if they're going to have to talk to their parents or their church leader about it.
Isn't it rule 34 of the internet?
Oh, I don't know.
Maybe some people enjoy that.
Well, yeah, I know a few people who would probably watch more of it if they knew someone was watching them watch it.
Well, well, Thom, interesting you should say that because I had you in mind. You can subscribe. You can say, I know someone, I have a friend who. You can subscribe for $16.99 per There is a 30-day money-back guarantee.
You don't get any data back though.
Well, this is the thing. I don't know if you were to send an enormous amount of data, which I imagine you might, Thom. Maybe there should be a platinum plan or something.
You know what's also interesting is how they would— would they pixelate images? Is that— they wouldn't want to affect—
Yes, they blur out images. They do blur out images. So it's not a way of getting—
So they might get it wrong, right? Thom might be looking at just a huge donut, for example. For example, and it misconstrues that.
Chocolate donut.
Anyway, moving on. The app, it's not spyware. I don't think it's really spyware. It isn't secretly spyware. Godware, TM, Carole Theriault. It's quite brazen about admitting what it's aiming to do and to help people with their porn addiction.
I do have a serious question.
Yes, go ahead.
So, at some point, the images are captured and then blurred.
Yes, yes.
Where is that stored? Is there an unblurred version? Is that copyright? I could be looking at pictures of, I don't know, past girlfriends, past partners, etc.
Oh yeah, that's possible. Yeah.
So where and how is this data being secured? Is it obfuscated permanently? Is it reversible? So is there effectively now another storage area of pornography that could be used for nefarious purposes.
So it's interesting. The system claims to be able to distinguish between porn and non-porn images.
Right.
I had a professor who said that anything that was longer than it was wide was a phallic symbol. So, you know, just saying. I was like, a toaster?
Fridge?
Fridge. Anyway, so the images are uploaded to a server under the control of Covenant Eyes, and it claims that the images are blurred. Now, according to Wired's investigation, it says that when it set itself up as a user, it received slightly blurred images.
So—
What? Slightly?
I love how vague this is.
Do you know, slightly blurred is not always good. There's a coffee shop I go to, and across the road, I think someone has basically their loo right in front of the window with some very ineffective plastic opaque coating.
Oh, so you can work out the general—
Yeah. When they're, you know, when they're finished and stuff and yanking up their trousers. Yes.
So let me tell you about some other accountability or shamedware apps. There's one called Fortify. And what you can do with this, you can log information about when you last masturbated, where you were when it happened. And this one was the one which intrigued me. What device you used? I mean, I would— device?
I was entirely manual today.
I'm with Thom. I think people must be installing this because they want to. I just wonder what the app creators collect as well, right?
Ah, yes.
Yeah, exactly. Exactly.
Well, in the case of Fortify, it asks you how challenging your urges were during the day. So you can choose from a sliding scale from very easy to very difficult with different smiley faces.
Are you kidding me?
They should have a vinegar strokes face, surely.
I don't even know what that means. There's also, he's younger than me.
I have no idea what it means. It's not about youth, dude.
I'll tell you later.
Please don't.
There's also, it gives you trophies and rewards and allows you to, quote, celebrate your victories. Now, what form the celebration takes if you've had a one-week streak of not stroking.
I don't know.
There's even in this Fortify app an SOS button. So I imagine—
Oh my God. This is just fun. Wouldn't you do this with your partner? You could do this if you were living long distance, right? You could have this little relationship thing and they're like, oh, oh, oh.
What does SOS stand for? Is it save our sperm?
Oh my God.
Stop our spaffing. Anyway, so Wired looked into this Fortify app as well. And they found that the form you use to log your masturbation stats with Fortify, it unfortunately has a Facebook tracking pixel on it.
Oh my God.
What? Explain that to me.
Oh my God.
What that means is that Facebook is able to track you as an individual and how often you go to Fortify and potentially what you might be entering on it. What could possibly go wrong with Facebook?
Changing your ads.
Facebook knowing how often you're wanking off. But there's another problem. It turned out—
Do you like more disinfectant wipes?
Easy access underwear.
Kleenex man-sized tissues.
Snap-on, snap-off trousers.
It turned out there's a bug in Fortify, which means it also passes your account password.
Shut up.
Fortify account password in plain text.
No!
To Facebook as well.
Bree, why did you go first this week? You could have just invited me to do my story first. How is anyone supposed to follow this?
So there's a— I think there's a problem with these kinds of apps.
Do you? Hugh, you are, you know, as an expert, I'm listening very carefully. Okay, so you think this isn't very good.
So I think it's your choice whether you want to install an app like this and get your mate, get your bud bud Carole, to call you up in the morning and say, "Hey, what did you get up to last night?" As if I don't know. But there's—
Got this detailed report.
From your bloodshot eyes and the— The burst blood vessels in your cheeks.
It's definitely in your area of palms. It's definitely a problem if churches are telling their congregations, or indeed, you know, religious cults, anyone who's sort of in a spiritual position, because apparently hundreds of people at this particular church have installed the app. And get this, if you volunteer for the church, it is mandatory. You have to agree to install the app before you're allowed to work for the church or do any voluntary work.
The thing is, Graham, I think you're not thinking about this as a religious person.
Oh, okay.
So if you were Catholic, for instance, couldn't this app save you a lot of time at the confession?
I'll airdrop you my sins.
Right, exactly. This comes up in the Wired article. They say that the church elders, they like it.
Of course.
It makes it much easier to know what to talk about.
Exactly. You don't have to pussyfoot around.
There's huge amounts to laugh at here and there's huge amounts to ridicule here because the apps—
They haven't even started yet.
Yeah, exactly. But Carole, I think you were hinting at a very valid point. Any kind of addiction is going to be destructive.
Anything in excess.
Yeah. Anything excess is going to be damaging, et cetera, et cetera.
Binging on podcasts, cheese sandwiches, whatever it might be.
Exactly. But people do need help and all that sort of thing. The problem is when that help is not entirely focused on the individual and has ulterior motives or flawed frameworks or facilities in which it operates. And that's what I think we have here. For churches, it's about control. And from an app perspective, they're obviously very poorly built, you know, sending passwords in plain text. So it's actually very troubling.
What about curvy vicars, right?
Yes. And if you think about certain religious— Scientology. If you think about particular groups where they've collected information about their members in the past and they use it as blackmail.
Yeah, very troubling on many counts, preying on people when they are at their lowest and potentially seeking help.
Now I've probably gone on too long about this already. But I just want to mention one thing from the security point of view which it was doing, which is that the latest is following the Wired report, Google has removed Covenant Eyes from the Android App Store because it was apparently exploiting Android's accessibility features in order to see what was going on the screen. So these were features which are built into the operating system to help people with poor eyesight, for instance, screen reading. Those sort of things. So there's a certain irony here that features which are normally used legitimately for helping someone with poor eyesight are now being used by someone maybe who's ended up with poor eyesight because of their wanking problem. Basically, it's essentially the same thing.
It's still on iOS. I'm sorry, there's somebody come to my door.
Professional.
Amazon delivery.
That was typical. Right.
Is that the Kleenex from Facebook that's just arrived? Thom, what have you got for us this week?
Well, a little bit more serious, although it does still cover pornography because, well, it's a hobby.
What is going on this week?
No, no, no, no, no. You know, I'm taking it a very serious and sober view of this. So we all know that AI is a big thing now. Certainly hit the media recently about these AI image generating tools like DALL-E and Stable Diffusion. In fact, you were talking about it fairly recently. You type in something and up comes your name.
Yeah.
And they're powered by massive datasets of images that are scraped from the internet. So, you know, you set your little algorithm off for your AI, it scrapes up every image it can and every bit of context it can about that image so that when you type in something like, I don't know, Thom doing a podcast, it actually pulls something together that actually vaguely resembles Thom doing a podcast or whatever that might be according to the datasets.
Up comes an image of a dumpster fire.
Yes, exactly.
Yeah. I've heard your podcast. Yeah.
Floating down a river. Yeah.
The problem is, what if one of those images is of you? There's actually no easy way for you to opt your image out from these AI datasets that are being used.
So what you're saying is that the images are scraped from the internet and you haven't been asked permission for that?
Yes.
Right.
To be used for potentially commercial purposes. So this is actually from a vice.com article. It talks about how in one example, sensitive images can even end up powering these AI tools. So for instance, there was one individual, she had a photograph taken of her, a particularly sensitive image taken of her by her medical practitioner for the purposes of a procedure that she was undergoing. And this image that was guaranteed to be only used for purposes of the procedure and would not be shared elsewhere was found in the dataset belonging to an AI company called LAION, L-A-I-O-N, which it used to train the Stable Diffusion and Google's Imagen. So you've got a company that builds huge datasets and those datasets are then shared with various other AI companies that allow them to then generate their images.
But surely that sounds like a surgeon or a doctor has been careless with the privacy of her photograph because they should have—
Careless?
Did they post it up on the internet somewhere and leave it publicly accessible?
Well, exactly. This is where it gets very, very difficult, for instance, because actually the traceability of that is very, very difficult because companies are buying large datasets of images not knowing their provenance effectively. And they could have been stolen, they could have been illegally obtained one way or the other and made available. And nobody is actually claiming any kind of responsibility for this as a result.
So you could imagine, for example, if you were, I don't know, say a plastic surgeon, and you had all these close-up images of body parts, it's basically pseudo-anonymized or basically anonymized because you don't have headshots that go along with it. So do you care?
Well, unless there is a headshot, right? You don't know, but—
Yeah, yeah, you don't know, exactly. But I guess if you're not recognizable, do you care? And I think the answer is yes, because we have no idea how much of this information will be used in the future.
Or if you're not recognizable in that particular crop of that picture, but there is a larger image of you that is identifiable. But there's other elements to this as well. So Motherboard, who are doing the research into this, has also found that some of the worst images that have ever been posted online are also included in the dataset, including ISIS executing people, real nudes that were hacked from celebrities' phones, all that sort of stuff. So stuff that was very clearly illegal or grossly offensive that have no place on the internet as such or in the public domain are being used to train these AIs.
Is that because it's just being scraped without any regard to what is in the content?
I think that's exactly it. And companies will probably say, "Hey, we are just gathering what's out." But there's no accountability for the content as a result. You know, there is a moral imperative here. So, you know, LAION doesn't even go into detail about the, you know, the not safe for work and violent images that appear in the dataset. It does say that it does not contain images that may be disturbing to viewers, which is untrue, but links in the dataset can lead to images that are disturbing or discomforting depending on the filter or search method employed. The FAQs goes on to say, we cannot act on data that are not under our control. For example, past releases that circulate via torrents. This sentence could potentially apply to something such as Scarlett Johansson's leaked nudes, which already existed on the internet. And basically it relinquishes control from the dataset creators.
There's gotta be some money being made. I think they're scraping the data and they're providing those datasets to companies like Google, et cetera.
Yeah. Yeah.
You'd expect Google to actually look at this and go, whoa, right? Let's keep this dataset.
What? Do no evil, Google? Sure.
I'm calling on them right now on a very, very influential show.
It sounds like Google's thing entirely. It's the way they've always operated, isn't it? Scoop up whatever they like.
And this, the article goes on and talks about how responsibility has to be on the part of the developers of the AI and the machine learning tools and on the people who are actually creating these datasets. It shouldn't be on the individual whose photos are there.
No.
It should be on the companies that are basically scraping the data, making money. They need to make sure that the data they're gathering is valid, is legal, etc.
Ethical.
Ethical, absolutely. Of course they won't.
No, they're never going to do that. They're never going to do that unless someone comes at them with a great big cricket bat.
Yeah, or a regulation.
Yeah. The best way to police this or to make this happen is to put the responsibility on the people who are actually using these images. And the article goes on to quote somebody who's saying that algorithmic destruction is, I think, an actual deterrent because that's going to cost money and time.
Surely they've got a backup, haven't they?
You would like to think so.
Someone erases the awesome code.
There is, however, a little ray of hope for us.
Oh yes.
An artist and musician, Holly Herndon, has created a website that makes it easy for people to search if their images have been used to train AI. This website is called Have I Been Trained? The link's in the show notes. Very simple. You type in words, your name, you can upload a photo, et cetera, et cetera. I gave it a try. I uploaded one of my sort of publicity photos and found there's an awful lot of bald, bearded white men out in the world, is all I can say, who all look very, very similar to me.
They could be the stunt double
They could. Exactly. No.
If someone demanded a body double for you in a sex scene, if someone was filming a sex scene with you and didn't want it to be you, obviously, Thom, they could ask for one of these people. in Host Unknown, the movie, couldn't
They couldn't afford my moneymaker. Oh my.
Now, I've had a go at 'Have I Been Trained?' I haven't uploaded a photograph, but I entered my name. they? You know what?
Yeah.
To see what would happen.
I entered your name as well.
Oh, did you? Now, I don't think they look particularly like me. There's one who looks like a jockey, a couple who look like a murderer. So are LAION making money
Yeah, I had a whole bunch that looked very odd, I have to say, when I put my name in.
out of these? They all look vaguely—
They all have very crazy eyebrows though, Graham.
The AI algorithm starts with the eyebrows, works out from there.
They're scraping this data. Carole, what have you got for us this week?
Okay, so meet Patrick Hillmann. He is the chief communication officer at Binance, the world's largest crypto exchange with $25 billion in volume, says CSO Online in an article about two days ago. And he starts his day as any other, you know, butt scratch, 5-mile run, a kale and goji berry smoothie, and a scroll through the daily deluge of email.
He's not running the Fortify app, clearly, or CovenantEyes.
I made 3 of those things up. So there he is reviewing his email and he spots messages from clients about a recent video call with investors. And there are 6 of these emails. And one of them says, thanks for the investment opportunity. Another one says, I have some concerns about your investment advice. Another one complains that the video quality wasn't very good and one even asks outright, "Can you confirm the Zoom call we had on Thursday was you?" No way. Way!
Way!
So according to CSO Online, this is where Patrick Hillman got that sinking feeling in his stomach that someone had deepfaked his image and voice well enough to hold a 20-minute investment Zoom call trying to convince his company's clients to turn over their bitcoin for a scammy investment.
It's happened. It's happened.
Right? So he says, quote, the client I was able to connect with shared with me links to faked LinkedIn and Telegram profiles claiming to be me, inviting them to various meetings to talk about different listing opportunities. Then the criminals used a convincing-looking holograph of me in Zoom calls to try and scam several reps of legitimate cryptocurrency projects.
Holograph. Is this guy a fan of Star Trek: Next Generation? What?
Now, there's a few different approaches, right, on how deepfakes are created, but many deepfakes use generative adversarial networks, okay, or GANs. G-A-Ns. I don't know how you say the acronym. But this is basically where two machine learning modules duke it out, right? One machine learning model trains on a dataset and then creates a video forgery while the other attempts to detect the forgery. And the forger creates fakes until the other machine learning model can't detect it.
Okay. Yeah.
Right? And so of course, the larger set of training data, the easier it is for the forger to create a believable deepfake. And this is a massive challenge, says Eric Horvitz of Microsoft. This is in a brand new paper on the subject, link in the show notes. So over time, the generator learns to fool the detector. And with this process, he says, at the foundation of deepfakes, neither pattern recognition techniques nor humans will be able to reliably recognize deepfakes. So thoughts?
Well, I'm very happy to be corrected if I'm wrong, but this is the first time I've heard of a video deepfake being used to run a con in reality.
Oh no, no, there was a big one. A bank, was it in Austria, Graham? Where they—
That was a voice one, though, wasn't it?
No, video. And— Oh, actually, no, they weren't using a— sorry, a quote, holograph. The guy was masked, actually. That's right.
Oh, the one involving—
The guy was masked.
It was basically old school deepfake.
Yeah, there was one involving the French Defence Minister. They created a set to make it look like it was inside the French—
But it was an actual physical mask, wasn't it?
Yeah, it was
Yeah, it was Mission: Impossible style, you know?
Yeah, yeah, yeah.
That kind of thing.
But this, I think, is the first time that I've heard of this actually being used in the wild, as it were. Because a couple of years ago, end of 2020, you know, when you get those things of, what predictions have you got for the next year?
a physical mask, that's right.
And so I said back then that video deepfake would be used for running a con. And of course, 2021 went through and it didn't happen. But now it has. It's a little bit behind.
You're a soothsayer, Thom.
Well, something that.
We'll ask you for the lottery numbers later.
Yeah, they won't be right for this Wednesday, but maybe for about 7 Wednesdays in advance.
So it's not that this chap from Binance actually went out a bit, had a few lagers, got a bit crazy, went on Zoom, gave him some bad financial advice.
Do you know what? That was my initial thought. He went on a bender, got some gear up his hooter and he was off.
Although I suspect this argument will be used in the very near future when someone does something fucking stupid online, right?
Yeah, yeah. It wasn't me, it was a deepfake.
Exactly.
You were in my office.
That'd be Shaggy's next song. It wasn't me, it was a deepfake.
Now, obviously, or maybe not obviously, but obviously to me, 'cause I'm very, very smart. No, no, but obviously audio deepfakes would be easier to fake, wouldn't they? Yes, yes. Because you only have the audio to worry about, not having to match up the video as well. And all that kind of stuff. So it could be fairly easy for me to, you know, someone Graham, right? You're a podcaster, or Thom, and I could get so much audio from you that I could probably get some audio deepfake to convincingly have you sing I'm a Little Teacup Short and Stout.
Yeah.
So the idea is how would we be able to tell?
Yeah.
I was just on the MIT deepfake site because they have a few tests there, link in the show notes again, and I was feeling quite smug because I was oh, of course that's not Trump, and oh, I can see that. No, no, no, no, no. But I got screwed on one. I literally— yeah. So, so I was thinking about this, but according to the conversation, researchers at the University of Florida developed a technique that measures the acoustic and fluid dynamic differences. I know it sounds complicated, but between voice samples created organically by human speakers and those generated synthetically by computers.
Yes, that's going to be my suggestion of how to tackle the problem. Yes, it seems fairly straightforward.
Fluid dynamics, something, something differences.
Okay, so this group hypothesized that deepfake audio samples would not be constrained by the same anatomical limitations that us humans have. And if machines could detect that difference, could that not be very helpful in detecting fake messages or fake voice alerts? And they were right. The testing results not only confirmed their hypothesis, but revealed something super interesting. It was common for deepfake audio to result in vocal tracks with the same relative diameter and consistency as a drinking straw in contrast to human vocal tracks. So they were much more limited and not as variable in shape.
Oh, okay.
Right? So by estimating the anatomy responsible for creating the observed speech, their findings suggest that it's very possible to identify whether the audio was generated by a person or a computer.
That's interesting. So at the moment, yes, quite exactly. Because it could be introduced later. So for instance, if I understand correctly, a very simple characteristic would be, here I am speaking quite closely to the microphone, right? And if I spoke like this the entire time, that would be unusual. But maybe in a normal human conversation, sometimes you would move further away.
Yeah, exactly like Thom's been doing this whole episode. Doesn't sit still for one second.
I never sit still, Thom.
Feels like he's on a unicycle.
Oh, the chair is creaking, creaking. As long as you're not on a ping pong stick, that's all we care about.
Ping pong stick.
Yes.
Graham. So, so these—
What?
These deepfakes, they may become the new ransomware of tomorrow, which is very scary for a lot of us. And we've been talking about it for a while, but now we're seeing examples of it. According to MIT, these are the things that you want to look for. We've talked about this before, but it's worth just going through really quickly. So you want to look at cheeks and foreheads. The wrinkles in these areas are often non-existent. So yay for wrinklies, and screw poor plastic surgery people!
Are we back to Graham's story?
Oh my goodness.
I'm not even— I'm ignoring that. So shadows in the areas around the eyes, nose, and open mouth. Shadows would often be poorly formed. Glasses. So position and angle of any lighting glare in the lenses, right? Should shift correctly as the head moves.
No one's going to do this.
Yes, they will. They're gonna have to. 'Cause there's no technology at the moment that we can really reliably—
I've got another answer. I've got another answer, right? Deepfakes are clearly going to become a bigger and bigger problem, which means we should stop trusting people that we communicate with via computers. We should—
Yes, so don't listen to a word that we say, listeners.
Get back in the office. Get back in the office. Yes, like Jacob Rees-Mogg would recommend. I think we have to start having face-to-face meetings again. Holograph.
Carry a baguette and whack them, so you check that they're solid as well. That is, for now, the best way to conduct any important meeting.
See, I hate this, because I actually don't really like looking at people very much, right? Like, and I—
I don't like looking at you either, Carole.
I have a very old television, and part of my love for my old television is it's not high-def. So I don't have to look at people's pores, you know, or war paint on like US newscasters. And now I've got to study their faces just to make sure they're not lying to me and they're full of shit. Anyway, so deepfakes, they're a-coming.
Cybersecurity continues to be a hot topic. It's relevant for all of us, no matter what field we are in. And the Cybersecurity Inside podcast is a fantastic resource to stay up to date on the latest news and trends. Whether you're a security expert or just want to know more about the subject. The Cybersecurity Inside Podcast is hosted by Thom Garrison and Camille Morhardt, and they and industry guests make it easy to understand and learn more about today's most important security topics. Recent episodes have included the ethics of AI and machine consciousness, where we're headed with the cloud, how small businesses can get access to cybersecurity resources, ransomware viruses, and so much more. Every episode, you will walk away smarter about cybersecurity and have fun while you're at it. So what are you waiting for? Check out cybersecurityinside.com/smashing to listen to the latest episode. That's cybersecurityinside.com/smashing, or search for Cybersecurity Inside wherever you listen to podcasts. And thanks to them for supporting the show.
October is Cybersecurity Awareness Month, and Bitwarden would like to remind everyone about key actions that the US Federal Agency for Cybersecurity recommends that you take. Number 1, use strong passwords. Bitwarden can generate and store strong passwords for you. And 2, enable multifactor authentication on all your accounts, including your password manager. And of course, it's recommended that you keep your software up to date and take steps to recognize and report phishing. Bitwarden supports security for all with fully featured free accounts available to everyone. This Cybersecurity Awareness Month, protect yourself and help protect loved ones by educating them on password security and starting up a free Bitwarden account today at bitwarden.com/SmashingSecurity. That's bitwarden.com/smashing. And thanks to Bitwarden for sponsoring the show.
Pick of the
Pick of the Week is the part of the show where everyone chooses something they like. Could be a funny story, a book they've read, a TV show, a record, a movie, a podcast, a website, an app, whatever they wish. Kolide sends employees important, timely, and relevant security recommendations for their Linux, Mac, and Windows devices right inside Slack. Kolide is perfect for organizations that care deeply about compliance and security, but don't want to get there by locking down devices to the point where they become unusable. So instead of frustrating your employees, Kolide educates them about security and device management while directing them to fix important problems. Sign up today by visiting smashingsecurity.com/kolide. Doesn't have to be security-related necessarily.
Week. Pick of the Week.
That's smashingsecurity.com/kolide. Enter your email when prompted, and you will receive a free Kolide goodie bag. After your trial activates. You can try Kolide with all of its features on an unlimited number of devices for free, no credit card required.
Better not be.
Well, my Pick of the Week this week is not security-related. My Pick of the Week this week is the joy of sets. Try it out at smashingsecurity.com/kolide. That's smashingsecurity.com/kolide. And thanks to Kolide for supporting the show. And welcome back.
Of what?
The Joy of Sets. Are you familiar with this?
Did you design this entire show for Thom Langford?
Can you join us at our favorite part of the show, the part of the show that we like to call Pick of the Week. So, yes, I did actually. You did?
So, Thom, be honoured, because he didn't think about me or what I think about any of this.
Oh, Carole, you may have misheard me. I said sets.
S-E-T. Like badger sets?
Not quite badger sets.
Okay.
As in the TV, television sets, because The Joy of Sets, link in the show notes. Is an archive the BBC have put together of empty, over more than 100 empty sets from different shows, which they've had over the years. There's no actors getting in the way. It's just the beautiful set of, for instance, the Liberator from Blake's Seven.
I've just opened that one. And there's 7 people in it.
Oh, are there? You see?
But you're right. They're virtually all, every other one is virtually empty.
There you go. There you go. And so it's kind of curious, and it's a wonderful sort of— It's brilliant. And you know how you can use these, Thom, don't you? Because you love a green screen.
Yeah.
You could put yourself on the multicoloured swap shop in place of Noel Edmonds, Top of the Pops, Steptoe and Son, any of these things, Fawlty Towers. Anyway, I thought it's marvellous. I like it very much.
Is brilliant. I love it.
This is The Joy of Sets.
You see, I didn't grow up here, so a lot of these shows were before my time.
It's very nostalgic, but if you like old TV shows, Yes Minister, Grange Hill, Only Fools and Horses, you're going to like The Joy of Sets.
Good Life.
Oh, Barbara. Oh, she's so sexy.
Yes.
Wasn't she, eh?
Yeah, let's objectify them now. Right. Okay.
My pick of the week is the Steam Deck.
What's that?
So there was a company, it came back in 1996 called Valve, Valve Software. And they produced the series of games called Half-Life. Initially based on the Quake engine, wildly successful games. They've spawned multiple sequels, huge amounts of community love, et cetera. Valve evolved from just a software company into a hardware company, into a services company. They launched, I think it was in 2002, 2003, something like that. They launched the thing called Steam, which is a gaming platform where you buy your games online, you store them in their library or your library. You have the software on your computer and you can download and delete and upload and it creates, you know, it will store all of your high scores and it's where you download your patches and it's a very community-based, it's great, it's fantastic. Primarily for Windows PC, but there is some support for Mac games on there. So I'm a gamer from the days of yore, from the '90s. So I cut my teeth on Doom and Doom 2 and Quake, et cetera. Had the regular LAN parties. And I used to go to Steam and, you know, download some old games that I used to play. They have plenty of new stuff there as well. The problem being, since I'm an Apple person all round, I was quite limited to what I could actually find to play until the Steam Deck came along. Now, the Steam Deck is a handheld console, slightly larger than the Nintendo Switch. So a big screen in the middle with sort of, you know, connected joypads on the side. It's effectively a self-contained battery-powered computer. It runs SteamOS, which is a form of Linux, an AMD processor. Wi-Fi, etc., etc. And it connects directly to Steam and you can download virtually, I think it's something 90+% of Steam's catalog of games and run them all from a battery-powered handheld or you plug in a little USB-C hub and you put it up onto your TV or your screen and run them off there. A Bluetooth controller, or even if you're old school with Quake and stuff that, a keyboard and mouse. And to top it all, you can also go into desktop mode and you've got a standard Linux desktop, which you can then download and run all the productivity tools. So if you're out and about and then suddenly you're in a pinch and you need to make a Teams call or jump onto the internet for something, you can still do that. There's trackpads on either side that you can use as a mouse if you need, there's an on-screen keyboard. It's fabulous. I've had it about 4 days and I love it. I love it.
It's a little— No.
Oh, Thom.
It's a little bit like Fleabag in terms of—
Oh, Thom.
You love a gadget, don't you? I love a gadget. I've downloaded the entire Myst series. Oh yeah, Myst. Yeah, the entire Myst series. I've got Quake, Quake II, Quake III Arena, Quake Champions. Thom, I've got a question for you. When do you find time to masturbate?
I know, I know.
Oh my God.
Well, that's when the web browser comes in handy on it.
Can we? This sounds a very impressive game. So it plays the games well. It's not cheap.
It's about £500 for the top end one. Yeah.
£350 for the low end.
Yeah.
But I guess the thing is that this means all of your games are now portable and it's more portable than a laptop. Laptop, for instance, and it's more set up for gaming.
It's perfect for somebody who's either trying to relive their youth a little bit, you know, and can access all these old games, or somebody who's just a casual, every now and then gamer, doesn't want to invest in a big old gaming rig or anything like that.
Are there free games on Steam as well? Because I am a cheapskate, you see, having spent £400 or whatever it is on this device, I probably wouldn't be happy actually spending any money on the games.
There's loads of free games. And also some of the older games, the ones that you and I grew up with, are £3, £4.
So, Thom, you've had this four days. That's what makes me a little bit nervous, because I think you do get very excited about your toys.
You don't have to buy it right now.
Just wait a few months.
Well, exactly. That's what I want. I want Thom to come back in a few months and tell us if he's still playing on this or whether the— what are they? The joystick knob. Whether that's—
Yeah, whether the knob's fallen off.
Whether your knob's fallen off. That kind of thing. That's what I want to know.
You've heard this episode is exclusive. Excruciating.
Carole, what's your pick of the week?
So, my pick of the week is a TV show, a series called Am I Being Unreasonable? And it's flipping great. It stars Daisy May Cooper. She sports this massive shearling taupe coat and these '70s shades and these bedazzled bootcut leggings. And she wanders around this English village, very grumpily and friendless. That's how it opens up. Graham, I got you to watch it, and I think you hoovered up the whole series as well.
You did. Yeah. I wasn't sure after the first episode. I was a bit confused as to what was going on because it is very twisty-turny, very twisty-turny. But by about episode three, I was hooked. And then episode four, oh my goodness, the whole world has changed. What is going on now? This is on BBC iPlayer, Am I Being Unreasonable? And it's very good.
What's kind of cool about it is you watch the first episode and you think you get the gist. You get a few little glimpses of what you think might be going on. And for people like me, you're like, oh, I get it. I get it. I know what's happening. I've got it. Got it. Got it. And she totally just veers. You'll get it categorically wrong.
Right? Multiple times.
Multiple times. And it's a bit like Murder, She Wrote, where it's really not obvious.
It's nothing like Murder, She Wrote.
Don't listen to him.
It's nothing like Murder, She Wrote. I'll tell you what it's a little bit like. It's a little—
Columbo.
Yeah, it is.
It's funny, but there's also a lot of darkness there. And—
That's what I compare it to as well. It's a comedy thriller, they call it, but it is noir. Interestingly, The Guardian only gave it three out of five stars, saying it doesn't cohere. And I don't agree. I thought it was really fresh, very funny, quirky, surprising, and very charming. I was like, again, BBC, very cute.
The one I was really impressed with was an actor called Lenny Rush, who plays her son. And I thought he stole just about every scene he was in. Because usually, when I see a child in a TV show, I kind of think, "Oh, God, they're going to be painful." He was so funny. No, totally agree. But I think it's the relationship between the three, you know, between all of the actors that really makes it because they all are strong. But somehow, it's the— there's a little fizz magic between all of them and how it works. 'Cause she's a tall lady, right? Daisy Mae Cooper is no wallflower.
I'm gonna look at that tonight, I think.
I'd recommend it as well. I'd recommend if you're not sure after the first episode, keep going.
Keep going.
Yeah, if you're exactly like Graham.
If you think you've worked it out, think, "Oh, I know what this is." Just younger and better looking. It will be rewarding, I think.
That's called Am I Being Unreasonable? on BBC iPlayer. It may be other places. Well, I don't know. And that is my pick of the week.
Great pick of the week, Carole. And that just about wraps up the show for this week. Thom, I'm sure lots of our listeners would love to follow you online and find out what you're up to. What's the best way for folks to do that?
Oh, I'm on Twitter at @ThomLangford. That's Thom with an H. You can also get me on my other day job as a podcast host of Host Unknown at podcast.hostunknown.tv.
Terrific. And you can follow us on Twitter @smashinsecurity. No G, Twitter wouldn't allow us to have a G. And we also have a Smashing Security subreddit. And don't forget to ensure you never miss another episode. Please follow Smashing Security in your favorite podcast app, such as Apple Podcasts, Spotify, and Google Podcasts.
And deep, deep thank yous to our episode sponsors Bitwarden, Collide, and the Cybersecurity Inside podcast. And of course, to our wonderful Patreon community. It's thanks to them all that this show is free. For episode show notes, sponsorship info, guest list, and the entire back catalog of more than 290 episodes, check out smashingsecurity.com.
Until next time, cheerio. Bye-bye.
Bye.
I'm recording locally.
Yep, same here. I am recording locally. I can see the little red line going across.
What do you use, Audio Hijack?
GarageBand.
Oh, hello.
Who the fuck is that?
That was me.
Who's the amateur on the show?
Who's the amateur on the show?
The thing is, it comes up on my phone, on my—
Have you heard of this feature called Do Not Disturb?
Yeah, alright, alright, chill chick. It comes up on my bloody— Bloody— Oh, there it is. God, I can't remember.
This is what happens when you invite geriatrics on the show.
Computer! That's it.
Oh, that was the word. The word was computer. We were all wondering.
Bloody hell. It's what happens when I get invited on other people's podcasts. I start to lose all my wordy things.
Gorgeous listeners, and this is how we are starting this show. So wish us luck.
Hosts:
Graham Cluley:
Carole Theriault:
Guest:
Episode links:
- The Ungodly Surveillance of Anti-Porn ‘Shameware’ Apps – WIRED.
- Covenant Eyes.
- Sick and tired of trying to quit porn? You’re not alone – Covenant Eyes promotional video.
- Fortify.
- AI Is Probably Using Your Images and It’s Not Easy to Opt Out – Vice.
- ISIS Executions and Non-Consensual Porn Are Powering AI Art – Vice.
- Have I been trained?
- The Deepfake Danger: When It Wasn’t You On That Zoom Call – CSO Online.
- Deepfake Audio Has A Tell – Researchers Use Fluid Dynamics To Spot Artificial Imposter Voices – The Conversation.
- Deephy: On Deepfake Phylogeny – Cornell University.
- On The Horizon: Interactive And Compositional Deepfakes – Microsoft.
- Detect DeepFakes: How to counteract misinformation created by AI – MIT University.
- New Deepfake Threats Loom, Says Microsoft’s Chief Science Officer – Venture Beat.
- The Joy of Sets – BBC Archive.
- Steam Deck.
- Am I Being Unreasonable? – BBC iPlayer.
- Smashing Security merchandise (t-shirts, mugs, stickers and stuff)
Sponsored by:
- Bitwarden – Password security you can trust. Bitwarden is an open source password manager trusted by millions of individuals, teams, and organizations worldwide for secure password storage and sharing.
- Kolide – the SaaS app that sends employees important, timely, and relevant security recommendations concerning their Mac, Windows, and Linux devices, right inside Slack.
- The Cyber Security Inside podcast – Relevant cybersecurity topics in clear, easy-to-understand language. With every episode, you’ll walk away smarter about cybersecurity, and have fun while you’re at it!
Support the show:
You can help the podcast by telling your friends and colleagues about “Smashing Security”, and leaving us a review on Apple Podcasts or Podchaser.
Become a Patreon supporter for ad-free episodes and our early-release feed!
Follow us:
Follow the show on Bluesky at @smashingsecurity.com, or on the Smashing Security subreddit, or visit our website for more episodes.
Thanks:
Theme tune: “Vinyl Memories” by Mikael Manvelyan.
Assorted sound effects: AudioBlocks.

