What Happens When Big Tech Says No to Government?
In this episode of Cyber Made Human, we sat down with Paul Ducklin to discuss government influence on big tech. Specifically, Apple’s move to pull its enhanced iCloud Plus service from the UK in response to the government’s new legislation. We also look at how the technological policies of today will affect the next few years.
You can watch the full episode on our YouTube and Spotify pages. Check out the full episode transcript below to learn all about this topic and our discussion on it.
Disclaimer: This transcript is an outline of the dialogue exchanged in this episode and may therefore contain inconsistencies with the video version.
Our book recommendations for this episode were:
Alice: Fragile Beauty: The Elton John and David Furnish Photography Collection
To discover more book recommendations, check out the Cyber Made Human Bookshelf
What Happens When Big Tech Says No to Government Transcript
Alice: Paul Ducklin, welcome back to Cyber Made Human.
Paul: Well, thank you for having me back, Alice. It means I probably didn’t say anything regrettable last time I was on.
Alice: No, not too bad. So things have changed slightly. We’ve got people actually sitting behind the camera this time.
Paul: Yes. Not, uh, recording and then realising, one of them was focusing on the wrong place.
Alice: That didn’t happen.
Paul: No, it didn’t, but we did record it twice that happened. It’s always better the second time. So that’s the excuse.
Alice: I don’t recall. I think we got it. We got it sorted the first time. It was perfect.
Paul: It’s the content that matters.
Alice: Absolutely. And it was a very popular episode, actually.
Lots of people have referenced it as their favourite. Welcome back to series two.
Paul: Thank you.
Alice: So, for anyone who doesn’t know yet who’s been living under a rock, you’ve been working in cybersecurity now for how many years?
Paul: Let’s just say. It’s three decades,
Alice: You and I met at Sophos, where we both worked on the Naked Security team.
Paul: Am I allowed to say again? The award-winning naked security team.
Alice: It was award-winning. A very popular podcast. We do say so ourselves.
Paul: And a very special kind of content because it wasn’t about a sales spiel, which. So much cybersecurity content these days is all about getting everybody to lift their game a little bit.
Which in my opinion, pushes back against the crooks much more strongly than if one group like Google or Microsoft or Amazon or somebody lifts the bar a lot in their sphere. It’s not as good as if we all do a little bit extra, because if we all do a bit. That means the crooks have to work harder everywhere.
Alice: That’s interesting. So today we’re gonna be talking a bit about the UK government having changed their law around encryption, specifically in cloud data. And that’s impacted cloud services. In the past, you can probably explain this better than me; it was encrypted before it got to the cloud. So even with the service EG, Apple couldn’t see the data. And it would be decrypted when you took it out of the cloud, but that’s no longer the case, is that right?
Paul: That’s right. Now with Apple, that was a special extra feature. I believe they called it iCloud Plus. Because if you think about it, what a lot of people want to use the cloud for is not just backup. They want to use it so they can upload files. They’re safe in an off-site storage. They can download them later, but they can also easily give the power to other people to download it. In other words, there’s an element of backup and file sharing involved. That’s very, very handy. But sometimes, for example, if all you’re doing is a backup, it’s nice to think that even the cloud provider, whether they want to or not, can’t look at it. Therefore, they can’t be forced to look at it. They can’t expose it by accident. They can’t expose it if someone inside their organisation goes rogue, etc., so, as you say, the idea of iCloud Plus is kind of like what we call end-to-end encryption. You get with websites where it’s encrypted as it goes into your browser, and then it’s only decrypted when it reaches the server at the other end.
We’d spoken about that in the last episode, and in a way. Apple iCloud Plus is end-to-end encryption, even if all you ever do is upload it and then download it, each end of the encryption is you. And as you say, the big difference between just having a decryption key that’s uploaded. And the decryption key is uploaded, and it’s stored somewhere else.
And companies like Apple are smart about how they do this. It might be comparatively easy for the crooks to download the rogue data, or for a rogue operator to get at the raw data. But to get the data and the decryption key would be quite hard. But there’s something nice about knowing that the encryption and the key only ever existed on your computer, and by the time the data hits the cloud, it’s.
If you like, pre-scrambled. And the key is never, ever shared. And that’s kind of important if you just want to be absolutely certain that you’ve done it right. It’s not a question of being some kind of hacktivist or rogue operator if you want to encrypt your stuff yourself and then back it up somewhere else.
But the UK government. Unfortunately, doesn’t seem to see it that way. They think if you put something in the cloud, in a cloud service, it’s unacceptable. The cloud provider should provide the ability, basically, for you to put a locked box, a Prelock cash box, say, inside a safe deposit box, which is sort of the analogy.
What they want is for you to put it in a box, but it has to be unlockable while it’s in storage. And you kind of think, what is the point of that?
Alice: Do you know what’s led to that change?
Paul: Well, it always seems to end up with, we need to think of the children, you know? Well, pedophiles might use it, terrorists might use it.
Well, if they’re smart, they might just encrypt the data before they upload it, and then what are you going to do anyway? But the real problem to me is that. It kind of takes the human or the humanity outta the equation. In other words, it’s saying we might need to look at this data. We might need to track this specific person or that specific person because we’ve got a suspicion about them, a well-formed suspicion for which we can actually get a warrant, and which a judge will authorise the warrant, and it’s all gonna be above board.
But instead of just doing it that way, what we want is a situation where essentially anybody can spy on everybody. Just in case somebody needs to spy on somebody. And wouldn’t it be better if we just restricted it too, well, somebody can spy on somebody, but there are controls that prevent it, getting everything going, Hayward.
Alice: So that would mean that there would still be a back door, though, in the cloud operator.
It would still mean that, if there was a warrant to say this person’s got illegal imagery or is planning terrorism, that there is still a back door that could still be exploited by children.
Paul: No. What I’m suggesting is that it’s a very bad idea. ‘Cause experience suggests that there are very many other ways that you can acquire the necessary actionable intelligence about, well, the same way that.
We seem to have managed to bust people like ransomware, criminals, or dark web marketeers simply by analysing their Bitcoin transactions, which are not, they’re only pseudo anonymous. They’re not fully anonymous. And there’ve been some massive busts where billions of dollars have actually been recovered.
And bad operators have been caught and sent to prison. You know, I either for. Selling stuff that’s plainly illegal or for laundering money, or in one very tragic case, child abuse imagery. That was a case where the servers were operated outta South Korea, and in none of those cases, a US journalist/writer called Andy Greenberg wrote a fascinating book about this called Tracers in the Dark.
And the idea is, you know, that by dark, he means not only is it dark web, it’s dark because everything’s under illuminated. They’re sort of fumbling around a bit, but they followed the money. The primary investigator worked for the IRS, the revenue service. They figured, well, the. Following the money usually works.
The fact that it’s gone through the blockchain. Bitcoin prevents us from just looking up who did it, like we could if it was, say, a credit card transaction or bank transaction, but it doesn’t stop us beginning to form opinions about who spoke to whom and when, and why and where, and eventually finding out who’s at the nexus of all of this.
And they did these busts without cracking any encryption, without breaking any blockchain hashes, without cracking anybody’s. Bitcoin private keys without. Having access to the raw data on the servers. In fact, we don’t in some cases we’re even knowing where the servers were. They were just able to come up with sufficient evidence that said, this person clearly traded with that person, and this is, we have good evidence that this is what they were doing when they were trading, more than enough to obtain a conviction in court.
So there are all sorts of ways that you can acquire that data. For example, If you have a warrant, you could implant malware, let’s call it malware or spyware, on somebody’s computer. Not everybody’s. That would be a very bad idea, but what if you just figured, well, I do have a reasonable suspicion that this person is up to something bad, so I will put them under surveillance, which you are probably going to need to do anyway to get evidence to the quality that you need.
Alice: I’ve got a couple of questions there, though. One of them is. I think through the work that both of us have done, we know that a lot of criminals, even cyber criminals, aren’t that intelligent. And so when the government is using these kinds of examples as a reason as to why they need access to everyone’s data, a lot of people actually don’t even encrypt their data or don’t understand how things work.
But then the other side of that is if they are actually super intelligent technical people who can build their own encryption, even if you do have malware that’s got spyware on it, surely those people who are intelligent in that area can just build something that isn’t able to get through that.
Paul: Exactly. I think you need to be a little bit careful, like saying, well, most crooks are unintelligent, like they’re dumb
Alice: Because I’ll get in trouble again.
Paul: I, no, I don’t, I’m not trying to be unoffensive to crooks. I’m just trying to be unoffensive to everybody. It doesn’t really come down to intelligence. It just comes down to your operational practices and you are right that the very vast majority of it is conducted not using super fancy techniques. And the reason for that is not, maybe it’s not that the crooks are unintelligent. Maybe it’s that they’re actually so smart that they realise that. If you use an old school technique that you know works really well because you’ve been using it for two or three years, well, there might be a small cadre of people that you can’t crack with that, but anybody who hasn’t been really good with their cybersecurity has been a little bit sloppy at some point.
May, as you say, fall for your phishing attack, and the problem is that in many cases, if you fall for something that is as simple as a phishing attack, although many of them are surprisingly believable. The crooks may end up with your password. And your encryption credentials. So whether your stuff is encrypted in the cloud or not, the crooks are still able to masquerade as you.
Alice: And I suppose the solution for anybody watching this who isn’t cyber savvy would be to always have multifactor authentication turned on. So even if someone does have your login credentials, there’s a second barrier to entry.
Advert – The Cyber Made Human Podcast is produced by Alice Violet Creative, my content marketing agency.
Based in Cheltenham, we specialise in complex brands, which primarily means those in emerging technologies, cybersecurity and intelligence. We’re able to take abstract, clinical, and difficult topics and make beautiful, compelling and results-driven content. So get in touch with us for digital marketing and all your content needs.
Alice: So the origin of this story then is about the UK government changing their law. Do you have knowledge of what other countries’ laws are? Because could somebody just simply say, well, I’m gonna use A VPN and pretend that I’m based in a different country. Would that be a workaround?
Paul: I guess it would be in the immediate term.
I think it’s just in the UK that Apple decided we are not going to put in, as you say, a back door. We’re not going to. We’re not going to provide a way that we can continue to offer iCloud Plus, and yet it doesn’t actually work as advertised elsewhere in the world. Now, some people have said Apple shouldn’t have caved.
They should have just shown the UK government the hand, and they should have said, we’re gonna continue offering the service. And to hell with it, and you’ll have to kick us outta the country entirely. But from Apple’s point of view, I don’t know whether that would be really good or bad. You know?
Alice: Interesting. When you were saying that about the UK government and Apple, and how Apple has. Obviously, I agreed with what the new legislation is and that some people said that was a bad thing, ’cause I can see arguments on both sides there. Obviously, we’re saying things like AI and all the emerging technologies that we’re seeing, the government can’t catch up with those.
And actually, you want the government to be putting in protections for a lot of things. However, the counterargument is that then companies like Apple, which are actually offering users safety and security and privacy, are Unable to do so without being in breach of the government. I don’t, that’s a really difficult one to know whose side to be on.
Paul: My own opinion is that the arguments against what the UK government have done are easy to understand and overwhelming in proving that it’s a bad idea. From my point of view, there are several reasons. The main one is that I do not think that you can make cybersecurity stronger by deliberately making the really important parts of it weaker.
Like, how can that work?
Alice: I think in this case, I completely agree that the government shouldn’t have access to everybody’s own encrypted data. I think that’s just asking for trouble, as you say,
Paul: Yes,
Alice: But people who aren’t following the news or aren’t in this sector, when they hear that, and they can see a clear argument for why we shouldn’t let the government tell Apple what they should do? It’s clear, which is dangerous when you’re looking at Meta or X and how they’re doing certain things, which are unethical, and suddenly it’s like, no, we don’t want the government to have control over tech companies.
In this instance with Apple I, I can get it with the cloud, but it’s a difficult one, encouraging tech companies to disagree with the government because then there’s no accountability and they are so powerful.
Paul: Well. I guess if you want to, to be fair to Apple, in this case, they haven’t said we are going to carry on anyway and to hell with. His Majesty’s government. They have said, we are simply not going to offer this service in the UK because we don’t want to introduce what we would consider a backdoor in general, because these things tend to spread like cancer and we don’t want to offer this service under different terms and conditions, which would give a false sense of security.
If there is a shared key, if the government can decrypt it with a warrant, then that isn’t the iCloud Plus service. So we’re just going to say, you can’t do it, and let’s see what people think about that but I think the other argument that goes along with it, I don’t see how you can strengthen something like cybersecurity by deliberately weakening the most important parts of it.
Last time we tried it and the time we tried it before, and so on and so on, it didn’t work. Why is it suddenly going to work this time? What does His Majesty’s government know now that? We didn’t know before, and my thought on that is nothing. It is something that sounds good. Let’s think of the children.
Yeah. We’ve got, we’ve got a former Minister of Defence, a Secretary of State in the United Kingdom, tweeting, but the only people who love crypto are pedophiles. But what kind of absurd hornet’s nest stirring statement is that?
Alice: Yeah, but that’s a difficult thing because when you are trying to have a debate about it or let people educate themselves in digital literacy and cyber and do podcasts like this, when someone is throwing out words like pedophiles love encryption, that makes everybody feel like if they want to keep their data private or their pictures of their children, like normal pictures of their children on their phones and keep their identity off social media and not have it accessible, they feel like they’re hiding something.
Because they are saying, well, what have you got to hide? Well, nothing. I just don’t want my private family photos available to everybody.
Paul: Well, I think this idea that if you’ve got nothing to hide, then you’ve got nothing to fear. And therefore, you shouldn’t care about this. Law is one of the biggest myths. In cybersecurity, because everybody has something that they wish to hide, or hide is the wrong word that sounds,
Alice: Keep it private. You’re allowed to keep it private.
Paul: Private, but also. Most of us have things that we have already entered into some kind of legal agreement with someone else. To keep private.
Yeah, like the PIN code on your bank card there. There can be all of problems with these laws. The third thing that I don’t like about it is what’s often called, you know the, the principle of unintended consequences, but what else is the government going to want to look at? And people go, well, I trust the government.
And other people say, well, I don’t, it doesn’t matter. You can trust the government and the people involved. And you know, if you like the Miller of the public sector, you can think they’re kind of on our side, but. That’s not enough. You have to trust everybody who provides any sort of IT service to the government as well.
Alice: It’s about technology advancing. If you look back at even if you watch a film like the Harry Potter films from 10 years ago, they look really low quality now. And this was used, we what at the time was like really high, amazing films and now you look at them and you think, gosh, they, they look terrible.
And it’s the same with technology and cybersecurity in 10 years’ time, what we’re using now to encrypt data. In 10 years’ time, it might seem really archaic. And so I think it just is a huge risk about making everybody’s data available now and storing it on these cloud services. I mean, I’d love to touch on what a cloud server actually is.
It’s a massive data centre. It’s a physical place. I think people imagine that the cloud is some, you know, abstract concept, but actually it has a physical consequence.
Paul: The data just goes, just gets vaporised. And it just goes up there and hovers around. Well, that would be even worse because if you’ve ever seen lenticular clouds that form above mountains.
Yes. And you know, they just seem to hover above the mountain, but they don’t; if you look really closely, they’re actually forming. As the air goes over the mounting and they’re basically dissolving or evaporating at the other side. So that will be pretty bad ’cause clouds are by definition ephemeral.
But you’re right. When we talk about the cloud, that metaphor comes from network diagrams. Four decades ago, you would traditionally, you know, you’d have, well, my computer and a modem, and you might have a router that was at your employer, and then you had your branch office over here and you didn’t want to try and draw the whole internet and all the weird interconnections in between.
So you just drew a little puffy cloud.
Alice: Oh, really? It’s where it came from. It should have been a spiderweb or something.
Paul: Well, yeah, we didn’t have the worldwide web then, so we could have borrowed the web, but the idea was we’ll just, we’ll draw a cloud because it’s, it, I think it’s easy to do on a whiteboard and in fact, the very first time I saw that metaphor used by someone when I know, in the very early days of the internet, was a presentation from where I was working at the time.
Someone had come over from the US. One of the big, I, I want to say it was Vinton Cerf but I don’t, he did, I did meet him a couple of times, but it wasn’t him. And he actually said, the first time I ever saw this, when it drew the cloud, it was known as the PFC because the only board marker left that hadn’t been used was pink.
And so, we referred to it for the rest of that session as the pink fluffy cloud. And so it really was just a way of saying, well, there’s a lot of complexity in the network. But the idea of, the cloud, if you like, is the metaphor or little. Yeah. PFC thing in the diagram that shows how your data travels.
But these days, as you say. It really refers to these server rooms,
Alice: Right, the buildings,
Paul: I mean, we were chatting about this beforehand and you said, you know, don’t they have bicycles and scooters inside? Yes, they do. Because like, let’s say you, you, you are worried about uptime and you want to go and fix a server.
Like, if it’s 800 meters away, it’s gonna take you five minutes to walk there. So why not just use a scooter and whiz along there? And yeah. So obviously there’s a, there’s a whole ecological and environmental side to that. Like, just how much energy are these things, eating up to store stuff?
That may be, if we didn’t collect it in the first place, that would be a better solution to our cybersecurity troubles if we just drained some of the data lakes a little bit. But it also, it is quite literally somebody else’s computer. I mean, it’s stored somewhere. And the idea that it’s possible to store things on somebody else’s computer in a way that, not that they cannot decrypt it even if they want to, seems to me to be a very, very, very good idea indeed.
Doesn’t it? It means that you can have all the benefits of this remote storage that’s looked after by someone who’s good at managing power supplies. That’s good at managing air conditioning. That’s good at detecting that discs are about to fail and migrating your data automatically, but even if they’re approached by law enforcement, or even if your data gets migrated in an emergency because of an earthquake to another country, you don’t have to worry about suddenly falling under different regulations.
I think. I don’t want this to sound corny, but. The very title of this series is Cyber Made Human, and I think that what we risk with all these laws about, well, we’re gonna capture the data and we’re going to scan it automatically for everybody, is it removes the human part from cyber. And I think that’s not just a pity.
I think it’s dangerous because it means you are collecting these giant data lakes, which will overflow, which will have breaches, which will become a focal point for cyber crooks and the stuff that’s supposed to protect us from a few rotten people in our midst may actually become a vehicle for those very rotten people to actually attack us in the future.
So I think we need to find human-centric ways of solving this problem. But like I said, don’t rely on a system where. Anybody can spy on everybody, but we try to keep it under control. rather. Instead, we should have a system where some human decides that there is a reason, a good reason, that they can convince other people, say in a court of law, face to face, there is a good reason to put this person under.
Possibly even extreme surveillance because we are likely to get something useful out of it that will make society safer, not just we’re going to just look at, try and look at everything in the hope that we might find something. It just seems that I just can’t see how that can be an endgame that could possibly succeed. Not that I feel strongly about it,
Alice: No. Well, thank you. My final question for you is, of course, the Cyber Made Human bookshelf. So what’s your recommendation for our listeners this month? Doesn’t have to be cyber-related.
Paul: Could it be a video clip that I watched?
Alice: I suppose so, is it sixty seconds security with Duck?
Paul: Well, those are great by the way. Follow me at P Duckin on LinkedIn for them once a week and they really are 60 seconds I time them. Exactly. Which is more harder work than it seems. Uh, I watched a couple of selections from a very old TV series, Blackadder Goes Fourth, where Blackadder is explaining how the superpowers in the 1910s had put in place all these safety measures, that they were all so powerful that they didn’t dare to attack each other.
And of course they’re in the trenches. Blackadder Baldrick and Baldrick asked, so what was wrong with the plan? Captain Blackadder, and I won’t repeat it ’cause it’s a rude word, but. The problem with the plan is that it just wasn’t a very good one, and it didn’t work. So it it somehow seems to be quite pertinent today, but it
Alice: It was a case of comedy that made me feel sad, unfortunately.
Paul: Yeah. But it’s the best kind of comedy, actually. It’s supposed to make you thoughtful, isn’t it?
Alice: It is definitely, well, my recommendation this month, there’s actually, I went to the Victoria and Albert Museum recently. I’m a member there. I love it. And I went to the Elton John Photography exhibition, which is a collection that he and his partner have made over the last years, and it’s an amazing collection of imagery. A lot of it’s black and white, and I love black. I love black and white photos and um, so I would really recommend it.
I bought the book. So even if the exhibition is no longer on and you are interested in that photography collection, the coffee table book version is absolutely stunning and has insights on their collection that they’ve had in their different homes. So that’s my suggestion this month.
Paul: I love black and white photography as well.
Alice: Not just because it’s old and I do like, you know, well a lot of this isn’t old. So a lot of it’s like there were different, um, themes through it. Some of it would be like African tribes who’ve got jet black skin with white paint on them, which was really beautiful. And then some of it was actually photography that’s been made black and white. So it would be like someone jumping out of the Twin Towers as a press photo.
Paul: And it is amazing how. Much fun. You can have yourself playing with black and white if you’ve got a Photoshop license or you’ve got open source if you use the gim. Yeah. And you just used, you know, drop the red.
Drop the blue. My understanding is that most black and white print paper was mainly responsive to green light. So just take the green channel desaturate and. It’s amazing how different the image can look. It’s the same sort of data, but the information that you feel you’re getting outta can be surprisingly different and often a lot richer in my opinion.
Alice: Yeah, I agree. Well, thank you so much for your insights as always. Thank you for having us, and love to have you here in Cheltenham.
Paul: Thank you and a great pleasure to attend.
Watch the episode now!
Watch on Spotify
Watch on YouTube
GET IN TOUCH FOR ALL YOUR 2025 EVENT NEEDS
PHOTOGRAPHY | VIDEO | LIVE STREAMS | LIVE PODCASTING | SHOW REELS