Cyber Made Human Podcast: Responsible AI for Creators and Brands

Deepfakes & AI Ethics

by | Oct 28, 2025

What happens when anyone can make a convincing fake? In this episode of Cyber Made Human, we get practical about deepfakes: consent, intellectual property, data privacy, misinformation, and what responsible AI for creators and brands actually looks like.

You can watch the full episode on our YouTube and Spotify pages. Check out the full episode transcript below to learn all about this topic and our discussion on it.

Disclaimer: This transcript is an outline of the dialogue exchanged in this episode and may therefore contain inconsistencies with the video version.

Our book recommendations for this episode were:

Alice: The Beauty Myth by Naomi Wolf

Richard Norton (Norts): Perfume by Patrick Suskind

Derek Ahmedzai: The Labyrinth House Murders by Yukito Ayatsuji

To discover more book recommendations, check out the Cyber Made Human Bookshelf

Cyber Made Human Podcast: Responsible AI for Creators and Brands

Deepfakes & AI Ethics Transcript

 

Alice: In this episode, I sit down with Richard Norton, AKA Norts, and Derek Ahmedzai from The Peeps. They have an agency specialising in AI content. We discuss the ethics of AI, things like responsible AI and what it means for consumers when they don’t know, when they’re looking at deep fakes and what creating things like deep fakes means for inspiring the next generation of what’s possible.

Let’s get started.

Alice: So Norts welcome Cyber Made Human. You brought with you a surprise guest, Derek.

Norts: I would say a bonus for all.

Alice: Bonus guest, surprise guest, welcome Derek.

Derek: Hello.

Alice: So this is the first time I think that we’ve had two guests on Cyber Made Human at once. So welcome.

Norts: Thank you.

Alice: And I know you Norts, from our collaboration that we did with Cynam, the Securing the Future of Technology video that we did in Bristol.

Norts: You crossed the line. No ones knows me, don’t know myself, to be honest.

Alice: I know you and Derek, I saw your wonderful talk with Norts. Where was that?

Derek: That was in Bristol a couple of weeks ago at the Watershed.

Alice: Watershed, very good.

Norts: Specifically, the Pervasive Media Studio Lecture Theatre, if we’re gonna be, you know, yeah. Precise.

Alice: So, although this is called Cyber Made Human, obviously, we specialise in cyber and you guys are not cyber, I would say that your industry is kind of adjacent your creatives like us. But you work in the kind of AI, augmented reality, machine learning space.

Derek: Yeah.

Alice: And I do think that that has huge implications and overlap with cybersecurity in terms of intellectual property, data privacy, et cetera, securing your data, but also questions around ethics, and so I’d love to dive into some of that with you today.

Norts: Go for it.

Alice: Well, before we do, I’d love to hear a bit about The Peeps, how it started, how you two met, and what you guys do.

Derek: Should you go or should I go?

Norts: You, I, you have first dibs.

Derek: Yeah, let’s go on. Alright. Run, run wild today. So my background is creative technology. And I worked in agencies since, I dunno, like the year 2000, doing websites and apps and CD-ROMs and Flash websites and all that kind of stuff. And I went freelance and then worked for the company that Norts was the creative director. So that’s how we met, working on.

Alice: So Norts was your manager?

Norts: Hmm. Yeah, he was, he was loose collaborate. Someone hired in to work. It was not on this specifically. Yeah, he was hired in to work on. The agency I was at, we were quite forward thinking about how to use social media when it was innocent and fun and playful. Not the septic pool of horribleness it is now. And we tried to do things, we kind of hacked our way to do things on those really primitive social media platforms before we could really do them early Twitter and early Facebook and stuff. So, and we needed someone who, someone who could get under the hood and that was Derek.

Alice: Yeah. Nice. And how did you go on to find The Peeps together?

Derek: So, then we didn’t really cross over for quite a few years, the wilderness years until about 2017. I was very into making bots and things on Twitter. Oh, really? And doing things with very basic, like, not artificial intelligence, but early stuff like early neural networks or Markov chains just to generate silly, silly Twitter bots that made silly jokes or would.

Alice: So like the private jet trucker type stuff?

Derek: Not as even kind of useful as that. Things that would, tweet all the rude things that Shakespeare wrote and his. Okay. All, you know, all the swear words out of Spartacus or just silly things like that. And then was looking at using the new AI tools that were coming out around that time.

Norts: Well, you know, up to 2016 or so, I was under witness protection. So we can’t talk about any of that, right? That’s not true. But it just adds an air of mystery, doesn’t it?

Alice: You’re unemployed, aren’t you!

Norts: I was a musician originally. I thought I’d be a professional musician. That’s what I thought my life would be. Hasn’t turned out that way. It is what it is up to about 25, 26 got out of music industry because I was sick of living on people’s floors in vans, not making much money. Got the job as the world’s oldest junior copywriter. And kind of enjoyed that, didn’t really know what it was about, but just pursued that. And then for many years, henceforth I became a writer, creative director. That’s that sort of journey.

So like you say, if you hang around long enough in an agency, you end up being the most senior creative, don’t you? It’s a very simple thing to do.

So I did some agency side, did some client side. Then went agency side ’cause they seemed to have much more exotic fun. Did that, met Derek through the course of working in the agency I was at. But I suppose when it comes to the AI world, there’s the fundamental moment of learning was persuading the agency I was at to send me to South by Southwest in Austin. Just because it seemed like a bit of a laugh. That’s where all the cool people were, weren’t they? But the intersection of music and cinema and marketing advertising. And it was there. I saw someone give a talk about the possibility of using machine learning, as it was called then, Derek. Remember those giddy days in AI and I sort of, you know when someone you discover, so you’ve never thought of before and it was what I’m gonna do the good for that. Look you do like. As they discuss examples of it. Now, back then, there were probably about two things in the world. They’d used AI.

So I came back to my agency, sort of like evangelising us in the future. Oh my God, it’s AI, it’s gonna be transformative. And they didn’t really wanna know. Myself and someone else there, someone called Kerry, Kerry Harrison, and Derek. We kind of got together and thought, you know what? We should try and do something with this. So we actually set up as a business called Tiny Giant in 2017. So The Peeps came from the body of Tiny Giant. So it was, it was myself, Kerry, and Derek as them. Kerry then decided to go off and do other things, maybe three years in. That was in our Web3 phase, wasn’t it? The difficult album as we call it. And also that first three years where we’d used neural networks and machine learning to create stuff for people. People thought we were berserk because it was like what you could create. Interesting. But it didn’t have a use yet.

So the people who would hire us, in fact, Marika from Cheltenham Science Festival, were brave and bold and pioneers in the sense that they wanted to show, isn’t this a distant way of the world to looking at the world, through advertising or marketing or whatever you’re trying to do? Expressively. Yeah. Through machine learning. Which was true, but, and as everyone said, but what does it do?

So whereas in 2017 where we first mooted it and sort of desperate for anybody to give us anything to demonstrate, we were making things with cupcakes, weren’t we? And cocktails. Homemade neural networks. Homemade. Random ideas to hack from that point in where they just thought we probably had mental breakdowns and we were going into form of technology in a funny sort of way. For about a week, we were considered seers, weren’t we? How did you know? How did you guys know that?

Norts: At that point now. But now it’s kind of so democratized, isn’t it? Everywhere. Yeah. We’re a bit sick of it. Aren’t everybody sick of AI now? It’s everywhere. We’re not really sick of it, but we understand there are what I would call creative tensions in what we do. What others want to do, what anybody can do. But a lot of it can end up being quite loopy. By that I mean rubbish.

Alice: And how do you feel now that it’s become mainstream then? Do you feel like a bit disappointed because it’s no longer so niche? I mean, there’s obviously much more demand for your services now, probably.

Norts: Yeah, because it’s also all technologies. All technologies kind of come and go, don’t they, on that kind of hype cycle. And there was a period where it was metaverses. There was a period where it was crypto. So AI will come and maybe drop off or various parts that it might do. Some parts I think might, but yeah. So what I think what you can do that’s interesting with it is not necessarily what everybody is doing. So that’s still our differentiator.

Alice: So there’s a couple of things I want to touch on there ’cause I want to dive into the kind of ethics of AI and data and things. What’s your kind of opinion on social media at the moment?

Norts: Well, I don’t have any, apart from LinkedIn, I’ve removed myself all social media about seven months ago. For the just the what it was communicated to the world, what it does to people, what just people close to me at the time they wasted it or the things, their kind of view of the world that shaped by experiences that are inauthentic. I just find that just, it’s just a bit depressing in a way. So I think, well, in that sort of binary way that I operate, in my head, I thought, well, the best way I can start by doing this, I’m not gonna tell people to not do this, not do that. I’m, I suppose in the last 18 months, it would’ve been probably exemplified the most by what’s happened with Twitter and X.

But I think the decline of Twitter as a creative, a force for creative good was long before that. To be honest, I mean, it goes back to the point what Derek touched upon. In a way we said a lot that any form of, we touched upon this at when we met a couple weeks ago. Any creativity of any form, most of it, of human origin, it’s part AI for a minute, is crap. Most people aren’t very good at it. Everybody can do it, but what can be considered good is quite a small portion of it.

Alice: Well, it’s also subjective.

Norts: Yeah. But unfortunately, most people’s subjective view is crap as well.

Derek: Interesting. I don’t necessarily agree with Norts.

Norts: Well, that’s where we are different.

Derek: I live on Twitter and BlueSky and Letterbox. They’re the first three things open in the morning. Over the last few years, I’ve tuned my Twitter by blocking every advert I ever see. Any word I don’t want to see. So it’s almost a nice, you know, I just follow creative people doing creative things.

Alice: The only thing I would say with that is by training your algorithm to just show you what you want, although positive in terms of your experience with X, you are creating an echo chamber for yourself. Although like a lot of the conversation that you see might be damaging to see. It means that that one in 10 that could actually make your thinking a little bit broader, is now not available to you and you are creating just one view of the world, which is just your version.

Derek: Maybe I just, yeah, I think it’s possible to have a more balanced social.

Alice: I think the risk with algorithms is that people are becoming extreme in their views. Either left or right because they are creating echo chambers through algorithms intentionally or not, and yours is very intentional and although I’m sure you’re doing it to avoid harmful content. I personally would just worry that if I don’t see the other side, what’s the risk there?

Derek: Yeah. But then I might just be tempted to as Lord did unsubscribe or turn it off.

Norts: One of the benefits me having more time from not looking at social media at all, death scrolling, doom scroll, allows me more time to lie on the floor in the field position. So who’s the winner there?

Alice: So you only use LinkedIn now?

Norts: Yeah. And that’s reluctantly, ’cause the same rule applies. Most of what’s LinkedIn’s awful, isn’t it?

Alice: Do you think so?

Norts: Yeah, there’s that way people write on LinkedIn, isn’t there? It’s like people see it and then they emulate it because they think that’s the way to write.

Alice: Well, that is the issue with algorithms. Is that it encourages the behaviour of either, like LinkedIn encourages self-celebration, doesn’t it? And it is very cringey sometimes. Also everyone’s learning.

Norts: Not a bad thing. I’m a fan of sorts, but obviously people will use technologies. I will have to call it out a chat GPT or some of a large language model to help them in whatever way they write. You could see that that’s immediately.

Alice: Oh yeah. So in terms of the work that you do, I know that you create things like an AI version of a CEO to announce a talk and come out of a helicopter and walk on stage at work events and things like that. And they’ve obviously given you permission to use their likeness in the work, and they’re paying you to do that. I wonder what your opinion is on using images just that you find online. And what the ethics around that are, in your opinion on using that content for AI?

Norts: We’ve spoke to the people who we think important, like lawyers and things like that. We said that we will very deliberately, alright. I have to say we do ’cause it is representative us, that’s me mostly would do. We’ll aim to use subversive means to communicate something funny, using in what I would describe as people who we think are awful. Now they might not be awful, might they, you know, but this is general perceived. I’m leaning rather lazy in the interview, that are people in the world who, by the nature of their kind of narcissistic personalities or their desire for self-promotion are probably worth taking down a peg or two by doing something funny that you can do with all the various techniques in AI just to create a little bit of murph.

Alice: And what would an example of that be?

Derek: A recent one. ‘Cause someone asked me just yesterday on a WhatsApp group, how did you do that? Was taking President Trump and basically making snog all his favorite world leaders over the top of a very beautiful song. And it was very beautiful. Very beautiful day, wasn’t it? And so, you know, it’s that. Is that bringing harm and detriment to the world? Maybe if enough people said, I don’t like what you’re doing there. I wouldn’t do it, but we just do it just for promotional sake. We just do it because it’s just something that seems like fun and slightly subversive to do.

Derek: The Dor Brothers are a classic example of that. Those videos they make with all their kind of world characters, they dial it up and do it big style and that is definitely a promotional vehicle for them to champion their skillset.

Norts: Which is not only using AI, but actually they’re really good storytellers, editors. And they’ve got their finger on the cultural zeitgeist. If I can use that term.And I kind of think that’s what we try to do in a smaller way.

Alice: Because they always tend to be topical. Like Elon Musk to me, he’s just there to be parroted. I see the things that you post on LinkedIn of President Trump kissing Putin, and although I can see what you mean by it being a parody, I do wonder if it’s a slippery slope of kind of deep fake porn, for example, because if that was Hillary Clinton and Rachel Reeves kissing, it’s got a very different connotation.

Yeah. And by doing it with Putin and Trump, although, you know, that nuance, a lot of people don’t. And are you then giving permission for people to find, you know, leaders of companies and do stuff like that or reality TV star celebrities, and it does feel like it’s just a little bit on the mark and could easily go in the wrong direction and seem like an endorsement of that direction. And I don’t know what your thoughts on that are.

Derek: Yeah, I think that’s a very fair point. And originally, I’m not quite so against it now, but even just using unpopular world leaders like Trump and Putin in deep fakes, I didn’t like it at the beginning. I think me and Kerry were the same, ’cause it kind of shows them in a good light that they didn’t earn or shows them, you know, giving them positive publicity that they don’t deserve.

Norts: Is it positive publicity though?

Derek: Well, anything way. It’s like you’re basically making fan art of these world leaders. That’s what all this AI stuff kind of is. So using them at all, I guess, is some kind of an endorsement.

Norts: That sounds like your filter at work, getting their data filtering out. Whereas I say take them head-on.
I see what you’re saying. All I can say in defense of us and myself is that I’m defending myself. What’s the defend? I’m on the front foot. All I would say is that I don’t think it’s for me to be the determinant of how other people use it. The fact that I make a critical choice. I decide what I may or may not put out that I consider funny. Some people will like it, some people won’t. But I’m not telling anybody to do anything. I totally don’t think that suddenly, by me showing the actions of that, it’s not like I’m explaining the technique to do those things. It’s just merely putting a message at that moment from me or us about something in the world that kind of irks me. I would like to think that what we do when we do that sort of stuff is deliberately about making a statement that might be humorous, but it’s also a little bit awkward.

Alice: Yeah. Do you think that a way of doing that could have been to show them hugging or shaking hands or the sexualization of it? That’s the bit that I think for me was a bit like, oof, if that was me kissing a random leader.

Norts: It’s not though, is it?

Alice: No, but it could be someone’s,

Norts: but it’s not that, is it? Because in that scenario, those in particular have what I consider odious people having a soft, I wouldn’t even say a sexualized side, a sensual side. That isn’t really there.

Alice: Both of those people are probably quite homophobic though, I would say.

Norts: Know that, do we don’t know anything really. There’s a lot of assumptions in this debate.

Derek: Mm. But yeah, it’s noted. But at the end of the day, you’ve gotta crack a joke to find out if it’s funny, haven’t you?

Alice: Yeah. And I’m the first person to love a bit of dark humor. I just wonder, with AI and deep fakes, that’s just something that I worry about in the future as this becomes more available to people, what that could turn into. And that’s, I think that’s the kind of crux of it. Like before AI people would do this on Photoshop or After Effects. And they’d spend a lot of time and you wouldn’t need to do it if you really wanted to, to spend that time. And these tools, the availability of things that let you do this for free or buy an app. I think, I personally don’t think those tools should be available to everybody.

Alice: Okay. You are giving tools to people to be able to create content that probably shouldn’t be. But if they had to do it manually, that would be like a filter on their time that then they would say, no, I’m not gonna do it. Whereas now, because things are so easy, there’s nothing to stop them in that sense.

Derek: I think with stuff like that, I do prefer people to have access. Like we’ve had conversations about quantum encryption and decryption and all these kinds of new technologies that can cause huge amounts of risk. And I actually do advocate for them being more mainstream available because the more people that have access to it then become conscious of it. So then when you do see a deep fake, you know it’s fake. Whereas, or if you need to protect yourself, you know what the risk is. Whereas I think if something is secret, then if bad actors use it, the rest of society can’t really protect themselves against it.

Alice: It’s true. But it’s ’cause it’s like an arms race, isn’t it? The more people are creating the software, the better it’s gonna get, ’cause more people see it and improve it, whereas if there were fewer developers, fewer people using it, it might not have reached the lifelike qualities.

Derek: Yes. Or the detail that it currently is. And yeah, some things are useful. And things. Is Deepfake tech something that everybody in the world actually really needs access to?

Norts: I go back to the point that we said before, any manifestation of machine learning technology platforms that you use. I was listening to something yesterday about a video on YouTube. Someone was critiquing AI music creation. And they started off quite cynical about it, but by the end of it, they came to understand actually, if you adopt a human hand onto the creation of music with AI, you probably will create something that actually is better than the kind of just human only. Insofar as not that the technology creates mysterious chords or notes that haven’t been heard before, could do with micro, but the fact that actually the strange way that stuff is made and the strange, you can take the non-obvious roots to create something, whether it’s the melody or the chord or the arrangements or whatever, bring together you create new forms of music.

Alice: Mm. But the only reason that comes about is because it’s with a human who’s also making the decision.

Norts: I’m confused that, and it’s the same with everything we do. I mean, every single thing we’ve ever done. It’s not like we have a machine, an infinity machine, who did, discuss that once, Derek, they just pumps out stuff that we have no hand in. We press the button, let do it. The only time we do good stuff well, and a lot of bad stuff, is we are determined that which is good.

Alice: So you create it, you have the idea, use the tools to create. It comes at the other end and you curate it decides.

Norts: So like for example, if we are making a video for somebody, a two-minute video, that’s purely AI, we are gonna very instinct, storyboard what that idea is. Work out our characters, work out our visual universe that we’re operating in. Then we’re gonna start making it. 90% of what we knock out, probably more, we don’t like. So that gets chucked. You have to make human decisions on what you think’s good. Then you gotta edit it. Then you’ve got to put music onto it. Then you’ve got to work at all like you do with a podcast, and then you decide at the end, that’s when I finish and that’s what I’m happy with.

Alice: Right. Yeah.

Norts: And I go back to the point that most people when faced with that creative challenge, not be it as complex as that, aren’t very good at it. Or they use the first output and they call it a day.

Alice: Then also you are both leaders in creativity. So it’s not just, it’s a human in the loop. It’s a very skilled human in the loop. So that knowledge and that kind of aesthetic and view and quality control, all of that’s really important in that process as well. It’s not that anyone can discern what looks good. It’s still a skill. Yes, and it’s what you are.

Norts: It’s the domain or it’s what you’re trying to do, so yeah. Ours is creative things. We did when we were tiny, giant experiments with food and drinks. So we would create a neural network that would create cocktail recipes, ’cause we scraped a whole bunch of data off the internet and then put it into our neural network. Then we gave that to a mixologist who would actually take these and look at them and decide. Some were good, some were rubbish. And tried to make a few of them to see if they were good or not. So rather than just, you know, we could have just spat out some recipes. You know, called it a day then.

Derek: Yeah. But because they had the expert, they knew what was good, what wasn’t.

Norts: So kind of different. Framing of that I’d be interested about. You talk about taking recipes that already exist and feeding them into a large language model, and that outputting these great recipes that you then finesse with a cocktail specialist, which all sounds great.

And I am not necessarily, I don’t really have any opinions on what the data is that’s being fed into the engines. I know there’s intellectual property issues,

Alice: but last year Adobe came under a lot of scrutiny because their AI engine Firefly was using creatives’ work that they were creating in their platform. So Photoshop, Premiere, etc., and it was feeding the AI engine to create new stuff. They’ve since backtracked ’cause of the uproar. I just would, as fellow creatives, wanna know what your opinion is on kind of data sets and intellectual property, what your thoughts are.

Derek: They should only be using information that they have the ability and rights to use.

Alice: Right, but that’s not the case.

Derek: Yeah. But with the Adobe thing, yeah. That is bad because they sold their whole suite on the fact that it was all ethically trained.

Alice: Yeah.

So the genie is out of the bottle in all of these cases because there are too many layers. It’s not necessarily Adobe training, or the training, taking all that information to train might be outsourced to another company. And then people kind of say they don’t necessarily know. But it was just this last week, wasn’t it, that Meta were found to have trained their text engine on tons and tons of copyrighted books.

Derek: Yeah, personally I think it has to be opt-in. Even with the kind of AI standards and guidelines that we have, doesn’t seem to be a way to stop people from using things that have already been trained because it’s all open source. You can just go onto, find these models, download them for free because they’re not commercial. On the one hand that makes kind of R&D really easy because you can build on other people’s work, which is the beauty of open source software. On the other hand, you’ve got access to all this information that is very, you know, it’s either in a gray area or completely unauthorized.

Alice: Yeah. So then my final question for you both before we go onto the Cyber Made Human bookshelf is, what do you think the future of AI is in our industry?

Norts: I, it is a funny sort of way. I’m very optimistic, to be honest. Because even in the short window of our seven years in the creative where we’ve been hardcore creative technology beyond AI, these kind of interesting little troughs and peaks, and you get hype curves come. We were talking this morning about GenTech AI. That’s the buzz thing, isn’t it? All these things pop and go. Some things take off, some things don’t. I think that the world would be in a funny sort of way more creative because the tools that people’s disposal will make will lead, I think, to incredible new ways of expressing the human condition, which we don’t know yet. We are climbing at the mountain in the same way I.

That maybe photography changed art. So we don’t know what kind of forms of creative expression. I’m not saying advertising and marketing should be aligned to artistic expression, but it is a certain type of art in itself. At the end of the day, you’re trying to make people emotional to make a decision about something or other. So there is that part of it. This part of me is very excited about that and how that manifests, ’cause that’s kind of the world we explore. However, the line will always be there, but most of it will be rubbish.

Alice: I do have a question there before we move on to your thoughts of the future, which is when you say advertising’s always been built around an emotion. That’s a really interesting thing because we often think of social media now being about getting an emotion to rage, bait, angry, cry, laugh, share, and actually I had never thought that advertising’s always been like that.

Norts: My understanding and my learning of doing advertising is why do you write an ad? You could write it. Is it an informative ad? Is it insightful? But the best advertising, if there was ever a golden era of advertising, we’re definitely not in that anymore, is this idea that you trigger emotional responses to something, something makes you laugh, something that makes you cry, something makes you worry, etc. Because ultimately that then leads to this point where people will think a bit more about what they’ve just seen or experienced, which then puts them into a state of mind to make decision at the end of the day. Like it or not, advertising and marketing is probably about making people accumulate things or make decisions they weren’t planning to make.

Derek: I didn’t know that existed, but now I do. I want it because it makes me feel thus. So therefore, the best way to do that is to kind of tell quite strong, funny, entertaining stories that make you think about those things.

Alice: What’s the future of AI and creativity?

Derek: I think there’s gonna be people using tools that are kind of more targeted at what you are actually trying to do. So rather than general-purpose things like ChatGPT, it’s probably gonna be things, plugins in your After Effects or plugins in your code editors. So it’s smaller helpers around what it is that you’re trying to do. I think those are gonna be very big, very useful. Kind of what Norts said. It does make a lot of things easy that you would previously have had to outsource. So, why would you pay a writer when you can just use an LLM? Why would you pay an illustrator when you can use an image generator? I’m seeing lots of people on social who were the illustrator and writers here and now losing out on jobs. So, it’s gonna be a hard transition, I think.

Alice: Okay. So finally, Cyber Made Human bookshelf. We’re book lovers here.

Norts: And is it a real bookshelf or metaphorical bookshelf?

Alice: Both.

Alice: Okay. So there’s a digital version, a metaphorical version, and soon to be physical version, which probably just have a QR code to the digital version. It can be any book you like, your favorite book of all time, one that really changed your thinking, your current read. What would you recommend?

Derek: I’ve got a physical book. I wish I brought mine. This is just what I’m reading right now today. And it’s a Japanese detective novel of which I am quite fond of at the moment.

Alice: Nice.

Derek: And my shelf at home is just full of these. And it’s propagated. It’s like the old-fashioned ones. They’ve got a cast of characters and a match.

Alice: Wow. I love it.

Norts: Very good. Nice.

Alice: Not you didn’t bring one.

Derek: I told him very important. We bring our books. You’re gonna bring, We must bring our book.

Alice: So what’s yours?

Norts: Well, I was gonna talk about, I’m gonna change my answer. I was gonna talk about book and reading, which is Revolution by Professor, which is about the interregnum, the most glorious part of British history when there were no royals in charge.

Alice: Nice.

Norts: Nice. But I’m not talking about, if I’m talking about a profound book, like you said, a book that changed it, I’m gonna say the last fiction book I ever read and after reading it thought I’m never gonna read another fiction book ’cause there won’t be any as good as that. Which is Perfume by Patrick Suskin.

Alice: Okay.

Norts: Which basically is a story of a French kid born without smell who becomes a master perfumer and he end up being eaten by everybody. And I think that’s a nice metaphor. And it was a film, wasn’t it? That was a thing of it. When books get turned to films, that’s not the purpose of books. But yeah, I never, I’ve never read another fiction book after.

Alice: Well, my recommendation this episode is gonna be The Beauty Myth by Naomi Wolf. It’s about how women are conditioned to basically think about their appearance in every aspect from the top of your head to the tip of your toes constantly, and things like air hostesses, PAs, waitresses, it’s all like sexualized and it’s just a really interesting book, and I’m somebody who loves clothes, loves makeup, can’t deny it. But it really made me think, whoa. And it very much changed my thinking and I’ve since listened to a podcast episode on it, which is called Breaking Down Patriarchy, which is done by women who are former Mormons in Salt Lake City in Utah. They evaluate feminist literature and they did an episode on The Beauty Myth and coming from them as well. It was just really interesting and I’d recommend it for men and women. I know you’ve read it.

Norts: Yeah. I think it’s a just a really amazing read. It’s very interesting.

Alice: Yeah. So thank you so much for being on Cyber Made Human coming to Cheltham.

Derek: Thank you.

Norts: Five days across the continent.

Alice: You’ve made it. Thank you.

Watch the episode now!

Watch on Spotify

Watch on YouTube

GET IN TOUCH FOR ALL YOUR 2025 EVENT NEEDS

PHOTOGRAPHY | VIDEO | LIVE STREAMS | LIVE PODCASTING | SHOW REELS