Transcript

The following is a transcription of the provided video excerpts, presented in a readable format and drawing exclusively on the source materials:

hey heat hey hello everyone welcome to the Flow State podcast i’m Stuart P turner and today I’m very excited to be joined by Sarah Pelicanos i forgot to ask how to pronounce your surname Sarah i’m really sorry pelicanos pelicanos yeah um thank you to my husband who um has left me to a life of spelling out my last name so love it i’m like super bad at pronouncing anyone’s name from anywhere so I’m glad I got that at least perfect thanks Mr p for for the challenge so Sarah thanks very much for coming on and welcome um would you like to give a brief introduction to yourself and what you are all about for those people who may not know you absolutely

um so Sarah Pelicanos um I am a agency owner so I own 22 digital which is a um boutique agency located in the valley uh we work with a lot of national and international brands um that are uh basically looking to solve particular market problems um also delivering on their uh full stack marketing capabilities as well so we kind of do everything from business analysis problem solving strategy through to implementation um I’m very much personally a a problem solver through and through so you know whenever there’s a problem even if it’s outside of the marketing realm I have to understand it in its entirety and I have quite an affinity for AI um just understanding the ethical considerations and the morality behind it um it’s led me to quite a few different uh speaking opportunities which I’ve had I also do a lot of um uh presentations and speaking ops um at a few conferences recently announced South by Southwest so I’ll be there yeah which is going to be great on brain waves to bottom lines so understanding that sounds really exciting actually So tell me bit off piece but tell me a little bit about what you’re going to be talking about because this sounds like it’s going to be a super exciting talk obviously without giving away giving the Oh yes I can’t give away too much um but yeah no I um

I will be presenting um on so you can see it all on the website but it’s essentially just understanding how your brain responds to marketing stimuli and how you can use that to be able to better deliver on your digital experiences so you know between your brain to your mouth there’s a lot of um bias that’s involved uh so what we’re trying to do is eliminate that bias to the best of our ability and understand how the brain responds to for example like color psychology um different navigational experiences uh things that will basically just create a more streamlined process for your brain while navigating online so yeah interesting so that’s like that’s pretty exciting for me because I’m big on the blend of you know the intellectual and animal side of uh of human human life totally so that look that sounds super exciting so if you’re going to South by Southwest make sure you see that because that sounds awesome wonderful well look thanks for the intro Sarah let me let me dive into

a few related things um and the reason we’re we’re all gathered here so speaking of psychology I thought we’d mix things up a bit now and do some what’s been happening in the news interestingly since I started speaking to you about this show like I’ve seen tons of stories that I could have dropped in here but I picked out a sort of top four from all the stuff that I’d found um so things that have been happening what do you think about these right so directly related to work from what you were just saying YouTube recently announced that they’re going to cut all the AI crap from the network which is quite exciting there’s an article on TechCrunch about it which I’ll share with the episode so you must have seen this in your agency work Sarah where if you’re not familiar um you may have seen these videos of just weird robot voice over badly spliced together images or whatever um purely made for like monetization I imagine i can’t I don’t really understand why you’d make them otherwise so what do you think about this is this good is it bad like

um I think it it’s it’s about I guess the the ethos of the platform right like YouTube is for content creators so if you’re creating the content creator to then create the content are you then like one step removed where it’s no longer content creators publishing to a platform which is you know I think it just it really depends on the intention of the platform right so like if that’s what YouTube’s intent is and that’s you know kind of like how they have created their platform and that’s what they’re aiming to do then it makes sense i’m not opposed to it i I you know whenever I see those AI videos where it’s you know the voice over doesn’t quite match it does make you question and for someone like you know my grandma for example she’s going to fall for it so I I’m not opposed to removing that yeah I like it no I was I was very much the same to be honest although I think um you know there’s enough to your point around the intention there’s enough human rubbish on YouTube to last a lifetime already right exactly i want to watch that yeah yeah just you know just leave us leave us with the people who are happy to spend their actual human human time doing it yeah no like I think I think it makes sense like I think my only other kind of thought around it was um very practically like I’m sure you’re the same if you’re running obviously if you’re running like ads or trying to promote stuff across YouTube like the less garbage in the inventory theoretically the better but um totally and like if you are running ads it’s dependent on the like topics around the particular um content right so yeah um if they’re then creating content that’s not really that valuable or or things and then you have ad placement Yeah you then as a marketer have to consider do I want to show up against content like that and I I don’t think and you know I may be wrong on this because um you know one step removed being a business owner um is like it is there the ability to not show up against AI generated videos and things like I don’t believe so so look I’d be I’d be surprised but again I’m like I haven’t run any YouTube advertising for quite some time but I always remember when we were doing it you know I’ve worked at Group M for a long time and like brand safety was a an ongoing battle slash constant source of frustration because it’s a very murky murky area and like at my you know as of my last using any of those tools none of them work as well as they claim if you know better feel free to feel free to come back and correct one or both of us but yeah I don’t know i don’t know i think this the cynical side of me says that I’m sure Google will say that they can do it but I doubt they can yeah I doubt it weird all right well look I think that’s good less less garbage is is typically better so I feel like page on that front um I’ve dropped the article if you want to read that moving on to a related piece of news possible dead internet incoming browser browser wars 2.0 or maybe 3.0

so if you haven’t seen Anthropic I think it is um OpenAI have just announced or well they’ve teased slash you know announced that they’re thinking of launching a browser perplexity I think are going to launch a browser as well i’d be very surprised if everyone else who’s running a big AI model is not thinking the same things um what does this mean for our browsing of the internet Sarah and specifically does it mean that SEO might really be dead for real this time that’s such a such a complex topic right is the whole um is SEO dead and the whole zeroclick search and that whole environment i look I don’t think SEO can ever truly be dead because when it changes right when LLMs um or or all these new you know AI generated search engines or search browser whatever it is um when they’re referring to content you know it’s based off of um the prompts that are put in of course but then the the prompts have to be in relation to what it’s searching for and then the probability that’s going to show up blah blah all that stuff um I won’t bore everyone with the details but um in that regard I think SEO is actually more critical than ever because you know if we’re thinking historically so or well current on Google um on Google at the moment you know you’re you’re working really hard to show up on you know first page results or or in um Gemini’s uh AI generated results at the moment um you know you’re doing your best in search to be able to be relevant to these very particular prompts and then when it’s going to be say a browser that’s run by open AI or anthropic or whoever um you know you’re then the list becomes so much smaller right they only put forward one two three maybe brands so I think it’s from a search perspective it’s making sure that you’re really really really specific Um and you’re going to have to be you know really understanding your core audience but then also I think that there’s a shift towards having that direct to consumer relationship as well so brands are going to have to really focus on connecting with their users outside of search engines too so I think it’s it’s SEO cannot die because um you need like it needs a database to be able to pull from um and when I’m thinking of search I’m thinking of you know having really strong websites so they’re really relevant having different information sources going out there so it’s pulling your relevant content based off of various prompts um things like that so you’re kind of having to work with the user and also with the LLM so I think it’s just changing rather than dying interesting well that’s reassuring as uh my my original background was pretty pretty heavy in SEO so obviously I’m kind of I’m still a fan i don’t really do it anymore but you know it’d be sad it would be sad if it if it did go i don’t think it’s possible because because it has to pull from various databases right and a lot of that has to do with websites and I I was actually I responded to something on LinkedIn uh a little bit ago about um you know there was the whole zeroclick search of things and I’m like but they have to be able to pull from somewhere and your website has to host that information and or you know potentially journal articles or contribution to content externally and that is you know when we’re looking at say chat GPT the most universal one that everyone knows right um or Gemini because maybe is you know that’s Google relevant so maybe let’s stick to either one of those when they’re looking at content to put forward to you they want to make sure it’s as reliable as possible um I mean you would hope right and you know that’s the dream right yeah that’s the dream so when we’re looking at that you need things like really solid websites but you also need really valuable content going out that’s ideally not um generated by Chat GBG uh but stuff that has you know rapport behind it like a doctorate like someone who created the content has a doctorate or has some form of um some form of accreditation behind them um it’s content that is unique it’s content that isn’t based off of probability it’s you know that side of things um because it needs to be able to pull from that otherwise it’s just going to absorb itself and we’re just going to be in this nice sea of of generated content that is ever so slightly different each time um until it eventually just Yeah gets absorbed interesting yeah look I mean I think there’s there’s a few interesting points that you raised there and going in reverse order I um I agree with you i think that the the challenge around this whole shift I guess is is ex exactly the one you raised about this two the two sides of what’s the out the output or the outcome going to be and how how much is human behavior really going to change around that new way of browsing because if you if you wind back to the what’s now the olden days of the internet when Google first launched you know their whole vision was to be a effectively like an index for the internet like you know you go into a library back in the olden days for computers decimal system yeah exactly you know remember those like all these little cards and stuff you know it’s like that was a fairly a fairly altruistic and um useful endeavor where they started but then to your point 20 or 30 maybe at this point I I need to check the dates but an alarming amount of time later which is where we are now we’ve now got Google essentially not just being the reference library but telling us which books are most important advertising their own books also not even bothering to tell you about the books they’re telling you about something else now um and yeah this whole AI browser evolution feels like a worse version of that basically because at least Google has a huge database of like the internet right whereas these models are all scraping from god knows where the same sources some different ones well that’s the thing you want to be aware of the you want to be aware of what’s available and then you make your selection based off of that but human nature is very much uh and I think we spoke on this last time we um had a conversation is path of least resistance so if you’re as a human if I have a question and you’re going to give me the answer straight away and it sounds right I’m probably going to take it you know regardless of AI reveal the secrets of my entire career to this day sarah that’s basically 90% of what I do all the time well you know it’s like and that comes down to I guess awareness of of AI um ethical considerations is you know things like AI hallucinations it’s going to come forward with you with something that it believes is correct and you know not to AI as well i’m not blaming AI here um because it believes that it’s definitely AI but you know you can be nicer but it believes it’s the correct answer and it’s coming forward to you with like yes this is the right answer but it’s not because it’s falsely generated something that you know it believes it’s correct but because it didn’t either have all the information there was an error in fine-tuning you know something like that it hallucinates um so without that critical thinking component as a human to be like challenging things we’re going to be in a lot of misinformation so that is very true and even I know we’re going to talk about this in in a little bit but even the the language that you’re using there I think is you know part of the root of this behavioral challenge right because when you’re talking about you know it thinking it’s generating the right response right like it’s it you know it doesn’t does it it’s like it’s generated the correct output based on the the input but we’re all you know and this isn’t obviously I’m not just saying you’re doing this everyone’s doing this right we’re all like it thinks it’s given me an answer sounds confident like it doesn’t it doesn’t sound anything you have interpreted that that’s supposed to be the correct answer based on the output of a machine that does not think in the sense that we now think that they do which is quite you know I guess the crux of a lot of our current challenges right um and like yeah look I agree with you I think and I’ve been winging on about this for years so I won’t I won’t bore you with the the entire version but the you know the sort of joy of discovery on the internet is is very hard to find now like if you you know you I think you might be old enough to remember this um I just I’ll tread carefully to avoid age based insults but before Google when there were more than one search engine way back in the day remember like Don Pile yeah yahoo um all the AOL search you know when you when you got a search search function tied to your ISP like back then you would obviously naturally have to browse around you actually discovered things the way that search engines now work through links and you know talking to people but it kind of feels like if AI is just giving you stuff like that’s gonna shrink even more and like you know to your point our ability to actually think about how to find things possibly is going to suffer well probably abs absolutely and I think that um that’s just not developing certain core competencies of your brain right because you’re you’re just receiving information sure the information that you’re absorbing you’re getting that right that’s great but you’re not challenging it in any way um and you’re just receiving whatever it delivers to you you know it’s it’s like

um I’m going to use So so I’m celiac um and I obviously can’t have gluten absolutely yeah i wasn’t always so I know what bread tastes like even worse i know it’s so much worse it’s so much worse um but for example you know using this as an example is that if I am let’s say I’m treated as celiac but I’m not celiac so I’m given gluten-free stuff every single day um and I’m not questioning it i’m not like oh what’s you know bread taste like what’s this taste like think about what it really tastes like try not to remember um but you know I’m not questioning it because it’s just what I’m given each day you know and and I’m not even aware of bread or pizza or anything like that because I’m just given this gluten-free diet all the time so and it’s handed to me on a plate so why would I critically think of anything outside because I’m not aware of everything else that exists and then I’m missing out on you know if I wasn’t celiac I’m missing out on bread i’m missing out on pizza i’m missing out on all these other areas that I’m not aware of because I’m just referencing only what’s given to me on my plate and nothing else that exists beyond that so um I really like to talk in metaphors so I hope that that I love it i love it no I’m the same like I think I mean you know going way back to um the the classics from my degree like if you think the the analogy doesn’t quite hold against the bread analogy but like the you know we’re in the Plato’s cave situation where like you familiar with the the um story not story the the analogy of the cave and like the people inside it i don’t think so no I know Plato but I don’t think I know the cave analogy well it’s one of the better things that he said amongst all the you know the the other random stuff but the short version of it is basically that um he was uh he was talking about how you exactly what you’re saying about how do you know that reality is real basically it’s like an early version of the the brain in a jar or like you know The Matrix type um type analogy so he basically said that look like reality as you know it before you become enlightened like he was was uh we’re all you know me and you were sat around a fire in a cave trying to remember this as accurately as possible um and there’s people in the cave that are you know making like shadow play on the wall and we’re we’re watching them and we’re like “This is this is all real.” You know that’s that’s everything that’s real and then uh as part of this uh this sort of metaphor he’s like “Oh imagine one of those people gets up right?” And they turn around away from the fire and they’re like “Oh my god there’s a way out of the cave.” And they walk up a path and then they emerge into the currently blinding sunlight and you know see real real life like all these things that they’ve been shown merely as shadows um and he was don’t quote me exactly on this but he was using this um this kind of analogy as a way to sort of describe becoming more you know more learned and understanding like the real core concepts that sit behind the things that we interpret through our senses but the point he was making was you know the one you’re making now right where like we seem to be intentionally moving away from things that are real and forcing ourselves down into this weird fake reality um and his point was like look as you said if you’ve never eaten real bread that might be offensive to anyone who is gluten-free/seliac but whatever real bread’s I mean look gluten-free bread is not that good exists it exists and it serves a purpose but you know to your point if you’d never eaten it and suddenly you eat real bread you’d be like “Oh my god what what is this?” Like how do how do I even understand that this is bread when it comes to That’s kind of the point so like I think the that’s the the concerning the point I’m trying to make here she’s incredibly rambling is exactly that like we’re sort of we seem to be moving away from things that are really real to us and and increasingly creating to your point these weird sort of um reflections of of what is real to us and like the more we lean into those the more danger there is that we kind of forget the real stuff of like being interacting with each other right like which is incredibly dangerous when you market that’s spot on it’s like it is incredibly dangerous even beyond like if I just think the commercial side with marketing sure but like going into I guess how humans work as well you know there’s a great book which um I think I recommended to you last time is Lost Connections which is the book since you recommended it oh you did right here it’s on my reading list this book is the book it is a very very good book would recommend it and it’s all about um and it’s been a little bit since I’ since I’ve read it but um the gist of it is is that it’s about how humans like why people have depress or or get depressed essentially i don’t want to say have depression but get depressed and the connections that they’ve lost as humanity or or humans um and a key point of that is the value um there’s a value of connection which is um being able to you know be with other human or the value of community so being a part of a community being a um you know being able to add to that community being able to see the value that you add amongst your people right so you know if um and if you’re thinking of the people that you’re surrounded it could be so much as you know your work people it could be your neighborhood people it could be your um your family you know it it’s being able to contribute to that community in a way that is outside of technology and I think that you know with all of this um dependency on AI and people forming these very meaningful relationships with AI is challenging that and I think that that runs the risk of people potentially falling more into a state where they’re not you know as mentally well as they could be so Yeah and it’s it’s a scary place yeah yeah look totally and you know leading nicely on to my next fun well maybe it’s not fun this next I’ll say thoughtprovoking piece of news from this week was exactly as you just said Sarah reading that people are not only I mean we know people are in relationships with with um AI already that’s been happening for a while I I mean even since we did the first series but people are now actually marrying their AI partners apparently which is pretty I won’t say crazy that would p robably be very offensive i think it’s quite legally binding is it legally binding i don’t know if it legally could be because I don’t know that an AI instance or what whatever you would describe a conversation I guess would Yeah entity is it right yeah it can’t own assets or anything like that like so if the person was to you know touch would pass or anything like that the AI wouldn’t then absorb their life like as in their entities their assets their financ stuff right sure crazy I mean look this tell me what you think about this because following what you were just saying this as we were saying before I’m I’m of two minds about this um you know positive takeaway I think is like what we’re all grown-ups you know just live and let live if people want to marry people have been marrying weird stuff for years right like there’s I know they quote in the article there was a lady who married the Berlin Wall there was someone that um married a married a train station as well there was that TV show um Oh I can’t remember it was like early 2000s and then Oh my god I think I watched the show there was a guy who married a car as well I think yeah like My Strange Addiction or something like that yeah I’ll try and find it so we can share share the dream very strange though hey I mean like fine do what you like you know if you’re not hurting me if you want to marry a wall but then as as we were saying before I think you have to question the mental state of people who feel Yeah that that’s something that’s going to enrich their lives right i mean like it’s not again it’s not real is it like you’ve created an artifice around something which you would assume suggests some kind of deeper seated issues they might correct yeah and you’d have to consider as well like why they as a human feel the need to do that and should like do we as humans are part of this community you know humans supporting humans do we then have an obligation to see why they do that and then help them around it because I don’t think the solution is cutting someone off from that if that’s giving them you know joy but I think it’s not a sustainable like it’s not a sustainable solution to marry AI because all you’ve done is create an echo chamber and you know your AI person for better use of words is is void of free will so is that then does that say something about the person or does it not say something about the person you know it opens up so many ethical considerations so it’s just Yeah and look my

I mean there’s two films that I think essentially explain exactly what’s going to happen here which if you haven’t seen them you should watch both because they’re awesome one is her with Hackin Phoenix in which is amazing probably the most awkward sex scene in the world as well without him spoiling anything it’s really good um and the other one is Ex Machina which is the I love that movie i love that movie can I spoil i was going to watch it last night and then got Yeah yeah go for it just spoiler alert i’ve seen it mute it just for like 10 seconds i’ll wave to say that I’m done um the ending that she gets out what I was like what the hell so good so good like it’s um Look that’s that’s what’s going to happen right i mean there’s there’s obviously the you know Terminator the Matrix like they’ve all covered it as well in a more you know aggro fashion but I just think the especially her like it’s just so tragically filmed and you’re like again spoilers for her stop listening if you haven’t watched it but when he realizes that she’s been talking to like what it’s like two or 300 people at the same time and the that just that exactly is what’s happening right now and even with um it was Scarlet Johansson’s voice wasn’t it it was yeah although I actually found out reading um because I’m just obsessed with film trivia of films I watch as well i found out that they actually had a different um actor who recorded the entire film who was the previous Scarlett Johansson and then for some reason they decided they didn’t like her and they got Scarlet to re-record everything bit gutted if you I mean I assume she got paid but still it’s a bit really good film though like both of those really good and and the whole thing with ex Ex Machina Am I pronouncing that right Ex Machina I think so um is like so they bring in um just premise of the movie is they bring in I can’t remember his name it’s been a while since I’ve watched it but they bring him in for a Turing test which is basically to test sentience of uh AI um I think that’s the definition i can’t remember off the top of my head but um and he doesn’t realize that he’s you know actually living amongst Yeah oh that’s a spoiler sorry i’ll just mark mega spoilers for this entire sorry um but it’s it’s just so interesting but you know and that goes on to I’ve had a few conversations i don’t want to say heated because I’m not an angry person passionate conversations about the um inevitability of sentience some people you know believe that AI can become sentient i am of the mind of no it cannot because um sentience in my mind is uh driven and it depends on definition right you have to define it what determines sentience in my mind it’s free will and it’s also based off of experience so AI cannot at this stage cannot have opinions based off of an experience that it never had right so for example if I fall off my bike when I’m 8 years old and now I have this fear of riding bikes does AI have that no because it you know never fell off its bike so unless I towed that in in which case that’s void of free will so you know you can’t develop one without voiding the other yeah and look I’m actually I’m just checking this as well because I very loose on some of the some of the definitions of this stuff but um even if you look at the I know it’s very bad form to use Wikipedia as a primary source for this stuff but it’s there so sentience the definition of sentience there right is um the ability to experience feelings and sensations so even if you just use that really narrow definition um feelings and sensations you would have to define so narrowly to apply to artificial intelligence at present that they would become effectively meaningless because it’s like me writing a word document and saying that you know the word processor is experiencing feelings and sensations because I’m describing them in the words like it it isn’t it’s just it’s having them input into it and it’s doing stuff with them it doesn’t actually experience them at all so totally totally I feel like we’re yeah we’re playing with definitions to a sort of very loose degree there that makes them almost meaningless so I agree with you i just I don’t think it’s I think it’s miles away it’s not possible it just it can’t be possible because it’s based off of like so if if we go by that definition right experiencing feelings and and emotions is what you said correct yes yeah yeah i’ll check I’ll check in as well so if you’re experiencing Oh well okay no I know if it’s experiencing feelings and emotions deter like define experience isn’t experience if you’re experiencing feelings and emotion is it just that you’re voicing it or are you feeling those feelings and what to determine like you know that then comes down to how you define everything so you know based off of that definition sentience may be possible but when we say sentient they are not a freethinking human being you know so and I often equate it to and um I apologize for the analogy but I often equate um like the determination of AI to sociopath right so someone that sociopaths by definition you know let me just get that definition up as well actually no you look up that definition i’ll look out while he’s talking so I don’t Yeah so they don’t go off the screen but um you know sociopath from memory or maybe it’s psychopath by definition is someone who does not necessarily experience emotions but mimics it um and that you know is really what AI is they’re mimicking um or they’re mimicking what they’ve been told to to mimic or how they’ve been told to act or things like that and that’s fine but don’t then go and say you know that it’s uh as an entity is experiencing those emotions because they’re not actually feeling it so this is brilliant so I’m being corrected by Google’s crap AI in search now yeah man sociopath is an outdated informal term used to describe someone diagnosed with antisocial personality disorder or ASPD formal diagnosis it refers to individuals who exhibit a pattern of disregard for the rights and feelings of others often characterized by manipulation deceit and a lack of empathy so essentially exactly what you just said but quelling over the word rather than the definition well there there’s a lot of really good books on um I think there’s a book called The Sociopath I Know and then there’s like um uh some I can’t remember the psychopath one like there’s um in psychology you know they study a lot uh not that I’ve studied psychology by the way um I just study a lot of um neuroscience and psychological books but I haven’t formally studied um you know it’s uh there’s a lot of questions around sociopaths psychopaths all that thing and how they think so um and then yeah that’s it’s just interesting there’s a actually a question that if you’re interested that um and uh I I heard this from one of my old friends um I think it was her first year of psychology is how I heard about it and apparently it’s widely known i didn’t think it was widely known i’ve asked a few people oddly enough on my first date with my husband um because I didn’t know him that well I asked him this question when I got just ch something it might be quite important i was like I just need to know if you’re a psychopath so so I asked him the question and we’re married we have a beautiful baby like you know it so it works out so if you need some wedding tips this is one of them u So let me let me ask you this question um and I apologize to to any people who have heard this before if I butcher it I’m going to do it my best this is recalling over a decade ago so um I will do my absolute best but the idea is that if you can answer this in the way it’s intended that you could have like the tendency of a psychopath or something like that so if you get it right I’m just gonna hang up the phone and All right so let’s try now i’ve only ever known one person to get it right um Oh my god so right so if I get it right I am a psychopath or if I just you can think like one doesn’t necessarily mean that you are one but it means that you could channel that part of your brain right interesting um yeah I I got it horrifically wrong so which makes sense but I live my life with um sunshine colored glasses on so fair enough fair enough all right ready for the question i’m I’m ready okay so and I’m really trying not to butcher it uh so there are two sisters and um you know they’re very close love each other very much all that they have a very close family um very much uh close-knit uh do everything together blah blah blah fast forward their mom passes very sad so they go to the funeral um you know they’re both very upset very distraught they loved their mother so much they’re all so close and one of the sisters um at the funeral so she’s been single for a very long time and she meets this guy amazing right and and he’s the love of her life like they just hit it off straight away they’re in love she’s just swept off her feet he’s saying all the right things he’s you know romantic he’s handsome he’s all of these things and and he’s brilliant and she’s just like “This is the guy I’m going to marry.” However she didn’t get his name you know okay at the funeral that they at the funeral yeah at the funeral get his name and um and so she’s just you know she’s devastated obviously didn’t get his name doesn’t know um how to reach him and then two weeks later her sister dies why did her sister die oh my god i mean are the two related was it interesting well I mean there’s so many different ways you could you could interpret that though i mean maybe the two were Sorry I think I butchered it she killed her sister hold on yeah she killed her sister yeah yeah yeah sorry interesting all right i mean I think I feel like my response would be the same though i mean did she was one anything to do with the other or was you know I don’t know i imagination’s going wild with reasons why she could kill her sister why Why do you think what is what is your feeling of why you think she killed her sister and they got on super well before right you were saying they were you know super close right well I mean look the detective in me would say “I have no idea would need to sort of you know start to dissect their relationship and their history and understand maybe there was some sort of secret hidden animosity and like they didn’t get on as well as they thought um you know the more sort of I suppose um you know drama loving side of me would suggest that maybe they you know maybe they both love the guy who knew maybe he spoke to both of them maybe one maybe the maybe the other sister imagined the guy to deal with the the trauma of you know losing your mom like I don’t know I could I could write many books just from the starting point I don’t know does this mean that I could be a psychopath or am I just well no completely completely off on the answer which is great so I will stay on this conversation now not like oh yeah I I would have killed her as well was that would that have made me a psychopath if I was like yeah she sounds like she sounds like a so The answer the answer is um is that she killed her sister because it was a family funeral and she wanted to see the guy again and so she knew that she would see him again if another family member died brilliant right i hadn’t thought that but maybe eventually my brain would never have thought of that it would never never in a million years have thought of that so that’s such an almost you know sort of autistic level of reasoning just so literal it makes sense it makes it makes sense but like I just couldn’t comprehend it you know like I couldn’t think of it so So interesting surely just just ring someone and ask him who the who the guy was he was on the guest list yeah like look I was sad that my mama died but he was that he was that guy that I was talking to do you remember wow well that’s that’s reassuring i’m glad that I’m not you know so congratulations he wouldn’t find that way and to to anyone that knows that question I’m so sorry if I ruined it i as in like didn’t say it correctly i’m pretty sure I didn’t but I got the gist of it i Well look I enjoyed it correct or otherwise I love it well you’ve given me a lot to to think on Sarah there I’ll take that away take that away from today um look I mean I guess rounding off I feel like we could talk about the um the relationship and personal implications of of people getting into deep relationships with not other people quite endlessly but I think the Yeah look let’s let’s pick that one back up amazing i’m going to I’ll keep following developments i’m going to share that there’s a Guardian story about these people i actually found that series as well it was called um My Strange Addiction that’s it it was My Strange Addiction yeah 2010 to 2011 it was on horrify the guy the guy with the the pool toys as well i remember that one the car he loved his car loads without getting too off topic titles i’m pretty sure one episode was eats cat food smells mothballs that was urine drinker that was another one a in love with balloons i remember that one i remember that one this was a such a good show it was so interesting anyone that didn’t get to didn’t get to witness like the beauty of 2010 like earlier 2000 TV for ages they did they did another season in 2015 apparently so good so good yeah that was quite eats bricks that was another good one there was one that eats mattresses i remember that one um grandma lover that’s one of the other Jesus well we all love our grandmas so maybe not maybe maybe not the way that those people feels like feels like there’s some significant differences in the loving being applied there anyway getting back getting back on top we’ll we’ll pick that back up in another episode range of sessions um so

um look I think the important rounding that off I think the important takeaway what I’ve been thinking about it sounds like you have as well Sarah on the the marrying AI subject is like there is obviously like much more awareness of mental health as a as an area of health that we should all be aware of now like you know easily over the last five years since uh since the time we don’t speak about uh aka COVID um and the lockdowns but I think um yeah look the danger is like we don’t doesn’t feel like particularly in the UK and Australia where where I’ve got the most experience like we really have the tools or the proper infrastructure to support people who have these issues like you know it’s kind of thrown at like medical professionals it’s thrown at the police it’s thrown on social services but none of them are properly funded or trained to the degree that they need to be so it does feel like it could feed into what’s already quite a big potential problem for a lot of Yeah and you know like when I when I read this topic point for this conversation right it it really really pulled at me from a morality perspective and I was questioning why right because I was like you know person to person so we’re thinking only fans or something like that right they make a choice as a human to pay for this content and that person that created the content they made the choice to create the content and that’s fine everyone’s adults they can make their own decisions And so I was like why do I not feel the same way about that that I do about someone having you know a AI generated partner um and I think it came down to free will um because you know the person to person aspect there’s both choices being made right and they’re choices that are that are their own decisions with AI it’s void of choice and it’s someone else’s choice to create something and I think that and I was having a conversation with someone else about this recently is that when you when you open up sorry when you open up the I don’t know if you heard that my calendar but uh when you open up this pool of possibilities of how people can choose to create their AI partners choose to have how their AI partners act whatever it is um when we go to the question of and this links really directly back to the psychopath question you don’t think like a psychopath so how can we know the implication how do we know the implications if we as you know your everyday person we don’t think that way so how can you put parameters in place barriers in place things like that when it’s literally incomprehensible for you because you just don’t think that way so is opening it up in a pool like that is that high risk i believe so i don’t think that I don’t think it should be you know opened to that extent so well you’ve touched on a couple of things that are definitely related there that I think are another sort of you know current issues for everyone like the like the first of those is um the general sort of level of discourse has definitely dropped across the internet right i mean everyone’s more polarized there’s all this like rage baiting stuff going on and it’s it’s just driving people to I think to your point previously think less critically and reasonably about stuff um and you know to your point like um when again going back to like my time at uni like I did philosophy degree so a lot of that is like putting yourself in other people’s shoes trying to think in ways that you don’t think normally and trying to um you know genuinely understand how you can arrive at thinking like something even if that isn’t how you think um and that’s fine everyone should you know should be able to do that like you know to your point you don’t have to be a psychopath to understand how they think and to some degree to understand the way that they operate like and it doesn’t make you one to think like that because everybody you know everybody has even you know emergent dark thoughts about stuff is like a common feature of like you know depression and stuff if that starts to get on top of you it’s often a sign that you’re um you know you’re not sort of mentally balanced so I think that’s a challenge and people seem very unwilling at the moment to step into other people’s shoes right like you know I mean you look at without getting into any political stuff like the the level of political discourse in the US between you know the Democrats and the you know the the Republicans where you’re just like it’s enough it’s enough just for someone to say that they’re from the opposing way of thinking that people are like “Oh some kind of you know degenerate moron.” And you’re like “That’s not really not a helpful way to engage with each other is it?” Like it doesn’t help us come together people should really be looking to to understand and I I think that you know to your point there like there’s a saying that I really like which is you know when you say you know everything you know nothing because no one can know everything and no one can know everything about every topic it’s just impossible um so I think that you know a mindset shift needs to happen for people to just be open to hearing other people out obviously there’s limitations right you’re not going to hear out people that are just hateful and you know evil or anything like Oh I mean you might because you might want to understand like why they think that way but you know I’m not advocating for that be like “No good point let’s Yeah yeah let’s do some bad stuff great.” um but you know like I think that when people have different opinions rather than going on the defense or the offense I don’t even know which one it is in that regard because they both just kind of attack um rather than doing that I think it needs to come from a point of understanding and then also challenging which is like the the the educational and informative conversations that they’re more conversations that get people to think in different ways and you know you can’t hate people for having different opinions obviously as long as their opinions aren’t hateful but um you know you you just because someone disagrees in politics or someone disagrees in in um you know how businesses are run or disagrees in you know this or that or whatever look to understand first and then you can look to persuade because you’re never going to convince someone if you come at it from the attack people don’t respond that way they just don’t so you know it needs to be more tactful than that um and you know I was always taught that um every conversation is a chess game so you know I don’t plan on losing chess anytime soon so I’d love to not lose but I’m not very good at chess i also am not very good at chess i’m good at verbal chess not good enjoy the game but you know don’t have the skills really my husband my husband me every time he’s so good but I don’t know like I’ve been YouTubing i’m like please show me tactics to I’m more of an Uno guy these days that’s my more my Monopoly deal monopoly deal all the way i will win i like you just need the determination and some Alan Sugar style grit to to win at Monopoly right i like it i like it um no look I I agree i think um I think that’s and and the flip side of this stuff I guess is like that’s where these things I think are quite positive because the drifting out of the pessimistic side briefly like a lot of the stories you hear from people um who’ve been developing these sort of you know more personal broadly let’s call it relationships with AI and I think where they’re actually better than you know to use the only fan example like where that’s turned into a mad industry now is like praying on similarly lonely people let’s be honest mainly lonely men you Jesus creep maybe maybe not creepy lonely men let’s say let’s not offend them all um I’ll try and I’ll try and you know do what we were just talking about so if you put yourself in their position right and it’s the same with gambling or any of these industries where they feed off addictive behavior the appeal of stuff like Only Fans is like you know you have a relationship with this person that you’re talking to but you don’t really because a lot of the bigger models employ armies of you know offshore cheap labor to have like fake conversations with people and I I’m sure most of them are using like AI as well now so they’re you know the use case is the same where they’re like look as you were saying they they’re making money fair play like I don’t have anything against people trying to make money and survive you know the sort of capitalist nightmare that the western world has turned into but I think the where it becomes a challenge is a large proportion of those people I would be pretty confident in saying are like addicted in some way to the either the you know the idea of having these relationships or the you know gambling works because people just get addicted to like just paying for stuff like you know or look at gacha games in in the mobile world and microtransactions like they all feed off the same sort of fundamental psychological challenges that people have um well then that yeah that flips onto the responsibility of the big corpse that are doing it you know they’re exploiting people that have addictive behaviors and that’s that’s wrong that’s ethically wrong and but then also like as humans we have free will um well you would hope um and so you know there like we can make those choices for ourselves and it’s that serotonin release but generally it’s like a serotonin disorder where you know they need that release all the time and that’s the addictiveness right so yeah and look big complication yeah totally and I think sorry I was trying to make a positive point there um so the positive sorry let’s get negative the positive part of that no I agree with you though but I think the positive part is like you know can you can you afford to do it so for example going back to what we were saying earlier like mental health support is hard to get now right australia is actually quite ahead of the game here where like you can get I think it’s 20 sessions a year now that subsidized um which is awesome so like if you’re struggling you can go get referred and you can you know it’s it’s more affordable it’s not as affordable as it could be but it’s better than better than nothing right yeah way better but a lot of people have you know they can have conversations with like chat for example and if you know the kinds of questions to ask and you’re happy to be open you can actually I think do some of the basics of working on yourself a bit but then the problem is you need a certain level of self-awareness to get to that stage anyway and a lot of people maybe don’t have that or aren’t ready for that without speaking to someone to unlock those things I feel like that base level is good that that’s accessible to everyone um and you know if you are lonely and you just want companionship and something fulfills that need for you I think that’s fine um but like with gambling right it’s always a case of like what what can you afford to lose and I feel like people don’t know what they’re going to lose if they get too deep into these things because to your point earlier if you start to lose your ability to connect with real people um yeah and like the um I know we’re talking talking a lot a lot of sex adjacent talk today but like if you look at like the pornography industry the the danger of that is like you know it’s fine again adults being adults people can do whatever they like i think the adult industry is totally fine broadly but it supports a lot of dangerous stuff as well because again you find that people become addicted like they forget how to they don’t know what a real relationship’s like like they don’t understand how to be intimate with people because they think it’s all tied to just like the weird fake stuff you see online and it’s exactly the same with Only Fans and exactly the same with the AI conversations like none of it’s is not real it’s all like for show to a degree well with the AI it’s an echo chamber right so it’s mirroring it’s mirroring what you’re putting in so if you’re exactly you know asking for all this I guess looking at that side of thing if you’re saying like a relationship wise that’s what you’re looking for that doesn’t necessarily mean that’s what a human to human relationship is going to be like a you know marriage is hard as hell it’s you know it’s every day you living with other people is a challenge like they don’t just always agree and want to talk to you correct yeah yeah and I think that that that’s a probably an area where people might struggle if they have that dependency on AI is is that you know people do challenge you people you have complications you have arguments you you know you have to compromise you have to go against your ego a lot of the time when it comes to like person to person relationships versus AI is is not like that so yeah exactly and look I think I mean you’ve characterized that much more uh concisely and eloquently than than I did i think that’s exactly the issue is like if you always get exactly what you want you don’t necessarily understand what you need and you’re going to really grow as a person right because just you know you’re like the um the sort of uh well name and experiment Pavlov’s dog right where like the bell rings and you get a little biscuit and like that’s you know that’s your life essentially yeah but you made it for yourself instead of having a weird doctor do it to you I guess is where that falls down but yeah that’s so spot on is like you can’t but like you can’t always get what you want much like the song you can’t always get what you want the is the Rolling Stones famous many years ago you literally can’t because otherwise there will be a world of brats you know imagine then having to go into a workplace and then not getting what you want and then someone’s just like “Well no you can’t do that.” And then someone’s going to be like “Well I don’t understand why.” And it’s not even going to be their fault really i quit i’m done way yeah they’ll be like “Well what do you mean I can’t just be on break all day?” And and that’s obviously an over dramatization but you know and then someone’s like “No well you have to work you know like this is a business.” um and then they’re going to be like well I’m used to getting what I want so you know and then that goes into back to the very start that book of Lost Connections is like you need the value of adding to community you need the value of status you need the value of contribution and if you shape your brain in a way where you’re always getting whatever you want you know you’re probably going to live a very selfish life so exactly and like historically those would have been the going back to the animal stuff right those those are not useful functional animals are they the ones that only ever do stuff for themselves are very quickly ostracized or often killed in nature or die right absolutely you have to be part of the the I don’t know what the word is but like the the pack I guess yeah the great human collective Sarah yeah no look I I totally agree like I think um I think what you know winding all that back to to where we started I think that’s the exact challenge like there’s a lot of and and the difference I guess with this is why like you know I’ve been really enjoying continuing to talk about like the AI and like it’s broad implications for work and personal life because the psychological implications of what is happening right now are much greater than the previous big evolutions in digital technology and like you know going back to what we were talking about with search search shaped how we behave and how we use the internet but not in a way that fundamentally changed what we’re actually doing with the information we find whereas you know AI is applying a layer of presupposition in between you and what you’re trying to do which I think is like much more sort of um interesting and potentially like harmful depending on how it gets used it’s I think that’s where the real sort of you know the real benefits or the real sort of nightmare stuff’s going to happen i know probably a topic for another time but look let’s come back to something a bit more positive to possibly round off as I feel like we’re we’re overtime as usual which is a regular for me um the Swiss our friends the Swiss you know known for delicious chocolate and snow skiing snow sports i’m trying to think of more things I know about Sweden at the moment um Sweden switzerland switzerland waterfalls i’m losing it yes nature nature laconic people full of you know interesting things to say about where they live very beautiful country very beautiful actually i’m running out of things I know from the yeah the people I know from over that way as well now that’s it apologies everyone for that description but um what’s coming out of Zurich some cool stuff so our friends over there have been building what they describe as an ethical AI model for the people so this sounds pretty cool on paper and if anyone could do it I feel like I feel like they can well that’s where the saying comes from i’m Switzerland so exactly a neutral you know broad-minded people you would hope in this area so this is really interesting because they’ve basically been developing an LLM that is um open source obviously um it’s been designed with a sort of broad society-oriented data set and what they’re hoping to do with it is I think effectively you know release it into the world as a resource that we can all use to do you know not the stuff that everyone else in AI is doing which is essentially hoard data and try and sell it back to everyone an alarming rate um what are your thoughts on this versus how AI is is currently being developed Sarah and like I think if you need something a bit more specific because this this is more aligned with where I was hoping AI was going to go you know it’s multilingual it’s inclusive it’s scaled responsible data practices are one of the big headlines um public access reuse you know they’re just giving it to people i think very similar to Wikipedia right yeah this is probably where it should have started right like this is what should have been launched first in my opinion rather than with the you know lizard people in Silicon Valley and look I I understand the I understand the release of of um things like chat well chat GBT was the first one right um uh that was like widely accessed in the public and you know there there have been many many many implications for it and you know I don’t blame the the developers or or anything like that because I think that it’s rapidly evolving and to my point earlier you know you can’t think like a psychopath so how can you determine what parameters should be in place um obviously I don’t know if some of the developers psychopaths I can’t say that could be I’m not each their own um but you know the the I think that something like this probably should have been put first because then it sets the tone for it moving forward right rather than it being this free for all to begin with um where you know they’ve had to have these negative situations happen for then regulations to be put in place um you know where um some unfortunate situations have happened and you know I won’t go into all the details but you know they they kind of I think released it too early without understanding the adoption it should have maybe been like a pilot or something like that or designed like this where it’s designed for the greater good first and foremost and then you can evolve from there so yeah I think it’s you know if if anyone was going to do it it’s Switzerland there yeah look I Yeah I agree with you i think um I don’t know i’m in a constant sort of uh you know battle with myself around private industry because obviously I mean you as you are as well like being a business founder and owner you obviously have to lean into being a capitalist to a to a degree because otherwise why would you be doing what you’ve done or what you’re doing now but I also feel like you know if I wind back to my younger more pretentious um self who seemed more sure about a lot of stuff i feel like possibly I’m turning a bit into what I used to hate when I was younger but then you know as you get older you sort of realize that that’s it’s not quite as clear-cut as that is it really like yeah I think you just the older you get the more you realize that everything kind of operates in the gray right like nothing is as as simple as yes or no um it was and but and that’s okay you know I think a lot of people don’t know how to have that shift from you know it was yes or no and now it’s all gray and like I was so confident when I was younger that this was the direction I wanted like me when I was younger I was a nightmare i was like my opinion is the opinion and like this is fact and this is that this is why we get on so well now yeah you just like you’re an idiot and also wrong and here yeah I I was a nightmare and as I’ve gotten older still a nightmare but less so you just become more aware of your nightmare qualities I feel don’t you and they’re like maybe maybe just don’t say that just think it just think it and go home and you know also you know like just being aware of other people and you know I’ve always been a very empathetic person you know I tend to like uh what’s the word um selfdeprecate no that’s not the word like I self no not self-sabotage i don’t do that the one where basically I put others above me most of the time i don’t know what the word is i can’t think of it off the top of my head self-sacrifice there we go um and I you know I’ve been taught that that nature is not necessarily always positive but actually it’s not something that I want to change because I quite like that about myself and I quite like you know being able to put other people first and things like that and as I’ve gotten older it’s I probably think it’s become more so evident but it’s just knowing like when self-sacrificing is appropriate you know like I’d do it for my family my friends you know um some key people rather than everyone when I was younger it was like oh a random you know wants $500 sure here you go now I’m $500 like that’s obviously dramatization i never did that because I didn’t have $500 when I was younger no I know what you mean though i think um interesting i’ll send you something else that you’ll you’ll enjoy reading um on this very subject because as usual we haven’t made it as far forward as everyone thinks in terms of human thinking but going back to our old friends the ancient Greeks again Aristotle who was um Plato’s teacher slash mentor came up with um the I think he called it the doctrine of the mean which is is exactly this right where he was like every every vice can turn into a virtue and every virtue can turn into a vice yes yeah and to use your example he was like if you’re rich right and you want to donate to charity but you donate so much to charity that you become poor then that has become a vice because you’ve been you’ve obviously been doing it for non non um self-supporting reasons is the simplest way to think about it and he’s like you’ve obviously been donating because you want people to think you’re great and like whatever or you’re an idiot because you don’t realize if you give away all your money that you’re going to be poor and now you can’t help anyone so he’s like that’s that’s vice like because of the incorrect intention whereas he was like you can also be you know be be selfless but you need to be selfless and also look after yourself essentially because if you you know if you become a poor fool you’re then a burden on society and all the other stuff so like he listed I think it was 10 yeah I remember studying this yeah yeah there was also I think in that same same because it was a course in my degree um was ethics um and then Aristotle came into it with the virtues and the vice side of things um and pretty classic I remember um I can’t remember who said this where it’s from or anything like that but which is great reference point but um it was that there’s we’ve been full of those all the way through this I’ll try and dig them out there’s no such thing as an altruistic action so it and the more you think about it the more you’re like ah is there because you know like you donate to charity so that people know that you donate to charity so that they think that you’re a good person or you know you go do x y z so that people like you feel better about yourself you go do this and that and blah blah blah and I’m like that’s I can understand that thought process but I also think it’s really sad because it’s taking away any form of people feeling that positive association with doing good as well it’s voiding that yeah and look I think that’s where there’s a lot of nuance in the way that Aristotle described all this which which is where the way he describes this and the reason that he pulled this together which if you if you only looked at purely that section like it’s probably misses the rest of the the kind of broader context is that it’s quite a relativist perspective that he is proposing in that the whole idea is that there’s no you know there’s no fixed or firm rules it’s a sort of way for you to guide your decision- making and the to use exactly that analogy right it’s totally fine to you know donate to to charity and then tell people you’ve done it to be like look you know I’m donating because you’re supporting your cause and it does make you feel good because you’re to your point you’re building those connections with you know the great unwashed mass of humanity and trying to help people and you’re supporting the social contracts right but um yeah the point he’s making is like if you only do it because you want other people to think that you’re great and you donate to charity then you might as well not bother because that just makes you a you know pretty selfish sort of Yeah self-centered person because the intent behind it is not you know altruistic the intent behind it is egotistical because you want to you want to look great and be like look how good I am so this I mean and coming back to um you know I guess we should do probably probably sum up roughly Sarah just to be conscious of your time no no it’s fine i could talk all day about this but um I feel like we’ve got enough to talk for several more episodes so let’s keep some in the tank for for next time but look I think were I to sum up if it’s even possible everything we just talked about i think the key takeaways for me at present are um and this is true professionally and personally is like what what AI cannot currently do and to me is still a long way off and this is true for the technology industry broadly is you cannot ever truly understand someone’s intent and the way that they decide to do what they do um and there’s no amount of digital signals or AI conversations that are going to tell you the reason that I’ve decided to do something unless I explicitly and very honestly tell you like why I’ve decided to do it um and that essentially is like I think the thing that can never be really replicated and where I still have quite a lot of faith that like despite all the you know hype and tech bro chat like no machine will ever replace that because unless you can start really digging directly to someone’s brain like it’s impossible it’s impossible to sell that and you would have to subscribe to a very reductionist idea of our own sentience and sapiens in order to even you know for that to even be possible which we’re miles away from in terms of medical science so I feel like we’re safe for now is my key takeaway so what your what are your thoughts Sarah what are you like you know how are you feeling about life at the moment you’re not a psychopath which just makes me feel better very reassuring that’s my That’s my other key takeaway from this conversation safe to be around everyone um no i I look it’s just such an interesting conversation and I think that there’s just so much ambiguity around everything that and you know I sure as hell don’t have the answers um and I don’t really think anyone does i think that we’re all just evolving with it and seeing how humans interact with AI um and you know there there is a lot of risk but also there’s you know my my biggest takeaway which um we can chat more on in another episode is you know not letting it replace critical thinking or analysis because you know then you won’t develop that competency in your brain so you know just consider orientering back in the day uh when you would use navigational skills now we have Google Maps you put me in the middle of anywhere I am lost i’m put away from my house i’m not getting back home so I’m the same i’m like I can do I could do the directions but I don’t know the direction anything’s in basically which is my next challenge north and South yeah I’m pretty good there east and West got that but then I’m We used to have those skills so yeah can you do civilization yeah can I help there i got a calculator yeah exactly calculator don’t don’t do long division anymore so yeah ironically maths was never really my strong suit until I started doing logic at uni without the numbers in it and then it all started to make sense strangely which is I was exactly the same i Well I mean I No that’s not true i have always been good at math but actually a math genius so uh no so I’ve always been good at math but I understood it more in its complexities as I got older and was understanding tech and how everything syncs and links to each other so um which you know silent little plug here I am um uh launching something new in the coming months which will be yeah talking about some of that stuff and going through it teaching people how things connect so I feel like you have to come back now so we can hear about this mysterious thing that you will be launching Sarah that’s very exciting nice well look I’ll before we end up being on all day Sarah I think like I mean we’ve covered a million things it was really interesting discussing all this stuff with you so thanks for coming on definitely come back is there like how can just final final thoughts like how can people get in touch with you or find you if they do want to have a chat with you about any of this stuff yeah absolutely South by Southwest obviously I am very open to chats all the time with um I always like to give back to the community i teach it at uni and um you know I uh well it’s about not really about break at the moment obviously having had a baby um sorry I feel like you’ve earned one oh thanks but you know I I um you know I’m always happy to help mentor teach everything like that and give back to people so if you ever want to contact me contact me on LinkedIn um you know best place to go is there because I pretty much accept everyone so um yeah if I if I don’t accept your connection I apologize sometimes I just um forget so I that’s why I say pretty much accept everyone on it because sometimes I forget um but just message me and I will accept it so very open there you go i’m the opposite yeah I love talking to everyone so awesome well look on that note Sarah thanks again for coming on um definitely let us chat again and there you go that that’s that’s done i’ve got I still haven’t thought of a good ending so I guess we’ll just we’ll just end that’s the end bye sounds good back.