Transcript
hey hey hey hello everyone i am Stu this is the Flow State podcast i’m joined by not one but two people today ashton Jen welcome back thanks for rejoining and being brave enough to come back on. hello hello hello pleasure to be back pleasure yeah great to have you both back on.
So obviously today we’re we’re gathered here today once again to discuss the continued exploding unchecked use of AI this time within the context of ESG environmental concerns sustainability. Crucially, is the use of AI undermining your brand credentials?
Before we dive into that, I’ve been doing things a bit differently from the previous time I spoke to you both, just doing a bit of what’s happening in the news context. Obviously none of us are formal environmentalists, but as concerned citizens this is all stuff that’s kind of high on the agenda. So there’s been a ton of recent reports about the projected requirements of energy and also water. I know you mentioned Jen, with our current obsession and continued growth of the use of cloud services, of which AI is a massive one.
There’s a recent report which I’ll share from the IEA, the International Energy Agency, which projects that data centers are going to double the use of electricity by 2030. That is surprisingly close—only four and a half years away—so that’s a lot of electricity and a lot of water. They think that that’s going to account for around $1.5%$ of the world’s electricity, which is actually quite massive when you think about the relative imbalance of where AI companies are based and who’s developing these things. As broad context that’s fairly concerning.
Data centers, as you both know from our notes and previous chat, use a lot of electricity and a lot of water and don’t really do a lot for the economies where they’re based. Obviously we need them because that’s where the cloud is. As my mom used to ask, when she was like, “where’s where is the cloud?” it’s like in a massive computer, basically, sometimes underwater; I’m not sure if people are aware of that, but these do get built in the sea as well. And often built where it’s cheap to build them. So, not on our doorstep, but often in someone else’s backyard.
The broad kind of context I guess that just to set the scene of what we want to talk to you both about is that we’re building all this really awesome stuff. AI obviously is useful, but people are just using it everywhere and not really thinking about it. I want to just share a quote, and then I’ll throw to you two to sort of share your views on this, from a recent story from the Rap. The CEO of Europe’s largest publisher has apparently mandated now that you only have to explain if you didn’t use AI in the newsroom. We’re not here to talk about news obviously, but I think this attitude is just indicative of the current growth in the use of AI. It’s not just an expectation; people are actively being told to use it without necessarily much thinking about the why, and you might get in trouble if you don’t use it. That’s kind of concerning when you’re not really considering these broader implications of the use. This is probably just one example amongst many, so I just wanted to kind of throw that one out there because I think it really just exemplifies the crux of the issue.
So without further ado, Ashton, I know you had a good example of this in practical terms from a brand perspective. What are your thoughts on this? How do we address it? What can we do as individuals? It does feel a bit like something I complained about last time, which is, “I’m diligently recycling my cans, yet Jeff Bezos has just had probably the least sustainable wedding in the world in Venice”. Is this something we can even address as individuals, just starting point?
Yeah, it’s a really interesting kind of like meta concept. I think what you just brought up, that unthinkingness around just use, just use AI, don’t even think about it, that probably speaks to just the broader psyche of we’re not considering what our usage of this technology in particular is actually doing on an environmental level. I don’t know if it’s still the current thing, but it takes a bottle of water to process a prompt on Chat GPT. So every time you ask Chat GPT, “tell me this recipe,” or “give it to me,” again and again and again, it’s like we’re just sort of pouring bottles and bottles of water down the drain. If knowing that, does that stop your usage of the technology? I’m going to say no.
I think unless you are, your values in your day-to-day really steer you towards the lowest environmental impact possible, you’re probably just sacrificing that. You’re just thinking, “I have to use this, I have to use it because I need to be more efficient, I’m expected to do it in my work,” as that sort of article says. “I’m gonna get left behind if I’m not using it” in a work sense and also in sort of a societal sense. So it’s a really kind of crazy balancing act between how do I kind of lean in and use these technologies and be part of this global society doing that, but consider the implications. Are my actions, can I really impact change, or am I just another number?
Yeah, look, I think that’s the crucial issue with any of these things, right? It’s like obviously we have our own moral and ethical standpoints on these things, but it’s an area that’s full of contradictions. I have another example of this. My neighbors over the road are very active in the local community and are very sort of environmentally minded, and they’ve got an ebike that they ride everywhere on, like go down to town, do their shopping on it, and stuff. That’s awesome, like I could ride a bike to town, but I don’t. I usually just drive down because I don’t want to have to carry a massive heavy bag back up, and my bike is not an ebike so I’m on leg power only. So, it doesn’t mean I don’t care about the environment, but also I can’t spend an extra hour on my shopping trip all the time. So, does that make me a horrible person? I don’t know. What are your thoughts on that, Jen? How do you address this in your personal life?
Well, I think the biggest issue is that unless you completely go off-grid and live in the Alaskan wild or something like that, you can’t avoid it. Even like with the ebike solution, you think you’re doing the right thing, but creating those batteries and the disposal of those batteries at end of life, there’s no way you can do that in a clean way. And it’s the same thing with AI: there is no clean way you can do it necessarily. It really just is an amplification of some of the issues we have already created from ourselves from the sheer amplitude of data that we’re using and creating on a day-to-day basis as everything shifts online and everything gets stored.
And also it is pretty much entirely unavoidable now. So every time you’re interacting with an airline, with a bank, with government service, etc., there is some AI happening in the background that’s supporting some of those systems or processes. You’re using a chatbot, there’s decision-making, machine learning AI, like something is being used there. If you’re in an office space and you’re using Microsoft Copilot, it is now there whether you want it or not. I run my business on Google, the latest update to the Google thing has Gemini top to bottom; I can’t opt out of it. So unless I go completely off-grid, I am using AI in one shape or form.
As an individual, it makes it very, very hard. Even just saying thank you after Chat GPT gives you an answer consumes that same bottle of water because thinking, “Do I need to respond to this? What did they say?” even that takes compute power. You can make microscopic decisions, but it really is going to have to be the companies that are running these things and the companies that are using it to provide services that are going to have to be the ones making these big macro decisions.
Yeah, and look I think that brings us around to the example that I think you brought up Ashton. Obviously, as individuals these things are all directly under our control. I’m perfectly happy to reconcile my trips in my car versus other things that I think I do to reduce my own footprint as much as it’s possible to do so. You’ve got to be realistic about it, and as much as my dad keeps trying to convince me that it’s a sensible retirement plan, I don’t want to move to the Alaskan wilderness to live in a shack and have to fight off bears and stuff.
Look, I think where it comes to brands, this isn’t a new issue, right, because since cloud became a thing, this was always going to happen because cloud infrastructure is expensive and energy hungry. It was never going to just start and go away; it was always going to get bigger and bigger. I do know that some of the bigger companies like Google’s, Meta’s, IBM are all investing very heavily into renewable, which is great. But the fact of the matter at present is that we’re never going to be able to shift enough energy generation to renewable in such a short time frame to cover the amount of energy we’re going to use. I think that existential challenge is where I see the brands below them need to sort of step in. These are the companies in the mid to sort of bottom end of top end of town, who do have the kind of power to do stuff, but I don’t see a lot of them actually doing anything that’s really real. I see a lot of like hangover from the start of the climate crisis and people being like, “Oh no, we’re green, we’ve bought some trees and they give out oxygen and whatever”.
Are any brands really doing anything? I think it’s worth talking about the example you raised Ashton of the B Corp accreditation, because that used to be the big thing right in sustainability. Tell us about your sort of views on that and whether it means anything anymore.
Yeah, it’s a great question. A very topical news story this week when we were sort of planning for this conversation was the fact that Princess Polly achieved B Corp status. Princess Polly is a fast fashion company. Great success story as far as the e-commerce world goes, but as soon as they announced that, everyone kind of raised their eyebrows and went, “Really, how?”. It was everyone going, “How is this possible when we’re talking about fast fashion?” like those things seem in direct contradiction to each other.
It made a lot of people question how do you achieve B Corp status then? Is it so easy that you can just swap your packaging for compostable bags and that’s kind of all you really need to do? How are we thinking about, like you say, “we’re planting all these trees to offset the carbon from shipping stuff across the world”? Not to besmirch Princess Polly, because they’ve met the criteria, they’ve done what they have to do. So in which case you go, “Well, what is that criteria and is it really keeping up with consumer’s expectations?”. We’re not even talking about the technological side. Even with a company like Princess Polly, is that taking into recognition the chatbots, the AI they are using for their operations, supply chain, distribution, all of those things as well?
I feel like the impact is going to be so latent when we go, “Oh wait, we’re going to have to actually kind of legislate to protect ourselves from this”. What I’m really interested in is how the generational shift is going to impact that. I look at a company like Princess Polly, which is very for the younger generations (Gen Z and leaning into Gen Alpha). All the reports show that they are a lot more sustainably minded because of the climate change aspect, but they’re also the tech leaders, they’re also the AI kind of heaviest users and adopters. So I’m really curious how that’s going to play out in the next generational wave.
Yeah, I completely agree. I think e-commerce and fashion in particular are a really pertinent example here. Fashion as a whole is not renowned for having amazing environmental or even social practices. Third world sort of fashion production, child labor, underpayment, the whole labor side’s bad already. Then there’s all the shipping and the international infrastructure, and truckloads of clothing being disposed of.
I remember when Primark launched in the UK, there was another big uptick in interest around this because it’s super cheap. The commentary at the time was like, “How can this t-shirt, for example, that costs like $\text{\textsterling}3$ or the equivalent of about $\text{$}6$, how is it physically possible for that to be sustainable to produce unless something illegal is happening or some other nefarious something?”
Bringing this back to what you were saying Ashton, where this becomes an issue is like social commerce and the growth of that side of things. Social commerce is massive. Instagram and Facebook have pushed this for ages. The whole like buy the outfit, your Shein haul or your Temu haul.
How do you speak to a lot of sort of government types, Jen, and hear from them? What is happening at the top level to address these things from an environmental standpoint? It’s very hard from a policy standpoint that they’re looking at doing this. They’re certainly grappling with the ethical side of things. From a government procurement perspective, this is still something that’s very top of mind for them. They do have things like: “show me your anti-slavery policy,” “show me your number of women on your board,” “show me if you have any indigenous support,” etc.. They quite often have carbon targets that they’re trying to meet themselves. They’re very conscious of suppliers having good environmental standards. When they’re procuring, if it’s hardware, they want to know what disposal looks like; if it’s a data center, they want to know what the data center management looks like.
But again, when it comes to AI, it’s getting pushed so fast that how quickly can policy keep up with that and how can standard keep up with that?. There’s a big discussion going on when you bring private sector and public sector together around accountability. Who is accountable for looking at these things? Is it the government that’s procuring these, or should they be setting standards around what is acceptable, or is it the provider being accountable for how they deliver that service, or is it the end user?
My view is it should be a little bit of all of them. It also means there needs to be greater transparency. Those data center companies need to be more transparent about their practices, about their energy, their water usage, the labor practices—all of that needs to be much more transparent than maybe they are now. Maybe the government needs to be asking more difficult questions and using that procurement action to drive the behavior that they want to see. And not just government; it’s the other big buyers of those—the banks—who use those massive data center resources to run their businesses. They should be looking at these things too, because again, they have their own carbon targets that they’re trying to hit as part of their bigger ESG programs.
It’s an interesting point, Jen, about the companies. Imagine if the companies themselves, the big players in utilizing these resources (the energy, the water, for the data centers), imagine if Chat GPT had to be transparent around the data usage. Every time you used it and did a prompt, imagine knowing and seeing a counter of how much water you have used. I feel like at scale that has the potential to change behavior. I can’t see that happening anytime soon, because they need to be held accountable to doing that, and the only thing that can really hold big business to account is usually money. So again, it kind of comes back to how do you kind of exert change by impacting the bottom line.
Yeah, I think that’s a really interesting point, because this is where there’s a real tension between the free market and government regulation and oversight. I was listening to quite a mad interview with Peter Thiel last week. He was obsessed with this idea of seasteading for a while. His big theory from a couple of years ago was that we’re in a period of stagnation as a species, and that we’re not inventing things as fast as we have, we’re not sort of progressing the collective human body of knowledge quickly enough. His view is that a lot of these things are sort of shuffling the deck chairs on the Titanic.
He’s very behind this kind of chaotic style change and just letting things happen is to try and unblock what’s happening. But it sounds like you’re both more of the opinion that we need more of a responsible attitude and more of that accountability and transparency. If we just carry on running things unchecked, we’re going to ruin the place that we live. It feels like we should be more mindful, right? Should we not be pushing for more brand accountability, more authenticity?
Ashton, are you dealing with a lot of B2C brands? Are you seeing this kind of same attitude across the conversations with them, or are people kind of just trying to push it behind the curtain a bit and be like, “Oh, we’ll deal with that later”?
I really think it depends on the industry. It is the industries that are probably held to more account around the transparency, like what’s actually happening behind the scenes here. Because there’s been so much backlash from greenwashing in the past, a lot of companies in that industry—natural resources, mining, big banks (around who is actually funding negative climate change and who’s not)—anyone in those industries has kind of already felt the effects and it’s very visible. They’re like, “The magnifying glass is on us. So we have to be careful, we have to be transparent”.
But I think for a lot of companies, especially e-commerce, it’s an interesting one because we kind of want what we want and we want it now. Australians are pretty patient because we’re so far away. I think our expectations are pretty low sometimes. In other parts of the world, same-day delivery is very normal, and it would take a lot for people to part ways with that once you get used to convenience. It’s like, would you rather choose convenience over environmental impact?
I think Mecca is probably a good example of this because they have two different packaging options when you purchase online. You can go, “Yes, I want the full packaging, all the nice stuff,” or you can have more sustainable packaging, which is just a low-fi cardboard box and some eco-friendly brown paper instead. You can actually make that choice for yourself. I’ve always seen Mecca as quite a leader in their industry. That’s a really good example of giving people more of an active choice to make more sustainable decisions and then kind of make that connection to the brand.
I was going to say, but I think we might be looking this through a very privileged viewpoint though, right? A lot of people just don’t have the mind space to even be thinking about these sort of things. They’re just worried about, “I need to pay the electricity bill this month, I need kids off to school”. It’s just too much decision fatigue, it’s too much information. They just take the least path of resistance. They don’t have the time or the money or the resources to stop and think about all of this. Or maybe just not the education resources.
We know about this stuff because we’re in the industry. The vast majority of people are not in that same position, so again, this is where you can’t push it all back on an individual. Especially if you’re not being transparent, people just simply don’t know what happens. $\mathbf{95%}$ of people probably don’t even know when they’re talking to a chatbot these days.
Yeah, I mean you only know when it just fails to do even the simplest task, right? Or if a media report is created by Gen AI, or that bloody Wimbledon influencer. That’s a really interesting one. People had a more positive reaction to that AI influence—”Wow, look at her, she’s so beautiful”. And then there were real influencers. There was a woman wearing her Wimbledon whites, and she unexpectedly got her period, and there was backlash about that. It’s like, “Yeah, but that’s a real person. That actually happens in real life”. Influencers menstruate. That for me was such an interesting one: so robots, yes, people, no?
I find it really funny that like Meta platforms are strangely puritanical. They have the least care about the content that’s on their platforms, but they absolutely hate specific elements of sexualization of women specifically. The minute you show something, or someone talks about a period, or it’s the wrong bit of a body show, they’re like, “Crap, the hammer comes down on everyone, that is too far”.
This influencer stuff’s the same. It’s undermining what was quite a fun industry of just “do stuff you like and make a bit of money”. Now it’s a career aspiration. Ask kind of 12-year-olds what they want to be when they grow up: content creator.
That just reminded me, I think it was The Iconic this week was also in the news because they said, “This is an AI person” for some of their models in the clothes. Usually when you see the clothes, they’re like, “Our model is $\text{5 foot 7}$,” usually takes a size 8/10, blah blah blah. They had the kind of warning label, “Just letting you know this is actually like an AI person”. That’s an interesting one, because how many times have I bought something, and it looked great on the model but it looks terrible on me, even though we’re the same size. Does it really matter then if it’s a real person or not, because they don’t have my body anyway?
I have unusually weird long arms and monkey arms and legs, so shirts and trousers and stuff are always slightly too short but fit everywhere else. I either have to have weirdly baggy clothes that are the right length or properly fitting ones that are slightly too short. It’s a real rarity for me to find something that actually fits properly. But then, clothes are one of those things that have to be real. You need to see them on you to know if they look nice.
To me, that just seems like a real stupid use of AI. I thought augmented reality was going to be much bigger there, where you just take a photo of yourself, or shoes is a good one, and then you can see them on you. For them to lean into AI just seems really stupid because you’re like, you need to see clothes on a real person. All of the glasses companies do it, like Bailey Nelson. Glasses are a perfect example where some could look awesome on a shelf or a model, and then you put them on, you’re like, “Jesus, what is wrong with my face?”
On the subject of fashion, because the fashion industry is so prevalent and moves so fast, it does feel like they should be more in control of the technologies they’re bringing on board. I want to counter your example of Princess Polly with Patagonia at the other end of the spectrum. They’ve done so much to really lean into their environmental credentials. But then, to your point, Jen, Patagonia is a wanky fancy clothes travel brand. It obviously works for them, and the guy is minted. You pay a premium for that environmental viewpoint.
And that feels like maybe where the challenge is, because I’ve got a lot of sympathy for people who just can’t think about this stuff. You shouldn’t have to think, “Has this pair of shoes that I’ve bought meant that the part of a rainforest has been burned down to get it to me so cheaply?”. It feels like that should be the responsibility of brands.
Going back to your example of the B Corp certification, Ashton, if that doesn’t mean anything anymore and nobody’s taking any real personal accountability. If you really care about your customers and looking after them, you don’t just do what they say; you should be doing what you know is kind of best for them and for everyone else. It doesn’t feel like a lot of people are really doing that at the moment.
A free market economy basically says the consumer will choose. You’ve got the Patagonia option, you’ve got the Temu option. If you’re somewhat informed, your dollar is going to go where your values go. In a free market, basically they’re saying it will self-regulate. If there is demand for people wanting to be more environmentally conscious, they will start supporting the more environmentally conscious brands.
This is again where the government comes in. In some big markets, there’s a big move away from regulation. The US federal government is saying, “We don’t want this regulated; it should again be free market, the market should be up to decide, the consumer should be able to decide”. Whereas some other markets are more regulatory heavy and they’re actually saying, “We should have more regulation around this for privacy reasons or transparency reasons”. It’s the same concept as like food labeling. Regulation is in place to educate the user so the user can make the decision in a more informed way.
Maybe that’s what more is around AI. Maybe it’s not regulation that says you can do this, you can’t do that. It’s saying, “How do you do that?” and put a water bottle counter.
That’s a really great example of the health star ratings. When you look at something you’re like, “Oh, it’s four stars, great, it’s healthy.” The way that system works is it’s only compared to that category. So if you’re shopping for chips and you’re like, “These ones are one star and these ones are four star,” you’re like, “Oh, these are healthy,” but they’re actually just not as bad as Twisties. So I think education on like what these systems mean.
I went to a gig last night, I went to The Hives; they’ve got a new song, it’s called “Legalize Life”. More regulation is not the answer. Give people choice and agency in what they’re doing, but make sure that they’re educated about those choices.
I want to link that back to Stu, your quote that you shared from the chief of the newsroom over in Europe, “Yep, just use, just use AI, don’t even think about it”. That media literacy is really concerning to me. The lack of, I think we’re going to lose a lot of critical thinking around, “Well, hang on a second, what is this purchase meaning? What is my Chat GPT usage kind of leading to?”. So I think critical analysis and actually questioning our choices is probably the thing that’s going out the window a little bit. That’s probably what we need to focus on more than regulating the systems.
Yeah, it’s the right regulation, it’s the right kind of regulation in the right areas. Regulating the right behaviors—that’s the crux of the challenge. Sometimes the citizen isn’t always right in that area. You have to get past this short termism and kind of think some of these decisions we’re making now are going to have massive long-term implications. You would hope that somebody there is taking a bigger view of the world, which is supposed to be government.
I don’t think companies themselves are actually fully aware of the ways that they can control the impact that their use of AI can do. Things like deep learning—feeding the content, feeding the data into a system to enable it to make decisions—is a very data heavy, very energy heavy thing. It could be something like let’s be just very specific. Let’s just build the data that we really need to use, let’s not include everything.
From a Gen AI perspective, using Gen AI to create images takes an enormous amount more energy than just Q&A type of response. If you’re a company that has the need, like a fashion company, to generate a huge number of images, rather than just letting your team go free reign in some Chat GPT type of tool, create a curated set of images that they then use as an image library. It’s not 20 people doing it at a free-for-all. You can put controls around that policy, process, and tools. You just need to be mindful of it and make sure that it’s communicated and that people adhere to the decisions that you make, and a way to track it.
If you’re a big miner or you’re a big bank, you’re already reporting on ESG standards, and it takes a team of people to do that, to measure and report on things. Bigger companies are already doing it, but small companies just don’t have the resource to do it. That’s where you need to look at, well, maybe this is where the providers can step in and lend them a hand.
There’s only so much external regulation can really do. There’s obviously a fair bit of stress on the social contracts and the economic pressures bearing down on you. You just have to look after yourself at the end of the day, right?.
Coming back around to work, focusing on people in marketing or who are looking after brands. Marketing is often the one that is saddled with or is in control of the brand of a business. They’re often the ones that are tasked with being the brand guardian and champion. It feels like a lot of the power to communicate and control this stuff should sit with marketers. Marketing is now stepping into the world of a relatively technical role, just given everyone’s obsession with the tools.
The two things that naturally now sit with a potential marketing team or senior marketing person are: one, actually internally making sure that you do live up to your brand values, and two, having some level of governance over the use of tools that impact on those values.
The current situation is that marketing as a department and often as a discipline seems to have kind of lost a lot of its confidence broadly and seems to get kicked around a fair bit, especially in B2B. How do you think we reconcile those two things? How do you think marketing people can re-empower themselves around these issues to just kind of get on the front foot and actually take charge of some stuff? You should be able to walk into a boardroom and be like, “You guys are not doing the right things with these tools. Our brand is about this”.
I’m going to disagree with you on that because I think the problem with making this a brand issue is that this is where things like greenwashing come in. It’s not a brand issue; it is a core company value issue regardless of what you project externally. It is a pure “how do we operate and live as a business” issue, and so it sits at every part of the company. The brand to me is how you live externally, but as a core, your culture, your core values, and your social license sits with everybody, not just marketing. My concern is if marketing takes the lead on this, it will become a “what does it look like outside the business” issue. To me, the grown-up companies are building ethical AI committees where it’s got representation from across the business. You need to look at it within every business function and across every business system that you use as well.
Interesting, no, I get where you’re coming from there Jen. I guess the maybe marketing being in charge of it is the wrong characterization, but shouldn’t the internal aspects of that also really be wrapped into your brand? I think the brand piece and the “who controls it” piece are probably separate. Because I’ve sort of lived this a lot, it becomes a marketing problem to solve. Even if marketing doesn’t get to decide with this and with that necessarily, if something goes wrong, it’s a marketing problem.
Marketing has a strong influence in this space because we’re at the front line of talking to all the stakeholders publicly. We can get that sentiment, we’ve got the data. That’s probably the most powerful thing. If you’re sitting in front of a room being like, “Guys, we’ve got a problem,” and it’s actually impacting brand sentiment compared to competitors, or revenue’s fallen off a cliff with this demographic because they see us in this light. I think it becomes kind of marketing as the bearer of bad news sometimes and the ability to influence.
But because the problem is often a cultural problem, it is kind of permeated through the company. If your issue is actually in supply chain, if the practices are kind of what’s bringing the company down, that’s not a problem for marketing to solve. That’s where you need everyone, that cross-functional committee, to go, “Okay, this is all of our problem to solve”. It just ends up being sort of marketing as the messenger a lot of the time. I can speak to that because I’ve just spent 18 months trying to step outside the marketing stream a little bit and step into that broader operational general manager space because I felt like my level of influence wasn’t strong enough as I needed it to be.
So, poor marketers, yes, it does become kind of “we’re the messenger and it becomes our problem to solve ultimately”. But it takes kind of someone to bring everyone together and go, “This is actually a company problem; this isn’t a marketing problem”.
Yeah, it’s a complex challenge. If you’re working with companies over and above a certain size, then you do have the option of that level of governance. But that middle ground of midsize companies, where you maybe you’re not quite big enough to require a separate sort of function but you probably should have, and then it does end up sitting with one or two people—that’s probably the hardest space. Startups are a little bit easier because you kind of live and breathe a culture every day. At the top of the chain, you’ve got the big companies with the resources and the regulation. But it is almost that in between where you’re like, “We’re kind of not lean enough to, we kind of might have strayed away from our values, but no one’s really putting pressure on us to change anything”. So that probably takes that internal champion, which again often comes back to marketing.
I think this is a bit off topic, but it was a very amusing example: the Woolies CEO getting confronted last year in Wollongong. She got confronted by some lady who was filming her on TikTok about the price gouging scandal that was going on. She responded like the worst corporate robot you’ve ever seen, just not even saying real human words. She was like, “God, we thank you for your feedback, and we’ll get back to you”. That to me just exemplifies that huge failure of her job to actually be a CEO, because your job effectively is as a human shield for whatever’s getting thrown at the company, and also just failure as a human to actually engage someone like a normal person. Woolies’ customer is the average Australian. That just exemplifies that failure of the brand and the values at every level.
Anyway, this has been a relatively wide-ranging discussion. I still haven’t thought of a decent sign off for the show. If you guys were going to wrap up everything we’ve talked about, how do we individually weigh up the pros and cons of what we’re doing against the mandate of efficiency at all costs, growth at all costs? What should we be doing to sort of dictate how we behave and how we do use these tools? Ashton, how do you guide your decisions on a day-to-day basis?
I think I’m going to throw the word northstar out there. The biggest trap is companies or brands thinking that they don’t have to worry about this kind of stuff. My advice would be: don’t wait. Actually kind of take affirmative action and remember that, like Jen said, in a free market, consumers will choose. If you want to be competitive and you want your company to sort of be sustainable and keep being a company, you do need fundamental values and you do need to stand for something because that’s where consumer loyalty will come into play. Acquisition costs are not going to get any cheaper; it’s going to become harder and harder to gain those customers, so that loyalty is essential. The way to get to those people is to have very sort of firm values and to take affirmative action and to do things before you’re actually asked to do them.
I’d come back to that transparency as well. It’s one thing around like the influencer sort of thing, “this was created by Gen AI,” but it’s another thing to come back to “these are the business principles that we’re following”. Putting the effort into disclosing that, but tracking it, etc., not just for yourself but within your upstream and downstream channels and business partnerships. Regardless, you should be looking at doing that anyway, because while there may be a lot of uneducated users/customers out there right now, I think they’re going to start to become educated and question these things much faster. So, get on the front foot, be more transparent around how you’re operating, whether you’re selling to consumers or you’re selling to businesses, because those demands are going to be coming, and you might as well get started now rather than be caught on the back foot.
Yeah, I agree with all of those sentiments. I do think we do have some individual agency still and some control over these things. Hopefully that leaves everyone with something to mull over in the day-to-day.
Thank you very much both of you for joining me again. Any final thoughts, what’s happening in both of your worlds? Jen, I know you’ve got an event coming up soon, right, in Sydney?
So I run the meetup program for the CMO Alliance here in Australia, and we partner with the Product Marketing Alliance. We’ve got a session coming up in August. We are aiming for Brisbane for the first week of September. The $21^{st}$ of August we’re looking at running our Sydney event. If anybody wants to find me on LinkedIn, happy to share all of those details. Stuart is usually a willing participant in these sessions as well.
How about you, Ashton, what’s going on over the next couple of weeks?
LinkedIn always. I’ll be heading along to a couple of events. I’ll be at Something Digital in Brisbane part of Something Fest, definitely the $25^{th}$ and $26^{th}$ of August. I’m actually running a master class at Something Digital as well around how to unlock your innovation mindset, which I’m super excited about. And then myself and Sarah, our founder of 22 Digital, will be down at South by Southwest in October for the whole week. Ready for lots of chats and drinks and music and just sponge mode soaking up absolutely everything. So yeah, Something Digital and South by Southwest, hit me up and catch up.
Awesome, well that all sounds very exciting. I wish I had similar plans, to be honest. I think the meetups are awesome, I’ve been to a couple of those, so definitely worth heading up to. And obviously South by Southwest would be amazing. I’ve heard very good things about the events around Brisbane as well.
Awesome, look, thank you both for coming on again, it’s been a pleasure. And I’m sure we’ll chat again after this. This is the awkward ending. Thanks Ste, thanks everyone.