Even in the AI Era, Communication is the Central Theme

Rob Collie

Founder and CEO Connect with Rob on LinkedIn

Justin Mannhardt

Chief Customer Officer Connect with Justin on LinkedIn

Even in the AI Era, Communication is the Central Theme

AI is rewriting the rules of analytics. Copilot can pull answers straight from your semantic model and bypass the dashboard entirely. But for all the tech fireworks, the same old truth holds: communication is still the hardest part. Stakeholders don’t always know what they want, builders don’t always know how to translate it, and requirements docs have never fixed that gap. Copilot just puts the tension in sharper focus.

Rob and Justin dig into why vanishing chat histories aren’t just inconvenient, they erase the most honest record of what stakeholders actually care about. Screenshots and Word docs are a band-aid, not a solution. Persistent, shareable conversations could change the way model developers and business users collaborate, but only if governance and security evolve fast enough to keep up. Along the way, they show why usage data from Copilot queries is miles ahead of click stats on a dashboard and why the story of your data has always hinged on the same thing: people understanding each other.

Dashboards may have set the stage, but conversation is where the real action is. Listen now and see what happens when the chat itself becomes the deliverable.

Episode Transcript

Rob Collie (00:00): Hello, Justin, and welcome back from death's door.

Justin Mannhardt (00:04): Thank you.

Rob Collie (00:04): You go to great lengths to avoid sitting down to record podcasts with me, like contracting the plague.

Justin Mannhardt (00:11): This is what is the most unfair reality about it, although I'm very grateful for this, is everybody else in the household just dodged it, and that rarely happens. Simultaneously, I'm like, WTF, everybody, this happened to dad, why isn't it happening to anybody else? But I'm also, I'm glad the kids and April and everybody's healthy and happy.

Rob Collie (00:35): Yeah, if you could choose, that's what you would want.

Justin Mannhardt (00:38): Yeah.

Rob Collie (00:38): At the same time though...

Justin Mannhardt (00:41): Because when y'all get sick, I get sick.

Rob Collie (00:44): Exactly. The children are the vector. And doesn't your wife also work in the medical profession, so-

Justin Mannhardt (00:50): She does, yeah.

Rob Collie (00:50): ... everyone's on the front lines of germs all the time. You're not on the front line of germs, you're in a private office by yourself.

Justin Mannhardt (00:57): Yep.

Rob Collie (00:57): You're not going to be the one that brings it home.

Justin Mannhardt (00:59): Of course not.

Rob Collie (01:00): But you're the one that gets hit the hardest, because everyone else's immune systems are battle-tested.

Justin Mannhardt (01:06): That's actually a theory that I think I've proven, because of all the kids going to daycare for so many years, and now the school, April worked in the ER for a number of years and was a primary care provider, and just always healthy.

Rob Collie (01:21): We eat Ebola for breakfast, we just sprinkle it on our cereal.

Justin Mannhardt (01:25): I don't believe this, but April never actually tested positive for COVID ever.

Rob Collie (01:32): She was swimming in COVID.

Justin Mannhardt (01:35): Yeah. She was just living her life at the place where testing happens or where vaccines were getting given out and just sick people all the time, no, she's fine.

Rob Collie (01:50): All right. So we were talking earlier today, the solo podcast, you had some thoughts after listening to it last week. Why don't we dive in there? There was something particular that you wanted to talk about, I think.

Justin Mannhardt (02:01): There were some interesting nuanced threads that I hadn't really stopped to think about that I started thinking about after your last episode. We've been talking about together, and you talked about it last week, how tools like Copilot over semantic models and Power BI are going to fundamentally change how we think about BI and data and dashboards, and the thought I've been having has to do with what's become, for me anyways, a very well-understood dynamic between stakeholders and people building solutions for those stakeholders. That's how we've been doing this game forever.

(02:48): And one of the challenges that emerges in that environment, you could be building Power BI solutions, you could be building Power Apps, you can just be building software, is the success people have as stakeholders communicating and articulating what it is they want or need... We always used to say people don't know what they need until they see it, and so we always acknowledge it's very hard for them to articulate what they're really trying to ask for. And at the same time, it can be challenging for the team or the person trying to build a solution for someone to understand what it is they are wanting, and that's why here at P3 we've always valued a high tempo, very iterative process, because we know this exists.

(03:41): And so, I just got to thinking, now Copilot is going to enter the chat, no pun intended, and it's going to be sitting there in front of a stakeholder or a user, and this user probably has the same type of challenge in being able to articulate what it is they need. We've seen some really impressive things, where Copilot can answer clear questions with right answers. I was like, well, where does Copilot need to be in understanding that the user maybe isn't quite understanding it themselves, what they're trying to get at, and then how that affects maybe someone that's responsible for semantic models, to have that type of telemetry to understand that that challenge has occurred and what they might do to service about it. So it made me think about some interesting things about the dynamics between people and the way Copilot would service the user in that scenario and what we think the evolution of that Copilot experience needs to look like with that in mind.

(04:45): That's something I haven't really pulled back the curtain on myself too much yet, and so that was one of my reactions. I've been daydreaming about it, Rob.

Rob Collie (04:53): It certainly invokes some of the themes, that if you zoomed back and said what are some of the themes of this podcast over the past several years, a lot of these same concepts are coming up here, they're just in a completely different light. You're highlighting the communication cost, but also the communication bottleneck between creator and consumer of dashboards, and the communication bottleneck is what made traditional BI projects so expensive, this communicating via requirements documents, which are themselves a very, very, very inefficient artifact to create, they just consume tremendous amounts of time to create them and they're full of miscommunication and they rest on the flawed assumption that people actually know what they want, they actually know what they need, and can communicate it, and that you're talking to the right people when you're asking those questions, and then also, when you turn it around and produce it for them, that you produce it in a way that they can understand. So that's always been a theme here.

(06:04): With our Faucets First methodology, not only do we give the business value quicker, quicker ROI, you actually get to the meat of what's real instead of spending so much time focused on a requirements document that just soaks up such a huge percentage of the cost of the project, developing that thing and then executing against it, when it's a myth that it's ever going to be correct.

Justin Mannhardt (06:28): It makes me question, and I wondered how well a reasoning model would do it thinking through this, how many total person hours of time have been lost to trying to explain the difference between business requirements and technical requirements. Those are the two prominent sections on a requirements doc template.

Rob Collie (06:50): You're kidding. I didn't even know that. That's just awful.

Justin Mannhardt (06:53): Yeah. I'm going back 10 years of my career or more and just [inaudible 00:06:57] who is this for? Is this for me, is this for you? I don't get it.

Rob Collie (07:01): What it says to me, this is the half of the document that I read and this is the half of the document that you read. Isn't that great? That's going to work great when each of us are paying attention to half of the document.

(07:13): But also, another theme of ours has been how when you're doing it right, and it's so hard to do this, you're working backwards from your stakeholders, your users, their workflows and their needs, and not working forward from the data, you're not trying to push things at them. Like at Microsoft, when we were designing software, we would do user visits and customer visits and things like that for exactly that purpose rather than trying to just push things at them, we were doing it right. And the leverage on a dashboard, it's great, but it's not like we're building multi-billion dollar software. The time to spend with those people doesn't necessarily feel sufficiently high ROI. But again, when you're doing it right, that's what you should do.

(07:58): All that communication and miscommunication costs. Communication with yourself, what do I really need? I don't know. When someone asks me what I really need data-wise, oftentimes I'd feel like I was being interrogated. I'm supposed to have a smart answer, my biggest goal in that conversation is to come off sounding smart and not look dumb, more importantly, as opposed to actually answering the question in a way that's going to be meaningfully helpful to me. And then, there's just so many places where this breaks down. Even if the creator of the dashboards and the data models actually nails it, it might not be structure,. The output, the way it's described, the way it's laid out, whatever, might not be the way that the person was thinking about it, it might still be intimidating or hard to understand.

(08:44): And one of the things that I learned the hard way multiple times is that when a dashboard user isn't getting value out of the dashboard, the reason for it is they either blame themselves or they blame the data, they blame the whole data culture, as opposed to identifying, no, there's nothing wrong with me and there's nothing wrong with being data-driven, it's just that the dashboard needs to be improved. That's the least obvious conclusion that they come to. So they say, "I'm out on dashboards, dashboards in general, I'm out on them because they never have done what I wanted," and it's like, no, they were poorly executed.

(09:20): The punchline on all of this is that all of these barriers, all these obstacles in between, it's like Clark Griswold in Christmas vacation where he dramatically in slow motion brings the two ends of the extension cord together and plugs them together and everything finally lights up, it's that kind of moment, because when a user sits down with a chat interface and there's no one looking over their shoulder, they're just going to ask for what they need, they're going to ask questions of that that they wouldn't ever even come up with if they were looking at dashboards, because the dashboards just tell them what's possible, it limits them.

Justin Mannhardt (09:55): But it creates an unconscious cognitive bias from the jump, the moment you see something, there's a lot of theory and science around the way you would lay out a dashboard because of this, so now the thing's gone.

Rob Collie (10:08): Yeah, the thing that tells you what's possible, that limits your thinking, is gone. Interestingly, this is a really great example of this, something I would never ever, ever, ever think to build a dashboard for, nor would anyone ever think to ask for a dashboard that did this, but when they sit down in front of a chat interface, with my hockey model, Ryan Spar, the commissioner of the hockey league, asked Copilot, "Hey, can you take the current roster of players and assign them into seven evenly balanced teams?" That's a question which would never ever come up, like an optimization type question. It turned out the current version of Copilot didn't do a great job of it because it was assigning teams and using the same players on multiple teams.

Justin Mannhardt (10:54): Oh.

Rob Collie (10:56): So I think that's one of those things that could get fixed. You're just not going to run into these barriers.

Justin Mannhardt (11:04): There are some interesting new twists and turns, I think, coming at us.

Rob Collie (11:10): Indeed, yes.

Justin Mannhardt (11:11): For example, I played with Copilot yesterday against our internal P3 reporting.

Rob Collie (11:20): Oh, excellent.

Justin Mannhardt (11:22): I didn't have a real catalyst, other than let me just hammer this thing and ask lots of questions, and I had some cool takeaways. That chat is gone, I can't go back and get to that today. Microsoft have got to fix that.

Rob Collie (11:37): Right. Like in the same way that you can go back to a ChatGPT chat or a Claude chat, that was a valuable conversation I'd like to revisit without having to ask all the questions all over again.

Justin Mannhardt (11:46): Yeah. I was like, I can't even remember the questions I asked. Something simple like that gets fixed. But I think an assumption we have is this chat experience is going to service a lot of the front-end need, make charts, point me to the right charts, take me to the right reports. And so, in that exchange I was having the other day, it created some little snapshot visuals along the way, and it's almost like, well, I want those artifacts to come back so I can click on them and see them.To your point about the autopilot we have when we go to the reports we use every day, with the dashboards we use every day, and we can just be so efficient with our mouse and clicking around and finding things we want to look for, it's like there's an interesting hybrid experience that isn't quite there yet. Whereas I had this whole exchange the other day with Copilot, it did some things that I found interesting, but it's just gone.

(12:48): Where human-in-the-loop is going to become a thing, for example, if you were my BI person that I've worked with for years and I find something interesting or I get stuck and I want to get you involved in all of that, that's the dynamic is now we've got human A, human B and Copilot, what's this new dynamic of collaboration going to look like? I think we understand and have theories about parts of it, like how the end user's experience is going to change dramatically and how the model developer's responsibility is going to change dramatically. But how the collaboration between all these two parties is going to change is something I think we're wrestling with on behind the scenes as well. But yeah, that's just an interesting play out the movie before you get there, what is this going to look like in these different moments?

Rob Collie (13:39): I am really, really, really excited about the usage data of all of this is is now going to be useful in a way that it's never been before. I publish a bunch of dashboards, then I go look at use of statistics and I see who's engaged and who isn't. So I'm like, oh boy, not everyone's engaging, I really wish people were engaging. Or you look at the people who are engaging and trying to divine their story from their usage data, what are they really trying to do? You've got all these clicks and not clicks and everything, but you have no idea what their intent is, you're trying to read the tea leaves, and this is even if you've gotten that far. We've had Gil on the podcast with BI Pixie because it does a better job than the default instrumentation, it actually tells you which visuals were being clicked and not just the queries that were being generated. Of course, you need that.

(14:37): But when you get to stare directly into their intent in the form of questions that were being asked, you're going to learn so much. You're going to learn so much about what people are doing and what they want to do and all kinds of ideas and opportunities that emerge from that, and also things that just aren't working. They're asking questions that your model isn't capable of answering, it's not your fault that your model can't answer it because you never knew that this was even a thing. Now, you know. So now, you've got to do something about it or decide that it's low ROI and you shouldn't. But it's the reason to add capabilities to the model is to meet stakeholder and user needs and demands. I can actually imagine me just sitting there, in a way, all-day sessions, just reading the questions that were being asked, if I had built a model for some very large organization, dozens or hundreds of users, I could just pour through the logs of questions, and also what the answers were, all day long.

Justin Mannhardt (15:44): I know you could, but we'd probably build an AI apparatus to do all that too.

Rob Collie (15:53): That would take a little bit of the fun out of it now, wouldn't it? Fine, I would read that then.

Justin Mannhardt (15:58): No, I couldn't resist. It's like, no, you wouldn't.

Rob Collie (16:02): That sounds like a perfectly unstructured source of information that could be chewed up by an LLM and...

Justin Mannhardt (16:06): Yeah, and I think that's going to be really important. Can't be overstated enough, the ability to get that type of instrumentation is gold. Imagine not having that. Because you know there's going to be frustrated user experiences with this thing, no doubt about it. Just this example I just shared with you, I was playing with the other day, well, Justin, what'd you ask it? I can't remember. So having access to that type of insight is going to be really useful.

Rob Collie (16:42): That's right. Even when you and I are capturing our experiences with it, we're taking screenshots and putting it in Word. When I want to share something like this on ChatGPT, I share the chat, which is also an interesting security thing. When they eventually add the share a chat feature to Copilot in this space, it's going to have to respect data security, it shouldn't leak answers. Oh, it's going to be fun.

Justin Mannhardt (17:13): Yeah. I was talking with Charlie on our team about this yesterday, some of the things we hope will get addressed here. Today, governance is a concern we're hearing from customers, they want to understand how all that's going to work. Today, you turn Copilot, it's either on or it's off, and you can control either your entire organization has it or it's limited to a security group. But once it's on, I can use the standalone Copilot experience with every piece of content I have access to in our tenant. It doesn't matter where that content is, it doesn't matter what workspace it's in, it just works, and it'll respect row-level security and that kind of stuff, but there's nothing there yet to tell Copilot, "Ignore this." That's something that I think we're expecting to get quite a bit better pretty soon.

Rob Collie (18:05): I thought there's already a setting somewhere that's like this has been prepped and so it should be exposed.

Justin Mannhardt (18:10): Well, based on my experience, I can ask it a question about anything that's in there and it'll work with it. To my knowledge, there wasn't a way to exclude something from Copilot's... It's like the Eye of Sauron, it sees all.

(18:24): The idea of when you think about how an individual interacts with data, there's the isolated question of the moment, hey, what happened yesterday? Or, yell me about this, and maybe then there's some follow-up questions. And then, another point, and this is where I think I was yesterday, is there's more of a project lens, like I'm trying to work on more of an initiative space and I have lots of questions, and so that's where the ability to go back and recall these chats and work with Copilot and then enrich the data and then work with it some more and then be able to keep track of all of that, like we're used to with something like a ChatGPT or a Claude where you can go back through the conversational history. I think that's going to be a really important feature with this that's going to help a lot of users understand where they're going in a way that they also can't really do with dashboards.

(19:20): I do this all the time, where I'll go on a dashboard and I'll be clicking around and then I'll be like, I can't really remember where I was at here. I think that's also another maybe sleeping secret with all of this that, if that does happen, could be pretty interesting.

Rob Collie (19:35): That's actually really a good point. I have, from time to time, made PowerPoints that are just screenshots of reports in various states, and I wasn't making them as presentations. People hearing that saying, "So you made a presentation with dashboards in it. Of course, we all have done that." That's not what I'm talking about. I was talking about an artifact that I created for me so that I understood a story for myself from the data. Every time you come back to it and re-engage with it, it's such a gnarly detailed thing, so you don't even really quite remember... Not only was I on each slide putting a screenshot, but I was also putting some bullet point notes of my own diagnosis of what was going on there. This is external memory for me so that when I come back to it, I don't have to recreate those multiple hours of thinking and these artifact conversations are... I think I would conduct it differently in this new world if I could come back to the chat.

Justin Mannhardt (20:39): Yeah. I think that's what it's like, okay, we've got this chat-based experience, it's powered by AI and it can work over our semantic models and it's awesome and it's going to get more awesome in the future, and so you think about what happens after that. I was talking to someone earlier today, and the phrase storytelling is used a lot in analytics and data and other domains as well, but this is just a riff on the point you made last week, the dashboard has always been in control of the story.

(21:10): So now, the user interacting this way with Copilot, they're in control of the story, they're in the choose-your-own-adventure spot, but they're also going to want to be able to remember that story, share that story, collaborate on that story. All of those types of features are things that our friends over in Redmond, I think, hopefully those are being considered, because I think that's where some of the real hype-to-adoption transition could face some friction, in my view, thinking about this, is if I can't get to those level of action and inertia with my team and with other people and persuading people of ideas, then we're all taking screenshots and putting them in Word documents.

Rob Collie (21:54): I introduced my friend at Orangetheory Fitness that I talked about in last week's podcast, I introduced him to a screen grab tool for the first time in his life. He'd never used a screen capture tool. For his purposes, the snipping tool built into Windows was sufficient. But I pinned it to his start menu.

Justin Mannhardt (22:11): Oh, you didn't make him download the fancy thing?

Rob Collie (22:13): No, I didn't make him go to WinSnap or Snagit. I'm team WinSnap, for some reason. The whole world seems to be using Snagit. If you're in my circle, you use WinSnap. Anyway. The idea of even just taking screen captures is not in most people's workflow, but the frequency with which I reach for the screen capture tool when I'm using Copilot is telling, isn't it? Because I know that it's fleeting, I know that it's going away, I better grab it now and put it somewhere else. As you're pointing out, I think it's a really, really astute observation that these chats become artifacts in their own right, they're almost worthy of being documents and maybe even something that itself should be searchable by the broader O365 Copilot, right?

Justin Mannhardt (23:01): Yeah, sure.

Rob Collie (23:02): I had this conversation with our business data about this topic, I want that conversation to be an input to the summary write-up that I want to do, and it might even be a summary write-up for me over the course of seven, eight, nine different things that I've done, it's not just that one thing.

Justin Mannhardt (23:23): Oh, very, very, very interesting. What a cool thought. You made me think about there's a... Feature is the wrong word, it's a general capability that I know I value and I know you valued it, is the ability to have... The conversation itself is an artifact, but then there's something else that's an artifact. So for me, for example, I've really enjoyed using canvases in ChatGPT, so if I've got projects or plans or ideas, that we can be co-editing a document, essentially, or like what you've been showing me with the agents you've been working on, you really value having this database behind it that you can save things into and update together, and same kind of ideas here with Copilot/Power BI is we're creating a story that we want to tell or getting onto something we want to leverage, where does that go? Does it go into a presentation? Does it go into a summary? I certainly want to be able to come back to the conversation.

(24:24): If we all had perfect empty schedules, we could sit at problems and work them until we're satisfied, but we start on a problem, then we have to stop, go to a meeting, we want to come back to these things.

Rob Collie (24:34): It's easy to understand why, in this early version, they haven't made chats persistent or shareable, even just thinking through, again, just the security concerns. Should my long-running chats be something that other business colleagues can search? Yes and no. Let's assume that they have permission to the data that I was using and there isn't row-level security that means that they're different than me, that's something that has to be evaluated, first of all.

(25:06): Those chats, if they become artifacts in any persistent sense and discoverable sense, they need to pass through the entirety of the security model and it needs to be, by the way, the lowest common denominator of access, because I can have a chat with multiple different models at the same time, so if I go to share that chat with someone, you might have access to two of those three models, but you don't have access to the third one, or you don't have row-level security access to the types of things that I was seeing, it's going to have to stop that chat from being shared or redact whole sections of it. You're still going to probably be able to infer some things from the rest of the chat.

(25:48): It's so wild, the security, the share button makes it so easy. Of course, we have the same problem with documents today. I could take screenshots, put them in a Word doc and share them with you, and next thing you know, the cat's out of the bag. But it just seems psychologically a little closer to the data when I'm sharing a chat that's all about the data. It's one of those things that IT could be blamed for, whereas no one's going to blame them for someone slapping together a Word doc.

Justin Mannhardt (26:15): It's funny too, let's say I've got a document and it's in a particular team, SharePoint or OneDrive and everything, and there's only certain people that have access to it, but I create a share link and it's like, well, now, anybody could have this link. I rarely connect the dots of, oh, I need to change that to not be anyone.

Rob Collie (26:34): Yeah. I probably think about that more than 99% of users of Microsoft software, and I still am a coin flip as to whether I think about it. So it's an interesting question. Here's the thing, if you're the Copilot team, you don't want the bad PR of Copilot being a leakage vector, even if it's no different than the Word doc screenshot vector for sharing stuff. You know that the world is capable of seizing on this as bad PR. Even me bringing it up is me intuitively being aware of this thing could get blamed for something, kind of unfairly, but it doesn't matter. They've got to be really careful about this.

Justin Mannhardt (27:24): Yeah, you're going to have this new way of interacting with things. And so, I think some things are still going to be the same, like when I interact with a dashboard, I take away things from that interaction and decide I need to talk to you or Kellan or my team about something, or I've got to go call a client, or I've got to change my mind about how I'm approaching a project I'm working on. We're going to have all those same what's next after these experiences with Copilot, and I think the artifacts being saved or ability to share them or move those into other workflows is an interesting idea that I hope is coming along.

Rob Collie (28:00): It's really neat to me how much communication remains the theme, what are these insights and understandings, and also, honestly, formulation of strategies, which is something that we're just not used to accustomed to asking our data models about. Some of our chat experiences with other data sources, we've got a couple in our private interface, our front-end that we've been cooking up for interacting with LLMs, we have a couple of data sources loaded that are just fronted by SQL, they're not sophisticated semantic models, and so-

Justin Mannhardt (28:33): Just the table.

Rob Collie (28:34): Just the table, and they're performing incredibly well in terms of the type of conversation you can have with them, it's just jaw-dropping. And I think one of the things that Charlie and I have been talking about is using these experiences as a means of setting the bar for what we should expect from Copilot.

(28:50): The thing that's unfair is that these data sources are relatively simple by comparison. If we had five different line of business systems that we needed to splice together with all kinds of really nuanced business logic that's specific to our company, then these raw SQL experiences wouldn't work. But for the data sources we've loaded, they're more than adequate, and they are not just answering questions, but they're also able to help brainstorm strategic courses of action, which is, again, brand new.

Justin Mannhardt (29:28): Brand new.

Rob Collie (29:30): It's really unbelievable. All these different types of communication, for example, these insights and even new strategies for optimization and things like that, until they can be socialized and activated, they don't touch the world.

Justin Mannhardt (29:43): Bingo.

Rob Collie (29:44): They don't make any improvement. And so, it is about communication over and over and over again, and as you say, just in the course of this chat, you pointing out that your Copilot chat is itself a valuable thing to revisit. So obvious to me now that you've said it, hadn't struck me yet. I was happily taking screenshots and dumping them into Word docs. What the heck was I... Now, I'm going to be very unsatisfied with that experience waiting for the real one.

Justin Mannhardt (30:12): Yeah, yeah.

Rob Collie (30:13): So thank you,

Justin Mannhardt (30:14): And honestly, it hadn't dawned on me, because I'd used Copilot on several occasions just to be like, what is it capable of, but this was the first time I was like, I actually want to go pick up where I left off.

Rob Collie (30:27): Yeah. I think I frequently want to pick up where I left off, but I'm not allowed to. But I can with this custom UI Harness that we've got, right?

Justin Mannhardt (30:36): Totally.

Rob Collie (30:36): I do, I came back to it today and asked a crazy question today. I'm just deliberately trying to stump these things and failing to stump them. So I'm looking forward to continuing to discover these sorts of nuances. So many assumptions that we make about this stuff are, assumptions whether explicitly formulated or unconscious assumptions, so many of them are invalidated. It's like, nope, nope, new rules, and you're going to go discover those new rules and those new needs.

Justin Mannhardt (31:10): Well, I don't know about you, I will be staying in the chat to bring the pond full circle, because these are the things to figure out.

Rob Collie (31:18): Well, it's good to have something so exciting, so energizing. I'm as jazzed about all of this as I was about Power BI in the first place, and the last time I was this excited about something, I started a company. It's a big deal.

Check out other popular episodes

Get in touch with a P3 team member

  • This field is hidden when viewing the form
  • This field is hidden when viewing the form
  • This field is for validation purposes and should be left unchanged.

Subscribe on your favorite platform.