Good Tech Things and the Limits of AI, w/Forrest Brazeal

Rob Collie

Founder and CEO Connect with Rob on LinkedIn

Justin Mannhardt

Chief Customer Officer Connect with Justin on LinkedIn

Good Tech Things and the Limits of AI, w/Forrest Brazeal

Dive into this week’s episode where we explore the fascinating intersection of technology and creativity with Forrest Brazeal. Forrest, a former software engineer turned tech educator, uses his unique talents in art and music to break down the barriers of traditional tech communication.

This episode features Forrest sharing his journey from coding complex software to crafting educational and engaging content that lights up the tech community. He discusses how artificial intelligence is woven into the fabric of our daily tools, making it essential yet invisible, and gives practical examples that bring this technology to life for listeners of all backgrounds.

Forrest also takes us behind the scenes of his creative process. Whether it’s sketching out a cartoon that simplifies cloud computing or composing a catchy tune about network security, his methods are as effective as they are enjoyable. It’s a fresh take on tech education that proves learning about technology doesn’t have to be daunting—it can actually be a lot of fun.

Intrigued by the blend of tech and creativity? Subscribe to The Raw Data Podcast on your favorite podcast platform for new content on data, tech, and their business impacts delivered to your inbox every week.

Good Tech Things
The AI Cartoon – Pile of Complexity

Episode Transcript

Rob Collie (00:00:00): Hello friends. In our episode a while back with Sadie St. Lawrence of Women in Data, we talked about so-called slash careers, people who don't have a single role but multiple, thus requiring the use of slashes in between said roles like in your LinkedIn profile.

(00:00:16): Well, today's guest, Forrest Brazeal, embodies that as much as anyone I've ever met. Software engineering leader, developer advocate, and evangelist, author of one of the best blogs I've encountered in a while. Oh, and he is also an illustrator, songwriter, musician, a simplifier of complicated things kind of communicator, and now co-founder of a tech influencer platform that's filling a hole in the marketplace. Speaking of holes in the market, he has his own name for the slash phenomenon, which is sometimes you discover that there's a you sized hole in the world, a niche where your particular mix of skills and experience meets the marketplace.

(00:00:52): Well, Forrest's version of that has a very unique shape, and for the first time ever in the four years doing this show, I felt the symptoms of imposter syndrome in the presence of a guest. It's funny. We've interviewed some real luminaries on this show like Shishir Mehrotra, who only ran YouTube for a while, or Arun Ulag, who owns basically the majority of the Microsoft technology that we care about as a corporate vice president. Dave Gaynor, also corporate vice president who runs a big chunk of the office org, and even former Bing Chief Marketing Officer, Mike Nichols. Heck, we've had former NFL quarterbacks on this show like Hugh Millen. And none of them rocked me on my heels at all.

(00:01:37): So why did Forrest? Well, he and I, our talents run in similar lanes like observations, insights, cutting through the bullshit, simplifying things down to plain language. Well, then he goes and adds things like illustrator and singer/songwriter, and suddenly I'm questioning, well, who the hell am I? It got me to thinking about imposter syndrome and whether it requires some degree of similarity and running in the same lanes. It probably does, doesn't it? Well, that's a subject for another day. And I'm glad he's an illustrator because it was originally one of his web comics in my LinkedIn feed that led us to reaching out about appearing on our show.

(00:02:15): Now, he's not from our so-called corner of the tech landscape. He spent more time with Google and AWS tech platforms than with Microsoft platforms, and he's more of a software engineering professional than a data professional. But those worlds intersect quite a bit in the domain of AI. And his experience being from distinctly outside of our usual ecosystem makes his input even more valuable to us while we all try to get a handle on the trajectory of AI and its implications for our businesses and careers. I don't want to spoil too much in the intro here, but it was just kind of next level. Just a super smart, compelling, and authentic human being who's also incredibly up-to-date with the goings-on and the higher echelons of the software world. It was one of those conversations that I'm probably going to find myself circling back around to from time to time.

(00:03:02): So now we bring you the oh so uniquely shaped, Forrest Brazeal.

Speaker 2 (00:03:08): Ladies and gentlemen, may I have your attention, please?

Speaker 3 (00:03:12): This is the Raw Data by P3 Adaptive Podcast with your host Rob Collie and your co-host, Justin Mannhardt. Find out what the experts at P3 Adaptive can do for your business. Just go to P3daptive.com. Raw Data by P3 Adaptive is Data with the human element.

Rob Collie (00:03:40): Welcome to the show, Forrest Brazeal. How are you today?

Forrest Brazeal (00:03:43): I'm doing great, Rob. Thanks so much for having me.

Rob Collie (00:03:45): This is one of those really cool, not really chance encounters, but just like something goes through your feed, this massive river of information flowing across our screens all the time and it catches my eye. And one of the beautiful things about having a podcast is that it's a professional excuse to meet interesting people. We're completely leaning into that benefit today.

(00:04:07): You're kind of difficult to define professionally. You're one of those slash people. You've got a lot of things going on in parallel. Why don't you tell us a little bit about your history and take us up through a little bit of today, what you're up to these days?

Forrest Brazeal (00:04:18): Oh, I'd be happy to. Well, first off, Rob, thanks again for having me. My background is as a software engineer, a leader of software engineering teams. That's my education, that's my experience. I've worked for a lot of companies ranging from startups to the Fortune 50, including some folks in Indianapolis where I think some of y'all are based, just helping them migrate large legacy systems to the cloud, all that kind of thing. I definitely consider myself more sympathetic to the ops side of the great dev ops divide. It's in my DNA. It's in my background, but I think you're absolutely right that for a lot of us, and I think this is true of just about everyone, if you continue to grow professionally, you reach a point where there's not a great descriptor for what you do. You'll discover that there's just kind of a U-shaped hole in the world.

(00:05:01): Only your skills, your connections, the absolute limit of what you're able to push yourself to do professionally can fill that void, and it's up to you to decide if you're going to step into that space or if you're just going to allow yourself to continue to be defined by whatever pops up in job postings. I've been very privileged and fortunate in my career to be able to gradually over time accrue this weird collection of things that I enjoy doing. Again, being in the software engineering space for a long time, cloud engineering, cloud architecture, and then transitioning from that for a few years, kind of more into the business go-to-market side of tech.

(00:05:32): So I had been consulting for a while. This was back in the mid-2010s with a small little cloud training startup called A Cloud Guru. They had been started out of Australia by a couple of brothers named the Kroonenburgs, and I came on when they were just a few people to help them figure out how to talk to engineers more effectively. It turned out that I really liked doing that. As you've mentioned, I like to draw cartoons. I like to sing. I've got a YouTube channel where I do ridiculous tech songs, and if you have not been exposed to those yet, count yourself lucky. But I came on eventually to do that full time and we ended up selling A Cloud Guru to Pluralsight in the summer of 2021 for about $2 billion. It was one of the largest acquisitions by any tech company in the history of Australia.

Rob Collie (00:06:12): Wow.

Forrest Brazeal (00:06:13): After we finished that, I was at Pluralsight for about five minutes, and then I went from there to Google Cloud and I worked at Google Cloud until January of this year. I had all sorts of made up titles again.

Rob Collie (00:06:22): We have to call you something, Forrest.

Forrest Brazeal (00:06:24): You have to call me something. Yeah. Cloud Bard was my unofficial title when I first got there, and then Google released something called Bard.

Rob Collie (00:06:30): Bard. Yeah.

Forrest Brazeal (00:06:31): However, now, Bard has been deprecated again in favor of Gemini. There is no Google Bard, so I think that I survived the Google deprecation cycle and I get my preferred title back. So I did that at Google for a couple of years. Basically, I was their guy who helped them figure out how to talk to developers, and we got to do some really exciting fun things through that. I ran a newsletter for a while called The Overwhelmed Person's Guide to Google Cloud, where we just tried to contextualize that whole fast moving ecosystem for people, and that was a real blast.

(00:06:57): But I wrapped that up in January of 2024, just a couple of months ago, and kind of took a big leap into exploring what I enjoy doing, which is communicating technical things to people in really unusual ways, would make sense of something I could do on my own. So you've encountered some of the cartoons and other things that I do. I actually do have a startup that I've just recently launched with another wonderful, wonderful co-founder in tech, Emily Freeman, which is focused on scaling more of those type of services to be able to provide them to a whole bunch of businesses. We're very excited about that. We're learning a lot through it. I don't know. I'm just having a good time. It's been a good year.

Rob Collie (00:07:32): The AI cartoon, that was the reason enough to have you on the podcast. It doesn't require a lot of evidence, especially when you have that sense. That's kind of the spider sense. This would be an interesting conversation. But then when I started researching you. I got to tell you, I got a pretty solid dose of the imposter syndrome from reading your writing. So I recognized some similarities. There's definitely a similarity between me and you, but as I'm reading the things and I'm seeing the things you're doing, I'm like, "But I can't do these things."

(00:07:57): There's a few things more unsettling than finding a seeming kindred spirit, but you're like, "They're a little bit more eloquent. They're a little bit more of this, a little bit more of that." It's one thing when you're different. You're like, "Oh, this is my gig, and he's really good at it."

Justin Mannhardt (00:08:09): You mean singing, Rob?

Rob Collie (00:08:11): Well, I can't sing for sure. I can't write music. I can't sing. I can't even draw. Damn it.

Forrest Brazeal (00:08:18): You warned me before we started recording that you had a tendency to get too self-deprecating, and I'm just going to call it out. I think you're there.

Rob Collie (00:08:23): I appreciate the coaching. See, that's exactly what a Forrest Brazeal would do, right? That's the kind of thing he does, right?

Forrest Brazeal (00:08:29): There you go. No, but you've mentioned this AI cartoon a couple of times now, so I will call out and I've shared the link with you all, but if you go to goodtechthings.com, you can find not only that cartoon, but many, many, many other cartoons, which are mostly just my attempt to... They're thinking aids for me to help me think through, how do I feel about ai? How do I feel about the new AI Safety and Security Board that DHS just announced? How do I feel about what it means to be able to debug certain kinds of issues and how that relates to my imposter syndrome? These are things that are easier for me to get a handle on if I can either write about them or if I can draw about them, and they seem to help other folks, too, so I'm just going to keep doing it as long as people will still let me.

Rob Collie (00:09:06): Well, I'm definitely going to be subscribing. So many of these things you're mentioning even in that short little intro are things that I very much vibe with the communicating of complex tech subjects in a way that's approachable. It's weird. Tech is almost... It finds itself very frequently in a place where it couldn't be more opaque if it was deliberate. It's just so overwhelming, but everything worth doing can be explained simply. Everything worth pursuing. There is a simple explanation for it. It's just that so often the software vendors, for instance, and the platform providers, they don't really take the time to do that. They never stop to bring it down simply. It's amazing what a gap that leaves. We're witnessing another sort of, not revolution, but another generation, big changes coming to the Microsoft platform that we work so closely with, and it is a mess to understand what's really going on. It is not simple at all, even though it probably can be and definitely should be.

Forrest Brazeal (00:10:09): I think you're making an insightful point. It's easy to say, "Oh, well, these vendors are all complexity merchants and they're intentionally making things harder." I think that's overly cynical to say that. I think that in large part, the reason we have such a hard time talking about technology is just because we tend not to be gifted communicators in general in a technical space. And then add onto that, there's ego at play. There's folks who you feel like, if I want to be taken seriously and I want to advance in my career, the rational action for me is to make what I'm doing sound as impressive as possible, and that's how you end up with these layers and layers of complex stacks. And you take a step back and you're like, "Hang on. This is not rocket science."

(00:10:48): I wrote a book years ago for Wiley called the Read Aloud Cloud, which was intentionally designed to make the fundamentals of cloud computing so simple that even an adult could understand them. What I discovered as I was going through that process was a whole lot of the concepts I would find myself trying to explain them and going, "Well, hang on. There's really nothing to this."

(00:11:10): I would go around to all these non-technical people in my life and they'd be like, "Well, what's Kubernetes?" Or, "What's this and that?" And I'd be like, "Oh, you'll never get it. Unless you're in the special club, you'd never understand this. It's too hard."

(00:11:20): And then you take five minutes and explain it without the jargon and you're like, "Actually, this is far less complicated than what the doctors and lawyers and accountants that I know do every day." And it's really just ego and insular jargon on our part that's making us think it's some special, arcane, unique thing. That's a mindset thing. That's not an inherent property of the technology.

Rob Collie (00:11:38): Yeah.

Forrest Brazeal (00:11:38): So if we can just be ruthlessly intentional about speaking directly and plainly in small words about what we do, then we can actually start solving the real problems, which always tend to be human problems. They don't tend to be technical problems at all.

Justin Mannhardt (00:11:51): It's interesting, Forrest, because there's two sides to this coin. There's the side where the ego comes into play and we just have a hard time using simple language and explaining things very clearly. The people that are being communicated with, business leaders, executives, the majority of them have a hard time saying, "Forrest, I didn't understand a single thing you just said there." And sort of nod and say, "Yes, we need the cloud."

(00:12:15): So I think encouraging both sides to speak more clearly, more simply, down to earth as much as we can and when you don't get it, say, "I don't get this. Can you explain it to me again?"

Forrest Brazeal (00:12:26): I feel so bad for tech executives sometimes because everybody that's trying to sell to them thinks that they need to be talked to like they're numbers in a spreadsheet, not people, and they've got to use language like, "Let's leverage actionable insights," and, "Let's do strategic transformation."

Rob Collie (00:12:42): Data ipsum.

Forrest Brazeal (00:12:43): Yes, executives are people, too. They take great joy in just being spoken to plainly, in my experience. I don't know. If we're not willing to do that, what are we even doing here?

Rob Collie (00:12:53): There's so many professional and even sort of personal psychological incentives to have it be an ivory tower. Even really, really fundamental things that take great glee in tearing down these sorts of things. Mostly because, like you, I was a software engineer. I was in the belly of the beast. I was one of the people committing these sins. I often refer to myself as a recovering software engineer. I'm here to atone from my past sins.

Forrest Brazeal (00:13:20): I don't want to be down on software engineers. I actually love working with software engineers. Having been exposed to various industries, we're quick to take for granted how amazing the people that end up in software engineering often tend to be. They're honest people. They're people that care about doing a good job. They're people who are intellectually curious. They're people who are systems thinkers. Go work in... Pick another professional industry off the top of your head. You are not going to encounter people like that. I have found overwhelmingly that the software engineers I've worked with are people with great integrity who care about getting the job done. I can work with that. The communications issues are just kind of a smaller thing that we talk about.

Rob Collie (00:13:58): Totally. That's another thing about me. I'm actually not down on software engineers either. In fact, the thing you were just talking about, I actually screenshotted from your Cynicism is the Mind Killer post exactly that topic, the one that you were just talking about. If you don't know any better, you might not appreciate how atypically wholesome these people are.

(00:14:17): The other thing that I really liked from that segment of that blog post, talking about all the different backgrounds that these people come from. They're not churned out by the machine. They have interesting backgrounds. A lot of times, even blue collar backgrounds that discover they have a collision with software development or a collision with cloud tech. They get that itch and it's like, "Oh my gosh, this is my calling." And that's a super, super strong parallel with what we're doing here at our company. We're a 70 person data analytics consulting firm with about 50 consultants, and the overwhelming majority of those people came from non-tech backgrounds that had those collision modes. We call it the data gene laying dormant. And that integrity thing you're talking about, the highest integrity group of people that I've been around are these people that love building data models and dashboards and crunching numbers and all that kind of stuff. They're some of the best people that you'll ever meet, and it was really cool to see you calling out that same kind of vibe. It's the software gene, data gene. They're similar.

Justin Mannhardt (00:15:22): They're similar genes. It's the maker gene.

Forrest Brazeal (00:15:25): Yeah, that's right, and I think you could even go farther, and this calls back to my bias for the ops side, so I'm just recognizing that right now, but the rock solid real world ops skills that folks tend to bring into tech when they come from a background is working in the trades or anything where they have to be hands-on, customer facing, troubleshooting issues involving real infrastructure. I know a guy who's a residential and commercial plumber in Atlanta for a number of years before getting into tech and some of the things that he had to do, basically tracing techniques, right? Running smoke through pipes and figuring out where there's a leak in a giant building.

(00:15:56): We're basically pair programming with a colleague to work their way through a large complex plumbing system and arrive at the root of it, ruling out possibilities as they go along. Realizing that business continuity is something that you have to think about when you're troubleshooting a system live in production, if you will. You can't just shut down an entire office building while you figure out a plumbing problem. You got to figure out how to troubleshoot live. Those are all things that translate incredibly well to tech. In fact, tech almost feels like a pale limitation of some of the challenges that you have to solve in that field.

(00:16:24): Unfortunately, I think as an industry we've not done a good job of recognizing how incredibly impactful that skill set can be. We have a preference for folks who come out of four-year college computer science programs. No shade to those folks. I was one of them myself. Those people need jobs too, but those folks are not going to show up on day one understanding business continuity. They just don't have the seasoned maturity. I've spent a lot of my spare time working on this in my career. I run an initiative called The Cloud Resume Challenge, which is specifically designed to surface people like that and get them matched with tech jobs. And the thing is these people don't need my help. They're great at what they do. The people who need help are the hiring managers to understand why these are folks that are worth taking a chance on.

Justin Mannhardt (00:17:02): I wonder if we could draw some parallels and even get a take of digging in a bit deeper. The same thing applies in our world. Someone that's been on the ops or the business side of the house and has a very intimate understanding of the business problems and the business challenges. This is my experience. I sort of learned how to do analytics and all that sort of stuff at a necessity. I have to solve this problem. And when people come out of programs where they've learned the technology they learned, the languages, and all these things, they almost needed to develop in a different way to bring it all together. So you're talking about surfacing people that maybe have an opportunity to branch over into tech. What about the people that started their journey in tech but have this gap on the app side like you're describing? What's been your experience with that?

Forrest Brazeal (00:17:49): The thing with the ops skills is there's no school where you can learn them except the school of hard knocks.

Justin Mannhardt (00:17:54): That's right.

Forrest Brazeal (00:17:54): You have to get to a point where you can be the one on call at 2:00 in the morning when something goes down a few times before you can really trust yourself and be trusted in that situation. I mean, the answer is ideally similarly to how we get people comfortable in the real infrastructure world, you need some kind of apprenticeship model where you put those people alongside seasoned engineers and let them debug these issues in a slightly lower stakes setting before they have to handle it all on their own, and we've been bad in general in tech about applying that apprenticeship model, but tech, software engineering, it's a more trade-like calling than we've tended to want to give it credit for, and it's far past time that we developed standardized apprenticeship structures in tech.

Justin Mannhardt (00:18:33): I love it.

Rob Collie (00:18:33): The close association between computer science education and math education... When I graduated from my computer science degree and then landed at Microsoft, I looked back and wondered, "I don't really think I learned anything in school that was truly helping me here." There was one time I had cackled with glee and walked the halls and said, "Hey folks, I just used O notation. Write it down." There was the algorithmic complexity. I used it to prove something, so now those four years spent in school, they were worth it for that moment.

Forrest Brazeal (00:19:05): Yeah. I think a good computer science program is going to go a long way toward teaching you to learn how to learn. They're going to throw you into open-ended projects without a lot of step-by-step handholding tutorials, and you're going to be required to bang your head against the problem for a while. I remember that being the most helpful thing about my computer science education, but yeah, I mean, have I had to calculate the time or space complexity of an algorithm very frequently in my career? Not very frequently.

Rob Collie (00:19:30): No. You can just generally tell, okay, this is slow. This needs to get faster. For me, the journey was... As a college student, I was kind of obsessed with this notion of abstract formulations of the world as if they were unlocking some deeper truth and I wanted to hide behind those rather than embrace the world as it was. So I've gone through quite a journey of change. It started at Microsoft, but then certainly has continued since leaving. Oh, right. Yeah. The tools, the technology, all of that abstract computer science, that's only useful if we can bend it to the will of what the people actually need and meet them where they actually are. And that sounds so simple and so obvious, and yet 22 year old me would've looked at whoever was saying that and gone, "What are you talking about?"

Forrest Brazeal (00:20:22): Right. There's got to be a pure technical solution to this, right?

Rob Collie (00:20:25): Yeah. We need to decompose this into just a series of symbols and abstract concepts.

Forrest Brazeal (00:20:30): Yes.

Rob Collie (00:20:30): That does bleed into software engineering, that same mentality. Former version of me was always trying to introduce new nouns into the product, like these new abstract concepts like a content class as if the humans looking at the software were going to know what that meant, imbuing these objects with meaning, some sort of deep abstract meaning. What they really want is verbs.

Forrest Brazeal (00:20:53): Yes.

Rob Collie (00:20:54): Go do this thing For me. It's kind of neat to reflect on that transformation. And also honestly, how little the Microsoft environment of the late nineties, early 200s, how little help that environment was in making that transformation. There wasn't anyone there really formally urging all of these computer science graduates through that transformation. You just had to luck into the right managers and things like that. I wonder how different it is today, whether there's a rehabilitation that takes place explicitly, "Welcome to the real world. We're going to start solving problems."

Forrest Brazeal (00:21:29): I think there's something almost tragically humorous in that idea that we have a whole industry full of people who are convinced that there's a pure logical solution for everything, and then coming to find out that all the problems are actually human problems. That is endlessly funny. It will never not be funny, and I think that underlies a lot of the cartoons that I draw and things like that. It's kind of bittersweet and wistful because, again, the people that believe this are so pure of heart and so adorable.

Rob Collie (00:21:56): They are. Yeah.

Forrest Brazeal (00:21:57): It's like a Greek tragedy.

Rob Collie (00:21:59): It is. It totally is. And you need people who think that way to build some of the software that the world needs.

Forrest Brazeal (00:22:06): That's right.

Rob Collie (00:22:07): You need the people who are committed to that ethos, even to that mythos. Some software is more transactional than others, but some of the things that we rely on at our company are the result of genius, true computer science genius that have come out of Microsoft. So you need that commitment, but then you also need this translation layer.

Forrest Brazeal (00:22:28): That's right. And a lot of us are not solving problems of that nature.

Rob Collie (00:22:31): Totally true.

Forrest Brazeal (00:22:32): We're releasing hundreds of thousands of computer science graduates into the world every year, I don't actually know what the number is, but that sounds right, who are convinced that their work day is going to look like sitting around being Leslie Lamport for eight hours a day. I remember it being quite a shock to me when I discovered that I was not spending a lot of my time hardcore algorithm programming.

Rob Collie (00:22:50): That's right. Yeah.

Justin Mannhardt (00:22:51): Forrest, we need you to figure out how to make this button turn a different color. "Oh, okay. Got it."

Rob Collie (00:22:57): Yeah. How do we rationalize all of these customization options across all these controls in a way that people are going to understand? It's always the human.

Forrest Brazeal (00:23:04): Yep.

Rob Collie (00:23:04): I was going to ask you, at what age were you first drawing and realizing that you were good at it?

Forrest Brazeal (00:23:09): I'm not willing to accept the premise that I'm good at it. I think I'm pretty barbaric, but I remember being so obsessed with newspaper comic strips as a kid. I had all the Calvin and Hobbes books and this and that. I would just pour over the old Peanuts collections. I would pour over those for hours and hours. I remember being 11 or 12 and being so convinced that I wanted to be a syndicated newspaper cartoonist. I had read all about this and I remember I drew up all these cartoons and I sent them to, I think it was either Universal Features or King Feature Syndicate, which were the two big ones at that time.

(00:23:39): To be clear, I'm not that old, and the newspaper comic industry was already well into its death agonies by time I was doing this. So it was not a brilliant career move with any future, even if I had been any good at it, but I remember I spent all this time drawing those. I had this Bristol board and I had all this stuff to try to do it just the way Bill Watterson did it, and I sent all this stuff in and I remember getting the form rejection letter back, of course. I remember feeling like my life is over. I think a good lesson for 11 or 12-year-old Forrest was that you kind of just get to do what you want in life and you don't have to wait for some gatekeeper like a newspaper syndicate to say, "Yes, you're good enough to do this."

(00:24:15): I draw things that I like and I'm thrilled that other people seem to enjoy them. Ultimately, it's about creative satisfaction for me.

Justin Mannhardt (00:24:21): Yeah, and just the courage to put it out in the world. I think there's a lot of people listening to this podcast or people we're all connected with that appreciate the humility. You maybe say, "Well, I'm not a great artist, but I like what I'm doing." There's people that have interesting things to say and communicate. Just ship it. Just put it out into the world. You'll find people that like it. You'll find people that don't like it, but if you have something to share, share it.

Forrest Brazeal (00:24:47): There's an interesting balance there because I see a lot of advice out there saying, "Just get into a cadence, ship stuff, record a video every week, write a newsletter post every week. You'll get better and the good stuff will come." And I do think that's true. You have to have the reps. You have to keep working at it. But at the same time, I think that advice pulls a lot of people into a mode where they are just doing content creation for the sake of it, in scare quotes. And instead of writing, I don't know, 100 blog posts, they're writing the same blog post 100 times. They're not really learning. They're not growing from that, and they're just kind of shouting in the void forever. I try to guide people in a different direction than that because I think that it ultimately just ends up with them getting frustrated and abandoning the things that maybe they are passionate about.

(00:25:24): What I try to encourage people to do is write about something you enjoy, and if you really don't feel like there's something interesting enough to you right now to write about, that's okay. It's okay not to write something this week, but approach the week with, what am I interested in this week? What's interesting to me? What do I really want to explore and learn about? To me, that's the place to start from, not from, "Oh, I have to get something out this week, so I better come up with something." That's a more sustainable and I think a more human and joyful way to approach creating things.

Rob Collie (00:25:49): Are you ever surprised at the uptake some of these things get? I think I saw something from you on LinkedIn where you were reflecting or surprised at how much that AI cartoon seemed to hit a nerve.

Forrest Brazeal (00:26:02): It's bewildering. The AI cartoon specifically, I drew that more than a year ago just after ChatGPT had first come out, and I remember it didn't get a lot of attention at the time. Fine. You just never know how these things are going to do. It expressed how I was feeling. Sometimes it just takes a while for people to get to a place where they're ready to hear that. I think the hype cycle is so huge at that moment. And now all of a sudden, people are coming up against the limitations of these services and they're ready to have that conversation.

(00:26:29): I think that the mistake is in trying to chase what you think is going to be the most dank meme that's going to strike the most people, but just speak honestly about how you feel. I have been consistently bewildered that people enjoy these enough to share them and reshare them and they'll pop up in places that I never would've expected. They'll pop up in serious investor reports and things like that. I'm like, "Whoa, this was not intended to be serious thought leadership. You need to replace that with a graphic draw and a PowerPoint."

Rob Collie (00:26:57): Was that AI cartoon, would you say that's been the one that's gone the most viral?

Forrest Brazeal (00:27:02): I don't even think that's the most viral of my AI specific cartoons. There's another one about what developers are good at versus what AI is good at that I think has been seen even more than that, and you can find that on goodtechthings.com as well. It's a little Venn diagram. There's a whole bunch of them that will surface from time to time. As a creator, it's shocking how little you're able to tell whether something you've done is good or not.

Rob Collie (00:27:22): Yeah. You can't predict. Right.

Forrest Brazeal (00:27:24): A tremendous point of humility. When you have a lot of ideas, every idea that pops into your head seems like the best idea you've ever had.

Rob Collie (00:27:31): True.

Forrest Brazeal (00:27:32): You have to rely on other people to help you triage that. It scares me even now that when I think of something, I just don't have any perspective on it until it's out there. I cannot predict what's going to happen to it.

Rob Collie (00:27:42): I feel that in my bones for sure. Do you have anyone in your life that sort of helps you as a sounding board or as an editor or anything like that or pretty much it's just you and the internet?

Forrest Brazeal (00:27:56): I'm grateful to have many, many wonderful colleagues, friends, family members in my life who keep me honest. The internet is an amazing thing because in some ways, the only true honesty I think comes from random people on the internet. People that you know and love are never going to give you the unvarnished truth. But man, people on Reddit, they are mean. They have no filter. If something strikes them as cringey, something strikes them as inaccurate. Something strikes them as pedantically incorrect in some tiny detail, they will be all over it. And once you get used to that and you learn to decouple your self-worth from it, it's an incredible gift.

Justin Mannhardt (00:28:32): I'm ready. I want to bend this conversation into the AI bucket since we brought up the cartoon.

Rob Collie (00:28:37): Let's do it.

Justin Mannhardt (00:28:37): So, Forrest, you said we're about one year removed from the cartoon that we've been referencing that caught our attention and we liked. The hype train just keeps on rolling. There's been some really good articles I've read recently about, is generative AI hitting a plateau? Are people really using this stuff? Considering that most technical problems are actually human problems, where are we with generative AI? What's your take on the current state of affairs? I mean, we saw GPT-4o last week. What's your read on where this is going and how might it actually land in the world?

Forrest Brazeal (00:29:12): I'm pretty thankful that I don't live in Silicon Valley. Nothing against people that do, but I think that the hype cycle has been so intense there over the past year and a half, and I mean, I don't know how recently you've been out in the Bay Area, but every billboard is gen AI. Every marketing budget is being reallocated. I'm hearing wild stories from CMOs that I know at large tech companies saying that their entire DevRel team is just being switched to focus on talking about generative AI all the time. Forget all their other products that actually make money. All the growth dollars are flowing into this. So you're seeing it on every billboard, you're seeing it on every booth at every conference, every talk at every conference. I don't remember a hype cycle quite like this since I've been in tech, which I'm in my second decade now. There just hasn't been something burning quite this white hot. Even cloud.

Justin Mannhardt (00:30:01): No, I agree.

Forrest Brazeal (00:30:02): More enterprisey, more stodgy than this is.

Justin Mannhardt (00:30:05): Right.

Forrest Brazeal (00:30:06): And if I lived in the middle of that, I don't think I would have a chance of maintaining any perspective on it because you just have so much jargon and there's so many venture capital dollars floating around. When the money is waiting, the conversation like that, it's impossible to see anything else. But I think we are now getting to a point where, at least for folks directly outside of the maelstrom, the eye of the storm there, we're starting to get a decent handle on what LLMs are good for and what they're not good for. They're really, really interesting as just generalized interfaces to human speech. They're amazing at doing that. They're not as amazing at actually having something worthwhile to say.

(00:30:46): So we've seen a lot of creative work with LLMs applied to it. Music, art, text, creative text, prose text, and I think while you don't have the problem of factuality there, you do have the problem that it does seem to be asymptotically mediocre. There's no spark of originality to it. You've seen that by how quickly people have grown to roll their eyes when they see obviously AI-generated text or AI-generated videos. Even the AI-generated blog headers everyone was using for a while, they've lost their novelty and now people look at them and it just looks cheap. It looks like someone clearly spent two seconds on this.

(00:31:22): My buddy Corey Quinn said the other day, and I'm upset at how good this is. He said, talking about AI-generated marketing copy, "Why should I be bothered to read something that someone else couldn't be bothered to write?"

Justin Mannhardt (00:31:36): Wow.

Rob Collie (00:31:37): That is awesome.

Forrest Brazeal (00:31:38): I think that's brutally accurate. And again, if you're not financially incentivized not to see that, it seems pretty obvious. I'm not down on generative AI. I'm not down on LLMs. I think they're really, really interesting and good at what they do. We all remember how magical it felt the first time we got a look at ChatGPT. Specifically, I remember my brother who works at another AI company out in the valley. I'd been on this long text chain with him and one of my other siblings, and he took the whole text chain and put it into GPT 3.5 and he output a hallucinated imaginary text conversation we might have between the three of us, and I remember reading it and just feeling these chills go down my spine at how absurdly good it was at capturing the nuances of how we talked and the kinds of things we might say. Do you want a really hot take?

Rob Collie (00:32:24): Yeah.

Forrest Brazeal (00:32:25): I don't even know if this is too hot for this podcast or not.

Justin Mannhardt (00:32:27): Yes, please.

Forrest Brazeal (00:32:28): This is my really hot take. I think that what generative AI is ultimately incredibly good at is exposing how little thought and how little original effort goes into so much of the knowledge work that we're doing day-to-day. It's exposing so many of the emails we send, the briefs that we write, even the public facing content that we create has no original value. It's just pale copies of something that's already been done 1,000 times. Generative AI is really good at doing those things. And I think that if we lived in a just world, it would shock people into realizing, "Hey, I've got to do better." If generative AI can do this, that automatically means it has little value. It's not that generative AI is creating something that's valuable for you. In many cases, generative AI is exposing that what you were doing didn't have value to begin with. That is really the existential thing that's hurting a lot of people when they try to think about this.

(00:33:22): I firmly believe that original creative work in either a business or a traditionally artistic domain that looks freshly at the landscape and takes a human perspective is unmatched. Generative AI is nowhere close to playing in that ballpark and has shown no signs of being able to generalize enough to do that. We have not seen any emergent general capabilities out of LLMs that would imply they're going to get any better at doing that, but I think a lot of folks find that hard to hear, too.

Justin Mannhardt (00:33:49): They do. I think of you can make a two column list in a way, which is, here's the opportunities and possibilities that are most realistic and most helpful to humanity with generative AI. I use these things every single day in my job. I love the way you characterized that for us. It's the things I know are of little value. I need help getting off the blank page with some type of proposal or thoughts for a presentation. I would normally get on a call with someone like Rob and be like, "Okay, I'm thinking this, this, this," and we would rubber duck and we'd refine it. So that's really good. To your point, there's really good things there.

(00:34:24): But I think the big problems, to your point, I haven't seen anything that convinces me we're on the way to really solving this in a way that it would take it to the next level, which is the hallucination problem. Not only solving for that problem, but solving for what is truth and what is fact and who decides?

Forrest Brazeal (00:34:41): Yeah.

Justin Mannhardt (00:34:41): And then what I call the being inspired problem. The LLM is never going to be inspired. So in the work you're doing as a software engineer or as an analytics professional or anything in tech, the real breakthroughs happen when someone is inspired or they have this inspired moment to go somewhere they didn't think they could go, and the LLM doesn't know how to do that. So I see these things where they're like, "Oh, look, I uploaded my data frame of data and it spits out all these statistics and all these visuals." I was like, "Yeah, but it's never going to be inspired. It's going to give you the expected insights, the expected five-number summaries." It's never going to be inspired for the secret sauce. It's the human that's going to get us to that point. I think it's important that we remember that.

Forrest Brazeal (00:35:25): To be fair, I know there's lots of domains, like certain kinds of financial modeling and things like that where that's actually exactly what you do need, right? And it just then becomes a tremendous labor saving device for those folks, but you'll still need a professional in the loop. The one thing we haven't on here, of course, is code, and I didn't mention that at all when I was talking about creative domains. I really do think that code is a little bit fundamentally different because, as a general rule, atomic pieces of code are not creatively differentiated. The creativity is coming at a higher level. It's coming at the prompt level, the architectural level, in a lot of cases. So I think that's why programming-related use cases have been so immediately successful for LLMs.

(00:36:00): I think GitHub Copilot has demonstrated that it is a legitimately useful tool that people are not going to want to give up. It's not perfect, but is it worse than Stack Overflow? I'd argue it's not. How many outdated answers are there at the top of Stack Overflow? What gives LLMs an advantage over Stack Overflow is that they're obviously much faster and more convenient and they're way friendlier, aren't they? Way friendlier to use.

Justin Mannhardt (00:36:23): Right. You think Reddit is mean?

Forrest Brazeal (00:36:27): ChatGPT has never once closed your prompt as being off-topic or as a duplicate.

Justin Mannhardt (00:36:31): Right.

Forrest Brazeal (00:36:31): It will patiently work with you as long as you need and that's why, of course, you now see Stack Overflow officially partnering with OpenAI. They can read the writing on the wall. So that has fundamentally changed the way we retrieve that type of knowledge. The question then remains, are we at a plateau of how useful LLMs can be for coding without putting a bunch more human interaction in the loop? Fast forward three years when we're three years farther down the road of new versions of programming languages being released and new paradigms being developed, and if we don't have that human knowledge base of people going and posting on Stack Overflow... Because so many programming problems are very niche, they're one off. They're edge cases. It takes a human going and tinkering around with it for five hours and beating their heads against it to figure it out, and that's why those little nuggets of wisdom on Stack Overflow were always so valuable because only 10 people in the world are ever going to run into that, but those 10 are going to be so grateful that the one shared it.

(00:37:18): That's the sort of thing that LLMs are less likely to come up with without human trainers putting that in. And if human trainers aren't incentivized to do that because they know all their work is just getting sucked into an LLM and regurgitated without attribution, how does that look down the road? Do we have a steady decline in the quality of what these programming assistants are able to surface? I don't know. I think that's an unsolved problem. I think it's very concerning. But at the moment as we're theoretically sitting at the top of knowledge and LLMs are in their first phase of competence, yeah, it's pretty great.

(00:37:47): It reminds me of, I remember reading years ago a book. It was one of those James Herriot books, the Veterinarian who was in Yorkshire back in the thirties and forties, and he was describing his first encounter with penicillin, which had just been discovered and I think it was just after World War II, so it was just becoming available to vets out on the country like he was. He had a bunch of sheep that had some classic sheep disease where one member of the flock gets it, just write them all off. You just take the insurance right down right then because the whole flock is going to die.

Justin Mannhardt (00:38:15): Yeah.

Forrest Brazeal (00:38:16): As kind of a last resort, he went in at the end of the evening and he dosed them with this penicillin concoction and he came back in the morning and every one of those sheep was standing on their feet. They were 100 percent healthy and he had that prickling feeling in his spine that we were describing when we saw ChatGPT for the first time like, what just happened? The world is completely different now. Penicillin did not continue to act like that. Bacteria developed some resistance to it, and it did not have that miraculous effect after too many more years. I believe that's how we are seeing LLMs right now, and I think the knowledge bases are going to decay. People are going to get jaded about them and they're going to settle into, like antibiotics have today, useful, couldn't live without them, but they don't feel as incredibly magical as they did in the forties and fifties when they truly could seemingly bring people back from death.

Rob Collie (00:38:59): We should describe this AI comic that we've been talking about. It's an audio podcast, so we're going to have to do the 1,000 words worth one picture thing. Can we just briefly resummarize what the point of that comic was?

Forrest Brazeal (00:39:12): I'll give you the alt text version of it. It's a three panel cartoon. The first panel says current situation, and it depicts a sad engineer standing next to a pile of complexity, languages, tool chains, and infra, and then what flows out the other end of that ultimately is the apps that we ship. The middle panel says what we think AI will do. All of a sudden the sad engineers are replaced with some happy non-engineers, and they are interfacing with a magical bubble of ai and what comes out the other end of the AI is this kind of grayed out hidden pile of complexity that no longer matters. Presumably the languages, tool chains, and infras. The AI just talks to that, and then we get apps hopefully that are better and faster. And then the bottom panel says what is actually going to happen. Instead of happy non-engineers, we have even sadder engineers who are working with a new pile of complexity, AI pipelines, templates, a new set of enterprise business processes built up around AI. That is flowing into our magical AI service.

(00:40:02): And on the other side of that, we still have the old pile of complexity and there are yet sadder engineers who are now stuck maintaining that, right? They've been pushed down the stack and then that somehow rolls out to apps, which we presume are at maximum as good as they were before, perhaps worse and slower. I mean, heck, it blows my mind. And I say this as a former employee of Google, it blows my mind that you go to Google search today and there's a spinning JavaScript animation at the top of that 10 blue links result page while it waits for the generative AI summary to load. Would you ever have thought that that would be Google's search result experience? Their whole thing was snap your fingers and here's your 10 blue links, and now it's like a static website going back to hit a Lambda function or something, right? It's unbelievable what that's done to the experience and we're just like, "Yeah, okay, whatever." I guess generative AI just makes everything worse.

Rob Collie (00:40:50): Yeah. Meanwhile, every web page, every website on the internet is being penalized for its load times by the gods of Google.

Forrest Brazeal (00:40:57): I'm not even criticizing them. They have to do it. There's no other option. I think it's going to be the default now. You have to put web on the end of your queries to not have this. At some point, we just accepted that that was a better experience. I don't think it is.

Rob Collie (00:41:08): Yeah. There's almost like a prisoner's dilemma thing going on here, right? Everyone is having to rush in a marketing sense. We're leaders. We can't be last. We can't lag behind, so everyone is just diving in. Microsoft Copilot for office. I was like, "Oh, I can't wait." I had envisioned what it would do for me in PowerPoint and it's very expensive. It's very exclusive. It's precious. It's dear. This is one of those things. It's very hyped when you finally get Copilot enabled in your work environment. I went in there and it won't do a thing for me. It's been delivered to the market as like, "Ta-da." It is way below MVP, but there's money changing hands over this now. It's not even a proof of concept as far as I'm concerned. You were talking about the white hot hype cycle and everything. I'm so unaccustomed to software being delivered as if it's a big deal. It's actually useful. And it's like DOA. That is a new experience for me.

Forrest Brazeal (00:42:06): I have not encountered that in tech before either, and it's such a departure from those kind of honest integrity-filled software engineering ethics that we've been used to where we really felt that truth was defined by, "Okay, what do the bits actually do?" Right? We're not defined by, "What does the press release say? What does the stock price say?" But it does feel like a lot of these things are narrative-based development more than it is actually creating something that's designed to be useful.

(00:42:28): I remember being so struck. I was watching Google Cloud Next was in April, and it was the first one I hadn't had to help put on in years, and so I was kind of watching it with some shade for enjoying not having to have been involved with it, but every talk, every speaker was talking about generative AI and it was all these executives in suits and it was clear that obviously they don't understand how the technology works. They're just reading whatever script their marketing people have put together for them.

(00:42:51): I remember thinking, "Hang on, it kind of feels like gen AI is already this stodgy enterprise thing, but we never got to a point where it actually worked. Did we just skip the whole part of the cycle where all the cool kids were using it and getting a lot of value from it?" No. It's like somehow we went straight from the promise to, "Oh, now it's wrapped up in all this enterprise language and we're going to be signing these big deals for it." But there was never value demonstrated at any step along the way. It's like we just skipped all that. That blows my mind.

Justin Mannhardt (00:43:18): One hundred percent. It makes me think about the people that listen to our show, developers, business leaders, analysts, software people, people in tech in general. Every single one of these people, self-included, is being asked constantly, what are we doing about AI? If you're a software engineer, you're probably on a team where someone is asking you, "How are we going to get AI into our product?" If you're an executive, your CTO, your CEO is asking you, "What is our AI strategy?"

(00:43:50): I wonder if we take this from a couple of different lanes on this question for us. For the people you love to help, if you're being asked that question, "How are you going to bring AI into the product, bring AI into your analytics?" whatever that might be, what advice do you have for these people when they're in these moments? There's a meme, I forget where it comes from, where it's like, if your solution looks like this where you're just making a post request to OpenAI's API and then just getting the response, don't do that.

(00:44:15): What should people be saying, thinking about, doing in terms of this type of integration?

Forrest Brazeal (00:44:21): If it's a thin wrapper around ChatGPT, that's probably not a good sign. To be fair, I do a fair amount of product marketing, positioning, consulting. It's just something that comes over my transom from time to time for various startups and companies around tech. This is a question that comes up. What I always encourage folks to go back to is, okay, what is the actual problem you're solving for the customer? The word AI hopefully doesn't include in that. AI is not a problem to solve. That's a solution in search for problem. What is the actual problem we're solving? And usually, particularly if you're trying to sell to an enterprise buyer, there are four things that will cause someone to actually pull the trigger on paying for a technical product. We're in a recession. It's 2024. I don't know if we're in a recession or not, but growth budgets are not what they were. Hiring budgets are not what they were. That's borne out very quickly by a look at the job market.

(00:45:07): Everybody's budget is tight, so if you're going to convince an engineering director to spend money on a software tool, you are going to have to be very crisp about what the value prop is. The four things that I tend to talk about are, does it take care of downtime? Does it reduce the possibility of downtime? Does it reduce the possibility of dangers like breaches, security issues? Is there a possibility of delays in that I'm going to lose a lot of money if I'm not able to ship my product? And then the fourth one, it's just dollars in general. Does this represent a significant cost savings for me? Downtime, dangers, dollars, delays. If you're not reducing one of those four things, then I don't care how AI-centric the thing is. It's dead on arrival,

(00:45:41): I do see some AI-involved tools that do help solve some of those problems. Ironically, some of them solve the problems by just routing around organizational constraints. I talk to people whose products are... The value of the AI is not that, "Oh, it generates a report that we couldn't have generated on our own. But if I get AI-involved, then I can blame the AI for what goes wrong instead of the security team blaming me for what goes wrong." It creates a connection point where there previously wasn't one between two teams that don't like each other very much. Again, not a technical problem, a human problem, and there's cases in large organizations where having an AI agent in the middle of things is useful, but it's not for the reasons you might expect.

(00:46:19): I do think that the LLM stuff... I haven't seen a lot of cases where it's contributing a ton of value on its own at an organizational knowledge level. At an individual level as a productivity aid. It potentially does have some value, but it helps me be more productive personally continues to be a hard sell to get a manager to buy a tool for you.

Rob Collie (00:46:40): It's not just a prisoner's dilemma dynamic for all the software vendors where they all have to be these things to market or else be perceived as not with it. Just like you were pointing out. On the consumption side, the customer side, everyone is feeling the same sort of pressure to be bluffing that they're using it. Everyone is sort of caught in the same dynamic right now.

Forrest Brazeal (00:47:00): You know when I'll believe that gen AI actually provides useful value to an end user, to a consumer as part of an e-commerce flow is when it shows up in the Amazon Prime signup flow because they've optimized the heck out of that so much. There is not a single pixel in the Amazon Prime signup flow that detracts from you signing up for Amazon Prime. The day I see Amazon Q, or whatever they're calling their thing now, the day that shows up in the Prime flow, I'll know it's legit. Specifically the signup flow.

Rob Collie (00:47:29): Yeah. Wait for that spinny to load.

Justin Mannhardt (00:47:31): That's a stake in the ground.

Forrest Brazeal (00:47:33): There are no spinnies there. Yeah. I'm calling it right now. When you can show me a loading spinny in Prime, not cancellation. I very much believe that you might have to talk to an LLM to get Prime, but yeah, I haven't seen that yet.

Rob Collie (00:47:45): It's taken a little while for to the truth of this stuff to work its way through the system. That prickling feeling you described, your first brush with doing something scary, and then the seeming exponential advancement that we thought we were experiencing last year. You start to extrapolate that intuitively in your own brain and just like, "This is just going to explode tomorrow and everything is over." That was the kind of feeling, but it does seem to not have continued. In hindsight, the first debut of GPT on the cultural stage, it arrived almost in its final form as far as what we're seeing today. It's definitely gotten better.

Forrest Brazeal (00:48:24): Yes.

Rob Collie (00:48:25): It's been able to do certain things that it wasn't before. That was one of the questions I was asking myself in the beginning was, is this the sort of thing that continues with time?

Forrest Brazeal (00:48:33): Yeah.

Rob Collie (00:48:34): Just continues to improve. Even linear improvement over time that was sustained would be really kind of dramatic.

Forrest Brazeal (00:48:41): Well, I would agree.

Rob Collie (00:48:42): Is it subject to massive breakthroughs that just qualitatively change its capabilities overnight? And a lot of things you've been saying echo our impressions of it, which it hasn't really been working that way.

Forrest Brazeal (00:48:54): Yeah, it definitely appears to be more of a difference in degree than in kind.

Rob Collie (00:48:57): But in the beginning, it just really did seem like it was very easy to just take a couple of quick data points in the early days of these LLMs exploding on the stage and thinking, "Oh my gosh, the world is going to be a fundamentally different place in a year, and it hasn't played out like that. But I do vividly remember that extrapolation that I was doing, like, "Oh my gosh. This is going to run away."

Forrest Brazeal (00:49:20): I do actually believe the world is going to be a fundamentally different place, but I don't think it's going to be through really visible top-down cultural institutions. We're not suddenly going to see all movies replaced by AI generation. I've never understood the people who are like, "What about when AI can generate 100 different versions of your favorite movie for you?" Who wants that? People just assert that like it's something everyone would want and it would be so cool, and I genuinely don't get it. I want to see a human's creativity and versatility on display. Everything else just feels fake and cheap.

(00:49:48): But where I'm going with that though is to say I think the fundamental change that we'll see, it's going to happen a little more under the surface, and it's going to look more like a bunch of folks who are doing clerical type of work are suddenly having trouble finding work, or they're having trouble finding full-time work and having to do contracts because a lot of people that are running businesses are now able to do a lot of the form filing and stuff off the side of their desks with the help of an LLM. You're going to see folks who are really struggling to get junior level programming jobs because there is nothing that a junior software engineer can contribute that ChatGPT or GitHub Copilot cannot contribute.

(00:50:22): I think those are concerns long-term, again, because they're not allowing folks to develop the expertise would need to be able to function as a expert partner to an LLM the way I think some of these fields will become. I think we're probably borrowing from the future there to gain some efficiency and some cost savings in the present, but I do think that's going to be fundamentally disruptive. It's just like the Rust Belt gets a lot of attention because it's easy to see an abandoned factory. It's easy to write a feature story about 300 coal miners who were all laid off at the same time. It's much harder to write a story about, "Oh, these scattered folks who were working in a dentist's office now have to come in contract, or they're only in 20 hours a week because a lot of the insurance filings they doing are now being done by LLMs." Right? That's just not as visible.

(00:51:07): It'll creep up on folks and it'll just sort of become to be accepted as a fact of life. But I think you'll look back on it 20 years from now and you'll say, "Wow, life is incredibly different than it used to be." Look back at life in 2006 and say, "How different was my life without a smartphone?" But it took a few years. Even for me, it took a few years for my smartphone to kind of creep in and replace every other device I owned.

Justin Mannhardt (00:51:26): Yeah. There's a bifurcation of possibilities in the future that I'd kind of like an echo back in the conversation. So first, we were talking about how there's a hope that because of the mediocrity of what the LLMs do sometimes that inspires or forces humanity to be better. I think that's true, that we're going to realize that we don't like the AI movies. We don't like the AI music. We don't like the AI creative output. I think there'll be use cases for it. It will be used, but it's like we'll realize, "Look, we're already enough." It makes me wonder because of what you were just talking about where it is useful and those sort of clerical routine, maybe junior code development, this virtual assistant type stuff. Is that then the limit? Here at OpenAI, I talk about we're going to bring AGI to the world.

Forrest Brazeal (00:52:17): I mean, until OpenAI stops bleeding talent, I am not really trusting anything that comes out of Sam Altman's mouth. I don't know what's going on over there, but it does not seem like they're in a great spot. Look, you said bifurcation. I would say a bimodal distribution of what's going to happen. I do absolutely agree that the top quartile of creatives, gen AI is just going to push them to get better, more original, sharper at what they do. I even see this in small ways. If I'm typing in Google Docs, I'm writing something creative and it pops up with the suggested completion for a sentence, then I know not to use that. That would be the cliched version of what I'm about to say.

Rob Collie (00:52:56): Yeah, the formulaic predictable.

Forrest Brazeal (00:52:58): Right. I look at it as a cliche warning. And it's like, "Oh, well, I'm going to say something different now." I think that those are actually ways in which LLMs can guide people toward increasing creativity in a surprising way, even at the sentence level. I do think that at the lower end of the cost spectrum, you're going to see just a continued profusion of... I think the term I've seen used for it is slop. It's like spam, but it's LLM generated spam. It's slop. You're going to see that on social media feeds. I mean, you're seeing it. When is the last time you logged onto Facebook? It's all this AI generated imagery, and it's being commented on by some combination of bots and by people who are... They're not as sort of AI fluent, and they're not realizing that these are not real images.

(00:53:40): I think you're going to see more and more of that, again, combined with access to entry level professional jobs being denied people by LLMs. I think it is going to cause some problems. I'm not convinced that that's a default positive, but yeah, at the top end, it's going to create some really cool stuff for sure.

Rob Collie (00:53:54): Have you had any exposure to what Microsoft calls the Citizen Developer phenomenon?

Forrest Brazeal (00:54:00): Yeah. This is kind of that. Sometimes it's called low code, no code. That idea.

Rob Collie (00:54:04): Microsoft has actually done... I'm a reasonably hardened Microsoft cynic in a lot of ways. I worked there for a decade and a half. But I also know what they're really good at, and I think they've done something truly phenomenal. Heck, I mean, I've voted with my feet. I founded a company that we basically do is utilize that platform for business value for our clients. It is a form of coding. Have you seen what the innards of a Power BI model would look like?

Forrest Brazeal (00:54:31): Yeah. I used to work for a company that had their own BI DSL that hooked into Microsoft products. It was called Biml Script. Varigence was the name of the company.

Justin Mannhardt (00:54:38): Oh, yeah.

Forrest Brazeal (00:54:39): I used to work there. I've had a little exposure to what BI tooling looks like for a non-developer.

Rob Collie (00:54:44): At first glance, you would think that the low-code, no-code world would be almost the more junior stuff in a way. The things that we do, all the things that Justin was talking about, like the inspiration, there's something very creative going on in turning these big piles of data into something meaningful that's infused with all the business logic and the business meaning and everything. The GitHub Copilot experience, it has grown up around the procedural "real code". We haven't really seen it at its full strength yet, like generating things in the Power BI language, for instance. I'm really interested. It's one of the things I'm watching. How good does it get at that stuff? Is it going to be just inherently more capable in that space, or is there something about it that will actually make it inherently less capable? The one thing I'm convinced of is that it won't be the same.

Justin Mannhardt (00:55:33): It's a safe bet.

Forrest Brazeal (00:55:34): You know what the greatest low-code platform in the world is, right?

Rob Collie (00:55:36): Excel.

Forrest Brazeal (00:55:37): Microsoft Excel. Yeah.

Rob Collie (00:55:38): Yeah.

Forrest Brazeal (00:55:39): The end game of all of this is to where we can drive any business process off of a spreadsheet interface, right?

Rob Collie (00:55:46): It all comes back to Excel in the end. We spent all these years escaping it. Then it just pulls us back in.

Justin Mannhardt (00:55:52): Forrest, you said something earlier, a project you're working on. I think you referred to it as the Overwhelmed Developer's Guide. One of the things we've liked talking about on this show is the fact that because of the hype, because of social media, because of everything you're seeing in your feed, there is a tendency to get overwhelmed, and we have a term that we like to use called FOBO, which is the fear of becoming obsolete. What we try to do is explain to people how to deal with the FOBO, deal with the overwhelm. What are some of the top things you would say in regards to people being overwhelmed by all this hype, the fear of being obsolete in their role? What are some of your perspective on that issue?

Forrest Brazeal (00:56:34): To the fear of being obsolete, I would say if you ever lose that fear, it's probably too late. Fear of being obsolete is the first sign that you're still alive and kicking. That means there's time to keep learning. Let's be realistic. I spent my entire career in IT kind of running half a step ahead of the automation reaper, going from blade servers in a closet to virtualization, to the cloud, to higher level managed services, and each time you learn and you keep moving. You see what the next level up the abstraction stack is. You take the opportunities if they're available where you work. You go for the opportunities if they're available where you're not. Over time, you develop a little bit of a sixth sense for that.

(00:57:10): What I always encourage people to do when they ask about this is don't be worried about being automated out of a job. Think about how you're going to make yourself into a job. What would it look like for me to operate at the next higher level of abstraction than I'm currently at? Because I keep coming back to ops examples in this conversation, if I am a database administrator, a SQL server, Microsoft SQL server database administrator, I can see that a lot of my clientele is moving to these higher level kind of serverless SQL services, Cloud Spanner, things like that, which Spanner technically doesn't use SQL, I don't think, but it has a SQL interface to it now, or a BigQuery, things like that. If I see that that's where my clientele is going, I'm going to have to figure out, what does it look like for me to continue to add value in that world? And it might be that I have to become more than just a traditional database administrator where tuning queries and debugging execution plans is all that I do.

(00:58:07): It might be the case that I need to become more of kind of like a holistic data architect for a platform, and that's going to involve learning some new skills and going to involve getting more comfortable probably with scripting, with coding, getting more comfortable with plugging together multiple elements of a cloud data stack. Or conversely, it could be that I'm at a company that's going back to the data center and they need someone who can understand how to do large scale migrations, things that I wasn't comfortable with before. So as long as you're always thinking ahead to what do I need to do to be indispensable at the next higher level of the stack? Then you're going to be fine. There's so much to do out there, and every hiring manager is desperate for competent people that have a track record. Continue to leverage what you've already learned and make incremental steps. You're going to be just fine.

Rob Collie (00:58:49): I wanted to ask you about your Cloud Resume project.

Forrest Brazeal (00:58:52): Yeah. The Cloud Resume Challenge.

Rob Collie (00:58:54): You described it as, in some sense, like a discovery process where you're mentoring themselves up through this process, but they're also... From the hiring manager's perspective, you're helping the hiring manager more than anything, identifying talent for them. In one of your blog posts, you were sort of lamenting the low completion rate of people going through this cloud resume challenge, and you're sort of wondering aloud, walking through the process. I could make it easier. I could make it easier to get through, increase the graduation rate essentially, but then it might lose its value, the thing that it's actually doing that's good today. And then the thing that really struck me was inner dissonance about the gatekeeping nature of this. It's hard because it needs to be. It's challenging because it needs to be, but at the same time, that's discouraging so many people. Any peace on this topic? I have my own version of this, and I have not come to a peaceful conclusion.

Forrest Brazeal (00:59:51): It's interesting. I got so much mail about that issue more than I think I've ever gotten, and it was not just one line responses. It was people sending me these long reflective essays of their own creation in which they were going back and tracing their own career journey and thinking about how they got to where they are in tech. Some of these were people who had used The Cloud Resume Challenge as part of their career transition. Some were people who had just checked in for part of it, and they were quick to tell me, "Hey, the reason I didn't finish this was because I only needed part of it in order to get hired, and I used those skills and I moved on. Hopefully that's okay with you." "Of course it is." I say elsewhere that's intentionally designed that way. The challenge is modular. You can pick and choose which pieces you want to work on.

(01:00:30): But I think where I've come to is it's okay for some things to be hard, and it's okay to not pretend that they're easy. It drives me up the wall when you see folks talking about learn data science in two weeks, learn Python in a month of lunches, something like that. You're writing checks that you're never going to be able to cash. It's the get rich quick scheme applied to knowledge, and there is no such thing.

Rob Collie (01:00:55): Seven minute abs.

Justin Mannhardt (01:00:57): Why aren't they six minute abs?

Forrest Brazeal (01:01:01): No, not six minute abs. That would be ridiculous.

Rob Collie (01:01:03): Yeah.

Forrest Brazeal (01:01:07): Yeah. That would just be pure snake oil. So I don't have any attacks of conscience about pretending that hard things are hard. I do think there are things that can be done to make the experience of learning less discouraging. Use the word discouraging. And I definitely don't want people to feel that way if it's something that they can power through. I mean, the reality is there's a point on the learning curve for all of us where we just want to give up. We don't feel like we can do it if it's something sufficiently hard. The way that I've historically gotten past that for things that I've found to be very difficult has usually been having someone around me to encourage me, someone to keep me going, someone to motivate me, someone to push me through.

(01:01:43): So I've really tried to invest more of my time and effort in The Cloud Resume Challenge recently over the last couple of years of its four-year life in building stronger community support motions around it. So we've got a large Discord server now with about 10,000 folks in it, and there's a number of other guidebooks and resources that are available for you to help with. And of course, I try to make myself available personally as much as I can. We still have a low completion rate, but that's okay. If you're going through it and you're stuck and you feel you're not getting help from the community and you're frustrated, it could just be the case that you've learned something about yourself, which is this is not something that's interesting to you that you want to do, and that's okay. That's a reasonable outcome of this.

(01:02:17): But if I can help 100 or 200 people a year discover that this is really something they want to do and that they actually do enjoy it even though it's difficult, I consider that to be a success, and it's ended up being far, far more than that. As long as I continue to have that impact for even one or two people. There's a billion resources out there on the internet. Everybody can find something that suits their learning style. This is one particular thing, and it's resonated well for some folks. It's the way I like to learn. I'm going to keep doing it.

Rob Collie (01:02:42): Yeah. I mean, looking at it through the lens of the known positive outcomes. You can shut it down. Would the world be a better place if you shut it down? No.

Forrest Brazeal (01:02:52): Yeah, I don't know. I'd probably sleep a little more, but other than that.

Rob Collie (01:02:57): So what are the services that freeman + forrest are going to be providing? It's kind of new. You just recently announced it, right?

Forrest Brazeal (01:03:03): We actually just publicly announced it yesterday, but it's something we've been doing behind a curtain in stealth for a little bit. Myself and my co-founder, Emily Freeman, are both veterans of the big tech world. I was at Google Cloud. She had sort of a parallel job title to me at AWS. Left AWS about the same time I left Google and she worked at Microsoft before that. Both of us have a lot of experience, not only as engineers, but running product marketing teams, community teams, developer relations teams, that kind of thing.

(01:03:28): We were comparing notes earlier this year and discovered that we both had had a similar problem, which was we would constantly have executives coming to us saying, "Hey, we've got this big new thing rolling out. It's probably a generative AI thing. We would love to get a whole bunch of influencers to make noise about this. We want to see 300 important people all talking about our product and showing off content, creating it. Can you go create us a list of all these influencers?"

(01:03:51): And we would be like, "Okay," and we would go and comb the internet and make a list of people that we thought would be great if we could get them to talk about the product, and then we'd turn it back into our marketing teams. And the marketing teams would look at it and be like, "Well, what do I do with this? I can't go through the procurement process for all 300 of these people. I don't know how to work with them. I don't have a process. I don't even know how to price them." It would just be a nightmare and a headache.

(01:04:10): And then on the other side, you've got this incredible, huge community of tech creators. I don't even want to use the word influencers because I don't think most people that are influential in tech think of themselves as influencers the way that a TikTok influencer who's selling shoes and hats thinks of themselves as an influencer. These are folks who, they might have a relatively small niche following, but they've got a little newsletter. They've got a podcast. They've got some followers on LinkedIn just because they're earnest about sharing what they're passionate about. As I was saying before, they create when they want to. The thing is there's so much value locked up in their audiences. These are folks who can speak really effectively to a technical buyer. Of the few thousand people that follow them, a disproportionate number are going to be people who control the budget strings inside of an engineering organization.

(01:04:55): Marketing orgs in tech have historically struggled to reach that audience. They don't listen to traditional PR. Traditional social media is fractured so much. Who's even on Twitter/X anymore? LinkedIn is what it is. Some people are active on it, some people aren't. Who knows who's on Mastodon or Bluesky, or I can't keep up with whatever the new ones are. So marketing teams are looking for different places to put their growth spend. They've realized that who these technical decision-makers listen to, it's their friends. It's their peers. It's the trusted voices in the group chat that they follow. So how do I go in and get some brand awareness in those places? Traditional influencer marketing platforms built for consumer brands can't work with these tech influencers because their audience size is valued by its quality, not by its quantity. It's a fundamentally different value equation. There's no platform doing this.

(01:05:38): What Emily and I have built, it's called... freeman + forrest is the name of it. It's the first influencer marketing platform built for enterprise tech specifically. So it's a two-sided market. We're helping large brands turn a budget into influence by working with a large network of amazing, amazing influencer partners across cloud and security and generative AI and various other niches in tech. We're early on with it. If you're a growth person listening to this and you have interest, feel free to reach out to me. But it's been very well received so far by both sides of the market. What we've heard mostly from people is, "I've been longing for something like this. Why hasn't this existed before now?" It's been a lot of fun.

Rob Collie (01:06:15): I wasn't even aware, honestly, of the consumer influencer distributor model. I didn't know that was even a thing.

Forrest Brazeal (01:06:21): It's very B2C. It's all about, "I've got this movie or this dress," or something and I want influencers talk about it.

Rob Collie (01:06:27): Well, I knew that there's absolutely a market where I can pay to have influencers talk about my new shoes or whatever. I just never really thought about the idea of there being platforms to help organize it and make it sane, easier for procurement.

Forrest Brazeal (01:06:39): There's agencies, there's platforms, all kinds of stuff. [inaudible 01:06:43], they run very sophisticated campaigns and they have to at the scale that they are. And there's great, great infrastructure for them, but there is no infrastructure for tech and the few teams that are doing it, they're reinventing the wheel every time their marketing teams are. They're spending a ton of admin cycles on this, and they're getting themselves into an echo chamber because they're only working with their friendlies, the influencers that they know, so they're just hitting the same people every time.

Justin Mannhardt (01:07:03): The quality versus quantity dynamic B2C versus B2B and specifically in tech resonated strongly when you said that. That's very, very true.

Rob Collie (01:07:13): And how do you measure that? How do you measure the quality of an audience? That seems like a difficult problem.

Forrest Brazeal (01:07:18): It is a difficult problem. We have a vetting process, a little bit of secret sauce that we go through. Both Emily and I have a long history as influencers, if you will, in our own right, and so we have a little bit of prior art there. But yeah, we have an intake process and we vet folks and we look at the size of their audience and where it is and what type of content they're putting out, and that helps us to ballpark where they have impact.

Rob Collie (01:07:38): Well, that makes sense. So there's experienced human beings in the loop here.

Forrest Brazeal (01:07:42): I don't know what to tell you. AI is not ready to solve this problem yet.

Justin Mannhardt (01:07:45): There's no LLM.

Rob Collie (01:07:48): How old fashioned.

Forrest Brazeal (01:07:51): I know. I know. Honestly, if I were trying to position a tech product today, I might just lean into no AI required. This is the only conversation you're going to have today that has no AI in it. I feel like that would be very refreshing.

Rob Collie (01:08:02): Oh, yeah, and brilliant. We're ready for that. I don't know how often you do stuff like this. I know that you give talks. Sometimes there's a piano involved. Do you travel with your grand piano?

Forrest Brazeal (01:08:14): I don't travel with a grand piano. Usually we hook that up at the site, and there's a few reasons for that. One of which being that a grand piano is basically the last thing you would ever want to transport anywhere. It makes far more sense to set it up on the receiving end. But yes, it's not always the same grand piano, but I'll bring in a number of songs and we'll have some connective tissue and we'll tell a story through the music. I'm actually, believe it or not, and I know this is going to disappoint you, I'm working on a set, a show, whatever you want to call it right now, that is all about generative AI and about our feelings about it and about the limitations of it. I think we're at a point where it feels only just to apply some human emotion to that situation, process it that way.

Rob Collie (01:08:54): Is that a show that you just take to a conference? Do you go on tour with this? What's the delivery vehicle?

Forrest Brazeal (01:08:58): I have to be circumspect about how often I do this because it would be easy to do it all the time, and I have to actually do my regular job and see my family and things like that. For example, I'm doing a set at fwd:cloudsec, which is an amazing cloud security conference held in Arlington, Virginia, this year. That'll be in just a few weeks. I've got another one in Atlanta just a week or two following that. I'm prepping for a much larger event out in Vegas in August, which I'll have more to share about soon. It hasn't been publicly announced yet. I primarily just work with folks that I like and respect in the space and set up things that work for them at their conferences. I don't tour as a solo act or anything like that.

Justin Mannhardt (01:09:31): You're not going to do the Jonathan Coulton thing?

Forrest Brazeal (01:09:36): No.

Justin Mannhardt (01:09:37): Code Monkey.

Forrest Brazeal (01:09:38): I haven't so far. Or there's another group out of Seattle, I think Tech Roasts that goes around and does stand-up comedy where they roast people's tech stacks. It's sort of an only in a tech hub thing could that be born.

Justin Mannhardt (01:09:49): Right, right.

Forrest Brazeal (01:09:51): I haven't been to one of their shows, but I've heard people really enjoy it. So yes, there are people who make a full-time living that way, I guess. But I have lots and lots of things I enjoy doing. This is just one of them. It brings me great joy to roll it out from time to time.

Justin Mannhardt (01:10:03): That's great.

Rob Collie (01:10:03): I so appreciate you spending the time with us.

Forrest Brazeal (01:10:05): Of course.

Justin Mannhardt (01:10:06): This was very fun.

Rob Collie (01:10:07): Justin stayed in Indy. He delayed his flight back so that he could have this conversation.

Justin Mannhardt (01:10:13): I'm happy to do it, Forrest. I really appreciate the time and the perspective.

Forrest Brazeal (01:10:16): Well, y'all have been a blast to hang out with. Thank you so much for inviting me. It was a pleasure to chat.

Speaker 3 (01:10:20): Thanks for listening to The Raw Data by P3 Adaptive podcast. Let the experts at P3 Adaptive help your business. Just go to P3adaptive.com. Have a data day.

Check out other popular episodes

Get in touch with a P3 team member

  • Hidden
  • Hidden
  • This field is for validation purposes and should be left unchanged.

Subscribe on your favorite platform.