episode 214
Why We Should Stop Paying Attention to the % of AI Projects which Fail (and Instead Learn Why the Others Succeed)
episode 214
Why We Should Stop Paying Attention to the % of AI Projects which Fail (and Instead Learn Why the Others Succeed)
This episode starts with a familiar scene. A role opens, the applications pour in, and suddenly you’re staring at a mountain of resumes that deserve real attention but arrive faster than anyone can process. The mix had everything… experienced candidates, newcomers trying to break in, and a growing stack of AI-generated submissions that looked sharp until you asked a second question.
That’s where Haystack came in. Instead of using AI as a blunt filter, Rob and the team treated it like a collaborator. Teach it what matters. Teach it what P3 looks for in a teammate. Teach it how to separate real signal from polished noise. What came back wasn’t a robot recruiter. It was clarity.
And Haystack is only half the story. As the conversation unfolds, Rob and Justin zoom out into the broader pattern they’re seeing across all the small, useful agents taking shape inside P3. The stuff that isn’t blind hype. The stuff that quietly fixes overloaded parts of the business and makes the human decisions easier to get right.
Because that’s the through-line here. When AI handles the overflow, people get to spend their time on the work that actually requires judgment.
Queue it up and hear what happens when AI stops pretending to be magic and starts doing real work. And if you’ve got a corner of the business that’s begging for that kind of clarity, we can help you find the tiny build that changes everything.
Episode Transcript
Speaker 1 (00:04): Welcome to Raw Data with Rob Collie. Real talk about AI and data for business impact. And now CEO and founder of P3 Adaptive, your host, Rob Collie.
Rob Collie (00:20): All right, welcome back, Justin. We are stringing them together here, two in a row, I think, of me and you.
Justin Mannhardt (00:25): Definitely two in a row.
Rob Collie (00:27): We're numbers people, so we like to keep track of streaks-
Justin Mannhardt (00:29): Streaks.
Rob Collie (00:30): ... and things of that sort.
(00:32): Before we dive into the real stuff, I want to share a vicious rumor that I've heard. And it only feels like a rumor to me because, and you know this, I've been in almost like a total news blackout for the better part of six months now. I pay attention to tech news because I have to, but US news, world news. And it's not that I don't care. It's actually that I care too much. There's just so many unreasonable things going on everywhere.
Justin Mannhardt (01:00): All the time.
Rob Collie (01:01): When I see them, I don't just see the unreasonable thing that's happening. I also see what that's doing to the future. You can point anywhere in the world, and I see things like this happening that's like, "Oh, and this is also poisoning our future," and things like that. It's like it's just too much for me.
(01:16): So, in order to take better care of myself and the people around me, including my business colleagues, I just took Facebook and Reddit and moved them to another screen on my phone. I no longer have this compulsion that clicks them all the time. I don't go to news sites. Things still leak through. And one of the things that leaked through to me, yesterday, was that apparently MTV is no more.
Justin Mannhardt (01:45): I'm learning about this from you right now.
Rob Collie (01:47): Really?
Justin Mannhardt (01:48): Yes. I'm also a bit of a news hermit, but oh, my god.
Rob Collie (01:51): So, now we need to fact-check this. Is MTV actually gone? I think it's actually gone.
Justin Mannhardt (01:59): This is the search summary. MTV is not completely gone, but it is shutting down five of its music channels December 31st. These channels include MTV Live, Club MTV. No, these have to be gone. Okay. No, that seems right, so five channels are going away.
Rob Collie (02:18): I didn't even know there were five channels. To me, MTV is a channel.
Justin Mannhardt (02:22): MTV Live, Club MTV, MTV 80s, MTV 90s, and MTV Music.
Rob Collie (02:29): MTV Music would have been like the thing that we used to think of as MTV.
Justin Mannhardt (02:32): I believe so.
Rob Collie (02:33): Okay. So, it's gone.
Justin Mannhardt (02:36): What's left that... Dear listener, please help us. What's staying?
Rob Collie (02:40): Anyway, I was alive and conscious and paying attention when MTV debuted.
Justin Mannhardt (02:47): The main MTV channel will continue to operate-
Rob Collie (02:50): Oh, really?
Justin Mannhardt (02:51): ... focusing on reality shows instead of music content.
Rob Collie (02:56): So, they're keeping the name, but all vestiges of actual MTV, RIP. Bye-bye.
Justin Mannhardt (03:04): Video Killed the Radio Star.
Rob Collie (03:06): 44 to 45 years.
Justin Mannhardt (03:08): Wow. Had a good run.
Rob Collie (03:11): I don't know when it's good run ended, but you know.
Justin Mannhardt (03:13): Had a run.
Rob Collie (03:16): We're a pop culture podcast at times, apparently. Shall we move on to things that we're actually qualified to talk about?
Justin Mannhardt (03:22): They learned something important about the world today.
Rob Collie (03:24): So, yeah, so this is a multistep news leakage, and it's really the important stuff, what happens to MTV.
Justin Mannhardt (03:31): Right.
Rob Collie (03:32): Okay. So, I think we have a couple things to talk about today that are pretty interesting, and one of them is circling back on... We talked last week about the AI arms race in the job market and how we have entered that race out of necessity. Just even in the intervening week, between last week's episode and this one, oh, my gosh, have I experienced a lot through this lens? It is a game changer. Last week we were referring to it as the interview or application screening bot or whatever, but we've since renamed it Haystack.
Justin Mannhardt (04:13): Haystack. This is an important thing, this agentic AI solution naming.
Rob Collie (04:19): I agree.
Justin Mannhardt (04:20): There's a whole series on this. We just haven't gotten there yet.
Rob Collie (04:24): Yeah, like our copywriting agent, internally, the marketing that helps us write ads and stuff like that is Griff, which is actually an acronym. Generous is what the G in Griff stands for. It's like our stealth core values, in a way, is in Griff.
Justin Mannhardt (04:38): Stealth Griff.
Rob Collie (04:40): But Haystack, we chose this name deliberately because of our experiences of using it. Basically, I want to talk about the ROI of Haystack.
Justin Mannhardt (04:49): We got some impact.
Rob Collie (04:50): Last week, we talked about the necessity for it and all that kind of stuff.
Justin Mannhardt (04:54): The pain, the problems, yeah.
Rob Collie (04:56): But the value of it is so much clearer to me seven days later. That's a real short period of time. So we ended up, I think, with well over a thousand applications for this one job, this one web dev job.
Justin Mannhardt (05:16): And fast, too.
Rob Collie (05:18): Yeah. It was in an eye blink. Based on the majority of candidates kind of like refusing to engage with the application, again, because they're overwhelmed, a staggering percentage of them are just pressing go on some sort of AI system that's off the shelf. There's SaaS solutions out there that do this. They're not even having to homegrow these things. They can just sign up for them, and it's already plugged into all the APIs. It's like, "Yeah, we can handle Breezy. We can handle LinkedIn. We can handle all these sorts of things. We just automatically apply for every job that comes across the plate."
(05:51): In the end, your hiring philosophy for any job should be to find someone who is great in the job, is a great fit, both for the job and the culture of your company and all of that. Just filling a role is not the standard to set, and it's not the standard that we've ever set. Our success, as a company, owes a lot to this belief. If people aren't showing you that they're great, you have to be okay with letting them go, moving on from them so that the good ones can shine through.
(06:27): Now, here's the thing. If we hadn't had Haystack, we hadn't had this agent, there aren't that many people here at the company, at P3, who really could be manually reviewing these applications and rendering a judgment on them. It's a pretty short list of who we've got here, me and you.
Justin Mannhardt (06:44): And even then, it's hard. You got a thousand, and there's only so many things that a human could look at in bulk to make some sort of big cuts. But it's hard to keep in your brain, like, "Okay, candidate A." Then 20 candidates later, "Okay, where does candidate one rank, now that I've been through 20?" That's a very difficult task.
Rob Collie (07:08): And even just advance them to the next step of the funnel or disqualify them, even that binary decision-
Justin Mannhardt (07:14): That's hard.
Rob Collie (07:15): ... it takes a lot of clicking around in this hiring interface, and a lot of reading, and a lot of making judgment calls over and over and over again, and it's exhausting. So, I estimate that if I had done that manual review, that first review of all thousand plus candidates, that would've probably taken me a full week.
Justin Mannhardt (07:37): Easily. I don't even know that you could honestly say that you would've done it well.
Rob Collie (07:41): No. This is something that I think is really important to always circle back to is like, it would've been a week of the CEO of this company's time or a week of the chief customer officer's time. You could put a value on that. You could put a price on that, and it's high. Also, you got to recognize that you're not going to get that week. That's not going to happen.
Justin Mannhardt (07:59): Yeah, that week doesn't exist.
Rob Collie (08:03): Okay, so now what? I went through and did that first manual review. I made it through like 100 one time, and that was the most I could probably ever. Because, again, it's also exhausting. It drains your energy as well.
(08:15): All of our candidates for the job that we'd actually would've had a chance to seriously engage with would've had to come from that first 100. 900 of them would've never been looked at or not looked at in time to make a difference. We would've just called it, "Okay, here's our pool," because even the review of 100 only advanced like 10 or 15 to the next step.
(08:37): Just today, I ran Haystack. I went to Haystack and said, "Hey, let's grab the latest batch that's come in. " It's kind of slowed to a trickle at this point, but so there were only nine in the newly applied state. But it just went and burned through them pretty quickly, dequeued seven of them. Again, seven of nine were just AI slop. And I looked at the two that it advanced, and one of those was, really, still AI slop. I've tuned Haystack to not be overly aggressive, so it's okay to it misses a couple, but then one of the nine was amazing.
Justin Mannhardt (09:14): Really was, yeah.
Rob Collie (09:15): Have you looked at it?
Justin Mannhardt (09:16): Yeah, I did. Yeah, I read it.
Rob Collie (09:18): I have been very respectful because this is ultimately your position you're going to hire for. I have not been moving candidates to the step of let's get them a screening face-to-face interview. I've been waiting and letting you do that. I've been moving them to a step before that. This time, though, I'm like, "Mm-mm."
Justin Mannhardt (09:37): We go, yeah.
Rob Collie (09:38): The needle in the haystack, the diamond and the rough. We wouldn't have noticed this person.
Justin Mannhardt (09:43): No. If anything, the probabilities that we would've just been exhausted and missed them is reasonably high.
Rob Collie (09:51): So there's multiple ways to view the ROI here. First of all, I think that the chances that we're going to get someone great in this job are much higher as a result of having Haystack.
Justin Mannhardt (10:06): I agree.
Rob Collie (10:07): Secondly, the time it's going to take us to get to a really good person is significantly shortened. We talked about how it would've been a week of manual effort, which isn't going to happen. So, if you think about that, okay, maybe we'll slog through some here and there. It might make a two, three, four week difference in the duration of this process. A better result, sooner, and we don't have to burn a week of my time or your time, like two of the most highly leveraged individuals with the company.
(10:43): Which, by the way, the normal ROI calculation would probably only factor that last thing in, and it shouldn't. The total impact of Haystack is already... I'm already just like, "How did we ever live without this thing?" And that was, by the way, if you think about it from like six months ago, we set this internal goal of having agents where we look at them afterwards and go, "How did we ever live without this? " Check.
Justin Mannhardt (11:07): Yeah. You do need to think about impact on those dimensions. AI, I think, lends itself to sort of easily going down the time savings, cost savings type of analysis, which only gets to same outcome. And I remember this was a lesson that I kind of learned early on in my career. I hadn't quite found my footing yet at the company I was working at, and I had an opportunity to meet with the CEO and I was explaining how it was taking a certain department just far too long to do something. He's like, "I don't really care. It's same outcome." So, the fact that you have to look at, "Yes, that's true. It's great that we save time and we can move faster," but better outcome also, that needs to be a measuring stick for AI solutions. Because if all we do is get the same outcomes, have we improved?
Rob Collie (12:02): Yeah, there's a point at which things become so much faster that they happen differently. Different things happen that wouldn't have happened before. It is really an overly simplistic view to say, "Well, it's a week of person X's time." No, it has far greater impact than that. So, helping the really, really, really good candidates rise to the top faster or rise to the top, at all, is a really big deal.
(12:35): I have ideas on how to further improve Haystack, but at this point I'm almost like, "What's the next thing to do?" Because it's already so good that it's going to be a little while before I feel like we want to circle back and add to it.
Justin Mannhardt (12:47): I want to emphasize something we've been talking about recently, as well, which is that customizing these things to your business is really crucial, and it's also not hard, is like the idea we've had. I think that initial batch you did by yourself helped you frame the process in a very clear, specific to us way that highlights, before you go off thinking about where to deploy AI and how well do you understand the process and your philosophy and what you're doing there, because that's what got us to where we are with Haystack is having that level of clarity. This is not just, hey, we started sending resumes to an LLM and asking it specific nuanced things we're looking for.
(13:36): And the one that we had come through today, yeah, it was like, bingo, like yes, on the money, kind of the stuff we're looking for. So, just the fact that you had that clarity of what you wanted to have happening in this process was, I think, really important.
Rob Collie (13:54): Yeah, I agree. And you can think of it through like a human lens. Let's say, in your job, you have this super talented, super intelligent intern pipeline. It's almost like a PEZ dispenser. You can just pop one off the top whenever. Okay.
(14:07): So, this new hire who's quite a bit sharper, in some ways, than the average new hire, how would I train them to hand off the job? How would I delegate this job to them? I would have to go do it myself a little bit, first, figure out, "Okay, here's my process. Here's the process I go through to screen these people." So, then I turn around and I teach this other person, well, think of it as a person for the moment, what that process looks like. And, of course, that process is, as you say, it's going to be very customized to us, and it's going to be customized to that particular position and the nuances of that position, and also what we need from it, and what fits us, and all that kind of stuff.
(14:46): But then I would need to let them go try a few and come back and say, "Okay, I put them into the following buckets." And then I would need to review their work and offer additional feedback, like calibration, like, "No, no, no, no. Those should also be in this bucket, and that one should move over there," and all that kind of stuff. And that is exactly the process I went through building and training up Haystack. Went through all of that. Now, it did require my data gene brain, 100%.
(15:20): Some number of months ago, I met with Amir Netz, the grand architect of everything that we all love, the godfather of Power BI and many other things at Microsoft. I hadn't tried vibe coding yet, and he said, "You should go try it because what you're going to learn is, is that, A, it's really cool, but B, it's not for everyone." And I completely understand what he means by that now. Learning what the manual human process looks like is a crucial first step.
Justin Mannhardt (15:48): Yeah. If you want AI to get great results, but you don't understand what it is you want the AI to do and achieve and how it would go about it, you're not ready yet. And it's not hard to get ready. That's the encouraging thing there.
Rob Collie (16:05): I think this does circle back to something we talked about last week, which is that the out-of-the-box, off-the-shelf experience of using ChatGPT for things that it's good at gives you the wrong impression.
Justin Mannhardt (16:16): That's right.
Rob Collie (16:17): It gives you the impression that it doesn't require anything extra. It knows how to cook chicken. It knows how to search the web. There's so many things it knows how to do, and you don't need to help it. You don't need to train it. You don't need to customize it. So, when you turn around to business, you're kind of expecting to just say, "Okay, go do it." Because that's all you really have to tell it, off the shelf, is, "Just go do it", and it does it.
(16:39): So, it sets you up for failure. That's that chasm that you have to cross. Through that lens, this becomes very, very, very easy to understand why something like 95% of AI projects fail. We talk about that. We don't focus enough on the 5% that are succeeding. It's more like being used as like, "This stuff sucks." It's, "Okay, 5% are succeeding. What are they doing?"
(17:03): Bill Krolicki's vendor bot, that we talk about multiple times, but he was on the show to talk about it, as well, and that's an example of the 5%. Haystack is an example of the 5%. So, you want to be in the 5%. Don't get distracted by the 95%. It's easy to be dumb.
Justin Mannhardt (17:19): Yeah. It's easy. Yeah.
(17:24): Well done. Let's see. Now that was a good use of time and effort, creating a ole haystack there.
Rob Collie (17:29): Yeah, and I learned a lot.
Justin Mannhardt (17:30): Can I call it Ole Haystack?
Rob Collie (17:31): Ole Haystack. You can call it, yeah, good ole Haystack.
Justin Mannhardt (17:35): I'm looking forward to the avatar for this one.
Rob Collie (17:37): Oh, yeah. The first five versions of it are going to come out looking like Cousin Itt. It's like a haystack with eyes.
Justin Mannhardt (17:48): Yeah.
Rob Collie (17:53): Thankfully, Cousin Itt has become relevant again, thanks to the reboots of The Addams Family and everything. Otherwise, this would be another cultural reference. Some people will listen and just be like, "What's MTV? Was that ever important?"
Justin Mannhardt (18:09): Is that like TikTok? No.
Rob Collie (18:11): Nope.
Justin Mannhardt (18:12): Just different source of brain rot.
Rob Collie (18:13): I mean, yeah, it was just like TikTok in that it was the thing that was rotting our brains that everyone was concerned about.
(18:20): Okay, there's something else going on, too. We've got all these internal agents in various stages of development and deployment.
Justin Mannhardt (18:28): Yeah, several at this point.
Rob Collie (18:30): We've got some things going on in Danielson Labs that we're just going to have to drag him onto the podcast.
Justin Mannhardt (18:36): Yeah, we'll trick him into like a meeting, and then it'll just be a podcast recording.
Rob Collie (18:40): The meeting is in the SquadCast. It's because Teams is down this week, for some reason. But you've been working on one. Are we getting close to going live with this one?
Justin Mannhardt (18:49): We're at the event horizon. For context, we haven't been shy about it on the podcast, AI is a big deal and we think we can help our clients with it. And helping our team help our clients with AI is a priority of ours, as leaders.
(19:05): So, one of the things we're working on to finish out the year here is putting together what we had a working title for, of a playbook. Here's the types of solutions we can offer to them or talk to them about. Here's how you can help them think about where to use AI within their business, so on and so forth. We started putting together things like demos and use case ideas or frameworks to think about these things, how we might go to market with those ideas, et cetera, et cetera.
(19:33): So, of course, I'm getting through this project, and I go, "Why isn't this thing an AI agent? Of course, it should be an AI agent." And I think I Slacked you, and you're like, "Yeah, and it should do this and this and this and this and this."
(19:46): So, yeah, we're super close, and I think we finally have picked out a name for this thing. I got to give you your due credit. This was your idea. We're going to call this one Opi.
Rob Collie (19:57): Yes.
Justin Mannhardt (19:58): Why are we going to call it Opi?
Rob Collie (20:00): Well, because it helps us, both us and our clients, spot opportunities to use AI to revolutionize their business or to have the Haystack type of experience for any number of business processes. We could call it Opportunity Spotter, or Identifier, or something like that would be kind of boring. But you give it something that sounds like it might be like the name of a bot. We haven't created the avatar for Opi yet. That's coming.
(20:29): Going back to other things we talked about before, it is definitely non-trivial to start identifying places where AI can improve your business. You're sort of starting from zero. Most people still haven't seen any examples of this working, so it's why we talk about things like Haystack. It's not your business, dear listener, but your business probably does have that kind of problem somewhere. And even if it doesn't, you can extrapolate. Any examples help you build those muscles.
(21:02): So, a small group of us here at P3 have been deep down the rabbit hole developing these muscles, and I wouldn't say that we're done developing them. We've developed a lot of ability here. Every day or at least every week, like a new chamber in my mind opens up, and I go, "Oh," like, "We can do that."
Justin Mannhardt (21:27): Yeah. Then when everybody's working on things, and you come back together and you see how someone solved an interesting problem, whether it was a technical problem or the way you wanted the AI system to behave, then you get this sort of chain reaction of like, "Oh." Yeah, it's so cool.
Rob Collie (21:44): And because it's a small number of us who set aside the time to go and develop this, most of our company doesn't have that time every day. They're off doing today's billable work.
(21:56): I had the ability. It really started with me. I had the most time that I could set aside, and then you and Kellan benefited from that. And then Kellan just goes like, "Here, hold my..." you know.
Justin Mannhardt (22:08): "See you later."
Rob Collie (22:10): Kellan like grabbed my hand and slingshotted himself into orbit, and I don't even see him anymore. But anyway, if you think of this like concentric circles, like a visual, there's a small group that's been developing this capability to spot opportunities. It makes sense that we are the ones who have been spearheading the development of these internal agents because we're the only ones who've had the time to build those muscles, not just on how to build them, but even just to spot the opportunity to do it. I'm going through these applications in our HR system, and I'm going, "Oh, we can do better," so I wouldn't have connected those dots maybe even six weeks ago that, yes, we can.
(22:54): So, then we have the consulting team, the next ring out from us. And then the ring further out from that, the next ring out, is our clients, which is what really matters. So, how do we get this newfound wisdom? It's not super easy to acquire, but it doesn't just pop into your head. How do we get that shared with the next ring out so that it can then get to the ring where it matters, which is the clients? Most of our clients haven't had time to do this muscle building process that we've been doing.
(23:28): So, I think of this Opi agent, and I can't wait for it to go live, I think of it as almost like a wisdom broadcaster or a wisdom distributor. That's what's happening. Some very, very, very subtle and nuanced new patterns of thought are needed. And how best to get those out to everyone? Well, you want to have a conversation with some entity that has that wisdom.
Justin Mannhardt (23:59): When we think about this idea of propagation of ideas and best practices or we're running a company and it's like not possible for you, or I, or Kellan to go be with every consultant, with every client, but we've got these ideas that we want to propagate. So, when you infuse that type of knowledge into an agentic AI solution, now you've got this opportunity to do that very efficiently, and then to learn the feedback loop coming the other way, too, to reinforce that and improve it over time.
(24:37): I think that's what is really neat about what we're doing with our platform is just to have a way to share these things. My earliest uses of AI, going back a couple of years, I'd go into something like ChatGPT and say, "Hey, I talked to my client about this type of project. Here's the transcript from the call. Can you help me get a first draft of a proposal for a project together?" And it would do that, but I'd have to edit it, and it wouldn't get it quite right, and it would misplace technology.
(25:05): Whereas, Opi is really capable of understanding and guiding so many things. It knows that it needs to work with the user to understand where they are in the process, like totally confused and needs help, it can guide them through that. My fingers are crossed that we'll get to that, "I don't know how we ever lived without this," state because even before AI existed, this is always a challenge. To share these ideas and best practices across 50, 60 people is not easy.
Rob Collie (25:38): So walk me through, and I kind of know the answers, but walk me through the workflow of how you expect Opi to be used by our consulting team.
Justin Mannhardt (25:46): And I think this is a useful framework for thinking about custom AI solutions, too. It's a chat-based agent. A user would come up to it, and its main job is to support our consultants in the process of working with their clients on opportunities to help them with their business.
Rob Collie (26:05): So, a consultant is going to be walking up and having a chat with Opi.
Justin Mannhardt (26:10): That's right. Some simple ideas here would be the consultant wants to get some help thinking through how to even engage their client in a conversation about AI. "Where do I start?" Or a client has said, like, "Hey, we're interested in how we could use AI with our Power BI models. Do you guys help us with that?" And it could bring the question from the client to Opi.
(26:36): It has a knowledge base underneath of it that has been tuned where it understands how we go about the process from blue sky to actual, "Let's sign an engagement agreement and move forward with something." It can support them all the way through that process.
(26:55): So, what's important about the process is, again, the customization part, it's our process. It's the way we like to go through this process. It's not a generic sales process or a generic partnership framework. It's the way we do it.
(27:11): Opi is designed to always ensure that it understands where the consultant is along that journey, everywhere from, "How do I get started?" to "The client wants to move forward. I need help reviewing an MSA," and support them all throughout that process. In the knowledge base, it has everything from the services we do and don't provide, the assets we have, like demos or materials they can share, the things they need to be gathering or suggestions, so it's a guide. It's a guide and a partner and working across that spectrum.
Rob Collie (27:46): That first part seems like a really hard problem. There's this chasm to cross, both maybe for the consultant, but also for the client. How do you imagine something you haven't seen yet? How does Opi handle that? Where do I start from the consultant? How does Opi help lead the consultant and then, ultimately, the client to a place where they can say, "Oh, yeah, that would actually help"?
Justin Mannhardt (28:15): Today, Opi has two assets. One is an asset of examples. And I've actually taken a lot of things you and I have talked about on the podcast in terms of frameworks and how to think about AI and building that. And Opi can feed the consultant with some thought starters, help them just think through. I would do this with another person, just like sparring ideas. Opi can work as a sparring partner in that way, maybe to help them think.
(28:44): And then as we do more AI projects, it also has a library of project examples, like, "Here's actual things that we've built." I need to add Haystack in there now, and Opi, itself, would go in there. So, that's some resources that it has. And then it also has a guidance framework of just conversational starters, like questions you could ask your client that aren't big blue sky. But then the other thing it does is it has a capability just like, "Hey, here's the people at P3 that can help you with this. Would you like me to help you loop them in on this?" So, that's the ways Opi is set up to help people on that front end.
(29:23): The worst thing about all of this is the blank page. When I was first starting this, I want Opi to help get off the blank page because ideas breed ideas. You could go talk to someone about Haystack, and they'll be like, "Wow, that's really interesting. That reminds me, I have this thing over in purchasing that's like Haystack." We used to say this with Power BI is like, the method we'd solve for this one problem, this is a very similar thing.
(29:51): That's something that's going to need to be cared for over time, also, in a solution like this, so it gets better and better. I think probably similar to the experience you had With Griff, refining it over time with those sorts of things.
Rob Collie (30:04): Yeah. The blank page is so tricky because you kind of need to go look at your entire business, and you're looking for things that are currently happening, but happening poorly, or slowly, or whatever. But you're also looking for things that aren't happening that could be happening. It's like the old joke of like, "Try to imagine nothing." It's very hard to do, right? like-
Justin Mannhardt (30:26): It's super hard to do.
Rob Collie (30:27): ... I'm imagining an empty room. "Oh wait, it's a room. There's walls." Things that aren't happening are hard to spot. Haystack is like the mix of things that were happening and weren't.
Justin Mannhardt (30:37): And in some cases, I think a classic technique I don't think really resonates naturally to the majority of people is to talk about a concept of like, "Well, where is the pain?" I don't know. People might not be experiencing pain, but they also don't understand it could be a lot better or that they could do something differently, to your point, understanding the possibilities, understanding what's happening today in a company. There's probably some other company out there in the world, Rob, that they've got two people, and they're happily reviewing all of the resumes. Just, "Do-do-do-do-do."
Rob Collie (31:15): The pain that you've always lived with is pain that's invisible. So, again, now you're trying to spot something that's almost designed to not be spotted.
(31:24): Yeah, I do think that, fast-forward, I don't know, a year or two years tops, and our customers, our clients, our partners, we're going to need their help to help them. Our job is to show them enough examples so that the light bulb starts going off for them because we do learn about our client's businesses, but we only learn about the things that are clear trouble spots for them at the moment, not typically seeing the entire picture the way that they do. So, I do think that Opi is going to be useful for this. Do you ever envision Opi being used by a consultant while they're sharing their screen with the client?
Justin Mannhardt (32:06): Your question is interesting because this has been a point of trepidation, since generative AI burst onto the scene, is this insecurity about using it to do our work. My stance is, well, why not? Why wouldn't we embrace the fact that we're using AI to be more successful together with our clients?
Rob Collie (32:31): Yeah.
Justin Mannhardt (32:32): Opi, in its current form, probably not because there's some internal things that Opi... Not that we're trying to hide anything, it just wouldn't make any sense to the client, like, "Who is Heather? Why does she need to do something?" But that front end part of Opi, that I could see. I could se a situation where we have that as a shared asset with clients.
Rob Collie (32:55): So, I have a personal agent. No reason to get into the details of it, but one of the personal agents I built for myself, it's not work related. It's for my wife and I. And the first question it asks is, "Who's here?" Is it Rob? Is it Jocelyn? Is it both of us? And it behaves differently. It doesn't keep secrets or anything, but it knows each of us has a different perspective. Each of us has a different set of things that we're working on. Opi could, potentially, have that same first question.
Justin Mannhardt (33:24): I don't know, Rob. Maybe there's a version of Opi that's just out there for the world, to help them think through where to leverage AI.
Rob Collie (33:31): People are going to be trying to jailbreak it to write JavaScript for them and-
Justin Mannhardt (33:34): Yeah, right. I went to great lengths with Opi to make sure that it wouldn't do certain things. And that was sort of the hammer in my testing was trying to get it to do those things. It was quite successful. I got to give it credit. We've got some smart and creative people here. Maybe they'll figure it out.
Rob Collie (33:56): I bet that a version of Opi that we password protected gave the password to clients. It'd be interesting. If Opi is about what we're saying it's for, why bottleneck it just through the consultants? Obviously the consultants need it, but why not light it up?
Justin Mannhardt (34:16): Why not?
Rob Collie (34:19): Ah.
Justin Mannhardt (34:19): That's a really interesting idea.
Rob Collie (34:20): I mean, it's so cool. Plus it also becomes an example for the client. It's another example of something that helps them build their muscles, as well. It's like, "This thing is real. This thing was helpful."
Justin Mannhardt (34:30): You talk about, well, how do you help someone in that blank page, blue sky state? Examples and demonstratives are so helpful and important because of what we've been talking about. We all understand the off-the-shelf ChatGPT experience. It's probably even hard when you hear us on this podcast. I'll be like, "Yeah, but we customize it to your business." Well, what does that mean? And what's the difference in that experience like when we show that to people?
(34:57): One of the things I've been doing, I've been talking to a lot of clients over the past several weeks about AI. When they see that, it helps them. They go, "Oh, I see why that's different." It helps move them along the thought process.
Rob Collie (35:12): We will watch Opi's career with great interest.
Justin Mannhardt (35:14): We will.
Rob Collie (35:15): In a sense, a lot of the things we've been talking about in this episode and in many others is all about all these processes of these things you need to learn to do, like spotting opportunities, learning what customizing things look like and all that kind of stuff. That is the art of being in the 5% of projects that succeed. As more and more people get into the 5%, well, the 5% is going to become 10% and so on and so on.
(35:39): So, yeah, I'm very pleased to have been doing our part to increase that 5%. Haystack is in the winner circle. It's already sitting there. It's finished its round of golf. It's in the clubhouse. Opi is about to tee off, and we're going to see. Griff is already a big success. We have things going on in Danielson Labs that will be ready any day now because, again, they just keep getting better and better and better. Like, why release these things when they keep getting better?
Justin Mannhardt (36:10): Yeah, right, or you're just like, because you can work so fast, the temptation to continue iterating is insane.
Rob Collie (36:18): Yeah.
Justin Mannhardt (36:19): You don't feel the pressure. It's just a different experience when you had to redact so you could ship and all this stuff, you know. It's, "Ugh."
Rob Collie (36:28): Yeah.
Justin Mannhardt (36:28): But I think Danielson Labs is cooking up something that I think we're going to start into its initial teeing off on Monday, as well. So, I'm going to leave that one for a future episode, but we're pretty excited about that use case, as well.
Rob Collie (36:43): I've seen previews.
Justin Mannhardt (36:45): Have you seen the Slack messages?
Rob Collie (36:48): No-
Justin Mannhardt (36:48): Oh, my gosh.
Rob Collie (36:49): ... I haven't.
Justin Mannhardt (36:49): That's a future episode. I can't spill the beans, and I can't steal the man's thunder.
Rob Collie (36:54): That's a good segue, a good teaser, until next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.
Subscribe on your favorite platform.