The map is not the territory
How do you navigate the pace of AI disruption? This mental model helps you decode AI hype, catch cartographer bias, and avoid being blinded by the past.
Subscribe to get notified of new episodes
Watch on YouTube
Show Notes
Summary
Eric and John break down the mental model "the map is not the territory" and pressure-test it against AI hype, career war stories, and the beloved platitude "perception is reality." They walk through Shane Parish’s three principles: 1) reality is the ultimate update, 2) consider the cartographer, and 3) maps can influence territories, and show why each one matters when billions are flowing into AI and the territory is shifting under everyone's feet.
Key takeaways
- "Perception is reality" is a useful awareness tool and a terrible life principle. It helps you understand why people behave the way they do, but centering your life around it leads to incongruity and character problems.
- Reality will update your map whether you like it or not. AI skeptics who refuse to revise their position as capabilities improve are a real-time case study in map–territory mismatch. The faster the territory changes, the more dangerous a stale map becomes.
- The cartographer always has a bias. Whether it's a CRO whose commission rewards higher ACV or a frontier-model company that needs to justify billions in investment, the person drawing the map has incentives baked in. Always ask who made the map and what they gain from it.
- Maps shape the territory they claim to describe. The ROI-first map for AI is concentrating nearly all successful tooling around knowledge-worker productivity (especially coding), even though AI is capable of far more. That’s limiting what gets built and funded.
- Touch the territory. Financial models, performance reviews, product demos, and AI benchmarks are all maps. The risk you miss is always the one the map doesn't show, so get your hands on the actual thing before making big decisions.
Notable mentions and links
- Charlie Munger of Berkshire Hathaway fame is credited with championing the idea of collecting mental models from many disciplines to improve decision-making.
- Shane Parrish is a Munger disciple who runs the Farnham Street blog, wrote the book series The Great Mental Models.
- You can read the Farnham Street blog post on this mental model.
Transcript
00:00:00,200 --> 00:00:17,540 [Eric] [upbeat music] Welcome back to the Token Intelligence Show. John, one concept that you and I love is, uh, this concept of mental models. 00:00:17,540 --> 00:00:18,099 [John] Yes. 00:00:18,100 --> 00:00:38,940 [Eric] And we have actually used these mental models as a way to sort of navigate a lot of questions about AI. And we talked about a mental model on the very first show that we recorded, and we thought that we would just do some dedicated episodes to exploring these mental models. 00:00:38,940 --> 00:00:41,000 [John] Yes. Bottlenecks was the first one we did. 00:00:41,000 --> 00:01:31,760 [Eric] Bottlenecks was the first one. And for those of you who didn't catch the first episode, mental models actually stem from Charlie Munger of, you know, Berkshire Hathaway fame. And there's a guy named Shane Parrish, who was a disciple of Charlie Munger's, and Shane Parrish uncovered this... Not uncovered, I mean, it was, it was fairly widely known, but Charlie Munger had this belief that if you studied th- all of these different disciplines in the world, you could derive mental models from different disciplines and then use those to, uh, apply those mental models, um, you know, as appropriate for decision-making and problem-solving. 00:01:31,760 --> 00:01:32,480 [John] Yep. 00:01:32,480 --> 00:02:03,700 [Eric] And so the thesis is, okay, if you have a good handle on these mental models, which give you a, a foundational understanding of how things work, then you're way better at decision-making and way better at problem-solving because you're sort of using these baseline, you know, mental models. And Charlie Munger said that there are probably 80 or 90 models, you know, that you need to know. And then, you know, a, a small fraction of those, let's say five or 10, really carry most of the freight, I think is, is the term that he used, which I like. 00:02:03,700 --> 00:02:08,360 [John] What- do you have that shortlist, like, in your mind? 'Cause I don't think that I do. 00:02:08,360 --> 00:02:08,400 [Eric] I- 00:02:08,400 --> 00:02:12,020 [John] I have my favorites, I would say, but I, I haven't quite shortlisted for me. 00:02:12,020 --> 00:02:13,950 [Eric] I haven't finished reading the full series, which- 00:02:13,950 --> 00:02:13,950 [John] Okay 00:02:13,950 --> 00:02:16,920 [Eric] ... because the first couple books, I've- there's so much meat in there that I- 00:02:16,920 --> 00:02:17,070 [John] Right 00:02:17,070 --> 00:02:18,500 [Eric] ... tend to keep returning to those. 00:02:18,500 --> 00:02:18,920 [John] Right. 00:02:18,920 --> 00:02:20,980 [Eric] So I don't know. We should- we'll do that in a future episode- 00:02:20,980 --> 00:02:21,049 [John] That'd be fun 00:02:21,049 --> 00:02:22,360 [Eric] ... is try to, like, shortlist the models. 00:02:22,360 --> 00:02:23,180 [John] Yeah. 00:02:23,180 --> 00:02:24,080 [Eric] Um- 00:02:24,080 --> 00:02:25,740 [John] This is a top one, though. This is a fun one. 00:02:25,740 --> 00:02:37,220 [Eric] Yeah, this is, this is absolutely in the top, in the top several for me. Um, Shane Parrish, though, ran a blog called Farnam Street. Farnam Street is, um, a nod to Berkshire Hathaway. 00:02:37,220 --> 00:02:37,310 [John] Mm. 00:02:37,310 --> 00:02:39,840 [Eric] So, um, you know, in Nebraska- 00:02:39,840 --> 00:02:39,910 [John] Right 00:02:39,910 --> 00:03:00,820 [Eric] ... where, um, where Warren Buffett lives. And he, uh, explored all sorts of topics. He has a great podcast called The Knowledge Project. He explored all sorts of topics, but he really dug into mental models and ended up writing a book series called The Great Mental Models. Uh, and actually, we have- 00:03:00,820 --> 00:03:01,799 [John] Yeah. 00:03:01,800 --> 00:03:06,640 [Eric] We have one right here. Uh, we'll show it to, um, those of you on video. 00:03:06,640 --> 00:03:08,079 [John] That is the one- 00:03:08,080 --> 00:03:09,300 [Eric] This is volume three- 00:03:09,300 --> 00:03:09,340 [John] Mathematics 00:03:09,340 --> 00:03:09,470 [Eric] ... Systems- 00:03:09,470 --> 00:03:10,840 [John] Systems and Mathematics 00:03:10,840 --> 00:03:14,430 [Eric] ... and Mathematics. So actually, here I can- I can put this up here. 00:03:14,430 --> 00:03:14,540 [John] There you go. 00:03:14,540 --> 00:03:18,370 [Eric] Um... Oh, well, that's not centered, sorry. 00:03:18,370 --> 00:03:18,920 [John] Yeah. There you go. 00:03:18,920 --> 00:03:19,420 [Eric] There we go. [chuckles] 00:03:19,420 --> 00:03:21,549 [John] Your c- your camera's, like, moving around. [chuckles] 00:03:21,549 --> 00:03:22,460 [Eric] It is moving around. 00:03:22,460 --> 00:03:22,700 [John] I found it. 00:03:22,700 --> 00:03:48,240 [Eric] Yeah, but volume one is General Thinking Concepts, volume two is Physics, Chemistry, and Biology, volume three is Systems and Mathematics, and volume four is Economics and Art. So bottlenecks came from volume three, but we're actually going to talk about, um... We're going to talk about a, the mental model called The Map Is Not the Territory, and it is the, I believe, the very first model in the very first- 00:03:48,240 --> 00:03:48,760 [John] Mm 00:03:48,760 --> 00:03:56,630 [Eric] ... book, uh, from General Thinking, and it's a great one. So you wanted to frame this- 00:03:56,630 --> 00:03:57,549 [John] I have a set up. Yeah 00:03:57,549 --> 00:04:01,040 [Eric] ... with a, with a question, which I think is a really compelling question. 00:04:01,040 --> 00:04:18,140 [John] Yeah. So pretty much my entire business career, I've heard this business platitude that I think conflicts a little bit with this, right? So I think a lot of you have heard it, too. So the mental model today that we're talking about is called The Map Is Not the Territory. 00:04:18,140 --> 00:04:19,000 [Eric] Yep. 00:04:19,000 --> 00:04:26,940 [John] But a lot of times in business, I've heard this probably at every job I've, I've ever had, is, um, perception is not reality. 00:04:26,940 --> 00:04:27,440 [Eric] Mm. 00:04:27,440 --> 00:04:30,620 [John] So that's, that's what I wanna dig into. 00:04:30,620 --> 00:04:30,640 [Eric] Yep. 00:04:30,640 --> 00:04:41,540 [John] We're gonna go into the model first, so you can better unpack what the map is not the territory. But in the back of your mind, and what I've been thinking about is, like, wait, but if perception is not reality, how does that fit in with this? 00:04:41,540 --> 00:04:46,050 [Eric] Right. Well, let me ask you, let me ask you a question, and you can defer this to later in the show- 00:04:46,050 --> 00:04:46,440 [John] Right 00:04:46,440 --> 00:04:47,950 [Eric] ... if you want there to be a big reveal. [chuckles] 00:04:47,950 --> 00:04:50,160 [John] [chuckles] Yeah. 00:04:50,160 --> 00:04:58,700 [Eric] Do you generally agree with the idea that perception is not reality? Or sorry, that perception is reality? 00:04:58,700 --> 00:05:05,390 [John] I, I don't know. I've gone back and forth and back and forth on it, truly. 00:05:05,390 --> 00:05:05,520 [Eric] Okay. Yeah. 00:05:05,520 --> 00:05:15,580 [John] Truly, where, where early on when I first heard it, I would say vehemently disagreed. Then, like, as I've progressed, like, I see the aspects where it's true. 00:05:15,580 --> 00:05:16,500 [Eric] Mm-hmm. 00:05:16,500 --> 00:05:22,880 [John] And then I f- I'm really... I, I don't think I can give a, like, full answer to it. I can unpack that. 00:05:22,880 --> 00:05:30,320 [Eric] What you're saying, it sounds very much like when it comes to perception being reality, that the map is not the territory. [chuckles] 00:05:30,320 --> 00:05:30,820 [John] Yeah. Oh, wow! 00:05:30,820 --> 00:05:32,200 [Eric] And, and distilling complexity- 00:05:32,200 --> 00:05:32,280 [John] Wow 00:05:32,280 --> 00:05:33,700 [Eric] ... is hard. That was meta. You're welcome. 00:05:33,700 --> 00:05:34,870 [John] Yeah, that was super meta. Wow! 00:05:34,870 --> 00:05:35,340 [Eric] You're welcome. 00:05:35,340 --> 00:05:36,620 [John] But let me give you an example. 00:05:36,620 --> 00:05:36,760 [Eric] Yep. 00:05:36,760 --> 00:05:55,630 [John] In the... 'Cause I, I think everybody has a different context for that s- that phrase, "perception is not reality." And I think for some people, it's actually immensely helped them, and for other people, like, not so much. My original, like, founding perception of hearing this, my very first job, and this is so funny looking back on it. My boss comes in, or boss's boss, I think, comes into the room. 00:05:55,630 --> 00:05:56,180 [Eric] Mm-hmm. 00:05:56,180 --> 00:06:02,200 [John] And, and, and it was just one of those, like: "Let me teach you a story," you know, I was really... or, "Let me teach you through a story"- 00:06:02,200 --> 00:06:02,420 [Eric] Right 00:06:02,420 --> 00:06:02,700 [John] ... type of- 00:06:02,700 --> 00:06:02,730 [Eric] Right 00:06:02,730 --> 00:06:07,416 [John] ... type of moments. Um-... It's like, all right, early on in my career when I was in the military- 00:06:07,416 --> 00:06:07,846 [Eric] Mm-hmm. 00:06:07,846 --> 00:06:10,916 [John] -was in-- I was working with a team. We were in charge of this warehouse. 00:06:10,916 --> 00:06:11,216 [Eric] Mm-hmm. 00:06:11,216 --> 00:06:14,616 [John] We had to inventory everything every week- 00:06:14,616 --> 00:06:14,645 [Eric] Yep 00:06:14,645 --> 00:06:15,635 [John] ... kinda manage a warehouse, right? 00:06:15,635 --> 00:06:16,515 [Eric] Yep. 00:06:16,515 --> 00:06:21,316 [John] And, you know, at times it was really busy and hectic, but there's other times, like, we just, we had a lot of downtime. 00:06:21,316 --> 00:06:22,036 [Eric] Mm-hmm. 00:06:22,036 --> 00:06:27,395 [John] Um, and, [chuckles] and he's like: What, what my, you know, manager, or whatever the- 00:06:27,395 --> 00:06:27,406 [Eric] Yeah 00:06:27,406 --> 00:06:28,075 [John] ... position was at the time- 00:06:28,075 --> 00:06:28,676 [Eric] Commanding officer. 00:06:28,676 --> 00:06:40,866 [John] Yeah, commanding officer, yeah. What he taught me is that regardless of what was happening, you needed to look busy, and the hack was that you needed to have a clipboard and a pen or a pencil in your hand at all times when you're walking around. 00:06:40,866 --> 00:06:40,876 [Eric] [chuckles] 00:06:40,876 --> 00:06:50,596 [John] So if you're walking around aimlessly, clipboard and pencil. If you're actually doing work, clipboard and pencil. And then he just walks out. [laughing] Like, so it's like a- 00:06:50,596 --> 00:06:51,376 [Eric] That's your lesson- 00:06:51,376 --> 00:06:51,466 [John] So we- 00:06:51,466 --> 00:06:52,395 [Eric] ... early in your career? 00:06:52,395 --> 00:06:56,515 [John] Yeah. So it was like, uh, I mean, we're-- I'm a couple months into my first job- 00:06:56,515 --> 00:06:57,256 [Eric] Wow 00:06:57,256 --> 00:07:08,596 [John] ... and, like, walks out, and you're like... And, and, and, and the, the lesson, the takeaway is, like, perception is reality. And then walks out. And, and I'm just sitting there like: I don't know what to think about this. 00:07:08,596 --> 00:07:11,716 [Eric] Perception is reality. Mic drop. I'm gone. 00:07:11,716 --> 00:07:11,876 [John] Yeah. 00:07:11,876 --> 00:07:13,336 [Eric] Did you go buy a clipboard? 00:07:13,336 --> 00:07:15,885 [John] [chuckles] Uh, no, I didn't. 00:07:15,885 --> 00:07:16,356 [Eric] [chuckles] 00:07:16,356 --> 00:07:19,235 [John] I had a notebook, which may be similar in a business context. 00:07:19,236 --> 00:07:20,236 [Eric] Yeah. 00:07:20,236 --> 00:07:26,976 [John] But it was so... And, and, and his point to us was like: "You guys should look bu- busy even if you don't have work to do." 00:07:26,976 --> 00:07:27,075 [Eric] Yeah. 00:07:27,075 --> 00:07:27,796 [John] That was his point. 00:07:27,796 --> 00:07:28,916 [Eric] Yeah, yeah. 00:07:28,916 --> 00:07:30,436 [John] Tie it into perception is not reality. 00:07:30,436 --> 00:07:32,695 [Eric] Okay. All right, we're gonna test- 00:07:32,695 --> 00:07:32,706 [John] [chuckles] 00:07:32,706 --> 00:07:37,655 [Eric] ... we're gonna test perception as reality, um, through this mental model. So- 00:07:37,655 --> 00:07:38,336 [John] Yeah 00:07:38,336 --> 00:08:49,756 [Eric] ... uh, I'm gonna read from, um, I'm gonna read from Shane Parrish's source material on, um, on The Map is Not the Territory. So select quotes from The Map is Not the Territory: "The map of reality is not reality. Even the best maps are imperfect. That's because maps are reductions of what they represent. If a map were to represent the territory with perfect fidelity, it would be-- it would no longer be a reduction, and thus would no longer be useful to us. A map can also be a snapshot from a point in time, representing something that no longer exists. We use maps every day to simplify complexity. A great example is the financial statements of a company, which are meant to distill the complexity of thousands of transactions into something manageable. Yet they tell us nothing about whether the product is good for the customer or what's really going on in the company. A policy document on office procedure, a manual on parenting a two-year-old, or your performance review: all are models or maps that simplify some complex territory to guide you through it. Relying solely on maps can lead you to the wrong conclusion. You need to actually touch the territory." Uh, and then the, um, 00:08:51,796 --> 00:09:44,906 [Eric] uh... And then the post goes on to say: "Some of the biggest map-territory problems are the risks of the territory that are not shown on the map. When we're following the map without looking around, we trip right over these risks. Any user of a map or model must realize that we do not understand a model, map, or reduction unless we understand and respect its limitations. If we don't understand what the map does and doesn't tell us, it can be useless or even dangerous. In order to use a map or model as accurately as possible, we should take into account three important principles. Number one, reality is the ultimate update. Number two, consider the cartographer. And number three, maps can influence territories." Um, and I'll just give a brief-- I'll read a, like a brief sentence or two on each of these. Um- 00:09:44,906 --> 00:09:46,415 [John] So reality is the ultimate update. 00:09:46,415 --> 00:09:46,836 [Eric] Yep. 00:09:46,836 --> 00:09:49,276 [John] Consider the cartographer, so the mapmaker. 00:09:49,276 --> 00:09:49,876 [Eric] Yep. 00:09:49,876 --> 00:09:51,435 [John] Maps can influence territories. 00:09:51,435 --> 00:09:51,656 [Eric] Yep. 00:09:51,656 --> 00:09:52,736 [John] Got it. 00:09:52,736 --> 00:10:39,056 [Eric] So, um, uh, for reality is the ultimate update, a map captures a territory in a moment in time. Just because it might have done a good job at depicting what was at that-- what was at the time it was made, there is no guarantee that it depicts what there is now, uh, or what there will be in the future. The faster the rate of change in the territory, the harder it will be for a map to keep up to date. Uh, consider the cartographer. Maps are not objective creations. They reve- reflect the value, standards, and limitations of their creators. And then maps can influence territories. Um, a model can influence the decisions we make in the territory when we try to fit complexity into our own, um, simplification- 00:10:39,056 --> 00:10:39,066 [John] Mm 00:10:39,066 --> 00:10:48,705 [Eric] ... or our own model of simplification. So I thought what we could do to break this down is go through each of these, uh, points, and each of us can kinda- 00:10:48,705 --> 00:10:48,705 [John] Mm 00:10:48,705 --> 00:10:56,516 [Eric] ... give an example or two about how we're seeing this play out. And I think this is particularly important because we're going through such a big change- 00:10:56,516 --> 00:10:56,746 [John] Right 00:10:56,746 --> 00:11:06,276 [Eric] ... with AI, where the territory is changing rapidly, and so a lot of maps are having to be thrown out and recreated. So- 00:11:06,276 --> 00:11:14,195 [John] Yeah, well, and we were talking about this before the show. All three of these points are actually really relevant for the current AI, like- 00:11:14,195 --> 00:11:15,586 [Eric] Investment landscape. 00:11:15,586 --> 00:11:21,036 [John] All of the... Investment landscape, the, what reality is versus perceived reality. [chuckles] 00:11:21,036 --> 00:11:21,046 [Eric] Yep. 00:11:21,046 --> 00:11:22,056 [John] Like, it's all- 00:11:22,056 --> 00:11:22,195 [Eric] Yeah 00:11:22,195 --> 00:11:23,036 [John] ... interconnected, I think. 00:11:23,036 --> 00:11:30,856 [Eric] Well, so let's do that. Let's, let's just break down how these relate to AI, and then let's give some maybe examples from our careers, you know- 00:11:30,896 --> 00:11:30,906 [John] Yeah 00:11:30,906 --> 00:11:54,856 [Eric] ... or even personal lives that relate to this. So reality is the ultimate update. I think when I think about AI and reality as the ultimate update for the map is not the territory, I'm gonna give a very specific example here. I'm gonna, I'm gonna think about the person who, um, started out very skeptical of AI. 00:11:54,856 --> 00:11:55,656 [John] Yes. 00:11:55,656 --> 00:12:20,124 [Eric] Um, you know, sort of the most vocal of these people, wrote blog posts, you know, or social media posts about, you know, the limitations and, um, [clears throat] and the problems, you know, and all of the danger inherent in non-deterministic, you know, generation, et cetera.... who progressively, over time, were forced by reality to update their map of- 00:12:20,124 --> 00:12:20,304 [John] Right 00:12:20,304 --> 00:12:21,724 [Eric] -what AI is useful for- 00:12:21,724 --> 00:12:21,734 [John] Right 00:12:21,734 --> 00:12:23,164 [Eric] -and what it's capable of. 00:12:23,164 --> 00:12:23,264 [John] Right. 00:12:23,264 --> 00:12:33,624 [Eric] Um, and so, it-- but interestingly, like we talked about in the sunk costs episode, um, which we, which was our previous episode- 00:12:33,624 --> 00:12:33,944 [John] Right 00:12:34,004 --> 00:12:37,084 [Eric] ... there are a lot of people who are not updating their map. They're not actually- 00:12:37,084 --> 00:12:37,234 [John] Yeah 00:12:37,234 --> 00:12:38,403 [Eric] -using reality to update their map. 00:12:38,404 --> 00:12:50,183 [John] Right. Well, and I, I think the trickiest part for anybody, me included, and reality is the ultimate update, is sorting out what I want to believe to be true versus what is true. 00:12:50,184 --> 00:12:52,164 [Eric] Mm. Yes. Yes. 00:12:52,164 --> 00:12:57,404 [John] And getting-- being honest enough with yourself to figure out the difference. 00:12:57,404 --> 00:12:57,424 [Eric] Yep. 00:12:57,424 --> 00:13:02,344 [John] And sometimes reality will align with what you want to believe is true, and sometimes it won't. 00:13:02,344 --> 00:13:04,914 [Eric] Unfortunately, [chuckles] that is- 00:13:04,914 --> 00:13:05,794 [John] [chuckles] 00:13:05,794 --> 00:13:11,864 [Eric] ... as you, as you get older and each decade passes, you progressively realize that the delta is- 00:13:11,864 --> 00:13:16,164 [John] Larger than you ever could have imagined? [laughing] 00:13:16,164 --> 00:13:22,164 [Eric] [laughing] Uh, okay, consider the cartographer. Uh, break that down for me as far as AI. 00:13:22,164 --> 00:13:25,724 [John] Yeah. So t-this one 00:13:25,724 --> 00:13:36,364 [John] I think is really complex, in that-- so consider the cartographer or the mapmaker. So I think it would be fair to say that a lot of the companies building these, the frontier models- 00:13:36,364 --> 00:13:36,804 [Eric] Mm-hmm 00:13:36,804 --> 00:13:40,704 [John] ... these large AI companies, are kind of the cartographers of the day. 00:13:40,704 --> 00:13:40,804 [Eric] Yep. 00:13:40,804 --> 00:13:43,383 [John] Or the thought leaders in the space, which there's a big overlap there. 00:13:43,384 --> 00:13:44,224 [Eric] Yep. 00:13:44,224 --> 00:13:52,804 [John] So they are absolutely influencing, you know... So the consider the cart- cartographers, essentially talking about bias. Like, w- 00:13:52,804 --> 00:13:52,874 [Eric] Yep 00:13:52,874 --> 00:13:57,784 [John] ... if you created the map, you as the mapmaker are interpreting reality through your- 00:13:57,784 --> 00:13:57,794 [Eric] Yep 00:13:57,794 --> 00:13:59,544 [John] ... own lens and creating a map. 00:13:59,544 --> 00:13:59,704 [Eric] Yep. 00:13:59,704 --> 00:14:07,204 [John] So I think there's a big, a, a, a big consideration here, is if, if you have invested billions of dollars into AI, like, you want it to work- 00:14:07,204 --> 00:14:07,274 [Eric] Yes. 00:14:07,274 --> 00:14:08,384 [John] -you want it to be true. 00:14:08,384 --> 00:14:08,844 [Eric] Yep. 00:14:08,844 --> 00:14:16,274 [John] You want to just continue to, um, kind of amp up the potential gains, the potential- 00:14:16,274 --> 00:14:16,274 [Eric] Yep 00:14:16,274 --> 00:14:22,824 [John] ... outcomes, all of that. So I-- yeah, I think it's such a, such a big deal right now, 00:14:22,824 --> 00:14:23,564 [John] to think about that. 00:14:23,564 --> 00:14:33,044 [Eric] Uh, uh, it really is because I think about ROI, which, you know, ROI for not only the frontier models- 00:14:33,044 --> 00:14:33,054 [John] Right 00:14:33,054 --> 00:14:43,164 [Eric] ... and the people who are funding them, but also, um, you know, the, the companies that are purchasing, you know, AI products, tokens- 00:14:43,164 --> 00:14:43,184 [John] Right 00:14:43,184 --> 00:14:44,384 [Eric] ... API access- 00:14:44,384 --> 00:14:44,824 [John] Right 00:14:44,824 --> 00:14:57,093 [Eric] ... that need to prove, you know, some sort of ROI from this. Um, and so that actually, I think, leads well into point number three: maps can influence territories. So one of the- 00:14:57,093 --> 00:14:57,114 [John] Yeah 00:14:57,114 --> 00:15:10,844 [Eric] ... things I've been thinking about here with AI is that, um, you know, you and I have both talked about how we believe that AI is going to unleash a gigantic boon in economic creativity, um- 00:15:10,844 --> 00:15:11,764 [John] Right 00:15:11,764 --> 00:15:26,234 [Eric] ... because it is such a powerful tool and capable of so many different things across so many different disciplines. But really, the main impact that it is currently having is productivity and knowledge work. 00:15:26,234 --> 00:15:26,304 [John] Right. 00:15:26,304 --> 00:15:28,484 [Eric] And I would argue specifically, that- 00:15:28,484 --> 00:15:28,954 [John] Software engineering- 00:15:28,954 --> 00:15:31,104 [Eric] ... that overindex is for software engineering- 00:15:31,104 --> 00:15:31,274 [John] Right 00:15:31,274 --> 00:15:51,004 [Eric] ... right, in the current state. And so that, I think, is because, like, the map is influencing the territory, in that the map that I think all of the money that has been put into this, the map that they have is figuring out the quickest path to ROI. 00:15:51,004 --> 00:15:51,304 [John] Yes. 00:15:51,304 --> 00:15:54,914 [Eric] And the quickest path to ROI is knowledge worker productivity, right? 00:15:54,914 --> 00:15:54,944 [John] Right. 00:15:54,944 --> 00:15:57,364 [Eric] Is, like, sort of increasing, you know- 00:15:57,364 --> 00:15:57,374 [John] Right 00:15:57,374 --> 00:15:59,304 [Eric] ... increasing leverage, whatever that looks like. 00:15:59,304 --> 00:15:59,484 [John] Yeah, right. 00:15:59,484 --> 00:16:16,464 [Eric] That could be through output, that could be through cost optimization, et cetera, right? But that's absolutely influencing the territory, and I think that's why you've seen the most successful tools, um, all, you know-- you know, as we think about investment, Silicon Valley- 00:16:16,464 --> 00:16:17,084 [John] Right 00:16:17,084 --> 00:16:22,264 [Eric] ... and the tech space, almost all of the tools are related to increasing knowledge worker productivity. 00:16:22,264 --> 00:16:22,584 [John] Right. 00:16:22,584 --> 00:16:27,884 [Eric] Right? Um, that have been, like, successful on a large scale. Um, you know, which is super interesting. 00:16:27,884 --> 00:16:27,924 [John] Right. 00:16:27,924 --> 00:16:35,384 [Eric] So that map is absolutely influencing the territory, even though AI is capable of, you know, all sorts of different things- 00:16:35,384 --> 00:16:35,524 [John] Right 00:16:35,524 --> 00:16:36,244 [Eric] ... beyond- 00:16:36,244 --> 00:16:36,364 [John] Right 00:16:36,364 --> 00:16:38,924 [Eric] ... you know, just knowledge worker productivity. 00:16:38,924 --> 00:16:46,814 [John] Yeah. I mean, I mean, it's a huge thing, and, and like you said, the subset of, like, software engineering is even more concentrated inside of knowledge work- 00:16:46,814 --> 00:16:46,894 [Eric] Mm-hmm 00:16:46,894 --> 00:16:51,164 [John] ... primarily because in, in a lot of ways, it's easier. 00:16:51,164 --> 00:16:51,464 [Eric] Yep. 00:16:51,464 --> 00:16:54,654 [John] The, the-- when you get to other knowledge work, it's just messier. 00:16:54,654 --> 00:16:54,684 [Eric] Yep. 00:16:54,684 --> 00:17:07,364 [John] Like, you've got, you've got stuff in people's brains, you've got stuff in messy formats like email, whereas, whereas, um, in a lot of ways, the knowledge work that's most organized is code. 00:17:07,364 --> 00:17:07,583 [Eric] Yep. 00:17:07,584 --> 00:17:09,744 [John] Like, that's the most organized knowledge work. 00:17:09,744 --> 00:17:10,003 [Eric] Yes. 00:17:10,003 --> 00:17:16,784 [John] And because of that, um, that's why I think it's, it's the, the biggest, the first frontier- 00:17:16,784 --> 00:17:16,964 [Eric] Sure 00:17:16,964 --> 00:17:17,724 [John] ... because of that. 00:17:17,724 --> 00:17:20,064 [Eric] Right, and the data is more easily accessible, right? 00:17:20,064 --> 00:17:20,784 [John] Yeah, k- sure. 00:17:20,784 --> 00:17:22,144 [Eric] Code is the most- 00:17:22,144 --> 00:17:22,184 [John] Right. 00:17:22,184 --> 00:17:24,554 [Eric] It is the most documented output of knowledge work- 00:17:24,554 --> 00:17:25,344 [John] Mm-hmm 00:17:25,344 --> 00:17:26,524 [Eric] ... probably in history. 00:17:26,524 --> 00:17:33,504 [John] Well, and, and the other thing, too, just to be, really to distill, like, what AI is good at, it's, like, really good at reading fast. Um- 00:17:33,504 --> 00:17:33,604 [Eric] Yep. 00:17:33,604 --> 00:17:38,664 [John] Can you read in your native language faster, or can you read code faster? 00:17:38,664 --> 00:17:38,944 [Eric] Yep. 00:17:38,944 --> 00:17:40,184 [John] I mean, you- 00:17:40,184 --> 00:17:40,374 [Eric] Oh, [chuckles] 00:17:40,374 --> 00:17:44,463 [John] ... I think, native language. But, but, uh, but not just you, but I think even some- 00:17:44,464 --> 00:17:45,094 [Eric] Yep 00:17:45,094 --> 00:17:48,244 [John] ... like, really, like, talented developers, it's still their native language. 00:17:48,244 --> 00:17:48,904 [Eric] Yep. 00:17:48,904 --> 00:17:51,764 [John] So there's that. And then writing, think about that same thing. 00:17:51,764 --> 00:17:52,664 [Eric] Mm-hmm. 00:17:52,664 --> 00:17:54,264 [John] Like, which one can you do faster? 00:17:54,264 --> 00:17:54,764 [Eric] Mm-hmm. 00:17:54,804 --> 00:17:59,744 [John] So there's, it's just a really interesting, like, trade-off, even for people that are really- 00:17:59,744 --> 00:17:59,924 [Eric] Yep 00:17:59,924 --> 00:18:12,224 [John] ... literate in whatever language they maybe write code in. They're-- I, I think for a lot of-- I think most people, they're still, like, faster in their native language if they can communicate in that. And there's, there's still a translation process- 00:18:12,224 --> 00:18:12,394 [Eric] Yep 00:18:12,394 --> 00:18:20,664 [John] ... that nobody thought much about, where they have to go to the meetings and speak in English, and then they have to go away and write in code. There's still this translation process- 00:18:20,664 --> 00:18:20,784 [Eric] Yep 00:18:20,784 --> 00:18:36,308 [John] ... um, that when you find something that's, like, essentially not, not any slower in reading in English versus in Python... Same with writing, not really any slower in writing in English versus Python. Like, it's just, it, it's a-... so it's a huge opportunity. 00:18:36,308 --> 00:18:37,398 [Eric] Yep, totally agree. 00:18:37,398 --> 00:18:41,048 [John] And that's why I think a lot of the money's focused on that. And they're expensive people, that's the other thing. 00:18:41,048 --> 00:18:41,578 [Eric] Yes. [chuckles] 00:18:41,578 --> 00:18:51,947 [John] The ROI problem of, like, if you look at your headcount for an organization, you know, the end, uh, software engineering or technical people do tend to be more expensive. 00:18:51,947 --> 00:18:51,967 [Eric] Yep. 00:18:51,967 --> 00:18:54,788 [John] So that's the other reason why there's a kind of a target there, I think. 00:18:54,788 --> 00:19:04,368 [Eric] Yep. Yep. Okay, let's talk about some lessons from our careers where we either sort of understood that the map is not the territory or did not take that to heart. 00:19:04,368 --> 00:19:04,888 [John] [chuckles] 00:19:04,888 --> 00:19:10,758 [Eric] So I'm gonna- I'm going to let you choose one of the three, um- 00:19:10,758 --> 00:19:10,868 [John] Oh, man 00:19:10,868 --> 00:19:13,947 [Eric] ... principles here and share an anecdote, and then I can take one. 00:19:13,947 --> 00:19:18,288 [John] Yeah. I mean, I, I think I have to get back to the perception is reality thing. 00:19:18,288 --> 00:19:19,127 [Eric] Yeah. 00:19:19,128 --> 00:19:19,568 [John] Really. 00:19:19,568 --> 00:19:23,538 [Eric] Are you, are you any more convicted about the truth of it? [chuckles] 00:19:23,538 --> 00:19:28,788 [John] Here... And this isn't like a s- uh... Well, other than like the example kind of I already gave from one of my first jobs. 00:19:28,788 --> 00:19:29,467 [Eric] Yeah. 00:19:29,467 --> 00:19:36,788 [John] Um, I think, I think it's a- I think it's a reality to be aware of and a horrible principle to live by- 00:19:36,788 --> 00:19:36,798 [Eric] Mm 00:19:36,798 --> 00:19:40,768 [John] ... is kind of my, like, best way to communicate what I think about it. 00:19:40,768 --> 00:19:41,447 [Eric] Yeah. 00:19:41,447 --> 00:19:56,488 [John] And it's a horrible principle to live by, in that, like, if perception is reality, that means that things that people can't see are less important or maybe not important at all. And if you really are living by that, I think you end up with some pretty deep character problems- 00:19:56,488 --> 00:19:56,578 [Eric] [chuckles] Yes 00:19:56,578 --> 00:20:01,888 [John] ... and some, like, deep, like, incongruity in your life. 00:20:01,888 --> 00:20:02,308 [Eric] Yep. 00:20:02,308 --> 00:20:19,228 [John] Um, so I think horrible principle to live by. Really helpful thing to be aware of is, and just being... Honestly, the, the upside of it, the better part of it, is being better able to understand what other people are perceiving- 00:20:19,228 --> 00:20:19,328 [Eric] Yes 00:20:19,328 --> 00:20:20,928 [John] ... versus your perception. 00:20:20,928 --> 00:20:21,348 [Eric] Yep. 00:20:21,348 --> 00:20:24,168 [John] That's the positive of it and the necessary part of it. 00:20:24,168 --> 00:20:24,578 [Eric] Yep. 00:20:24,578 --> 00:20:32,108 [John] So I think that's, to me, why it's complicated, where i- in a sense, you cringe at it, 'cause if somebody's gonna, like, center their life around the principle- 00:20:32,108 --> 00:20:32,238 [Eric] Right 00:20:32,238 --> 00:20:33,288 [John] ... it's, like, super destructive. 00:20:33,288 --> 00:20:33,848 [Eric] Right. 00:20:33,908 --> 00:20:39,588 [John] But if somebody's gonna completely ignore it for what it is, as far as... Like, that's a problem, too. 00:20:39,588 --> 00:20:47,028 [Eric] Yeah. I think that the way that this has played out for me is that 00:20:47,028 --> 00:20:52,908 [Eric] I think that before I started to understand 00:20:52,908 --> 00:21:04,658 [Eric] that the map is not the territory, I would use my map of expected behavior out of people given the defined inputs, right? 00:21:04,658 --> 00:21:04,687 [John] Okay. 00:21:04,687 --> 00:21:14,308 [Eric] So you have a certain context, you have a certain motivation or incentive, and so, you know, you have a map in your mind of how someone's going to behave or handle a certain situation. 00:21:14,308 --> 00:21:15,068 [John] Yes. 00:21:15,068 --> 00:21:20,048 [Eric] When there's deviation from that map, that's difficult to understand, right? 00:21:20,048 --> 00:21:20,128 [John] Right. 00:21:20,128 --> 00:21:21,998 [Eric] This is like reality is the ultimate update. 00:21:21,998 --> 00:21:22,018 [John] Right. 00:21:22,018 --> 00:21:31,788 [Eric] And so I remember experiencing this, you know, some... We had some friends years ago who were going through some, like, very difficult, you know, relational problems. 00:21:31,788 --> 00:21:32,687 [John] Mm-hmm. 00:21:32,687 --> 00:21:39,828 [Eric] And I had a map of why that was, you know, wha- I, I had a map of why that was happening. 00:21:39,828 --> 00:21:41,348 [John] Mm. 00:21:41,348 --> 00:22:11,868 [Eric] Um, and the way things unfolded, like, dramatically deviated from that map, and one of the... That is, I think- a- and I, I almost remember the moment specifically when perception is reality clicked for me, where I realized, like, okay, there are some objective things happening here, but the perception of these people involved is different from the objective reality, yet they are behaving according to their perception, right? 00:22:11,868 --> 00:22:11,888 [John] Yeah. 00:22:11,888 --> 00:22:20,258 [Eric] And so they were operating within the reality that was r- best reflected by their perception or their map, you know? And so I was like, "Oh, right!" Like- 00:22:20,258 --> 00:22:27,687 [John] So you feel like you, as a third party, are like, "I can see what's going on here," and then you also, like, start to understand what they think is going on- 00:22:27,687 --> 00:22:28,068 [Eric] Correct 00:22:28,068 --> 00:22:29,888 [John] ... and kind of feel that tension essentially. 00:22:29,888 --> 00:22:29,908 [Eric] Correct. 00:22:29,908 --> 00:22:30,328 [John] Yeah. 00:22:30,328 --> 00:22:31,568 [Eric] Yeah, yeah. Yeah. 00:22:31,568 --> 00:22:42,128 [John] Yeah. I think, I think for me, like, I kind of shared the clipboard example, uh, updated, like something that, where I learned... So that was the one side, where it's like, uh, you know, you- you're kind of repulsed by that example. 00:22:42,128 --> 00:22:42,248 [Eric] Yes. 00:22:42,248 --> 00:22:43,687 [John] It's like, no. 00:22:43,687 --> 00:22:43,788 [Eric] Right. 00:22:43,788 --> 00:22:51,447 [John] Like, somebody's eventually gonna figure out that, like, we're not actually that busy, and they're gonna, like, adjust to the problem. It probably would've been better to just, like- 00:22:51,447 --> 00:22:51,548 [Eric] Yep 00:22:51,548 --> 00:23:05,567 [John] ... you know. But the opposite example, a little bit, a couple years later in my career, is we were, um, making, um, some pretty important software selection with a, with a pretty decent chunk of investment for the company- 00:23:05,568 --> 00:23:06,187 [Eric] Mm-hmm 00:23:06,187 --> 00:23:09,908 [John] ... um, that, that we were at. We're, you know, buying a piece of software, 00:23:09,908 --> 00:23:11,288 [John] and 00:23:11,288 --> 00:23:21,888 [John] did eval- did evaluations and, and picked something, and, and it was like back in the days where, like, you know, pretty long implementation process and, like, pretty, pretty high, you know, capital investment. 00:23:21,888 --> 00:23:22,608 [Eric] Yep. 00:23:22,608 --> 00:23:28,467 [John] So get through that, and then I start using it. I was one of the people that actually had- 00:23:28,467 --> 00:23:28,478 [Eric] Mm-hmm 00:23:28,478 --> 00:23:31,268 [John] ... to use the software. I'm like: "This is not very good." 00:23:31,268 --> 00:23:31,278 [Eric] Mm. 00:23:31,278 --> 00:23:34,518 [John] And it was back in the days where you, you kind of did demos and you got a flavor for it- 00:23:34,518 --> 00:23:34,518 [Eric] Mm 00:23:34,518 --> 00:23:37,088 [John] ... but, like, you didn't necessarily get to have hands on it till you bought it. 00:23:37,088 --> 00:23:38,028 [Eric] Yep. 00:23:38,028 --> 00:23:40,717 [John] And I was getting that sense, I'm like: Okay, uh, this is kind of buggy. 00:23:40,717 --> 00:23:40,728 [Eric] Mm-hmm. 00:23:40,728 --> 00:23:45,678 [John] Like, you know, you're going down that road, and you're like, "Oh, no, we just spent a lot of money on this," right? 00:23:45,678 --> 00:23:46,208 [Eric] [chuckles] Yeah. 00:23:46,208 --> 00:23:46,628 [John] Then- 00:23:46,628 --> 00:23:47,788 [Eric] There are a lot of zeros there- 00:23:47,788 --> 00:23:48,258 [John] There's a lot of zeros there 00:23:48,258 --> 00:23:49,408 [Eric] ... especially early in your career. [chuckles] 00:23:49,408 --> 00:23:52,608 [John] Yeah, yeah. And I was IC or, like, a team lead or something. 00:23:52,608 --> 00:23:53,368 [Eric] Mm-hmm. 00:23:53,368 --> 00:24:06,237 [John] And then this other piece of software comes out. Cheaper, way better to use. Like, essentially, if you evaluate something on cost, it's better. 00:24:06,237 --> 00:24:06,268 [Eric] Mm-hmm. 00:24:06,268 --> 00:24:09,148 [John] If you evaluate it on end-user experience, it's way better. 00:24:09,148 --> 00:24:09,248 [Eric] Mm-hmm. 00:24:09,248 --> 00:24:12,228 [John] If you'd have evaluated it on developer experience, it's also better. 00:24:12,228 --> 00:24:13,048 [Eric] Mm. 00:24:13,048 --> 00:24:17,937 [John] So, like, all three. But as you can imagine, you just spent a bunch of money on implementation- 00:24:17,937 --> 00:24:17,937 [Eric] Yep 00:24:17,937 --> 00:24:19,748 [John] ... you bought some software, and you're under contract- 00:24:19,748 --> 00:24:20,378 [Eric] Yep 00:24:20,378 --> 00:24:23,888 [John] ... for however many years, maybe it might have been two years at the time. 00:24:23,888 --> 00:24:24,508 [Eric] Wow. 00:24:24,568 --> 00:24:27,268 [John] That's a tricky situation- 00:24:27,268 --> 00:24:27,628 [Eric] Yes 00:24:27,628 --> 00:24:40,648 [John] ... that I probably should have handled better, but I essentially went down the road of, like: "Look, we made the wrong decision-... like, we should go ahead and, like, use this other tool. 00:24:40,648 --> 00:24:40,947 [Eric] Mm-hmm. 00:24:40,948 --> 00:24:46,708 [John] Because, because it was something that was like, uh, impacted the company's perception with customers and all- 00:24:46,708 --> 00:24:46,738 [Eric] Yep 00:24:46,738 --> 00:24:57,198 [John] ... these things, and, and I hate that we made this decision, like, we just gotta do this. So but for, you know, for my boss and the people above me, like, they're not seeing it that way, right? 00:24:57,198 --> 00:24:57,208 [Eric] Yeah. 00:24:57,208 --> 00:25:00,528 [John] They're seeing, like, "We gotta stand behind our investment. Like, we sure don't wanna be wrong." 00:25:00,528 --> 00:25:01,668 [Eric] That was their map, right? 00:25:01,668 --> 00:25:07,938 [John] Yeah, that was their map. And it was a good learning for me. I definitely should have, like, better considered that. 00:25:07,938 --> 00:25:07,968 [Eric] Yeah. 00:25:07,968 --> 00:25:08,628 [John] You know what I mean? 00:25:08,628 --> 00:25:09,688 [Eric] Yep. Yeah, yeah, totally. 00:25:09,688 --> 00:25:18,358 [John] Um, now, putting myself in their shoes, I still think the right thing should have been to just be honest and be like, "Hey, this is a newer tool. We weren't aware of it." 00:25:18,358 --> 00:25:18,368 [Eric] Right. 00:25:18,368 --> 00:25:18,668 [John] Like... 00:25:18,668 --> 00:25:23,888 [Eric] Because in aggregate, the savings from using a better tool over two years is gonna be way higher, right? 00:25:23,888 --> 00:25:24,008 [John] Yeah. 00:25:24,008 --> 00:25:24,588 [Eric] You just write the other cost off. 00:25:24,588 --> 00:25:27,078 [John] And just the customer experience, and the like- 00:25:27,078 --> 00:25:27,078 [Eric] Yep 00:25:27,078 --> 00:25:29,748 [John] ... and setting us apart as, like, a better- 00:25:29,748 --> 00:25:29,848 [Eric] Right 00:25:29,848 --> 00:25:34,868 [John] ... solution or whatever. Um, or at least it should have been, like, evaluated- 00:25:34,868 --> 00:25:35,068 [Eric] Yep 00:25:35,068 --> 00:25:39,548 [John] ... openly versus, like, "This is what we have. We're not even gonna consider, consider it." 00:25:39,548 --> 00:25:39,817 [Eric] Yeah, yeah. 00:25:39,817 --> 00:25:50,688 [John] So there's two sides to it, but it was a good learning and the perception is reality piece for me, and that, like, we wanna [chuckles] we wanna maintain this perception that we're competent, we made the right decisions- 00:25:50,688 --> 00:25:51,088 [Eric] Mm-hmm 00:25:51,088 --> 00:25:52,298 [John] ... we know what we're doing. 00:25:52,298 --> 00:25:52,307 [Eric] Yep. 00:25:52,308 --> 00:25:54,068 [John] Whereas reality, like, we screwed up. 00:25:54,068 --> 00:25:54,587 [Eric] Yeah, yeah, totally. 00:25:54,588 --> 00:25:55,448 [John] Um, yeah. 00:25:55,448 --> 00:26:02,388 [Eric] Totally. Okay, I have one more, um, I have one more to add here, which is for consider the cartographer. 00:26:02,388 --> 00:26:02,458 [John] Mm-hmm. 00:26:02,458 --> 00:26:03,618 [Eric] So which, which leads right into that- 00:26:03,618 --> 00:26:03,618 [John] Yep 00:26:03,618 --> 00:26:10,428 [Eric] ... because I do think a lot of the deviation between map and territory that we see in the context of what you just talked about- 00:26:10,428 --> 00:26:10,578 [John] Mm-hmm 00:26:10,578 --> 00:26:19,248 [Eric] ... is, you know, people will call it sort of the, the, [chuckles] the pejorative term is, like, kingdom building within a company. 00:26:19,248 --> 00:26:19,708 [John] Oh, sure. 00:26:19,708 --> 00:26:20,668 [Eric] Like kingdom building- 00:26:20,668 --> 00:26:20,678 [John] Yeah. 00:26:20,678 --> 00:26:20,888 [Eric] You know? 00:26:20,888 --> 00:26:21,248 [John] Oh, yeah. 00:26:21,248 --> 00:26:22,288 [Eric] Sort of building a fiefdom, right? 00:26:22,288 --> 00:26:22,868 [John] Uh-huh. Yeah. 00:26:22,868 --> 00:26:26,418 [Eric] And so you have a map, and that's just how... You sort of try to force reality into the map, which is- 00:26:26,418 --> 00:26:26,418 [John] Right 00:26:26,418 --> 00:26:28,008 [Eric] ... also, like, maps can influence territories. 00:26:28,008 --> 00:26:28,068 [John] Yeah. 00:26:28,068 --> 00:26:36,028 [Eric] But consider the cartographer. Okay, there was a start-up [clears throat] my friend was telling me about. They were in, you know, a leadership team meeting. 00:26:36,028 --> 00:26:37,208 [John] Mm-hmm. 00:26:37,208 --> 00:26:39,188 [Eric] And 00:26:39,188 --> 00:26:48,748 [Eric] the quick background is that the market had shifted. The market was shifting more towards, you know, sort of after the zero interest rate bubble- 00:26:48,748 --> 00:26:48,757 [John] Yes 00:26:48,757 --> 00:26:58,308 [Eric] ... and the market is shifting much more towards, you know, self-service and, you know, sort of like forcing a lot of companies to go back and reconsider PLG- 00:26:58,308 --> 00:26:58,448 [John] Okay 00:26:58,448 --> 00:27:07,488 [Eric] ... um, you know, type motions as a component of their go-to-market, you know, because, you know, companies weren't flush with cash to spend on software anymore, right? 00:27:07,488 --> 00:27:07,508 [John] Interesting. Okay. 00:27:07,508 --> 00:27:12,738 [Eric] And so being able to charge, like, you know, being able to charge sort of whatever you wanted within reason. 00:27:12,738 --> 00:27:23,548 [John] So, yeah, so okay, so you're saying considering these models of, like, "Oh, we actually ca- [chuckles] we can't afford to pay all these, like, salespeople and account management people. Like, we, we need to, like, better enable self-service," essentially? 00:27:23,548 --> 00:27:35,288 [Eric] Well, I, I think even deeper than that, the, the market was producing a bunch of companies who were building really good products with, like, a lower price, self-serve type motion, right? 00:27:35,288 --> 00:27:35,848 [John] Oh, okay. 00:27:35,848 --> 00:27:37,548 [Eric] And so you're competing more heavily- 00:27:37,548 --> 00:27:37,678 [John] Got it 00:27:37,678 --> 00:27:39,588 [Eric] ... against that, those types of companies- 00:27:39,588 --> 00:27:39,597 [John] Yeah 00:27:39,597 --> 00:27:45,948 [Eric] ... right? Which is really hard when you've built a company, you know, around a sales-led motion. 00:27:45,948 --> 00:27:46,428 [John] Right. 00:27:46,428 --> 00:27:48,157 [Eric] Um, and so that- 00:27:48,157 --> 00:27:49,838 [John] A more, a more people-intensive process, essentially. 00:27:49,838 --> 00:27:50,388 [Eric] Exactly, right? 00:27:50,388 --> 00:27:50,428 [John] Yeah. 00:27:50,428 --> 00:27:51,828 [Eric] And so competition's increasing- 00:27:51,828 --> 00:27:51,948 [John] Yep 00:27:51,948 --> 00:27:52,648 [Eric] ... there's pricing pressure, right? 00:27:52,648 --> 00:27:53,208 [John] Yep, got it. 00:27:53,208 --> 00:27:59,868 [Eric] And so that's the territory, and so in this leadership meeting, 00:27:59,868 --> 00:28:21,548 [Eric] the, the CRO, Chief Revenue Officer, um, made... had done a bunch of work and made a very compelling and, like, mathematically accurate case for, you know, the way that we actually break through the next barrier of growth is to increase average contract value. Um, right? 00:28:21,548 --> 00:28:22,868 [John] Okay. 00:28:22,868 --> 00:28:31,518 [Eric] So we need to go from... I mean, I'm just gonna make up numbers here, 'cause I don't know the numbers, but let's just say average contract value is, like, the typical SaaS 20 grand. [chuckles] 00:28:31,518 --> 00:28:33,148 [John] Yep, 20 grand. 00:28:33,148 --> 00:28:39,368 [Eric] Um, we need to go from 20 grand, like, if we can get that up to, like, 50 grand or 60 grand- 00:28:39,368 --> 00:28:39,498 [John] Mm-hmm 00:28:39,498 --> 00:28:44,388 [Eric] ... and we maintain the same close rates, that's like, that's actually the catalyst for what, you know- 00:28:44,388 --> 00:28:44,458 [John] Right 00:28:44,458 --> 00:29:02,428 [Eric] ... what changes the business, which was mathematically true, and apparently, it was th- you know, that... I, I, I believe that that strategy won the day because it was an airtight, you know, mathematical, um, case based on our own data, based- 00:29:02,428 --> 00:29:02,447 [John] Mm 00:29:02,447 --> 00:29:04,168 [Eric] ... you know, based on this person's experience. 00:29:04,168 --> 00:29:05,767 [John] Right. 00:29:05,768 --> 00:29:14,748 [Eric] But the reality on the ground was the market dynamics are changing, and so even if that was true in theory, the pricing pressure in the market wouldn't allow that to happen. 00:29:14,748 --> 00:29:15,388 [John] Oh, sure. 00:29:15,388 --> 00:29:22,708 [Eric] And so it was just really interesting because if you step back and think about it, the cartographer 00:29:22,708 --> 00:29:27,527 [Eric] makes their money through commission on contracts, right? 00:29:27,528 --> 00:29:27,647 [John] Right. Right. 00:29:27,648 --> 00:29:35,178 [Eric] And so... And I don't- I'm not saying that person, you know, th- they were making a decision that I think made sense with all the inputs that they have. 00:29:35,178 --> 00:29:35,208 [John] Right. 00:29:35,208 --> 00:29:36,708 [Eric] Maybe they were denying reality- 00:29:36,708 --> 00:29:36,718 [John] Right 00:29:36,718 --> 00:29:38,608 [Eric] ... of, like, with the, the shifts in the market. 00:29:38,608 --> 00:29:39,548 [John] Right. 00:29:39,548 --> 00:29:46,428 [Eric] But either way, they were- they are incentivized to build a map that optimizes for commission, right? 00:29:46,428 --> 00:29:46,628 [John] Right. 00:29:46,628 --> 00:29:50,197 [Eric] Which means that if ACV goes up, like, you know, it, it sort of benefits them. 00:29:50,197 --> 00:29:50,248 [John] Right. 00:29:50,248 --> 00:29:51,368 [Eric] And, and it was also true- 00:29:51,368 --> 00:29:51,478 [John] Right 00:29:51,478 --> 00:29:53,878 [Eric] ... that that would, you know, could change the financial trajectory of the company. 00:29:53,878 --> 00:29:58,778 [John] Which, yeah, it benefits the company, too. If it were to have happened in that way, then great. 00:29:58,778 --> 00:29:58,788 [Eric] Exactly. 00:29:58,788 --> 00:30:00,168 [John] It's good- it is good for everyone. 00:30:00,168 --> 00:30:01,088 [Eric] Right. 00:30:01,088 --> 00:30:04,788 [John] But, but as far as the bi- the cartographer bias, like [chuckles] it's- 00:30:04,788 --> 00:30:04,908 [Eric] Yes 00:30:04,908 --> 00:30:05,688 [John] ... pretty strong there. 00:30:05,688 --> 00:30:06,808 [Eric] Yeah, exactly. 00:30:06,808 --> 00:30:10,868 [John] And, and even it's strong for another reason: it is less work to do it that way. 00:30:10,928 --> 00:30:11,647 [Eric] Yes, 100%. 00:30:11,648 --> 00:30:18,188 [John] That's, I think that's the biggest bias here, is you could technically get there two different ways. So say if it's just a number- 00:30:18,188 --> 00:30:18,378 [Eric] Mm-hmm 00:30:18,378 --> 00:30:24,188 [John] ... like, great, we can, we can land 20 large accounts or 100 mid-market accounts or- 00:30:24,188 --> 00:30:24,378 [Eric] Yep 00:30:24,378 --> 00:30:30,528 [John] ... 500 smaller accounts. I mean, small, like, less, [chuckles] less is, is easier. 00:30:30,528 --> 00:30:30,767 [Eric] Sure, sure. 00:30:30,768 --> 00:30:31,808 [John] So there's the other piece. 00:30:31,808 --> 00:30:32,338 [Eric] Absolutely. 00:30:32,338 --> 00:30:32,338 [John] Yeah. 00:30:32,338 --> 00:30:32,348 [Eric] Absolutely. 00:30:32,348 --> 00:30:33,428 [John] Yep. 00:30:33,428 --> 00:30:40,568 [Eric] All righty, well, that has been, uh, The Map Is Not The Territory on mental models, on token intelligence. 00:30:40,568 --> 00:30:41,178 [John] It's a good one. 00:30:41,178 --> 00:30:53,728 [Eric] It's a good one, and we will catch you on the next show. [upbeat music]
