Why can't we find a metaphor for AI?
Stochastic parrot. Intern. Exoskeleton. Every AI metaphor shapes what you build and what you ignore, but the deeper question is why we can’t find a metaphor that fits.
Subscribe to get notified of new episodes
Watch on YouTube
Show Notes
Summary
Eric and John trace five years of AI metaphors: stochastic parrot, blurry JPEG, intern, calculator for words, autonomous agent, digital employee, exoskeleton. Every metaphor suffered from a form of near-sightedness, capturing what the technology felt like in the moment, but missing what it was becoming.
Then they ask the harder question: what happens when a technology is so transformative that no metaphor holds? They pull in horseless carriages, Gilded Age empires, and biblical prophecy to argue that the best frame for AI is no frame at all.
Key takeaways
- Your metaphor is your ceiling: Call it a parrot and you'll use it cautiously. Call it a calculator and you'll use it practically. Your mental model for AI shapes what you believe is possible.
- Count metaphors per year, not features: The fact that we've burned through seven frames in five years is a clear indicator that AI will be more transformative than most people can imagine.
- Expect the best metaphors to break: When a technology is truly transformative, like rail, electricity, and the internet, it stops being described by analogy and starts being described on its own terms.
- Watch the agent economy, not just individual agents: The frontier isn't AI serving humans, it's AI systems interacting with each other, buying, selling, and bidding, which raises hard questions about trust and infrastructure.
- Use metaphors as a design check: Unlike replacement metaphors, the exoskeleton recenters the human. It's a useful test: does this tool amplify skill, or does it just hide the absence of it?
- Study the Gilded Age parallels: Rail, oil, steel, and banking each started as a single focused industry and ended up reshaping everything around them. AI is following the same playbook.
Notable mentions and links
- The book of Ezekiel, Chapter 1, contains a vision of "a wheel within a wheel" — a biblical example of reaching for metaphor when direct language fails to capture something genuinely new.
- "Stochastic parrot" was coined in a 2021 academic paper by Emily Bender, Timnit Gebru, and others, framing large language models as systems that statistically mimic text without real understanding.
- Ted Chiang's 2023 New Yorker essay "ChatGPT Is a Blurry JPEG of the Web" compared language models to lossy compression — you get most of the information, but you'll never get the exact original back.
- The "intern" metaphor (2023), popularized by Wharton's Ethan Mollick, communicated that AI output needs to be checked, reviewed, and supervised — useful framing during the era of hallucination anxiety.
- Simon Willison's "calculator for words" (2023) reframed language models as tools that manipulate language the way calculators manipulate numbers: powerful, but not a search engine replacement.
- The "autonomous agent" metaphor (2024) emerged alongside real-world deployments: Klarna announced its AI had replaced 700 customer service workers, and Eric and John built their own SEO content agent using Google Sheets and the ChatGPT API.
- The "exoskeleton" metaphor (2025–2026) recenters the human: AI augments what you can already do rather than replacing you, but it's only as good as the operator wearing it.
- The TI-83 Plus Silver Edition comes up as a nostalgia touchpoint — John and Eric bond over graphing calculators as their first experience of a machine doing complex operations they couldn't easily do by hand.
- Polymarket is referenced as a platform where autonomous agents could participate in prediction markets, illustrating the agent-to-agent commerce concept.
Transcript
00:00:00,200 --> 00:00:17,940 [Eric] [upbeat music] Welcome back to the Token Intelligence Show. John, I'm gonna start this episode out by asking you 00:00:17,940 --> 00:00:26,960 [Eric] how you explain AI to someone who isn't... Maybe they aren't as familiar with it as you are. 00:00:26,960 --> 00:00:45,400 [John] Okay. So how do I explain AI? I... The, the first thing that comes to mind that I use as far as, like, an analogy, um, would be talk to it as if it were a person. And this analogy quickly breaks down, but if you're treating it like a computer, that's the first thing that I think the- 00:00:45,400 --> 00:00:46,269 [Eric] Oh, interesting 00:00:46,269 --> 00:00:47,000 [John] ... mistake people make. 00:00:47,000 --> 00:00:49,720 [Eric] Okay. W- describe that more. 00:00:49,720 --> 00:01:03,330 [John] Yeah. So, so often, like, I'll get... Somebody will say, "Oh, I use ChatGPT," and, and then, like, I've heard this. I think you and I were in this conversation together. They say, "I use ChatGPT, but I just went back to Google." It's just, you know, it's easier for me- 00:01:03,330 --> 00:01:03,720 [Eric] Very common 00:01:03,720 --> 00:01:04,180 [John] ... to get better- 00:01:04,180 --> 00:01:04,680 [Eric] Very common 00:01:04,680 --> 00:01:05,040 [John] ... better results. 00:01:05,040 --> 00:01:06,300 [Eric] Like, shockingly common. 00:01:06,300 --> 00:01:06,550 [John] Yeah. 00:01:06,550 --> 00:01:07,820 [Eric] But we're very biased. 00:01:07,820 --> 00:01:17,360 [John] Yeah. For, for sure. Um, and then, and then my follow-up question, [laughs] assuming the person, like, was interested, which I think is- you're- we are currently facing- 00:01:17,360 --> 00:01:17,580 [Eric] Right 00:01:17,580 --> 00:01:21,020 [John] ... some, uh... You know, people that just don't wanna, don't wanna use the new technology. 00:01:21,020 --> 00:01:21,100 [Eric] Right. 00:01:21,100 --> 00:01:22,880 [John] But pretend you're interested. 00:01:22,880 --> 00:01:23,360 [Eric] Yep. 00:01:23,360 --> 00:01:24,900 [John] And 00:01:24,900 --> 00:01:32,740 [John] the, the idea is basically do you, um, [tsks] um... Like, how do I get better is the question. 00:01:32,740 --> 00:01:32,840 [Eric] Yeah. 00:01:32,840 --> 00:01:43,180 [John] And, and the thought behind it is you can ask the tool to help you get better. You can help it guide you to ask it, like, the right... like, in a better way. 00:01:43,180 --> 00:01:43,500 [Eric] Yes. 00:01:43,500 --> 00:01:46,200 [John] And that, people struggle with that a lot. 00:01:46,200 --> 00:02:04,160 [Eric] It's very unintuitive because it's one of the, it's one of the characteristics of AI as a technology that I don't think we have experienced before, or if we have, on an extremely limited basis. 00:02:04,160 --> 00:02:04,780 [John] Right. 00:02:04,780 --> 00:02:11,590 [Eric] So for example, a- an example would be you could Google, "How do I get better at Googling?" 00:02:11,590 --> 00:02:11,609 [John] [laughs] Right. 00:02:11,609 --> 00:02:13,740 [Eric] And there would be information and articles- 00:02:13,740 --> 00:02:13,750 [John] Right 00:02:13,750 --> 00:02:14,920 [Eric] ... that you could go read. 00:02:14,920 --> 00:02:15,600 [John] Yep. 00:02:15,600 --> 00:02:25,690 [Eric] But the direct nature of asking AI how to wield AI is, especially in the form factor, is unprecedented- 00:02:25,690 --> 00:02:25,690 [John] Right 00:02:25,690 --> 00:02:27,480 [Eric] ... in how helpful it is. 00:02:27,480 --> 00:02:33,579 [John] Right. It... And it's... And there's two levels because one, I'd started this off with, like, talk to it like a person. 00:02:33,580 --> 00:02:34,240 [Eric] Mm-hmm. 00:02:34,240 --> 00:02:35,900 [John] That helps on one level- 00:02:35,900 --> 00:02:35,950 [Eric] Yep 00:02:35,950 --> 00:02:40,620 [John] ... to be more fluid, give more context, like, not treat it like a query in Google. 00:02:40,620 --> 00:02:41,510 [Eric] Mm-hmm. 00:02:41,510 --> 00:02:52,690 [John] But I'm, I'm about to contradict myself with what I just said of, like, there's this non-person aspect, which is also weird, which is the meta, um, h- how could I have asked you this better? Or- 00:02:52,690 --> 00:02:53,030 [Eric] Right 00:02:53,030 --> 00:02:55,440 [John] ... if I need to do this again, what should I ask you? 00:02:55,440 --> 00:02:55,640 [Eric] Right. 00:02:55,640 --> 00:02:56,400 [John] Things like that- 00:02:56,400 --> 00:02:56,489 [Eric] Right 00:02:56,489 --> 00:02:59,720 [John] ... which are really weird, and you wouldn't really probably do that with a person. 00:02:59,720 --> 00:03:15,040 [Eric] Right. Well, what's interesting to me about what you're describing is that it's... It is a good metaphor. Talk to it like you would talk to a person, you know, or someone you work with. Ask for, you know- 00:03:15,040 --> 00:03:15,560 [John] Right. 00:03:15,560 --> 00:03:21,680 [Eric] That, that's a good way to describe it for someone to have more productive interactions- 00:03:21,680 --> 00:03:21,690 [John] Mm 00:03:21,690 --> 00:03:23,690 [Eric] ... with AI and, you know, make it a- 00:03:23,690 --> 00:03:23,690 [John] Yeah 00:03:23,690 --> 00:03:38,320 [Eric] ... more useful tool for them to use. But what's interesting is you even contradicted yourself because that is a very narrow, um... That's a very narrow view of all of the different things that AI can do, all of the things that it can be used for- 00:03:38,320 --> 00:03:38,420 [John] Right 00:03:38,420 --> 00:03:42,440 [Eric] ... all the different ways that you can interact with it, all the different ways that it behaves. 00:03:42,440 --> 00:03:42,940 [John] Yeah. 00:03:42,940 --> 00:03:51,579 [Eric] Right? Because if you think about agents, you know, especially if you think about Claude Code, it can now release 00:03:51,580 --> 00:03:56,460 [Eric] multiple agents. So you're talking with an agent that is coordinating multiple other agents. 00:03:56,460 --> 00:03:57,260 [John] Right. 00:03:57,260 --> 00:03:57,970 [Eric] And- 00:03:57,970 --> 00:04:00,400 [John] 'Cause mine assumes, like, one-to-one and, like, a- 00:04:00,400 --> 00:04:00,600 [Eric] Right. 00:04:00,600 --> 00:04:03,840 [John] There's a bunch of assumptions behind the thing, the analogy I just used. 00:04:03,840 --> 00:04:04,160 [Eric] Right. 00:04:04,160 --> 00:04:05,340 [John] Yeah. 00:04:05,340 --> 00:04:16,180 [Eric] What, what intrigued me about this is hearing different people describe AI. Actually, a better way to say that would be hearing different people struggle to describe AI. 00:04:16,180 --> 00:04:16,440 [John] Right. 00:04:16,440 --> 00:04:38,659 [Eric] So I was skiing with the kids a couple weeks ago, and you know, you're on the ski lift or, you know, you're getting off the ski lift, and people are standing around and talking, and several times I heard people describing to someone else how they're using AI or how it's changing, and what really struck me was that it's pretty difficult to describe it. And- 00:04:38,660 --> 00:04:38,670 [John] Right 00:04:38,670 --> 00:04:41,800 [Eric] ... the metaphors that people use are really imperfect. 00:04:41,800 --> 00:04:42,120 [John] Right. 00:04:42,120 --> 00:04:47,260 [Eric] So I actually, of course, because this is [laughs] this is who I am, 00:04:48,320 --> 00:04:51,320 [Eric] I thought about this a lot, and I actually did a bunch of research. 00:04:51,320 --> 00:04:51,980 [John] Nice. 00:04:51,980 --> 00:04:58,000 [Eric] So this is going to be a little bit of a quiz for you on the history of AI metaphors. 00:04:58,000 --> 00:04:58,980 [John] Awesome. Can't wait. 00:04:58,980 --> 00:05:05,720 [Eric] But I thought... Okay. The... But before we get there, what are other things that are hard to describe? 00:05:05,720 --> 00:05:07,120 [John] About AI? 00:05:07,120 --> 00:05:07,880 [Eric] No, in general. 00:05:07,880 --> 00:05:08,660 [John] Oh, yeah. 00:05:08,660 --> 00:05:23,940 [Eric] Like, things that are difficult. 'Cause a couple things popped into my head, right? But you're, you are very good generally at taking even complex things and, like, trying to break them down and explain them. So other than AI, what's something recently that you struggled to, like, articulate? 00:05:23,940 --> 00:05:33,460 [John] Yeah. So that's so funny because, um, I've got a friend that, that's doing, um, some, like, AI website agency work and- 00:05:33,460 --> 00:05:33,550 [Eric] Yep 00:05:33,550 --> 00:05:50,099 [John] ... and I was talking to him [laughs] about it, and I was like, "What, what a blessing to, to be able to go home or, you know, or have dinner with your family on a weekend and tell your mom, like..." Your mom's like, "Oh, how's work going?" And, and to be... or her to be able to articulate that you make websites. 00:05:50,100 --> 00:05:51,719 [Eric] [laughs] So true. 00:05:51,720 --> 00:05:53,330 [John] So that's one r- 00:05:53,330 --> 00:05:53,330 [Eric] Yeah 00:05:53,330 --> 00:05:58,520 [John] ... but my version of it, like, it's like, "What do you do?" Like, what do I do- 00:05:58,520 --> 00:05:58,610 [Eric] Mm-hmm 00:05:58,610 --> 00:05:59,980 [John] ... you know, in, in my day job? 00:05:59,980 --> 00:06:01,180 [Eric] Mm-hmm. 00:06:01,180 --> 00:06:02,880 [John] Extremely [laughs] hard to articulate. 00:06:02,880 --> 00:06:03,440 [Eric] Mm. 00:06:03,440 --> 00:06:03,880 [John] Um- 00:06:03,880 --> 00:06:04,200 [Eric] Interesting 00:06:04,200 --> 00:06:14,888 [John] ... in that, like-Like in working with data, so the... I don't even know if analogy's the right word here, but I explain what I do to non-technical people usually in terms of spreadsheets- 00:06:14,888 --> 00:06:15,088 [Eric] Mm. 00:06:15,088 --> 00:06:16,867 [John] Because that's like their interface for data. 00:06:16,868 --> 00:06:17,648 [Eric] Yeah. 00:06:17,648 --> 00:06:29,928 [John] So imagine a spreadsheet, and i- it was, you know, more data than can fit in one spreadsheet, and then, you know, and then I kinda go from there, like unpacking what, what work with databases and models and- 00:06:29,928 --> 00:06:30,088 [Eric] Yeah 00:06:30,088 --> 00:06:31,068 [John] ... architecture looks like. 00:06:31,068 --> 00:06:31,648 [Eric] Yeah, yeah, yeah. 00:06:31,648 --> 00:06:39,548 [John] So I think that's my best, my best example that's, that's top of mind would be that spreadsheet analogy for data infrastructure. 00:06:39,548 --> 00:06:41,588 [Eric] Yeah. 00:06:41,588 --> 00:06:50,108 [Eric] Uh, for me, I'll tell you what comes to mind, which this is [laughs] this is totally random, but I'm excited. 00:06:50,108 --> 00:07:12,228 [Eric] We were driving somewhere. We were driving down the highway, and it was sort, a sort of rural, the... It was a big interstate, but the family's driving and, and it sort of passed through like a rural area where these, these big rolling hills, and you can kind of see. You know, it's farmland and just, you know, open land. And 00:07:12,228 --> 00:07:19,828 [Eric] my daughter, who was five or six at the time, uh, 00:07:19,888 --> 00:07:23,428 [Eric] asks, "Hey, Dad, where do hills come from?" 00:07:23,428 --> 00:07:24,528 [John] [laughs] 00:07:24,528 --> 00:07:25,378 [Eric] Where do hills- 00:07:25,378 --> 00:07:25,378 [John] Yeah 00:07:25,378 --> 00:07:26,148 [Eric] ... come from? 00:07:26,148 --> 00:07:26,408 [John] Sure. 00:07:26,408 --> 00:07:31,578 [Eric] Right? And it was really difficult to answer 'cause it's like- 00:07:31,578 --> 00:07:31,578 [John] Right 00:07:31,578 --> 00:07:32,828 [Eric] ... well, 00:07:32,828 --> 00:07:36,088 [Eric] there are so many fundamental things- 00:07:36,088 --> 00:07:36,628 [John] Right 00:07:36,628 --> 00:07:42,828 [Eric] ... you know, that you need to explain in order to, you know, p- like 00:07:42,828 --> 00:07:44,588 [Eric] [laughs] there's just a lot there- 00:07:44,588 --> 00:07:44,598 [John] Yeah 00:07:44,598 --> 00:07:54,848 [Eric] ... right? Like, I, I... And it changes over... Like land changes over time. Like, there's so many things. You know, you can, uh, actually move land around with like big machinery. There's just like all sorts of things, right? 00:07:54,848 --> 00:07:55,708 [John] Yeah. God made the hills. 00:07:55,708 --> 00:07:56,988 [Eric] Yeah. God made the hills, right? 00:07:56,988 --> 00:07:59,128 [John] [laughs] But yeah. Yeah. 00:07:59,128 --> 00:08:00,508 [Eric] And I s- I sort of- 00:08:00,508 --> 00:08:02,778 [John] Or the tectonic plates. You could get into that. I mean [laughs] 00:08:02,778 --> 00:08:04,008 [Eric] I did. Well, I actually did. 00:08:04,008 --> 00:08:04,538 [John] Of course you did. 00:08:04,538 --> 00:08:14,798 [Eric] But my, but my kids, like, definitely, you know, my kids are now used to... The older two are used to, like if they ask a question like that, I'm probably gonna try- 00:08:14,798 --> 00:08:14,868 [John] [laughs] Right, right, right 00:08:14,868 --> 00:08:15,848 [Eric] ... to cover the bases. 00:08:15,848 --> 00:08:16,187 [John] Okay. 00:08:16,188 --> 00:08:16,428 [Eric] Right? 00:08:16,428 --> 00:08:17,168 [John] Sure. Right. 00:08:17,168 --> 00:08:20,418 [Eric] [laughs] From God to tectonic plates, right? 00:08:20,418 --> 00:08:20,448 [John] Okay, got it. 00:08:20,448 --> 00:08:21,718 [Eric] And I definitely covered that, you know? 00:08:21,718 --> 00:08:22,408 [John] You got the whole... 00:08:22,408 --> 00:08:22,728 [Eric] Yeah. 00:08:22,728 --> 00:08:22,928 [John] Yeah. 00:08:22,928 --> 00:08:23,328 [Eric] Exactly. 00:08:23,328 --> 00:08:23,768 [John] Awesome. 00:08:23,768 --> 00:08:25,918 [Eric] It's actually... It's a good exercise in doing that. 00:08:25,918 --> 00:08:26,988 [John] [laughs] Right. 00:08:26,988 --> 00:08:32,168 [Eric] Uh, the funniest part though was after I got done with my explanation and I thought, "That was a pretty good- 00:08:32,168 --> 00:08:32,248 [John] Mm-hmm 00:08:32,248 --> 00:08:35,288 [Eric] ... explanation." You know? Like I feel like for a five, six-year-old- 00:08:35,288 --> 00:08:35,708 [John] Right 00:08:35,708 --> 00:08:37,097 [Eric] ... they have a good handle on that, and they- 00:08:37,097 --> 00:08:37,218 [John] Yeah 00:08:37,218 --> 00:08:43,168 [Eric] ... picked up some knowledge that will be useful later on about geology and blah. 00:08:43,168 --> 00:08:44,828 [Eric] And 00:08:44,828 --> 00:08:49,248 [Eric] my daughter thought for a second and she said, "What are hills even for?" 00:08:49,248 --> 00:08:49,408 [John] [laughs] 00:08:49,408 --> 00:08:53,168 [Eric] No, no, no, no. She said, "What do hills even do?" [laughs] 00:08:53,168 --> 00:08:54,928 [John] [laughs] 00:08:54,928 --> 00:08:56,148 [Eric] What do hills even do? 00:08:56,148 --> 00:08:56,438 [John] Awesome. 00:08:56,438 --> 00:08:58,148 [Eric] It's like, okay, great. 00:08:58,148 --> 00:08:58,328 [John] [laughs] 00:08:58,328 --> 00:09:00,948 [Eric] I'm glad that... I'm glad I taught you something. Um- 00:09:00,948 --> 00:09:03,288 [John] Right. 00:09:03,288 --> 00:09:05,328 [Eric] The... But 00:09:05,328 --> 00:09:09,208 [Eric] I feel like that when people ask me about AI. 00:09:09,208 --> 00:09:09,338 [John] Yeah. 00:09:09,338 --> 00:09:12,127 [Eric] And I keep reaching for metaphors, and I can't find it. 00:09:12,128 --> 00:09:12,468 [John] Right. 00:09:12,468 --> 00:09:18,688 [Eric] The other example actually, you mentioned that God made the hills, but the other example that came to mind was biblical prophecy. 00:09:18,688 --> 00:09:19,548 [John] Okay. 00:09:19,548 --> 00:09:29,418 [Eric] So, you know, I mean, there's a, you know... And this is like, there's a lot of even like pop culture, you know, references to like the Book of Revelation- 00:09:29,418 --> 00:09:29,548 [John] Mm-hmm 00:09:29,548 --> 00:09:32,108 [Eric] ... and the beast and the end times and dystopian- 00:09:32,108 --> 00:09:32,118 [John] Yeah 00:09:32,118 --> 00:09:35,428 [Eric] ... future and all that sort of stuff. Um, 00:09:35,428 --> 00:09:40,298 [Eric] but there is some wild stuff, and I actually pulled a quote that I wanna share- 00:09:40,298 --> 00:09:40,458 [John] Awesome 00:09:40,458 --> 00:09:44,938 [Eric] ... with you. Okay, you ready for this? So, I... This is what we should do. I'm gonna read this, and then I- 00:09:44,938 --> 00:09:45,008 [John] Okay. 00:09:45,008 --> 00:09:47,568 [Eric] You try to describe to me what you think- 00:09:47,568 --> 00:09:48,468 [John] [laughs] Okay 00:09:48,468 --> 00:09:50,018 [Eric] ... this is describing. You ready? 00:09:50,018 --> 00:09:53,228 [John] Awesome. 00:09:53,228 --> 00:10:30,368 [Eric] This is a vision that Ezekiel is having. "Now as I looked at the living creatures, I saw a wheel on the earth beside the living creatures, one for each of the four of them. As for the appearance of the wheels and their construction, their appearance was like the gleaming of beryl. And the four had the same likeness, their appearance and construction, being as it were, a wheel within a wheel. When they went, they went in any of their four directions without turning as they went. And their rims were tall and awesome, and the rims of all four were full of eyes all around." So give me a- 00:10:30,368 --> 00:10:30,908 [John] [laughs] 00:10:30,908 --> 00:10:31,508 [Eric] Just- 00:10:31,508 --> 00:10:31,728 [John] Yeah 00:10:31,728 --> 00:10:32,158 [Eric] ... break that- 00:10:32,158 --> 00:10:32,587 [John] Painting, painting a picture for me? 00:10:32,588 --> 00:10:34,847 [Eric] Break that down for me, John. 00:10:34,848 --> 00:10:57,428 [John] Oh, man. The thing that comes to mind, um... Two things that come to mind. Did you ever have one of those, those spinning tops with the lights on it that you could like pull the... Uh, uh, it w- there's a gear inter- internal gear, and then like a, a plastic piece that you could pull, and it would like hit all the, the points in the gear and then make it spin, and there'd be lights. It was- 00:10:57,428 --> 00:10:58,087 [Eric] Oh, yeah. 00:10:58,088 --> 00:10:58,948 [John] I don't know what they're called. 00:10:58,948 --> 00:10:58,988 [Eric] Yeah. 00:10:58,988 --> 00:11:01,288 [John] It's not just a standard top. It's like a high-tech top. 00:11:01,288 --> 00:11:08,268 [Eric] Yeah, I know what you're talking about. Yeah. The, the mechanism inside would actually rotate, and then it would rotate the entire thing. 00:11:08,268 --> 00:11:09,148 [John] Exactly. 00:11:09,148 --> 00:11:09,638 [Eric] You know, the outer- 00:11:09,638 --> 00:11:09,828 [John] Right 00:11:09,828 --> 00:11:10,358 [Eric] ... shell of the ball. 00:11:10,358 --> 00:11:12,888 [John] And there's typical maybe like noise and lights and, and stuff. 00:11:12,888 --> 00:11:14,288 [Eric] It was gyroscopic. 00:11:14,288 --> 00:11:16,528 [John] Yeah, exactly. So something w- like that- 00:11:16,528 --> 00:11:16,898 [Eric] Okay 00:11:16,898 --> 00:11:25,027 [John] ... combined, but like that's on a really small scale. So let's like scale it up, and now I'm thinking like astronaut training where they have like the- 00:11:25,028 --> 00:11:25,438 [Eric] Oh, yeah. Yeah 00:11:25,438 --> 00:11:26,848 [John] ... the circular thing that goes- 00:11:26,848 --> 00:11:26,958 [Eric] Sure. Yeah, yeah, yeah, yeah 00:11:26,958 --> 00:11:33,178 [John] ... in all the different directions. But then there's like a, like a outer piece that, that reminds you of like the- 00:11:33,178 --> 00:11:33,178 [Eric] Yeah 00:11:33,178 --> 00:11:34,068 [John] ... the top thing. 00:11:34,068 --> 00:11:34,267 [Eric] Yeah. 00:11:34,268 --> 00:11:37,168 [John] That's, that's what I got for you. [laughs] 00:11:37,168 --> 00:11:37,668 [Eric] Well done. 00:11:37,668 --> 00:11:41,278 [John] Which, which I think is a UFO, is what I just described. [laughs] 00:11:41,278 --> 00:11:43,568 [Eric] Well, I think p- I think there is like a- 00:11:43,568 --> 00:11:44,408 [John] There's a whole thing. 00:11:44,408 --> 00:11:45,428 [Eric] There's a whole thing about that. 00:11:45,428 --> 00:11:45,968 [John] Around that. 00:11:45,968 --> 00:11:46,598 [Eric] Yeah, yeah. 00:11:46,598 --> 00:11:46,648 [John] Yeah. 00:11:46,648 --> 00:11:51,228 [Eric] Okay. All right. So are you ready for the history? 00:11:51,228 --> 00:12:00,568 [John] Yeah. Well, so there's a history lesson of AI metaphors starting from kind of the, the beginning of at least generative AI to present. Is that, is that kinda what we're doing? 00:12:00,568 --> 00:12:01,588 [Eric] That's what we're gonna do. 00:12:01,588 --> 00:12:01,968 [John] Awesome. 00:12:01,968 --> 00:12:02,488 [Eric] You ready? 00:12:02,488 --> 00:12:02,788 [John] Yep. 00:12:02,788 --> 00:12:11,060 [Eric] Okay. Guess which year the first one shows up? Okay. I, I used AI toLike do a bunch of research on this- 00:12:11,060 --> 00:12:11,070 [John] Mm-hmm 00:12:11,070 --> 00:12:13,580 [Eric] ... and pull references and try to put together a rough timeline. 00:12:13,580 --> 00:12:13,600 [John] Right. 00:12:13,600 --> 00:12:14,720 [Eric] This isn't perfect, but- 00:12:14,720 --> 00:12:15,060 [John] Right 00:12:15,060 --> 00:12:22,140 [Eric] ... in terms of sort of the major, uh, cited references, this is what shows up. 00:12:22,140 --> 00:12:30,179 [John] Okay. So if my re-recollection is correct, I think ChatGPT 3.5 is around 2021. Is that about right? 00:12:30,180 --> 00:12:31,100 [Eric] Yes, yes. 00:12:31,100 --> 00:12:31,620 [John] Okay. 00:12:31,620 --> 00:12:31,780 [Eric] Yeah. 00:12:31,780 --> 00:12:32,040 [John] So I would imagine- 00:12:32,040 --> 00:12:33,580 [Eric] 2021 is where we start. 00:12:33,580 --> 00:12:34,519 [John] Okay. Nice. 00:12:34,520 --> 00:12:40,540 [Eric] That is where we start. Okay, good job. This is... This'll be interesting 'cause 00:12:40,540 --> 00:12:49,800 [Eric] I have seen this term pop up occasionally, but don't hear it a ton. It, uh... the first metaphor is stochastic parrot. 00:12:50,940 --> 00:12:52,880 [Eric] [laughs] Oh, man. 00:12:52,880 --> 00:12:54,740 [Eric] That's a... It sounds very academic. 00:12:54,740 --> 00:12:55,880 [John] Yeah. 00:12:55,880 --> 00:12:59,140 [Eric] It sounds... And which actually it came from a paper. Um- 00:12:59,140 --> 00:13:00,300 [John] Okay 00:13:00,300 --> 00:13:10,840 [Eric] ... but the idea was that LLMs, you know, basically use statistics or an algorithm to regurgitate information- 00:13:10,840 --> 00:13:11,060 [John] Right 00:13:11,060 --> 00:13:15,280 [Eric] ... you know, back to the user. Um, so stochastic parrot. 00:13:15,280 --> 00:13:23,920 [John] I think I missed that one. I, I may have heard it, like, before you mentioned it now. Like, you know, maybe I heard it once or twice, but that... I th- I think I might have missed that. 00:13:23,920 --> 00:13:24,790 [Eric] Yeah. [laughs] 00:13:24,790 --> 00:13:25,620 [John] [laughs] 00:13:25,620 --> 00:13:31,690 [Eric] Uh, uh, my guess is that it didn't really, it didn't really [laughs] get wide distribution because- 00:13:31,690 --> 00:13:31,690 [John] No 00:13:31,690 --> 00:13:33,160 [Eric] ... it's kind of abstract. 00:13:33,160 --> 00:13:33,720 [John] Yep. 00:13:33,720 --> 00:13:33,740 [Eric] Uh- 00:13:33,740 --> 00:13:36,230 [John] Like, that didn't come up at any family dinners at- 00:13:36,230 --> 00:13:36,240 [Eric] [laughs] 00:13:36,240 --> 00:13:40,180 [John] ... Thanksgiving in 2021. The stochastic parrots were not part of the- 00:13:40,220 --> 00:13:40,230 [Eric] [laughs] 00:13:40,230 --> 00:13:41,820 [John] ... the menu. 00:13:41,820 --> 00:13:44,700 [Eric] [laughs] It's kind of a good dig though. 00:13:44,700 --> 00:13:45,880 [John] [laughs] Yeah. 00:13:45,880 --> 00:13:45,920 [Eric] Um- 00:13:45,920 --> 00:13:49,180 [John] It's almost an insult to the technology, right? Like, like- 00:13:49,180 --> 00:13:49,860 [Eric] It is 00:13:49,860 --> 00:13:50,880 [John] ... like a- 00:13:50,880 --> 00:13:51,170 [Eric] Well, like- 00:13:51,170 --> 00:14:00,540 [John] Which is, which is interesting coming from an academic p-paper because obviously all the generative AI stuff c-come, came out of labs and academia. That's where it came from. 00:14:00,540 --> 00:14:01,660 [Eric] Yeah. That's interesting. 00:14:01,660 --> 00:14:02,000 [John] So I don't know. 00:14:02,000 --> 00:14:05,479 [Eric] Well, we'll link to it in the show notes, and maybe we're misinterpreting it. 00:14:05,480 --> 00:14:06,120 [John] Yeah, maybe. 00:14:06,120 --> 00:14:13,040 [Eric] But interesting that you say it's, uh, you know, it's a negative view on the technology itself because the- 00:14:13,040 --> 00:14:16,120 [John] Which maybe it wasn't at the time. It definitely would be now. 00:14:16,120 --> 00:14:22,580 [Eric] It definitely would be now, but this... what's interesting is the next major one that came up is 2023. 00:14:22,580 --> 00:14:23,439 [John] Okay. 00:14:23,440 --> 00:14:27,440 [Eric] And someone called AI a blurry JPEG. 00:14:27,440 --> 00:14:28,840 [John] [laughs] Okay. 00:14:28,840 --> 00:14:29,560 [Eric] Um- 00:14:29,560 --> 00:14:31,520 [John] So who called it a blurry JPEG? 00:14:31,520 --> 00:14:36,180 [Eric] Uh, Ted Chang, and this was in The New Yorker. 00:14:36,180 --> 00:14:37,060 [John] Okay. 00:14:37,060 --> 00:14:41,100 [Eric] And it initially had surprised me. I totally missed this one. I did not- 00:14:41,100 --> 00:14:42,000 [John] Yeah. I did not 00:14:42,000 --> 00:14:42,780 [Eric] ... I did not see this. 00:14:42,780 --> 00:14:43,780 [John] That did not cross my- 00:14:43,780 --> 00:14:44,100 [Eric] Uh 00:14:44,100 --> 00:14:44,660 [John] ... radar. 00:14:44,660 --> 00:14:58,820 [Eric] But I actually think it's a better articulation. You know, we're, we're sort of... The, the technology has advanced now, but I think it's a better articulation, or at least a more concrete way to explain the stochas- the stochastic parrot- 00:14:58,820 --> 00:14:59,340 [John] Okay 00:14:59,340 --> 00:15:06,560 [Eric] ... um, analogy. But essentially it says it's a lossy compression of web knowledge. 00:15:06,560 --> 00:15:07,460 [John] Okay. 00:15:07,460 --> 00:15:17,400 [Eric] Right? So AI can retain patterns, and it can, you know, form the structure of an image, but it's a blurry JPEG. 00:15:17,400 --> 00:15:24,570 [John] Yes. I think around the same time, I didn't hear that one, but I did hear the geographic map a lot. 00:15:24,570 --> 00:15:24,620 [Eric] Yeah. 00:15:24,620 --> 00:15:29,080 [John] I think around the same time, and it was usually in context for prompting. 00:15:29,080 --> 00:15:29,560 [Eric] Mm-hmm. 00:15:29,560 --> 00:15:34,650 [John] Whereas if you were bad at prompting, you were gonna get all these points around like- 00:15:34,650 --> 00:15:34,940 [Eric] Oh, interesting 00:15:34,940 --> 00:15:41,380 [John] ... maybe a s- uh, a state. Like, you could get somewhere near California or South Carolina or Texas. 00:15:41,380 --> 00:15:42,280 [Eric] Mm-hmm. 00:15:42,280 --> 00:15:48,260 [John] But you're bad at prompting. And if you get good at prompting, you can get to the zip code level or the city level. 00:15:48,260 --> 00:15:48,460 [Eric] Right. 00:15:48,460 --> 00:15:53,210 [John] And that was the idea of like, oh, you gotta really nail prompting. Like, this is the core skill set. 00:15:53,210 --> 00:15:53,240 [Eric] Yeah, yeah. Totally. 00:15:53,240 --> 00:15:55,460 [John] You gotta get good at it so you can really- 00:15:55,460 --> 00:15:55,670 [Eric] Yeah 00:15:55,670 --> 00:15:58,360 [John] ... hone in your, um, your points on your map. 00:15:58,360 --> 00:15:58,840 [Eric] Right. 00:15:58,840 --> 00:16:00,240 [John] Which I haven't heard that much anymore. 00:16:00,240 --> 00:16:07,740 [Eric] Yeah. That is really interesting though. Uh, which is... I, I mean, that, that was a... that's a great, that's a great metaphor- 00:16:07,740 --> 00:16:08,000 [John] Yeah 00:16:08,000 --> 00:16:08,240 [Eric] ... I think. 00:16:08,240 --> 00:16:08,780 [John] Yeah. 00:16:08,780 --> 00:16:09,000 [Eric] Um- 00:16:09,000 --> 00:16:13,400 [John] And it's still somewhat true, but, but in some ways it's less relevant than- 00:16:13,400 --> 00:16:13,409 [Eric] Right 00:16:13,409 --> 00:16:14,220 [John] ... it used to be. 00:16:14,220 --> 00:16:29,220 [Eric] Right. Totally. Uh, okay. The next one's a big one. So this is the same year as the blurry JPEG. Uh, and I should have looked at the actual months on these, but this is interesting that these two- 00:16:29,220 --> 00:16:29,700 [John] Yeah 00:16:29,760 --> 00:16:41,280 [Eric] ... uh, that these two metaphors emerge in the same year. Uh, the next one is 2023 again, um, but it's when people started calling AI an intern. 00:16:41,280 --> 00:16:43,560 [John] Yes. Yeah, I remember that. 00:16:43,560 --> 00:16:48,080 [Eric] Right? And so that is, that is sort of the first personification- 00:16:48,080 --> 00:16:49,200 [John] Mm-hmm 00:16:49,200 --> 00:16:54,820 [Eric] ... which is really interesting. And I, I, I remember this very clearly at work 00:16:54,820 --> 00:16:59,500 [Eric] because, you know, the... I was in the, um, product organization. 00:16:59,500 --> 00:16:59,980 [John] Mm-hmm. 00:16:59,980 --> 00:17:10,960 [Eric] And so working, you know, day to day with engineers, and everyone's trying to evaluate how do we place this? How do we use this? What's our mental model for this technology? 00:17:10,960 --> 00:17:11,400 [John] Right. 00:17:11,400 --> 00:17:18,940 [Eric] And intern was pretty apt, right? Where it's like, okay, this can create leverage, but you have to manage it very- 00:17:18,940 --> 00:17:18,950 [John] Yeah 00:17:18,950 --> 00:17:20,220 [Eric] ... very closely. 00:17:20,220 --> 00:17:31,190 [John] The intern, I think the two components implicit in that were, one, the management that you just mentioned, and then two, the, um, fairly constant redirection of like- 00:17:31,190 --> 00:17:31,720 [Eric] Yes 00:17:31,720 --> 00:17:33,520 [John] ... "Don't do this, do this." 00:17:33,520 --> 00:17:33,960 [Eric] Right. 00:17:33,960 --> 00:17:42,260 [John] Like, that's a constant piece, and then the constant piece of, like, managing a lot around it so it can be productive versus it self-managing. 00:17:42,260 --> 00:17:53,500 [Eric] Totally. Okay. Now this is where things get, uh, interesting because it starts, at this point, it starts to flip-flop between human and non-human. 00:17:53,500 --> 00:17:56,000 [John] Okay. So, like, we just had human and intern. 00:17:56,000 --> 00:17:56,119 [Eric] Right. 00:17:56,120 --> 00:17:58,000 [John] And then we're gonna flip to a non-human one. 00:17:58,000 --> 00:18:08,480 [Eric] Yep. Which just to put this in perspective, this is fascinating. So, you know, we'll say 2020, 2021 is sort of when 00:18:08,480 --> 00:18:20,323 [Eric] the, um, let's say practicalViability of this technology starts to become more widespread. 00:18:20,324 --> 00:18:20,964 [John] Okay. 00:18:20,964 --> 00:18:23,764 [Eric] Which is when the metaphors really start to show up, right? 00:18:23,764 --> 00:18:32,114 [John] Yes. And, and maybe we just practically mean by that the ChatGPT app on the iPhone app store is in the top five or 10 or something. 00:18:32,114 --> 00:18:36,344 [Eric] Sure. And people are using the tool to do meaningful things- 00:18:36,344 --> 00:18:36,924 [John] Mm-hmm. Yeah 00:18:36,924 --> 00:18:37,784 [Eric] ... you know, let's say in their- 00:18:37,784 --> 00:18:37,894 [John] Right 00:18:37,894 --> 00:18:38,354 [Eric] ... day-to-day work- 00:18:38,354 --> 00:18:38,354 [John] Yeah 00:18:38,354 --> 00:18:45,164 [Eric] ... as an example, right? In two years, it goes from parrot to human. That's wild. 00:18:45,164 --> 00:18:45,174 [John] Okay. 00:18:45,174 --> 00:18:45,564 [Eric] Right? 00:18:45,564 --> 00:18:46,964 [John] True. Yeah. 00:18:46,964 --> 00:18:48,864 [Eric] That to me, that was really- 00:18:48,864 --> 00:18:48,974 [John] Stochastic- 00:18:48,974 --> 00:18:49,764 [Eric] ... that really struck me 00:18:49,764 --> 00:18:51,004 [John] ... parrot to intern. 00:18:51,004 --> 00:18:51,984 [Eric] To intern. 00:18:51,984 --> 00:18:52,024 [John] Yeah. 00:18:52,024 --> 00:18:54,504 [Eric] That's a l- that's a big, [laughs] that's a- 00:18:54,564 --> 00:18:54,574 [John] Yeah, yeah 00:18:54,574 --> 00:19:11,764 [Eric] ... huge step, right? Um, Simon Willison, who is, thinks and writes a lot about AI, you know, wonderful thinker and author, he called AI, this is also in 2023, a calculator for words. 00:19:11,764 --> 00:19:13,344 [John] Okay. 00:19:13,344 --> 00:19:16,584 [Eric] So first of all, which... Did you have a graphing calculator? 00:19:16,584 --> 00:19:17,764 [John] TI-83 Plus. 00:19:17,764 --> 00:19:18,544 [Eric] Plus. 00:19:18,544 --> 00:19:19,244 [John] Silver. 00:19:19,244 --> 00:19:19,284 [Eric] Okay. 00:19:19,284 --> 00:19:21,494 [John] Silver edition. Or maybe it was just colored. 00:19:21,494 --> 00:19:21,524 [Eric] Silver? 00:19:21,524 --> 00:19:22,564 [John] Yeah, it was silver. 00:19:22,564 --> 00:19:22,674 [Eric] Oh. 00:19:22,674 --> 00:19:24,254 [John] It was, like, semi-transparent. 00:19:24,254 --> 00:19:24,284 [Eric] Man. 00:19:24,284 --> 00:19:25,654 [John] You could, like, see the circuit board through it. 00:19:25,654 --> 00:19:26,224 [Eric] I remember that. 00:19:26,224 --> 00:19:26,804 [John] Yeah. 00:19:26,804 --> 00:19:28,654 [Eric] Yeah. That was actually kind of- 00:19:28,654 --> 00:19:28,854 [John] It was kind of a big deal 00:19:28,854 --> 00:19:31,044 [Eric] ... a statement if you had a colored- 00:19:31,044 --> 00:19:31,054 [John] [laughs] 00:19:31,054 --> 00:19:32,304 [Eric] 'Cause I just had the black one. 00:19:32,304 --> 00:19:33,524 [John] Ugh, okay. 00:19:33,524 --> 00:19:34,064 [Eric] You know? 00:19:34,064 --> 00:19:35,614 [John] You know what, you know why I got that one? 00:19:35,614 --> 00:19:35,854 [Eric] Why? 00:19:35,854 --> 00:19:40,124 [John] 'Cause I lost my black one. [laughs] 00:19:40,124 --> 00:19:40,884 [John] I'm pretty sure that's true. 00:19:40,884 --> 00:19:46,404 [Eric] I just love the idea of, like, let's just inject the plastic with a different color- 00:19:46,404 --> 00:19:46,504 [John] Yeah 00:19:46,504 --> 00:19:47,224 [Eric] ... and we can- 00:19:47,224 --> 00:19:48,604 [John] I'm pretty sure it was more expensive, and it- 00:19:48,604 --> 00:19:49,144 [Eric] Oh, it totally was 00:19:49,144 --> 00:19:54,944 [John] ... and I think, and I, I think it was one of those things where I needed one last minute and bought, had to [laughs] get the more expensive one. 00:19:54,944 --> 00:19:55,804 [Eric] Yeah, totally. 00:19:55,804 --> 00:19:56,584 [John] Um, but- 00:19:56,584 --> 00:19:59,504 [Eric] There was nothing more satisfying than the click, like, when you- 00:19:59,504 --> 00:19:59,694 [John] Oh 00:19:59,694 --> 00:20:01,244 [Eric] ... slid the cover over. 00:20:01,244 --> 00:20:03,404 [John] On or off. It definitely clicked- 00:20:03,404 --> 00:20:03,444 [Eric] Both 00:20:03,444 --> 00:20:04,724 [John] ... yeah, I think it was both directions. 00:20:04,724 --> 00:20:04,963 [Eric] Yeah. 00:20:04,964 --> 00:20:05,204 [John] Yeah. 00:20:05,204 --> 00:20:06,984 [Eric] It was just so satisfying. 00:20:06,984 --> 00:20:07,084 [John] Yeah. 00:20:07,084 --> 00:20:09,804 [Eric] Um, a nice piece of hardware, actually. 00:20:09,804 --> 00:20:10,744 [John] Yeah. 00:20:10,744 --> 00:20:30,224 [Eric] Uh, Texas Instruments. There you go. Calculator for words, though. This one was interesting. This one, when I first came across it in my research, it seemed like, uh, okay. But then I thought, oh, wow, if you think about the TI-83 Plus as a graphing calculator, especially when you started to get into, like- 00:20:30,224 --> 00:20:30,524 [John] Mm-hmm 00:20:30,524 --> 00:20:35,624 [Eric] ... you know, more advanced calculus and other things like that, it was phenom- I mean, it was insane. 00:20:35,624 --> 00:20:35,904 [John] Yeah. 00:20:35,904 --> 00:20:44,224 [Eric] Right? Like, the amount of work that it actually compressed into, um, you know, the ability to, like, hit a couple buttons and- 00:20:44,224 --> 00:20:44,413 [John] Mm-hmm 00:20:44,413 --> 00:20:45,264 [Eric] ... sort of make that happen. 00:20:45,264 --> 00:20:45,344 [John] Yeah. 00:20:45,344 --> 00:20:54,684 [Eric] Right? And so I thought, oh, that's actually a very astute, uh, metaphor, because it, it absolutely is the same thing with words, right? 00:20:54,684 --> 00:20:54,804 [John] Yeah. 00:20:54,804 --> 00:21:05,723 [Eric] The ability to take an immense amount of num- you know, you know, verbal or, you know, yeah, sort of, like, word input and, like, do very complex things with it, like, very, very quickly. 00:21:05,724 --> 00:21:06,104 [John] Mm-hmm. 00:21:06,104 --> 00:21:07,104 [Eric] So I thought that was a good one. 00:21:07,104 --> 00:21:08,744 [John] Yeah. 00:21:08,744 --> 00:21:08,784 [Eric] Do y- 00:21:08,784 --> 00:21:10,084 [John] So we're in 2023. 00:21:10,084 --> 00:21:10,864 [Eric] 2023. 00:21:10,864 --> 00:21:11,684 [John] Yeah. 00:21:11,684 --> 00:21:27,304 [Eric] Um, do you want to... Okay, the next one actually is, um, okay, the next one's not human. Um, but this is 2024. 00:21:27,304 --> 00:21:28,604 [John] Okay. 00:21:28,604 --> 00:21:33,364 [Eric] And it is autonomous agent. So we go from intern- 00:21:33,364 --> 00:21:33,644 [John] Okay 00:21:33,644 --> 00:21:37,094 [Eric] ... calculator for words. You know, those are sort of around the same time. 00:21:37,094 --> 00:21:37,634 [John] Right. 00:21:37,634 --> 00:21:49,764 [Eric] And then we move on to autonomous agent, you know, so sort of this machine that's acting independently, um, and can actually do things end to end, which I think was a big step, right? 00:21:49,764 --> 00:21:50,904 [John] And this is 2024? 00:21:50,904 --> 00:21:52,744 [Eric] This is 2024. 00:21:52,744 --> 00:21:55,584 [John] What were they doing in 2024 00:21:55,584 --> 00:22:00,164 [John] that was autonomous agent, agentic? 00:22:00,164 --> 00:22:02,154 [Eric] That's a good question. Hold on, I'm thinking back to this. 00:22:02,154 --> 00:22:03,864 [John] Early coding tools maybe? 00:22:03,864 --> 00:22:06,264 [Eric] I think early coding tools, for sure. 00:22:06,264 --> 00:22:09,364 [John] Oh, customer service. That was one of the early ones where they- 00:22:09,364 --> 00:22:10,504 [Eric] Oh, customer service- 00:22:10,504 --> 00:22:11,594 [John] ... they were trying to do that with- 00:22:11,594 --> 00:22:11,594 [Eric] ... right 00:22:11,594 --> 00:22:12,414 [John] ... with voice and chat and stuff 00:22:12,414 --> 00:22:14,864 [Eric] There was that big Klarna. Yeah, there was the big Klarna thing where they laid off a bunch of people. 00:22:14,864 --> 00:22:16,003 [John] Which I think was '24. 00:22:16,004 --> 00:22:17,724 [Eric] That was '24. That was definitely '24. 00:22:17,724 --> 00:22:18,864 [John] Okay. Yeah. 00:22:18,864 --> 00:22:39,944 [Eric] And then the other thing I would say, though, is, um, compared to what we would call an autonomous agent now, it seems very primitive. But actually, do you remember this? This is in 2024. You and I built, and really you built, uh, an agent for generating- 00:22:39,944 --> 00:22:40,214 [John] Oh, yeah 00:22:40,214 --> 00:22:40,844 [Eric] ... SEO content. 00:22:40,844 --> 00:22:41,764 [John] I forgot about that. 00:22:41,764 --> 00:22:42,874 [Eric] And- 00:22:42,874 --> 00:22:42,874 [John] Yeah 00:22:42,874 --> 00:22:52,364 [Eric] ... it, again, like, a lot of this stuff is packaged up now, but it required very little human intervention because it would essentially, like, go through a loop. It was sort of the- 00:22:52,364 --> 00:22:52,544 [John] Right 00:22:52,544 --> 00:22:56,164 [Eric] ... the loop, the Ralph Wiggum loop before that was a thing. 00:22:56,164 --> 00:22:56,804 [John] Yeah. 00:22:56,804 --> 00:22:56,834 [Eric] And we- 00:22:56,834 --> 00:23:01,304 [John] 'Cause we did it with Google Sheets and then the API for, like, ChatGPT, I think. 00:23:01,304 --> 00:23:03,044 [Eric] The API for ChatGPT. 00:23:03,044 --> 00:23:03,364 [John] Yeah. Yeah. 00:23:03,364 --> 00:23:04,444 [Eric] So, like, yeah- 00:23:04,444 --> 00:23:04,454 [John] It worked 00:23:04,454 --> 00:23:14,924 [Eric] ... that's crazy, actually. So if we just stop and think about this for a second, it was, we give it an input. It would actually, we had, we... This is amazing. I haven't thought about this in a while. 00:23:14,924 --> 00:23:15,644 [John] Me either. 00:23:15,644 --> 00:23:23,083 [Eric] We literally purchased... Do you remember this? We purchased, uh, system prompts that were personas. 00:23:23,084 --> 00:23:24,023 [John] Oh, right. Yeah. 00:23:24,024 --> 00:23:25,654 [Eric] So this is, like, skills before skills, right? 00:23:25,654 --> 00:23:26,804 [John] I do remember that. Yeah. Uh-huh. 00:23:26,804 --> 00:23:31,584 [Eric] So we would give an input, like, we wanna create this... Or we were trying to create content- 00:23:31,584 --> 00:23:31,594 [John] Mm-hmm 00:23:31,594 --> 00:23:32,684 [Eric] ... you know, at scale. 00:23:32,684 --> 00:23:33,364 [John] Yep. 00:23:33,364 --> 00:23:35,574 [Eric] And that was SEO optimized. 00:23:35,574 --> 00:23:35,624 [John] Yep. 00:23:35,624 --> 00:23:40,424 [Eric] And so you would give it input, and then it would assume a persona- 00:23:40,424 --> 00:23:40,924 [John] Yep 00:23:40,924 --> 00:23:42,704 [Eric] ... you know, that we had purchased. 00:23:42,704 --> 00:23:45,004 [John] Yeah, and we worked, yeah, and we worked really hard, like, customizing them and- 00:23:45,004 --> 00:23:45,264 [Eric] Mm-hmm 00:23:45,264 --> 00:23:46,083 [John] ... yeah, I remember that. 00:23:46,084 --> 00:23:48,384 [Eric] And it was a, it was a function in a Google Sheet- 00:23:48,384 --> 00:23:48,424 [John] Mm-hmm 00:23:48,424 --> 00:23:50,094 [Eric] ... that would, like, assume the persona. 00:23:50,094 --> 00:23:50,104 [John] Yep. 00:23:50,104 --> 00:23:51,944 [Eric] That, you know, all that was in a field. 00:23:51,944 --> 00:23:51,963 [John] Yep. 00:23:51,964 --> 00:23:53,704 [Eric] Make the API call to GPT. 00:23:53,704 --> 00:23:53,924 [John] Yep. 00:23:53,924 --> 00:23:56,174 [Eric] Generate something, then return it, and then on- 00:23:56,174 --> 00:23:57,114 [John] And then loop into the next one 00:23:57,114 --> 00:24:00,004 [Eric] ... and then loop into the next one, and it would assume a different persona, right? 00:24:00,004 --> 00:24:00,034 [John] Yeah. 00:24:00,034 --> 00:24:01,384 [Eric] 'Cause it's like you're a content writer. 00:24:01,384 --> 00:24:01,644 [John] Yep. 00:24:01,644 --> 00:24:02,804 [Eric] You're an SEO expert. 00:24:02,804 --> 00:24:03,494 [John] Yep. 00:24:03,494 --> 00:24:05,744 [Eric] You're a researcher, you know, like, whatever. 00:24:05,744 --> 00:24:07,314 [John] I totally forgot about this. 00:24:07,314 --> 00:24:11,374 [Eric] Right. But at the time, we were like, it was really cool. I mean, we were like- 00:24:11,374 --> 00:24:11,374 [John] Yeah 00:24:11,374 --> 00:24:12,324 [Eric] ... "This is crazy." 00:24:12,324 --> 00:24:12,434 [John] Yeah. 00:24:12,434 --> 00:24:16,904 [Eric] This, I mean, that was an autonomous agent where it's like I dump this in, and then I get output. 00:24:16,904 --> 00:24:16,924 [John] Yeah. 00:24:16,924 --> 00:24:18,684 [Eric] But it's actually gone through, like, multiple steps. 00:24:18,684 --> 00:24:21,184 [John] Yeah. I just didn't know how to sell it, I guess. 00:24:21,184 --> 00:24:21,744 [Eric] I know. 00:24:21,744 --> 00:24:22,184 [John] [laughs] 00:24:22,184 --> 00:24:24,064 [Eric] Man, you really missed the boat on that one. 00:24:24,064 --> 00:24:25,884 [John] I know, right? 00:24:25,884 --> 00:24:26,164 [Eric] Ah. 00:24:26,164 --> 00:24:26,264 [John] Yeah. 00:24:26,264 --> 00:24:28,904 [Eric] I'm so glad you didn't raise money in '24- 00:24:28,904 --> 00:24:29,324 [John] Yeah, me too 00:24:29,324 --> 00:24:30,324 [Eric] ... for AI. [laughs] 00:24:30,324 --> 00:24:31,864 [John] Especially that use case. 00:24:31,864 --> 00:24:35,994 [Eric] Especially that. [laughs]P- we purchased prompts. How great is that? 00:24:35,994 --> 00:24:37,054 [John] Yeah. True. 00:24:37,054 --> 00:24:39,674 [Eric] Uh, okay. The next one's human. 00:24:39,674 --> 00:24:39,713 [John] Okay. 00:24:39,714 --> 00:24:41,534 [Eric] Do you wanna guess what it is? 00:24:41,534 --> 00:24:45,004 [John] So we, so we had two non-human ones, and we're still in twenty twenty-four? 00:24:45,004 --> 00:24:45,884 [Eric] Yep. They're around the same time. Um- 00:24:45,884 --> 00:24:47,054 [John] They're twenty-five now. 00:24:47,054 --> 00:24:49,494 [Eric] This is twenty-four. 00:24:49,494 --> 00:24:52,494 [John] Okay. I would've said twenty-five, but bear with me. 00:24:52,494 --> 00:24:54,144 [Eric] Well, we'll say, you know, twenty-four, twenty-five. 00:24:54,144 --> 00:24:54,934 [John] Right, late twenty-four. Okay. 00:24:54,934 --> 00:24:55,854 [Eric] Yeah. 00:24:55,854 --> 00:24:57,814 [John] Are we to like an AI employee? 00:24:57,814 --> 00:24:58,213 [Eric] Yes. 00:24:58,214 --> 00:24:58,534 [John] Okay. All right. 00:24:58,534 --> 00:24:59,334 [Eric] Digital employee- 00:24:59,334 --> 00:25:00,074 [John] Digital employee 00:25:00,074 --> 00:25:04,954 [Eric] ... was the, was the, uh, you know, the term that came up, the metaphor that came up. 00:25:04,954 --> 00:25:15,214 [John] Which is... Okay. Digital employee versus autonomous agent. What, what's, what's the difference in your mind? 00:25:15,214 --> 00:25:18,514 [Eric] The, 00:25:18,514 --> 00:25:26,314 [Eric] uh, I mean, from experience, I think the big difference is the 00:25:26,314 --> 00:25:31,134 [Eric] amount of influence that context can have. 00:25:31,134 --> 00:25:31,653 [John] Okay. 00:25:31,654 --> 00:25:45,954 [Eric] And if the-- if you give a model or an agent, um, the right context, it is actually good at making decisions independently of you. 00:25:45,954 --> 00:25:46,794 [John] Okay. 00:25:46,794 --> 00:25:51,734 [Eric] And so I would say, and actually if we think about this stepwise, so we have a parrot 00:25:51,734 --> 00:25:54,394 [Eric] to an intern 00:25:54,394 --> 00:26:09,884 [Eric] to an autonomous agent to a digital employee. And so if we think about that, the, you know, the intern to the autonomous agent is really interesting because we set that process up, but we didn't have to check on it. It was just- 00:26:09,884 --> 00:26:09,884 [John] Mm. 00:26:09,884 --> 00:26:11,804 [Eric] ... sort of flowing through the process. 00:26:11,804 --> 00:26:11,813 [John] Right. 00:26:11,814 --> 00:26:18,614 [Eric] But it wasn't really making decisions. It was cycling through a structure that we set up for it. 00:26:18,614 --> 00:26:20,234 [John] Okay. That makes sense. 00:26:20,234 --> 00:26:24,284 [Eric] Right? And it was autonomous, so... And it was generative, so like- 00:26:24,284 --> 00:26:24,284 [John] Yeah 00:26:24,284 --> 00:26:31,034 [Eric] ... it was, you know, sort of like looking at the input and, you know, it wasn't a machine that produced the same thing every time, right? 00:26:31,034 --> 00:26:40,894 [John] So I might say that AI employee is going to do more work to self-gather context and self-gather whatever it needs to accomplish a task. 00:26:40,894 --> 00:26:40,914 [Eric] Right. 00:26:40,914 --> 00:26:44,693 [John] Whereas the other one's more like 00:26:44,694 --> 00:26:50,494 [John] set up, execute A, B, the loops. Like you pre-set up the loops and it, and it runs, and it does a thing- 00:26:50,494 --> 00:26:50,504 [Eric] Right 00:26:50,504 --> 00:26:53,954 [John] ... but the inputs are kind of the same every time and the context is preset. 00:26:53,954 --> 00:26:54,543 [Eric] Yeah. 00:26:54,543 --> 00:26:54,543 [John] Okay. 00:26:54,543 --> 00:27:11,714 [Eric] Well, I, I would actually say, to put a very sharp point on this, I think the big, the big difference there is that if we, if you and I were using the tools that we use today to do that same thing, we wouldn't have to purchase prompts first of all. 00:27:11,714 --> 00:27:14,374 [John] [chuckles] Sure. 00:27:14,434 --> 00:27:23,614 [Eric] But the biggest thing is that the, um, is that AI would actually put the, put the... It would construct the flow. 00:27:23,614 --> 00:27:24,594 [John] Okay. 00:27:24,594 --> 00:27:26,014 [Eric] It would actually help define- 00:27:26,014 --> 00:27:26,034 [John] Mm 00:27:26,034 --> 00:27:26,654 [Eric] ... the process. 00:27:26,654 --> 00:27:28,134 [John] Yeah. That makes sense. 00:27:28,134 --> 00:27:30,674 [Eric] Right? And, and plan mode is a really- 00:27:30,674 --> 00:27:31,043 [John] Like, like an employee would 00:27:31,043 --> 00:27:33,644 [Eric] ... good example of that. Like an employee would, right? 00:27:33,644 --> 00:27:34,974 [John] Which is the analogy, yeah. 00:27:34,974 --> 00:27:48,294 [Eric] So I mean, even I would say in, in a lot of the tools today, you know, Claude Code, if you, if you dumped in a big requirement like that multi-step process, et cetera- 00:27:48,294 --> 00:27:48,323 [John] Mm-hmm 00:27:48,323 --> 00:27:54,294 [Eric] ... Claude Code would, it would actually refine the plan and, and start asking you questions- 00:27:54,294 --> 00:27:54,883 [John] Yeah. Yeah 00:27:54,883 --> 00:28:00,404 [Eric] ... to validate things and formulate a, you know, formulate what needs to happen- 00:28:00,404 --> 00:28:00,404 [John] Yeah 00:28:00,404 --> 00:28:01,814 [Eric] ... before it started doing anything. 00:28:01,814 --> 00:28:02,574 [John] Yep. Okay. 00:28:02,574 --> 00:28:05,514 [Eric] That I think is the big, is the gigantic step there. 00:28:05,514 --> 00:28:07,874 [John] Yep. Yep. Agreed. 00:28:07,874 --> 00:28:09,974 [Eric] All right. And then 00:28:09,974 --> 00:28:31,134 [Eric] we have the last one, which is twenty twenty-five, twenty twenty-six, um, you know, which we're early, early in twenty twenty-six. Um, okay. I'm gonna... This one is, this is absolutely fascinating to me. So we sort of have these, um, machine type metaphors and then these human metaphors, you know, that are sort of- 00:28:31,134 --> 00:28:31,334 [John] Mm-hmm 00:28:31,334 --> 00:28:32,794 [Eric] ... you know, being used- 00:28:32,794 --> 00:28:32,804 [John] Back and forth 00:28:32,804 --> 00:28:33,734 [Eric] ... to describe the technology. 00:28:33,734 --> 00:28:33,754 [John] Yep. 00:28:33,754 --> 00:28:38,354 [Eric] Okay? The most recent one is a combination. 00:28:38,354 --> 00:28:39,774 [John] Okay. 00:28:39,774 --> 00:28:40,934 [Eric] Do you-- Can you guess what it is? 00:28:40,934 --> 00:28:46,674 [John] I's, I've heard, I've heard one, um, recently, and I bet it's the same one. Um, exoskeleton. 00:28:46,674 --> 00:28:47,174 [Eric] Yes. 00:28:47,174 --> 00:28:52,824 [John] Yeah. I heard that, I heard that on a call or something in, like in twenty twenty-six, and I hadn't heard that one. 00:28:52,824 --> 00:28:53,914 [Eric] Yeah. 00:28:53,914 --> 00:28:55,853 [John] Brain in a jar is another one. [chuckles] Like the- 00:28:55,853 --> 00:28:56,324 [Eric] Brain in a jar 00:28:56,324 --> 00:28:58,904 [John] ... brain in the jar and the exoskeleton- 00:28:58,904 --> 00:28:58,904 [Eric] [chuckles] 00:28:58,904 --> 00:29:00,574 [John] ... which is weird. Um- 00:29:00,574 --> 00:29:01,313 [Eric] That is weird. 00:29:01,314 --> 00:29:13,874 [John] But, but the idea is because, 'cause I think you're, like you may be going the people route, but there's this other exoskeleton thing where you've got like the brain in the jar, which is, which is like the AI or the autonomous thing- 00:29:13,874 --> 00:29:14,024 [Eric] Mm-hmm 00:29:14,024 --> 00:29:15,614 [John] ... and then you're giving it an exoskeleton- 00:29:15,614 --> 00:29:15,734 [Eric] Oh 00:29:15,734 --> 00:29:16,594 [John] ... and it can do things. 00:29:16,594 --> 00:29:17,794 [Eric] Interesting. 00:29:17,794 --> 00:29:20,243 [John] And th- because that's one exoskeleton metaphor. 00:29:20,243 --> 00:29:20,484 [Eric] Mm-hmm. 00:29:20,484 --> 00:29:30,604 [John] And the other one is the, what I think you and I were thinking of is the person like operating a, a thing. So I think that one can go both ways, which is- 00:29:30,604 --> 00:29:32,354 [Eric] That is really interesting. 00:29:32,354 --> 00:29:32,394 [John] Yeah. 00:29:32,394 --> 00:29:42,194 [Eric] Okay. That is... Yeah. The, the research that I did, the exoskeleton drew on exoskeletons on like mechanical- 00:29:42,194 --> 00:29:42,774 [John] Mm-hmm 00:29:42,774 --> 00:29:47,414 [Eric] ... exoskeletons on humans who are doing some sort of physical work. 00:29:47,414 --> 00:29:47,854 [John] Mm-hmm. 00:29:47,854 --> 00:30:11,094 [Eric] So they use this example of people in factories, you know, you know, things in factories, there's been an immense amount of automation already with robotics and stuff, but there are still humans in factories, right? And so a good example is just down the road here in South Carolina, um, BMW has a car factory, and they make all of the X vehicles in the world are made- 00:30:11,094 --> 00:30:11,194 [John] Yeah 00:30:11,194 --> 00:30:12,313 [Eric] ... you know, not far from here. 00:30:12,314 --> 00:30:12,574 [John] Mm-hmm. 00:30:12,574 --> 00:30:14,804 [Eric] And you can actually go tour the plant, which is amazing. 00:30:14,804 --> 00:30:14,834 [John] Yeah. 00:30:14,834 --> 00:30:18,384 [Eric] So listeners, if you're ever in the area, you should totally go see it 'cause it is- 00:30:18,384 --> 00:30:18,934 [John] Yeah. It's super cool 00:30:18,934 --> 00:30:19,274 [Eric] ... it is- 00:30:19,274 --> 00:30:19,854 [John] Yep 00:30:19,854 --> 00:30:22,574 [Eric] ... insane. But there are still people there, right? 00:30:22,574 --> 00:30:23,214 [John] Yep. 00:30:23,214 --> 00:30:29,774 [Eric] Um, and so an increasing trend is actually like augmenting 00:30:29,834 --> 00:30:42,734 [Eric] humans who are doing things in a manufacturing or, you know, warehouse or whatever context with exoskeletons, and it does two things. It increases the, like their sort of like overall productivity and decreases injury- 00:30:42,734 --> 00:30:42,914 [John] Mm 00:30:42,914 --> 00:30:43,554 [Eric] ... significantly. 00:30:43,554 --> 00:30:43,794 [John] Sure. 00:30:43,794 --> 00:30:44,094 [Eric] Right? 00:30:44,094 --> 00:30:44,414 [John] Yeah. 00:30:44,414 --> 00:30:50,312 [Eric] And so the combination of those two things means like way more productivityYou know, overall. 00:30:50,312 --> 00:30:50,422 [John] Yeah. 00:30:50,422 --> 00:30:58,542 [Eric] Um, you know, which is pretty interesting. But what's... You know, if, if I go back to our mental model, the map is, you know, the map is not- 00:30:58,542 --> 00:30:58,652 [John] Yeah 00:30:58,652 --> 00:31:03,832 [Eric] ... the territory. One of the key things is considering the cartographer, right? 00:31:03,832 --> 00:31:03,862 [John] Right. 00:31:03,862 --> 00:31:12,102 [Eric] Which is like, okay, in this, the, the blog post at least that I read about this, and I actually had found it, you know, 'cause it was trending on Hacker News, but, you know- 00:31:12,102 --> 00:31:12,192 [John] Yeah 00:31:12,192 --> 00:31:19,342 [Eric] ... it's a, it's a becoming a more common metaphor for AI. The company builds product development software. 00:31:19,342 --> 00:31:20,342 [John] Okay. 00:31:20,342 --> 00:31:23,621 [Eric] And they 00:31:23,622 --> 00:31:26,522 [Eric] are... One of their values is, like, keeping the human in the loop- 00:31:26,522 --> 00:31:26,532 [John] Sure 00:31:26,532 --> 00:31:27,762 [Eric] ... which makes sense and, you know- 00:31:27,762 --> 00:31:28,022 [John] Right 00:31:28,022 --> 00:31:31,242 [Eric] ... having done a lot of product work, I think that there's a lot to be said for that, right? 00:31:31,242 --> 00:31:31,272 [John] Right. 00:31:31,272 --> 00:31:51,302 [Eric] That AI is more of an exoskeleton where it can, you know, uh, consume, distill, you know, research, all of these things, but ultimately, as a product manager, I need to have, like, insight and empathy and other things that, that AI, you know, the AI can't have in order to do my job well. 00:31:51,302 --> 00:31:51,502 [John] Yep. 00:31:51,502 --> 00:31:56,582 [Eric] Right? But even the exoskeleton I think is really imperfect. 00:31:56,582 --> 00:32:04,702 [John] Yeah. Yeah, I mean, even what I just said, I think, I think there's two ways to interpret that, one. 00:32:04,702 --> 00:32:09,322 [John] And then the other thing that, that I've heard for 2026, so if I had to pick one for 2026- 00:32:09,322 --> 00:32:10,402 [Eric] Yes 00:32:10,402 --> 00:32:16,722 [John] ... it is, um... And this actually was a thing in '24 and '25. It w- Do you remember the swarm stuff- 00:32:16,722 --> 00:32:16,732 [Eric] Oh, yeah, yeah, yeah 00:32:16,732 --> 00:32:18,862 [John] ... like agent swarms and all that stuff? 00:32:18,862 --> 00:32:24,162 [Eric] Someone [chuckles] used that. A vendor used that on a call, and I was like, "Oh yeah, agent swarm." 00:32:24,162 --> 00:32:31,102 [John] Yeah. You know what? You know what I've heard though that's like the new version of that? Agent economy. 00:32:31,162 --> 00:32:33,862 [Eric] Whoa. 00:32:33,862 --> 00:32:35,362 [Eric] Okay, this is very interesting- 00:32:35,362 --> 00:32:35,412 [John] Have you heard that one? 00:32:35,412 --> 00:32:39,302 [Eric] ... actually. I have not heard that. I have not heard that. 00:32:39,302 --> 00:32:48,002 [John] And the idea... 'Cause, 'cause, like, recently there's been several startups and several sites that's agents interacting with other agents. 00:32:48,002 --> 00:32:49,242 [Eric] Mm-hmm. 00:32:49,242 --> 00:32:49,642 [John] And then- 00:32:49,642 --> 00:32:51,732 [Eric] Like, what's an example beyond Moldbook? 00:32:51,732 --> 00:32:59,802 [John] So there... Yeah. [chuckles] There is the Moldbook, which there's a lot of specu- Like, there's a lot of people skeptical of, like... Some of that was just people, like, messing around. 00:32:59,802 --> 00:33:00,762 [Eric] There was definitely that- 00:33:00,762 --> 00:33:00,872 [John] Right 00:33:00,872 --> 00:33:02,262 [Eric] ... which was absolutely hilarious. 00:33:02,262 --> 00:33:09,572 [John] It was funny, yeah. Um, but there's a couple of, there's a couple of startups that the idea is, and I'll have to... We'll have to put them in the show notes. 00:33:09,572 --> 00:33:09,762 [Eric] Mm-hmm. 00:33:09,762 --> 00:33:16,642 [John] But the id- idea behind it is, like, zero humans involved. You, like, sign up for an account, and you, like, kinda, like, provision an agent. 00:33:16,642 --> 00:33:16,812 [Eric] Mm-hmm. 00:33:16,812 --> 00:33:18,162 [John] And you give it, like, a budget. 00:33:18,162 --> 00:33:18,382 [Eric] Mm-hmm. 00:33:18,382 --> 00:33:20,641 [John] And then, like, some guidelines for what- 00:33:20,642 --> 00:33:21,022 [Eric] Mm-hmm 00:33:21,022 --> 00:33:22,362 [John] ... what it is, what it should do. 00:33:22,362 --> 00:33:23,102 [Eric] Yeah. 00:33:23,102 --> 00:33:25,822 [John] And, um, and it does things. So 00:33:25,822 --> 00:33:31,882 [John] the ones that are actually making money are typically some kind of trading on, like, a Polymarket or a- 00:33:31,882 --> 00:33:31,962 [Eric] Mm-hmm 00:33:31,962 --> 00:33:38,302 [John] ... um, Polymarket or one of their competitors. I can't remember the top competitor for them. That type of world. 00:33:38,302 --> 00:33:38,542 [Eric] Mm-hmm. 00:33:38,542 --> 00:33:39,912 [John] And betting basically. 00:33:39,912 --> 00:33:40,782 [Eric] Mm-hmm. 00:33:40,782 --> 00:33:46,282 [John] Um, I haven't heard of too many that are, that are outside of that world that are making money- 00:33:46,282 --> 00:33:46,451 [Eric] Yep 00:33:46,451 --> 00:33:57,992 [John] ... allegedly. Um, but yeah, there's, there's multiple startups in this space now where, where it's this agent-to-agent economy, and they wanna be, like, the platform for agents to, like, interact with each other- 00:33:57,992 --> 00:33:57,992 [Eric] Wow 00:33:57,992 --> 00:33:59,942 [John] ... and do commerce, which is the economy part. 00:33:59,942 --> 00:34:02,142 [Eric] Yeah. That is so wild. 00:34:02,142 --> 00:34:02,902 [John] Yeah. 00:34:02,902 --> 00:34:07,242 [Eric] That is so wild. Okay. That actually... Okay. So 00:34:07,242 --> 00:34:10,762 [Eric] that was our brief history lesson on metaphors, 00:34:10,822 --> 00:34:13,862 [Eric] which if any of our listeners have another one that we missed- 00:34:13,862 --> 00:34:14,602 [John] Yeah 00:34:14,602 --> 00:34:14,982 [Eric] ... please- 00:34:14,982 --> 00:34:15,462 [John] Love to hear it 00:34:15,462 --> 00:34:24,302 [Eric] ... yeah, let us know. Uh, you can go to the site and figure out how to contact us. Um, but I actually went even further back in history and tried to think about- 00:34:24,302 --> 00:34:24,562 [John] Mm 00:34:24,562 --> 00:34:34,142 [Eric] ... what were other, what were other things where the metaphors were just... the early metaphors were woefully inadequate. 00:34:34,142 --> 00:34:34,642 [John] Okay. 00:34:34,642 --> 00:34:35,302 [Eric] Right? 00:34:35,302 --> 00:34:35,312 [John] Yeah. 00:34:35,312 --> 00:34:50,182 [Eric] That, that really... That didn't... Number one, like, something was difficult to describe, and so you describe it in terms that distill it down into, um, you know, a much narrower view than what it actually is and the impact it's having. 00:34:50,182 --> 00:34:50,302 [John] Right. 00:34:50,302 --> 00:34:55,082 [Eric] Okay? So there's a really popular one, but I think it's a really good one. 00:34:55,082 --> 00:34:55,642 [John] Okay. 00:34:55,642 --> 00:34:57,382 [Eric] I'm gonna ask, do you know- 00:34:57,382 --> 00:34:58,272 [John] I've got one too actually. 00:34:58,272 --> 00:34:59,201 [Eric] Okay. You've got one too? 00:34:59,202 --> 00:34:59,342 [John] Yeah. 00:34:59,342 --> 00:35:00,602 [Eric] Okay. Tell me yours first. 00:35:00,602 --> 00:35:00,712 [John] Okay. 00:35:00,712 --> 00:35:01,462 [Eric] And we'll see if it's the same one. 00:35:01,462 --> 00:35:07,462 [John] Mine is, mine is silly. Um, and I don't even know that it's true, but I'm pretty sure it's true. 00:35:07,462 --> 00:35:07,861 [Eric] Okay. 00:35:07,861 --> 00:35:09,762 [John] So before we had, like, airplanes- 00:35:09,762 --> 00:35:09,902 [Eric] Mm 00:35:09,902 --> 00:35:10,381 [John] ... and flight. 00:35:10,382 --> 00:35:11,162 [Eric] Mm-hmm. 00:35:11,222 --> 00:35:18,162 [John] I think the metaphor, and the obvious metaphor throughout all of human history before then was a, was a bird of some sort, right? 00:35:18,162 --> 00:35:18,542 [Eric] Hmm. 00:35:18,542 --> 00:35:20,202 [John] Hawk, eagle, like whatever. 00:35:20,202 --> 00:35:20,622 [Eric] Mm-hmm. 00:35:20,622 --> 00:35:27,962 [John] And I think everybody just assumed that the way to fly was to mimic the flapping and the, you know, like how a bird would fly. 00:35:27,962 --> 00:35:29,182 [Eric] Oh, right. Right. 00:35:29,182 --> 00:35:36,202 [John] But then when we get to flight, we end up with the ability to have a jet, completely different mechanism than a bird- 00:35:36,202 --> 00:35:36,822 [Eric] Mm 00:35:36,822 --> 00:35:40,542 [John] ... and carry, you know, 100 people or however many people- 00:35:40,542 --> 00:35:40,782 [Eric] Right 00:35:40,782 --> 00:35:41,742 [John] ... in a thing. 00:35:41,742 --> 00:35:41,962 [Eric] Right. 00:35:41,962 --> 00:35:44,282 [John] And that, and the metaphor completely breaks down- 00:35:44,282 --> 00:35:44,292 [Eric] Oh, man 00:35:44,292 --> 00:35:44,842 [John] ... from a bird. 00:35:44,842 --> 00:35:45,822 [Eric] That is fascinating. 00:35:45,822 --> 00:35:46,842 [John] Yeah. 00:35:46,842 --> 00:35:47,582 [Eric] That's fascinating. 00:35:47,582 --> 00:35:56,382 [John] And then, of course, there's this middle ground where, like, obviously, like, some of the early things, like, there's some mimicking of, of wings and, like, only one person could fly at a time or whatever. 00:35:56,382 --> 00:35:56,762 [Eric] Right. 00:35:56,762 --> 00:36:02,182 [John] But it quickly progressed into this thing that, like, does not mimic how a bird would fly at all. 00:36:02,182 --> 00:36:21,881 [Eric] Right. Right. Totally. Yeah, and if you think about the, uh, describing what a modern jet is, um, to someone, you know, it's crazy, especially, like, you know, the, the re- the mega jets that are s- you know, where you fly- 00:36:21,882 --> 00:36:21,892 [John] Yeah 00:36:21,892 --> 00:36:22,782 [Eric] ... across the world, right? 00:36:22,782 --> 00:36:23,582 [John] Yeah. 00:36:23,582 --> 00:36:26,132 [Eric] Um, they carry so many people. 00:36:26,132 --> 00:36:26,172 [John] Yeah. 00:36:26,172 --> 00:36:27,752 [Eric] They carry so much weight. 00:36:27,752 --> 00:36:27,782 [John] Yeah. 00:36:27,782 --> 00:36:29,512 [Eric] They themselves weigh so much. 00:36:29,512 --> 00:36:29,512 [John] Yeah. 00:36:29,512 --> 00:36:42,242 [Eric] And so if you imagine, it's like, okay, there's a metal tube that weighs however many tons with hundreds of people on it, and it goes 500 miles an hour for 20 hours. 00:36:42,242 --> 00:36:43,552 [John] Yeah. P- people would have- 00:36:43,552 --> 00:36:43,572 [Eric] It's like- 00:36:43,572 --> 00:36:44,302 [John] ... just, like- 00:36:44,302 --> 00:36:45,012 [Eric] It's like, what? 00:36:45,012 --> 00:36:45,652 [John] Like, no. 00:36:45,652 --> 00:36:45,682 [Eric] Yeah. 00:36:45,682 --> 00:36:46,982 [John] Like, the physics are wrong. 00:36:46,982 --> 00:36:47,482 [Eric] Yeah, totally. 00:36:47,482 --> 00:36:49,462 [John] Like, the physics just feel wrong. 00:36:49,462 --> 00:36:49,902 [Eric] Yeah. 00:36:49,902 --> 00:36:58,818 [John] And anyways, you have, you have your own analogy here, but, but-Yeah. I, I think AI is, is, is gonna be that where it's like, no, the, these physics feel wrong even though they're not- 00:36:58,818 --> 00:36:58,828 [Eric] Yes 00:36:58,828 --> 00:37:00,138 [John] ... in a lot of applications. 00:37:00,138 --> 00:37:06,958 [Eric] Yeah, exactly. I think that's a really good analogy. The one that came to mind for me was, um, horseless carriage. 00:37:06,958 --> 00:37:07,858 [John] Okay, for a car. 00:37:07,858 --> 00:37:08,477 [Eric] For a car. 00:37:08,478 --> 00:37:08,818 [John] Yeah. 00:37:08,818 --> 00:37:10,888 [Eric] Uh, and- 00:37:10,888 --> 00:37:10,888 [John] Mm-hmm 00:37:10,888 --> 00:37:22,578 [Eric] ... you know, I mean, Henry Ford, I think, I, I think Henry Ford saw this, and which is why he's celebrated as such a visionary, right? 'Cause he's like, "Well, if I asked people what they wanted, you know, they would've asked for a faster horse," right? 00:37:22,578 --> 00:37:23,717 [John] Right. [chuckles] Yeah. 00:37:23,718 --> 00:37:24,218 [Eric] And- 00:37:24,218 --> 00:37:25,037 [John] Right 00:37:25,037 --> 00:37:26,078 [Eric] ... um- 00:37:26,078 --> 00:37:31,038 [John] 'Cause the carriage was an add-on. You know what I mean? [laughs] 00:37:31,038 --> 00:37:31,478 [Eric] Totally. 00:37:31,478 --> 00:37:32,448 [John] Like, you needed the horse and you could- 00:37:32,448 --> 00:37:33,578 [Eric] It was like purchasing a prompt. 00:37:33,578 --> 00:37:38,318 [John] It... [laughs] You needed the horse, and you could ride the horse without a carriage no problem. 00:37:38,318 --> 00:37:38,718 [Eric] Right. Exactly. 00:37:38,718 --> 00:37:44,848 [John] But like, you know, the carriage is nice if you wanna have s- you know, have s- multiple people riding and stuff, but it- 00:37:44,848 --> 00:37:44,978 [Eric] Right 00:37:44,978 --> 00:37:45,878 [John] ... it was an add-on. 00:37:45,878 --> 00:37:46,758 [Eric] Right. I, you know- 00:37:46,758 --> 00:37:50,358 [John] It's like a trailer for a car. It's like, cool, you don't need a trailer for a car- 00:37:50,358 --> 00:37:50,438 [Eric] Right 00:37:50,438 --> 00:37:51,858 [John] ... but it's nice in some situations. 00:37:51,858 --> 00:38:05,538 [Eric] Well, if you th- actually, if you think about everything involved with caring for a horse, that is... I mean, a horseless carriage, w- this is what's interesting, that is actually a huge shift. 00:38:05,538 --> 00:38:06,118 [John] Yeah. 00:38:06,118 --> 00:38:09,597 [Eric] Right? Like, okay, carriage is an add-on to a horse. 00:38:09,598 --> 00:38:09,798 [John] Mm-hmm. 00:38:09,798 --> 00:38:20,618 [Eric] But even if you say horseless carriage, that's a major shift, like, even in just the cost basis for transportation, right? So let's say I have- 00:38:20,678 --> 00:38:21,158 [John] Yeah 00:38:21,158 --> 00:38:42,518 [Eric] ... let's say I have a carriage that's pulled by two horses, and so I have to own and, like, care for two horses. And I just... I, my sister rode horses for a time when we were younger, and I just remember, like, being shocked at how much went into this. 00:38:42,518 --> 00:38:42,978 [John] Sure. 00:38:42,978 --> 00:38:45,488 [Eric] So she rode horses. I had a dirt bike. 00:38:45,488 --> 00:38:45,518 [John] [laughs] 00:38:45,518 --> 00:38:47,008 [Eric] And I would, I would, you know- 00:38:47,008 --> 00:38:47,598 [John] Yeah 00:38:47,598 --> 00:38:53,998 [Eric] ... this is, this is horrible, but I would constantly remind my parents, like, "Isn't it amazing how much cheaper it is-" 00:38:53,998 --> 00:38:55,998 [John] [laughs] For me to be into dirt bikes. 00:38:55,998 --> 00:38:58,018 [Eric] It's like, let's, let's amortize this out, right? 00:38:58,018 --> 00:38:58,038 [John] It's hilarious, yeah. 00:38:58,038 --> 00:39:01,427 [Eric] Like, this is essentially the cost is gonna go to zero, you know. 00:39:01,427 --> 00:39:01,438 [John] Yeah. 00:39:01,438 --> 00:39:02,717 [Eric] And gas was super cheap- 00:39:02,718 --> 00:39:02,858 [John] Yeah 00:39:02,858 --> 00:39:04,168 [Eric] ... you know, especially at that time, right? 00:39:04,168 --> 00:39:04,178 [John] Yeah. 00:39:04,178 --> 00:39:05,318 [Eric] And I'm like... 00:39:05,318 --> 00:39:05,918 [John] So funny. 00:39:05,918 --> 00:39:06,318 [Eric] You know? 00:39:06,318 --> 00:39:09,978 [John] Well, and think about a modern neighborhood 00:39:09,978 --> 00:39:11,898 [John] with everybody has a horse. 00:39:11,898 --> 00:39:12,978 [Eric] Everyone has a horse. 00:39:12,978 --> 00:39:23,018 [John] Like, I mean, sure, maybe you don't need a garage, but, like, that same, like, square footage, like, you can't... That can't just be a barn. Like, you need [laughs] you need- 00:39:23,018 --> 00:39:23,178 [Eric] Totally 00:39:23,178 --> 00:39:25,258 [John] ... more space for a horse than a garage. 00:39:25,258 --> 00:39:25,958 [Eric] Right. Well, okay. 00:39:25,958 --> 00:39:25,967 [John] Um- 00:39:25,967 --> 00:39:37,238 [Eric] This is, this is what's interesting, and, and I'm so glad you brought up the, um, agent economy term because I think that's getting at one of the core things that 00:39:37,238 --> 00:39:49,148 [Eric] I uncovered in studying this. And it's that the, um, truly transformative technologies escape the metaphor, right? 00:39:49,148 --> 00:39:49,178 [John] Yeah. 00:39:49,178 --> 00:39:50,978 [Eric] People struggle, it escapes the metaphor- 00:39:50,978 --> 00:39:51,108 [John] Yeah 00:39:51,108 --> 00:39:53,858 [Eric] ... and then it eventually becomes its own, its own thing, right? 00:39:53,858 --> 00:39:53,967 [John] Right. 00:39:53,967 --> 00:39:55,467 [Eric] And so you're not trying to describe- 00:39:55,467 --> 00:39:56,478 [John] No metaphor needed. 00:39:56,478 --> 00:39:59,238 [Eric] Yeah, no metaphor needed, right? It's just an automobile. 00:39:59,238 --> 00:40:00,398 [John] Yep. 00:40:00,578 --> 00:40:12,278 [Eric] But when it escapes the metaphor, or, or part of the process of it escaping the metaphor or, or maybe I guess I would say, like, the downstream effects of it, is that it reshapes things around it. 00:40:12,278 --> 00:40:12,918 [John] Yep. 00:40:12,918 --> 00:40:37,958 [Eric] And that can happen on a large scale or on a small scale. So for example, automobiles, you know, you have the horseless, uh, the horseless carriage. And of course it, it certainly changed on a practical level, people thinking about, "Okay, what's my cost basis for, you know, transportation?" Um, but it actually reshaped cities. 00:40:37,958 --> 00:40:38,008 [John] Yeah. 00:40:38,008 --> 00:40:39,178 [Eric] And it reshaped the way we- 00:40:39,178 --> 00:40:39,208 [John] Mm-hmm 00:40:39,208 --> 00:40:42,358 [Eric] ... thought about, um, the way that goods were transported. 00:40:42,358 --> 00:40:42,898 [John] Yep. 00:40:42,898 --> 00:40:50,798 [Eric] And, um, made things like the highway system, um, you know, an actual thing, right? 00:40:50,798 --> 00:40:50,808 [John] Yeah. 00:40:50,808 --> 00:40:51,898 [Eric] Or som- or something that could- 00:40:51,898 --> 00:40:51,908 [John] Mm-hmm 00:40:51,908 --> 00:40:53,738 [Eric] ... even be considered. Um- 00:40:53,738 --> 00:40:54,698 [John] Yep 00:40:54,698 --> 00:41:01,898 [Eric] ... and so that reshaping is, I think, what we're going through with AI right now. 00:41:01,898 --> 00:41:01,998 [John] Right. 00:41:01,998 --> 00:41:17,978 [Eric] And we're still in the early phases of it. Um, but what's... I think what's really interesting, if you think about the horseless carriage, um, or even the airplane, um, is that those things took decades- 00:41:17,978 --> 00:41:18,498 [John] Right 00:41:18,498 --> 00:41:25,918 [Eric] ... in order to sort of fully reshape, you know, the, the larger landscape around them, right? 00:41:25,918 --> 00:41:26,218 [John] Right. 00:41:26,218 --> 00:41:35,378 [Eric] So it wasn't like when Henry Ford introduced the Model T, immediately cities started thinking about how do we put a bunch of paved roads in, and what are- 00:41:35,378 --> 00:41:35,408 [John] Right 00:41:35,408 --> 00:41:41,957 [Eric] ... the width of those roads, and do houses need a garage, you know. Well, they probably just used a horse carriage initially, right? 00:41:41,958 --> 00:41:42,638 [John] Yeah. 00:41:42,638 --> 00:41:43,658 [Eric] You know, for their garage- 00:41:43,658 --> 00:41:44,338 [John] Yeah, yeah 00:41:44,338 --> 00:41:46,358 [Eric] ... for their Model T. [laughs] 00:41:46,358 --> 00:41:47,478 [John] That's funny, but yeah. 00:41:47,478 --> 00:41:54,978 [Eric] Uh, so I think that's what's really interesting because we... AI has happened... We started at 2021. 00:41:54,978 --> 00:41:55,318 [John] Mm-hmm. 00:41:55,318 --> 00:42:00,178 [Eric] Right? And, and the metaphors have changed so dramatically. 00:42:00,178 --> 00:42:00,698 [John] Mm-hmm. 00:42:00,698 --> 00:42:10,368 [Eric] From... I mean, think about this. Think about the distance between these two things. A parrot, like a, a bird that can say- 00:42:10,368 --> 00:42:10,387 [John] Yeah 00:42:10,387 --> 00:42:11,298 [Eric] ... things back to you- 00:42:11,298 --> 00:42:11,428 [John] Right 00:42:11,428 --> 00:42:12,618 [Eric] ... is how we're describing this. 00:42:12,618 --> 00:42:13,318 [John] Right. 00:42:13,398 --> 00:42:18,448 [Eric] And then you're talking about an, an... and then [chuckles] agent economy. 00:42:18,448 --> 00:42:18,457 [John] Right. 00:42:18,458 --> 00:42:20,938 [Eric] Like, literally an entire economy- 00:42:20,938 --> 00:42:21,138 [John] Right 00:42:21,138 --> 00:42:22,918 [Eric] ... that has market dynamics- 00:42:22,918 --> 00:42:23,218 [John] Right 00:42:23,218 --> 00:42:28,618 [Eric] ... and where these independent technologies are interacting with each other on behalf of humans. 00:42:28,618 --> 00:42:29,018 [John] Yeah. 00:42:29,018 --> 00:42:31,098 [Eric] Right? And that's half a decade. 00:42:31,098 --> 00:42:31,658 [John] Yeah. 00:42:31,658 --> 00:42:32,858 [Eric] Which is crazy. 00:42:32,858 --> 00:42:35,118 [John] Yeah. 00:42:35,178 --> 00:42:39,558 [John] Yeah, and, and I think the other thing that's really interesting 00:42:39,558 --> 00:42:46,158 [John] is all of the technological innovations of the past, it's, it seemed like there are also often, like, two or three things happening at once. 00:42:46,158 --> 00:42:47,228 [Eric] Mm-hmm. 00:42:47,228 --> 00:42:52,298 [John] And this feels like one thing. So, like, like Gilded Age, I think, was rail- 00:42:52,298 --> 00:42:52,518 [Eric] Mm 00:42:52,518 --> 00:42:56,438 [John] ... rail transportation... like pipelines for like oil 00:42:56,438 --> 00:42:56,938 [Eric] Mm-hmm 00:42:56,938 --> 00:42:58,338 [John] And water maybe too. 00:42:58,338 --> 00:42:58,438 [Eric] Yep. 00:42:58,438 --> 00:42:59,478 [John] But definitely oil. 00:42:59,478 --> 00:43:00,398 [Eric] Steel. 00:43:00,398 --> 00:43:01,198 [John] Steel. 00:43:01,198 --> 00:43:01,417 [Eric] Mm-hmm. 00:43:01,418 --> 00:43:05,368 [John] Good. Yeah, that's Carnegie. And two or three other things. Um- 00:43:05,368 --> 00:43:08,208 [Eric] Uh, yeah, modern, like banking and financial system. 00:43:08,208 --> 00:43:09,098 [John] Yeah. J.P. Morgan. Yeah. 00:43:09,098 --> 00:43:09,808 [Eric] That was a big deal- 00:43:09,808 --> 00:43:09,808 [John] Morgan 00:43:09,808 --> 00:43:10,638 [Eric] ... with all that. 00:43:10,638 --> 00:43:17,568 [John] Or not J.P. Morgan. Yeah. Anyways, but yeah, so and, and there'll be things like that too. This feels more focused 'cause those, those are- 00:43:17,568 --> 00:43:17,568 [Eric] Interesting 00:43:17,568 --> 00:43:27,098 [John] ... those are intertwined 'cause like the oil and rail are intertwined 'cause they're moving barrels of oil on rail, and then they move to the, to, and to transport via pipes and... You know what I mean? 00:43:27,098 --> 00:43:27,378 [Eric] Mm-hmm. 00:43:27,378 --> 00:43:34,278 [John] So, so pieces of it are intertwined, and the financial, you know, the modern banking is funding all of it. Like- 00:43:34,278 --> 00:43:34,458 [Eric] Mm-hmm 00:43:34,458 --> 00:43:38,478 [John] ... they're definitely intertwined. But in some ways this feels even more focused. 00:43:38,478 --> 00:43:41,858 [Eric] Interesting. What is it, what's it more focused on? 00:43:41,858 --> 00:43:50,098 [John] In that like, things are even more intertwined. Like you could do rail, and sure you're moving a lot of oil, which is related to oil pipelines- 00:43:50,098 --> 00:43:50,138 [Eric] Mm-hmm 00:43:50,138 --> 00:43:52,258 [John] ... but like you can move people on rail, too. 00:43:52,258 --> 00:43:52,878 [Eric] Right. Right. 00:43:52,878 --> 00:44:10,138 [John] Whereas with AI, like, it, and maybe this is like just the software technology space. It's like you've got a couple options, but the products are like the same thing. Like I can't move my oil on rail or ship or via a, a, um, pipeline. 00:44:10,138 --> 00:44:10,418 [Eric] Mm. 00:44:10,418 --> 00:44:13,168 [John] I don't have three options to like transport my product. 00:44:13,168 --> 00:44:13,518 [Eric] Interesting. 00:44:13,518 --> 00:44:16,858 [John] I have like three options, but it's the same medium. 00:44:16,858 --> 00:44:17,418 [Eric] Interesting. 00:44:17,418 --> 00:44:20,698 [John] It's an API that like does a, you know- 00:44:20,698 --> 00:44:21,258 [Eric] Right. Right 00:44:21,258 --> 00:44:23,438 [John] ... call, call and response a, via API. 00:44:23,438 --> 00:44:36,158 [Eric] Well, I actually think about it in an inverted way. So I agree with you that it is, that the medium is far more focused, but I think the impact is far broader. 00:44:36,158 --> 00:44:36,997 [John] Yes. 00:44:36,998 --> 00:44:37,578 [Eric] Right? 00:44:37,578 --> 00:44:37,818 [John] Right. 00:44:37,818 --> 00:44:38,358 [Eric] Um- 00:44:38,358 --> 00:44:38,558 [John] Right 00:44:38,558 --> 00:44:41,278 [Eric] ... and because it's digital, it can happen way faster. 00:44:41,278 --> 00:44:41,858 [John] Yeah. 00:44:41,858 --> 00:44:48,618 [Eric] So the medium is super focused, but it... You know, if you think about the steel industry, 00:44:48,618 --> 00:44:52,478 [Eric] you know, eventually steel started to be used in everything, right? 00:44:52,478 --> 00:44:52,728 [John] Mm-hmm. 00:44:52,728 --> 00:44:55,118 [Eric] And petroleum is used in like all sorts of stuff- 00:44:55,118 --> 00:44:55,128 [John] Yeah 00:44:55,128 --> 00:45:11,658 [Eric] ... right? But I think the initial use cases for those were narrower, and then they expanded over time, you know, as manufacturing advances, costs decreased, et cetera, et cetera, right? In trains, really you're transporting stuff, right? 00:45:11,658 --> 00:45:11,748 [John] Mm-hmm. 00:45:11,748 --> 00:45:21,877 [Eric] And so it certainly was disruptive in the transportation industry. And so even though the medium is super focused, the impact is everything, right? 00:45:21,878 --> 00:45:22,438 [John] Right. 00:45:22,438 --> 00:45:27,248 [Eric] And that, I think, the impact is everything, and it's happening super quickly, and I think that is a- 00:45:27,248 --> 00:45:27,308 [John] Right 00:45:27,308 --> 00:45:30,618 [Eric] ... huge contributor to the difficulty in using metaphors- 00:45:30,618 --> 00:45:30,627 [John] Right 00:45:30,627 --> 00:45:33,438 [Eric] ... to explain it, even if we look at, you know, history. 00:45:33,438 --> 00:45:35,398 [John] Yeah. Yeah, and I guess 00:45:35,398 --> 00:45:41,908 [John] I agr- I, I agree with you. And it, it, and I think it looks like an inverted pyramid where everything like initiates- 00:45:41,908 --> 00:45:41,918 [Eric] Yeah 00:45:41,918 --> 00:45:44,198 [John] ... from this very small point and then out really wide. 00:45:44,198 --> 00:45:45,097 [Eric] Yes. 00:45:45,098 --> 00:45:49,878 [John] Whereas like, let's say like Gilded Age, I feel like there was a couple 00:45:49,878 --> 00:45:54,938 [John] f- like stakes in the ground that like there was a lot of overlap and then definitely went wide. 00:45:54,938 --> 00:45:55,458 [Eric] It definitely went wide. 00:45:55,458 --> 00:45:58,858 [John] But it feels like it's all on one like stake. 00:45:58,858 --> 00:45:58,998 [Eric] Right. Right. 00:45:58,998 --> 00:46:00,618 [John] Or one point. Like... 00:46:00,618 --> 00:46:00,988 [Eric] Yeah. 00:46:00,988 --> 00:46:01,318 [John] Yeah. 00:46:01,318 --> 00:46:03,708 [Eric] And it's, it's a s- it's ap- like its application- 00:46:03,708 --> 00:46:03,978 [John] And wider 00:46:03,978 --> 00:46:05,458 [Eric] ... is immediately wide, right? 00:46:05,458 --> 00:46:05,778 [John] Yeah. Yeah. 00:46:05,778 --> 00:46:22,058 [Eric] Because, I mean, it, it is wild if you think about, um... I mean, think about 10 years ago, what if I told you that there was going to be a technology that was going to 00:46:22,058 --> 00:46:25,038 [Eric] impact, um, 00:46:25,038 --> 00:46:27,158 [Eric] the medical field, 00:46:27,158 --> 00:46:35,838 [Eric] the law field, the academic research field, the software engineering field, the creative field- 00:46:35,838 --> 00:46:36,698 [John] Finance 00:46:36,698 --> 00:46:47,198 [Eric] ... the finance field. I mean, all of these things, right? I mean, it just, you know, where it's like, okay, the, the impact is so broad. 00:46:47,198 --> 00:46:47,238 [John] Yeah. 00:46:47,238 --> 00:46:51,338 [Eric] It's not concentrated, right? But you're s- That is a really interesting way to think about it- 00:46:51,338 --> 00:46:51,347 [John] Yeah 00:46:51,347 --> 00:46:52,578 [Eric] ... as an upside down- 00:46:52,578 --> 00:46:52,918 [John] Yeah 00:46:52,918 --> 00:46:59,698 [Eric] ... uh, triangle. All right, do you w- Okay, so agent economy, any other metaphors you wanna throw out there? 00:46:59,698 --> 00:47:01,658 [John] Um, I wanna go back to one. 00:47:01,658 --> 00:47:02,118 [Eric] Okay. 00:47:02,118 --> 00:47:04,338 [John] That is my favorite, and it's back to the explaining- 00:47:04,338 --> 00:47:04,558 [Eric] Mm-hmm 00:47:04,558 --> 00:47:05,178 [John] ... to people thing. 00:47:05,178 --> 00:47:05,858 [Eric] Mm-hmm. 00:47:05,858 --> 00:47:11,898 [John] Um, the... And I, I hadn't thought of this, but the math calculator or the calculator for words- 00:47:11,898 --> 00:47:11,908 [Eric] Mm-hmm 00:47:11,908 --> 00:47:18,978 [John] ... I really like. But a little bit of a derivative of that is thinking about it in terms of a function- 00:47:18,978 --> 00:47:19,518 [Eric] Mm 00:47:19,518 --> 00:47:21,998 [John] ... and a, well called a non-deterministic function. 00:47:21,998 --> 00:47:22,858 [Eric] Mm-hmm. 00:47:22,858 --> 00:47:27,438 [John] And, and a lot of the skill, like, like in coding they talk, call, talk, talk about vibe coding. 00:47:27,438 --> 00:47:28,398 [Eric] Mm-hmm. 00:47:28,398 --> 00:47:38,638 [John] But I think it gets at the skill of I'm controlling inputs into this thing and kinda have a gut sense for what the output might be. 00:47:38,638 --> 00:47:40,038 [Eric] Mm. 00:47:40,038 --> 00:47:43,498 [John] And the... So I think the calculator, 'cause calculators obviously like functions. 00:47:43,498 --> 00:47:43,938 [Eric] Right. Right. Right. 00:47:43,938 --> 00:47:46,028 [John] I think that's the interesting like human piece here- 00:47:46,028 --> 00:47:46,318 [Eric] Oh, yeah 00:47:46,318 --> 00:47:47,838 [John] ... where there's this very- 00:47:47,838 --> 00:47:48,158 [Eric] Yeah 00:47:48,158 --> 00:47:53,778 [John] ... vibey gut feel thing that people are, you know, calling it vibes or- 00:47:53,778 --> 00:47:53,908 [Eric] Mm-hmm 00:47:53,908 --> 00:47:55,578 [John] ... remember windsurf, like surfing- 00:47:55,578 --> 00:47:55,668 [Eric] Mm-hmm 00:47:55,668 --> 00:47:56,638 [John] ... like this vibey thing- 00:47:56,638 --> 00:47:57,278 [Eric] Yeah 00:47:57,278 --> 00:48:04,278 [John] ... that is like inputs into the non-deterministic thing and then getting a good or a desired output. 00:48:04,278 --> 00:48:04,838 [Eric] Yeah. 00:48:04,838 --> 00:48:06,078 [John] Um... 00:48:06,078 --> 00:48:15,298 [Eric] So if someone asks you at a dinner party and they say like, "Okay, describe AI," you're gonna say, "It's kinda like a TI-83, but not a Plus. It's a TI-83-" 00:48:15,298 --> 00:48:15,658 [John] Yeah 00:48:15,658 --> 00:48:16,258 [Eric] "... Vibe-" 00:48:16,258 --> 00:48:16,408 [John] Uh-huh 00:48:16,408 --> 00:48:17,737 [Eric] "... but with words, not numbers." [laughs] 00:48:17,738 --> 00:48:22,178 [John] Exactly. Exactly. Imagine you had a TI-83. [laughs] 00:48:22,178 --> 00:48:23,658 [Eric] The TI-83 Vibe. 00:48:23,658 --> 00:48:27,118 [John] And you put words in, and you got vibes out. [laughs] 00:48:27,118 --> 00:48:29,678 [Eric] No, is it vibes in, vibes out? 00:48:29,678 --> 00:48:30,938 [John] I mean, we're going that direction. 00:48:30,938 --> 00:48:31,398 [Eric] We're g- [laughs] 00:48:31,398 --> 00:48:33,278 [John] For sure. Yeah. 00:48:33,478 --> 00:48:37,958 [Eric] Okay, mine is, mine is kind of silly. 00:48:37,958 --> 00:48:40,098 [John] [laughs] Sillier than that? 00:48:40,098 --> 00:48:45,848 [Eric] Sillier than a TI-83 Vibe with words, not numbers. [laughs] 00:48:45,848 --> 00:48:47,898 [John] [laughs] I'm ready. 00:48:47,898 --> 00:48:50,318 [Eric] Mine's... Okay, mine's more of like a visual picture, so... 00:48:50,318 --> 00:48:52,078 [John] Okay. 00:48:52,078 --> 00:48:55,338 [Eric] And this is definitely because I was reading some Dr. Seuss books to my daughter. 00:48:55,338 --> 00:48:56,698 [John] Oh, great. 00:48:56,698 --> 00:49:02,558 [Eric] Uh, but Dr. Seuss has drawn these really amazing machines. Uh- 00:49:02,558 --> 00:49:02,568 [John] Ooh 00:49:02,568 --> 00:49:20,548 [Eric] ... and actually, like there's some, you know, they're, they are a little bit ambiguous, where you kind of get the sense of like, it can do this and this, but then there's a lot of... You know, it's a static picture, and it's in the context of, you know, the, the funny rhymes that he's put together. 00:49:20,548 --> 00:49:20,918 [John] Yeah. Right. 00:49:20,918 --> 00:49:33,258 [Eric] Um, but I was looking at this machine, and I thought, "That thing is crazy. Like, what all can this thing do?" And so I kind of envision myself, you know, on this machine- 00:49:33,258 --> 00:49:33,658 [John] Yeah 00:49:33,658 --> 00:49:40,088 [Eric] ... and I have the ability to, like tell it what I want to happen. 00:49:40,088 --> 00:49:40,918 [John] Mm-hmm. 00:49:40,918 --> 00:49:46,378 [Eric] And I don't really know what's going on inside of it, but there are all these trap doors and like weird arms and other stuff. 00:49:46,378 --> 00:49:47,158 [John] Things are whirling and twirling. 00:49:47,158 --> 00:49:50,588 [Eric] And it can just like pop out an arm and like do something, and you're like- 00:49:50,588 --> 00:49:50,588 [John] Right 00:49:50,588 --> 00:49:52,828 [Eric] ... "Oh, whoa, that's cool." You know? [laughs] 00:49:52,828 --> 00:49:52,828 [John] Right. 00:49:52,828 --> 00:49:55,198 [Eric] Like it can just sort of like do all this stuff. 00:49:55,198 --> 00:49:55,598 [John] Right. 00:49:55,598 --> 00:49:57,418 [Eric] You know, and clean up the house, uh- 00:49:57,418 --> 00:49:57,898 [John] Yeah 00:49:57,898 --> 00:49:58,838 [Eric] ... you know, after the cat in the hat does his- 00:49:58,838 --> 00:50:01,438 [John] Find a mouse. [laughs] 00:50:01,438 --> 00:50:04,158 [Eric] [laughs] 00:50:04,218 --> 00:50:04,658 [Eric] Exactly. 00:50:04,658 --> 00:50:05,018 [John] Yeah. Exactly. 00:50:05,018 --> 00:50:08,968 [Eric] So there... Yeah. That's, that's what we're doing is riding on a Dr. Seuss machine. [laughs] 00:50:08,968 --> 00:50:10,557 [John] [laughs] Awesome. 00:50:10,558 --> 00:50:16,538 [Eric] All right. Well, thanks for joining us, and we'll catch you on the next one. 00:50:16,538 --> 00:50:26,478 [John] Yeah. [upbeat music]
