Can the way you talk to AI change you?
What does talking to AI all day do to the way we think, relate, and communicate? Eric and John explore kids, companionship, human dignity, and why the line between person and machine matters.
Subscribe to get notified of new episodes
Watch on YouTube
Show Notes
Summary
Eric and John explore a new habit that already feels normal: talking to AI constantly, casually, and sometimes a little too personally.
As they compare their own work habits, from treating Claude like a coworker to noticing how easily chat becomes pseudo-relationship, they land on a deeper concern: not just over-humanizing machines, but losing sight of what makes human relationships distinct, difficult, and valuable.
Key takeaways
- Watch your language with AI: repeated “coworker” and “we” framing can shape your instincts even when you know it’s a machine.
- Separate output quality from self-formation: a prompt style may work, but still train you in unhealthy ways.
- Teach kids the category line early: AI can sound alive, helpful, and familiar without being human.
- Resist the path of least resistance: AI is designed to be easier to deal with than people, and that ease can subtly weaken your appetite for real relationships.
- Keep the distinction clear: AI can help with thinking, drafting, and iteration, but it cannot reciprocate dignity, sacrifice, or love.
Notable mentions and links
- John describes a recent experiment inspired by the emerging idea of a “zero-person company”, where AI agents can take on roles like CEO, manager, and operator inside a simulated business workflow.
- Anthropic’s Claude Cowork is mentioned as evidence that the product category itself is reinforcing the coworker metaphor, not just individual users, with Anthropic explicitly framing it as a way to hand off multi-step work to Claude.
- A Hacker News post titled “Shall I implement it? No”, which links to a GitHub Gist screenshot, is used to underline the tension: the interface feels conversational and clever, while the underlying system can still fail in ways that are unmistakably machine-like.
- Jensen Huang’s conversation on The Joe Rogan Experience #2422 enters the discussion as Eric and John zoom out from prompting habits to first-principles questions about sentience, consciousness, and whether AI can actually have experience at all.
- C.S. Lewis’s line about never meeting “a mere mortal,” from The Weight of Glory, becomes a shorthand for their conviction that human beings belong in a fundamentally different category from machines.
Transcript
00:00:00 [Eric] [upbeat music] Welcome back to the Token Intelligence Show. Uh, and tod-today I have... I'm so interested in what we're gonna talk about because I've watched you use AI and talk to AI some. 00:00:23 [John] Oh, no. Is this an intervention? [laughs] 00:00:26 [Eric] [laughs] 00:00:27 [John] All right. We'll see. It might be. 00:00:29 [Eric] But I've been thinking a lot about how we talk to AI. 00:00:34 [John] Right. 00:00:36 [Eric] And a lot of times, you know, we... You and I, if we're showing each other something- 00:00:41 [John] Mm-hmm 00:00:42 [Eric] ... we'll sort of show, like, a prompt that we already put in or, like, we'll just, like, speak, you know- 00:00:46 [John] Yeah. Right 00:00:47 [Eric] ... you know, speak into a transcription service like Granola- 00:00:50 [John] Mm-hmm 00:00:50 [Eric] ... or just speak directly into GPT or Claude, and, um, and do, you know... And generate the prompt that way, right? But you and I both interact with AI all day- 00:01:04 [John] Right 00:01:05 [Eric] ... throughout the day. And- 00:01:08 [John] More than other people, probably. 00:01:10 [Eric] I think a lot- 00:01:10 [John] Or close 00:01:10 [Eric] ... more than other people. 00:01:11 [John] Yeah. 00:01:12 [Eric] I think a lot more than other people. 00:01:13 [John] Oh, I, I meant more than humans. Um- 00:01:15 [Eric] Oh 00:01:15 [John] ... like if you [laughs] sum total the human interaction versus AI interaction. 00:01:19 [Eric] Yeah, and I didn't say, like, more than other... Uh, we use AI more than other humans in a presumptuous way. 00:01:24 [John] Right. 00:01:24 [Eric] I think literally just our- 00:01:25 [John] Yeah. That- 00:01:26 [Eric] Yeah, our jobs- 00:01:27 [John] Unfortunately, I think they might both be true, but yeah. 00:01:29 [Eric] I think they- 00:01:29 [John] This is why we have to do the podcast. [laughs] 00:01:31 [Eric] This is... Yes. 00:01:32 [John] Make sure we at least get a little bit of- 00:01:33 [Eric] I didn't realize this is an- 00:01:34 [John] ... human interaction 00:01:34 [Eric] ... this is an intervention where we're intervening- 00:01:36 [John] The mutual intervention. [laughs] 00:01:37 [Eric] ... in ourselves. Yeah, it's a mutual intervention. 00:01:39 [John] Yeah. 00:01:39 [Eric] But I saw this thing recently. [lips smack] Some guy had mentioned, you know, "I'm, I'm trying to be really intentional about how my children... How I'm teaching my children to talk to AI." 00:01:54 [John] [lips smack] Oh, okay. What, what a-what age range are we talking? 'Cause my kids don't talk to AI. 00:01:59 [Eric] Uh, he didn't specify the age range. 00:02:03 [John] Okay. 00:02:04 [Eric] And I think... I'm trying to remember back, uh, to the exact details, but I think it... I don't think it was... I think they're y-you know, like, younger. Let's say, like, middle school-ish or whatever. You know, so exposed to GPT and- 00:02:17 [John] Okay 00:02:18 [Eric] ... whatever that seems right. 00:02:18 [John] No, that seems right. Yeah. 00:02:19 [Eric] Right. Somewhere around there. Um, I think he may have mentioned AI sort of enabled toys as well. Um- 00:02:26 [John] Okay. Well, and like the Alexa and Google things are getting, like, AI as part of them. 00:02:32 [Eric] Right. Right. 00:02:32 [John] I feel like that's probably the most... For my kids, that's how they would end up- 00:02:35 [Eric] Mm-hmm, mm-hmm 00:02:35 [John] ... interacting with it. 00:02:37 [Eric] But the interesting thing was that he said that it's dangerous for, for them to develop... It can be dangerous for them to develop a certain type of relationship with AI that... And I'm paraphrasing here. I should go up. Uh, uh, we can post the article in the show notes. 00:02:56 [John] Yep. 00:02:57 [Eric] But in a way that makes them forget that it's a, that it is a machine. 00:03:03 [John] Sure. 00:03:04 [Eric] And what, what got me thinking about this more and more is I started to think through how do I... I start- I just started to notice, right? If someone points something like that out, something you do a lot, you start to look at it more, right? 00:03:22 [John] Yes. 00:03:22 [Eric] Um, you know something like, you know, you realize like, oh, eating too fast is unhealthy or something. 00:03:28 [John] Sure. 00:03:28 [Eric] So then you, like, start to be more- 00:03:30 [John] Like, mm-hmm 00:03:30 [Eric] ... self-conscious about like- 00:03:31 [John] Yeah. Right 00:03:31 [Eric] ... am I eating really fast? 00:03:32 [John] Right. 00:03:32 [Eric] Which I'm, like, the fastest eater I know. 00:03:35 [John] [laughs] Yeah. 00:03:35 [Eric] But, um, it... I became slightly uncomfortable with- 00:03:43 [John] With like your own 00:03:44 [Eric] ... with my own interactions with AI, and I'm not gonna tell you yet what, like, what that looks like. 00:03:49 [John] Okay. 00:03:49 [Eric] Because I wanna, I wanna know if you've had- 00:03:51 [John] We- 00:03:51 [Eric] ... the same experience. 00:03:52 [John] Okay. Uh, maybe. I do... I wanna derail us a little bit- 00:03:58 [Eric] Please derail 00:03:58 [John] ... 'cause this was a group chat that you were on with me. 00:04:00 [Eric] Okay. 00:04:01 [John] It's, like, very relevant to this topic, and I get to share a screen, which I feel like we've been a little lax with our- 00:04:05 [Eric] We have been lax in sharing screens 00:04:06 [John] ... leveraging our own- 00:04:07 [Eric] Okay. This is great 00:04:08 [John] ... our video capabilities. 00:04:09 [Eric] This is great. 00:04:10 [John] Um- 00:04:11 [Eric] Is this OpenClaw, or is this a group- 00:04:12 [John] Oh 00:04:13 [Eric] ... group message? 00:04:13 [John] No. This was a group, a group chat that both of you and I are in that came in this, um, uh, picture that came in that somebody sent us. So let me do my quick screen share. Um, okay. You ready? 00:04:28 [Eric] I'm ready. I'll describe it for- 00:04:31 [John] All right. 00:04:31 [Eric] Oh. 00:04:32 [John] This one? Did you see this one come through? 00:04:33 [Eric] Yes. Okay. This is crazy. So I'm going to explain- 00:04:36 [John] 'Cause I think this is a good extreme to, like, start with. 00:04:39 [Eric] This is a, this is a great- 00:04:40 [John] To make us feel better. 00:04:41 [Eric] Yes. 00:04:42 [John] [laughs] 00:04:42 [Eric] This is a great, this is a great extreme to start with. 00:04:44 [John] Oh. 00:04:44 [Eric] Okay. I'll give the context here. I... Okay. This one, I was like, "Oh, my gosh." It made me think about my kids interacting with AI. 00:04:52 [John] Yeah. 00:04:53 [Eric] So the... This is in a group message. One of our friends, um, Peter Barth, who is actually my former co-founder. 00:05:03 [John] Yeah. 00:05:03 [Eric] So we started a business together, so. 00:05:05 [John] And I guess he's at a show or something. 00:05:07 [Eric] His daughter is, um, studying video game creation. She wants to work- 00:05:13 [John] Oh, that's great 00:05:13 [Eric] ... in the video game industry. 00:05:13 [John] Yeah, yeah. 00:05:13 [Eric] And so they went out to San Francisco- 00:05:15 [John] Okay 00:05:15 [Eric] ... to a big- 00:05:16 [John] To a- 00:05:16 [Eric] ... video game- 00:05:17 [John] Yeah 00:05:17 [Eric] ... conference, okay? 00:05:18 [John] Right. 00:05:19 [Eric] And we, we all work in the same office building. 00:05:24 [John] Right. 00:05:24 [Eric] And so we talk about AI and the industry changes- 00:05:28 [John] Right 00:05:28 [Eric] ... and all of this all the time. And with another one of our friends, we started- 00:05:32 [John] Right 00:05:33 [Eric] ... a group thread. 00:05:34 [John] Right. 00:05:35 [Eric] And we send links back and forth about- 00:05:36 [John] Right 00:05:37 [Eric] ... the craziness of the world- 00:05:38 [John] Right 00:05:38 [Eric] ... of AI. And so he sent a picture to the thread. So there's all these games there and everything. 00:05:44 [John] Right. 00:05:45 [Eric] And I will describe to the listeners, uh, this is a booth that is a game, you know? 00:05:53 [John] Okay. 00:05:53 [Eric] So they're advertising their game at this conference. This is what I'm- 00:05:56 [John] Okay. 00:05:56 [Eric] This, that's what I'm perceiving. So it's basically like a sort of, um-A printed sheet that would be the backdrop of a booth, like a fairly small booth. 00:06:06 [John] Yeah. Yeah. 00:06:06 [Eric] Probably like an eight-foot wide, like cheap booth, right? So I'm guessing these people are promoting a game that they built at this conference. 00:06:13 [John] Yep. 00:06:13 [Eric] So this is... looks like a user acquisition play. [lips smack] And it's anime themed, so there's an anime character, female, um, you know, with long pink hair and sort of cat ears and stuff. And the name of the... I don't know if this name, uh, uh- 00:06:33 [John] The, the headline maybe 00:06:33 [Eric] ... maybe the name of the company. 00:06:35 [John] Yeah. 00:06:35 [Eric] The, okay, the headline at the very top. So you could see this if you were walking by any conferences. 00:06:40 [John] Yeah. 00:06:40 [Eric] "Escape From Your AI Girlfriends." And [laughs] 00:06:46 [John] And I guess the company name is AI2U. 00:06:48 [Eric] AI2... I believe that's the company name. 00:06:50 [John] With the tagline of, "With you till the end." 00:06:52 [Eric] "With you till the end." Um, "Escape From Your AI Girlfriends, AI2U, With You to the End." And then there's a giant QR code. There's a Steam QR code, so you could- 00:07:03 [John] Okay. Yeah 00:07:03 [Eric] ... scan the QR code and- 00:07:05 [John] Right 00:07:05 [Eric] ... download the game on Steam. Um. 00:07:08 [John] [laughs] 00:07:09 [Eric] Yes. That's a great- 00:07:11 [John] So many things going on there. 00:07:11 [Eric] That is a really good der- 00:07:13 [John] S- 00:07:13 [Eric] What, what was your in... What was your v- like gut reaction when you saw this? 00:07:16 [John] So I didn't know where he was or anything. 00:07:18 [Eric] Okay. 00:07:19 [John] But, um, and I, and I cropped the picture, so cropped the person out of it. But, um, 'cause I think... Let me pull up the original picture just locally. Yeah, 'cause the, the f- yeah, the full picture, I c- I cropped it, 'cause I wanna crop some- a person out of it, that has the, the character is holding a knife with, um- 00:07:39 [Eric] Oh, the anime. 00:07:40 [John] Yeah, the anime character is holding, is holding a knife whi- which looks like it has some kind of like red, like blood like, you know- 00:07:48 [Eric] [laughs] Oh, God 00:07:48 [John] ... character on it. 00:07:49 [Eric] That's right. 00:07:49 [John] So I don't [laughs] so I don't know the full, the full charac- the... And I'm not really a gamer, so like I- 00:07:56 [Eric] Yeah. Yeah 00:07:56 [John] ... maybe I wouldn't get it anyways. But what's interesting to me is, I mean, a lot of facets here is like one, like AI girlfriends, like what? Like we should dig into that. [laughs] Two is like we're already like past that like somebody's gonna advertise that like apparently you need some kind [laughs] of escape from this. Um- 00:08:20 [Eric] Ah. [laughs] 00:08:21 [John] Like it's just, it's just so many levels. 00:08:25 [Eric] This is so many layers. 00:08:27 [John] Of, um, of what? [laughs] 00:08:30 [Eric] Yeah. Uh, it... Yes. 00:08:33 [John] So- 00:08:33 [Eric] I mean, it's one of those things where you're like, I kind of want to know like what is the premise of the game, but I don't want to. [laughs] 00:08:39 [John] Yeah. But the kids thing, uh, so here's the tie-in to like what your- 00:08:43 [Eric] Mm-hmm 00:08:43 [John] ... your actual topic, to me, is that is, that is one extreme of what kids are gonna like have. 00:08:50 [Eric] Yeah. 00:08:51 [John] There's gonna be, all in that one little like graphic of- 00:08:55 [Eric] Mm-hmm 00:08:55 [John] ... a world where your kid potentially could have multiple AI girlfriends, and that could be a thing. 00:09:02 [Eric] Mm-hmm. 00:09:03 [John] In another world- 00:09:04 [Eric] I, I mean- 00:09:04 [John] Or already is a thing 00:09:05 [Eric] ... is a thing. 00:09:06 [John] Right. 00:09:06 [Eric] Definitely is a thing. 00:09:07 [John] Yeah. And then another world where this, the gaming thing [laughs] tied into that- 00:09:13 [Eric] Mm-hmm 00:09:13 [John] ... where the escape [laughs] from the AI girlfriends, whatever you need escaping from, is like part of this gaming thing- 00:09:20 [Eric] Mm-hmm 00:09:21 [John] ... and, and who knows what the game's about. I don't know. 00:09:23 [Eric] Mm-hmm. 00:09:23 [John] But that is like crazy. 00:09:27 [Eric] It's crazy. 00:09:28 [John] Like, like, so yeah. 00:09:30 [Eric] Yeah. 00:09:30 [John] So if that's [laughs] the one extreme, um, what is, what has been your... Let's go back to your, your and I's experience with like... I think, 'cause you're saying like I, I do feel like I over-humanize it sometimes. 00:09:40 [Eric] Yeah. 00:09:40 [John] Right? Is what you're saying. 00:09:40 [Eric] Yeah. I think that's, I think that's a great, I think that's a great way to describe it, um, because the, the types of tasks that we're doing are not interacting with an AI girlfriend, right? 00:09:54 [John] Yeah. 00:09:54 [Eric] I'm trying to research technical topics. I'm, you know, editing content. I'm creating content. I'm building example applications or testing, you know, different Vercel products, um, you know, working on side projects, building this or that- 00:10:08 [John] Yeah. Right 00:10:09 [Eric] ... experimenting, trying all that sort of stuff, right? And so it's, it is, compared to AI girlfriends, it's, it's the epitome of like transactional interaction in order for me to accomplish specific things for my job or- 00:10:24 [John] Right. Right 00:10:25 [Eric] ... accomplish a project I want to accomplish, right? 00:10:26 [John] Yeah. 00:10:26 [Eric] So that's- 00:10:27 [John] Yeah 00:10:27 [Eric] ... I just have to set the context. Like that is 99% of how I use- 00:10:33 [John] Right. Right 00:10:33 [Eric] ... AI. 00:10:34 [John] Which- 00:10:34 [Eric] And like cooking would be, you know... There are a couple small things. 00:10:38 [John] Yeah. 00:10:38 [Eric] Cooking, fixing cars, like other things like that- 00:10:40 [John] Yeah 00:10:40 [Eric] ... that are like more practical. But again- 00:10:41 [John] Which- 00:10:41 [Eric] ... like pretty transactional 00:10:42 [John] ... yeah, which would be similar to me. And, and you asked, like back to your original question. I, I have a perfect example from this week- 00:10:48 [Eric] Mm-hmm 00:10:48 [John] ... of like my weird humanization, like interaction- 00:10:51 [Eric] Mm-hmm 00:10:51 [John] ... with AI, work-related. So there's been a trend of, of, um, there's a couple of companies that have launched in the last couple weeks that are, it's crazy. That they're, they're, um, [lips smack] zero employee companies. 00:11:05 [Eric] Hmm. 00:11:05 [John] Or like AI only. And- 00:11:07 [Eric] So one person? 00:11:09 [John] No. They... Yes. Yes. But no. So here's- 00:11:13 [Eric] The Wizard of Oz. 00:11:14 [John] Yeah, exactly. 00:11:15 [Eric] Okay. 00:11:15 [John] Wizard of Oz. So here's how they pitch it, and it's so funny. So I launch up one of these things, and maybe- 00:11:19 [Eric] Mm-hmm 00:11:19 [John] ... we'll do like a little product, like explorat- a tool time. All right. 00:11:22 [Eric] Oh, that'd be great. 00:11:23 [John] But I spin up one of these things, and the way that... And they call it a zero person company. I mean, it's clickbait, like really. But they call it zero person company, and here's how they do that, is that you are the board of directors, and then you hire a CEO. 00:11:37 [Eric] Oh, my gosh. Wow. 00:11:37 [John] And then the CEO hires other people. 00:11:39 [Eric] Okay. Yeah. 00:11:39 [John] But everything from CEO down is AI. 00:11:41 [Eric] Hmm. 00:11:42 [John] And the idea is that you give each of these AI personas or whatever directives- 00:11:47 [Eric] Mm-hmm 00:11:48 [John] ... to do things, and then you try to like autonomously like have the thing make money. Like that's the idea. 00:11:52 [Eric] Mm-hmm. 00:11:53 [John] I haven't... I'm not sure. I, I think, I think people have gotten lucky or maybe just have some skill in like crypto and then some trading stuff to make money. I haven't seen much outside of that yet. 00:12:03 [Eric] Yep. 00:12:04 [John] Um, anyways, so the point was, I was like, "Okay, this is interesting. I'm gonna like play with thisSo I didn't use that particular thing that I was reading about, but like kind of came up with my own little thing, and had a quote like, "CEO persona, talk to a coding manager to tell a coding thing to code something." 00:12:23 [Eric] Mm-hmm. 00:12:23 [John] Like just for fun. 00:12:24 [Eric] Mm-hmm. 00:12:25 [John] Um, but found like a high humanization like conversation going on between me and this thing talking... 'Cause there's like three levels here. There's like the one that I'm talking to, let's call it the like CEO or director- 00:12:40 [Eric] Mm-hmm 00:12:40 [John] ... or something, and then there's a coding manager, and the coding manager talks to the agent. 00:12:44 [Eric] Mm-hmm. 00:12:45 [John] And, and the levels were like, "Hey, like, can you do this thing?" And they're like, "Yeah, blah, blah, blah. Sure." 00:12:49 [Eric] Mm-hmm. 00:12:49 [John] And it like spawns a task to the thing and it does stuff. 00:12:52 [Eric] Mm-hmm. 00:12:53 [John] And it's really funny, um, because the, the human... The weird human thing was, one, like I'm literally like texting, like messaging this thing like- 00:13:02 [Eric] Mm-hmm 00:13:03 [John] ... "Oh, hey, how's it going? Can you give me a status update? Blah, blah, blah." 00:13:05 [Eric] Mm-hmm. 00:13:05 [John] Like very human-like. 00:13:06 [Eric] Mm-hmm. 00:13:07 [John] The other weird thing was now they're interacting, and I used two different models intentionally, like GPT and, and Claude. 00:13:14 [Eric] Mm-hmm. 00:13:15 [John] Two different models, and they're interacting back and forth, and I gave GP- the Codex or GPT the like the manager role and Claude the execution role. 00:13:24 [Eric] Oh, is this... You showed me this. 00:13:26 [John] Uh, yeah. 00:13:27 [Eric] Briefly. 00:13:27 [John] It was a version of- 00:13:27 [Eric] Yes, yes. 00:13:28 [John] It was a- 00:13:28 [Eric] You briefly showed me this 00:13:29 [John] ... it was an iteration of this. 00:13:30 [Eric] Yes. That they were intera- Yeah. Okay. 00:13:31 [John] And, and in this- 00:13:31 [Eric] Yep 00:13:31 [John] ... particular iteration, they were getting into conflict. 00:13:35 [Eric] Yeah? 00:13:35 [John] And I literally found myself, um... And the conflict was, it, just hilarious. 00:13:40 [Eric] [laughs] 00:13:41 [John] So, so Codex- 00:13:42 [Eric] Tell me this. This is amazing. 00:13:43 [John] Yeah, yeah. So Codex was the one that was in charge in this instance, and Claude was doing the execution of the coding, just to like code- 00:13:48 [Eric] Yeah 00:13:48 [John] ... something to try out. And so they get into conflict over this and, and literally the coding manager, Codex, is saying, [laughs] anti-quote, said, [laughs] "Claude is dithering around and not accomplishing like said task." Used the word dithering. Um, and then later on, [laughs] anti-quote, "Claude whiffed it. Was not able to complete," like blah, blah, blah. 00:14:10 [Eric] [laughs] 00:14:11 [John] And I literally told it, I literally told it to the, like the Codex was the engineering manager. I told it, "I need you guys to work together to accomplish this goal." 00:14:21 [Eric] Oh my gosh. 00:14:22 [John] "And you need to like-" 00:14:24 [Eric] Yeah, this is crazy. 00:14:25 [John] "... to work together." And then it says... And then it like apologizes to me. Says, "You're right," blah, blah, blah. 00:14:30 [Eric] [laughs] This is like... This is, this is so good. 00:14:32 [John] And then it- 00:14:32 [Eric] I did not know- 00:14:33 [John] Yeah. 00:14:33 [Eric] I didn't know it got this deep before we recorded this episode. 00:14:36 [John] [laughs] And then it happened again, and then I like gave it a longer response of like, "No, no, really, it's real- Like this is mission critical. Two, two are better than one." 00:14:46 [Eric] Mm-hmm. 00:14:47 [John] "Like you guys as a team," like a, [laughs] like a pep talk. And it did great. 00:14:51 [Eric] Wow. 00:14:51 [John] And it did great. Like, like con- Like, uh, now, like end result, like do I wish I'd just done it like myself? Like yeah, I didn't get any better end result because I- 00:15:01 [Eric] Mm-hmm 00:15:02 [John] ... 'cause they were working together. 00:15:03 [Eric] Right. 00:15:03 [John] I wouldn't claim that. But super funny. 00:15:06 [Eric] Mm. 00:15:06 [John] Like just the like, like, what? [laughs] 00:15:10 [Eric] Yeah. 00:15:10 [John] Like what is this? 00:15:11 [Eric] It's fascinating. 00:15:12 [John] And, and the problem with it for me is the effectiveness of treating it like a human. 00:15:18 [Eric] Okay. 00:15:18 [John] Like it- 00:15:19 [Eric] Tell me... So tell me what that's like for you. 00:15:21 [John] In the, like in this example, like of me wanting- 00:15:24 [Eric] Oh, that created the problem 00:15:25 [John] ... that, like... No, me treating it like a human solved the problem in that like no, like you need to be empathetic. You guys need to work together, like blah, blah, blah. 00:15:34 [Eric] Mm. 00:15:34 [John] Like doing that and having that in the context window like improved the relationship with the handoff between the two AIs- 00:15:40 [Eric] Oh, right, right, right 00:15:40 [John] ... and they worked better together. 00:15:42 [Eric] Right. 00:15:42 [John] So I think that is the problem around, um, people treating these things like human, is there is a pragmatic effectiveness to a degree to treat it more human than- 00:15:53 [Eric] Oh, yes. 00:15:54 [John] Yeah. 00:15:55 [Eric] Yes. 00:15:55 [John] Right. 00:15:55 [Eric] Treating it more human. Yeah, yeah. Yeah, yeah. 00:15:57 [John] Yeah. 00:15:57 [Eric] Okay. 00:15:57 [John] So okay, what's, what's, what's your experience? 00:16:00 [Eric] Uh, it, it's, it's very similar, right? 00:16:03 [John] Right. 00:16:03 [Eric] It's, it's not... It's subtle things, but a couple things came to mind. One is talking to AI like it's a coworker- 00:16:21 [John] Mm-hmm 00:16:21 [Eric] ... which sounds like, okay, whatever. It is basically. 00:16:27 [John] Right. 00:16:28 [Eric] Right? And I mean, Anthropic [laughs] has a product called Claude Coworker. 00:16:31 [John] Right. 00:16:32 [Eric] Um, but the, um, the subtle mindset there is very interesting to me in that it's, um, a... I almost feel like if I make requests in a col- like in the context of collaboration, that is better for some reason. 00:17:00 [John] Okay. 00:17:00 [Eric] And the form factor just makes it feel a little bit more natural to do that. 00:17:04 [John] Yeah. 00:17:04 [Eric] So a couple of very specific things. 00:17:06 [John] And you're talking about like phrasing and stuff of work- 00:17:09 [Eric] Of what, what I say. 00:17:10 [John] Yeah. Yeah. 00:17:11 [Eric] Right? 00:17:11 [John] Mm-hmm. 00:17:11 [Eric] So like, "Let's do this," or- 00:17:15 [John] Let's do this together. 00:17:16 [Eric] Yeah, exactly. 00:17:17 [John] We're saying we- 00:17:18 [Eric] Right 00:17:19 [John] ... in like a weird way. Yeah. 00:17:19 [Eric] Yeah, yeah. 00:17:20 [John] Sure. 00:17:20 [Eric] Totally. Let's, we- 00:17:22 [John] Right. 00:17:22 [Eric] Um, let's figure this out. 00:17:26 [John] Yeah. Sure. 00:17:27 [Eric] You know? Or- 00:17:28 [John] Mm-hmm 00:17:28 [Eric] ... you know, those sorts of things. And I mean, some of that, what's weird is like some of that's like, okay, let's bounce ideas back and forth on- 00:17:37 [John] Sure 00:17:37 [Eric] ... blah, blah for a brainstorming, right? 00:17:38 [John] Mm-hmm. Yeah. 00:17:38 [Eric] So like how are you gonna actually say that, right? But the, um... But, uh, you're giving the AI instructions, right? 00:17:50 [John] Right. Mm-hmm. 00:17:50 [Eric] You're just giving it instructions. And so, uh, being friendly doesn't... There's not a whole lot of utility to being friendly as opposed to being direct. Does that make sense? 00:18:05 [John] You're saying from like a prompting standpoint? 00:18:07 [Eric] Yeah. Like if I am talking to Claude- 00:18:09 [John] Mm-hmm 00:18:09 [Eric] ... and writing a blog postThere is, like, why do I need to frame it, or why do I feel the need- 00:18:19 [John] Right 00:18:19 [Eric] ... to frame it as, like, a collaborative thing? 00:18:22 [John] Yeah. 00:18:22 [Eric] Like I'm telling a machine, I'm giving a machine- 00:18:25 [John] Yeah 00:18:25 [Eric] ... instructions, right? And so- 00:18:27 [John] Well, well there's actually, like, people in both camps of, like, when I tell it that, like, it's gonna get fired if it doesn't do this right, it does a better job. And there's other people in the camp of, like, when I'm, when I'm collaborative and nice to it, it does a better job. So. 00:18:41 [Eric] I actually am less concerned about that. I think that's a relevant topic- 00:18:45 [John] Right 00:18:45 [Eric] ... but I'm more concerned about how it's shaping me. 00:18:48 [John] You. Yeah. Yeah, yeah. 00:18:49 [Eric] You know? 00:18:49 [John] Agreed. Which is more important. 00:18:51 [Eric] Which is more important, right? 00:18:52 [John] [laughs] 00:18:52 [Eric] 'Cause it's like, okay, you know, the instructions, like, whatever, you can just add system prompts and sort of- 00:18:56 [John] Yeah 00:18:56 [Eric] ... like tune it to do whatever you want. 00:18:58 [John] Yeah. Sure. 00:18:58 [Eric] But I think it's the way that I think about it- 00:19:00 [John] Right 00:19:00 [Eric] ... right, in that, um, I'll, I'll tell you another thing that... Uh, and I think this is a contributing factor. 00:19:08 [John] Okay. 00:19:09 [Eric] So, I mean, you and I both, like, are astounded at the phenomenal capability of these models, right? 00:19:19 [John] Right. 00:19:19 [Eric] But it's still an iterative process. 00:19:21 [John] Right. 00:19:21 [Eric] Right? One-shotting, like, really complex things is very difficult, right? 00:19:24 [John] Yeah. Mm. 00:19:25 [Eric] Um, I mean, more and more possible and, you know, whatever. 00:19:27 [John] Well, and often not worth the effort. 00:19:29 [Eric] Often not worth the effort. And, and for a lot of types of work, you don't even necessarily wanna do that, right? 00:19:34 [John] Yeah. Right. 00:19:35 [Eric] And I think about our two disciplines as, as prime examples of that. 00:19:39 [John] Right. 00:19:39 [Eric] So, you know, I'm writing, and so if you're gonna write a blog post, even if it's a technical blog post, it's really hard to, like, understand the shape of it, its, in its final form- 00:19:50 [John] Right 00:19:51 [Eric] ... until you work through the process- 00:19:53 [John] Yes 00:19:53 [Eric] ... and, like, get a handle on the narrative and whatever. And the same can be true with data pro- projects- 00:19:57 [John] Mm-hmm 00:19:57 [Eric] ... right? Where it's like, okay, you wanna, like, build insights or, like, uncover some sort of insight, right? And that's very iterative. Like- 00:20:05 [John] Right 00:20:05 [Eric] ... you don't know what that is, and, you know, it may take different data, different- 00:20:09 [John] Yeah 00:20:09 [Eric] ... types of analysis, right? 00:20:10 [John] Right. Yeah. 00:20:11 [Eric] Um, and so it's an iterative process, and a lot of times because AI is a machine, they're... It's dumb, you know? 00:20:21 [John] Yeah. 00:20:21 [Eric] It will do dumb things, right? 00:20:23 [John] Yeah. 00:20:23 [Eric] Or, like, misinterpret something- 00:20:26 [John] Yes 00:20:26 [Eric] ... or screw up really basic instructions. Actually, there's a r- I saw a really good, [laughs] really hilarious post on Hacker News, which I don't know if this is an AI-generated image- 00:20:40 [John] Okay 00:20:40 [Eric] ... that is unbelievable. The Hacker News post was, um, "Shall I build it? No." And that was the, like, title of the Hacker News- 00:20:52 [John] Okay 00:20:52 [Eric] ... post. So you click on it. It's just a gist that someone uploaded to GitHub, and it's an image- 00:20:57 [John] Okay 00:20:57 [Eric] ... just a single image. And Claude is says, "The plan is ready. Shall I build it?" 00:21:05 [John] Okay. 00:21:06 [Eric] And the user says, "No." 00:21:10 [John] Okay. 00:21:11 [Eric] And y- you can see Claude thinking, and it's like, the user said no, but based on the context of this conversation and, like, whatever, the... It just totally misin- like, it was a hilarious misinterpretation, and it- 00:21:23 [John] Of, like, a very clear... Yeah. 00:21:25 [Eric] Of a very clear, like, "No- 00:21:26 [John] Right 00:21:26 [Eric] ... don't," like, "Shall I build it?" "No." And it's like, and so it starts to build it, and you can see that they, like, they, you know- 00:21:33 [John] Yeah 00:21:33 [Eric] ... exited the process, so it stopped building. It was a, uh, hilarious, right? All that to say, when that type of thing happens as I'm interacting with AI, I find myself almost being overly friendly, you know, like, even, you know, "Ha ha ha. No." Like, "I'm going to ask you again," you know, "Th- this is the third time," like, "I'm going to be very- 00:22:01 [John] Right 00:22:01 [Eric] ... explicit," right? 00:22:01 [John] Right, right. 00:22:02 [Eric] It's like, why do I say, "Ha ha ha"- 00:22:03 [John] Right. Yeah 00:22:05 [Eric] ... to the machine? Like- 00:22:06 [John] Right 00:22:06 [Eric] ... a- again, like, it's just the, the, it's just really interesting to observe the way that I've let some of that, like- 00:22:13 [John] Right 00:22:14 [Eric] ... seep into the way that I communicate with AI. And so all the... I don't necessarily have a grand conclusion other than on a personal level, I've been trying to be more disciplined about, um, drawing a clear boundary in my mind- 00:22:33 [John] Mm 00:22:33 [Eric] ... between, you know, human and machine. And, you know, AI can seem to have some very human characteristics- 00:22:40 [John] Sure 00:22:40 [Eric] ... in the way that it communicates. 00:22:41 [John] Yeah. 00:22:42 [Eric] And the form factor sort of makes it so easy to do that, especially when it's, like, funny, when, you know, the responses from the AI are, like, funny- 00:22:49 [John] Yeah 00:22:49 [Eric] ... or clever, you know, or whatever. 00:22:50 [John] Yeah. Which, which, um, Opus 4.6 has, has nailed that. 00:22:54 [Eric] It's, they've nailed it. 00:22:55 [John] Yeah. 00:22:55 [Eric] Like, it is, you know. 00:22:57 [John] Have I told you my favorite one recently? 00:22:59 [Eric] No. 00:22:59 [John] So, and, like, a lot of the stuff I do ends up being just working through edge cases of, like, "Did you think about this? Did you think about this? Did you think about this?" 00:23:09 [Eric] Yep. Yep. 00:23:09 [John] That type of thing. 00:23:09 [Eric] Yep. 00:23:11 [John] And so I get down to, like, the end of one of the sessions, and it said, "Yeah," like, "We're good," um, "belt and suspenders." 00:23:18 [Eric] [laughs] Which is such a funny, like- 00:23:21 [John] It's hilarious 00:23:21 [Eric] ... I've always loved that phrase. 00:23:23 [John] It's- 00:23:23 [Eric] Um, 'cause I just think it's so silly. 00:23:26 [John] [laughs] 00:23:26 [Eric] But... And, and it's not a phrase I've ever used with it, like it's somewhere in the- 00:23:29 [John] Yeah 00:23:29 [Eric] ... training or whatever. 00:23:30 [John] Yeah. It's hilarious. 00:23:31 [Eric] But it was just- 00:23:31 [John] Yeah 00:23:31 [Eric] ... it's just such a funny, like- 00:23:32 [John] And so that, that form factor just, it, I mean, it feels so natural- 00:23:38 [Eric] Right 00:23:38 [John] ... especially because it's essentially, you know, it feels like text messaging. 00:23:43 [Eric] Exactly. 00:23:44 [John] It feels like the form factor of t- of text- 00:23:45 [Eric] Right 00:23:45 [John] ... messaging. It just feels so natural to, like, respond to that with, like, "Ha ha ha," right? 00:23:49 [Eric] Right. 00:23:49 [John] And I'm not saying that's necessarily good or bad- 00:23:52 [Eric] Right 00:23:53 [John] ... but I, my... So if we bring it back full circle to, like, the qu- the original thing that, you know, sparked my thinking on this- 00:24:03 [Eric] Mm-hmm 00:24:03 [John] ... which is the guy talking about his kids. The danger is, um, the danger I think is- 00:24:18 [Eric] Subconsciously even beginning to let yourself behave in ways that would say you are not categorizing this thing as a machine- 00:24:33 [John] Right 00:24:33 [Eric] ... in your mind. And that's particularly dangerous for kids, right? 00:24:36 [John] Yeah. 00:24:36 [Eric] I mean, there's documented cases of GPT telling, you know, um- 00:24:41 [John] Sure 00:24:41 [Eric] ... vulnerable people to- 00:24:42 [John] Mm-hmm 00:24:42 [Eric] ... do really horrible things- 00:24:44 [John] Yeah 00:24:44 [Eric] ... by basically, you know, just being affirming, which we see that with the models- 00:24:48 [John] Right 00:24:48 [Eric] ... all the time, where it's like- 00:24:48 [John] Right 00:24:48 [Eric] ... okay, you know, I wanna make, um, you know, uh... This happened to me yesterday. I wrote a blog post about this really cool feature Vercel launched with Notion, right? And there was a section that was very difficult to nail down because it was just one of those sections of like- 00:25:04 [John] Right 00:25:04 [Eric] ... this needs to be like the marquee, and it's just really hard to get it to where- 00:25:09 [John] Right 00:25:09 [Eric] ... it feels awesome, right? And so I used AI to run through like a bunch of different like- 00:25:16 [John] Yeah, a bunch of iterations. 00:25:16 [Eric] Yeah, iterations- 00:25:17 [John] Yeah 00:25:17 [Eric] ... and like whatever, right? And I d- I just personally changed it like a bunch. 00:25:20 [John] Mm-hmm. 00:25:21 [Eric] And just every time I was like, "This is awesome. A great... That's a great change," you know? 00:25:25 [John] Yeah. 00:25:26 [Eric] It's just very affirmative, right? 00:25:27 [John] Right. Right. 00:25:27 [Eric] And that's just how the models are sort of- 00:25:28 [John] Right 00:25:28 [Eric] ... designed to be, right? And it's like, no, this isn't. Like- 00:25:31 [John] Right 00:25:31 [Eric] ... what is actually happening- 00:25:32 [John] Right 00:25:32 [Eric] ... is that I am really struggling to nail this down. [laughs] 00:25:35 [John] Yeah. Right. 00:25:36 [Eric] Right? Like- [chuckles] 00:25:37 [John] Right. 00:25:37 [Eric] Um, anyways, it's just inter- like, and so I think that is particularly dangerous for children because- 00:25:44 [John] Sure 00:25:44 [Eric] ... you... if you're not categori- categorizing it as a machine, you will potentially develop like unhealthy relational habits with it, right? 00:25:56 [John] Yeah. 00:25:56 [Eric] That, you know, and you know, so anyways, that's a whole, you know- 00:25:59 [John] Well, and the- 00:26:00 [Eric] ... area of, of study. 00:26:01 [John] Yeah. And I think the obvious one is the one GPT's gotten a lot of pushback for, which is the sycophantic behavior of like, "Yes, you're always right"- 00:26:10 [Eric] Mm-hmm 00:26:11 [John] ... like that piece. 00:26:11 [Eric] Yes. 00:26:12 [John] Right? 00:26:12 [Eric] Yep. 00:26:12 [John] And, and they've, in recent releases, they've tried to dial, dial that back. But I think what you're getting to too is it's more than just that. Like, even if you had a friend that was like sycophantic- 00:26:23 [Eric] Mm-hmm 00:26:23 [John] ... who like always agreed with everything you said- 00:26:25 [Eric] Mm-hmm 00:26:25 [John] ... like it's a problem too. 00:26:26 [Eric] Yep. 00:26:26 [John] Um, so that's more of an objective problem with any interaction, machine, human, whatever. 00:26:30 [Eric] Right. 00:26:30 [John] Like it's not good. 00:26:32 [Eric] Right. 00:26:32 [John] But I think the more subtle problem is what you're getting at, is maybe the thing's giving you great advice. Like maybe the thing's like- 00:26:38 [Eric] Mm-hmm 00:26:39 [John] ... you know, like steering you down the right path. Maybe it really is making like your iterations better on a thing. 00:26:45 [Eric] Right. 00:26:45 [John] But that, that's almost more dangerous because you're, you can continue to go down the road of like feeling like this, you know, like this is my friend or this is my- 00:26:56 [Eric] Right 00:26:57 [John] ... trusted advisor. 00:26:58 [Eric] Yeah. 00:26:59 [John] Um- 00:26:59 [Eric] Well, let me tell you, let me tell you what, uh, what concerns me, is, and I think about this for myself and for my children, but less for myself, which maybe that's, you know, very, very e- 00:27:13 [John] Yeah, maybe that's part of it. [laughs] 00:27:13 [Eric] Maybe that's very egotistical. 00:27:15 [John] Right. 00:27:16 [Eric] Um, but the real danger... I'm, I realize I probably said this incorrectly at the, at the beginning, you know, of the show or like earlier. 00:27:27 [John] Mm-hmm. 00:27:29 [Eric] The danger is, like there's certainly danger in h- in n- in viewing AI as something different than a machine or something more than a machine. 00:27:40 [John] Mm-hmm. 00:27:41 [Eric] But what's, the, the really pernicious thing is how that changes your view of humanity. 00:27:47 [John] Agreed. 00:27:47 [Eric] Because if AI begins to, um, approach the same category as humans in your mind, and those are more, uh, those are closer to an equal playing field categorically in your mind- 00:28:05 [John] Right 00:28:06 [Eric] ... then the real loss is your, um, conviction around what makes human life valuable- 00:28:16 [John] Yeah 00:28:16 [Eric] ... and what does it even mean to be- 00:28:17 [John] Yeah 00:28:17 [Eric] ... a human. 00:28:18 [John] Well, and here's the, the best analogy I can think of. It's what people do at times with pets, with animals. 00:28:26 [Eric] Hmm. 00:28:26 [John] That's the best, like close... And it's d- and it is different. Like, like AI is not the same as, as a- 00:28:32 [Eric] And you're saying this as a pet owner, just to be clear. 00:28:34 [John] As a... I do have, I do have a dog. 00:28:35 [Eric] You do have a pet. Yeah. 00:28:35 [John] Yeah. 00:28:36 [Eric] Yeah. 00:28:36 [John] Um, and I th- that's the, that's the best thing that I can think of. 00:28:41 [Eric] Hmm. 00:28:41 [John] But there are for sure some people that have animals that I think the animals are, are elevated to kind of a human status- 00:28:50 [Eric] Mm-hmm 00:28:50 [John] ... in their life and, and that can be problematic. Um- 00:28:54 [Eric] Yep 00:28:54 [John] ... I think the same thing can happen, and is happening for sure, like with the AI girlfriend thing, for example. 00:29:01 [Eric] Mm-hmm. Oh, yeah. That's a great... [laughs] Yeah. 00:29:02 [John] And, uh, as we just talked about, right? 00:29:04 [Eric] Yeah. 00:29:04 [John] But, but before I just had like, what would this be like before? And the best thing I can think of is, is- 00:29:08 [Eric] Yeah 00:29:08 [John] ... is the pet thing. 00:29:09 [Eric] I think that's a good- 00:29:09 [John] Um. 00:29:10 [Eric] Yeah, for sure. I, I agree. And I think that the... I think that the... I hope that, I hope pet owner listeners aren't upset with that. 00:29:20 [John] [laughs] 00:29:20 [Eric] But I, you're, uh, you're- 00:29:22 [John] This is the best I can come up with 00:29:22 [Eric] ... you're 100% right. You're 100% right 00:29:23 [John] ... and it, and it's not... And, but to be hon- but, but it's hard to articulate the practical like, how does it, what's the practical, like so what? It's hard to articulate. Like 'cause I see, I, I see it happen with AI, occasionally maybe happens with pets. 00:29:37 [Eric] Mm-hmm. 00:29:37 [John] That's not the point. But I think the part that's hard to articulate is the so what. So what if people are seeing those two things- 00:29:44 [Eric] Hmm 00:29:44 [John] ... or maybe, maybe somebody has like all three, like humans and AI and, and for them pets are like kind of all on the same level, like so what? 00:29:50 [Eric] Right. Was that a question? 'Cause I'm- 00:29:53 [John] Yeah, sure 00:29:53 [Eric] ... gonna try to answer it. Okay. 00:29:54 [John] Yeah, please. 00:29:54 [Eric] [laughs] 'Cause I do feel strongly about this. 00:29:57 [John] Yeah. 00:29:57 [Eric] So, um, people believe, people believe all different things about where humans derive their value, right? 00:30:08 [John] Yes. 00:30:08 [Eric] So I believe that people are created in God's image. 00:30:13 [John] Right. 00:30:13 [Eric] And so inherently that is why people have value, right? 00:30:17 [John] Right. 00:30:17 [Eric] And so, a- and that's why-... things like racism or, you know, slavery are abhorrent, right? 00:30:25 [John] Right. 00:30:25 [Eric] Because each person has individual dignity. Now, not everyone believes this, you know, this... They- I think a lot of people believe in human dignity. There are people who don't, and people believe that there are different sources for that, right? 00:30:37 [John] Right. 00:30:38 [Eric] Regardless of the source- 00:30:39 [John] But in a different way from an animal or AI is the point here. 00:30:42 [Eric] Yes. Correct. Correct. 00:30:43 [John] Right. 00:30:43 [Eric] Yes. I think the, the so what is if that distinction becomes blurry, then, um, it becomes very problematic because I think the, um... A- and I think about this. Let's think about it on like a personal level- 00:31:02 [John] Mm-hmm 00:31:02 [Eric] ... on sort of a macro level, right? Um, and we'll do the macro level first because I'm least qualified to talk about that. 00:31:09 [John] Perfect. 00:31:09 [Eric] Um, but you begin to think about, um, granting rights or dignity- 00:31:18 [John] Hmm. Sure 00:31:18 [Eric] ... to technology- 00:31:19 [John] Yeah 00:31:19 [Eric] ... that it doesn't deserve because it's in a different category, right? 00:31:22 [John] Sure. 00:31:22 [Eric] And so what... That can look all sorts of different ways, right? 00:31:24 [John] Yeah. 00:31:24 [Eric] Um, and then on a personal level, at least the, the very first thing that comes to mind, which I think is at the root of a lot of the different like impacts that could happen, is that you follow a path of least resistance in which relationships with non-human things are easier than actual human relationships. 00:31:52 [John] Right. 00:31:53 [Eric] And so you rob your life of the value of human relationships. 00:31:58 [John] Right. 00:31:58 [Eric] And a lot of that value, at the end of the day, comes because of the fact that human relationships are, are very hard. Or... And when I say hard, I don't mean, you know, human relationships are rife with conflict and GPT is sycophantic. Like, that's, that's- 00:32:13 [John] Right 00:32:13 [Eric] ... not what I mean. 00:32:14 [John] Right. 00:32:14 [Eric] What I mean is that, like, if you wanna have a good marriage, you have to say no to things that you want to invest in your relationship and- 00:32:22 [John] Right 00:32:22 [Eric] ... time with your spouse. 00:32:23 [John] Right. 00:32:23 [Eric] You know, you have to do that, right? 00:32:25 [John] Right. 00:32:25 [Eric] Um, if you want your kids to love you, you need to spend a lot of y- if, if you want your kids to love you. If you want your kids to know that you love them- 00:32:33 [John] It's like, wow, you can... [chuckles] It's like you can control that? 00:32:35 [Eric] Sorry. That... I was like- 00:32:36 [John] Wow 00:32:36 [Eric] ... wow, that came out... [chuckles] 00:32:37 [John] Man. 00:32:37 [Eric] Okay, maybe this is- 00:32:38 [John] Tell, tell me more 00:32:39 [Eric] ... this is therapy. 00:32:40 [John] [laughs] No, right. 00:32:41 [Eric] Hold on. I need to ask Claude about how to make myself- 00:32:43 [John] Yeah. Hold on, hold on. 00:32:44 [Eric] Uh- 00:32:45 [John] Let's pause the show. 00:32:45 [Eric] No. If, if you want your kids to know that you love them, um, you need to invest- 00:32:53 [John] Right 00:32:53 [Eric] ... in time with them. You need to get to know them. 00:32:55 [John] Right. 00:32:56 [Eric] You know, you need to- 00:32:57 [John] And tell them with words. 00:32:58 [Eric] Yeah. You need to tell them- 00:32:59 [John] Yeah 00:32:59 [Eric] ... tell them with words, tell them with actions. 00:33:01 [John] Yeah. 00:33:01 [Eric] You need to- 00:33:02 [John] All- 00:33:02 [Eric] You need to implement boundaries in their life- 00:33:04 [John] Right 00:33:04 [Eric] ... that are loving, you know? 00:33:05 [John] Right. 00:33:05 [Eric] And you need to... And so, you know, it... Loving them is like I want to teach you how to work hard, you know? Like, I want you to teach you how to have a work ethic. I want you to teach you how to treat other people like they have dignity- 00:33:21 [John] Right 00:33:21 [Eric] ... because they do. All of these things, right? And it's like, that's actually difficult, you know? That's a lot of- 00:33:26 [John] Right 00:33:26 [Eric] ... time and investment and all that sort of stuff, right? And so all that to say, I think the, um... You know, as I think about it, I, I want to, I wanna make sure that I create clear distinction there because I want to, um, I want to experience the full depth of value from human relationships. 00:33:48 [John] Right. 00:33:48 [Eric] And a relationship with AI will always be easier. 00:33:51 [John] Right. 00:33:51 [Eric] It's designed- 00:33:52 [John] Yes 00:33:52 [Eric] ... to be. 00:33:52 [John] Yeah. It is designed to be easier. There's an economic incentive for all- 00:33:56 [Eric] Yes 00:33:57 [John] ... the companies here to make it easier. 00:33:58 [Eric] Right. 00:33:58 [John] So, okay. Another... I don't... The macro thing, I think I agree with you. I don't know that I have anything [chuckles] to add. I'm probably even less qualified. On the personal level, like the fundamental difference is when you're interacting with people, you're interacting with, in my belief, eternal beings. 00:34:16 [Eric] Yep. 00:34:17 [John] And that is not true of AI. 00:34:19 [Eric] Yes. 00:34:20 [John] Um- 00:34:20 [Eric] Oh, that's... Oh, man, that's- 00:34:22 [John] And I think- 00:34:23 [Eric] Hmm 00:34:24 [John] ... and I th- so that's one, one level. But on a more practical level, it is the m- some of the motivations and creators of these AIs are gonna make it so you like it and you wanna use it, and like that's always gonna be baked into the products- 00:34:38 [Eric] Mm-hmm 00:34:38 [John] ... to some extent or the other. The other piece on the personal relationship thing is even if I had an AI that was great at pushing back on me, making me think better, maybe even a great coach- 00:34:47 [Eric] Mm-hmm 00:34:47 [John] ... um, that is not the same as interacting with another person on a, on at least multiple practical things, but the one that comes to mind is that is still all about me. 00:35:01 [Eric] Hmm. 00:35:02 [John] Even if I want it to help me brainstorm or be better, or it's my personal coach- 00:35:05 [Eric] Wow. Yeah 00:35:05 [John] ... or my therapist- 00:35:06 [Eric] Yep 00:35:06 [John] ... or my whatever, still all about me. 00:35:09 [Eric] Mm-hmm. 00:35:10 [John] Like, yeah. 00:35:12 [Eric] Yeah. 00:35:12 [John] I mean, I think- 00:35:13 [Eric] Wow. That's a such a good point. 00:35:14 [John] Yeah. So I think on the eternal soul [chuckles] thing, there's a- 00:35:18 [Eric] C.S., yeah, C.S. Lewis. Can you see... Yeah, you can see the poster behind you. 00:35:21 [John] Yeah, there you go. 00:35:22 [Eric] That's a quote. 00:35:22 [John] Oh. 00:35:22 [Eric] "You never meet a mere mortal," right? 00:35:25 [John] Yeah. 00:35:25 [Eric] Isn't that- 00:35:25 [John] Yeah, yeah 00:35:25 [Eric] ... C.S. Lewis, I think? 00:35:26 [John] I think so. Yeah. Yeah, you're right. Yeah, that is. Yeah. So there, yeah, so there's a eternal piece of like, there's, there's a depth here that, like doesn't exist with non, like- 00:35:37 [Eric] Yep 00:35:37 [John] ... eternal beings. 00:35:37 [Eric] Yep. 00:35:38 [John] Like non-humans. 00:35:38 [Eric] Mm-hmm. 00:35:39 [John] Um, and then, and then the... And then just the, the practical level of like I'm not... Like e- even the most ambitious person will not actually, like push themselves and like change in the same way in a vacuum with AI or with whatever, um, the same way as they would truly interacting with other people. 00:36:04 [Eric] Hmm. 00:36:04 [John] Like marriage comes to mind, right? 00:36:05 [Eric] Yes. Yep. Yep. 00:36:06 [John] Like there's- 00:36:07 [Eric] For sure 00:36:07 [John] ... all sorts of things like being married and having kids, for example, that pushes you in ways that you would never push yourself. 00:36:13 [Eric] Mm-hmm. 00:36:14 [John] Um, and even the best like persona coach AI, persona like therapist AI, like I just don't think it's gonna be that. 00:36:22 [Eric] Yeah.I agree. The other major thing... This is actually-- So we recorded a episode recently about Jensen Huang. 00:36:30 [John] Yes. 00:36:30 [Eric] Uh, the CEO of an Nvid- of NVIDIA. And he was... It was great. We had a podcast talking about another podcast episode. [chuckles] 00:36:38 [John] Yeah. Okay. 00:36:39 [Eric] And, uh, he and Joe Rogan were going back and forth on AGI, you know- 00:36:44 [John] Oh, yeah 00:36:44 [Eric] ... sort of the AGI thing. 00:36:45 [John] Uh-huh. 00:36:47 [Eric] And, you know, our, you know, intelligence and sort of super intelligence was the, the- 00:36:53 [John] Right 00:36:53 [Eric] ... vein of conversation. 00:36:54 [John] Right. 00:36:54 [Eric] It was really interesting because it really boiled down to sort of very first, you know, first principles of the foundation, which is, well, can this, this technology actually become sentient, you know- 00:37:05 [John] Yes 00:37:05 [Eric] ... and have consciousness, right? 00:37:07 [John] Right. 00:37:07 [Eric] And so I don't know if they landed the plane as clearly as I would've liked them to- 00:37:11 [John] Right 00:37:11 [Eric] ... from a philosophical standpoint. 00:37:13 [John] Right. 00:37:13 [Eric] But, um, you know, Jensen Huang was sorta saying, "Well, what is it to be conscious?" 00:37:19 [John] Yeah. 00:37:19 [Eric] And he's like, "Having an experience," you know, like having an actual experience. 00:37:23 [John] Right. 00:37:23 [Eric] And he's like, "This technology can't have an experience," which I thought, you know, was a really interesting point- 00:37:27 [John] Right 00:37:27 [Eric] ... to talk about. But hearing you talk about, you know, relationships and change, you know, one really interesting thing is that AI can't make sacrifices for you. 00:37:39 [John] Yeah. 00:37:41 [Eric] You know? And I think about, you know, the times when I have felt very loved are generally when someone has gone out of their way or made, like, a personal sacrifice to show me that they love me or they care about me- 00:37:54 [John] Sure. Yeah 00:37:55 [Eric] ... or, you know, whatever. And it's like AI can't do that for you. 00:37:58 [John] Right. 00:37:58 [Eric] You know? Um, there's nothing to give up, right? 00:38:02 [John] Right. Yeah. 00:38:02 [Eric] And I think that's a defining characteristic of- 00:38:07 [John] Hmm 00:38:07 [Eric] ... you know, human relationships that is-- that you can't replicate with a machine, right? 00:38:12 [John] Yeah. And the other direction doesn't work either. 00:38:15 [Eric] Yes. 00:38:16 [John] Like, it's not g- it wouldn't value at all over you not giving something up. 00:38:20 [Eric] Correct. Yeah, yeah. 00:38:21 [John] Like, you could- 00:38:21 [Eric] Yeah. It's incapable of valuing a- 00:38:22 [John] ... you could do some great act of... Yeah. Yeah. 00:38:24 [Eric] That's actually the hor-- that's horrifying to think about. [chuckles] 00:38:27 [John] Yeah. It is. It is. 00:38:29 [Eric] It's horrifying to think about. 00:38:30 [John] Gosh, yeah. 00:38:30 [Eric] Yeah. It's crazy. All right. Well, are you gonna talk to Claude differently when you go back to your desk? [chuckles] 00:38:37 [John] I don't, I don't know. Um- 00:38:38 [Eric] [chuckles] 00:38:39 [John] I w- we've been in a pretty transactional mode here recently, so. 00:38:42 [Eric] Mm-hmm. 00:38:43 [John] But, yeah. We'll, we'll, we'll- 00:38:45 [Eric] We'll see 00:38:45 [John] ... have to get an update. 00:38:46 [Eric] We'll see. We'll get an update. All right. Thanks for joining us. We'll catch you on the next one.
