J.D. (00:01.25) Welcome to another episode of AI Update brought to you by InformAven. I'm J.D. Mosley-Matchit, the founder and CEO of InformAven. And our guest for this episode is Dr. Tim Dacey, the author of Wisdom Factories, AI Games and the Education of a Modern Worker, and his newest book, AI Wisdom Volume 1, Metaprinciples of Thinking and Learning. Tim has 35 years of AI and learning science experience, most of it in leadership at MIT. And he helps educators and institutions think clearly about AI. Welcome to the podcast, Tim. Tim Dasey (00:40.211) Thank you, JD. Pleased to be here. J.D. (00:42.798) Now you have training in both machine and human learning. Plus you've spent more than 30 years developing AI and leading research and development groups at MIT Lincoln Laboratory. Then in 2022, you opened your own consulting and training business. That's quite a journey. Tim Dasey (01:03.465) Well, I mean, I think one of the things I'm fortunate to have had in my career is a lot of variety. So a lot of different kinds of problems and audiences to serve. that allows, as I preach with the students, that kind of allows you to adapt. J.D. (01:20.43) This is true. This is very true. Okay, among many other concepts, your latest book explains how AI learning methods reveal better ways to structure the way we educate people. So let's start with this first question. There's a lot of anxiety regarding higher education and the ways in which it may be failing today's students. So can you give us your perspective about the current state of higher education administration with respect to the AI related transformations that seem to be coming, whether the institutions are ready or not. Tim Dasey (01:58.431) And I should say, I tend to be rather direct. So one of the things I want to say upfront is institutions are having a lot of things thrown at them right now at every level. And so I'm not discounting the real issues that everybody has to deal with day to day. But if I take an abstracted view on what we're seeing, it's really been what, two. J.D. (02:03.074) Okay. Tim Dasey (02:26.425) It's almost three years since ChatGBT came out. And what I would say is the vast majority of college administrators and leaders that I have interacted with, and I also hear through a network of other consultants, honestly are not knowledgeable about the AI issues. And as you say, I mean, there are a lot of emotional reactions. mean, frankly, there are emotional reactions for me, right? I see a lot of colleagues who can't get jobs now, right? Because they were coders or whatever. I see a lot of changes coming that I'm fairly scared about. But what I think is the situation at colleges now is, and really at almost every level, when I talk to administrators, when I talk to the academics and professors, I just get a question, can you help explain this to me? Because I really don't know what's going on. There's a lot of misinformation and they kind of can't sort out who to believe. So a lot of what I try to do is create nuance where maybe when I go into the situation, there's a very black and white views of what to do. Because unless we can get, if it's a never touch AI response, or it's a good for everything response, that's just not where we wanna be, right? It's a very case by case situation. And I think there is a certain sense that I've gotten, this is true at the corporate level, but certainly at the college level, is that leaders are kind of waiting for it all to settle down. And it isn't gonna settle down. We're at the beginning stages of this industry, right? J.D. (04:16.555) No. Tim Dasey (04:21.483) And, so, you know, one of the things that that I talked to my, some of my professional college colleagues about, some of whom are in the business of kind of closing down colleges or helping them when they get into financial trouble. And they come to me and say, look, these people are not operating these colleges like a business. And, and, know, you, the academic side of the house and the research side of the house, let them be ideologues. Right? At some level, that's what we need them to be, right? For what you're running a business on the administrative side. And if you're captive to all the concerns that the academic side has, which I think is what sometimes spills over into the attitudes, then you're really doing a disservice to your students and your institution by putting them in a more precarious spot. There are going to be new models entirely. J.D. (04:55.468) Mm-hmm. J.D. (05:00.002) Yes. Tim Dasey (05:20.675) new perhaps personless colleges, at least on the instructional side and perhaps on the administrative side. And that's not a 20 years away prospect. That might be a five year away prospect. And so the big established highly reputable and sought after institutions, the in-person learning model is still going to have a mark. but a lot of the smaller places that maybe are hanging on right now, if they don't transform, my fear is they'll fail. And whether that's before or after these other market entrants come is unclear, but there is going to be a major disruption in this industry. J.D. (06:11.224) I quite agree with you. Well, certainly, generative AI in particular is a whirlwind of change. And higher education is famous for moving at a more deliberate pace. So can you provide some guidance for how institutions should effectively plan for these AI-related transformations when everything keeps changing so rapidly? Tim Dasey (06:24.297) Mm-hmm. Tim Dasey (06:37.393) Right. And this is where I think I like to relate how learning happens in any system, including our brain, and what those conditions need to be for an organization to learn and adapt. And so you just have to have a bunch of very simple basics. You have to have goals and mission, of course, and values to guide those. You have to have a learning mechanism. Okay, so in our brain, that's the synapses changing and stuff who, you we learned. What is that in your organization? That's the learning mechanism. The next step you need is, know, anthrogogy or pedagogy, right? You need a process for the learning, right? That's true at the organizational level too. And so one of the things that the... the technology industry really learned the hard way over the course of decades is that planning in a fast changing world, which technology has long been, planning two years, three years out and then starting this long march to that goal just doesn't work. The tech people call that the water, in the software world, they call that the waterfall process, right? And instead, the industry has gone to what they call the Agile process. And so the assumption of the Agile process, which I'm extrapolating here to consider the entire institution wide learning, is you assume that you don't know what the future will be. And you assume that any big bet that you make is likely to be wrong, because things will change in the market. employee and student population. And so it, but what it does is it makes sure that you take a tangible step, get something working, even if it's a partial solution, get feedback from the right players, throw it out if it isn't working, correct and move on. Highly iterative, processes that make sure that it's just not thinking, it's doing. And Tim Dasey (08:59.003) And that's true for any uncertain complex world where any strategy or plan you make might not as they used to as they would say, I think it was General Eisenhower who said this, but I'm not sure if the attribution, it's, know, no battle survives or no plan survives the first shot. Right. Absolutely. And you will get the plan wrong if you have this episodic process and every two years you get together to restrategize. J.D. (09:12.289) Hmm. J.D. (09:18.67) Yes. Tim Dasey (09:28.699) It needs to be every month, every year, and with an attitude that something will change this month. But we get to decide what that is. And you just start chipping away. Unlike a lot of the technology-related manifestations at colleges and universities in the past, where would be an IT department could set up some kind of system-wide benefit, I think here it's very different. A lot of the benefit comes from each individual at their desk doing things just a little differently. And it's that accumulated, it's not necessarily the enterprise function. It's having really systemic understanding, not just of the technology, but of your job and how it fits in the organization and the ways you might rejigger a process to do it differently. That in my experience, and I've dealt with a lot of regimes and realms that don't change very much. So law enforcement, public health, the government itself, these are all entities that are not used to changing much, And what ends up being the biggest impact is the attitude that you're going to change. J.D. (10:42.414) until recently. Tim Dasey (10:55.571) with respect to AI right now, and I think we'll get into this in a bit, that attitude is tough to get through. There's a lot of emotional barriers. But we really have to. It can't be one of these things where you put your hands up or I think, there are really existential threats that you have to be careful. If business, like even with respect to academic integrity, if business doesn't believe that the students are going to your classes perhaps, or doing their homework, you've lost your entire credibility as an institution. And that may happen to the sector, people are careful. Because certain things, for example, certain things I've been saying for a couple of years regarding policy is, if you have a remote student and much of your learning is done remotely, you better know that the student is actually there. and it's not a fully interacting avatar that looks just like them. And right now we have no way of doing that. Those kind of issues are existential because if the business world stops believing that people are getting educated, they're gonna go do either the assessments themselves, right? And not rely on your credential or your, or they're going to educate. them themselves and just have them skip the college process. all of this stuff I'm putting out there is, it sounds massive. And I think if there's one message, it's just that, you you really do have to consider this a big deal. And everybody's having to deal with cost cutting issues right now and funding issues. And, know, if you don't think AI is part of the solution to that, then you just haven't been paying enough attention. J.D. (12:51.926) I quite agree with you. Now, we both agree that not everyone thinks that AI is a good thing for higher education. There are lots of polarized views and even misconceptions about AI that are making it difficult for higher education leaders to bridge all of the competing points of view. Is there a middle ground and is there any way to constructively redirect Tim Dasey (13:03.081) Mm-hmm. J.D. (13:18.764) all of the energy that's being exerted against AI. Tim Dasey (13:24.881) I believe there is. So there are two paths to that, I think. One is that they're all correct, to some degree. So those that are worried about, will students learn? Is this replacing things that are sort of core to being human? Those are all really valid concerns. But If they can't see that there are some times when it might be useful, maybe it's a service to students or employees that nobody has the time for and there's no budget to hire somebody to do, then we can't start that conversation. If you're getting a salesperson from a tech company telling you it's the answer to everything, I can tell you that's not true. so it's somewhere in between. It's a bit like the analogy I use is if somebody came up to you and said, we shouldn't use computers, they're bad. Or we shouldn't use computers, they're good. Or electricity, it's bad. Yeah, sometimes, right? But that's not the right conversation. Or is it thinking? Or is it reasoning, really? Or are we using the right terms? J.D. (14:38.038) Mm-hmm. Mm-hmm. Tim Dasey (14:50.459) not the right conversation. Fine for people to go philosophize in their own regimes. the question is, what is it we want to preserve? And what is it we want to change? Because we wanted to change it anyway. Most institutions have been trying to get their data shop in order for two decades now. Now you've got a technology that actually can create J.D. (15:05.269) you J.D. (15:12.078) Hmm. Tim Dasey (15:18.047) because it can write software for you. Maybe it can now start patching these systems together in a way that you can look across the whole enterprise to do things like institutional research or financial analysis or learning analysis. And you just couldn't do that before. The other thing I see is that the, I guess what I would say is the philosophical or emotional reactions tend to be very field and domain specific. I will hear a lot from the humanities community decrying this, but they don't seem to mind that the coding jobs are going away. I hear nary a mention of that. It's really about can an essay be written or an email be written instead of a human expression. I look at it and say there's no simple. J.D. (15:56.888) Mm-hmm. J.D. (16:06.188) Thank Tim Dasey (16:15.135) answer to these questions. What you need to do is develop an ethical framework by actually discussing individual important issues and seeing rather than have it be about all the voices battling each other, let's make it about some small solution. Even if that solution is we're not going to change that piece right now. So one way to do that There are a couple of techniques that I think are somewhat useful and you can do this for your own job or the whole organization. So one is what what a named Gary Klein is a psychologist described as a pre-mortem analysis. That is, maybe you take a, you know, a bunch of people who are doing strategic analysis or planning for you and you say, okay, you folks assume that whatever we're trying to get done in five years, it actually worked out great. J.D. (17:00.642) That is Tim Dasey (17:14.003) And the rest of you assume that it didn't, something badly went wrong. Now I want each of you to work backward and tell us what you think were the reasons that it succeeded or failed. Like if it is to succeed, why will it happen? And if it is to fail, why will that happen? And when you do that analysis, I've never seen any group of people come back and say that that the failure path meant that they should just do nothing now. They're always saying, well, it's because we didn't do something. We weren't paying enough attention. We didn't get on it early enough. Our competition moved faster, whatever it might be. So sometimes it's just reframing that conversation and saying, OK, we don't want to get to a point where students cognitively offload to the technology. But we know that there are ways that, there's one woman, name is Michelle Kisorla at a community college down in Georgia who teaches English literature and writing, and just doing amazingly creative stuff with her students using AI. And what she finds is they come out scoring better on all the same kinds of things she wanted them to score on to begin with. in the summative assessments. The same thing can be true in the business side, on the legal side. There just has to be every individual looking at their little space and saying, I want this job to survive. Be selfish. I want this job to survive. But for that to happen, I'm going to have to expand the job to be able to do X, Y, and Z and need some help. with automation to do the stuff that's routine. And we can get there. We can get there with these conversations. So a classic conversation that's happening right now is that there are a bunch of people who are saying, like, we're going to abstain from using AI because it uses power. And it's an increasing fraction of the power in these data centers and et cetera. And what I would say to that is, Tim Dasey (19:36.863) Sure, but you haven't done a power budget for your whole life. You're actually still using more energy scrolling social media, right? But secondly, you can turn that concern into something actionable. Because the way that a college or university might implement their access to AI for students and employees, you could buy like a block subscription that's $20 a month for each person or something like that. Or you could have an interface that uses the API. Well, what's the advantage of the API? The API charges by how much you use it. There's a direct financial incentive to departments and schools, right? And about not overusing, right? So there's a tangible way to turn the concern into something real. For those that are worried about AI messing up learning, have them design J.D. (20:22.158) Mm-hmm. Tim Dasey (20:36.329) the checks and balances, right? Have them design the evaluation frameworks. So we can come to some agreement on when do we see that that's not the right approach or when, but it's not a binary, right? It's gotta be because at some level the students are gonna have to know this. I mean, I'll give an example. I think when you get AI agents, so AI agents are, it's AI that will talk and act on their own. So it's not just that you're interacting with some, having a conversation to solve a problem with some AI, you're sending it off to go do something for you. And when that happens, of course, you have to set boundaries, you have to monitor, you have to do all of these things. And once those agents start going out into the world and interacting with other agents, now you get a sort of AI society. where there are norms that will start to emerge between these pieces of automation. So what does that mean for sociology? Well, maybe sociologists have a job boom coming, right? But that boom is to help support how do you manage these AI societies? Okay? And so if we shut off the possibilities that this might be a productive path, it might be that your ability to recruit goes through a big transformation because now I can have personal, I can go find people out there that might be prospective students and I can customize the message and have a dialogue with them to make them feel welcome in this community in a way that I never could before. And so you can get really embarrassed by doing that wrong. And so we want people that are highly critical of AI. J.D. (22:29.44) Yeah. Tim Dasey (22:32.831) to be the folks that say, well, I'm going to do the testing to make sure that this isn't going to embarrass us. So you can employ the people who are the naysayers to be your guardrails and to help you form those. J.D. (22:39.147) makes sense. J.D. (22:46.55) I like that. The collaboration concept. Okay, well, let's get practical. The job market seems totally unpredictable. And now we have more doubt than ever regarding the skills that we need to teach our students. So how should administrators deal with all of that? Tim Dasey (22:50.047) Yeah. Tim Dasey (23:06.823) Yeah, and I'm not going to get into really the nuts and bolts here of teaching and all of that. But if we stay at the strategic side, let's pick a number. So let's say the last 10 years, just as an example, nursing schools have gone through a huge boom. But they're expensive to set up. They require a lot of infrastructure. They require relationships to do it right with medical facilities. It's complicated and that's a bet you've got to make. That's at least a 10-year bet. Now we have these jobs. So I can tell you nursing's not going away. But I have no idea what to tell you is coming. I think there are some things that I think overall directions that schools should consider. One is that we may not want to characterize a lot of credentials by J.D. (23:36.59) Mm-hmm. J.D. (23:41.838) Yes. J.D. (23:47.914) Mm-hmm. Right. Tim Dasey (24:04.765) the knowledge domain or the exact industry that they're helping. So it might not be. So sure, you could have a bunch of medical degrees, but maybe you have somebody who's an expert in judgments and goes around and tries to interact with people in various industries to help them improve their ability to decide things better. And maybe they're using AI with that or maybe not. But that's an example of what I would call a cross-cutting skill that schools do have that now in the form of things like you've got philosophy majors and you've got it right. But the divisions for those are going to be different. And I think a lot of schools have dealt with that by trying to create a lot of interdisciplinary connections. But the next step of that is really J.D. (24:57.826) Yes. Tim Dasey (25:00.593) I have to create different divisions. I think we'll see that more on the STEM side than anything else, because that's just changing so fast. If I'm a chemist and I'm teaching somebody to do certain things manually now, I was talking to the dean at a business school, particularly worried about accounting. And I said, you should be worried about accounting, right? And the people in your organization who are doing accounting should worry about their jobs. But I don't know many of these schools who don't value these long-term people who have been there and the institutional knowledge that they have. They have just as much of an incentive to make you into something a little different. And my guess is sometimes that little different is going to be painful, but sometimes it may put you in a better spot as a job. That's the kind of discussion to have with your college or university at the sort of larger level is like, don't want to take your job away. Right. But right now, like, I don't know how to reach out to all those students I want to reach out to. You don't have the time to put all those emails together. Or, you know, I want to be able to have something crawl my learning management system and find out whether the teachers are implementing, let's say, a more experiential form of learning, let's say, if that was a strategic goal. Like, how are we doing? That's information spread over a whole institution that is just too much complexity for brains to handle. So there are just some things we want to do. I'm saying we with the royal we here. But there are some things you want to do that are just going to take. J.D. (26:48.974) You Tim Dasey (26:54.609) leaps of faith that you have people that can adapt, that those people are still valuable to have, but we're not exactly sure what they're doing in a year or two. And in the meantime, maybe we get ourselves to a better financial spot, or we get ourselves to really standing out among the crowd. Because right now, if you are moving forward with AI, you're the unusual school. Most schools are still at the level of, yeah, we've written a policy down, but... I don't even know how that's going, they might say, right? So, and I think a lot of that is just because as I talked about at the beginning, there's not this learning crank, there's not this adaptation machine that's regularly part of administration. So. J.D. (27:44.674) That's so true. So with the world changing so rapidly, it's pretty clear that the things that we teach our students have to adapt quickly in order to keep pace. But curriculum development must be done thoughtfully and carefully. There have to be checks and balances to ensure that the changes that we make are productive. So what steps would you suggest institutions take to adjust their education programs more? Tim Dasey (28:14.687) I think there are two ways to answer that question. I mentioned one, which is you might just start new credentials and new majors and all of that. That's the big push. But really what you have to do is you have to figure out a way to meet them where they are. They don't care about teaching this class that you're dreaming up. As much as I got a I got to get the class together tomorrow and I have a bunch of work to do before that. even at the provost and dean level, that's just a bunch of people who are just way too busy. So the question is, can you find something that's what I wrote my latest book about was really to try to say, hey, there are some durable skills specifically related to AI. And these can. Because those durable skills at root level are about how things think and learn, you can relate them to any learning. Every class should be about thinking and learning. So one of the ways that I've taught is really look for what I would call systemic levers that maybe teachers wouldn't fight so much about. So if I said to a school, OK, go ahead and teach everything the way you're teaching it now, but there's one class-wide project for the whole term. that every class has, and that is, how am I going to teach this next time? And you and the students work on what next time looks like. And they're learning about how learning works and how the choices about how you learn should be, can manifest. And guess what? When you work with AI, you're teaching it still. It's like that. J.D. (29:46.862) you Tim Dasey (30:09.511) It's like that exchange student that showed up in your class. They have a brain already. But when you're interacting, you're teaching it other stuff about what's particular to your needs. So that's an example of a system-wide effort. We're just going to teach people meta skills about learning. OK? And I can apply that to whatever I'm teaching them in a way that the individual teacher can take control of. But that has crossover benefit to how we interact with the machine, So I would look for those sort of opportunities where you can move the needle without necessarily having to convince everybody to use AI. J.D. (30:49.014) I like that. Well, Tim, it's clear that you've spent a lot of time wrestling with the challenges that AI is presenting in higher education. And you've noted how important it is for our institutional leaders to focus on the deeper meta principles of AI, rather than getting caught up in the complexity of rapid technological change. You've basically taken your 35 years of experience with artificial intelligence Tim Dasey (31:11.038) Hmm. J.D. (31:16.95) and made it available to administrators who probably feel a bit blindsided by this technology. Thank you so much for giving us everything that you've given us to consider. Tim Dasey (31:29.535) Thank you, JD. This is a valuable space because I don't see a lot of others covering higher ed administration and what they can do. So really appreciate it. J.D. (31:40.888) Thanks. For more information about AI news and trends that are directly impacting administrators in higher education, please follow InformAven on LinkedIn and visit our website at informaven.ai. J.D. (31:58.442) Okay. Now, is there anything that we didn't cover that you would like to cover? Tim Dasey (32:05.021) Let's see. J.D. (32:05.654) because I can slot things in. Like I said, I've got the editing suite. Tim Dasey (32:10.067) I mean, there are things that I sort of like little things, but not, for example, one of the things I usually preach, I talk to administration level folks about is resource management, right? If I'm going to, where would I see more or less energy from an institution in the future? So I mentioned, for example, you know, if you have a CTL. J.D. (32:22.658) Mmm. Tim Dasey (32:36.671) But that's going to become more important in your transformation, right? If you have an IR shop, right? A lot of times those are like, just pieces of people here or there, right? Well, I think that's going to have to be enough of a thing that you get the right kinds of information to. Anyway, so it's little things like that, but I'm not sure it's necessary. How are we on time? Was about a half hour, something like that? J.D. (32:45.774) Mm-hmm. J.D. (32:57.196) I hear you. Yeah, and that's typically about the way it goes. And as I said, I break it down into the five questions that I throw out on LinkedIn so that people can get a snippet about AI and higher education administration each week. So every Tuesday at two o'clock, I'll take one of the questions and throw it up there. But what I'll do first is take our full interview, which I also Tim Dasey (33:02.952) Okay. J.D. (33:29.336) put up on my website. I'll take the full interview, edit it, send it to you for approval before I upload it and before I chop it up into the five questions for LinkedIn. Tim Dasey (33:36.946) Okay? Tim Dasey (33:43.271) That's very kind of you, but I mean, just letting you know, I won't really filter myself very much. I mean, yeah, I appreciate the offer. It's really not been offered to me before, which is nice. J.D. (33:50.784) It's okay. It's okay. I J.D. (33:58.188) Well, I used to be a journalist in a previous lifetime and I know what it's like to have, you your words being held hostage by someone else. So I don't want you to feel that way. I want to make sure that you feel good about it as well. One of the things that you mentioned that I really appreciated was this notion that it shouldn't just be left up to the educators, it shouldn't just be left up to administration, shouldn't just be left up to IT, but perhaps there should be some way of having someone pay attention to what's going on at a higher level across the institution. And it seems as though, you know, a lot of times that gets kicked over to IT, but I know that IT already has their hands full. I've never seen an IT. office in any institution that wasn't already up to their eyeballs with the LMS and the SIS and all the other. And it's not the right, I agree, they don't have the right skill set. Yeah. So it's Tim Dasey (34:59.903) Yeah and it's not the right it's not the right it's not the right skill it's not the right skill Yeah, it's funny. So I have a term I've coined called productivity therapy. so it's meant to give the sense of, you have to if with tech, with AI, you have to not just understand, you know, the IT part, that's actually might be the easy part. J.D. (35:16.3) Ooh, I like that. J.D. (35:33.048) Right. Mm-hmm. Tim Dasey (35:34.301) You got to understand how does this job work and how does it fit in the organization and even how does this person work best? And so what I see happening is. J.D. (35:42.882) Mm-hmm. Mm-hmm. Tim Dasey (35:49.143) I would say most colleges have gone out and got people that have AI backgrounds to be their AI leads, if they can afford a separate one, right? Or they'll take somebody from their tech ranks and they'll bring them up to that spot. I don't think that's right skill at all. I think you're much better off having someone. J.D. (35:55.608) Mm-hmm. Mm-hmm. J.D. (36:04.012) I agree. Tim Dasey (36:07.177) who understands how to change and analyze operations. So it might be like the industrial engineer or the operations researcher or the HR person who kind of understands people really well. to me that getting empowering people at those individual levels requires somebody to sit down and work with them on how it will help them specifically, not. J.D. (36:10.52) Mm-hmm. J.D. (36:22.574) Mm-hmm. Tim Dasey (36:34.867) departments generically. So I've been calling that productivity therapist because it has this. I think that's one major structural thing. maybe, you know, I think the other thing that I will tend to tell leadership in particular is you're not going to understand this unless you try using it. J.D. (36:44.631) I like that. J.D. (36:58.272) It's true. It is so true. my God, it's so true. Mm-hmm. Mm-hmm. Tim Dasey (36:59.711) You were just not going to. Tim Dasey (37:04.703) It's a very experiential kind of thing. One thing that's been clear at K-12 and higher education levels is that when educators and administrators get trained, even like a basic AI 101, they're less hostile. J.D. (37:20.13) Yes. Yes. I agree. Tim Dasey (37:22.661) So I still think that's the priority. J.D. (37:26.754) Yep, yep, I agree. Okay, thank you so much, Tim. I really appreciate you taking the time. I love the thoughts that you shared today and I have your books. I haven't had a chance to read them. I'm trying to get through them, but you know that there is just so much. Tim Dasey (37:40.155) thanks. This was fun. This is. Tim Dasey (37:46.815) People don't, well, know, it's funny that I always tell people like people don't read books anymore, but it helps my career if I hand them out. And I actually made this for schools to be able to upload into AI to kind of provide a context for curriculum design. So it is what it is. Yeah. J.D. (37:58.059) Exactly. J.D. (38:06.2) And I appreciate that. Yes, yes, I appreciate that. In fact, if you have a PDF version, I would pay for it in addition to the Kindle version that I've already paid for. Tim Dasey (38:16.243) Well, no, happy to. Would you prefer PDF or the e-book? J.D. (38:21.184) I prefer PDF because I have the e-book. I've got a Kindle. I've got the Kindle version. I went on Amazon, I bought it. But if I could have a PDF, that would be great because I would love to be able to tell Perplexity or ChetTPT or whoever I'm working with, whoever I'm working with, whichever one I'm working with, that, you know, take it from Tim Dacey's framework or context or frame of reference. Tim Dasey (38:23.825) Okay. Okay. Tim Dasey (38:47.987) Yes. J.D. (38:50.292) and analyze this situation for me. I would love to be able to do that. Tim Dasey (38:54.503) So I'm happy to hand, actually both books do that way. What I have told people is that what I hope they do is turn off the setting that allows AI to train on the conversations, right? Because that just sort of keeps me from giving them my book, right? J.D. (38:59.022) What? J.D. (39:08.51) Right. And no, I don't have them training on any of mine. Exactly. Yeah, no, I definitely locked down everything because I am working with institutions and I don't want any of their information floating around out there, much less my information floating around out there in the law. Tim Dasey (39:26.591) Yeah, but happy to send you those. J.D. (39:29.518) Thank so much. That is so generous. Thank you. And if there's anything that I can ever do, let me know. Tim Dasey (39:36.895) This has been great. It's good exposure for me and hopefully it your career. J.D. (39:40.906) Okay, and like I said, I'll probably get to this if not this week, then next week, and I'll get the edited version off to you for your approval. Tim Dasey (39:51.625) Thank you, JD. It's been a pleasure. J.D. (39:53.196) And remember, don't shut down this window until you see a message that says that it has uploaded all of the information on your site. Yes. Because, and that's. Tim Dasey (39:58.492) that it's all been uploaded. Okay. I'm just going to leave it be because I have to go somewhere else anyway. J.D. (40:04.876) Okay, sounds great. Okay, see you later. Thanks so much. Tim Dasey (40:08.415) Take care.