Leadership Voyage

S2E15: Using AI to Upgrade Yourself with Don Schmincke

Season 2 Episode 15

Text Jason @ Leadership Voyage

Don Schmincke has been accused by a NY press agency of providing “the most provocative and sensational view of business than any other speaker today.”

What else would you expect from an MIT and Johns Hopkins researcher who was nearly arrested as a capitalist spy in the Soviet Bloc, got shot off an aircraft carrier, survived in the Kurdish capital as the Ayatollah held hostages in Tehran, and developed missile inertial guidance systems while his frat brothers took Vegas (later portrayed in the movie “21”)?

Don Schmincke’s irreverent humor and unconventional methods provide audiences such a refreshing change to other status-quo topics that he’s been called the world’s “management renegade.” His patent-pending offerings transcend typical programs via refreshing alternatives to trendy theories, unproven methods, and phony “experts.” The industry agrees.

Don's latest book "Unleash Your Potential" is available on Amazon here.

==========
Artificial Intelligence 

  • why hadn’t we asked “artificial intelligence how to humanize us?”
  • latest book was a humorous experiment
  • ChatGPT came up with title and defined the chapters
  • it took ChatGPT 30-40 minutes to write the book
  • ChatGPT is “still an infant” but could see us and tell us how to improve ourselves

AI's view on improving ourselves

  • the answers were primarily a representation of what's been written, such as understand yourself, build goals, and build habits
  • but it added two interesting ideas: improve your mental health and build strong relationships

How to use AI

  • use it as a tool for speed
  • it’s not too different from when the calculator was invented
  • gather information efficiently as a research source
  • but it’s not infallible and makes mistakes

Don't abdicate your thinking

  • it doesn’t provide insights
  • it’s an efficient algorithm 
  • humans provide insight on what they see or how they ask
  • the danger comes when we allow AI to be autonomous with other machines

Have fun with it!

  • experiment
  • people use it for things like constructing workout programs
  • don’t be afraid of it
  • it’s still learning as well

==========

Leadership Voyage
site: leadership.voyage
email: StartYourVoyage@gmail.com
youtube: https://www.youtube.com/@LeadershipVoyage
linkedin: https://www.linkedin.com/in/jasonallenwick/, https://www.linkedin.com/company/leadership-voyage-podcast/
music: by Napoleon (napbak)
https://www.fiverr.com/napbak
voice: by Ayanna Gallant
www.ayannagallantVO.com
==========

Instacart - Groceries delivered in as little as 1 hour.
Free delivery on your first order over $35.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.



1
00:00:02.200 --> 00:00:05.860
Jason Wick: Okay. 

2
00:00:06.090 --> 00:00:06.900
Don Schmincke don@sagaleadership.com: 

3
00:00:08.170 --> 00:00:21.860
Jason Wick: all right. Welcome back to another episode of leadership voyage, everybody. It's good to be with you again, and I am very happy to be with a special guest today, Don Schmenka Don, it's great to meet you this morning.

4
00:00:22.390 --> 00:00:50.170
Jason Wick: so we're gonna get into some of the interesting stuff that you've been working on and writing about pretty recently. But before that I found a snippet from your biography which I just can't resist reading to the audience. And here is the snippet. It's what else would you expect from an Mit? And Johns Hopkins, researcher, who was nearly arrested as a capitalist spy in the Soviet block, got shot off an aircraft carrier

5
00:00:50.290 --> 00:01:00.830
Jason Wick: survived in the Kurdish capital as the Ayatollah held hostages in Tehran and developed missile inertial guidance systems while his frat brothers took Vegas. Now.

6
00:01:01.150 --> 00:01:13.020
Jason Wick: how could I resist Could you kick things off for us here and just tell us about any one of the stories that we just heard a little peak of. No, it was funny.

7
00:01:13.170 --> 00:01:22.659
Don Schmincke don@sagaleadership.com: yeah, when I was I got I was. I was doing planetary physics at Mit, and I got started a human studies and Hopkins. And then I got involved with

8
00:01:22.720 --> 00:01:27.740
Don Schmincke don@sagaleadership.com: studying. Why, management theory has such high failure rates a. And then

9
00:01:27.940 --> 00:01:39.209
Don Schmincke don@sagaleadership.com: I know why no one was talking about it. And so I started training Ceos. and it took off. But the expeditions I did, because I wanted to do field testing and learning because they were using.

10
00:01:39.580 --> 00:01:55.020
Don Schmincke don@sagaleadership.com: You know, we're taking like these best selling management theories and looking at their failure rate, some implementation. And I thought, you know, we need to do some more deeper work in our species. So I ended up getting involved with projects that sometimes took me in

11
00:01:55.020 --> 00:02:17.309
Don Schmincke don@sagaleadership.com: dangerous areas, you know, like I, when a Soviet block fell, I I thought that studying empires collapse, and that was happening. So you know, I snuck in and and did that almost. But yeah, all of these different situations were one of I don't know 20 or more expeditions that I

12
00:02:17.400 --> 00:02:27.779
Don Schmincke don@sagaleadership.com: got involved with to try to understand humans and how to lead better. So you know, we started applying a lot of this about 20 years ago, and companies started taking off.

13
00:02:27.960 --> 00:02:41.280
Don Schmincke don@sagaleadership.com: you know, executive leadership started improving dramatically. And then so my books, some of the articles and a lot of the videos and my speeches all sort of center around that. So it's kind of a neat area for me. I love love to learn. I love to teach

14
00:02:41.280 --> 00:03:07.589
Don Schmincke don@sagaleadership.com: And it's such a a unique niche that it's it's just been, you know, a kick. So that's what I'm doing today. No, that's great. And and I also just thanks for having a a a sense of humor and laughing about. I mean, obviously, that Via was written with a sense of humor. But it's just great to to hear the juxtaposition of such. The intense situation with the laughing about it. I do appreciate that. I also think

15
00:03:07.620 --> 00:03:24.440
Jason Wick: you're the first person to talk about their expeditions, that I've had this, podcast I love that expeditions that's beautiful. Now, you were talking a little bit about how you first got Interested in in studying the failures

16
00:03:24.460 --> 00:03:33.450
Jason Wick: of management philosophies, or consulting, as you were saying, could you tell us a little bit more about what? What motivated you to to dive into that?

17
00:03:34.040 --> 00:03:42.949
Don Schmincke don@sagaleadership.com: yeah. When I was a studying and teaching. I ended up teaching at Johns Hopkins. I ended up running into a lot of executive Mba

18
00:03:43.060 --> 00:03:49.739
Don Schmincke don@sagaleadership.com: students who were managers in their companies, and they were complaining about.

19
00:03:49.850 --> 00:04:13.790
Don Schmincke don@sagaleadership.com: These best selling theories weren't really resulting. And the kind of outcomes that they were looking for. And so they, you know, they basically asked me, could this be a biological issue? I had done biomedical engineering studies at Mit and as well as a number of other scientific exploration. So I thought that'd be kind of a cool project, and

20
00:04:13.810 --> 00:04:41.130
Don Schmincke don@sagaleadership.com: I wasn't sure what I was going to find out. But I ended up starting asking questions and going into areas. People didn't want me to really expose And I found out that you know a lot of what we were doing today was failing because we've missed what our ancestors have been doing, which is aligning in the beliefs of people. You know, we were just throwing a bunch of tools at the problem. You know, you got to do this, do that, and

21
00:04:41.180 --> 00:05:04.149
Don Schmincke don@sagaleadership.com: you know every book is now what you gotta do and how to do it, but it doesn't alter your behavior. So it'd be getting getting to more into that. And you know, I just had some brilliant people around me. It was really fun. And I connected with some some real geniuses. And we started looking at the human brain. We sort of looking at behavior and how beliefs occur in the brain, and it was

22
00:05:04.150 --> 00:05:18.180
Don Schmincke don@sagaleadership.com: fabulous. I just said a lot of fun still doing it today. We're still learning a lot, still making mistakes. but I work with a number of companies which gives me good field testing for a lot of the the models that we we currently use

23
00:05:18.540 --> 00:05:34.910
Jason Wick: very cool. Yeah, thanks for getting into that a little a little more deeply. yeah, really interesting insight to here. So some of these best selling management theories, not working because they don't align with people's beliefs really, really cool insight. Thanks for for talking about that.

24
00:05:35.620 --> 00:05:55.430
Jason Wick: Well, what I'd love to dive into the most today is around your your most recent book. I believe it is. And the title, if I've got it right, is unleash your potential. How artificial intelligence wants to upgrade you a chat Gpt experiment. So I actually think this will be a really

25
00:05:55.490 --> 00:06:19.429
Jason Wick: awesome discussion for a few reasons. So I I encounter people who, you know. They they know, chat, gpt, they work with it daily. They they use a Google Bar. They use whatever different tools are out there and then. I've encountered some folks in the last few weeks, and they they've never even heard of Chat Gpt. And I find it interesting because it just kind of depends where you're dialed into what you're what you're learning about.

26
00:06:19.440 --> 00:06:27.989
Jason Wick: So could you first tell us a little bit about how you were first inspired to write a book about this.

27
00:06:28.700 --> 00:06:32.620
Don Schmincke don@sagaleadership.com:  what you know we were doing? AI

28
00:06:33.000 --> 00:06:43.299
Don Schmincke don@sagaleadership.com: learning how to program AI back in the eighties when I mean 4 seventies actually at Mit. It was like a required course for a lot of students. And so it wasn't like it was today. We were just

29
00:06:43.350 --> 00:06:45.250
Don Schmincke don@sagaleadership.com: figuring out how to, you know.

30
00:06:45.280 --> 00:06:57.059
Don Schmincke don@sagaleadership.com: use languages like a lisp and object oriented type of algorithms. But a before. Well, when before Chat Gpg released

31
00:06:57.180 --> 00:07:08.210
Don Schmincke don@sagaleadership.com: A lot of my fret brothers were arguing about, and it's always delightful debates around. you know, humanizing AI and

32
00:07:08.560 --> 00:07:21.080
Don Schmincke don@sagaleadership.com: What are the pros and cons of that. And what are the various modalities that should be explored? And so when, when Chat youpto released, and I guess end of last year, I. It was one as the nor my brothers

33
00:07:21.290 --> 00:07:24.149
Don Schmincke don@sagaleadership.com: early adopters, and so

34
00:07:24.230 --> 00:07:32.340
Don Schmincke don@sagaleadership.com: I thought it was interesting. There was so much debate going on around humanizing artificial intelligence that

35
00:07:32.690 --> 00:07:38.500
Don Schmincke don@sagaleadership.com: I just had this crazy idea like, how come we haven't asked artificial intelligence how to humanize us.

36
00:07:38.630 --> 00:07:45.020
Don Schmincke don@sagaleadership.com: It was just sort of a a joke thing it was. It was a a humorous

37
00:07:45.120 --> 00:08:02.219
Don Schmincke don@sagaleadership.com: exploration. So I so I I jumped on a chat Gbt when it when it first came out, and you know, they they let me in because of the time. It was hard to, you know, get into it. But I started asking it, and it started talking to me. So we we had a dialogue going on.

38
00:08:02.400 --> 00:08:22.429
Don Schmincke don@sagaleadership.com: And this is fascinating. And so I took the dialogue. And I actually said, hey? Okay, we're going to do a block, and what would you call it? And it came up with the title and I said, and it I. It told me the chapters it wanted and then we started drilling down in each of the chapters.

39
00:08:22.540 --> 00:08:29.489
Don Schmincke don@sagaleadership.com: and so I it took I mean God. I probably the amount of time it took to write. The book was maybe

40
00:08:29.510 --> 00:08:32.210
Jason Wick: 30 or 40 min.

41
00:08:32.250 --> 00:08:52.769
Don Schmincke don@sagaleadership.com: and I I I want every like I you can keep going. But I need to interrupt. Say that one more time for those people who are in the car and aren't really listening. When this comes out, say one more time. The right. The whole book took about 30 or 40 min. The human, was the slow factor.

42
00:08:53.430 --> 00:09:02.549
Don Schmincke don@sagaleadership.com: does. It took the human, you know. 2 weeks to to figure out what to do with all this information. So so what? I just said, okay, I'm just gonna

43
00:09:02.860 --> 00:09:09.539
Don Schmincke don@sagaleadership.com: just have humorous attempt to to get this out to the world. And I just I took it's dialogue. I didn't, really.

44
00:09:09.720 --> 00:09:28.330
Don Schmincke don@sagaleadership.com: I didn't do any editing. I thought, you know I'm not. I'm not gonna edit it. I'm just gonna like, say, here it is. I front end it. And I back in the book with my voice. You know what I wrote. And I basically say, Hey, here's why I'm doing this. And it's an experiment and And there's a section where it says, okay. From here out everything is written

45
00:09:28.490 --> 00:09:42.509
Don Schmincke don@sagaleadership.com: nothing is written by human. Everything is chat. Pt, and then I wrap it up. And then I but the whole point was like, look, it's still an infant, you know. Let's not be too critical. But the point was to see how

46
00:09:43.470 --> 00:09:52.509
Don Schmincke don@sagaleadership.com: how maybe awareness isn't the right word, but how it it could see us and tell us what to do differently to improve ourselves.

47
00:09:52.720 --> 00:10:00.810
Don Schmincke don@sagaleadership.com: And It was fun, you know, so I I I launched it, and it became a best seller on Amazon for about

48
00:10:00.830 --> 00:10:25.509
Don Schmincke don@sagaleadership.com: 30 microseconds or something, so that things move fast in the AI world. Now I know I just, and I could have like done a normal cause. I'm an author. I've got a number of books out there already, but I could have promoted it had a more of a you know, a launch strategy. But I didn't. I? Just so let's just get it out there and yeah, have some fun with it. So that's And I was going to do a version, too, because

49
00:10:25.820 --> 00:10:41.519
Don Schmincke don@sagaleadership.com: I sent, I guess, a month or 2 after that because I forgot how long ago it was it now has evolved. And then I heard that it passed the bar, you know, so apparently it went from an infant to, you know, maybe in a

50
00:10:41.520 --> 00:11:04.500
Don Schmincke don@sagaleadership.com: early stage, adolescent in an early college pretty quickly. So I thought I should do a version 2, you know, and I thought about that. But I thought you know, with AI you'd have to do a new version of a month right, and probably re-release the book every 30 days, because it's just moving. So this. But I thought that, let's just let it set. But maybe I'll maybe I'll go back to it in a few months. See? Okay.

51
00:11:04.530 --> 00:11:27.559
Don Schmincke don@sagaleadership.com: how would you upgrade this book, you know? Given? Yeah. When you know it's it's a fun experiment. And everybody helped I think promote it and spread it around. I I made up testimonials on the back cover from like, you know, Google and Tesla, just, you know, stuff. But it was. It was all in good cheer. And hopefully.

52
00:11:27.560 --> 00:11:38.450
Jason Wick: you know, we started something. I love it. No, and I I really appreciate. You know you said early on how much you love learning. And and you said something

53
00:11:38.450 --> 00:11:59.149
Jason Wick: equipped something here along the line like we're still doing that with Saga leadership, your your group. And you know we're still doing that. We're still failing or making mistakes or something right. And I appreciate this thread because you're talking about, you know, you just you just had to run this experiment. I I really love it right. And, boy.

54
00:11:59.260 --> 00:12:10.080
Jason Wick: sometimes we just really like to hold on to making something perfect, don't we? And and I love to hear you're just like I'm going for this. It's an experiment. Let's see what happens. And I'm going to share it with the world.

55
00:12:10.210 --> 00:12:16.199
Jason Wick: Now, the question you said you're exploring kind of is. how could the AI

56
00:12:16.320 --> 00:12:21.459
Jason Wick: see us right? What could it. Tell us about ourselves?

57
00:12:21.590 --> 00:12:24.850
Jason Wick: Can you tell me a little bit about what it saw?

58
00:12:25.440 --> 00:12:44.849
Don Schmincke don@sagaleadership.com: It's interesting. I mean it it. Some of the the topics were maybe not surprising, because, you know, it's essentially it has read like everything we published as humans. So it's it's basically absorbed the entire web and and all our literature. So it was just curious as to

59
00:12:45.070 --> 00:12:47.010
Don Schmincke don@sagaleadership.com: what it would pick out

60
00:12:47.070 --> 00:13:06.739
Don Schmincke don@sagaleadership.com: to answer my questions. And Some of the things are very obvious. Some of things weren't, though. I mean, the typical chapters that came up with were like, you know, I got to understand yourself. You got to set goals, build positive habits, you know. continuously learn, find balance. Those are all great things.

61
00:13:06.740 --> 00:13:27.859
Don Schmincke don@sagaleadership.com: And normally, what we would expect in any sort of self help personal development book. I mean, right? They'll have basically, you know, they've got 10,000 personal development books. And they're all saying the same thing around these areas. But to it it picked up that I thought it was interesting. like Chapter 4 was, improve your mental health.

62
00:13:28.470 --> 00:13:33.270
Don Schmincke don@sagaleadership.com: And I thought, that's interesting, because and a lot of personal development.

63
00:13:33.300 --> 00:13:39.560
Don Schmincke don@sagaleadership.com:  books. And we we we, we have books on mental health. But

64
00:13:39.590 --> 00:13:55.329
Don Schmincke don@sagaleadership.com: it doesn't come into our personal development work. But I thought it wouldn't be interesting for it to be there. Like, you know, anything you're going to do to develop yourself and become a achieve a higher potential in your life, having to look at your mental health.

65
00:13:55.370 --> 00:13:59.739
Don Schmincke don@sagaleadership.com: And today, we're getting more and more comfortable talking about that

66
00:14:00.040 --> 00:14:20.519
Don Schmincke don@sagaleadership.com: because we're seeing a lot of celebrities. We're saying, people coming out and saying, Hey, this is an issue we're getting more familiar with addiction. We're becoming more supportive. And so I thought, but it was interesting that it pulled that out. And I thought, Maybe that's sort of something that should be in all of our mainstream books that are on on development.

67
00:14:20.700 --> 00:14:26.680
Don Schmincke don@sagaleadership.com: the other aerial was really around building strong relationships.

68
00:14:27.340 --> 00:14:45.660
Don Schmincke don@sagaleadership.com: But there's a lot of books out there on relationships. But sort of as a general self-help personal development book. I thought, I want to, man. It's how many, how many books actually talk about that. And and then it began to see the work like, could it be seeing the world in terms of conflict like

69
00:14:45.660 --> 00:15:00.949
Don Schmincke don@sagaleadership.com: we've got wars, you know, breaking out everywhere, and we still haven't figured out how to live together. I mean, it's just amazing that we've developed as a species for this long, and we still don't know how to live together. I think that should be like the first

70
00:15:00.950 --> 00:15:17.819
Don Schmincke don@sagaleadership.com: for you already. Right? I mean, hey, guys, let's have a let's go to a meeting and figure this out, because I think we know enough now to live together. It's like no So so there's something about building relationships that I thought that it thought was was

71
00:15:18.000 --> 00:15:40.219
Don Schmincke don@sagaleadership.com: was good. I mean, it was you know, pulling that out. So I I really I really enjoy doing the experiment. And of course a lot of what it said is was somewhat obvious, because it's a lot of people are saying the same thing. So normally it's all that. But it was those couple of things that I thought was interesting, that it would add those to your standard list

72
00:15:40.340 --> 00:15:44.820
Don Schmincke don@sagaleadership.com: of things, you know, relationships. What if we did that better.

73
00:15:44.970 --> 00:15:49.629
Jason Wick: Yeah, improving your mental health and building strong relationships. No, very cool.

74
00:15:49.670 --> 00:15:55.410
Jason Wick: And so we're talking about this this opportunity, maybe it will say here, where we have

75
00:15:55.870 --> 00:16:13.240
Jason Wick: artificial intelligence. In the case of your experiment in the case of your book using Chat Gpt to really maximize something right to to utilize this technology to our advantage. So in this case, you're talking about how? How could we?

76
00:16:13.670 --> 00:16:28.790
Jason Wick: how can we unleash our potential as your book title is, how can we upgrade ourselves? And looking to the AI for suggestions? And you just said, it's kind of the non standard elements that it's added to this list or improving your mental health, building strong relationships.

77
00:16:29.520 --> 00:16:36.180
Jason Wick: If you could kind of capture the I don't know. Maybe a good practice of how can an individual

78
00:16:36.220 --> 00:16:47.260
Jason Wick: use chat, gpt, or use AI to their advantage when it comes to developing their own growth, to unleashing their potential? What what would you tell folks?

79
00:16:47.860 --> 00:16:52.970
Don Schmincke don@sagaleadership.com: I would say use it using as a toll for speed.

80
00:16:53.600 --> 00:16:58.080
Don Schmincke don@sagaleadership.com: You know you use it as a tool for speed and and cause. I know there's a lot of

81
00:16:58.830 --> 00:17:20.760
Don Schmincke don@sagaleadership.com: fear out there, I mean, and some of it may be valid like. Could this be an extinction event which I think is it's this thing I never thought about it being an extinction of. I thought it'd be all the reasons for us to go extinct. But This will call me by surprise, and hopefully, you know, smarter minds than I are going to figure out how to do that.

82
00:17:20.970 --> 00:17:32.949
Don Schmincke don@sagaleadership.com: But I think the this issue around? Well, it's just going to make kids stupid. They're not going to have to learn anything like it. It's these are the same complaints and fears that I heard

83
00:17:33.190 --> 00:17:43.779
Don Schmincke don@sagaleadership.com: and you may not be old enough, but some of the people listening to this may be old enough. These are the same compl fears that I heard when when the calculator was invented.

84
00:17:44.350 --> 00:18:10.400
Jason Wick: Interesting? Oh, my goodness, okay, yes. Okay. Tell us more. This is this is great. Go ahead. It was like it was a backlash. All we can't allow calculators in the classroom, and people aren't going to learn math anymore. And they're going to be stupid, because now they just typed the numbers into their calculator. And it was this thing, and there was like, they banned calculators. And you can't. So it was really interesting to see the same reaction to calculators. Now, of course.

85
00:18:10.400 --> 00:18:31.679
Don Schmincke don@sagaleadership.com: at the end of the day, did we learn that? Yeah, we still learn math, and and it's it's very easy to teach math and not see somebody cheat you. Just give them a pencil and a paper and say, do this equation right, you know. And so so with that, we still have to figure out math until you learn it, and then it gets to a point to where

86
00:18:31.940 --> 00:18:55.179
Don Schmincke don@sagaleadership.com: you're doing all this long division or calculus. And somebody says, How can we not using a calculator? So it goes from the Kenya's calculator to. Why would you calculate over this? It's taking you an hour to do what you get to it like 30 s. So I think, AI, the part of the experience, I think would be exactly that it's going to be you know. What? Why don't you use that? Why, you

87
00:18:55.210 --> 00:19:00.049
Don Schmincke don@sagaleadership.com: because I like this book as an example. And I'm I'm thinking about some other research projects.

88
00:19:00.340 --> 00:19:18.859
Don Schmincke don@sagaleadership.com: It's like, Okay, I take it in on the Johns Hopkins library. And I could spend 2 weeks going through all of the literature on self help patterns. And you know, things have been published the past 10,000 books and so forth. and come back and digest that tap. Or.

89
00:19:18.890 --> 00:19:26.349
Don Schmincke don@sagaleadership.com: okay, that's 2 weeks of my life where I can send a grad student to do it. And they could come back and sit down with me for a day.

90
00:19:26.630 --> 00:19:53.409
Don Schmincke don@sagaleadership.com: and now it saves me 2 weeks. Now now I'm down to a day and speed. I can't all those days back in my research, or I go to Chat, Gpt, and you know it'll probably do the whole thing in a half an hour. Okay. Now, I've got my, you know, I'm yeah. Okay. I got choices. I can go for 2 weeks, or I spend 30 min. I take over the 30 min now, is is it making me stupid? No, I'm I'm doing it like it's my grad student.

91
00:19:53.780 --> 00:20:13.040
Don Schmincke don@sagaleadership.com: you know. Now I'm not going to take everything the grad student comes back to me with, because they may not be seeing things that I see, or validating things that I'm going to validate. And it's just a research source. And and and I've caught it making mistakes. I mean, I've I've went to do. What was I doing the other day, I forgot. But

92
00:20:13.100 --> 00:20:19.500
Don Schmincke don@sagaleadership.com: I oh, yeah, it was something around the failure, the failure of the self esteem movement.

93
00:20:19.630 --> 00:20:36.420
Don Schmincke don@sagaleadership.com: We started back, you know, and and that's where everybody gets a participation award. Nobody's allowed. They've been the most catastrophic disaster experiment and and human society. But so I went back to to look at where it was. Some of the research founders had

94
00:20:36.430 --> 00:21:00.980
Don Schmincke don@sagaleadership.com: had been complaining, and then there was something else on because I really hate this thing. We got people coming up with medical research, or this research, or there, or they hear something. And then every motivational speaker is saying, Oh, this, oh, that. Yeah, I get an article on. I'm going to put on the blog on the butterfly effect. Oh, Butterfly slapping, it's ringing the south Mar. You want to cause a hurricane, you know, and it's like what

95
00:21:01.050 --> 00:21:23.210
Don Schmincke don@sagaleadership.com: I mean, what? And so so nobody reads the research, nobody goes back and sees the articles, and I know that the the the the scientists that came up with this was a meteorologist, and he he says, I didn't mean that, you know it was like, for of like, we don't know what's going to happen. It wasn't going to. So everybody is preaching. This stuff is wrong

96
00:21:23.220 --> 00:21:44.770
Don Schmincke don@sagaleadership.com: because it was taken out. So I'm right back to find a quote, and of course Jet, which comes back, says, Yes, December 2,002, and Discovery Channel. He said it was a great sentence. Now, of course, if I was a you know student know any better, I'd say, oh, I can just quote that. Well, no, no, you don't. This you can. So I actually went back to look at the publications. I couldn't find it.

97
00:21:44.920 --> 00:21:51.609
Don Schmincke don@sagaleadership.com: Hmm! And so I'm back chat with you and says, can you give me the source for your footnote for for this quote innocent?

98
00:21:51.730 --> 00:22:00.599
Don Schmincke don@sagaleadership.com: Oh, I'm sorry I misunderstood your request. that was never said, and I'm going to make note of that. In other words, it's learning right

99
00:22:00.600 --> 00:22:25.269
Don Schmincke don@sagaleadership.com: because the algorithm is guessing. And when it gets is wrong, it learns. And it's better. It's like the brain, it it it's like neural pathways. It's it's like these paths seem to work well, we'll keep it. But this one didn't work so well. So we're going to not use that path anymore. And it's a so what my point is that it's a great tool. But as with any tool, we shouldn't advocate our knowledge and decisions to the tool.

100
00:22:26.050 --> 00:22:28.010
Don Schmincke don@sagaleadership.com: we should still use it

101
00:22:28.030 --> 00:22:36.389
Don Schmincke don@sagaleadership.com: as a whole, but not abdicate to it. So that's an example of, you know it's trying. But it's not going to be totally there.

102
00:22:36.780 --> 00:22:42.639
Jason Wick: That's great. yeah, I mean, thanks for that last bit of wisdom at the end about, you know, we still should

103
00:22:42.740 --> 00:23:09.619
Jason Wick: maintain our responsibility for using our brains and making our decisions. But I, okay, let's let's just recap what you brought in here, which is really powerful. Because again, we're gonna have people who are some folks are using chat, Gpt, or some have, you know, create an account and hardly use it. And others probably really just, you know, maybe they're at this point almost afraid to ask, because something sometimes it's just gone by, and they still haven't got on the train. So you've got the opportunity to

104
00:23:09.740 --> 00:23:14.729
Jason Wick: gather a tremendous amount of information very efficiently

105
00:23:14.800 --> 00:23:40.020
Jason Wick: right at your fingertips, and you gave the great example. You could go to the library and spend 2 weeks of your time. You could delegate the 2 weeks to a grad student, and maybe get one day's worth of your own time to get all that information. how accurately it comes through! Who knows? Or you could delegate it to to the machine and and get it that in 30 min. Now, that's like a massing information in a a short period of time.

106
00:23:40.020 --> 00:23:46.050
Jason Wick: How kind of going back to your book and your continued use, presumably of of the AI.

107
00:23:47.060 --> 00:23:48.140
How?

108
00:23:49.340 --> 00:23:56.690
Jason Wick: How, How good is it at at providing insights, as opposed to also just amassing information for you.

109
00:23:57.790 --> 00:24:01.169
Don Schmincke don@sagaleadership.com:  that's a good question. I

110
00:24:01.810 --> 00:24:02.839
Don Schmincke don@sagaleadership.com: I think

111
00:24:02.850 --> 00:24:12.099
Don Schmincke don@sagaleadership.com: no insights. Okay, no good. That's a powerful answer. Tell us more about that. I think the insight comes from a human interpreting

112
00:24:12.410 --> 00:24:13.740
Don Schmincke don@sagaleadership.com: what they're saying.

113
00:24:14.390 --> 00:24:17.270
Don Schmincke don@sagaleadership.com:  The machine.

114
00:24:17.350 --> 00:24:29.400
Don Schmincke don@sagaleadership.com: I don't think is like having any Aha epiphanies, you know it's it's just an algorithm that's providing a very efficient way for us to can access the knowledge. But

115
00:24:29.930 --> 00:24:47.779
Don Schmincke don@sagaleadership.com: this has always been the case. However, it so it's not the as a tool. I think. Thomas Coon wrote a great book, the history of Scientific Revolution and If I spent on about it at Stanford. I think it was a required reading, but the whole point was that a lot of revolutions in science had nothing to do with new data.

116
00:24:47.990 --> 00:24:53.019
Jason Wick: How to do with it with observing the old data in a new way.

117
00:24:53.280 --> 00:25:01.990
Don Schmincke don@sagaleadership.com: you see. And so I think that's the same thing with AI. It's There may not be any insights, but some human may see what it produced

118
00:25:02.050 --> 00:25:14.799
Don Schmincke don@sagaleadership.com: and see it differently, and get an insight from it, or ask it different questions, saying, Well, what about this angle and and explore knowledge in a way which can lead to insight? But

119
00:25:15.300 --> 00:25:40.140
Don Schmincke don@sagaleadership.com: I I don't think we're there where it's going to tap us on the head and say, Hey, idiot! You're missing this point. Well, and thanks for answering the question in that way. I mean as consistent with what you just said a second ago about not abdicating our responsibility as the human in this equation. And I think that's a great thing to say a couple of times. So I love it. You're getting this information more efficiently. Whatever you're trying to dive into and find

120
00:25:40.140 --> 00:25:56.319
Jason Wick: find the insight in, it's gonna provide it right at your fingertips. And now the human interprets that information. which which is powerful save a lot of time, we can use more of our brain for that, and and not as much for the the tedious gathering of information. Hopefully. Right?

121
00:25:56.320 --> 00:26:03.679
Don Schmincke don@sagaleadership.com: Right? Yeah. Yeah. The the danger it comes to when we start the allowing it to be autonomous with

122
00:26:03.680 --> 00:26:15.149
Don Schmincke don@sagaleadership.com: other machines, you see. And and that's a different realm of AI. certainly, as a research tool as as my now my very efficient grad student. No problem. I love it.

123
00:26:15.330 --> 00:26:42.849
Don Schmincke don@sagaleadership.com: but turning it over to a drone I mean they just published. I mean, they had to re. They had to retract this statement, but it was like the the drone, the simulation killed its operator. Oh, gosh, okay. Oh, we see this so far. I I haven't read the photo particles, but that was that wasn't entirely accurate. But the point is, I could see where you know if it if it's thinking I have to achieve this mission.

124
00:26:42.850 --> 00:26:48.290
Don Schmincke don@sagaleadership.com: and I have to take whatever it takes to achieve this mission. But the drone operator says, Stop.

125
00:26:48.540 --> 00:27:12.969
Don Schmincke don@sagaleadership.com: you know, does it? Does it say, oh, now you're trying to stop me from the mission. So now you're in my way, you know. So I mean, I can see little little minor mistakes like that happen. But that's when we get into the Terminator part right? So many movies on this. It's like it's an old theme. but I'm I'm hoping there's there's a lot of smart people that are putting together.

126
00:27:13.260 --> 00:27:40.909
Don Schmincke don@sagaleadership.com: You know the roles, I mean, what's an Isaac as them all came up with the the roles of robots. I forgot. I forgot the list, like, you know don't kill a human or something. Right? So I don't know. I have to go back. And it was like 30 or 40 years ago that'd be longer. So I don't know where we're going to go with it, I think, as a research tool like I said, it's great. Use it like you use a calculator when you're going to automate machines. Let's, you know. Be careful with that, too, because

127
00:27:40.950 --> 00:27:42.380
Don Schmincke don@sagaleadership.com: you know

128
00:27:43.220 --> 00:27:52.239
Don Schmincke don@sagaleadership.com: we can't abdicate common sense, you know, to a machine. And we've seen problems already when we're trying to automate too too quickly.

129
00:27:52.740 --> 00:28:07.800
Jason Wick: No, some great caveats that you're outlining, Don. Thank you. And and again. I I didn't speak to it myself, but the the calculator example is really a powerful one. I hope that resonates with a lot of folks specifically as it

130
00:28:07.850 --> 00:28:30.960
Jason Wick: relates to the way you're framing how you think we can effectively use it as a tool. Right? We're not. We're not delegating the thought to the calculator. We still know how to do basic operations in math and understand. But again, it's efficient, and it's quicker the speed. And I think, pointing out that parallel. Here is something that could really speak to a lot of people. So thanks for saying that.

131
00:28:30.980 --> 00:28:58.699
Don Schmincke don@sagaleadership.com: is there anything before we we wrap things up here. It's been a really interesting conversation. I'm really enjoying it. Is there anything else you felt like you wanted to mention to folks who are going to dip their toe, as it were, into using AI. Any other things you'd like to just point out about how to effectively use it. Yeah, I, I would say, you know, have fun learning about it. Have fun applying it. I mean, I think people use it for like their workout programs or their programs. And so

132
00:28:58.740 --> 00:29:01.729
Don Schmincke don@sagaleadership.com: you know the experiment. Be playful with it.

133
00:29:02.090 --> 00:29:12.940
Don Schmincke don@sagaleadership.com: Don't be too afraid of it. I think it's still learning as well, and the more that we get used to it the the less it will be, you know, a a fear based reaction

134
00:29:13.260 --> 00:29:43.139
Jason Wick: and hopefully the the like. I said. The greater minds are going to take it seriously enough to, you know, prevent a a, an extinction event. We will help before we. We. We finish things up here and let everybody here you know where where your services are located on the web and elsewhere. I ask every guest the same question. Love to hear yours. What is something that you've learned recently done

135
00:29:46.270 --> 00:30:01.750
Don Schmincke don@sagaleadership.com: so many more mistakes to make and so little time left. That's that I I I have this new book coming out in the fall, and it's called winners and losers, and the whole point is for entrepreneurs.

136
00:30:02.000 --> 00:30:05.340
Don Schmincke don@sagaleadership.com: and the whole point was that we're teaching people how to be winners.

137
00:30:05.370 --> 00:30:19.460
Don Schmincke don@sagaleadership.com: But we're not teaching them how to be losers. Oh, gosh! That's great! And you know I. And when I look at all these successful entrepreneurs they write these books, but everything they did right. But when I talked to the entrepreneurs I go through their their biography.

138
00:30:19.460 --> 00:30:47.630
Don Schmincke don@sagaleadership.com: They had a history of errors, of mistakes, of miscalculations, and it was in those mistakes and those things that they did wrong that propelled them to become more powerful. So I said, You know, we need to teach people to lose powerfully, because that creates great entrepreneurs and great leaders. So. that's that's why I said so many more mistakes to make all the time left. Because, that's how we grow. So yeah, so that books coming out, and that's what I learned.

139
00:30:47.630 --> 00:30:57.600
Jason Wick: Love it. That's a great answer. Thank you, Don, and and we'll look forward to that that next book. Maybe we can get you on next season of the podcast in 2,024. That would be super cool.

140
00:30:57.930 --> 00:31:08.640
Jason Wick: awesome. Well, I've really enjoyed this discussion for those who are interested in learning more about your work. the services you provide or the books. where would we like to direct them?

141
00:31:08.680 --> 00:31:15.510
Don Schmincke don@sagaleadership.com: soccer leadership.com. I stole it from the from the Vikings. But yeah, saga

142
00:31:15.610 --> 00:31:33.139
Don Schmincke don@sagaleadership.com: saga leadership.com as a whole, not the topic, the compelling saga I'm trying to put everything there in terms of books and the services that we have for executives. And also I'm moving more into like entrepreneurs now. So we're gonna we're gonna be expanding that. So that's a good place to start

143
00:31:33.140 --> 00:31:50.850
Jason Wick: beautiful. So saga leadership S. A. G. A. For everyone out there to learn more about what Don Schmenk is up to. Don. Thank you so much for being generous with your time today I have enjoyed this conversation a very unique one we've not had on the show, and and have a great rest of your day. Great! Thank you, you, too. Thanks for having me.