Created
December 20, 2023 20:43
-
-
Save fblissjr/80e3b349ddcdc57b09cc588c492bf854 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[00:00:00 - 00:00:08] SPEAKER_02: thanks for tuning in to the world xp podcast if you're enjoying the content please drop us up | |
[00:00:08 - 00:00:12] SPEAKER_02: drop a like and let us know your thoughts below in the comments also please consider supporting | |
[00:00:12 - 00:00:18] SPEAKER_02: our podcast via the link below it really helps us out bill welcome to the world xp podcast how are | |
[00:00:18 - 00:00:28] SPEAKER_03: you man i'm doing all right how you doing today not too bad not too bad this delay is going to be crazy | |
[00:00:28 - 00:00:33] SPEAKER_01: we're just talking offline there's gonna be five second i know it's killing me right now | |
[00:00:33 - 00:00:38] SPEAKER_03: that's all right so let's do a quick intro i'm gonna try to preempt what you're saying | |
[00:00:38 - 00:00:43] SPEAKER_01: and answer ahead of time this this might be the most like telepathic one or the worst one ever | |
[00:00:43 - 00:00:46] SPEAKER_03: no in between This might be the most telepathic one or the worst one ever. | |
[00:00:47 - 00:00:49] SPEAKER_01: No in between. | |
[00:00:53 - 00:00:53] SPEAKER_03: So let's start with who you are and what you do, | |
[00:00:55 - 00:00:56] SPEAKER_01: and then we'll jump into the question from the last guest, | |
[00:00:57 - 00:00:59] SPEAKER_01: and then we'll go from there. | |
[00:01:01 - 00:01:01] SPEAKER_03: Yeah, sure, that works, man. | |
[00:01:02 - 00:01:02] SPEAKER_00: So my name is Philip Howard. | |
[00:01:15 - 00:01:16] SPEAKER_03: I work as an IT and data consultant. What I do is I work with businesses to help them build up platforms that give them a better understanding of what's going on, like sales internally, whatever they do by connecting all their data points. | |
[00:01:24 - 00:01:24] SPEAKER_01: Well, yes, you also use AI, though. That's the big thing, right? So a lot of data stuff, but AI as well. | |
[00:01:25 - 00:01:26] SPEAKER_03: Yeah, kind of buried the lead, didn't I? Yeah, you did. You buried it a little bit. That's all right. That's the big thing, right? So a lot of data stuff, but AI as well. Yeah, kind of buried the lead, didn't I? | |
[00:01:26 - 00:01:27] SPEAKER_01: Yeah, you did. | |
[00:01:27 - 00:01:28] SPEAKER_03: You buried it a little bit. | |
[00:01:28 - 00:01:28] SPEAKER_03: That's all right. | |
[00:01:29 - 00:01:29] SPEAKER_03: That's all right. | |
[00:01:31 - 00:01:33] SPEAKER_01: So, yeah. | |
[00:01:34 - 00:01:34] SPEAKER_03: All right. | |
[00:01:35 - 00:01:35] SPEAKER_02: So let me try that again. | |
[00:01:39 - 00:01:39] SPEAKER_01: So the cool part about this, because I usually make that as fast as possible. | |
[00:01:43 - 00:01:43] SPEAKER_01: I've been to many a Thanksgiving where I start talking about what I do and people's eyes go different directions. | |
[00:01:47 - 00:01:49] SPEAKER_01: But the whole world runs on data, right? Literally everything runs on data on data is what makes things smart. That's what makes people smart. | |
[00:01:49 - 00:01:51] SPEAKER_01: You go to school, you're learning data, you go to your | |
[00:01:51 - 00:01:53] SPEAKER_01: right, your science class, you're reading the data from | |
[00:01:53 - 00:01:55] SPEAKER_01: people from past experiments and stuff, right? I can go on about | |
[00:01:55 - 00:01:59] SPEAKER_01: this, and you don't want me to. But what this all builds up to | |
[00:01:59 - 00:02:02] SPEAKER_01: is the more stuff, you know, the more stuff you can do. And the | |
[00:02:02 - 00:02:06] SPEAKER_01: same thing with machines, especially as far as we go with AI now, because | |
[00:02:06 - 00:02:09] SPEAKER_01: once upon a time, a computer was would only do what you told it | |
[00:02:09 - 00:02:14] SPEAKER_01: to, it was a running joke, the computers are stupid. What you | |
[00:02:14 - 00:02:18] SPEAKER_01: do when you when you teach people to program is we try to | |
[00:02:18 - 00:02:20] SPEAKER_01: get them to wrap their head around that idea, you tell them, | |
[00:02:20 - 00:02:21] SPEAKER_01: alright, give me instructions on how to make a peanut butter and | |
[00:02:21 - 00:02:24] SPEAKER_01: jelly sandwich. And they say, Okay, well, you, you take some | |
[00:02:24 - 00:02:27] SPEAKER_01: peanut butter, you put it between two slices of bread and you put it together | |
[00:02:27 - 00:02:31] SPEAKER_01: and people you know you go into that literally you take a jar of peanut butter and you slap it | |
[00:02:31 - 00:02:36] SPEAKER_01: in with the jar you put it between two slices of bread it's stupid computers are stupid | |
[00:02:36 - 00:02:47] SPEAKER_01: but now with ai because of how they're trained and the way they work with all of the data that we have now, it gives the facade of intelligence, of creativity, | |
[00:02:47 - 00:02:54] SPEAKER_01: of being able to come up with solutions to things that it was not given beforehand. | |
[00:02:55 - 00:02:59] SPEAKER_01: So yeah, I like working with data because this is what it leads to. | |
[00:03:00 - 00:03:06] SPEAKER_03: Fair enough. Okay. So you said a lot of interesting things but i'm gonna ask you this now because i'll | |
[00:03:06 - 00:03:12] SPEAKER_03: forget if i don't um a question from cody who was the last guest who unfortunately his episode got | |
[00:03:12 - 00:03:18] SPEAKER_03: taken down for reasons that i don't understand but that's okay his question is what's something | |
[00:03:18 - 00:03:23] SPEAKER_03: your profession has made you realize your perspective on something in life was completely | |
[00:03:23 - 00:03:28] SPEAKER_03: wrong so you thought you thought something about something in your life and then you started working in your field and you were | |
[00:03:28 - 00:03:43] SPEAKER_01: like that is not the case at all i thought that the i thought that the more money a company had | |
[00:03:43 - 00:03:55] SPEAKER_01: the bigger it was the more advanced it was the the bigger the name, the longer it's been around, the more advanced stuff they would do, the more they'd have it together, the cooler, better tech they would have, the smarter the people. | |
[00:03:57 - 00:04:05] SPEAKER_01: I'm going to get in trouble here. I work in consulting, so I'm going to say stuff and people are working with me and they're like, hey, were you talking about me? Like, no, dude, it wasn't you. | |
[00:04:05 - 00:04:05] SPEAKER_01: I swear. | |
[00:04:06 - 00:04:09] SPEAKER_01: No, but the truth is, you know what the entire world runs on? | |
[00:04:10 - 00:04:10] SPEAKER_01: Excel. | |
[00:04:10 - 00:04:11] SPEAKER_01: It still works on Excel. | |
[00:04:11 - 00:04:19] SPEAKER_01: We have millions and millions and millions of dollars in advanced databases and software and technology and people are still using spreadsheets. | |
[00:04:19 - 00:04:23] SPEAKER_01: You don't want to know how many of your bank accounts are run on spreadsheets. | |
[00:04:23 - 00:04:25] SPEAKER_01: You don't want to know how many of your bank accounts are run on spreadsheets. | |
[00:04:31 - 00:04:31] SPEAKER_01: It's the world is not as complicated. | |
[00:04:34 - 00:04:34] SPEAKER_01: It's extremely complicated, but it's not as complicated as people like to believe. | |
[00:04:35 - 00:04:36] SPEAKER_01: That's all sci-fi stuff here. | |
[00:04:38 - 00:04:51] SPEAKER_01: People are just people and everything's run by people. So my, my profession has taught me that don't, don't, don't make something out to be what | |
[00:04:51 - 00:04:51] SPEAKER_01: it's not. | |
[00:04:52 - 00:04:56] SPEAKER_01: Don't, don't over, don't, don't make things bigger than what they are. | |
[00:04:57 - 00:04:57] SPEAKER_01: Yeah. | |
[00:04:58 - 00:04:58] SPEAKER_03: 100%. | |
[00:04:58 - 00:05:01] SPEAKER_01: That's a, I don't know if that's a good answer or not, but that's what I'm going with. | |
[00:05:02 - 00:05:06] SPEAKER_01: That's a very good, that's a very good answer because we see this all the time | |
[00:05:06 - 00:05:12] SPEAKER_01: like you said these big systems and organizations and and companies at the end of the day they're | |
[00:05:12 - 00:05:21] SPEAKER_03: all run by a person who's generally speaking just like everybody else man this delay is rough | |
[00:05:21 - 00:05:26] SPEAKER_01: is it really do you want to Do you want to take five? | |
[00:05:26 - 00:05:28] SPEAKER_03: Let's try and reboot and start again. | |
[00:05:28 - 00:05:31] SPEAKER_03: And we're back, possibly, maybe. | |
[00:05:31 - 00:05:33] SPEAKER_03: We'll see how the delay is now. | |
[00:05:33 - 00:05:35] SPEAKER_01: We made a couple of changes. | |
[00:05:38 - 00:05:38] SPEAKER_01: Where were we? | |
[00:05:38 - 00:05:39] SPEAKER_01: Oh, yeah. | |
[00:05:39 - 00:05:40] SPEAKER_01: Big organizations, at the end of the day, | |
[00:05:40 - 00:05:44] SPEAKER_01: is all run by a human being who's just bones and muscles and whatever. | |
[00:05:51 - 00:05:57] SPEAKER_01: And that's a very very good point because we think of these organizations as and people joke about it as well like it's like oh i like this organization takes so slow and it's like dude you're a part of | |
[00:05:57 - 00:06:02] SPEAKER_03: that organization but it's like people differentiate themselves from the organization we almost think | |
[00:06:02 - 00:06:09] SPEAKER_03: of them as not groups of people it's's kind of strange, similar to how you could say countries and their governments are not the same, even though | |
[00:06:09 - 00:06:15] SPEAKER_03: the people live there. Yeah, so it's a weird kind of indictment on the state of things at the moment | |
[00:06:15 - 00:06:20] SPEAKER_03: that things have become so big that we view organizations as their own entities. | |
[00:06:21 - 00:06:30] SPEAKER_01: It's funny, you bring up a much more interesting philosophical point than I did, because I was just making a joke about Excel. But what I actually have learned from being a | |
[00:06:30 - 00:06:35] SPEAKER_01: consultant, getting to that kind of inside peek into how businesses operate, and who makes the | |
[00:06:35 - 00:06:42] SPEAKER_01: deals and how decisions are made, is that the most important thing you can learn is that people like | |
[00:06:42 - 00:06:50] SPEAKER_01: working with people they like, you know, businesses are run by people it's the the fact that businesses are seen as their own entity like the way you described | |
[00:06:50 - 00:06:55] SPEAKER_01: and absolutely legally speaking that's exactly what they are um is is bonkers and that's what | |
[00:06:55 - 00:07:00] SPEAKER_01: keeps people from trying new things trying to break the rules because they think that | |
[00:07:00 - 00:07:04] SPEAKER_01: this is the way it's supposed to run you know there's no way that i can get into this company | |
[00:07:04 - 00:07:07] SPEAKER_01: or this there's no way that i can challenge this company to do something different. | |
[00:07:07 - 00:07:10] SPEAKER_01: But I mean, it's just a collection of people. | |
[00:07:11 - 00:07:15] SPEAKER_01: You can go up and that's why the joke is that business deals are all made on golf courses | |
[00:07:15 - 00:07:19] SPEAKER_01: or in bars or in God knows where else, because you're just talking to the people that run | |
[00:07:19 - 00:07:20] SPEAKER_01: the show. | |
[00:07:20 - 00:07:22] SPEAKER_01: They're just people just like you and me. | |
[00:07:22 - 00:07:24] SPEAKER_02: So it's amazing. | |
[00:07:24 - 00:07:26] SPEAKER_02: It's an amazing revelation when you see that. | |
[00:07:27 - 00:07:45] SPEAKER_03: And and then the people skill on top of that is figuring out which people have the the authority or the like the poll to make something happen, because oftentimes those people are not the people with the titles. It's some guy who maybe is the president really trusts some other guy and you figure out | |
[00:07:45 - 00:07:49] SPEAKER_03: who the other guy is that can convince the president because it's hard to talk to the | |
[00:07:49 - 00:07:53] SPEAKER_03: president because they're so busy all the time doing whatever. It's like figuring that stuff out, | |
[00:07:53 - 00:07:58] SPEAKER_03: even when you're within your own company, like, hey, I think we should implement blank process. | |
[00:07:59 - 00:08:03] SPEAKER_03: Figuring out who the right person to go talk to is a very valuable skill because | |
[00:08:03 - 00:08:05] SPEAKER_03: sometimes they're the people that you a very valuable skill because sometimes they're | |
[00:08:05 - 00:08:10] SPEAKER_03: the people that you think they are but often they're not as well honestly i would uh compare | |
[00:08:10 - 00:08:15] SPEAKER_01: that to try to game the stock market like all right is it gonna go up is it gonna go down trying | |
[00:08:15 - 00:08:22] SPEAKER_01: to pick that out is near impossible what i think works really well um and i am not the the model | |
[00:08:22 - 00:08:25] SPEAKER_00: for this uh honestly if i going to chill for my wife, | |
[00:08:26 - 00:08:27] SPEAKER_00: she's incredible at this. | |
[00:08:28 - 00:08:31] SPEAKER_00: But just be likable and be open to talk to people, | |
[00:08:32 - 00:08:34] SPEAKER_00: talk to everybody, talk to anybody, listen to people. | |
[00:08:35 - 00:08:36] SPEAKER_00: People love being listened to. | |
[00:08:36 - 00:08:39] SPEAKER_00: And you'd be amazed at just making a habit | |
[00:08:39 - 00:08:40] SPEAKER_00: of talking to everybody you can | |
[00:08:40 - 00:08:42] SPEAKER_00: and just figuring out what's going on in a day, | |
[00:08:42 - 00:08:43] SPEAKER_00: who they are, what they like, what they do. | |
[00:08:44 - 00:08:50] SPEAKER_01: You never know who you're going bump into who knows who and just things will suddenly | |
[00:08:50 - 00:08:54] SPEAKER_01: start happening because they know you and this person knows you who knows this person and if | |
[00:08:54 - 00:08:57] SPEAKER_01: you just make a habit of doing that it's not going to happen overnight but if that's just | |
[00:08:57 - 00:09:08] SPEAKER_01: if that just becomes your personality you'd be amazed the the network that you've built without even without being as predatory as as uh | |
[00:09:08 - 00:09:16] SPEAKER_03: as sales yeah yeah no i fully agree i um i have friends that i kind of mentioned that to them like | |
[00:09:16 - 00:09:23] SPEAKER_03: first impressions are are so important and it's not even the like gaming the system kind of in | |
[00:09:23 - 00:09:47] SPEAKER_03: the way that that you mentioned but exactly like you mentioned to be not predator, but just to be just to listen and actually take a genuine interest in somebody for like two minutes or five minutes or whatever. And that could make all the difference between you getting a job or making a deal | |
[00:09:47 - 00:09:49] SPEAKER_00: or doing whatever, all of these things. | |
[00:09:49 - 00:09:50] SPEAKER_00: And I think it's super important. | |
[00:09:51 - 00:09:53] SPEAKER_03: Just, I mean, even if you're just in line | |
[00:09:53 - 00:09:54] SPEAKER_03: at the grocery store or whatever, | |
[00:09:55 - 00:09:57] SPEAKER_03: and somebody says something to you, | |
[00:09:57 - 00:10:00] SPEAKER_03: like it's like little things can go such a long way. | |
[00:10:00 - 00:10:01] SPEAKER_03: You don't even realize. | |
[00:10:01 - 00:10:03] SPEAKER_03: And to your point about it not happening overnight, | |
[00:10:03 - 00:10:09] SPEAKER_03: that's also entirely correct. And then it'll it'll start to snowball eventually um because you'll your | |
[00:10:09 - 00:10:14] SPEAKER_03: your little network will build and somebody will be like oh yeah this guy's application ran across | |
[00:10:14 - 00:10:18] SPEAKER_03: my desk do you know who that guy is and they're like no but i know somebody who do and then they'll | |
[00:10:18 - 00:10:32] SPEAKER_03: call that other person and it's like oh that guy's really nice like i met him a couple times at blank and like it snowballs for you so because all it takes is one bad one to like ruin all of it and so you | |
[00:10:32 - 00:10:38] SPEAKER_00: have to build up enough cushion to kind of if if something bad happens you've got enough like | |
[00:10:38 - 00:10:50] SPEAKER_03: built up in the bank that it's okay you had a bad day that day exactly i literally just had something to | |
[00:10:50 - 00:10:56] SPEAKER_00: say and it just went right out of my mind so i'm blaming that on 7 30 at night fair enough | |
[00:10:56 - 00:11:06] SPEAKER_03: this is gonna be a fun hour no it'll be good because now we'll so what we talked about before when we were on the phone for those listening is is | |
[00:11:06 - 00:11:12] SPEAKER_01: real heavy on on the ai piece um and that's something that i haven't had an ai person on | |
[00:11:12 - 00:11:19] SPEAKER_03: the show before and i'm trying to branch out and talk to all sorts of different people who are | |
[00:11:19 - 00:11:30] SPEAKER_01: doing all sorts of different things and so obviously ai i think really hit the lexicon kind of with chat gpt like people knew | |
[00:11:30 - 00:11:35] SPEAKER_03: what ai like had heard the term but it really just popped into everybody's life with chat gpt | |
[00:11:35 - 00:11:40] SPEAKER_01: so i don't know like what's been your experience like how did you get into an ai | |
[00:11:40 - 00:11:46] SPEAKER_03: sort of field was that something that you had been wanting to do um like how did | |
[00:11:46 - 00:11:52] SPEAKER_03: you end up in it and what is your interpretation of what is going on with with it and it's i'll say | |
[00:11:52 - 00:12:09] SPEAKER_01: evolution into the mainstream at the moment so that is a very very big question probably a bunch of bunch of places i can go um so the world was super | |
[00:12:09 - 00:12:15] SPEAKER_03: crazy about ai i mean even back in like uh what was it 2019 i think um that's when the first big | |
[00:12:15 - 00:12:19] SPEAKER_01: hype was coming out when people were doing all kinds of like machine learning and all sorts of | |
[00:12:19 - 00:12:24] SPEAKER_02: cool stuff like that where data science was was the field to get into you know you wanted to make | |
[00:12:24 - 00:12:28] SPEAKER_01: lots of money you wanted to be like a super smart guy in the room who who's going to be working on really cool | |
[00:12:28 - 00:12:32] SPEAKER_01: stuff you had to be a data scientist i don't think anybody really knew what that meant um | |
[00:12:32 - 00:12:37] SPEAKER_01: that's kind of like where i mean you just you just went to school and you learned how to do | |
[00:12:37 - 00:12:41] SPEAKER_01: statistics and a bunch of other stuff and then on top of that you learned programming um because | |
[00:12:41 - 00:12:48] SPEAKER_01: that's where all the ml stuff was doing um and god bless the ones who did because now they they built out all the stuff that made it way easier for guys | |
[00:12:48 - 00:12:55] SPEAKER_01: like me to do it um but when it really came back to life you're right it came out with chat gpt | |
[00:12:55 - 00:12:59] SPEAKER_01: when uh when opening i released that and made it mainstream where people could just talk to | |
[00:13:00 - 00:13:06] SPEAKER_01: something and it could be creative i mean that just blew people's minds it still blows my minds | |
[00:13:06 - 00:13:11] SPEAKER_01: on a regular basis and that was actually my first intro into it um i don't know what i guess i | |
[00:13:11 - 00:13:16] SPEAKER_01: didn't even need a marketing department i mean the word of mouth just went wild i mean you saw | |
[00:13:16 - 00:13:20] SPEAKER_01: the stuff on like instagram and tiktok people talking about it like this wrote my kids essay | |
[00:13:20 - 00:13:29] SPEAKER_01: and of course i rolled my eyes a little bit, because I remember the first type, I'm like, all right, yeah, sure. I go to chat GPT, and I start playing around with it. And like, | |
[00:13:29 - 00:13:35] SPEAKER_01: my mind is blown. I don't remember the first, I gotta look at my history here. My chat GPT history | |
[00:13:35 - 00:13:40] SPEAKER_01: is wild. I think like the first thing I did was I had it write like a kid's book for like, about | |
[00:13:40 - 00:13:46] SPEAKER_01: like my wife, my kid, me and their nanny just to kind of show what it did and | |
[00:13:46 - 00:13:51] SPEAKER_01: it was it was beautiful it was beautiful it was it was touching and sweet and to the point like | |
[00:13:51 - 00:13:54] SPEAKER_01: like it covered like you know maybe it would have been like three or four pages long right | |
[00:13:54 - 00:14:01] SPEAKER_01: and it was very sweet and i mean it was crazy and then from there i i wanted to see i looked i did | |
[00:14:01 - 00:14:05] SPEAKER_01: like a deep dive into all the random stuff you could do with it because | |
[00:14:12 - 00:14:13] SPEAKER_01: being able to make it dance was the fact that um that you could have so much control over it um | |
[00:14:18 - 00:14:21] SPEAKER_01: made it so much more fun so that's where things like prompt engineering came in where you can give it specific instructions on like the persona you wanted to take maybe you wanted to talk like | |
[00:14:22 - 00:14:25] SPEAKER_01: uh mr t or freaking william shatner | |
[00:14:25 - 00:14:34] SPEAKER_01: or something i don't know um and you could have it uh write poems uh about world war ii in the | |
[00:14:34 - 00:14:38] SPEAKER_01: voice of martha stewart or something i mean i don't know it's it you could also do cool stuff | |
[00:14:38 - 00:14:44] SPEAKER_01: like that or if you wanted to um turn the tables on it and have it ask you questions um you could | |
[00:14:44 - 00:14:45] SPEAKER_01: do that too i i think i | |
[00:14:45 - 00:14:50] SPEAKER_01: had it do a um my wife and i were talking about one day maybe building a house it's one of those | |
[00:14:50 - 00:14:54] SPEAKER_01: like moonshot ideas that you know we want to do and i'm like hey i don't know anything about | |
[00:14:54 - 00:15:00] SPEAKER_01: building a house you are a an architect and design consultant um talk me through the process | |
[00:15:00 - 00:15:08] SPEAKER_01: starting from beginning stages and ask me you know all of my preferences and wait for my response and then build off of that and it one by one it would ask | |
[00:15:08 - 00:15:13] SPEAKER_01: me a question like how many rooms do i want how big do i want it um i would give it an answer | |
[00:15:13 - 00:15:20] SPEAKER_01: and it would ask me like everything from the style of the home the the layout the uh what region i'm | |
[00:15:20 - 00:15:28] SPEAKER_01: gonna live in because that would um play into what type of how i would want it insulated or what type of windows I would want, right? The design of the kitchen, the hardware, | |
[00:15:28 - 00:15:37] SPEAKER_01: you know, all that stuff. It sparks the imagination in a way that nothing has in a very, | |
[00:15:37 - 00:15:47] SPEAKER_01: very long time. And it makes people wonder what's possible. And when you start looking at it, | |
[00:15:47 - 00:15:50] SPEAKER_01: in terms of what it can do for you as an individual, and then | |
[00:15:50 - 00:15:54] SPEAKER_01: what it can do for companies and what it can do for schools, like | |
[00:15:54 - 00:15:58] SPEAKER_01: what it could do for education. I mean, you everybody, everybody | |
[00:15:58 - 00:16:04] SPEAKER_01: can benefit from this. So naturally, I was I was just, I | |
[00:16:04 - 00:16:05] SPEAKER_01: was in it, I was enthralled. | |
[00:16:05 - 00:16:07] SPEAKER_01: I couldn't get away from it. | |
[00:16:09 - 00:16:13] SPEAKER_01: So my experience with it after getting my fill, | |
[00:16:13 - 00:16:15] SPEAKER_01: I mean, I say get my fill of chat GBT | |
[00:16:15 - 00:16:16] SPEAKER_01: as if I'm done with it, | |
[00:16:16 - 00:16:18] SPEAKER_01: it's literally opened up on my screen right now. | |
[00:16:20 - 00:16:24] SPEAKER_01: Is as a consultant now, of course, | |
[00:16:24 - 00:16:26] SPEAKER_01: every single company hears about this disruptive technology. | |
[00:16:27 - 00:16:31] SPEAKER_01: They're like, hey, I don't know what this is or what we can do with it, but I know we need it. | |
[00:16:31 - 00:16:32] SPEAKER_01: We need it or we're going to be left behind. | |
[00:16:33 - 00:16:40] SPEAKER_01: So companies are just spending crazy, crazy, crazy dollars trying to figure out exactly how it can be used in their business. | |
[00:16:40 - 00:16:47] SPEAKER_01: Because when it first came out, I mean, it was cool, but it felt like maybe a toy is the wrong | |
[00:16:47 - 00:16:53] SPEAKER_01: word, but how does it have any immediate impact in my business? And so now everybody's scrambling | |
[00:16:53 - 00:17:00] SPEAKER_01: to come up with like all of these really cool applications and services that can somehow work | |
[00:17:00 - 00:17:10] SPEAKER_01: with what I'm doing in business and make me more money. Sadly, that's the core of my job, is I help businesses make more money. So it seems kind of mercenary, but it affords me the | |
[00:17:10 - 00:17:15] SPEAKER_01: opportunity to learn all sorts of things about what's coming out, what companies are making | |
[00:17:15 - 00:17:21] SPEAKER_01: what between Google and Meta and OpenAI and all the open source work that's going on. | |
[00:17:22 - 00:17:26] SPEAKER_01: So I guess I would count myself fortunate because I'm in a position | |
[00:17:26 - 00:17:31] SPEAKER_01: where I can do that. I definitely didn't go out looking for this. So it's not like I was searching | |
[00:17:31 - 00:17:36] SPEAKER_01: for a job in AI because I mean, that didn't even exist. For that matter, I didn't even look for a | |
[00:17:36 - 00:17:43] SPEAKER_01: job in data, but that's a whole other conversation. But I'm here and I love doing it. | |
[00:17:43 - 00:17:44] SPEAKER_01: Um, but I, I, I'm here and I love doing it. | |
[00:17:46 - 00:17:46] SPEAKER_03: Yeah. Fair enough. I, | |
[00:17:51 - 00:17:51] SPEAKER_03: I think a toy is the right word when you say, you know, you weren't sure. | |
[00:17:54 - 00:17:57] SPEAKER_03: I think it is the right word when it first came out, people were like, Oh, this is this thing that I can mess around with. Um, | |
[00:17:57 - 00:18:08] SPEAKER_03: another thing that I found interesting was about what you said was the uses and applications for it at first, | |
[00:18:09 - 00:18:15] SPEAKER_03: similar to a lot of things when it first comes out. I think information is moving so fast now | |
[00:18:15 - 00:18:26] SPEAKER_03: that companies feel like they have to take more risk than before. I feel like if this comes out 20 years ago i feel like there was more | |
[00:18:26 - 00:18:33] SPEAKER_03: embracing of this then let's then say um when like email came out back in the back in the early | |
[00:18:33 - 00:18:39] SPEAKER_03: 2000s at least from at least i wasn't that old but from what i hear from from what i hear people | |
[00:18:39 - 00:18:47] SPEAKER_03: were like why do you need email you can just send send mail. What does email do? And now no company can do anything without email. | |
[00:18:48 - 00:18:55] SPEAKER_03: And I feel like people were much quicker to maybe take the leap immediately and say, I don't know what this is. | |
[00:18:55 - 00:18:57] SPEAKER_03: And I don't really know how to use it. | |
[00:18:57 - 00:19:09] SPEAKER_03: But I'm going to try and figure out how to use it rather than say, that's stupid and i and i i wonder i've kind of like seen this or maybe | |
[00:19:09 - 00:19:17] SPEAKER_03: i've i've maybe i've made this up but this pattern that i've seen of of like people who | |
[00:19:17 - 00:19:27] SPEAKER_03: like the gap between the or the ability to like take off with a business is much faster now than it used to be | |
[00:19:27 - 00:19:32] SPEAKER_03: and the also the ability to just absolutely plummet and tank also can happen like things | |
[00:19:32 - 00:19:38] SPEAKER_03: are happening much faster is that a fair assessment have you seen kind of like a | |
[00:19:38 - 00:19:45] SPEAKER_03: similar situation with people jump into embrace ai and it both working and maybe not working? | |
[00:19:45 - 00:19:48] SPEAKER_01: I don't know if AI can tank a company yet. | |
[00:19:50 - 00:19:51] SPEAKER_01: So there's a few things. | |
[00:19:52 - 00:19:53] SPEAKER_01: I'll get to them one by one here | |
[00:19:53 - 00:19:54] SPEAKER_01: because those are all really good questions here. | |
[00:19:54 - 00:19:57] SPEAKER_01: First, with the companies adopting AI, | |
[00:19:58 - 00:20:01] SPEAKER_01: as a rule, companies tend to be pretty change-averse | |
[00:20:01 - 00:20:02] SPEAKER_01: because they're risk-averse. | |
[00:20:04 - 00:20:05] SPEAKER_01: If what they've got is | |
[00:20:05 - 00:20:10] SPEAKER_01: working and they're making money off of it um if they mess with a winning solution they risk | |
[00:20:11 - 00:20:17] SPEAKER_01: losing a lot of money if they're wrong but um there's been enough lessons learned over time | |
[00:20:18 - 00:20:23] SPEAKER_01: um a lot of lessons learned about the dangers of ignoring disruptive technology so if you're smart | |
[00:20:23 - 00:20:29] SPEAKER_01: in what you're doing if a business is smart, they've always got somebody keeping an eye on the horizon saying, | |
[00:20:29 - 00:20:33] SPEAKER_01: okay, what's coming down the road? What can really change things? You know, what's going to make us | |
[00:20:33 - 00:20:43] SPEAKER_01: have to change to keep up? And that speaks a lot. The fact that everybody is jumping on the AI train | |
[00:20:43 - 00:20:50] SPEAKER_01: right now, a lot of it is hype, but a lot of it is not. And the fact that everybody is jumping on the AI train right now, a lot of it is hype, but a lot of it is not. And the fact that everybody is jumping on kind of speaks to how disruptive this is. | |
[00:20:50 - 00:20:59] SPEAKER_01: Because you're right, everybody feels like they need to have it. They don't know what it is yet. They don't know how. They don't know how they're going to use it. They don't know how it's going to make them money or benefit them yet. | |
[00:20:59 - 00:21:10] SPEAKER_01: And that's why they bring in literally anybody who claims to have any amount of knowledge on it, which was to be perfectly frank, isn't much because it's changing so quickly. | |
[00:21:11 - 00:21:15] SPEAKER_01: Our experience is everything we've learned and done and built over the last year. | |
[00:21:16 - 00:21:19] SPEAKER_01: I mean, that's that's nothing in terms of new technology. Right. | |
[00:21:22 - 00:21:27] SPEAKER_01: But companies are just asking anybody with any amount of experience to come in and tell us, hey, how is this going to help us? | |
[00:21:27 - 00:21:32] SPEAKER_01: Because we need it before we get left behind, because it is changing so fast. | |
[00:21:33 - 00:21:39] SPEAKER_01: And the abilities of the models today are improving so quickly. | |
[00:21:40 - 00:21:48] SPEAKER_01: Everybody's all the big the big guns out there that are making these models here are coming out with the newer, the bigger, the better models with all the new capabilities. | |
[00:21:48 - 00:21:54] SPEAKER_01: It's insane what's possible now compared to what was possible just a year ago. | |
[00:21:56 - 00:21:59] SPEAKER_01: So, yeah, companies are trying to jump on this. | |
[00:22:00 - 00:22:10] SPEAKER_01: The other question where it is easy for people to jump in and kind of start new businesses and, you know, these things will tank businesses. | |
[00:22:10 - 00:22:28] SPEAKER_01: That's also true, and not just because of AI. Especially with tech, the barrier to entry is much, much, much less than it was, let's say 10 years ago. Um, cause I'm not old enough to speak to like, like 30, 40 years ago, but. | |
[00:22:30 - 00:22:32] SPEAKER_01: A lot of that has to do with, with the cloud. | |
[00:22:32 - 00:22:33] SPEAKER_01: Everything is coming out now. | |
[00:22:33 - 00:22:37] SPEAKER_01: Um, so you've got your three main, I think I'm gonna, I think I'm gonna piss | |
[00:22:37 - 00:22:40] SPEAKER_01: off some, some of the platforms here, but the platforms that I had that I mess | |
[00:22:40 - 00:22:43] SPEAKER_01: with here, you got, you got Azure, you've got AWS and you got Google. | |
[00:22:44 - 00:22:46] SPEAKER_01: Um, and then you've got, you you got Google. And then you've got, | |
[00:22:46 - 00:22:49] SPEAKER_01: you know, a whole bunch of different other platforms that are built on top of them. | |
[00:22:52 - 00:22:56] SPEAKER_01: The short of that, if anybody listening is unfamiliar, just that you're in a computer | |
[00:22:56 - 00:23:02] SPEAKER_01: space. You've got all of these services that are offered that make it easier to build new | |
[00:23:02 - 00:23:05] SPEAKER_01: applications. If you've got some great idea about how you're going to build an app | |
[00:23:05 - 00:23:06] SPEAKER_01: or how you're going to, | |
[00:23:07 - 00:23:09] SPEAKER_01: you know, you want to start a business | |
[00:23:09 - 00:23:11] SPEAKER_00: and you want to have like a database | |
[00:23:11 - 00:23:12] SPEAKER_00: and store your data | |
[00:23:12 - 00:23:13] SPEAKER_00: and you want to have something a little more sophisticated | |
[00:23:13 - 00:23:15] SPEAKER_00: than just writing stuff down on an Excel sheet. | |
[00:23:16 - 00:23:20] SPEAKER_01: You can do that for much less of a cost | |
[00:23:20 - 00:23:21] SPEAKER_01: than you would back in the day. | |
[00:23:21 - 00:23:22] SPEAKER_01: Back in the day, you'd have to like | |
[00:23:22 - 00:23:24] SPEAKER_01: buy these gigantic servers, right? | |
[00:23:28 - 00:23:31] SPEAKER_01: You'd have to set up a special room and provide security and make sure that power is going to it right. | |
[00:23:31 - 00:23:34] SPEAKER_01: And God forbid that somebody flips the wrong light switch | |
[00:23:34 - 00:23:35] SPEAKER_01: and everything shuts down. | |
[00:23:38 - 00:23:39] SPEAKER_01: It's not like that anymore. | |
[00:23:39 - 00:23:42] SPEAKER_01: So it's much easier to get into it than it was before. | |
[00:23:43 - 00:23:50] SPEAKER_01: Now, combine that with AI. AI is the great equalizer. | |
[00:23:53 - 00:23:58] SPEAKER_01: Anybody, the only thing I can compare this to is like the internet or Google, where | |
[00:23:58 - 00:24:05] SPEAKER_01: anybody with any amount of desire to learn, like, I just want to know more. I want to be better. I want to try something new. | |
[00:24:07 - 00:24:08] SPEAKER_01: All they have to do is ask. | |
[00:24:13 - 00:24:14] SPEAKER_01: You can, I can go to chat GPT right now. | |
[00:24:16 - 00:24:16] SPEAKER_01: And in fact, I have, this is, sorry, | |
[00:24:17 - 00:24:19] SPEAKER_01: earlier today and yesterday, I was doing my own little pet project here | |
[00:24:19 - 00:24:22] SPEAKER_01: where for whatever reason, | |
[00:24:22 - 00:24:27] SPEAKER_01: I wanted to understand how this encryption algorithm worked. | |
[00:24:28 - 00:24:30] SPEAKER_01: It's called SHA-256, if you know about it. | |
[00:24:31 - 00:24:33] SPEAKER_01: Nothing new. | |
[00:24:33 - 00:24:35] SPEAKER_01: And I just wanted to learn about it. | |
[00:24:35 - 00:24:39] SPEAKER_01: And I asked it, okay, you are a professor of encryption. | |
[00:24:40 - 00:24:45] SPEAKER_01: And tutor me from base principles | |
[00:24:45 - 00:24:47] SPEAKER_01: on how the SHA-256 encryption algorithm works. | |
[00:24:48 - 00:24:49] SPEAKER_01: And it gives me a whole outline | |
[00:24:49 - 00:24:51] SPEAKER_01: and then I ask it to dive deeper into each step. | |
[00:24:52 - 00:24:53] SPEAKER_01: And then it does that. | |
[00:24:53 - 00:24:54] SPEAKER_01: And then I say, okay, well, I don't quite understand this. | |
[00:24:54 - 00:24:55] SPEAKER_01: Can you provide me an example? | |
[00:24:56 - 00:24:57] SPEAKER_01: And it builds out an example | |
[00:24:57 - 00:24:58] SPEAKER_01: and shows me how all the calculations are done | |
[00:24:58 - 00:25:00] SPEAKER_01: and say, okay, what if you did it this way? | |
[00:25:00 - 00:25:04] SPEAKER_01: I don't, does this happen every time just like this? | |
[00:25:04 - 00:25:05] SPEAKER_01: Well, no, actually you would do it that way. It's't, would, does this happen every time just like this? Well, no, actually, | |
[00:25:05 - 00:25:14] SPEAKER_01: you would do it that way. It's like talking to a personalized tutor who listens to you, | |
[00:25:14 - 00:25:20] SPEAKER_01: who's patient, who can adjust on the fly, who doesn't get frustrated with you. | |
[00:25:21 - 00:25:26] SPEAKER_01: So you can learn anything you want. Now, there are some risks that I'm happy to talk | |
[00:25:26 - 00:25:36] SPEAKER_01: about in a bit here. But you can learn anything you want, you can create, you can create basic, | |
[00:25:36 - 00:25:42] SPEAKER_01: again, God, I feel like I have to attach, like little fine print here, just like be very careful | |
[00:25:42 - 00:25:45] SPEAKER_01: about this. But you can create like basic | |
[00:25:45 - 00:25:53] SPEAKER_01: contracts, or you can create marketing material, or you can create your own your own brand, like | |
[00:25:53 - 00:25:56] SPEAKER_01: your profile, you know, you can talk through it about like brand management, if you wanted to | |
[00:25:56 - 00:26:03] SPEAKER_01: start a company, you can talk through about potential risks to starting a business in | |
[00:26:03 - 00:26:06] SPEAKER_01: starting to starting a restaurant.'s say you want to start a | |
[00:26:06 - 00:26:09] SPEAKER_01: restaurant what are all the things you'd have to consider and it would give you ideas and you can | |
[00:26:09 - 00:26:16] SPEAKER_01: bounce ideas you can dig deeper into those points you don't you're not limited anymore if you wanted | |
[00:26:16 - 00:26:25] SPEAKER_01: to try you can and i'm it's it's it's an incredible shift in the power of who knows what. | |
[00:26:26 - 00:26:29] SPEAKER_01: And I'm really looking forward to see what that happens. | |
[00:26:29 - 00:26:32] SPEAKER_01: But yeah, it is easier to start your own thing. | |
[00:26:32 - 00:26:34] SPEAKER_01: You can work on your own. | |
[00:26:34 - 00:26:37] SPEAKER_01: You can jump into it with everybody else. | |
[00:26:37 - 00:26:40] SPEAKER_01: And for that reason, that's exactly why companies | |
[00:26:40 - 00:26:42] SPEAKER_01: who aren't embracing this are going to fall. | |
[00:26:42 - 00:26:44] SPEAKER_01: You just, you have to learn about it. | |
[00:26:44 - 00:26:46] SPEAKER_01: Even if you're not building things with it, | |
[00:26:46 - 00:26:48] SPEAKER_01: at least understand it, be literate. | |
[00:26:48 - 00:26:51] SPEAKER_01: It's no different than when computers first came out, | |
[00:26:51 - 00:26:52] SPEAKER_01: when the internet first came out. | |
[00:26:52 - 00:26:54] SPEAKER_01: You can have to be literate and understand it | |
[00:26:54 - 00:26:55] SPEAKER_01: and be a part of it. | |
[00:26:55 - 00:26:56] SPEAKER_01: Otherwise you're going to fall behind. | |
[00:26:57 - 00:26:58] SPEAKER_03: Yeah, that makes total sense. | |
[00:27:00 - 00:27:02] SPEAKER_03: The education thing is interesting | |
[00:27:02 - 00:27:06] SPEAKER_03: because I don't, did you go to college? | |
[00:27:06 - 00:27:07] SPEAKER_03: I did. Yeah. | |
[00:27:07 - 00:27:13] SPEAKER_03: Okay. I didn't, I couldn't remember because I knew that you were doing military stuff. And | |
[00:27:13 - 00:27:19] SPEAKER_03: oftentimes that takes the place of, of school for some, for some people. But so you've obviously | |
[00:27:19 - 00:27:25] SPEAKER_00: seen, I would assume the stuff about college being not worth the money anymore and people should go to | |
[00:27:25 - 00:27:29] SPEAKER_00: trade school or whatever, because then you're not a hundred thousand dollars in debt or whatever | |
[00:27:29 - 00:27:36] SPEAKER_03: the number is. You've seen all that, right? So yeah, this is my question and I, there's no | |
[00:27:36 - 00:27:40] SPEAKER_00: answer. I don't, I don't think, but a thought that I've had is, so I totally agree with all | |
[00:27:40 - 00:27:45] SPEAKER_03: the things that I just laid out. The thing thing also though is i feel like we're entering | |
[00:27:45 - 00:27:50] SPEAKER_03: a bit of like a wild west period because one of the things that colleges were able to provide | |
[00:27:50 - 00:27:56] SPEAKER_03: was a kind of a certification that everyone could trust or and like an accreditation it's like you | |
[00:27:56 - 00:28:08] SPEAKER_03: went here we know that you learned at least like blank things and whatever now you've got people maybe rightly saying i became like an electrician | |
[00:28:08 - 00:28:14] SPEAKER_03: on youtube and i took this one test on the internet and now i'm certified because some | |
[00:28:14 - 00:28:18] SPEAKER_03: websites that i was and then it's like do you want that guy to come to your house and like | |
[00:28:18 - 00:28:23] SPEAKER_03: mess with your electrical system it's like well i don't i don't really know what i want | |
[00:28:23 - 00:28:24] SPEAKER_03: It's like, well, I don't really know what I want. | |
[00:28:25 - 00:28:25] SPEAKER_00: You know what I mean? | |
[00:28:33 - 00:28:33] SPEAKER_00: And so chat GPT and AI generally, I should stop calling AI chat GPT because it's not that. | |
[00:28:34 - 00:28:34] SPEAKER_00: It's AI's own thing. | |
[00:28:39 - 00:28:39] SPEAKER_00: But AI is going to become kind of, like you said, the great equalizer. | |
[00:28:46 - 00:28:54] SPEAKER_00: And people are going to have access to all this information and knowledge without necessarily the accreditation and i i am fully behind someone putting the universities and in their in their place right but i wonder what would | |
[00:28:54 - 00:29:02] SPEAKER_03: take the like that part of it away like that that piece of it is the reason at least to me why they | |
[00:29:02 - 00:29:05] SPEAKER_03: haven't just tanked completely yet and i and i | |
[00:29:05 - 00:29:10] SPEAKER_03: wonder and i wonder how society is going to like fiddle through that because it's going to be | |
[00:29:10 - 00:29:14] SPEAKER_03: there's going to be some growing pains for sure like you're going to have people's advertising | |
[00:29:14 - 00:29:21] SPEAKER_03: as though i'm i i'm an expert in this like how do you know well i just put it into ai and it taught | |
[00:29:21 - 00:29:28] SPEAKER_03: me it's like well i don't really know that you know that like you can put on a resume, and you can answer all the right questions in the interview, because you prepared | |
[00:29:28 - 00:29:32] SPEAKER_03: you studied really for it. And then as soon as you get on the job, you don't actually know what | |
[00:29:32 - 00:29:39] SPEAKER_03: you're doing. And I mean, that's okay, right? Like, you can train people, but I don't like what | |
[00:29:39 - 00:29:42] SPEAKER_00: are your thoughts on that as society as a whole kind of going through that it feels like a | |
[00:29:42 - 00:29:50] SPEAKER_00: transition process and a bit of a kind of growing pains for society as a whole i mean obviously there's other things going on in | |
[00:29:50 - 00:29:55] SPEAKER_00: society that make life not the easiest at the moment but this one in particular feels like a | |
[00:29:55 - 00:30:04] SPEAKER_01: big one as well so yeah you're gonna hear me jump back and forth uh across both sides of the fence | |
[00:30:04 - 00:30:07] SPEAKER_01: a lot on this and it might it frustrate you, but bear with me. | |
[00:30:07 - 00:30:11] SPEAKER_03: Speculate. Speculate all – that's what this is for, right? People just having a conversation. | |
[00:30:14 - 00:30:20] SPEAKER_01: First off, knowledge absolutely shouldn't be gatekept by a degree. | |
[00:30:21 - 00:30:23] SPEAKER_03: I've met some very dumb people with degrees. | |
[00:30:27 - 00:30:31] SPEAKER_00: It's just a degree. So I've met some very dumb people with degrees. It's just a fact. Having a degree doesn't make you smart. It means you can finish something, which I think when people require that | |
[00:30:31 - 00:30:34] SPEAKER_00: you have a degree, I think that's really what they're looking for. They know you can show up | |
[00:30:34 - 00:30:39] SPEAKER_00: on time. They know you can commit to something and you can finish it. But still, there's no | |
[00:30:39 - 00:30:49] SPEAKER_01: guarantees there. And more on top of that, I guess I would say that I've as much fun as I had at college | |
[00:30:49 - 00:30:54] SPEAKER_01: learning the things that I did. I, my story is not much different than anybody else's when I say that | |
[00:30:54 - 00:30:59] SPEAKER_01: I am not using at all what I learned in college in terms of my degree, I learned how to problem | |
[00:30:59 - 00:31:08] SPEAKER_01: solve, which, which I use every single day, but I, not, my degree was in biophysics. | |
[00:31:08 - 00:31:11] SPEAKER_01: And I did that just because I took a physics class | |
[00:31:11 - 00:31:11] SPEAKER_01: and I enjoyed it. | |
[00:31:11 - 00:31:12] SPEAKER_00: I was like, oh, this is really cool. | |
[00:31:13 - 00:31:16] SPEAKER_01: I don't, no, I don't use physics at all. | |
[00:31:16 - 00:31:17] SPEAKER_01: Why would I? | |
[00:31:20 - 00:31:24] SPEAKER_01: But I've learned more outside of school | |
[00:31:24 - 00:31:27] SPEAKER_01: than I have in college. | |
[00:31:27 - 00:31:30] SPEAKER_01: And in my own opinion, that's the way it should be. | |
[00:31:30 - 00:31:33] SPEAKER_01: I've been out of school for, all right, carry the two. | |
[00:31:34 - 00:31:40] SPEAKER_01: I will say, I don't know, like to a 10, God, how long has it been? | |
[00:31:41 - 00:31:46] SPEAKER_01: Let's see, is it 20, well, we'll just say 10 years because of easy math and i don't feel like | |
[00:31:46 - 00:31:51] SPEAKER_01: remembering right now but it's been a while man if i stopped learning the day i got out of school | |
[00:31:51 - 00:31:59] SPEAKER_01: then i'd be dumb um let's i guess if most people felt that way the world would be dumb so take for | |
[00:31:59 - 00:32:07] SPEAKER_01: that what you will um and if we're making comments about the state of society right now. | |
[00:32:07 - 00:32:17] SPEAKER_01: But you should always be learning, you should always be picking up new things. It doesn't have to be studying, it's just having interest. What's new? How does it work? Be curious. | |
[00:32:17 - 00:32:28] SPEAKER_01: Having curiosity makes you an expert eventually you know um so with with the university thing out of the way | |
[00:32:28 - 00:32:35] SPEAKER_01: uh well not quite yet uh my opinion of what's going to happen when knowledge becomes this | |
[00:32:35 - 00:32:42] SPEAKER_01: accessible when creation becomes this accessible um is that universities will probably go back to | |
[00:32:42 - 00:32:48] SPEAKER_01: what they used to be which is a place for people to expand our existing knowledge. | |
[00:32:48 - 00:32:50] SPEAKER_01: That's what all the PhDs do, right? | |
[00:32:50 - 00:32:52] SPEAKER_01: I mean, that's the whole reason they work at the university. | |
[00:32:52 - 00:32:53] SPEAKER_01: It's actually, it's not to be a teacher. | |
[00:32:53 - 00:32:56] SPEAKER_01: That's part of their contract. | |
[00:32:57 - 00:33:00] SPEAKER_01: You get to use our facilities and experiment and learn | |
[00:33:00 - 00:33:06] SPEAKER_01: and philosophize and think in exchange for teaching us. | |
[00:33:06 - 00:33:07] SPEAKER_01: And we'll also give you a paycheck. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment