Build strategy with actionable goals.Learn OKRs
EventsAbout
Get in Touch

Unlock the Secrets of Certified Scrum Product Ownership in the Age of AI

How can we help? Let’s talk.
Presented by Bob Schatz

Unlock Your Potential: Thriving as a Product Owner in the AI Era!

What You’ll Discover:

  • Key skills for successful AI Product ownership
  • Insights into becoming an AI Certified Scrum Product Owner.
  • The transformative impact of AI on Product Ownership.

Join Dr. Bob in upcoming course to earn your Certified Scrum Product Owner certification: https://hyperdriveagile.com/training/courses/cspoai

Transcription:
Speaker 4
(0:13) Welcome everyone to a special webinar today, we are combining forces with our friends at Silicon Valley Project AI Meetup to bring you a little bonus talk this month.

(0:21) Normally, we only do a talk once a month, but you get an extra treat today.

(0:26) It seems like everywhere you turn, there’s more news on AI being incorporated, more and more into our everyday world.

(0:30) And today, Dr. Bob will share insights, bless you, on how AI is changing the landscape of product management.
(0:38) If you’d like to continue to learn more from Bob, join us in our upcoming Certify Scrum Product Learner class for AI projects.
(0:45) Our session this month starts on June 20th.

(0:48) After our talk today, I’ll send out an email with more details on how to join, including a promo code for $100 off, and that promo code is going to be good for 24 hours. (0:57) It’ll expire tomorrow at midnight, so be sure to sign up right away. (1:01) And as an added bonus, folks who sign up within the 24-hour period, you’ll be entered into a drawing to win a complimentary Certify Scrum Master Class, so you can come back, take the CSM with Bob again.

(1:13) And then another thing is the email will also include details on how to score a free coaching session with Bob as well. (1:19) So Bob on all angles. (1:22) Be careful what you win.

(1:25) Be on the lookout for that. (1:27) Great to see everyone. (1:28) I will turn it to Rob from Silicon Valley AI Meetup for a few words.


Speaker 2
(1:33) Awesome. (1:34) Thank you so much.
(1:38) Thank you, everyone, for joining us.

(1:40) We have an action-packed session ahead, full of insights for product management, navigating how AI will impact your careers.
(1:50) I’m Rob Franklin, CSP and webinar co-sponsor with Hyperdrive.
(1:55) Before we dive in, I wanted to just share some exciting updates.

(1:59) Silicon Valley Project AI is going to have its in-person meetup taking place in July. (2:06) Sign up now to be the first to hear the details. (2:09) We’re going to have two incredible speakers.
(2:11) The first one will be on models you may not know, but should, and achieving GPT-4 performance of 10x faster and 25x cheaper.
(2:25) And how to identify used jobs to be done framework for developing meaningful AI features.

(2:33) We are looking for sponsorship support and, in particular, a hosting vendee.
(2:38) So please reach out or DM me if you have or would like to discuss sponsoring.

(2:43) Our AI Quick Bytes newsletter is your go-to resource for concise, powerful insights into AI strategy, trends, tools, and learnings.
(2:53) And if you’re interested in launching an AI innovation as a service program or need a fractional PMO at your company, visit our Journey to Agility’s website.

(3:04) I’m going to drop all these links into the chat. (3:08) And get ready for a great webinar. (3:12) I highly advise turn off all your distractions and focus on unlocking the secrets of certified Scrum product ownership in the age of AI with Dr. Bob Schatz, CST. Enjoy.


Speaker 1
(3:27) All right.


Speaker 2
(3:28) Thank you. (3:28) Back to you, Shariann. (3:30) Nope.


Speaker 1
(3:31) Go ahead, Shariann.


Speaker 4
(3:32) I think I’ll pass it to Stacey. (3:35) Stacey, They want to say a few words or we can get the show on the road.\
(3:38) No, no. (3:38) Go for Stacey.


Speaker 3
(3:39) I like to hear her. (3:40) All right. (3:41) Thanks, everyone.
(3:41) So everybody, this is so awesome. (3:44) We talked about product management. (3:46) We’re talking about AI.

(3:47) Back about 15 years ago, I was working for a company called Xerox PARC.

(3:51) And those of you who are familiar with Xerox PARC, we created a lot of interesting inventions. (3:56) We did a lot of AI back then and a lot of AI that was really dedicated for DOD purposes and publishing.

(4:04) Fast forward 15 years, AI is everywhere, as you guys can tell. (4:07) But a lot of the underlying nature of AI still hasn’t changed as far as how can it affect the future of a product.(4:14) The things that Dr. Bob’s going to cover today are things that you see the government agencies have been thinking about for over a decade, like how you productize this and how do you integrate this, and what can it mean for your products as you evolve your products.

(4:29) So I’m very excited to hear what Dr. Bob has to say. (4:33) For those of you who don’t know Dr. Bob, he’s been one of our premier trainers. (4:37) He’s been doing adult training for years, and he’s a product expert.

(4:42) So taking all that, combining it with his doctorate in organizational management, combining that into just his great skill in being able to teach and coach people. (4:53) Welcome, Dr. Bob.


Speaker 1
(4:55) Thanks, Stacey. (4:56) Thank you very much. (4:58) And thanks, everybody, for attending.


(5:02) And it was just a few months ago I approached Stacey with an idea to start poking around and helping people do product management with AI as the target. (5:19) There was a lot of stuff going on, as many of you know, with using things like chat GPT and GPT everything to do all of our jobs.

(5:29) And, of course, the fear is there. (5:31) But to me, the bigger challenge that I was seeing, right, those are things we’re going to all learn to use and probably have already. (5:39) The other side of it is how do these things get created, right? (5:42) And that’s where I was starting to see more of the struggle when something comes out of the darkness and into the light again for the company that’s done that.

(5:53) And companies start seeing dollar signs on what it could do for them in cost savings and making money. (5:59) And there’s all kinds of statistics out there. (6:00) I’ll show you some.

(6:02) But I thought it would be interesting to start to combine these practices and take what we know about agile practices and their use on these types of projects and combine it with the AI challenges that are there and see what we can put together. (6:18) And so I did a lot of hard work in trying to put it together. (6:20) So hopefully this is helpful for everybody.

(6:23) And, you know, again, we’re trying to drive, you know, a lot more details into a CSPO course.(6:30) And we’ll talk more about that later. (6:31) But I appreciate everybody attending.

(6:33) And hopefully this is helpful for everybody. (6:34) And we’ll have some questions and answers session towards the end here so we can see some of your experiences and hear from some of you and really look forward to that. (6:44) So hopefully you enjoy this.

(6:47) So before we get started, I was going to give you my background. (6:51) But I thought, like, who wants to hear all that? (6:54) So while I’m talking about myself, which I’m very good at, I thought maybe you could do a little poll question.

(7:01) So if you want to put your iPhone up there or whatever you have, an Android device, and poke that little thing there and see what you’re going to come up with. (7:10) I’ll show you the results in a few minutes.

(7:13) So my experience has been 40 years in software and systems development and was everything from a developer to a C-level executive. (7:25) As far as AI goes, I’ve had a lot of clients that have, over the years, been dabbling in that and producing things that fit that mold and always tried to help them modify what they were doing with that.

(7:38) Personally, my first AI experience was back in 1981 in college. (7:45) I took an artificial intelligence class.

(7:47) I thought that would be pretty cool.

(7:48) We wound up just creating a monopoly simulator, which was all programmatic and deterministic.

(7:54) So I’m not sure it was mostly random stuff but not anything like we’re seeing today. (7:58) But that’s where I got my start with this stuff.
(8:02) So in my experience, I’ve been in a lot of different environments, worked in the aerospace industry, and flew satellites for a living for a while.
(8:17) I got into a startup in the extremely fun regulatory world of pharmaceutical regulatory and getting drugs approved from the FDA.
(8:29) We actually came up with a way of using a brand-new technology back in the early to mid-90s called PDF.
(8:37) Many of you may have heard of it by now.
(8:40) And using that to publish massive documents for drug approvals for the FDA, and then we went global.

(8:48) So that’s where I learned a lot about agile practices before anybody had the word agile in mind.
(8:56) And it was during those time periods that a lot of these practices evolved.
(9:02) We had to build things in a very short amount of time.
(9:06) We needed a great team of people.
(9:07) And the idea of how to go from there to some other point, which we thought was big, but then you wound up going somewhere else.
(9:16) But it was kind of interesting.
(9:18) And so I watched these practices evolve.

(9:21) And after our startup had been very successful and got sold off, I decided to move on and took a job at a company called Primavera Systems, which is now part of Oracle, and got heavily into project management, which it was kind of ironic because I was very good at managing projects, but I have a hatred for project management software.
(9:45) So I thought that would be a great job to take, leading an organization that developed project management software.
(9:53) And after a year of trying to do my best to fix things, decided that we should try something really crazy, this thing called Scrum, and that’s how I got involved with Scrum early on.

(10:06) I worked with Ken Schwaber and Mike Cohn and a lot of the popular names in there.
(10:13) Because unbeknownst to us, we became one of the early adoptions of Scrum at an enterprise level.
(10:21) This was not a point where it was massive corporations.
(10:23) This was a 20-year-old legacy product with 150 people.
(10:27) So at the time, that was pretty big for Scrum because apparently nobody had really tried to tackle that.

(10:34) So that’s how I got into Scrum, Extreme Programming, and got my group to start to do that and started to create what I guess was an early showcase for people who were thinking that maybe this wouldn’t work, where they were, and just trying to show them how it actually works.

(10:51) So I’ve been involved with that for quite some time.
(10:54) For the last 18 years, I’ve been teaching, coaching, advising, guiding, and anything else I need to do to help people that want to move forward in their practices.
(11:08) Agility in general is important to me, making sure that organizations are set up to serve their customers in better ways and do a better job of taking care of their people while they’re doing it.

(11:22) So I try to help organizations in that light.
(11:26) There’s a bunch of organizations here I’ve worked with in some capacity, trying to help them out.
(11:32) As Stacey said, I’m a certified Scrum trainer.
(11:34) I’ve been doing that for a long time.
(11:37) I think I started that back in 2006.
(11:42) And education, I have an undergrad in computer science, master’s in organizational dynamics, and I did my doctorate in management.
(11:48) Actually focused on the whole dynamic of radical change and how that happens and all the people aspects of that.

(11:56) So if you’re ever interested in that, you can always contact me and I’ll tell you more about that sometime.
(12:02) So anyway, who cares about all that?
(12:04) It’s crazy. (12:05) All right.
(12:06) What are we trying to tackle here?

(12:09) So there’s three things I wanted to do in this session we have for the time we have.

(12:16) One is really just to start to look at how Scrum and agile approaches in general fit in with AI product development, which is very different in itself.
(12:27) So sort of merging these two aspects and other things that have evolved that would help as well.
(12:34) And the product owner role in these becomes extremely important also.
(12:38) One of the things I was really considering not too long ago when I was trying to focus is what’s probably more important in these projects, the product owner or the Scrum master, as you’re looking at the different parts of this and just looking at all the things that are happening and different experiences companies are having, kind of looked at the product in a role.

(13:01) They’re really having problems with setting business objectives, understanding what problem they’re actually trying to solve, and all the operational things that go with deploying AI.
(13:12) So I tried to focus on that.
(13:14) Maybe sometime in the future here, we’ll start to go to the Scrum master side. (13:20) I know a lot of people are teaching courses on how to use chat GPT to do your job better and all that stuff.
(13:27) And that’s great.
(13:28) I’ve done some of that myself and certainly use those things on a regular basis.
(13:33) But this product owner role becomes really critical in product success.
(13:38) And so we’re going to kind of focus on that, give you some insight to that, and then talk about this product owner CSP that’s really got a focus on AI and data product development.

(13:51) So I tried to put a little spin on it while still meeting the criteria for what the Scrum Alliance likes for the CSP, of course.
(13:59) So it wasn’t really that hard to mesh the two and just get a different focus on it. (14:04) So helping people do the translation instead of trying to have to work it out themselves. (14:11) All right.
(14:11) So those are the things we’ll try to cover.

(14:13) And let’s see if this actually works. (14:17) Oh, let’s see.
(14:18) I got to do this. (14:19) Let’s see if this works. (14:22) I don’t know if this is right.
(14:24) I don’t really use this a lot. (14:26) Oh, I got it on the other place.
(14:28) All right.
(14:29) Well, let me share this real quick so you can all see what turned out of this.
(14:39) Here we go. (14:40) We’ll do this real quick.

(14:42) In case you’re wondering, it seems like 50% of the people, about 50% are camera off people. (14:50) Got the blurs. (14:53) No one’s pedaling. (14:55) I get a lot of that in classes. (14:56) People are always pedaling. (14:58) I’m really happy they are pedaling. (14:59) I’m kind of jealous they’re pedaling. (15:01) But it looks like you’re actually doing stuff. (15:03) And always the Golden Gate Bridge background, a classic. (15:08) So just that seemed like a minor one. (15:10) So it looks like everybody’s on camera. (15:12) So it makes me wonder, because I started wondering about this at the beginning of the pandemic when people were having a lot of video meetings and not turning video on. (15:22) And I was like, then why use video conferencing? (15:24) So I never really understood that with people. (15:26) I like to keep my camera on, but that’s just me. (15:30) All right. (15:31) That was kind of fun. (15:33) Interesting. (15:35) So there we go. (15:36) All right. (15:36) So let’s get back to the show. (15:42) All right.

(15:43) So the first thing I wanted to tackle is a question that comes up quite a bit. (15:48) Because if you’re on LinkedIn, any more than five minutes, usually there’s a Agile’s dead post. (15:56) And, you know, being somebody that’s been involved with Agile practices for like 25 years is extremely concerning. (16:04) But I just wanted to let everybody know that the reports of Agile’s death have been grossly exaggerated. (16:10) It is not dead.

(16:12) This wave, AI and its popularity, you know, if you think about, well, how are you going to go about developing these types of products? (16:24) It takes a lot of teamwork. (16:26) It takes a lot of adaptability because things are really fluid, more than most things that we usually develop. (16:35) And the fluidity of these projects does not end when it gets released. (16:40) That is just the beginning of the fun.

(16:43) So really, when you think about it, the only way to really go about this is using some kind of Agile practice, right? (16:49) Now, what I believe, and, you know, everybody’s got opinions on these things, but what I think is the dead part is a lot of the hype, which is good. (17:01) I mean, this is one of the things that people used to ask me many years ago when these practices started and we were just starting to teach people these classes.

(17:12) They used to ask, like, what do you think the future is going to be?

(17:15) I was like, well, I hope the future is where we’re never talking about things like Scrum. (17:19) We’re just doing things like that. (17:21) We’re learning how to build things in different ways. (17:23) And then we don’t talk about it anymore. (17:26) So I guess we’re getting there in some way. (17:29) So, you know, all the hype, you know, the dogma about like, oh, you got to follow this. (17:33) You got to do this.

(17:34) And of course, there’s all kinds of framework debates out there. (17:37) But, you know, I think if you look at how things are shifting and what companies are doing, I think they’re giving us the message of what they’re looking for. (17:47) You know, usually you can see in what people are recruiting for and trying to get people in the companies, you know, what they’re asking for is usually a good telltale.

(17:57) But it’s interesting because, like I said from the start, agile practices of any kind were always about trying to deliver value in complex environments. (18:08) We did not in the beginning really think much about this stuff. (18:13) There wasn’t a lot of well, there weren’t any posts because there was no social media, really. (18:18) So we didn’t have to worry about that. (18:20) So it was all just a bunch of people that were interested in trying to build better products.

(18:25) And so I’m hoping that, you know, that is what we’re on our way back to. (18:31) And then, of course, you know, there’s a lot of discussion about, you know, mostly bashing about certifications. (18:38) And, you know, I get it. (18:40) You know, I get what that’s all about. (18:41) But, you know, the more you see new things coming like AI training and, you know, different things you’re going to be learning. (18:50) I mean, certifications are stronger than they ever were.

(18:55) I’m guessing, in my opinion, it’s because people are much more fluid in their jobs, like they’re moving around more. (19:02) And, you know, your reputation and your skills are coming through things like LinkedIn, where you have to post up like, oh, here’s what I’ve been through is what I’ve learned. (19:13) And, you know, so a lot of it’s coming through there.

(19:16) So people want to have something. (19:18) So there’s a lot of in ways. (19:22) So, you know, I think the value is still there. (19:25) You know, which one you go to and all that stuff is, you know, something you have to figure out. (19:29) But, you know, again, you know, people are asking for these things in job requirements and things like that. (19:37) So that’s usually what I’m looking at is, is this still important for people? (19:42) And beyond the certification, you know, just getting the education. (19:45) And that’s always what it’s all about.

(19:47) Certifications never prove that you can do something. (19:50) They just prove that you were somehow involved in some kind of education event. (19:56) And anybody that’s taken my courses knows that, you know, when I certify people, I’m like, hey, you know, let’s not get all crazy because the real test is what happens when you get back to work. (20:06) And you got to put things in the practice. (20:08) So always about more of the application of things than anything else. (20:12) So I think, you know, you can read those things and understand what people are talking about. (20:17) But I think most of it is clickbait myself. (20:19) But, you know, what do I know?

(20:22) Okay. (20:25) So I’m not going to go through a whole spiel about, you know, what artificial intelligence is and go into the nitty gritty detail. (20:32) But I thought this would be a little help just when you’re hearing words. (20:35) For those of you who are not aware, I know a lot of people are, you know, but some people aren’t. (20:41) And just when you’re hearing things, there are different layers of what this artificial intelligence is really going after.

(20:49) Obviously, the big hype right now is generative AI, mostly through things like chat GPT, Microsoft Copilot, other little things we have to do that. (21:01) And they got very popular when open AI opened up their solution to the world, you know, put a little chat window on it. (21:11) And you were able to go in there and almost felt like you were like cheating. (21:16) Like you started asking for things like, hey, generate things that I do my job. (21:20) And all of a sudden they start popping out.
(21:21) And that’s always pretty cool.

(21:23) Right. (21:23) So I’m sure that’s what got people excited. (21:26) And that opened up a whole world of things we can now learn and get deeper and deeper into a lot of good stuff out there. (21:34) So, again, that is just kind of a reference. (21:37) And the other thing that I like to focus on, there’s really different patterns of AI. (21:43) So when you go about talking about, you know, different solutions, they are there are different categories of things. (21:50) Some are pretty easy.

(21:52) And that’s what everybody’s getting into is more of the conversation, human interaction, chat bots, virtual assistants type things, which, you know, are what the popular things are today. (22:03) But there’s a lot of other things that, you know, we’ve actually all been using. (22:07) I mean, every time you, you know, look at your mobile device and it pops up because it sees your face, it takes a lot of effort to do that. (22:17) So that’s like more on the recognition side. (22:19) And then you have things like fraud detection and stuff like that. (22:24) Or, you know, we have patterns, autonomous systems like vehicles. (22:28) That’s like really the hardest stuff, takes the most data and most computing power.

(22:32) So there’s different levels of solution that people are going after. (22:37) So it’s important that as you’re looking at a project like, hey, what pattern are we using, we’re going for, what’s the solution look like? (22:46) And then that starts to help develop some of the questions about what are we going to really need in order to make that happen. (22:53) So, again, just a little bit of a reference for that stuff.

(22:59) All right. (22:59) So as far as, you know, what projects look like, I mean, there’s a lot of good statistics out there. (23:06) Different sources of people saying things that are always very extreme.

(23:11) But, you know, obviously people are looking for what this can add to the economy. (23:17) Anytime you get numbers that look like that, that’s going to draw some interest. (23:22) And businesses, companies are trying to figure out some approach. (23:27) Right. (23:27) I’m sure many of you know, I’ve been hearing from your executives like, hey, we need an AI story or something. (23:34) And just the popularity of chat GPT and how fast it, you know, accumulate users. (23:41) It was to date the fastest growth of users ever. (23:47) So they were about 100 million users in 60 days.

(23:50) I think the last time I looked at a couple of days ago, it was like two billion hits or something a day, some ridiculous number. (24:01) And just in comparison to Facebook, four and a half years to get up to that, of course, very different business model. (24:07) And Instagram took two and a half years.

(24:09) So, you know, the point of that is it is seems like it’s an acceleration. (24:14) And, you know, what businesses can expect in revenue increases. (24:20) Again, these are things that everybody looks at and starts salivating on like, oh, we can make more money. (24:26) Let’s go do this. (24:28) So there’s all the excitement that gets involved when numbers pop up.

(24:32) But it’s also interesting to know that a large percentage of projects that are taken on never really make it to production. (24:40) A lot of what’s going on has to do, you know, there are a lot of pilots and prototypes and stuff like that. (24:47) And there’s a big difference between doing those things and going into production with something.

(24:54) And one of the really big issues that I’ve seen for many years is that. (25:01) Data. (25:02) Data is a real problem. (25:03) We have a lot of data debt that’s out there. (25:06) A lot of quality problems, quantity problems, access, security, privacy. (25:14) There’s a lot of things that we have to deal with within data.

(25:19) And then once you get to that point where you have the data, then it’s really the whole problem of model confidence. (25:26) It’s like, what, how confident are you in the model? (25:28) It’s this is a very predictive type thing. (25:32) So probabilistic. (25:33) So, you know, it’s you’re not going to get 100 percent. (25:36) It’s always right.

(25:37) Right. (25:37) It’s not programmatic. (25:38) So that becomes an issue, too. (25:40) So, you know, it’s really just screaming at us that we, you know, with all the hype, there’s still a lot of problems that we have to deal with. (25:48) And those are going to be right in front of everybody very shortly. (25:53) So we’re going to have to really deal with that. (25:56) And it takes a lot of effort to get these things corrected, too.

(26:00) So all this debt we have from the past of collecting data, big data, massive computing, you know, we’ve become hoarders of data. (26:09) And on one side is very good because organizations have this this asset of data that they have from all the things they’ve done in their businesses, which is very valuable to them. (26:23) But it’s also not in great shape. (26:25) So that means you have to get it out. (26:26) You got to prep it. (26:28) You got to clean it. (26:29) And there’s different tools and approaches for that. (26:31) So it’s kind of interesting. (26:32) So anyway, yeah, money. (26:34) It’s a lot of money involved here, both on what you can do and what it takes to get into it. (26:39) So it’s a big investment. (26:42) So the question is, what is really different in these AI projects?

(26:49) Why is it any different? (26:51) And you can play the kids activities there while I’m talking about this stuff. (26:56) So the first thing is understanding AI development is not software development. (27:01) So you can’t just use the same approaches we use for software development and say, well, how hard could this be?

(27:06) Like, it’s just different. (27:08) It’s not. (27:09) It’s you have to have a really different mindset on this. (27:12) You know, AI is all probabilistic, whereas programming is deterministic. (27:18) Right. (27:18) You set the rules, you program it. (27:20) It does it. (27:21) It works. (27:21) It doesn’t work. (27:22) It doesn’t work. (27:23) You fix it and you deliver it. (27:25) And that’s what it does. (27:26) You give, you know, you give it some input. (27:29) It always gives you the same output with AI. (27:33) It doesn’t.

(27:33) You can have different data in the same algorithm, give you different answers. (27:38) You can have the same data in different algorithms, give you different data. (27:43) So different outputs. (27:44) So it’s you have to really deal with probabilities and data dependencies are unpredictable. (27:52) So you’re getting data from usually many different sources. (27:56) A lot of those we don’t control. (27:59) So there’s all kinds of data dependencies there. (28:01) You have scope that is rapidly evolving because you’re learning as you go. (28:07) There’s no way of really just specifying this up front. (28:10) So you can’t just say, oh, here’s the order. (28:12) Go fill the order. (28:14) And, you know, again, you’ll hear it over and over.

(28:16) It’s just everything’s about data. (28:18) The data is the much bigger problem here. (28:22) Code is actually very small amount of stuff. (28:25) So it’s like almost like opposite of what we’ve done before. (28:28) And so we have to deal with that.

(28:30) And data changes. (28:31) It’s never stable. (28:33) So it’s always changing, especially if you’re doing something and getting real time data out there. (28:38) It’s always changing. (28:39) You have drifts in the data where it starts. (28:41) The data is drifting through changes. (28:44) The models start drifting because now they’re getting different data in. (28:47) So it starts to you know, what you thought you had is no longer what you have. (28:51) So that opens up a lot of operational issues, hallucinations, hallucinations. (28:57) Some of these things just start going berserk. (28:59) And every few days you’ll hear one of the big players talk about how something happened.

(29:06) And here it goes. (29:07) The last funny one I heard was the city of New York had something for small businesses. (29:13) And it started giving them some advice that was illegal. (29:18) And they were faced with a decision like, hey, do we keep this up or shut it down? (29:22) So let’s keep it up. (29:23) It’s still learning. (29:24) So I don’t know what it’s telling people to do. (29:27) But that was kind of weird. (29:28) And of course, there’s the creepy factor where it starts to get a little too real.

(29:34) You know, recent chat GPT 4.0 and it’s well, they had the whole thing with Scarlett Johansson’s voice. (29:43) And that was interesting and weird. (29:47) But, you know, sometimes when these things start talking to you, especially at the rate that you would talk to a human like the speed, it starts to get a little weird.

(29:56) Like when you were talking to Siri and it had like a delay or a slight delay is like, all right, I’m talking to it. (30:02) And it’s digital system and then starts having these like animated conversations with you. (30:08) That’s when it starts getting a little strange. (30:10) Some people like that. (30:11) Other people just creeped out by it.

(30:13) So, you know, there’s always that stuff you have to worry about. (30:16) And the skills and composition of teams are very different.

(30:21) So, again, you know, software programming being the sort of minor part of this, you have much more data, data engineers, data scientists. (30:30) Right. (30:30) It’s a lot of data quality. (30:32) So you have a lot of those things going on. (30:35) Certainly, the hype about this is huge and fears are huge.

(30:41) Everybody’s like, oh, everybody’s going to take our jobs and not have stuff to do. (30:46) You know, and of course, that’s gets propagated by people to say like, oh, you know, this thing is going to do all the menial work for you. (30:53) So you’re freed up to be more creative. (30:58) That sounds cool, except I don’t know who’s going to tell you to be that creative. (31:01) But I mean, if you still have a job, that’s great.

(31:04) I like a digital assistant myself. (31:06) But the expectations are very high. (31:09) They’re usually unrealistic. (31:11) And you have a fast moving landscape of tools, platforms, everything’s changing very quick. (31:17) And again, the dynamics of the project, there’s a lot more work in operations when the thing is live and in production than when you have it in a development environment. (31:27) So it does. (31:28) There’s a lot of shifting of cost and effort and different challenges they have to deal with. (31:33) So those are definitely different.

(31:35) And I don’t know who picked up the five differences. (31:39) But if you didn’t, that’s kind of embarrassing. (31:43) It’s I mean, it’s pretty easy. (31:44) I think like here’s one. (31:47) Two. (31:49) I don’t know if anybody’s impressed by this. (31:52) And I think the tail. (31:53) All right. (31:53) There we go. (31:54) There we go.

(31:54) In case you didn’t get that right. (31:56) That’s one for the kids. (31:58) All right.

(32:00) While there’s a lot of things that are different in these projects, there’s also a lot of things that are the same. (32:04) And there are some of the big issues that we’ve seen in companies trying to deal with this. (32:11) And it has to do with, you know, first identifying the problem. (32:15) That is always a problem in whatever project you’re taking on. (32:19) People start solving problems before they even know what it is. (32:21) You get all excited and you want to jump right into it. (32:24) But a lot of times, you know, projects fail because they start off and nobody really knew what problem we were actually trying to solve. (32:30) So it becomes really important here because of that dynamic nature to do that.

(32:34) Also understanding the outcome goals, like what are we actually trying to achieve? (32:39) These things are not just to have them there to drive an outcome. (32:43) And that outcome is usually about making money or saving money or saving time or improving customer sat. (32:50) Like, what is it trying to do? (32:53) And so we need to figure out what that looks like. (32:56) You still have to break problems down. (32:58) You can’t tackle massive things. (33:00) So it’s always about breaking things down. (33:02) It has to be done with quality. (33:04) It has to be done with a lot of feedback. (33:08) Those things are still very critical. (33:09) And those are things that people continue to have problems with, even on software development side. (33:15) So that’s still a need. (33:17) So this is where these things are all the same. (33:19) And you definitely need support from leadership, not just investment, but really the support. (33:25) This is organizational change.

(33:27) This is one of those transformation changes that is, you know, everybody wants to think it’s the same like we have with agile. (33:33) Like, oh, how hard can that be? (33:35) And then we see, you know, 30 years of history. (33:38) It’s pretty hard, actually, to make those changes. (33:40) So this is, again, one of those next wave things that’s going to hit us that we really have to be prepared for. (33:45) And, of course, you always need a great team of people, but they really have to have a really strong desire to experiment and learn and not just be sitting there like somebody tell us what to do. (33:55) All right. (33:56) It’s not it’s just not going to work like that. (33:58) So we’re going to need a lot of that stuff. (34:00) So, again, that’s a lot of the product and role pieces coming into that as well. (34:05) So that’s what becomes the same.

(34:08) So Scrum actually does a really good job here because it kind of fits with what we try to do. (34:14) And again, just get away from the dogma of it. (34:16) And, you know, all the debates that happen, you know, that’s not really what it’s for. (34:22) It wasn’t put there for people to debate about it. (34:25) It was put there like, hey, let’s put people on a framework where they can work in these very dynamic environments and get stuff done. (34:31) So the iterative nature of it fits very well with what AI needs in terms of an approach.

(34:40) The teams, it really is going to take a really good self-organized, accountable, cross-functional team. (34:47) So even more important here with skills that may not have previously been on Scrum teams. (34:55) So you can have Scrum teams in your organization that have been doing software development.

(34:59) Now you’re taking people who may not have been on those teams. (35:03) They may have been like sort of like, hey, just keep those people over here. (35:06) And now those people are now becoming the center of attention. (35:09) So they may not be ready for this, but that’s why this is good to start getting them involved in like, hey, we got to here’s a problem. (35:15) We get to work together.

(35:16) We got to solve it. (35:18) And we have to solve it by getting good feedback from customers and keep the customer involved and make sure that we’re getting good telemetry on the on the production side to make sure that we can see and monitor. (35:29) Where are these things are going again? (35:31) You know, being probabilistic, these things can get out of control as much as they can drive a lot of value. (35:38) So these have to be monitored. (35:40) So it kind of fits in nice with that. (35:42) There’s really nothing in there.

(35:43) We can modify some of the things in Scrum because we’re going to have to deal with some challenges that these out to practice send to have like sort of locked down on. (35:54) But now you’re trying to apply it to something different. (35:57) So you’ve got a really level of variability that’s well beyond what most people are used to. (36:04) The whole idea of user stories, epics, it’s really hard to define that because you may not have a human in the loop there. (36:14) It might just be data products. (36:16) So that has to change.

(36:17) I’ve written some stuff about that, about how to sort of go back to that a different way. (36:21) Remember, user stories were put in place to build empathy for users because we were starting to build, you know, Internet apps, mobile apps. (36:30) And we didn’t have a history of being what you would call empathetic. (36:34) So user stories were a way to approach that and served us really well. (36:39) But, you know, I’ve dealt with a lot of my customers who, you know, were building things that were not involved with the user. (36:46) And so they were asking about user stories or they start to make it out and making crazy user stories about themselves. (36:53) And that just wasn’t necessary.

(36:54) So different approaches for that. (36:57) You have a lot more dependencies, data dependencies. (37:00) Mostly that’s where those things are coming from. (37:03) So you can’t just be, oh, we’re just going to do stuff ourselves and, you know, circle the wagons and get something done. (37:09) So that’s a little tough.

(37:10) Understanding what we’re actually trying to achieve in a sprint and whether the time box of the sprint is that important. (37:20) It can be important for getting feedback and providing update on where we are, but maybe not necessarily having that product increment at the end of every sprint is something that probably will change. (37:33) I’ve seen it change. (37:34) I’ve helped people change that and not worry about it. (37:37) What about what the book says? (37:38) Don’t worry about that.

(37:39) The book wasn’t written for this kind of stuff. (37:42) So let’s just adapt it to what you need and figure that out. (37:46) And, of course, the definition of done both from a procedural and a functional perspective is very different. (37:52) So we got to take a look at that. (37:54) And we talked about the team composition being very different as well. (37:57) So there are some changes there. (38:00) And these changes kind of remind me of when we first started with Scrum many years ago when we didn’t have all these really little more rigid rules that people sense it today. (38:12) It was much more loose in the beginning because we were dealing with these things that we didn’t know how to deal with. (38:18) So it’s almost like what we’re trying to experience here is actually going back a little bit. (38:23) So it’s kind of interesting that we maybe learn more about Scrum by having to adapt it outside of what we now consider the norm and just start to understand what it’s really all about. (38:35) So that’s kind of interesting as well.

(38:38) And just remember that the goal is not to have an AI product or an AI story. (38:46) I’ve heard that a million times. (38:49) The goal is never to implement Scrum. (38:51) That is not a goal. (38:52) The goal is not moving to Agile. (38:54) It is about delivering value and product owners, product managers, whatever. (38:59) It could be data owners. (39:00) Sometimes we have changes in titles. (39:02) These are the value drivers that are going to make this happen. (39:06) So that’s why this becomes really important.

(39:08) So you really have to start thinking about transformational change. (39:11) This is not just, oh, let’s try this out. (39:13) It’s like really this is why there’s a lot of executive support because you have to have a big picture view of what you’re going to attempt.(39:19) But then start to think like, OK, we’ve got to start simple and then we’ll have to keep iterating. (39:25) It’s not just like, build this. (39:27) And this reminds me of when mobile started and every one of my customers is like, we need to have a mobile app. (39:32) And I’m like, no, that’s not the answer.

(39:34) The answer is not having mobile because we have one and it’s a one star app. (39:38) Nobody’s going to be downloading it. (39:39) So it has to drive some value. (39:41) It has to have something there. (39:43) It can’t just be about having it. (39:45) So that’s sort of another part of the product owner role.

(39:49) Learning how to set this goal and trying to work with the organization on how to break this down to something is a little more achievable. (39:57) So the product owner role is really in that position, like I said, you know, when you look at the scrum roles, it doesn’t really change that much. (40:05) And if it does, that’s fine. (40:06) We can adapt these things. (40:07) They weren’t meant to be locked in stone.

(40:10) Remember, scrum is not owned by any entity. (40:16) It did not come down from the heavens. (40:18) It is not a government standard. (40:20) It didn’t emerge out of the volcanic rock. (40:24) It is just a bunch of goofy ideas people put together and say, hey, why don’t we try this? (40:29) And, you know, we tried to put a frame around it and see if people could use it.

(40:32) So the product owner role, you know, when you look at it, it doesn’t really change all that much. (40:37) What they’re doing and their expertise probably will change and it will need to. (40:42) And, you know, we’ll talk about that coming up here.

(40:45) So, you know, we know the challenges, unclear objectives, data quality and governance issues, high expectations, you know, having the right team, collaborating across the organization. (40:57) That’s going to affect a lot more people when it starts to get to legal, you know, regulatory, legal. (41:03) This is really heavy involved with that, because if you put something out there, some chat thing and it starts going berserk, you know, you’re going to have some legal issues. (41:11) And also you’re putting your stuff potentially in the public domain.

(41:16) So anything that your company, somebody at your company might be putting in the chat GPT, unless you have some private version is going out into the public domain. (41:25) Somebody puts a document up there or something. (41:27) It’s, hey, guess what? (41:28) You signed up. (41:29) You just let it go into the data that’s training this thing. (41:33) So it may wind up in somebody else’s response.

(41:35) So that’s kind of interesting. (41:37) A lot of infrastructure challenges in terms of, you know, what we need to support that. (41:42) Different options that are available, self hosting, you know, using one of the big players to do all the heavy lifting for you, but then you have to pay them. (41:52) And those ethical and societal implications of what we do is also pretty tough. (41:58) Yeah.(41:58) And somebody put security and compliance is also pretty big. (42:02) I have that on the next one.

(42:04) So when you look at product owners and things they need to do, there’s some things that are kind of common, like, you know, when you have to establish and communicate the text of a project and make sure that, you know, we are getting people aligned in that. (42:23) And that has to do with, you know, creating vision statements, setting goals, using outcome goals or OKRs. (42:31) I know Hyperdrive does a lot of stuff with OKRs and like any of that stuff is really good. (42:36) But you’ve got to have time based, usually measure, definitely measurable outcomes. (42:41) Like, what are we actually trying to do? (42:43) We need to increase sales by 30 percent. (42:46) We want to increase customer retention. (42:49) You know, whatever you’re trying to accomplish, the AI project has to be a means to the end, not the end of itself. (42:56) Knowing the patterns, understanding the data and infrastructure landscape, starting to understand the whole tradeoffs where in software, it’s scope, cost, time.

(43:07) In these things, you have to worry about accuracy, performance and cost. (43:12) And there’s always a tradeoff, right? (43:14) Is 85 percent accurate good enough or do you need 95 percent accuracy? (43:19) And are you willing to pay the exponential costs to get there? (43:24) You know, that’s that’s always a tradeoff. (43:26) So you got to really think about that. (43:27) And of course, the operations side as well. (43:29) And then as far as, you know, how you go about these things, you know, using things like Scrum or Kanban or any of these agile practices is good. (43:41) You know, it’s always been recommended.

(43:43) There’s these data methods that are out there, like CrispDM, CPM AI. (43:48) Microsoft has a TDSP. (43:51) These are all sort of frameworks that like methodologies for data product development. (43:58) And these are things that you can use within the context of Scrum.

(44:02) So you just wrap that around it as Scrum was meant to do and you have something there. (44:07) Then, of course, you have like your Gen AI stuff, ChatGPT, ChatPRD, Copilot Gemini, whatever you’d like to use. (44:14) And there’s those things are evolving every five minutes.

(44:17) You know, any kind of team collaboration, tools, data analysis and prompt engineering, of course, is going to use any kind of Gen AI. (44:27) It’s all about prompt engineering, which is basically word coding. (44:30) I mean, you’re you’re writing code, except you’re putting in English language, basically. (44:35) So, you know, any things you’ve learned there are going to be helpful and should be all coming up to speed on that. (44:40) And of course, the ethical parts, the bias, fairness, laws and regulations, security, privacy, and always have a human in the loop. (44:49) Right. (44:49) We’re not anywhere near where you don’t want that.

(44:53) You don’t have a human in the loop checking things, you know, especially with customers where they might be using a chatbot, you know, making sure like, hey, let’s make sure things are responding the right way. (45:04) And it’s not going off the off the universe and always be right out of a human jump in there if you don’t things like that. (45:11) So it’s a lot of checking still because it’s not quite there yet. (45:15) And we got a little ways to go. (45:17) So those are some of the skills and these are things that we’re going to cover in the CSPO course, like on top of this from stuff is try to see how all this kind of fits in with what we’re doing and, you know, kind of wrap that together into a nice package. (45:32) So at least there’s some awareness I don’t think in two days anybody’s gonna be an expert in any of these things I don’t think that’s possible for anybody to actually have expertise in all these things, but we’re gonna at least attempt to hit it so that there’s awareness. (45:47) Some of these things we’ll get into a little more than others but it’s good to have a general awareness of that as you’re trying to figure out how to go about running these things.

(45:56) So the course itself, you know, just give you a little intro that it’s, you know, just trying to get your AI project expertise up. (46:04) So, you know, try to get past the hype and into some of the specifics of how to make it practical and pragmatic so that, you know, you don’t get too wrapped up in all the excitement of it, it’s, it is exciting but you know you also have to deliver value. (46:21) It’s a two day course it’s virtual it’s online. (46:24) So, and it’s live, you’ll have me, and, you know, try to make it interactive discussion exercises practice some of this stuff. (46:32) And of course you get to meet other people. (46:35) And as anybody that’s taken my class knows anytime you’re a student of mine you’ve got access to me forever until I don’t exist. (46:43) And maybe there’ll be an AI version of me in the future and then I will exist. (46:48) I’m pretty much hoping for that.

(46:52) And I know Sherry and hyperdrive put some good incentives together that today and we have two coming up ones, June 20 and 21, and the other is July 15 and 16 so we’re going to run those two. (47:05) Hopefully, I’ll get to see some of you in those classes will be kind of fun. (47:09) And we’ll start to explore this a little more. (47:12) So there we go. (47:14) All right. (47:15) So I thought I’d open up for questions. (47:18) Everybody had questions I think there’s some questions in the chat here. (47:22) I don’t know if I can get them all and say we got here. (47:25) So, I really wonder if the ways Chrome is widely used, understood and practice is the best choice for data science. (47:34) Um, you know, I don’t think. (47:36) I mean, Maru had that question I, you know, out of the box. (47:40) I don’t want to say anything’s the best approach because I like to understand the situation first. (47:46) I will say this, I think agile approaches in general are the only approach you really should be using for this. (47:55) Do it seriously. (47:57) Is Scrum a good one.

(47:58) Yeah, it always has been it’s been a core of what we’ve done, you know the time boxes are a good checkpoint. (48:04) But to your point, agile and Scrum practices are also not done very well everywhere. (48:12) And, you know, that is evident in the half the people that complain about and half the people that love it. (48:18) That seems to be the course of the world today. (48:22) So, I try to focus on making clients successful. (48:26) And if it’s not the right one, I would be the last one pushing it.

(48:29) I try to keep people like, hey, here’s an option for you. (48:32) It’s a tool in a toolbox. (48:35) You know, just because you have a hammer sitting in your toolbox doesn’t mean you should pick it up all the time, right? (48:40) If a job doesn’t require a hammer, then pick up something else that’s in there. (48:43) But don’t have one tool in that toolbox. (48:45) Make sure you have a bunch of tools.

(48:47) So, I think that would be my advice there. (48:52) Let’s see what else. (48:54) Super glad to see CrispDM. (48:56) Yeah, CrispDM, that’s been around forever. (48:58) So, that’s been around for a long time. (49:00) That was more on the data mining projects, which was the DM part. (49:05) But now they’ve adapted, like CPM AI added some worse, you know, content to that and put some more operational stuff in there. (49:16) So, it’s kind of cool. (49:17) But that was good.

(49:18) Is there a difference between the CSPO cert and the CSPO AI cert? (49:22) Not as far as the certification itself.

(49:25) So, the Scrum Alliance does not have any specific AI certification. (49:33) Since I’ve been in this longer than the Scrum Alliance has, they tend to lag a little behind sometimes. (49:39) So, maybe in the future somewhere, they’ll do that. (49:44) But for now, the certification from the Scrum Alliance is the same.

(49:50) As an instructor, I have the flexibility to put different examples and context in there. (49:57) So, it’ll be the core of the CSPO, and then it’ll be a lot of AI specific material in there. (50:04) So, that’s at least my approach. (50:08) Let’s see how much and how long is each session. (50:12) The much part, I’ll leave that. (50:13) Oh, Stacey Sherian answered that. (50:16) Oh, she’s got all that kind of stuff. (50:17) Okay. (50:18) Typically, what type of opportunities are post-completion offer? (50:25) I’m not sure what that is referring to. (50:29) That might be the course itself. (50:31) I’ll let somebody else answer that. (50:32) All right. (50:33) How about I open it up to questions?

(50:34) We have a little bit of time here. (50:37) Anybody have any questions, thoughts, or experiences? (50:40) Anything you’ve run into? (50:42) This is new to a lot of us, so I don’t think anybody’s a swan. (50:48) I’m sure there’s experts in the world. (50:50) But I think we’re all on the learning curve on this at some level. (50:57) Don’t be shy. (50:58) We’re all friends here.


Speaker 2
(51:04) Will you be having some fun games, Bob, like the penguins during the course?


Speaker 1
(51:11) I probably won’t have penguins during the course because I like to get a little more serious when it comes to producing a product. (51:17) But yeah, that may be some of the reasons that sometimes we have problems with Scrum and companies. (51:25) I think people were focused too much on the penguins. (51:31) Fortunately, massive corporations do not run on childhood games. (51:35) But yes, we’ll have some fun for sure. (51:38) It’ll be definitely interesting, engaging, and practical. (51:42) I tried to make it practical, so we’ll definitely have that.


Speaker 3

(51:44) Hey, Bob, maybe you could quickly summarize. (51:48) What would be a difference between your standard CSPO class and one with this AI component? (51:55) What would one expect to see?


Speaker 1
(51:57) I think the difference is going to be – I’m going to focus on – you’re going to get some basic AI landscape stuff. (52:05) We’ll talk about different issues that are involved. (52:07) How do you set up a project that is an AI-type project, so setting the goals. (52:14) The business justification, and then asking some really important questions about data sourcing, data cleaning, what the whole process looks like from that product perspective. (52:26) Then the whole idea of setting expectations on when you go to run it, you’re going to be running sprints. (52:33) It’s not going to be a very typical, oh, we’re at the end of the sprint.

(52:36) Let’s have a demo of all the fun things we’ve built. (52:39) Sometimes you will have that.(52:40) Other times it will be a little looser, like, well, we’ve identified these problems, and this is what we’re working through.

(52:47) It may be more of a decoupling of the time boxes and the increments. (52:55) Really trying to address some of those specifics. (52:58) Then the exercises, we’ll work on projects that are more examples of where you would use AI, using the patterns, and saying, okay, here you came up with an idea. (53:11) What patterns are they? (53:12) We’ll look at that and try to look at those things. (53:16) It will be, as far as Scrum and learning Scrum, it’s that, but in the context of these types of projects.

(53:24) That’s the difference. (53:27) That is kind of a big difference. (53:30) Putting it together, there’s a lot of different pieces to that.

(53:35) That was kind of interesting. (53:37) It kind of made me relook at Scrum again to really understand it, which I thought I always understood it. (53:43) When you go to put it in a different context, it makes you really look at each piece and go, wait a second, what would you do there?

(53:49) Why is that different? (53:51) I’m really going to try to focus on that so that if you are working on AI initiative or a project or you want to, you’ll have a different perspective. (54:01) You won’t be just like, oh, I’m a product owner that got certified, but I have no idea what to do with AI projects.

(54:09) I don’t want to see people getting stuck like that. (54:11) I’ve seen people get stuck like that on data analytics projects and BI data warehousing. (54:18) I’ve been through all that.

(54:20) I had to help a lot of people translate what they were doing to that kind of work. (54:25) I feel instead of putting that burden on the people, I want to put it on myself to say, hey, let me help you figure out how this looks. (54:35) That was the idea.


Speaker 3

(54:37) That’s great. (54:38) There’s a question. (54:39) I know we’re at the top of the hour, but there’s a question from Simone. (54:43) She writes, what if you already have a CSPS certification? (54:46) Which is best?


Speaker 1

(54:48) Yeah. (54:50) If you already have one, I mean, this is an interesting opportunity. (54:55) If you wanted to, I don’t know how long we’ve had it, but you don’t have to take the class to renew a certification.

(55:06) But if you wanted to get a sort of refresher or if you wanted to sort of delve into how you would take that and twist it into something more AI focused, you could do that. (55:20) To my knowledge, there is no ACSPO AI course that I know of. (55:27) So will there be in the future?

(55:30) I don’t know. (55:31) That I would have to ask ChatGPT. (55:34) Maybe it could give us a prediction, but I don’t know.

(55:37) Right now, there’s none. (55:39) So this is kind of unique. (55:41) I know there are people training CSM classes and CSPO classes about, and they’re starting to use, like, oh, let’s use ChatGPT to do something, which is fine.

(55:50) I mean, that’s all good. (55:51) But, again, to me, the bigger challenge here is how do you set these projects up and run them? (55:59) Not how do I generate a PRD by asking ChatGPT to do it?

(56:03) You know, you can do that. (56:06) And the PDUs, yeah, the same thing. (56:08) The PDUs or PDMI, that’s the same as what you would get in a CSPO course.

(56:15) Sorry, is it 16 or is that what they get? (56:19) Do you know? (56:19) 16.

(56:20) 16. (56:21) Yeah, 16 PDUs for this? (56:22) Yeah.

(56:23) So you’ll get that. (56:24) So if you’re getting your PMI credits in there, you can do it that way.


Speaker 3

(56:29) That’s excellent. (56:29) Well, thank you, Dr. Bob, for showing up and giving us this talk.(56:34) Thanks, everybody, for taking the time. (56:36) And from Hyperdrive, have a great day, and we’ll chat with you soon.


Speaker 1

(56:40) Yeah, thank you, everybody. (56:42) Hope to see you soon. (56:42) Bye-bye.

Questions? We Can Help.

When you’re ready to move beyond piecemeal resources and take your Agile skills or transformation efforts to the next level, get personalized support from the world’s leaders in agility.