Scroll down for how to subscribe, transcript, and more
Luke is the author of the books Align Remotely: How to achieve together, when everyone is working from home and Launch Tomorrow: Take Your Product, Startup, or Business From Idea to Launch in One Day.
He's the host of the highly rated “Managing Remote Teams” podcast. He comes from a product management background and has a BA in Economics and English from the University of Pennsylvania.
He's joining us on the podcast from Poland.
Today, we discuss topics and questions including:
- Background question — How did you get introduced to Agile, Lean Startup, things like that?
- “Fuzzy side of innovation”?? — time wasted 20-30 years ago?
- Doing the wrong things righter?
- Tampering – and increasing variation
- Processes for creating software?
- When you were reading about “Lean Manufacturing”? How does that resonate with? How does that relate to you and your work?
- How easy is it to estimate “story points”?
- Lean Thinking – batch vs flow… physical flow vs. work flow — Adaptations to the flow of software?
- Takt time – how to translate this in terms of required software, requirements, points
- How did you learn about Process Behavior Charts?
- Why did that resonate with you?
- How do you incorporate PBCs into your work?
- Counting physical products vs. story points (something more esoteric)?
- Landing pages – product or service that doesn't exist yet
- What to test BEFORE a landing page?
- How to make a good decision with limited data points?
- What's so powerful about testing an idea as a hypothesis?
The podcast is sponsored by Stiles Associates, now in their 30th year of business. They are the go-to Lean recruiting firm serving the manufacturing, private equity, and healthcare industries. Learn more.
This podcast is part of the #LeanCommunicators network.
Video of the Episode:
Thanks for listening or watching!
This podcast is part of the Lean Communicators network — check it out!
Automated Transcript (Not Guaranteed to be Defect Free)
Welcome to the Lean Blog Podcast. Visit our website at www.leanblog.org. Now here's your host, Mark Graban.
Mark Graban (13s):
Hi, it's Mark Graban. Welcome to the podcast. It's episode 452 for August 2nd, 2022. My guest today is Luke Szymer. You'll learn more about him in a minute. We're gonna be talking about some things we don't always or often talk about here on the podcast, the world of software agile lean startup. We're we're not gonna take a deep dive into all of that, but it's really about, you know, entrepreneurship and innovation. And, you know, I think one of the main reasons Luke and I connected here is that he's been an adopter of the process behavior charts, methodology that I wrote about in my book measures of success. We're gonna talk about his application of that method to different metrics and, and software.
Mark Graban (54s):
And I hope this is of interest to people working in different industries and fields. So if you wanna find links for more information, you can look in the show notes or go to leanblog.org/452. Well, hi everybody again. Welcome to the podcast. My guest today is Luke Szymer. He is the founder of Launch Tomorrow. He helps new technology products get to market faster, even if you're working remotely and his website is www.launchtomorrow.com. Luke is the author of the books Align Remotely and Launch Tomorrow. And he's the host of the highly rated podcast Managing Remote Teams.
Mark Graban (1m 35s):
So look for that podcast, wherever you're listening to this one, he comes from a product management background. He has a BA in economics and English from the university of Pennsylvania. He's coming to us today from Poland. So while it's morning for me, good afternoon to you, Luke, how are you today?
Luke Szymer (1m 52s):
Very good. Very good. I'm excited to speak to you today and yeah, yeah, it's definitely quite warm warm day here. So yeah.
Mark Graban (2m 3s):
Well thank you for, for joining us and you know, there's a lot to cover today, you know, I think there's an interesting intersection between the work you do and, and some interests of, of mine around entrepreneurship and metrics and, and process behavior charts. And so that's music to my ears when we started having some conversation about process behavior charts, but, you know, first off, you know, in terms of backgrounds, I usually like to ask guests a bit of an origin story question of, you know, how did you first get introduced to methods related to, you know, improvement, whether you want to frame that as is agile or lean startup or other methods, or maybe it's an amalgamation of all of those.
Mark Graban (2m 46s):
Tell us a little bit about how you got started with this type of work.
Luke Szymer (2m 50s):
Yeah, I, I kind of fell into it out of an interest in especially early stage innovation, but in general new product development. I, I think that that kind of, so-called originally originally called fuzzy side of innovation like that, that especially kind of 20, 30 years ago when I was getting started, it was just meant, always cited as this kind of black hole where lots of time would be wasted basically. And yeah, so I think as I, as I got more and more into specifically product being kind of digging into digging into exactly how, how the, how teams produce produce features, this is of software.
Luke Szymer (3m 42s):
And then also from the, from the other side, what, what features are worth producing in the first place? And those are the, the kind of the, the constant questions of a, of a, of a software product manager. Certainly. Yeah. So,
Mark Graban (3m 55s):
Yeah. So there's this question of like, do you wanna do the wrong things more correctly building right. Building the wrong features or building the wrong product or building the wrong company in a better way. Right. There's those key questions, right,
Luke Szymer (4m 10s):
Exactly. Exactly. Yeah. It, yeah, kind of min minimizing that as much as possible. I mean, most of my, most of my time when I drifted into product was, was in financial technology in a, in a hedge fund environment. And when, when we were looking at, at our internal processes, I mean, this is going to the topic you, you mentioned of, of process behavior charts. When we were looking at our processes of how we were making software, I think the, the way that it was really useful to, to kind of dig into what was actually going on was looking at how, how the team managed to finish things over time.
Luke Szymer (4m 55s):
And, and that's, that's kind of where we started exploring that. And I, I found your, I found your book, I think originally via, via Eric Reese and went
Mark Graban (5m 9s):
To, yeah, he was kind enough to, he was kind enough to endorse the book. Yeah,
Luke Szymer (5m 12s):
Yeah, yeah. And absolutely absolutely loved it as I've, as I've mentioned to you before and yeah, I mean, it, it, the, the tool, the tool itself is actually the way that I kind of immediately, or what felt like we immediately got it, is that it's quite similar to relative strength indicators when looking at currencies modeling, how, how currencies, I mean, this is, and this is kind of a typical, I guess, financial technology geek.
Mark Graban (5m 48s):
I, so tell us about that little bit, you're, you're looking at fluctuations and currencies and trying to out signal versus noise.
Luke Szymer (5m 58s):
It's, it's, it's kind of, it's using the same mental framework, but in a slightly different way. So basically it's a way of, it's a way of forecasting what's going to be happening with, with exchange rates, but with the assumption that they, they, they revert to a mean, so it's kind of in fact, the, the quote, unquote expected behavior is that they're going to go back to the mean, and then they're only gonna fluctuate within, within standard deviation bounds. So it's like, yeah, it's a slightly different tool and yes, it's kind of, it's, it actually probably is arguably a much more of a random process than, than some of the stuff you'd use a PVT for.
Luke Szymer (6m 41s):
But, but yeah, that's kind of it already, I had some sense of numerically how that, how that would work out when I saw it. And yeah. I mean, that's the main, yeah. So I guess the main, the main place where, where they, they came in handy is that, you know, a couple of, couple of different projects where we had quite, you know, significant delivery pressure around a certain date, because that's tied with three other departments that have to coordinate a whole bunch of things with something being done by, by some date, and then a lot of focus on exactly how quickly work was being done when it's going to be ready.
Luke Szymer (7m 25s):
And I think the, the, the risk, which I think you articulated brilliantly was that if, if everyone's super focused on exactly how the team is doing every little minor fluctuation of up and down suddenly makes everybody nervous. Right. Right. And yeah, so, so, so it, it, it, you know, it became this very easy way for me to be able to calm down various stakeholders that okay, it fell, but you know, this is more or less to be expected. There's always gonna be some natural fluctuation up and down.
Luke Szymer (8m 5s):
Right. And I wouldn't be too concerned at this point. Yeah. And
Mark Graban (8m 13s):
There there's, there's, there's, there's a point I didn't really make this point too much in measures of success, but it's an old w Edwards demoing point on this. He would use the term tampering that when managers overreact ask for explanation to the, the small fluctuations, the noise, the common cause variation, it tends to increase variation because of the responses then to that management of that micromanagement. And there's, there's an irony there of people would say, well, I don't like the variation, but then they take actions that end up increasing variation. Yeah. Wondering, have you seen that dynamic when, when it comes to some of these measures on productivity or on time delivery?
Luke Szymer (8m 55s):
Yeah. I can definitely kind of intuitively sense how that, how that dynamic could kind of play out. Yeah. I mean, it's, it's if, if nothing else, if, if, if there is too much variation in, in the eyes of a senior stakeholder, then you've got a lot more meetings, you've got a lot more discussions and brainstorms and all of that. And that's time not actually spent doing the work it's time talking about the work, which would, if anything, reduce productivity.
Mark Graban (9m 27s):
Well, I, I could see the cycle of yeah. All those extra meetings are slowing down work and then they see there's a problem. And then somebody has a mandate, like no meetings for the next two weeks and then productivity might, so productivity might soar, right?
Luke Szymer (9m 38s):
Yeah. Yeah. So there is, yeah, there is a bunch of kind of interdependent things going on there. I think which very much could affect, especially downward variability. So yeah, definitely on that front.
Mark Graban (9m 54s):
Yeah. There's, I mean, we talk about systems and, you know, back to currencies for a minute, I mean, it seems like these exchange rates are outcomes of a system and that system might be stable for a period of time and sort of like a metric, a performance measure in an organization with the process behavior chart, it would tend to be fluctuating around an average, and we could predict that that would continue to be true, but then here's the catch, unless the system changes.
Luke Szymer (10m 23s):
Mark Graban (10m 24s):
And, and you might not be able to predict the system change. So like with currencies, if there were some sort of, you know, world event or economic shock or something, then that tendency to fluctuate around a stable average might no longer be true. And so I wonder if, if those charts in whatever format would allow you to detect that signal sooner in a way that would be like financially advantageous, if you realize, okay, that assumption's no longer true, I should either buy more of that. Or am I expecting it to shift upward or shift downward? Would I buy more of that currency? Or, or I don't know what decisions you would make, but it seems like there there's, there's a similar trap where knowing it's been stable and predictable for a period of time, doesn't guarantee that that'll,
Luke Szymer (11m 12s):
It's gonna continue
Mark Graban (11m 13s):
Luke Szymer (11m 15s):
It, yeah. I mean, the thing, the thing that I remember from speaking with various people, I mean, this is mostly kind of serving hedge funds. I mean, they, they had this, this kind of catch all term of like when all correlations go to one. So basically all market assets
Mark Graban (11m 31s):
Luke Szymer (11m 32s):
Go in one direction and usually kind of what happens is that at that point, almost all of the mathematical tools they have are, are useless. So it's almost like they've, they've, they need, they have models to understand kind of the stable state and then they need completely different models to understand what happens in these extreme, let's say black Swan type events and, and model that, and kind of do, for example, common tools, a scenario analysis. So you replay a portfolio of exactly how it would've performed during say nine 11 or during just different, different world events, which would affect markets.
Luke Szymer (12m 18s):
And there you, there typically it's not, it's not about the currency jumping outside of the standard, the one standard deviation it's about completely throwing it up window. So yeah. It just, it's, it's very different way of thinking about at that point.
Mark Graban (12m 35s):
Yeah. So I wanna talk a little bit about used a phrase earlier processes for creating software and I'm, you know, I'm getting probably not for this audience, but for some people they might cringe at that thought, what do you mean a process for creating software? So when, when it comes to process and software, and you know, is, is, is you would read about, let's say lean manufacturing. I'm, I'm, I'm curious what resonated with you, how did you see that relating to software development or, you know, entrepreneurship?
Luke Szymer (13m 15s):
So I think the, the biggest shift that's really helped in that context. So the main, the main unit of measure is something that we call a story point in the software world, which is it's kind of a, it's a measure of complexity. So basically the amount of mental effort needed to be made to create something because unlike manufacturing, there's no manufacturing cost or distribution cost. Once it's builds, that's pretty much all the costs you're going to have. So it's all about managing that. And then also making sure that you're building things which are valuable that are, or that are most immediately valuable.
Luke Szymer (13m 59s):
So, yeah. So going back to what I did find most interesting, I think I really liked in Womack book, lean thinking this, this description of different ways of organizing a bicycle factory of like either, either you think about planning out a whole order as this big multi-stage project, or you just come up with a way of measuring, how do you and reorganizing the work around, you know, minimizing the amount of time it takes to create one bike and then doing it that way. And I think from a process perspective, that's, it's, it's kind of a mind twister, but it's extremely helpful at making, making the well, one, the work visible two as a early warning signal, if something's going wrong and, you know, works just as well, remotely as it does in a, in an office and to, yeah.
Luke Szymer (15m 5s):
And it, and it obviously takes a little bit of effort to do that, but just to reorganize the work that way, but that's, that's one thing that's super helpful. And then the other, the other bit that you kind of got for free after organizing things, that way was something that wasn't in, in the traditional, let's say agile, well now traditional agile way of thinking tax time, I think is a super interesting concept where you quantify demands in terms of a rate, as opposed to a final outcome.
Mark Graban (15m 35s):
Luke Szymer (15m 36s):
Particularly when it comes to, you know, managing, even if it's relative to internal stakeholder expectations, if they're, they're expecting you to do, you know, create something really complicated in, in two months, like you, it's one thing to say that it's another thing to say that, you know, you're expecting us to go at the rate of 170 story points a week. And you know, our current rate is 30. So it's like at that point, it's, it's much, much more quantitatively precise. I mean, the other question is whether it's accurate.
Mark Graban (16m 14s):
Well, I was, I was gonna ask, well, I was, I was gonna ask a follow up and explore that where I, this is where I would, I would wonder about the translation of this. So like, let's say with cars, that's easily countable. It's very discreet. It's very repetitive. You'd say the market demand is 60 seconds. A new car is purchased. Like that's very straightforward. I would, I would bet. I would guess customers don't talk in terms of story points. They wouldn't say I want a two week tact time on new features. And then it seems, they might say that. I don't know, but it seems like, you know, the story points, it's both an abstraction and an estimate.
Mark Graban (16m 58s):
And I wonder, you know, are there different ways where that can get off track?
Luke Szymer (17m 3s):
It's a story point is a unit of measure in the same way. It's a kilogram or, or a meter or something like that. Sorry, a pound or a mile.
Mark Graban (17m 12s):
That's okay. We can speak that trick here. It's okay.
Luke Szymer (17m 18s):
And then in, in those units of measure, you can estimate, you can then measure how much was actually delivered. You can plan, you can express attack time. So it's, it's used in different ways. I think actually, so from, from the calculation, in terms of tech time is, is actually very similar cuz it's about the breaking up the overall demand at the end into some kind of a rate, right? So the overall demand is the same. It's just a, a question of a rate you, and then you can just express it in story points as opposed to cars. Yeah. And yeah, it's numerical.
Mark Graban (17m 56s):
Yeah. Yeah. But, but again, like, it seems like there's this estimate element where like back when I started my, you know, career in automotive industry, when, when work is very discreet and very repetitive general motors at the time had these, you know, work estimation tools that were very, very precise to be able to, even before going and testing something in practice, on the, you know, on, in the manufacturing shop that, okay, here, here's a job that could be done in 55 seconds based on how much motion and how much turning and how much weight was being lifted. But then when we would get into more the abstract realm, we're, we're not dealing with things that can be directly measured like mass or so I wonder you, I always think maybe this is the, the cynic in me of thinking of the dysfunctions of, is there incentive for people to overstate the story points?
Mark Graban (18m 50s):
Well, you want me to, to follow this user story and develop this new feature? Well, I think it's a large number of story points and then some manager might say, well, no, no, no, no. I think, I think that's actually fewer story points. Say how, how would you know, or how would you assign that story point number?
Luke Szymer (19m 9s):
So there's a, there's a, a standard tool called planning poker. So essentially you're using the dynamics of poker. And this is kind of backed up in software engineering research that this very much does help reduce this, this type of gamesmanship that you're talking about after the facts, the key is to try and expose that up front when you're planning. And usually the way it works is that when you've got a particular unit of work, you know, a story, a task, whatever you wanna call it, the, the, the, the team who's actually going to do it, discuss it amongst themselves. And then in order to not influence one another, they vote with, with a number to estimate roughly how many story points they think it is.
Luke Szymer (19m 56s):
What level of complexity is, what level of complexity it is relative to all the other work being discussed. Let's say that day, but also relative to everything they've done in the past. And yes, it's very abstract. I I've, yeah, it's, it's definitely not a bicycle or a car, but really when, when we do have these planning estimation sessions, I'd say 70% of the time, you'd be surprised EV all of the developers vote the same thing, despite not being influenced by one another.
Mark Graban (20m 31s):
Luke Szymer (20m 33s):
And it it's just, it's uncanny. Yeah. Especially, especially with bigger things. Cause with little ones. Okay. It's little, everyone agrees. It's little, but like once it's big, that's where you can have the potential differences. And in fact, the whole point of story poker, isn't so much the voting, it's the discussions to talk about the variance to figure out overall as a team, what the, what the right estimate is and the keys that it's done before you start the work. First of all, and second of all, the estimation isn't done by the managers it's done by the people doing the work in a, in a, you know, relatively low, ideally low pressure situation.
Mark Graban (21m 14s):
Yeah. Right. It seems like you would want to eliminate fear and other dysfunctions that would lead to gamesmanship or, you know, people sandbagging or whatever term you use. But yeah. I mean, I'm, I'm reminded, I'm thinking of different times when you, you would try to use a little bit of wisdom of the crowd of looking for estimates. Like I, I've seen like discussion around asking people, even in a healthcare setting without going out and measuring, which would maybe be better, but let's say if there's certain tasks that don't happen very often, so there's not an opportunity to go actually measure it. And you would ask people, how long does it take to do X? You, you might remove an outlier at each end of those estimates because sometimes people are just bad at remembering or estimating maybe, you know, eliminate the outliers.
Mark Graban (22m 2s):
And they say, well, okay, that's, there's kind of some consensus there. That might be good enough until you can go test that assumption or test that hypothesis in reality.
Luke Szymer (22m 13s):
Yeah, exactly. It's, it's enough to be useful. Like it's, it doesn't need to be precisely down to the decimal point, as long as it's the right. Let's say not quite order a little bit less in order of magnitude, but as long as it's in the right area, it's enough to, to get going. And then there's also a cost to spending more time estimating to get more precision. So that's the other side. So you want enough estimation for it to be useful, but not so much that you spend a week estimating right. And planning. Yeah. So, and yeah. And whereas you could just be doing, doing things, prototyping, building something, that kind of thing. So yes.
Mark Graban (22m 52s):
Yeah. You mentioned the book, lean thinking, you know, it seems like there's a bit of a parallel when you talk about the design of a factory, you know, Womack and others would write about, you know, a departmental factory layout, functional layout of all the welding machines are in one area. All of the cutting machines are in another, all the, you know, and so then things would almost by necessity, you know, kind of burp along through the system in batches, right. Because of these long distances, it doesn't you, one piece flow wouldn't make sense to somebody who has to carry a bicycle frame, a long distance, you're gonna accumulate a large metal basket full of frames and then move the basket.
Mark Graban (23m 37s):
But then, you know, I think the, the one key insight of lean thinking is to have a flow based layout or have production cells, or even just have, you know, whatever the sequence of operations would be. You know, 1, 1, 1 cutter, one grinder, one welder, and just then you could have one piece slow because the machines are like, literally right next to each other. Yeah. It seems like then there are parallels to software where people can think about reducing batches and, and thinking of, you know, the, the, the, the functional departments and software might be back in the day of collecting requirements, building the software, testing, the software like that, that used to be almost like just people thought, well, it had to be a large batch process.
Mark Graban (24m 25s):
Right. Can, can you talk, I'm curious your perspectives in, on some of that evolution of moving toward flow within the design of a software system.
Luke Szymer (24m 35s):
Yeah, absolutely. I mean, it, the, the, I think the overall change that I've seen kind of since the nineties is that before it was very kind of functional long-term state stages of, of, of first, like, like you say, collecting requirements and, and then going through various stages. And then finally in the end, something goes out to the client. Now I think that the move has been very much towards having a cross-functional team. So everything needed to produce one unit of software is within one team. And then it's just a question of how they cooperate amongst each other. And in, in practice, the way, the way that, that I've managed to let's say work on something like waste is mostly, you know, quantifying the time between stages.
Luke Szymer (25m 25s):
So basically at the team level, how long does a particular story or task stay in a waiting state between, between each let's say not quite functional area anymore, because it's specific people or specific groups of people within a team that are gonna be handling something. And yeah. And you can very, you very much can apply that in a software context. I mean, I, I mean, I had one project where we went down from like an overall, an overall cycle time of, of like three and a half weeks down to 37 hours. Where of course, after doing a whole bunch of improvements, identifying changes in infrastructure changes in all kinds of tooling writing code that helps write documentation, like all kinds of different, more, more, or less insane ways of, of speeding things up.
Luke Szymer (26m 20s):
And, and, and it, you know, and it really, it really did, did help quite a bit. And I think the, the fact that it was kind of a, an observable collectible me collectible measure made it helpful. And yeah, I think it definitely shocked everyone that when, when we first started, like the ratio of the amount of time that the tasks would be in a waiting state was like multiple times of what it was, what the actual work time was. And in that context sitting on someone's neck, that, you know, why don't you, you know, SP why did you spend so much time developing this? And that is kind of irrelevant because it's not gonna be done anyway, like, because it's because of, because of all this, these process and efficiencies, So that's kind of the direction that it's gone, I'd say.
Mark Graban (27m 10s):
Yeah. And one other thing I was gonna ask you, you know, curious if this has moved in a better direction or, or how you, you mentioned when we first started talking here, I thought was interesting phrase. I, I, hadn't heard the fuzzy side of innovation thinking back 20, 30 years ago, do you think through different methods, including what we've learned from, you know, Steve and Eric Reese and others, like our, our organizations wasting less time when it comes to innovation, is it less fuzzy because we're, we're better at testing hypotheses and making, instead of making assumptions, it might be hard to generalize, but what do you think?
Luke Szymer (27m 56s):
Yeah, I mean, I think it's, again, it depends a lot on, on the individual companies, but I think on the whole, there's, there's a whole set of, there's a whole grab bag of, of analytical tooling to help break down something that would before be largely an intuitive process amongst a group of people, kind of a, the kind of skunkworks environment, you know, lock them in a, in a dark room, tell them to go figure it out, at least in a corporate context, startups pretty much do that themselves or in a garage, right. Location is slightly different, but same idea.
Luke Szymer (28m 37s):
And yeah, and I think, I think the, yeah, basically applying a more quantitative approach and a feedback loop approach. I think that's, that's, what's, that's, what's certainly changed for me relative to, you know, the stuff that, you know, I would've read in college around innovation in the late nineties, for example.
Mark Graban (28m 55s):
Yeah. Yeah. So then there's, there's, there's innovation in how we go about innovation and, you know, hopefully these new practices or even new ideas that are being developed now are framed as some sort of hypothesis. You know, we have, we, we have a theory and, and I think this is like a really fundamental mindset, just even bringing it back to the work that I'm most involved in around continuous improvement. That there's a huge difference in mindset between say, ah, I, I have an idea. I know this is a good idea. I know it's going to work like that mindset can get you in a lot of trouble, even with the smallest of changes in a workplace, as opposed to saying, I've got an idea what needs to be true for this, this to work?
Mark Graban (29m 42s):
What assumptions am I making? How do I test this idea? It might not be a good idea. Like, you know, kind of the honest recognition of some of those things. I, I, I think maybe sometimes easier said than done that, you know, people could go and study lean startup and different methods and still fall into the, I know my idea is a good one trap.
Luke Szymer (30m 4s):
Yeah. There's, there's, there's, there's some kind of, kind of individual personality thing going on there when it happens, I think, and, and to some extent, there, there is a little bit of a, let's say structured thinking slash numerical skills, particularly for the more kind of like experiment construction or something. There's, there's some people get it right away. I mean, I've, I've run workshops for, for like open workshops on this topic. And then like half of the half of the half of the people there just, just obviously immediately get it. And then there's some that are just like really struggling with, with, with, you know, with certain things simply because they don't, they don't feel confident enough in their, in their math skills.
Luke Szymer (30m 50s):
So it there's, there's a bit of that. Yeah. I mean, I think my, I mean, my favorite one is the, the, the landing page testing, I think so. So with landing page testing, essentially what you're doing is you're, you're putting up a, a webpage which describes a product or a service, which, which either doesn't exist yet, or, you know, you're, you're, you're planning to release, you've got, for example, a, you know, a release date or something like that, and structuring the interaction that way either at the level of a startup where, you know, the whole company doesn't exist and it's kind of a one product company, or as a kind of off brand exercise for a larger company where they just wanna see how, how the, How the market would react.
Luke Szymer (31m 47s):
And essentially what you're, what you're doing is you're creating a kind of a structured way to gather numerically, test numerically, what the kind of market structure is, what the demand is for, for something. And there's, there's, there's a couple of parts there. So one thing is what's on the actual landing page and what it, what the thing is, what the value proposition is, but it's also what, what segment are you going after? You know, like what, what channels you use to reach them? I mean, there's there, there's different, subparts there. And essentially you're trying to get this initial match between a group of people that actually really want one thing before you do anything else and, or at least relatively early on in the process, or for example, in parallel as you're working on it.
Luke Szymer (32m 41s):
Like I know in, in, in healthcare, it's definitely a lot more complicated. I mean, I've, I've worked with medical device startups or something where yeah. I mean, it's, it's, it's one thing to test demand. It's another thing where you're talking about things, which, which are clearly tied to people's health and, and, and, and, you know, yeah. So, so, so there's, I think you just need to be really careful in terms of making sure that you aren't doing something that's going to hurt someone or, or, or that is, you know, that make sure it's okay from a, from a regulatory perspective. Right. But then there's even, even before you do a full landing page test, there's a number of different types of tests you can run before that.
Luke Szymer (33m 28s):
So for example, a really common issue, especially with tech startup founders. I mean, I work a lot with that kind of like engineering type founders is that they came up with some widget, which, which they think is amazing, but because they're so close to it, it's difficult for them to articulate it in a way that is easy to understand for someone. And if someone doesn't understand what the products that's being pitched is they're definitely not gonna buy it. So I think this is more, this is more of like a, a necessary condition.
Luke Szymer (34m 13s):
It's not a sufficient condition. They have to understand in order to wanna buy it. But if, if you are trying to enter a market and if you're marketing communication is completely unclear, then clearly you're gonna get a false signal. Like even like, it might still be a great idea, but if you can articulate it, then who cares. And there's, there's a number of ways to ways to do that. Partially. I mean, the best is directly with customer directly with customers just, you know, pull up a webpage on a tablet and get, get feedback from them, that kind of thing, or, you know, there's different kind of tools originally meant for, let's say in like a, more of a UX context where they can quantitatively measure how visually clear layout of a particular landing pages or yeah.
Luke Szymer (35m 6s):
Or looking at the attractiveness of, of the headline or something like that. And, and this stuff is important. Not only in, not only in the context of testing an idea, but I think also like when you are creating something new, I think it's not just that act of zero to one creating something, but it's, it's especially for a startup, like, you know, the it's landing page is like the, the 21st century equivalent of a, of a, of a business card, right. Like this is, this is your you're kind of testing your identity to some extent. Yeah. Right. And getting that clear and understandable for particular group. Like there's a lot of testing you can do around that.
Luke Szymer (35m 49s):
Mark Graban (35m 49s):
Well, and it seems like, wonder if anyone ever did this with business cards, but, you know, A/B testing or testing and evaluating different alternatives. I don't know if people ever tested, like, okay, I'm gonna, I have two different business cards and I'm gonna hand 'em out to people and see which card leads to the most follow up calls. Like you could do that in real life maybe. Right.
Luke Szymer (36m 10s):
It's I mean, it sounds, it sounds like something like a, like a old school direct marketer would do. Yeah. To be honest. Yeah. Yeah. Or they'd have a couple of versions and then they'd yeah. Go to a big event and then just see who, who, who, who, you know, have a different phone number on each one and then see which one's called or that kind of thing. So,
Mark Graban (36m 29s):
Yeah. So it seems like there's that trap of, of somebody, again, this difference of saying, I know this landing page is good. I know that headline is good versus I have a hypothesis let's test something or let's test a couple things in parallel, and hopefully you have enough data points to make a good decision, but I know, I know, think one thing that you've worked on and, and thought about is, you know, how, how do we make good decisions if we have a small number of data points?
Luke Szymer (36m 59s):
Yeah. So this, this is, I think, I mean, this is kind of the, the classic problem with numerical tools, it, in, in early stage innovation, because you, you can't, you can't necessarily afford to do kind of a large scale corporate survey or, or something like that to get, to get a full, a full picture. But yeah, I mean, I think, I think the, the, the two numbers that I find the most useful are five and 30. So five is kind of this rule of thumb in terms of just qualitative exploration of ideas, where if you go and do five, five interviews with, with different people in a market, probably you'll get a, I mean, you'll definitely learn something every single time.
Luke Szymer (37m 51s):
And that, that, that, that every time I've done that, or worked with teams, that's always the case relative to what you knew going into before you did the interviews. And also the actual core idea of something that's super valuable is somewhere in that qualitative data set of stuff that you've heard. And then when you're ready to start thinking about testing, then, you know, it's a question of what's, what's, let's say operationally feasible, like the smallest, the smallest size sample size, where you can start reasoning. Statistically is probably around 30 and yeah. So the, so, so, you know, you can go and have, have that many interviews with, you know, with people on the street, if you're doing a consumer product or something like that, but Yeah.
Luke Szymer (38m 44s):
So that, that, that gives you something. And then the good thing about having a somewhat smaller sample size is that you can run a lot more experiments with the same number of potential observations. Yeah. So yes, it's not as precise and definitive, but at the same time, on the flip side, you can have a much greater variety of experiments and much more robust learning when you're entering a new market.
Mark Graban (39m 12s):
Yeah. And it seems like when you're evaluating, let's say a landing page response rate over time, there's a time series, there's time series data, there's a metric. You could apply a Process Behavior Chart to that, to avoid the situation of saying, well, the response rate fell from 14% to 12%. Let's go do a root cause analysis like, well, that number might be fluctuating around an average.
Luke Szymer (39m 40s):
Yeah, definitely. I mean, I think, I think at this stage, you're, you're very much getting into the realm of traditional conversion rate optimization, which is more of kind of like at this point, an established way of looking at it from a marketing perspective in an established company. Whereas the, let's say the startup view of landing pages is a little bit different in that you aren't optimizing a sales process that already exists, which is what CRO does. You're, you're trying to nail what it is that you wanna offer in the first place. So, yeah. Yeah. So that's slightly, slightly different way of thinking about it.
Mark Graban (40m 14s):
Yeah. But it seems like there, there would be a number, a flow of different hypotheses baked into even putting a landing page out there of, you know, first can people find the page, Are people interested in learning more? Would they signal like, yes, I would actually buy this like saying I would buy it versus actually buying it is different. And then, you know, hopefully you've already sorted out the question of, can you deliver it? Like you were talking earlier about healthcare. And I was starting to think through if Elizabeth Holmes had started with the landing page for Theranos, a lot of people would've said, yeah, bring it on. I want all these lab tests done with the drop of blood, but that wasn't the issue. She never delivered, you know, the company never delivered.
Luke Szymer (40m 57s):
What's what she was saying. Right. Yeah, exactly.
Mark Graban (41m 1s):
But, well, well, well, Luke, you know, there's, there's a lot, we're, we're writing a little short on time here. So there, there, there's a lot more that we could discuss, but I think we'll have to sort of leave it at this today, but I do wanna mention again, you know, the, your, your most recent book launch tomorrow, take your product startup or business from idea to launch in one day, if you wanna learn more about landing page, minimum viable products and, and measuring and evaluating things there, you know, testing ideas as hypotheses, there's a lot to explore there. So again, the website for Luke's company is launchtomorrow.com and then the podcast is Managing Remote Teams.
Mark Graban (41m 46s):
Now, a question on the podcast, is that something that you had been exploring and, and talking about, well, before the pandemic, or was this a pandemic adjustment?
Luke Szymer (41m 55s):
It, it was admittedly a pandemic adjustment it. Yeah. I think I, when I saw a lot of things, the initial reaction, when I saw a lot of the content showing up online, around remote work, I felt was kind of the, obviously the first initial thing, like what types of tools do you need and all of that. But having worked with remote software teams for quite a while, I, I felt that the real questions were probably three or four levels deeper. And I didn't really, yeah, that was, that was what I wanted to explore in the podcast, basically. So yeah.
Mark Graban (42m 33s):
So managing remote teams, so people will check that up. Cause I'm, I'm, I'm sure Luke, you're working with people every day and, and how many different time zones
Luke Szymer (42m 43s):
At the moment? Not very many, but yes. I've, I've worked with, with people across 13 time zones in the past and it, yeah. It's, it's, it's yeah. There's, there's ways of there's ways of doing it, but if you can avoid it, obviously that's better. So
Mark Graban (42m 60s):
There's remote and then there's globally remote, like how, how
Luke Szymer (43m 3s):
Remote distributed. Yeah.
Mark Graban (43m 6s):
But, you know, Luke, thank you for, you know, the, the discussion today. Thanks for sharing, you know, some of your thoughts and perspectives around innovation and, and software. And I think there are transferable lessons learned in, in all directions. And, you know, I appreciate you sharing with me privately and, and, and talking a little bit here today about your use of process behavior charts in that realm. So I, I, I do appreciate you doing that and sharing that and being here as a guest today.
Luke Szymer (43m 34s):
Thank you, Mark.
Announcer (43m 35s):
Thanks for listening. This has been the lean blog podcast for lean news and commentary updated daily visit www.leanblog.org. If you have any questions or comments about this podcast, email mark at email@example.com.
What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn. Don't want to miss a post or podcast? Subscribe to get notified about posts via email daily or weekly.