Can you create an Evidence-Guided Roadmap? | Itamar Gilad

In episode 36 of Talking Roadmaps, Phil Hornby interviews Itamar Gilad, a product management expert. They discuss creating evidence-guided roadmaps, exploring Itamar's GIST Framework and The Confidence Meter. Itamar shares insights from his book, "Evidence-Guided: Creating High-Impact Products in the Face of Uncertainty," and his experience at Google and Microsoft. The conversation delves into practical strategies for product managers to navigate uncertainty and make informed decisions.

Itamar is a coach, author and speaker specializing in evidence-guided product management and product strategy. For over two decades he held senior product management and engineering roles at Google, Microsoft and a number of startups. At Google Itamar worked at YouTube and led parts of Gmail.

Itamar is the author of the book Evidence-Guided: Creating High-Impact Products in the Face of Uncertainty. He also publishes a popular product management newsletter and is the creator of a number of product management methodologies including GIST Framework and The confidence meter.

Itamar is based in Barcelona, Spain.

Have a watch and if you enjoy the video don’t forget to subscribe to the channel, like it and maybe sign up to our mailing list!

Here is an audio-only version if that’s your preferred medium - and you can access it through your favourite podcasting platform if you prefer (Apple, Spotify, Amazon).

In the next episode we are talking to Gabrielle Bufrem, Product Leadership Coach. So watch out for Season 1 - Episode 37!

  • - Welcome to Talking Roadmaps, the channel where we talk about the good, the bad, and the ugly of roadmapping. Today I'm joined by Itamar Gilad. Itamar, can you introduce yourself?

    - So, I'm a product coach. I work with product teams on building high impact products and before that, I used to be a product manager for about 15 years in startups in Israel where I'm from and in some bigger companies like Microsoft and Google. And recently, I'm also an author. I published a book.

    - You made this one?

    - Oh, exactly. You have it.

    - If you're enjoying the channel, subscribe, hit the bell and give us a like. Maybe you could tell our listeners a little bit about the book and the kind of fundamental premise of it.

    - So it's called "Evidence-Guided" and it basically tries to help product people such as ourselves move the organisation a little bit away from opinions and judgement and all this good stuff, which is important into slightly more evidence-guided approach, which is a little bit more like how science works and law and medicine where we combine human judgement with evidence. And the aim is to kind of break the hard transition and it is hard into multiple layers and into multiple manageable sections. So that's what the book is about.

    - Yeah, and I love the term evidence as opposed to, too often, people talk about data, right? And evidence is I guess a softer wooer word, but because it includes that judgement and the qualitative element as well as the quantitative elements. I really liked the title.

    - I think that's a really good point. We're in a situation where we're inundated with data. We have a lot of data, we have analytics, we do use the research, but data is not evidence. Data is just fact and numbers. What we really need to drill to and find is those facts and numbers that either support our hypothesis or refute them. And that's much rarer, much harder to do and requires some judgement . So that's why evidence is something that tells the story while data is not. And people tend to confuse the two. So that's why I chose evidence.

    - Yeah, I'm often heard quoting that data tells you nothing, data needs to be interpreted and put into context and a story told around it. Which means that it sounds like evidence is a bit like roadmaps. 'cause we often do a lot of storytelling there. So maybe we'll start out with a nice softball one of what's the purpose of a roadmap? And feel free to weave in some of your concepts from the GIST framework.

    - All right. First off, I find the word "roadmap" very vague and ambiguous and many people interpret it in different ways. I'm sure you've encountered this numerous times 'cause you're an expert on the field and had a lot of people speak about it. I think on the one hand, there's the very classic roadmap of releases, a longer timeline, sometimes, for a year. I've had clients that had to plan for multiple years. Now it's more popular to do it for a quarter and then to be increasingly more vague about what's in the future. But the core premise is that there are some releases we're committing to and they're coming at a certain date. Feel free to interject if you disagree that that's one interpretation for roadmap.

    - And so we're talking about releases there. So releases on a date. So that's an-

    - I'm saying that it's a spectrum. That's one interpretation on the far end of the spectrum. Another interpretation that I'm more inclined to support is that our roadmap has to be more about outcomes. What we're trying to achieve along a timeline. In fact, in the book in chapter six, I give an example of how this might look like. And I think there's a gamut in between. I think in some cases, it's okay actually to show output, to show launches. And in other cases, it's not and it's actually pretty destructive and counterproductive to do this.

    - Interesting, you used another word there that's often overloaded. You said roadmaps are vague and overloaded. Outputs, or in fact, no outcomes are often overloaded. Now are we talking about customer outcomes or are we talking about product outcomes? Are we talking about kind of changes in behaviour? Are we talking about moving a metric or are we talking about a job to be done?

    - Great question. Honestly, outcomes is also very confusing these days. I would say this for me, outcomes are anything that is tied to the goals. Like when we construct our goals. And that's the top layer of my model that is by the way, called GIST. Goals, ideas, steps, and tasks. Those are the four layers, the falls, big areas of change we need to adopt or to embrace in order to move to an evidence-guided model. So in the goals, we want to say in a qualitative way what we're trying to achieve, that's the objective. But we also want to say in a more quantitative way, how do we measure progress? And anything that indicates progress might be a good outcome. So it could be changes in customer behaviour, it could be reduction in page load time, it could be anything we can measure and point and said we achieved that. We actually are moving in the right direction. And this definition between products, outcomes, customer outcome, business outcomes is maybe useful. But for me, it's all a mixed bag. These are all outcomes. They're all good. They're all just need to be picked. We really need to have a model to say these are the most important one. At company level, these are the most important. At team level, and these are the few, the very few we're picking as our goals. And from there on, we can create a roadmap of outcomes.

    - I guess there might be a play on that. You talked about goals being on that. Does that influence the audience that we're showing this thing to then?

    - If you use an outcome roadmap, I think it's super important. It's kind of at the company level, I would say it's a yearly thing and it kind of needs to reflect how do we interpret the company strategy this year? What are the three, four most important objectives that we need to capture? And then we can try to lay them on a timeline and say, "By end of Q2, we want to be deployed in three emerging markets, companies." That's an outcome. "By end of Q3, we need to resolve our security issues 'cause this is killing our business right now. And by end of the year, here's yet another major objective and set of outcomes." Attached to that are the two most important outcomes that I suggest focusing on, which I call the top level metrics. The North Star metric that measures how much value we deliver to the market, and the top business metric which is out of our many business metrics that we can come up with, revenue, profit, market share, etc., what is the most important thing we need to grow this year? And we can put targets on those as well for the year. If you want, we can delve into this later or you choose.

    - Take us there now. I mean what you just described, I almost started to think about as, okay, hour zero. You know, companies have, okay, one, two, three, quite often I find well. Having that North Star metric that you know, the most important product goal and the most important business goal as the key results of an okay hour zero, that's almost like a foundational thing is kind of almost the way I think about those two things. And then the others can kind of ladder down almost as KPI in a KPI tree.

    - I think I'm going to steal this name. It's a great branding for this OKR. Just to explain and I agree. So the top top objective of the company is the mission. What are we trying to do? What value are we trying to create at for whom? You know, the famous examples are to organise the world's information, make it universally accessible, to connect professionals and make them more productive. That's LinkedIn by the way. The first one was Google or to spread ideas. That's Ted. I like these missions because they're very much about what value we create. Just declaring, you know, let's be the number one company in our sector or the leading provider of X. Not very inspiring and not very original and not actually helping anyone 'cause it doesn't deliver information. So you have this objective. But then you need to measure whether or not you're moving in the right direction. And those are those two top metrics. So the top business metric is like saying, this year, we really want to hit 250 million in revenue. Why? Because we need to secure the next round of investment or because we want to go on an IPO, whatever it is. This is the target, guys, this is the most important business metric. Doesn't mean we don't care about profit but if you need to make a trade off, here's, and that's is kind of the idea. We need to be concise and clear. Companies try to be very vague about all of this. And then there's the North Star metric, which is even more interesting for product organisations, which is about how do we measure how much value are we creating? I can give some examples. WhatsApp always measured number of messages sent 'cause every message sent is a little increment of value to the sender, to the receiver, it's free, it's rich media. You can send it for anywhere in the world. Compared to SMS, its value. So if in year one, there were a billion messages for the platform, in year two, 2 billion, we can roughly say we doubled. And you can do the same kind of exercise for multi-sided marketplaces, for analytics tool. Another great example comes from Amplitude, the analytics tool. They're measuring weekly learning users, which are the number of weekly active users that found an insight in the tool. And it was so important that they shared it with at least two other people who consume the insight. And that for me is the essence of the North Star metric. It finds the core value that your users are looking for, in this case, insights not just fancy graphs and data and measures how much of those are you producing further?

    - Feel free to steal that term 'cause I was inspired to give it that name when reading an early draught of your book, so.

    - By the way, if you are writing a book right now, Phil is your guy. You need to send it to Phil. You will get super actionable and really valuable feedback and nice support later on. So thank you, Phil.

    - Appreciate it, Itamar, But yeah, funny 'cause I was working with a company on their OKRs and I'm thinking, we've got this gap, we've just got the business metric, we are missing that North Star metric. And it just started to explain the challenge that we were going through of everything was one-sided.

    - I would say if you have these two metrics defined and it's a pretty hard discussion, it takes a lot of iterations and the mission, you're already off to wonderful start for your roadmap because already, you have something to lay, even if you don't know what you're going to do at all, even if you don't have OKRs, you just say we need to see this growth this year in these two metrics and let's project what's possible and let's push ourself to be a little bit more ambitious than that. For a lot of startups, that's all they need, honestly. But then you can drill down further and break this into submetrics. And that's again what I explained the chapter about goals, how to create a metrics tree and then the metrics tree just find submetrics or input metrics as they're sometimes called that supports your North Star metric, that supports your top business metric and then metrics that supports those and so on. Until you have a layer of metrics that are much more actionable, much quicker to move. And those you can assign to product teams and they can own them consistently. You know, the onboarding team will own the percentage of successful onboards and the search team, the percentage of the click-through rate on the top 10 results and whatever it is. But it all connects to these two top metrics and that really enables teams to measure impact and of their ideas and also enables them to create goals without having you know, the management tell them what the goals are. And again, that all connects to this roadmap that we can then create on a company level and even on a team level.

    - So we've kinda already started talking around vision and strategy or mission and kind of strategy objectives and how they influence things. So we've kind of got into some of my later questions. But one thing that occurred to me as you were talking there, an outcome-based roadmap that is about say the KRs that we're going to move or the lower level metrics, input metrics, that feels very internally focused. Is that something you would think about sharing with an external, with a customer for example?

    - That's a good question. I realise that the roadmap is perceived as a sales tool in many organisations, especially large B2B. So either you share the entire roadmap or at least the relevant features that are coming with the customers in advance. It helps them prepare. It also helps answer the questions where now, we're getting what. So it's obviously very important for a lot of organisations. There's a challenge there that that sometimes motivates the customers to start demanding things on the roadmap. It becomes part of the contract, part of the negotiation because they know you're going to show them the roadmaps and they know they need to demand it. Why is this bad? For one thing, they don't necessarily, or they're not necessarily going to use the thing they demanded. And on the other hand, even if you build it and you give it to them, they might demand something else. And these are just two eventualities. And we might end up being in a situation where we're just building one-offs for the customers just to satisfy them and then show them on the roadmap. So I would suggest don't start from there. Start from the goals. Start from the goals are based on customer needs at the end of the day as well. And then when a customer says, "Do I need this feature?" first check whether or not we're trying to address that need or try to understand what's the need and try to see if the need is answered by another feature or another objective that we have. Let make a concrete example. So a customer comes and says, "I need this security feature. I need this new form of authentication." And the temptation is just to go and say, "Yeah, let's put it on the roadmap." But I would say let's have a discussion with the product manager talking to the customer or more senior product manager depending on the seniority of the person asking and try to drill into the need. And then you might have a discussion like, "You know, we're not sure we will produce this exact feature, but we are definitely trying to address the problem that you're raising and here's some of our ideas, we don't know which of those. And if you want to be an early tester as well, like join our early adopter programme, that will be great." And sometimes that answers their demand. Sometimes, it doesn't, but at least it moves us in the direction of us being in control of our destiny rather than the customers.

    - Yeah, 'cause as I as you was talking, I was thinking, well, okay, so we've talked about the goals. Now we're kind of transitioning into ideas and steps almost if we're thinking about GIST, 'cause we might communicate, these are some of the ideas of how we might address that goal and this is the steps we are gonna take in terms of the experiments, if I remember rightly, to kind of validate. So we might have a prototype that we use with that early adopter customer around one of those ideas to see, does it solve the need? For example. And I guess we could roadmap those. Closer in timeframe, we'd have some level of certainty.

    - So you explained this brilliantly and it's all correct, but I think your audience might need a definition of what ideas and what steps are. So let me take a stab of it. So ideas are the features or products we might want to launch or business initiatives. Anything we might build or work towards is an idea. It's not an opportunity, it's not a customer need, it's a concrete way to address the things. What we used to call solutions back in the day. And most of the roadmaps that I see include ideas actually. Solutions on a timeline and say we'll launch this or the other. And I would argue that that may have worked well in a world that was very predictable and we could predict which ideas are the best for the next year or the next six months or even the next three months. But that is not the case in a lot of companies. And we have a lot of statistics that come from AB experiments that show that the vast majority of ideas actually don't do what we expect. They don't move the metrics we expect, they don't create any positive value for the customers or the business. And it's usually more than half, depending on the type of company, the type of product, which means if we do the classic roadmap, we're taking huge bets and we're creating a lot of waste. And honestly, I think that that was the situation for most of my career is working for 20 years as an engineer and the product manager. And it's a situation with most of my clients that I see. So the more modern version of this, and we both know the people behind some of these ideas is to discover the product. What does discover mean? First, you need to do some research, you need to understand the customers, you need to understand the market, you need to understand the technology and you need to do this in an ongoing basis. The research generates ideas. You say, all right, first opportunities and needs, but also ideas and goals. And then we have this bag of ideas that come from all over, sometimes, from the stakeholders, sometimes from the customers. And then we need to kind of decide which ones we should invest in. And that's the other part of discovery. We need to test them. We need to evaluate which ones are most promising and then test the ones that seem most promising. And that's the core of GIST as well. It's a discovery method essentially, or discovery method framework. And the question is how to package this into a classic roadmap or roadmap at all. And I think that's kind of where you were hinting and I have an idea but I want to hear your thoughts before.

    - Yeah, I mean, I guess interestingly, what I've started using the term explore even before discover. So we've gotta go and find those problems, those needs, then we evaluate whether they're going to act, whether we can address them, whether they're valuable. And then we go and figure out can we solve them? Or can we create a solution for them? And so I kind of almost put that on exploration to find it with the street strategic context and discovery to be then evaluating those problem, that problem space of can we address it. Then following onto delivery.

    - A very old idea, right? We had the double diamond from design thinking, we had build, measure, learn, multi Kagan's product discovery. It all kind of talks about the same principles.

    - Where I've leaned to these days is if I think almost like as I get further out in time, the further away, I'm generally looking at the objectives with maybe some really high level ideas of the problems I might solve. As I get into a nearer term, I'm looking at the problems or the opportunities I might address very much tied into what I believe are gonna be my next goals. Because obviously, goals evolve. And in my shorter term, I've got that hierarchy of here's my goal, here's the opportunity or the need I'm gonna address maybe a job to be done or a user outcome. And then I've got some ideas of the solutions, but I'm still evaluating them or if I've done enough work or and evaluating them already, I'm now delivering them.

    - I think that describes for me what an outcome roadmap needs to look like. So actually, most of the timeline is populated by objectives and outcomes and when do we want to address them. On the short term we're starting to speak about ideas and I think here the key concept is confidence. I stole this from Sean Ellis who invented ICE. Before, there was just impact is evaluation. But Sean also added confidence to create what's called ICE scoring. And for me confidence is the answer to the question, how sure are we that this idea is actually going to create the impact that we expect and the cost or is that we expect and it's based on evidence. What evidence do we have? And the trick is to bring the ideas from concepts. We may be living in them a lot, our judgement may tell us this is a brilliant idea to a place where the evidence tells us this idea is validated enough to invest in. Switch to delivery, it's okay. And we want in every quarter, to be in possession of some ideas that are just being evaluated and tested on the discovery and some ideas that are already ready for delivery. And those, I'm perfectly fine with putting on the roadmap as outputs as like by May 15, we're going to launch this, you know, the padlock feature for security or the new onboarding wizard. Perfectly fine. If you tested it, you have sufficient confidence that this does what you expect. The right thing now is to launch it and it's okay to tell the customers or whoever you want to tell we're gonna launch it. The problem is most of the existing roadmaps are not in this state. They're in early confident states. They're mostly based on opinions and what I call anecdotal evidence, which is usually either the CEO talk to a few customers or we are copying this feature from the leading competitor assuming that they know something we don't. And that's a disastrous pattern. We're talking about this thing as if they're a sure thing but they're actually low confidence.

    - Absolutely. And I think if I remember rightly, that's where steps come into the GIST framework. There are about the sequence or series of activities or experiments you're gonna run to level up that confidence. Is that right?

    - It is correct. The ideas layer of just about collecting ideas in idea banks and ranking them using ICE. And by the way, for confidence, I created a special tool called the confidence metre, which is kind of a mapping table if you like, but I put it on the wheel so it's a little bit more pleasing to look at. And it works like a thermometer. It says if you only have opinions, even if it's opinion of senior people in the company, you have very low confidence, around 0.1 out of 10. If you have some data but it's anecdotal, you have 0.5 out of 10. If you have data that comes from surveys and from smoke tests and from other things, that's slightly higher. If you have evidence that's actually coming from tests and there are various forms of tests that puts you higher and higher and closer. And again, the trick is to know when it's enough to test. You don't need to go through all of it. If you have a very cheap and low risk idea, you're just reorganising the order of the settings or doing a redesign of the settings page, it's enough that the expert opinion, the designer says this is a good feature, do it. You don't have to test it that thoroughly. You don't need to reach that level of confidence. But if the feature is a little bit risky, we might lose subscribers because of it or lose money. Even if it's very small and easy to launch, put it into an A/B experiment, absolutely. Reach the right level of confidence. If it's big, expensive and risky, absolutely test it in multiple rounds. And each of these rounds is called a step in my book, it's an experiment by another name. But I don't love the word "experiment" 'cause I like how statisticians think of experiments. Experiment is only a test where you have also a control element that kind of tells you if you're measuring noise or real effect of your change. So in the chapter about steps, I give a wide gamut of things from just evaluating things on paper, discussing with an expert, you know, interviewing customers and so on. Moving to more expensive things like you know, wizard device tests or smoke tests, fake door tests. And then to even more expensive things like an MVP and alpha and then beta, dog foods, labs. There's a bunch of these things explained up to experiments and then the release itself can be a step and each one of those gives us more and more confidence. So we need to be good at picking where to start in the cheap stuff. And then if that shows supporting evidence, go to a slightly more expensive thing and then another until we feel we're done and then switch to delivery.

    - And I wonder if we could not put those steps onto a roadmap. Definitely, I've had conversation with some previous guests around roadmapping your discovery and so you can almost imagine capturing that as a way of showing, this is what we're working on to gain that confidence.

    - Yeah. Did you find a solution to do this? 'Cause I would love to use it.

    - I think it's as simple as choosing which items appear on your roadmap, right? It doesn't have to be the solution, it can be those different types of items. Be it a step of we are going to run an A/B test here or we're gonna run a concierge test here, we're gonna go all the way to a Wizard of Oz test, maybe within a theme or a swim lane perhaps. I'm thinking out loud.

    - I'm with you and I agree with the need. Here's the challenge. I think that the steps are very dynamic. You test an idea, okay, you know, what's the first step? But then you learn something completely new that sends you to the whiteboard. You need to think of a completely next step or maybe you pivot idea or maybe you dump this idea completely and you switch to another idea. That's part of discovery. So this whole part is very, very dynamic. It's really hard to commit to and saying we'll do that and then that and that and that. I do offer a project management tool called the GIST board that's in the task layer of GIST to manage this stuff. And it's a very dynamic thing that each product team should manage and should reflect to managers and to stakeholders is tell them, listen, here are the goals on the left, here are the ideas we're currently working on. And the goals are just the key results, just the outcomes which are usually no more than four per team. Here are the few ideas we're working on right now and here are the next few steps. And it's really important, every time you complete a step and you have data, you analyse it, you produce evidence, and you make a decision to communicate that as well. 'Cause it's very, very important for them to understand how you're operating so it's not a black box. And sometimes, they will interject and say, you know, I think you may be being too negative about this result or maybe this idea is not as strong as you think, but it makes it a very transparent process. Putting that on a roadmap, this might be an extension of the roadmap. This might be like when we look at the roadmap about this goal, here's a link to where this team is working on it. Here's the GIST board for the onboarding team, here's the GIST board for the search team, etc.

    - It could be a double click through.

    - Exactly. What you may want to consider is how much time you want devote to search, sorry, to research because that's sometimes the first step. Once you have a new goal and you don't know enough, you're going to say, now we're going to spend three weeks on research and you might put this as a block in your roadmap if you really want to show the work. Then there's another blob which is discovery, which is like from the research we get a bunch of ideas, but now we're going to run through them. How long is discovery going to run? Again, something you might want to put on the roadmap, but it's really hard to predict. And then from there, there will be some ideas that will move into the delivery. And those are really hard to predict in advance while you're still in research and discovery. But once you have one, by all means, put it on the roadmap and tell everyone that this is coming.

    - That hints at one of the things that I find to be a common challenge with roadmaps. There's a perception that it's this fixed in stone item and I consider it like the GIST bought to be very much a living, breathing, evolving document. I might publish plan A today, but I'm already working on plan B and it comes out tomorrow because we're living, we're learning and that's the nature of product and how we deliver value today. And so it may just be an expectation setting thing.

    - No, I think that's a much, much better interpretation of how a roadmap should operate. You know, like as well as I do, how much time people spend in planning end of year and every quarter to produce these rolling roadmaps and how much they're attached to them after they made this investment. So I would much rather say don't try to create their perfect roadmap. Try to create the perfect goals as far as we know and then be dynamic just like you suggested. But it does kind of like the CFO might not like it because they want to do budgeting for the year. The CTO may not like it because they want to do headcount and the sales team may not like it because they want to know and prepare for what's available. So there's these two forces pulling on the roadmap from the product side and from the business/resource side.

    - Yeah, and that's really interesting one. So the sales one is often an interesting conversation. The most recent approach that I've adopted is saying, "So will you guarantee you will win this customer on this particular exact customer on this exact date? So if you can't guarantee that, why are you holding me to a different level?" I can help you move an outcome the same way as you are gonna move your sales, hit your sales target, and if I hit my outcome, you hit your sales target, we're both doing our job. But exactly how we get there, which customers, which features, they're gonna be variable and we should be on a level playing field in that way.

    - I think you point a very good observation that there's this discrepancy. We operate with our business side on outcomes, marketing outcomes, sales outcomes. And they know they need to hit certain numbers by a certain time. Whether or not it's the ideal outcomes or the process of creating them is perfect, that's a different discussion to have. Whether or not it motivates them, right? Yet another discussion. But with the product organisation, we switch to outputs. It's like deliver on this set of things for us and we trust this will lead to the business outcomes that we care. But we did already, the validation, multi Kagan kind of pointed to this. The delivery mode is like the business team already figured out what we need to do. We validated the idea. We don't need to test it, don't need to think too much, just deliver. So that's a very different way we're talking to these two halves of the organisation and setting goals and the roadmap is right in the middle of it and it's kind of the contract that the product organisation is signing in a sense.

    - Well, a previous guest called it consensual reality. We're all agreeing that this is the reality in the future, even though we know it probably won't be.

    - Yeah, exactly. That's a better name than a contract maybe.

    - But what's interesting there is also, I have definitely observed a shift in my time in product for it to be more about the thing we build, whereas product was almost 75% on the business side when I started. And now it seems to be at say probably 75 on the tech side. But I see things like the recent Airbnb activities to be a rebalancing of that to kind of pause back. So we're maybe 50-50 on that line. What's your thoughts there?

    - I had the same experience. When I was a young engineer in the mid '90s, we had no agile, we had no product managers, there were hardly any designers, there was no user research or at least I didn't have access to it. So the head of the developers, the engineer was really big. Like we had to focus on the users as well. We needed to understand the business. I talked to salespeople sometimes. My manager, the engineering manager was kind of a product manager in a sense. So the discussion, at least on the engineering side was not just, how do we build the highest quality, the best design, etc., but how do we achieve the goals? The business goals, the customer goals. We had to be aware of these things. We had a lot of context. And I think by kind of shifting to this agile world and introducing these new roles of UX design and researcher and data analyst and PM, we kind of reduced the horizon, the scope, the responsibility of our engineering teams to just delivery in a sense. I don't think that they love it necessarily, but they got used to it. And there's just entire generations that are brought up in this agile mindset that all they need to do is burn through story points. Move tickets to the downstate, that's their job. It's multi Kagan pointed out, this is a huge misuse of your engineering resources. If all you use them for is delivery, you're wasting a lot of innovation power inside your organisation. So that's why I think a lot of companies are trying to move away from this model.

    - Interesting. So when I started, product was officially towards the late end of the '90s, although I'd been doing a chunk of it as an engineer like yourself. And we were very much more on the business side. But what I've observed over the last 15 plus years is there's so many more product managers as well. They're becoming more operational and more in the weeds. Like what I used to do as a product manager is now what a head or a director of product would do and there's a whole team of product managers working for them. And I think that's come in with the advent of agile and not seeing it necessarily a bad thing. But there's an a more operational layer of product managers that are in the weeds a bit more, doing more experimentation quite rightly so to kind of make better decisions. But we used to well own much more strategic conversations as a product manager. And now it's the director or the head of level, those conversations are happening. I think some of the education and stuff that's out there is not necessarily kept up with that change.

    - First off, if those extra product managers are doing experimentation, I'll be very pleased. In my experience, that's not what they actually are hired for. They're hired to write tickets and to create perfectly prioritised backlogs and to do basically to feed the agile machine to make sure that it's on 100% capacity and just churning through full work. And that's again, a huge misinterpretation of the role of product management. And I think yes, especially in Europe, this came about because a lot of companies discovered first and foremost, Scrum, they didn't have product managers before that. They had classic project management, waterfall, etc. Then they adopted Scrum. And then this concept of a product owner emerged and they needed to fulfil the job. In some places, they just repurposed their project managers to become product owners. In other places, they started hiring these new skills. But the concept of the most strategic product manager that is thinking about the customers is an expert about the market, the usage, the business and all appeared just as a much higher level. It's like the senior person who does all of this and these other tactical people are just about moving the delivery forward. If you presented this model to anyone, I think still in America and a lot of companies, they will be shocked by this model. But back in the day, if anyone with a title product manager would be presented with this, they will say, "No ,way I quit. This is not what a product manager is supposed to do." But today, it's the reality.

    - Yeah, and I guess my product management journey did start working for a US-based company, so that probably influences that perspective. So okay, I'm gonna bring us back a little bit to roadmapping. It's been really interesting, but let's go for some high level stuff. If you think about roadmapping as an overall thing, what do you consider best practise when we talk about roadmap?

    - I'm going to be a little bit sly about this answer and say I don't think of roadmapping as a separate thing. I think of a much larger kind of shift in the way we plan and execute. So we talked about part of the shift. First, there needs to be a mission, there needs to be a strategy. We didn't talk a lot about strategy, but for me, strategy is about defining a theory of how we will achieve the mission, how we will grow, how we will capture new opportunities and what opportunities are those in which markets, how we will create this moat or defend our core business, etc. And that's another hugely lacking item in the menu of a lot of companies today. If you ask what the strategies even among the executives, you get six different versions if you ask six different people. And without the mission, without a strategy, it's really hard to define this top key metrics and to create this objective zero, OKR zero as you call, which creates this tremendous focus in the minds of people. And they really need to understand what is it that the company's trying to achieve. Do an experiment, go to any random company and ask people in the corridor what's the mission? What's the goal? What are we trying to achieve this year? And they will struggle to answer you because they get this barrage of objectives and key results and it doesn't connect. It just every division, every department needs to have its own and it's all then aggregated to the top. And that's the OKRs if you're managing to do this, the roadmap being part becomes a bit easier because first off, you know what are the three or four things you want to achieve this year? And it's not satisfying customer A with this feature, it's about achieving something very significant that moves us in the direction of the strategy and the mission. And then the discussions about the hard discussions about priorities are done. And then it's up to the teams I think to propose ideas to say, based on the information you gave us and based of the OKRs we created, here's how we think we can move towards these goals on our level. These are team level ideas. And then we have this discussion that we mentioned. How much of this you want to expose in the roadmap? You definitely want to expose the ideas that are already validated. So means you need to have worked on them before. You might want to expose the big boxes, the research and the discovery part and when do they start and when do they end. But that's as far as I would go. Sometimes, there are bigger ideas, like ideas that span multiple teams or multiple companies. And those, I would suggest don't start big. Google had a saying, "Think big but sell small." So say you found a huge strategic opportunity, you validated it, you tested the market, you put already kind of advanced team. To validate the opportunity to size up the market to say, "Yeah, there's a bunch of customers with a real need here." And then you started to test ideas. This initial testing should be small with a core team that is very small and very concise and just trying to find ideas that show potential for product market fit, not to find product market fit, just to show. And once you have one of those, you can start putting more resources on it and say, "All right, build me a version that is good enough to do an early adopter programme. Or build me a version that will get enough customers to sign up so we know that we have product market fit." And that's their mission. And if that is achieved, then you can start throwing more resources and building the thing, reaching product market fit and scaling. So it's naturally kind of growing and spreading across the organisation. Sometimes becoming a huge project. But that's the right way to do a big bet not to start big and then trickle down a bunch of mini projects that are actually connected to this massive project. And then you need project management and that's a terrible way to start a big idea. So I'd much rather that you do the opposite.

    - Love it. And interesting enough, if I make a differentiation between roadmapping and the roadmap artefact itself. So I consider if it's taking someone more than 30 minutes to a maximum of an hour once a quarter to update this artefact, then they're not doing the work that is the roadmapping, which is fundamentally understanding the direction of travel of the team and the business, which is kind of what you talked about there. 'Cause that's just capturing the understanding. We've gotta be working together as a team, building that understanding, figuring out what the right direction is and then we capture it and communicate it. But that's the small part of the task.

    - Yeah, yeah. I mean I think that's why I have a challenge with the word "roadmap." I mean advanced people like you and Jenna and Bruce are coming up with these much more modern interpretations of roadmap that are much closer to my vision as well of how we need to plan and execute. But you're still using the word "roadmap" and so that brings up connotations for back in the day, you know. So I think some of the objections sometimes are not towards what you are actually coming up with, but the older versions of this thing.

    - It's a loaded term. Absolutely. And I've had that debate, should we change the word? And I got to though, well it's just like an MVP that was a useful term and then sales got hold of it and it became something else that became unuseful again. So I've given up on trying to come up with new terms and instead, I'm trying to embrace the common term and say let's do it better.

    - I just prefixed it with outcome and I call it an outcome roadmap, although there might be some outputs in it and just to indicate that it's something different. But I don't think that's solved the terminology problem necessarily.

    - Anti-patterns. If you had to choose one big anti-pattern around roadmapping, what would it be?

    - Just one. That's the challenge. So my whole premise of the book is balance your opinions with evidence. And that is because I sat through, I don't know how many hours of roadmapping discussions or prioritisation discussions where the key inputs were the opinions of an individuals, very small to very experienced individuals, full respect to them, but still opinions. And opinions of groups and opinions of the highest paid person in the room. And all of those were biassed and we didn't know they were biassed. We felt we were being completely objective and doing the right thing. But the problem we were trying to solve was so ambiguous, so complex, so lacking in information and we tried to solve it under a time constraint. And back in the day, no one read thinking fast and slow or any of those cognitive, sorry, cognitive psychology books. So we didn't know that we are actually using the Scotsman fallacy or narrow framing or any of these hundreds and hundreds of cognitive biases. But we were relying on these things to convince ourselves internally that we pick the right idea, that we know what's in the future. And you would think that if a bunch of smart people sit in a room, they will balance each other and then they will come with an objective result. But actually, groups just research to show actually come up sometimes with worse decisions than individuals and worse ideas because of group think, because of politics, because of all these other, or because when the senior leader speaks, people don't want to contradict her necessarily. So it's all very human. And if that's the way you're constructing your roadmap today or doing the roadmappings process power to you, chances are you're wasting at least 50% of your engineering resources and probably much, much more. And you're also wasting a lot of time of very busy people in this process. It is very cumbersome. So try injecting evidence, try to start asking the question, how much evidence do we have that this idea is going to be a game changer? "You seem very convinced, but what's the evidence? Oh, you spoke to three customers and what did they say exactly? Was there a user research in the room? Is it just kind of a casual conversation?" And then you can pull out my tool if you like, the confidence metre and say, "Well, according to this strange Israeli guy with accent, it's 0.5 out of 10."

    - Or if we look at a lot of them, 0.05 even 'cause I remember how low those numbers go.

    - And that's the way, I don't know how many companies are using the tool today. It's a very popular tool for this particular reason to say, "We need to test further. While you're very convinced in your opinions, the evidence is pretty low still. Look at this tool." So by the way, commercial plug, you can download all of the tools, the confidence metre, the GIST board, and a lot of the other information you found from my website. So I'll leave you a link at the end.

    - Funnily enough, I've taken the confidence metre into a lot of meetings with clients and coaching conversations for that exact same reason 'cause I found it, I think I first saw it just before you were on stage in mind the product last year and kind of thought, this just makes so much sense.

    - Thank you. Yeah, it's a very popular thing right now, but I see it as a part of a bigger framework and that's why I created the GIST model and the book that I just published.

    - Whose advice on roadmapping do you listen to?

    - You mean out of the experts, out of the people out there speaking?

    - Whoever. Could be experts, could be practitioners, could be people in other industries. I don't, yeah, wherever you listen.

    - That's a good question. I mean, I'm a big fan, It's no secret of the writings of multi Kagan. I was very influenced by the leans series of books. And I think if you actually read the SVPG, the multi Kagan books and some of the lean books, lean startup, lean enterprise, lean analytics, running lean, you'll have all the advice you need to reframe your thinking about these things. And there are many other good books out there. You can read Teresa Torres's "Continuous Discovery Habits." And hopefully, you can read my book that is also in line with those things. And they kind of just, I think take a step back and say, it's not just a roadmap, it's this bigger thing. So I'm dodging the question, I know, but that's how I think about it.

    - You know what? That's the point where we wanna hear everyone's views and Theresa and Marty are both previous guests so people can go back and listen to their thoughts as well. Is there anything else about roadmapping I should have asked you that I haven't?

    - I think it was pretty exhaustive, so I think we're good on roadmaps as far as I'm concerned.

    - So Itamar, it's been wonderful having you here today. I'd just like to give you an opportunity to pitch yourself, pitch your services for how people can get in touch with you. Obviously, we'll put some links down below as well.

    - Right. Call us. So first off, I didn't say, but it's a pleasure to speak to this community and I hope that you found some things valuable. If you are interested to learn more, visit my book site evidenceguided.com, just all one word, all my resources page, which is itamargilad.com/resources, and then you can download a lot of the things we mentioned. And from there you can find me. I'm very easy to find. So yeah, it's pretty clear.

    - It's been wonderful having you here today. Yeah, I think we'll sign it off there. Thanks, Itamar.

    - Thank you.

Phil Hornby

Co-host of Talking Roadmaps

Passionate product professional. Helping entrepreneurial product teams to be successful. Coach. Trainer. Facilitator.

https://www.linkedin.com/in/philhornby/
Previous
Previous

How is making a roadmap like cooking? | Gabrielle Bufrem

Next
Next

What do Hardware and Software Roadmaps have in common? | Rob Phaal