EP64: What early stage SaaS companies miss about product discovery

EP64: What early stage SaaS companies miss about product discovery
In Demand: How to Grow Your SaaS and Stay In Demand
EP64: What early stage SaaS companies miss about product discovery

May 15 2026 | 00:55:05

/
Episode 64 May 15, 2026 00:55:05

Hosted By

Asia Orangio Kim Talarczyk

Show Notes

In this episode of In Demand, Asia and Kim dive into product management.

They cover problem space versus solution space and why so many founders and product teams build features based on assumptions, customer requests, or competitive pressure without doing the discovery work necessary to understand what users are actually trying to accomplish.

This episode is a masterclass on how to avoid building features nobody uses and how to create products that customers genuinely adopt, value, and pay for.

Got a question you’d like Asia to unpack on the podcast? Record a voicemail here.

Links: 

Chapters

  • (00:01:00) - Product discovery and idea validation.
  • (00:03:15) - Why understanding the problem deeply creates better product decisions.
  • (00:06:05) - What product discovery actually means in practice.
  • (00:14:40) - Observational research and watching users interact with products.
  • (00:16:15) - User stories versus hypothetical behavior.
  • (00:19:50) - An example of the user stories you might look for on a SaaS reporting function.
  • (00:23:45) - How support tickets can help lead you to the areas you should start working on.
  • (00:29:00) - Castos example: uncovering integration problems through user stories.
  • (00:37:45) - Theresa Torres and the Continuous Discovery approach.
  • (00:44:30) - How discovery creates clearer product strategy and positioning.
View Full Transcript

Episode Transcript

[00:00:05] Speaker A: What's up, founders? And welcome to the In Demand podcast where we talk all about how to troubleshoot growth for your plg Sass. I'm your co host, asia Arangio, the CEO and founder of DemandMaven. [00:00:15] Speaker B: And I'm Kim Talarczyk, Client services manager at Demand Maven, where we help SaaS companies reach their toughest growth milestones. [00:00:22] Speaker A: All right, Kim, let's get into it. So this is an interesting topic because we've talked about it a lot across various episodes, but I don't think we've ever just broken down the fundamentals of this topic and dug a little deeper. And this might not be a long one, but it's top of mind for me because I am in the process now of transforming the product management process for a client and it's so much easier said than done. But the process that we're trying to implement is product discovery and idea validation, which are two different concepts. I like to think about product management in a couple of ways, but the framework that I like the most is there's problem space and there's solution space. This is an old concept, it's not like a new concept. And actually, let me see, I actually don't even know where it comes from. Do you by any chance, it's not [00:01:29] Speaker B: Teresa Torres, is it? [00:01:31] Speaker A: No, actually, I don't think. I don't think so. Just double checking here. Let's see. Ah, Newell and Simon, 1972. There's also Eric Reese at the Lean Startup, so he popularized the distinction. Dan Olson's the Lean Product Playbook. Oh, that's a good one. Formally separates the market problem from the product solution. And then, yeah, there are a bunch of UX research frameworks that, that reference it, but it's one of, it's one of the core foundational beliefs behind the idea of product management, which basically is the problem space is understanding what problems exist within a particular market. And because of my jobs to be done training and your jobs be done training, you can kind of think about problems as what are the jobs to be done? So what are the things that people are trying to accomplish? There's an underbelly of a problem sometimes. Sometimes it's more of a desire. Sometimes the problem framed a certain way doesn't sound like a problem, but it. So for example, like I want ice cream. Maybe the problem, maybe the problem is like, I have no ice cream or like that's the problem. But of course, like there are other problems that sound more like, you know, like businessy type pro, like you know, I, my tasks are spread across disparate systems or maybe there's a problem, like, I, I don't know what questions I should be asking when I pull reports or, you know, blah, blah, blah, like those, the types of problems that exist in a problem space. And, and the more time that you spend in the problem space, the better. To a degree, of course, you want to get out of analysis paralysis mode and actually execute solutions. But the problem space is, is important because the more that we understand the context of a particular problem within a particular market within a particular segment, the better we're likely going. The better chance we have at defining the solution space. So a million years ago now, I took a number of strategy courses and like, strategy certifications. And I'm trying to think of the big one that I did. I think it's like, how to Win. I'm on my computer now. Let me just look it up. How to Win, Playing to win, how strategy really works. Pretty sure it's this book. He outlines one of the core foundations of strategy and strategic thinking is really understanding where to play and how to win. And there's another foundational concept related to this that's kind of like problem space and solution space. But basically, the more that you really deeply understand a problem, the better you're going to be at presenting different solutions. And then design thinking taught me that there are many different solutions that you could produce, but there are likely [00:04:22] Speaker B: a [00:04:23] Speaker A: smaller handful of ideas. Ideas and solutions are relatively infinite, but there's likely like a handful of things that are actually going to like, finger quotes like work working, meaning it's adopted, it's applicable, it sticks, it's something that people find very highly valuable and has, like, relatively lasting staying power. So those are like the two core foundational pieces to me, like, when it comes to my core philosophies behind product management and how I approach. And I think what I'm observing in the market right now is how many teams skip right over problem space thinking and skip right into solution space thinking without really questioning anything about what they're building, shipping, producing, or even prioritizing. And I, and I, I think this is really what I want to spend time talking about today. Yeah, I think actually get into the topic. [00:05:20] Speaker B: A good topic. [00:05:22] Speaker A: It's a tricky one too, because, you know, no one wants to sit and think I'm not doing enough. And I, I don't think, you know, I don't. That's not what I'm saying. I think it's more like, especially if you're founder, CEO, listening. And you are also the head of product right now. I think it's like the goal isn't to do more. I mean, for me it's actually to do less because I'm lazy as hell. Like, I don't like working a ton, a ton, a ton. Like I don't like working 80 hour weeks. I want to be as efficient as possible. And to me the best way to do that is to spend, you know, measure twice and cut once is how I like to think about it. I'd rather measure twice and cut once than cut a million times and then be like, I didn't get what I wanted out of this. This wasn't the right cut at the end of the day. I think the way that we do that though is spending a bit more time in the problem space. And there's a number of ways that this looks to me. The first is discovery. So we've talked about discovery before and product discovery is ultimately the process of understanding and uncovering user stories that surface problems, problems that depending on how you solution, your product could solve. And this is where innovation work comes into play. Because as I mentioned earlier, you could. The ideas for what could be developed are relatively infinite. There's no shortage of ideas, but there's likely a handful that are truly in alignment with all parties involved. It's something that your product could realistically do. It would make sense to do it. It would have like or improve product market fit. It would be something that customers are willing to value and pay for and also satisfy their needs like, like in that way. And it's kind of like a win, win, win across all boards. There's likely only like a handful of those ideas, but there are infinite things that you could do. So like innovation kind of gets into this like infinite space, which I think is interesting. So like ido, for example, when they approach innovation that. So ido, for those who aren't familiar with ido, like they're like the, they're like the really cool, kitschy like consulting firm that would design products for like within challenging contexts. The thing that I feel like they designed like the IKEA cart or something, like they, they designed like a, a cart of like a shopping cart that's like really unique, but it's like the perfect fit within the context and the confines of where that shopping cart is, who it's for, and like what that cart needs to carry. Like I feel like they designed something like that and they. IDO is famous for having like really interesting product design solutions, but it's based off of their methodology around design thinking, around problem space versus solution space and around like how to think about product strategy as a whole. So when it comes to our SaaS products, I would say we don't need to necessarily be ido unless you are trying to build a decacorn and raise a bajillion dollars like you might. Yeah, like you might need to do some, some big thinking innovation. So like when I think about AI for example, you know, AI hasn't, AI has actually been around for a long time. We, it just only really became mainstream in the last like two to three years. I feel like where the average user or the average person could access LLM. But LLMs have been around for forever. But the way that you get to an LLM is by that like big thinking innovation of what are the problems that we can solve and what are the different solutions that we can use to solve those problems and think really big. But in SaaS PLG world we don't need to necessarily think that big all the time. Nine times out of nine it's probably so much simpler than that. And I think that's the irony is I, I think it's so simple that I think we think we can just kind of skip the whole problem space discovery process. And discovery to me doesn't have to be, it doesn't have to be like hundreds of interviews. Traditionally speaking, the best way to do discovery in my opinion is like 15 to 20 minute interviews. You get like five to 10 of those, you pack the front half of your week with those. It's like you knock all of those out Monday and the next day is ideating around the solution space. So like maybe first day is conducting those interviews really getting clear about the different user stories that you can, that you've identified, identifying which problems now are most appealing to solve. And then from there we get into solution space and from there it's okay, what are the different ways that we can solve this problem? I'll give an example, but I'll pause here because I feel like you wrote something interesting down. [00:10:33] Speaker B: Yeah, I did because so this is a product. These, these are examples that this SaaS product already exists. Is, are the founders thinking in this process we think people might be having these problems or we're hearing this or. And so they're doing these interviews with certain things in mind and they're doing these interviews with specific questions of okay, navigate to that. Like they're putting the user in it, like a specific context or is this ever just sort of open Ended, hey, we need to make more money. What other, what other things can we do to our product to solve people's problems? [00:11:17] Speaker A: I think, I think it's a two parter. So I think, I think the first is in most early like underdeveloped product management functions where let's say like the founder is doing it and maybe the founder is not like a super experienced product leader. Meaning like you know, you've had like several product management roles like you know, and blah blah, blah. Like, like a lot of founders and CEOs are first time CEOs. At least a lot of them don't have like super deep, robust product backgrounds, which is understandable. I think what's actually happening is the discovery actually isn't happening at all. And a lot of it is based off of snippets of feedback from customers and taking those recommendations at face value of I want the report to have this filter. Okay, great, I can build that or I want to be able to send text messages, not just emails. Okay, I think I can build that and maybe upcharge or whatever. And I think there's a good portion of that is fine. That's safe. Those are like, some of those are going to be safe bets. What's, and what's interesting is like that feature, like there might be four platforms that are similar, different markets, solving different problems technically for different people and text messaging might be something that all of those platforms add and it's going to make sense for some of those and not at all for others. And so it like you know, you can't like point at one feature set and be like, yeah, that sucks, don't ever build that for. Because it's just so highly contextual and dependent. I think what's hard about that though is if you've, if I think if you have your product management process really dialed in, like you can tell for yourself if that's worth your time or not. And like you can make better decisions around is that going to support growth or retention or not. I think that's part one. I think part two is if they are doing discovery, I think the discovery is more like again it's more like face value interviews where they're asking the customer what do you struggle with? What don't you like about the product? What, what do you hate or whatever. And it's funny because like we actually. So when I think about our process for research in the jobs to be done world, we actually do ask those questions because I like to hear, you know, are there challenges with the Product. Like are there, is there any? Because a lot of it makes the customer feel good to give their feedback. And it also is very, it tells us information about their experience, especially if they're asking for features that already exist. Like, it tells us like we learn a lot just from context, like asking those questions. But I think a lot of teams, their approach discovery is just kind of asking the customer what they want. And we've already had the episode about being like friction vocal versus friction aware and product aware. Customers aren't great product managers. They're not, they just, they can't tell you what to build. And it's just so rare that your context is gonna give you customers that are like super awesome at telling you what to build. It's just so rare. Maybe if you are building tech for developers or tech for product managers, like if you're notion, for example, you probably eat up feedback like left and right because the people who are using your product are like actual product thinkers. But it's just so rare to have that context. So I think it's a two, I think it's two part. What it looks like when it's great is you're not just asking the customer what they want. What you're doing is you're doing either one of two things. You're doing an observational study, meaning you are observing them accomplish a task and you're asking them questions along the way about what they're trying to do and why they're trying to do it. So we might consider this to be like a UX interview. And when we work on activation, that's one of the first things that we do is a UX interview of just like just watching them sign up is enough to one make you want to scratch your eyes out because you're like, oh my God, I can't believe people are struggling this hard with it. We're in the process now actually of doing pricing interviews where we're having users navigate they're qualified prospects. Like they haven't looked at our product yet, but they're going to through the process of doing the interview. And like they're, they're using the pricing page to like evaluate, like make decisions. And it is so cringe because you're just like, man, I thought this design was awesome. But looking at people navigate this, it's trash and we need to completely redesign it. We're also, we're using cloud design to like make quick, fast adjustments. So that's the, that's the cool part is like there's almost no Excuse now to not prototype and test and experiment because like, it's just so fast now that you can build things and you can learn faster. But anyway, I digress. Um, UX interviews are one of the fastest ways to see if something is garbage. And then you're like, damn. But the other way. So that's observational study. The other way is extracting what we call user stories. And that's the formal or more technical term. But basically what it is is you're interviewing people based on actualized behavior and not fictitious or hypothetical behavior. And this is the key difference. And this has been actually kind of hard for my brain sometimes to wrap around because in jobs world, hypotheticals are okay. Like we try to get some specifics and hypotheticals. So for example, hypothetical would be like, how often do you go to the grocery store? Oh, well, I probably go like usually it's two to three times a week. That's a very generalized hypothetical statement. Usually two to three times a week. That's very different than when was the last time you went to the grocery store? And when I think about, okay, when was the last time I went to the grocery store? I'm vaguely remembering the day. I want to say it was like last Wednesday, maybe last Thursday. And from here you've got to get the person to get very specific about what actually happened and not what they think or like hypothetically usually happens. So I can pinpoint the last time I went to the grocery store. I can say, okay, I think it was either like Wednesday or Thursday. It's okay if I don't remember exactly when. But then it's like, okay, but like, what did you do? What did I pick up? Literally trying to remember my grocery order. Oh, I picked up like a sub sandwich because I had to play tennis. And I think I picked up some sushi, like a poke bowl, I think, yeah. And then some milk and like, like non dairy milk. So like I think it was like almond milk or something. And that's funny. I can feel my brain wanting to be like. But usually when I go to the store I do this and it's like, no, no, we don't care about the. Usually we care about the okay, but like, what did you actually do? So then if I were interviewing myself, which I am literally right now, and be like, okay, did I drive? Did I, like, how did I get there and why was I going? How did I know that I needed to go to the store? When did I know? And I think for me, you know, like, I can, I can, I'm thinking about from like a researcher's perspective. Oh, well, you know, I, I remember like, Tuesday opening up the fridge and being like, all right, I gotta play tennis on Thursday. I'm gonna want to get, like, food so I can, like, fuel myself enough to play. But I remember thinking that on Tuesday I also had to play Tuesday. That's also what triggered me thinking about the tennis. And well, so like, I being a researcher and having research experience, I know how I would answer those questions now. But, you know, when you're, when you're surfacing user stories, what you're really listening to is you're listening for real behavior. You're trying to get away from generalized statements and get to like, what was the actual thing that you did? And asking contextualized questions around that so you can understand, like, okay, well, like, what's the con? Like, how did you know to do that? What triggered that for you? And then also too, like, was there anything stressful or frustrating or challenging about that experience? And then unpacking that and user stories. I know, like, you know, grocery store is one example. That's like a real life example, but like, in a product context. So if I'm thinking about the products that we're working on right now, it would be like, let's say. And this actually goes back to your, to the other part of your question that I, that I didn't answer that I'm realizing now. But say we kind of already know that we want to improve, like, the reporting section of our product. And like, we know that that's kind of like a pain point for people. Like, it's not as advanced as it could be. The user stories that you might try to extract would be, when do people use reporting and how do they know to do that and when do they do that? And it also could be tied to other things. So, like, why are you using reporting? Is it for financial performance? Is it for understanding, like, what classes to offer or how many teachers you need to hire, or like, what students maybe you need to sell packages to or whatever. Like, that's all really specific to a particular product and like the reporting features within that specific product. But that's the type of, like, questioning that I'm starting to think about when I think about, like that section of, of the platform. And then it goes further into, and it goes deeper. So maybe there's a particular area that we've identified. So like, let's say, for example, we've learned over interviews and studies that our target audience, our icp, like, they're primarily using reporting for financial purposes. So it's really to help finance make financial decisions. Like that's the jobs to be done, so to speak of that feature. What does their financial process look like both in and out of the platform? Are they using like a spreadsheet? How are they tracking their finances? When was last time that they did it and what did that look like? Where were they? Were they in a cafe? Were they at home? Were they on the computer? Were they on the couch? And what tabs do they have open? Do they have like Xero open? Do they have a spreadsheet open QuickBooks? And what are they actually doing? Like are they matching transactions? Are they catching calculating numbers? Is someone else doing that for them? And they're like running with it like they're going it over together like on a call. And then when they use the reporting in the product, how does that, how are they using that in conjunction with this? Like when do they open up that and look at that. And all of these are questions that might sound very like, okay A.J. that's so tedious. Why would we need to know any of that? But the reality is that again, like if you don't understand the problem space, what solution are you realistically going to define for them that's going to actually fit their context? What's going to happen? And this is what, this is what happens across every team. Most, most, I would say underdeveloped product management functions or more like, you know, early stage product management functions. Most of them just skip over all of that and they're like, I'm just gonna add this feature to reporting and just see, we're just gonna see if people use it and if they like it. And this to me is no different. It's just no different to me than like I'm just gonna build a whole product and just see if people want it. Like when I hear that, that's what I hear. I hear it's a very ex. I don't think there's anything wrong with being exploratory if you have the insight to back up why you are exploring a particular thing and you're spending the resources on it. But if we don't have the insight to justify the resource spend, it's no different to me than like I'm just going to build a product and just see if people want it. It's the same thing, it's just in a feature, right? [00:23:06] Speaker B: Very rarely I think is it. If you build it, they will come rare, probably rare instances, right? [00:23:14] Speaker A: Product wise that should be the title of the episode. If you build it, they will not come with the caveat that like if you don't, if you don't understand what the potential problems are. So, so by doing those users, it's capturing user stories. By capturing user stories you will start noticing patterns, particularly if you're focused on a particular area which to your question earlier, which I did not answer, which I'll answer now, you also asked like, do people already know what section or portion of the product to focus on or can it be more open? And I think, I think both are actually true. I, I think you absolutely can keep it more exploratory, especially if you're not sure or if you don't have a whole lot of data. But my guess is you actually do have data about like where the problem areas in the product are. It's like if you were to analyze all of your support tickets, like everyone uses AI now for pretty much everything. You could probably port all of your support tickets from JIRA or whatever and ask Claude or whatever and be like, hey, can you tell me like the top theme themes or top areas where most tickets seem to arise or most requests, Maybe it's not like a support ticket but like you're getting a ton of requests at the help email address whatever whatever. And AI can probably spit out something and be like, hey, here's where most people struggle or where most people have like improvement requests or whatever, like they have more desire around these things. But all that to say you can keep it more high level exploratory. Like maybe you just don't know where to focus and so you can do more high level interviews. They might not be user story interviews because user stories are very specific. Like you do kind of have to know where you're focused to a degree. You can also another way that I like to do exploratory work, at least it's, and it's part of the discovery process. But I also like to like, if we already know like what our top functional jobs are of the product. So functional jobs being like if I'm a bank it would be like save money, spend money, raise money. Raising money would be in the category of like acquiring credit or a line of like a credit card or line of credit or like an investment, then there's also like maybe like invest money. So maybe I want to also be able to grow my money in some kind of way. And if each of those jobs to be done, if you were to map out tactically what all of the steps are to be Able to save money whether it's in your product context or not. So like if you are in the banking or financial context that save money jobs to be done, there are steps people have to take. They have to open up an account, they have to secure all of their like details, they have to get a card, then they have to like go to a place to like actually spend the money. Like there's like a process that they go through. If you were to take all of your functional jobs and if you have a big product, you probably have a lot of functional jobs, you probably have a lot of them. And if you were to map out what are all the steps that someone has to go through just to execute that thing. So if in like the studio management software context, for example someone, if you're a yoga studio owner, you want your software to help you book classes and like students, clients, appointments, whatever, you want something to help you schedule them, you want to be able to take payments, you want to be able to market to students, you want to be able to remind students so that way they come to their classes. You want to be able to grow your studio at some point. You want to be able to maybe hire other teachers or professionals. There are a lot of jobs. You want to be able to make business decisions or like make financial decisions or like know which classes to to cut and know which ones to add. There are a ton of jobs for one platform that's like that host your website. Great. What are all the steps to do that and whether your product fits into it or not? You can see visually. Oh yeah, they're going to use our product here, here, here and here for this job and they're going to use it here, here, here and here for this job. What are all of the cards left or what are all the steps left? And that could be product opportunity. It could also not be. But that's another way to kind of discover or even like just visually see from an exploratory perspective what your options are. And then of course the other way to do it is to be very specific and focused and targeted and be like, okay, yeah, now we know reporting is a problem. Or like okay, yeah, no, we know our email marketing is a problem or like okay, we know our automations are confusing or yep, we know our website templates are overwhelming. We that's something we should fix. So then we collect user stories and we discover what those problems are. We don't assume and I think the assumptions are where a lot of teams and products get into trouble. I think it's also what ends up. It's what creates mediocrity, in my opinion, in what we end up shipping. And people might not ever say this feature sucked, but I think they show it when they don't use it. So you ship something and 10% of people of your customer base are using it. There are some products where that would be a win, but if you're small, like, if you're less than 5 or 10 million, I would be concerned. I would be like 10% of people. That's crazy. I would. I, like, I want to see at least 50 or 40. I would even take 30 or 40. But 10, 10 is insane. Temper. Like, you shipped a new feature and 10% of people use it. We didn't do enough discovery or validation. [00:29:01] Speaker B: So what does that user story look like? [00:29:05] Speaker A: The user story? I'll give you a more specific example. So I'm trying to think of a product that I bought recently, like an actual SaaS. Actually, I could use Kastos as an example because, I mean, I didn't buy it recently. I've been a customer for a long time. But, like. But I remember going through the process of buying Castos and like, actually using it. One of Kastos claims to fame, and I'm sure, like, there probably are other platforms that do this now, but at the time, I remember it being like, that's. That's awesome. It could automatically, like, when you publish a new podcast on Castos, because of the integration with WordPress, it would. It could automatically, like, publish the page for you, like, on your actual website. And I remember being like, that's pretty cool. I actually never used it for that function, but I just remember being like, oh, I like that a lot. Like, I like that I could do that if I wanted to. The other thing I liked a lot about it was it would integrate with YouTube and would automatically publish my episode to YouTube without me having really to do much. Like, it's automatically synced. So it'll distribute across all the different podcast platforms. Great. But it'll also distribute across YouTube, which, at the time I was really excited about YouTube and, like, having YouTube channel. And I remember being like, oh, that's really cool. I should totally. We should totally do that. That sounds awesome. One of the problem areas for me was integrations, so, like, it would disconnect a lot. And it was also like, I was having issues with, like, the file type. So if you inputted and it depended on the. On the video and, like, the audio file. I think it was like, if you gave it like an MP3 and not like an M4A or something. It wouldn't automatically publish to YouTube. And remember being that being so annoying. So if I were the researcher researching this problem, because, I don't know, maybe Castos can see that, like, oh, we're having failed. Like, there's like a percentage of failed integration publishings on YouTube. We should investigate that. That seems like an annoying thing to like, let's just go and see. And like, if I were interviewing me, certainly I could just. I could just tell them, hey, the integration seems to break sometimes. But maybe they're like, no, no, no, no. We're not just going to fix the integration. We need to rework our whole integration, like, strategy or whatever in the platform. Because maybe this is, like, super speculation. I'm just giving this as an example. Cast us for context. Is like a podcasting platform, in case I didn't mention that. But, like, let's just say that Castos was like, we're gonna. We're gonna totally redesign how we think about integrations. And, like, something like, I don't know, really cool and sparkly. And I'm like, okay, so if I think a way to approach it would be like, maybe the researcher is like, hey, what sucks about integrations? And I'd just be like, it keeps us connecting. And that's all I really say. That's one way. So you. So the research is like, okay, we should just fix that. Like, yep. Yeah, like, you could totally do that. That's an acceptable way to surface problems. The problem in this world is extremely focused of integration. Broken. Fix it. And that is a way to approach discovery a different way. So, like, let's say Castos again, had that bigger vision of, like, let's, like, overhaul how we think about integrations. The story might sound closer too. When was the last time you distributed the podcast on another channel? And I might say, well, I don't do it manually. Everything I do is through integrations because I'm lazy and also very forgetful. The ADHD kicks in and I just don't remember. And that might be very interesting to the researcher. They might be like, okay, tell me more about that. What does that mean? And I might give this entire story about most of my systems are connected. You know, I use zapier in these scenarios, or zapier. I think they're zapier. And I don't think that's better. I think zapier is better. I agree. [00:33:00] Speaker B: I just. I just call it zapier just because [00:33:02] Speaker A: I. I like it. Better. Zapier sounds weird. It sounds like rapier, and I don't like that, which is like. Anyway, I think it's zapier, but it's not better. It should be zapier. Anyway, so, like, I use. You call it. [00:33:15] Speaker B: You say zap. [00:33:15] Speaker A: You literally call it a zap. Yeah, exactly. Exactly. This is the brand hill we will. I will die on. [00:33:21] Speaker B: Someone that I went to college with and graduated with is the current CMO of Zapier. [00:33:28] Speaker A: Wow, fancy. [00:33:29] Speaker B: He's not the cmo. He might be one. He's a big deal. But it's. Maybe he's like the head of the VP of. [00:33:37] Speaker A: Yeah, maybe VP or head. Yeah, that's amazing. Wow, small world. Anyways, I might discuss too. Like, maybe I. Maybe the interviewer also prompts me to get really specific about. Okay, but, like, what about, like, Castos, specifically in, like, your podcast episodes? Like, when are you distributing those? I'm like, technically, Castos does it all for me. Like, it. It distributes to Spotify, it distributes to blah, blah, blah. Um, and I probably would talk about YouTube, but I think what makes a really good user story is getting really specific on, like, well, when did I do and like, what did I do and why did I do it? So in my case, that would be like, okay, I remember, like, last Tuesday, a podcast episode was supposed to be out. I noticed that it didn't go out. I open up YouTube, I check it, I notice, oh, the not only it didn't publish, but there's like a phantom thing there. Sometimes that happened. That would happen. There would be like a phantom. It wouldn't publish to the public, but it would be like an unlisted Phantom object in YouTube, and it's the episode, but it's not the content. If I, like, like, the episode stuff is there, but the audio file isn't. And they're like, oh, that's really interesting. How did you know that that happened? Oh, well, I went to the back end of YouTube and I checked that, and I checked that, and it's like, okay, like, where were you? Like, did you do all this? Like, in one setting? Like, where I'm like, oh, yeah, like, this was honest. Like, I was on the couch in front of my TV and I just noticed it on my phone, and I checked my phone, and then I went to my desktop. Ah, okay, that's interesting. Those are things that are interesting from a context perspective because it's like, oh, you know, users might notice things from their mobile devices if something, like, didn't go well. That's interesting context. Now, it's funny, like, my little product brain is worrying now because I'm like, oh, like, it'd be really cool if you had like a. If every time a podcast episode published successfully, if there was almost like a little dashboard for like, YouTube check, Spotify check, Apple check. Like, they're for every single episode published. Now. This is solution space. I've heard problem space, and I've heard a few problems. My brain jumps, of course, to solution space, and I'm like, oh, that's an interesting idea. I'm just gonna write that down, but I'm not gonna hold on to it tightly because I don't know yet. You talk to 20 of me's and you might find that different people have different experiences with integrations and also different experiences with the channels that I've mentioned, like YouTube, Spotify, et cetera. And, you know, I'm just one person, but you talk to 20 of me or 10 of me, and you might discover there are very key struggle moments or problems across these experiences and across these, like, user stories that jumping to solution space could be really interesting if you did xyz. I think the problem, though, part, there's. There are a number of problems, but the first is we never really collect user stories. We kind of just ask customers, like, what do you want to fix? Well, the integration's broken. Fix it. Okay. It's like, okay, cool, done, fixed. Or it's, you know, like, user error. Like, oh, Asia, you uploaded, like, the wrong file. And I'm like, oh, it doesn't take that type of file. That's kind of. It's kind of dumb. All right, that's fine. I'll. We'll just do M4As or whatever. I can't remember what it is. I think they might have actually fixed that, though. Probably because people were like, that's weird that you can't just take a different file. But it also. It could be like. It could be a number of things, though. And I think the second part of the issue, you know, one, we're not collecting user stories, really. Not real true user stories. I think the second part is we're not collecting enough of them. So we talk to two people or three people. Hey, like, what's broken about integrations? This, this, and this. Okay, we'll fix that. And it becomes so much more transactional and also so much more like, you're really putting the effort on the customer to kind of tell you again what to build. And I think customers, if something in particular, like, if Something is broken and there's a bug or there's some type of issue that's easy I think to point out and to identify. It's much harder to surface problem stories because in order for you to do that you've got to get really hyper focused and specific on a particular thing and you have to be extremely sensitive to what's general hypothetical versus what's actual and specific. The best book I recommend Left and Right to everybody. I've been recommending it a lot lately and I feel like Teresa Torres and I should be friends. But Continuous Discovery Habits, it's quite literally my favorite resource for digging deep into what actual user stories that surface problem spaces sound and look like. And I'm actually taking one of her courses right now and it's about like how to do better like user story interviews. And it's funny, I actually tested a little bit of this yesterday in my interview. Was it the one that you attended? Yes, I think it was. So I noticed. You noticed? Okay, yeah, I did check. I didn't realize that that's what it was. I was like executing some, a little bit of my training, my new training. Um, but no, it, it really, it really does hit different. Uh, so, so for context, we're doing pricing interviews for a client and one of the, it was actually inspired by the CEO in one of the interviews that we did. He was really curious about the budgeting process and like the financial process because it's correlated to reporting in this particular product. And I was like, I like that line of questioning because you know, you can, you1, we can kind of low key surface user stories. But also this would actually help product. And I, I was really motivated to like spend five to 10 minutes just like asking a little bit about that. And like, like user stories don't take forever. That's the thing. They don't take a long time. You just have to be diligent about securing them. But anyway, so I remember like I spent like five to ten minutes just digging into her process. The, the candidates process around budgeting and reporting and like finances and specifically like decision making around what's in budget versus not in budget. So this is really more for sales like from like the sales perspective because one of the things that's a little bit of a black box is like when our prospects buy our software, what's the decision making process they're going through? Because they're not like super biz savvy folks, which means they have a process. We just. And like for like a CEO of a Company, you know, you know, we have all of our spreadsheets and all of our like, calculations and but like the type of buyer, they're not very sophisticated buyers and they're not coming from like business degrees or marketing degrees. So they're having to educate themselves in a particular way and make decisions. And it's not going to be as sophisticated as us. But like, it's important that we understand it because that's more of like a positioning in sales and marketing thing. And then on the flip side, the reporting ties back into that budgeting story in the product. So if they ever do become a customer or really a customer of any product, not just ours, they're using that reporting to inform their budgets. And we like I was digging deep user story wise into like, okay, but how. So I was asking questions like, okay, like when was the last time you did it? Why did you do it? What? Like, how did you know that you needed to do that? And customers and prospects, people are going to be like, oh, well generally I. And I'm like, no, no, no, I want to know like exactly what you did. What specifically did you do? [00:40:52] Speaker B: What was really interesting about that too was because the answers were actually so much different than I would have if you hadn't done deep, I'd be like, oh right, she, she has a budget. But then when you dove deep, it's like uh, she every, she has all these sort of different things she's pulling up. And the budget actually really isn't a budget like me. And you might say it's a budget. It's sort of a number she comes up with every week and then she does it weekly. She's sitting down looking at things weekly. So it's. It. You can't really assume. [00:41:22] Speaker A: No. [00:41:23] Speaker B: What these users are doing. [00:41:24] Speaker A: We learned that. Yeah, like, that's such a good point. Like we, we learned that like the like budget to me and you and like our team probably means. It means something very different than what it means to her. So to her in her context it was more like really she's cash flow managing, she's not really budgeting. [00:41:42] Speaker B: Right. [00:41:42] Speaker A: And that's different. So really it's. I'm reacting to my cash flow, not I'm setting a budget. So like, you know, to me setting a budget means I'm going to look at my average amount of expenses and I'm going to categorize transactions in a particular way and I'm going to set, set a certain amount for each of those categories that I'm not going to exceed If I can help it. And when I track and manage things, I. What I'm really doing is I'm allocating those transactions to those buckets. So I think about like, Ynab, like, you need a budget. Like, that's the. When I hear budget, that's what I hear. But it was so fascinating because that's not at all what she was doing. Like, she had a spreadsheet and she was really tracking her cash flow. So she's managing her cash flow and basically she's saying, okay, I have this much cash flow or I have this much profit, I am now going to use that profit to spend it on this. And that's what I interpreted. I'm curious. We actually haven't talked about that interview yet, but that's how I interpreted what she was doing. And I was like, oh, she's not really actually budgeting, she's doing this. And that was fascinating. And we could have dug deeper. But these user stories, 10 to 20 minutes, you don't need an hour, you just need like 10 or 20 minutes and you need like 10 of those. And I bet you if we had talked to 10 more people and just focus on user stories, we would have uncovered more patterns. And also too, we would have been able to tie it back to like the budgeting and reporting conversation. Because turns out, you know, she needs reporting for a different reason than what we assumed. And also there are like three to four other reports that she probably needs to support her process. And also it's possible that this product that I'm talking about does budgeting and financial stuff. Like actual. Like, yeah, you can use QuickBooks, like your accountant can use QuickBooks. You can, you know, we can integrate with QuickBooks and pull in everything. But wouldn't it be awesome if you could just like log into this one platform and it's already Synced up to QuickBooks. Your accountant's going to do whatever, but you could just see your cash flow and your projections and like based off like, you could really do both. You could do both cash flow management and also like budgeting and help you like, make better decisions. And if you're running at a loss, what if like the platform, like suggested things to do to, to fix your loss, like that solution space. All about solution space. But it's rooted in my, you know, that one user story that I, that I did. Imagine if I did 20 of them or if I had done 20 of them, now that solution space starts to prior. Like, it gets clearer. A little bit of like, okay, like there are A million things we could do. But like, actually these three to four would be gangbusters. You know, that would be awesome. [00:44:35] Speaker B: And then that's also probably the beginning of, you know, obviously once that you build that, then it's product marketing education on like what you might call it is going to be different than what the user might call. Might call it. Now you know what language they're speaking because you are having these conversations. [00:44:57] Speaker A: So they call it budget. They're using the words budgeting. But like we. And it'd be debatable on if we'd use that same language or if we'd be like, actually what you're looking for is cash flow management and we can do that for you or whatever. Like your accounting platform can do it. That's fine. We can pull in that data so you can at least see it in one place. You don't have to have three tabs open where it's like your spreadsheet, QuickBooks and this, you could just have this and like leave QuickBooks for your accountant because that's what they're going to want to see anyway or whatever. Like there are so many different things like it could be. But again, that's one user story and that those are like four or five ideas from that one user story. But what's, what's important is that you get more than one user story. And I think that's, there's that and then there's also just like collecting the user story to begin with. I think most teams aren't doing that and they're just doing, I'm gonna say most, you know, most early stage teams, especially in scenarios where like, there's no real head of, there's no product manager, but there is a head of product and usually it's the founder. But the way that we can mature that practice is by incorporating these processes. And it doesn't have to be weeks, it could happen in literally hours. I think slowness is a choice. Slowness is a choice to me. I think you can be slow by choice. You can also be fast by choice. And I feel like it depends on what's important to you and like what are the values in the quarter, in the year, in the company. But it doesn't have to be slow. Slowness is a choice. It can be, it can literally be in hours. We've conducted pricing interviews in hours, like from zero to five. Interviews booked, takes a day, if that. Yeah, like it doesn't have to be, it doesn't have to be slow. But anyway, so that's really? Yeah. Like what? I kind of wanted to unpack a little bit more today, especially since it's so top of mind because again, like, I'm literally implementing these processes in clients, right? Like in some of my different client organizations right now. And to me, like, there's, there's a topic of frequency and length. Again, like I think like 5 to 10 user story interviews, 10 to 20 minutes long, TBD on if you have to incentivize or not. I would say for these, I wouldn't because they're so short and ultimately it is in the service of building something that's going to be valuable to them. So I like, I feel like there is a trade off that I think is valuable there. But all that to say it doesn't have to be every week. I do think, like, if you're using, excuse me, if you're using, like the Sprint methodology, part of the Sprint should include a week of discovery or a few days at least of discovery within your core target areas. The way that you strategically prioritize, like your core target, like product focus areas, should also be based on some amount of data. And chances are, like, if you're listening to this and you're at this point of the podcast episode, you probably already kind of know like what your top three to four areas of opportunity are in your product. And it's very possible that, like, you have a lot of really great ideas for what to do, but they need to be validated using discovery and using the collection of user stories. So if you already have the idea, for example, of building AI features, let's spend a week talking to 10 customers, collecting their user stories around the specific business problems that you're hoping to solve using AI. So don't pitch them the AI solution. Again, you're understanding problems. So if you're kind of like, I think I want to launch AI features related to reporting, okay, great, don't pitch them AI features. Instead, talk to them about how they're currently using reporting and what challenges exist there. Because the more you understand the problem space, the more you'll be able to be like, oh, the AI feature should do this and it should be here and it should look like this and it should accomplish these things. Instead, if you revert, if you do it the opposite way, what's going to happen is you're going to build the feature. You're going to be like, I built this AI reporting feature. I made this. Do you want it? And then they're going to be like, no. [00:49:20] Speaker B: Or they're going to be like, okay, cool, sure. But then it won't apply to everything they're doing. Or, yeah, it'll just be something that they're like, oh, you just added something for me. Oh, okay. [00:49:31] Speaker A: I guess you're honestly exactly right. It's funny. That's actually my bias showing. I'm assuming customers would tell you no, but we've seen it all a million times before that they actually don't. They actually are like, that's a great idea. And then never use it. And then you think, but clearly it wasn't that good of it. [00:49:48] Speaker B: Pat myself on the back. This is. [00:49:50] Speaker A: Yeah, I'm so smart. I'm so awesome. And then like, nobody uses it. Yeah, no, you're absolutely right. They actually wouldn't even do that. They would be like, wow, thank you so much. This is so great. And then never, never use it again. Extremely common, actually. Yeah, but that's how we avoid that. And that's how we avoid the 10% adoption rate. And like, let's do some user stories and let's do some discovery. And before we kick off a sprint, or even, like, if we are going to kick off a sprint, it's. It's a. Before we predetermine what the sprint is, maybe it's a, like a week of discovery just to kind of make sure that, like, okay. Or maybe actually it's okay. I take that back. Maybe, actually, you know, like, when you kick off your Sprint, because Sprint methodology is still pretty common, I feel like, in product management teams. But like, when you kick off your Sprint, maybe you already know your core areas of focus, but maybe you don't know exactly what the solutions are until you do the user discovery. And that's what informs your solution space. And what's kind of cool about that methodology, too, is like, you might already know your core areas of focus, but you might actually say, we could put energy here, but based off of these user stories, it's kind of maybe telling us to put energy over here so you can. You might actually even do a live reprioritization of something, which I love because one of. One of my best friends, she made a joke that, like, I will edit myself live. She's like, oh, she edits herself. Wow. And I'm like, yeah, like, it's. I don't know. It's important to me to be able to pivot. If I'm like, that's the wrong path. We're not going there. And I do feel like sometimes we commit to sprints that are like, should we build this, when this is over here and it's like clearly obviously like in pain and like sucking and having a hard time. Like this thing over here is dying. It's that, I don't know, have you seen that meme of the kid in the pool? There's like two kids at the pool and like the mom, and like the mom is like with the little girl and she's like, oh my gosh, yay. And then like the sun is in like in the corner of the pool, like drowning. Like, like that's. I feel like a lot of our teams like this award winning growth creating part of the product is getting no love, but the new shiny thing, probably AI is getting all the attention and like, meanwhile all of your customers are like, I just need the report to do this. And like, or you know, whatever. I'm using reporting as an, as an example, but it could really be anything in your product. And I think, like, I think that's part of why maybe founders don't have as much confidence when they just listen to customers on what to build because they're not doing the user stories and like they're not doing the discovery part, which if they did that part they'd uncover I think, a deeper pattern which is more applicable to more people and then they can go to solution space and then we get into like, we don't have time today to talk about like prototyping and testing, but there's a number of things that we can do there if we're really uncertain about something before we commit a full sprint or like dev cycle to building a thing that people aren't going to fundamentally use, which sucks a lot. And I've been there. Not me, I'm not a developer. Not me personally, I haven't built something that people don't use necessarily. But I know what it's like to be on the growth and marketing side where the product team might prioritize something that doesn't net the impact that we were hoping to see. But it's because we just didn't do enough discovery. And it's possible that maybe we were close to building the right thing and if we had just like eased it over into this area over here or like adjusted the way that people interacted with it or whatever, it, it would be more adopted. Because I think the other thing too is you don't want to run the risk of launching something that is disappointing and then people never use it again, even after you've made improvements. That's the other risk. It's like you've you launched something under baked and people didn't use it and they got, they were disappointed and then they were like, oh, this is going to help me. And then they just ignored it for the rest even after you fixed it or whatever. That's the other risk too. So anyway, I digress. But cool. Okay, well, I took an hour anyway. I was like, this is gonna be short. No lies, lies. I lied. I'm sorry everybody, but no. Problem space and solution space discovery interviews, user stories, do those. Please, please, please. And it doesn't have to be a million, you just need like 10, 20 if you're feeling frisky and like they don't have to be long. Like 15 to 20 minutes is all you really need. And ideally you do have them relatively focused on a product area. You already have data probably if you have support tickets, you already have data on what areas to focus on. Pump that shit through LLMs and Claude and have it tell you where to put your energy. High level and really validate and test assumptions related to what customers ask for. Because some of it is obvious and very intuitive based. Some of it's like feature parody shit that you should just have, it's fair. But some of it, there's a deeper story and like we got to figure that part out. So that's my quick recap. Thank you all for listening and thank you Kim. [00:55:02] Speaker B: Thanks everyone.

Other Episodes