Conversation with Ben Newell & Tom Randle

Conversation on customer feedback with Ben Newell, VP Product at Geckoboard and Tom Randle, VP Operations and PM at Geckoboard

Published

Key Points

  • How to identify which pieces of feedback are relevant
  • How to build a Jobs to be Done (JTBD) understanding for an existing product
  • How to define a clear strategy out of JTBD
  • How to combine quantitative and qualitative data
  • How to prioritize identified opportunities
  • User testing and feature flagging when building and launching new features
  • Using metrics tied to the company’s strategy to measure success
  • Tools to gather feedback from customers
  • How to plan and organize customer feedback sessions and their results

Transcript

Daniel:

Can you tell me a bit about yourselves and your background in product management?

Ben:

Yeah.

Tom:

Yeah sure. I’ll start. I have a weird background. I did a degree in mechanical engineering. I decided that I didn’t really want to be an engineer and that the kind of places that were available to work weren’t that exciting to me in the UK. When I finished my degree, I was like — I’ve always been into computing, computers, design. I got a job as a UX designer, at a software company in Cambridge, as a graduate with no prior experience at all. Within a couple of weeks of joining, I was the sole designer on the project. I had a lot of freedom responsibility very early on. The products were quite technical, so I ended up, from the off, doing a lot of feature spec’ing that sometimes would be done by a product manager.

I was there for couple of years as a designer. One of the big projects I worked on, there was no product manager. The previous one got fired and there was no one there. I ended up doing a lot more of that. At that stage, I began to realize that ultimately, I’d rather be a product manager than a designer. Took a few years to make the full switch. Sometimes I was working on smaller projects, smaller teams, where I was doing probably what some people would call product management anyway, before making the full switch about two and a half years ago now.

Daniel:

Awesome. How about you, Ben?

Ben:

Similarly-

Daniel:

Also mechanical engineering?

Ben:

No. Never going to do that again. My background, I studied theoretical physics at Uni, had a joint Maths/Physics degree, because they are the coolest subjects that all the cool people are into. Then, I was finishing that and I had no idea what I was to do. A lot of jobs were like, “Go work in a bank.” I was like, “That sounds rubbish.” I didn’t have a clue. I ended up going into teaching. I was a Math teacher for a year. I was there for a year. That’s all I will say.

Then, I got a job at a London software company. Kind of as grad, 'cause I was only one year out anyway, to rotate through product, QA, development, that was the idea. I started in product, didn’t really know what it was when I started, had no idea what was going on. I ended up staying for longer than I was meant to anyway and decided actually, not going to do the rotation, I’ll just stay in that. Done it ever since. I moved from that to here about two years ago, where Tom and I have shaped the direction of product, I suppose.

Daniel:

That’s awesome. You’re both product managers for Geckoboard. This topic of customer feedback comes around everywhere. “You should gather customer feedback,” “you should use customer feedback,” but sometimes it’s difficult to know what to do with it. What is customer feedback to you and how relevant is it to your work as a PM there in Geckoboard?

Ben:

Massively.

Tom:

Yeah, it’s extremely important. We talk to customers one way or the other, every day. It impacts everything we do.

Daniel:

Right. What kinds of feedback do you get? Let’s start maybe from, you have a running product and you have a lot of quantitative and qualitative data that it’s coming your way. Let’s start maybe with feature requests. Maybe someone comes to you and says, “I would like to have this feature”, or, “I have this idea”, or something like that. What happens? What’s your process to handle that kind of thing?

Tom:

It’s quite easy for us, because we have a very clear strategy at the moment and project themes we are working on. While we get this continual stream of feedback, it’s very easy to see whether that’s relevant to what we’re working on right now. If it is, then we sometimes interleave that immediately, or if we think it coincides with what we think we should be doing anyway and it positively reinforces all that, or it’s something we completely missed and forgot about and it’s a no-brainer, then that’s really easy. The other stuff, the stuff that we’re not planning on working on, doesn’t match that broader strategy. We were aware of it. This is the back of our minds, but we don’t act upon that at all.

Daniel:

Right. You have a very clear strategy at the moment. How does that look like? How do you get there? What makes you have a very clear picture of where you’re going and what’s your target?

Ben:

We spent quite a lot of time, probably about 18 months ago now, doing some very intense, very customer-focused research. We had a lot data, like customer feedback coming in, internally, externally, not really a very clear sense of being able to say, “Yes, this is important and why”, and, “No, this isn’t and why”, because the features coming in were sensible features that someone could say, “A dashboard might as well have this feature. Why is it that yours doesn’t have this feature?” We had a sense that, obviously, having all these requests wasn’t going to lead to our success. We did a lot of research, we spoke to a lot of customers, new customers, existing customers, happy ones, who had cancelled and left us, to understand the whole spectrum of people and what they’re gaining out of the products.

We use a methodology called “Jobs to be done”, which is- you’re familiar with it. It’s a way of thinking about why people buy the things that they do. Ultimately, people just want to make certain types of progress in their lives, your product or service is just a way in which they do that. If you understand that job, it tells you how you can improve the product to be more suitable for that. With that in mind, with this research in mind, we managed to say, “Look, we have these jobs. These are all the jobs that people hire our product to do. Therefore, our product is deficient in these ways, these other things.”

Tom:

Identifying those deficiencies is a really important step based on, here’s the job we’re trying to do, where are we missing in that, where are our problems. And at that stage, we call in on all the research, opportunities and challenges we have and looking on all that data and really trying to understand what the hell is going on.

Ben:

For us, it’s about the core as well. A lot of feature requests tend to say, “Can you add X? Can you add this?” What we’re doing is more, actually going, “Hang on. There’s this core that a lot of people get a lot of value from.” Yet, that is still deficient in many ways. That’s where we’ve got the biggest effort. That’s where we impact a lot of people. Some of them are not going to be telling you that they need this stuff to happen. You do the research that you understand that and you say-

Daniel:

When you say that you find out about this core, that’s the core of the things that people are hiring your product to do. Do you figured your strategy back from that or do you say, “These are the jobs that people are hiring that the product isn’t, but these are not the jobs that we would like to be solving”? It’s a chicken-and-egg situation here. You already had a product. You already had customers. There is already a part of the equation solved there, but how do you approach it?

Tom:

When we did our jobs to be done, we identified some jobs that we do that we don’t want to do or that we’re not very good at. That really helped us, because we were immediately able to cut out a whole lot of requests and stuff. And things we could do in that, because we identified these jobs as being things that we’d only ever be kind of average or run of the mill, and there would always be competitors who are better at that or products better at that. That helped us carve off a big sway of that.

Ben:

With almost any product, except for the incredibly constrained ones, people will find a way to use a product for all sorts of different things. This looks back to that problem I said at the start, which is, lots of people asked for lots of different things that will help them out. If it turns out that those features support jobs that actually the rest of your product isn’t very good at, and probably isn’t in the cards, you can basically lay out the jobs and you could say, “Look. Here’s what they all are.”

You can be as honest as you can about your strengths and weaknesses as a product and a company against them and say, “If we were to focus in this way and in this area, we think that we can make these ones better.” That’s going to mean that we’re not going to get any better at three, four and five over here. Let’s say that. Let’s say that to the whole company. Let’s be very clear, so that when any one in sales and support, in development, in marketing is thinking about those other ones, they know, “Yes. That job, that’s not something we do.”

Daniel:

Right. When you talked about jobs and what you’re good at and what you’re not, there are two things there. First of all, understanding the job, and second of all, understanding if you’re good at it or not. Let’s start with the first part. How do you understand what’s the job? What kind of questions, what kind of research do you do to find out the job that people are doing? When should you be trying to ask those questions?

Ben:

All the time.

Tom:

If you don’t know what it is now, then you should go and do it.

Ben:

I’d say, even though we’ve had the jobs written down and shared for a long time, in all the customer calls we do now, unless it’s a specific usability or user research call on that specific feature, we’ll still ask a light version of that line of questioning. There is a whole approach to this, which you can read about online. There’s a Udemy course about how to do it, how to ask a different type of interview. For the jobs part, it’s less about, “What do you like about that product? What don’t you like about that product?”, more talking about, “Why did you end up buying this product? What causes you to have this need to do that?” You could also get a lot out of just examining how people do it. That functional side, what do they do with it, then put it up on the wall, and put their metrics on it, and you get that. You can dig into the interviews and some of the other stuff going on in their lives around that. Why did that become important to them? Why is that something that is even relevant to that?

Daniel:

Right.

Tom:

The thing I found hardest about that process is factoring it down, and grouping it. Defining those categories wasn’t straightforward to begin with. Once we had them, it was really obvious and easy, but it was quite, “How many jobs do you have? Does this quite fit in this one, this one?” Coping with that took a little bit of, not ages, but it did take a bit of brain-power to try and-

Ben:

We had a really big trouble, didn’t we, with like ten, 15, 20 things that we thought seemed like one, and managed to group them, and then go, “This seems like these are related by this. They have certain nuances within them, but it feels like this is a theme, and this is a class of jobs, and this is another class of jobs.”

Tom:

One of the reasons we went for the jobs to be done framework was becoming observation from our data of what kind of customers we had as well. There was a huge variety in what customers we have. We’ve got sushi shops to manufacturing plants, and SaaS companies, and blue chips, tiny one-man bands. That made it very hard for us to do any of the classic persona stuff or market segment, that kind of thing. There wasn’t a lot of commonality. We had buckets of people. There were loads of them. There wasn’t just one or two, there were lots. We did notice, once we started doing these questions, is that a lot of them, even if they worked in completely different fields had a lot of commonality in what they’re actually trying to do. I’m not sure whether that would apply to all products, but for us in particular, it really jibed with us and really-

Ben:

When your product is quite horizontal, where it’s very non-specific about who is meant to be using it. A lot of products have sprung up now. It’s this for marketers or it’s this for DevOps. It’s very obvious who’s going to use it generally is quite obvious why- But with a bit of product that is a bit more broad like ours, it lends itself well.

Daniel:

Right. The other part of that was, how do you understand where you’re good and where you’re not good at? How do you measure that? Is it your thoughts on how you believe you’re fulfilling some value proposition or is it something that is coming from a customer saying, “This sucks”, or, “This is good”?

Tom:

A few things. Some of it is just judgement. You can tell looking at our product what is good and what’s not. Also, as we’re so familiar with it, we know where the deficiencies are if someone’s trying to do a certain thing and we know, “Christ, if you’re doing that with us, you’re missing this, this, this and this. This is really not a good fit. You’re putting yourselves through hell. Why are you doing that?” That’s one aspect of it. Then the other is, we could look at the data so we could see who would be more likely to churn and who were the more successful customers in terms of usage. Also, when we’re interviewing them and the strength of their opinion and how positive they are about it as well tended to be more positive, more happy, more satisfied if they were doing the jobs that we’re best at.

Daniel:

All right. Now you have a pretty well-defined strategy and everything that comes to you is more or less clear. You can decide where something applies or doesn’t apply to your product right now. There are multiple things that might be going into the product. There’s a thing that customers are asking for, there are bug fixes, there are things that you might believe are improving or something bad that you’re doing on some area. How do you balance all these things and how informed is it from what you learned about your customers? Do you find out about new jobs that you would like to be able to do? What’s your orientation there?

Tom:

There’s a step before that. We’ve got our jobs to be done and we’ve identified their deficiencies. Then, what we try and do before that is to try and work out what strands of work, what themes can we work on that will help best move that forward. We have a product and company led decision about what is that project that is going to have the biggest impact on our MRR, or be the best strategic thing that we can do to to grow us longer term. There is a a little bit more artistic license with that, and judgement. It’s a feeling that’s built up from all the feedback we have over some of the stuff we’re working on.

We identified one of our biggest problems as being the difficulty of getting your data in. We’re dashboarding product. If you can’t get your data in, then you’re in real trouble. You can’t build the dashboard you want. We identified that as one of our big areas we need to look into. We looked at our very specific problems we had with that. The platform that we use currently to roll out more integrations to let people get theirbe data in, was not really fit for the purpose. We didn’t want to continue building off that. We knew that is was limiting how good a solution we could offer. We identified that and realized we don’t want to carry on in that vein.

We wanted to try and break out of that. We looked at the different ways we could do that. One is to have a very vertical approach and pick a certain thing that we want to integrate with and do that really well. The other is to have a better, more generic approach. We realized through a lot of discussion and thoughts and being hyper aware of what our customers were experiencing today and having problems with, able to come up with ideas, the things we need to build that would help us move away from that. That was what is heavily informed by researching, ultimately is a judgement call-over from the company and from the team. What is the best thing we can do to help? Given these are the problems, given this is the feedback that we’re getting, what is our idea for what we can build?

Daniel:

Right. At this stage, do you use other methods, or there are maybe quantitative or qualitative methods to try to find out about more things, or try to test some of your assumptions?

Tom:

We do the whole hypothesis, Lean Startup kind of thing. We got a hypothesis that this is a good area to work on. One of the other problems we had is, because of the nature of the product, it’s quite hard to market it, because there’s a whole array of different types of people. While it doesn’t really affect the jobs to be done, it does affect how we used to reach people. That was another factor that we wanted to include in deciding what to work on next. The big thing that we’ve been recently working on, we partially chose, because we knew that it would probably be easier to market it. That was a really important, deciding factor in that.

Once we thought that that was probably all we were going to do, we went and tried to validate that. We did some more in-depth research with that specific segment of customers, did more specific research into what we could potentially do technically and what the design and approach we could take could be. You have the hypothesis. We validate it to a point where we’re like, “Yes, we can kick this off”, and then we begin on that. We get into the process of continual feedback on the thing we’re building and doing betas and stuff like that.

Daniel:

Right. That’s the step that I wanted to move into now. As you start to build it, you also want to keep gathering input into what you’re doing. What’s your process there? So you decide that you’re moving forward with something and then, what happens?

Tom:

We got a continuous stream of calls we’re having anyway with people. We’re potentially trying to recruit more of those people and talk to more of them. Our approach is, we start to design and build something. We try and put these designs and things we’re building out in front of people. Typically, in the past, it has taken us maybe a month, six weeks, to get to the MVP that we can actually beta. Beta is the wrong word, but put in front of people that we can then actually get them using. We try to get to that as soon as possible really, because of the nature of the product. It’s quite hard to, for users, to judge the value of it unless you’re using their real data. If you just show them, “Here’s a chart. It’s not your data. Sorry. How does that work for you?” That doesn’t really give us the feedback we’re interested in at that point.

By this point, we’ve done enough research that we’re willing to give it a reasonable length of time before we say, “No. This isn’t something that we want to carry on with.” We’re willing to commit a couple of months by that stage to actually giving it a proper chance. We’re not going to go like, “We haven’t had a single user this week.” Say this isn’t something they want and give up. We kind of plough on a bit just to-

Ben:

It’s because we’re working with existing products and because a lot of the improvements we do are about core improvements. We know that there’s a market for it. We know that there are people who want to do these things. Generally, a lot of the work we’ve done recently is less speculative in terms of, is anyone actually going to use this. Is this actually a viable product or feature which is a very big problem. Starting out, initially, you’re very small. A lot of what we’re doing is taking a product that exists and then try to help it move up to the next level and really tap into those things.

Tom:

Our biggest problem is very rarely, is what we’re building useful, because it will be useful to some of our customers. We’re always confident in that. What it is is, “is this the best use of our resources?,” “is this going to create the company?-”

Ben:

It’s substantially better in a way that we have both said.

Tom:

You can do back of the envelope stuff to say, this is a pretty decent market. It feels reasonable that we can grow this and look at our existing data to try and see if we double the conversion rate of this, pull in twice as many users, how much money is that going to make us?. The whole opportunity cost, that kind of thing is impossible to really have a good feel of. Again, there’s a bit of faith that we’ve chosen the right thing and measuring as we go, but it’s very hard.

Daniel:

That’s something I wanted to talk about, which is, if you have multiple competing priorities- Maybe taking a step back from what we’re talking about. You have multiple competing priorities, multiple things that you would like to work on. You say, these all three or these all four are good things for us. How do you size them? You were talking about trying to figure out, there’s a market here, trying to measure it somehow, but how does that tie into the strategy and also the measuring itself?

Ben:

Within the strategy, certainly, because our strategy predominantly consists of this set of themes, we have some idea within those themes of priority as well. As I said earlier, when you’re measuring those deficiencies between where you want to be and where you are, the ones with the bigger gap or the ones that are perhaps more fundamental to the product experience, are going to stand out. Maybe they release four themes and each of them is a bunch of potential candidate projects within, but if this theme is also the most important, we’re going to start there. Then it’s like, within that theme, there are all these different projects that we can do, which one’s the most important?

At that point, that’s what Tom was saying about, back of the envolope calculations, gut feel, trying to draw on the fact, as we said, we talk to customers every day. No, we don’t necessarily formalize and record everything in a searchable manner, but you can get a gut feel for what people are saying, where the paint points are. And then further research, going back in and looking deeper at the products ourselves, and say “We’ve got a couple of candidate projects.” We can make an assessment about how this thing will play for those that exist today vs what we would like it to be, why we think there’s bit of a gap for improvement.

And also concerns for the wider team. Again, for marketing the key is, does his feature as well as appealing very well to people we have today, does it actually help us market, get the word further? Does it help to draw in more of the people to who this is going to be a useful and valuable product?

Daniel:

Right. That brings me back to what we were talking about, which is, you were building the thing, you were making it available as a beta, and trying to figure out if there’s enough usage, and decide if you’re going to invest even more in it, and bring it to the full product. Besides these quantitative measures, are there other measures that you do? What kind of testing do you do on usability, that kind of pre-build phase?

Tom:

We do quite a bit of remote usability testing, where we set up a Zoom conference call, and then we get them to share the screen with us, and sign for a free trial and give it a go.

Ben:

We use feature flagging to optionally take- Sometimes we test stuff that’s been shipped already, because of our Kanban, it’s very easy for us, we can learn and to make a fix or change it. If it’s a feature that’s half ready or it’s somewhere in development, we do sometimes flick the feature for that customer’s account while they’re sharing. Makes it really easy for us.

Daniel:

Right. What’s your experience with remote usability testing? Does it get you the input as good as in-? You’re missing the visual queues watching someone use a product, but-

Tom:

I started as a UX designer, I did usability testing a lot back then. All our customers were in America, we had to do it, there was no choice. I’ve done it in the flesh as well. I worked for a company where the first Tuesday of every month, we brought in six users and did user testing all day. To be honest, I don’t think there’s much difference. You get as good via a webcam and screen sharing as you do in the flesh. In fact, sometimes I’ve actually preferred it, because it’s a bit easier to have more people in the room. I think the technology has moved on a lot. Back in the day, I remember all sorts of horrible issues where people’s firewall would make it impossible to share their screen, disasters like that. Nowadays, also with our customers, that just isn’t a problem. It always works and it’s fine.

I don’t think we do quite as many as we’d like to do. We went from a stage of doing none for a little while to redoing them. They are a bit time-. They do take up a lot of time. It’s something that we’re quite keen maybe one day to get the design team to run a bit more, with us not always leading them. We get tremendous value out of them. As with the calls as well. I actually find them probably equally valuable, seeing someone use it and getting usability issues. Our overriding focus at the moment is the growth. We’re getting more useful feedback from how satisfied people are with our overall [functionality]{style=“font-weight: 400;”}, than discovering how usable it is at the moment. The development team probably gets more from the usability but- we get more from the general usage.

Ben:

All of the while, in the back of our minds is like, “what’s coming next?” We’re in the midst of a big project right now. Two months ago, or before Christmas, it was kind of “that’s all we’re focusing on” and we wound down all of our general calls, really heads down on that. As that has progressed, we had the calls back on, because we started to think, “Something’s coming up next.” Three months ago, we had an idea of what that is, but that’s changed like ten times between now and then. As you get closer to that point of solidifying, you want to fill yourself back up again.

Daniel:

You mentioned that, for instance, you’re now looking at growth and that’s maybe one of your biggest targets as a company. How does that translate into metrics? What are you looking at? You talked about satisfaction, how people are feeling about the product. What other things can tell you a bit about metrics, that are necessarily the final metric that you’re trying to shoot for, but things that might be upstream and help on the end goal that you’re looking for?

Tom:

We tried quite a lot of different things. We’re slightly unlucky in that our volumes are relatively low. We didn’t get tens of thousands of sign ups a month or anything like that. They’re quite small. They’re also quite spread out, because we’ve got a lot of different integrations. The thing we’re working on might only affect 20% of current customers. We want to grow, so that that’s a bigger segment of our customers. That prohibits us from being able to do this like A/B testing really in the product. It’s not worth it. We’d have to run tests for months to get any kind of significance on even quite big things. That’s not really anything we can do.

The metric that we actually really care about, the one that we’re trying to drive, which is MRR contribution, or number of subscribers, or number of people who’ve successfully used this thing. All the while, we’re looking at all the other metrics that support that, so the conversion rates, the-

Ben:

Top of funnel.

Tom:

Top of funnel is actually, we’re really keen on. We’re trying to make more than just marketing responsibility, because ultimately, we think that what we’re building, we’re not trying to raise the conversion rate, we’re trying to build something that we can get more people in, because it’s marketable, we can shout about and get much more people. Actually, when you’re a startup, your growth doesn’t come from slightly increasing the conversion rate or even significantly increasing the conversion rate. It comes from getting a lot more people in the top. That’s been a big emphasis in our most recent project.

Daniel:

Right. Do you have any way for people to reach you with unsolicited things?

Tom:

This is going to turn into giant Intercom advert now. We use Intercom and we really like it. We pester our customers all the time and it’s great. We get really interesting feedback from that. This probably is my favorite kind of feedback actually, because often it comes at the time someone has the problem and they go, “Oy”. It’s really, really great. It also helps some customers to build a slight relationship with us. We can understand a little bit more about what we’re doing. Eventually, that leads to having calls and having a bit more of a stronger idea of what they do and why and that kind of thing.

Daniel:

Do you find that they’re open to having a dialogue open and answering questions?

Tom:

We get reasonably good response rates, in the order of ten’s of percent.

Ben:

It’s about 20, 30.

Tom:

Most of it is triggered. Someone does something, like they use a new thing for the first time and we go, “How did you find that?”, or they use the thing we’re replacing and we go, “We’ve built a new one. Try it out.” Maybe you attract a certain type of personality. A little bit like with all user research. Obviously, you end up-

Ben:

Selection bias.

Tom:

Yeah. There is definitely a selection bias of sorts. We see such a variety in what people say and the way they say it.

Ben:

It’s all because it is there and it’s so lightweight for them to fire off something, compared to say an e-mail one. You’re more likely to get a more balanced range. Somebody might not go to their e-mails and go, “It’s important for me to reply to this.” They’ve just done a thing and they’ve seen someone say, “How is the thing?”. They might go, “Actually, I had a problem with that. I’m going to tell you about it.” They might never write back to you. We definitely don’t reply to all of them. Something’s firing off, but we’ve received something and we’ve heard of what they have to say.

Tom:

The other thing I quite like about it is that it means that we see a little bit more of a cross-section of the support tickets. A lot of people use Intercom instead of our Zendesk. That comes to us first. Sometimes we’ll answer it, particularly if it’s a beta and it’s not that kind of response. We think that-

Ben:

If you know what the answer is, it’s easy.

Tom:

It’s hard to forward it onto our Support team.

Daniel:

You’re actually using both Intercom and Zendesk, right?

Tom:

Basically, we use Intercom. The product team uses Intercom for internal messaging. Everybody uses it for drip e-mails and that kind of thing. Our proper support tickets that’s got a minimum of one hour turnaround and stuff like that, and staffed by a professional team, is separated.

Daniel:

Do you have any customer segment that you feel is harder to get to? Maybe you need to talk to them, but they’re maybe too busy to reply to you, but you need their input somehow. Is there something that you tried to do there?

Tom:

I don’t think so. At the moment, we’ve made it easier for ourselves, because we’re trying to go somewhere more marketable. We’ve been working on a big Salesforce feature recently. That’s been quite easy, because we’ve got very easy, identifiable job titles and we’re also quite open to talk about this kind of thing. They’re sold a lot of products already and they’re always on the lookout for things. It’s been a bit easier.

Sometimes, it can be a little bit harder to reach- There’s a stakeholder who doesn’t use us, but they instigate the need. They’ve identified that they’ve got the problem that we solve and send someone out to find a solution like us. We often end up talking to the implementer or the person who’s been tasked with solving that. It’s harder for us to reach the overriding stakeholder sometimes, because we are a team product. You put your dashboard on a TV screen in the office. We tend to talk to the person that’s responsible for setting that up, but we don’t necessarily talk to the people that consume that, or again, as I said, the overriding stakeholder or someone who may have asked for that to be created.

Daniel:

Is there a way for you to overcome that or is it something that comes maybe later on as they’re trying to use a product? Maybe the person that bought it is not the person setting up the dashboard.

Tom:

It’s not so-

Ben:

Sometimes it gets forwarded as well. Sometimes, if we do a mail shop for a research recruitment thing, we send it to whoever the person that replies. You’ll see it in the forward list. We send this to you, because you’re the one to set it up.

Tom:

Sometimes, we’ve done less recently, but we did for a while, sitting in a bit more sales calls as well and you—. Those, quite often, particularly in bigger companies, there’d be a team of people responsible for this and often that would be taken all over that, so we’d get the glimmer in that. Quite often, it doesn’t really hold us back at this stage. We are often quite happy to have that second-hand information from the person we are talking to and they can say, “Our team didn’t mind this”, or, “Their boss wanted this.”

Ben:

We probably have more ease reaching particular customer segments, because of the way we view our customer’s jobs thing. The thing we probably have a harder time getting is particular types of feedback. It’s very easy for us to talk to someone about an integration, because it’s a very functional part of the process. It’s very binary. It can or it can’t do X. I could or I couldn’t get my data. It’s much harder to at least reach out to people to talk about things that at the other end, at the presentation and appearance layout. For many people, they don’t have that same kind of yes/no, I did it or I didn’t. Some people do for very specific issues. Trying to understand what people’s needs are beyond that, can’t really go off to them for that.

That’s the kind of thing you have to build up through observation, sometimes indirect observation as well. We get people to share their boards with us, for example, and then we can see things. They might not say to them, but you can see these things on there. Then you go, “I can see that you’ve done this”. That’s sometimes trying to prove or disapprove hypothesis that we have internally as well about some of that stuff. That’s the stuff that people don’t tell you, but you feel there’s a problem there and that you can make it better somehow.

Tom:

Quite often, when a customer shares their dashboard with us and we see it, we observe things that could be a lot better, there’s lots of room for improvement with that and have product ideas based on that as well. Often, the customers are completely unaware that they have that problem or that it could be better. They’re completely satisfied, but we’re like, “That isn’t the best practice. This could be a lot better. It needs to be a lot clearer.” Unfortunately, that is a bit harder for us, because we don’t have access to all our customer’s data and dashboards. We can’t just go through mining it and stuff like that. It’s always a bit of a treat when a customer does share out with us, so that we can have a look. As we focus more on that kind of thing, we’ll probably try and work out some ways that we can look at this stuff anonymized and maybe have an idea of what kind of common layouts people are doing, that kind of thing, so we can refer stuff.

Daniel:

Awesome. We’ve already touched on this, but I wanted to try and maybe explore a little bit further. You talked about sitting on sales calls and doing maybe some lightweight support and trying to fit into the overall picture for your company. The issue of internal alignment towards what are we doing, why are we doing this, something that you see popping up in many product management conversations. You guys seem to be doing very well there. What are you doing to make sure that everyone’s on the same page in terms of process, in terms of information?

Tom:

We come from a culture where we are very much product-led in terms of what we build. There isn’t this overriding friction in terms of “Oh, we’ve been asked to build this, we don’t think this is a good idea.”

Ben:

We are a product company, so to speak.

Tom:

That’s kind of been ingrained in culture. That’s been quite an easy ride. We have lots of natural points where we work with everyone anyway. With support, if they have a more general question, it comes through us. We help with all the bugs prioritizing, no matter what they are.

Ben:

Sales, we help with, “Can this be done?”, or, “Is this coming up?”, or, “Here’s what our big prospect would have wanted to become a customer.”

Tom:

Or, “Is there a work around for X, Y, Z?” All those natural everyday conversations we have with everyone anyway. Marketing, obviously, they need to know what we’re doing, so have to talk to them about that and enjoy doing that. That’s just like-

Ben:

As you’ve said, if you’re like one of the things we’ve been doing recently is working on the marketability of the feature, trying to really consider when we’re doing features, when we’re doing actually marketing these things, involving them more and more in, selling this more recent project we’ve done. Just having so much customer contact does give you a certain level of trust in what we say isn’t completely insane.

Daniel:

Yeah. Does it revolve around informal conversations and doing meetings together and talking, or is it more than that?

Tom:

We have things like all hands meetings where our CEO will give an update and he’ll always include the product thinking at the time, projects we’re working on and what we’re thinking about next.

Ben:

How it fits into the bigger picture.

Tom:

Then we have more informal ones. We have a show-and-tell every two weeks where we show what we’re building. Again, use that to reinforce any changes in product direction like that. We’re quite lucky that we’ve got two themes, they’re getting into the same two themes, we’re saying two themes for a year and a half. Again, I’ve been saying two themes for probably another year and a half. It’s not like we’re continually changing. The message is quite clear. That helps. We’ve got a couple of things. Actually, the customer service team comes to our engineering stand-ups in the morning. Sometimes some of Marketing do too. We’re a small company. We’re less than 30 people. We have lunch with everyone. Things come up in conversation. We’re always trying to share and keep everyone abreast.

Ben:

It gets reinforced a lot in this conversation. You might be talking about one thing, but then, your feedback will be, “But our jobs are this. I feel like what we’ve done here doesn’t quite match with that.” You’re there like, chances to reinforce what are we doing, why are we doing it, what we believe is important. Particularly with the product like ours where it’s very easy to view it as a tool or category. It’s a dashboard. It’s helpful to keep making that clear. It’s not just a dashboard, because that can mean very, very many different things to people. It’s about building a product which is great for these jobs.

Daniel:

Great. Keeping on the topic of tools and processes, do you have anything set up to try to organize your research and your feedback that you’re getting? You have any personal productivity on this kind of thing?

Tom:

You mean tools and things we need to make it easy?

Daniel:

Yeah.

Tom:

We use Calendly, which basically is hooked up to Ben’s calendar and shows when the meeting room and him are free.

Ben:

If I’m free, Tom’s free.

Tom:

Pretty much. We’ve only got one meeting, which makes life a lot easier. We use Intercom and we send out e-mails. We can get e-mails from our database and work out what kind of profile people we want to talk to. Send an e-mail, say, “Here’s a link”. If someone via Intercom said something particularly interesting or relevant and we might do that. Sometimes we rely on contacts that various people in the company have to try and reach, particularly actually reaching brand new people that would be interested in the product, but haven’t tried it yet. That’s quite useful. Calendly helps us find a slot. We use Zoom for the actual call, you can see it. It does screenshot really well and it has recording of all sorts as well, so records with the screen and the video.

Daniel:

Right. Maybe something that comes in one day isn’t really important, as it’s just one person asking for something, but sometimes, a few months from now, someone comes in with another similar thing. Then it might be something that you might have to start paying notice, because maybe two or three times over time. When things come your way, do you have anything to archive it and then go back to it?

Tom:

We dump stuff in Trello. We got millions and millions of boards in Trello. The thing is, we’ve got so much stuff we want to do. We’ve got way more we want to do that we can ever do. The odd customer request here or there isn’t really a going to change that. We’ve got stuff that we know is absolutely critical that we want to do. We both have very good memories, we tend to remember when this stuff’s come up before. We don’t worry about keeping repository of that.

Ben:

They just get stale.

Tom:

Yeah. They come and run as well. We might have done something that makes the whole part of the application obsolete. We don’t worry about having a permanent record of that. Actually having it disposable helps keep us sane and not have this giant burden of thousands of things that we consider. There’s enough obvious stuff to do, so that isn’t our problem. Our problem is, which of the obvious things are we going to do first?

Ben:

The thing is more like, if you then want to try to close the loop on someone who you’ve done something for, we’ve got to do that for the very ad hoc feedback on the stuff we’re working on. With the Salesforce project, if someone messaged you saying, “Would you have this visualization?”, and three weeks later we do, that’s really easy, because we know how to do it, so we tie it into and we come back and message. If it’s someone we spoke to like six, eight months ago, we will never find that in the video. Last time I tried to edit video highlights to a reel, it literally takes longer than real time to listen to every single one of them, to find the bit that you want. Until AI gets that good. It’s not a winnable problem.

Tom:

We do treat stuff that we’re likely to do very differently. We will tag that up, we will put that on the Trello card, so that we can draw on those people and use it as an example when we talk to the team about it. But for stuff we don’t think we’re going to do already, we don’t worry too much about it. If it comes up all the time, that eventually will, maybe the fourth or fifth time, we’ll go, “That’s happening quite a lot.”

Ben:

Let’s start keeping a red button on this. Start keeping record for when someone says it again. 😀

Tom:

A lot of the feedback, we do nothing with, because we’re using Intercom. People want to give you feedback is when something’s new as well, we’re quite lucky that most of the feedback is relevant to stuff we’re working on today. Stuff that we bother looking at regularly is very focused on the now.

Ben:

Part of the reason for that is that we only do triggered Intercom, we don’t have it persistently available at all times. We’ll fire a message about a specific thing. We’re far more likely to get the other feedback about the other stuff in a call with someone who’ll then say, “I was using this information”, and then you’ll be like “we haven’t thought about that for ages.”

Tom:

The other thing is, the customer success do if they notice a trend, they’ll start pestering us as well. They’re going, “Hey guys, this is coming up quite a lot. Can you think about it?”. It’s quite a nice filter for us, because we have to see yet every single one. Once they’ve identified a pattern, then that comes through.

Daniel:

Right. Do you have any bad experiences and things not to do with regards to how you handle and how you use customer feedback. Experiences of overall Dont’s for product managers to avoid when they’re figuring out the customer feedback thing?

Ben:

We were talking about just then, saying like, “Don’t worry about keeping the record of everything.” A part of that is, what you do is guided by the feedback. The feedback doesn’t tell you what to do. It’s very easy, particularly when you’re much more than we are. We have a luxury of being in the position that we’re in, which is that, the overall value proposition is quite well defined. For us, it’s about, we’re trying to build upon that and make it much more than it is. When you’re still figuring that out, you’ve got to be very, very strong on what you think that is and use the feedback to guide that rather than go, “No, it seems like everyone’s saying this. We need to go.” Of course, in those areas, there is a point where you’re going to go, “I’m completely wrong.”

Keeping a big repository of everything guides you much more down that one. It’s that idle hands problem. We could do something. Why don’t we do this thing that someone asked for? Then you drag it in and you do it. You’ve not thought about it. The two main things you do are, you do the research to validate the jobs your product is hired to do, to guide the strategy of what you think you need to do. Then you use the feedback in terms of how you do that and how the things you make support that or don’t support that. It never says, “Build me a boat.”

Tom:

All the classics like, don’t listen to what they’re asking for, but check what the problem is. Quite often, we do get quite obscure things we really have to clarify. Someone will say, “Can you add this?”, and actually turns out the problem is much easily fixed by something else or better fixed by something else. Obviously, as I was saying, you don’t want to be reactionary. You want to listen, you need to be aware of it, but you certainly don’t want a building based just on what the feedback of it is. Particularly if you’ve got quite a good confidence in what your building’s right.

Ben:

Try to be helpful when you say no.

Tom:

Yeah. The only times that we are reactionary, is when we’re building something and we’ve made an oversight. We’ve not realized that this can happen, particularly because we’re dealing with quite complicated environments and datasets all the time. We might not be aware that some kind of data isn’t typed, or they’ve got a certain type of Salesforce account, or something like that, or some-

Ben:

Date functions in Google Spreadsheets.

Tom:

Yeah, some kind of weird thing like that. We’re like, “Oh my God, I have no idea.” This is because of what we’re doing. It’s not a bug, it’s just an oversight. Those we tend to try and prioritize. As another example that we’ve prioritized based on feedback recently, we had customers hitting their API limit. They were really good customers. They were enjoying the beta so much that they maxed it out. These guys are pretty important. Maybe we should prioritize stopping those guys hitting their API limit, so they can keep using it and be really successful. That again, we had on our road map. We just thought, let’s bring this forward, because we want these really strong customers to be happy and let’s do that.

We would never go like, “Oh my God, I need this kind of chart”, or, “I need to do this and act upon that.” Sorry, I keep rattling on. We did find that when we first started here, the focus was quite hard for us to keep. The way we’ve solved that is by having these projects and having the grander strategy and the jobs to be done. The importance of having a focus in what you build is absolutely critical. Your productivity will be so much higher if you’re not flitting between lots of different things. That’s absolutely number one importance really.

Daniel:

Right. That’s a great note to end this with. It’s been really nice talking to you. Thanks again for your time and for taking part in the project.

Tom:

It was really good fun. It’s quite interesting to think about it really, because it’s not something we sit down and reflect on very often.

We’re off to visit a customer now.

Daniel:

Okay. Great. I won’t keep you from that. Thank you.

Tom:

Cool. Excellent.

Daniel:

Bye.

Tom:

Cheers.

Daniel:

Cheers. Bye.