[Full Podcast transcript at end of page]

We’ve worked with dozens of large companies trying to build innovation into their culture. Many talk about experimentation. Few know how to actually do it.

That’s why we were excited to sit down with Leandro Balbinot, CTO at Whole Foods and VP of Technology at Amazon, for our latest episode. Leandro has held senior roles at Kraft Heinz and McDonald’s—and now sits at the intersection of tech and operations in one of the most complex organizations in the world.

His insights on experimentation, failure, and scaling innovation were some of the sharpest we’ve heard.

Innovation Is a Tactic, Not a Destination

We opened the episode by asking Leandro how he decides when a problem calls for innovation versus just solving it with a simple fix.

His answer was direct: not every problem needs innovation. Sometimes a product tweak is enough. The key is knowing when to optimize—and when to rethink the model entirely.

Innovation teams waste too much time applying “big ideas” to problems that don’t require them. Leandro reminded us that great operators know when to hold back and when to swing big.

The Best Experiments Are Fast, Cheap, And Clear

When we asked how he prioritizes experiments, Leandro didn’t hesitate:

  • The faster and cheaper, the better.
  • If you can’t get real data from it, don’t run it.
  • If it doesn’t teach you something important, skip it.

It’s a framework we deeply believe in—and one that too many large companies ignore.

Gut Feeling is Data Augmentation

One of the best lines from the episode.

Leandro doesn’t see gut instinct as a replacement for data—it’s a complement. A way to fill in gaps when data is unclear or incomplete. It’s what helps you sense when a metric is misleading, even if it’s “green.”

This is the nuance most organizations miss: you need both instinct and information to make good bets.

You Don’t Earn Rollout Support Until You Earn Trust

Innovation doesn’t scale without organizational buy-in. And according to Leandro, you don’t get that buy-in by pitching a bold vision. You get it by doing the hard work upfront:

→ Showing your process

→ Backing it with evidence

→ Writing a clear, detailed plan

It’s not just about the idea—it’s about demonstrating that you understand the business and can execute within it.

Amazon Moves Fast Because The Systems Are Built For Speed.

Leandro made it clear: experimentation doesn’t work without the right infrastructure.

You can’t run weekly experiments in a company that ships code twice a year. You can’t run A/B tests if your tools don’t support them. The technical architecture, team model, and decision-making structure all have to support velocity.

Amazon didn’t stumble into speed—it built for it.

Bottom Line

If you want to innovate inside a big organization, you need to stop romanticizing ideas and start operationalizing how you test, learn, and scale.

Leandro’s mindset is a masterclass in that.

- Ben and Marcus

Transcript

Ben Yoskovitz  00:00

How do you think about doing things that are sort of maybe closer to the core, more incremental in scope, versus bigger bets, riskier bets, perhaps maybe things that are a little bit further away from the core business? Innovation,

 

Leandro Balbinot  00:12

for me, is a tactic to actually solve problems, right? I mean, sometimes you don't need innovation to solve a problem, and continuous improvement is very important. Sometimes you just need, like, to do some tweaks on your product to actually get to that improvement. And that's really important, and you should continue to do that. That's really critical. But sometimes the problem requires really, like, a think out of the box and find a solution that's different than what exists. That's where innovation kicks right. So I think it depends on case by case. For me, it's like, like, we do a science. I mean, AI is not the solution for every problem, right? I mean, AI is just one way of solving a problem. And if you tend to, tend to believe that it can solve everything, right? Maybe one day, but, I mean, it's definitely a great solution for problems, but maybe some problems don't need it just a simple solution that solves for that.

 

Ben Yoskovitz  00:59

Leandro, thank you for being with us today. We met back I think it was 2018 late 2018 when Highline beta started working with you, when you were at Kraft Heinz at the time. And back then, Kraft Heinz had this sort of growth innovation group called evolve, and their job was to build new businesses beyond the company's core. And you know, there was a focus on technology, software and services, which is not Kraft Heinz core business, and also a few food related ideas. Then you joined McDonald's as a senior VP of global technology and digital, and actually we work with you there as well, and we were there in a different capacity, this time, sort of helping innovation teams move faster with the right playbooks and methodologies and scorecards, more of an org design project today, and you're the CTO at Whole Foods and VP of technology at Amazon. We have not worked together there, and I'm really excited to learn more about what you're doing there and how things operate there. So you've worked at some of the biggest, most well known brands in the world. So my first question, how has the approach to innovation changed or stayed the same across Kraft, Heinz, McDonald's and Amazon?

 

Leandro Balbinot  02:09

First of all, Ben and Marcus, thanks for having me here. Very happy to see you again, Ben, as you said, so I've been dealing with innovation in all these companies in the past, and I think I have a different snapshot of time on this company, so maybe they have evolved different than what I'm gonna my my testimony, when I say but I mean, I can say that there's an evolution clearly that's happened in my career, but also in these companies, in terms of how you treat innovation. I see innovation clearly as a way of experimenting things as a tactic for solving problems, of course, and I think this company has been learning about how to do that. I mean, from from the early times where people were like just creating chief innovation officers and trying to actually separate it from the core of the business, which is clearly a mistaken and I think to the evolution that you see more people really like embracing failure, embracing experimentation. That's actually what you do here at Amazon pretty well. And I think these companies have been evolving. I hope they are actually more capable to do that these days. We started to plant a seed there. But the main difference is that is really like understanding innovation as a way to experiment solutions for problems and a way of like, really living out of a problem that you have how to solve that problem in a way that only innovation can do. And it includes, really, like a trust part of it, so you need to earn trust in terms of believing in failure, believing that you're learning that failure, and believing that you actually fast enough to pivot and change the solution for something that actually works. And these companies are not very in the past. I mean, companies have worked like Kraft Heinz and McDonald's, they were not really like a very comfortable failure at that time, right? So, I mean, I think some companies mis understand experimentation with pilot and rollouts, right? I mean, when you have a pilot of dates, go live dates and stuff like that, you're not experimenting, right? Because experimentation is probably experimentation is prone for failure. So we cannot have a like a roadmap for success and something that you have you're prone for failure. So you need to understand the difference between these two things. And I think this company has been learning about that. And definitely Amazon is a good example of how this could be successful, right? And, and you see the evolution happening. And I think blending technology with business as well as important, like having, like, a single threaded ownership that's something we, I consider very important for innovation as well. These companies are going to that model, from what I know. So I think this is the evolution and, and definitely in Amazon. I think we are in the top of that.

 

Marcus Daniels  04:42

And I think it's really interesting, because you have CPG, QSR, technology, grocery, that sort of category. I mean, we can geek out a little bit here. Just go a bit deeper in some of the uniquenesses that you've experienced in the operating model between them.

 

Leandro Balbinot  04:58

Yeah, that's a good point, because, honestly, it's. CPG and retail overall, normally have the sense of being conservative and like, really, like a down to earth, because they know mean, when you deal with customers every day, I mean, you have to be careful about what you experiment and how you experiment. So there is this kind of rejection to failure that's natural, right? We cannot fail in front of the customer, right? So, but I think the operating model is different. And of course, when you think about Amazon, is a different way of approaching retail. Approach retail from an innovation technology first, which is not the case definitely for any other companies, which actually take more technology as a first, as a necessary evil, and then later as a kind of, yeah, way of maybe helping me to grow, but not really the core of the business ever, right? So that's the difference in operating model, for sure. And I think I still see a lot of CPG and retail companies still like not embracing technology and innovation as they should, because they're afraid of how much it costs, how much is it could be damaging their operations and brand, right? Right?

 

Ben Yoskovitz  05:58

So you've mentioned failure. Now I don't know how many times you've said the word failure Leandro so far, but quite a few times which is which is interesting. He's embracing it, which is good. How do you help a company? Now, maybe Amazon already had the right mindset, and you can speak to that. But how do you help other companies? Or how? What advice would you provide other people who are in organizations where failure is a bad word and they're not embracing that, and therefore, by extension, not embracing experimentation? How do we help companies handle that process and get better at running experiments and innovating by accepting the reality that sometimes they'll fail. I

 

Leandro Balbinot  06:41

think that's a very good question. First of all, failure cannot be alone. I mean, it has to be together with a kind of framework or a way of working or operating model, right? So meaning that failure means you have to react fast and have to pivot fast. And what we cannot allow when you are accept failure is to let the failure last for long, and then you spend, invest a lot of money, and then don't react to that right? Don't learn from that failure. So we need to guarantee that at this companies have to guarantee that when you have a model like that, you have to be able to have fast decision making, fast pivoting, and have the data enough to really decide where to go with that failure and how what you learn on that failure if you don't have this really, like a fast shipping model, where you're shipping a different version or a different showing to the customer, a different version of your solution, fast enough this failure, we're going to become just a real expensive failure, and not going to learn anything from that, right? So be able to pivot and react really fast to failure is the most important thing than the failure itself, right? And and that's why I believe in models like the single threaded ownership model, where actually you kind of create a small teams with, like two pizza team engineers and, like some product managers and people that understand the business, and they're able to really take decisions on their own. I mean, they like their own, the owners of their own business, meaning that inside those parameters of investment and time they can take at whatever decision they want, right? And actually move fast, instead of like having this committees where we evolve, like six, seven functions different and try to decide where to go if that failure, which actually is very expensive as we know, right? So really need to guarantee these companies can be fast enough to react and to pivot when needed under experimentation.

 

Marcus Daniels  08:24

And a lot of this is happening kind of at the core or near the edge of the core. You know, what is your advice to really balancing a bit about that need and drive for experimentation with the ability to have operational consistency?

 

Leandro Balbinot  08:37

Yeah, I mean things like AB testing and ability to really test for a portion of your customers or your users is pretty important. I mean, otherwise, I mean, you cannot experiment on your core business fully, right? I mean, without it's almost like we have this concept of one way door, a decision or two way door. It's kind of, if you experiment something to your whole population of customers or users, it's like a one way door. Honestly, it's very costly to kind of roll this back, right? So you need to have a way of testing in a percentage of your customers and grow in a gradual way. And I think McDonald's, we are doing that quite well. I mean, every every experimentation we did, we were very successful on the digital side. With the new app, the mobile app, and everything we were launching, you're able to really, like, go, like, in a very small growth of that population that were using that app, and pivot enough on time to, like, avoid, like, really bad changes in the operations. And I think that's really key. I mean, don't test you in your core without really, like, having a way to to roll back it

 

Ben Yoskovitz  09:40

right? And how are you Leandro prioritizing experiments. So, I mean, you can do 1000 things, maybe even more than 1000 things, at any given point in time. So how do you look at prioritization to decide what are the best experiments to run, or, you know, the biggest wins that we've. Think we're going to have,

 

Leandro Balbinot  10:01

I think the first thing is, the best experiment is the cheaper and faster one, right? The one I can really put in front of the customer as soon as possible and see how it's working. So that's the first prioritization. I think the second one is like, really, like, we will be able to acquire the data enough to really decide if that's the right solution or not for the so sometimes the experiment is great, but I don't have a way really to test if it's really effective or not, so maybe that's not a good experiment to go ahead with, right? So there are some things that actually inform you about how to prioritize experiments, experiments that are faster and cheaper, easy to collect data and decide if it's the right solution or not, and you can actually test the market fit well as well. So I think these are the three main parameters to decide. And

 

Ben Yoskovitz  10:46

what would you say Leandro is, can you put a time on fast or a number on cheap? You know, are we talking a week? Are we talking six months? I'm sure it varies experiment by experiment, and it varies if it's digital or physical. But how long are these experiments in your mind? How much time do you give something before you say, we don't have enough data, we don't have enough conviction.

 

Leandro Balbinot  11:12

I think if it's purely digital, I would really be concerned if experimented after two street two sprints cannot generate some kind of a response on the customer ready, right? Could be two months, in some cases, or two weeks, and depends on how quick your sprints are, your shipments are, right? It has to be really fast, in my view, for digital, for physical, as you know. I mean, it depends, right? I mean, sometimes you have to invoke hardware creation and stuff like that, which actually takes time, right? Or we can do like we were doing in the Kraft Heinz time. Sometimes you just, like emulate it, to have the hardware, even if you don't, just to kind of be able to receive the response from the customer, right? Fake it. And that's also possible, but it has to be as soon as possible, definitely. I mean, I any, any experiment that says that the has to call it the minimal lovable product, or the minimal viable product, is like six months I had, I started doubting if that's the right experiment,

 

Marcus Daniels  12:04

honestly. And have you seen any I mean, variance in between the way Amazon and also the Whole Foods ways of doing things? Has it been a great mesh into how things are done together? Just from your perspective? I

 

Leandro Balbinot  12:17

think it's great when you can combine the the knowledge of the business and and the way Whole Foods runs grocery business with the way Amazon runs innovation and experimentation. So we are able to combine these two things. That's great, because one important prerequisite for good experimentation is understanding the business well as well. So you have have to know what you're testing and for what, what's the problem, right, that you're solving. And I think in Whole Foods and Amazon, we can combine these two things. We know very well what the problems are in Whole Foods. And I think in Amazon, we can come with great experimentations to test the solutions for this problem,

 

Marcus Daniels  12:53

right? I spent a lot of time actually working in the QSR innovation space for Tim Hortons, did some work there over the years. And, you know, I thought there's always an interesting balance between looking at innovation as bets, but also having speed. You know, speed in the perspective of how you validate and do experimentation. Have you really seen a different lab in of speed that's that's translated into way, the way Amazon operates, versus, obviously QSR side of things. Oh,

 

Leandro Balbinot  13:24

yeah. I think the the idea of shipments every week, or stuff like that is not really common in any QSR and even CPG organizations, right? That's kind of a it's hard to get to that. By the way, you have to have the right model in place to obtain that right. I mean, I remember in McDonald's when I joined, I mean, sometimes a shipment would take six months to a year, right, to happen, and then you cannot experiment in that that model, right? They have to solve that first right? They have to be able to ship every week or month at least, to be able to experiment. So there are some prerequisites that you have to achieve. But these companies are evolving, and I think they are getting there. When you get there, then you can really they start properly, innovating, experimenting, and also the ability to do AB testing is also key in that case, right? Even for physical life. I mean, you should be able to test in one store, or at least in one POS, in each store, for instance, instead of, like, throwing it to the whole organization and, like, break the operations.

 

Ben Yoskovitz  14:18

Yeah, I remember, I remember at McDonald's, and I'm sure they still have test stores, you know. And there was one in Chicago. We never got a chance to go to it, but I really wanted to go to it because it was one of those, you know, testing menu items, testing different, you know, technology inside of the store. But I still think the cycle time is just always a challenge in that physical environment.

 

Leandro Balbinot  14:39

Yeah, we call here seed stores, which is kind of the same, but that's very important, because one thing on the fast decision making, part of the innovation is you should be able to influence metrics and change the way you calculate metrics and stuff like that. And if you have a store where people are measured by this metrics, I mean, it's very hard to do changes in decisions, right? So you have to have some flexibility to say, You know what, you should not evaluate that specific seed store the same way, because they're going to be impacted by my experimentations all the time, right? So let's change the way we validate the performance. So that's that's pretty critical, right?

 

Ben Yoskovitz  15:14

Because the experiments can actually affect the performance of the people and the metrics they're trying to hit, and you could be doing something that's having a negative impact on sales or throughput of inventory, or whatever it is, and then the people who are being measured against that are stuck because you're experimenting

 

Leandro Balbinot  15:31

Exactly. And if you if you are innovating, you have to be able to pivot fast. And you cannot be like blocked by, oh, I cannot do that, because going to be back then you help people is performance management and remoderated, right?

 

Marcus Daniels  15:44

So a nice balance between meticulously structured and maybe a little bit of focused chaos, you'd say

 

Leandro Balbinot  15:49

Exactly, yeah.

 

Ben Yoskovitz  15:51

So tell Leander you mentioned data a couple of times. I know you're a data person because you're a tech person. How do you think about experiment? And you mentioned hitting certain metrics. So how do you balance, or what is the balance between using the data from an experiment and your gut, or maybe not your gut exclusively, but the team's gut, the instincts, versus the data? How are you balancing that? I think

 

Leandro Balbinot  16:17

there are two things. First, I treat gut feeling as a data augmentation in a way, right? I mean, it's basically, there are some patterns you don't find in the data directly, but if you have experience, you may find it just because of your experience, right? You know what's behind that data, right? So that's what I call gut feeling, right? It's really you have to start with the data. I don't believe in gut feeling starting from scratch, right? It has to be a data augmentation, in my view. So, yeah, many times I don't have enough data to really know exactly what the patterns are, what the trends are, and I cannot know exactly what to do with that experiment. But the gut feeling helps me to say, You know what? I've seen that before. I've seen that trend and pattern happen. I think if you do that, you're going to actually achieve so it's it's important as something, right? And also, I think gut feeling is important to validate if the metric is correct, because metric is not reality, right. Metric is a representation of reality. And sometimes we mix, and we see people like chasing a metric, not the reality, right? Yeah, just

 

Ben Yoskovitz  17:14

to hit it, just to hit a certain target, even though the target may not actually achieve what you want to achieve. Yeah, we

 

Leandro Balbinot  17:20

need to separate gut feeling from desperation, right? Because sometimes I have to make it right, because my money is gunning away, right. I mean, I have to do this right. And then I kind of this the standards of humans to like, you know what, let's be positive about that, even if my gut feeling tells me not sure I think it's going the right way, right?

 

Marcus Daniels  17:37

Bit of a wrestling match between gut and kind of the data, right? And I think also thinking that, you know, success is not just the ROI, it's also the learning cycles, the learning velocity, I would say

 

Leandro Balbinot  17:47

more Yes, more important, even sometimes, yeah, I agree.

 

Ben Yoskovitz  17:50

And do you use? How do you think about problem identification? So you mentioned, start with data. So you can start with data on the results side of an experiment, and then say, not sure I have enough of it. I'm not sure it was the right data. Use my instincts to decide whether to keep going with an experiment or not. What about on the problem identification side? Because you said before you know what the problems are, or Amazon knows what the problems are. At Whole Foods, are you using data there to identify problems? And what about gut there, when somebody who's in a store or working on the digital Whole Foods product or an Amazon product says, I'm pretty sure the problem is x. And you know, how do you start with that problem identification space? Yeah,

 

Leandro Balbinot  18:33

I think it's much more risky to use gut feeling on the problem identification, because it's very connected to buyers, right? I mean, it could be, I think I know what the problem is, because I've seen that before, but without looking at the data. So I prefer to really, like, try to acquire the data on the problem identification in some way, like surveys, like monitoring what's happening in production, or talking to customers directly and asking them. And of course, you still can use some gut feeling and some experience. But, I mean, I think it's much more risky to use that on the on the problem identification, than it is on the on the experimentation analysis part, right? So you want, you

 

Ben Yoskovitz  19:10

want real conviction and evidence, let's say, on the problem before you would go and decide to go try to solve it through an experiment, and maybe a little bit more gut on the back end of that to say, well, this experiment, you know, we're not just going to kill it instantly because we didn't hit a certain number.

 

Leandro Balbinot  19:28

Yeah, having said that, I mean, sometimes you have to use a gut feeling in the problem identification as well, because we have to solve the problem is so important for you to solve that you have to really like, as we call here in Amazon, disagreeing. Commit, right? Yeah. We don't have the right data to decide exactly what to do. Let's, let's follow one thesis, right? The thesis seems more realistic is that one. Let's go to that thesis and test it. But then I think it's more important than than ever to test and get, uh, feedback as soon as possible on this case, right? Yeah. And

 

Marcus Daniels  19:55

I think you meant you think you mentioned it as well, that the gut feeling also gets fueled a lot. By the business side, and understanding the business context, I think, gives you a good sense of just when you should move forward there, you know, I always love to dive deeper into just how you kill experiments. And I think specifically when you think about, you know, Amazon and AWS, I mean, if they would have waited for AWS to be profitable, they would have been a terrible mistake. And so I'd love to learn a bit more about your perspective of just killing experiments.

 

Leandro Balbinot  20:26

I think in that case, I mean, when you start with the customer, focus on the customer. I mean, your goal is not to be profitable necessarily, right? Your goal is really guaranteed the customer likes and understand what you're offering and then buy it, right? So that's the most important thing. Of course, you have to have a path for profitability somehow, right? And I think the Amazon flywheel is pretty important in that case. I mean, we like keep evolving to get to a point where profitability becomes part of it, and you can continue invest in escalate, right? So it's scale, sorry. And I think really, the way we do it here is pretty, pretty, pretty, pretty much the right way. Really like looking at the Okay, this is the metric. So defining the metrics for innovation is pretty important. What you're trying to achieve. I mean is what's the value of what you're trying to achieve, and how can you use that to confirm that that's the right solution for that problem? That's more and more important. You have the wrong metric, and that happens more than than you like, or sometimes you have the wrong metrics, and all the metrics are green, and say, I'm not sure I should continue this experimentation, even if the all the metrics green. So what's wrong? Wrong are the metrics probably like, you're not measuring the right things you should measure right. So that's pretty

 

Ben Yoskovitz  21:33

important. Got it, and maybe you can share a little bit of your experience and how you're thinking about incremental versus more growth or disruptive innovation. So there's lots of things that and it could be experimentation on both sides. Always maybe focus on everything's an experiment, no matter what. Maybe that's part of the answer. But how do you think about doing things that are sort of maybe closer to the core, more incremental in scope, versus bigger bets, riskier bets, perhaps maybe things that are a little bit further away from the core business.

 

Leandro Balbinot  22:09

Yeah, as I said before. I mean innovation, for me, is a tactic to actually solve problems, right? I mean, sometimes you don't need innovation to solve a problem, and continuous improvement is very important, and sometimes you just need, like a to do some tweaks on your product to actually get to that improvement. And that's really important, and you should continue to do that. That's really critical. But sometimes the problem requires really, like, a think out of the box and find a solution that's different than what exists. That's where innovation kicks right. So I think it depends on on case by case. For me, it's like, like we do a science. I mean, AI is not a solution for every problem, right? I mean, AI is just one way of solving a problem, and if we need

 

Marcus Daniels  22:45

it, are we sure it's not? It's a bit early?

 

Leandro Balbinot  22:50

Yeah. I mean, we tend to believe that it can solve everything, right? Maybe one day, but, I mean, it's definitely a great solution for problems, but maybe some problems don't need it. It's just a simple solution that solves for that, again, faster and cheaper is also more, always more important, in my view.

 

Marcus Daniels  23:05

And do you think you know, getting there from different using different approaches, you know, back from building, buying and partnering? Just what is your perspective

 

Leandro Balbinot  23:13

now? I believe in time to value, right? I mean, sometimes you need to deliver what the business needs to solve a problem, and sometimes maybe the the MLP or the MVP is basically a three piece solution that's already in the shelf. Just implement that and start using it. It's just not a one way door. It doesn't mean you're gonna have to stick to that 3p solution forever, right? You can actually use that for now to actually answer for the immediate needs of the business. And then you parallel, you build something that actually is better and differentiate you from from the from the other competitors, or sometimes, honestly, I mean, if it's a commodity thing that actually doesn't really matter if you are different than others on that, you just keep using the 3p forever, right? So I think it's totally possible.

 

Ben Yoskovitz  23:55

So so Leandro, you've used the term MVP, minimum viable product and MLP, minimum, lovable product a couple times, I'm super curious. Does are you using these terms inside of Amazon? Do you have a clear definition of what you what those terms mean? Is there one of those that you prefer I get into this argument with? I'm not suggesting we're about to get into an argument, but I get into this argument. We might, but I get into this argument all the time with people who say the term MVP is done, it's dead, it's it's nobody understands what it means anymore. So then they went to MLP minimum lovable product, and people were like, just because it's lovable doesn't mean it's viable. And, you know, they're just running around in circles. So maybe just from your perspective, maybe the Amazon perspective as well. What do these terms mean to you? Are you using them inside the organization in a structured way? Yeah,

 

Leandro Balbinot  24:51

use minimal lovable progress hours here. So I'm just trying using MVP just to translate to actually, probably what some other companies do. But I mean, if you. You have a customer focus. And I definitely Amazon is the company from all the comes of work that has more of that definitely like, really, like, focus on customer first. MLP is a necessary thing, right? You really need to build something that's lovable for the customer. The customer have to, even if it's basic, and have the minimum requirements on that. It has to be lovable. Customer really has to like it and really like adopt it. That's, for me, a very important part of this. Otherwise, you cannot get the engagement from the customer. You don't get the data that you need to kind of validate if that's the right thing or not. So that's for me, if you launch something that you don't get to the customer to use it. I mean, what's the point, right? I mean, what's the point of actually launching so are

 

Ben Yoskovitz  25:38

you using MLP? Is that? Is that part of the Amazon vernacular where people are saying, Let's build an MLP. That's super interesting, okay?

 

Marcus Daniels  25:47

And I think people, I mean, lot of organizations we work with, everybody has that same level of orientation, but maybe we can go a bit deeper. And really, what does lovable mean to you?

 

Leandro Balbinot  25:57

I think, I mean, honestly, I never checked if the way I think about lovable is the way Amazon thinks about love. That's my opinion, honestly, but I believe very much that it should be focused always on the customer and see what the customer needs and how the customer going to love what you're launching, right or shipping. So that's what lovable means. In that case, it's like a maybe I can have an MVP that generates barely the metrics I wanted to validate. But if the customer is not adopting, and the audience in the population of that customer that's using it is so small, I cannot get any statistical reference on what I'm launching, right? Yeah, you're

 

Ben Yoskovitz  26:32

you're convinced I'm still clinging, because I'm old, to the term MVP, because I always felt like the V meant was stands for viable, but viable in the eyes of the customer, that's how I always interpreted it, not viable, like I could make money off it, or viable, like it worked, like it technically worked. That wasn't the point of it. It was always, you know, customer first validated with them prove that they want it. And that's what to me an MVP is, but I get it, yeah,

 

Leandro Balbinot  27:01

and that's the risk of that, because some companies could take MVP as like, if it compiles and doesn't rate any errors, and I can launch and ship that's, that's that's viable already.

 

Ben Yoskovitz  27:10

That's true. That's true. We built a thing. Nobody uses it, but we'll still define that as an MVP. And that's, that's where I think that terminology has has gotten messy over the years.

 

Marcus Daniels  27:21

Well, it's pretty much overdone with the startup, you know, ecosystem, and then looking into corporate innovation, everybody's grasped to this, you know, gravitated, I should say, around, you know that that term, I mean, I'm curious, just on thinking about the next area of scaling, into scaling innovations. You think about Amazon's ability to just move fast and experiment quickly, but then also shift into things, into commercial environments. Maybe we can talk a bit about, you know, that element, yeah,

 

Leandro Balbinot  27:48

scalability, for me, is part of the experimentation, and you need to always be thinking about the next step, right? I mean, yeah, you're gonna, you're gonna understand what's the results you achieved, and then you're gonna decide how to scale. You have to, always to keep scaling your product, right? I mean, or either a pivot, or you scale. I mean, you cannot stay where you are. There are only two options, in my view, pivot, or you start scaling it to and then you start defining pilots and rollouts. Right? For me, it's part of the plan. When you proven that your solution is good enough and it's kind of solving the problems of the customer in a good way you still start thinking about scaling and rolling out. This what you cannot have. And then I saw many times in the past, not in Amazon, but, I mean, you have that product that's there is not good enough to be scale scalable or to be rolled out, but it's not terrible to kind of kill it, right? To keep like, this thing dragging for forever, and that's the really a bad place to be, right, where it becomes a zombie, yeah?

 

Marcus Daniels  28:47

Or maybe, like a half dead house plant,

 

Ben Yoskovitz  28:49

yeah? Yeah, exactly. So do you have a is there in your mind, or a systematic way of thinking about PILOTs and scales and scale? So you know, if something hits this threshold, we'll take it to two stores, or we'll take it to 10 users, and then we'll go to 10. Is a small number, 10 to 100 users. Do you have a systematic approach that you try to consistently follow, or is it really case by case, project or experiment by experiment? I think it's

 

Leandro Balbinot  29:19

case by case. It's different from digital from digital from physical as well? I mean, when it's physical, you need to guarantee you have the right supply chain, the right distribution for that. You have the right ways of delivering what's needed at scale, right? So there are different ways of measuring that and see if you are ready or not, right? I mean, sometimes it's great, but you cannot have, like, a national distributor or a national supplier that can provide that to you at scale, and then you cannot go ahead, right? So, so there are, there are limitations that are different from physical to digital. I think is digital. Digital should be easier to identify if you're ready to scale or not. Do

 

Ben Yoskovitz  29:50

you think of as you're scaling something digital or physical? Do you still think of that as an experiment, or do you feel like you've done the experiment? Experimentation, you've gotten the validation that you need from quantitative and qualitative data, and now you operationalize. And that's a different mindset for scale, or is it still sort of the same mindset of, let's keep innovating, let's keep experimenting, even while we're scaling.

 

Leandro Balbinot  30:15

I think, I think if you are still trying to identify that solution is the right solution for that customer. I think it was to experiment. You know, even like, sometimes you need to scale to experiment. It better, right? I need to, like, I need 10 stores to really confirm that's the right solution for that. So that's still experimentation. You should, should be able to roll, roll it back, right? It's different than when you say, Yeah, that's the right solution. Let's now scale and roll out when you know exactly what the version is and and then you can start shipping it to all the all the stores. Or

 

Marcus Daniels  30:43

Ben, are you trying to, like, really balance out, um, kind of agility and also structured execution? Is that what kind of you're kind of hitting at?

 

Ben Yoskovitz  30:51

Yeah, sort of the the difference between the sort of somewhat Wild West of, we're just running experiments, we're trying to learn, we're trying to validate something. Okay, we've got it. Let's call that the MLP for a moment. We we validated this works. Now we have to operationalize it. We have to take it from 100 users to a million users, or or one store, or two stores to 50 stores, and is the skill set, the organization, the systems that we use to do that different, or is the mindset sort of the same along the way, which is lots of rapid little experiments to validate that this thing is creating value, and now it's still a bunch of experiments, just bigger experiments at a bigger scale.

 

Leandro Balbinot  31:39

I think the way I see it is there's no two different change management process or two different operational operational process, just one. If for innovation or for no innovation, you have to get back to the core and to streamline this into your processes, right? So, I mean, yeah, if you're ready to roll out, you should use the same structure that you roll out anything in a company to roll out an innovation or new products or new solutions. So of course, sometimes you have some new requirements in terms of how to deploy it physically, right? But it should be, should get back to the mainstream of the company. Cannot be like a separate team, right? From operational perspective, you cannot run a separate operations for for that,

 

Marcus Daniels  32:17

yeah, from our experience, we see a lot of leaders get pushback. So what are your hacks to get that kind of big rollout support? I

 

Leandro Balbinot  32:25

think earning trust is one of the leadership principles of Amazon, and I believe that very much. I mean, you need to earn trust before you because you're going to be able to embrace failure, you're going to have to actually talk about investments. You're going to talk about, like changing the way operations run, and that requires trust, and that's something you need to acquire early on in the process. By showing that you are measuring properly, you're taking it the right way, you understand the business. That's the way to do that right. It's like, really like, if you don't earn the trust early on, it's very hard to actually achieve whatever you want to achieve in your

 

Marcus Daniels  32:56

innovation. And is the trust? Is the trust coming from the data, or is it also from just the ability to manage those relationships,

 

Leandro Balbinot  33:03

it's data. Is communication, is writing a document that explains very well what to how you were thinking, how you want to evolve that and have all the details that's needed for that to evolve, and what to means in five years time, how this thing going to be running, right? So really creating the trust on the plan itself and on the data that you're using to build the plan.

 

Ben Yoskovitz  33:20

Did you think Leandro, when you first joined Amazon and Whole Foods? About small wins first as a way of earning trust, if I can just, if I can join here, do something, you know, not go swing for the fences on some crazy new project. Try to get some you know, small wins first demonstrate ability to execute, deliver, learn, the system, the culture and everything else, is that one of the ways you thought about it,

 

Leandro Balbinot  33:47

that's totally, I mean, if you cannot show, earn trust through, like, small wins and showing that you can evolve quickly, it's very hard to, like, say, oh, trust me, five years are gonna have gonna get to perfection, right? That doesn't exist, right? So we really have to be able to, have to show and demonstrate small wins through data. How

 

Marcus Daniels  34:07

has the time horizon changed for you? I mean, you've mentioned, you know, five year kind horizons twice now. You know, when you're thinking about getting to that point, when you're seeing momentum just selling to get that buy in with executives, has that shrunk now? And do you think it'll shrink even more in this new wave of AI being applied? I

 

Leandro Balbinot  34:26

mean, if you have a bold plan, and you know where you want to get, you need to show the horizon, right? Where are you going to get to that, right? So that's the first start. And of course, you need to show how you're going to be winning and evolving every every month, right? So it's kind of a, let's say that the end game is a three year plan or a five year plan. But, I mean, you have to show what are your monthly milestones to get there.

 

Ben Yoskovitz  34:49

But have the three or five year plans ever been accurate?

 

Leandro Balbinot  34:53

They are not accurate by definition. But I mean, I think it's part of the uninterested like to show that you can understand. Then how to adapt the plan to what you want. But I mean, the principles in the mandate should not change, right? You have a thesis. You have to, you mean, you have to keep trying to prove that thesis right. And you cannot be changing completely everything every month, right? It has to have some, some rationale, what you're trying to achieve, right? What the goal is? Yeah,

 

Ben Yoskovitz  35:17

I think the parallel For me there is with early stage startups, you know more, more along the lines of what we were doing. You know, with you at Kraft, Heinz, with evolve, building net new businesses, where you have to put sort of a three year or five year projections, it's based on nothing but assumptions. But if you don't demonstrate the thought process behind it, it suggests you don't really understand what you're doing. You know, it doesn't. You don't really know the problem. You don't really know the solution. You don't really know the levers that you can pull to acquire more users or charge more money. And so most of the time, I don't think I've ever seen a three to five year plan from an early stage company that actually materializes, but it's the thought exercise of doing it that tells me you know what you're doing, or you actually don't know what you're doing, and you're just winging it completely.

 

Leandro Balbinot  36:05

Yeah. I mean, the trust is more important than the plan in the way, right? I mean, you to need to get the trust that you know what, what you're trying to do, and what the plan is, and how to measure that you are getting there or not, then you can pivot at any time, right? I mean, to your point, the only, the only the only five year plans that materializing completely, are the ones that are already bad at the beginning. Okay, so, yeah, they're gonna fail anyway. Yeah, they materialize. They materialize. And I think, I think

 

Marcus Daniels  36:32

what you're saying there too is it's a lot of just understanding the thinking is how you're building the trust, right, really articulating the assumptions, adapting them, those things become important. Often. What we're seeing is this period of time where folks are wandering a bit and, you know, the ability to kind of, you can wander for a while, but as long as you can articulate how these experiments, you're getting these learning cycles and you're applying it, you start building more of that trust to go for that kind of bigger wins. What do you see, even with these experiments and building trust, the balance between experiments that are going on the offense versus the ones that are more kind of defensive? Yeah.

 

Leandro Balbinot  37:13

I mean, if I understood correctly, what you want to know. I mean, I, first of all, I believe the let's come back to the wondering parts. First of all, when you deciding a solution, or a thesis, you have to wonder a bit. I mean, you have to really, like, talk to people have, like, a whiteboarding with a bunch of really smart people is really great. And I think it's the way of actually defining potential solutions and thesis that you want to test and and sometimes they are really, like, about growth and about like a really, like a winning in the market and conquer market share, not and sometimes just to defend yourself against something that's coming from, from the competitors that you clearly see that is a threat for you, right? You really need to be faster on that. And I kind of maybe experiment even faster, because, you know, there is a threat really clear coming to you, right? You need to. You have more urgency, definitely on that case, right? When it's defensive,

 

Ben Yoskovitz  38:05

interesting, makes sense. So, so Leandro, we haven't talked tech very much. You are a tech person. CTO, technical guy, and you mentioned AI already, so let's talk a little bit about AI. How are you thinking about the use of it, the implementation of it, the experimentation around it, you know, what can you tell us about how you're really thinking about the applications of AI for what you're working on?

 

Leandro Balbinot  38:33

I think there are the obvious use cases. First of all. I mean, there are many problems that can easily be solved by AI, and we are doing that a lot here, in many ways, supply chain, customer experience, customer service, internal things like code generation, like ticket resolution, all that is definitely obvious use case. I think, what's the challenges? And we are trying to exercise that more and more is like, what are the problems we couldn't solve before? And now AI can let me solve like things that I thought was were impossible, like I have this bias that I cannot even try to solve that because there was no solution eventually. Now there are solutions. So like this, thinking differently about the solution of problems and resetting that a bit and say, Okay, now I have different tools that I can actually use to solve that problem. Let's look at this problem differently, right? I think that's the key of a differentiation that AI can cause more than just the obvious use cases. And I and I think we are, we are finding this in many ways, in many cases that are pretty interesting and could really solve important problems for our customer,

 

Marcus Daniels  39:35

right? So would you say that AI is pushing it beyond the core a bit, you know, thinking a lot of the work that you've been doing has been on the core or the edge of the core, and ais that you describe is allowing you to kind of reimagine more areas to kind of pursue,

 

Leandro Balbinot  39:51

yeah, when you have experience, you are, like, tied to some constraints that you know exist, right? And I think I break this constraint and say, maybe, let's look. Maybe these constraints don't exist anymore, or they can be actually expanded or reduce it, right? I mean, that what I'm talking about is really like a, like a living, a gut feeling aside a bit and like, like a trust the model to generate some solution that may be surprising for you,

 

Ben Yoskovitz  40:16

right? Any as he any hesitancy. I mean, Amazon's a tech company, so I wouldn't imagine there is. But Whole Foods is not necessarily any nervousness around the use of AI or its implications around what it's going to do for resourcing or other things along those lines. I

 

Leandro Balbinot  40:34

mean, first of all, I believe we should always focus on the customer. And sometimes you need to have human many times have to human in the loop. We live in an AI model solution to guarantee the response or the solution is we're actually supporting and delivering what the customer needs, right? That's definitely the case. So I don't believe that it can solve for any problem without a human in the loop. In many ways fair. No, I think our focus is more on like guaranteeing we can actually solve the problems of the customer first and actually have more productivity than really, like thinking about, uh, eliminating the human in the loop. So really, like more focus on, on, guarantee you get the productivity we need and the customer satisfaction and the customer experience that we need that's more important, and more more, more the focus that we're having

 

Marcus Daniels  41:17

in AI for sure. So an opportunity to create more value for the customer exactly, exactly,

 

Leandro Balbinot  41:21

yeah, yeah. And I

 

Ben Yoskovitz  41:23

like the idea of what you said there Leandro around, AI may unlock problems, or the solutions to problems that up to this point, we had decided we couldn't solve, or we've given up on to some extent, or felt like, you know, we would just never be able to prioritize them, because the tech wasn't available to us, and that takes a different way of thinking, perhaps, than the other stuff, which is just the efficiency super important, but efficiency productivity, which feels like an obvious fit, where AI is plugging in everywhere, then there's those net new but what if we could do this? And what if we could do that? So does that change a little bit how the teams think about what problems that they want to even go chase and prioritize.

 

Leandro Balbinot  42:06

Yes, definitely. I think AI helps in every single step of the and starting on the idea generation, maybe, I mean, there'll be some ideas that are generated by AI, not by by the team, right? And the team just like tailor that and then kind of prove a thesis out of that. So starting with AI and idea generation and going through all the process, I think you can change. You can think about every single step differently. Now with AI, right, right?

 

Ben Yoskovitz  42:31

Yeah, makes sense. Let me ask you then another question about another tech question. So how do you decide if a certain technology is a fad or not a fad worth the investment or not worth the investment. And the thing that comes to my mind is metaverse. I'm not saying Metaverse is not worth exploring. There's lots of people doing stuff in metaverse. I'm not trying to crap on all the people who believe in the metaverse. I just feel like there was a point in time, a couple years ago where that's where a lot of investment was going. And then it it diminished quite quickly when the use cases didn't become as as evident. And then AI emerged, and chat GPT launches. And that changes a lot of things for people. So how do you think about that r, d, kind of tech bets that you're making, and what has real application and what might not

 

Leandro Balbinot  43:21

I think it depends on where you are. I mean, if you're really innovating completely, like in a very new like science project that actually requires a lot of R and D process, that's a different than what I'm going to say my mine, my answer is more based on reality and what I need to solve the customer problems today, right? Instead of, like, in five years time, right? I think it's very easy to identify if a technology is a solution for the problems you're having now or not, right? I mean, is this technology can really help on that? If yes, how and how we're going to be experimenting that, if I cannot do that, then it's the wrong technology, definitely, right. I mean, and every technology should be considered, right? I mean, RFID, for instance, the technology that's used it for, I don't know, 30 years, and sometimes it comes back to the table. So maybe that's the solution for that problem, right? I mean, and sometimes it will be aI sometimes it's going to be something brand new. And it depends on what the problem is and how you want to solve that problem.

 

Marcus Daniels  44:16

I was going to say, often it seems like just really following the developer communities and what progress they're making, because sometimes it takes a while for the solutions to really catch up to solving the problems for the customer,

 

Leandro Balbinot  44:27

the scalability, right? I mean, again, it even if I can experiment and test the technology, if I cannot scale that, why I would be considering even that, right? I need to scale that. That's that definitely should be a prerequisite, right? Well, actually,

 

Ben Yoskovitz  44:38

that brings up a good point. Then, when you are doing early stage experiments on newer technologies, how quickly do you try to examine the scalability of that technology? It

 

Leandro Balbinot  44:51

has to be really early, early because, I mean, at least in my reality, it doesn't make any sense to kind of prove a technology or a solution if I cannot. Scale that in the next three, four years. But why I'm even looking into that right now. I need to solve, I need to solve the customer's problem. Customers, the customer problems today, right? Not tomorrow, not in three years time, right? So I need, of course, then it's different than the R and D team that are looking at for solution for the future, right? And try to be more innovative there. That's a different kind of perspective. I would say,

 

Ben Yoskovitz  45:22

Leandro, maybe I'll ask you one last question. In in your arena, you know, tech and innovation, is there something that you believe is true that other people, or most other people would not believe is true?

 

Leandro Balbinot  45:36

No, I don't think so. I believe very much in some things that are true. But some people, I would say, are not as trustful as I am on these things. Like, I believe very much on experimentation. I believe very much in, like, failing, failing quick and pivoting and trying to find a solution for different for the problem. I believe in really, like the sto model, where we have, like, ability to take decisions really fast, and have ownership on the metrics. It doesn't matter if you're a tech, if you're a business, you have the same metrics if basically the successful metrics for that process or that product. That's what I believe. And some people believe less on that. I think they talk about it, but they don't believe and they don't apply it. So I like to apply that, because I believe very much on that,

 

Marcus Daniels  46:20

but you love executing, so that makes a lot of sense. Yeah, I

 

Ben Yoskovitz  46:23

was just gonna say talking about things was a lot easier than applying things. So,

 

Leandro Balbinot  46:27

oh yeah, I'm very much in favor of applying things and scaling things and rolling things out. Yeah, awesome.

 

Ben Yoskovitz  46:32

Leandro, thank you for your time. We really appreciate it. It was, it was great to catch up and have a chat. And thank you so much. Thanks

 

Leandro Balbinot  46:39

for the conversation. Appreciate that. Thank you very much.

More from Beyond The Core

Collaboration drives growth.
Conversation drives solutions.

We always enjoy conversations about innovation and startup building so please get in touch.

Let's Talk
icon right arrow in white colour