Be The First To Know

Welcome aboard! We are thrilled to have you.
Uh oh, something went wrong. Try submitting the form again.

Neil Hoyne

Neil Hoyne is an author, researcher, and marketing executive working at Google as Chief Measurement Strategist. In this informative conversation, Neil and Chris discuss the power and pitfalls of relying on data to make business decisions. It’s easy to get caught up in data analytics, which is why we like Neil’s advice: you don’t have to be the best; you simply need to be better than the alternatives.

Making sense of data
Making sense of data

Making sense of data

Ep
200
Aug
03
With
Neil Hoyne
Or Listen On:

Too much data, not enough heart

Neil Hoyne is an author, researcher, and marketing executive working at Google as Chief Measurement Strategist. He is also a senior fellow at The Wharton School and an advisor for Capital G.

Neil specializes in building strong customer relationships by analyzing data like customer lifetime value (CLV) and consumer psychology.

In this informative conversation, Neil and Chris discuss the power and pitfalls of relying on data to make business decisions. They start by defining what “data-driven” means and how awkward current marketing tactics are when you think about it.

It’s easy to get caught up in data analytics, which is why we like Neil’s advice: you don’t have to be the best; you simply need to be better than the alternatives.

If you are confused by your customer data, this episode will help dispel some of that analytical haze.

Hosted By
special guest
produced by
edited by
music by
Appearances

Episode Transcript

neil:

I'm surprised as to how many companies, how many entrepreneurs, when they're confronted with the problem, it's like, all right, a bear is chasing you. And they're like, "All right, what we need to do is we need to sit here and think about what's the strategy for outrunning the bear." And then they sit there for eight, 12, 16 months. They're like, "We need to collect more data to solve this problem." There's too many companies that are sitting there being like, "We want to be deliberate and think about this." It's like, no, just start running. Just move faster than the other campers and you win.

chris:

I'm excited to talk to my guest today. His name is Neil Hoyne. He's the Chief Measurement Strategist at Google. And he's got degrees from Purdue University and UCLA. He's senior fellow at Wharton. If that wasn't impressive enough for you, he also wrote this book. It's called Converted: The Data Driven Way to Win Customer's Hearts. Neil, welcome to the show.

neil:

Yeah. Thank you so much for having me, Chris.

chris:

I just want to dive right into it. I'm skimming through the things that I can pick up and the first thing that you say is a real person is on the other side of the transaction. And you say this book will teach you how to win their hearts. Can you explain what that means?

neil:

In the simplest sense, the more we use digital, the more we use technology, the more we become disconnected from people on the other side. And you see this with people that just rely on dashboards and data. A lot of what I like to say, a lot of head in there, a lot of thinking and logic, not a lot of heart. And so it's not uncommon to see people, whether they're entrepreneurs or they're marketers at large companies, looking at dashboards and just be like, "Look how many page views we had." Even on social media, "Look how many likes we have." And you get disconnected from the story of the people behind it. And the greatest problem that we've had with marketing over the past 25 years at least, probably you can make the case 100 years, has been that it's often difficult for businesses to understand or believe if it's working the way it's supposed to be. Would somebody have bought my product anyways if I didn't market? Or if somebody's building a brand, does that do anything different?
And because of it, there's just been that focus entirely on metrics. And this is just taking it a step further to say, look, we're going to look at who those people are, who those relationships are, and how to build something real from them.

chris:

If we understand this ... Okay. So if we're looking at the analytics and we're not understanding the person behind it, what are we supposed to be doing? Because maybe I'm guilty of this myself. I see that something's performing or it's hitting a certain demographic. How do I see the person behind the data?

neil:

For a lot of companies, honestly, it's just giving them the simple focus to it. Now, I joke with people here that when they look at analytics and data, there's a certain desire for perfection. I want perfect information. I want perfect decision making. I want to reduce risk to zero. The business world doesn't work that way. The business world works you're better than other people. Now, I joke with this with my wife. I'm not sure she appreciates the joke, but I say to her, I'm like, "Look, I am probably not the best guy in the world. I'm just better than other people you had." It was like, "He's the best one I found so far." And it's very similar with businesses. Look, somebody doesn't have to have a perfect product or a perfect advertising campaign.
They simply need to be better than other alternatives. So if the alternatives are companies who are looking at you purely from a dollar and cents perspective, purely from a transaction side, like, "Did you buy my product? Yes or no?" And you have another company say, "Look, we care about this relationship. We care about the loyalty you have with us." Then all of a sudden they're more attractive. And so when we talk about understanding a little bit more about those people it's, and this is really what the book covers, collections of steps, things you can do as a marketer, as a business owner, as an entrepreneur to at least get one step closer to having a real relationship with those people, a real connection, and being able to harvest some of those results even if you're not perfect.

chris:

I can see why your wife would not like to hear that statement from you.

neil:

There were a few jokes. And she'll laugh at it. And I say, "I married up." Eventually she's going to figure it out. There's a certain imposter syndrome, as there is at Google with everyone believing they snuck in. It's always the case sometimes in relationships. You're like, "You're really too good for me. When are you going to figure that out?" And thankfully it hasn't come yet, which is fantastic. So I continue just as I tell her, I try to be 1% better. I try to just be 1% better than everybody else to be like, well, this works. And that's the same thing for companies.

chris:

Okay, let's continue on that thread. Let's go deeper into the whole relationship thing. And you draw the comparison that this is like dating and you don't want to be too intense or direct and scare them off. And you're trying to court someone. So tell me how ... Because that sounds very human, that idea that this might be a long-term marriage or courtship and that's what we all want to build is a long-term relationship with our customers. How are you looking at this from the lens of data and marketing?

neil:

There's a joke early on in the book where I had this thing where I was like marketing today is very similar to marketers walking into a bar and proposing to the first person that they see. The original version of the book wasn't proposing. But in this case, proposing to the first person they see. And you think about how awkward that would be if somebody did it. And you would say, well, why would anybody do this? And you'd say, well, look, if that marketer is purely looking at their motivations, their incentives, maybe their goal is to get married that evening. There's really not an incentive to pursue anything else. Because nothing else matters. Nothing else is counted. Just like with marketers, you only care about sales, immediate transactions. It doesn't matter if they love you or they want to come back and have a longer conversation.
Your boss is going to ask you how many people said yes. And so this is how marketers approach it by and large, is they go out and they propose to everybody, and then they go back at the end of the night and it becomes a game of volume. Did I propose to 100 people, 1,000 people? Eventually, at least in data, 2% or 3% of people, I don't know who they are, will say yes. And then the question is, well, how much of an opportunity do you have to improve if that's your goal, if that's your approach? And you may be able to change what you say, or maybe where you go, but there's only so many levers you can pull on. Now, when we look at this from a very human lens, you're like, this is absurd. This is not the way that people behave.
And marketers are slowly realizing you're right. This is not the way customers buy either. Even simple retail purchases often take three or four interactions on a website, three or four conversations before they're willing to commit. Yet marketers are still focused on no, I want everyone to commit right now. And then they struggle and they're like, "Well, how come there's not a good relationship there?" And so sometimes it's just a story, because if you go the opposite way and you say ... Here's what marketers are doing today. Marketers are going out and they're trying to get everyone to buy right away. You're like, "Hey, that doesn't sound that bad. Why would that bad?" And he's like, "Well, what if you thought relationships worked this way?" And you're like, "Wow, that would be awkward." Or I draw another comparison in the book.
If you met somebody and then as soon as you leave that bar, they start calling you up and sending texts to be like, "Hey, did you think about that date? Did you think about that proposal? What about now?" And then they keep following you. You're like, "No, no, no. I'm at my parents' house now." "Hey, what about ..." And that's the same thing we see with online advertising is that if people start showing you messages as soon as you leave the website, display advertising, purchase intent, their interest in your products, your brand actually decreases. And again, just like in the real world, the advice at least earlier on was you generally want to wait two or three days before you call somebody. It's very similar to advertising. If you wait two or three days before you start sending them ads versus your last conversation, it actually improves their responsiveness. And so there's a lot we can learn from both sides. It's just being able to put them within that context. So that way they makes sense and we pull ourselves out of that purely data driven mindset that we've had for the past decade or two.

chris:

Okay. I think I understand what you're saying. And I'm listening also for our typical audience person, which is either a solopreneur or an entrepreneur with less than five employees. And they vary in their creative practice and execution of this. I'm on board. I hear what you're saying. It sounds to me like people are trying to close too fast. And then they're just looking at the transaction person and they're forgetting that there's a human being.

neil:

That's it.

chris:

And I think in today's age, we have so many sophisticated tools to track people and follow them everywhere. We can do email nurture sequences, and we're just bombarding them with stuff. First of all, it's annoying. Second of all, comes across as a little bit desperate.

neil:

It does.

chris:

I think that neediness repels people away. Okay. So I understand the concept. And if we continue to draw from the analogy of how you behave in the real world, if you're really attracted to someone, what do you need to do? And most of us, I guess, aren't looking for just that one night stand, but we're actually looking to build deep, meaningful relationships with people. How do we begin to change this? Because I'm on board now. I get it.

neil:

Yeah. You're good. You're good. It's like, well, it makes a lot of sense when you think about that way. So the basic tenants of it, number one is exactly what you said. The long term relationship, the importance of it. The second part of it is also to remember that progress in this area does not mean perfection. It does not mean going out and everybody you talk to loves you. It does not mean that every person you meet, you will have a great long term fulfilling relationship with. In fact, when we look at customers across businesses, it doesn't matter the size, it doesn't matter the industry, you always see this separation where a very small portion of customers will contribute a lot of value and a large portion of customers will contribute almost nothing at all. And there's nothing necessarily bad.
You need those people in your life. And I compare it very similar to you will have partners, friends, family members, that you couldn't imagine having a life without. And then you'll have the more transactional relationships. It'll be someone comes in and at that moment ... Let's say a server at a restaurant, an Uber driver that brings you back from the airport. That interaction was cordial. And there may have been some value exchanged, but you're not going to see them again. You're not going to go to them for advice. And so it's just prioritizing to say, out of the customers you do have, out of the relationships you have, which ones have the most potential? Who are they? And this is where this idea comes from. And again, the metric I feel is terrible. I wish they had a better name for it. But what we're really talking about here is looking at a metric like customer lifetime value. Well, what is customer lifetime value?
If you're new to it, it's an individual level prediction as to how valuable every relationship will be to you. That's exactly what it is. The same way that you might be able to look at someone individually and say, "I really connect with this person. I love this person. I'm so glad they're in my life." This is just doing it except scaling it across hundreds, thousands, in some cases, millions of customers to tell you what those differences are objectively. It doesn't require a lot of data. It simply requires an understanding of previous interactions, how much people spent and when. And then just over time, you start to get that picture to say, these are the people I really need to pay attention to and these are the people I need to avoid. And so this gives you just an understanding. It's looking at here's all your customers, here's who you should focus on. And everything that happens after that point just builds from that story, which is to say, if there's people you really get along with then businesses, and we can go into this, start asking themselves why. What makes these people special?
How do they like to engage with me? What products are they interested in? How often do they come back? When? How did I meet these people? And at the same time, who are the people I want to stay away from? And just like we have those interactions in our own lives where we can look at people and be like I don't really connect with these types of people. They're not my tribe. You start to see the exact same thing from a business. You start to see some people that say, "Look, they'll come in whenever I want to offer them something on a deep discount. Or if they come in, they waste a lot of my time." And this is just making it a little bit objective. So it's taking that heart part of ours that wants to be human and connect with people and bridging it with the mind to say, we can use some of this data and just put it into a perspective that's really useful.

chris:

Neil, I love what you're saying. Completely resonate with this. I need help in understanding the data part. So if you don't mind, I may ask you some really boring questions.

neil:

Go ahead, go ahead. Nothing's boring.

chris:

I know the company you work at, Google, you guys are the masters of data. And I'm feeling everything you're saying. And then I'm just scratching my head thinking, okay, let's just take for example, a company like ours, where how we actually are able to finance what we do is because people buy digital products mostly, or they join our coaching community. So if I were to look at a customer lifetime value, I could filter it by who has spent the most. I can also look at who is the most engaged. I could see some of those things, but how do I do this beyond, say, building a story and reverse engineering who these people are, how they came into our ecosystem. Are there tools and things that I can use to find out more, or do I need to ask them, how did you find out about us and do this on a one to one level until I have enough data points,

neil:

Let's fill in the missing piece. Great question.

chris:

Thank you.

neil:

So number one is the models that we're talking about here, the way we calculate it ... Because people start thinking you're predicting relationships. This is the future and this requires lots of data and lots of expensive shit. No it doesn't. Yeah. And in fact, one of the things that I was deliberate about when writing the book was we dropped that chapter out because we didn't want it to be a requirement. The value is in the strategy in thinking about the problem, what people naturally do. The models are already established. They've been there for decades. I just need to connect the dots. And what I ended up with on the book was there's a really great company down in LA, a company, Retina AI. Their CEO, Emad, fantastic guy. And I said, "Emad," I said, "You build these models all day long. I have all these book readers that are going to want to calculate it for their small business and their startup. Can you help them?"
And they were actually pretty generous. And they said, "Hey, yeah. You go through the book website, you click a button, we'll model your data for you. And so you don't need to worry about it. We're not going to charge you. We're not going to ask you for credit cards. There's no upsell, there's no subscription." It's just their value is using that data. What we'll talk about. The models are just commonplace and really it's connecting the dots and showing people. And I joke about this. If you go to Google and type how to calculate lifetime value, the first three pages are wrong. And they'll run you in circles and you won't see anything out of it. So this is the easiest way. I said, "Look, I'll give you all the math. I'll give you the models if you're that type and you want to see how I'm really building this. I'll give you all the research so you can see it."
But if you're busy and you're like, "I just want to see the numbers. I want to see who I should pay attention to.", then we have some of these partnerships like Retina, where they said, "Hey, we'll just do it for you for free. We're not keeping your data. We're going to delete your data after 48 hours." Just genuinely good group of people just to help people through that first step. Now, what do you get out of this? So your data goes in and the data comes out very much like a spreadsheet. In one column, you have, here's everybody's name that you sent. And in the second column, here's how valuable they're going to be in the future. So we know what they do in the past, but does that spending continue?
Well, we'll get that prediction. And so you have these two columns in the spreadsheet, but a lot of people have questions. Is it engagement that matters, or the fact that they engage with us frequently? Or is it the types of products they engage with us on? Or is it how we acquired those customers? These are legitimate questions. Just again, as we would have in real life. Is it people that I met here or there, or do I get better people with these hobbies or those hobbies? And so just picture that spreadsheet and add a third column. So for some business, it'll be, here, let's add a third column. Here's how I was introduced to that person. A fourth column. Do they participate in this product? A fifth column. What product did they buy initially? And what companies do is they just group their customers and they say, how does that behavior change based on where they came from?
And they start to see differences, they start to see, well, these relationships that came from this channel were really valuable. This channel, not as much. Or maybe this campaign, or we find really high value customers sometimes buy really cheap products early on. They're just getting acquainted with us. Or it's not uncommon to see sometimes we want people to buy right away, but our best customers actually want to be deliberate. They want to spend more time on our website learning about our products before they raise their hand and want to be known to us. And so now we just get to say we're not combining all the data. We're starting with a hypothesis. What do we think is a driver of these relationships? And then we just add it to the spreadsheet and we group these people together and we say, does lifetime value differ based on how these people are behaving?
Now, I will tell you sometimes it doesn't. I work with some companies and they, "Well, let's take a look at people that are using our mobile app versus not." And they're like, "Well, they're spending the same amount. They're staying around the same amount of time. Nothing's changing." But even that's insightful to say, look, you have this mobile app, you have a permanent place on their device, and you're not really building a relationship from it. You're simply giving them an alternative channel than converting over the web. And sometimes those are meaningful discussions as well. But this is the way that we start to bring together those dimensions. Who you get along with best, how they're behaving with your business and giving you different actions you could take to improve that value.

chris:

The spreadsheet that you're talking about with the sample columns like how much money they've spent, engagement, however you want to define that, or the initial products they purchased, et cetera, are those a pretty good starting point for most companies or does this need to be, you have to look at well, this business, those columns aren't going to matter, in this business you need totally different kinds of things?

neil:

I say that the goal is always, as we say, to be better, not perfect. Every column that you add, I say it really starts with two things. If you have a legitimate hypothesis about the data, how the data matters to you, and you have the ability to act on it, then it's a fine data set to bring in. But I say this because oftentimes I work with companies where they say the starting point is to get all of our data together. Everything that we have and clean everything up. And I say, "No." I say, "Give me the data you have available and just tell me that if I see something different, do you have a course of action that you can take in your business to adjust?" If you know what advertising channel people are coming from, and you have the ability to spend more or less on an advertising channel, or if people are talking to different customer service reps on the phone and you find their lifetime value differs based on that call or the representative they talk to, then that matters. Or you find people from different regions behave and you can move your marketing and your focus to those regions, then great, you can do it.
But if you get into one of those pictures to be like, well let's see how a conference from two years ago behaved. If you can't go back to that conference, if you can't reach that audience, then what are we looking at this data for? And so that's where companies start is not even the data at all, but it's to sit there with almost a blank sheet of paper and to say, "What do you think leads to great relationships in your business? What data would you use to prove that?" And then that's what you bring in new tests. And you say, how closely does that align to what that hypothesis is? Some companies are surprised.

chris:

I love that opening question. You start with a hypothesis. You're looking for some kind of pattern. A prediction. Like we love these kinds of customers. What is common about them? And we start there and we don't have to have all the data points, as you said. Just get going and it can change. It can evolve. You can add more data points until you can start to see some kind of pattern. Right?

neil:

That's exact. What I tell people, this is my favorite one is you think about this, they have this of a camper in a forest and a bear is chasing you. And you think, well, it's like, well, what are you going to do to get away from the bear? And you can't run faster than the bear. And then you realize the problem is you're simply running faster than other campers. That's the solution. And I'm surprised as to how many companies, how many entrepreneurs, when they're confronted with the problem, it's like, all right, a bear is chasing you. And they're like, "All right, what we need to do is we need to sit here and think about what's the strategy for outrunning the bear?" And then they sit there for eight, 12, 16 months. They're like, "We need to collect more data to solve this problem." And it's like, no, no, no. Just start running and you're fine. Especially with this economy, there's too many companies that are sitting there being like, "We want to be deliberate and think about this." It's like, "No. Just move faster than the other campers and you win."
My world is online advertising. Online advertising, as you know, is an auction. You win by having better information. Being able to make better decisions than your competitors. You don't win by having perfect information. You just need to be slightly better. And when you break this down and you say, look, really all we're doing is going one step at time saying, who are these valuable people? And then one hypothesis is what makes them different? The products they buy, how they engage.
It allows you to make a slightly better decision very quickly that advances your business and builds a trust in the approach you're doing. And then you can keep doing it. As opposed to, I see some of these strategies, like we're going to do machine learning and it's going to take two or three years to implement. That is risk. You can't do anything for two to three years. You have all this revenue you're passing up, hoping for this godsend of a solution to come. And I'm like, no, just take your spreadsheet and add a couple attributes and just start doing something to meet slightly better people.

chris:

Can you share an example or two of when somebody starts to look at their data, figures out who their best customers are, who they want to work with and looking at the data, they discover something? And then what they do with that discovery so that they can increase the value of the relationship, the length of the relationship, and ultimately build the best customer for them.

neil:

Well, let's start with that example I mentioned earlier about the mobile app. I have worked with a number of companies where they're convinced and they're like, "Our mobile app is what's driving deeper loyalty and engagement." And you look at the data and you say, no, these people are spending the same as non mobile app users. And then they've been able to go into their app and to say, well, what exactly is the strategy of your application? Now, travel is a great case. A couple of years ago, there were a number of travel apps where the entire focus was getting you to book again. They weren't talking about loyalty programs. They didn't have any value added services. Especially hotels didn't have digital key cards. It was all you book or you don't. And it was just a choice. Well, this app allows me to book faster. And they were pretty surprised when lifetime value was the same. Nobody was really bidding.
But their goals were well, we want to drive sales. Otherwise, our app doesn't get any credit. And now you give them this lens and you say, "No, if you add this loyalty program information, people aren't bidding again, but it's bringing them back." And that's not necessarily a bad thing. You see it on other websites. One of my favorite examples just of something someone learned. This is actually some academic research that I talk about in the book. Was just around gift giving. Oftentimes, especially during the holiday season, you see that option to give something as a gift. And they found actually in the data that people that were on their very first purchase were buying a product as a gift for somebody else ended up having a higher lifetime value themselves for the next 12 months.
I guess it has something to do with, if you're giving somebody a gift, it's a reflection of your taste. And these are brands that I love and you feel more close and connected to that brand so you spend more yourself. And there's also been cases just with the advertising channel customers have come in on where there are differences based on where customers are acquired from on their very first interaction that doesn't translate to how much they spend on their first purchase. But they do come back and spend 20, 30, 40% more. And for those cases, what those companies have done is they've simply said, "Look, we're going to spend a little bit more time and emphasis on these channels because we can now prove their longer term value." As opposed to resetting that clock every time someone comes in and buys once.

chris:

You just reminded me of a conversation that I have with my chief operating officer. His name is Ben. And what Ben does is he's the master of the data set. And I'm not so much because the numbers make my head hurt. And he'll come up with some hypothesis based on the data he's seen. And this is where it gets real interesting with us because the data is the data, but the human interpreting the data creates a narrative and that supports the strategy that they want to take. And I look at the data and said, "I have a completely different conclusion based on the exact same data that you just told me right now." So now when data turns into humans to interpret and then to make educated guesses, how do we know what path to go? Or do we do both paths or do we split test? Do you have any advice for people who look at the data and start to form their own narrative? You must get this a lot.

neil:

I get this a lot, but I understand the nature of the question and I'll give you a simple example. Let's say I gather everybody from your side, we're all sitting in a room and I say, "All right, here's what we're going to do. I'm going to give you an opportunity." And let's say it's a market growth opportunity or a new product you could launch. And what I'll do just for fun to get things started off is I'll give you the overview of the opportunity. So imagine I'll slide across the table a sheet of bullet points, five or six, and here's the opportunity, here's the market size, here's what we may price it, here's how we build it. And I'm going to just ask you, Chris, I'm going to be like, "What do you think?" And you're going to have your opinion.
I may send it over. You said, Ben? I'm going to send it over to Ben. I'm going to be, "All right, what do you think?" And we do this across a couple hundred people maybe, and we're going to get some people that look at the opportunity and just say, "Look, I love this." And other people that look at this and say, "I'm staying away from this." And generally what everyone will say is, "We have disagreements here. We don't have a conclusion. What we need is more data. We need to understand this. What is this a real opportunity or not?" And that's fair enough. I'm a data guy. I can empathize with people like that. So over the next two months, I'm going to send you more and more data. But this is an experiment and I love to play around with things. So if you were on that side where you said you love this new product, you think it's awesome, I'm going to just give you a whole bunch of data that tells you why it's garbage.
And if you think that the opportunity is terrible, I'm going to give you a whole bunch of data to say this is an opportunity you can't miss out on. And then we're going to get back together after two months and I'm going to say, "How do you feel about this opportunity now?" And nearly everyone will keep their original decision.

chris:

Whoa.

neil:

The only thing that changes is that most people will feel more confident with their original decision because they had time to look at that data. And so when people come to me and they say, "The solution is more data and more analysis," I look at it and I say, "Is it really? Is that really going to change your point of view?" And here's where the answer a little comes into it is that these researchers were looking at this experiment and they said, what could we do to actually change somebody's point of view? As it turns out it wasn't more data. It was simply reminding people when they went into the problem, that being human beings, that we are biased towards our initial impressions, even if they're not publicly declared. And we have a very difficult time taking in and processing new information.
That simple awareness. I wasn't giving you advanced techniques and well, we're going to do meditation in between. No. Just this is what it is. Made people more open minded and willing to accept new information and new data. And when they did that, that simple disclaimer, I think it was then after that point, a majority of people changed their decision to match what the data showed them. And so when I look at that, I say those disagreements are very human. It's just important for us to recognize that they exist so that we can make, again, slightly better decisions overall than what we had. And I generally find companies come in and that's the understanding is if I go in there and I tell them just to look at the data, the data will help you decide, those disagreements are rampant because people see that data in different lenses. If I tell them I say, look, you're going to bring in a whole bunch of biases, then people deliberately try to be more open minded and they're easier to find a consensus around.

chris:

You just make them aware of how they might be approaching this. It's almost like you saying some of you might be racist and then you have this discussions like, okay, I'll be a lot more sensitive and listening to this. Right?

neil:

Yeah. And that actually is a known bias in terms of surveys that we talk about briefly in the book about if somebody asks you your age, your gender, where you grew up beginning of a survey, it'll actually change the answers that you give. One survey, I believe it was done at one study over at Harvard, actually found when they asked ... I think it was elementary school children, middle school children. They gave them a math exam and they reminded them of their gender at the beginning. They said that women actually performed worse on the results of the test, just because they were anchored across what they believe are these social differences that never manifest themselves. But we do fall into those identities and those classifications subconsciously. And so that's why it's careful to remember just ...
That's why they always say you put those questions at the end of a survey is because you don't want to bias people by their peer groups, by their experiences. Those should be left out. But they are very human and they do impact everything. They even said that when you talk about eating habits, an interesting thing they said just about brands and brand awareness is there have been studies that say, look, if they show people ads for McDonald's, then their eating habits will change later on in the day. Now, it doesn't necessarily mean that they're going to go to McDonald's. It actually means that they found that people will have less healthy eating habits later on in the day because they were conditioned by seeing those McDonald ads. And you start to think you're like, well, how is my brain shifted by it? And the larger macro study in that case was they were looking at, if it was just simply seeing an advertisement for a particular brand that can change how we respond for the rest of our day, what happens when you're driving home and you go past that big, fast food restaurant? Does that change the way that you behave later in the day?
And those researchers will make the claim that yes, in fact it does.

chris:

Wow. Okay. This part isn't so intuitive. If you didn't tell me this, I wouldn't have even guessed that this is the reality where the sequence of the questions changes the way people behave. And so putting the demographic information at the end of it will help them answer in a different way. That is just crazy for me. And as you're talking about seeing ads and affecting your behavior, I just flew back from Las Vegas. And as I'm going from security to the terminal where I'm supposed to be at, I'm walking past a bunch of fast food places and then all of a sudden it starts to increase my appetite for it. And I'm going to confess, I did not make the best ... I had fried chicken. I did not make the best choice for my body.

neil:

I eat McDonald's at the airport. Always.

chris:

Is it the only time you eat McDonald's?

neil:

I recently introduced my two and five year old. I introduced them to happy meals. And so a little bit more frequently because they get really amped up about it. But with a lot of these things, I think what it is, is just we like to believe we make objective decisions about our health, about how we answer questions. If you ask somebody about it, be like, "Well, how do you make a decision about what you're going to eat?" They're going to think about it in a very objective perspective. "Well, I want to take good care of my body. This is what I have a taste for." They don't necessarily think about all the different factors. Even some of the questions reminding people about their financial circumstances.
I think there was one study asking people, how much money do you have in your bank account or how much credit card debt do you have will change the way that they spend immediately after. Whether they become more aggressive or more conservative. Reminding people about when they're going to die or other people dying, making people more aggressive with how they spend, more risk taking. These things happen all the time. And again, it's not necessarily ... Going back even to that original question around data driven decision making. It's not necessarily that we want to change people and we want to just become very robotic and this is how we make decisions. But oftentimes people will make better decisions in personal life and in business, if they're aware of some of the factors that are driving these changes. And it doesn't have to be everything. You don't have to uncover every nuance about human behavior. Again, success with yourself, with your business is just knowing a little bit more than where the mass market sits.

Greg:

Time for a quick break, but we'll be right back.
Welcome back to our conversation.

chris:

I'd love to get your perspective on this. As I'm listening to you, I'm checking in with myself to say, okay, am I guilty of some of these things? How am I doing this? And so as a designer, as a person who likes some of the science data, I make a lot of decisions intuitively. I have, I think, a pretty good pulse of what our customers want because I'm having conversations with them online all the time nonstop. And so I use that as part of my data points and they're pretty soft. And then here's the example. We use a program called TubeBuddy. And TubeBuddy says, if you feed me the title of your video on YouTube, we will score it and we'll tell you if it does well and we'll make suggestions using some algorithm, machine learning. I don't know how they do it.
And they kick out a title that scores much higher than the one that was put in. And I look at I'm like, I don't think any human is actually going to type that in. And we've done some experiments with this, because my team will just use that as word from God. Here's the Bible. It says use this. Yeah. And they put that in and the videos do not perform. And I give it about two days to collect enough data points. We pretty much know that the video is or is not going to work within the first 48 hours. And then I go back and I change the title using my intuition, knowing what it is that I know about our audience. I change a title to something that feels more human. And surprise, surprise 75% of the time it outperforms the original title based on data. Help me. Is this a massive form of cognitive bias?

neil:

No. As you're telling this story, I'm like, I can't wait to see what it comes up with. Like old data driven analyst yells at machine learning models will be the name of this video. So look, I have a slight ... I know I have a negative bias on my side towards a lot of machine learning lately because I've seen too many companies, entrepreneurs say, if I want to raise money, the easiest way is to claim something's using AI or machine learning. And so there's often a difference between what these tools are capable of versus the reality. The other part that we look at is very much how human beings can be biased. We like to think computers and data are impartial. Machine learning models are biased based on the data that they've already captured.
And if the data that they've captured is biased in some way in terms of the age of the data or the types of videos that they're collecting, then you're going to see it. You're also going to see just that propensity. Unless the only data source they were pulling out was top tier performers, I would almost suggest that they run the risk of being biased towards the average. And where you're seeing success is your intuition is saying, look, this is something that's not reflected in your data set that could be better. And they have that joke about machine learning, where it's like a joke like if you were a machine learning algorithm and all your friends jumped off a cliff, would you jump too? And machine learning all go, we'll go. That's what everybody's doing. That's what I will do.
And so there has to be that purposeful effort. But what it goes down to with me is to say, I look at anything that's ... People often ask me, they say, "Well, what's the standards of machine learning or AI in your work? And where I generally go with it is I say, oftentimes it's impossible for me to understand everything that goes into the models. I carry a healthy amount of skepticism with them. But I look at them as being almost like a friend or advisor that comes over. It'd be like Chris, if you reach out to me and be like, "Hey, we're naming our YouTube videos. What do you think we should do?" I'd say, "Well, I think you should do this, this and this." And you go out and test it and you come back and you're like, "Hey Neil, your recommendations kind of suck."
It's like, all right, well, don't listen to me anymore. But you need to have that level as opposed to being, "Hey, Neil has a really good background so I'm just going to trust everything that he does right away." And even if I have really good recommendations, let's just say for a moment that I'm advising ... I'm so popular that I'm advising 500 people, many are all competitors and I'm giving them all great advice. Well now the status quo has been risen. And so now you're not going to see any benefits. So again, your question is, how do I run a little bit faster than that bear? How do I run faster than the other campers? Which is to say I need something better. And so what I'm seeing here is I'm saying, look, intuition can work well. It can inspire machine learning and AI can help scale that. Can help improve it. Can learn how you behave. So the interesting algorithm for me would be I'd say, "Chris, if you're really good at naming YouTube videos, I would take your YouTube naming and I would build a machine learning model out of that. Because that dataset seems to be performing better than whenever this other program's using."

chris:

I love your Silicon Valley nerdy humor. If machine learning bots are running off the cliff, would you?

neil:

The book that I wrote originally started with me just telling stories of failures, because I don't think there's enough of them. But I just remember one time I was meeting this company. They didn't turn into a big one, but you know how they have those direct to consumer mattress companies that send you mattresses in a box?

chris:

Yep.

neil:

And I sat down with them and I'm like, "All right, let me understand your business model here." I was like, "All right, mattresses, you stuff them in a box. You drop ship them." And I'm like, "So you're a mattress company." And they're like, "No, no, no." They're like, "We're a sleep technology company." "What does that mean?" And they're like, "Oh, well, we have a mobile app that connects to your ..." I'm like, I don't need this. I need a comfortable bed. But if you want to make money, if you want to get investors, you need to have that big vision. That comes with the valley.
There was another time, there was a payroll company. It was a digitized payroll for small businesses. And I send back, "Well, what's your business model?" They're like, "We are the connection between people and possibilities in their life." Like, "What does that mean?" And they're like, "Well, we are the enabler of people getting access to capital, therefore allowing them to ..." I'm like, "No." I'm like, "You write checks to people and you pay taxes. Let's just really simplify the concept." And the same thing though happens carte blanche with machine learning, where it's like, look at all these things we can do. And I don't say it because I want to tear machine learning down.
I just worry that we enter into an era where people make bad decisions because they see what other companies are doing and they say, "I need to position my company in this way. I need to use machine learning in this way. I need to name my videos in this way." And they lose that ability. I don't call it intuition as much as I call it critical thinking to challenge and to say this data is not all encompassing. There are other questions. There's ways you need to break beyond. And that's not a bad thing. That's not saying you're not data driven. It's just saying you're taking in multiple perspectives from different people and you're building systems to objectively judge who you should listen to.

chris:

Okay. My older brother, his name is Arthur. If he were listening to this, he's going to start scratching his skin right now, because I recently were having a conversation with him and he said companies in the future who employ machine learning will have an unfair, competitive advantage. And he points out Amazon and Netflix as two who are using machine learning. And so he told me think about how you can use machine learning. And you were saying, if you want a Silicon Valley investor just mention AI or machine learning in your business model and you'll get some money. So I need to rethink how I'm talking about a company these days.
So machine learning can be helpful. And I love the way that you describe it where if I have an instinct on how to write titles for videos that perform well, we should build a model around the decisions that I'm making, what I'm looking at so that it can scale that process. And I love that explanation. There is a time and place for machine learning and you have some skepticism towards it. Can you point to us another use case where machine learning can actually be very beneficial for a business?

neil:

I would actually agree with your brother. I think that if you have a critical lens towards it, and you're able to objectively look at it to say, look Google as a company uses a lot of machine learning and automation today. These are the key words you should advertise on. This is the target. This is how you should set the bids. They're using vast amounts of data, incredible engineers to build those recommendations. And I've looked at the underlying performance data. It works really well. And therefore I have a lot of trust for those models. Self-driving cars I think are getting there. Even basic things. Recommendation engines being able to go through millions of products and make recommendations. Now, I know Netflix at a time, I don't know if this is still true, was using machine learning to recommend hit shows.
I don't know if they're still doing this. I end up watching dating shows sometimes on Netflix with my wife and I'm not sure. I look at that, I'm like, what algorithm pumped this out that said, all right, we're going to get that dude from 98 degrees and a bunch of unhappy couples, and we're going to smash them together in a house. And I'm like, how'd you get this? Like what dataset brought this in? And what I say is I say that what we should look for with it, and I think this is where the agreement is, is that because someone's using machine learning AI, it shouldn't suddenly be a gold stamp of approval to say this technology is so awesome and so advanced that it's perfect and that we should use it and that it's going to outperform our existing techniques.
We have to validate it. And that's where I look at it. When I see companies where what they'll do, and this is often the pitch, and this is where I draw a line, is they'll say, "We're using machine learning." I say, "Can you explain it to me?" "No, it's proprietary. It's black box." "Well, how can I trust the results of it?" "Well, you just have to ..." "Can I test it? Can I evaluate it?" If I don't have those criteria in place, it's simply, "No, come jump off the cliff with us. Trust me, this parachute will work." Then I challenge it. And I think what you've seen here in this scenario that you have is to say, look, I tried a product. Machine learning had a lot of potential. That's why I looked at this tool to see what it would recommend.
And my results have been the opposite. That it's not working at least for my business. And the question is to say, because I believe that this technology has potential, how would I build it? What would I use differently? What would I do to change it? And that's the nice part is that we are far from this being an established and mature area so these hypotheses very much could play out. You could end up building Chris. This could be your idea. You'll come back in like two years and be like, "I sold it for a billion dollars. A better YouTube video renamer." And I challenge almost all companies to that. I say, don't trust this technology blindly, but have faith that it will work. It will play a large role in what we're doing. Just make sure you're critical of it, because that will be what drives the advancement more than just blind investment and blind hope.

chris:

I want to be clear here in case I'm offending anybody, potential sponsors of the show maybe, that for my team whose job is mostly to edit the videos and to do some motion graphics, to make the videos as engaging as possible, they have limited amounts of data because what they do is they'll cut the video, they'll upload it, and they'll call it something, whatever the algorithm tells it to call it. And the algorithm's going to outperform their first or second attempt at titling it. It's because they have fairly finite data points to look at. Whereas I'm on the opposite end and I'll use my data intelligence. I don't know what you call it in the industry, but I'm watching videos all the time from other people, myself, and talking to people and seeing ... Because they literally tell us in the comments, you should call the video this. And if you read enough of those, you start to see a pattern. Because they literally tell you that.

neil:

Well we'll call that unstructured data to make everybody happy.

chris:

Okay. Thank you.

neil:

I have experiences. I have comments is like, I didn't put it in a table. Maybe it's just unstructured. It's still valid.

chris:

Right. So basically it can outperform depending on your data set and how much interest and attention you're paying to a particular problem. But if you're using a lot of unstructured data and you're able to intuitively shape it, then you might over perform what the machine can tell you to do. But this leads me to a different question.

neil:

Go ahead.

chris:

I think it's human nature to resist change. That we want data as long as it supports the narrative of doing what it is that we do. And so we're all walking with degrees of bias, some more or less. And so when the machine or the data or the data analyst tells us we should do X versus Y and that's a different thing than what we wanted to do, the first thing that we're going to do is we're going to try and poke holes at the data and that it doesn't work and we're going to do that. And what we probably should do is just stop for a second. We should have some skepticism and we should try to act on the data and the recommendations. And if we're a little reluctant, how long do we run a test before we know that it's working or not working? Because I think the length of time and our commitment to it also determine whether or not it can be successful or not. So that we don't just revert back to previous patterns of behavior and decision making.

neil:

When we talk about experiment, at least let's talk about broadly design and that can include the length, the amount of tests that we run or sample size and so on. And what I generally look at is I try to look at the entire options for it. So a lot of the tests we run, we will run geo-based tests, which will take six to eight months for us to run. And I had a result one time from a large retailer and I brought it over to some of the professors I work with over at Penn. And I said, "Hey, what do you think about it?" I was proud of it. This was a difficult test and we had to run this. And I was like, "What do you think about it?" I was like, "Could this integrate into a paper somehow in research?"
And they laughed at me and they're like, "We really wouldn't accept that test." I was like, "What do you mean? It took us months to do this." And they're like, "Generally we need to run a dozen or so tests, three or four years worth of work. We need to be able to show it has repeatability." And I said, "Well, that's great." I was like, "But I could not tell any company that they have to wait three to four years before I can find an answer." But what it avails is to say there are different types of methodologies based on the burden of proof that you're putting on somebody to validate something. So in this case, you could say, look, if you want a complete, perfect answer, it will take three to four years and the costs are three to four years of experimentation.
Plus waiting for this to work. In other cases, there are easier methodologies to say you could do something in six to eight months. You could do a pre-post test for one to two. And what I really have discussions within the organization is culturally, I say, how much proof do we need before we make a decision? In which case you have to consider how big of a decision are we making? Can we reverse course if we need to? And how much evidence do we need to see as a group that something works? And I think that last one is key. I think oftentimes where disagreements arise in companies is they don't have firm criteria to say how they're going to decide whether something's working. Experiments are rarely black and white. There's always some gray area somebody's going to bring to be like, "Hey, this happened in the market during our test," or, "Maybe this test wasn't large enough," or, "The result has some error on both sides."
And so what I generally say is for everybody around the table, I simply say, let's get agreement here. What is the minimum amount of data we need until we're comfortable with the decision we're going to take, knowing that decision will carry risk? In other cases, they say, look, this is a million dollar thing. We need to be certain about what we're doing. And they make different decisions and there's no right or wrong answer to it. It's just, you'd be surprised as to how better companies are able to make decisions when before that test is run, they decide what's that burden of proof, how are we going to establish it as an organization, and what are we going to do based on the results of that test? Because if they can't get past that point, it doesn't matter running a test. If you say, well, we're just going to try this out. And nobody agrees to say, well, if this test shows us this, we're going to buy this product, no one does it. They just hope, well, we'll look at the data and then we'll make a decision as to how we feel. And that I think is the worst case, because then you're just going entirely based on intuition.

chris:

I really love the way that you frame questions. I guess you know a thing or two about this. Just starting out beforehand, getting some consensus agreement so that when we see the data, we can then commit to it because it's fulfilled the requirements that we set for ourselves. It's like you're designing the perimeter of how far we can explore the edges, the boundaries. And then that way we can look for the data and come back and say we don't have sufficient data yet. But once we have that, then we need to be brave enough, I think, as entrepreneurs to act on it and give it time to see if it works or not.

neil:

That's all it is. That's all. But the reason I bring it up is because I think that a lot of companies ... I'll put this delicately, I would say that a lot of companies don't establish that criteria. And because of it, they may put irrational constraints on what they're trying to do. And oftentimes just getting people out to say, look, how are we making a decision on this data? A lot of companies confront the idea. They don't know. They don't have any criteria. They don't have any consensus. And that sometimes is a more valuable problem to tackle than whatever the experiment is focusing on is to say, we have no criteria in an organization for how we make a decision. So why are we running all these tests? And then by doing it in advance where it's a non-descript problem to say, look, how much evidence do we need before we take this action?
And I tell people sometimes, sometimes we work in the academic research space where I can give you 60,000 pages that says this is the right thing to do for your business. And companies say, well, we're going to test it first. I'm going to say, "So you need 60,001 pages of data. That's the threshold." And it's like, "Yeah, you're right. That doesn't make a lot of sense." But it's until you objectively get out there and say, "This is the criteria. You need this amount of evidence to go do something.", they don't realize how much they're slowing down their organization, their decision making.

chris:

I like the way you framed that question to make them rethink their needs for more testing. Well done.

neil:

Thanks.

chris:

There's two more questions I need to ask you before we run out of time here. The first question is ... And it's about questions because you talk in the book like, you're going to learn how to ask the right questions that we can anticipate our customers' needs. And you seem to be very good at asking the right questions. Is there anything else that you can shed light on in terms of how we can discover what the needs of our customers are?

neil:

I go to a slightly deeper level. The impetus behind that section was to say that oftentimes people look at data collection as what do we have available? Do we have this metric? Do we have this data? Removing themselves from the possibility that they can ask customers. And that process seems to have the most friction in a lot of organizations, especially retailers. We want to ask our customers more things. How did you get to our site? What do you think? And the difficulty of companies doing that is just incredible. Oh, we don't want to add more friction to the checkout process, or this is our annual survey. And what happens is that curiosity dies. Or if nothing, it's marginally replaced by saying, "Well, do we have a data set that says how they found us? Well, we have some rough referral data in our ..." No. Ask more questions. Be curious, rotate those questions.
It doesn't have to be part of a large survey. If you want to know when to ask questions, ask it on the thank you page. Turns out the height of trust with most consumers is after they give you money. Ask them questions instead just being like, here's your order number. Rotate those questions, learn about it, be curious. And that's really what that chapter is about is to say, if you have a conversation with someone and Chris, you and I are sitting here and you're just talking and I don't ask you a single thing and I'm just listening, then that's not engaging. Because I'm sitting there, I'm like, no, what are you saying? Explain that a little bit more. Have that conversation with your customers. The only thing that I've put on it is I say, as we do with experiments, know what you're going to do with that answer.
So don't just go out, what do you think of my ads? What do you think of my products? No. Know how those answers are going to change your future behavior. Do you want to go skiing sometimes? Why? Are we going skiing? No, I was just asking. That's a ridiculous question. Question, how much are you spending with other competitors? What was the feature that really brought you here? How long did you think about these products? All of those things enable that type of curiosity go forward and give you new opportunities that you might otherwise miss.

chris:

Those are some great questions. Super actionable. Now I have to call Ben and say, "Ben, are we asking these questions on the thank you page? Because we need more data." That was awesome. Thank you.

neil:

The time you do it.

chris:

Okay. This is the last question here.

neil:

Sure.

chris:

Which is you mentioned this just slightly at the beginning of our conversation about even for social media. A lot of people who listen to this podcast are very excited about trying to grow their influence and authority on social media. They have expertise, but they can't reach an audience. And we have limited insights from the different social platforms. So if I'm trying to grow my presence and my authority on say something like Instagram, how can I use data to allow me to understand what's working, what's not working so that I can make content that's going to be more attractive to my audience?

neil:

So if you thought your brother was going to write you a note about machine learning, here comes the answer that gets everybody to write the notes to me.

chris:

We're all ... Okay, we're ready. And Neil's email address is ... Okay.

neil:

Here's the way that I approach it is social media is attractive because it gives us rules for conversation. And it gives us feedback on it. If you and I have a conversation, or if I message to a thousand people, say at a conference, and there's no feedback mechanism, maybe I can't see them, maybe I can't hear from them, there's no survey, I don't know how I did. And social media overcomes that by rating your conversation. If you do well, they promote it. You get more likes. People are saying, "Do you comply with the rules of the community?" And so when I look at success of social media from that lens, I say really what it is to discover, to say, how closely can you conform to the rules and best practices of that particular group? What's successful on Twitter will not be successful necessarily on LinkedIn.
And when people are looking at viral content, that's like, ah, something that can form to all of the rules of what the community expects. And so from that lens, when people are saying, "Well, how do I become more success successful on social media?," it's really saying, well, how closely can you comply with the rules that they put out? Almost prioritizing them over your own brands and ambitions. Like the way you present yourself, the way you present your products, the way you write copy all have to comply with it. And I don't really think that's an interesting problem to solve because when I look at it, I say, well, say you have a million people following you. And you see these articles come up every now and then where you have these influencers with all these followers, and then nobody takes action on what you said.
And I think there was this one famous musician, I won't name names, that had something like 20, 30 million followers and then wrote a book. And I think only about 15,000 of their followers actually bought a copy of the book, despite all their promotions. And I say, is that really worthwhile? And let's tie it back to the larger thing we're talking about and we see how all this stuff is connected. If you have a CRM database of a million emails and you send out an email and nobody replies, nobody buys your product, how do you feel? Do you feel great because you sent out a million emails or do you feel terrible because you have a million emails and nobody cares about the message you're sending? And so I go back to that question. I say, oftentimes social media rewards people with quantity more than quality.
And I think that it's a rough proxy and I think we get distracted by it, but my challenge would be is I look at the success on social media as I do with any other channel. If you put out a message, if you put out a request, do people respond to that more than just arbitrary buttons to say they're engaging with it? And what a lot of people find, and this is a discouraging part about social media, is that they find out when they do put out these requests ... They say, "Hey, I have 50,000 followers. I say buy this thing, listen to this thing." And they get such a low conversion rate over it. And it's like, well, look at all these people and nobody's interacting and nobody actually bought the thing I put out there. And I say that's fine.
But I say what that does is that reveals that there's a different set of rules. There's one set of rules for volume and activity. There's another set of rules for good relationships. And those rules are a little bit harder to discover, but I say that's the game you actually want to play. So if I'm in the influencer game, I don't care how many people follow me. I care how many people act. And so that's why I prioritize it. I say what you want to do is learn about as many people who take action, who click on something who go to a website, ask them questions about why they did, and then figure out what content relates to them. I'll give you one quick example on it. There's a case study of Electronic Arts, a video game company, where they put out 16 different display ads for promoting a mobile game.
And the one that brought the most downloads, the most volume was the one that was least attractive to the people that would actually spend money in the game. And when you think about mobile gaming, 98% of revenue will come from 0.2% of the customer base. And so for them, the success of their work came down to customers that would barely register if they tried to look at what was impactful for downloads. And I say the same with social media. If you look at it, say what's impactful for likes or retweets or whatever metric you're looking at, it's likely going to be different from the people that are going to support your enterprise, that are going to spend, buy and want to be connected and relatable to your brand. That's the end of my TED Talk there. Social media from a guy that doesn't tweet.

chris:

All right, team. We will edit that part and we'll separate that. And we'll include Neil's email address at the bottom so that you-

neil:

If you have hate mail, go ahead and send it to me.

chris:

You have bones to pick with him.

neil:

Dear Neil.

chris:

Okay. I was nervous listening to that, like, oh my God. Because I'm at like 800,000 followers on Instagram. How do you engage audience?

neil:

No, no. You have the rules. And the thing is, look, if you have 800,000 followers and you put out a message and say I have a new product and they buy your product and there's that exchange of value, then I say you found the best of both worlds. But I think all too often, people would say, look ... It's this idea that if you build it, they will come mentality. It's I'm going to capture a whole bunch of customers and then they will see how amazing I am and they will love me. Daily deal space. Where restaurants are like, "I'm going to give away my products at 80% off and I'm going to have all these customers come in and give me their email addresses." And then none of them come back. And so I always look at that connection to say, it's not simply the volume and how many people raise their hand, it's how many people are willing to follow you in the next step. And that I look at as success and those rules are harder to identify because you need somebody to take action, buy something as opposed to clicking a little button to share. But I think that they're all that more important for anybody that's pursuing that channel.

chris:

That could be the beginning of a whole nother conversation that you and I could have, but as it is, we're out of time. My guest today has been Neil Hoyne. He's written the book Converted. This is a book that if you've enjoyed this conversation as much as I have, you definitely need to go out and pick up this book. Neil, thanks very much for jumping on the podcast today.

neil:

Hey Chris, my pleasure. Thank you for having me.
My name is Neil Hoy and you're listening to The Futur.

Greg:

Thanks for joining us this time. If you haven't already, subscribe to our show on your favorite podcasting app and get a new insightful episode from us every week. The Futur podcast is hosted by Chris Do and produced by me, Greg Gunn. Thank you to Anthony Barro for editing and mixing this episode. And thank you to Adam Sanborne for our intro music.
If you enjoyed this episode, then do us a favor by rating and reviewing our show on Apple Podcasts. It'll help us grow the show and make future episodes that much better. Have a question for Chris or me? Head over to thefutur.com/heychris and ask away. We read every submission and we just might answer yours in a later episode. If you'd like to support the show and invest in yourself while you're at it, visit thefutur.com. You'll find video courses, digital products, and a bunch of helpful resources about design and creative business. Thanks again for listening and we'll see you next time.

More episodes like this