How Hex found product-market fit: Barry McCardel on data collaboration

How Hex found product-market fit: Barry McCardel on data collaboration

Show Video

(upbeat instrumental music) - Welcome to the Startup Field Guide, where we learn from successful founders of unicorn startups how their companies truly found product-market fit. I'm your host, Sandhya Hegde, and today we'll be into the story of Hex. So launched in 2019, Hex is a modern collaborative platform for data teams. Hex has over 500 paying customers, primarily data scientists and analytics engineers who really love this product. They can use SQL, Python, R, or even just drag and drop some templates.

They can do everything from building simple dashboards to complex applications on top of their data that anyone else can interact with and build on. So, love from this user community is so strong that despite Hex not technically being a unicorn yet, I just had to get Barry McCardel on this podcast to ask him everything about how his team has built Hex. Barry, welcome to Field Guide. - Welcome, I'm honored to be a pre-unicorn exception.

Thank you for having me. - Well, it's 2023, not 2021, so. - That's right. (both talking)

Generous these days. (both laughing) - The bar is now you do have 500 paying customers, not what silly VC's like us have valued the company. - Yeah, yeah, all right, cool.

I'm glad to be here, so. - All right, so Barry going all the way back to 2011, you started your career officially as an Excel jockey at PWC. What was that like? And help us connect the dots from there to you saying, "Okay, I'm going to start this company called Hex."

How did it happen? What was kind of the original insight behind Hex? - So I had no idea what I wanted to do in undergrad. I was at Northwestern. There was a lot of things to explore and try. I actually spent most of my time like producing concerts and speakers. So not super clear through line, but I did get really involved in, I took this class in social network analysis. Which was, this was like 2009, 2010.

So it was social networks is a concept and that idea of that being a thing to analyze in network science was really taking off. - Right. And it was really cool. And I actually wound up spending time in a lab, a research lab there called Sonic, where we did a bunch of, and no one really called it data science at the time that hadn't gotten super in-vogue, but basically data science on network data. And so these could be networks like you think of a social network, like a Facebook or we studied even things like research collaboration networks and healthcare networks.

And so there's a lot of interesting stuff there. And so I just kind of got my eyes open to this whole world of data stuff. I had always been kind of a nerd and it was really cool. I was starting to write some stuff in R I was doing a bunch of stuff in spreadsheets. We had built some of our own software in this lab.

I still had no idea what I wanted to do after college. And consulting sounded pretty cool 'cause as far as I understood it, it was like, "Hey, you get to go in, learn a bunch about these businesses and dive into data and try to generate some insights." And so I went to a small firm that got bought by PWC, and I wound up being the person, I worked on a lot of like airline projects actually randomly. It was really cool. It went really deep in how airlines work, and there's all sorts of interesting data stuff, whether it was network scheduling and airport planning to wifi pricing for airplanes. And I was always the person on my teams that wanted to dive into the data, and I love building a nice deck, slide deck, but I was always the Excel guy 'cause I loved it.

And I was setting up little access databases, illicit access databases on PCs that I had per PC towers I purloined and set up at our client site. I was going really deep on that and I was building these whole data apps basically in Excel with dropdowns and buttons and VDA, and it was really dark arts of Excel. And I was shipping releases of these models. I was like, this is version 12.1 at the wifi pricing model and distributing it on thumb drives, you know.

It was very cute kind of looking back, and I-- - I mean it just, Excel is still like the best no-code development tool ever. - Yeah, it's great. - It's so powerful and amazing. - The first blog post on our site actually is, it's this, it's called Long Live Code.

And it's kind of a little meditation on why Excel is so popular, and I think it actually gets this thing right that we've... We try to get right as well that I like to say is a low floor and a high ceiling, which is you can go into Excel and just build a shopping list. You don't have to do anything technical really, but you can also build these insane data apps that I was building with calculations and regressions and all this. It's very accessible and then you can just ramp and ramp and ramp in the same UI.

And so, fast-forwarding a little bit, I had a friend who from college who was working at Palantir and he was like, "Hey--" I was describing what I did and he was like, "Hey, you're doing admirable work in Excel, but we're a company, that's like all we do and we build software for this." And I found that really appealing. So I spent about five years there. I got to work on and solve all sorts of interesting problems using data and working on very cutting edge of data at a very interesting time.

And started big data and data science were becoming very hot and in vogue. And I met a bunch of wonderful people including both of my co-founders. I left, went to a healthcare startup in New York and I was puzzled that despite having a pretty modern data stack with like a data warehouse and DBT and a BI tool that we were still doing all of our work out, our data team was still doing a lot of work in like one-off SQL scratch pads and-- - Right. - Jupyter Notebooks

floating around and scripts and a lot of stuff winding back in spreadsheets. That was kind of funny to be back in building data apps in Excel. I was like, whoa, we're still doing this. And so I actually started this journey of Hex as a buyer. I was shopping for something. I was like, certainly someone has built this thing.

Certainly this exists. I had been in the Palantir bubble for a long time where we kind of just built our own stuff and weren't really aware what was on the market, and I couldn't find it. And so I was going around all these friends on data teams and I was like, "Hey, what do you y'all use to solve this problem you do?" And no one had anything good and what most people were like, "Hey, this is a problem for us too, but when you, when you find something, let us know." - Right. - And so it took a while to turn the corner from being a buyer to a builder.

But I got together with Glen and Caitlin, who were two people I loved working with at Palantir, and we kind of took the leap 'cause we were like, "Hey, if no one's solving this, maybe we are the people to do it." And so that was the very end of 2019. So we're about three years and four months or something that into that journey now. - Makes sense, and I'm curious, what was, when you said, "Who is solving this problem?" How would you articulate the heart of the problem? Because one thing I see is given how crowded the analytics ecosystem of vendors is, often when I meet a new founder starting up in the space, I'm, I have to tell them, "No, first you have to start really small. I'm going to ask you what the big vision is, but you have to start really small and do one thing incredibly well if you have a chance at standing out." Otherwise, no matter how good your product is, there's just so much noise in the ecosystem that is going to get-- - Totally.

Yeah, it's really tough. - Enveloped in it. So what was that one amazing thing you really wanted Hex to nail when you started? - We had a lot of stuff in mind, and I think you're touching something very important, which is like, we had a big vision, we had a lot that we knew we could do, but it was like trying to stay really focused early. It was hard, but it was worth it. We focused on one thing early on, which was sharing and communicating work. We had this sense that one of the most painful things is like a data scientist who might be doing your work in something like a Jupyter Notebook or a Python notebook or scripts was actually taking that and turning it into something other people could interact with.

And there were some things out there, there's some open source projects that were like, "Hey, you can wire up a UI and publish a thing." But nothing that we felt was as easy and intuitive as we felt should exist. And so the first version of Hex was actually very, very light on the editing front. In fact, the first thing we built was basically the ability to take a Python notebook that you already had, drag and drop it in, do some basic parameterization of it, like adding input parameters, and then publishing it. And it turns out that that even just by itself was a quite appealing thing, that solved a quantum of value for some early people. - Right.

- And we actually had some reluctance I think to let our vision and focus blow up too much. We were open even at the time, but maybe that's the, that is the four corners of what we do. Maybe there is this sharing layer. I think as we thought about the market and what we realized though was that was that sort of last mile was probably not big enough to build something to solve the depths of the vision we wanted much less build a venture-scale business.

- Right. - And actually there's some analogs to other markets where if you look at the design market, there was a bunch of companies for a while that were sort of like, "Well we're gonna be the last mile. You're gonna do your work in sketch," but then were the last mile thing that you used to share and communicate your mockups. And those were real companies. Like Abstract, Wake I think was one. There were others. - Envision.

- Yeah, well, envision. Yeah, I mean, and their lunches, breakfasts, and dinners all got eaten by Figma, which was the place where people wanted to spend their time. And so we felt like that last mile was a little precarious even though we did have some conviction that that might be the place to start. And so it all sounds very rational, linear right now.

There was a lot of angst for a few months of, "Wow, where do we start?" And it's like, I think you do have to pick a beachhead thing that you're like, "Oh we're gonna be good at this." And then provide a quantum of value and get into an iteration loop with customers. I think that's actually the most important thing early on is getting, we call it commitment engineering loops. - Right. Getting into this give get cycle with early users and customers is the thing early on I've come to believe. - Got it.

Could you share some numbers around it? So things I'm curious about is one, how long after you started the company did you have something in the hands of a end user trying it on their own? Just the kind of sharing collaboration functionality. - We started full-time in December 2019, and then I remember we had this trip to New York about three months later, so it was the end of February or something and we built a first prototype. We'd gone just heads-down and built this thing out.

And I remember this moment, it was so surreal. We went and visited one of our early design partner customers, just friends that I had known on a data team at a company in New York who we had convinced to try this thing out. And We went to the sit down in this conference room with them for them to get their feedback on the prototype. And they were like, "Yeah, it worked."

And I was like, "What do you mean?" They're like, "Yeah, yeah, it worked here." They showed me, they demoed what they were doing. They had feedback requests and I was like, "What, it worked?" We tried to hide our surprise 'cause we had given-- and that was a fun moment I think-- I remember getting in the elevator and walking back out onto Spring Street with, I kind of looked at Caitlin and Glenda and I looked at each other, and we were like, "Do we have users? Like, what?" So that was kinda a funny moment. - That's great. - And so that was three or four months. - To your point, getting that three months in is super valuable, right? Because it just accelerates all the other features, ideas you want to work on for the rest of the year.

- I mentioned this before, I can expand on it a little bit, but there's this concept called commitment engineering that I, it's not something I invented. I some really smart people I used to work with said this all the time, but I'm becoming an evangelist for it now, which is I think really early on, founders will often try to be like, okay, I've got my startup idea and I've got my product prototype and I'm gonna go try to sell it now, and I'm gonna convince people this thing. I think early on the most important thing to do is be the person who really understands the problem and try to find the people who feel that problem really acutely. And then you get into what I call this commitment engineering loop, which is the first thing you're asking for is not like, will you buy my software? What these first people, the first thing you're asking for is, "Hey, it sounds you have this pain point.

Would you take a half-hour and do a call and just tell me about it?" - Right. - And if they say yes, then, great, you've started something. If they say no, you're like, this is clearly not a big enough pain point for them to take a half-hour and talk to you, that tells you something. It's early signal even before you have a single line of code written. And then once you do have some code written, it's like, "Hey, will you take a half-hour and click around in a prototype with me?" And then you're testing that.

Are they willing to give their time? Are they willing to make a commitment to you that's non-monetary but they're giving you something 'cause you're giving them something, right? You're, it's a give get loop. And you can kind of ride that all the way through. You can be like, "Hey, great, well if I came back with another version of this in three weeks that addressed feedback x, y, z, would you do a 45 minute session and would you invite your boss or a colleague or would you use this for real for a day?" And that's you're just testing the whole way up, all the way up through, "Will you sign this contract? But that doesn't--you can validate your way up. And I think getting into those loops early and getting those feedback loops and just finding a way to validate, "Am I on the right track?" is so important versus what you do see some founders do. And it's a verycommon mistake, which is, spend six months just building in a hole and then you get out there and you start validating by trying to sell it and it's like, well if you got some foundational assumptions wrong, you don't wanna find that out that far in. - Yeah, I think the other thing I'll wanna point out to our listeners, especially about your approach is you're not validating the idea by asking if someone likes it.

You are validating it by asking for more commitment, right? It's really-- - That's right, yes. - It's really easy to have this really eager founder showing you their baby. It's really easy to be like, "Yeah, I have that problem." - Looks cool. - I could. - Looks great. - I could see us using it maybe a few years.

Yeah, yeah, right and-- - Well you actually connected to your data now? Oh no, no, we've got a lot going on or not right now. And it tells you something, right? - Yeah, exactly. So I love the focus on commitment as the validation. Not the words being used in response to your demo, right? - Yeah, you have happy ears early, right? You wanna know, you're like, "Hey, I wanna hear that this idea, this thing is gonna take the world over," right? - Yeah, yeah, you have to remember that you are biased and someone else is trying to be nice to you. - Yeah, it's eternal, and I've had so many moments in my career and even at Hex where that that old age, old wisdom of like, you're the easiest one to fool comes in of you can convince yourself of almost anything and real intellectual honesty, even if it's painful in the moment when you hear that the thing you've just spent a bunch of time on is not it.

Like-- - Right, right. - It's important. It's hard though. It's really hard. - Yeah, I'm curious what was the pattern in early adopters who leaned in versus those who leaned out? I'm assuming not everybody is like, "Yeah, I'm going to try this right now."

Was there any underlying pattern that helped you crystallize who is the early-adopter customer? Who is that persona? What are we looking for when we reach out to people?-- - Totally. So data is a really big space. - Right. - And we discovered something I think it's intuitive enough to say, but it took us some time to sort of to really crystallize it, which was the problem we were solving initially, our early value prop is about this sharing and communication thing. And there was basically two classes of people we would talk to. We talked to the people that we'd be like, "Yeah, we'd talk about the pain point we were solving, like sharing and fragmentation and sending around PDFs and screenshots of reports and all that."

And there were people who were nodding. I would describe the problem and I'd just be getting head nods. And then there's people who are sharing, "Yeah, I don't know, I don't really share." And it was weird.

It was like the thing we discovered is, there's really two classes people, there's people doing basically analytics, which even if they call themselves data scientists, I think analytics is a big part. - Right. - It's this big part of the data world, which is basically you're trying to ask and answer questions to influence a decision. And then there was people who were doing what I would broadly characterize as ML engineering, which is I'm iterating on a model that is gonna run to make a prediction somewhere. - Right. - And the former has a lot of problems around sharing and communication 'cause you're trying to influence a decision.

The latter, it was not a big part of their workflow and they were asking us for very different things. And so I think early on we got confused honestly 'cause we weren't thinking as clearly enough about this, which is kind of embarrassing 'cause we had so much experience in this space. And we were jumbling those things up and we were taking feedback from both and sort of putting 'em in the same hopper where the people in the ladder were asking us for basically a completely different set of things. - Right. - And it took us, I think, a while to get comfortable just saying no or ignoring and putting blinders on. And I think as a founder, again, you have happy ears and you want every discovery call you do to be a great discovery call.

- Right. - And the person on the other side being like, "Yeah, I can't wait to use this." And I think as a founder you have to get really comfortable in the first few minutes if you're, if the person on the other side is like, "It's not clicking." Just being like, "Hey, you know what I wanna respect your time." - Right.

- "I'll circle back if we go down this path." That's really powerful. I actually had a sales call earlier this week with a very large company that would be a great prospect for us as a logo, where I was literally in the first 10 minutes I was asking them a bunch of questions and then I was like, "Okay, it sounds like this is not right for you." And we wrapped up the call and we all moved on with our days and you gotta get comfy with that early on of having that focus.

- Right. Yeah, no, makes sense. I think one thing I really took away from what you were saying was that even though the title might be data scientists for these two different people, they might have the same exact title outside looking in, you are like, yes, this is the same customer persona, yada yada, right? - Well data science, it's such a funny thing. This is an aside. I regret, there's a lot of regret around data science even becoming a title 'cause it's just like, there's a joke, there's a lot of jokes around this, right? That's like, you're data scientists or data analysts who do the same thing in Python. And it's a little reductive.

'Cause I think a good data scientist is bringing a level of statistical rigor and insight into what they're doing. They're typically thinking about things like experimentations, sample sizes, forecasting. There is deeper techniques to bring to bear and certainly most data scientists are doing that. But I think a lot of those, if you kind of survey data scientists, I think the the majority certainly are effectively doing analytics.

They're-- - Right. - They're asking and answering questions to influence a decision versus I am program, I am training a deep learning model. - Right.

- That we're gonna deploy and behind an API endpoint to serve online predictions. That is a very different discipline even if theoretically they're both querying data and charts are generated along the way. - Makes sense. So, going back to Feb 2020, which also momentous month and the history of the-- - Yeah, it was. - But-- - I didn't realize it would be my last trip to New York for a while, or trip anywhere for a while.

- But we'll get to that point later. But, so walk walk me through the rest of 2020. At what point did you kind of just say say, okay, you know what, we no longer want people to upload Notebooks into Hex, we want all the work to happen in Hex. What was that transition and and how many end users or customers did you have at the time when you started going kind of deeper into the product roadmap? - Well, just a handful. We were reluctant to blow up our scope too much.

- Right. - I think we were like-- I think the abundance of ideas we had actually made us a little paranoid where we were like, "There's all this stuff we can do!" 'Cause yeah, remember we'd spent like five years together at Palantir building, Palantir has built a huge number of different analytics and data service areas. So we had seen some plenty of data points in terms of all the things one could build, and Hex is not a reflection of any one thing at Palantir.

But we had a sense of there's a ton that one could do in this space and focus being important. In fact, that was a lesson I took away from Palantir focus is really important. 'Cause there were moments where we did or didn't have that. It was though very clear to us when we were talking to these early customers, I remember one conversation in particular where we, where all the feedback was on the editing experience and we were like, "Well would you ever just wanna keep editing this in Jupyter?" And the guy was like, "No, Jupyter sucks." And I was like, "Well that's a little strong.

Tell me more about that, though." And because I was a big Jupyter fan for a long time. I still am. But he got into it and he was telling us about his whole workflow and it was like, "Oh, okay, I can see why your takeaway from this workflow is that you were looking for something to act." We had created

a good experience for him on the last mile. And he wanted that goodness permeating further forward. And so all the feedback we were getting basically was around that. And I had been a user of DEV Notebooks for a long time, probably 10 years into using some form of a Python, IPython Notebook now. And I think we had always had this feeling of not wanting to build a notebook company. 'Cause I had always felt like notebooks weren't a market.

And I still don't. I think notebooks aren't a job to be done or a problem to be solved. They're a format. It'd be like saying I'm building a text editor. You're like, what kind of text are you editing? And so we wanted, if we were gonna take this on, and I think we wanted to make sure we were being like focused and disciplined on what we thought this could look like. And we had a really clear picture of what we thought, an actually much better end-to-end workflow for these things could be.

It was just a lot to bite off. And I think it took us a little bit to get confident of "Yeah, okay, we--" there was almost a moment I remember that this feeling of: we didn't choose this life, this chose us. It was like the pull is there, we have all this knowledge, we have all this experience. All right, I guess we have to do this. And I'm obviously glad we did 'cause I think we've been able to bring a lot to it.

But it took a lot of pull from users to almost drag us there. We weren't pushing it. And I think that's what made it feel authentic and ultimately built our confidence. - And what did the customer profile look like in your first year? So was it mostly smaller teams, smaller startups? Did you have kind of larger companies already? - Yeah, all smaller companies. I think the data is just, I, maybe this is intuitive to everyone, but I think it's perhaps a little underappreciated. In data, being able to actually connect to your customer's data is just everything.

It's like the sun and the stars and the moon. You can get a little ways being like, "I'll just upload a CSV but you are not gonna--" - Right. - You're not gonna get to any meaningful revenue on back of CSV uploads. And so we had known that. In fact, we had worked in the the most paranoid data environments in the world at Palantir where getting access to customer data was often the matter of years of trust building.

That was very important to us early on. So we just focused on early startups, and tech companies that typically are a little more permissive in terms of that where there wasn't a lot of baroque processes. Now it's very different. We are working with large companies. We have a lot of public companies as customers. We have built out a lot, whether it's SOC two to private VPC deployments to SSH data connection.

And we built a ton around that. But early on, if we had to go and build all of that stuff just to iterate our product-market fit, we would not be where we are. - Makes sense. So, two follow up questions there. One, you started a company post the warehouse becoming strategy, right? Especially companies like Snowflake, but people trying to standardize on this idea of like, "Okay, we want all of our data unified in a warehouse so that we can figure out what to do with it later." Was that a tailwind for Hex? How did you experience that? - It's the tailwind for Hex.

- And two, I would say given that that is something small startups don't necessarily do, right? A 10% startup might not have done the warehouse, unified data thing yet. So I'm curious about that tension, because you're talking to small companies but you wanna leverage this warehouse tailwind. How did that play out? - Yeah, well first I think that that is the tailwind for Hex. I think this modern data stack where you're bringing your data together, you have Snowflake, the Snowflakes of the world, the big queries of the world. You've got DVT and Fivetran and when I started my career in data, let's say 10 years ago or longer, the idea of a single place to go find a bunch of clean, ready-for-analysis data about your business was preposterous. When I was in consulting in airlines, the way I got data delivered to me was, you literally go down the hall on the 13th floor to find Tom, who's got access to the MySQL thing, who can pull you an extract once a week and a CSV that then I can go to.

And then at Palantir, we had all these customers who they've got 18 different data lakes and a lot of different databases. And this was really before data warehouses were the column data warehouses were a thing. And fast-forward to now and it's not a solved problem for every company in the world, but even really big fortune 500 businesses, we talked to 'em, they're like, "Yep, we got cloud data warehouses. We're using modern-- starting to adopt modern TL tools. Yep, got a warehouse with these analytic tables."

- Right. - And now the question is, "Well, what do we do with that?" - Right. - And how do we empower people and how do we make that easier? I think the last few years, the story's been around data infrastructure. I think there's now all these new expressions of what you can go and do with that. Really small startups, yeah, they don't typically have that. And a lot of them when they're using Hex, are maybe using Python to connect to an API or something to pull some data in.

But our sweet spot even early on was what I would call scaling tech companies. You'd have people, companies, 50 to 500 people, was what we used to talk about. And those companies, those sizes of companies, are standing up data warehouses. And again, 10 years ago, the idea of a 50-person company having an enterprise data warehouse was like, "What are you, are you racking and stacking servers for Teradata?" Now it's yeah you just go and sign up for Snowflake and set up a Fivetran connector and all the data's there. And it is just orders of magnitude simpler for people to get their data in that type of pattern. - So maybe pivoting a little bit to team building.

So you pretty much launched your MVP same time the world was going into lockdown. - Also raised a seed round when everyone thought the economy was going away, which was a very weird time. - Very, very short window though.

- I chose the short window of panic to raise our first round. So I, that was an interesting experience. - What's been your approach to building a team? You have hired some people that you know I love very much. So I'm curious how have you thought about building a team and especially building a good connected culture while a lot of us have been remote and distributed. - Yeah, it was interesting.

Our there's a funny story. Our first employee started, it was March 6th or 9th or something like that of 2020. And we had just moved into this new office in San Francisco. We were so excited. We had this office space, we had two rooms.

That was really nice. And he came and I had a little, I had set up a monitor and it's got his laptop. He was like, "Our first employees here, yay!" And we had one day in the office with him and then we're like, "Oh, we should probably work from home."

So, yeah, all of the early hiring, I think up through 15 people or more actually was done fully remote and distributed. We wound up with people all over the US, and I don't, I... I think we've done a pretty good job on this. I don't think I have a silver bullet.

I think in terms of a connected culture, I think we try to get together a bunch, and I think in general, especially when you're remote and you worry about permeating best practices, when I'm at my best, I think I try to be really mindful and make sure I'm taking my time to highlight examples of what good looks like. I talked to some founders and they're like, "Oh I've got all these people and it's tough to feel like we're all in the same culture. I know what I'm gonna do.

I'm gonna write a values manifesto and I'm gonna send it around." And it's like, "Okay." We have that, too. We have our handbook. It's public on the website. You can go check it out.

We have values in there. But I find that the most effective single thing you can do to calibrate a culture and it's probably true in person or remote, but especially as you get bigger where you do have people just all over the places, regular shout outs for what good looks like from you, the founder. And so just earlier this week we had a part of our product that was very long neglected.

It was never the most urgent thing, but it was always a source of shame. It's like, this is not good. And one of our designers went and just redesigned the whole thing and she took the time to do it and it's beautiful now. And I shouted that out in the company in Slack in our product channel because I felt like I wanted other people to see that. I both wanted to give her props 'cause it was sweet. But the real motivation was like, "Hey, I want other people to see that and see that and be like, 'That's what good looks like here.'"

- Right. - That's what I should inspire, too. And when I'm at my best, I try to do that a lot. I find that to be the one most important thing probably.

- Yeah, go comes back to storytelling and the stories that motivate us and make sense out of what's happening in the world. I've always told every founder I've worked with that like, "You'll be surprised how powerful a tool storytelling is for building your business and how much difference that makes, versus all the other things you're going to focus on because it gives you leverage across every single activity you are going to do." - I think that's really insightful. I think that's really, really insightful. I think storytelling is probably most of the job as a founder. If you really think about what you're doing, you're telling candidate stories, telling customers stories, you're telling investors stories, you're telling your team stories, and that sounds like stories like, fantasy stories. - Right, right.

- Like no, you're telling a narrative of the way you see the world and different founders will do different amounts of extrapolation into the future that but-- - Right. - Yeah, I think that's very insightful. I think that's right. - So speaking of the future I would suspect that once you have, you have built this Hex product, you have a data team that has built a bunch of applications, dashboards, they have done all their work and in SQL or Python, but now they have built a service area that's accessible to the rest of the company. How are you seeing Hex user profile change? You have tens of thousands of users, 500 paying customers.

What does the future of Hex look like in terms of the people who are a part of this community that you are betting on two, three years from now, versus today? - Yeah, I think the problem that we hear really consistently from customers is around fragmentation. I think people wind up doing work in a bunch of different places. Most of these tools aren't built for collaboration.

Most of them have no sense of organization or governance or permanence. So you do some really great work and then you leave the company, it's like, "Where did that go?" And communication's really broken a lot of insights live in charts and screenshots of charts and PDFs of decks in an email somewhere from three years ago. It's just not great. That is a big part of I think the story and the pain that we hear from our customers.

And so when you think about that, that cuts across a lot of different workflows and users and use cases. And so we were talking earlier about focus in the early days. I think it's actually just equally important now and something I try to put a lot of energy into of, there's a ton of directions that people pull Hex in. We have people build all sorts of crazy stuff.

Like on Twitter, I saw someone the other day built this like 20 questions app using Open AI's GPT-3 API. It's like you can build all sorts of stuff and that's awesome because it's a flexible tool. People will take it in all these cool directions. There's a pleasure and a joy and a pride that we take in building powerful tools for creative people. I think it's really fun to build those types of products. On the other hand, it's very stressful 'cause you wind up getting a lot of different types of feedback.

- Right. - We were talking earlier about different personas giving you different feedback. There was a problem today. - Right. And I think I've tried to get really good at both, making decisions on what we're gonna focus on, but then making sure even what our salespeople know that. When they get a request from a customer, I would rather them telling 'em, "Nope, we are not focused on that," than like, "Oh yeah, that's on the roadmap," or whatever people wind up saying.

- Right. - So for us, one of the biggest changes in the profile is people just use it-- The type of people who use it is different over time, which is, early on in a customer we'll see the main people using Hex are the people with data in their job title. - Right. - Data scientists, data analyst, analytics engineer, people on the data team and then they share it out with other people, and you wind up with this affect of, you have PMs and engineers and... So one case even I talked to someone who's just on the sales team who uses Hex, they know how to write some SQL, and it's actually the best place to write SQL.

That is really gratifying, I think, to see that spread, and It's very cool. Again, it does cause tension sometimes because you have the requests you're gonna get from an ML engineer are different than the requests you're gonna get from a product manager when they're using the product. - Right. - And knowing who to focus on and how to have a UX and a UI that is gonna do a good job being that low floor, high ceiling is a very interesting challenge.

I think it's a worthy challenge. It's something we enjoy. But I think if you're building in the data and analytics space, especially if you're building tools that have a lot of flexibility, you're gonna have that same challenge.

And so founders I think really have to be disciplined on saying no, and as their team grows, also making sure that you're communicating that down through the org 'cause you want everyone on the same page. - I think this is a great example of how I think particularly for enterprise software, since you have a more complex customer journey, product-market fit is not actually a milestone. It's, you hit it and then you have to keep it and grow it and make it stronger. And suddenly you have to think about okay, different parts of your product for different parts of the market. Especially if you're doing collaboration software, I think, it's really a constant work in progress. Not a milestone you hit and in a-- - In a space moving as fast as data, too.

- Yeah. - To think that we would've had PMF five years ago might not today. - Right. - And you gotta, as a founder, I think one of the hard parts is you're always trying to build the next feature, close the next deal. But you also have to hold in your head a sense of how things are changing.

With everything happening with AI right now, there's even, I think, a new strata on that, which is, I think just a bunch of foundational assumptions are changing as well. - Yeah, I'll take that as an invitation to put you on the spot about your AI strategy. - Sure, yes. - I was trying to hold back, but yeah, I think it's obviously fascinating.

The most fascinating thing is actually for me, the code generation, right? Text-to-Python, text-to-SQL, of all the things that this thing does well that seems to be one particular and unexpected strength. And so I'm curious how you're thinking about the future of Hex in that context. - Yeah, so that's a great question. We, our number one mission is empowering people working with data. And we think that people are gonna stay involved in that for a while.

I believe, we believe that data work is fundamentally creative. I know you don't think of a data scientist when you think of a creative, you might think of a artist or musician or whatever, but if you think about a lot of data work, it is creative. You're forming hypotheses, you're exploring ideas, you're telling stories, you're building beautiful charts, you're taking some risks.

It's a creative and stimulating endeavor. It also can be so frustrating and tedious 'cause you're tracing down a missing parenthesis-- - Right. - Or fixing your python dependencies. And I see AI in every domain as a chance to allow humans to focus on that creative, engaging, stimulating work that humans are uniquely capable of. And to partner with AI to abstract away a lot of that tedium. And if you look at the way a software engineer uses GitHub Copilot, we know a lot of engineers on our team use Copilot.

When you talk to 'em about it, it's like, yeah, well, I don't have to worry about a lot of the BS that I had to before. - Right. - It takes care of boiler plate for me.

I am thinking at a higher level more consistently and that lets me move faster and explore things in more depth. We see that exact same thing with people using our AI features, which are, we call Hex magic, a set of magic tools that lets you generate, edit, debug, and explain code right in your workflow. It's built directly into the cells where you're writing SQL or you're writing Python. Very importantly, we are not building some black box thing where a business stakeholder is gonna roll up and be like, "Tell me an answer." and it's gonna spit out a perfectly formatted chart with explanations and built out data pipelines behind it. I think in data especially it can be dangerous to do that because there is correct answers to things, and there's already enough problems with organizations getting to different answers for the same question.

I think you wanna make sure that if you're building something AI assisted that you're, at least right now, you're keeping humans in the loop. So we actually have a rule as we're building these features of we don't run the code. If you generate something in Hex, you hit the run button. And that's because we really see it as like, hey, this is here to help you iterate. And I think it's really interesting, too, to see the way people wind up using it. It's similar to other types of generative AI workflows.

If you've used ChatGPT Two or if you spend any time with a image generator, things like the Midjourney or Stable Diffusion, you wind up working in this really iterative way. - Right. - It's not a zero shot, one shot thing where you go like, "What's the answer to this question?" The users who really find success with this in our product, and the way I wind up using it when I use it is, I'll ask a basic thing.

I'll be like, "Number of customers broken down by tier, okay, add revenue." - Right. "Add this, build a chart, factor this code out."

It's an iterative process where you're partnering with it. That, I think, is what--getting that UX right is really one of the most important things to building an AI driven product. It is not hard, I promise you, it's not hard to build a quick demo and there's like a million of them now in the data space of, ask a question and it will generate a SQL query.

Like half the YC patch was that, I think. But it's much harder to like, "Okay, what's the right UX for this?" And then also have the data to know how to prompt these things accurately. Being able to take in context the rest of the project or the database schemas or pass queries or passcode or past completions people have accepted or rejected. Getting all that right is also the real hard part so that the models themselves are becoming commodities. It's how you package that up and build the right prompts and the right UX to allow humans to partner with this that I think will matter in this next generation of productivity tools. - Yeah, and how do we create a future that's not an even worse governance nightmare than we already have today, right? - That's right.

- This will be, this could easily become our version of news misinformation where there's like, okay, here's the image my mind, right? There is a head of sales and head of marketing. Both pointing to their ChatGPT chart saying, "No, no, no." - And they prompted it differently. The salesperson saying, "Why is the bottleneck top of funnel?" and the marketing person is, "Why is the bottleneck sales execution?" Yeah, you could answer, you could probably tell a story for either of those things.

So that's why I think the data team-- - The sales team has not converted all my amazing MQLs - Yeah, (laughs), tell me why the VP of Sales-- Yeah, yeah, that's right. (Sandhya laughing) - Also-- - That is why I'm a believer in the data team and the continued value of data teams. Even if you can have a bot that can generate some valid SQL, that assumes then that data teams are SQL, just SQL monkeys, and I just don't think that's what they're there for.

And I think that will actually become clearer in the same way that developers, they're not there just to bang out type scripts. They're there to be creative and think and architect something. So it's a very similar thing I think we're gonna see in every domain. - Makes sense. Maybe wrap up on a quick last question. What would be your advice to 2023 new founders thinking about building data startups? What would be your-- - Don't (laughs).

- You're not allowed to say that. You can't just say don't. - I think that we are at a point in data where you should assume there's exactly zero pure greenfield.

- Right. I've talked to some founders and they're hunting around for the patch of grass that no one's ever stood on. - Right. - And it's funny 'cause even successful companies, the youngins these days don't even realize , there was generations of data warehouse companies before Snowflake. There was generations of ETL companies before, like DVT and Fivetran. There are some old, old like, does anyone remember Abadicio? There's no new ideas, I think, and that's okay.

I think one thing you learn is that ideas are cheap. Execution rules of the day. And so I would think really carefully about what are the areas where you and your team have a unique license to execute. - Right.

- And what are the areas where you have unique insight and passion? And I think data was very hot the last few years. There was a lot of VC dollars, and I'm certainly the beneficiary of that. You could call me a hypocrite for saying this, in fact.

But I think just being like, "Oh, I wanna start a startup in data." You gotta realize that there's like 80 other people with the same idea that have come before and are gonna come after. And you have to really be honest with yourself about where you have unique insight or execution ability. That would be my advice. - Very good advice, and thank you so much for joining our show, Barry. - Thanks for having me.

- I enjoyed this conversation so much. Thank you for coming. - Thank you. This was great. And hope to join maybe sometime soon or down the road.

(upbeat instrumental music) - [Sandhya] You've been listening to the Startup Field Guide with Sandhya, an Unusual Ventures Podcast. Stay connected with us by subscribing to the show in your favorite podcast player. If you liked what you heard, please rate our show, and help us reach more aspiring founders with lessons on how to find product-market fit.

Thanks for listening. Until next time. (shimmering music fading)

2023-05-15 04:31

Show Video

Other news