Transcribe your podcast

Talent wins games, but teamwork wins championships.


Welcome to eight players. But guess what? We'll tell you how to target, hire, retain and train top performers for your team.


If you're looking for a flight and look for people who are excited by a flight and want to do a fly them. If you're looking to push state of the art with research and that's your team, then yes, then go for people that are interested in being so interested in pushing the state of the art of demonstrate that already to not being followed by shiny and being very thoughtful about what do I need and then going after that because I've seen it fail.


If you don't pay attention to that specifically in then I am Robin shows you at Higher Suites and we are sourcing automation software that helps of the tech companies hire the best talent at me. And follow me now on LinkedIn who want to keep an eye on this.


Hey, everyone. Today we're having madder and madder. I will be telling us about hiring that team of eight players, but hiring eight teams, and especially with more machine learning and artificial intelligence focus. Well, come on players Mandara Can you tell us more about yourself, your background and about your thesis on hiring teams and teams of players? That's good.


Hi Robin. Hi everybody. Very excited to be discussing how to build eight teams for machine learning. In terms of my background, I I've spent a couple of decades in different product teams across my career at Sun Microsystems. So we build our platform, the language that hopefully most of you still love and adore them.


I was at Adobe for building a brand new product for designers for three years and my last almost decade now at will be where the first three years I spent migrating up platforms and then the last six years building out our Emelle effort where we took our small teams from zero to shipping multiple email services that have different customers, where our customers are getting value from the services we have built and shipped and have scaled.


And the specific part I want to call out is this is probably a coincidence, but all these years and my last decade, I've been hiring every single year, some years, 50 percent hundred percent team growth while we were all trying to build and ship products. So hiring and building a team. I've gotten a lot of practice around it and have learned a lot from not doing it right in the beginning and then honing specific practices where it's now art and science and hands.


Where are you happy to talk about that subject?


And what I liked when we prepared that episode is you really seem to consider hiring as a core component of your job, even in a large company like Workday, where you could expect that the hiring teams would do the work for you, actually for you is the opposite. You have to really work with the recruiting teams, with its own team, and you have to own the recruiting yourself as their hiring manager, right?


Definitely. I've never had success just depending on the recruiting team to build out my teams and a very simple reason behind it.


I believe the network that we're trying to hire from are the skill set, are the people that we're trying to hire there in our network. Not necessarily. Not recruiting teams network. So unless you're very intentional about accessing your network building relationship, continuing to grow your existing relationships, you won't be able to attract, hire and bring in the right people because they sit in your network and you can't expect your recruiting teams to source these candidacies and specifically passive candidates.


Yeah, and so here we're talking about machine learning, data science, but that's also the case in mostly all engineering teams, right?


Yep. Yeah, that is definitely the case. And any engineering team that you're trying to hire.


OK, so first message is, if you were a hiring manager, then you have to be involved in recruiting. You have to think about it and be strategic about it. Also, another thing we discussed is because of the U.S. mostly recent nature of the job of machine learning, there are a lot of new jobs, new skills to learn. And so it's also even harder for the recruiting teams, for the talent seems to understand. What kind of profiles do you exactly?


And I know that's even more the case in this new field of machine learning. So how do you tackle these and what's your role as a hiring manager to help the talent teams and to help define exactly the roles of the person you're looking for?


So when we began our journey with building out the small teams and building the products in the industry outside of some of the big players who were doing machine learning for years, most of the rest of the industry was still trying to figure out what kind of roles we need to be able to build machine learning teams. And when we started our journey over time, we realized that is at least, I would say, three different flavors that we kind of differentiated when it came to data science or machine learning.


That is the basic data scientist which stayed for us within more analytics or people that knew data science that could do analytics, serving some of our internal machine learning needs. And for example, we build a product, internal product, where our marketing team that sell the products they wanted to get a better understanding of, which leaves are that I used to go pursue for selling our products. So we built an internal product that gave the predictions around, which leaves for marketing would be the right ones to pursue.


Now, if you're doing internal data science projects like these, the skill set required is mainly just data science, and that isn't as much of engineering element of skill within that role. So we have a profile that is just data science skills that related to our internal projects and internal Lemel pieces. Then the other profile that we started looking for and realized that we needed, which is maybe are the machine learning products or was people that have a science background and people that also have engineering background to be able to build, deploy maintainable, scalable machine learning services and products.


So the other job profile that we ended up developing a career framework around, we called it machine learning engineers. I know there is a different companies that are using a different term, but this profile essentially requires you to have data science skills and also engineering skills. And this is a very hard profile to fill a role for. And the way we ended up building out talent for this particular profile, which we require for our arm, is we either hired people who had really deep data science background, have done some coding, and we trained them by bringing them on to the team.


And we had solid engineers on the team which helped mentored and grow those people to be engineers. And we also went at it from the other side, which is some of our existing software engineers. They were interested in machine learning and they were interested in developing the data science and math skills to move to them a track. So again, we help these people gain the skills and also the experience with their data science counterparts so they could become Emily. So we kind of found people in both the pools and help them develop the other side of the skillset to develop more and more machine learning engineers within the team.


And it is very hard to find families in the market that have solid experience building and shipping products. So for us, we had to train most of our families in the house and now it's becoming easier because more and more companies have experience building machine learning products. But the initial years you couldn't find experienced anomalies in the market. There were only a few that let entry moving around. So we ended up building out our expertise by training both data scientists and engineers to be Amelie's.


OK, so you mentioned there were three different flavors. So the first one is basic data science. The second one is machine learning. Engineers do is the third one.


The third one is probably not even machine learning. But I know a lot of people claim to be data scientist machine learning, but it's mainly just basic analytics. It's pretty. So data understanding, data, being able to visualize data, being able to pull out insights and do analytics on it. That is a lot of people that fit in that category. And I. I mention it because if you're hiring, you need to be aware of that. Are these real data scientists?


These are these are just people that are good at looking at data exploding data pulling out, building up analytics visualizations out of it, because that is that role, which I wouldn't consider something that we can leverage for building products. But we do see a lot of people that have this background which claim to be doing data science of smell. But it's not the one that at least is useful for us.


I would say, though, you could to look at these people and if they're eager and willing to help them grow the skills, but the people need data science, machine learning and engineering if they're willing.


But it'll just be a longer road for those people.


More how you call them data analysts, it's more so whenever you look at a profile, I just ignore the titles that people use and I just focus on. OK, tell me what you did. Describe to me the type of projects you did and that generally then helps you understand, OK, do they have the experience I am looking for, for what my team is trying to do. So I ignore the titles and different people. Like I was saying, we call it the Machine Learning Engineers Emily's that I've heard different terms used by different teams.


That's another reason I just don't look at title. I just look at the type of work they have done and then what to do in detail. OK, what did it involve? Did you end up deploying it? And what technology stack you use was in the public cloud. Private cloud. Did you end up building monitoring for it? So just going deep into their work is the only way I've found I can map whether what they have done maps to what my team is trying to do.


And so what I like from this approach and thus also the focus for this episode is you are basically saying we can only hire a players in the sense of we also need to assemble. A team of people will lead to progress together. And in the end there will be together a team and a team. But each one of them may not be the best person. But still they train on one specific aspect because of the ever changing nature of these roles, right?




Yes, definitely. And just to reiterate that point, when I'm hiring, I'm not looking to hire one person. I'm looking at my entire team. I'm looking at what business problem are we tasked to solve and then figuring out based on who I have today, what's the next best skill set? I need to make my team whole and make my team a team. And whenever someone leaves, that's the other thing that's very common. I'm sure my fellow leaders will relate to this.


The demand and supply or the skill set, the ratio is really off. And what that means is that it's churn on your team. I have accepted that as just a fact for me, which means one of the things that we discussed it is I'm always hiring whether I have headcount or not. I'm always hiring because the kind of churn you see specifically in this domain is higher because that is just a lot more people that are looking for the skillset then that is supplying the employees specifically.


So I'm always hiring. And whenever someone leaves, I take a look at my team and try to figure out, OK, now that I have a slot at this point in time, what does that next best person we can bring in to make the team whole? And then you go with your intention, the hiring process, to bring in the next best person to make the team home.


And how do you do that exactly? Is there any framework or recipe that to use again and again?


These seem like very basic things, but I am very diligent with using intentional process for my hiring whenever I'm trying to fill this stuff. And one of the biggest things that I realized, if I don't follow this is my own biases take over my own ways of still typing. People take over and I might not hire the best person for the job. And I'll color some of the specifics that I use when I'm hiring is first and foremost laying down. What capabilities are you looking for?


For the next hire, for you sitting down and actually going to that process for yourself before you even look at resumes and candidates? That's very critical. And when I sit down and figure out what capabilities. We all want a lot of things, I believe, in the person that you're bringing in, but you'll never find that unicorn. So another thing that I started doing then in my own thinking and also on the job description I put out, is I separate out my must haves and my nice to have and people have probably heard of the research and statistic around.


For example, when you look at gender men applying for job positions, if they qualify for 60 percent, even 60 percent, and women apply for a job positions where they meet the one percent of the listed requirements.


So if you want applicants from non majority pool or minority pool, then it's a good idea for you to keep your mustache to the bare minimum that you consider for the role. So your applicants are a wider pool and put everything that is truly nice to have into. Nice to have. So I started doing that when I'm thinking about what capabilities I'm looking for, skillset, capabilities. And then my job descriptions also reflect that, which is I try to keep the most a minimal set.


And then nice to have is where all the other stuff goes in, for example, the six degree or to be in stats for me that goes in nice to haves and because I have a lot of people that don't have that and they're doing amazing on the to their self-taught, they haven't really gone and done the specific degrees.


One percent had even come from agriculture degree. So I know as a fan that some of the highest performing people I've had don't have a science degree or they don't have a Ph.D. in stats necessarily. So I look at all the different skill or capability that I'm trying to hire for and be very strict about keeping. That must have to a minimum. And you can just think of your best performance in the past and that'll help you decide a particular thing should be it must have, are nice to have and that has definitely helped me.


So that's one big one I started doing. The other one is training your people for how to interview Will and training your people to do resource based interviewing the stuff we were talking about earlier, which is ask people to demonstrate if that is a specific thing that you're looking for. Don't ask them questions. You're not trying to make them fail. You're trying to just understand, have a conversation with them to see whether what you're looking for. They've demonstrated that in the past.


And for example, we look for people that have ideas for my senior roles, that have ship services and do end. So they were responsible for providing training, for figuring out the right metrics, for doing the modeling, training, validation for deploying the service, for doing the necessary things to monitor and experimentation associated with trying out different models. So if I'm looking for someone who has had experience doing this, then my question to them is going to be, have you had an experience in the past where you build and deploy, monitor and did a machine learning service and end and let them speak?


So that's the pattern to use for anything that you're trying to assess, which is the capability or the skills that you're looking for them, for an example, where they have demonstrated that particular thing. And that generally helps in getting really specific details about whether someone demonstrated it or not and just ask the questions to make sure they're not spitting out things from something they might have read. But they've actually done the work. And one of the questions that one of my colleagues gave, which I personally like, is tell me about the specific most interesting signals that you found for this particular model to perform well, which was counterintuitive.


So ask questions that tell you whether somebodies skillset and knowledge is deep or they're just surfing on the surface of Nazo, building up that list of questions and helping your team interview. Well, we also have we repeat the same set of questions with all the candidates. So, again, to put out bias, that's another specific thing, which is if somebody is testing them on somebody, testing them on engineering somebody, testing them on whether they're the perfect for the attributes we look for in people, then that is a set of questions that regardless of who's interviewing for that particular site, that does the set of questions we use for all candidates so that we we have really apples to apples comparison.


Between then, that set of candidates that we interviewed for that particular position. So these are some of the ones that I can think of as we talk more, if I remember more.


Do you also have a coding interview, whiteboard exercises, or is it only asking questions about what people have done in the past?


So when we used to bring candidates in House, I guess, but I am not a fan of a white board for interviewing because that's not how you work. Whiteboarding system design is perfectly a good way to figure out system design chops or animal systems design just because that's how you are going to do it with your team where you try to. What are the main blocks for my email system and how do I think about it? So for that one white board is good, but for how good are they at programming?


My preferred method is programming. So if you have the time, set out an example code base where it's a common problem that anybody can relate to and ask them to build out one specific sub piece within that module, give them enough help with somebody checking in every hour or so and give their stuff, because that's what you do in a team, right? If somebody stuck, you can ask them questions so they can flow further as this person is already intimidated by interviewing.


So giving them something that they can do work the way they would and then being a good teammate where you you're checking in on them during the interview. The most important piece for me is having things like ask them to write at least one or two unit tests, ask them to debug failing test within that module. I know it takes time to actually do this because we have to consider having a laptop from my team, setting up with these pieces and also giving them options for languages.


So you'll have to consider all these aspects. But it's investment worth making so that you're trying to assess as much as what the real day would look like and how they do when they're trying to program in a real setting. So my preferred method is that it's a lot of work, but I do think it's a better way to compare it to what they might do day to day.


And you do this yourself. Do you work with the person taking the interview and do you help them with a peer cutting exercise or do you have someone in the team do it?


So hiring is a team event for me. And we generally so let's say our senior voices would work with other engineers that are interested in being part of setting up hiring processes. They would work as a SWAT team to figure out what is the problem they choose. How do we deal with if someone is not familiar with this language, that language. So the team generally will come up with this and then we discuss it to say, OK, is this going to work or what problems might come in play?


And then we tweak as we go based on the reaction that the interviewer might have.


OK, OK, so you talked about defining the role and writing the job description and how you try to keep the required fields to the bare minimum and how you give the interview and then the person reaches the end of the process and then you extend an offer. Do you have any advice on conversion, on improving the conversion, making sure the person will sign your offer and then other companies knowing that again? So you're based in the Bay Area. So one of the most competitive place on Earth for Game of Thrones.


So conversion is a huge part as well. Do you have any advice on these?


Do you have a conversion for me? Speed is critical. That's one factor. So moving through the process for them and giving them the best experience they can have, being very respectful, of course, and moving through fast, I've seen that to work. Well, if it's senior roles, I definitely spend a lot of time just helping them understand it's not senior roles, but for senior roles. I definitely make sure I do this before I leave. When we go into the interview process, which is helping them understand what this team is trying to do, what the vision is, what role they will play in realizing the vision, getting them excited about what we are doing or not.


Right. If you have a conversation with the senior person that is considering your team, you're talking about the vision, you're talking about what this team is tasked to do. And if they're not excited about, it's completely fine and they might not want to move forward, which is a good thing for both parties so that you're not wasting anybody's time, but setting the stage and getting them excited. And hopefully they don't get excited by what the team is doing.


And then we move forward to the process, move through it as fast as we can train. And respectfully, through the entire process, and then if it goes to our first stage, go in with your best foot forward and try to see if that materializes.


OK, so I expect you're currently hiring a lot and then dipping those processes and doing that today. And probably you've been given a lot of advice to other machinery managers and executives trying to build their teams as well. What's the top three those that you give to those people?


One of them is we already discussed some of them. One of them is hiring is a team event, definitely engaged, involved in your team in forming the process, figuring out what exactly is the capabilities that you're looking for, for hiring and also leveraging your network and your team's network to try to get as diverse the pool that you can to your pipeline. The second one, I would say, always be hiring, whether you have a head open or not.


The best way for you to build a diverse team, from what I've seen, is already having a warm pipeline of candidates that, you know, from your network, from your teams doer's network before you actually open up that track.


So always be hiring. And having some people that you're kind of keeping warm is key. And the third one that we discuss multiple things around is they be very intentional with everything you do when it comes to hiring. And as a leader, if you're not from the underrepresented minority pool, then you somebody in your team or somebody in your network that is a minority to kind of walk to your process and get their feedback on what are the blind spots that you have so that you can tweak it.


And even if you don't know what might be problematic, just leverage people, you know, that are from diverse backgrounds so that you can have a really good hiring process.


Yeah, understood. Well, thanks a lot. It's already been 30 minutes. So great discussion, a lot of advice. And what I like a lot of very practical advice for people building engineering teams especially. Thanks a lot. It was great. Do you have any final words, any final advice?


The one thing to say which is specific to Emelle is make sure you're hiring for what you need and don't get pulled away by shiny things. So when I say that is there are some teams that are trying to do apply them and I've seen them be intentional with what they need and going after people that have big names who are potentially talented magnets, they've done a lot of they've pushed state of the art. But you're trying to hire them for your team.


But is that what you really need? And I've seen some of those hires be fails because the person that you're bringing in wants to do something else. They want to push the state of the art. But that's not the task for that particular team. And the mismatch can get everybody in trouble. So with that mole, that's something to specifically watch for, which is if you're looking for applied and look for people who are excited by applying them and who want to do apply them.


If you're looking to push state of the art with research and that's your team, then yes, then go for people that are interested in research, that are interested in pushing the state of the art and that have demonstrated it already. So I'm being pulled by shiny and being very thoughtful about what do I need and then going after that because I've seen it fail if you don't pay attention to that specifically in that time.


And it also comes down to the hiring A players, but B, hiring teams don't necessarily need the two things that matter are great discussion.


And you're on the.


Yeah, it is fun. Thank you, Robin. Thanks for listening. That's still the end. If you're still with us, it's probably that you enjoy the players pay players is brought to you by myself and higher suites. We are building a sourcing automation software and we already have nine of the companies hire the best science to know more about us, go to W-W the higher suites dot com or you can add me on LinkedIn.


I'm pretty responsive and always happy to check the more subscribers the best guess will host. You want to help? You can do a lot in less than ten seconds. Please subscribe to the podcast. Leave us a nice rating or review and share the podcast around you. That really, really helps. Thanks a lot and talk to you soon.