Hey, everyone. Thank you for joining us for today's webinar. I'm Josh Jones, and today's topic is making sense of talent metrics for smarter hiring. I'm thrilled that you're joining us for today's session. Over the next hour, we're gonna explore how leading experts are cutting through the noise of recruiting data to focus on the metrics that actually drive smarter hiring. If you're feeling overwhelmed by the metrics maze from the AI generated resumes to the burned out recruiting teams and the constant push to justify your tech stack, you're in the right place. Today's conversation will help you move past surface level analytics to uncover insights that truly matter for performance, engagement, and retention. We are joined by an awesome lineup of speakers. We've got the one, the only, Jim Durbin. We've got Sarah Kumar and Damian Glancy. Together, they're gonna help you rethink what recruiting metrics should look like in twenty twenty five and beyond. Before we get started, just a few housekeeping notes. There is a chat feature at the right hand side of your screen. Please use that to engage with each other and with these panelists. If you have questions, though, please use the q and a tab so that we can find them quickly. Today's webinar is being recorded. You will receive a recording of today's broadcast by the end of the day. This session has been made possible by our good friends over at PageUp. PageUp believes the most powerful talent acquisition technology is built on one simple principle, human connection. As the chosen talent acquisition partner for the world's most trusted brands, PageUp delivers a world class customer experience by building deep lasting partnerships. This commitment is reflected in PageUp's intelligent talent acquisition platform, an intuitive AI powered system that's easy to use, adaptable to your unique hiring needs, and always innovating. They strip away complexities so talent teams can focus on what matters, creating the strong human connections that forge a resilient workforce. Jim, Sarah, Damian, thank you so much for joining us today. Jim, I'm gonna hand things over to you. I'll be watching from the audience side. If you need anything at all, just holler. Sure. Well, welcome, everybody. Excited to be here talking about a pretty important topic, metrics. I know it doesn't sound like most exciting thing in the world, but partly that's because we tend not to have real discussions about it. So let's let's actually get into that discussion. We know metrics are vital for us. You know, what do they always say? You can't, can't manage what you don't measure even though that's the wrong quote. The truth is is that we have responsibilities to talk to our executives. We have information that we need to communicate to our employees, and, of course, there's a candidate experience side. So we're gonna cover three main categories here, starting off with the idea of of vanity metrics. So the question is, if we got good metrics and bad, I'd always chase something like time to fill or quality of hire, things like that. Those sound good, but are these causing us some hidden problems? Are the metrics that we're using to judge how we're doing causing us problems, not just in how we do our work, but how we're perceived by executives? So we're gonna start with Sarah. Sarah Commerce is a senior recruiting manager for Consumer Group of T Mobile. Welcome, Sarah. She is responsible if I'm right. You hire sixty thousand people a year. Twenty. Twenty. Oh, well, here's the twenty twenty six. Typo starts off. Let's start with it. So so in your experience, let me ask you an easy one. What do you think the worst metrics are? Like, what are the vanity metrics that mislead instead of guide you? Twenty thousand. Is she freezing a bit? I may need to go log to a different station because I So what I can do if you wait Connection isn't. If you wait just a second, I can provide a phone number for you. It might be easier to Guys. No problem. While she does that, I can pivot over to Damian. So, Sarah, when she comes back, when she gets on the phone, is gonna talk to us about the vanity metrics that are in place. Damian, for you So Damien Glancy is the senior vice president engineering for PageUp, today's sponsor. He's coming to us live from Ireland. Isn't that right, Damien? That's correct. Yes. Very wet outside. So I know we're jumping right on it. We're moving over, but I got a question for the back end. Right? So we're talking about what vanity metrics are, and she'll come back. Sure. Can we trust the data we're pulling from our reports? And what kind of mistakes do we tend to make in the way we create these reports? Is there anything you look at that makes you wonder why anyone would ask for it? Like, from from the from the dashboard side, what are we doing wrong, and what do you normally see that we could be doing better? Yeah. So straight into a a kind of a quite a technical a technical question, Jim, quite a technical topic. And and I guess I I'd answer, can we trust the data by saying, I I it depends on the plumbing. If you've got a very simple single system, the data is obviously going to be a lot more trustworthy than if you've got a complex HRIS web of applications, which I think most of us probably do. And so most reporting issues sort of stem from bad not from bad intent, but more from some gaps in in the data and how it's being interpreted. And so, for example, different systems like ATSs and CRMs and career sites might have something just as simple as different time stamps. If those systems aren't even synchronized on time, that can be a challenge in terms of whether you control state or not. Right. I've seen that where I get I get things that are coming in, and it's showing, oh, what happened last night? But they're set in a different server in a different country, so it's six hours ahead. So my day is off by six hours. Yeah. It makes a big difference because, obviously, twelve to six, not a lot's going on in the US. Yeah. And it can it can it can skew your you it can skew your your your trend data as well. I mean, that's that's just a simple one, but but probably a little bit more more detailed in terms of when you're thinking about connecting different systems together. There's often a focus, I think, around just getting technical connectivity. Is the data flowing? Can I log in? Can I connect at an API level? But, you know, you need to sort of think about things as well from a maybe from a business perspective or a business definition perspective, something maybe even as simple as application submitted. What does that mean? It could mean a very different thing from your career side to your ATS or to your or to your CRM. And so you need to have some sort of common definitions of that as you're integrating these systems or at least understanding those differences because, again, you might be measuring the wrong thing. And so I think without some form of normalization, I I describe as you're you're comparing kinda apples, some potentially slightly different apples, and and they all can sound close but have meaningful differences. But in general, the data recruiting data tends to be quite transactional in nature, and I think that does help with with trust. I I think it's very important, I think, because the the definition. Because I remember we were running source of hire. My boss ran it, and they built a big Power BI dashboard and showed source of hire. Like, I don't understand why is it a dollar to hire from Indeed and twenty thousand dollars to hire from LinkedIn. And the answer is, well, your LinkedIn clients, no one's submitting it. It's they're actually starting it offer started, you know, started versus offer accepted versus offer given. We had twenty different codes from offer, offer accepted, start date, completed. I mean, there's so many in there that unless you added all twenty of those and drop down by, you're not getting the information that you thought you were needing. So, yeah, that the the definitions are so that's a matter of connection with your TA team. Speaking of TA team, Sarah, can we hear you back? Can you guys hear me? Oh, it sounds fantastic. It sounds fantastic. About that. Well, that that's the key. That that's actually a pretty important part about TA. Everybody expects things to be perfect, but the truth is sometimes thing goes wrong, and how you fix it matters more than how things go wrong. So welcome back. We'll pivot it back to the question. Damian, you still have a great answer. He was talking to us about the back end, the problems with definitions, are your servers aligned, things like that. So let's go back to the original question for you, which is vanity metrics. What are the what are the bad metrics that are possibly misleading us or causing more problems than they're fixing? You know, I I love this question because I don't know that there are metrics that we're utilizing today in corporate that are bad, but there certainly are those that are misleading. And when I think specifically about the work that I support, which is primarily high volume, I think about interview pass rates. And our recruiters are so heavily tied to my interview pass rates are too are too low, and my hiring managers, my interviewers are being too too picky. So you mean passing on to the manager or once the manager's done it, do they Higher rates. Yeah. Once the because in in high volume, typically, there's a team of interviewers that are making the decisions. And recruiters get really taught real get really stuck on, well, they're not passing enough people. They're making me screen too many people. They're being too picky. And my response to the recruiters is always on the other side of that story is a leader who's saying recruiting isn't giving me good enough talent. So I think that's one metric that we get hung up on, which is interview pass rate. It can be too high. It can be too low, or it can be an indicator of a miscommunication between talent acquisition and leaders. And and really not getting so stuck on the number that you're using it to define whether or not you're successful. Right. Because when you tell the boss, hey. This is how we did, and the next quarter, you have you don't have the same number. They're like, why are you changing your metrics? What what about volume to hire? Is that is that an issue? It's like the the sheer number of applications have exploded because of AI applications and the economy. Is that something we have to be careful about too? Because we've measured ourselves for years. Every bullet point on my resume talks about how many more resumes I got for less budget. But talk to me about that a little bit and maybe historically and then what the difference now in the last year. Yeah. I love that because, really, we've as hiring teams, we've seen a a massive change in candidate in candidate activity in the last five years. We went through the pandemic where nobody was applying. Well, we had a lot of applicants for a little while, and then no no applicants at all. And, really, I don't know about other other companies, but at T Mobile, we spent a lot of time sourcing in twenty twenty four. It was a hard year for us to get the right candidates in future. You were doing a lot of sourcing, I think. All roles. High volume too. Yes. And we continue to source. There are still locations that we need to source for. But in twenty twenty five, we finally got it kinda figured out. We got a rhythm. And to your point, suddenly, all of these corporate roles are getting a lot of applicants. And some of our roles will get, gosh, almost two thousand applicants in twenty four hours. So those are the corporate positions. Typically, they get the higher volumes of candidates per role. So you're right. We had to really pivot in twenty twenty six for our focus and move away from candidate pipeline and talent attraction to attracting the right talent. And so I think you're when you're looking at your metrics as a recruiter or even when you're looking at purchasing media, looking at those sources that are richer. You mentioned LinkedIn earlier, and why is my cost per hire at LinkedIn so much higher? Well, the way I look at it is you wanna take all of the applicants that came from that source and check the percentage of those applicants that were hired, and that tells you the richness of that applicant pool. And, really, in twenty twenty six where recruiters need to be focusing more is on the richness of their applicant pools. Is that the talent density that everybody's talking about, or is that just one of those fake buzzwords that Yes. Yes. That would be the buzzword would be talent density. Yes. To to try to insert between the two where where Sarah's going, where Damien's going, I remember sitting down with somebody who told me that one out of every four candidates I get from LinkedIn is hired. And I started laughing. I said, no. It's not. He goes, why are laughing? I was like, because it's not physically possible. Like, in the multiverse is it even possible for that to actually happen. He goes, let me show you. So he pulled up his ATS, and he showed me twenty five percent of everybody coming from LinkedIn was indeed hired. I was like, the problem is is that that's what your data says, but I know what's happening. You have a recruiter who's going into LinkedIn talking to people. And when they become a candidate, that's when they're putting that into the ATS. So it's not one in four candidates from LinkedIn. It's the way you're inputting data. One in four candidates that you get from LinkedIn is hired, which is significantly different because he went out and he bought one of those three year long term LinkedIn recruiters spent all that money and then they integrated. That number is no longer twenty five percent. It's point o o one. So it it's so dangerous for us not to understand that and not to communicate back and forth. So if you're asking someone on the engineering side to build your reports or pull your data, they're just gonna do exactly what you said. So some of those clarifying questions become become difficult. And but it's it's important for us to know. It's on us to understand that a report's only as good as the data, but how we input it is just as important as how it works on the back end. And then, of course, Damian, part pivoting back to what you've mentioned, the challenge of making sure the systems are in place. Damian, do you have any do you have any tips for talking to us on how to get how to ask for reports or how to make sure we're tracking that data? And from the engineering side, we just asking for ridiculous things? Of course. And that's okay, though. It's it's it's a ridiculously challenging problem. I mean, I came originally from banking, and, you know, banking is complex as well and and regulated, overregulated, but it's very transactional in nature. You you you can't get away from the fact, and even with the source of higher kind of conversation you were just mentioning there, Jim, is is that we are trying to track human behavior here, and it's complicated. And I think it it can be easy to just try to sort of melt it down into a dashboard, but in reality, there are gaps in in in what is even in your system to be measured, never mind being able to surface it and true in the dashboard. So I think it's it's it's what makes us attracted to this industry, I think, is is the human dimension, the h. And and so from a from a from a software engineering perspective, that's also the interesting part is that you're working with, you know, we you're trying to create insights in human behavior, which I think is interesting. And so, like, I I guess, things that we could ask maybe trying to create insights is is, like, the things that we see is what I'll call I'll use engineering terms, so I'm happy to to expand on them, but say, like, over aggregation is something that we see. So for example Over aggregation? What's that? Like like, teams collapsing, like, distinct statuses, say say, in review or rejected or withdrawn into, say, not higher. A wreck and all of a sudden, three hundred higher. Yeah. And all of that. There's only three hundred five people, so that three hundred weren't really involved. Right. Got it. Yeah. And the fidelity is gone. Did we explain that right, or is it just showing up in my head? Is that clear? Overaggregation. Let's slow down and let Damian finish because I'm thinking that those listeners on on this webinar that maybe are unfamiliar with overaggregation, it would really help just let him go through the run it through the entirety of it, and then we can give it up in recruiter speak when he's finished. Sure. Sure. I mean and I think this is one of the interesting aspects too is just hearing, you know, this problem approach from both the practitioner and the engineering perspective. We're both trying to work together even though we come from very different sides of the of of the equation. So, yeah, I think what what you what can ever what can often happen is that there's aggregation happening at lower end systems like ATSs. They get passed up to higher end systems like dashboard or or CRMs, and and then the fidelity is lost. And then if you're even feeding it for essentially, the more systems the data pass through, it doesn't have to be this way technically, but in practice, you start to lose fidelity of the data in general. And the more systems you pass it through, the more the more inaccurate they become. And it's a little bit like sampling audio multiple times. It's just playing telephone. Yeah. It introduces noise. And and it and it can be it depends. If you're doing transactional recruiting, it it's it probably is okay. If you're doing if you're using the data to make strategic decisions, it can be very, you know, it can be very challenging if it's goes wrong. I wanted to touch back on something that Jim had said earlier about source tracking, and I think it's really relevant for us to talk about on this call if if there are individuals here today that are newer to recruiting or recruiting leadership to talk about source tagging and partnering with if you have an agency, many corporations use agencies for their marketing, their employer brand marketing, making sure that your agency is properly source tagging anything that you're paying for. And if they're a good agency, even things that you're not, and that your career site doesn't override that. So we did have a challenge here once where our career site as candidates were applying was allowing candidates to override the source tags that came from LinkedIn, and it was an advertisement that we had paid for or a recruiter reached out to you directly because those would be two different sources, you could go in as a candidate when you're filling out your application and say, I was referred. Well, now we just lost all that data to Damian's point. So if you have a good career site that allows you to, just keep that reporting accurate all the way through, and then that career site is feeding into an ATS that also knows how to read that data and report it back to you, And that's really critical if you wanna leverage these metrics. I I mean, I probably I probably spent half of my career on that problem, Sarah. It it is the essence of really why why I I have done what I've done, trying to figure out the actual original source of hire. If you work for a large brand, like, so many forms of that, where did you hear about us? Imagine somebody applying for Coca Cola or Pepsi. Where did you hear about Pepsi? I mean, it's impossible to actually get that, you know, that data realistically. Everybody knows about it since we're five. And and so, yeah, it's a it's a really challenging problem. It even links into the idea, and and job boards do well out of this because very often, they might be the conduit to which the application came through. They get the application attribution, but perhaps they've been on the career side, perhaps they've been at a a hiring event for, you know, that you'd be like, graduate recruiting, for example, very often long tail and can be quite difficult to measure where that actual source of candidates originally came from. Yeah. And in fact, some of the depending on your ATS too, and this is why it's there's no easy answer for this. The answer is you have to know where it comes from and what your system's like. Like, I work with a client that the very first person who whoever put it in so if you got it from Indeed or referral or the career side five years ago, that name's there, that person's still working. What if we got them in from Indeed or LinkedIn today? It's gonna show the original, not that. So in some cases, you want the recruiters to override it, but it's more that's just why it's important for you to know what your recruiters are doing. A lot of times, you can't just count on those dashboards. What are they actually doing? So because you you can hide a lot of hires. We had one where we got twenty hires from a source in a quarter, but then I ran an email of the actual hires and it was thirty four. That's pretty significant for high end roles. And it's simply that the original source data claimed one thing, but the most recent showed another. But then you have to train your recruiters and manage and do it, so that's very difficult to get to. Sarah, I mean, Damian mentioned this before. Let's pivot about results. Let's let's let's answer the easy question then. Yeah. Seeing we can't necessarily trust your data, and it's difficult to do, important, to do, how can teams measure, like, recruiter burnout or workload capacity through data if we can't trust it? Or when what do you like, you use your intuition? What what do you what's a signal that you've got an overwhelmed team that might be masked by positive dashboard results? Yeah. I think about this a lot. Two of the things that I hear a lot from the recruiting managers that I support are fills and rec volume. One of our high volume teams, the the recruiters on that team have a very tactical style of recruiting just out of necessity. It's how we support that group, and they'll hold sometimes up to a hundred openings at a time. And on the flip side of that, I have other teams over you know, I have I have professional teams that fill maybe five a month, and others that fill thirty a month, other nonprofessional roles that are high volume. And so you hear fill rate on the other side of that coin. And what will happen is the productivity and the burnout really stems from the difficulty of the position, and you can't always know the difficulty of the position based off of just one number. There's balance there. We have a pretty complex dashboard here that weights all of our job codes. So different job codes have different values. They've they are weighted from one to twelve. So the easiest being a one, the hardest being a twelve. And the weights come back in and measure against the job code and how many of those positions they filled that month. And that's been pretty helpful. But when you aren't careful about really digging in to your recruiting, productivity, your recruiters will burn out. And three things will happen when your recruiters start burning out. The first is the obvious. Right? They're just they're your culture drops. The second is mistakes start happening that are not their fault. The mistakes are happening because they're overworked. They're going too fast. And then the third is your quality of hire drops because we're just rushing through. So it's really important to make sure that you're managing, burnout and productivity on your team. I I love that idea of coding it for ease, but here's the question. Obviously, you can tell the difference between hiring someone in a call center and hiring a senior software developer. What do you do when you're hiring in Atlanta, which is a relatively easy market? You never have a problem with quality. I mean, there it's about sorting and zip versus something like Savannah or Kansas City versus Chicago. In terms of your recruiters, does your do your codes take into account locations? And is there even an objective measure of that? Because how do you know if Kansas City is easy versus Overland Park? I mean, you can break it down to the ZIP code. This place is difficult. This place four miles away is easy. How do you is there any way to break that down, or is it just more the manager kinda knows and weighs that difference? Yeah. That's a great question, and I don't know. Can you hear me okay? Yes. K. Voice sounds great. You can't get that nuanced. My voice is good. I'm gonna go off camera. Yep. We can still hear you. You cannot get that nuanced large corporation. Good. Good. Did you hear my answer? Yes. Can't get that nuance to a large corporation. Alright. I'm gonna dial in. I I can jump in here, while while Sarah Yeah. Because that's a pretty important thing. How do how do you judge that from the reports? Again, I think it's I mean, a practitioner, of course, will know their team better than any sort of software engineer or or report will know. But you can build reports and and and a page up, we we have done so to try and understand, you know, recruiter burnout based on their behavior, based on their their their volume and traction. You know, tracks things like screening load per person, how much time they're burning on admin work, their email patterns, or emails going unanswered. And I think the thing is to is to acknowledge that this is not about laziness. This is about about about velocity and burnout and and and looking at a team that needs help. And, you know, know, you either reduce the load or you hire more people or or you automate the busy work that's dealing ten to fifteen hours of of recruiters' time, which, of course, is what, you know, those of us in software are increasingly trying to do. And and and, you know, maybe I think I'll be the first to mention this. AI is is, you know, has has some potential, you know, assistance there as well. The the task of scheduling interviews, for example, compared to five or ten years ago is getting into much healthier place with automation. If you think about the difficulty of it, I mean, hotels fill their hotel like, x Expedia or Hotels dot com fill thousands of hotel rooms, you know, automatically every day, and and increasingly, that's true, I think, with interviewing as well, although the human behavior and complexity does make it challenging. So, yeah, I think it's it it it is it's really tried to not make it intuition, but make it a data driven decision and not associate blame, but just say here is a team that is so successful. Because that's the thing about burnout. You're measuring essentially success and seeing the impacts of those success in sort of negative outcomes or negative negative numbers. Because a a team will, for a while, be able to overcome the burnout, I think, by sort of overworking, but eventually that runs out of of energy and steam. Yeah. Burnout's basically future success. Just to put a cap on that that category, what Sarah was saying is pretty important nuance. The answer is there's a lot of answers, but more your metrics aren't going to be simple. You have to know them. You have to understand it and just be consistent, and you can look for trends. But it's much but this is why it makes it interesting. It's because as a manager, you have to decide what you're gonna pick and how it works. Don't assume it's the end all be all. I think that's the challenge sometimes when we talk about data is, oh, this is how it is because this is a report. It's more complex than that. But if you get weighed out into the weeds, you're not gonna learn anything. Fact, the noise is gonna get worse. But since you talked about AI Sarah, are you back? Can you hear us now? I am. Can you hear me? Absolutely. But now we're moving on to Damian's question. I gotta hold my phone the old fashioned way because I didn't bring my earbuds. That's okay because the next one's for Damian. I was talking about AI. So let's pivot away from just talking about metrics to talk about something what's happening to us, is how do you decode quality in a high volume AI saturated world? I mean, certainly, we've been using it to to sort, to filter, to make ourselves smarter, to build job descriptions, communicate with people. But let's think about it from the candidate side, and we have to first understand that when we talk about AI candidates or AI saturated, there's two kinds. There's the AI resumes, which is augmenting a resume, writing a resume, following up. Someone grabs your job description, throws it in, and now they write the perfect resume. But there's also the AI application, which is just taking the regular resume and, you know, it's not really AI. It's automation of pop, pop, pop, pop, pop. So you've got a problem with volume of the wrong resumes, and you have the challenge of people rewriting the resumes, not not because it's better, not helping them, but just trying to get past the ATS. So now you've got a fake candidate with a bad resume that you can't stop in your filter. And we know this is a problem because at the innovation summit last week, we were hearing about this, not just that TA teams are overwhelmed on the apply side, but the number of fake candidates that get through to the last interview before they realize this person doesn't really know what they're doing. They've been using a a answer. So so it's not a might, but it's scary between those, the quality of resume and the volume. So a question, Damon, is there anything from the other data red flags that you can that, you know, indicate that the brand messaging, the tracking the wrong applicants, or you can tell that these are all fake? Is have you have been been approached about that, how you to deal with this influx and identify just a full on nonsense resumes. Oh, yeah. I mean and and and and the thing is this is a a growing problem, and it's growing at a rapid rate. I mean, it's grow it's a bigger problem, you know, in November twenty twenty five than it was in November last year. Though it's been around for a while. I will say, just like you you mentioned, we all use AI in our job in some form or another. If you've done if you've used a spell check-in a resume in the last twenty years, you've used AI. I think nowadays, what we're really talking about is a new form of AI called generative AI as opposed to sort of machine learning, for example, which which which spell checks are used at grammar checking. And there's nothing wrong with that. I can't spell. I've been using a spelling checker for years. And so, I mean, that's really I've lost the skill of spelling. It's also, like, not as important as it ever, you know, as it once was. So here's the question. You know, GenAI, is it are are people losing skills that they will not need, okay, or that they will need? That that is a challenge. But, yeah, we we can see data. There's a couple of things. And this feeds in I I I'm sort of I focus on recruitment marketing, and one of the things that we you know, I give a talk at Unleash recently about, you know, ask for more sorry. Ask for less to convert more. And so just ask for less information initially because one of the weird weird things that's happening is if you have a big application form with twenty fields, that's actually it's it has always been very unfriendly to humans. But what has changed, it has now become trivial for bots and trivial for AI to fill out. So, I think that's very interesting. You were literally if you AI will just love twenty five fields to fill out and will will do it. And so one of the things I think that you can do to combat it is just ask for less and have a more involved engaging flow. Send an email after you ask for three or four pieces of information, ask for some more information, and engage people that way. They'll also feel like that they are making progress in their application as well even though you're not involving recruiters in any of this. But in terms of in terms of to get back to core question, Jim, I think high application starts with low completion rates is is one form of sort of AI that we see. We because you also asked about brand messaging, so I can kind of talk a little bit about that as well. But whether your messaging is just attracted curiosity and not commitment, for example. But, but in terms of AI, flagging that, I think application completion times that they're suspiciously fast, application submit drops, like, from eight minutes to two minutes for a bunch of candidates, maybe an awful lot of applications coming from a location where you don't really hire in. That could also be be something looking at first terse touch response patterns. If you could track your application back and depending on your career site technology, say, a page up, we we would be able a recruiter will be able to see the behavior of that candidate across their across their interactions with the career site to get a sense of whether, you know, they're they're really engaged or not. Real candidates take twelve to twenty four hours to reply to an email. Bots take seconds with almost no human lag. So, yeah, we are building detection into our platforms to try and flag this stuff. We have to be careful, though, because, you know, a very enthusiastic candidate could get, you know, falsely flagged, and they're just, you know, somebody who really wants the job. And so I think some of the detection tools can work quite well at volume to answer how many AI applications by percent do you think we got this week or this month. I think it could do that today quite and quite a high high level of accuracy. At the individual, I think we have to be very careful, you know, because it can get it wrong. Yeah. So Now those are great takeaways, especially, like, you'll be able to look at especially at volume. You can see how fast your applications occurred. I guess it's a matter of communicating back up the chain. Sarah, how how are are your teams running into this, and how are you distinguishing between volume and quality, specifically for you in the sourcing stage because you were talking about that? In terms of looking at the right resume, how can you guys tell if it's a good resume? Are you running into those quality problems, and how do you teach your recruiters to determine if someone's real or not? You know, it's rare. It's rare that we run into someone completely faking a resume using AI where they're not qualified for the job. We we can tell who the mass applicants are who are using bots because when we're looking through resumes when we're doing a resume audit, the application tells you in the corner how many active applications this person has. So if a recruiter opens up the application to look at it, they look in the corner corner, and they see, oh, this person has applied to a hundred and thirty five jobs or four thousand jobs if we have four thousand posted. This isn't a legit this this is not a legitimate application. And that's also a great tip for applicants who are on the job market. It's it's be strategic and thoughtful about what you're applying to. And I argue, and we talked about this when we were preparing for our webinar today. I argue that nowadays, we don't want people who aren't using AI to build their resumes. We want people who know how to use AI. This is a day that that this is the times that we live in require that professionals can use the tools that are available to us. So you should be using AI to tailor your resume. You shouldn't be using AI to lie. But quite frankly, before AI, you could hire a resume writer. You could go online and steal somebody else's resume. There were other ways that you could fudge your way to an interview and then tank in the interview. So I I think that as far as the volume goes and differentiating who's a good candidate and who's not, AI should be helping us with that. It should be helping to tailor the resume to tell the story better of that person's work history. No. I I I think that's fantastic because it's true. Do we really care? They've always done it. I think the problem is now so many are able to do it or even a small percentage are able to do it, and then that bothers us because it crowds out the the people who were thoughtful in their application. But it's it's not new. It's just volume. It's the scale that bothers us, not not the individual. The volume. And those people aren't applying to jobs that they're qualified for. I I would argue when we see these applicants come through behind the scenes, we sometimes we we have fun with applicants, right, where we get an application that has nothing to do with the job that they've applied to, and the recruiters are chatting about it because it's nice to talk about things at work. And that's typically what we see. We see somebody applying for, like, a VP role, and maybe they're early in their career. They just graduated, and they just got their degree, and they're like, I yes. I would like to run T Mobile. That sounds great. So I mean, TikTok tells you what we see. You miss a hundred percent of the shots you don't take. I mean, if you're willing to hire me as a VP out of school, why shouldn't I? I should at least ask. Right? That is the mentality. They actually do that. Yeah. We see a lot of that. It's It's funny. We've made it so easy to apply, and that's the whole thing is removing this friction. But the most common search on Indeed is jobs near me. And when they search jobs near me and put a filter, I wanna make seventy thousand dollars or a hundred thousand dollars. And there's a quick apply. They can say A million. Click click click click. Why wouldn't they? Yeah. Right? Because it's it becomes a game. So we have created these conditions for ourselves, and they were like, wait a minute. Why are they why are they acting as independent entities? And so, really, it's a question of how do we build back that truth and that trust. Do you have any tools or techniques, I guess, maintaining, I guess, authenticity in the candidate pilot. Is that the right way to to put it? I mean, is it something have you seen falling more? I mean, especially in the high volume, a lot of those roles, do you really need some of the three years experience? A lot of it's more about are they conscientious? Do they show up? Are they able to take training? It depends on the kinds of roles. Are these call center roles or these retail roles that you primarily fill? Both. Yeah. So our the majority of what we fill at T Mobile are retail roles. Second to call center roles are the next bucket. We do about thirteen thousand retail roles a year, about seven thousand call center roles. Now these aren't all external. I wanna I wanna be thoughtful about this. A lot of these are internal, what we call promotions or lateral moves into higher paying roles that we facilitate on the recruiting side. So you screen and support the hiring decision for those roles. Authenticity in the external candidate, I'm assuming, is what you're asking rather than the internal. Yeah. Yeah. I mean, it's the the key is, are you gonna show up for the job? Because ghosting has been a huge problem. We're seeing that with a lot of these frictionless experiences where you'll my my brother-in-law my brother did that. He applied for a role. He was hired. They wanted him to show up Monday at three. He has a full time job. He was looking for part time. So now he they think he ghosted them, and he was just like, I never talked to anybody. You never got to share. You hired someone who can never come work. So there's some of those challenges that we've created for ourselves. How how are T Mobile to for your managers? Because ghosting has gotta be an issue for some mean, it's true for everybody in high volume. How how are you ensuring that these candidates are like, oh, I do have a job. I'm interested. I'm gonna show. Is that the human touch, or is there something from a a process point that we can be doing? I think that I think it's both. Where we really run into challenges with ghosting is in the interview. So many and, again, this comes back to the location. It really depends on the location, the labor market in that location, how competitive our roles are there. But you're right. The human touch helps, but it doesn't solve it completely. Because your recruiter can call that candidate the day before their interview, say, I'm so excited to see you tomorrow. The candidate can say, yeah. I'm super excited. Can't wait. And they could still ghost you. Right? Because maybe they got called for another interview or they got a job offer. They got cold feet. Who knows the reason for that? So I definitely think that the human touch is helpful. What we've also found to be very successful here in improving the show rate for both interviews and first day for those candidates coming in on their first day is we send them text messages and emails talking about our different benefits that we offer and our culture. And we found that by saying, hey. We're really excited for you to come in and work for us and take advantage of our money coach. Every employee at T Mobile gets a free money coach. We're excited for you to have access to our free tuition that we offer every single employee. And so we do these little trickles, right, of of what matters. Prior to the interview too, we send them tips and tricks on how to ace the interview and also things that we want them to be excited for to make sure that we're bringing in people that are the right culture fit for T Mobile. So talking about our ERGs, for example, our employee resource groups. I have my indigenous people's network pin on today. And so, really, we we try to make sure that they're getting excited about working here. We're also building a sizzle reel right now that shows a day in the life that we're hopeful will be very useful in that. That's textbook best practices. It's just staying in touch with people. Instead of just saying, okay. Finally, oh my god. I made the hire. They'll show. How do you stay in touch and keeps them engaged? Because that is an issue we don't think about. We always think about our metrics and our terms. But let's pivot a little bit to talk about candidate experience because the truth is with layoffs occurring, all all the weird stuff that's occurred in the world, candidates are distressful. A lot of them, they're ghosted on their side. Like, recruiter will talk about something. They'll they'll make a and a lot of them are still applying for jobs and never even back what happened. So if they're thoughtful and they're smart and they supply to four or five jobs and nothing happens, we've taught them. And they're now being taught by people on LinkedIn and others. Hey. You know what? Go out and apply to a thousand. It's a numbers game. But that's just a rush to to low quality. So I think sometimes we forget that we're in a two sided market. Since our metrics are focused on our work and our responsibilities and how we commit to executives and how we manage our people, we forget that candidate's gonna say too. And right now, I think they're not too happy with their experience. So in all fairness, some of this can be laid to feet of their own actions. If you apply for a thousand jobs, you didn't get rejected for a thousand jobs. You just I mean, come on. You didn't really apply for a thousand jobs. You just did apply a bunch of times. But we created the conditions that allowed them to do that. So it's not a matter of blame. It's how do we regain their trust, not just for our individual companies because it's a market wide problem. If we're the only one doing well and there's a hundred other companies doing poorly, we're we're still impacted by that. So how do we find out what really matters to them? Are are you able to link candidate experience feedback to any actual process improvements in the last few years? Is there anything you've learned that you're like, oh, we this is true. We've changed it, then you saw a difference. Any process improvements from that feedback? I love this question, and candidate experience is such a huge focus for us. I mean, ultimately, by the end of twenty twenty six, I would love for people to say, we can't touch T Mobile's game. They're too good at the candidate experience. We can imitate some of it, but we can't do all of it. I think first and foremost, and when we talk about metrics that matter, you should be you should be surveying your candidates and finding out how you're doing. We have a recruiter effectiveness index here at T Mobile that measures a variety of data points. It tells us how the recruiters are doing, and then within that, we also measure other factors that are outside of the recruiter's control. One area that we saw a significant improvement on recently was we implemented a year ago a new AI tool that helps candidates to apply for high volume roles, schedule interviews, and just ask general questions about the job. This tool also allows them to connect directly with either the recruiter or the hiring manager depending on who's running the job because at T Mobile, we have a couple we do have a few job codes that are run by the manager. When we did that, our technology score for our application process shot up to ninety six percent satisfaction. That's great. So the candidates really, really loved this ability to engage with, ask questions, and be supported through the application process. Were there any specific data points that came out of that that could tell you when a candidate was truly engaged? And, you I mean, obviously, satisfaction is just, hey. At least you're paying attention. At least you care. I think a lot of that comes from, oh my god. Someone actually cares about me as a person. Obviously, it decreases your customer satisfaction. But how do you tell if they're truly engaged? Is there anything that comes out from those that says, was listening. I hear I sent this stuff. Someone read it, listened to it, repeated it. I don't even know it's anecdotal. Was there any data that would show that someone's engaged that explains those satisfaction results? That explains that a candidate is engaged or if a The candidate. Yeah. Interviewer. Or the job seeker. If that that score shoots up so high, what what do you think is there any data that shows of why that happened? Like, what what made them Well, really, I mean, what we would look back at, like, the the scoring and say what changed. Right? So scoring before versus scoring behind scoring behind. And so that's really how we're tracking that against our tool usage and our tech stack. The candidates can also leave verbatim. I I rely far too heavily on verbatim, and I'm sure many people do. You're immediately it doesn't matter what your scores are. The first thing you wanna do is scroll to the bottom of that survey result and read all of the comments. And then you look at the two bad comments, and you're gonna solve for those two bad comments. So I I would say that would probably be part of the engagement. Another piece of it, though, that is indicative of engagement is the fact that they stopped to take the survey in the first place. So look at twenty thousand survey respondents depending on the role. Now that's against a million applications. Right? So we're not the responses are, percentage wise, very low in comparison to our total overall volume. So in that case, maybe you'd say, well, the candidates aren't that engaged. I don't know. I mean I mean, we can talk maybe about your side of it. When we look at a lot of the tech like she's talking about, it's about usage. I mean, that's what techs build. The question for you is how are, I don't know, texting, email, chatbot interactions being analyzed to optimize for candidate sentiment for engagement? Is there something you can offer to that? What what do you see from the tech side of the business? I I to support Sarah's point, a lot of the times, candidate engagement come and sentiment, which are very closely aligned, it comes from multichannel, and it comes from interaction. Somebody who's engaged is engaged in all the channels. They're engaged on over email, on SMS, or on on web, or wherever whatever channels that they're using. So we we track for all of this, billions and billions and billions of data points. How how people are we we call it content, but, you know, it's it's it's it's real. It's it's emails and things. So, like, we've all got emails we like to see here, whether it be work or personal. And you, you know, you tend if if you're if you're getting if you're getting close to getting an offer and if for a place you really want to work, you'll open up that email again and again and read it and get a feeling about it. And being able to actually track that and being able to to to report on it gives you good kinda early insight into engagement levels or whether they've actually read the job and and and and and and seen it. But in terms of your direct question, yeah, I mean, candidate communications, it probably started before, you know, like, the Internet got involved being very human. But with Internet technology and with ATSs, the early day ATSs, they were interviewing and and kind of communication became quite transactional. Your interview is confirmed. You you thank you for attending the interview. And it's kinda moving back to con conversational, and probably social and and and chatbot like experiences are sort of bringing that to the front again. So I would say, to to the listeners here that, like, every communication point, every SMS, every reply, every email open, every chatbot message is is it it suppose an opportunity to get your branding across. Job alerts should look like newsletters, not just your database true up on the screen. I vomited out the jobs. And and but, also, they all contain sentiment data, tone and engagement level data that you should be reading. So, like, for SMS, the difference is the same you know, difference between thanks as a response and please remove me. Are you actually analyzing that? Email by by you know, measuring how quickly candidates open, being being being sensible about, do you know what time zone that candidate is in, and are you emailing them at human times? Or are you just splurging it out at twelve midnight, the time that your server is running, which, you know, many many systems are guilty of. Looking at things like high open rates with low click rates could either signal trust issues or perhaps just unclear call to actions in your emails, and and it's it will take somebody to try and go through that data to and and do maybe a b testing to figure out what that is. Chatbots are probably, in terms of candidate experience, are probably the most sophisticated. And as Sarah kind of alluded to in the ones that they they may be using because it uses it it inherently uses end of something called NLP, natural language processing, to actually converse work. And so that inherently contains sentiment analysis as well on message content to detect whether it's just positive or neutral or negative tones. And, again, that can feed in. And and, you know, the negative tone might not necessarily would be about potential employer or about the job. It could just be the chatbot is stupid or being silly or not able to answer those questions. But a good sentiment analysis with taking within context of everything should be able to do that. And and so things like I'm in recruitment recruitment marketing area of which, you know, increasingly is, you know, CRM or recruitment marketing are kinda becoming one on one. It's it's starting to become kind of the candidate touch point. And and this type of insight across your channels should be become available. Not really just at the transactional level for the recruiter to make a determination about where a particular candidate is engaged, but also just at a more strategic level and rolling them up. Right. You don't you don't want your recruiters going and saying, we're gonna make an offer to this person, but they haven't been sitting on the chatbot long enough. Yeah. Do that over fifty thousand. All of a sudden, you might see some trends that emerge that help you fix your systems. Exactly. You do have to be very careful about, you know, making any kind of fuzzy I mean, like, I'm I'm in the European Union, so where where the EU AI act and stuff is is coming down and the regulatory situation is is becoming real. So we you know, it's it's like the new GDPR. And and rightly so, I I support the legislation. I think it'll give everybody confidence to be able to move with AI knowing that it will be safe to do so and and and and equitable for candidates. But, yeah, you just because a system can do it doesn't necessarily mean it should do it. I I blinked back blinked three times, and therefore, that means something. Or you've got books behind you, Jim. Does that mean that you're a better candidate than me? I've got a light switch for some reason behind me. And and so, yeah, you gotta be careful about the way that you read the signals as well. That that's so great. That actually ties back to some of the vanity metrics because things are moving quickly. Software makes everything move faster. There's more volume. We've always dealt with these kind of problems. You know, that question we always ask at the end, tell me what what, know, what do you know? What questions do you have for me? A lot of candidates will fail that. But go back twenty five years, the answers back then are much different now because then you had to ask somebody. You couldn't just jump on the Internet and read a whole employee brand or watch a video or dig in or ask AI to walk through and talk you through it. So it's it's not even just us. It's making sure the hiring managers and the interviewers don't have these vestigial tales of interviewing from how it was when I started working. So the data really, it becomes a better communicating that backup to not just to recruiters, not just the candidates, but the executives themselves, which pivots to the final question, which is how do you build a business case with this? That's one of the things that you could really start to see at the Innovation Summit last week. Everybody's starting to realize that, we don't live in a bubble. It's not about our metrics. It's how we interact. We're part of an ecosystem. So we've always known that hiring managers, we we can't make them do things, but the data we can share with them and try and improve processes, they're part of it. So the candidates, us, and the hiring managers is actually a three sided market or more sometimes. So the question we now have is now that we know what vanity metrics are bad and we're looking at how we're building it, we're thinking about our software, we're looking at AI, how do we start talking about that up the chain to communicate that? Like, how do we tell a compelling story to leadership that's backed up by data but also pushes towards how we're fixing our process? So this goes for either one of you. What makes a data story persuasive to, executive stakeholders who are not recruiting experts? Is it really all just about the numbers, or is there a spot there for for telling a story? Jim, I I think this question is fantastic. I did wanna go before we run out of time and answer a couple questions that came through in the chat. Are you okay with that? Yeah. Yeah. Yeah. Let's do those first, of course. And and this might help it might tie together. So the first question came from Anne Mason or Masson. Hi, Anne. She says, are you surveying all candidates even those who did not get the job or just those who have been successful? At T Mobile, we survey all of the candidates. We actually use Qualtrics, and it allows us to identify if the candidate had been moved forward or not at the time that they completed the survey. But I'm firm on this, and and it's an unpopular or divisive opinion. I want every freaking candidate to have a great experience. If you get declined, I want you to wanna come back, or I want you to say, I understand why. So that's that's my take on it. Like I said, it's divisive. I have a team of forty recruiters, and they don't all love that. And the hiring managers don't love it either, especially when I talk to them about their interviews. And we're talking about their interview results, and they're like, well, we have a bad person who can move forward. So do we even care? Yes. Yes. We care a lot. But you're a consumer facing man. That's pretty important because and it's another thing to communicate with the executive. You do a bad job with that. People will not go to T Mobile. If if if it's a horrible experience, it impacts their I mean, so there there's an impact on that that they've quantified years ago. So it's much different. It's not just us. We're part of the whole corporation. So it's pretty important to know what's going, not just from the people you hired, but everybody. How do they feel about us? How does this feel as a brand? Pretty important. Again, we go back to those million applications. Right? And then we have Zebra, Butler, Subra. For trickle text, are you using a program for that, or are your recruiters sending them from cell phones? Great question. We used to send them from cell phones. We now use our tool, and you can see that on our career site, to send those texts. And then once we make the hire, depending on who the leader is, the leader will send those messages out after hire personally. So, for example, in our call centers, you go immediately to a trainer. The trainer sends those notes out. For our retail applicants, the retail store managers continue that connection prior to start to use examples of our high volume positions. And I think that's it, Jim. So Yep. Those are the two. So how do you communicate all this to your executives? How do you tell that story to make sure they don't just cut your budget or fire sixty percent of our staff as if sometimes you're making the claims? Yeah. That's that's a great question, and it it never ends, if you like. I think one is, as recruiters, it's really important for you to put together your business reviews every quarter. Get those in front of your client groups early so that they can use them for their quarterly business reviews. So talk to them about what you did to support them, what worked. It's also extremely important for recruiting leaders to be strategic. You need to be looking around corners, and you need to be advising your business on what we should be doing in order to be successful in whatever the hiring outcomes are. So often in my world, we hire for projects. So it's important for us to come in early and say, this is what we recommend for this project. Track it all the way through. What was our attrition, for example, before we started this project? For the people we hired during the project, what was our retention for them? Did it work? What were our candidate satisfaction rates? What's our fill rate? How staffed are we? Then at the end, you're telling your story and showing your success rates against the strategy that you developed. And, also, you have to be ready to pivot. So if you're going to present to leaders, they wanna know what your problem statement is, the impact it has, and what the solution is going to be, and what that outcome will look like for the business. So really kind of brings me back to recruiter speak, which is the SBO format. Your what was your situation? What was the what's the behavior that you're recommending, and what's the outcome that you're forecasting will come from this. SPO. Got it. SPO. Damian, final question for you. How can TA leaders bridge the gap between the operational metrics you have and the business level outcomes? Is there anything you can help with to assist people like Sarah or that you're seeing or that you would suggest for us? I don't yeah. I don't have any any significant insight here, Jim. You know, I I I guess, just to make sure that we have that we can get the the data out to TA to be able to report on it and that we are flexible enough to to to bridge those to bridge those metrics. I mean, say, like, to be able to to be able to frame it to help TA be able to frame it in a in a business context. So maybe missing a missing a TA goal means that we missed a revenue it has a revenue impact and that we missed a product launch, for example. I'm thinking more less as a vendor here and more almost as our own business. And, you know, I'm a hiring manager myself, and and that and if we miss if I don't have enough resources, then that has real business implications. So to bridge the gap, I think it's being able to try and and help, you know, make those slides or those decks for for the likes of Sarah to be able to tell the business story and not just our to hire has dropped one point four percent. That's like, to a to a to an exec who doesn't know anything about TA, that's just a number. They don't even know if it's good or bad. They could it might it might come across as, well, that sounds might be cheaper. But but to frame it, it's something that a CFO understands, you know, our revenue is at risk. And and and so I think you gotta see a TA system as also as a as a business as long as a business prediction engine in some ways. What's happening in TA today is a is an indicator of what is going to happen. There's a lag, but it's an indicator of what's gonna happen to the business of the future. Josh, anything final for those? Any final thoughts in the last minute? Anybody wanna toss something else out there? Maybe flaw in rep recipe? Fantastic discussion. Cannot thank the three of you enough. And thanks for the shout out to the recruiting innovation summit as well. We are hard at work planning the next one, so stay tuned for information about the twenty twenty six recruiting innovation summit. Hope to see a lot of you there. A huge thank you to our friends over at PageUp for making this important discussion available. To all of you, stay tuned, or pay attention to your inboxes because we will be sending out the the recording shortly. And if you have any last minute questions, we do have maybe oh, we just hit time. Sorry. Jim, thank you so much for moderating this discussion. Sarah, Damian, thank you so much for lending your voices to this topic. Until next time, everyone. Have a great day. Stay attention pay attention to our upcoming webinars, and we'll see you real soon.
Watch now: Making sense of talent metrics for smarter hiring
Recruiting generates no shortage of data. But burnt-out teams, pipelines flooded with AI-generated resumes, and constant pressure to prove ROI have talent leaders asking the same question: what should we actually be measuring?
The problem isn’t a lack of metrics. Most teams are simply tracking the wrong ones. According to Gartner’s Future of Work Trends 2025, organisationsorganizations fixated on activity numbers are missing what truly drives performance: employee well-being, team capacity, and candidate quality. More dashboards won’t fix that. Better data storytelling will.
Join us for a live, candid conversation with industry experts who’ve solved this puzzle. Learn which metrics reveal burnout before it happens, how to spot low-quality and AI-generated resumes in your pipeline, and, most importantly, how to build a business case leadership can’t ignore. Walk away with practical tools you can implement immediately, not another system to manage.
What you’ll learn
-
The hidden metrics revealing team capacity and burnout signals
-
How to spot low-quality and AI-generated resumes in your data
-
Candidate engagement signals that actually predict successful hires
-
The data story that turns recruiting insights into leadership buy-in
Get practical, immediately-actionable insights you can implement in your recruiting process today.
Meet your speakers:
Damien Glancy, SVP of Engineering, PageUp
Sarah Kummer, Senior Recruiting Manager, T-Mobile
Jim Durbin, Managing Principal, Respondable
Fresh insights for HR
Stay up to date with HR trends, tips and more when you sign up for our industry newsletter
Read more articles
Fresh insights for HR
Get the latest trends in recruitment, HR and technology delivered straight to your inbox.
