Resources

People Analytics at Pinterest | Trevor Fry | Data Science Hangout

video
May 14, 2025
54:41

image: thumbnail.jpg

Transcript#

This transcript was generated automatically and may contain errors.

Hey there, welcome to the Paws at Data Science Hangout. I'm Libby Herron, and this is a recording of our weekly community call that happens every Thursday at 12 p.m. U.S. Eastern Time. If you are not joining us live, you miss out on the amazing chat that's going on. So find the link in the description where you can add our call to your calendar and come hang out with the most supportive, friendly, and funny data community you'll ever experience.

Can't wait to see you there. I am so excited to be joined by our featured leader today. We have Trevor Fry, Lead Data Analyst on the People Insights and Analytics team at Pinterest. Trevor, I would love it if you could tell us a little bit about yourself, how you got into data science, what you do, what you do for fun. Maybe not all of those at the same time. Who is Trevor Fry, and what does he do?

Yeah, yeah. Well, first of all, thanks for having me here today. And before I jump into anything, I just want to give a quick disclaimer that I am, you know, expressing my own personal views here today and not necessarily representing Pinterest. So just to get that out of the way. Yeah, my name is Trevor. I live just outside of Washington, D.C. with my wife and our dog who just turned one year old yesterday. And we really like doing stuff outdoors, hiking, traveling. I grew up in Spokane, Washington, on the West Coast. So I'm still fairly new to living on the East Coast, and there's a lot to explore and see.

As far as what I do, like you said, Libby, I'm a lead data analyst at Pinterest, more informally referred to as a People Analytics partner. So what that means is I am on the People Insights and Analytics team, but I'm also like embedded in the business in the sense that I work very closely with our HR business partners and other cross-functional partners across the People team to support the employees that sit in our engineering and product organizations. So I do analytics and insights to support the engineers and the data scientists and the people who make Pinterest a thing.

That really involves a wide range of topics. So analyzing like our org structure, headcount growth, employee representation. I work with hiring and recruiting to look at like different strengths or weaknesses or opportunities or like patterns that we might see in the types of candidates we're getting or the people that we're hiring. Throughout my career, I've worked with like compensation and finance pretty often just to align between our different functions. Spent a lot of time throughout my career working with learning and development. So whether that's being sort of a thought partner in terms of what we want to include in our learning or our training content or doing an analytics to really assess like the impact that that learning might have or if it's actually working, if it's having the intended effect that we wanted. Performance metrics, how we analyze who our top performers are, are there patterns in how those performance ratings shake out? What drives high performers? What drives low performers? A lot of really open-ended questions.

And certainly like employee feedback. So like surveys, focus groups, any like sort of information we can get from employees, making sense of that. Jeez, I could probably go on. Goal setting, helping define KPIs. But I would say the work really varies both in breadth and depth. So like people often ask me like, oh, like what's your day to day look like? And to be honest, I feel like even before Pinterest, like my day to day, it's never the same thing every day. It's always a little bit different. There's always something new, some ad hoc request or something like that that comes up. So it could be anything from like, you know, high level reporting just to help maybe leaders get a sense of our baseline or where we're currently at, the current state. Sort of all the way to the other end of the spectrum where it's more of like a multi-team, cross-functional, deep dive where we're triangulating the results of numerous explanatory analyses to try to make sense of something or to try to provide some solid recommendations on how the business can move forward. So it really does range quite a bit.

That sounds nice though. It sounds like you never get bored. It also kind of sounds like you do the job of five or six people. So hopefully you've got a big team. We actually have a solid team. I wouldn't say we have a big team. But the challenge, I think, in people analytics is the balancing act. In my space here and in previous roles, there's always sort of a push and pull of like having prioritized projects versus like these ad hoc needs or requests that come up. And I think that is one of the maybe more unique challenges to the people analytics field. But I really enjoy that, like working through that ambiguity. And throughout my career, I've had great support from like managers and leadership to help prioritize and make that balance. So I've never really felt like I was put in a corner or a tough place by that. But yes, it is always something new. And I don't ever feel like I'm getting bored in this field.

Bridging the data science practice gap

Okay, hey, guys. Hi. Yeah, my question is, your bio on the Posit page mentions bridging the data science practice gap. And so just what are some primary examples of that gap that you've experienced?

Oh, great question to start things off. Yeah, that was always when I was in grad school, I feel like that was always a topic of conversation. Like that, how do we bridge that science practice gap? Because I think there's a lot of like in the in the world of research and science, we have a lot of information, we have a lot of results and takeaways, but like translating that in a meaningful way back to business or business leaders can definitely be challenging. I will say one of the things that I've found really helpful, and definitely a resource I would recommend. There's a book called Storytelling with Data by Cole Naflak. She used to be at Google, I believe, and has since really spun off a whole sort of business with this book that she wrote. But it's a really incredible book in terms of like, how we really communicate the findings that we have, the meaning behind the data that we're working with to our stakeholders in a way that resonates with them. Not just in terms of like how we communicate it, but how we present that visually. The book does an amazing job of giving like examples of how to really highlight what you're trying to pinpoint with like your charts or your tables or how you're presenting data. I have read that book several times now, it's something I always go back to. But I think that's sort of one piece of it. I think bridging that science practice gap is always going to be a challenge. I don't think there's necessarily a solution to that. It's just something that we always have to kind of keep in the back of our minds and be prepared for.

Trevor's background and journey into people analytics

And you have a pretty solid background in research. I don't think that we have mentioned specifically your educational background, but do you want to do a quick rundown on that? I know that you've told me it's a long story, there's too much to talk about. Give us a brief view.

Yeah, I'll try to give it like the elevator pitch here and keep it concise. Because my journey into the field of people analytics has definitely not been a linear one. I started off in school studying psychology. I really love psychology. I ended up getting into a grad program to get my master's degree in experimental psych. And at the time I was preparing to go into academia. That just kind of seemed like the natural next step for the route that I was on. Coincidentally, I was working in the tutoring center. Working in the tutoring center at my college campus. And that was really my, happened to be my first sort of experience doing more applied research. So I had a pretty administrative role as administrative assistant for the person who ran the tutoring program. That person happened to be pursuing their PhD in education. And so one day they came to me and they said, you know, I've been doing research on tutoring. And basically what he told me is as he had reviewed the literature, there's all these peer reviewed articles about how tutoring works. But almost all of it was done at the elementary school level. There was a handful of articles that were published on high school students, but at that time there was really no peer reviewed like literature out there that had evidence to say that tutoring works in post-secondary or in college level education.

And so he was like, with the data you're working with, could we see if our tutoring program actually works? Which was really meaningful because, you know, every university in the country has a tutoring program almost. So it's a big investment. And to not know if it works or not seemed like a really cool opportunity. So long story short, did a bunch of analyses. We were able to demonstrate that our tutoring program did have an effect on people's grades and their grade improvements. And that became like the first paper that I actually got published. And so from there, I was like, I don't think I want to be a professor anymore. Like I want to use these research skills in a more meaningful way. And I really enjoyed that opportunity to work with real world data instead of, as opposed to just being in a lab, which I loved being in labs. But that's when I learned about industrial organizational psychology. And so from there, I went into grad school for IO psychology. I ended up working at a small consulting firm while I was in school, which gave me exposure to a lot of different contracts. And I really felt helped me to become more agile and just like how I use my skills and how I could apply that to our, the client work that we were doing.

I was also, I got really into studying team effectiveness. I played sports growing up. So that was sort of a topic that I really gravitated towards in IO psychology, like the study of team and group dynamics and team effectiveness is really a hot topic. So that was something that really resonated with me. And that sort of launched me into the next stages of my career, working with the Navy and the Army Research Institute, really doing a lot of mixed methods research. So combination of qualitative and quantitative research, which is fascinating. I thought I knew what research was, and then I learned about qualitative and it's like a completely different world. We were studying, you know, multi-team systems and how, you know, ships and planes communicate and coordinate in the contested environment and what, you know, what can help facilitate that or what can help enable that.

From there, I ended up taking a job with the Army Research Institute, studying team effectiveness, but ultimately came to learn that I really wanted to do more advanced analytics. And so that's when I transitioned out of government and defense work. And I took a job at Nordstrom as a workforce scientist, actually a data science role. Yeah, really cool title, which I still have that. But that's when I really kind of got into the field of people analytics and started to learn how to hone my skills even more with that people data and with that like HR focus and really focusing on more like business outcomes as opposed to more of the kind of foundational research type thing. Long-winded story there, but that's sort of my journey into the field of people analytics.

Nuances of people analytics vs. traditional data science

So yes, as Libby said, I'm interested in like what are the nuances that you would say distinguish data science at Pinterest as well as the data science that seems to be more, I guess, people-facing in your role versus more traditional data science and what advice would you have for people who are trying to get more experience or hands-on experience in that type of data?

Yeah, good question. So I will say like my answer is going to be a little bit biased because I don't consider myself like a traditional data science. That's not my education or my background per se. I do think the type of like the way I approach problems in people analytics is probably not that much different than a data scientist might approach problems in their space. I think where the nuance probably would come in is just like the nature of people data being a bit more sensitive. We have to be very careful. This isn't a Pinterest thing. Like anywhere you're working in people analytics, you have to be very careful with like just anonymizing data, making sure like how we're dealing with like within our HR records there's always going to be a lot of personally identifying information. So how we consume that, how we make sure that that's protected and that's not shared in some inappropriate way, that's definitely an important nuance to it.

I would say there might be other nuance in like how we manage data when it comes to like some of the more finer details of like how we manage missing data or how we're working with like a nested model because in an organization like almost everything is nested. Like we have people working on teams, working in departments, working in larger organizations. And so how we might account for that might be different than how you might do that if you were working with like some sort of like visual machine learning algorithm or something like that. For us, I will also say that there's the nuance of in people analytics like one of the most important things is being able to explain like the output or the result to our stakeholders. And so how we approach that might be a bit different just in like how we communicate it. Again, I'll go back to Cole's book on storytelling with data as a reference there. I don't think that's limited to just people analytics. I think anybody who's working with data should find value in that book.

I think that was a great answer. It sounds like people analytics has maybe a lot more of the qualitative mixed in by default because human beings are complex, right? And then I wanted to say there was a great thing, Trevor, the last time I spoke to you that you said to me, you said in order to successfully drive the message home, you have to speak the language that the audience is speaking. And you said that in the military, when you were working with the military, you even needed to like get down to the level of explaining what data means. Like what is data? What does Trevor mean when Trevor's saying data? And I think that that's such a great call out. Like you have to get down to the bits and pieces of what you're talking about and make sure that the other person's understanding the same thing, right?

Yeah, yeah, absolutely. Being able to speak the language of your audiences is absolutely critical. I had no idea how many acronyms I was going to learn when I started working in the military space, but I still remember a lot of those today. And sometimes they just like come out automatically. I think as I was working, even in grad school, as like as a consultant, that's something I picked up on too. Like we had a lot of different contracts from government, private industry, some military. And so like from one day to the next, like the level of detail, the exact like language I'm using and the terminology I would use would often shift. And really just trying to align with that audience. So reduces confusion. We have to answer less questions and ultimately like to get their buy in on either what we're finding or what we're recommending.

Being able to speak the language of your audiences is absolutely critical.

Tech stack and tools

I wanted to ask while we are sort of in this space, and we're talking about what data science is, and how we do it. Usually by this time, we have asked a tech stack question. And I wanted to throw that in there. So what tools do you do? What tools do you use to do data science, whether that's at Pinterest or anywhere else? So languages, IDEs, environments that you work in, do you use Databricks? Do you use Snowflake? Do you use any tools that we would recognize?

Yeah. I have worked with Snowflake, Microsoft, SQL Server Manager, Studio. I mean, throughout my career, I've had plenty of times where I just received Excel spreadsheets full of data too. For me, my preference for almost as long as I can remember now is RStudio. I, at a certain point in grad school, committed myself to wanting to learn R. In psychology, the universities I went to, and I think most typically teach statistics in SPSS, which is expensive and clunky and sort of limited in what you can do with it, at least in my experience. But R seemed like sort of the holy grail of being able to do anything and everything or build functions and build your own algorithms. So that is where I spend the majority of my time doing analytics is in RStudio.

I prefer it for a lot of different reasons. I have had use cases in projects where I've worked on Python. I don't think I'm fluid in Python, but I'm comfortable with it. But ultimately, for me, in the space that I work in, like applied statistics, I just feel like the packages that come in R are much more robust and just fit sort of my need much better. Yeah, I say I do use others like Tableau. I've sort of grown to like, I was resistant to it at first, but building dashboards in Tableau has been definitely helpful, especially in some cases. I still do use like Excel and Google Sheets for some things. It's kind of unavoidable. Yeah. But yeah, really, like I spend most of my time in RStudio. We have the Posit cloud at Pinterest now. So that connects directly to our data sources. So that's been a big, a big improvement in terms of efficiency. So I'm not having to like work with flat files.

Yeah, that's that's been a huge, you know, lift and or lift off of our backs. And also just like, I'm not a security expert, but I know that like the Posit cloud is a much more secure environment than like, yeah, downloading flat files onto your laptop.

Third-party data and benchmarking

Yeah, I guess my question. Good. Interesting that you also work for defense. Ironically, one of the software projects I work with is grass, which was geospatial software from the Army Corps of Engineers. But I'm kind of curious about what role third party data plays, like, for example, from data brokers and stuff like that, for what role does that kind of data play in your analysis? And is geospatial something that's kind of important for the kind of analysis that you do?

Good question. To be perfectly transparent, I don't think I've ever really worked with like geospatial data much in people analytics, aside from maybe like understanding our organization's population in terms of like the geo footprint, like where where people are sitting. I mean, I've done a lot of analyses between like remote and hybrid and in-office employees, but I know that's not really getting at what you're talking about in terms of geospatial data. Yeah, that's an interesting question. I'm trying to think about how we might use that in our space. I will say like in terms of third party data brokers, I haven't really worked with any in people analytics.

Maybe the the exception might be like just benchmarking data. Yeah, there are there are vendors that provide that. Gartner, I know, is a big one that a lot of people use. I know Aon has some like some different benchmarking services that they offer. I'm a little biased against benchmarking. I don't feel like it ever really is an apples to apples comparison. And I think leaders often assume that it is. So that's something I often get hung up on. I will say like, I think some of the better benchmarking stuff is the government provided. So like the jolts survey that the Bureau of Labor Statistics puts out, it is it's a little lagged in that like it's always like a month or two behind of sort of the current date. But I think that's a more objective and more sort of robust benchmark than some of the private ones that people pay for. And that's my opinion.

Building a data-driven culture

Yeah, so thanks for being here, Trevor. I guess my question is a little bit more on the abstract side, but high level of like what strategies would you suggest for someone that wants to motivate a more data driven culture, right? In the HR department of the importance for people analytics, right? So like, for example, for the company that I work on, like, you know, we have like survey data, workday data, things like that, right? But we necessarily do not run analysis on top of that data, right? So like what strategies or what recommendations would you have to sell the idea of like, hey, you know, maybe, you know, applying some data analytics workflows could help provide some more insights as to like what you had mentioned before, like employee, you know, attrition or just productivity and overall improve business outcomes.

Yeah, that's a great question. I think it's a question I read a lot about. I know, like, developing a more, you know, data literate culture and more data literate leadership is, I read articles about that, I feel like at least a couple times a week. And throughout my career, I feel like that's just been an ongoing conversation. My advice, if you're if you're trying to just like, like, you've got data, and you're trying to figure out like how to get buy in to start like, leveraging it would be like, start small. And like, figure out what your stakeholders like, what are the gaps your stakeholders are facing? Like, is there a problem that they're struggling to solve, to understand?

Okay, let me give a more specific example, like survey data, you say you've got survey data. For me, as an IO psychologist, that is like, the goldmine of data sources. If I could work in survey data, like employee survey data, all day, I probably would, because I think it's just such a rich resource and, and such good information that we get like from our employees, whether that's just through the scores or the comments that they leave. But I've often found like leaders are often asking questions like, well, what, you know, what drives, whatever, fill in the blank, what drives engagement? What drives job satisfaction? What's driving, you know, our sentiment on manager support?

If you can find those types of questions and provide some insight to that, I think that's how you can start to build that buy in. And then, in my experience, once you can, once somebody kind of gets that taste for like, oh, like, kind of having that moment with the data that you're presenting them, they're going to start coming back for more. They're gonna be like, hey, well, you showed us what drove engagement, like, what's driving burnout? You know, what's, what else can we do with this? And from there, I feel like you kind of, you get the ball rolling and you start to develop trust with your stakeholders. And then you can start to think about those more long-term, like, okay, how do we build a workflow to support this in a more consistent or ongoing or long-term way? But I do think it, to build that trust and to get that buy in, you do kind of have to start out with something bite-sized to get things started. That would be my best advice.

To build that trust and to get that buy in, you do kind of have to start out with something bite-sized to get things started.

Heartbeat analysis and survey data

I think that's a valid concern and yeah, we could probably talk for hours. I learned about an analysis a few years ago at the society of industrial organizational psychology has a conference every year. And I saw this presentation on something called heartbeat analysis. So real quick, like if you think about when people are taking a survey, sometimes people just kind of go down the line and they just, if it's on a scale of one to five, they're just clicking five for everything, but maybe there's one item that they really feel negative about. And so it's fives on everything, but then there's like a one on this one item. This heartbeat analysis, basically you're normalizing everybody's scores and seeing which items are they scoring higher or lower than their sort of normal score. And so it, you count them as like upvotes and downvotes. And it's sort of an interesting and different way to interpret the data. Um, but in one of my previous roles, we use that and it got a lot of traction with leadership. They really seem to resonate to me. Yeah. Cause when, when you fill out a survey or when you look at a lot of survey data, you realize that for a lot of people, four is the highest they're going to score anything. They're not going to give anything a five. So if they give everything a four and then one thing of five, you're like, oh wow, they really felt strongly about that one.

The future of people analytics and AI

Good question. Yeah. Um, I mean, I think like most spaces, AI has been like the topic of conversation the last couple of years. And I think as a field, we're still trying to figure out like how to use that, you know, like how we're going to use it effectively and efficiently. Trying to think of like something beyond just AI, cause that seems pretty, uh, like that's what everybody's talking about now.

I know like, well, I don't, I don't work in this space, but I have friends that do like IO psychology really focuses on like assessments, like pre-hire assessments and how we select the, you know, the right people for the right job. Um, that that's really what the field was built on. And I think over the last few years, I know there's been a lot of projects, I don't know anybody that's really got this off the ground yet, but, um, like to do like interviewing in more automated ways, like where you're basically interviewing with a, some sort of AI chat bot or entity and the algorithms are scoring based on the answers and your behaviors and things like that. I, I think that's been sort of fraught with bias and, and that's why we haven't really seen that more, more broadly. But, um, I know that's something that like, there's a lot of interest in that and like being able to do that and to get it right. Which is obviously the hardest part. So I would say probably start seeing more of that as, as Gen AI continues to like advance.

I do wonder, like some of the recent research, I know personality assessments used to be a lot more common for like pre-hire assessments. And it seems like that's, um, shifting in terms of how many, like, like who's using it and how those assessments are designed. Um, see a lot more like forced choice type, uh, personality measures now compared to like some of the more traditional, just like, uh, behaviorally anchored type, type rating skills. Um, that's a good question. It's hard to predict what's going to happen next, even when we work with all the data.

Yeah, even when we're right there with it, I, you've mentioned AI, it's the upcoming thing. It's on everybody's mind. Do you want to talk a little bit about what you're excited about specifically? Like how are you starting to integrate it and what are you excited about integrating LLMs into your workflow to tackle? Yeah. Yeah, definitely. I, I work with them a lot. I've, for me, again, I'll lean back on like the, the employee survey stuff, cause that's, that's really one of my favorite things to work with. But I think one of the most challenging things in people analytics, maybe not the most challenging, but definitely the most time consuming thing we do is analyzing all of the comments that come in on the employee surveys. And oftentimes, you know, a survey closes and we have maybe a week or two before we start maybe a week or two to analyze all the data and get decks out to leadership, to share those findings. And so like when you've got 20, 30, 40,000 comments to sift through, that becomes pretty overwhelming for such a short timeframe. And so as these AI models have come out, I immediately was like, oh, this could be, this could really help make our comment analysis process much more efficient.

I think one of the challenges though, is in like, as I'm working with the AI, like I really don't need like a creative writing co-pilot. I need like a scientist co-pilot. So like I need the AI to get it right. And I need to be reliable. Like if I give the AI a batch of comments and ask it a question, like I should get the same response that my counterpart gets or, you know, anybody else on my team would get. And so I've been exploring like more of the like local models using like Olama, where I can, you know, set temperatures and do a little more like tuning. Um, and I feel like that's been generally successful. Um, getting the exact same response every time is still a challenge, but, um, I do feel like that's going to be one of the big successes for people analytics is being able to process all of that, like unstructured data in a much more efficient way.

Um, again, I feel like right now it's like we get an output and then we have to go back and like check to see, you know, do it ourselves to see like how close the output actually was. But, um, I think that's definitely, um, a ripe opportunity for people analytics with, with using AI. Um, I know there's other, there's some other like being able to just give it like a, you know, a table of data and ask it a question about it. I really like to see how that advances maybe to the previous question about like building that data literacy, like for our, for HR business partners or other people outside of the people analytics team to be able to self-serve and then do some of those analyses on their own to create those insights. Um, I think that would be really beneficial for organizations as well.

Measuring the ROI of data teams

So, um, I'm interested in this cause I'm in a kind of similar role where, um, you know, being a consultant within your organization, being kind of a just in time data guy or a data team. Um, I think that people that we work most closely with value our input and, and, you know, see our value, but how do you express your return on investment of having that data team or those data consultants upwards within the organization? You know, what's the value of having a just in time data team?

That's a tough question. Um, I've, that's a really good question. I think it's really challenging. This isn't something that we've done at Pinterest, but in my previous roles, I have had cases where there was like a really strong desire to see like ROI of everything. Um, and leaders who really wanted to put like a dollar sign on every single program we had to say like, is this costing us or is this saving us money? And I think with people programs, it's incredibly challenging to do. Um, I think there, in order to do it, you're going to have to align on like accepting some assumptions. I think for me, that was like the biggest blocker is, you know, we would come up with like, okay, like, here's, here's a way we can model the cost of something, but it's never like the full, it felt like you're never really covering all the bases. It's kind of like this, it's a signal, it's the best we've got, you know, this is what we, based on the data we have, this is what we think, but there's always a leader to say, oh, well, what about, you know, we're not accounting for this. We're not accounting for that. And it was very, very challenging to ever really come to an end point on, on, yes, we, this is the ROI of this program.

I think there are some with people data, there's things you can lean on like, um, turnover cost of turnover is something we can get pretty close to, I think. So leaning on things like that, like, or like, maybe like time to productivity, you could put a, you could put sort of a monetary value on. And so it was really about like identifying, like, what can we associate a cost to? And then from there, like, how can we make sense of that in some way? Like, how are these things related? Um, it was fun in the sense that it was very ambiguous and like, no one knew what the right answer was. Um, but again, like it was, it was a big challenge to, to work with leaders and to get them sort of bought into, yes, this is the ROI of this training or this onboarding system or, or whatever it happened to be.

Qualitative vs. quantitative data in people analytics

What kinds of qualitative data? So we, we mentioned qualitative data a few times or qualitative analysis. Um, can you give a quick rundown on what is qualitative versus quantitative data in people analytics? Yeah. Um, in people analytics, it's primarily going to be comments from like survey in like open-ended comments, um, whether that's a survey or some other form, um, interviews, uh, or focus groups. So I've had, uh, projects where, you know, we've done interviews, facilitated focus groups, transcribe everything. Um, in grad school, I was taught how to do like qualitative research in a more traditional way. So not necessarily using like NLP, but literally going like into the text and highlighting and making codes and, um, basically doing topic modeling from scratch.

So that, that's some of the qualitative data in people analytics. I'm trying to think if there might be other like behavioral type observations, but even then we're really getting more into the quantitative world. I do remember when I first started at, at the army, I thought I knew what qualitative research was because I had studied mixed methods and I quickly learned that like, what I thought was qualitative was still quantitative. Like for me, I would think like, okay, we let's, let's look at our, you know, topic modeling and whatever the most frequent topic that came up, you know, that's sort of our, our key topic. And it's like, no, that's, that's quantitative. If you're counting anything, if you're ranking anything, like you're not doing qualitative anymore. So I would say in people analytics, we do more of that mixed methods type of stuff. Like we're, we're organizing qualitative data and then analyzing it in qualitative and quantitative ways.

Common models and methods

What's the best approach to measure something like incentive that's sensitive. You can't really set up AB testing. So like what kind of methods or models would you work on there? And I think that our sentence had asked a question about modeling and maybe Rohan as well. Common methods that people in analytics teams or people analytics teams are using. Do you use structured equation modeling? So SEM, maybe a quick rundown on that before we say goodbye.

Yeah. I wish I had more time to answer this. Structural equation modeling is, is one of my favorite things to go to. You can do a lot with, with SEM, especially in terms of like looking at like mediators and moderators of different relationships between your predictors and your outcomes, being able to control for multicollinearity in predictors and outcomes. Okay. Let me, one, one package I use a lot, and I'll go back to the, the, the topic of like understanding drivers when it comes to survey analysis, like what drives engagement relative weights analysis. And there's a package in R called RWA is a really powerful tool for doing this. I won't go into the details of how it works with three minutes left, but essentially it's, it's a way of using multiple regression that allows you to control for multicollinearity in your predictors. So you can, it gives us a way of rank ordering sort of what the most important features are for whatever outcome we're looking at. That's a model that I've leaned on quite a bit over the last several years of my career. And, and I feel like stakeholders that we I've been able to like land that pretty well and, and get buy-in on, on what that means.

I mean, when it comes to like how we're, you know, validating what we're measuring, confirmatory factor analysis is I think the gold standard for that. Sorry, I don't have more time to, to really dive into this. Those are sort of like top of mind, like what I, what I lean on most often though. I will say like that Keith McNulty has a book called on regression. I think one of the previous guests recommended it. And in the forward of that book, Alexis Fink from Meta wrote it. And she refers to regression as the Swiss army knife for people analytics. And I, I always keep that in the back of my mind. Like it's, it's such a great tool, like to, to, to lean on, like if we don't know what else we're, if we don't have another approach in mind, like regression is always a great place to start.

She refers to regression as the Swiss army knife for people analytics. And I, I always keep that in the back of my mind. Like it's, it's such a great tool, like to, to, to lean on, like if we don't know what else we're, if we don't have another approach in mind, like regression is always a great place to start.

Fantastic. All right. Resources are in the chat. Rachel linked Keith McNulty's book, which is great. I am so glad to hear that somebody out there is using CFA. It makes me feel better about learning it in undergrad for so long. Thank you so much, Trevor, for spending time with us, everybody. It was so wonderful to have you here. I am going to stick around for another minute in case you would like to save the chat. So up in the right-hand corner of the chat, you can click little three dots and save it. That way you have got it, all those resources. And I wanted to let everybody know that next week we have Kara Thompson. I am so, so excited. Kara Thompson is a data visualization, amazing person, runs her own business called building stories with data. And I'm so excited for everybody to meet her. So thank you. Thank you, Trevor. Thank you all. Thanks for having me. Yeah. Happy Friday, everybody. Happy birthday to Rachel. We'll see you on Thursday.