10
 min read

What talent assessments are and how you can assess yourself

Stefan van Tulder, founder of Talent Data Labs, sheds light on the world of professional assessments, how they are misused as of now, and how people can take control of their own assessments.

September 29, 2020
Yuma Heymans
September 7, 2022
Share:

Stefan van Tulder is one of the thought leaders within the talent assessment industry.

He is founder of Talent Data Labs (TDL), that helps companies become prediction experts in Talent Analytics. Through a series of statistical methods, AI, Machine Learning, robotic process automation, and data extraction technologies TDL helps companies generate, analyse, and use data for predictions and people analytics algorithms.

He is also co-founder of Career Analytics, a platform that helps people to navigate their future of work and close their skills gaps by enabling them to assess themselves.

Here's the full conversation I had with Stefan on the current state of talent assessment and what the future of talent assessments holds:

Full conversation on current state and the future on talent assessments

Below is the transcript of the full conversation with Stefan van Tulder.

Stefan van Tulder 00:00 - What talent assessments are

There's this fancy word psychographics, which is just a visualization of how you, as a person, show up in data in the world. And now, for me, there are two key differentiators here. There are a lot of things that call themselves assessments, which basically are assessments that create a profile that actually predict some behaviour, right? How you interact in daily live needs to be predictable to those assessments. Otherwise, it's more of a social tool. It's more of a, we call it a horoscope category tools. And the only thing we're interested in, is creating data that we can actually help people with by being able to understand what it predicts. So that's three that's sort of been the baseline assessment. 

Stefan van Tulder 1:05 - History of talent assessments

Assessments have been evolving over time, which was the second part of your question.

That's actually very interesting because the most robust evolution is based on those 120 years ago, assessments were on pen and paper. It was turned in a form, and then there would be a calculation key in the bottom. So from the exercise of two administrators, assessments have not really changed.

The most important ones that everyone probably knows are personality and logical reasoning tests. Those are also the two competencies I think are the most dominant providers, and basically allow you to measure a certain level of intellectual capacity and ability to handle complexity. 

Now, those tests haven't been reinvented over and over and over again. But, if you believe very strong research by Schmidt and Hunter, which are two names that constantly evaluate the validity, meaning predictive power and how could it predict future events?

Basically, the only thing that really matters is personality. And logical tests.

And we can dive deeper into personality if you want later, but I'll leave it here for a bit.

Yuma Heymans 2:56 - How do you know if someone has the right level of intelligence and personality for the job?

You're saying personality and intelligence are the most important predictors of success and happiness and they work. So how can we break down those two factors? How do you know how intelligent someone is and how when someone has the right personality for the right job?

Stefan van Tulder 3:52 - How you measure intelligence

Yeah, and that's it really I think it's a really complex question in terms of how do you know when someone is something and the best we can do is estimate it right? So I'm just going to say how do we best estimate that that's the answer I'm going to give, because this is a social study and one thing that's very important to know is that the social study is not similar to things like rocket science. You don't have to land on an exact spot on the Moon or Mars or wherever we're aiming to go in the future.

In a social study, you’re looking at society at large and you should be able to influence the movement of the group.

And if you feel like you’re being subjected to an assessment as a candidate, for an example when you think: ‘I don't know where it's used for’, that's a flaw of the market, where a lot of companies that are providing assessments are trying to hide their knowledge and they're trying to hide their databases and their data with all of these norms. They try to hide that from users and from companies so that they won't be able to cheat on these tests or they won't be able to copy their work or they won't be able to steal their scores and go to a cheaper company. 

So that's basically a flaw that we find annoying as well. 

I was also rejected so many times for large corporate firms.

So I think one of the important things you asked here is how can we sort of figure out whether things are leaning towards a good description of someone who’s a winner.

We look at personality and IQ but we also look at things like culture, we ourselves, we work with culture and look at the validity and predictive power of this. But culture is just not as much of a powerful predictor as intelligence and personality are. 

Intellect is something that allows you to deal with complexity in organizations and situations, but also with really strange and abstract things like mathematical formulas. Now, what you want to do there, if you want to create an intelligence test, you want to just ask a lot of questions, increasing in difficulty. 

So you can start with two plus two is whatever. But later you want to make logical gaps and logical jumps as well. And as mathematics is also just one part, you also want to do something else. 

So we build something that's usually called logical reasoning, or analogies. You've seen this, either a matrix with two or three or four pictures or frames or whatever, and four row spaces equal to the square and you have to figure out what the right is recovering, or we have to figure out what's missing. 

Now, that gets the brain working on dealing with limited information, but ruling out things is a task that requires heavy processing power. So usually when we do this, we start removing more and more and more things and information and forcing the brain to predict more.

We assume that people who are least able to answer the questions are the least intelligent people, and we assume that there are certain levels that allow you to do certain intelligent tasks. And so if you’re at the lowest level of intelligence, you're perfectly able to do thinking, we're not saying you can't do thinking you can't do anything. It just becomes more difficult for you to absorb a whole lot of complexity and different information.

Yuma Heymans 11:36 - How to handle assessments for different roles?

Let's deep dive a little bit on that, because I can imagine there are different assessments for different jobs, and different things that you'd like to test for. So let's take an example of an enterprise salesperson who has to deliver in terms of having great conversations with clients, understanding them, but also as to understand the complexity within organizations, the dependencies, the different systems in use, etc, etc. So, probably intelligence is also a good predictor of the future performance of that individual.

But I can also imagine there are different ways of testing a salesperson and testing intelligence for a data scientists. How do assessment differ for different roles? And do you have some examples?

Stefan van Tulder 12:37 - Why assessments shouldn't be generalized for roles

So the question you're asking is perfect, in terms of how things differ for different roles, right. And the answer that's been given by the market has been predominantly; a data scientist versus a salesperson looks like ABC. Intelligence level required for a data scientist is 80% and sales people don't have to be as smart as the data scientists, so 60% intelligence and extroverted personality and such and such.

But do we really believe that the typical data scientists against a salesperson, that most data scientists basically have to look radically different than salespeople? Or will there be data scientists that will be good salespeople?

The market looks at it like: “we have the biggest database on sales people, we've assessed 2000 salespeople, we have a million salespeople that we've met and we've looked at them so the profile that's most likely to succeed” 

They think that having a huge knowledge base works, and that that makes sense. Right? And it does make sense. But only if your company and your environment is generic. But now imagine one of those people who goes to a startup and another one works for NGOs or whatever. Do you still hire the same sales profile? Well, the market says yes, sales is sales and data scientists are data scientists.

I say, that's nonsense. It doesn't make sense to me on a population level. 

There are some traits that are more likely to make you a good data scientist versus more likely to make you a good salesperson. But as soon as they end up in a highly specific environment, like a startup, or even a department in a company that does something weird compared to the rest of the company. Those employees are going to be required to do different things in a different way. And therefore are typically different people.

I say personally, and all my research proves the same, if you're a company like Microsoft, it's funny but the work is not going to be the same for two different (sales) positions at Microsoft. It depends on the product you're selling, the kind of company you work with, etc.

You don't want rocket scientists that are unable to handle complexity, so build the local environment and predict locally.

There are no key identifying factors, but there are things that matter if you look at the population at large.

Yuma Heymans 20:00 - How do companies apply this?

How do companies practically apply that because that can sound like a theoretical framework, and that might be where it all starts, especially if you approach it scientifically. But when you want to collect that data, where do you start? And do you have some examples?

Stefan van Tulder 20:52 - Applying assessments

Less than 1% of the organization is able to make the right predictions about their employees, like who’s is most like to be promoted. Just imagine that you're in an organization, and you have to pitch this idea to someone where you’re going in conversation with people to have these assessments become reality within the organization, and before you know it, you have a three year plan on your shoulders because there is so much to put in place. That’s not a plan likely to be executed. But the market doesn't have to do this kind of long term investment. There's a lot of software available. 

With a platform like ours, the organization can provide the data, like performance reviews or engagement reviews, and then the platform makes sense of that data and you can do your analytics yourself. 

Doing an analysis like this, about the differences in personalities and capacities within a team for example, usually skyrockets performance, engagement, etc.

You just need to put in the efforts to analyse the environment, spend some time looking at some extra data to see what you're measuring, right. It doesn't have to be just performance. It can also be other things showing up, whatever is most important for you.

But it sometimes just sounds like a lot of work. That’s why nobody's getting started. So now we tried to unlock the basics in the platform so people just have to do the simple things.

Ask the exact question you asked me, with all due respect to that fair question. “What does the salesperson look like for large customers? What is an enterprise salesperson look like?” And then invite a bunch of candidates to assess based on their role, and hiring the top performers. That's not predicting anything, that's accepting the market as it is. And that's unacceptable for me. I want to build something that can help you as an individual, as a candidate. Express yourself and learn from an environment. 

The answer is; it's complicated, but it's fixable. You have to work for it but it's fixable.

And how do you start collecting the data? You may be surprised how many organizations have entire databases on their employees but just don't have access to it. Like engagement surveys for instance.

In an ideal situation, you keep it super simple, even something basic, you know, micro studies, try to test it with one department. We have a platform where you can actually give us the data as well. So use the available tools. 

Yuma Heymans 34:53 - What are you changing in the assessment industry?

As a departing question, what do you see change in the coming decades? And what is it you actually want to change personally, in this whole talent assessment space?

Stefan van Tulder 35:49 - The future of assessments, user driven

Thanks for the question, but we're not expecting anything. We have given up on this becoming a priority.

There's too much nonsense out there. Companies are doing the same thing over and over again, only with different shells. 

Users are never getting any feedback, they’re always left in the dark.

Career Analytics is our platform where you can assess yourself. You’ll learn what you are brilliant in doing and how you can orientate on a next career. 

We will just make sure that people really get to better understand who they are and where they fit.

More content like this

Sign up and receive the best new tech recruiting content weekly.
Thank you! Fresh tech recruiting content coming your way 🧠
Oops! Something went wrong while submitting the form.

Latest Articles