Tuesday, September 8, 2015

Is hiring all in the brain?


Applying brain-imaging science to hiring sounds cool and cutting edge – exactly the sort of thing we ought to be doing in the 21st century.  High-profile startups argue that neuroscientific recruitment and selection is more strongly predictive, more reliable, less biased and easier to implement than traditional methods.  

But can brain imaging research really be applied practically in the workplace?  And can it solve America’s multi-billion-dollar mis-hire problem?  

Let’s take a look at the evidence…

Hiring today: a $$$ no-brainer

One thing everyone can agree on: hiring today needs some serious fixing.  Almost half of new hires don’t last even eighteen months in the job.  The costs of these thousands of mis-hires – and they happen even at highly regarded companies – are frightening.  

No other core business process would be allowed to get away with these kinds of inefficiencies.  It’s time to get hiring to work.

First, though, we have to understand why hiring isn’t delivering great results.  It is unlikely to be because of under-investment – hiring costs continue to rise year-on-year  and are up by 7% in 2014.  Much more likely is that too much time and money is being spent on activities that do not accurately predict performance in the job.

Hiring, after all, is making a prediction – choosing in advance the best performer from a pool of potential candidates.  There has been a lot of research into which factors predict success at work, and many common selection inputs such as the résumé, the traditional interview, a candidate’s years of experience have been shown to be at best weakly reliable pointers to future success.  

Of course, it is not only recruiters who are getting things wrong.  Not every job candidate has great self-knowledge.  It’s easy to get seduced by the idea of being a leader, for example, even if you rarely demonstrate the characteristics and abilities that typify great leadership in action.  And if we know ourselves only partially, we hardly know jobs at all.  Most of us have limited and distorted ideas about what different jobs and organizations are really like, about which factors reliably drive success and which are less relevant.

Neuroscientists are convinced they can do better.  They point to the scientific research behind their approaches and to the ease and accuracy of assessment.  Let’s take a look at what they do:

The case for neuroscientific hiring

We may not yet have the technology to create the World’s First Bionic Man, but using fMRI (functional magnetic resonance imaging) and other new approaches we can see with increasing clarity what’s popping in our brains.  

Researchers have peered inside people’s heads while they play games, make decisions, solve problems and experience emotions.  They have also looked at the interaction of brain processes with physical actions – how people’s faces or pulse rates change when experiencing specific emotions, for example.  From these experiments they have found evidence for four broad categories of neurological activity: mental processing speed and accuracy, memory, executive control, perception and social cognition. 

Companies have taken these research findings, and the experiments which reveal them, and used them to assess job and career fit.  Candidates take anything from two to twelve assessment exercises that feel like 1980’s videogames or lab experiments.  They get feedback, which is sometimes quantitative (“You solved the problem faster than 60% of our peers”) and more often qualitative (“You’re risk-averse”; “You’re a quick thinker”).  Results are matched against jobs and careers, based on profiles determined by an employer and/or by data from similar jobs.   Typically, an individual is given career recommendations and perhaps development suggestions while an employer gets presented with the profiles and contact details of candidates who match well to the job.

Neuroscience companies claim that this approach solves key problems with the hiring process: assessments that rely on candidates’ limited and often inaccurate self-knowledge, assessments that can be gamed by savvy candidates, and assessments that are subject to bias on the part of recruiters and hiring managers.  By using brain-games, they say, they can get at the real truth about a candidate to help individuals go beyond their prejudices to find the right career and help employers identify the right, high-performing new hire. 

Does it work?

Before we look at the specific claims made for neuroscientific hiring, I have a more fundamental question: does what is revealed by the neuroscience tests genuinely predict performance at work?

The evidence is mixed.  When we compare neuroscience data to other research linking ways of thinking to work outputs, we find some overlap, but also some differences.  Cognitive ability – mental processing speed and accuracy – is a big area of neuro-investigation and has been shown to be highly predictive of future work success.  But the evidence is much less clear when it comes to other factors.  We can measure someone’s short-term memory capability quite accurately, for example, but it’s much less clear how important short-term memory is for success at work, or success in particular kinds of work, or how it operates outside the calm, one-on-one conditions of a laboratory experiment.  It may well be that short-term memory ability really does distinguish the best from the rest, but so far nobody has proved it.

There is also an issue of whether the tests are measuring what the researchers think they measure.  It’s a fascinating idea that we can peer into someone’s brain and see what they are thinking – including which celebrities they obsess over  – but the reality of brain imaging is a little more complex.   

Take emotional states.  Recent meta-analysis has found no reliable evidence that the brains of people experiencing an emotion all react in the same parts, or in the same way.  Brain scans of people experiencing fear, for instance, show different patterns and intensities of electrical activity.  Back in 1996 Daniel Goleman coined the catchy term Amygdala Hijack to describe an overwhelming fight-or-flight response, but more recent research has suggested that he got it wrong. 

Only a quarter of studies since 2009 showed an increase in amygdala activity during fear, and many studies showed amygdala activity increasing during non-emotional thoughts and experiences.  Even more significantly, individuals whose amygdalae have been destroyed can often still experience full emotional lives.  The seductive idea that we can measure electrical activity in the amygdala and thereby discover the intensity of someone’s terror is just not true.  People may experience the fight-or-flight response, but that experience does not happen only or perhaps at all in their amygdalae.

These mistakes variations in the experimental data exist because the brain is extraordinarily adaptive.  Different parts of our cerebral cortex cantake on different functions well into adulthood.  The reality – so far as we currently understand it – is that the brain is a bunch of multi-purpose networks that come together in a variety of ways to make our minds and bodies work.  Mapping those networks and pathways is work still to be done, and perhaps needs more advanced imaging technology to be feasible.

It’s not just brain imagery that has been often misinterpreted, but all kinds of other physical-response-derived neuroscience data.  We may kid ourselves we can recognize lying and emotions by micro-analyzing facial expressions, but the evidence just isn’t there.   Big-data comparisons of facial analysis research studies find no consistent emotional facial expressions – different people experiencing the same emotion will show a range of expressions (and a range of other physical symptoms such as heart rate). 

Given the uncertainties around the research on which these games and tests are based, and the lack of solid evidence linking their results to actual successful performance in specific jobs, they cannot demonstrate that they get around candidate self-ignorance or lying, or that they are less subject to bias than other recruiting methods.  You may believe they are better, but there is simply no evidence either way to justify that belief.

What really works

When it comes to something as important as hiring, we really should trust the evidence and not our prejudices.  There have been decades of research into predicting work performance, and the right combination of assessments can get you correlations with future performance of over 0.7 – an outstandingly high predictive value which some selection experts believe can be increased to over 0.8 for certain roles when supplemented with the right mix of focused interviewing and other techniques.

Of course, assessments don’t sound quite as cool as neuroscience games (and neuroscience gaming is a real market competitor with the likes of Candy Crush, never mind the fact that earlier claims around developing brainpower or protecting against Alzheimer’s have been proved false).  But would you rather have a fun recruiting approach, or get genuinely business-transforming results? 


I know which I’d want used when it came to my career.