What is collective intelligence and why should you use it?

Technology has not only brought new opportunities to businesses that they never thought possible, it’s also given birth to a vast collection of confusing and overused buzzwords.

Collective Intelligence (CI) is one that you’re likely to hear much more of. Here’s what it means, and how it might change the way you approach business decision making.

Collective intelligence refers to the combination of machine intelligence (AI, data, machine learning etc) with human intelligence (emotions, thoughts, experiences etc). In short, from a decision making point of view, it’s the ‘wisdom of the crowds’, with robots.

For businesses, that essentially means linking all the sources of information and insight – whether it be data sets, customer experience or judgements of stakeholders – into a ‘collective brain’ that can outperform any of its individual components.

The idea of using collective sources of information to make decisions is nothing new, it’s just that over the last decade the development of technology has increased the number of intelligence sources, and our ability to tap into them.

Practical uses can range from simple citizen science to more large scale data intelligence.

For the last decade Lego has mobilised thousands of enthusiastic fans to develop product ideas. NASA runs regular challenges, sometimes offering financial rewards, to encourage the public to help innovate anything from computer codes to space suits.

Siemens “works on the assumption that the collective intelligence of its engineers will be smarter than its managers” and has adopted a systematic approach that enables its engineers to vote on where the organisation should spend its internal development budget.

But that doesn’t mean it is the domain of globe-spanning organisations. For businesses of all sizes, collective intelligence is about thinking what information you need to be successful, and to identify whether there are any sources that are currently being overlooked within the day-to-day operations.

“In some ways this is obvious, but it’s often not the way that businesses think,” says Geoff Mulgan, author of Big Mind and CEO of innovation charity Nesta, which established the Centre for Collective Intelligence to explore and develop research into CI.

If you break down the conventional “components of intelligence” that are available to businesses – observational data, live models, analysis and prediction, memory, empathy, motor coordination, creativity, judgement and wisdom – technology has had a much bigger impact on some of these components than others. This means that when it comes to making decisions, organisations are often tend over reliant on certain sources, most notably data.

“A healthy organisation should have a reasonable balance of these things when it makes decisions,” says Mulgan. “It’s about tapping into those resources of intelligence that they might normally be overlooking.”

Our ability to gather intelligence at scale is being used to look for solutions to some of the world’s biggest problems. Take climate change, for example. Taxes, regulations and new business models only provide part of the solution, as you also need to work out how to change the lifestyle choices of everyday people.

Nesta has also been using CI to study how labour markets might develop over time, combining information from over 4 million job adverts with the perceptions of experts, jobseekers and AI, to forecast skills shortages and demand over the next ten years.

In the age of misinformation, the inability to identify facts from perceptions is the biggest threat to the widespread development of collective intelligence, as are the polarising effects of political events like Brexit that stop people talking to each other.

Simply mobilising the crowd or data streams is not enough. Mulgan warns that collective intelligence requires “careful design, curation and orchestration” – this may differ from business to business.

Source: https://www.managementtoday.co.uk/collective-intelligence-why-use-it/future-business/article/1594636

Technology and employee wellbeing – what does the future hold?

Technology and employee wellbeing – what does the future hold?
By Thomsons Online Benefits on 30 Aug 2019 in PROMOTED CONTENT, HR software, Benefits, Latest News, Wellbeing, HR Technology

Employee wellbeing is not a new phenomenon – in fact, it has been a hot topic for some time. But, employers are only now truly focusing on tangible actions to support their employees’ wellbeing.

Our new research reveals that the most important priorities for HR teams in 2019 are to support employee health and wellbeing (52%) and attract and retain key talent (56%). So it is no surprise that employers are implementing new technology to keep their employees healthy and happy. Employers today recognise that technology to support employee wellbeing, such as apps and wearables, not only improves the employee experience but also helps to achieve business objectives. Examples range from implementing virtual GP services to offering onsite health checks, wearables with pedometers, and even allowing employees to work remotely. Healthier, happier employees are more likely to stay at the company longer, take advantage of the benefits on offer and take less sick leave.

When it comes to wearables, there has been a significant rise in the use of everyday devices such as Fitbits and Apple Watches – and these are set to grow exponentially to 1.1 billion worldwide by 2022. Wearables enable individuals to track their fitness progress instantly and work towards wellbeing goals. Putting on company-wide step challenges, team sports days and exercise classes all help employees improve their physical wellbeing. Some employers are even collecting data from their employees’ wearables to positively impact their working environment.

However, with workplace tech currently lagging behind the on-demand, intuitive technology experience employees are used to outside of work, this can sometimes have a negative impact on an employee’s perception of their job role or employer. Furthermore, the ethics surrounding the collection of employee wellbeing data from wearables can be a barrier. Our research – Innovation generation – the big HR tech disconnect 2019/20, a global survey of over 380 HR and reward professionals working in multinational organisations – found only 33% of employers are currently collecting data from employee wearables, and only 46% are using this data to inform benefits decisions.

So what does the future hold for employee wellbeing? We will see a continued focus on technology and data collection to create an engaging and healthy working environment. Our research predicts that by 2020 80% of employers will collect data from building sensors to measure footfall and desk time, and that by 2022 a staggering 81% of employers will collect data from employee wearables.

There will also be an increased emphasis on other areas of wellbeing. Employers are turning to a wide range of tools to support mental wellbeing such as EAPs, mental health first aiders and mindfulness apps. They will also need to prioritise social wellbeing to ensure employees feel a sense of belonging at work. Examples include putting on breakfast talks, quiz nights and community environmental projects. For these initiatives to work effectively, employers need to marry them up with technology to deliver flexibility, choice and ultimately an engaging employee experience. For example, employers can use technology to let employees know the different ways they can get in touch with their mental health first aiders.

Employers who are best supporting their employees’ wellbeing are looking at the bigger picture. They know that wellbeing technology cannot be operated as a standalone entity – for example, a virtual GP service for employees needs to be connected to a provider, GPs, the NHS and individual employees. Many have implemented a best-of-breed ecosystem, comprising of integrated technology and leading software apps, with the ability to ‘plug-and-play’ different tools. As well as enabling third party supplier connectivity to automate processes and save time, an ecosystem gives employers the ability to trial new wellbeing technology more quickly. In turn, it empowers HR teams to track their impact on both wellbeing and engagement scores, providing a good return on investment and ensuring they are able to continuously improve their benefits offering.

It is important that employers who are looking to adopt wellbeing technology (or make use of existing technology for employee wellbeing) connect this to their wider business goals and consider the end-result. Whether that be tackling employee engagement during a time of growing digital disruption, attracting and retaining key talent or simply supporting their employees’ health and wellbeing. With this in mind, wellbeing technology needs to be easy to use and personalised, making it easier for employees to access support that is more relevant for them. There is no point offering workout classes onsite if most employees are field-based or work remotely. Instead, why not provide easy access to online workouts for employees on the move?

Finally, connectivity is key. To successfully implement wellbeing technology, employers need a best-of-breed ecosystem that encourages open and simple connectivity and enables HR teams to successfully trial new technology. After all, a better employer experience delivers a better employee experience, with multiple opportunities to engage employees with their benefits and improve their overall wellbeing.

Source: https://www.personneltoday.com/hr/thomsons-tech-and-wellbeing-future-hold/

Poor recruitment processes can damage brands

Research into recruitment experiences has underlined the potential damage caused to brands by negative processes. These might include failure to provide feedback and to acknowledge receipt of the application.

The research by Reed UK also found major discrepancies between candidate expectations and companies’ handling of the process. Two-thirds of applicants (69%) expect feedback if unsuccessful at interview stage yet only 8% consistently receive feedback. About one third (35%) rarely or have never gained any comments on their application.

Reed’s study claimed that 73% of job seekers would be less likely to use the products or services of a company with which they had a poor hiring experience. This was particularly true of younger candidates (under 35s), of whom 50% said they would share their negative experiences with others.

Three-quarters of job seekers expect to receive confirmation that their application has been received, rising to 84% for those earning £50,000 or more. A significant proportion (39%) expected to receive this confirmation within 24 hours and 68% expected it within three days.

Given success at the screening stage, 73% of applicants expected to be interviewed within a fortnight of making their application while nearly all candidates (91%) believed that businesses should be in a position to offer them a job within three weeks of the initial application.

Neil Millett, research and events manager at Reed, linked the high expectations of applicants with the rise of consumer use of online services, whether it was in banking or retail.

He added: “But it’s bad news for brands who fail to meet these expectations, particularly with younger jobseekers, as they’re more inclined to share their negative experiences with family and friends. Additionally, the majority of those we surveyed would even be less likely to use the brand’s services in the future. Clearly, this highlights the importance of companies having a sleek hiring process.”

Source: https://www.personneltoday.com/hr/poor-recruitment-processes-can-damage-brands/

What do people mean when they call HR soft and fluffy?

We already have our seat at the table. We’ve forged an unbreakable chain between employee, service, and profit. We ditched analysis for analytics and correlational for predictive. Our soft got harder and our fluffy more edgy. Less pie in the sky, more ROI.

So why hasn’t all this been enough?

A good starting point is to work out what people mean when they call us soft and fluffy. What do you think it actually means? Ever given it much thought? Me neither, before now. It seems that calling HR soft and fluffy is, paradoxically, a pretty soft and fluffy thing to do. It’s a vague accusation that has multiple and distinct meanings.

Soft and fluffy is deployed to suggest that HR is not sufficiently business-focused – yet I’ve heard exactly the same criticism levelled at almost every function and at every level. So it can’t mostly still be about that. An obvious way of dealing with this is always to start with specific and real business problems or opportunities rather than with practices or techniques. By focusing too much on our practices and techniques all people can see of HR is what it does rather than why it does it.

Soft and fluffy is also used to describe HR people rather than the function as a whole. You know the stereotype: we’re more concerned with being liked than the success of the business. If anyone in any role in any function is focused more on being liked than getting stuff done then this will be counter-productive. At the same time being nasty doesn’t mean you’re helping the business. This particular image problem for HR, if that’s what it is, is part of a broader view of business success as necessarily requiring macho, ass-kicking, take-no-prisoners ruthlessness.

A third meaning of soft and fluffy is to refer to the human stuff we deal with in HR. This is a challenge I’m not sure historically we’ve dealt with too well. I’ve never understood why people think that human thoughts, feelings and behaviours are so intangible and mysterious they cannot be understood, so there’s almost no point in even trying. This leads to the even stranger idea that data comes in two varieties: hard and soft. Anything to do with people is soft even if measured in a reliable and valid way. Anything to do with numbers is hard even if measured in the dodgiest way imaginable.

HR data (with the possible exception of things like absence) is typically regarded as intrinsically soft and hence not to be taken seriously. Whereas other data, such as that used in finance, is regarded as hard and hence meaningful. I think we are partly to blame for perpetuating the idea that what we deal with is soft, because we are nowhere close to being sufficiently concerned with the reliability of the data we collect and the measures we use. This is something we could do much better.

The last meaning is somewhat related to the third. We hit candy-floss levels of soft and fluffy whenever it appears that we don’t know what we’re talking about. I don’t think other organisational functions are as relaxed as HR about what things are called and what they mean. When push comes to shove, and it often does, we get frustrated if people try to pin us down and ask what the terms we throw about actually mean. And because we don’t really know we say it doesn’t matter, and anyway we all know what it means. Right?

The single most important move we can make to rebuff the fluff once and for all is to take seriously the meanings and definitions of the terms we use as a profession. So remember that even if it’s getting you out of a hole right now, it’s likely harming the longer-term reputation of HR. If we go around telling people that it really doesn’t matter what we call the things we do, this sounds awfully close to saying the things we actually do don’t matter either.

Experience Doesn’t Predict a New Hire’s Success

Most organizations think that it’s important, even for entry-level jobs. Unfortunately, the evidence doesn’t support the idea that applicants with more experience will be better or longer-tenured employees than those with less.

How did the studies measure performance?

It varied, but typically in two ways: either supervisor evaluations—such as annual reviews—or more-objective, quantifiable metrics, such as sales or, in one paper on sewing-machine operators, parts produced.

What types of jobs and industries are we talking about here?

The ones most represented were protective services (police, firefighters) and then sales and customer service jobs. Study participants mainly worked in frontline positions, though some were managers. None were at the senior executive level. But we captured 15 of the 23 job families listed by the U.S. Labor Department’s Occupational Information Network, so we felt it was a pretty good representation of the U.S. economy.

Why on earth wouldn’t people with experience—especially directly relevant experience—outperform those without it?

My coauthors—John Arnold of Florida State University, Rachel Frieder of the University of North Florida, and Philip Roth of Clemson University—and I have speculated about that. One possibility is that many measures of experience are pretty basic: the number of jobs you’ve held, tenure at your previous employers, years of total work, whether or not you’ve previously worked in a similar role. Those metrics tell us whether a candidate possesses experience but not about the quality or significance of that experience, which would probably have more bearing on performance. One of the basic premises in our area of research is that past behavior predicts future behavior. But prehire experience isn’t a measure of behavior. The person might have failed or stagnated in previous jobs. So we should take experience into account but maybe do a better job of delving into prehire performance. We also want to know whether candidates have learned from their prior experiences. People aren’t always good at that; they might forget things that have gone wrong or explain them away. And, last, we need to consider that experience in one organization might not help—and might even hurt—performance in another if they don’t operate the same way or have similar cultures.

Don’t interviews and reference checks help employers figure all that out?

Yes, especially when you ask behavioral questions like “How have you previously handled difficult clients? Tell me about a specific situation, what you did, and what the outcome was.” But not all employers evaluate candidates that way. And it’s possible that applicants who could answer well have already been screened out due to their lack of traditional work experience.

What factors beyond experience should we consider?

Well, another reason employers look for hires with experience is that they think previous jobs have helped those people build up knowledge and skills. They might even think that candidates who have done certain types of work have particularly desirable personality traits. But we’d recommend focusing on the knowledge, skills, and traits directly rather than using experience or even education as a proxy.

Are there any scenarios in which experience matters?

We did identify a couple of situations in which it does seem to have more of a benefit. First, we found a smaller set of studies within our data set that looked at prehire experience and performance on the job after three months, two years, and five years. Although the relationship was weak at the two- and five-year marks, it was stronger at three months, so experience appears to have helped some people as they were getting started. Maybe it’s because they were accustomed to employment and organizational life and could hit the ground running. Or perhaps managers gave the employees who came in with experience better ratings at first. But over time employees’ prehire experience became less and less important to performing their current job.

Second, again in a smaller number of studies, we saw measures of experience more at the task level. So, for example, instead of asking pilots or truck drivers how many years they’d worked in those jobs, employers would ask how many hours they’d logged flying or driving. Those metrics were better predictors of future performance.

Is it realistic to think that HR departments and hiring managers will stop screening for experience?

You can understand why so many organizations do it: Experience is easy to assess. Have you worked in sales for three years? Have you managed people before? It’s either a yes or a no. Past performance and existing knowledge and skills are more difficult to figure out, especially if all you have is an application or a résumé. But today, when everyone is complaining about the skills shortage and the war for talent, companies can’t afford to knock out candidates who would do really well but don’t have the experience that someone has chosen to put in the job description. You want to expand the pool of people you’re considering.

Are there any other simple screens we could use instead?

Probably, but they would vary by organization and job. The key thing is evidence of correlation with job performance. Let’s say there’s a role you need to fill in the sales department and you see over time that people who majored in marketing tend to stay longer and get better customer reviews than those who studied other subjects. That could be a viable screen. For another job it could be having some certification; the data might show that employees who have it outperform their peers, so you look for it when hiring. Companies could consider using other screening tools, too, like job-relevant tests. The problem is that most organizations don’t take those steps. They use data to make decisions about products, marketing, and finance, but they don’t use it to make decisions about people—at least not effectively.

Does experience within an organization matter?

We didn’t look at posthire experience, but other research suggests that there is a link between how long someone has been in the job or working at a company and how well they perform. It’s not a superstrong relationship, but it’s something companies might consider when deciding on promotions and transfers. Is experience more important for managers? That’s something we’re looking at now. If, say, a sales rep wants to be a sales manager, how much does his or her experience in that lower-level job predict success in the more senior one?

Source: https://hbr.org/2019/09/experience-doesnt-predict-a-new-hires-success