Jim Colvin, CEO of virtual reality (VR) training simulator and solutions specialist Serious Labs, looks to the future of VR training, alongside operator efficiency and safety.

How has the Covid-19 pandemic changed training for the long-term, particularly in the use of VR?

The pandemic certainly made us look at the whole concept of training differently, and at our products differently.

We’ve learned over this last year that our VR training simulators are not just training tools to replace a different form of training, but rather measurement tools that quantify the risks the individual operator is bringing to the job site. They are also instructional tools which teach that operator how to change those risky behaviors. We have done this by utilizing telematics.

Telematics are not just ways to measure aspects about equipment such as MEWPs; they can also be applied to human-machine interaction as well.

“Operator Telematics” refers to the measurement of different human variables to help assess how an operator is using the equipment. That data can then be analyzed, and the operator’s movements can be refined and corrected for proper usage of the machine.

The MEWP simulator objectively measures over 25 data points of operator telematics, so the trainer is going to be able to provide highly detailed information to help the user increase their proficiency quickly.

We had that ‘Aha’ moment during the pandemic, and because of this we feel we are going to help the access world to evolve. We are starting to understand operator learning and training on a whole new level, and who knows where that will lead. We will definitely be able to help companies train better, but maybe we will gain insight into equipment design and what makes equipment less liable for accidents on job sites, or maybe we will help the industry view training in a whole new light.

It remains to be seen, but we are emerging from the pandemic feeling optimistic in regards to the capabilities of VR simulation training.

With operators now able to receive PAL+ training via VR simulator, what are the next steps to providing full PAL Card training via VR, or other types of training?

I think the short answer is that it’s up to IPAF, but that is our goal. We believe that’s coming.

To me, every job site has the obligation of the employer or the supervisor to make sure that the operator is familiar with the equipment, the model of equipment they’re operating that day, how to do a pre-use inspection, and then being able to do a site assessment. We train how to do all of that right inside the simulator.

Because that’s always part of the job at the job site itself, the stretch goal is that you can get, in a sense, a recognized certification of operator proficiency on a simulator. That report card will be just as valid, if not more, than just a perfunctory test you did in the parking lot or on level grounds, not anywhere near any obstructions, because you can measure so much more within the simulator and it’s not objective.

It’s happening in North America too, that organizations and bodies are recognizing the simulator as a replacement of the practical evaluation on the real machine because of the data that it generates, and the objectivity, and the complexity, and the real testing of actual skills and behaviors at height, rather than just what can be observed in 10 minutes in a parking lot.

What are the future advantages of using simulators rather than real equipment onsite?

Right now, simulators provide a great solution onsite for many reasons, such as for contractors and rental yards who may not have space onsite to provide physical equipment for training, or they are located in remote locations, or the day rate equipment rental cost is prohibitive for them to run the training, or there is inclement weather. Also, operator telematics provides a level of quantitative data on operator performance that far exceeds what can be provided through in-person training.

In the future, one exciting possibility is that you can have an objective measurement of proficiency that’s associated with somebody’s record, whether that’s a digital PAL card or something else. When someone shows up on a job site, that record will be visible to them and can help determine things like work assignments or additional training needed. For example, if you’ve got a very talented operator, you can assign them to challenging tasks, and if you’ve got someone who may need more training, you know exactly where and how to help them improve. Just think about what that would accomplish for safety and how many accidents we could prevent.

What are the next developments in your VR MEWP training programmes and systems?

With the MEWP itself, I would say the next major development is getting global recognition from the certification bodies to enable folks to be tested right on the simulator. To certify someone is the next big step.

We recently announced that Energy Safety Canada recognized our test on the simulator and next up is having the American National Standards Institute (ANSI) recognise us in the United States.

ANSI is much like British Standards Institute (BSI) or the International Organisation for Standardisation (ISO) and has the ability to accredit the training for meeting specific requirements. ANSI recognizing our product will be another national body verifying the legitimacy and the value of our products.

Beyond the MEWP, we’ve learned it’s way more about the data generated by the person operating it. What we see by analysing the data, and identifying the trends from a big picture perspective, and then figuring out how we take a macro look at it so that we can improve the industry as a whole, that’s where we are headed, and that’s what we are really excited about it.

What are the challenges in developing the latest and new technologies?

One technology challenge that we lead the solution to is cybersickness. Traditional simulators have had limited utility to replace ‘seat time’ because it was common for the operator to feel unwell after only minutes in the simulated environment, particularly if operating a piece of equipment or a vehicle that moves. If the motion the operator’s eyes are experiencing is not physically replicated through forced feedback, cybersickness is the result.

Using our proprietary motion base has to be able to replicate, or trick our bodies, into making that vestibular connection between what we see and what we feel. We are working with university researchers and experts from around the world on this, and we have become part of the team who is leading this science and solving this problem in VR and simulation in general.

What will be possible in five to 20 years’ time?

Right now, there is a concerted movement towards autonomous vehicles assembling modular buildings – using different types of materials and prefabricating parts of them and assembling them onsite. That is definitely in our future.

However, wherever the trade requires dexterity and precise motor skills, you need a tradesperson and that person will likely need to work at height. As such, getting MEWP operators safer and more efficient is the goal to the application of new technology like VR MEWP Simulators.

So that takes it back to the training side. If there’s a human in it, then he needs to be trained. And if he’s going to be trained, the best way to do it is by doing.

For theory, the classroom has slowly been sharing its role with e-learning. The pandemic only accelerated the adoption of e-learning as classroom training was virtually non-existent. However, as more training has become available in VR, from softskills training to onboarding, learning by experiencing is emerging as the training of choice. Learning methods are rapidly evolving and the new generation of industrial labour is leading the way.

As for 20 years, well, have you seen the movie Ready Player One? I mean, that’s what’s possible. It’s about creating complete worlds and environments that people can operate within that contain the kind of fidelity and experiential detail that the real world does.

In VR, the headset will almost be a replacement to our normal set of eyes, of how we optically perceive the physical natural world around us, the headset in whatever form that is. We physiologically are not going to change for millions of years. So everything that we create in the artificial world has to bear that in mind. Our eyes aren’t going to change, but the headset will. It still has to be fully experienced, that it still has to be able to create a completely immersed environment.

Some people are concerned about what this means for industries like construction, and what it means to the labour force. But think of the automated loom being invented in the early 1800s to make textiles. People said, “That’s going to put all of these guys out of business.” Well, it didn’t. Instead, it reduced the cost of a bolt of fabric so low that people could have several outfits, which meant that this guy was making way more and he was hiring seamstresses and people like that. So business actually increased.

I think this transition over to more technologies will be like that – it’s the types of jobs that are going to change. The one thing that you have to always bear in mind is that it used to be human labour was a physical thing, whether you’re removing rocks or picking things up, and then steam and ultimately internal combustion replaced that. Jobs didn’t go away – they evolved.

Could VR units transition from training to remote operation of MEWPs? And, what else is possible.

Absolutely, and that can be done almost now. It probably would start in cranes where you might have a control panel where you’re sitting in your living room in Liverpool, and operating the crane in Malayasia. There’s going to be a point where artificial intelligence will be able to calculate and program the lift with little or no human intervention.

But we’re still a long ways away from that. So I think the next logical step will be remote operation for certain types of equipment. And that could ultimately be done with a boom lift. The challenge with a remote operation of a boom lift is you still need the guy that’s doing the work up there, but he’s just not running the machine. And you don’t want your operator doing it from his living room in Liverpool.

It really comes down to whether the physical task can be performed more efficiently and economically than a human being can perform it.

The dexterity of the human hand is very hard to replicate, and that’s why robotics have only gotten as far as they have. However they are constantly improving so their capabilities will continue to replace human physical activity wherever it makes sense.

Anything’s possible right now. It’s really about the utility of it. Does it make money? Does it save money. And hopefully the world will continue to examine and include whether it has a societal benefit too. Then, that will be what drives innovation.

What will the future construction site look like from a VR point of view?

There are amazing possibilities here. Based on some technologies we’re already seeing emerge, the crew will be able to visualize the site and the different stages of what they’re building to align with the plan. They’ll be able to assess their skills and train up on any heavy equipment they’ll be using with realistic simulators like the ones we offer. VR will also let them practice and refresh on-the-job skills relating to their trades, even if these don’t involve heavy equipment. Some operators might control autonomous machines from off-site using VR. The whole thing will be interlinked with operator telematics and other data that lets managers track progress and find efficiencies.