Blog

Imagining the workshop of the future Part 2

In my last post we looked at how we can start creating a workshop of the future using existing technology, but where do we go from here? Well, conversation seems to be one of the most exciting new technologies exploding on the scene at the moment. You can talk to your phone to do everything from setting alarms, to sending messages, getting directions or searching the web. All these conversation based devices and phones, be they Siri on iPhones, Google Now on Android devices or the Google Home, or Alexa from Amazon, can allow us to achieve what we need without having to physically touch our devices, and this feels like a logical next step for our workshop of the future. Imagine being able to call across the workshop 'Hey Alexa, what defects are there on truck 49?' and have the defect details read out to you - you wouldn't need to down tools, climb out of an inspection pit or walk across the workshop. Once a defect is fixed a simple shout out of 'OK Google, mark the defect on trailer T56 as repaired' would move the defect to the repaired column. 

Jumping even further in to the future there are technologies in their infancy that can turn what you see into a part real, part digital world. Augmented or mixed reality, either via a phone or using smart glasses such as Microsoft Hololens or the rumoured AR glasses from Apple would allow our reality to be overlaid with information, such as having the defect board 'pinned' to the side of the trailer we're working on. This could be extended further, allowing live chats to expert mechanics anywhere in the world who can see what we see and can guide us as we service or repair machinery. This is a great way to help educate junior mechanics or provide specific help from the machine manufacturer. These technologies are still very immature, but give it five more years and we'll see them coming into much more mainstream usage.

Finally, let’s think more about safety - something always on our mind here. Even the most expert eye can fail to see hazards in the workplace, so what if AI could help? Wouldn't it be great to have cameras that monitor our workplace and use artificial intelligence to check for and alert us of the dangers around us in real time. This is actually something that is coming - Microsoft did a demo of this very thing at their recent //Build/ conference. Their solution analysed a live video feed and compared the objects it saw to a set of pre-defined safety rules. The example they gave is a heavy jackhammer was leaning up against a bench, and if this fell it could really hurt, or even break bones. The rules in the system spotted this was in an unsafe position and sent an alert to a phone, warning the person nearby so that they could lay the hammer down in a safe place. You can see the demo of this below.
 


The vision of the future is pretty interesting - using technologies like the defect board to streamline defect management, then taking advantage of touchscreens or conversation based technologies to bring the defect board into a workshop and make it easily accessible to mechanics, to AI hazard tracking to ensure everyone does their day to day without worry of injury. We at EROAD are excited to be a part of this journey.

 

Jim BennettJim Bennett is a mobile developer at EROAD, as well as an internationally recognised thought leader in the mobile developer space. He is a Microsoft MVP, Xamarin MVP, technology blogger, author of Xamarin In Action from Manning publications, and is a regular speaker at meet-ups and technology conferences around the world such as Xamarin Evolve, Microsoft Ignite and NDC. You can find him online at https://jimbobbennett.io or on twitter at @JimBobBennett.

 

Want to know more?