From our Lab

Virtual Content Triggers anyone?

As the winter weather roles in and the nights get longer and longer up here in the North, I end up spending more and more time in front of my various screens searching for ways to use the endless stream of new technologies coming out.

As we all know, the best way to test out any new tech is to apply it to a real world test case. So when I came across the sample code for scanning and detecting 3D objects from the latest ARKit from apple, I wanted to run a quick test to determine how credible it would be to 1) scan an object and 2) use that scan for object detection and to display some virtual information to the user in Augmented Reality.

I decided to combine this tech with my love of motorcycles because to me, any time spent tinkering with a motorcycle in a garage is time well spent. I ended up using a Harley Davidson speedometer I had lying around the garage for my test subject.

I started by downloading the sample scanning code, compiled it with xcode and deployed to my iPad. The shocking result was that it worked first shot! So far so good.

The video below shows the process (sped up a little). Overall it was fairly seamless despite my environment not having the best lighting or background. It took less than 2 minutes to create a good scan on my first try.

I followed a simple Apple tutorial and built a swift ARKit project that included both the previous speedometer scan and a new sprite object. The sprite object contains some viewable content details for display purposes and can really be anything from 2D drawings, catalog information or even video. I was just trying it out as a container to see how it would react. My final app and results are show in the video below running on my iPad.

The test worked very well and was surprisingly easy. Point your iPad or iPhone at the object and display some related information. Next tests I run will be on larger objects to see how the scan sizes and larger object recognition hold up. I don’t see why it wouldn’t.

What I am really interested in now is finding test cases within our Substation Design industry. Imagine scenarios where a user could point the iPad or iPhone at a substation component and have the detailed drawings displayed beside it. Maybe the red-lined markups of past work? Maybe a field checklist or other manufacturing details? IOT information? Are there any equipment vendors out there that would like to combine this type of catalog data with our Substation Design Suite and Utility Content to pull models and specs?

So here’s a call to action for all Substation Design folks, SDS users or members of the SDS Industry Consortium out there. If anyone is interested in working with me on a Substation Industry test case around this technology, please feel free to message me.