Testing out some different manipulations of objects in a HoloLens app. Unfortunately, I can’t show the app in which this is used, as a it’s for a client that wouldn’t appreciate me showing it on my personal channel. We’ve built a IoT-integrated system for troubleshooting industrial manufacturing equipment through the HoloLens. Having natural, intuitive input gestures is an important goal towards making the app very fast to learn. This is really basic and I’m kind of at a loss for why I haven’t seen more of this type of manipulation in other HoloLens apps. I’m not even using the latest Windows Holographic version, this is the original version. Not using MRTK here. I find it very buggy and also annoying. I’ve found it a lot easier to build software that supports different platforms if I stick to the basic, core IO elements rather than any ostensibly “easy-mode” libraries. Also not using the built-in gesture recognizer. Again, I’m trying to support lots of different platforms, and most of them don’t have a gesture recognizer. So it’s easier for me to support all of the platforms if I build my own gesture recognizer. And incidentally, it also means you can track more than one hand at a time on the HoloLens!

Video

The clouds finally broke and I got a chance to test in full sunlight. Need to tweak the shadow strength and ambient/directional ratio, but it’s looking really good. Shadows go a long way to making the image believable.

Video

And now for something completely different. I took a photo of the garage across the street and uploaded it from my phone to Vuforia. I then put a traffic animation in front of it. It’s hard to see because the bright, low sun made it hard to record my phone. But it works pretty well. There are some issues with rotations again, but it might work better in better sunlight and with better image preparation. I also took a stab at light estimation. I just guessed at the direction of the sun and set it statically as a directional light. It worked really well. At least for outside scenes, time of day and latitude should be sufficient to estimate lighting.

Video

This is a similar setup to the last test. 24 foot range, marker about 3 feet from camera. The marker is smaller this time, but rated at 5 stars. Also, it’s at an angle to the camera, rather than parallel to the plane of view. Additionally, I have Vuforia extended tracking turned on. The AR graphics are the left basement wall, the table on which the marker sits, a table halfway to the back wall, and a chest that sits in front of the back wall. When the camera is close and can see the marker clearly, extended tracking is not enabled. There is a similar jitter to the previous test, though it is not as pronounced. Setting the marker at an angle has definitely improved tracking rotation. Much more interesting, however, is the fact that backing away from the marker far enough to have extended tracking kick in shows that the graphics become pretty rock solid. There is very little jitter. Unfortunately, we can also see that there is a modeling error in my layout. The box of cards does not sit perfectly straight up and down. There is an ever so slight lean forward to it, which is not reflected in my scene geometry. That error multiplies over distance. A calibration step could be created to correct for this.

Video


Indy Theme by Safe As Milk