Integration with A-Frame and AR.js


#1

Hi I’m about to start developing with my Aryzon headset, and I was considering using AR.js and A-Frame.

Does anyone know if its possible to use web technologies instead of Unity?

Thanks

Adam


#2

Awesome, :clap: glad to see someone interested in web development. There are some things to take into consideration. As of now there only is Unity support. Getting tracking to work is just the first step. Other steps are to get the correct lens distortion model, virtual camera position and camera projection matrices. I’m happy to work with you on this.


#3

Hi Atomki and Maarten,

I have the same question. I’m part of a group of WebXR enthousiasts (‘WebXR NL’), and would love to use Aryzon for webAR. Have you made any progress with this so far? I have mainly been using A-frame & AR.js in my webAR projects so far.

I would love to hear from you,

Kind regards!

Dominique


#4

Hi Dominique!

We sadly have not been able to shift our focus from Unity development and to be honest we are hoping for others to pick this up. I hadn’t heard of WebXR NL before, do you think there are people in this group that have the time, skill and enthusiasm to develop this? I have some ideas on how to keep it simple.


#5

Hi Maarten,

Thanks for your quick reply! I am going to ask in the group and of course, I am curious to hear about the ideas you have! Would be great if we can get this to work.

Kind regards,
Dominique


#6

I’m not very up to date with how far WebXR has developed the last year. I do see an ARKit / ARCore demo by Google which looks very promising.

Rendering wise 3D AR is not very different from VR, there are two virtual planes where the virtual objects is rendered upon for each eye. The only tricky thing with AR is it needs to overlap with the real world. Because tracking is done with the phone’s camera any tracking engine will give the pose of the camera sensor instead of the eyes of the user. The only difference with basic VR is that we need to compensate for this shift (and render a black background of course).

There must be a great mobile VR implementation I think A-frame can do this right? I’m not sure how this all works in A-frame though I would think you could compensate for the shift by applying the tracking transform to an object which is the parent of the VR rendering setup that has compensated for the shift. These X,Y,Z (shift) values are different for every phone and every headset that’s why we let users calibrate. However you should be able to get an OK experience without calibration by using an average value.

Another step is to apply the correct camera projection matrix to each camera for the eyes. I can help find the correct matrix / settings. Our SDK calculates it at runtime, I can export the values for a specific headset.

In the end you can call our calibration server with the calibration code and some more values (I will explain if necessary, there is no documentation) and retrieve a JSON containing the shift and headset values.

To get to the first result might actually not be that hard :slight_smile: