Would Aryzon be possible with OpenCV?

Hello everyone,

The goal is to build my favorite game ‘StarCraft 2’ into a table 3d Table game. (picture 1)
The game itself has no 3d support whatsoever. But I found a way to turn games like LoL and StarCraft, you name it, into 3d using Unity with some complicated tricks. ( This is new and has not been invented yet ). But for this to work I had to build a standalone app for obvious performance reason! (StarCraft runs on your PC not your phone)

ARYZON currently focusses primarily on SDK’s like ;ARkit/ARcore/Vuforia.
I managed to get OpenCV working in Unity (see pitcures). And would love to use the hologram shader from the ARYZON SDK to show only the object and nothing else when I use the cheap ARYZON headset. How does the transparent part work of ARYZON. I don’t want any background I only want to display the gameObject.
I added picture of my unity setup to show how my image tracking is done using OpenCV.

Thank you, for reading my post this far. I would love to share my finished product as it is something new and I hope it makes a an impact on to the StarCraft/AR community.

TLDR; how to implement the Aryzon hologram shader?

Details

Vuforia and other AR SDK’s don’t support standalones. So I had to build my own software to calibrate my camera, implement headtracking and image tracking.
I don’t make use of the accelerometer or any kind of gyroscope yet. I can use both the webcam of my PC as well as any webcam on any phone. You do need to use my calibration software to get better tracking performance.

Me;

Dutch student with a machine learning/ AR passion

Pictures

The camera texture is separated from the object already as you can see in the game preview of the first in-scene picture.
I made these at around 5-6 AM so the quality is a little lacking; :sunglasses:
You have to pretend the Aruco-board is the map of league of legends / StarCraft with the units on top. (I got this working but I removed all the parts how I actually showed the in-game map)
I will share it when it’s more ‘representable’ as it is currently still in development.

Hi Zelos, cool project! Welcome to the community :slight_smile:

We don’t really use a special shader for this, it’s just a camera rendering a black background. There might be a hologram shader in the sdk somewhere causing confusion, but that would be something you wouldn’t need for this to work.

Couple of things you can do, if you’re able to just not render the camera feed from OpenCV (probably renders to a render texture?) you might be done already. Otherwise a workaround will be to make sure the Aryzon camera’s renders after (on top of) your OpenCV camera.

You will still need the Aryzon GameObject to do the stereoscopic rendering. This will render the black background. Then you will need to hook up the Aryzon object to your OpenCV camera transform, just like we show in the videos with Vuforia or ARCore / ARKit . You can use the ‘other’ field for that. If the camera stays static and only the virtual Aruco board moves then this is not even necessary.

BTW Unity has a function Input.Gyro, this might help you out with the gyroscopic part.

You must have done this already, but make sure you calibrate your phone camera as well. It’s easiest to use meters as your scene units. If it’s hard to change your calibration process to use meters then you can tell the Aryzon object your own units.

Intersting to see this work out, keep us posted!

I hadn’t looked at the images properly, I see the Aruco marker moves, not the camera. Makes it easier, just put the Aryzon object at position and rotation 0.
Furthermore there is a plane rendering the OpenCV camera feed, just disable this plane or make sure it’s outside the range of the Aryzon camera’s.

I know probably shouldn’t say this since a lot of work has gone into opencv, but I think they do. You can configure Vuforia to work with your webcam.

Dear Maarten,

Thank you for all information!!
I am going to mess around with all the setting… like I always do with SDK’s.

It’s hard to make it look ‘real’ if you build everything from scratch.
I occasionally update this thread with some progress just for fun.

‘Hi Zelos, cool project! Welcome to the community :slight_smile:

:sunglasses: Thank you

Current status; Learned the SC2 editor and mathematics to create the illusion in-game and got it working! I’m sending keys with Windows.System.Forms and it’s really slow. But it’s working.
Goal; Improve response time by calibrating webcam and optimizing webcam. Looking into Anaglyph shaders maybe?

Note; I’m in the editor. Streaming on low quality. And having a bad pc. So the graphic aren’t there yet.

cool3

Looks like you’re having fun haha :grin:

So if I understand correctly you are sending key points detected by the phone to the computer, compute pose and render on the computer, send it back to be displayed by the phone. Is that correct? Just a thought, maybe it’s easier and more stable to use ARCore / ARKit as a tracking engine and send pose data to the computer. ARCore / ARKit make use of some clever sensor fusion and calibrated camera’s to make tracking very stable. Have you tried to use ARCore / ARKit? Or do you have a specific reason to use OpenCV?

Yes that is all correct

I have some time again to work on this project. And will try and implement ARKit. I thought having everything running on my pc would lower performance time. That’s why I used OpenCV…

I am also reading about the FOV with AR. Because I think that is something il have to figure out

The Aryzon SDK determines the size of the screen to be able to render the stereoscopic images correctly. This way the images have the correct size (FOV) and are placed at the correct distance (horizontal distance between the camera’s). Setting your game window in Unity to the same physical size as your phone might be enough (yes physical, as in real world dimensions, not pixels). You can edit SDK settings with AryzonSettings.Phone / AryzonSettings.Calibration. You can check the width of the screen with AryzonSettings.Phone.screenWidth for instance.