Multiplayer AR Synchronize worldspace Hi,
for studies we invent a jenga-AR-Multiplayer. Singleplayer is allready working fine and a "windows-simulation" with multiplayer also works.
But when bringing it together i have a problem.
Here is the szenario:
Player-H (host): starts g...
Shader.SetFloat not work Hi,
I'm trying to set a uniform variable for one shader used for multiple materials.
In unity editor the result is what I want.
I set Shader.SetFloat("variable", value) and my result is acchieved.
Inside the hololens nothing happen.
Do you have any solutions or work around?
Is this a bug?
play external ambisonic audio in 360 video player Hi,
is there a way to load from disk an ambisonic wav file (without having to import it in the asset and use the inspector to mark as Ambisonic) and reproduce with resonance audio ?
Thanks and regards
Get Name of Found Image Target Hi! I've been looking all around the ITrackableEventHandler documentation and related forum posts, but can't seem to figure it out. I want to be able to get the name of the target that has been found.
I read somewhere that you have to put the trackable event handler in the Image Target in Unity, but I'm dynamically loading a database of Image Targets, and ideally would be able to change the image target behaviour outside of specific image targets...is this possible?
There seems to be a problem with the ARPointCloud's Points property.
Running on a iOS 12 beta iPhone 8 with ARKit 2, the Vector3.
Logging lenghts of both arrays, I get values like:
"GetPoints: pointCount 803
GetPoints: verts 200"
which clearly don't fit into each other nicely.
Here's the original code with some logging inserted:
how do i set multiple cameras(VR Camera & non VR Camera)? Hi there, i'm trying to set multiple camera. and i'm using MRTK(Mixed Reality Tool Kit)
One camera is HMD Camera(Samsung odyssey), and another camera is non VR Camera(Just using Unity Camera. not Mixed Reality Camera).
I want to output a view of another camera(Unity Camera) in one camera(HMD).
I tried to set inspector value, modified depth value , but HMD camera is not showing another camera.
How can i output a view of another camera in an HMD camera?
have a nice day
Oculus Go with Linear Color Space looks Darker than Expected. We have set Graphics api to Opengles 3.0 and Color Space to linear. Our Build Looks more darker then expected(Compared with editor view and cardboard build in IOS).
3)Target : Oculus Go
Using Vuforia API Hello , I am trying to understand how to use vuforia api but the information available on their site is quite daunting and too overwhelming for me .
Can you please suggest some tutorials or blogs through which I can begin my learning . I really get ...
Turn Roll a Ball into AR game Hey There,
I am new to the Unity world, and I have just finished the Roll a Ball tutorial to get a better grip on the interface and basic functions.
I wanted to know how it is then possible to use this platform and export it as an AR game to an Androi...
Lens Distortion Effect in VR I am creating an app in which the view inside a VR headset should look incorrectly distorted. For this reason, I would like to apply a lens distortion effect on top of the one that is built in to make the view look correct. I found the lens distortion effect in the post processing stack, but it is disabled in XR (this makes sense assuming you want the lens distortion to be correct, but I don't). Does...
So far there is no much clues I can find from my beta testers report.
Logfile command line doesn't really give much data of this crash.
Crashes happened to certain HTC Vive users, not all however.
Could someone please point me to correct direction to fix this crash on Steam?
Is there a way to save the detected surface and the chosen anchor? I'm trying to make a scene to instantiate a surface and choose an anchor to spawn the object. This information will be saved and used in other scenes. The point is that the user won't have to search for a new surface and anchor every scene.
Thank you in advance
Testing Oculus Go within Unity Editor I am testing the waters with VR and doing a few experimental things to help get use to working with VR. I have the Oculus Go and everything is working fine, however to test my demo I always have to create an .APK build and deploy it using ADB install etc.
I've seen many people on YouTube testing inside the Unity Editor by hitting the Play Button. This does not work for me, is there a specific set up to do this? It would really help and save me some time.
Help with Gaze Click in Unity, Google VR Hey
I am trying to make a VR experience in Unity using google VR where the user loads up the program and your put into a 3D room. I have done this by taking 360 videos of a room that leads into a hallway and moves down the hallway into other rooms. Wh...