Lightweight XR Rig – Haptic feedback? I've got a VR project running in 2018.2.6, using the same XR rig (with the TrackedPoseDrivers) that gets created when you make a new LW VR project.
I'm trying to enable/fire haptic feedback events to the controllers. I'm unable to use the SteamVR plugin, because it seems to be fully unusable in LW. (Every time I try using it, the entire editor crashes after rendering a single frame. This happens even on an empty project.)
I also noticed there's an OpenVR package available in the package...
GestureRecognizer.OnHoldStarted Delay When using unity's built in gesture recognizer (UnityEngine.XR.WSA.Input.GestureRecognizer) on the HoloLens we're encountering a strange delay in the triggering of the gesture depending on what is begin grabbed.
When pinching while looking at a simple-ish UI panel, our gesture is captured and the drag method is triggered almost instantaneously. When pinching over a mesh, the grab gesture must be held for nearly 3 seconds before the system recognizes it.
The problem I'm facing now is that i've no idea how you add physics to it, so your player falls to the floor, and how you would add touchpad locomotion to it. I've searched all over the internet and there's a serious shortage of any information on how to do this. Any help is appreciated?
accessing ARCore and phone camera settings Hey guys, Is there any way to adjust camera settings like exposure within unity? i dont want the phone camera adjust light/brightness by itself. its really bad for my app and i'd like to have control over it by myself. I know we can do it out of unity. But I cant write plugins. if there is no way, then how can I request it to unity team?
PS: also focusing in the area that i tapped is another useful...
I am making an AR application with Hololens and I have created a rectangular gameobject (cube) of little thickness and I need to assign a material that is transparent and that also conceals what I have below it.
Project things on a wall and this wall has shelves, and I would like if I look from above the shelf, do not see what I have below, but that if I look at it in front, if you see it. So where I theoretically have the shelf, I will place this material.
Multiple SDK in one project Hi,
Is it possible to have multiple VR/AR SDK in a single unity project? For example, can I have a Unity Project that contains SDK for ARKit and ARCore for AR and Google Cardboard, Oculus Gaer VR/Rift, Vive and Daydream SDK for VR in the same project ...
Vuzix m300 Speech and Touchpad in Unity3D Hi,
Is there any way to include Speech recognition and Touch pad support for Vuzix m300 using Unity. The code shared is for Android Studio development. I am requesting for Unity3D.
Random crash App with Unity 2018 and Oculus utilities 1.28 Hi there. I have large project with many scenes. Each scene is standalone game. This project was created for Gear VR with Oculsu sdk and Unity 5.6.4p4.
Recently i upgraded project to 2018.2.4f1 and Oculus Utilities to 1.28 and in Editor its working per...
I don't know how to tell whether or not my Oculus Rift is being used on Steam or in Oculus. What is the proper way to check which platform my game is being deployed on?
VR MI Hello!
Can you help me?
I want to create app for MI Virtual Reality, but i can not registred on Chinese web site https://dev.mi.com/developer/selectBindType?userId=1735740835
Do I really need to have a Chinese bank card to create applications? Are ther...
1) The included Post processing stack doesn't show while I'm in Play mode. What is the reason for this?
2) In the editor / play mode, the game is very jittery, and the left eye flickers about once a second. When I make a build, it's buttery smooth. I assume there is some editor overhead, but is there something that can be disabled to make it...
here's another thread for this issue. There's multiple old ones existing in the forums but they don't seem to be watched anymore or are marked closed. So here's a new one in the hope this finally gets solved.
- I'm using Unity 2018.2.6f1 and ArKit 1.5
- I build the ArKit 1.5 ARKitRemote scene with "Developer Build" checked
- I start the build on my iPhone X and the FaceAnchorScene in Unity on my Macbook (connected directly via Lightning to USB-C cable with wifi...
...but what actually happens when the app launches is: totally black screen, for 3-5 seconds, and then sometimes the splash images flashes up briefly before we enter the game. Other times, it goes right into the game (after 3-5 seconds of black).
Have I set the logo up wrong? Or do I need to do something like, start with a trivial entry scene that Unity can load more easily, and then load the "real" game...