We're the developers of free 6DoF head and eye tracking beta software to control BeamNG's in-game camera with real-life head movements. The app is incidentally named Eyeware Beam. Go figure iOS Download Setup Guides Tutorial Video We built the software's proprietary computer vision algorithms and machine perception AI technology that is not dependent on ARKit. As a result, it generates an accurate six degrees of freedom (6DoF) head pose and eye tracking signal comparable to expensive Tobii or TrackIR proprietary devices. The app works without needing additional hardware for the millions of people who already own an iPhone and a PC. Once connected to a Windows PC over Wi-Fi or USB, the app turns a Face ID-supported iPhone or iPad, with a built-in TrueDepth camera, into a precise, multi-purpose head and eye tracking device. Android support is not currently available. Get started with your Windows 10 gaming PC, iPhone or iPad that supports Face ID with iOS 13 or later. We released an SDK for developers to access the API to build additional integrations. Software Development Kit (SDK): Download Documentation: Here Questions? Reply here, the Discord Server, or email Help@Eyeware.tech.
Thanks for bringing this up - nice software. However, why do i need the iphone? The visual algorithms are a lot better today and can detect head movement pretty well without any "TrueDepth" camera to begin with? For example: https://learnopencv.com/head-pose-estimation-using-opencv-and-dlib/ That being said, we will not link to your .lib for this integration and never will. We are creating software for so many people to enjoy that we need to see what's going on to ensure our player's privacy and runtime behavior. I.e. we cannot say if we link to your closed source library that you will start tracking all our users instantly and send info to your servers. We did not integrate with closed source black-box SDKs in the past and we will not do so in the future. Feel free to create a local API on the computer with pipes, RESTful interfaces or alike and provide a free and open library to interface them though cheers, Thomas
In addition to what @tdev said, you can take a look at lua/ge/extensions/core/cameraModes/trackir.lua for an example of how you could cleanly inject your own head tracking system into the game's camera pipeline: the "runningOrder" parameter controls how early or how late in the pipeline your camera transformation would get applied. The aforementioned restrictions (e.g. LIB linking) still apply though, so you would need to communicate with your SDK some other way.