Having an action webcam strapped with bow ribbons to my XR glasses grinning mad into the smartphone cam. A bunch of wires are also strapped to the glasses.
Video: How to get 6DOF with older 3DOF XR glasses using Breezy and OpenTrack

Breezy can now turn a 3DOF (degree of freedom) device into a 6DOF device by augmenting the missing positional data from a webcam. Spoiler! It is not the cam strapped to my face – this is just for the demo you can watch here, on PeerTube or YouTube.

The cam, that I used for this task, is sitting on my monitor. How this works? Well not with magic! This requires a somewhat decent webcam – really anything from the last decade should suffice – and OpenTrack, of course.

OpenTrack is a head-tracking application with multiple tracker plugins. One of it’s plugins is the Neuralnet Tracker, an AI powered extension that comes with a bunch of different head pose models to choose from. With a webcam connected this can now locally run the detection model with very low latency – so it’s usually blazing fast on most systems!

This alone is already 6DOF and is used a lot for gaming already – so what does Breezy do with this? Simple! It reads the forwarded data via an UDP listener, a very quick way to transmit data on a local network or system [and complements it’s own rotational data with the missing positional data].

With this a Breezy user still gets the rotational data from the XR’s very sensitive IMU, that is short for Inertial Measurement Unit btw, and the not so important positional data sent from OpenTrack.

This works of course only while the webcam can still see the user. So sadly no walking around while using this.

And the best thing? It can also send the data back! This means that the very same combined values can be forwarded – e.g. to a computer game – benefiting from the best available data sources for rotation and position.

That’s not the main use case, of course, and only of importance for some nerds like myself. This is mostly relevant for the productivity features of Breezy, because sometimes a text may be too small to read with the glasses on. We do no longer have to increase the font size – we can now simply lean in! That is a feature that is usually only available with glasses, that come with little cameras of their own, so they can have native 6DOF support. And when I say native I mean that such glasses usually also outsource exactly this calculation to the connected computer. It’s my understanding that this seems to require a lot of computation power, which is something many XR users with the more modern devices complain about.

Well not so much with OpenTrack and the Neuralnet tracker, that utilizes the ONNX runtime under the hood. That’s a high-performance, cross-platform engine to power exactly such models locally. The runtime automatically makes use of the best available hardware acceleration, if there is any.

Overall I’m rather hyped about this feature – especially because I’m using the OpenTrack output option of Breezy for quite some time now, to get a VR like experience with stereoscopic 3D rendering in Side-By-Side mode. I can now keep using my older XR glasses and still enjoy this more modern 6DOF feature. This is rather expensive hardware after all.

And all that on Linux PC!

Breezy xr_driver: https://github.com/wheaney/breezy-desktop by https://www.youtube.com/@WayneHeaney

Official Announcement XR desktop with 6DoF + multiple displays: https://www.youtube.com/watch?v=eFLmjpjF-rA

Music “Life’s Worth Dying For” CC BY-SA 3.0 “LostDrone”. Licensed to the public under https://creativecommons.org/licenses/by-sa/3.0/ Verify at https://soundcloud.com/lostdrone/rock-lostdrone-lifes-worth-dying-for-free-download-and-creative-commons-license

So what happens when sheer stubbornness, a glorified button box, Ace Combat and the Unreal Engine Scripting System meet? Pure magic. I got the game to spew out a constant stream of telemetry data and events in search for more immersion in my VF-1 inspired home cockpit. The approach is the very same that I used for X4 Foundations before: Side load lib Luasocket, get a network connection established and start dumping extracted game data to it. This is highly experimental and the result of hacking away for the last ~4 nights. This video demonstrates the results:

https://makertube.net/w/cbXJAveVgVTGVEi58akVTA / https://www.youtube.com/watch?v=50J-gjkgJxE

To be perfectly clear: I am aware that Ace Combat is not a “flight sim”, not really worth of an API, and I know that DCS or BMS does it better and in greater detail and even with realism. This is not the point. I started working on this just for fun and to satisfy my own curiosity to see *if I can make it*. This may be hard to believe but chipping rocks together until the computer does what I want is “quality time” for me 🤓

You may have noticed that I’m a Macross fan and that my SimPit is heavily inspired by a VF-1 Valkyrie and that I usually use a modded VF-1 plane in AC as well. This is my personal substitute for the lack of any decent Macross / Robotech game since Macross VOXP.

This said I usually fly Space Pew Pew games with this cockpit so everything you see going on is designed for _space_ and not for flight sim. This is also why I sometimes talk about “ships” or “docked”. This is wording found everywhere in my plumbing pipeline for telemetry. All games I play, that can use this, send their data over this. The idea is that I do not have to rewrite half of the connected systems for every game so I transform the data into a unified format before.

You can read more about this on the dedicated project website https://simpit.dev (and here, of course). I will soon update it with some more details for Ace Combat. If this looks like something you’d like to try let me know, I’d love to connect. I’m active on various social media. Please do let me know if you find this inspiring.

List of menu key bindings from a PC game demonstrating various bound buttons with an ungodly long menu entry for each option

Looks like has a broken Input.ini parser resulting in my mappings to be gone on restart. The problem is that some special characters, like a comma, break the INI format used by their controls implementation [/Script/Engine.InputSettings].

Have an example what the game writes to AppData/Local/ProjectWingman/Saved/Config/WindowsNoEditor/Input.ini


AxisMappings=(AxisName="Pitch Axis",Scale=-1.000000,Key=Joystick_ThrustMaster,IncF-16FlightControlSystem_2_Axis1)

The name for the key is something homebrew the game produces based on the controller type (Joystick_ or Gamepad_) and the HID device descriptor name. This example mapped fine ingame but breaks on reload of the game resulting in only ThrustMaster for each mapped control – and that joystick can not be found, of course.

The “fix” is to manually edit the file and add quotation marks for the key:

AxisMappings=(AxisName="Pitch Axis",Scale=-1.000000,Key="Joystick_ThrustMaster,IncF-16FlightControlSystem_2_Axis1")

Now the game finds the proper joystick and all controls are mapped to something like ThrustMaster,IncF-16FlightControlSystem_2_Axis1 again, as expected.

Needless to say that the file should probably be write protected after that – or at least saved again under a different name, because any change to the controls will overwrite this fix again. This problem does probably also happen with other special characters, like the © sign that some vendors are known to use.

This is Project Wingman mission 01 Black Flag played on a Linux PC with Proton Experimental, OpenTrack with the Neuralnet Tracker plugin and my DIY HOTAS / rudder system based on Arduino Pro Micros replacing the original electronics in my Thrustmaster FLCS/Cougar gear:

Pick your poison: https://makertube.net/w/8MyoVSzDfwMuQR6bCqtbie / https://www.youtube.com/watch?v=dq0sihlgW_Y

I got Project Wingman on a sale months ago and I finally gave it a try. As an Ace Combat player I felt right at home. My initial experiment was with the XR glasses and woah that feels good in 3D and all but today I remembered that old Plasma TV in the basement. Got it second hand a year ago for dead cheap. Today I brought it upstairs to try it with the ViperPit and now I’m not sure what’s more awesome.

Well, that is if I feel like burning ~470W on top for that thing but hey this is for very specific gaming sessions only anyway 🤷

Guess I’ll spend more time in the ViperPit again 😀

Played (closed) Alpha with my inspired . I’m simply in awe that I can replay missions from (or ) with more modern graphics and modern interface devices again. I spent _so many_ hours playing these games as a kid.

This is the heavily cut VOD of the live stream over at @bekopharm@live.famkos.net (pick your poison):

https://makertube.net/w/r1LRrqDWnhw4wRk92uNfzo /
https://www.youtube.com/watch?v=9T2jxqT_5sU

This time I play with the native Linux version and my X52 Pro joystick (which means I actually have a chance of hitting stuff too). The following missions were played:

Historical Mission 2 / Wingmen Are Important
Historical Mission 3 / Sattelites Near Coruscant
Historical Mission 4 / Beating The Odds
OP 1: Destroy Imperial Convoy (Uncut)
OP 2: Reconnaissance Mission (Uncut)
OP 3: Fly Point During Evacuation (Uncut)
OP 4: Protect Medical Frigate (Uncut)

XWVM is not an official product from Lucasfilm Ltd. or Disney. It is not endorsed or authorized by either. It is a fan recreation of the game engine used to play X-Wing and TIE Fighter for the sake of accessibility and requires the original game assets to work.

Kudos to the XWVM team, they are doing a stellar job here.

The dedicated project website for the Macross inspired SimPit is https://simpit.dev

This uses my X4-SimPit extension for X4: Foundations, that sends ship telemetry via a socket to my node-red plumbing pipeline, which in turn forwards data to Websockets, SocketIO and MQTT. Various subscriber listen on the new messages to run blinken lights and my HUD app. I’m using the well known message format also used by Elite Dangerous so it’s compatible with that game as well.

Pick your poison: https://makertube.net/w/nUoG2ZPeAW1QhT3A2BXRrM / https://www.youtube.com/watch?v=wp1PkVhH9cc

Oh yeah… and on Linux PC 🤓

Let me know what you think!

X4-SimPit code (pending changes) is here: https://github.com/bekopharm/x4-simpit
The cockpit panel has a dedicated project page here: https://simpit.dev/

Played (closed) Alpha with my ViperPit and with glasses. I’m simply in awe that I can replay missions from (or ) with more modern graphics and modern interface devices again. I spent _so many_ hours playing this as a kid.

This is the heavily cut VOD of the live stream over at https://live.famkos.net (pick your poison):

https://makertube.net/w/hW6cJeqBY42YoryJL1gRg5 /
https://www.youtube.com/watch?v=8at4P5rf-gE

I go over the input settings and show it’s capabilities to connected various joystick devices, demo the Proofing Grounds and showcase mission 1+2. In the end I go over various settings for the XWVM engine and how the machine hardly sweats displaying the gorgeous cockpit.

XWVM is not an official product from Lucasfilm Ltd. or Disney. It is not endorsed or authorized by either. It is a fan recreation of the game engine used to play X-Wing and TIE Fighter for the sake of accessibility and requires the original game assets to work.

The game was played with Pro XR running in Side-By-Side mode thanks to ReShade on a Linux PC.

Kudos to the XWVM team, they are doing a stellar job here.

So I was asked if my head tracking approach of reading the IMU data from my Viture Pro to OpenTrack and SBS (side-by-side) mode with ReShade would also work with StarCitizen.

Guess it does 🤷

Pick your poison to watch the video: https://www.youtube.com/watch?v=rWUC2Y3TRh4 / https://makertube.net/w/8L7gVN8NnLvjhQCPGNmd6W

I start Star Citizen via Lutris (and not with Steam), which requires slightly different settings once ReShade is installed:

Enable Gamescope: ON
Output Resolution: "3840x1080"
Game Resolution: "3840x2160" (set this also ingame!)
Custom Settings: "--scaler stretch"

Can this get you banned? Who knows 🤷 Jury is still out on this. Do I care? Nope. I won’t miss my puny starter pack.

YMMV.

The proof of concept code to read the IMU data can be found at https://github.com/bekopharm/xr_to_opentrack (pending changes).

It works with the Breezy GNOME xr_driver: https://github.com/wheaney/breezy-desktop (but the Vulkan one works probably too but that’s untested). It should also be compatible with other glasses that have IMU for Breezy available.

There is an unlisted SBS version of this video linked in the description. You will need XR glasses that do FULL SBS though to watch it!

Until now I used OpenTrack with my DIY IR tracker or the Neuralnet tracker. I knew that my XR glasses feature IMU data though and the xr_driver of the Breezy Desktop project allows to access the data via IPC on Linux PC. So I did what Linux user do: I wrote a script to access the IMU data and forwarded it via UDP to OpenTrack:

Pick your poison to watch the video: https://www.youtube.com/watch?v=njuumLUvqrM / https://makertube.net/w/2bNyxJhdyydTeFq17onikv

This reminded me that I also wrote a proof of concept to implement the FaceTrackNoIR (or OpenTrack) protocol into FreeSpace 2 Open on Linux PC ( https://makertube.net/w/7VtfAjW7EiAUS5aiPwG7if ) so I gave it a spin to test the data bridge. That was smooth sailing!

The mod is Diaspora: Shattered Armistice, still awesome today: http://diaspora.hard-light.net/ (Warning: This may fuel a desire to re-watch the BSG series again 😀).

The bridge code can be found at https://github.com/bekopharm/xr_to_opentrack (pending changes).

It works with the Breezy GNOME xr_driver: https://github.com/wheaney/breezy-desktop (but the Vulkan one works probably too but that’s untested). It should also be compatible with other glasses that have IMU for Breezy available.

Update: hodasemi wrote a Rust connector based on the idea that works without Breezy: https://github.com/hodasemi/xr_to_opentrack_rs – comes with a systemd service file so it can run in the background. Once installed the only step left to do is fire up OpenTrack 🤘

So bear with me if I mix something up, this is all news to me and I’m still flabbergasted. I got myself some XR glasses mostly for watching movies and perhaps some gaming on the Steam Deck a while ago.

Now I learned about “SBS” (Side-By-Side) mode like ~3 days ago, that the glasses support. I tried this with the game Elite Dangerous first, since this has an SBS mode built in too, and was mind blown. My current favourite time stink is Ace Combat though so I started digging.

Turns out there is this Reshade tool that would forcefully enable such a mode for basically any game with the right shader. Several exist but the first I found, “SuperDepth3D.fx”, seems to do the trick. Enabling it split the 1920×1024 in half with two slightly different view ports, one for each eye. There are many options to fine tune this and I’m still fiddling with this to find the perfect settings but results look great already.

My glasses do Full SBS though and have a resolution of 3840×1024. I read somewhere that wide-screen is possible with more DLL shenanigans with Ace Combat 7 too but I run the game on a Linux PC anyway, where we utilise a tool named “gamescope”. This allows basically to configure a virtual display for each game and override the game resolution in various ways. It also has a stretch option, which is exactly what I needed to get the “compressed” SBS view from 1920 to 3840, where the aspect ratio would fit again. BTW: This also has FSR built in so any upscaling looks good enough too. I’m not entirely sure but I think there’s a similar tool on Windows called “Virtual Deskop”?

Anyway, I already managed to get my head tracker working by mapping the output to a virtual gamepad on the look-around axes before. I also found a mod that enables a wider FOV. Imagine my stupid grinning when everything fell into place: Full SBS with head tracking, a more sane FOV and yes, I jumped all the hoops to get my HOTAS and rudder pedal of my old ViperPit working (which is a different story because my devices are so old that I had to upgrade em to USB before, which involved some Arduinos, programming and soldering). I guess that makes me a member of multiple niches at once 🤓

And since I’m aware that nobody can “see” what I’m talking about, without having XR glasses or a VR headset (or a DIY VR Box for smart phones) on their own, have also an Anaglyph 3D render. This requires just some old school two coloured (red and cyan) glasses often made of paper, that many people still have around somewhere, to get an idea.

The colour of the sky? It’s perfect. A deep dark blue.

Update: There is now video footage: https://www.youtube.com/watch?v=NckLvP1HBGw