I usually play #FlyDangerous on Linux PC. I switched to Proton because I was eager to see some upcoming changes, like #headtracker support, on the public_beta branch. And while this works I was once more flabbergasted how complicated it is to set my desired display resolution of 5760×1200. I’m using a multihead setup with several displays and as usual the game engine would not let me _simply_ set that. Even in windowed mode (I mean I get that this won’t work with fullscreen).
There are several ways to work around this, especially with Proton, but I was looking for the prefs file I know from Linux. I found it in the end in the file compatdata/1781750/pfx/user.reg (that’s like the Windows registry but as plain file read by Wine) where the values are stored as dword under [Software\\StarGoat\\FlyDangerous]. In hex.
"Screenmanager Resolution Height_h2627697771"=dword:000004b0
"Screenmanager Resolution Width_h182942802"=dword:00001680
"Screenmanager Resolution Use Native_h1405027254"=dword:00000000
So 0780 and 04b0 are in the end 5760 and 1200. And sure enough, on the next game start I get _my_ desired resolution:
Sadly when I change settings in the game this gets overwritten again – so keep a backup around and drop it in again. This may even be added to a script – let’s see how long until this gets on my nerves and I automate that.
For the interested: This is how the same thing looks on the native version in the file ~/.config/unity3d/StarGoat/FlyDangerous/prefs
The last update has been a while. I focused my attention to the MFDs (Multi-function display). This part didn’t get much attention yet and I was caught between the difficult choice to learn yet another fancy framework, like Raylib, that would do OpenGL ES 2.0 without X11 on the Raspberry – or just throw the might of my CoffeeLake at it and go with ReactJS since most of the data was already available via NodeRED anyway. Also… ARWES is just so cool 🤩
I went with ReactJS and ARWES again, simply because I have some experience in this by know thanks to my Streaming Overlay I wrote with it. Hobbling it up to NodeRED was just a matter of installing SocketIO to transport the messages. It’s all a very hacky mess but it gets the job done.
While seeking through the available data I noticed that I don’t get velocity values from Elite. That’s not so important in space but _kinda_ interesting for me in planetary flight to satisfy the flight sim gamer in me as well. I noticed tho that I do get timestamped latitude, longitude and altitude values so shouldn’t it be possible to “simply” calculate this, right? Right?
This was when I dived into the rabbit hole of calculating velocity and heading on planetary objects using a spherical coordinate system and while I didn’t nail it exactly how Elite does it the result is close enough. The game provides the required data to go crazy here – most important the radius of the current object. In _theory_ I could start writing some primitive AFS (Auto Flight System) routines now, which I’m totally going to explore at some point in the future just because 🤓
After spending way too much time with this and the Pythagorean theorem (Yes mum, a game made me do maths. MATHS! 🤯) I settled with some calculations and data for my current ship to the right and targeted ship data on the left. This is sort of tricky because many game events update different parts of the data so timestamps have to be kept in mind and a game specific parsing strategy is required. See the last part of the demonstration video to get an idea how this looks.
Another point to tick off my list was getting the head tracking to work in Elite (again). Now this is very Linux PC specific so you may tune out on this paragraph. On Linux PC I’d usually compile Opentrack with the Wine Glue, patch in my appdata dir for Proton and hope that it’s still ABI compliant to Just work™. Alas recent Proton is sandboxed within pressure vessel and the usual approach of memory mapping is simply no longer working, if I got the gist of this right.
So my _current_ strategy is to download and drop the Windows build of Opentrack into the game folder and chain-load the EXE with the game where the Opentrack EXE would listen on UDP while my native Opentrack BIN would send via UDP. A task not made easy with Proton but it is possible. The following snippet may give you some pointers:
Why running Opentrack twice? The native build performs a lot better with my webcam and every frame really count here. Reading data via UDP is not much of a burden for Proton. This also saves me the trouble of fiddling with Wine Glue, a painful compile process nobody should endure involving installation of many many additional 32bit libraries. Hilarious but it works.
Seems to work nice with my DIY headtracker on Linux PC too. Sadly I got quite some frame-drops due to recording (and probably multi-head too). It works way better without all the cameras and a life-stream going on but I think it’s enough to get a good impression. Botched emergency landing included xD
Warning: This may fuel a desire to re-watch the BSG series again 😀
I like space and science fiction. Diving into epic stories set in some distant future amazes me since elementary school.
I’m also a gamer. And a tinkerer. It’s in the family.
I keep wondering: How can I improve the immersion of my games without going full VR?
I used a triple screen set-up before. It consisted of different models in height and size. When one screen finally broke down I purchased 3 refurbished screens of the same brand and model. What a difference!
The kids love it too. Of course. Means less stick time for me. Anyway.
The thing is: “The” reference product for a headtracker is the TrackIR system. Price as of today: 220 EUR. Ouch! That’s like a cheap VR, right? And it’s Windows only. No thanks.
So I checked what’s in this thing. Apparently a cheap camera, some infra-red LED, and a filter allowing only infra-red waves. And software, of course.
Since this is for Linux I get to pick my poison for the software part, and I settled with Opentrack fast. Onwards to the hardware part. I abused my mobile phone for the testing, sending it’s Gyroscope data via wifi to my PC, and while it worked it also _sucked_. Both, phone and wifi I mean.
Head tracking is awesome. And I knew I want it. So I started prototyping. For this I went with a simple design that I eventually implemented on cardboard. It looks hilarious but it gets the job done.
The focus was on a long life cycle so I wouldn’t have to replace the rechargables in the middle of a session. To get this right I checked with the camera that I was going to use. See (video above), this is way to bright and by trying various resistors I could get this down to 33mA per LED and still get a decent detection rate with Opentrack.
Speaking about the camera. That’s nothing special. It’s a dead cheap 480p Logitech QuickCam Communicate STX that I got from a discounter a decade ago. It was so cheap it doesn’t even _have_ an infra-red filter that I’d have to remove first.
I used tape to attach the salvaged camera cover of a dead G20 controller. That’s a Wii Remote knock-off that does basically the same thing like a headtracker. Various other foils can be used for this as well, as long as they permit infra-red. The idea is to reduce or remove all other light waves but infra-red.
The trick is to also turn off auto exposure and fiddle with the contrast and sharpness until a decent frame rate and a clear infra-red wave source by the LED can be seen.
When I was satisfied with my meter readouts, and my highly professional scribbles, I started working on the prototype while streaming the whole process on the Discord channel of the awesome Fly Dangerous project. If you like racing with a space ship give it a shot.
The prototype is made of cardboard that doubles as isolation for the polarity. The rest is tape and hook-and-loop fastener to attach the headtracker to my headphones. No magic here. The whole contraption is powered by two 1.2V rechargeables. I opted for a micro switch and an additional LED as power indicator, that I dimmed down even more. I can after all not see infra-red so this seemed like a good idea to me. Spoiler: It is.
So how does it play? Over the next weeks I tried basically any game supporting head tracking that I could get my hands on. Please keep in mind that I usually play with lights off but started the studio lights for demo purposes. The tracker does still work just fine.
I quickly found out that each game needs it’s own profile for fine tuned settings. Good thing that Opentrack has me covered on this. First, my beloved X4 using Wine and the TrackIR protocol.
Sadly I came to the conclusion that my GPU is no longer up for the task and Wine would cost me too many frames. I switched Opentrack to emulate a joystick instead and mapped it to camera movements in the native X4 version. It’s not exactly the same but it’s okay-ish. I have an idea how to hack this properly into X4 using an extension and a UDP server but that’s a topic for another day.
Anyway, the same principle works with X Rebirth too, making me even happier. While dated it still has it’s charm and the verse still feels a lot more alive compared to X4. It’s also not taxing my GPU that much.
Now for something different. When Opentrack would list a “protocol” named FlightGear I became very curious. I installed this free and open source flight simulator and crashed my first Cessna into the ground minutes later. By now I’m confident that I can crash a Cessna just about anywhere. I’m not fond of flying in real-life but avionics sure are a fascinating topic.
This was the moment a Steam sale happened and I bagged various flight sims, Space Kerbal and House Of The Dying Sun. All with TrackIR support.
Little did I know what gem I bagged with House Of The Dying Sun by the way. Sadly it’s also very short but I enjoyed every minute of it and will probably play it again. The art, sound and music reminds me a lot of Battlestar Galactica. Easy win 😀
So yeah, this is my current gaming set-up. I built myself a head tracker for 5 EUR. On Linux PC.
I also may have fallen into the rabbit hole called “simpit”.