A lot happened since my last update on the simpit – under it’s hood. Function wise it changed not so much so the older demonstration video is still better for a quick demo. I still assembled a new video from clips of the first evening with the new hardware:

Quick trip from Armstrong Orbital over to the huge crater on HIP 117029-4 and back

So what changed? I got rid of the CY-822A USB joystick controller that, while good, was also limiting. Especially in inputs and how they would react. The Raspberry Pi, that I used to drive the status indicators, is also gone. This is all replaced by one single Arduino Mega that is connected via serial over USB.

A custom joystick daemon written in Rust is listening for data from the and feeds back the flags of Elite Dangerous to drive the blinken lights. I also extended the source to add me some rotary encoders (with push button function) and I’m very happy with the result of this. That makes a whopping amount of 48 buttons and 6 axis (where 2 axis make one hat). And it feels _so good_ to have e.g. self destruct or eject cargo save under a protective cover now 😀

The panel also got an external PSU with enough ampere to drive as many LED as I may imagine so I no longer abuse a phone charger for that or risk frying of the PCB / USB.

With all that in place I streamlined my pre-flight check-list quite a lot because way less hardware is involved and most of this is automated by now. It wasn’t all fun n giggles tho and while the new hard- and software “just worked” in e.g. it was that gave me a hard time to actually use most of the new buttons.

Getting all the precious buttons into Elite as well (okay, limited to 32 thanks to an old dinput library but who is counting at this point – will simply set the rest to keyboard macros instead)

Turns out it had no idea about the device and model identifiers reported by the joystick daemon and that the kernel assumed a gamepad based on declaring e.g. ButtonNorth via the more recent xinput system really didn’t help – because that limited the amount of read buttons via xinput severe! In the end I set it’s identifier to a “vJoy” device. That I found in the DeviceMappings.xml of Elite and since this could be basically anything I gave it a try (and removed all “offending” magic gamepad buttons from the code) and sure enough Elite started accepting the inputs as expected and from there it was smooth sailing – got even the hat working.

Oh and for everyone who is wondering what exactly they are seeing on the “MFD” when I’m playing Elite: That’s basically a Website using the FUI framework. Find a quick demo video here. Without the cardboard covering up parts of the screen it looks basically like this:

I also started doodles for a version 2 – now that I have an idea what I really want.

Plans for another based on a Valkyrie Cockpit

The last update has been a while. I focused my attention to the MFDs (Multi-function display). This part didn’t get much attention yet and I was caught between the difficult choice to learn yet another fancy framework, like Raylib, that would do OpenGL ES 2.0 without X11 on the Raspberry – or just throw the might of my CoffeeLake at it and go with ReactJS since most of the data was already available via NodeRED anyway. Also… ARWES is just so cool 🤩

I went with ReactJS and ARWES again, simply because I have some experience in this by know thanks to my Streaming Overlay I wrote with it. Hobbling it up to NodeRED was just a matter of installing SocketIO to transport the messages. It’s all a very hacky mess but it gets the job done.

Video demonstration of my simulated cockpit made from cardboard on a budget mainly used to play Elite Dangerous in early 2022. This is work in progress.

While seeking through the available data I noticed that I don’t get velocity values from Elite. That’s not so important in space but _kinda_ interesting for me in planetary flight to satisfy the flight sim gamer in me as well. I noticed tho that I do get timestamped latitude, longitude and altitude values so shouldn’t it be possible to “simply” calculate this, right? Right?

This was when I dived into the rabbit hole of calculating velocity and heading on planetary objects using a spherical coordinate system and while I didn’t nail it exactly how Elite does it the result is close enough. The game provides the required data to go crazy here – most important the radius of the current object. In _theory_ I could start writing some primitive AFS (Auto Flight System) routines now, which I’m totally going to explore at some point in the future just because 🤓

Checking my maths – yes, altitude is added to the mix so velocity is mostly correct as long as no rapid course changes are made

After spending way too much time with this and the Pythagorean theorem (Yes mum, a game made me do maths. MATHS! 🤯) I settled with some calculations and data for my current ship to the right and targeted ship data on the left. This is sort of tricky because many game events update different parts of the data so timestamps have to be kept in mind and a game specific parsing strategy is required. See the last part of the demonstration video to get an idea how this looks.

Improving situational awareness by putting the video feed of wingman / gunner on the central MFD.

Another point to tick off my list was getting the head tracking to work in Elite (again). Now this is very Linux PC specific so you may tune out on this paragraph. On Linux PC I’d usually compile Opentrack with the Wine Glue, patch in my appdata dir for Proton and hope that it’s still ABI compliant to Just work™. Alas recent Proton is sandboxed within pressure vessel and the usual approach of memory mapping is simply no longer working, if I got the gist of this right.

So my _current_ strategy is to download and drop the Windows build of Opentrack into the game folder and chain-load the EXE with the game where the Opentrack EXE would listen on UDP while my native Opentrack BIN would send via UDP. A task not made easy with Proton but it is possible. The following snippet may give you some pointers:

#!/bin/bash
export STEAM_COMPAT_DATA_PATH=/games/steam/steamapps/compatdata/359320
export STEAM_COMPAT_CLIENT_INSTALL_PATH="$HOME/.steam/steam"
python3 /games/steam/steamapps/common/Proton\ -\ Experimental/proton run opentrack.exe

Why running Opentrack twice? The native build performs a lot better with my webcam and every frame really count here. Reading data via UDP is not much of a burden for Proton. This also saves me the trouble of fiddling with Wine Glue, a painful compile process nobody should endure involving installation of many many additional 32bit libraries. Hilarious but it works.

Did some programming on my “MFDs” last night. They start coming to live with proper game data from 😁 All duct tape and JS plumbing. Sorry for shaky cam. Couldn’t be arsed with the tripod at 1:30 am.

Short demo video of the panels loading up

Here is a close up picture without all the shaking:

The animations are made possible with ARWES.dev – a library designed to create futuristic user interfaces (FUIs) fast.

Here is another awesome example using https://www.myxouz.com/2021/12/lady-of-shalott-first-version-of-our-home-dashboard/ 👍

A FUI is short for a Futuristic User Interface. ARWES.dev is a JS framework to create such a FUI. Unlike the ones animated and used for cinematic purposes it can be actually used. Like I do this for my Streaming Overlay and going to use it for my as well.

I love everything about this project right up to it’s name “Lady of Shalott”. The idea to do some sort of extended home automation to query various daily streams is one of the reasons I dig MicroSub and ActivityPub so much and doing something similar for a dashboard crossed my mind already too. This is a topic I’m not doing much on at the moment. I mean I’ve very good ideas what I want here but I’ve already enough projects going at the moment so this is on hold.

Anyway, make sure to take a closer lock on the project. Myxouz offers some (unlisted) YouTube videos as well so you can get an idea of the dashboard in action and also describes the background techniques in use.

If you’re a developer consider going wild with ARWES as well 🙃

Video demo of using React and Arwes for a stream overlay

Can’t get enough of Diaspora: Shattered Armistice (on Linux PC). Of course with my DIY headtracker 😃 And while we’re at it we demonstrate https://arwes.dev/ – because nobody said that a sci-fi interface made with React can not be used as stream overlay too 👍

This is a Work In Progress and a DEMO.

Made to learn some React, get some freakin cool UI _and_ save me bucks on a stream overlay 😀