Having an action webcam strapped with bow ribbons to my XR glasses grinning mad into the smartphone cam. A bunch of wires are also strapped to the glasses.
Video: How to get 6DOF with older 3DOF XR glasses using Breezy and OpenTrack

Breezy can now turn a 3DOF (degree of freedom) device into a 6DOF device by augmenting the missing positional data from a webcam. Spoiler! It is not the cam strapped to my face – this is just for the demo you can watch here, on PeerTube or YouTube.

The cam, that I used for this task, is sitting on my monitor. How this works? Well not with magic! This requires a somewhat decent webcam – really anything from the last decade should suffice – and OpenTrack, of course.

OpenTrack is a head-tracking application with multiple tracker plugins. One of it’s plugins is the Neuralnet Tracker, an AI powered extension that comes with a bunch of different head pose models to choose from. With a webcam connected this can now locally run the detection model with very low latency – so it’s usually blazing fast on most systems!

This alone is already 6DOF and is used a lot for gaming already – so what does Breezy do with this? Simple! It reads the forwarded data via an UDP listener, a very quick way to transmit data on a local network or system [and complements it’s own rotational data with the missing positional data].

With this a Breezy user still gets the rotational data from the XR’s very sensitive IMU, that is short for Inertial Measurement Unit btw, and the not so important positional data sent from OpenTrack.

This works of course only while the webcam can still see the user. So sadly no walking around while using this.

And the best thing? It can also send the data back! This means that the very same combined values can be forwarded – e.g. to a computer game – benefiting from the best available data sources for rotation and position.

That’s not the main use case, of course, and only of importance for some nerds like myself. This is mostly relevant for the productivity features of Breezy, because sometimes a text may be too small to read with the glasses on. We do no longer have to increase the font size – we can now simply lean in! That is a feature that is usually only available with glasses, that come with little cameras of their own, so they can have native 6DOF support. And when I say native I mean that such glasses usually also outsource exactly this calculation to the connected computer. It’s my understanding that this seems to require a lot of computation power, which is something many XR users with the more modern devices complain about.

Well not so much with OpenTrack and the Neuralnet tracker, that utilizes the ONNX runtime under the hood. That’s a high-performance, cross-platform engine to power exactly such models locally. The runtime automatically makes use of the best available hardware acceleration, if there is any.

Overall I’m rather hyped about this feature – especially because I’m using the OpenTrack output option of Breezy for quite some time now, to get a VR like experience with stereoscopic 3D rendering in Side-By-Side mode. I can now keep using my older XR glasses and still enjoy this more modern 6DOF feature. This is rather expensive hardware after all.

And all that on Linux PC!

Breezy xr_driver: https://github.com/wheaney/breezy-desktop by https://www.youtube.com/@WayneHeaney

Official Announcement XR desktop with 6DoF + multiple displays: https://www.youtube.com/watch?v=eFLmjpjF-rA

Music “Life’s Worth Dying For” CC BY-SA 3.0 “LostDrone”. Licensed to the public under https://creativecommons.org/licenses/by-sa/3.0/ Verify at https://soundcloud.com/lostdrone/rock-lostdrone-lifes-worth-dying-for-free-download-and-creative-commons-license

Finally replaced the old display with a new touch display in my VF-1 inspired home cockpit panel.

The old display was salvaged from a laptop years ago and while it was working fine it also has a very bad viewing angle. I also got really tired of it’s glaring reflections so I experimented with an anti glare foil. This reduced the reflections a lot (worth every cent) but couldn’t help with the bad viewing angle, of course. I now had an idea how this could look though so I decided to buy into a replacement kit.

The new display is the N173HCE-E31, a 17.3" with a resolution of 1920x1080. The touch controller registeres as a USB HID pointer/mouse by ILITEK and is basically sitting on top of the display. The kit included a PCB, that was advertised as VS-RTD2556HC-V2 controller by VSDISPLAY but came without any data sheet and I have no idea who really made this.

Thing is this PCB runs very hot and the noted input voltage isn’t explicitly stated. An attached image suggested to use an USB PD power supply without 20V so I was looking for it’s datasheet to check if I was just holding it wrong. Picture me surprised but VSDISPLAY does not list this particular configuration in it’s datasheets. I contacted them via mail and they confirmed that this is not theirs. Theirs is apparently also strictly 5V/12V so that matches the picture I get.

Mine is equipped with the IC RTD2556VD that does not match the list of supported ICs. Theirs has 2556TE_R20.1 printed on the PCB. Mine has 2555TF_R30.1 printed on. It’s like 99% similar but differently routed. It also mentions E470791 JPX-D which seems to point to the PCB manufacturer Dongguan Jingweixin Circuit Co Ltd but that is where my GoogleFu left me. I did also find the very same pictures on other offers, each stating a completely different controller model 🤷

Anyway. I tried different configurations and while it works with 5V at ~2A I feel way more comfortable with 12V at ~0.8A on full brightness + blue color. I also attached a passive cooling block I had laying around and slapped a fan on top. Now it’s only “comfortable” warm to the touch after running for an hour.

Sadly I do not have any device with DP ALT providing more than 5V and the PCB will always switch down to 5V the moment the USB-C dedicated for the display signal is used as well, even when a proper USB PD power supply is attached on it’s dedicated power connector. I could only keep it at 12V with my VITURE USB-C XR charging adapter, which can indeed provide 12V and more via USB-C while still allowing DP-ALT + USB2. There went my plans to only have a single cable for all, DPPD and the USB2 lanes for the ILITEK pointer, because I really do not want to block this adapter all the time.

So now I have a dedicaded USB PD power supply at 12V connected, a HDMI connection for the display and an additional USB2 for the touchpanel pointer – and on top of that the little fan, that I simply connected to the micro USB2 socket on the PCB to provide it with 5V.

This also means that my Linux PC can not know that both, touch panel pointer and display, belong together. As a result all touch panel inputs were all over the place and not limited to a single display. Apparently KDE has an option in it’s graphical settings where this can be easily configured. Gnome does not [yet?] have such an option in it’s graphical settings. There is however a way to enforce the mapping of the touch panel in Gnome too! And while the real manufacturer for the controller of the new display is still a mystery to me I found the following snippet in my monitor configuration $HOME/.config/monitors.xml after plugging the controller in:

<monitorspec>
    <connector>HDMI-2</connector>
    <vendor>RTK</vendor>
    <product>0x2555</product>
    <serial>0x20230705</serial>
</monitorspec>

The touch panel is, according to lsusb, connected as ID 222a:0001 ILI Technology Corp. Multi-Touch Screen. Armed with that knowledge I can limit it’s input with gsettings to this specific display:

gsettings set org.gnome.desktop.peripherals.touchscreen:/org/gnome/desktop/peripherals/touchscreens/222a:0001/ output "['RTK', '0x2555', '0x20230705']"

Works like a charm but what a mess. I still wish I had a data sheet for this so if you know more kindly drop me a comment!

The last thing to fix was the already mentioned reflective glare. For this I went with a screen protector by BROTECT (that name still makes me laugh), that promises beside anti scratch also an anti glare effect without limiting the view angles (some foils do this to enhance privacy).

Attaching the foil was straight forward. The trick is to make sure that not a single dust particle is around during the process. To help with this I used an air humidifier to raise the humidity in the room before I even started. After that I removed the protective cover from the display and started slapping on the foil with the provided mounting card (yay, cardboard again). This was the very moment one of my curious cats decided to investigate my actions and jumped onto the table almost giving me a heart attack. The last thing I needed was cat hair all over the place and indeed after a lot of hissing I had to make good use of the also provided adhesive sticker to catch all dust particles in the last corner. Cats!

The end result is like night and day. I do no longer see any light sources or myself clearly reflected on the display. The touch panel is still accepting inputs just fine and the colours look very bright from any angle, especially with HDR enabled. This will also ease it’s cleaning because the cockpit panel is collecting dust like crazy due to the gradient of the panel. I usually use a vacuum cleaner for this and the foil will help a lot to avoid scratches.

Replacing the old display was also a task on it’s own. The old screws didn’t fit, of course, so I kinda had to build little adapters from leftover angle and wood pieces. Very ugly but good enough – this is just a toy after all 🤓

Ah yes and now that I have a touch panel I also have to rewrite my HUD app, of course 🙃

Visited the Black Forest open air museum Vogtsbauernhof in 77793 Gutach / Germany. This is a huge areal that has several very old farm buildings, that have been carefully de- and reconstructed on this site. There are often also tours, demonstrations or hands-on activities how people used to live in the Black Forest area but as a half-timber nutter I’m mostly interested in the buildings. These are from various periods starting as early as 1407 (though that’s an often refurbished exception).

Shooting any sort of pictures with my smartphone camera was very hard because the insides of the buildings are unbelievable dark and hardly lit. I guess this is driving home a point in itself. Anyway I did my best to improve the photos somewhat with Darktable (not that I have any idea what I’m doing there). Also the galleries won’t syndicate so you’ll have to check the source for the pictures.

Here are some of the pictures that are very dark in reality.

We also looked at plenty of farm equipment, of course. In fact one of the farms is still operated and has livestock around.

One of the most interesting things I found was a “mini house” that is basically of fridge for milk. It utilizes a water stream to keep milk on the inside cold.

Some of these buildings simply look gorgeous on the outside but my kids were pretty certain that they’d not actually want to life in such a building. Beside the darkness on the inside one could also always feel how air makes it inside. There’s always a slight breeze, which is probably nice in the summertime but not so much in the wintertime. Especially with only a few places around to heat the buildings.

One of their major show pieces is probably the Schlössle von Effringen, which is basically a mini castle that has been remade again and again dating back to ~11C.

This is also where I shot most of my pictures. Sadly we were running out of time and I have to revisit this again so look at some things a little closer. And maybe leave the kids at home too. They don’t really have the required patience.

I also rather enjoyed the various kitchens, that were almost all “smoke kitchens” – means the smoke from the kitchen fire was used to conserve food. This is a very medieval thing to do. Sadly the pictures I made don’t do them justice since they are basically black holes, that cameras struggle hard with.

One of the things I enjoyed most are models of various buildings. I made many photos of these for inspiration. Maybe I’ll come back to recreating such buildings in the Rising World game one day.

And last some unsorted photos in portrait mode (ugh, it happens, mkay?)

Website of the museum is: https://www.vogtsbauernhof.de/en with plenty of more pictures and a 360deg tour. Can recommend. Alas bring food along – the restaurants next to the site are rather expensive. Museum is worth every cent though IMHO.

In October we booked tickets for a ride with this historic steam train. Sadly that didn’t work out as expected. Something broke for the signal manager and the train was not permitted to enter the station and could later not leave it. We basically sat for hours in some waggon until it was announced that they won’t make the roundtrip this day. One way only. That’d have left us stranded so we left the train finally. We did get a full refund of the tickets though so we will try this another time.

Video: Historic steam train “Fiery Elias” leaving the station Korntal (Stuttgart, Germany) in 2025

The locomotive, designated Lok 50 2273, is apparently in service since 1942 with various reconstructions along the… uh… rails? It’s history can be reviewed at https://eisenbahn-museumsfahrzeuge.de/index.php/deutschland/staatsbahnfahrzeuge/dampflokomotiven/baureihe-50/50-2273

Historic steam train “Feuriger Elias”: https://www.ges-ev.de/museumsverkehr/kw/kw.htm

So I dunno if you know what a (https://en.wikipedia.org/wiki/Vacuum_fluorescent_display) is but I’m a sucker for these – at least virtually.

Games like perfected the look and this is where I want to go with my HUD app for my / home cockpit too.

Screenshot from the game Rebel Galaxy Outlaw with it's very colourful cockpits full of VFD like displays.

The segment displays are heavily inspired by project (https://augmented-ui.com/) where I’ll borrow some more elements. Learned the neat fake scan lines from there too. And yes the 8 segment display works by shifting bits under the hood 🤓 This isn’t really needed for an app but I have plans to add some real segment displays eventually (I do have a whole box full with these!) so I wanted to know how to implement this anyway.

Video from an earlier stage in the development demos the scan line effect.

The bars are configured with parameters in size, count, percent, colours and thresholds 😁 I also added a random chance of 5% to shift the hue a little bit because just as in real life nothing is perfect.

A colourful button box surrounding a display that shows various data from the game running in the background stretched over several own displays for immersion.

And yes they are fully themed so switching the colour theme also affects the virtual VFDs.

I’m also going to replace the older horizontal bars, that look way too boring in comparison.

It’s still very early but I hope to get some rad animations going too. See https://www.hudsandguis.com/home/2022/retro-digital-dashboards to get an idea in which direction this is going 🤓

See the dedicated project page https://SimPit.dev for more details on this inspired panel.

Played (closed) Alpha with my inspired . I’m simply in awe that I can replay missions from (or ) with more modern graphics and modern interface devices again. I spent _so many_ hours playing these games as a kid.

This is the heavily cut VOD of the live stream over at @bekopharm@live.famkos.net (pick your poison):

https://makertube.net/w/r1LRrqDWnhw4wRk92uNfzo /
https://www.youtube.com/watch?v=9T2jxqT_5sU

This time I play with the native Linux version and my X52 Pro joystick (which means I actually have a chance of hitting stuff too). The following missions were played:

Historical Mission 2 / Wingmen Are Important
Historical Mission 3 / Sattelites Near Coruscant
Historical Mission 4 / Beating The Odds
OP 1: Destroy Imperial Convoy (Uncut)
OP 2: Reconnaissance Mission (Uncut)
OP 3: Fly Point During Evacuation (Uncut)
OP 4: Protect Medical Frigate (Uncut)

XWVM is not an official product from Lucasfilm Ltd. or Disney. It is not endorsed or authorized by either. It is a fan recreation of the game engine used to play X-Wing and TIE Fighter for the sake of accessibility and requires the original game assets to work.

Kudos to the XWVM team, they are doing a stellar job here.

The dedicated project website for the Macross inspired SimPit is https://simpit.dev

I gave in and changed my event forwarding method in node-red for the Elite Dangerous Journal. This file is updated on various in-game events but in a way that makes it difficult to get new events only since last update. Another problem is that it’s not really a valid JSON file because it has one JSON per line but it’s not a valid JSON array. This is why it has to be parsed line by line and mashed together by event type (name) again to get the latest data for each event type per dump. Each event has it’s own timestamp by set by the game. The latest timestamp is now saved on the special flow const so node-red keeps the value in the “global” memory of the current flow:

msg.payload.event = "Journal";

let newJournalTimestamp = flow.lastJournalTimestamp;

Object.keys(msg.payload).forEach((key) => {
  if (msg.payload[key].timestamp) {
    const keyTimestamp = new Date(msg.payload[key].timestamp).getTime();

    if (!flow.lastJournalTimestamp || flow.lastJournalTimestamp < keyTimestamp) {
      // this entry is new - keep it. MULTIPLE events may have the
      //  same timestamp so wait with reassigning so we don't skip
      //  em or get the latest a 2nd time if nothing else changes.

      // update the next latest timestamp if this is newer
      if(!newJournalTimestamp || newJournalTimestamp < keyTimestamp) {
        newJournalTimestamp = keyTimestamp;
      }
    } else {
      // lastJournalTimestamp is newer, skip this
      msg.payload[key] = null;
    }
  }
});

// make sure this is a valid date for the next time
flow.lastJournalTimestamp = newJournalTimestamp || new Date().getTime();

// remove all nulled events from the payload
msg.payload = Object.fromEntries(
  Object.entries(msg.payload).filter(([_, p]) => p !== null)
);

msg.payload.timestamp = new Date(flow.lastJournalTimestamp);

return { payload: msg.payload };

So I do now keep track of the last read timestamp and reject every event that is older than the last read keeping the Journal dump smaller. This way I don’t have to try to keep track of the “latest” event to drag data from. Refuelling e.g. can happen from whopping 4 (or more) different events and it’s painful to compare all and check which one is the latest to keep track of the real current fuel levels for each tank.

Downside is I won’t get a full set of data for the current session any more if I have to reload my HUD app. This could be mitigated by using MQTT though where I could simply persist each event topic. That is already implemented and I can choose between SocketIO or MQTT in my app anyway.

This uses my X4-SimPit extension for X4: Foundations, that sends ship telemetry via a socket to my node-red plumbing pipeline, which in turn forwards data to Websockets, SocketIO and MQTT. Various subscriber listen on the new messages to run blinken lights and my HUD app. I’m using the well known message format also used by Elite Dangerous so it’s compatible with that game as well.

Pick your poison: https://makertube.net/w/nUoG2ZPeAW1QhT3A2BXRrM / https://www.youtube.com/watch?v=wp1PkVhH9cc

Oh yeah… and on Linux PC 🤓

Let me know what you think!

X4-SimPit code (pending changes) is here: https://github.com/bekopharm/x4-simpit
The cockpit panel has a dedicated project page here: https://simpit.dev/

Played (closed) Alpha with my ViperPit and with glasses. I’m simply in awe that I can replay missions from (or ) with more modern graphics and modern interface devices again. I spent _so many_ hours playing this as a kid.

This is the heavily cut VOD of the live stream over at https://live.famkos.net (pick your poison):

https://makertube.net/w/hW6cJeqBY42YoryJL1gRg5 /
https://www.youtube.com/watch?v=8at4P5rf-gE

I go over the input settings and show it’s capabilities to connected various joystick devices, demo the Proofing Grounds and showcase mission 1+2. In the end I go over various settings for the XWVM engine and how the machine hardly sweats displaying the gorgeous cockpit.

XWVM is not an official product from Lucasfilm Ltd. or Disney. It is not endorsed or authorized by either. It is a fan recreation of the game engine used to play X-Wing and TIE Fighter for the sake of accessibility and requires the original game assets to work.

The game was played with Pro XR running in Side-By-Side mode thanks to ReShade on a Linux PC.

Kudos to the XWVM team, they are doing a stellar job here.

Until now I used OpenTrack with my DIY IR tracker or the Neuralnet tracker. I knew that my XR glasses feature IMU data though and the xr_driver of the Breezy Desktop project allows to access the data via IPC on Linux PC. So I did what Linux user do: I wrote a script to access the IMU data and forwarded it via UDP to OpenTrack:

Pick your poison to watch the video: https://www.youtube.com/watch?v=njuumLUvqrM / https://makertube.net/w/2bNyxJhdyydTeFq17onikv

This reminded me that I also wrote a proof of concept to implement the FaceTrackNoIR (or OpenTrack) protocol into FreeSpace 2 Open on Linux PC ( https://makertube.net/w/7VtfAjW7EiAUS5aiPwG7if ) so I gave it a spin to test the data bridge. That was smooth sailing!

The mod is Diaspora: Shattered Armistice, still awesome today: http://diaspora.hard-light.net/ (Warning: This may fuel a desire to re-watch the BSG series again 😀).

The bridge code can be found at https://github.com/bekopharm/xr_to_opentrack (pending changes).

It works with the Breezy GNOME xr_driver: https://github.com/wheaney/breezy-desktop (but the Vulkan one works probably too but that’s untested). It should also be compatible with other glasses that have IMU for Breezy available.

Update: hodasemi wrote a Rust connector based on the idea that works without Breezy: https://github.com/hodasemi/xr_to_opentrack_rs – comes with a systemd service file so it can run in the background. Once installed the only step left to do is fire up OpenTrack 🤘