This is a project I kept postponing for years but when I eventually got my hands on all the required parts I had no longer an excuse and eventually built the first. It’s a portable music player for children that does not require internet access. It features selections of pre-installed music or audio books via RFID cards, that may come in all shapes and may even be integrated in toys. There are also 3 to 5 playback controls in the form of huge arcade buttons. Ideal especially for our middle one, who has to endure stationary stay for most of the week in a hospital.

And while this box is still missing proper decorations and button decals it’s full functional and portable. Also hey, kids ain’t stupid – they find the proper button without decal too. Even the baby found out where to put the RFID cards for the music to change 😉

The leg work for this was done by @xfjx@chaos.social and the project is described in great detail at https://www.voss.earth/tonuino/ – I did however not order the offered PCB and just soldered everything to a generic maker board to keep the costs down. Just like the arcade buttons, that I had left over from another project, I also have a bunch of such boards. The speaker was salvaged from an old entertainment system that broke down long ago and the box
 ah well I guess it speaks for itself. Can’t say I was happy with the drill but the box was just perfect for our purpose.

First we built a test setup after salvaging all the needed hardware. The Ardunio parts are off the shelf, nothing special here. I had to improvise a little on the wiring due to missing wires. I opted for the older branch that just needs Arduino Studio, to install the software itself. There is a more modern version using platformIO but something with that does not like my vscode and I never managed to successfully compile it.

I eventually got the idea how the RFID cards worked and could be trained to the system and also did some tests like it’s maximum power usage. It has a passive speaker and cranked up to max it would consume 0.09A max – and on regular volume it was sitting at comfortable ~0.06A. Which is pretty fine. This would run for days with a decent power bank that could be dropped right into the box later if no external PSU is used.

Next was preparing the box. Luckily I had just the right drill for the buttons but making the holes was a pain in the neck. This had to be done very slow because the hard plastic would easily rip and splinter. I opted for a very massive USB connector in the end because the microUSB one used first broke on the 3rd use already. That was probably a little bit too cheap. The replacement is way more sturdy, which is kinda what I want for the children anyway. Everything the box needs to operate, like an old phone charger, a very long USB cable, and the RFID cards do fit inside the box for transport.

So one of the questions left was what to put on it’s internal SD card. Some of their favourite music, of course. What else though? Easy. We have a public audio centre at https://www.ardaudiothek.de/ offering a lot of stories and podcasts even for children. Downloading them one by one manually was cumbersome though. Luckily @1337core@chaos.social was just releasing his first version of Audiothek Downloader at https://github.com/Leetcore/audiothek-downloader so I had more gigabytes than the SD card could manage in minutes. The only issue was that the SD card needs the audio files enumerated so I did some quick scripting to rename the downloaded files. I had also no use for the downloaded cover images. It’s not beautiful but it got the job done:

#!/bin/bash

folder=$1
oldpwd=`pwd`

if [[ -z $folder ]]; then
    echo "Missig paramater id"
    exit
fi

folder="output/${folder}"

if [[ ! -d ${folder} ]]; then
    echo "Missig folder ${folder}"
    exit
fi

cd $folder
shopt -s extglob
for filename in +([0-9])_*.*; do
    [ -e "${filename}" ] || continue
    oldfile=${filename}
    # remove including the first underscore to get the index
    index=${filename%%_*}
    index=${index##+(0)}

    # pad the number with zeros
    newfile=`printf %03d ${index}`
    # combine new index with old filename, remove up and including first underscore
    newfile=${newfile}_${filename#*_}
    if [[ ! -f ${newfile} ]]; then
        mv -v "${oldfile}" "${newfile}"   
    fi
done

declare -i n=1
declare -i i=1

for filename in *.mp3; do
    [ -e "$filename" ] || continue

    target_dir=`printf %02d ${i}`
    if [[ ! -d ${target_dir} ]]; then
        mkdir ${target_dir}
    fi
    
    target_file=`printf %03d ${n}`

    if [[ ! -f "${target_dir}/${target_file}" ]]; then
        mv -v "${filename}" "${target_dir}/${target_file}.mp3"
    fi
    n+=1

    if (( n > 255 )); then
        n=1
        i+=1 
    fi

done

cd $oldpwd
exit 0 

This goes into e.g. to-tonUINO.sh into the root folder of the Audiothek Downloader where it can be executed after downloading a category. Like this for example:

python3 audiothek.py --url 'https://www.ardaudiothek.de/sendung/big-baeaeaem-wissen-ohne-filter/96510766/'
./to-tonUINO.sh 96510766
Renamed '24_Warum_mobben_wir_andere.mp3' -> '024_Warum_mobben_wir_andere.mp3'
Renamed '25_Warum_bekommen_MĂ€dchen_bessere_Noten.mp3' -> '025_Warum_bekommen_MĂ€dchen_bessere_Noten.mp3'
Renamed '26_Wie_sieht_die_Schule_der_Zukunft_aus.mp3' -> '026_Wie_sieht_die_Schule_der_Zukunft_aus.mp3'
Renamed '024_Warum_mobben_wir_andere.mp3' -> '01/001.mp3'
Renamed '025_Warum_bekommen_MĂ€dchen_bessere_Noten.mp3' -> '01/002.mp3'
Renamed '026_Wie_sieht_die_Schule_der_Zukunft_aus.mp3' -> '01/003.mp3'

The resulting folder|s can be renamed, depending on what is already on the SD card, and moved to the SD card. It also makes sense to set the RFID card to audiobook mode so the TonUINO saves the position for the listener and does not start at the beginning again.

Now it’s up to the children to do some decorations. Our oldest wants her version built into a box that looks like a book. Hope we can get that one done soon too.

A lot happened since my last update on the simpit – under it’s hood. Function wise it changed not so much so the older demonstration video is still better for a quick demo. I still assembled a new video from clips of the first evening with the new hardware:

Quick trip from Armstrong Orbital over to the huge crater on HIP 117029-4 and back

So what changed? I got rid of the CY-822A USB joystick controller that, while good, was also limiting. Especially in inputs and how they would react. The Raspberry Pi, that I used to drive the status indicators, is also gone. This is all replaced by one single Arduino Mega that is connected via serial over USB.

A custom joystick daemon written in Rust is listening for data from the and feeds back the flags of Elite Dangerous to drive the blinken lights. I also extended the source to add me some rotary encoders (with push button function) and I’m very happy with the result of this. That makes a whopping amount of 48 buttons and 6 axis (where 2 axis make one hat). And it feels _so good_ to have e.g. self destruct or eject cargo save under a protective cover now 😀

The panel also got an external PSU with enough ampere to drive as many LED as I may imagine so I no longer abuse a phone charger for that or risk frying of the PCB / USB.

With all that in place I streamlined my pre-flight check-list quite a lot because way less hardware is involved and most of this is automated by now. It wasn’t all fun n giggles tho and while the new hard- and software “just worked” in e.g. it was that gave me a hard time to actually use most of the new buttons.

Getting all the precious buttons into Elite as well (okay, limited to 32 thanks to an old dinput library but who is counting at this point – will simply set the rest to keyboard macros instead)

Turns out it had no idea about the device and model identifiers reported by the joystick daemon and that the kernel assumed a gamepad based on declaring e.g. ButtonNorth via the more recent xinput system really didn’t help – because that limited the amount of read buttons via xinput severe! In the end I set it’s identifier to a “vJoy” device. That I found in the DeviceMappings.xml of Elite and since this could be basically anything I gave it a try (and removed all “offending” magic gamepad buttons from the code) and sure enough Elite started accepting the inputs as expected and from there it was smooth sailing – got even the hat working.

Oh and for everyone who is wondering what exactly they are seeing on the “MFD” when I’m playing Elite: That’s basically a Website using the FUI framework. Find a quick demo video here. Without the cardboard covering up parts of the screen it looks basically like this:

I also started doodles for a version 2 – now that I have an idea what I really want.

Plans for another based on a Valkyrie Cockpit

You probably heard about this before: An Arduino can be made into an excellent DIY joystick. Most examples use a Leonardo or Micro for this for a very good reason. They one comes basically with a chip that is recognized as HID (Human Interface Device) hardware on any modern operating system.

This is not the case with a Mega. This one has other perks but HID it is not. It sure shows up as USB device and a ttyUSB is raised where serial communications with the Arduino can be initiated. I’m also aware that some flash the built in programmer of the Mega so it starts operating like the others (which obviously removed the built in programmer). I’m on Linux PC though so I thought it’s basically a job of tricking the system into recognizing it as joystick and call it a day and OMG was I wrong!

How it’s not done

My train of thoughts was like this: Linux still supports plenty of old serial joysticks so how complicated can it be to send some bits an existing driver recognizes. Old hardware like this is usually glued to the driver with the tool inputattach of the Linux Console Project. This does basically initialise a joystick on some serial connection and sends it off to a fitting kernel driver. This way even non-USB, or let’s better say non-HID hardware, is mapped to a kernel driver who in return will set-up the joystick subsystem and manage the communication with the stick via a serial connection.

Turns out I’m not the first one with that idea and apparently someone made it work by connecting old Playstation Controller and a Wii Classic Controller to an Ardunio and fake a Stinger device without the use of HID so Kudos to Jarno Lehtinen here and his Linux-Arduino-Serial-Joystick repo – you sure did sent me down a rabbit hole of horror and amazement. I couldn’t even get inputattach to wait for that magic string to be sent with anything else than 9600 baud and aligned stars! I also had to throw socat into this horrible mix because the Arduino would insist on rebooting on init so a timeout was guaranteed! In case you wonder how I did this:

socat -r left.raw -R right.raw pipe:/dev/ttyUSB0 PTY,link=/dev/ttyUSB1,rawer
# and xdd to show me the debug juice
tail -f left.raw | xxd -c4
# and on yet another terminal
inputattach --baud 9600 --stinger /dev/ttyUSB1

This also meant that I had to tear everything down for reprogramming the Arduino. Anyway, in the end I could finally get through that init phase where the stinger related code in inputattach is waiting for the magic key after sending “ E5E5” to finally load the Stinger kernel driver – communication for both ways confirmed!

    // "\r\n0600520058C272";
    byte byteResponse[] = {0x0D, 0x0A, 0x30, 0x36, 0x30, 0x30, 0x35, 0x32, 0x30, 0x30, 0x35, 0x38, 0x43, 0x32, 0x37, 0x32};
    if (Serial.availableForWrite() >= sizeof(byteResponse))
    {
      Serial.write(byteResponse, sizeof(byteResponse));
    }

At this point I had a pipe to prevent the timeout due to the resetting Arduino, the _only_ working baud rate 9600 I could figure out with the Mega, a loaded driver that was recognized as joystick and was sitting put and did
 absolutely nothing. Null. Nada. Not a single bit made it to the driver and I could not figure out why. My guess is it needs a change in the baud rate to the original 1200 (?) of the Stinger but I have no idea if this is true. I could also not find any way how the stream is controlled and since the driver would fill up 2 bytes all the time and interpret them there is a fair chance that it would simply be one byte off all the time. Speculations tho, I simply didn’t grasp the stinger.c source so this is all just a theory. I do not want to admit how much time I sunk into this and I was pretty frustrated at this point. Reading some stupid serial? Not like this! Too many hoops!

So I threw it all in the bin 🚼

How it’s probably done

Say hi to /dev/uinput where you can basically raise virtual devices, like a joystick, without [much?] pain. I’m not the first one, of course, and funny enough the reason behind is very similar to mine. Read more on Virtual joystick on Linux by Gwilym Kuiper where this is all explained in great detail. The referred code at https://github.com/gwilymk/arduino-joystick sure did help me to get started and even without having touched Rust ever before I was able to quickly adjust this for my needs, doubling the possible buttons and get it up and running in just a few hours for my Linux PC. Cheers mate (also Jarno Lehtinen – you teached me a lot that day :D) đŸ•č

So here it is: A Mega acting as joystick without HID over a serial connection driven by a userspace daemon (means no kernel driver required) written in Rust providing a virtual uinput device for a joystick on the “modern” event system. Heck it’s even recognized in Wine!

What a journey to begin with. Now I need a back-channel for my blinky lights so I get my Raspberry Pi back from simpit duty 🙃

This is a brief description how to mod an CY-822A USB joystick controller into accepting analogue input. I’ve done this modification now with two of my PCBs and worked with both for an extended period of time without any problems. To achieve this two things have to be done – at your own risk!

sdl2-jstest detected 5 axes

The PCB comes with 5 analogue axes according to my Linux PC and sdl2-jstest and while I’m not sure where the 5th is located a tiny modification will allow us to use at least 4 of the axes.

Locate the central lane and simply scratch off the track with a sharp knife at the 3 indicated positions.

Locate the resistors R1 – R8 on the front that make up for 4 possible connections for analogue input with the use of potentiometers. There are 2 resistors with ~10k on the PCB that have to go. The 2 resistors hold 4 of the 5 axes perfectly still in the centre because the middle lane is bridged on the backside. This is the part where the conductor path has to be interrupted. Locate the central lane and simply scratch off the track with a sharp knife. Also clean all the holes of R1 to R8 so you can solder in some new pins for easier access. Use a multimeter to make sure that none of the 4 central soldering points are connected with each other any more. The upper and lower ones stay connected (Plus and GND).

The wire bridge at J1 makes the board boot in analogue mode

Next we want to remove the zero ohm resistor at J1 and add a wire bridge instead. Look for an resistor with a single black ring next to R2, remove it, and solder in the wire bridge next to R1. This is basically a jumper setting but with a bridge. I’ve no idea why the designer went with a zero ohm resistor and not with a bridge. My only guess is that this was cheaper for the assembly machines.

Anyway, this will make the board boot in analogue mode so we do not have to use a mode switch on power on every time. This serves two purposes: Axes are now read from input and actually send as joystick events on the USB wire while the former digital joystick connector (5 pins) is now mapped to Up/Down/Left/Right buttons – so no extra buttons are needed here any more (but can still be added, of course).

Any potentiometer should do – mine are 100k – 200k. YMMV.

Now it’s time to connect potentiometers as analogue inputs. This is pretty straight forward. Just make sure that the central connector goes to the centre of each axes. Change the upper with the lower pin if the direction is not as desired.

Please note that any axes that has no input attached will report _a lot_ of jitter making your game/app go nuts. This is what the former resistors at R1 and R2 were there for.

The last update has been a while. I focused my attention to the MFDs (Multi-function display). This part didn’t get much attention yet and I was caught between the difficult choice to learn yet another fancy framework, like Raylib, that would do OpenGL ES 2.0 without X11 on the Raspberry – or just throw the might of my CoffeeLake at it and go with ReactJS since most of the data was already available via NodeRED anyway. Also
 ARWES is just so cool đŸ€©

I went with ReactJS and ARWES again, simply because I have some experience in this by know thanks to my Streaming Overlay I wrote with it. Hobbling it up to NodeRED was just a matter of installing SocketIO to transport the messages. It’s all a very hacky mess but it gets the job done.

Video demonstration of my simulated cockpit made from cardboard on a budget mainly used to play Elite Dangerous in early 2022. This is work in progress.

While seeking through the available data I noticed that I don’t get velocity values from Elite. That’s not so important in space but _kinda_ interesting for me in planetary flight to satisfy the flight sim gamer in me as well. I noticed tho that I do get timestamped latitude, longitude and altitude values so shouldn’t it be possible to “simply” calculate this, right? Right?

This was when I dived into the rabbit hole of calculating velocity and heading on planetary objects using a spherical coordinate system and while I didn’t nail it exactly how Elite does it the result is close enough. The game provides the required data to go crazy here – most important the radius of the current object. In _theory_ I could start writing some primitive AFS (Auto Flight System) routines now, which I’m totally going to explore at some point in the future just because đŸ€“

Checking my maths – yes, altitude is added to the mix so velocity is mostly correct as long as no rapid course changes are made

After spending way too much time with this and the Pythagorean theorem (Yes mum, a game made me do maths. MATHS! đŸ€Ż) I settled with some calculations and data for my current ship to the right and targeted ship data on the left. This is sort of tricky because many game events update different parts of the data so timestamps have to be kept in mind and a game specific parsing strategy is required. See the last part of the demonstration video to get an idea how this looks.

Improving situational awareness by putting the video feed of wingman / gunner on the central MFD.

Another point to tick off my list was getting the head tracking to work in Elite (again). Now this is very Linux PC specific so you may tune out on this paragraph. On Linux PC I’d usually compile Opentrack with the Wine Glue, patch in my appdata dir for Proton and hope that it’s still ABI compliant to Just workℱ. Alas recent Proton is sandboxed within pressure vessel and the usual approach of memory mapping is simply no longer working, if I got the gist of this right.

So my _current_ strategy is to download and drop the Windows build of Opentrack into the game folder and chain-load the EXE with the game where the Opentrack EXE would listen on UDP while my native Opentrack BIN would send via UDP. A task not made easy with Proton but it is possible. The following snippet may give you some pointers:

#!/bin/bash
export STEAM_COMPAT_DATA_PATH=/games/steam/steamapps/compatdata/359320
export STEAM_COMPAT_CLIENT_INSTALL_PATH="$HOME/.steam/steam"
python3 /games/steam/steamapps/common/Proton\ -\ Experimental/proton run opentrack.exe

Why running Opentrack twice? The native build performs a lot better with my webcam and every frame really count here. Reading data via UDP is not much of a burden for Proton. This also saves me the trouble of fiddling with Wine Glue, a painful compile process nobody should endure involving installation of many many additional 32bit libraries. Hilarious but it works.

When you have a water cooled computer the last thing you want to see is a message from your CPU driver telling you that it forcefully clamps down power because this means your CPU is throttling down.

Refilling an AIO be quiet water cooling system

This may mean that there is not enough water in your cooler and the CPU starts to overheat. Time for maintenance and a refill!

What we need for this is an external power supply or the pump, a pipe, an adapter, and some sort of filter hopper and of course water for the refill.

The pump is powered by 12V and this model should not be undervolted. Seriously, this one will break when undervolted. 12V is basically what an Ikea LED lights strip power supply will provide. You can also use another PC power supply but this is what I have anyway for a fan providing fresh air for my office.

The hose I used for the pipe is a used DN9 RAUFILAM-E from REHAU that I found in the barn. It costs about ~1 EUR per meter but you can of course buy a “special” hose from your usual PC parts mafia for 10 EUR per meter.

Distilled water should be fine. You can also purchase whatever overpriced holy water you may prefer. I won’t care.

I spare you the details how to remove the water cooler, because I assume you assembled it before and if not there are plenty of other tutorials on the net.

This pump was completely dry after opening the lid with a screw driver. I prepared this contraption held in place with a wire attached to the chandelier. So don’t do this at home. Now I started the pump and added water to the pipe until the pipe was filled.

Now is the time to move and shake the radiator to get all the air out. Do not shake the pump! This takes some time. You’re done when no more bubbles appear.

Don’t forget the seal and screw in the lid while there is still water standing in the hole. Do not use force on the lid, because it breaks easily.

Make sure everything is clean and tidy (and dry) in the end because you don’t want water in your PC. This also helps to see any leaks that may spring. Don’t forget the heat sink compound later or this will be a very short adventure.

When done let the pump run for a little bit to see if any water leaks out.

When everything is assembled again make sure to run a load test and see if any leak springs.

And this is it. The system is able to hold 32°C again where it was hardly able to hold 80°C before.

And as usual, if you break your system you get to keep the pieces.

I’m not responsible for your mistakes.

I like space and science fiction. Diving into epic stories set in some distant future amazes me since elementary school.

I’m also a gamer. And a tinkerer. It’s in the family.

I keep wondering: How can I improve the immersion of my games without going full VR?

DIY Headtracker for gaming (on Linux PC)

I used a triple screen set-up before. It consisted of different models in height and size. When one screen finally broke down I purchased 3 refurbished screens of the same brand and model. What a difference!

The kids love it too. Of course. Means less stick time for me. Anyway.

This is when I started to read about head tracking and went on a quest to get this working for the game X4. As a bonus on Linux PC, my preferred system also for gaming.

The thing is: “The” reference product for a headtracker is the TrackIR system. Price as of today: 220 EUR. Ouch! That’s like a cheap VR, right? And it’s Windows only. No thanks.

So I checked what’s in this thing. Apparently a cheap camera, some infra-red LED, and a filter allowing only infra-red waves. And software, of course.

Since this is for Linux I get to pick my poison for the software part, and I settled with Opentrack fast. Onwards to the hardware part. I abused my mobile phone for the testing, sending it’s Gyroscope data via wifi to my PC, and while it worked it also _sucked_. Both, phone and wifi I mean.

Head tracking is awesome. And I knew I want it. So I started prototyping. For this I went with a simple design that I eventually implemented on cardboard. It looks hilarious but it gets the job done.

The focus was on a long life cycle so I wouldn’t have to replace the rechargables in the middle of a session. To get this right I checked with the camera that I was going to use. See (video above), this is way to bright and by trying various resistors I could get this down to 33mA per LED and still get a decent detection rate with Opentrack.

Speaking about the camera. That’s nothing special. It’s a dead cheap 480p Logitech QuickCam Communicate STX that I got from a discounter a decade ago. It was so cheap it doesn’t even _have_ an infra-red filter that I’d have to remove first.

I used tape to attach the salvaged camera cover of a dead G20 controller. That’s a Wii Remote knock-off that does basically the same thing like a headtracker. Various other foils can be used for this as well, as long as they permit infra-red. The idea is to reduce or remove all other light waves but infra-red.

The trick is to also turn off auto exposure and fiddle with the contrast and sharpness until a decent frame rate and a clear infra-red wave source by the LED can be seen.

When I was satisfied with my meter readouts, and my highly professional scribbles, I started working on the prototype while streaming the whole process on the Discord channel of the awesome Fly Dangerous project. If you like racing with a space ship give it a shot.

The prototype is made of cardboard that doubles as isolation for the polarity. The rest is tape and hook-and-loop fastener to attach the headtracker to my headphones. No magic here. The whole contraption is powered by two 1.2V rechargeables. I opted for a micro switch and an additional LED as power indicator, that I dimmed down even more. I can after all not see infra-red so this seemed like a good idea to me. Spoiler: It is.

So how does it play? Over the next weeks I tried basically any game supporting head tracking that I could get my hands on. Please keep in mind that I usually play with lights off but started the studio lights for demo purposes. The tracker does still work just fine.

I quickly found out that each game needs it’s own profile for fine tuned settings. Good thing that Opentrack has me covered on this. First, my beloved X4 using Wine and the TrackIR protocol.

Sadly I came to the conclusion that my GPU is no longer up for the task and Wine would cost me too many frames. I switched Opentrack to emulate a joystick instead and mapped it to camera movements in the native X4 version. It’s not exactly the same but it’s okay-ish. I have an idea how to hack this properly into X4 using an extension and a UDP server but that’s a topic for another day.

Anyway, the same principle works with X Rebirth too, making me even happier. While dated it still has it’s charm and the verse still feels a lot more alive compared to X4. It’s also not taxing my GPU that much.

Now for something different. When Opentrack would list a “protocol” named FlightGear I became very curious. I installed this free and open source flight simulator and crashed my first Cessna into the ground minutes later. By now I’m confident that I can crash a Cessna just about anywhere. I’m not fond of flying in real-life but avionics sure are a fascinating topic.

This was the moment a Steam sale happened and I bagged various flight sims, Space Kerbal and House Of The Dying Sun. All with TrackIR support.

Little did I know what gem I bagged with House Of The Dying Sun by the way. Sadly it’s also very short but I enjoyed every minute of it and will probably play it again. The art, sound and music reminds me a lot of Battlestar Galactica. Easy win 😀

So yeah, this is my current gaming set-up. I built myself a head tracker for 5 EUR. On Linux PC.

I also may have fallen into the rabbit hole called “simpit”.