Okay, recap from the last making session of the button box for my #simPit. What I achieved this time: The LCD controller has now a place inside the contraption while the LCD controller buttons are screwed on the outside. This is mostly because I have no buttons laying around to be used instead. I did note down the pin-out of the connector though so I can change this any time. It has a funny LED though that has 3 pins, GND, Red and Green. Uncertain what to make of this.
The button box itself was put on stilts for the extra room required. The LCD itself is prepared to be added to the button box but I’ve to remove part of the former hinges because they are way too sturdy to be removed with a simple cutter knife. I’ll probably need a grinder or a saw and that’s work I will not do at my computer but in the basement.
Next was preparing the ICP in the centre. For this I created a new box of cardboard that is attached to the button box with tape that also acts as a hinge so I can “open” it to work on the switches that go there. I also noticed that my knobs for the rotary encoders or potentiometer are way too big. I could compensate this with more height but I don’t want the ICP to dwarf the button box itself 🤔 So… mebbe I’ll use smaller knobs. On the topic of knobs: Dem, the costs for decent sized knobs are insane. Like ~8 EUR for one knob! So… perhaps I’ll go for spray painted wood here. I don’t know yet 🤷
Yeah, and I got tired of calling it just “button box”. It’s not. It’s a “glorified button box mock-up with a gorram LCD made from cardboard”. So I decided that it needs a name and as a nerd I came up with… Primary Buffer Panel! Firefly fans will know. Others may educate themselves via https://www.youtube.com/watch?v=mY59BYSrxn0 and have a laugh.
It’s also a running gag in the family. My van tended to loose all kind of parts while driving and when Serenity aired my dad yelled at this very specific scene: “Hah, just like your car!” 🤣
I’m making heavy use of Hetzner storageboxes, a rather slow network storage solution perfectly suited for backup tasks. Sadly they come with hickups resulting sometimes in a failed automount status where the machine may not recover from on it’s own. That’s so common that I got tired of checking monitoring or logging in to each individual machine. #Systemd and #bash to my aid!
for i in{bob,alice,steam,punk,younameit};do echo "$i:"; systemctl --host "$i" status storagebox.automount;done
So how are my storagebox mounts today? Ah yes, yes, seems like two need a little pep talk.
I track my working hours with timewarrior. That’s a CLI program that has a lot of nifty features and can also be hobbled together with taskwarrior to automated time tracking when a task is started. The taskwarrior on the other hand gets my todo list from various bugtrackers as sources, like Redmine and Jira, using bugwarrior-pull.
With this set-up I’ve my to-dos and my tracked hours available on the terminal, where I spend most of the day anyway. This is more or less comfortable for me but there is a huge drawback.
At the end of the months I’ll need some numbers and as it goes each company or customer has it’s own time tracking system (or even wants a csv export!) so I’ve to backfill the real systems each month. That’s a very tedious work especially if time has to be logged on specific tickets or customers and booking is done with quarter hours so I’ve to do some quick math in my head all the time.
In theory timewarrior has me covered on this because it has a summary view that is basically fine but this can not be used to grepped or sorted for certain tickets because it displays the date for each day only once.
Another nifty feature is the timewarrior export function that results in a JSON. The result is somewhat limited though since it will for example not display and durations and there is as far as I know no way to change this.
Demo how the various views or exports of timewarrior look (And yes, not _everything_ I work on is tracked here :P)
This is where jq (a lightweight and flexible command-line JSON processor) comes in. This little tool is seriously underrated and it allows me to change the format and the values of the JSON export on the fly by calculating for example time durations on the fly, reformats start and end dates so import functions of table calculation programs, like LibreCalc, can read the values as date (we all know that Excel reads anything as date already) and displays me the tracked time in quarter hours for each entry so it’s for most cases a no brainer now to backfill another time tracking system with this. The result can also be easily sorted now to find for example times for a specific ticket by grepping for it’s id.
timew export:week | jq -r '["id", "start", "end", "duration", "quarter_hours", "description"],
(.[] |
# make sure .end is set (may be empty for currently active tracked time)
.end = (.end // (now | strftime("%Y%m%dT%H%M%SZ"))) |
.duration = ( (.end | strptime("%Y%m%dT%H%M%SZ") | mktime) - (.start | strptime("%Y%m%dT%H%M%SZ") | mktime) ) |
# round duration to quarter hours
.quarter_hours = (.duration / 3600 / 0.25 | ceil*0.25) |
[
.id,
# urks, localtimes are a mess in jq, ymmv - as long as it is consistent off I do not care tho
(.start | strptime("%Y%m%dT%H%M%SZ") | mktime | todateiso8601),
(.end | strptime("%Y%m%dT%H%M%SZ") | mktime | todateiso8601),
(.duration | strftime("%T")),
.quarter_hours,
(.tags | join(", "))
]
) |
@csv'
The resulting csv file can be imported into most table calculation software now or read manually in a more comfortable way.
Timezones are still an issue. There are so many open tickets on jq on this that I don’t even bother.
When you have a water cooled computer the last thing you want to see is a message from your CPU driver telling you that it forcefully clamps down power because this means your CPU is throttling down.
Refilling an AIO be quiet water cooling system
This may mean that there is not enough water in your cooler and the CPU starts to overheat. Time for maintenance and a refill!
What we need for this is an external power supply or the pump, a pipe, an adapter, and some sort of filter hopper and of course water for the refill.
The pump is powered by 12V and this model should not be undervolted. Seriously, this one will break when undervolted. 12V is basically what an Ikea LED lights strip power supply will provide. You can also use another PC power supply but this is what I have anyway for a fan providing fresh air for my office.
The hose I used for the pipe is a used DN9 RAUFILAM-E from REHAU that I found in the barn. It costs about ~1 EUR per meter but you can of course buy a “special” hose from your usual PC parts mafia for 10 EUR per meter.
Distilled water should be fine. You can also purchase whatever overpriced holy water you may prefer. I won’t care.
I spare you the details how to remove the water cooler, because I assume you assembled it before and if not there are plenty of other tutorials on the net.
This pump was completely dry after opening the lid with a screw driver. I prepared this contraption held in place with a wire attached to the chandelier. So don’t do this at home. Now I started the pump and added water to the pipe until the pipe was filled.
Now is the time to move and shake the radiator to get all the air out. Do not shake the pump! This takes some time. You’re done when no more bubbles appear.
Don’t forget the seal and screw in the lid while there is still water standing in the hole. Do not use force on the lid, because it breaks easily.
Make sure everything is clean and tidy (and dry) in the end because you don’t want water in your PC. This also helps to see any leaks that may spring. Don’t forget the heat sink compound later or this will be a very short adventure.
When done let the pump run for a little bit to see if any water leaks out.
When everything is assembled again make sure to run a load test and see if any leak springs.
And this is it. The system is able to hold 32°C again where it was hardly able to hold 80°C before.
And as usual, if you break your system you get to keep the pieces.
Re-Visited Campus Galli in 88605 Meßkirch / Germany mostly for the new barn that is almost finished by now. My last visit was in 2019 so it was really time to see how much changed (despite the gorram pandemic). This time I took so many pictures that my battery drained.
Visitors aren’t allowed inside of the barn yet since it will be under construction until the end of the month. That was perfectly fine for me because catching the impression of the almost finished building is what I was after:
This cart also catched my attention so I checked it out closer. Spoiler: It doesn’t come with free rust proofer:
I consider myself lucky with the weather situation by the way. I could see a lot of systems that prevent flooding of the area in action – or not.
The orchard changed a lot since my last visit. The entrance for example is now completed.
Many trees were cut down for the constructions going on. Wood is needed everywhere and for everything on the site and some areas are becoming aerial.
The wooden church also got some changes. Most important the bell tower next to it and also a new porch. Couldn’t get enough of it.
All the other buildings required on a medieval construction site are also still there. Some show a lot of wear by now and constantly ongoing repairs are required.
The masons seem to be busy with a new arch. No idea where it will go tho 🤔 Their space doubles as a place to dry scales of wood in the attic.
This time I also managed to get pictures of some of the livestock!
This was a great day. Didn’t poke my nose outside much over the last year and I really missed excursion like this.
I also recorded some small video snippets so I may eventually come around creating a small video later too 🙂
Video of Diaspora: Shattered Armistice (on Linux PC)
Seems to work nice with my DIY headtracker on Linux PC too. Sadly I got quite some frame-drops due to recording (and probably multi-head too). It works way better without all the cameras and a life-stream going on but I think it’s enough to get a good impression. Botched emergency landing included xD
Warning: This may fuel a desire to re-watch the BSG series again 😀
I like space and science fiction. Diving into epic stories set in some distant future amazes me since elementary school.
I’m also a gamer. And a tinkerer. It’s in the family.
I keep wondering: How can I improve the immersion of my games without going full VR?
DIY Headtracker for gaming (on Linux PC)
I used a triple screen set-up before. It consisted of different models in height and size. When one screen finally broke down I purchased 3 refurbished screens of the same brand and model. What a difference!
The kids love it too. Of course. Means less stick time for me. Anyway.
This is when I started to read about head tracking and went on a quest to get this working for the game X4. As a bonus on Linux PC, my preferred system also for gaming.
The thing is: “The” reference product for a headtracker is the TrackIR system. Price as of today: 220 EUR. Ouch! That’s like a cheap VR, right? And it’s Windows only. No thanks.
So I checked what’s in this thing. Apparently a cheap camera, some infra-red LED, and a filter allowing only infra-red waves. And software, of course.
Since this is for Linux I get to pick my poison for the software part, and I settled with Opentrack fast. Onwards to the hardware part. I abused my mobile phone for the testing, sending it’s Gyroscope data via wifi to my PC, and while it worked it also _sucked_. Both, phone and wifi I mean.
Head tracking is awesome. And I knew I want it. So I started prototyping. For this I went with a simple design that I eventually implemented on cardboard. It looks hilarious but it gets the job done.
The focus was on a long life cycle so I wouldn’t have to replace the rechargables in the middle of a session. To get this right I checked with the camera that I was going to use. See (video above), this is way to bright and by trying various resistors I could get this down to 33mA per LED and still get a decent detection rate with Opentrack.
Speaking about the camera. That’s nothing special. It’s a dead cheap 480p Logitech QuickCam Communicate STX that I got from a discounter a decade ago. It was so cheap it doesn’t even _have_ an infra-red filter that I’d have to remove first.
I used tape to attach the salvaged camera cover of a dead G20 controller. That’s a Wii Remote knock-off that does basically the same thing like a headtracker. Various other foils can be used for this as well, as long as they permit infra-red. The idea is to reduce or remove all other light waves but infra-red.
The trick is to also turn off auto exposure and fiddle with the contrast and sharpness until a decent frame rate and a clear infra-red wave source by the LED can be seen.
When I was satisfied with my meter readouts, and my highly professional scribbles, I started working on the prototype while streaming the whole process on the Discord channel of the awesome Fly Dangerous project. If you like racing with a space ship give it a shot.
The prototype is made of cardboard that doubles as isolation for the polarity. The rest is tape and hook-and-loop fastener to attach the headtracker to my headphones. No magic here. The whole contraption is powered by two 1.2V rechargeables. I opted for a micro switch and an additional LED as power indicator, that I dimmed down even more. I can after all not see infra-red so this seemed like a good idea to me. Spoiler: It is.
So how does it play? Over the next weeks I tried basically any game supporting head tracking that I could get my hands on. Please keep in mind that I usually play with lights off but started the studio lights for demo purposes. The tracker does still work just fine.
I quickly found out that each game needs it’s own profile for fine tuned settings. Good thing that Opentrack has me covered on this. First, my beloved X4 using Wine and the TrackIR protocol.
Sadly I came to the conclusion that my GPU is no longer up for the task and Wine would cost me too many frames. I switched Opentrack to emulate a joystick instead and mapped it to camera movements in the native X4 version. It’s not exactly the same but it’s okay-ish. I have an idea how to hack this properly into X4 using an extension and a UDP server but that’s a topic for another day.
Anyway, the same principle works with X Rebirth too, making me even happier. While dated it still has it’s charm and the verse still feels a lot more alive compared to X4. It’s also not taxing my GPU that much.
Now for something different. When Opentrack would list a “protocol” named FlightGear I became very curious. I installed this free and open source flight simulator and crashed my first Cessna into the ground minutes later. By now I’m confident that I can crash a Cessna just about anywhere. I’m not fond of flying in real-life but avionics sure are a fascinating topic.
This was the moment a Steam sale happened and I bagged various flight sims, Space Kerbal and House Of The Dying Sun. All with TrackIR support.
Little did I know what gem I bagged with House Of The Dying Sun by the way. Sadly it’s also very short but I enjoyed every minute of it and will probably play it again. The art, sound and music reminds me a lot of Battlestar Galactica. Easy win 😀
So yeah, this is my current gaming set-up. I built myself a head tracker for 5 EUR. On Linux PC.
I also may have fallen into the rabbit hole called “simpit”.
Racing in space sims is a thing and usually done by a small and sometimes hidden community within games like Elite Dangerous or Star Citizen. What happens when this is not enough for players, or when they simply don’t want to waste time grinding the required resources in-game, is demonstrated by @jayleefaulkner at https://github.com/jukibom/FlyDangerous
I was delighted to find Linux PC binaries over at https://jukibom.itch.io/fly-dangerous as well – and this is a very early alpha even. Guess someone pestered the dev about Linux already and I so had to give this a spin, of course.
Unpacking the FlyDangerous-0.2.2b-linux.zip was a no brainer and the game itself started just fine. I do have however a very specific triple head display setup and the game started on the wrong display in fullscreen mode and could not be persuaded to go into windowed mode so I could reach the buttons to change the display resolution via hotkey or tweak the config at ~/.config/unity3d/StarGoat/FlyDangerous/prefs. It insisted of staying on defaults so once more wmctrl to my aid:
Fly Dangerous in all it’s glory over multiple screens
After this was sorted out I noticed that Unity detected “some” joystick with only an X and an Y axes. This doesn’t do my X52 Professional H.O.T.A.S justice. Usually I’d use the controller mapping in Steam now but this isn’t a Steam game and while it can be added as a foreign game to Steam I can not set any controller configs for it there so I started looking around.
Apparently (modern) Unity uses SDL2 and the modern evdev input system to detect joysticks on Linux PC and this can be indirectly configured / overridden using the environment variable SDL_GAMECONTROLLERCONFIG to provide additional input configurations like the ones listed in https://github.com/libsdl-org/SDL/blob/main/src/joystick/SDL_gamecontrollerdb.h (that somehow does not know of the X52 H.O.T.A.S?). Luckily there are various tools to create a mapping for this – for example Gamepad Tool at https://generalarcade.com/gamepadtool/ – which sure looks familiar enough from the re-mapping I’m used to by… Steam! AntiMicroX should also work for this but I didn’t try.
Gamepad Tool with my configured X52 Pro for Fly Dangerous
And since I’m on Linux and totally lazy I threw all this in the script file flydangerous.sh to start the game:
I love gaming over multiple monitors. It’s my current choice for work and games – especially simulations. Having several monitors attached to one computer (or graphic card) is not a big deal in 2021 any more. The framebuffer in recent graphic cards is insanely huge compared to some years ago, when one really had to think twice about the possible resolution when e.g. connecting a beamer to a laptop (good old SiS 630 anyone?).
xfce4-display-settings for my refurbished “new” set of displays
This couldn’t be easier nowadays. Even mixing the integrated graphic card of a recent Intel CPU with an NVIDIA or AMD dedicated graphic card does usually “just work”. Some driver specific mode may have to be set but that’s it. The workspace easily expands over multiple displays and windows can be moved around freely.
Games do not see one huge desktop but individual displays
There is however a catch. Games tend to read the primary display only and the maximum resolution offered usually comes with the readout of this very first display – or worse – the first display connected. This sucks especially when the monitors have different resolutions, as it was the case for me for several years now, because I didn’t just purchase a set but collected discarded monitors over the time. This can often be omitted by temporary disabling the “false” ones or by force window mode.
This results in hacky scripts involving xrandr, wmctrl and xdotool. This is for example how I hammered X4: Foundations into shape _after_ it started, because it would allow me to select a single display only. Set to window mode it can be freely scaled but that comes with a disturbing window decorations so with this the X4 window gets positioned to 0x0, expanded to 5770×1210 and the window decorations purged:
That’s a pain to find out and the fun really stops when it comes to Proton or some games that would not allow resizing over their maximum detected resolution – like for example Everspace.
How about a virtual monitor?
So the idea was to introduce a completely virtual monitor to the systems with the resolution of choice. VNC servers do that all the time so it must be possible. The usual approach won’t work in this case though: When loading the dummy driver the real displays can usually no longer be used and the drivers for AMD and NVIDIA do not really offer such a feature at all.
It is perfectly possible to define virtual monitors with a recent xrand but they have to be mapped on an existing output (a real port). One can use an unused port (as in: no monitor connected) for this, add a Modeline and even force it as “online” like so: echo on >/sys/kernel/debug/dri/0/HDMI-A-1/force
I was delighted to see the display showing up briefly but the AMD driver made short work of my soaring hopes by forcing it off again in an instant. So close and yet so far. This would require some hardware hacking by creating a dummy plug for the port. That’s basically some resistors in the right place making the computer think a display is connected. I hear they can also be purchased and this may be a way for others.
Others seem to have had success by compiling the experimental DisplayLink driver that seems to offer (virtual) monitors but I really didn’t feel like fiddling with something even more alien that will probably break on the next kernel update again.
Intel to my aid!
The success for me was in the end to use of the Intel driver and it’s VirtualHeads feature. The caveat is that one probably needs an Intel CPU for this to work and has to create a X11 config file. If this is done without adding the usual driver people will experience black screens on reboot only. This may be a show stopper for inexperienced Linux users who don’t know how to recover from a broken X11 config (yet :D). For me this is an amdgpu so my file /etc/X11/xorg.conf.d/20-intel-virtual-and-amd.conf has to look like this:
Triple check that your driver is used in there instead or you will end up with a broken config without the possibility to log in to a graphical window manager. When in doubt start e.g. a new session to your liking on the next display server where you can switch back with the key combination ctrl+alt+F[1-0]: startx /usr/bin/startxfce4 :2
Once started a new provider shows up and the new output “VIRTUAL1” is available: xrandr --listproviders
Now that we have a virtual monitor we need a Modeline for it. This is usually the current screen (of the framebuffer) and can be calculated (e.g. sum of all monitors x height and Hz of one monitor) or by asking the system: xrandr | grep Screen
gtf (or cvt) helps obtaining the Modeline: gtf 5760120060
Now all information needed to finally set up the virtual display is there. I’m creating the virtual display on top of the three real displays because I also want to see what’s drawn on it. That’s not strictly required though and in fact most graphical tools to configure the monitor location will even refuse this – because this use case is simply not considered or supported. Gnome for example really didn’t like this. XFCE4 didn’t care. Ymmv:
The virtual monitor becomes visible in display settings (xfce4-display-settings here)
And it works!
After a lot of research and fiddling (and breaking my X config several times) I finally found a working solution. Games let me select the virtual monitor or see at least my primary with my “maximum” resolution. Sometimes this still requires window mode but I could care less – the decorations are optional. And it works like a charm! Here is a small selection of the games I play most at the moment:
Elite Dangerous: Odyssey
Everspace
Satisfactory
X4: Foundations
No Man’s Sky
Alien: Isolation
This is how my set-up looks:
Now I’ve another problem. With this my usual 1080p gaming resolution is no longer and my graphic card is simply not up for the job any more 🤣
At least gaming itself is easy as pie on Linux in 2021. Complex display set-ups? Not so much.