So didn’t show my GPU on when started via . I have an and all the (#ROCr / ) stuff installed. It only listed the iGPU by Intel on startup:

[---] OpenCL: Intel GPU 0: Intel(R) UHD Graphics 630 (driver version 23.35.27191.9, device version OpenCL 3.0 NEO, 25561MB, 2556>
[---] libc:  version 2.37

This works however fine when I run boinc manually as user (or clinfo for the matter), and not via systemctl start boinc-client, so I guessed it’s some permission issue. journalctl had the context I was looking for and threw this in the middle of the boinc-client startup:

audit[305157]: AVC avc:  denied  { read write } for  pid=305157 comm="boinc" name="kfd" dev="devtmpfs" ino=532 scontext=system_u:system_r:boinc_t:s0 tcontext=system_u:object_r:hsa_device_t:s0 tclass=chr_file permissive=0

This is SELinux’s charming way of telling me that it blocked read and write access to /dev/kfd (the main compute interface shared by all GPUs, according to the ROCm manual) for the boinc process. Nice. So what most users do now is grumble and disable SELinux, which is kinda a bad idea. The more advanced user does this and calls it a day:

sudo ausearch -c 'boinc' --raw | audit2allow -M boinc
sudo semodule -i boinc.pp

This basically prepares an override policy based on any rejected boinc activity that looks in my case like this:

module boinc 1.0;

require {
	type hsa_device_t;
	type random_device_t;
	type boinc_t;
	class chr_file { ioctl map open read write };
}

#============= boinc_t ==============
allow boinc_t hsa_device_t:chr_file { ioctl map open read write };
allow boinc_t random_device_t:chr_file write;

Not today though. It left me befuddled with the following output:

libsemanage.semanage_direct_install_info: Overriding boinc module at lower priority 100 with module at priority 400.
Failed to resolve typeattributeset statement at /var/lib/selinux/targeted/tmp/modules/400/boinc/cil:3
Failed to resolve AST
semodule:  Failed!

…and I have no idea why. I also found nothing on Google Search. So to not be DenverCoder9 (https://xkcd.com/979/) in the future here is what I found out so far:

sudo cat /var/lib/selinux/targeted/tmp/modules/400/boinc/cil | bunzip2 
(typeattributeset cil_gen_require hsa_device_t)
(typeattributeset cil_gen_require random_device_t)
(typeattributeset cil_gen_require boinc_t)
(allow boinc_t hsa_device_t (chr_file (ioctl map open read write)))
(allow boinc_t random_device_t (chr_file (write)))

Apparently it can’t resolve the required typeattributeset boinc_t – which is kinda odd as it exists (see sudo semodule -X 100 --cil -E boinc and the resulting cil file). Frankly this is where SELinux lost me too. I found the man page for boinc_selinux, which is not really known on my Fedora system here, so I may be missing something. It suggests to enable permissive mode for boinc_t (instead of dropping SELinux altogether):

Note: semanage permissive -a boinc_t

can be used to make the process type boinc_t permissive. Permissive process types are not denied access by SELinux. AVC messages will still be generated.

https://linux.die.net/man/8/boinc_selinux

And sure enough on the next restart my AMD GPU became available:

[---] OpenCL: AMD/ATI GPU 0: AMD Radeon RX 6700 XT (driver version 3558.0 (HSA1.1,LC), device version OpenCL 2.0, 12272MB, 12272>
[---] OpenCL: Intel GPU 0: Intel(R) UHD Graphics 630 (driver version 23.35.27191.9, device version OpenCL 3.0 NEO, 25561MB, 2556>
[---] libc:  version 2.37

Happy numbers crunching. Mebbe some fix for SELinux crosses my path in the future so I can update this with the proper solution.

I just set https://simpit.dev/ live.

Primary Buffer Panel – The On A PC For More Immersion In Pew Pew

A glorified joystick controller with an LCD (‘MFD’) and plenty of RGB.

Best viewed WITH an ad-blocker (thanks @stefan)

I’m kinda blind by now after hacking away on this page for days so I’d appreciate feedback.

Especially if something is broken.

`gamescope` is slowly becoming the hammer to all of my gaming or recording issues on Linux PC.

Doesn’t capture in OBS via obs-vkcapture? Gamescope.
Get’s ideas about screen layout? Gamescope.
Has no built in FSR? Gamescope.
Doesn’t show up in the list for screenshare? Gamescope.
Does this post need a hashtag? .

TIL: “Skype For Business” is not really possible on Linux PC in 2023. There is no native built. It’s “Web App” requires a plugin that comes with an MSI installer. Not that it really matters, since it’s superseded by “Teams”. In theory.

So if you need it anyway you’re not really looking for “Skype”, that may sound similar but has nothing to do with “Skype For Business”, but for “Lync”. Did I say “Lync”? I meant “Office Communicator”. And if you do you’re in luck. There is a Pidgin plugin called “Sipe” which does that. In theory. I couldn’t get it to sign in because my “Office 365” account, that supersedes the BPOS (Microsoft Business Productivity Online Suite), has no more “Office Communicator” or “Lync” or “Skype for Business”. It does have “Teams” and to add insult to injury also “Skype”.

There is however an Android app “Skype for Business” and lo and behold it’s on API Level 30 from 2020 so chances are good that it works on most recent devices (it will request all permissions though and refuse to start without). And while I still have no account for this, or the possibility to create one, since Microsoft simply redirects me to “Teams”, I can now open an invitation link in a browser which in turn opens the app again where I now get the previous unavailable option to _join as guest_.

Stay tuned if it picks up the microphone too as there is no speech indicator and no echo chamber to test this. At least video seemed to work fine.

(And hell no I will not even try that with a Google Chrome Brower EXE in Wine)

Here are the humble beginnings[1] of a working example to read the ship status of in a format very similar to the Status File of

Both games are quite similar and by using a “well established” format it should be possible to use this with existing companion apps – like my own

It uses the “Named Pipe API” of “sn_mod_support_apis” – on PC 😁 This was not supported by this MOD so far but I made it work.

Well, at least on my machine 🤓

And yes, the pipe server works with some minor adjustments for other _existing_ apps as well. Here is a demo of with a data feed directly from X4: Foundations – it does not use the though, since that is not really needed, so I had to make some small adjustments in it’s connection routine but that was like 2 lines of code 🤷

[1] TBF the humble beginnings were back in 2021 (https://beko.famkos.net/2021/05/01/getting-into-x4-foundations-modding-on-linux/) but I kinda let it slide to tinker and build my Primary Buffer Panel (https://beko.famkos.net/category/simpit/) first. Other games made it easier to retrieve game data and I did learn a lot during that time but it was X4 that started it all.

FIY Should update within 24h https://youtu.be/RIGkmqzJdfQ

It’s a 6dof racing game with a flight model similar to .

It does have a version but the world gen lib is single threaded on Linux so Proton may be a better choice for now.

It also features access to full ship telemetry and head tracking including OpenTrack via UDP thanks to my nagging 🤪

No idea about VR. Apparently it works but I can’t test that.

Some time ago I needed a virtual machine and while I’m not entirely sure any more why that was I did seem to have an inspirational moment and made a template of this. Here is what the config for _may_ look like:

agent: 1
arch: aarch64
bios: ovmf
boot: cdn
bootdisk: scsi0
cores: 2
efidisk0: misfits-btrfs:501/vm-501-disk-0.raw,size=64M
ipconfig0: ip=192.168.2.251/32,gw=192.168.2.1
memory: 1024
name: arm-test2
nameserver: 192.168.2.1
net0: virtio=96:79:F4:02:A1:6B,bridge=vmbr2
numa: 0
ostype: l26
scsi0: misfits-btrfs:501/vm-501-disk-1.raw,size=8G
scsi1: local:iso/debian-10.6.0-arm64-netinst.iso,media=cdrom
scsi2: misfits-btrfs:501/vm-501-cloudinit.raw,media=cdrom,size=4M
scsihw: virtio-scsi-pci
serial0: socket
smbios1: uuid=63fe535c-1507-4528-8dee-2bd2d59b57f8
sockets: 2
vga: serial0

It makes sense to install the package cloud-init to some stuff can be set from outside of the machine.

…and yes, it’s just as slow as expected from an ARM 🤓

I’m also not entirely sure if this is really officially featured by Proxmox (just like btrfs 🤷) but the machine was doing it’s job without an issue for years and I did just replay the template on VE 7.4 so I guess it’s fine 🤷

I want more control over what my microphone picks up on screen share in video conferences or during streaming but I don’t want to buy a hardware mixer. I also want to be able to disable the microphone with a hotkey but it doesn’t have any physical switch. So achieve all this I utilise PipeWire to run a bunch of virtual devices that I can control via pavucontrol and obs later. Video conferences get this as “default device” so they don’t get a chance to mess up my audio setup (looking at you Teams). The steps are the same for PulseAudio if you don’t have PipeWire (yet).

#!/bin/sh
# setup virtual device intended for monitoring
pactl load-module module-null-sink sink_name="BekoBlaster" device.icon_name="audio-card-analog" node.nick="BekoBlaster" node.description="BekoBlaster-16" sink_properties=device.description="BekoBlaster-16"
# setup virtual MIC so intended monitoring device can be recorded from as MIC
pactl load-module module-remap-source master="BekoBlaster.monitor" node.nick="BekoMic" device.icon_name="audio-input-microphone" source_name="BekoMic-16" source_properties=device.description="BekoMic-16"
# IMPORTANT:
# RUN `pavucontrol` => Select Tab Record => Set BekoMic-16 input to "Monitor of BekoBlaster-16"

The 16 is not important. It’s just my kind of humour as my first Linux PC had a SoundBlaster16 😛 It also is a pattern sufficient enough so I don’t mix this up with the zoo of real microphones or audio sinks attached to my computer.

This is already sufficient enough so that everything played on the device BekoBlaster-16 can be recorded on the BekoMic-16 again, that I select as input microphone for Browser (video conferences) or Discord at this point. This can be done with pavucontrol – or later in obs.

This isn’t enough, of course. In case of e.g. playing music (or streaming a game) I’d also want to hear the sound myself too. For this I create an additional null sink and a combined sink. With this approach I can later fine tune in obs what gets recorded to which audio track (where audio track 1 is the one used for streaming) and what ends up on the BekoBlaster-16, that acts as my monitor and due to the remapped source also as virtual mic.

# setup virtual device for games (or whatever OBS should record)
pactl load-module module-null-sink sink_name="OBS-Blaster" device.icon_name="audio-card-analog" node.nick="OBS-Blaster" node.description="OBS-Blaster" sink_properties=device.description="OBS-Blaster"
# OPTIONAL setup a combined sink so I can enjoy game sound while OBS gets a copy
pactl load-module module-combine-sink slaves="OBS-Blaster,bluez_output.10_4F_A8_84_18_01.a2dp-sink" node.nick="OBS-Blaster-AND-Headphones" node.description="OBS-Blaster-AND-Headphones" sink_properties=device.description="OBS-Blaster-AND-Headphones"
# Important tools to manipulate: `pw-cli list-objects`, `pw-cli destroy $id`, `pactl list short | grep module`, `pactl unload-module $id`

With this (and my headset connected) it starts to get crowded in my device list.

As you can hear err… hopefully see: The sink OBS-Blaster-AND-Headphones is now selected for playing music which results in the music being played on the next virtual sink OBS-Blaster and my h.ear (MDR-100ABN) headphones. The same could be done with the BekoBlaster-16, of course, but bear with me. We still don’t have any real microphone added to the mix and while this can be done with PipeWire or PulseAudio alone too I need this usually with video included too so obs it is.

Here the most important setting is the monitoring device, which is the BekoBlaster-16 from the beginning, that can be used as microphone in e.g. Discord later again.

Next is the set-up of the mixer where I’m interested in 4 devices only:

  • The BekoMic-16 without monitor (it is the monitor so this would result in an echo chamber) and optional track 5 for recording (so I’ll know later how the mix sounded – but this is never used for video editing later).
  • The desktop audio without monitor, so random system sounds (or other Discord voices!) don’t make it to any stream. It can be recorded on it’s own track tho in case I fcked up or need a reference later on during editing.
  • The Mic/Aux, which represents the real microphone used. It is echoed on the monitor microphone and on track 1 (send to my streaming server) and on track 2 so I have a separate microphone track later to work with in post edit.
  • The OBS-Blaster, which usually represents the game I’m playing. It is echoed on the monitor microphone and on track 1 (send to my streaming server) and on track 4 so I have a separate game/music track later to work with in post edit.

This way I can control in great detail what ends up on the Discord / a video conference / game streaming, while I get the full power of obs scenes (where I also do my greenscreen mixing), mute microphones as I see fit and have some material to work with later when I decide to make a video on stuff. Here I did set up Discord to read from the virtual BekoMic-16 and output to my headphones only (where no recording in OBS is done) – so perfect for most Discord / video conference sessions.

Don’t mind the flipped video preview. That’s perfectly fine and will look right for the viewers later. This is by the way the virtual camera sink feature of obs and the v4l2loopback kernel driver that I also read from in video conferences instead of the real webcam. This way I can also control exactly what the webcam shows – zoom / crop included.

The whole mess looks like this visualised in helvum, a patchbay for PipeWire.

Most of this explains itself. The WEBRTC VoiceEngine is the recording of Discord. Other devices may float around but are not used at the moment of this snapshot.

More on this and proper documentation: https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/Virtual-Devices

You probably heard about this before: An Arduino can be made into an excellent DIY joystick. Most examples use a Leonardo or Micro for this for a very good reason. They one comes basically with a chip that is recognized as HID (Human Interface Device) hardware on any modern operating system.

This is not the case with a Mega. This one has other perks but HID it is not. It sure shows up as USB device and a ttyUSB is raised where serial communications with the Arduino can be initiated. I’m also aware that some flash the built in programmer of the Mega so it starts operating like the others (which obviously removed the built in programmer). I’m on Linux PC though so I thought it’s basically a job of tricking the system into recognizing it as joystick and call it a day and OMG was I wrong!

How it’s not done

My train of thoughts was like this: Linux still supports plenty of old serial joysticks so how complicated can it be to send some bits an existing driver recognizes. Old hardware like this is usually glued to the driver with the tool inputattach of the Linux Console Project. This does basically initialise a joystick on some serial connection and sends it off to a fitting kernel driver. This way even non-USB, or let’s better say non-HID hardware, is mapped to a kernel driver who in return will set-up the joystick subsystem and manage the communication with the stick via a serial connection.

Turns out I’m not the first one with that idea and apparently someone made it work by connecting old Playstation Controller and a Wii Classic Controller to an Ardunio and fake a Stinger device without the use of HID so Kudos to Jarno Lehtinen here and his Linux-Arduino-Serial-Joystick repo – you sure did sent me down a rabbit hole of horror and amazement. I couldn’t even get inputattach to wait for that magic string to be sent with anything else than 9600 baud and aligned stars! I also had to throw socat into this horrible mix because the Arduino would insist on rebooting on init so a timeout was guaranteed! In case you wonder how I did this:

socat -r left.raw -R right.raw pipe:/dev/ttyUSB0 PTY,link=/dev/ttyUSB1,rawer
# and xdd to show me the debug juice
tail -f left.raw | xxd -c4
# and on yet another terminal
inputattach --baud 9600 --stinger /dev/ttyUSB1

This also meant that I had to tear everything down for reprogramming the Arduino. Anyway, in the end I could finally get through that init phase where the stinger related code in inputattach is waiting for the magic key after sending “ E5E5” to finally load the Stinger kernel driver – communication for both ways confirmed!

    // "\r\n0600520058C272";
    byte byteResponse[] = {0x0D, 0x0A, 0x30, 0x36, 0x30, 0x30, 0x35, 0x32, 0x30, 0x30, 0x35, 0x38, 0x43, 0x32, 0x37, 0x32};
    if (Serial.availableForWrite() >= sizeof(byteResponse))
    {
      Serial.write(byteResponse, sizeof(byteResponse));
    }

At this point I had a pipe to prevent the timeout due to the resetting Arduino, the _only_ working baud rate 9600 I could figure out with the Mega, a loaded driver that was recognized as joystick and was sitting put and did… absolutely nothing. Null. Nada. Not a single bit made it to the driver and I could not figure out why. My guess is it needs a change in the baud rate to the original 1200 (?) of the Stinger but I have no idea if this is true. I could also not find any way how the stream is controlled and since the driver would fill up 2 bytes all the time and interpret them there is a fair chance that it would simply be one byte off all the time. Speculations tho, I simply didn’t grasp the stinger.c source so this is all just a theory. I do not want to admit how much time I sunk into this and I was pretty frustrated at this point. Reading some stupid serial? Not like this! Too many hoops!

So I threw it all in the bin 🚮

How it’s probably done

Say hi to /dev/uinput where you can basically raise virtual devices, like a joystick, without [much?] pain. I’m not the first one, of course, and funny enough the reason behind is very similar to mine. Read more on Virtual joystick on Linux by Gwilym Kuiper where this is all explained in great detail. The referred code at https://github.com/gwilymk/arduino-joystick sure did help me to get started and even without having touched Rust ever before I was able to quickly adjust this for my needs, doubling the possible buttons and get it up and running in just a few hours for my Linux PC. Cheers mate (also Jarno Lehtinen – you teached me a lot that day :D) 🕹️

So here it is: A Mega acting as joystick without HID over a serial connection driven by a userspace daemon (means no kernel driver required) written in Rust providing a virtual uinput device for a joystick on the “modern” event system. Heck it’s even recognized in Wine!

What a journey to begin with. Now I need a back-channel for my blinky lights so I get my Raspberry Pi back from simpit duty 🙃