Markus

Geek. Speaks 🇬🇧 and đŸ‡©đŸ‡Ș.

Markus boosted:
2024-03-23

I’m using a ConBee II Zigbee stick by Dresden Elektronik but switched from their own Phoscon/deCONZ software to the Home Assistant ZHA integration.

While I didn’t have any issues with my few Zigbee devices, today, I’ve noticed that there’s a newer firmware available for my ConBee stick. Usually, the firmware update is handled by the deCONZ software – which I’m not using anymore. Also, there’s a dedicated firmware update tool for Windows – which I’m also not using anymore.

However, their update instructions mention the use of Docker on Home Assistant. Which sounds exactly like the thing I need.

The instructions are compiled in the Home Assistant Forums and boil down to:

  1. Disable anything that uses the ConBee II. The forum post says deCONZ, in my case this means the ZHA integration. Settings –> Devices & Services –> Zigbee Home Automation –> –> Disable
  2. Make sure you’ve enabled SSH access to the host via port 22222 or use the Advanced SSH & Web Terminal with safe mode disabled to connect to the base operating system
  3. Run this command:
    docker run -it --rm --entrypoint "/firmware-update.sh" --privileged --cap-add=ALL -v /dev:/dev -v /lib/modules:/lib/modules -v /sys:/sys deconzcommunity/deconz
  4. Follow the instructions. When it asks for the filename to use or download, you can find all available firmwares here. Only type the filename, not the whole URL – in my case this was: deCONZ_ConBeeII_0x26780700.bin.GCF . Don’t panic during the firmware update. One step only ran up to 43% and then aborted, but it was automatically retried and finished successfully on the second try. If you don’t get a green “success” message, check all steps and try again. Also check the troubleshooting instructions.
  5. Once you’re done, delete the Docker image to free up the space:
    docker image rm deconzcommunity/deconz
  6. Now you can enable the ZHA integration again.

https://blog.mbirth.uk/2024/03/23/conbee-ii-firmware-update-via-home-assistant-zha.html

#firmware #hassio #homeAssistant #zigbee

Markus boosted:
2024-03-20

Back when I bought my Nintendo Switch, I made sure to buy an unpatched one – turning all the different boxes in the store upside down to read the serial number on the bottom and compare it with the list I had on my phone.

However, apart from some experiments with SX OS and later with memloader, hactoolnet and NHSE, I never made real use of that as I was pretty happy with everything as-is and even subscribed to the Nintendo Online service. Even though I never play online and never really played all the classic console games made available via the subscription.

When it was time to renew the subscription, I’ve realised that I only use this as a glorified savegame backup and that I can have that much easier and for free by hacking my Switch.

So I followed the “Rentry-Guide” (with a few small adjustments here and there) to install the Atmosphere OS and it all went smoothly.

However, I wanted to have my legitimately bought (via the Nintendo eShop) games available in the CFW. While they didn’t show up as properly installed at first (showing the “Cloud” icon), I found out that you need to copy them from the /Nintendo folder on the SD card to /EmuMMC/RAW1/Nintendo – so the emuNAND can find them.

This made them at least show up as “properly installed”, i.e. as if they should work fine. However, when trying to launch a (legitimately bought!) game, an Error 2155-8007 appeared.

All documentation I could find about this pointed to various causes like wrong/missing SigPatches, a bad clone of the sysNAND to emuNAND, bad game data, bad user profile, etc. But I’ve verified everything and was sure it was none of that. And I also didn’t want to risk getting my Switch banned by letting it connect to Ninty’s servers. So I made a post in the GBAtemp forums to ask for further ideas.

After a bit of fruitless discussion, I came to the conclusion that it must be something with the so-called “ticket”. The ticket is basically the metadata for a file, linking its content to your Switch’s user and your online profile. After cloning, I’ve unlinked the user profile in emuNAND from the Nintendo Online service using linkalho. This seems to have made the Switch want to re-authorise the existing tickets. (Even though I’m pretty sure I’ve tried cloning the sysNAND without unlinking the account afterwards, too, and it didn’t work either.)

It took me some experimenting before I found the solution: Using nxdumptool I was able to dump the tickets of all the games’ files. Each base file, update and all the DLCs have their own tickets. After dumping them, I used DBI to install those dumped tickets again. This made sure they are tailored to my current setup. And lo and behold – those games finally launched.

https://blog.mbirth.uk/2024/03/20/making-your-nintendo-switch-seaworthy.html

#firmware #nintendo #switch

Markus boosted:
2024-03-19

In April 2023, Reddit announced that it will start asking for money to use their API for commercial purposes. This also included 3rd party apps, e.g. the famous iOS Reddit client “Apollo”.

While there was hope it won’t get too pricy at first and the other changes looked manageable, that hope got crushed when Reddit’s new pricing scheme got published. Due to this, many developers of 3rd party Reddit clients decided to close down their apps.

On Reddit itself, this spawned the so-called APIcalypse and the search for alternatives.

Many people found a new home in either Lemmy or kbin or went on to completely different realms. And of those, many decided to completely wipe their account on Reddit as a sign of protest. And as deleting content on Reddit only sets a flag in their database while the actual text is kept, there was a desire to overwrite all old comments before deleting them.

One tool to achieve this is the PowerDeleteSuite – a browser extension that collects all available posts and comments by remote-controlling your browser session and going through your Reddit profile.

There’s only one issue: Reddit only associates the latest 1,000 comments and posts with your profile. While you might be able to find a few older items by using the different available filters in the Reddit profile, with 10 years of Reddit history as in my case, you will barely scratch the surface.

I was able to find various more comments from me via a Google search for site:reddit.com "mbirth avatar" that didn’t show up on my Reddit profile but were still in their system. (The added “avatar” string makes sure to only return actual comments from me and not just mentions of my username.) A data export finally revealed that of my 9,000 comments made over those 10 years, over 7,000 were still there and publicly accessible. So I started looking for a way to wipe them in an automated fashion.

Other users of the PowerDeleteSuite noticed the same problem and created this issue on GitHub. User @confluence turned the posted snippet into a working script and published it as a GIST – which I used as a starting point. After adding the deletion of comments to it, I couldn’t help myself and fleshed it out with 2FA support, optional skipping of entries (it you want to resume work without it having to check all previous entries) and a progress bar including calculation of time remaining (thanks to the wonderful Rich library).

I’ve published my version of the script and instructions here: https://github.com/mbirth/reddit-cleaner.

https://blog.mbirth.uk/2024/03/19/leaving-reddit-and-taking-all-my-contributions-with-me.html

#internet #online

Markus boosted:
Adrianna PiƄskaconfluency@hachyderm.io
2024-03-19

Hey, if you haven't nuked your Reddit comments yet and you want to (and you have more than 1000), @mbirth expanded my quick and dirty script into a proper tool with more output and options. github.com/mbirth/reddit-clean

Markusmbirth
2024-03-01

@nielso in der NĂ€he des Beamers anbringen und natives ist keine Option? Mittlerweile kann man auch andere Macs/Macbooks als AirPlay-Target nutzen, falls kein AppleTV zur Hand ist.

Zumindest als Notlösung sicher brauchbar.

Markusmbirth
2024-02-29

@benbloodworth@mstdn.party Watch Dolby Atmos movies with them!

Markusmbirth
2024-02-23

@nileane @macstories Definitely not a "unique approach". Just google for "Krona Sunlight" (released in 2015) or its successor "Sky Halo". Long time favourites on Android Wear watches all around the world.

Markusmbirth
2024-02-19

@katzenjens Ja, dann lieber so. Wobei das hierzulande sogar noch optional ist - so wie in DE vor der Reform 2013.

Markusmbirth
2024-02-16
Markusmbirth
2024-02-01

@katzenjens @masek Ich weiß nur, dass Excel manchmal "einfach so" importiert - mit den Standardeinstellungen, die nie das Ergebnis bringen, was man will. Und manchmal kommt dann eben auch der Dialog, wo man alles einstellen kann.

Aber so vollautomatisch UND richtig wÀre mir auch neu.

Markusmbirth
2024-02-01

@sebiturbo @nielso Man muss gar nicht selbst prĂŒfen - nur einmal Google bemĂŒhen:

forum.rme-audio.de/viewtopic.p

Apple hat Firewire nicht ausgebaut. Sie liefern nur nicht mehr den Firewire-Audiotreiber mit dem OS mit.

Macht aber nix - RME hat einen eigenen: rme-audio.de/rme-macos.html (im obig verlinkten Foren-Thread erwÀhnt)

Markusmbirth
2024-02-01

@bkastl Die Telekom war ja auch der Laden, der in den 90ern ĂŒberall Glasfaser wieder zugunsten Kupfer rausgerissen hat.

Markusmbirth
2024-01-25

My own was an hp Spectre x360 . That was my gateway drug into the world.

An HP notebook running macOS.
Markusmbirth
2024-01-14

@thisismissem @wonkothesane It does encryption of everything that leaves your device(s) - albeit with a hardcoded password at the moment. Proper E2EE will come on the very next update, but can be enabled manually for quite a while now. See forum.diariumapp.com/d/22-set-

Markusmbirth
2024-01-14

@wonkothesane @thisismissem Try Diarium (diariumapp.com) - one time purchase and has apps for iOS and macOS - so you can type on a proper keyboard, if you like.

Markusmbirth
2024-01-12

@katzenjens Der Film "Idiocracy" wird nicht umsonst mittlerweile als Dokumentation angesehen, statt als Fiktion.

Markus boosted:
2024-01-12
Markus boosted:
2024-01-08

Modern heads for Sonicare toothbrushes come with a small chip that records not only the type of head, but also how many seconds you’ve used it – so the head unit can remind you that it’s time to swap the head for a new one.

This isn’t purely for monetary reasons, especially as the unit doesn’t enforce this limit. But it’s proven that the abrasives in toothpaste work both ways. So to get the best performance from your brush head, you have to change it regularly.

To be fair, there are also toothpastes with very little abrasion so your heads would be able to do a good job for much longer. And if you don’t want your toothbrush to nag you, there’s now a way to reset the counter.

Back in May 2023, Cyrill KĂŒnzi spent some time analysing the communication between head unit and brush and posted it on Hacker News. Aaron Christophel picked this up, decompiled the firmware and re-created the password generation algorithm. And, finally, Nico Jeschke provided this convenient website to generate your password.

There was also a recent post on Hacker News about the whole process.

What you’ll need

First, you’ll need wak dev’s NFC Tools app to work with the NFC chip. The app is available either for iOS or Android.

Now, you need to find your brush head’s UID. For this, use the app to read the NFC chip of your brush head using the “Read / Edit memory” option under the “Other” menu. This will give you a long list of addresses and their data. Find your UID in the first two lines:

In the screenshot above, the UID would be 04:03:47:7A:22:70:81. (The last Byte in the first address is ignored as this is the BCC0 value and not part of the UID.)

Now we need the product code. This is either printed on the metallic rim at the base of the head:

Or you can find it ASCII encoded at addresses 21 to 23 (ignore the first 2 Bytes):

Now you can enter everything into the SonicareGenerator website and receive the unique reset command for this brush head:

Back in the NFC Tools app, you can now go to “Advanced NFC commands” (also found under the “Other” menu), enter your specific command into the “Data” field and send it to the brush head to reset the timer value.

https://blog.mbirth.uk/2024/01/08/resetting-your-sonicare-brush-head.html

#firmware #philips #sonicare

Markus boosted:
2024-01-07

In December 2011, I bought a Philips Sonicare DiamondClean Smart 9400 toothbrush set, product code HX9917/89. This included a HX992B handset – that offers Bluetooth connection to their app and also has a LED in the bottom to show when you’re applying too much pressure while brushing.

The handset transfers your brushing info to their app on your smartphone which can also put this data into Apple Health. If your phone is not in range, the handset stores a number of brushes and will transfer it later when your phone is nearby. However, while this worked most of the time, there were some occasions, where I had to re-pair the handset to the app. Until not even one year later, where even that didn’t work anymore.

I tweeted my issue to the Philips support and after verifying that I tried all their troubleshooting steps, they arranged for an exchange device. However, my model wasn’t available in their store at the time so the support agent suggested the HX9911/09 which was the only one available in their store that came with the nice glass stand and a travel charging case. So I accepted.

What I didn’t realise in that moment was that this replacement model was the DiamondClean 9000 instead of the 9400 – and had a RRP of 219,- € whereas my original model went for 279,- €. Also, this one doesn’t have the LED in the base of the handset – as it comes with the HX991B handset instead. It also shows up with a simple 4-quadrant timer in the app whereas the better model showed a set of teeth and marked where you were supposed to brush at that moment.

But what’s done is done and this one brushes my teeth exactly the same – just with a little less bling.

Well, even the new handset wasn’t free of issues. About two months later, it stopped syncing as well – albeit with a slightly different symptom. When connected to my phone, the app would show “Syncing
” but the progress would stay at 3% forever.

When I tried the same with an old Android phone, it miraculously worked. So it seems to be some issue with their iOS app or the way Bluetooth is implemented with it.

So it was back to contacting their support on Twitter again. This time, though, they seem to have found some engineer with actual knowledge about the thing and they gave me a secret code to reset the handset: 10-6.

How do you enter this into the handset you ask? Well, that’s easy:

  1. Put the handset onto the (powered) charger
  2. Push and hold the “MODE” (lower) button, then press the “POWER” button 10 times and let go of both buttons
  3. Push and hold the “MODE” (lower) button, then press the “POWER” button 6 times and let go of both buttons
  4. You should hear 3 beeps

This fixed my issue and even better: It never came back. The only issue I occasionally have is that the handset becomes slow to respond to button presses if the internal memory has too many records. But after syncing them to my phone, it goes back to normal operation for several weeks again.

I only wish the support agent had found this engineer when I still had the superior HX992B handset as I’m sure this would’ve fixed that one for me as well.

https://blog.mbirth.uk/2024/01/07/resetting-your-toothbrush.html

#hardware #philips #sonicare #support

Markus boosted:
2024-01-06

Google introduced its location tracking service Latitude in February 2009. A year later, in March 2010, I bought my first Android phone: the Google Nexus One. A few days later I’ve enable Google Latitude on it and my first checkin was on 31 March 2010 at 16:20:48 CEST. And apart from when I was without a smartphone, I never stopped logging my location.

Back then, this was a godsend to e.g. match photos taken with a proper camera to their location as phone cameras of the time were of pretty bad quality and digital cameras didn’t have GPS built-in. Even in 2023 I’ve used this several times to finally assign a location tag to some still untagged photos from over 10 years ago. It’s also great to do your work timesheets as you can easily see when you arrived at a site and when you left again.

Google Latitude was sunset in 2013 but the location tracking feature lived on in Google Maps as Location History, or later Timeline.

php-owntracks-recorder

In 2018, I was looking for a way to take more control over that pretty personal data. After a short search, I’ve found and forked tomyvi‘s php-owntracks-recorder. A simple PHP GUI that shows a map with where you were at what time and that can work with location data reported via the free OwnTracks app. The OwnTracks app was one of the very few, that only needed a low single-digit percentage of precious battery juice per day.

I’ve added support for SQLite3 and various different OpenStreetMap-based maps. And I was able to import all historic data from Google as well. My ca. 500 MiB Location History.json resulted in a 200 MiB SQLite3 database. Thanks to WAL mode, adding a new entry was still happening fast enough and I was pretty happy with the overall result.

PhoneTrack

Five years later, I figured there must be a similar app but that’s still in active development and more of a “standard” than what I’m using. My research ended with me choosing PhoneTrack as I already had a small NextCloud installation on my webspace. PhoneTrack can work with data packets from OwnTracks as well, so apart from changing the URL and adjust the device id, nothing else was needed.

As I used NextCloud with SQLite3 as well, moving over the data was as easy as connecting to the NextCloud db, attaching the previous database into the session and running a pretty straightforward INSERT..SELECT command.

ATTACH /tmp/owntracks.db3 AS ot;INSERT INTO oc_phonetrack_points (deviceid, lat, lon, timestamp, accuracy, altitude, batterylevel, useragent, speed, bearing) SELECT 1, latitude, longitude, epoch, accuracy, altitude, battery_level, "OwnTracks", velocity, heading FROM ot.locations WHERE tracker_id="mb";

Now, after processing an incoming location record from OwnTracks, PhoneTrack gathers the latest locations of your NextCloud friends and sends them back for OwnTracks to display. However, with my 13 years of location history, the database took several seconds to give a result for my freshest record.

With me having switched to an iPhone in the meantime, this often was enough to blow the time iOS allotted for background tasks and thus the records would pile up in the queue on my phone. I then had to open the app and let them get synced to PhoneTrack. This wasn’t ideal, so I’ve disabled the code for returning the friends list in my installation. It worked great after that.

Traccar

Upon reading a recent discussion on Hacker News, I was reminded of Traccar again. As I’m pretty deep in the Apple ecosystem and perfectly happy with their native apps and iCloud, I never used my NextCloud installation. And I figured if I could replace PhoneTrack, I could get rid of my NextCloud installation and thus have one huge do-it-all app less to take care of.

I dismissed Traccar before, because it is a Java application and I usually preferred “traditional” PHP apps I can just plop onto my UberSpace server. Now, that I’m using Docker everywhere I can, I welcomed that Traccar can be installed via Docker these days.

So I pulled the default traccar.xml from GitHub, modified it for the desired database and with a simple Docker Compose file like this, Traccar was up and running in seconds:

version: "3.8"services:  db:    image: mariadb:latest    environment:      - MARIADB_AUTO_UPGRADE=1      - MYSQL_RANDOM_ROOT_PASSWORD=1      - MYSQL_DATABASE=traccar      - MYSQL_USER=traccar      - MYSQL_PASSWORD=traccar    volumes:      - /opt/docker/traccar/mysql:/var/lib/mysql:Z      - /opt/docker/traccar/mysql-conf:/etc/mysql/conf.d:ro    restart: unless-stopped  server:    image: traccar/traccar:latest    restart: unless-stopped    depends_on:      - db    volumes:      - /opt/docker/traccar/conf/traccar.xml:/opt/traccar/conf/traccar.xml:ro      - /opt/docker/traccar/logs:/opt/traccar/logs:rw    ports:      - 8082:8082/tcp      - 5144:5144/tcp      #- 5000-5150:5000-5150/tcp      #- 5000-5150:5000-5150/udp          labels:      traefik.enable: "true"      traefik.http.routers.traccar.rule: Host(`traccar.mydomain.tld`)      traefik.http.routers.traccar.entrypoints: websecure      traefik.http.routers.traccar.tls: "true"      traefik.http.routers.traccar.tls.certresolver: le      traefik.http.routers.traccar.service: traccar      traefik.http.services.traccar.loadbalancer.server.port: "8082"      traefik.http.routers.traccar-ot.rule: Host(`traccar.mydomain.tld`)      traefik.http.routers.traccar-ot.entrypoints: traccar      traefik.http.routers.traccar-ot.tls: "true"      traefik.http.routers.traccar-ot.tls.certresolver: le      traefik.http.routers.traccar-ot.service: traccar-ot      traefik.http.services.traccar-ot.loadbalancer.server.port: "5144"

(I had to create the traccar entrypoint on my Traefik, to allow incoming logs from OwnTracks to port 5144 of the Traccar server. Traefik is also doing the HTTPS for the incoming OwnTracks connection.)

Import PhoneTrack history to Traccar

Now that Traccar was happily running, I wanted to import my history from PhoneTrack into it. And luckily, I wasn’t the first with that idea. So it was as easy as exporting all PhoneTrack data to a CSV file, like:

sqlite3 --csv owncloud.db "select * from oc_phonetrack_points" > oc_phonetrack_points.csv

And then creating a MySQL query to import that CSV back into Traccar’s database. As I wanted to keep metadata, especially the battery level, I had to use some trickery to get a valid JSON structure for Traccar.

LOAD DATA LOCAL INFILE '/tmp/oc_phonetrack_points.csv'INTO TABLE tc_positionsFIELDS  TERMINATED BY ','  LINES TERMINATED BY '\n'IGNORE 1 ROWS (  @id,  @deviceid,  @lat,  @lon,  @timestamp,  @accuracy,  @satellites,  @altitude,  @batterylevel,  @useragent,  @speed,  @bearing) SET  protocol="owntracks",  deviceid=1,  servertime=FROM_UNIXTIME(@timestamp),  devicetime=FROM_UNIXTIME(@timestamp),  fixtime=FROM_UNIXTIME(@timestamp),  valid=1,  latitude=@lat,  longitude=@lon,  altitude=@altitude,  speed=@speed,  course=@bearing,  accuracy=@accuracy,  attributes=JSON_COMPACT(    JSON_MERGE_PATCH(      IF(LENGTH(@batterylevel)>0, CONCAT("{\"batteryLevel\":", FLOOR(@batterylevel), "}"), "{}"),      IF(LENGTH(@bearing)>0, "{\"motion\":true}", "{}")    )  );

And one restart of Traccar later, all my old records were there. Time to decommission my NextCloud installation.

https://blog.mbirth.uk/2024/01/06/tracing-my-steps-logging-where-ive-been.html

#google #gps #latitude #location #tracking

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst