Skip to content

libheif1 >= 1.18 on debian 12 bookworm (installing packages from backport in debian stable)

Add this line to /etc/apt/sources.list:
deb http://deb.debian.org/debian bookworm-backports main
Run:
# apt-get update
# apt install -t bookworm-backports libheif1

This solves the issues encountered when decoding HEIF (.heic) files produced by iPhone / iOS 18, causing for instance rendering in Nextcloud to fail.

This page has more details on using Backports.


BackupPC - binary garbage in XferLog.z / XferLog.z getting huge

I run a BackupPC instance that is still on Debian 10 / buster. The latest rsync package for Debian 10 has version number 3.1.3-6.

I recently noticed infrequent issues when backing up hosts that are either on Debian 11 / bullseye, or on Ubuntu 22.04 (which both ship rsync 3.2.3). The symptoms are as follows:

- backups take much longer than usual
- XferLog.z starts "normally", but after a certain point contains a lot of binary garbage, and gets much bigger than usual (hundreds of MB, or even in one case up to 12 GB)

After investigating and looking for information online, I came across these bug reports, which contain the explanation as well as a workaround: Debian Bug report #969463 and BackupPC issue #369.

=> The issue is caused by a combination of a change of default behaviour introduced with rsync 3.2.3, and a bug in File::RsyncP.

The solution that works for me (pending an update to Debian 11 and BackupPC 4) is to add the following line in the individual server config for each of the affected hosts:
$Conf{RsyncArgsExtra} = ['--no-msgs2stderr'];


Find large files in a BackupPC transfer log (sort files in XferLOG by size)

When your backups suddenly takes much longer to complete, it could be because large files that were previously excluded from the backup were renamed or relocated elsewhere. In order to identify those files, or just to sort the list of files by size, I use the following code (bash):
cd /tmp/
BackupPC_zcat /var/lib/backuppc/pc/mypc/XferLOG.99.z > xlog
for S in `cat xlog | sed -e 's/^[^/]*\/[0-9]*[ ]*//' | cut -f1 -d\ | egrep '^[0-9]+$' | egrep '[0-9]{9,}' | sort -n | uniq ` ; do fgrep " $S " xlog ; done
This will show the list of files bigger than 99999999 bytes (100 MB). Remove "| egrep '[0-9]{9,}'" to just list all instead.

Note: on Debian, BackupPC_zcat is in /usr/share/backuppc/bin/. I've added a symlink in /usr/local/bin/ so that I don't need to look for it every time.


Grafana PNG export on headless Debian server (phantomjs / render fails with "404 page not found")

If it still fails after you've set "root_url" to the correct value in grafana.ini, you might want to check whether you can run phantomjs from the command line.

If you get "QXcbConnection: Could not connect to display / PhantomJS has crashed", then the explanation is here: Debian Bug #817277.

To fix it, I installed xvfb (apt-get install xvfb), and edited /usr/bin/phantomjs so that the last line now looks like this:
exec "/usr/bin/xvfb-run" --server-args="-screen 0 640x480x16" "/usr/lib/phantomjs/phantomjs" "$@"