Starting from the bottom:
maybe handle user data (e.g. /home) and the system (/usr, /var, /etc) with separate tools (I'm thinking git vs rsync)
Right approach! In fact, most distros, debian including, stick to some standards for directories; that allows you to get the same systems if you keep the apt-installed software in sync, the configuration in /etc selectively in sync (there's things like your machine name, some storage configuration, /etc/fstab, host name, your power settings… configured there, and you might want to sync some, but not all of it); /var/lib for mutable but persistent state (from containers to editor plugins).
When you have some non-apt-installed software in /usr/local, you need to look at that individually; generally, /usr/local/etc, and /usr/local/var/lib might be things that are worth syncing.
I don't know whether everything in home (excluding ~/.cache, and similar to /etc, treating ~/.config selectively) is appropriately handled by git – actually, no, it's not. But rsync, restic etc are suitable methods for synchronization.
near-realtime sync ("hot" sync) vs sync on boot/shutdown ("cold" sync - probably safer?)
dangerous for things outside of /home/, because the services of your system will already be accessing files; you don't want to randomly install stuff during bootup or shut down (especially not the latter, because how do you even notice if something went wrong?).
So, I'd go with: "hot" sync user data, but leave software / system sync up to dedicated reboots. Split system sync into two jobs: making sure you have the same packages installed as the other machine, that's a job for apt-get
, and syncing /etc and /var/lib, that's a job for rsync.
I'd probably just create a file (e.g., /var/lib/syncnextboot) and have a systemd unit activated at boot, that checks for the existence of said file, exits successfully if it isn't there, and otherwise fetches the package list from wherever, runs apt-get to install that exact list of packages, then rsyncs over the right parts of /etc and of /var/lib, and only after all that has been successful, deletes the sentinel file and exits successfully; otherwise it just exits unsuccessfully (your system is then in what sytemctl
would call "degraded state", but that's not a problem other than not being synced).
The /home rsyncing can be done opportunistically as service that depends on Network.target and is launched when online, and checks for reachability of the coordination server.
This all sounds like you might like layered, immutable-base operating systems, which make stronger guarantees on the ability to take all changes to a system and transport them anywhere.
In the Fedora world, such an OS would be Fedora Silverblue. I'm not sure whether there's an equivalent debian-based OS. Wikipedia suggests "Endless OS", but I'm always wary of distros that I haven't heard of; they typically have problems that noone has heard of before, too ;) But you could give it a try if all the software you need comes with it or is available as flatpak, or if you're not married to debian, try Silverblue (which, unlike Endless not only allows you to install software via flatpak, but also to define software layers, which you can modify to install "normally" packaged (i.e., installed via dnf/rpm, fedora's apt-get/dpkg) software).
different hardware configuration might lead to some stuff being different (kernel version and/or modules, /etc/modprobe.d, /etc/modules-load.d and so on)
mostly not a concern, usually solved by simply selectively excluding the packages bringing the graphic cards drivers for the laptop on the workstation and vice versa.
not everything in the system state might be reflected by the file system (unless I sync a "cold state", e.g. during boot and shutdown)
But that's also not state that makes sense to replicate, so, don't care!
conflict management
Hard because hard. I'd advocate for having the self-discipline to not uninstall software using apt on one device while installing it on another. The rest might be fairly rsyncable.
how to handle installed packages? (i.e. via "manifest files" + redo installs/removals vs sync the actual package contents in the file system + /var/lib/dpkg)
Yes! Core problem! Luckily, as linked to above, keeping the packages in sync is the core solution to the core problem, and apt-get and dpkg make that easy.
apt
.syncthing
useful for some of your $HOME data.