The whole question that Apple might lose developers might be true (or the opposite ftw.) but your reasoning is wrong in some parts.
You write that you test your „Linux scripts“ using a Parallels VM. Linux runs on ARM for quite some while and Apple already did demonstrate paravirtualizing Linux on macOS Big Sur. Docker will be possible and VMs like Parallels and VMware too. Most user space things in Linux don’t care for the CPU architecture. So if you’re not a kernel driver developer for x86 Linux you should see no problems. If you‘re a kernel driver developer for ARM Linux… well even better. Some time ago there where certain projects like MongoDB which only did run on x86 – but even that is now available on ARM. ARM Server machines are a thing that is coming fast now.
Bootcamp might be dead but running Windows apps on a Mac not yet. While it might not be possible to paravirtualize x86 Windows you can run it emulated. It will not offer the full performance of the machine but should be enough to test websites on Windows only browsers. Windows only browsers… well… Firefox and Chrome run natively on the Mac. The new Edge is based on Chromium and has a native Mac version too. So we speak about the old edge (small percentage of usage) and really old browsers like IE11 and older. So even this problem is getting smaller as time goes.
iOS and Android both run on ARM – with Macs being ARM it will be possible to paravirtualize those OS on a Mac.
Windows might be the dominant OS of the past – nowadays the Mobile OSes are the dominant ones. Smartphones are the „real“ personal computers – in the hand of much much more people than Windows ever was.
So – most of your points are nonissues or even could turn out to be better for developers