I run Debian 12 and thankfully use the mxrepo to get some packages on the system, not present in the Debian Repos.
sources.list-exerpt:
Code: Select all
deb [signed-by=…MX23-archive-keyring.gpg] http://mxrepo.com/mx/repo/ bookworm main
Since about today afternoon (Europe) I also get the "Clearsigned file isn't valid" error message.
Code: Select all
#LC_ALL=C apt update
Hit:1 http://security.debian.org/debian-security bookworm-security InRelease
Hit:2 http://ftp.de.debian.org/debian bookworm InRelease
Hit:3 http://ftp.de.debian.org/debian bookworm-updates InRelease
Hit:4 http://ftp.de.debian.org/debian bookworm-backports InRelease
Get:5 http://mxrepo.com/mx/repo bookworm InRelease
Err:5 http://mxrepo.com/mx/repo bookworm InRelease
Clearsigned file isn't valid, got 'NOSPLIT' (does the network require authentication?)
Reading package lists... Done
E: Failed to fetch http://mxrepo.com/mx/repo/dists/bookworm/InRelease Clearsigned file isn't valid, got 'NOSPLIT' (does the network require authentication?)
E: The repository 'http://mxrepo.com/mx/repo bookworm InRelease' is no longer signed.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.
When I request /mx/repo/dists/bookworm/InRelease via web-browser, wget oder curl, I get a 301 to https and after redirect the requested file.
When the file is requestet by the apt-cacher-ng-Proxy the server responses with some weird html content, looking a bit like a domain parking website.
I think this might be due to apt-cacher-ng sending the "host"-http-header as 2nd and not as the first header. (First header is the UA)
Afaict this order is unusual, but not forbidden. (Latest http-rfc has a SHOULD but no MUST requirement for using Host as first Request-Header https://www.rfc-editor.org/rfc/rfc9110#section-7.2-4 )
I attach 2 wireshark TCP-Stream-ASCII-Dumps. One captured from an original "Apt-Cacher-NG"-Request (I replaced some content with "…redacted…", because of possible identifiers and less bloat).
The other when I tried to mimic the behavior with wget/curl, as far as I got. - they don't let me send an UA as first header
Any ideas how solve the issue?