I want to reversibly transcode arbitrary information (a digital signature), initially $n$ bits, into symbols in an alphabet with $s$ symbols, with little space loss. In my application‡ $s=45$. Thus I can e.g. transcode $n=192$ bits into $m=\lceil n/\log_2s\rceil=\lceil34.961\ldots\rceil=35$ symbols with less than $0.12\%$ density loss.
I have working C code for this using textbook base conversion, including the necessary arbitrary-precision arithmetic on an integer up to $s^m\approx2^n$. It require $\mathcal o(n)$ bits of temporary space for that integer, and $\mathcal O(n^2)$ time. For the beauty of it, I'm in search for (not exclusive)
- a method achieving the same density (or at least $m=35$ for $n=192$, $s=45$) with a simpler symbols-to-bits conversion algorithm, even if somewhat more temporary space is needed to run the algorithm, and the bits-to-symbols conversion was complex.
- a proof (or disproof) that for odd $s$, any method with density as good as base conversion requires $n$ bits of temporary space in the decoder, on top of the $m$-symbols read-only input.
Update: I do not need compatibility with a pre-existing encoding. I could live without detection at decoding that a combination of $m$ symbols among $s$ can't be generated by proper encoding of $n$ bits.
‡ Encoding on "QR codes" using the "Aphanumeric mode", which is more compatible with various deployed readers than the "Byte mode". $s=45$ is such that $2\log_2(s)\approx10.9837\lesssim11$, allowing the QR code standard to store $2$ symbols into $11$ bits with less than $0.15\%$ space loss.
Update: Problem with the binary mode is that existing readers (physical devices, apps on a mobile phone) are harder to interconnect to other devices/wares when their output is arbitrary bytes, e.g. a digital signature. Issues include 7-bit serial mode; XOFF (0x13) halting serial communications; binary mode being intentionally disabled to block denial of service by the previous method; NUL (0x00) terminating a C string; filters limiting strings to valid UTF8.