by eesmith on 4/22/2024, 7:48:15 PM
You've got about 2,000 octets of data, see https://stackoverflow.com/questions/417142/what-is-the-maxim... .
That link, while old, has a comment from last year pointing out 'While modern browsers will support longer URLs, search engines do not so the headline figure remains "under 2000 chars"'.
Sure, it will be smaller than that. But "http:a.tv" is a valid start, so you've got nearly 2000 more mostly arbitrary characters. Call it 250 octect values ^ 1990 octets ~ 8E+4771 as a lower bound.
The real number is of course much larger than that. The link I gave points other limitations, like Cloudflare's URI limit of 32kB at https://developers.cloudflare.com/support/troubleshooting/ht... .
255 ^ (32*1024) ~ 3E+78857
I think that's a good upper bound.
by malfist on 4/23/2024, 12:43:36 PM
The RFC does not set a limit on the length of urls.
So the answer is: ∞
I wonder what the total number of valid URLs including protocol, hostname, path, query parameters, etc. is. A sort of Drake Equation for URLs. This came up when wondering what the cardinality of a single repo on github.com is.
I assume it's more complex than [number of valid characters]^[browser character limit]. Networking equipment, browsers, archaic protocols, etc. all place limitations/requirements on this number.
Feel free to simplify it (ex: a single domain) or make it more complex as necessary. For consistency's sake let's assume I use Chromium stable on 2024-01-01 00:00:01 UTC and the request actually goes out onto the internet (so other hardware touches it)