Proposal: Default scheme-less URLs to HTTPS

It's 2017. Most sites that I visit now support HTTPS, and even redirect to it from insecure HTTP. What does this change? I have one suggestion: Software that autolinks bare domain names as URLs should default to https:// instead of http://.

<rant>I should first mention that I'm actually not a big fan of messaging apps taking messages like "yeah I hate medium.com because of the persistent sharing dickbars" and turning "medium.com" into a link. It often doesn't match user intent -- most of the links I share and see shared are not to the root of a site, so this doesn't save me much convenience (maybe your experience is different) and I frequently refer to sites by their domain without wanting there to be a suggestion of visiting them. (Sometimes that is very much not what I want, e.g. in discussions of shock sites.) Worse, there are so many top-level domains that random dot-separated words become links. Some clever person bought the Moldovan domain readme.md to take advantage of all the README markdown file mentions in developer chat and resulting spurious links.</rant>

But, given that this feature isn't going to go away...

I think it's time to re-evaluate how "example.com" is turned into a URL. Traditionally you'd slap an http:// onto the front, because there was a good chance that would work and a poor chance that https:// would work. The tradeoffs are different now, though. Thanks to Let's Encrypt, vast swaths of the web have been secured at the transport layer. In the past year I've contacted several people to politely ask them to enable HTTPS, and within hours to weeks they had done it, often saying "oh hey thanks for the nudge, I'd been meaning to". The web has changed; SSL has changed. Most of the autolinked http links I click now redirect to https, or go to a site where my HTTPS Everywhere browser extension automatically redirects me to https.

When I click an http link to an https-supporting site, there's a window of time where a malicious ISP such as Verizon can attach tracking cookies, or a compromised café router can inject malware or redirect to a phishing site, or my existing cookies on the site can be snooped. That all happens before the redirect. Maybe the site has HSTS to force client-side redirects to HTTPS, but that only works after the first time.

On the other hand, when I click an https link to a site that only supports http... I get a load failure. If I'm savvy, I edit the "s" out; if I'm not, I complain to the person who sent the message, or I remember someone saying that this is a thing, and I edit the "s" out. Maybe I do a web search. But these sites are getting kinda rare now!

So either way, there's a chance of bad things happening. The software has to guess. Is it better to have a chance of silently insecure communication, or a much smaller chance of a visible error? Remember, too, that the balance is ever moving towards secure sites, while software does not get updated—a tradeoff made in software now may be outdated in a year yet remain in place for decades. I think it's pretty clear: We should be looking to the future, and failing safe.


Comments are closed.