rust-url: Regression in `set_scheme` for 2.1.1 vs 2.1.0

Cargo’s internal test suite is unfortunately failing after the update to url 2.1.1, and one thing we’ve narrowed down so far is that the behavior of this changed between 2.1.0 and 2.1.1:

use url::Url;

fn main() {
    let mut url = Url::parse("git://github.com/foo/bar").unwrap();
    println!("{:?}", url.set_scheme("https"));
}

On 2.1.0 this succeeded but on 2.1.1 this is now failing.

Is this an intended change or perhaps an accidental bug? If it’s intended, is there a way we can get this working?

About this issue

  • Original URL
  • State: open
  • Created 4 years ago
  • Reactions: 4
  • Comments: 23 (17 by maintainers)

Commits related to this issue

Most upvoted comments

  1. What was the motivation for this change?
  2. Should such a breaking change be part of a patch release?
  3. The documentation only says set_scheme fails if:
    • The new scheme is not in [a-zA-Z][a-zA-Z0-9+.-]+
    • This URL is cannot-be-a-base and the new scheme is one of http, https, ws, wss, ftp, or gopher And I don’t think either is the case here. If this is intended behavior should the documentation be updated?

Unfortunately, the semver spec is not particularly precise on what “backwards incompatible” means, and any bug fix changes behavior, almost by definition, so clearly you can’t say all behavior changes require a new major release.

In the Rust ecosystem, RFC 1105 is basically canonical, and something based on it (perhaps brought up to date a bit) is planned for the Cargo book.

Semver does have some other applicable guidance.

I wonder if maybe there should be a more lenient API for urls (maybe a separate crate possibly that rust-url uses) where methods like setScheme infallibly set the appropriate component, and parsing accepts weird cases like that in #581.

I just wandered into this madness today and I’m inclined to agree. It’s particularly baffling to me that it fails silently. This seems like the wrong behaviour. I also agree that bringing this up with the spec folk is likely a good idea; personally, this being my first day looking at the spec, there’s a lot to not like about it and the apparently particular focus on web use cases, as opposed to the idea of fully general URLs that was originally envisioned, IMO hurts the spec.