So, serde seems to be downloading and running a binary on the system without informing the user and without any user consent. Does anyone have any background information on why this is, and how this is supposed to be a good idea?

dtolnay seems like a smart guy, so I assume there is a reason for this, but it doesn’t feel ok at all.

  • Mechanize
    link
    fedilink
    arrow-up
    38
    arrow-down
    1
    ·
    1 year ago

    It seems it was done to marginally improve serde_derive build times? And just on x86_64-unknown-linux-gnu?

    It feels a pretty weird course of action, even if I can understand his point of view his official stance of “My way or the highway” seems a bit stronger than needed, especially considering the amount of problems - both moral and pratical - this modification arises.

    I don’t know. If he really feel so strongly about it the only real option would be an hard fork, but a project of that magnitudo and so integrated in the ecosystem is really not easy to both manage or substitute.

    Overall it kind of leave a sour taste, even if - I repeat - I understand it is his time and his decision to make.

    • manpacket@lemmyrs.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      It seems it was done to marginally improve serde_derive build times? And just on x86_64-unknown-linux-gnu?

      Indeed. If you use nix instead of compiling in 8 seconds it fails to compile almost instantly.

    • tatterdemalion@programming.dev
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      1 year ago

      The same feature is planned for Windows and MacOS. https://github.com/serde-rs/serde/pull/2523#pullrequestreview-1583726636

      The build time improvements are so marginal in a production environment where hundreds of crates are built. This decision demonstrates a strange inversion of priorities and smells of premature optimization to me. It’s so odd to see even further optimizations building on this “serde helper process” pattern.

  • BB_C@programming.dev
    link
    fedilink
    arrow-up
    22
    arrow-down
    2
    ·
    1 year ago

    I hate that I’m linking to Reddit, but I’m just reminded of this.

    Some of us knew where all the obsession with dependencies’ compile times will lead, and triggered the alarm sirens, if half-jerkingly, years ago.

    Compile times, and more specifically, dependencies compile times, is and has always been the most overblown problem in Rust. We would have some sort of sccache public repositories or something similar by now if it was that big of a problem.

    And yes, I’m aware proc-macro crates in particular present unique challenges in that field. But that shouldn’t change the general stance towards the supposed “problem”. And it should certainly not trigger such an obsession that would lead to such a horrible “solution” like this serde one.

    • Mechanize
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 year ago

      I hate that I’m linking to Reddit, but I’m just reminded of this.

      OT, but remember you can always use an archived link instead of a live one.

  • Barbacamanitu@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    I get why the binary is there, but there really should be a simple way to force compilation instead of downloading a precompiled binary.

    Serde is incredible though, so it can get away with basically anything it wants.

    • manpacket@lemmyrs.org
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Serde is incredible though

      Sure. Fork of it can be incredible too. In fact the only difference can be traditional approach to building the derive macro. All it takes is for people to switch.

  • Vorpal@lemmyrs.org
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 year ago

    I saw some other crate doing something similar but using wasm, the idea is to sandbox the binary used as a proc macro. So that seems a bit better. Can’t see to find it any more.

    EDIT: Found it https://lib.rs/crates/watt

  • Lucky@programming.dev
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    The second comment explains a lot. There is a build script that generated the binary, which they are using to reduce the overall build time. They mention this resulting from a limitation on cargo and this being a workaround

    It seems like you could build it all from scratch if needed with a bit of effort

  • TehPers@beehaw.org
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    edit-2
    1 year ago

    I’m a bit confused, proc macros could always execute arbitrary code on developer machines. As long as the source for the precompiled binary is available (which seems to be the case here), how is this any different than what any other proc macro is doing?

    Edit: I should add that any package, macro or not, can also do so in a build.rs script.

    • BatmanAoD@programming.dev
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      One problem is that the build isn’t easily reproducible: there are a few comments in that issue thread from someone trying to reproduce it and failing.

      • TehPers@beehaw.org
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        That seems like it could be an issue, but not the issue being raised by the post. The original post was talking about executing binary code on a user’s machine without consent. The thing is, this is how a lot of Rust packages work. Any package can have a build.rs that runs arbitrary code on a developer’s machine (that gets compiled into a binary automatically by Cargo). Any proc macro is arbitrary code that gets compiled into a binary and executed on a developer’s machine. In fact, any library, regardless of if there’s a build.rs or if it’s a proc macro, can have malicious code in it that gets executed when a developer calls a specific method.

        None of this is new. When done maliciously, it’s called a supply-chain attack. All packages can do this. This is part of why there’s been interest in executing some of this code in WASM runtimes within the compiler, so that developers can explicitly control the level of impact those packages can have on a developer’s machine. That being said, WASM doesn’t solve the fact that any package can just have malicious code in it that gets executed during runtime. This is why people should vet their packages themselves (when it’s important, at least) to ensure that this won’t happen to them.

        • BatmanAoD@programming.dev
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          If the executable were easily reproducible from the source code, then yes, downloading a precompiled binary would be akin to executing code in build.rs or a proc macro. The fact that it’s not makes these very different, because it makes your suggestion of “vet[ting] their packages themselves” impossible.

          • TehPers@beehaw.org
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Maybe I’m missing something, but I’m not seeing where in serde we’re downloading a precompiled binary. I see a script we can execute ourselves in the repository and an alternative serde_derive that uses that executable (after we compile it), but not where the actual published package has the executable.

            It’s possible I’m missing something here though.

            • BB_C@programming.dev
              link
              fedilink
              arrow-up
              5
              ·
              edit-2
              1 year ago
              bsdtar tfv ᐸ(curl -sL https://static.crates.io/crates/serde_derive/serde_derive-1.0.183.crate)
              

              Edit: Ogh, using which is a replacement character because Lemmy escapes the real one. This is annoying.

              There, you will see that this file exists:

              -rwxr-xr-x  0 0      0      690320 Jul 24  2006 serde_derive-1.0.183/serde_derive-x86_64-unknown-linux-gnu
              

              Yes, that’s a pre-built binary in the crate source release. It’s that bad.

              • TehPers@beehaw.org
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Looks like I missed that, I was checking locally but I must have been checking an outdated version of the package. I’d feel better about it if it compiled on the user’s machine, which is the impression I was getting.

      • TehPers@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m not sure I follow what that link has to do with this, though. serde is open source, anyone can go compile it themselves. In fact, from what I can tell, to get the precompiled version of serde_derive, you need to compile it yourself anyway. Compiling these proc-macros to binaries before executing the code isn’t new, this is what Cargo does with all proc macros.

        Also, I might be misreading the source here, but it looks like the executable needs to be manually compiled by the user on their own (by running the precompiled/build.sh script), and they need to manually add the precompiled variant of serde_derive as a dependency instead of using the version that’s on crates.io. Am I missing something here? Is this automatically used by the published version of serde somewhere?

        • manpacket@lemmyrs.org
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          No, serde_derive contains the binary and if you are on linux it will try to run it without asking the user. In fact there’s no way to make it so it won’t run.

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        You can read the source of build.rs and and proc macros executed during a build, but do you? Does anyone do that every time they add a new dependency?

        • manpacket@lemmyrs.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          When adding a new dependency I almost always go over the source code to see what kind of performance to expect. If build.rs is there - checking it takes a single click so yes to that too. Derive macro - less frequently, but you have to do it when documentation is non existent.