Archived link

The polyfill.js is a popular open source library to support older browsers. 100K+ sites embed it using the cdn.polyfill.io domain. Notable users are JSTOR, Intuit and World Economic Forum. However, in February this year, a Chinese company bought the domain and the Github account. Since then, this domain was caught injecting malware on mobile devices via any site that embeds cdn.polyfill.io. Any complaints were quickly removed (archive here) from the Github repository.

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    205
    arrow-down
    34
    ·
    4 days ago

    nah. over 100k sites ignored dependency risks, even after the original owners warned them this exact thing would happen.

    the real story is 100k sites not being run appropriately.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      113
      arrow-down
      6
      ·
      edit-2
      4 days ago

      That’s not how systemic problems work.

      This is probably one of the most security ignorant takes on here.

      People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes. And ignoring predictable outcomes to take some high ground doesn’t cary far.

      The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

      Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.

      • oce 🐆@jlai.lu
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        7
        ·
        3 days ago

        Ok, people will always fuck up, so what do you do?

        The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

        All the organizations (including public) getting ransomware and data stolen, it’s because the consequences are not that bad? It is not gross negligence?

        • douglasg14b@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          I’m not sure if this is just a rhetorical question or a real one?

          Because I didn’t claim it isn’t negligence. It is negligent, however, it is not a problem solvable by just pointing fingers. It’s a problem that solvable through more strict regulation and compliance.

          Cyber security is almost exactly the same as safety in other industries. It takes the same mindset, it manifests in the same ways under the same conditions, it tends to only be resolved and enforced through regulations…etc

          And we all know that safety is not something solvable by pointing fingers, and saying “Well Joe Smo shouldn’t have had his hand in there then”. You develop processes to avoid predictable outcomes.

          That’s the key word here, predictable outcomes, these are predictable situations with predictable consequences.


          The comment above mine is effectively victim blaming, it’s just dismissing the problem entirely instead of looking at solutions for it. Just like an industry worker being harmed on the job because of the negligence of their job site, there are an incredibly large number of websites compromised due to the negligence of our industry.

          Just like the job site worker who doesn’t understand the complex mechanics of the machine they are using to perform their work, the website owner or maintainer does not understand the complex mechanics of the dependency chains their services or sites rely on.

          Just like a job site worker may not have a good understanding of risk and risk mitigation, a software engineer does not have a good understanding of cybersecurity risk and risk mitigation.

          In a job site this is up to a regulatory body to define, utilizing the expertise of many, and to enforce this in job sites. On job sites workers will go through regular training and exercises that educate them about safety on their site. For software engineers there is no regulatory body that performs enforcement. And for the most part software engineers do not go through regular training that informs them of cybersecurity safety.

          • oce 🐆@jlai.lu
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            I’m not blaming the single person who did a mistake, I’m blaming the negligence of the companies that cut corners for profit, so most of them.

            Your first comment read as if organizations where this happens couldn’t have bad consequences. Your new comment explains what you meant better, and I agree.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        3 days ago

        People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes.

        So what does it say about us diverting from purely server-side scripted message boards with pure HTML and tables, and not a line of JS? Yes, let’s get back there please. And no phone numbers.

        The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

        Boeing - we know where you’re goeing.

        Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.

        There’s one industry which kinda started like this, with proper HIG and standard key combinations and proven usability with screenreaders or by people with color blindness, autism, ADHD, whatever.

        Then came in people talking with the tone similar to, sorry, yours in the “People will ALWAYS fuck up” part came saying that people want nice, dynamic, usable websites with lots of cool new features, people are social, they want girls with real photos, names and phone numbers on their forums which BTW should be called social nets.

        By the way, we already had that with Flash and Java applets, some things of what I remember were still cooler than modern websites of the “web application” paradigm are now. And we had personal webpages with real names and contacts and photos. And there were tools allowing to make them easily.

        These people just hated the existing culture with its individualism and depth, the web applications should be able to own you and not be just another kind of embedded content, the personal webpages should be all the same, and of course normies wouldn’t want to come as guests into the nerdspace - no, they had those new social nets as their space, looking down on those nerds and freaks of my kind.

        Now - well, try using today’s web as a person impaired in any way.

        And those normies can’t really use it too, and too feel impaired, they just won’t admit it.

        • efstajas@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          3 days ago

          So what does it say about us diverting from purely server-side scripted message boards with pure HTML and tables, and not a line of JS? Yes, let’s get back there please.

          Ironically, proper SSR that has the server render the page as pure HTML & CSS is becoming more and more popular lately thanks to full-stack meta frameworks that make it super easy. Of course, wanting to go back to having no JS is crazy — websites would lose almost all ability to make pages interactive, and that would be a huge step backwards, no matter how much nostalgia you feel for a time before widespread JS. Also tables for layout fucking sucked in every possible way; for the dev, for the user, and for accessibility.

          people want nice, dynamic, usable websites with lots of cool new features, people are social

          That’s right, they do and they are.

          By the way, we already had that with Flash and Java applets, some things of what I remember were still cooler than modern websites of the “web application” paradigm are now.

          Flash and Java Applets were a disaster and a horrible attempt at interactivity, and everything we have today is miles ahead of them. I don’t even want to get into making arguments as to why because it’s so widely documented.

          And we had personal webpages with real names and contacts and photos. And there were tools allowing to make them easily.

          There are vastly more usable and simple tools for making your own personal websites today!

          • rottingleaf@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            3 days ago

            Ironically, proper SSR that has the server render the page as pure HTML & CSS is becoming more and more popular lately thanks to full-stack meta frameworks that make it super easy.

            I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.

            Of course, wanting to go back to having no JS is crazy — websites would lose almost all ability to make pages interactive, and that would be a huge step backwards, no matter how much nostalgia you feel for a time before widespread JS.

            I disagree. Geminispace is very usable without scripts.

            That’s right, they do and they are.

            Well, then it appears they don’t care for what I need, so I don’t care for what they need. If only one paradigm must remain, then naturally I pick mine. If not, then there’s no problem and I still shouldn’t care.

            And those industry rules I was answering about are about making a thing work for both, even if being less functional.

            Flash and Java Applets were a disaster and a horrible attempt at interactivity, and everything we have today is miles ahead of them. I don’t even want to get into making arguments as to why because it’s so widely documented.

            Sorry, but either you still make an argument or this isn’t worth much.

            For me it’s obvious that embeddable cross-platform applications as content inside hypertext are much better than turning a hypertext system into some overengineered crappy mess of a cross-platform application system.

            The security issues with Flash and Java applets weren’t much different from those in the other parts of a web browser back then.

            There are vastly more usable and simple tools for making your own personal websites today!

            I ask you for links and how many clicks and fucks it would take to make one with these, as opposed to back then. These are measurable, scientific things. Ergonomics is not a religion.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 days ago

              I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.

              ??? Please don’t make weird blanket statements like this.

            • efstajas@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              3 days ago

              I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.

              But why? What’s bad about this?

              I disagree. Geminispace is very usable without scripts

              That’s great, I’m not saying that it’s impossible to make usable apps without JS. I’m saying that the capabilities of websites would be greatly reduced without JS being a thing. Sure, a forum can be served as fully static pages. But the web can support many more advanced use-cases than that.

              If only one paradigm must remain, then naturally I pick mine. If not, then there’s no problem and I still shouldn’t care.

              So you can see that other people have different needs to yours, but you think those shouldn’t be considered? We’re arguing about the internet. It’s a pretty diverse space.

              For me it’s obvious that embeddable cross-platform applications as content inside hypertext are much better than turning a hypertext system into some overengineered crappy mess of a cross-platform application system.

              Look, I’m not saying that the web is the most coherent platform to develop for or use, but it’s just where we’re at after decades of evolving needs needing to be met.

              That said, embedded interactive content is absolutely not better than what we have now. For one, both Flash and Java Applets were mostly proprietary technologies, placing far too much trust in the corpos developing them. There were massive cross-platform compatibility problems, and neither were in any way designed for or even ready for a responsive web that displays well on different screen sizes. Accessibility was a big problem as well, given an entirely different accessibility paradigm was necessary within vs. the HTML+CSS shell around the embedded content.

              Today, the web can do everything Flash + Java Applets could do and more, except in a way that’s not proprietary but based on shared standards, one that’s backwards-compatible, builds on top of foundational technologies like HTML rather than around, and can actually keep up with the plethora of different client devices we have today. And speaking of security — sure, maybe web browsers were pretty insecure back then generally, but I don’t see how you can argue that a system requiring third-party browser plug-ins that have to be updated separately from the browser can ever be a better basis for security than just relying entirely on the (open-source!) JS engine of the browser for all interactivity.

              I ask you for links and how many clicks and fucks it would take to make one with these, as opposed to back then. These are measurable, scientific things. Ergonomics is not a religion.

              The idea that any old website builder back in the day was more “ergonomic” while even approaching the result quality and capabilities of any no-code homepage builder solution you can use today is just laughable. Sorry, but I don’t really feel the burden of proof here. And I’m not even a fan of site builders, I would almost prefer building my own site, but I recognize that they’re the only (viable) solution for the majority of people just looking for a casual website.

              Besides — there’s nothing really preventing those old-school solutions from working today. If they’re so much better than modern offerings, why didn’t they survive?

              • rottingleaf@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                3 days ago

                But why? What’s bad about this?

                What I said, literally.

                But the web can support many more advanced use-cases than that.

                Which can be done with something embeddable, and not by breaking a hypertext system.

                So you can see that other people have different needs to yours, but you think those shouldn’t be considered? We’re arguing about the internet. It’s a pretty diverse space.

                If those people don’t consider mine, then I don’t consider theirs. If I must consider theirs, they must consider mine.

                Look, I’m not saying that the web is the most coherent platform to develop for or use, but it’s just where we’re at after decades of evolving needs needing to be met.

                That says nothing. It’s a market\evolution argument. Something changes tomorrow and that will be the result of evolution. Somebody uses a different system and that’s it for them.

                That said, embedded interactive content is absolutely not better than what we have now. For one, both Flash and Java Applets were mostly proprietary technologies, placing far too much trust in the corpos developing them.

                And today’s web browsers are as open as Microsoft’s OOXML. De facto proprietary.

                There were massive cross-platform compatibility problems,

                For Flash? Are you sure? I don’t remember such.

                and neither were in any way designed for or even ready for a responsive web that displays well on different screen sizes.

                Nothing was. Doesn’t tell us anything.

                Accessibility was a big problem as well, given an entirely different accessibility paradigm was necessary within vs. the HTML+CSS shell around the embedded content.

                Yes, but applet’s problems in that wouldn’t spread to the HTML page embedding it. Unlike now.

                Today, the web can do everything Flash + Java Applets could do and more, except in a way that’s not proprietary but based on shared standards, one that’s backwards-compatible, builds on top of foundational technologies like HTML rather than around, and can actually keep up with the plethora of different client devices we have today.

                I’ve already said how it’s similar to OOXML. Only MS documented their proprietary at the moment standard of their proprietary program and made it open, while Chromium is itself open, but somehow that doesn’t make things better.

                And speaking of security — sure, maybe web browsers were pretty insecure back then generally, but I don’t see how you can argue that a system requiring third-party browser plug-ins that have to be updated separately from the browser can ever be a better basis for security than just relying entirely on the (open-source!) JS engine of the browser for all interactivity.

                That’s similar to the Apple walled garden arguments. It’s valuable in areas other than security because of separating power between some browser developer and some plugin’s developer. And fighting monoculture is also good for security.

                Also people still use plugins, still separately updated, which still get compromised.

                Also plugins can be properly sandboxed.

                The idea that any old website builder back in the day was more “ergonomic” while even approaching the result quality and capabilities of any no-code homepage builder solution you can use today is just laughable. Sorry, but I don’t really feel the burden of proof here. And I’m not even a fan of site builders, I would almost prefer building my own site, but I recognize that they’re the only (viable) solution for the majority of people just looking for a casual website.

                Sorry, I still do feel that burden of proof. Because for a static site like in 2002 I’d just export a page from OpenOffice and edit some links, and then upload it.

    • ShaunaTheDead@fedia.io
      link
      fedilink
      arrow-up
      52
      ·
      4 days ago

      One place I worked at recently was still using Node version 8. Running npm install would give me a mini heart attack… Like 400+ critical vulnerabilities, it was several thousand vulnerabilities all around.

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        31
        arrow-down
        1
        ·
        4 days ago

        Running npm install would give me a mini heart attack

        It should; but more because it installs things right off the net with no validation. Consistency of code product is not the only thing you’re tossing.

        • LordCrom@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          4 days ago

          How else would you get LPAD ? Expect me to write 2 lines of code when I could just import a 100 Mb library to do it for me?

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          4 days ago

          You need to get up to date from three years ago. NodeJS 16.20, or thereabouts, enabled dependency auditing by default.

          I’m still fighting my engineers go get current enough to use this (but we do have a proxy artifact server that also attempts to keep downloads clean, and a dependency scanner)

      • unalivejoy@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 days ago

        If you’re on RHEL 8+, you can install the latest version of node with dnf.

        dnf install nodejs will likely install node 8 :(. Use dnf module install nodejs:20 to install the latest version.

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      44
      arrow-down
      3
      ·
      4 days ago

      the real story is 100k sites not being run appropriately.

      Same as it ever was. Same as it ever was. Same as it ever was.

          • ScreaminOctopus@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 days ago

            Finding new ways webshits fuck up the most basic development principles boggles my mind. It’s like they intentionally stay ignorant.

            • nilloc@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 days ago

              We know, but we don’t have time to change. We have another site waiting to get slammed out as soon as the one we’re working on, which was underfunded with a ridiculous timeline goes live.

              There’s still a fair bit of “my nephew makes websites, it can’t be that [hard, expensive, time consuming], oh and by the way, e we need a way to edit every word and image on the site, that both our intern and barely literate CEO can understand, even though we’re literally never going to edit anything ever.”

        • nyan@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          They’re widely variable. PyPI gets into about as much trouble as npm, but I haven’t heard of a successful attack on CPAN in years (although that may be because no one cares about Perl anymore).

      • Warl0k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        66
        arrow-down
        1
        ·
        edit-2
        4 days ago

        I don’t think we have to choose. “Maintain your websites so you don’t get taken advantage of” and “Here’s an example of a major-world-power-affiliated group exploting that thing you didn’t do” are both pretty important stories.

      • themurphy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        1
        ·
        4 days ago

        The malware thing still deserves a headline. They just argue it’s stupid so many even have to use the library to begin with.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          3 days ago

          Have to use? No one has to use any library. It’s convenience, and in this case it’s literally so they don’t have to write code for older browser versions.

          The issue here isn’t that anyone has to use it, it’s the way it was used that is the problem. Directly linking to the current version of the code hosted by a third party instead of hosting a copy yourself.