also i hope the curl joke is correct i don’t use it that much

also i hope there are itysl fans here

    • De Lancre@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      22 hours ago

      Now you have syntax highlighting and JSON parsing. Total install size: ~10MB. Total startup time: instant. Total RAM usage: negligible. Total feelings of superiority: immeasurable.

      As someone who often does curl API requests and also come up to idea of putting them in jq — I felt superior just by reading this, thank you

      • krashmo@lemmy.world
        link
        fedilink
        arrow-up
        52
        ·
        2 days ago

        I really like that the link at the bottom of the page says “check out more stuff you should be using” but that URL takes you to a page where the curl article is the only link.

        “more coming soon. Or not. I don’t owe you shit”

        A majestic site indeed. Bravo sir.

    • azimir@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      I feel bad that I read that page in Chrome. I’m a failure of a techie.

      Tomorrow I must atone by teaching more students terminal commands. Maybe using web API calls with cURL. Or get and some eviloverlord.com quotes?

    • Hadriscus@jlai.lu
      link
      fedilink
      arrow-up
      6
      ·
      1 day ago

      the hostility is hilarious 😂

      especially since I had no idea what any of it means 😂 and yet it manages to sound so obvious

      • luciferofastora@feddit.org
        link
        fedilink
        arrow-up
        4
        ·
        14 hours ago

        The low-tech version of it is that there is a certain technical way of asking computers for stuff, usually across the Internet. To do that as a human, for instance to build a system that can ask or answer these questions, there are a few different tools.

        One is Postman, which is a fancy, graphical tool that essentially loads a full browser for interacting with parts of the question, sending it and displaying the answer. Of course, it’s a commercial product, so some nice features are locked behind paywalls. The full browser also requires more memory and more time to load.

        Another is cURL (alias curl), which is a command line tool, meaning you just enter the various parts of the question as text. It’s a little less convenient for more complex questions to remember how to specify the question’s parts and adjust them, because instead of a nice table where you enter them, you have to type it in text, but you can use various other features of the command line to make it easier for yourself. In some cases such tools may have advantages over the graphical ones, allowing you to do things the other can’t do (or at least not easily).

        The whole page is a rant about people using the heavier and commercial graphical tool instead of the lighter and freely available command line tool. It’s a tongue-in-cheek continuation of an old argument in some software developer circles, where you will have people who prefer to use certain graphical tools and others that not just prefer command line tools (or generally text-based code), but also feel like everyone should just use them instead. For some, that’s just friendly banter. For others, it’s a deeply ideological conviction.

        Personally, I’d suggest that people use whatever works for them and their team. For example, I have little choice but to use a mostly graphical tool because I work together with people who don’t have the time to learn the text-based options. They might be a great tool for me, but if my work is unusable to others, that makes it harder to work together (and accordingly means that I’d be stuck doing everything myself, which I frankly don’t have the time for).

        • Hadriscus@jlai.lu
          link
          fedilink
          arrow-up
          2
          ·
          9 hours ago

          Dope, thanks for the ELI5, really appreciate it. I did look up curl and postman after the fact and was able to understand some of the context. What elude me are the specifics, relating to the commands in question. But I can’t expect having any understanding of it without having studied networking, I suppose. That’s fine. Cheers !!

    • marighost@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      The few times I do use curl, all I can think of is this sketch. Glad there are some TR fans here too 😁

  • synae[he/him]@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    72
    arrow-down
    1
    ·
    2 days ago

    whenever someone at work says they have trouble with a web service or api I’m like “Idk I can curl it just fine, what’s your request look like” and you would not believe how many developers get confused by this question. It’s so goddamn frustrating

    • Cousin Mose@lemmy.hogru.ch
      link
      fedilink
      arrow-up
      7
      ·
      2 days ago

      I get the same confusion when I prove someone wrong using a universal curl example. The same guy that parses JSON by hand (rather than use a library) can’t remember how to fucking use curl.

      • luciferofastora@feddit.org
        link
        fedilink
        arrow-up
        2
        ·
        14 hours ago

        The same guy that parses JSON by hand (rather than use a library)

        Of course I know him. He’s me.

        Or a past version of me anyway that was too dumb for its own good and also tried to do datetime stuff and several other complex things. Daft moron, that guy. Glad I don’t have to work with such idiots any-

        Looks at coworkers

        Glad I can now look down on such-

        Looks at own work

        You know what, never mind.

    • funkless_eck@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      the sketch is about reselling hair serum to bald men, he controls them because he threatens to withhold the serum unless he can dictate their haircuts.

      “You plant the seeds, you get to look at the trees!”

  • pelya@lemmy.world
    link
    fedilink
    arrow-up
    28
    ·
    2 days ago

    Recursively dumping all data from the server was always a wget thing, it will create a nice directory structure for you and will also convert links in webpages to point to your local file system.

  • MonkderVierte@lemmy.zip
    link
    fedilink
    arrow-up
    8
    ·
    1 day ago

    aria2c -c -x 10 -k 5M rocks. Faster and more persistent than Firefox’ built-in downloader, while still working on most servers.

      • luciferofastora@feddit.org
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        13 hours ago

        Excerpted from the manual linked by the other response by @funkless_eck@sh.itjust.works for those who don’t want to dig through it:

        aria2 is a utility for downloading files. The supported protocols are HTTP(S), FTP, BitTorrent, and Metalink. aria2 can download a file from multiple sources/protocols and tries to utilize your maximum download bandwidth.

        Basically, a powerful and versatile file downloader. It includes options to distribute downloads across multiple connections, download a bunch of files at once, automatically retry failed downloads, authorization, cookies and a lot of fancy options.

        @MonkderVierte@lemmy.zip has already explained what the specific flags in this case do:

        I think (a while ago)

        • -c is with a temp file to pick up again on failure
        • -x 10: # of connections
        • -k 5M: block size or something

        Specifically -x is the maximum number of connections per server, relevant mostly if multiple downloads are queried at once. -k defines the minimum size per block downloaded. In this case, if the file is smaller than 10MiB, it won’t be split because that would make the parts smaller than 5MiB. If it is larger, aria2c will split it into multiple connections, up to a default of 5 (unless a differenr value is specified by -s).

      • MonkderVierte@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        I think (a while ago)

        • -c is with a temp file to pick up again on failure
        • -x 10: # of connections
        • -k 5M: block size or something