• albigu@lemmygrad.ml
    link
    fedilink
    arrow-up
    19
    ·
    edit-2
    1 year ago

    The original google document was so vaguely written that I just assumed the smartasses were pretending they invented SSL certificates. Only now did I get it that it’s for the server to verify the client, not the other way around, and that’s a on a whole new level of absurd. Boy I sure love the idea of having anti-cheat for my browsers. The comments on github are fun to read, at the very least.

    That, coupled with Manifest V3 preventing adblockers, is way more reason than one needs to completely forget Google as anything more than the video hosting service for Invidious.

    From a github user kescherCode:

    I believe that whenever possible, we shall implement this spec into our own websites - but reversed. Whenever the attestation passes, let users not access your site. When it fails or this proposed API is unavailable - let users access your site.

    Edit: the mad lad actually started implementing it here.

  • doppelgangmember@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    Just made the switch back to DuckDuckGo

    The Google results are atrocious. Fkin 7 “sponsored” results. Two pages almost of fluff

    • Łumało [he/him]@lemmygrad.mlOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 year ago

      DuckDuckGo is only a meta search engine, they don’t have their own index. They use bing’s indexes and thus have very similar results, which are more often than not terrible crap absolute dogshit, SEO manipulated and completely unhelpful and you need to sift through them.

      I would know, I use it on the daily and am currently looking for an alternative.

      Searx looks most promising but I haven’t really dived into it. Well… Ever.

      And DuckDuckGo still has me somewhat won over because html.duckduckgo.com exists and something that I can call peak web design (no joke I truly love it): lite.duckduckgo.com

      • Prologue7642@lemmygrad.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        SearX or even better, SearXNG is probably the best option right now. But it is still just a metasearch engine. Unfortunately, currently there is no real FOSS option with its own index that is usable.

    • Bloops@lemmygrad.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Unless they reversed it, DuckDuckGo censors Russian news now. It’s why I moved to Brave, but that lacks many features so honestly I’m currently without any satisfactory search engine.

    • albigu@lemmygrad.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I dropped google almost entirely later last year when every search of mine returned with a dozen “Untitled” fake pages with spam text and possibly malware due to some wordpress bug. How can the main product of a trillion dollar company fail so badly at such basic search engine manipulation?

      • Walter Water-Walker@lemmygrad.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Because it’s not “failing”, the Internet is fragmenting off into large factions. Step 1 has been capitalism commodifying information, resulting in most information taking one of two forms: siloed off into private databases or SEO sites designed to fish people into siloed off sites. Step 2 is now taking the form of political factions as the world becomes multi-polar again (China, USA and Russia, essentially). Large super powers push their domination in the form of information and each claiming they’re right so you don’t need to worry about all that other stuff.

        The concept of the Internet being global is dead. The concept of the Internet being content created by regular people for regular people is dead. Search engines are just showing us this with their results. Which is why sites like duckduckgo aren’t even enough at this point. We are, in effect, just entering a “dark age” of the Internet, where we can expect whole sections to simply not be accessible to us anymore.

        To build an entirely new search engine, with broad indexing and showing us real results again that are relevant, you’d need massive amounts of capital. You can’t get that capital from private investment because there’s no profit in doing this. You can’t get that from public funds because it’s not in the interest of governments to go against their chosen hegemony. You’d have to grass-roots the effort, which would be a very precarious endeavor that, even if it launched (which would be a miracle), it would be difficult to maintain. Even models like Wikimedia aren’t actually sustainable. Wikipedia ultimately curtails to the Western world, as it’s moderators and editors largely make up the same hegemonic viewpoint. And Wikimedia doesn’t have to do indexing of the entire web, which, at minimum, requires substantially more compute power than they currently use.

        EDIT: It’s also worth mentioning at this point that a project to “fix the Internet” is probably low on the list of things to “fix” right now. So any leftists taking on the challenge should be aware that they’re spending gobs of resources and time on something that only the upper-tier of individuals care about in this world. Time and money would be better spent towards humanitarian aid as people get displaced due to climate change. Or in funding revolutions for areas of the world that are ready for it. By no means am I suggesting we should do a Google rewrite. I’m simply pointing out that everything we should have predicted capitalism doing to information is coming to pass and while things are shitty, they’ll definitely get shittier and we shouldn’t be surprised. Nor should we really care much, IMO. Because the core issue is just the contradictions within capitalism manifesting themselves into the Internet. Trying to fix the Internet is working backwards and, frankly, will probably be a fruitless endeavor. We Marxian-read leftists should know better.