• 39 Posts
  • 5.22K Comments
Joined 2 anni fa
cake
Cake day: 22 marzo 2024

help-circle
  • Davinci works better in Linux. Vapoursynth mostly works better in Linux.

    RAW photo editing is already horrible in Windows if you’re trying to do HDR. To be fair, it’s horrible in Linux too. As much as I hate it, they can’t touch Apple there.

    See this post I just made: https://lemmy.world/post/41751454/21613633

    iOS will render HDR JPEG-XL, AVIF and tiled HEIFs straight out of a camera; no problem. Heck, it will even display RAWs in the photo app. But it’s a struggle on Windows and Linux.


    And if by “professional use” you mean “Adobe,” I view that in the same way as still being on Twitter. At this point, subjecting yourself to Adobe on Windows is something you should do through gritted teeth.


  • You’d gain HDR!

    Windows is nearly effortless to maintain if you only use it for entertainment.

    Maybe we just have different priorities, but right now, I’d be miserable and wasting so much time if I was stuck on Linux only, even though I use Linux like 90% of the time. Some media and some games just won’t look right.

    And to emphasize, it would take sooo much time to massage this issue on Linux. Dual booting saves me a ton of maintenance and tweaking.


  • Here’s an interesting test:

    Say what you will about Safari and iOS, but it rocks with image format support. HDR JPEG XL and AVIF render correctly, and look like the original HEIF file from the camera.

    Helium (a Chrome fork) is on the left, Firefox on the right, running CachyOS Linux with KDE on a TV, HDR enabled from AMD output.

    Firefox fails miserably :(

    Chrome sorta gets the AVIF right, though it seems to lose some dynamic range with the sun.



  • Completely disagree.

    Even setting gaming aside, I’ve started taking family/fun photos in HDR instead of JPEG, and on things that can render them (like smartphones and TVs), they are gorgeous. I can’t imagine going back now.

    I took this on a walk this week, completely unedited:

    HDR HEIF

    If your browser doesn’t render that, here’s my best attempt at an AVIF conversion:

    HDR AVIF

    And JPEG-XL:

    HDR JXL

    On my iPhone or a TV, the sun is so bright it makes you squint, and as yellow-orange as real life. The bridge in shadow is dark but clear. It looks just like I’m standing there, with my eyes adjusting to different parts of the picture.

    I love this! It feels like a cold, cozy memory.

    Now if I crush it down to an SDR JPEG:

    It just doesn’t* look* right. The sun is a paper-white blob, and washed out. And while this is technically not the fault of encoding it as SDR, simply being an 8 bit JPEG crushed all the shadows into blocky grey blobs.


    …This is the kicker with HDR. It’s not that it doesn’t look incredible. But the software/display support is just not there.

    I bet most browsers viewing this post aren’t rendering those images right, if at all.

    Lemmy, a brand new platform, doesn’t even support JXL, HEIF, or AVIF! It doesn’t support any HDR format at all; I had to embed them from catbox.


  • Paid mods are a whole thing on top. You get stuff like hardcoded incompatibilities just to sabotage other mods, obfuscated code to sabotage compatibility, weird interpersonal spats, and what I can only describe as “Discord development cults.”

    But all that’s beside the point. If the publisher sanctions paid mods in some kind of marketplace, okay. But you can’t sneak under their nose then complain when they notice you.



  • Dual booting is not bad!

    What I do is share an NTFS partition between Windows and Linux for bulk data. If they’re DRM free, you can literally run the same games off the same drive.

    Something goes wrong? I can just delete the windows partition and start over in 30 minutes, without losing hardly anything. It’s so much better as a “disposable” OS.

    I also use two EFI partitions (the default Windows one and a new one for Linux) so there is zero possiblity of the OSes interacting.

    To be blunt, I would never do banking in Windows if you can do linux. It’s just too much of a risk.



  • I’m on a Sony OLED with a 3090. I game some, and color grade photos/videos in HDR.

    …And I can’t get HDR to look right in KDE, even with the display running off my AMD IGP. It has basically zero options for me to tweak.

    So I use Windows for that.


    Honestly, it’s hard enough on Windows. It’s a coin flip as to whether apps works or not, and the TV needs adjustments for some, lest they crush black or blow out highlights/colors. Many games, specifically, need configurable mods to look right.

    One of my saddest video workflows is transcoding on Linux, and downloading the result to my iPhone to see if it looks right.






  • My issue is those “smaller communities” for my niches withered away, lost in the depths of SEO and attention machines.

    I’m not innocent there. I stopped participating in many in lieu of Discord and Reddit which, in hindsight, I feel sick about. But the draw of phone pings and algorithms and critical mass is very powerful, and that temptation didn’t exist a long time ago.


  • I agree. It honestly makes me mad that people get in such a huff over using generative models for fiction; they’re just another generation of storytelling tools.

    The issue is blurring fiction and reality.

    This isn’t just a problem with AI. See: influencers, tabloids, and “news” that sell caricatures of reality.

    But AI makes it too, too easy to distribute fakeness in spaces that are supposed to be real. That is very dangerous. And this is what it ended up being used for.

    Nowadays i know much better how to verify information that’s important to me; a dogs picture licking a cat which makes her purr will always emotionally positive for me, because a) it doesn’t matter outside of my satisfaction, just like the well told story.

    …I think I’ve used generative models enough to get desensitized to the “feel good” bit. I guess I felt like you once, but having peeked behind the curtain, the feeling has gone away.

    But if they make you feel good, good. That’s what arts supposed to do.


  • It’s still a fantasy though. People aren’t in control of their phones/feeds.

    Heck, we can’t even get the world to support JPEG-XL or HEIF or anything, much less take RAW pictures.

    And Twitter has convinced me there is absolutely no line these services can cross to get people to quit.


  • Sure. But I can make my own AI image of a cute dog, and where’s the satisfaction in that?


    Hence, I think it cracks open a bigger issue than AI: the ‘illusion’ of authenticity on social media. Our squishy brains doomscroll with the fantasy that the stuff is real, and candid, and honest, and gems we found…

    But that’s never really been true.

    It’s largely staged content designed to go viral and make someone a buck. Or sell something. And it’s served by billion dollar algorithms designed to model and hijack your brain.


    My hot take: people are upset that slop smashed that illusion with a hammer. Social media has been addictive fakeness for years; it’s just glaringly obvious now.


  • brucethemoose@lemmy.worldtoMemes@sopuli.xyzCant Decide 🤖
    link
    fedilink
    arrow-up
    25
    arrow-down
    1
    ·
    edit-2
    un giorno fa

    I have a business idea:

    Vintage social media.

    Only media that verifiably exists on the internet before 2021 is allowed. That’s still billions of cute animal photos and videos.


    EDIT:

    And a sister project: RAW-only social media. Only photo/video uploaded as raw sensor data (which even phones can take now) is allowed. Metadata is stripped, and they’re post-processed by the site.

    Why? RAWs are technically possible to fake, but difficult enough to deter lazy slop spam. As a bonus, they can’t be heavily edited either; they’re unprocessed, unglamourous slices of reality. And they can be served in HDR with modern compression, as a cherry on top.

    …Now I just need a few billion dollars to host it, and about a trillion to survive anticompetive attacks.