• رضا@lemmy.world
    link
    fedilink
    English
    arrow-up
    147
    ·
    1 day ago

    OP is not updating his Arch system regularly.

    his previous update must have been at least two hours old.

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    36
    ·
    1 day ago

    22.8 GiB install size !?
    WTF?

    I must admit I don’t recall the size of my own installation, but that seems HUGE!
    Anyways congratulations on getting it trimmed. 😋

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        4 hours ago

        Which distro has Factorio as part of the standard package system?
        Seems like a nice way to save €32,-.

    • rustydrd@sh.itjust.works
      link
      fedilink
      arrow-up
      16
      ·
      1 day ago

      Larger than my entire root partition (currently at 21GB), but that’s because I made the fatal mistake to limit the partition to 25GB when I set it up. So I have to keep it trim, and I envy OP deep down.

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        24 hours ago

        Haha I did that once too, because I had a system that when upgrading I wanted a separate home partition so I could just reassign it to my new install.

    • silenium_dev@feddit.org
      link
      fedilink
      arrow-up
      32
      ·
      1 day ago

      If you’re doing anything with GPU compute (Blender, AI, simulations etc.), just ROCm, CUDA or oneAPI alone will take up half of that

    • cole@lemdro.id
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      lol mine is like 76GB. have been running the same install for going on 9 years now

    • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      I probably got something like that. I am not really into minimal installs, kde-applications-meta and plasma-meta is what I go with. Absolutely everything.

      I just wish I could safely use KDE Discover for updates. That’s probably what would work with “apply updates on reboot”, which sounds like the safest option. But for some reason packagekit-qt6 which would (probably) make this possible is not recommended to use.

      Preferably I’d go with something like KDE Neon or Kubuntu. I just really like KDE. But there’s just no sweet spot for me. Arch gives me new packages with all the bugs. Each update feels scary, what will I discover. Based on my Timeshift notes, last point without major bugs was 31st of October. Something like Linux Mint was stable, but I was missing some newer packages, and even drivers when my laptop was new. And major version upgrades also feel scary. Although, I don’t even know how they work. This is where Arch makes more sense to me. Linux as desktop OS is really just a huge bunch of packages working together, and they slowly get updated. When packaged into an entire OS, how do you even define a version?

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        I also use KDE, and it is far from minimal, but as I recall my system is only half that with a full system upgrade!
        Some say creativity stuff takes much room, but for instance Blender is only ½ a gig.

        But maybe my system is bigger than I remember, because even at 40 gig it’s near irrelevant compared to the size of an SSD today, and with 1 gigabit internet the upgrades are fast anyway.

        IDK if there’s a way to see the size of my actual Linux install not counting 3rd party media or games?

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    3
    ·
    1 day ago

    Download of 6GB is wild, is that re-downloading the entire package for each one that needs an update? Shouldn’t it be more efficient to download only the changes and patch the existing files?

    At this point it seems like my desktop Linux install needs as much space and bandwidth than windows does.

    • KubeRoot@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 hours ago

      Shouldn’t it be more efficient to download only the changes and patch the existing files?

      As people mentioned, that becomes problematic with a distro like arch. You could easily be jumping 5-6 versions with an update, with some more busy packages and updating less frequently. This means you need to go through the diffs in order, and you need to actually keep those diffs available.

      This actually poses two issues, and the first one is that software usually isn’t built for this kind of binary stability - anything compiled/autogenerated might change a lot with a small source change, and even just compressing data files will mess it up. Because of that, a diff/delta might end up not saving much space, and going through multiple of them could end up bigger than just a direct download of the files.

      And the second issue is, mirrors - mirrors need to store and provide a lot of data, and they’re not controlled by the distribution. Presumably to save on space, they quickly remove older package versions - and when I say older, I mean potentially less than a week old. In order for diffs/deltas to work, you’d need the mirrors to not only store the full package files they already do (for any new installs), but now also store deltas for N days back, and they’d only be useful to people who update more often than every N days.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      15 hours ago

      This doesn’t work too well for rolling releases, because users will quickly get several version jumps behind.

      For example, let’s say libbanana is currently at version 1.2.1, but then releases 1.2.2, which you ship as a distro right away, but then a few days later, they’ve already released 1.2.3, which you ship, too.
      Now Agnes comes home at the weekend and runs package updates on her system, which is still on libbanana v1.2.1. At that point, she would need the diffs 1.2.1→1.2.2 and then 1.2.2→1.2.3 separately, which may have overlaps in which files changed.

      In principle, you could additionally provide the diff 1.2.1→1.2.3, but if Greg updates only every other weekend, and libbanana celebrates the 1.3.0 release by then, then you will also need the diffs 1.2.1→1.3.0, 1.2.2→1.3.0 and 1.2.3→1.3.0. So, this strategy quickly explodes with the number of different diffs you might need.

      At that point, just not bothering with diffs and making users always download the new package version in full is generally preferred.

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Interesting, it wouldn’t work like rsync where it compares the new files to the old ones and transfers the parts that have changed?

        • Ephera@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          Hmm, good question. I know of one such implementation, which is Delta RPM, which works the way I described it.
          But I’m not sure, if they just designed it to fit into the current architecture, where all their mirrors and such were set up to deal with package files.

          I could imagine that doing it rsync-style would be really terrible for server load, since you can’t really cache things at that point…

    • Ricaz@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      No, that’s not how compiling works. And yes, 6GB is wild. If I don’t patch in a month, the download might be 2GB and the net will still be smaller.

      I don’t think I could get close to my Windows installation even if I installed literally every single package…

    • Olap@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      1 day ago

      Patching means rebuilding. And packagers don’t really publish diffs. So it’s use all your bandwidth instead!

      • definitemaybe@lemmy.ca
        link
        fedilink
        arrow-up
        29
        ·
        1 day ago

        Which is WAY more economical.

        Rebuilding packages takes a lot of compute. Downloading mostly requires just flashing some very small lights very quickly.

        • cmnybo@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          24 hours ago

          If you have multiple computers, you can always set up a caching proxy so you only have to download the packages once.

          • SmoochyPit@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            19 hours ago

            That reminds me of Chaotic AUR, though it’s an online public repo. It automatically builds popular AUR packages and lets you download the binaries.

            It sometimes builds against outdated libraries/dependencies though, so for pre-release software I’ve sometimes had to download and compile it locally still. Also you can’t make any patches or move to an old commit, like you can with normal AUR packages.

            I’ve found it’s better to use Arch Linux’s official packages when I can, though, since they always publish binaries built with the same latest-release dependencies. I haven’t had dependency version issues with that, as long as I’ve avoided partial upgrades.

          • Ephera@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            16 hours ago

            openSUSE Leap does have differential package updates. Pretty sure, I once saw it on one of the Red-Hat-likes, too.

            But yeah, it makes most sense on slow-moving, versioned releases with corporate backing.