I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

  • Davel23@fedia.io
    link
    fedilink
    arrow-up
    30
    ·
    1 month ago

    Not that big by today’s standards, but I once downloaded the Windows 98 beta CD from a friend over dialup, 33.6k at best. Took about a week as I recall.

    • absGeekNZ@lemmy.nz
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 month ago

      Yep, downloaded XP over 33.6k modem, but I’m in NZ so 33.6 was more advertising than reality, it took weeks.

    • 50MYT@aussie.zone
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      In similar fashion, downloaded dude where’s my car, over dialup, using at the time the latest tech method - a file download system that would split the file into 2mb chunks and download them in order.

      It took like 4 days.

    • pete_the_cat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I remember downloading the scene on American Pie where Shannon Elizabeth strips naked over our 33.6 link and it took like an hour, at an amazing resolution of like 240p for a two minute clip 😂

  • freijon@lemmings.world
    link
    fedilink
    arrow-up
    22
    ·
    1 month ago

    I’m currently backing up my /dev folder to my unlimited cloud storage. The backup of the file /dev/random is running since two weeks.

  • fuckwit_mcbumcrumble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 month ago

    Entire drive/array backups will probably be by far the largest file transfer anyone ever does. The biggest I’ve done was a measly 20TB over the internet which took forever.

    Outside of that the largest “file” I’ve copied was just over 1TB which was a SQL file backup for our main databases at work.

      • Taleya@aussie.zone
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        1 month ago

        A small dcp is around 500gb. But that’s like basic film shizz, 2d, 5.1 audio. For comparison, the 3D deadpool 2 teaser was 10gb.

        Aspera’s commonly used for transmission due to the way it multiplexes. It’s the same protocolling behind Netflix and other streamers, although we don’t have to worry about preloading chunks.

        My laughter is mostly because we’re transmitting to a couple thousand clients at once, so even with a small dcp thats around a PB dropped without blinking

        • potajito@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          3
          ·
          1 month ago

          Ahhh thanks for the reply! Makes sense! We also use Aspera here at work (videogames) but dont move that ammount, not even close.

        • daq@lemmy.sdf.org
          link
          fedilink
          arrow-up
          2
          ·
          1 month ago

          I used to work in the same industry. We transferred several PBs from West US to Australia using Aspera via thick AWS pipes. Awesome software.

          • Taleya@aussie.zone
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 month ago

            Hahahah did you enjoy Australian Internet? It’s wonderfully archaic

            (MPS, Delux, Gofilex or Qubewire?)

  • Decency8401@discuss.tchncs.de
    link
    fedilink
    arrow-up
    6
    ·
    1 month ago

    A few years back I worked at a home. They organised the whole data structure but needed to move to another Providor. I and my colleagues moved roughly just about 15.4 TB. I don’t know how long it took because honestly we didn’t have much to do when the data was moving so we just used the downtime for some nerd time. Nerd time in the sense that we just started gaming and doing a mini LAN party with our Raspberry and banana pi’s.

    Surprisingly the data contained information of lots of long dead people which is quiet scary because it wasn’t being deleted.

  • Avid Amoeba@lemmy.ca
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 month ago

    ~15TB over the internet via 30Mbps uplink without any special considerations. Syncthing handled any and all network and power interruptions. I did a few power cable pulls myself.

    • Pasta Dental@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      I think it’s crazy that not that long ago 30mbps was still pretty good, we now have 1gbps+ at residential addresses and it fairly common too

      • Confused_Emus@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        I’ve got symmetrical gigabit in my apartment, with the option to upgrade to 5 or 8. I’d have to upgrade my equipment to use those speeds, but it’s nice to know I have the option.

      • Avid Amoeba@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 month ago

        Yeah, I also moved from 30Mb upload to 700Mb recently and it’s just insane. It’s also insane thinking I had a symmetric gigabit connection in Eastern Europe in the 2000s for fairly cheap. It was Ethernet though, not fiber. Patch cables and switches all the way to the central office. 🫠

        Most people in Canada today have 50Mb upload at the most expensive connection tiers - on DOCSIS 3.x. Only over the last few years fiber began becoming more common but it’s still fairly uncommon as it’s the most expensive connection tier if at all available.

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          We might pay some of the most expensive internet in the world in Canada but at least we can’t fault them for providing an unstable or unperformqnt service. Download llama models is where 1gbps really shines, you see a 7GB model? It’s done before you are even back from the toilet. Crazy times.

  • Yeahboiiii@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    1 month ago

    Largest one I ever did was around 4.something TB. New off-site backup server at a friends place. Took me 4 months due to data limits and an upload speed that maxed out at 3MB/s.

  • HappyTimeHarry@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    I downloaded that 200gb leak from national public data the other day, maybe not the biggest total but certainly the largest single text file ive ever messed with

  • psmgx@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 month ago

    Currently pushing about 3-5 TB of images to AI/ML scanning per day. Max we’ve seen through the system is about 8 TB.

    Individual file? Probably 660 GB of backups before a migration at a previous job.

  • JerkyChew@lemmy.one
    link
    fedilink
    arrow-up
    3
    ·
    1 month ago

    My Chia crypto farm at its peak had about 1.5 PB of plots, each plot was I think about 100ish gigs? I’d plot them on a dedicated machine and then move them to storage for farming. I think I’d move around 10TB per night.

    It was done with a combination of powershell and bash scripts on Windows, Linux, and the built in Windows Services for Linux.

  • seaQueue@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 month ago

    I routinely do 1-4TB images of SSDs before making major changes to the disk. Run fstrim on all partitions and pipe dd output through zstd before writing to disk and they shrink to actually used size or a bit smaller. Largest ever backup was probably ~20T cloned from one array to another over 40/56GbE, the deltas after that were tiny by comparison.

  • weker01@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    1 month ago

    I once moved ~5TB of research data over the internet. It took days and unfortunately it also turned out that the data was junk :/

  • boredsquirrel@slrpnk.net
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 month ago

    Local file transfer?

    I cloned a 1TB+ system a couple of times.

    As the Anaconda installer of Fedora Atomic is broken (yes, ironic) I have one system originally meant for tweaking as my “zygote” and just clone, resize, balance and rebase that for new systems.

    Remote? 10GB MicroWin 11 LTSC IOT ISO, the least garbage that OS can get.

    Also, some leaked stuff 50GB over Bittorrent