I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.
This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.
Not that big by today’s standards, but I once downloaded the Windows 98 beta CD from a friend over dialup, 33.6k at best. Took about a week as I recall.
Yep, downloaded XP over 33.6k modem, but I’m in NZ so 33.6 was more advertising than reality, it took weeks.
In similar fashion, downloaded dude where’s my car, over dialup, using at the time the latest tech method - a file download system that would split the file into 2mb chunks and download them in order.
It took like 4 days.
I remember downloading the scene on American Pie where Shannon Elizabeth strips naked over our 33.6 link and it took like an hour, at an amazing resolution of like 240p for a two minute clip 😂
Totally worth it.
And then you busted after 15 seconds?
I’m currently backing up my /dev folder to my unlimited cloud storage. The backup of the file
/dev/random
is running since two weeks.I’m guessing this is a joke, right?
Entire drive/array backups will probably be by far the largest file transfer anyone ever does. The biggest I’ve done was a measly 20TB over the internet which took forever.
Outside of that the largest “file” I’ve copied was just over 1TB which was a SQL file backup for our main databases at work.
I work in cinema content so hysterical laughter
Interesting! Could you give some numbers? And what do you use to move the files? If you can disclose obvs
A small dcp is around 500gb. But that’s like basic film shizz, 2d, 5.1 audio. For comparison, the 3D deadpool 2 teaser was 10gb.
Aspera’s commonly used for transmission due to the way it multiplexes. It’s the same protocolling behind Netflix and other streamers, although we don’t have to worry about preloading chunks.
My laughter is mostly because we’re transmitting to a couple thousand clients at once, so even with a small dcp thats around a PB dropped without blinking
Ahhh thanks for the reply! Makes sense! We also use Aspera here at work (videogames) but dont move that ammount, not even close.
I used to work in the same industry. We transferred several PBs from West US to Australia using Aspera via thick AWS pipes. Awesome software.
Hahahah did you enjoy Australian Internet? It’s wonderfully archaic
(MPS, Delux, Gofilex or Qubewire?)
@data1701d downloading forza horizon 5 on Steam with around 120gb is the largest web-download, I can remember. In LAN, I’ve migrated my old FreeBSD NAS to my new one, which was a roughly 35TB transfer over NFS.
How long did that 35TB take? 12 hours or so?
A few years back I worked at a home. They organised the whole data structure but needed to move to another Providor. I and my colleagues moved roughly just about 15.4 TB. I don’t know how long it took because honestly we didn’t have much to do when the data was moving so we just used the downtime for some nerd time. Nerd time in the sense that we just started gaming and doing a mini LAN party with our Raspberry and banana pi’s.
Surprisingly the data contained information of lots of long dead people which is quiet scary because it wasn’t being deleted.
a .png of your mom’s width
~15TB over the internet via 30Mbps uplink without any special considerations. Syncthing handled any and all network and power interruptions. I did a few power cable pulls myself.
I think it’s crazy that not that long ago 30mbps was still pretty good, we now have 1gbps+ at residential addresses and it fairly common too
I’ve got symmetrical gigabit in my apartment, with the option to upgrade to 5 or 8. I’d have to upgrade my equipment to use those speeds, but it’s nice to know I have the option.
Yeah, I also moved from 30Mb upload to 700Mb recently and it’s just insane. It’s also insane thinking I had a symmetric gigabit connection in Eastern Europe in the 2000s for fairly cheap. It was Ethernet though, not fiber. Patch cables and switches all the way to the central office. 🫠
Most people in Canada today have 50Mb upload at the most expensive connection tiers - on DOCSIS 3.x. Only over the last few years fiber began becoming more common but it’s still fairly uncommon as it’s the most expensive connection tier if at all available.
We might pay some of the most expensive internet in the world in Canada but at least we can’t fault them for providing an unstable or unperformqnt service. Download llama models is where 1gbps really shines, you see a 7GB model? It’s done before you are even back from the toilet. Crazy times.
Largest one I ever did was around 4.something TB. New off-site backup server at a friends place. Took me 4 months due to data limits and an upload speed that maxed out at 3MB/s.
I downloaded that 200gb leak from national public data the other day, maybe not the biggest total but certainly the largest single text file ive ever messed with
Currently pushing about 3-5 TB of images to AI/ML scanning per day. Max we’ve seen through the system is about 8 TB.
Individual file? Probably 660 GB of backups before a migration at a previous job.
My Chia crypto farm at its peak had about 1.5 PB of plots, each plot was I think about 100ish gigs? I’d plot them on a dedicated machine and then move them to storage for farming. I think I’d move around 10TB per night.
It was done with a combination of powershell and bash scripts on Windows, Linux, and the built in Windows Services for Linux.
I routinely do 1-4TB images of SSDs before making major changes to the disk. Run fstrim on all partitions and pipe dd output through zstd before writing to disk and they shrink to actually used size or a bit smaller. Largest ever backup was probably ~20T cloned from one array to another over 40/56GbE, the deltas after that were tiny by comparison.
I once moved ~5TB of research data over the internet. It took days and unfortunately it also turned out that the data was junk :/
Local file transfer?
I cloned a 1TB+ system a couple of times.
As the Anaconda installer of Fedora Atomic is broken (yes, ironic) I have one system originally meant for tweaking as my “zygote” and just clone, resize, balance and rebase that for new systems.
Remote? 10GB MicroWin 11 LTSC IOT ISO, the least garbage that OS can get.
Also, some leaked stuff 50GB over Bittorrent