Welcome to the Linux Foundation Forum!

rsync is very slow between the linux and additional secure HDD



I'm trying to copy the data between linux machine to an extra attached secure hdd, initially I mounted the secure hdd on the system. I have around 150GB of data to transfer in which there is tar file of around 132GB of size, I tried to copy the data by cp command. It broke frequently, then I tried with the rsync command. The rsync command it not breaking but its very slow, the rsync command is running from last three days continuously but still it transfered around 30% of the overall data.

Can somebody help to speed up the process of rsync.


Dilip G


  • mfillpot
    mfillpot Posts: 2,177
    The issue may be the transfer speed of the external disk or the cable system in use. An Internal 7200RPM+ SATA3 HDD would offer the best transfer speeds without getting too expensive.

    Can you share the connection and disk information so we can see which is the cause? I expect that you are running a 5400 RPM HDD on USB2 or USB1.
  • rechil_colin
    Generally a large amount of data, there is something that tar can't handle for whatever reason / might be process gets interrupted / if it is a filesystem migration(guessing), it is necessary to do initial copy before the actual task starts. For initial step: (cd /src; tar cf - .) | (cd /dst; tar xpf -)
    Or u can use this command for making fast: rsync -aPhW --protocol=28, even u can use cpio command. See here for more info:
  • marc
    marc Posts: 647
    It shouldn't take that long to copy that amount of data. Are you sure the hard disck is correctly connected and/or configured?

    What does "iotop" tell you?

  • woboyle
    woboyle Posts: 501
    You say this is a secure drive, so I assume it is encrypted. If so, that is likely the cause of the slowness of the data transfer. Encryption (especially with modern strong encryption routines) can be quite slow, especially when it has to be done in software. My work laptop uses full-disc encryption, which is noticeably slower in copying large data sets than it would be if it weren't encrypted.
  • Dilip7597
    Dilip7597 Posts: 2
    Hi Guys,

    Thanks for your reply..... but I did my job by an alternate method. I ran a tar command with split command. It created the 20GB part files of the overall directory and then I transferred it by cp command.

    Thanks for your help guys.

    Dilip Gupta


Upcoming Training