While I was working on my new project I had a situation where we need to copy our files from Amazon AWS to Rackspace server.
On Amazon server as per the app architecture we are having multiple elastic private servers and we wanted to copy data from private server to the AWS public server and then to Rackspace. As I am not expert of Unix so it look some time for me to figure out how to do it.
I was aware that I have to use SCP or Rsync etc command but things were not working, because I was not having the root access to Amazon. Finally after googling and trying various commands / options I got it working. Here are my notes on that
scp -i <key-pair-file-path> <remote-source user@ip>:<directory> <destination-local-path>
-i indicates the identity_file
<remote-source user@ip>:<directory> – Source from where we would like to copy the file/folder
<destination-local-path> – Destination location where we would like to copy it, it can be a local/remote location
If we have to copy file from Local to remote server and we have access to a specific port of that server then we can use `-P port-number` option of the `scp`
scp -P 8888 mycert.certSigningRequest firstname.lastname@example.org:/home/xyz/mycert.certSigningRequest
1. To access Amazon command prompt as ROOT , login first with your identity key-pair and then type $sudo su this will show the root prompt directly
2. Now to access AWS public/private servers we need to use the identity key-pair file so we need to use the “scp -i ” option as “-i” indicates the identity file path
3. If you want to copy data from AWS public server to Rackspace or any other public server you need to use AWS “root” account, as mentioned above that is easy just type “sudo su” and from there you can transfer files/folders between AWS to any other remote server