S3CMD tool : Amazon S3 command Line tool - command reference


/ Published in: Other
Save to your folder(s)



Copy this code and paste it in your HTML
  1. What is S3CMD ?
  2. Command line Amazon S3 client that can be used in scripts, backup cron jobs, etc. This is your best choice if you want to quickly get up to speed with S3. Requires Python 2.4 or newer and some pretty common Python modules.
  3.  
  4. Simple s3cmd HowTo
  5. ------------------
  6. 1) Register for Amazon AWS / S3
  7. Go to http://aws.amazon.com/s3, click the "Sign up
  8. for web service" button in the right column and work
  9. through the registration. You will have to supply
  10. your Credit Card details in order to allow Amazon
  11. charge you for S3 usage.
  12. At the end you should have your Access and Secret Keys
  13.  
  14. 2) Run "s3cmd --configure"
  15. You will be asked for the two keys - copy and paste
  16. them from your confirmation email or from your Amazon
  17. account page. Be careful when copying them! They are
  18. case sensitive and must be entered accurately or you'll
  19. keep getting errors about invalid signatures or similar.
  20.  
  21. 3) Run "s3cmd ls" to list all your buckets.
  22. As you just started using S3 there are no buckets owned by
  23. you as of now. So the output will be empty.
  24.  
  25. 4) Make a bucket with "s3cmd mb s3://my-new-bucket-name"
  26. As mentioned above the bucket names must be unique amongst
  27. _all_ users of S3. That means the simple names like "test"
  28. or "asdf" are already taken and you must make up something
  29. more original. To demonstrate as many features as possible
  30. let's create a FQDN-named bucket s3://public.s3tools.org:
  31.  
  32. ~$ s3cmd mb s3://public.s3tools.org
  33. Bucket 's3://public.s3tools.org' created
  34.  
  35. 5) List your buckets again with "s3cmd ls"
  36. Now you should see your freshly created bucket
  37.  
  38. ~$ s3cmd ls
  39. 2009-01-28 12:34 s3://public.s3tools.org
  40.  
  41. 6) List the contents of the bucket
  42.  
  43. ~$ s3cmd ls s3://public.s3tools.org
  44. ~$
  45.  
  46. It's empty, indeed.
  47.  
  48. 7) Upload a single file into the bucket:
  49.  
  50. ~$ s3cmd put some-file.xml s3://public.s3tools.org/somefile.xml
  51. some-file.xml -> s3://public.s3tools.org/somefile.xml [1 of 1]
  52. 123456 of 123456 100% in 2s 51.75 kB/s done
  53.  
  54. Upload a two directory tree into the bucket's virtual 'directory':
  55.  
  56. ~$ s3cmd put --recursive dir1 dir2 s3://public.s3tools.org/somewhere/
  57. File 'dir1/file1-1.txt' stored as 's3://public.s3tools.org/somewhere/dir1/file1-1.txt' [1 of 5]
  58. File 'dir1/file1-2.txt' stored as 's3://public.s3tools.org/somewhere/dir1/file1-2.txt' [2 of 5]
  59. File 'dir1/file1-3.log' stored as 's3://public.s3tools.org/somewhere/dir1/file1-3.log' [3 of 5]
  60. File 'dir2/file2-1.bin' stored as 's3://public.s3tools.org/somewhere/dir2/file2-1.bin' [4 of 5]
  61. File 'dir2/file2-2.txt' stored as 's3://public.s3tools.org/somewhere/dir2/file2-2.txt' [5 of 5]
  62.  
  63. As you can see we didn't have to create the /somewhere
  64. 'directory'. In fact it's only a filename prefix, not
  65. a real directory and it doesn't have to be created in
  66. any way beforehand.
  67.  
  68. 8) Now list the bucket contents again:
  69.  
  70. ~$ s3cmd ls s3://public.s3tools.org
  71. DIR s3://public.s3tools.org/somewhere/
  72. 2009-02-10 05:10 123456 s3://public.s3tools.org/somefile.xml
  73.  
  74. Use --recursive (or -r) to list all the remote files:
  75.  
  76. ~$ s3cmd ls s3://public.s3tools.org
  77. 2009-02-10 05:10 123456 s3://public.s3tools.org/somefile.xml
  78. 2009-02-10 05:13 18 s3://public.s3tools.org/somewhere/dir1/file1-1.txt
  79. 2009-02-10 05:13 8 s3://public.s3tools.org/somewhere/dir1/file1-2.txt
  80. 2009-02-10 05:13 16 s3://public.s3tools.org/somewhere/dir1/file1-3.log
  81. 2009-02-10 05:13 11 s3://public.s3tools.org/somewhere/dir2/file2-1.bin
  82. 2009-02-10 05:13 8 s3://public.s3tools.org/somewhere/dir2/file2-2.txt
  83.  
  84. 9) Retrieve one of the files back and verify that it hasn't been
  85. corrupted:
  86.  
  87. ~$ s3cmd get s3://public.s3tools.org/somefile.xml some-file-2.xml
  88. s3://public.s3tools.org/somefile.xml -> some-file-2.xml [1 of 1]
  89. 123456 of 123456 100% in 3s 35.75 kB/s done
  90.  
  91. ~$ md5sum some-file.xml some-file-2.xml
  92. 39bcb6992e461b269b95b3bda303addf some-file.xml
  93. 39bcb6992e461b269b95b3bda303addf some-file-2.xml
  94.  
  95. Checksums of the original file matches the one of the
  96. retrieved one. Looks like it worked :-)
  97.  
  98. To retrieve a whole 'directory tree' from S3 use recursive get:
  99.  
  100. ~$ s3cmd get --recursive s3://public.s3tools.org/somewhere
  101. File s3://public.s3tools.org/somewhere/dir1/file1-1.txt saved as './somewhere/dir1/file1-1.txt'
  102. File s3://public.s3tools.org/somewhere/dir1/file1-2.txt saved as './somewhere/dir1/file1-2.txt'
  103. File s3://public.s3tools.org/somewhere/dir1/file1-3.log saved as './somewhere/dir1/file1-3.log'
  104. File s3://public.s3tools.org/somewhere/dir2/file2-1.bin saved as './somewhere/dir2/file2-1.bin'
  105. File s3://public.s3tools.org/somewhere/dir2/file2-2.txt saved as './somewhere/dir2/file2-2.txt'
  106.  
  107. Since the destination directory wasn't specified s3cmd
  108. saved the directory structure in a current working
  109. directory ('.').
  110.  
  111. There is an important difference between:
  112. get s3://public.s3tools.org/somewhere
  113. and
  114. get s3://public.s3tools.org/somewhere/
  115. (note the trailing slash)
  116. S3cmd always uses the last path part, ie the word
  117. after the last slash, for naming files.
  118.  
  119. In the case of s3://.../somewhere the last path part
  120. is 'somewhere' and therefore the recursive get names
  121. the local files as somewhere/dir1, somewhere/dir2, etc.
  122.  
  123. On the other hand in s3://.../somewhere/ the last path
  124. part is empty and s3cmd will only create 'dir1' and 'dir2'
  125. without the 'somewhere/' prefix:
  126.  
  127. ~$ s3cmd get --recursive s3://public.s3tools.org/somewhere /tmp
  128. File s3://public.s3tools.org/somewhere/dir1/file1-1.txt saved as '/tmp/dir1/file1-1.txt'
  129. File s3://public.s3tools.org/somewhere/dir1/file1-2.txt saved as '/tmp/dir1/file1-2.txt'
  130. File s3://public.s3tools.org/somewhere/dir1/file1-3.log saved as '/tmp/dir1/file1-3.log'
  131. File s3://public.s3tools.org/somewhere/dir2/file2-1.bin saved as '/tmp/dir2/file2-1.bin'
  132.  
  133. See? It's /tmp/dir1 and not /tmp/somewhere/dir1 as it
  134. was in the previous example.
  135.  
  136. 10) Clean up - delete the remote files and remove the bucket:
  137.  
  138. Remove everything under s3://public.s3tools.org/somewhere/
  139.  
  140. ~$ s3cmd del --recursive s3://public.s3tools.org/somewhere/
  141. File s3://public.s3tools.org/somewhere/dir1/file1-1.txt deleted
  142. File s3://public.s3tools.org/somewhere/dir1/file1-2.txt deleted
  143. ...
  144.  
  145. Now try to remove the bucket:
  146.  
  147. ~$ s3cmd rb s3://public.s3tools.org
  148. ERROR: S3 error: 409 (BucketNotEmpty): The bucket you tried to delete is not empty
  149.  
  150. Ouch, we forgot about s3://public.s3tools.org/somefile.xml
  151. We can force the bucket removal anyway:
  152.  
  153. ~$ s3cmd rb --force s3://public.s3tools.org/
  154. WARNING: Bucket is not empty. Removing all the objects from it first. This may take some time...
  155. File s3://public.s3tools.org/somefile.xml deleted
  156. Bucket 's3://public.s3tools.org/' removed
  157.  
  158. Hints
  159. -----
  160. The basic usage is as simple as described in the previous
  161. section.
  162.  
  163. You can increase the level of verbosity with -v option and
  164. if you're really keen to know what the program does under
  165. its bonet run it with -d to see all 'debugging' output.
  166.  
  167. After configuring it with --configure all available options
  168. are spitted into your ~/.s3cfg file. It's a text file ready
  169. to be modified in your favourite text editor.
  170.  
  171. For more information refer to:
  172. * S3cmd / S3tools homepage at http://s3tools.org
  173. * Amazon S3 homepage at http://aws.amazon.com/s3

URL: http://s3tools.org

Report this snippet


Comments

RSS Icon Subscribe to comments

You need to login to post a comment.