Copy files from one S3 bucket to another

I have an S3 bucket that contains several hundred files in a folder. I needed to copy those files into a different folder in another bucket. Sounds simple enough? but was unable to find a simple way to do this through the AWS Console. I found a number of stack overflow articles that talked about using Sync, or downloading the files and re-uploading them. None of which sounded particularly appealing.

In the end I just wrote this bash one liner (which I can probably optimise further by not repeating the sourcebucket / sourcefolder three times):

This just uses s3cmd to list all the files in the bucket/folder I wish to copy from. The output of that is piped to awk which I use to extract the s3 url of each file. I then use tail to remove the first line which I don’t need. I then use sed to build up a ‘s3cmd cp’ command which copies the file from its original location to my new location.

If anyone can suggest a better way that doesnt require me having to download the source files … I’d love to hear it.
If you can’t see the embedded Gist above then you can view it here.

SVN: Useful bash commands

Here’s a couple of useful bash commands I’ve been using recently when working with Subversion:

The first helps me with an annoyance I have with externals. Normally doing a “svn st –ignore-externals” still lists all the externals even though I’m not actually interesting in seeing them when I want to know what I’ve changed locally. For example in the output below I only really want to know that I’ve changed ‘development-tenants.xml’, i’m not really interested in the rest.

Nadeems-Computer:zephyr-trunk nadeemshabir$ svn st --ignore externals
X      lib/arc
X      lib/moriarty
X      lib/simpleSAMLphp
X      3rdPartyDevelopmentTools/svnant-1.0.0
X      3rdPartyDevelopmentTools/PHPUnit
X      3rdPartyDevelopmentTools/selenium-server-1.0-beta-1
X      3rdPartyDevelopmentTools/selenium-core-0.8.3
M      developmentdata/development-tenants.xml

To address this the first command is an alias I’ve created that shows me all the files that I’ve changed/added/removed locally but specifically doesn’t list anything related to any externals

alias whatschanged='svn st --ignore-externals | grep -v "^X "'

The second bash command deals with the fact that I often have a large number of files I want to add to subversion all at once. This command takes all un-added files and adds them to subversion …

svn st | grep '^? ' | awk '{ print $2 }' | xargs svn add

Hope you find them useful.

Cool Bash One-Liner: Post files to Platform Store

As part of some small prototyping activity I had to convert a whole load of data into rdf. My problem was that the files I had generated were scattered around in a very hierarchical directory structure, but all I wanted to do was find them and most them to a platform store. I really didn’t want to have to post them one at time manually. I knew I could do it using a bash script but my scripting was a bit rusty … so I asked Rob, he showed me how to do this …

  2. find . -name "*-issue.xml.rdf" | sed -e ‘s!^\./\([0-9]*-issue.xml.rdf\)!curl -v -d @\1  -H content-type:application/rdf+xml!’ | bash

Cool, huh? 🙂

For the un-initiated, the find locates all the files I want to post which in my case ended with -issue.xml.rdf. The sed search and replace matches the filename, and then replaces it with a curl command, inserting the filename as a parameter @\1. Finally the generated curl command is piped to bash which executes each generated line.