Playing around with StarCluster

Been playing around with StarCluster (documentation here) – today I’m figuring out how to mount volumes and upload data to StarCluster instances. I updated the SC config file with the volume section (had been uncommented out at the end), and then ran these commands:

starcluster createvolume 50 us-east-1c

This uses the volume naming in the config file to boot up a 50GB EBS volume. SC also boots up an instance, and you’re supposed to be able to upload your data and “explore” the volume by using this command:

starcluster sshnode volumecreator volhost-us-east-1c

I managed to log in but had no idea where I was in the file system (inside the volume? Inside the instance? There was no volume path like the one I listed in the config file as I had expected ). So I just terminated the volume instance (this does leaves your EBS volume intact if you look on the Amazon AWS console):

starcluster terminate volumecreator

Next step is to boot up a cluster using the cluster command and upload my data. Booted up a small instance

starcluster start -c my.qiime.cluster gom.illumina

Putting data on the attached volume is SUPER easy – Starcluster handles it all – you just need to type this

starcluster put gom.illumina /Users/hollybik/Dropbox/QIIME/qiime_parameters_18Sopenref_GOMamazon.txt /gom_data/

Where the breakdown of this command is:

starcluster put cluster_name local_filepath remote_filepath

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: