<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://techwiki.co.uk/index.php?action=history&amp;feed=atom&amp;title=RClone_-_Amazon_S3</id>
	<title>RClone - Amazon S3 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="http://techwiki.co.uk/index.php?action=history&amp;feed=atom&amp;title=RClone_-_Amazon_S3"/>
	<link rel="alternate" type="text/html" href="http://techwiki.co.uk/index.php?title=RClone_-_Amazon_S3&amp;action=history"/>
	<updated>2026-05-11T16:33:12Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.34.0</generator>
	<entry>
		<id>http://techwiki.co.uk/index.php?title=RClone_-_Amazon_S3&amp;diff=128&amp;oldid=prev</id>
		<title>Adam.birds: Created page with &quot;=Amazon S3=  ==Setup==  Paths are specified as '''remote:bucket''' (or '''remote:''' for the '''lsd''' command.) You may put subdirectories in too, eg '''remote:bucket/path/to...&quot;</title>
		<link rel="alternate" type="text/html" href="http://techwiki.co.uk/index.php?title=RClone_-_Amazon_S3&amp;diff=128&amp;oldid=prev"/>
		<updated>2016-04-30T15:01:50Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;=Amazon S3=  ==Setup==  Paths are specified as &amp;#039;&amp;#039;&amp;#039;remote:bucket&amp;#039;&amp;#039;&amp;#039; (or &amp;#039;&amp;#039;&amp;#039;remote:&amp;#039;&amp;#039;&amp;#039; for the &amp;#039;&amp;#039;&amp;#039;lsd&amp;#039;&amp;#039;&amp;#039; command.) You may put subdirectories in too, eg &amp;#039;&amp;#039;&amp;#039;remote:bucket/path/to...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;=Amazon S3=&lt;br /&gt;
&lt;br /&gt;
==Setup==&lt;br /&gt;
&lt;br /&gt;
Paths are specified as '''remote:bucket''' (or '''remote:''' for the '''lsd''' command.) You may put subdirectories in too, eg '''remote:bucket/path/to/dir'''.&lt;br /&gt;
&lt;br /&gt;
Here is an example of making an s3 configuration. First run:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rclone config&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will guide you through an interactive setup process:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
No remotes found - make a new one&lt;br /&gt;
n) New remote&lt;br /&gt;
s) Set configuration password&lt;br /&gt;
n/s&amp;gt; n&lt;br /&gt;
name&amp;gt; remote&lt;br /&gt;
Type of storage to configure.&lt;br /&gt;
Choose a number from below, or type in your own value&lt;br /&gt;
 1 / Amazon Cloud Drive&lt;br /&gt;
   \ &amp;quot;amazon cloud drive&amp;quot;&lt;br /&gt;
 2 / Amazon S3 (also Dreamhost, Ceph)&lt;br /&gt;
   \ &amp;quot;s3&amp;quot;&lt;br /&gt;
 3 / Backblaze B2&lt;br /&gt;
   \ &amp;quot;b2&amp;quot;&lt;br /&gt;
 4 / Dropbox&lt;br /&gt;
   \ &amp;quot;dropbox&amp;quot;&lt;br /&gt;
 5 / Google Cloud Storage (this is not Google Drive)&lt;br /&gt;
   \ &amp;quot;google cloud storage&amp;quot;&lt;br /&gt;
 6 / Google Drive&lt;br /&gt;
   \ &amp;quot;drive&amp;quot;&lt;br /&gt;
 7 / Hubic&lt;br /&gt;
   \ &amp;quot;hubic&amp;quot;&lt;br /&gt;
 8 / Local Disk&lt;br /&gt;
   \ &amp;quot;local&amp;quot;&lt;br /&gt;
 9 / Microsoft OneDrive&lt;br /&gt;
   \ &amp;quot;onedrive&amp;quot;&lt;br /&gt;
10 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)&lt;br /&gt;
   \ &amp;quot;swift&amp;quot;&lt;br /&gt;
11 / Yandex Disk&lt;br /&gt;
   \ &amp;quot;yandex&amp;quot;&lt;br /&gt;
Storage&amp;gt; 2&lt;br /&gt;
Get AWS credentials from runtime (environment variables or EC2 meta data if no env vars). Only applies if access_key_id and secret_access_key is blank.&lt;br /&gt;
Choose a number from below, or type in your own value&lt;br /&gt;
 1 / Enter AWS credentials in the next step&lt;br /&gt;
   \ &amp;quot;false&amp;quot;&lt;br /&gt;
 2 / Get AWS credentials from the environment (env vars or IAM)&lt;br /&gt;
   \ &amp;quot;true&amp;quot;&lt;br /&gt;
env_auth&amp;gt; 1&lt;br /&gt;
AWS Access Key ID - leave blank for anonymous access or runtime credentials.&lt;br /&gt;
access_key_id&amp;gt; access_key&lt;br /&gt;
AWS Secret Access Key (password) - leave blank for anonymous access or runtime credentials.&lt;br /&gt;
secret_access_key&amp;gt; secret_key&lt;br /&gt;
Region to connect to.&lt;br /&gt;
Choose a number from below, or type in your own value&lt;br /&gt;
   / The default endpoint - a good choice if you are unsure.&lt;br /&gt;
 1 | US Region, Northern Virginia or Pacific Northwest.&lt;br /&gt;
   | Leave location constraint empty.&lt;br /&gt;
   \ &amp;quot;us-east-1&amp;quot;&lt;br /&gt;
   / US West (Oregon) Region&lt;br /&gt;
 2 | Needs location constraint us-west-2.&lt;br /&gt;
   \ &amp;quot;us-west-2&amp;quot;&lt;br /&gt;
   / US West (Northern California) Region&lt;br /&gt;
 3 | Needs location constraint us-west-1.&lt;br /&gt;
   \ &amp;quot;us-west-1&amp;quot;&lt;br /&gt;
   / EU (Ireland) Region Region&lt;br /&gt;
 4 | Needs location constraint EU or eu-west-1.&lt;br /&gt;
   \ &amp;quot;eu-west-1&amp;quot;&lt;br /&gt;
   / EU (Frankfurt) Region&lt;br /&gt;
 5 | Needs location constraint eu-central-1.&lt;br /&gt;
   \ &amp;quot;eu-central-1&amp;quot;&lt;br /&gt;
   / Asia Pacific (Singapore) Region&lt;br /&gt;
 6 | Needs location constraint ap-southeast-1.&lt;br /&gt;
   \ &amp;quot;ap-southeast-1&amp;quot;&lt;br /&gt;
   / Asia Pacific (Sydney) Region&lt;br /&gt;
 7 | Needs location constraint ap-southeast-2.&lt;br /&gt;
   \ &amp;quot;ap-southeast-2&amp;quot;&lt;br /&gt;
   / Asia Pacific (Tokyo) Region&lt;br /&gt;
 8 | Needs location constraint ap-northeast-1.&lt;br /&gt;
   \ &amp;quot;ap-northeast-1&amp;quot;&lt;br /&gt;
   / South America (Sao Paulo) Region&lt;br /&gt;
 9 | Needs location constraint sa-east-1.&lt;br /&gt;
   \ &amp;quot;sa-east-1&amp;quot;&lt;br /&gt;
   / If using an S3 clone that only understands v2 signatures&lt;br /&gt;
10 | eg Ceph/Dreamhost&lt;br /&gt;
   | set this and make sure you set the endpoint.&lt;br /&gt;
   \ &amp;quot;other-v2-signature&amp;quot;&lt;br /&gt;
   / If using an S3 clone that understands v4 signatures set this&lt;br /&gt;
11 | and make sure you set the endpoint.&lt;br /&gt;
   \ &amp;quot;other-v4-signature&amp;quot;&lt;br /&gt;
region&amp;gt; 1&lt;br /&gt;
Endpoint for S3 API.&lt;br /&gt;
Leave blank if using AWS to use the default endpoint for the region.&lt;br /&gt;
Specify if using an S3 clone such as Ceph.&lt;br /&gt;
endpoint&amp;gt; &lt;br /&gt;
Location constraint - must be set to match the Region. Used when creating buckets only.&lt;br /&gt;
Choose a number from below, or type in your own value&lt;br /&gt;
 1 / Empty for US Region, Northern Virginia or Pacific Northwest.&lt;br /&gt;
   \ &amp;quot;&amp;quot;&lt;br /&gt;
 2 / US West (Oregon) Region.&lt;br /&gt;
   \ &amp;quot;us-west-2&amp;quot;&lt;br /&gt;
 3 / US West (Northern California) Region.&lt;br /&gt;
   \ &amp;quot;us-west-1&amp;quot;&lt;br /&gt;
 4 / EU (Ireland) Region.&lt;br /&gt;
   \ &amp;quot;eu-west-1&amp;quot;&lt;br /&gt;
 5 / EU Region.&lt;br /&gt;
   \ &amp;quot;EU&amp;quot;&lt;br /&gt;
 6 / Asia Pacific (Singapore) Region.&lt;br /&gt;
   \ &amp;quot;ap-southeast-1&amp;quot;&lt;br /&gt;
 7 / Asia Pacific (Sydney) Region.&lt;br /&gt;
   \ &amp;quot;ap-southeast-2&amp;quot;&lt;br /&gt;
 8 / Asia Pacific (Tokyo) Region.&lt;br /&gt;
   \ &amp;quot;ap-northeast-1&amp;quot;&lt;br /&gt;
 9 / South America (Sao Paulo) Region.&lt;br /&gt;
   \ &amp;quot;sa-east-1&amp;quot;&lt;br /&gt;
location_constraint&amp;gt; 1&lt;br /&gt;
Remote config&lt;br /&gt;
--------------------&lt;br /&gt;
[remote]&lt;br /&gt;
env_auth = false&lt;br /&gt;
access_key_id = access_key&lt;br /&gt;
secret_access_key = secret_key&lt;br /&gt;
region = us-east-1&lt;br /&gt;
endpoint = &lt;br /&gt;
location_constraint = &lt;br /&gt;
--------------------&lt;br /&gt;
y) Yes this is OK&lt;br /&gt;
e) Edit this remote&lt;br /&gt;
d) Delete this remote&lt;br /&gt;
y/e/d&amp;gt; y&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Usage==&lt;br /&gt;
&lt;br /&gt;
This remote is called '''remote''' and can now be used like this:&lt;br /&gt;
&lt;br /&gt;
See all buckets&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rclone lsd remote:&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Make a new bucket&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rclone mkdir remote:bucket&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
List the contents of a bucket&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rclone ls remote:bucket&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Sync '''/home/local/directory''' to the remote bucket, deleting any excess files in the bucket.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rclone sync /home/local/directory remote:bucket&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Modified Time==&lt;br /&gt;
&lt;br /&gt;
The modified time is stored as metadata on the object as '''X-Amz-Meta-Mtime''' as floating point since the epoch accurate to 1 ns.&lt;br /&gt;
&lt;br /&gt;
==Multipart uploads==&lt;br /&gt;
&lt;br /&gt;
rclone supports multipart uploads with S3 which means that it can upload files bigger than 5GB. Note that files uploaded with multipart upload don’t have an MD5SUM.&lt;br /&gt;
&lt;br /&gt;
==Buckets and Regions==&lt;br /&gt;
&lt;br /&gt;
With Amazon S3 you can list buckets ('''rclone lsd''') using any region, but you can only access the content of a bucket from the region it was created in. If you attempt to access a bucket from the wrong region, you will get an error, '''incorrect region, the bucket is not in 'XXX' region'''.&lt;br /&gt;
&lt;br /&gt;
==Authentication==&lt;br /&gt;
&lt;br /&gt;
There are two ways to supply '''rclone''' with a set of AWS credentials. In order of precedence:&lt;br /&gt;
&lt;br /&gt;
*Directly in the rclone configuration file (as configured by '''rclone config''')&lt;br /&gt;
**set '''access_key_id''' and '''secret_access_key'''&lt;br /&gt;
*Runtime configuration:&lt;br /&gt;
**set '''env_auth''' to '''true''' in the config file&lt;br /&gt;
**Exporting the following environment variables before running '''rclone'''&lt;br /&gt;
***Access Key ID: '''AWS_ACCESS_KEY_ID''' or '''AWS_ACCESS_KEY'''&lt;br /&gt;
***Secret Access Key: '''AWS_SECRET_ACCESS_KEY''' or '''AWS_SECRET_KEY'''&lt;br /&gt;
**Running '''rclone''' on an EC2 instance with an IAM role&lt;br /&gt;
&lt;br /&gt;
If none of these option actually end up providing '''rclone''' with AWS credentials then S3 interaction will be non-authenticated (see below).&lt;br /&gt;
&lt;br /&gt;
==Anonymous access to public buckets==&lt;br /&gt;
&lt;br /&gt;
If you want to use rclone to access a public bucket, configure with a '''blank access_key_id''' and '''secret_access_key'''. Eg&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
No remotes found - make a new one&lt;br /&gt;
n) New remote&lt;br /&gt;
q) Quit config&lt;br /&gt;
n/q&amp;gt; n&lt;br /&gt;
name&amp;gt; anons3&lt;br /&gt;
What type of source is it?&lt;br /&gt;
Choose a number from below&lt;br /&gt;
 1) amazon cloud drive&lt;br /&gt;
 2) b2&lt;br /&gt;
 3) drive&lt;br /&gt;
 4) dropbox&lt;br /&gt;
 5) google cloud storage&lt;br /&gt;
 6) swift&lt;br /&gt;
 7) hubic&lt;br /&gt;
 8) local&lt;br /&gt;
 9) onedrive&lt;br /&gt;
10) s3&lt;br /&gt;
11) yandex&lt;br /&gt;
type&amp;gt; 10&lt;br /&gt;
Get AWS credentials from runtime (environment variables or EC2 meta data if no env vars). Only applies if access_key_id and secret_access_key is blank.&lt;br /&gt;
Choose a number from below, or type in your own value&lt;br /&gt;
 * Enter AWS credentials in the next step&lt;br /&gt;
 1) false&lt;br /&gt;
 * Get AWS credentials from the environment (env vars or IAM)&lt;br /&gt;
 2) true&lt;br /&gt;
env_auth&amp;gt; 1&lt;br /&gt;
AWS Access Key ID - leave blank for anonymous access or runtime credentials.&lt;br /&gt;
access_key_id&amp;gt;&lt;br /&gt;
AWS Secret Access Key (password) - leave blank for anonymous access or runtime credentials.&lt;br /&gt;
secret_access_key&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then use it as normal with the name of the public bucket, eg&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rclone lsd anons3:1000genomes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You will be able to list and copy data but not upload it.&lt;br /&gt;
&lt;br /&gt;
==Ceph==&lt;br /&gt;
&lt;br /&gt;
Ceph is an object storage system which presents an Amazon S3 interface.&lt;br /&gt;
&lt;br /&gt;
To use rclone with ceph, you need to set the following parameters in the config.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
access_key_id = Whatever&lt;br /&gt;
secret_access_key = Whatever&lt;br /&gt;
endpoint = https://ceph.endpoint.goes.here/&lt;br /&gt;
region = other-v2-signature&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note also that Ceph sometimes puts '''/''' in the passwords it gives users. If you read the secret access key using the command line tools you will get a JSON blob with the '''/''' escaped as '''\/'''. Make sure you only write '''/''' in the secret access key.&lt;br /&gt;
&lt;br /&gt;
Eg the dump from Ceph looks something like this (irrelevant keys removed).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
{&lt;br /&gt;
    &amp;quot;user_id&amp;quot;: &amp;quot;xxx&amp;quot;,&lt;br /&gt;
    &amp;quot;display_name&amp;quot;: &amp;quot;xxxx&amp;quot;,&lt;br /&gt;
    &amp;quot;keys&amp;quot;: [&lt;br /&gt;
        {&lt;br /&gt;
            &amp;quot;user&amp;quot;: &amp;quot;xxx&amp;quot;,&lt;br /&gt;
            &amp;quot;access_key&amp;quot;: &amp;quot;xxxxxx&amp;quot;,&lt;br /&gt;
            &amp;quot;secret_key&amp;quot;: &amp;quot;xxxxxx\/xxxx&amp;quot;&lt;br /&gt;
        }&lt;br /&gt;
    ],&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Because this is a json dump, it is encoding the '''/''' as '''\/''', so if you use the secret key as '''xxxxxx/xxxx''' it will work fine.&lt;br /&gt;
&lt;br /&gt;
[[Category:RClone]]&lt;br /&gt;
[[Category:Linux]]&lt;br /&gt;
[[Category:Contents]]&lt;/div&gt;</summary>
		<author><name>Adam.birds</name></author>
		
	</entry>
</feed>