Skip to content

Commit

Permalink
Changing Key noun to be Object in storage package.
Browse files Browse the repository at this point in the history
This involves changes in all documentation, renaming a module
and a test module, and updating the calls and variable names in
a regression test module.

Fixes googleapis#544.
  • Loading branch information
dhermes committed Jan 20, 2015
1 parent 1bfa469 commit b5ceb5f
Show file tree
Hide file tree
Showing 21 changed files with 824 additions and 818 deletions.
6 changes: 3 additions & 3 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,9 @@ to Cloud Storage using this Client Library.
import gcloud.storage
bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
# Then do other things...
key = bucket.get_key('/remote/path/to/file.txt')
print key.get_contents_as_string()
key.set_contents_from_string('New contents!')
object_ = bucket.get_object('/remote/path/to/file.txt')
print object_.get_contents_as_string()
object_.set_contents_from_string('New contents!')
bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')
Contributing
Expand Down
67 changes: 31 additions & 36 deletions docs/_components/storage-getting-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Getting started with Cloud Storage
This tutorial focuses on using ``gcloud`` to access
Google Cloud Storage.
We'll go through the basic concepts,
how to operate on buckets and keys,
how to operate on buckets and objects,
and how to handle access control,
among other things.

Expand Down Expand Up @@ -113,33 +113,28 @@ by recognizing forward-slashes (``/``)
so if you want to group data into "directories",
you can do that.

The fundamental container for a file in Cloud Storage
is called an Object,
however ``gcloud`` uses the term ``Key``
to avoid confusion between ``object`` and ``Object``.

If you want to set some data,
you just create a ``Key`` inside your bucket
and store your data inside the key::
you just create a ``Object`` inside your bucket
and store your data inside it::

>>> key = bucket.new_key('greeting.txt')
>>> key.set_contents_from_string('Hello world!')
>>> object_ = bucket.new_object('greeting.txt')
>>> object_.set_contents_from_string('Hello world!')

:func:`new_key <gcloud.storage.bucket.Bucket.new_key>`
creates a :class:`Key <gcloud.storage.key.Key>` object locally
:func:`new_object <gcloud.storage.bucket.Bucket.new_object>`
creates a :class:`Object <gcloud.storage.object_.Object>` object locally
and
:func:`set_contents_from_string <gcloud.storage.key.Key.set_contents_from_string>`
allows you to put a string into the key.
:func:`set_contents_from_string <gcloud.storage.object_.Object.set_contents_from_string>`
allows you to put a string into the object.

Now we can test if it worked::

>>> key = bucket.get_key('greeting.txt')
>>> print key.get_contents_as_string()
>>> object_ = bucket.get_object('greeting.txt')
>>> print object_.get_contents_as_string()
Hello world!

What if you want to save the contents to a file?

>>> key.get_contents_to_filename('greetings.txt')
>>> object_.get_contents_to_filename('greetings.txt')

Then you can look at the file in a terminal::

Expand All @@ -149,32 +144,32 @@ Then you can look at the file in a terminal::
And what about when you're not dealing with text?
That's pretty simple too::

>>> key = bucket.new_key('kitten.jpg')
>>> key.set_contents_from_filename('kitten.jpg')
>>> object_ = bucket.new_object('kitten.jpg')
>>> object_.set_contents_from_filename('kitten.jpg')

And to test whether it worked?

>>> key = bucket.get_key('kitten.jpg')
>>> key.get_contents_to_filename('kitten2.jpg')
>>> object_ = bucket.get_object('kitten.jpg')
>>> object_.get_contents_to_filename('kitten2.jpg')

and check if they are the same in a terminal::

$ diff kitten.jpg kitten2.jpg

Notice that we're using
:func:`get_key <gcloud.storage.bucket.Bucket.get_key>`
to retrieve a key we know exists remotely.
If the key doesn't exist, it will return ``None``.
:func:`get_object <gcloud.storage.bucket.Bucket.get_object>`
to retrieve an object we know exists remotely.
If the object doesn't exist, it will return ``None``.

.. note:: ``get_key`` is **not** retrieving the entire object's data.
.. note:: ``get_object`` is **not** retrieving the entire object's data.

If you want to "get-or-create" the key
If you want to "get-or-create" the object
(that is, overwrite it if it already exists),
you can use :func:`new_key <gcloud.storage.bucket.Bucket.new_key>`.
However, keep in mind, the key is not created
you can use :func:`new_object <gcloud.storage.bucket.Bucket.new_object>`.
However, keep in mind, the object is not created
until you store some data inside of it.

If you want to check whether a key exists,
If you want to check whether an object exists,
you can use the ``in`` operator in Python::

>>> print 'kitten.jpg' in bucket
Expand All @@ -191,17 +186,17 @@ to retrieve the bucket object::

>>> bucket = connection.get_bucket('my-bucket')

If you want to get all the keys in the bucket,
If you want to get all the objects in the bucket,
you can use
:func:`get_all_keys <gcloud.storage.bucket.Bucket.get_all_keys>`::
:func:`get_all_objects <gcloud.storage.bucket.Bucket.get_all_objects>`::

>>> keys = bucket.get_all_keys()
>>> objects = bucket.get_all_objects()

However, if you're looking to iterate through the keys,
However, if you're looking to iterate through the objects,
you can use the bucket itself as an iterator::

>>> for key in bucket:
... print key
>>> for object_ in bucket:
... print object_

Deleting a bucket
-----------------
Expand Down Expand Up @@ -234,7 +229,7 @@ Managing access control
-----------------------

Cloud storage provides fine-grained access control
for both buckets and keys.
for both buckets and objects.
`gcloud` tries to simplify access control
by working with entities and "grants".
On any ACL,
Expand Down
18 changes: 9 additions & 9 deletions docs/_components/storage-quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,22 +53,22 @@ and instantiating the demo connection::
>>> connection = demo.get_connection()

Once you have the connection,
you can create buckets and keys::
you can create buckets and objects::

>>> connection.get_all_buckets()
[<Bucket: ...>, ...]
>>> bucket = connection.create_bucket('my-new-bucket')
>>> print bucket
<Bucket: my-new-bucket>
>>> key = bucket.new_key('my-test-file.txt')
>>> print key
<Key: my-new-bucket, my-test-file.txt>
>>> key = key.set_contents_from_string('this is test content!')
>>> print key.get_contents_as_string()
>>> object_ = bucket.new_object('my-test-file.txt')
>>> print object_
<Object: my-new-bucket, my-test-file.txt>
>>> object_ = object_.set_contents_from_string('this is test content!')
>>> print object_.get_contents_as_string()
'this is test content!'
>>> print bucket.get_all_keys()
[<Key: my-new-bucket, my-test-file.txt>]
>>> key.delete()
>>> print bucket.get_all_objects()
[<Object: my-new-bucket, my-test-file.txt>]
>>> object_.delete()
>>> bucket.delete()

.. note::
Expand Down
6 changes: 3 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
datastore-batches
storage-api
storage-buckets
storage-keys
storage-objects
storage-acl


Expand Down Expand Up @@ -48,5 +48,5 @@ Cloud Storage
from gcloud import storage
bucket = storage.get_bucket('<your-bucket-name>', '<your-project-id>')
key = bucket.new_key('my-test-file.txt')
key = key.upload_contents_from_string('this is test content!')
object_ = bucket.new_object('my-test-file.txt')
object_ = object_.upload_contents_from_string('this is test content!')
2 changes: 1 addition & 1 deletion docs/storage-api.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
.. toctree::
:maxdepth: 0
:hidden:
:hidden:

Storage
-------
Expand Down
7 changes: 0 additions & 7 deletions docs/storage-keys.rst

This file was deleted.

7 changes: 7 additions & 0 deletions docs/storage-objects.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Objects
~~~~~~~

.. automodule:: gcloud.storage.object_
:members:
:undoc-members:
:show-inheritance:
8 changes: 4 additions & 4 deletions gcloud/storage/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@
>>> import gcloud.storage
>>> bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
>>> # Then do other things...
>>> key = bucket.get_key('/remote/path/to/file.txt')
>>> print key.get_contents_as_string()
>>> key.set_contents_from_string('New contents!')
>>> object_ = bucket.get_object('/remote/path/to/file.txt')
>>> print object_.get_contents_as_string()
>>> object_.set_contents_from_string('New contents!')
>>> bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')
The main concepts with this API are:
Expand All @@ -32,7 +32,7 @@
- :class:`gcloud.storage.bucket.Bucket` which represents a particular
bucket (akin to a mounted disk on a computer).
- :class:`gcloud.storage.key.Key` which represents a pointer to a
- :class:`gcloud.storage.object_.Object` which represents a pointer to a
particular entity in Cloud Storage (akin to a file path on a remote
machine).
"""
Expand Down
8 changes: 4 additions & 4 deletions gcloud/storage/_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,11 +79,11 @@ def batch(self):
... bucket.enable_versioning()
... bucket.disable_website()
or for a key::
or for an object::
>>> with key.batch:
... key.content_type = 'image/jpeg'
... key.content_encoding = 'gzip'
>>> with object_.batch:
... object_.content_type = 'image/jpeg'
... object_.content_encoding = 'gzip'
Updates will be aggregated and sent as a single call to
:meth:`_patch_properties` IFF the ``with`` block exits without
Expand Down
30 changes: 16 additions & 14 deletions gcloud/storage/acl.py
Original file line number Diff line number Diff line change
Expand Up @@ -491,15 +491,15 @@ class DefaultObjectACL(BucketACL):


class ObjectACL(ACL):
"""An ACL specifically for a key."""
"""An ACL specifically for a Cloud Storage Object.
def __init__(self, key):
"""
:type key: :class:`gcloud.storage.key.Key`
:param key: The key that this ACL corresponds to.
"""
:type object_: :class:`gcloud.storage.object_.Object`
:param object_: The object that this ACL corresponds to.
"""

def __init__(self, object_):
super(ObjectACL, self).__init__()
self.key = key
self.object_ = object_

def reload(self):
"""Reload the ACL data from Cloud Storage.
Expand All @@ -509,16 +509,17 @@ def reload(self):
"""
self.entities.clear()

url_path = '%s/acl' % self.key.path
found = self.key.connection.api_request(method='GET', path=url_path)
url_path = '%s/acl' % self.object_.path
found = self.object_.connection.api_request(method='GET',
path=url_path)
self.loaded = True
for entry in found['items']:
self.add_entity(self.entity_from_dict(entry))

return self

def save(self, acl=None):
"""Save the ACL data for this key.
"""Save the ACL data for this object.
:type acl: :class:`gcloud.storage.acl.ACL`
:param acl: The ACL object to save. If left blank, this will
Expand All @@ -531,8 +532,9 @@ def save(self, acl=None):
save_to_backend = True

if save_to_backend:
result = self.key.connection.api_request(
method='PATCH', path=self.key.path, data={'acl': list(acl)},
result = self.object_.connection.api_request(
method='PATCH', path=self.object_.path,
data={'acl': list(acl)},
query_params={'projection': 'full'})
self.entities.clear()
for entry in result['acl']:
Expand All @@ -542,11 +544,11 @@ def save(self, acl=None):
return self

def clear(self):
"""Remove all ACL rules from the key.
"""Remove all ACL rules from the object.
Note that this won't actually remove *ALL* the rules, but it
will remove all the non-default rules. In short, you'll still
have access to a key that you created even after you clear ACL
have access to an object that you created even after you clear ACL
rules with this method.
"""
return self.save([])
Loading

0 comments on commit b5ceb5f

Please sign in to comment.