Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merging in storage samples. #23

Merged
merged 1 commit into from
May 21, 2015
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions storage/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
## Python Samples for Google Cloud Storage

Two samples:

1. ``list_objects.py`` lists objects in a bucket.
2. ``compose_objects.py`` composes objects together to create another.

See the docstring for each sample for usage, or run the sample for the help text.

### Setup

Before running the samples, you'll need the Google Cloud SDK in order to setup authentication.

1. Install the [Google Cloud SDK](https://cloud.google.com/sdk/), including the [gcloud tool](https://cloud.google.com/sdk/gcloud/), and [gcloud app component](https://cloud.google.com/sdk/gcloud-app).
2. Setup the gcloud tool.

```
gcloud components update app
gcloud auth login
gcloud config set project <your-app-id>
```

You will also need to install the dependencies using [pip](https://pypi.python.org/pypi/pip):

```
pip install -r requirements.txt
```
Empty file added storage/__init__.py
Empty file.
106 changes: 106 additions & 0 deletions storage/compose_objects.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
# -*- coding: utf-8 -*-
#
# Copyright (C) 2013 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# [START all]
"""Command-line sample application for composing objects using the Cloud
Storage API.

Before running, authenticate with the Google Cloud SDK by running:
$ gcloud auth login

Create a least two sample files:
$ echo "File 1" > file1.txt
$ echo "File 2" > file2.txt

Example invocation:
$ python compose_objects.py my-bucket destination.txt file1.txt file2.txt

Usage:
$ python compose_objects.py <your-bucket> <destination-file-name> \
<source-1> [... <source-n>]

You can also get help on all the command-line flags the program understands
by running:
$ python compose-sample.py --help

"""

import argparse
import sys
import json

from apiclient import discovery
from oauth2client.client import GoogleCredentials

# Parser for command-line arguments.
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('bucket')
parser.add_argument('destination', help='Destination file name')
parser.add_argument('sources', nargs='+', help='Source files to compose')


def main(argv):
# Parse the command-line flags.
args = parser.parse_args(argv[1:])

# Get the application default credentials. When running locally, these are
# available after running `gcloud auth login`. When running on compute
# engine, these are available from the environment.
credentials = GoogleCredentials.get_application_default()

# Construct the service object for the interacting with the Cloud Storage
# API.
service = discovery.build('storage', 'v1', credentials=credentials)

# Upload the source files.
for filename in args.sources:
req = service.objects().insert(
media_body=filename,
name=filename,
bucket=args.bucket)
resp = req.execute()
print '> Uploaded source file %s' % filename
print json.dumps(resp, indent=2)

# Construct a request to compose the source files into the destination.
compose_req_body = {
'sourceObjects': [{'name': filename} for filename in args.sources],
'destination': {
'contentType': 'text/plain', # required
}
}
req = service.objects().compose(
destinationBucket=args.bucket,
destinationObject=args.destination,
body=compose_req_body)
resp = req.execute()
print '> Composed files into %s' % args.destination
print json.dumps(resp, indent=2)

# Download and print the composed object.
req = service.objects().get_media(
bucket=args.bucket,
object=args.destination)

res = req.execute()
print '> Composed file contents:'
print res


if __name__ == '__main__':
main(sys.argv)
# [END all]
78 changes: 78 additions & 0 deletions storage/list_objects.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
# -*- coding: utf-8 -*-
#
# Copyright (C) 2013 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# [START all]
"""Command-line sample application for listing all objects
in a bucket using the Cloud Storage API.

Before running, authenticate with the Google Cloud SDK by running:
$ gcloud auth login

Usage:
$ python list_objects.py <your-bucket>

You can also get help on all the command-line flags the program understands
by running:
$ python list_objects.py --help

"""

import argparse
import sys
import json

from apiclient import discovery
from oauth2client.client import GoogleCredentials


# Parser for command-line arguments.
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('bucket')


def main(argv):
# Parse the command-line flags.
args = parser.parse_args(argv[1:])

# Get the application default credentials. When running locally, these are
# available after running `gcloud auth login`. When running on compute
# engine, these are available from the environment.
credentials = GoogleCredentials.get_application_default()

# Construct the service object for interacting with the Cloud Storage API.
service = discovery.build('storage', 'v1', credentials=credentials)

# Make a request to buckets.get to retrieve information about the bucket.
req = service.buckets().get(bucket=args.bucket)
resp = req.execute()
print json.dumps(resp, indent=2)

# Create a request to objects.list to retrieve a list of objects.
fields_to_return = \
'nextPageToken,items(name,size,contentType,metadata(my-key))'
req = service.objects().list(bucket=args.bucket, fields=fields_to_return)

# If you have too many items to list in one request, list_next() will
# automatically handle paging with the pageToken.
while req is not None:
resp = req.execute()
print json.dumps(resp, indent=2)
req = service.objects().list_next(req, resp)

if __name__ == '__main__':
main(sys.argv)
# [END all]
1 change: 1 addition & 0 deletions storage/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
google-api-python-client>=1.4.0
24 changes: 24 additions & 0 deletions storage/tests/test_list_objects.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Copyright 2015, Google, Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from storage.list_objects import main
from tests import CloudBaseTest


class TestListObjects(CloudBaseTest):
def test_main(self):
args = [
'ignored_command_name',
self.constants['bucketName']
]
self.assertNotRaises(main(args))
57 changes: 57 additions & 0 deletions tests/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# Copyright 2015, Google, Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
Common testing utilities between samples
"""

import json
import os
import unittest


BUCKET_NAME_ENV = 'TEST_BUCKET_NAME'
PROJECT_ID_ENV = 'TEST_PROJECT_ID'
RESOURCE_PATH = os.path.join(os.getcwd(), 'resources')


class CloudBaseTest(unittest.TestCase):

def setUp(self):
# A hack to prevent get_application_default from going GAE route.
self._server_software_org = os.environ.get('SERVER_SOFTWARE')
os.environ['SERVER_SOFTWARE'] = ''

# Constants from environment
test_bucket_name = os.environ.get(BUCKET_NAME_ENV, '')
test_project_id = os.environ.get(PROJECT_ID_ENV, '')
if not test_project_id or not test_bucket_name:
raise Exception('You need to define an env var "%s" and "%s" to '
'run the test.'
% (PROJECT_ID_ENV, BUCKET_NAME_ENV))

# Constants from resources/constants.json
with open(
os.path.join(RESOURCE_PATH, 'constants.json'),
'r') as constants_file:

self.constants = json.load(constants_file)
self.constants['projectId'] = test_project_id
self.constants['bucketName'] = test_bucket_name
self.constants['cloudStorageInputURI'] = (
self.constants['cloudStorageInputURI'] % test_bucket_name)
self.constants['cloudStorageOutputURI'] = (
self.constants['cloudStorageOutputURI'] % test_bucket_name)

def tearDown(self):
os.environ['SERVER_SOFTWARE'] = self._server_software_org