Skip to content

Commit

Permalink
Merge pull request #2657 from dhermes/docstring-doctest-prework
Browse files Browse the repository at this point in the history
Removing explicit (and implicit) doctest blocks from Sphinx docs.
  • Loading branch information
dhermes authored Nov 1, 2016
2 parents c56e6e7 + 372290f commit 1f8a79d
Show file tree
Hide file tree
Showing 14 changed files with 241 additions and 152 deletions.
36 changes: 21 additions & 15 deletions datastore/google/cloud/datastore/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -447,29 +447,35 @@ def query(self, **kwargs):
Passes our ``project``.
Using query to search a datastore::
Using query to search a datastore:
>>> from google.cloud import datastore
>>> client = datastore.Client()
>>> query = client.query(kind='MyKind')
>>> query.add_filter('property', '=', 'val')
.. code-block:: python
>>> from google.cloud import datastore
>>> client = datastore.Client()
>>> query = client.query(kind='MyKind')
>>> query.add_filter('property', '=', 'val')
Using the query iterator's
:meth:`~google.cloud.datastore.query.Iterator.next_page` method:
>>> query_iter = query.fetch()
>>> entities, more_results, cursor = query_iter.next_page()
>>> entities
[<list of Entity unmarshalled from protobuf>]
>>> more_results
<boolean of more results>
>>> cursor
<string containing cursor where fetch stopped>
.. code-block:: python
>>> query_iter = query.fetch()
>>> entities, more_results, cursor = query_iter.next_page()
>>> entities
[<list of Entity unmarshalled from protobuf>]
>>> more_results
<boolean of more results>
>>> cursor
<string containing cursor where fetch stopped>
Under the hood this is doing:
>>> connection.run_query('project', query.to_protobuf())
[<list of Entity Protobufs>], cursor, more_results, skipped_results
.. code-block:: python
>>> connection.run_query('project', query.to_protobuf())
[<list of Entity Protobufs>], cursor, more_results, skipped_results
:type kwargs: dict
:param kwargs: Parameters for initializing and instance of
Expand Down
18 changes: 11 additions & 7 deletions datastore/google/cloud/datastore/connection.py
Original file line number Diff line number Diff line change
Expand Up @@ -474,16 +474,20 @@ def lookup(self, project, key_pbs,
as output). It is used under the hood in
:meth:`Client.get() <.datastore.client.Client.get>`:
>>> from google.cloud import datastore
>>> client = datastore.Client(project='project')
>>> key = client.key('MyKind', 1234)
>>> client.get(key)
[<Entity object>]
.. code-block:: python
>>> from google.cloud import datastore
>>> client = datastore.Client(project='project')
>>> key = client.key('MyKind', 1234)
>>> client.get(key)
[<Entity object>]
Using a :class:`Connection` directly:
>>> connection.lookup('project', [key.to_protobuf()])
[<Entity protobuf>]
.. code-block:: python
>>> connection.lookup('project', [key.to_protobuf()])
[<Entity protobuf>]
:type project: str
:param project: The project to look up the keys in.
Expand Down
21 changes: 14 additions & 7 deletions datastore/google/cloud/datastore/entity.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,10 @@ class Entity(dict):
This means you could take an existing entity and change the key
to duplicate the object.
Use :func:`google.cloud.datastore.get` to retrieve an existing entity.
Use :meth:`~google.cloud.datastore.client.Client.get` to retrieve an
existing entity:
.. code-block:: python
>>> from google.cloud import datastore
>>> client = datastore.Client()
Expand All @@ -47,16 +50,20 @@ class Entity(dict):
You can the set values on the entity just like you would on any
other dictionary.
>>> entity['age'] = 20
>>> entity['name'] = 'JJ'
>>> entity
<Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
.. code-block:: python
>>> entity['age'] = 20
>>> entity['name'] = 'JJ'
>>> entity
<Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
And you can convert an entity to a regular Python dictionary with the
``dict`` builtin:
>>> dict(entity)
{'age': 20, 'name': 'JJ'}
.. code-block:: python
>>> dict(entity)
{'age': 20, 'name': 'JJ'}
.. note::
Expand Down
6 changes: 6 additions & 0 deletions datastore/google/cloud/datastore/key.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,20 +25,26 @@ class Key(object):
To create a basic key:
.. code-block:: python
>>> Key('EntityKind', 1234)
<Key[{'kind': 'EntityKind', 'id': 1234}]>
>>> Key('EntityKind', 'foo')
<Key[{'kind': 'EntityKind', 'name': 'foo'}]>
To create a key with a parent:
.. code-block:: python
>>> Key('Parent', 'foo', 'Child', 1234)
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child', 'id': 1234}]>
>>> Key('Child', 1234, parent=parent_key)
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child', 'id': 1234}]>
To create a partial key:
.. code-block:: python
>>> Key('Parent', 'foo', 'Child')
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child'}]>
Expand Down
31 changes: 22 additions & 9 deletions datastore/google/cloud/datastore/transaction.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,22 +25,27 @@ class Transaction(Batch):
For example, the following snippet of code will put the two ``save``
operations (either ``insert`` or ``upsert``) into the same
mutation, and execute those within a transaction::
mutation, and execute those within a transaction:
.. code-block:: python
>>> from google.cloud import datastore
>>> client = datastore.Client()
>>> with client.transaction():
... client.put_multi([entity1, entity2])
Because it derives from :class:`Batch <.datastore.batch.Batch>`,
:class:`Transaction` also provides :meth:`put` and :meth:`delete` methods::
Because it derives from :class:`~google.cloud.datastore.batch.Batch`,
:class:`Transaction` also provides :meth:`put` and :meth:`delete` methods:
.. code-block:: python
>>> with client.transaction() as xact:
... xact.put(entity1)
... xact.delete(entity2.key)
By default, the transaction is rolled back if the transaction block
exits with an error::
exits with an error:
.. code-block:: python
>>> with client.transaction():
... do_some_work()
Expand All @@ -49,9 +54,13 @@ class Transaction(Batch):
If the transaction block exists without an exception, it will commit
by default.
.. warning:: Inside a transaction, automatically assigned IDs for
.. warning::
Inside a transaction, automatically assigned IDs for
entities will not be available at save time! That means, if you
try::
try:
.. code-block:: python
>>> with client.transaction():
... entity = datastore.Entity(key=client.key('Thing'))
Expand All @@ -61,7 +70,9 @@ class Transaction(Batch):
committed.
Once you exit the transaction (or call :meth:`commit`), the
automatically generated ID will be assigned to the entity::
automatically generated ID will be assigned to the entity:
.. code-block:: python
>>> with client.transaction():
... entity = datastore.Entity(key=client.key('Thing'))
Expand All @@ -73,7 +84,9 @@ class Transaction(Batch):
False
If you don't want to use the context manager you can initialize a
transaction manually::
transaction manually:
.. code-block:: python
>>> transaction = client.transaction()
>>> transaction.begin()
Expand Down
32 changes: 16 additions & 16 deletions docs/bigquery-usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Authentication / Configuration
:envvar:`GOOGLE_CLOUD_PROJECT` environment variables, create an instance of
:class:`Client <google.cloud.bigquery.client.Client>`.

.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
Expand All @@ -39,7 +39,7 @@ To override the project inferred from the environment, pass an explicit
``project`` to the constructor, or to either of the alternative
``classmethod`` factories:

.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client(project='PROJECT_ID')
Expand Down Expand Up @@ -101,7 +101,7 @@ Patch metadata for a dataset:

Replace the ACL for a dataset, and update all writeable fields:

.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
Expand Down Expand Up @@ -231,7 +231,7 @@ Querying data (asynchronous)

Background a query, loading the results into a table:

.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
Expand Down Expand Up @@ -262,7 +262,7 @@ Background a query, loading the results into a table:

Then, begin executing the job on the server:

.. doctest::
.. code-block:: python
>>> job.begin() # API call
>>> job.created
Expand All @@ -272,7 +272,7 @@ Then, begin executing the job on the server:
Poll until the job is complete:

.. doctest::
.. code-block:: python
>>> import time
>>> retry_count = 100
Expand All @@ -287,7 +287,7 @@ Poll until the job is complete:
Retrieve the results:

.. doctest::
.. code-block:: python
>>> results = job.results()
>>> rows, total_count, token = query.fetch_data() # API requet
Expand All @@ -306,7 +306,7 @@ Start a job loading data asynchronously from a set of CSV files, located on
Google Cloud Storage, appending rows into an existing table. First, create
the job locally:

.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> from google.cloud.bigquery import SchemaField
Expand Down Expand Up @@ -337,7 +337,7 @@ the job locally:
Then, begin executing the job on the server:
.. doctest::
.. code-block:: python
>>> job.begin() # API call
>>> job.created
Expand All @@ -347,7 +347,7 @@ Then, begin executing the job on the server:
Poll until the job is complete:
.. doctest::
.. code-block:: python
>>> import time
>>> retry_count = 100
Expand All @@ -367,7 +367,7 @@ Exporting data (async)
Start a job exporting a table's data asynchronously to a set of CSV files,
located on Google Cloud Storage. First, create the job locally:
.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
Expand Down Expand Up @@ -395,7 +395,7 @@ located on Google Cloud Storage. First, create the job locally:
Then, begin executing the job on the server:
.. doctest::
.. code-block:: python
>>> job.begin() # API call
>>> job.created
Expand All @@ -405,7 +405,7 @@ Then, begin executing the job on the server:
Poll until the job is complete:
.. doctest::
.. code-block:: python
>>> import time
>>> retry_count = 100
Expand All @@ -424,7 +424,7 @@ Copy tables (async)
First, create the job locally:
.. doctest::
.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
Expand All @@ -449,7 +449,7 @@ First, create the job locally:
Then, begin executing the job on the server:
.. doctest::
.. code-block:: python
>>> job.begin() # API call
>>> job.created
Expand All @@ -459,7 +459,7 @@ Then, begin executing the job on the server:
Poll until the job is complete:
.. doctest::
.. code-block:: python
>>> import time
>>> retry_count = 100
Expand Down
Loading

0 comments on commit 1f8a79d

Please sign in to comment.