Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin pandas-gbq to latest version 0.4.1 #85

Closed
wants to merge 2 commits into from

Conversation

pyup-bot
Copy link
Collaborator

@pyup-bot pyup-bot commented Apr 6, 2018

This PR pins pandas-gbq to the latest release 0.4.1.

Changelog

0.4.0

- Fix bug in `read_gbq` when building a dataframe with integer columns on Windows. Explicitly use 64bit integers when converting from BQ types. (119)
- Fix bug in `read_gbq` when querying for an array of floats (123)
- Fix bug in `read_gbq` with configuration argument. Updates read_gbq to account for breaking change in the way google-cloud-python version 0.32.0+ handles query configuration API representation. (152)
- Fix bug in `to_gbq` where seconds were discarded in timestamp columns. (148)
- Fix bug in `to_gbq` when supplying a user-defined schema (150)
- **Deprecate** the `verbose` parameter in `read_gbq` and `to_gbq`. Messages use the logging module instead of printing progress directly to standard output. (12)

0.3.1

[PyPI release](https://pypi.org/project/pandas-gbq/0.3.1/), [Conda Forge release](https://anaconda.org/conda-forge/pandas-gbq/files?version=0.3.1)

- Fix an issue where Unicode couldn't be uploaded in Python 2 ([issue 106](https://github.com/pydata/pandas-gbq/issues/106))
- Add support for a passed schema in :func:``to_gbq`` instead inferring the schema from the passed ``DataFrame`` with ``DataFrame.dtypes`` ([issue 46](https://github.com/pydata/pandas-gbq/issues/46))
- Fix an issue where a dataframe containing both integer and floating point columns could not be uploaded with ``to_gbq`` ([issue 116](https://github.com/pydata/pandas-gbq/issues/116))
- ``to_gbq`` now uses ``to_csv`` to avoid manually looping over rows in a dataframe (should result in faster table uploads) ([issue 96](https://github.com/pydata/pandas-gbq/issues/))

0.3.0

[PyPI release](https://pypi.org/project/pandas-gbq/0.3.0/), [Conda Forge release](https://anaconda.org/conda-forge/pandas-gbq/files?version=0.3.0)

- Use the [`google-cloud-bigquery`](https://googlecloudplatform.github.io/google-cloud-python/latest/bigquery/usage.html) library for API calls. The ``google-cloud-bigquery`` package is a new dependency, and dependencies on ``google-api-python-client`` and ``httplib2`` are removed. See the [installation guide](https://pandas-gbq.readthedocs.io/en/latest/install.htmldependencies) for more details.  (93)
- Structs and arrays are now named properly (23) and BigQuery functions like ``array_agg`` no longer run into errors during type conversion (22 ).
- :func:`to_gbq` now uses a load job instead of the streaming API. Remove ``StreamingInsertError`` class, as it is no longer used by :func:`to_gbq`. (7, 75 )
Links

Copy link
Owner

@tnir tnir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks too old...

@tnir tnir self-assigned this Mar 31, 2019
@tnir tnir closed this Mar 31, 2019
@tnir tnir deleted the pyup-pin-pandas-gbq-0.4.1 branch March 31, 2019 23:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants