Releases: googleapis/python-bigquery-pandas
Releases · googleapis/python-bigquery-pandas
Version 0.5.0
- Project ID parameter is optional in read_gbq and to_gbq when it can inferred from the environment. Note: you must still pass in a project ID when using user-based authentication. (#103)
- Progress bar added for to_gbq, through an optional library tqdm as dependency. (#162)
- Add location parameter to read_gbq and to_gbq so that pandas-gbq can work with datasets in the Tokyo region. (#177)
Version 0.4.1
- Only show verbose deprecation warning if Pandas version does not populate it. #157
Version 0.4.0
PyPI release, Conda Forge release
- Fix bug in
read_gbq
when building a dataframe with integer columns on Windows. Explicitly use 64bit integers when converting from BQ types. (#119) - Fix bug in
read_gbq
when querying for an array of floats (#123) - Fix bug in
read_gbq
with configuration argument. Updates read_gbq to account for breaking change in the way google-cloud-python version 0.32.0+ handles query configuration API representation. (#152) - Fix bug in
to_gbq
where seconds were discarded in timestamp columns. (#148) - Fix bug in
to_gbq
when supplying a user-defined schema (#150) - Deprecate the
verbose
parameter inread_gbq
andto_gbq
. Messages use the logging module instead of printing progress directly to standard output. (#12)
Version 0.3.1
PyPI release, Conda Forge release
- Fix an issue where Unicode couldn't be uploaded in Python 2 (issue 106)
- Add support for a passed schema in :func:
to_gbq
instead inferring the schema from the passedDataFrame
withDataFrame.dtypes
(issue 46) - Fix an issue where a dataframe containing both integer and floating point columns could not be uploaded with
to_gbq
(issue 116) to_gbq
now usesto_csv
to avoid manually looping over rows in a dataframe (should result in faster table uploads) (issue 96)
Version 0.3.0
PyPI release, Conda Forge release
- Use the
google-cloud-bigquery
library for API calls. Thegoogle-cloud-bigquery
package is a new dependency, and dependencies ongoogle-api-python-client
andhttplib2
are removed. See the installation guide for more details. (#93) - Structs and arrays are now named properly (#23) and BigQuery functions like
array_agg
no longer run into errors during type conversion (#22 ). - :func:
to_gbq
now uses a load job instead of the streaming API. RemoveStreamingInsertError
class, as it is no longer used by :func:to_gbq
. (#7, #75 )