Skip to content
Merged
Show file tree
Hide file tree
Changes from 18 commits
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
9cd7554
deps!: BigQuery Storage and pyarrow are required dependencies (#776)
plamut Jul 27, 2021
9319eb1
chore: merge recent changes from master (#823)
plamut Jul 28, 2021
e26d879
chore: sync v3 with master (#851)
plamut Aug 5, 2021
66014c3
chore: merge changes from master (#872)
tswast Aug 12, 2021
dcd78c7
fix!: use nullable `Int64` and `boolean` dtypes in `to_dataframe` (#786)
tswast Aug 16, 2021
60e73fe
chore: sync v3 with master branch (#880)
tswast Aug 16, 2021
2689df4
feat: Destination tables are no-longer removed by create_job (#891)
Aug 23, 2021
eed311e
chore: Simplify create_job slightly (#893)
Aug 23, 2021
2cb1c21
chore: sync v3 branch with main (#947)
plamut Sep 9, 2021
a7842b6
chore!: remove google.cloud.bigquery_v2 code (#855)
plamut Sep 27, 2021
b0cbfef
chore: sync v3 branch with main (#996)
plamut Sep 30, 2021
71dde11
feat: add a static copy of legacy proto-based types (#1000)
plamut Oct 6, 2021
deec8e7
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Oct 6, 2021
beaadc8
🦉 Updates from OwlBot
gcf-owl-bot[bot] Oct 6, 2021
750c808
chore: remove unnecessary replacement from owlbot
tswast Oct 6, 2021
15c4055
Merge remote-tracking branch 'upstream/sync-v3' into sync-v3
tswast Oct 6, 2021
6bfbb7d
🦉 Updates from OwlBot
gcf-owl-bot[bot] Oct 6, 2021
72255a6
Apply suggestions from code review
tswast Oct 6, 2021
8a3b1ad
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Oct 7, 2021
0d0aedb
Merge remote-tracking branch 'upstream/sync-v3' into sync-v3
tswast Oct 7, 2021
1661262
🦉 Updates from OwlBot
gcf-owl-bot[bot] Oct 7, 2021
2c90edc
chore: remove unused _PYARROW_BAD_VERSIONS
tswast Oct 7, 2021
aa3c7d2
Merge remote-tracking branch 'upstream/sync-v3' into sync-v3
tswast Oct 7, 2021
7852c5c
🦉 Updates from OwlBot
gcf-owl-bot[bot] Oct 7, 2021
ed9b6cf
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Oct 8, 2021
294990a
Merge remote-tracking branch 'upstream/sync-v3' into sync-v3
tswast Oct 8, 2021
d448d0e
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Oct 11, 2021
50753cc
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Oct 14, 2021
c67377a
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Oct 26, 2021
6706678
Merge pull request #1004 from googleapis/sync-v3
plamut Oct 28, 2021
40c92c3
chore: cleanup intersphinx links (#1035)
tswast Nov 1, 2021
23d1187
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Nov 4, 2021
61e3d57
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Nov 4, 2021
7162f98
Merge pull request #1049 from googleapis/sync-v3
tswast Nov 5, 2021
12c2272
Merge branch 'main' into sync-v3-with-main
plamut Nov 9, 2021
859a65d
Fix type hints and discovered bugs
plamut Nov 9, 2021
42d3db6
Merge pull request #1055 from plamut/sync-v3-with-main
tswast Nov 10, 2021
3d1af95
feat!: Use pandas custom data types for BigQuery DATE and TIME column…
Nov 10, 2021
070729f
process: mark the package as type-checked (#1058)
plamut Nov 11, 2021
3cae066
feat: default to DATETIME type when loading timezone-naive datetimes …
plamut Nov 16, 2021
86fd253
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Nov 16, 2021
dad555d
chore: release 3.0.0b1 (pre-release)
tswast Nov 16, 2021
9fd8eb9
Merge pull request #1065 from googleapis/sync-v3
tswast Nov 16, 2021
3b3ebff
feat: add `api_method` parameter to `Client.query` to select `INSERT`…
tswast Dec 2, 2021
7e3721e
fix: improve type annotations for mypy validation (#1081)
plamut Dec 14, 2021
2b76944
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Dec 15, 2021
950b24e
chore: add type annotations for mypy
tswast Dec 15, 2021
e888c71
chore: revert test for when pyarrow is not installed
tswast Dec 15, 2021
011f160
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Dec 16, 2021
08a9bcc
Merge pull request #1088 from googleapis/sync-v3
tswast Dec 16, 2021
aea8d55
chore: sync main into v3 branch
tswast Jan 13, 2022
dd40c24
test: fix pandas tests with new bqstorage client (#1113)
tswast Jan 19, 2022
727a18d
Merge branch 'v3' into sync-v3
tswast Jan 19, 2022
c58ba76
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Jan 19, 2022
b67b255
Merge pull request #1109 from googleapis/sync-v3
tswast Jan 20, 2022
5f50242
feat: use `StandardSqlField` class for `Model.feature_columns` and `M…
tswast Jan 28, 2022
fec1ae6
Merge branch 'upstream/main' into sync-v3
tswast Mar 25, 2022
b4f4847
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Mar 25, 2022
dedb2ea
docs: add type annotations to job samples
tswast Mar 25, 2022
0279fa9
chore: blacken with black 22.3.0
tswast Mar 29, 2022
35d2d70
Merge remote-tracking branch 'upstream/main' into sync-v3
tswast Mar 29, 2022
9d256d5
Merge pull request #1175 from googleapis/sync-v3
tswast Mar 29, 2022
af0ecb0
docs: Add migration guide from version 2.x to 3.x (#1027)
plamut Mar 29, 2022
f69bae7
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] Mar 29, 2022
8bd4d39
Merge branch 'v3' of https://github.com/googleapis/python-bigquery in…
gcf-owl-bot[bot] Mar 29, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/.OwlBot.lock.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
docker:
image: gcr.io/cloud-devrel-public-resources/owlbot-python:latest
digest: sha256:87eee22d276554e4e52863ec9b1cb6a7245815dfae20439712bf644348215a5a
digest: sha256:4ee57a76a176ede9087c14330c625a71553cf9c72828b2c0ca12f5338171ba60
4 changes: 4 additions & 0 deletions .github/sync-repo-settings.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,12 @@
# https://github.com/googleapis/repo-automation-bots/tree/main/packages/sync-repo-settings
# Allow merge commits to sync main and v3 with fewer conflicts.
mergeCommitAllowed: true
# Rules for main branch protection
branchProtectionRules:
# Identifies the protection rule pattern. Name of the branch to be protected.
# Defaults to `main`
- pattern: main
requiresLinearHistory: true
requiresCodeOwnerReviews: true
requiresStrictStatusChecks: true
requiredStatusCheckContexts:
Expand All @@ -15,6 +18,7 @@ branchProtectionRules:
- 'Samples - Python 3.7'
- 'Samples - Python 3.8'
- pattern: v3
requiresLinearHistory: false
requiresCodeOwnerReviews: true
requiresStrictStatusChecks: true
requiredStatusCheckContexts:
Expand Down
19 changes: 19 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,25 @@
[1]: https://pypi.org/project/google-cloud-bigquery/#history


### [2.28.1](https://www.github.com/googleapis/python-bigquery/compare/v2.28.0...v2.28.1) (2021-10-07)


### Bug Fixes

* support ARRAY data type when loading from DataFrame with Parquet ([#980](https://www.github.com/googleapis/python-bigquery/issues/980)) ([1e59083](https://www.github.com/googleapis/python-bigquery/commit/1e5908302d36e15442013af6f46b1c20af28255e))

## [2.28.0](https://www.github.com/googleapis/python-bigquery/compare/v2.27.1...v2.28.0) (2021-09-30)


### Features

* add `AvroOptions` to configure AVRO external data ([#994](https://www.github.com/googleapis/python-bigquery/issues/994)) ([1a9431d](https://www.github.com/googleapis/python-bigquery/commit/1a9431d9e02eeb99e4712b61c623f9cca80134a6))


### Documentation

* link to stable pandas docs ([#990](https://www.github.com/googleapis/python-bigquery/issues/990)) ([ea50e80](https://www.github.com/googleapis/python-bigquery/commit/ea50e8031fc035b3772a338bc00982de263cefad))

### [2.27.1](https://www.github.com/googleapis/python-bigquery/compare/v2.27.0...v2.27.1) (2021-09-27)


Expand Down
2 changes: 0 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -364,8 +364,6 @@
"google-auth": ("https://googleapis.dev/python/google-auth/latest/", None),
"google.api_core": ("https://googleapis.dev/python/google-api-core/latest/", None,),
"grpc": ("https://grpc.github.io/grpc/python/", None),
"pandas": ("http://pandas.pydata.org/pandas-docs/stable/", None),
"geopandas": ("https://geopandas.org/", None),
"proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None),
"protobuf": ("https://googleapis.dev/python/protobuf/latest/", None),
"pandas": ("http://pandas.pydata.org/pandas-docs/stable/", None),
Expand Down
7 changes: 1 addition & 6 deletions docs/reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,6 @@ Job-Related Types
job.SourceFormat
job.WriteDisposition
job.SchemaUpdateOption
job.TransactionInfo

.. toctree::
:maxdepth: 2
Expand Down Expand Up @@ -141,11 +140,7 @@ Query
.. toctree::
:maxdepth: 2

query.ArrayQueryParameter
query.ScalarQueryParameter
query.ScalarQueryParameterType
query.StructQueryParameter
query.UDFResource
query


Retries
Expand Down
68 changes: 0 additions & 68 deletions google/cloud/bigquery/_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,78 +96,10 @@ def installed_version(self) -> packaging.version.Version:

return self._installed_version


class PyarrowVersions:
"""Version comparisons for pyarrow package."""

# https://github.com/googleapis/python-bigquery/issues/781#issuecomment-883497414
_PYARROW_BAD_VERSIONS = frozenset([packaging.version.Version("2.0.0")])

def __init__(self):
self._installed_version = None

@property
def installed_version(self) -> packaging.version.Version:
"""Return the parsed version of pyarrow."""
if self._installed_version is None:
import pyarrow

self._installed_version = packaging.version.parse(
# Use 0.0.0, since it is earlier than any released version.
# Legacy versions also have the same property, but
# creating a LegacyVersion has been deprecated.
# https://github.com/pypa/packaging/issues/321
getattr(pyarrow, "__version__", "0.0.0")
)

return self._installed_version

@property
def is_bad_version(self) -> bool:
return self.installed_version in self._PYARROW_BAD_VERSIONS

@property
def use_compliant_nested_type(self) -> bool:
return self.installed_version.major >= 4

def try_import(self, raise_if_error: bool = False) -> Any:
"""Verify that a recent enough version of pyarrow extra is
installed.

The function assumes that pyarrow extra is installed, and should thus
be used in places where this assumption holds.

Because `pip` can install an outdated version of this extra despite the
constraints in `setup.py`, the calling code can use this helper to
verify the version compatibility at runtime.

Returns:
The ``pyarrow`` module or ``None``.

Raises:
LegacyPyarrowError:
If the pyarrow package is outdated and ``raise_if_error`` is ``True``.
"""
try:
import pyarrow
except ImportError as exc: # pragma: NO COVER
if raise_if_error:
raise LegacyPyarrowError(
f"pyarrow package not found. Install pyarrow version >= {_MIN_PYARROW_VERSION}."
) from exc
return None

if self.installed_version < _MIN_PYARROW_VERSION:
if raise_if_error:
msg = (
"Dependency pyarrow is outdated, please upgrade "
f"it to version >= {_MIN_PYARROW_VERSION} (version found: {self.installed_version})."
)
raise LegacyPyarrowError(msg)
return None

return pyarrow


BQ_STORAGE_VERSIONS = BQStorageVersions()
PYARROW_VERSIONS = PyarrowVersions()
Expand Down
10 changes: 9 additions & 1 deletion google/cloud/bigquery/_pandas_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ def _to_wkb(v):
from google.cloud.bigquery import schema


pyarrow = _helpers.PYARROW_VERSIONS.try_import()
_LOGGER = logging.getLogger(__name__)

_PROGRESS_INTERVAL = 0.2 # Maximum time between download status checks, in seconds.

Expand Down Expand Up @@ -593,6 +593,14 @@ def dataframe_to_parquet(

This argument is ignored for ``pyarrow`` versions earlier than ``4.0.0``.
"""
import pyarrow.parquet

kwargs = (
{"use_compliant_nested_type": parquet_use_compliant_nested_type}
if _helpers.PYARROW_VERSIONS.use_compliant_nested_type
else {}
)

bq_schema = schema._to_schema_fields(bq_schema)
arrow_table = dataframe_to_arrow(dataframe, bq_schema)
pyarrow.parquet.write_table(
Expand Down
2 changes: 0 additions & 2 deletions google/cloud/bigquery/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,8 +92,6 @@
from google.cloud.bigquery.format_options import ParquetOptions
from google.cloud.bigquery import _helpers

pyarrow = _helpers.PYARROW_VERSIONS.try_import()


_DEFAULT_CHUNKSIZE = 100 * 1024 * 1024 # 100 MB
_MAX_MULTIPART_SIZE = 5 * 1024 * 1024
Expand Down
2 changes: 1 addition & 1 deletion google/cloud/bigquery/job/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
import http
import threading
import typing
from typing import Dict, Optional
from typing import Dict, Optional, Sequence

from google.api_core import exceptions
import google.api_core.future.polling
Expand Down
8 changes: 2 additions & 6 deletions google/cloud/bigquery/table.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,10 +180,8 @@ class TableReference(_TableBase):
https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#tablereference

Args:
dataset_ref:
A pointer to the dataset
table_id:
The ID of the table
dataset_ref: A pointer to the dataset
table_id: The ID of the table
"""

_PROPERTY_TO_API_FIELD = {
Expand Down Expand Up @@ -1690,8 +1688,6 @@ def to_arrow(
"""
self._maybe_warn_max_results(bqstorage_client)

self._maybe_warn_max_results(bqstorage_client)

if not self._validate_bqstorage(bqstorage_client, create_bqstorage_client):
create_bqstorage_client = False
bqstorage_client = None
Expand Down
2 changes: 1 addition & 1 deletion google/cloud/bigquery/version.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@
# See the License for the specific language governing permissions and
# limitations under the License.

__version__ = "2.27.1"
__version__ = "2.28.1"
1 change: 1 addition & 0 deletions google/cloud/bigquery_v2/types/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -951,6 +951,7 @@ class TrainingRun(proto.Message):

class TrainingOptions(proto.Message):
r"""Options used in model training.

Attributes:
max_iterations (int):
The maximum number of iterations in training.
Expand Down
6 changes: 0 additions & 6 deletions owlbot.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,6 @@
intersphinx_dependencies={
"pandas": "http://pandas.pydata.org/pandas-docs/stable/",
"geopandas": "https://geopandas.org/",
"proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None),
"protobuf": ("https://googleapis.dev/python/protobuf/latest/", None),
},
)

Expand All @@ -56,10 +54,6 @@
],
)

# Remove unneeded intersphinx links, the library does not use any proto-generated code.
s.replace("docs/conf.py", r'\s+"(proto-plus|protobuf)":.*$', "")


# ----------------------------------------------------------------------------
# Samples templates
# ----------------------------------------------------------------------------
Expand Down
6 changes: 5 additions & 1 deletion samples/geography/noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ def get_pytest_env_vars() -> Dict[str, str]:

# DO NOT EDIT - automatically generated.
# All versions used to test samples.
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9"]
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9", "3.10"]

# Any default versions that should be ignored.
IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"]
Expand All @@ -98,6 +98,10 @@ def get_pytest_env_vars() -> Dict[str, str]:
"True",
"true",
)

# Error if a python version is missing
nox.options.error_on_missing_interpreters = True

#
# Style Checks
#
Expand Down
8 changes: 5 additions & 3 deletions samples/geography/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
attrs==21.2.0
cachetools==4.2.2
cachetools==4.2.4
certifi==2021.5.30
cffi==1.14.6
charset-normalizer==2.0.6
Expand All @@ -11,7 +11,7 @@ Fiona==1.8.20
geojson==2.5.0
geopandas==0.9.0
google-api-core==2.0.1
google-auth==2.2.0
google-auth==2.2.1
google-cloud-bigquery==2.27.1
google-cloud-bigquery-storage==2.9.0
google-cloud-core==2.0.0
Expand All @@ -29,6 +29,8 @@ numpy==1.21.2; python_version > "3.6"
packaging==21.0
pandas==1.1.5; python_version < '3.7'
pandas==1.3.2; python_version >= '3.7'
proto-plus==1.19.2
protobuf==3.18.0
pyarrow==5.0.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
Expand All @@ -46,4 +48,4 @@ six==1.16.0
typing-extensions==3.10.0.2
typing-inspect==0.7.1
urllib3==1.26.7
zipp==3.5.0
zipp==3.6.0
6 changes: 5 additions & 1 deletion samples/snippets/noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ def get_pytest_env_vars() -> Dict[str, str]:

# DO NOT EDIT - automatically generated.
# All versions used to test samples.
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9"]
ALL_VERSIONS = ["3.6", "3.7", "3.8", "3.9", "3.10"]

# Any default versions that should be ignored.
IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"]
Expand All @@ -98,6 +98,10 @@ def get_pytest_env_vars() -> Dict[str, str]:
"True",
"true",
)

# Error if a python version is missing
nox.options.error_on_missing_interpreters = True

#
# Style Checks
#
Expand Down
1 change: 1 addition & 0 deletions tests/system/test_pandas.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
import google.api_core.retry
import pkg_resources
import pytest
import numpy

from google.cloud import bigquery
from google.cloud import bigquery_storage
Expand Down
1 change: 1 addition & 0 deletions tests/unit/job/test_query.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@

from google.cloud.bigquery.client import _LIST_ROWS_FROM_QUERY_RESULTS_FIELDS
import google.cloud.bigquery.query
from google.cloud.bigquery.table import _EmptyRowIterator

from ..helpers import make_connection

Expand Down
3 changes: 0 additions & 3 deletions tests/unit/job/test_query_pandas.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,6 @@
pandas = pytest.importorskip("pandas")


pyarrow = _helpers.PYARROW_VERSIONS.try_import()


@pytest.fixture
def table_read_options_kwarg():
# Create a BigQuery Storage table read options object with pyarrow compression
Expand Down
Loading