From 89c126ceca327fcf9f344dace691522e7351dde7 Mon Sep 17 00:00:00 2001 From: Anthonios Partheniou Date: Wed, 16 Jun 2021 12:44:04 -0400 Subject: [PATCH 01/12] fix(deps): add packaging requirement (#368) --- setup.py | 1 + testing/constraints-3.6.txt | 1 + 2 files changed, 2 insertions(+) diff --git a/setup.py b/setup.py index deb1bd5963..d8becf5f2c 100644 --- a/setup.py +++ b/setup.py @@ -34,6 +34,7 @@ "grpc-google-iam-v1 >= 0.12.3, < 0.13dev", "proto-plus >= 1.11.0", "sqlparse >= 0.3.0", + "packaging >= 14.3", ] extras = { "tracing": [ diff --git a/testing/constraints-3.6.txt b/testing/constraints-3.6.txt index f3d4031bf4..b3a4b8b6cc 100644 --- a/testing/constraints-3.6.txt +++ b/testing/constraints-3.6.txt @@ -14,3 +14,4 @@ sqlparse==0.3.0 opentelemetry-api==1.1.0 opentelemetry-sdk==1.1.0 opentelemetry-instrumentation==0.20b0 +packaging==14.3 From 11853a511bd4858eb92192e4d39d7c2f5786e143 Mon Sep 17 00:00:00 2001 From: WhiteSource Renovate Date: Wed, 16 Jun 2021 18:44:05 +0200 Subject: [PATCH 02/12] chore(deps): update dependency google-cloud-spanner to v3.5.0 (#367) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com) This PR contains the following updates: | Package | Change | Age | Adoption | Passing | Confidence | |---|---|---|---|---|---| | [google-cloud-spanner](https://togithub.com/googleapis/python-spanner) | `==3.4.0` -> `==3.5.0` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-spanner/3.5.0/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-spanner/3.5.0/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-spanner/3.5.0/compatibility-slim/3.4.0)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-spanner/3.5.0/confidence-slim/3.4.0)](https://docs.renovatebot.com/merge-confidence/) | --- ### Release Notes
googleapis/python-spanner ### [`v3.5.0`](https://togithub.com/googleapis/python-spanner/blob/master/CHANGELOG.md#​350-httpswwwgithubcomgoogleapispython-spannercomparev340v350-2021-06-11) [Compare Source](https://togithub.com/googleapis/python-spanner/compare/v3.4.0...v3.5.0) ##### Features - add decimal validation for numeric precision and scale supported by Spanner ([#​340](https://www.github.com/googleapis/python-spanner/issues/340)) ([aa36c5e](https://www.github.com/googleapis/python-spanner/commit/aa36c5ecf5b0decc6c5c3316cc5bc6b6981d9bf9)) - add progress field to UpdateDatabaseDdlMetadata ([#​361](https://www.github.com/googleapis/python-spanner/issues/361)) ([1c03dcc](https://www.github.com/googleapis/python-spanner/commit/1c03dcc182fc96a2ca85b23da99cbcaebfb3fe09)) - add query statistics package support ([#​129](https://www.github.com/googleapis/python-spanner/issues/129)) ([6598dea](https://www.github.com/googleapis/python-spanner/commit/6598deade66c8887514a1a6571fffb1bd7b16fd0)) ##### Bug Fixes - an Aborted exception isn't properly retried ([#​345](https://www.github.com/googleapis/python-spanner/issues/345)) ([e69e6ab](https://www.github.com/googleapis/python-spanner/commit/e69e6ab5cffd02bc9af6c08dbe9b5f229847d86d)) - correctly classify select statements that begin with brackets ([#​351](https://www.github.com/googleapis/python-spanner/issues/351)) ([d526acc](https://www.github.com/googleapis/python-spanner/commit/d526acca4795ebf34867ab4a256413a728fccd93)) - update to support the open-telemetry status code spec change ([#​358](https://www.github.com/googleapis/python-spanner/issues/358)) ([0f894f1](https://www.github.com/googleapis/python-spanner/commit/0f894f12622cfa6e38b838eb91e49f256d8d857d))
--- ### Configuration 📅 **Schedule**: At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] If you want to rebase/retry this PR, check this box. --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-spanner). --- samples/samples/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/samples/samples/requirements.txt b/samples/samples/requirements.txt index 542b2aaf54..305cd0b7e5 100644 --- a/samples/samples/requirements.txt +++ b/samples/samples/requirements.txt @@ -1,2 +1,2 @@ -google-cloud-spanner==3.4.0 +google-cloud-spanner==3.5.0 futures==3.3.0; python_version < "3" From 113505c58dc52509973f4199330a8983e3c5d848 Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Thu, 17 Jun 2021 11:40:48 +1000 Subject: [PATCH 03/12] feat(spanner): add processing_units to Instance resource (#364) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * feat(spanner): add processing_units to Instance resource PiperOrigin-RevId: 378758342 Source-Link: https://github.com/googleapis/googleapis/commit/d8698715e4f5b7c45505dadd679255987c260180 Source-Link: https://github.com/googleapis/googleapis-gen/commit/54cfa763144ff2bf631518a6e872055493b583ae * 🦉 Updates from OwlBot Co-authored-by: Owl Bot --- .../types/spanner_instance_admin.py | 6 ++++++ .../gapic/spanner_admin_instance_v1/test_instance_admin.py | 4 ++++ 2 files changed, 10 insertions(+) diff --git a/google/cloud/spanner_admin_instance_v1/types/spanner_instance_admin.py b/google/cloud/spanner_admin_instance_v1/types/spanner_instance_admin.py index db885f8469..d8cef6ea2b 100644 --- a/google/cloud/spanner_admin_instance_v1/types/spanner_instance_admin.py +++ b/google/cloud/spanner_admin_instance_v1/types/spanner_instance_admin.py @@ -128,6 +128,11 @@ class Instance(proto.Message): See `the documentation `__ for more information about nodes. + processing_units (int): + The number of processing units allocated to this instance. + At most one of processing_units or node_count should be + present in the message. This may be zero in API responses + for instances that are not yet in state ``READY``. state (google.cloud.spanner_admin_instance_v1.types.Instance.State): Output only. The current instance state. For [CreateInstance][google.spanner.admin.instance.v1.InstanceAdmin.CreateInstance], @@ -177,6 +182,7 @@ class State(proto.Enum): config = proto.Field(proto.STRING, number=2,) display_name = proto.Field(proto.STRING, number=3,) node_count = proto.Field(proto.INT32, number=5,) + processing_units = proto.Field(proto.INT32, number=9,) state = proto.Field(proto.ENUM, number=6, enum=State,) labels = proto.MapField(proto.STRING, proto.STRING, number=7,) endpoint_uris = proto.RepeatedField(proto.STRING, number=8,) diff --git a/tests/unit/gapic/spanner_admin_instance_v1/test_instance_admin.py b/tests/unit/gapic/spanner_admin_instance_v1/test_instance_admin.py index b36c820cf5..038f4b0e9a 100644 --- a/tests/unit/gapic/spanner_admin_instance_v1/test_instance_admin.py +++ b/tests/unit/gapic/spanner_admin_instance_v1/test_instance_admin.py @@ -1488,6 +1488,7 @@ def test_get_instance( config="config_value", display_name="display_name_value", node_count=1070, + processing_units=1743, state=spanner_instance_admin.Instance.State.CREATING, endpoint_uris=["endpoint_uris_value"], ) @@ -1504,6 +1505,7 @@ def test_get_instance( assert response.config == "config_value" assert response.display_name == "display_name_value" assert response.node_count == 1070 + assert response.processing_units == 1743 assert response.state == spanner_instance_admin.Instance.State.CREATING assert response.endpoint_uris == ["endpoint_uris_value"] @@ -1549,6 +1551,7 @@ async def test_get_instance_async( config="config_value", display_name="display_name_value", node_count=1070, + processing_units=1743, state=spanner_instance_admin.Instance.State.CREATING, endpoint_uris=["endpoint_uris_value"], ) @@ -1566,6 +1569,7 @@ async def test_get_instance_async( assert response.config == "config_value" assert response.display_name == "display_name_value" assert response.node_count == 1070 + assert response.processing_units == 1743 assert response.state == spanner_instance_admin.Instance.State.CREATING assert response.endpoint_uris == ["endpoint_uris_value"] From cb6196fcb1fc8cc16ecbff0ff35616fa54cb09b3 Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Thu, 17 Jun 2021 10:52:20 +0000 Subject: [PATCH 04/12] chore: new owl bot post processor docker image (#371) Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:58c7342b0bccf85028100adaa3d856cb4a871c22ca9c01960d996e66c40548ce --- .github/.OwlBot.lock.yaml | 5 ++--- docs/conf.py | 12 ++++++------ 2 files changed, 8 insertions(+), 9 deletions(-) diff --git a/.github/.OwlBot.lock.yaml b/.github/.OwlBot.lock.yaml index 43adabe6a5..ea06d395ea 100644 --- a/.github/.OwlBot.lock.yaml +++ b/.github/.OwlBot.lock.yaml @@ -1,4 +1,3 @@ docker: - digest: sha256:c66ba3c8d7bc8566f47df841f98cd0097b28fff0b1864c86f5817f4c8c3e8600 - image: gcr.io/repo-automation-bots/owlbot-python:latest - + image: gcr.io/repo-automation-bots/owlbot-python:latest + digest: sha256:58c7342b0bccf85028100adaa3d856cb4a871c22ca9c01960d996e66c40548ce diff --git a/docs/conf.py b/docs/conf.py index 9703f9705e..1d4a1c0b91 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -80,9 +80,9 @@ master_doc = "index" # General information about the project. -project = u"google-cloud-spanner" -copyright = u"2019, Google" -author = u"Google APIs" +project = "google-cloud-spanner" +copyright = "2019, Google" +author = "Google APIs" # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the @@ -281,7 +281,7 @@ ( master_doc, "google-cloud-spanner.tex", - u"google-cloud-spanner Documentation", + "google-cloud-spanner Documentation", author, "manual", ) @@ -316,7 +316,7 @@ ( master_doc, "google-cloud-spanner", - u"google-cloud-spanner Documentation", + "google-cloud-spanner Documentation", [author], 1, ) @@ -335,7 +335,7 @@ ( master_doc, "google-cloud-spanner", - u"google-cloud-spanner Documentation", + "google-cloud-spanner Documentation", author, "google-cloud-spanner", "google-cloud-spanner Library", From b7b3c383abcca99dcbae6d92b27c49ca6707010a Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Sat, 19 Jun 2021 01:36:18 +0000 Subject: [PATCH 05/12] docs: omit mention of Python 2.7 in 'CONTRIBUTING.rst' (#1127) (#374) Closes #1126 Source-Link: https://github.com/googleapis/synthtool/commit/b91f129527853d5b756146a0b5044481fb4e09a8 Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:b6169fc6a5207b11800a7c002d0c5c2bc6d82697185ca12e666f44031468cfcd --- .github/.OwlBot.lock.yaml | 2 +- CONTRIBUTING.rst | 7 ++----- 2 files changed, 3 insertions(+), 6 deletions(-) diff --git a/.github/.OwlBot.lock.yaml b/.github/.OwlBot.lock.yaml index ea06d395ea..cc49c6a3df 100644 --- a/.github/.OwlBot.lock.yaml +++ b/.github/.OwlBot.lock.yaml @@ -1,3 +1,3 @@ docker: image: gcr.io/repo-automation-bots/owlbot-python:latest - digest: sha256:58c7342b0bccf85028100adaa3d856cb4a871c22ca9c01960d996e66c40548ce + digest: sha256:b6169fc6a5207b11800a7c002d0c5c2bc6d82697185ca12e666f44031468cfcd diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index 17ee397e34..3df455e996 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -69,7 +69,6 @@ We use `nox `__ to instrument our tests. - To test your changes, run unit tests with ``nox``:: - $ nox -s unit-2.7 $ nox -s unit-3.8 $ ... @@ -144,7 +143,6 @@ Running System Tests # Run all system tests $ nox -s system-3.8 - $ nox -s system-2.7 # Run a single system test $ nox -s system-3.8 -- -k @@ -152,9 +150,8 @@ Running System Tests .. note:: - System tests are only configured to run under Python 2.7 and - Python 3.8. For expediency, we do not run them in older versions - of Python 3. + System tests are only configured to run under Python 3.8. + For expediency, we do not run them in older versions of Python 3. This alone will not run the tests. You'll need to change some local auth settings and change some configuration in your project to From ed9e124aa74e44778104e45eae1e577978d6b866 Mon Sep 17 00:00:00 2001 From: Ilya Gurov Date: Tue, 22 Jun 2021 05:22:04 +0300 Subject: [PATCH 06/12] fix(db_api): use sqlparse to split DDL statements (#372) Instead of simple `str.split(";")` method use more smart `sqlparse` package to split DDL statements executed in a form: ```python cursor.execute(""" ddl_statement1; ddl_statement2; ddl_statement3; """) ``` --- google/cloud/spanner_dbapi/cursor.py | 7 +++++-- google/cloud/spanner_dbapi/parse_utils.py | 2 +- tests/unit/spanner_dbapi/test_cursor.py | 14 +++++++++++++- 3 files changed, 19 insertions(+), 4 deletions(-) diff --git a/google/cloud/spanner_dbapi/cursor.py b/google/cloud/spanner_dbapi/cursor.py index 3569bab605..689ba8cf66 100644 --- a/google/cloud/spanner_dbapi/cursor.py +++ b/google/cloud/spanner_dbapi/cursor.py @@ -14,6 +14,8 @@ """Database cursor for Google Cloud Spanner DB-API.""" +import sqlparse + from google.api_core.exceptions import Aborted from google.api_core.exceptions import AlreadyExists from google.api_core.exceptions import FailedPrecondition @@ -174,9 +176,10 @@ def execute(self, sql, args=None): try: classification = parse_utils.classify_stmt(sql) if classification == parse_utils.STMT_DDL: - for ddl in sql.split(";"): - ddl = ddl.strip() + for ddl in sqlparse.split(sql): if ddl: + if ddl[-1] == ";": + ddl = ddl[:-1] self.connection._ddl_statements.append(ddl) if self.connection.autocommit: self.connection.run_prior_DDL_statements() diff --git a/google/cloud/spanner_dbapi/parse_utils.py b/google/cloud/spanner_dbapi/parse_utils.py index aa0e12d75d..d967330cea 100644 --- a/google/cloud/spanner_dbapi/parse_utils.py +++ b/google/cloud/spanner_dbapi/parse_utils.py @@ -199,7 +199,7 @@ def classify_stmt(query): def parse_insert(insert_sql, params): """ - Parse an INSERT statement an generate a list of tuples of the form: + Parse an INSERT statement and generate a list of tuples of the form: [ (SQL, params_per_row1), (SQL, params_per_row2), diff --git a/tests/unit/spanner_dbapi/test_cursor.py b/tests/unit/spanner_dbapi/test_cursor.py index 57a3375e49..789ca06695 100644 --- a/tests/unit/spanner_dbapi/test_cursor.py +++ b/tests/unit/spanner_dbapi/test_cursor.py @@ -941,6 +941,13 @@ def test_ddls_with_semicolon(self): EXP_DDLS = [ "CREATE TABLE table_name (row_id INT64) PRIMARY KEY ()", "DROP INDEX index_name", + ( + "CREATE TABLE papers (" + "\n id INT64," + "\n authors ARRAY," + '\n author_list STRING(MAX) AS (ARRAY_TO_STRING(authors, ";")) stored' + ") PRIMARY KEY (id)" + ), "DROP TABLE table_name", ] @@ -956,7 +963,12 @@ def test_ddls_with_semicolon(self): cursor.execute( "CREATE TABLE table_name (row_id INT64) PRIMARY KEY ();" "DROP INDEX index_name;\n" - "DROP TABLE table_name;" + "CREATE TABLE papers (" + "\n id INT64," + "\n authors ARRAY," + '\n author_list STRING(MAX) AS (ARRAY_TO_STRING(authors, ";")) stored' + ") PRIMARY KEY (id);" + "DROP TABLE table_name;", ) self.assertEqual(connection._ddl_statements, EXP_DDLS) From c1ee8c2685a794f9f89329e16f7c461e135114af Mon Sep 17 00:00:00 2001 From: larkee <31196561+larkee@users.noreply.github.com> Date: Tue, 22 Jun 2021 16:52:02 +1200 Subject: [PATCH 07/12] feat: update query stats samples (#373) --- samples/samples/snippets.py | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/samples/samples/snippets.py b/samples/samples/snippets.py index 10fc6413c2..18af239b5b 100644 --- a/samples/samples/snippets.py +++ b/samples/samples/snippets.py @@ -1703,7 +1703,10 @@ def query_data_with_query_options(instance_id, database_id): with database.snapshot() as snapshot: results = snapshot.execute_sql( "SELECT VenueId, VenueName, LastUpdateTime FROM Venues", - query_options={"optimizer_version": "1"}, + query_options={ + "optimizer_version": "1", + "optimizer_statistics_package": "latest" + }, ) for row in results: @@ -1716,7 +1719,11 @@ def create_client_with_query_options(instance_id, database_id): # [START spanner_create_client_with_query_options] # instance_id = "your-spanner-instance" # database_id = "your-spanner-db-id" - spanner_client = spanner.Client(query_options={"optimizer_version": "1"}) + spanner_client = spanner.Client( + query_options={ + "optimizer_version": "1", + "optimizer_statistics_package": "auto_20191128_14_47_22UTC" + }) instance = spanner_client.instance(instance_id) database = instance.database(database_id) From 51533b812b68004eafeb402641b974e76bf9a837 Mon Sep 17 00:00:00 2001 From: Zoe Date: Tue, 22 Jun 2021 15:55:41 +1000 Subject: [PATCH 08/12] feat: add RPC priority support (#324) * feat: add RPC priority support * Review changes * Review changes * Update google/cloud/spanner_v1/database.py Co-authored-by: larkee <31196561+larkee@users.noreply.github.com> * Update google/cloud/spanner_v1/database.py Co-authored-by: larkee <31196561+larkee@users.noreply.github.com> * Update session.py * update import Co-authored-by: larkee <31196561+larkee@users.noreply.github.com> --- google/cloud/spanner_v1/__init__.py | 2 ++ google/cloud/spanner_v1/batch.py | 15 ++++++++- google/cloud/spanner_v1/database.py | 46 ++++++++++++++++++++++---- google/cloud/spanner_v1/session.py | 20 +++++++++-- google/cloud/spanner_v1/snapshot.py | 25 ++++++++++++++ google/cloud/spanner_v1/transaction.py | 41 +++++++++++++++++++++-- tests/system/test_system.py | 5 +++ tests/unit/test_batch.py | 9 +++-- tests/unit/test_database.py | 22 ++++++++++-- tests/unit/test_session.py | 3 ++ tests/unit/test_snapshot.py | 13 ++++++++ tests/unit/test_transaction.py | 28 +++++++++++++--- 12 files changed, 209 insertions(+), 20 deletions(-) diff --git a/google/cloud/spanner_v1/__init__.py b/google/cloud/spanner_v1/__init__.py index 7c9e9d70fe..4ece165503 100644 --- a/google/cloud/spanner_v1/__init__.py +++ b/google/cloud/spanner_v1/__init__.py @@ -28,6 +28,7 @@ from .types.query_plan import PlanNode from .types.query_plan import QueryPlan from .types.result_set import PartialResultSet +from .types import RequestOptions from .types.result_set import ResultSet from .types.result_set import ResultSetMetadata from .types.result_set import ResultSetStats @@ -119,6 +120,7 @@ "PlanNode", "QueryPlan", "ReadRequest", + "RequestOptions", "ResultSet", "ResultSetMetadata", "ResultSetStats", diff --git a/google/cloud/spanner_v1/batch.py b/google/cloud/spanner_v1/batch.py index 9a79507886..d1774ed36d 100644 --- a/google/cloud/spanner_v1/batch.py +++ b/google/cloud/spanner_v1/batch.py @@ -23,6 +23,7 @@ from google.cloud.spanner_v1._helpers import _make_list_value_pbs from google.cloud.spanner_v1._helpers import _metadata_with_prefix from google.cloud.spanner_v1._opentelemetry_tracing import trace_call +from google.cloud.spanner_v1 import RequestOptions # pylint: enable=ungrouped-imports @@ -138,13 +139,20 @@ def _check_state(self): if self.committed is not None: raise ValueError("Batch already committed") - def commit(self, return_commit_stats=False): + def commit(self, return_commit_stats=False, request_options=None): """Commit mutations to the database. :type return_commit_stats: bool :param return_commit_stats: If true, the response will return commit stats which can be accessed though commit_stats. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :rtype: datetime :returns: timestamp of the committed changes. """ @@ -154,11 +162,16 @@ def commit(self, return_commit_stats=False): metadata = _metadata_with_prefix(database.name) txn_options = TransactionOptions(read_write=TransactionOptions.ReadWrite()) trace_attributes = {"num_mutations": len(self._mutations)} + + if type(request_options) == dict: + request_options = RequestOptions(request_options) + request = CommitRequest( session=self._session.name, mutations=self._mutations, single_use_transaction=txn_options, return_commit_stats=return_commit_stats, + request_options=request_options, ) with trace_call("CloudSpanner.Commit", self._session, trace_attributes): response = api.commit(request=request, metadata=metadata,) diff --git a/google/cloud/spanner_v1/database.py b/google/cloud/spanner_v1/database.py index 5eb688d9c6..fae983f334 100644 --- a/google/cloud/spanner_v1/database.py +++ b/google/cloud/spanner_v1/database.py @@ -58,10 +58,10 @@ TransactionOptions, ) from google.cloud.spanner_v1.table import Table +from google.cloud.spanner_v1 import RequestOptions # pylint: enable=ungrouped-imports - SPANNER_DATA_SCOPE = "https://www.googleapis.com/auth/spanner.data" @@ -454,7 +454,12 @@ def drop(self): api.drop_database(database=self.name, metadata=metadata) def execute_partitioned_dml( - self, dml, params=None, param_types=None, query_options=None + self, + dml, + params=None, + param_types=None, + query_options=None, + request_options=None, ): """Execute a partitionable DML statement. @@ -478,12 +483,22 @@ def execute_partitioned_dml( If a dict is provided, it must be of the same form as the protobuf message :class:`~google.cloud.spanner_v1.types.QueryOptions` + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :rtype: int :returns: Count of rows affected by the DML statement. """ query_options = _merge_query_options( self._instance._client._query_options, query_options ) + if type(request_options) == dict: + request_options = RequestOptions(request_options) + if params is not None: from google.cloud.spanner_v1.transaction import Transaction @@ -517,6 +532,7 @@ def execute_pdml(): params=params_pb, param_types=param_types, query_options=query_options, + request_options=request_options, ) method = functools.partial( api.execute_streaming_sql, metadata=metadata, @@ -561,16 +577,23 @@ def snapshot(self, **kw): """ return SnapshotCheckout(self, **kw) - def batch(self): + def batch(self, request_options=None): """Return an object which wraps a batch. The wrapper *must* be used as a context manager, with the batch as the value returned by the wrapper. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for the commit request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :rtype: :class:`~google.cloud.spanner_v1.database.BatchCheckout` :returns: new wrapper """ - return BatchCheckout(self) + return BatchCheckout(self, request_options) def batch_snapshot(self, read_timestamp=None, exact_staleness=None): """Return an object which wraps a batch read / query. @@ -756,11 +779,19 @@ class BatchCheckout(object): :type database: :class:`~google.cloud.spanner_v1.database.Database` :param database: database to use + + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for the commit request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. """ - def __init__(self, database): + def __init__(self, database, request_options=None): self._database = database self._session = self._batch = None + self._request_options = request_options def __enter__(self): """Begin ``with`` block.""" @@ -772,7 +803,10 @@ def __exit__(self, exc_type, exc_val, exc_tb): """End ``with`` block.""" try: if exc_type is None: - self._batch.commit(return_commit_stats=self._database.log_commit_stats) + self._batch.commit( + return_commit_stats=self._database.log_commit_stats, + request_options=self._request_options, + ) finally: if self._database.log_commit_stats and self._batch.commit_stats: self._database.logger.info( diff --git a/google/cloud/spanner_v1/session.py b/google/cloud/spanner_v1/session.py index 1321308ace..84b65429d6 100644 --- a/google/cloud/spanner_v1/session.py +++ b/google/cloud/spanner_v1/session.py @@ -230,6 +230,7 @@ def execute_sql( param_types=None, query_mode=None, query_options=None, + request_options=None, retry=google.api_core.gapic_v1.method.DEFAULT, timeout=google.api_core.gapic_v1.method.DEFAULT, ): @@ -258,6 +259,13 @@ def execute_sql( or :class:`dict` :param query_options: (Optional) Options that are provided for query plan stability. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :type retry: :class:`~google.api_core.retry.Retry` :param retry: (Optional) The retry settings for this request. @@ -273,6 +281,7 @@ def execute_sql( param_types, query_mode, query_options=query_options, + request_options=request_options, retry=retry, timeout=timeout, ) @@ -319,9 +328,12 @@ def run_in_transaction(self, func, *args, **kw): :type kw: dict :param kw: (Optional) keyword arguments to be passed to ``func``. - If passed, "timeout_secs" will be removed and used to + If passed: + "timeout_secs" will be removed and used to override the default retry timeout which defines maximum timestamp to continue retrying the transaction. + "commit_request_options" will be removed and used to set the + request options for the commit request. :rtype: Any :returns: The return value of ``func``. @@ -330,6 +342,7 @@ def run_in_transaction(self, func, *args, **kw): reraises any non-ABORT exceptions raised by ``func``. """ deadline = time.time() + kw.pop("timeout_secs", DEFAULT_RETRY_TIMEOUT_SECS) + commit_request_options = kw.pop("commit_request_options", None) attempts = 0 while True: @@ -355,7 +368,10 @@ def run_in_transaction(self, func, *args, **kw): raise try: - txn.commit(return_commit_stats=self._database.log_commit_stats) + txn.commit( + return_commit_stats=self._database.log_commit_stats, + request_options=commit_request_options, + ) except Aborted as exc: del self._transaction _delay_until_retry(exc, deadline, attempts) diff --git a/google/cloud/spanner_v1/snapshot.py b/google/cloud/spanner_v1/snapshot.py index f926d7836d..eccd8720e1 100644 --- a/google/cloud/spanner_v1/snapshot.py +++ b/google/cloud/spanner_v1/snapshot.py @@ -34,6 +34,7 @@ from google.cloud.spanner_v1._helpers import _SessionWrapper from google.cloud.spanner_v1._opentelemetry_tracing import trace_call from google.cloud.spanner_v1.streamed import StreamedResultSet +from google.cloud.spanner_v1 import RequestOptions _STREAM_RESUMPTION_INTERNAL_ERROR_MESSAGES = ( "RST_STREAM", @@ -124,6 +125,7 @@ def read( index="", limit=0, partition=None, + request_options=None, *, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, @@ -152,6 +154,13 @@ def read( from :meth:`partition_read`. Incompatible with ``limit``. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :type retry: :class:`~google.api_core.retry.Retry` :param retry: (Optional) The retry settings for this request. @@ -176,6 +185,9 @@ def read( metadata = _metadata_with_prefix(database.name) transaction = self._make_txn_selector() + if type(request_options) == dict: + request_options = RequestOptions(request_options) + request = ReadRequest( session=self._session.name, table=table, @@ -185,6 +197,7 @@ def read( index=index, limit=limit, partition_token=partition, + request_options=request_options, ) restart = functools.partial( api.streaming_read, @@ -217,6 +230,7 @@ def execute_sql( param_types=None, query_mode=None, query_options=None, + request_options=None, partition=None, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, @@ -249,6 +263,13 @@ def execute_sql( If a dict is provided, it must be of the same form as the protobuf message :class:`~google.cloud.spanner_v1.types.QueryOptions` + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :type partition: bytes :param partition: (Optional) one of the partition tokens returned from :meth:`partition_query`. @@ -291,6 +312,9 @@ def execute_sql( default_query_options = database._instance._client._query_options query_options = _merge_query_options(default_query_options, query_options) + if type(request_options) == dict: + request_options = RequestOptions(request_options) + request = ExecuteSqlRequest( session=self._session.name, sql=sql, @@ -301,6 +325,7 @@ def execute_sql( partition_token=partition, seqno=self._execute_sql_count, query_options=query_options, + request_options=request_options, ) restart = functools.partial( api.execute_streaming_sql, diff --git a/google/cloud/spanner_v1/transaction.py b/google/cloud/spanner_v1/transaction.py index 4c99b26a09..fce14eb60d 100644 --- a/google/cloud/spanner_v1/transaction.py +++ b/google/cloud/spanner_v1/transaction.py @@ -29,6 +29,7 @@ from google.cloud.spanner_v1.snapshot import _SnapshotBase from google.cloud.spanner_v1.batch import _BatchBase from google.cloud.spanner_v1._opentelemetry_tracing import trace_call +from google.cloud.spanner_v1 import RequestOptions from google.api_core import gapic_v1 @@ -122,13 +123,20 @@ def rollback(self): self.rolled_back = True del self._session._transaction - def commit(self, return_commit_stats=False): + def commit(self, return_commit_stats=False, request_options=None): """Commit mutations to the database. :type return_commit_stats: bool :param return_commit_stats: If true, the response will return commit stats which can be accessed though commit_stats. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :rtype: datetime :returns: timestamp of the committed changes. :raises ValueError: if there are no mutations to commit. @@ -139,11 +147,16 @@ def commit(self, return_commit_stats=False): api = database.spanner_api metadata = _metadata_with_prefix(database.name) trace_attributes = {"num_mutations": len(self._mutations)} + + if type(request_options) == dict: + request_options = RequestOptions(request_options) + request = CommitRequest( session=self._session.name, mutations=self._mutations, transaction_id=self._transaction_id, return_commit_stats=return_commit_stats, + request_options=request_options, ) with trace_call("CloudSpanner.Commit", self._session, trace_attributes): response = api.commit(request=request, metadata=metadata,) @@ -192,6 +205,7 @@ def execute_update( param_types=None, query_mode=None, query_options=None, + request_options=None, *, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, @@ -221,6 +235,13 @@ def execute_update( or :class:`dict` :param query_options: (Optional) Options that are provided for query plan stability. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :type retry: :class:`~google.api_core.retry.Retry` :param retry: (Optional) The retry settings for this request. @@ -246,7 +267,11 @@ def execute_update( default_query_options = database._instance._client._query_options query_options = _merge_query_options(default_query_options, query_options) + if type(request_options) == dict: + request_options = RequestOptions(request_options) + trace_attributes = {"db.statement": dml} + request = ExecuteSqlRequest( session=self._session.name, sql=dml, @@ -256,6 +281,7 @@ def execute_update( query_mode=query_mode, query_options=query_options, seqno=seqno, + request_options=request_options, ) with trace_call( "CloudSpanner.ReadWriteTransaction", self._session, trace_attributes @@ -265,7 +291,7 @@ def execute_update( ) return response.stats.row_count_exact - def batch_update(self, statements): + def batch_update(self, statements, request_options=None): """Perform a batch of DML statements via an ``ExecuteBatchDml`` request. :type statements: @@ -279,6 +305,13 @@ def batch_update(self, statements): must also be passed, as a dict mapping names to the type of value passed in 'params'. + :type request_options: + :class:`google.cloud.spanner_v1.types.RequestOptions` + :param request_options: + (Optional) Common options for this request. + If a dict is provided, it must be of the same form as the protobuf + message :class:`~google.cloud.spanner_v1.types.RequestOptions`. + :rtype: Tuple(status, Sequence[int]) :returns: @@ -310,6 +343,9 @@ def batch_update(self, statements): self._execute_sql_count + 1, ) + if type(request_options) == dict: + request_options = RequestOptions(request_options) + trace_attributes = { # Get just the queries from the DML statement batch "db.statement": ";".join([statement.sql for statement in parsed]) @@ -319,6 +355,7 @@ def batch_update(self, statements): transaction=transaction, statements=parsed, seqno=seqno, + request_options=request_options, ) with trace_call("CloudSpanner.DMLTransaction", self._session, trace_attributes): response = api.execute_batch_dml(request=request, metadata=metadata) diff --git a/tests/system/test_system.py b/tests/system/test_system.py index 7c1c0d6f64..8471cfc4c2 100644 --- a/tests/system/test_system.py +++ b/tests/system/test_system.py @@ -43,11 +43,13 @@ from google.cloud.spanner_v1.instance import Backup from google.cloud.spanner_v1.instance import Instance from google.cloud.spanner_v1.table import Table +from google.cloud.spanner_v1 import RequestOptions from test_utils.retry import RetryErrors from test_utils.retry import RetryInstanceState from test_utils.retry import RetryResult from test_utils.system import unique_resource_id + from tests._fixtures import DDL_STATEMENTS from tests._fixtures import EMULATOR_DDL_STATEMENTS from tests._helpers import OpenTelemetryBase, HAS_OPENTELEMETRY_INSTALLED @@ -1821,6 +1823,9 @@ def _setup_table(txn): update_statement, params={"email": nonesuch, "target": target}, param_types={"email": param_types.STRING, "target": param_types.STRING}, + request_options=RequestOptions( + priority=RequestOptions.Priority.PRIORITY_MEDIUM + ), ) self.assertEqual(row_count, 1) diff --git a/tests/unit/test_batch.py b/tests/unit/test_batch.py index 3112f17ecf..f7915814a3 100644 --- a/tests/unit/test_batch.py +++ b/tests/unit/test_batch.py @@ -232,12 +232,13 @@ def test_commit_ok(self): self.assertEqual(committed, now) self.assertEqual(batch.committed, committed) - (session, mutations, single_use_txn, metadata) = api._committed + (session, mutations, single_use_txn, metadata, request_options) = api._committed self.assertEqual(session, self.SESSION_NAME) self.assertEqual(mutations, batch._mutations) self.assertIsInstance(single_use_txn, TransactionOptions) self.assertTrue(type(single_use_txn).pb(single_use_txn).HasField("read_write")) self.assertEqual(metadata, [("google-cloud-resource-prefix", database.name)]) + self.assertEqual(request_options, None) self.assertSpanAttributes( "CloudSpanner.Commit", attributes=dict(BASE_ATTRIBUTES, num_mutations=1) @@ -280,12 +281,13 @@ def test_context_mgr_success(self): self.assertEqual(batch.committed, now) - (session, mutations, single_use_txn, metadata) = api._committed + (session, mutations, single_use_txn, metadata, request_options) = api._committed self.assertEqual(session, self.SESSION_NAME) self.assertEqual(mutations, batch._mutations) self.assertIsInstance(single_use_txn, TransactionOptions) self.assertTrue(type(single_use_txn).pb(single_use_txn).HasField("read_write")) self.assertEqual(metadata, [("google-cloud-resource-prefix", database.name)]) + self.assertEqual(request_options, None) self.assertSpanAttributes( "CloudSpanner.Commit", attributes=dict(BASE_ATTRIBUTES, num_mutations=1) @@ -339,7 +341,7 @@ def __init__(self, **kwargs): self.__dict__.update(**kwargs) def commit( - self, request=None, metadata=None, + self, request=None, metadata=None, request_options=None, ): from google.api_core.exceptions import Unknown @@ -349,6 +351,7 @@ def commit( request.mutations, request.single_use_transaction, metadata, + request_options, ) if self._rpc_error: raise Unknown("error") diff --git a/tests/unit/test_database.py b/tests/unit/test_database.py index c71bab2581..05e6f2b422 100644 --- a/tests/unit/test_database.py +++ b/tests/unit/test_database.py @@ -21,6 +21,8 @@ from google.cloud.spanner_v1.param_types import INT64 from google.api_core.retry import Retry +from google.cloud.spanner_v1 import RequestOptions + DML_WO_PARAM = """ DELETE FROM citizens """ @@ -902,7 +904,13 @@ def test_drop_success(self): ) def _execute_partitioned_dml_helper( - self, dml, params=None, param_types=None, query_options=None, retried=False + self, + dml, + params=None, + param_types=None, + query_options=None, + request_options=None, + retried=False, ): from google.api_core.exceptions import Aborted from google.api_core.retry import Retry @@ -949,7 +957,7 @@ def _execute_partitioned_dml_helper( api.execute_streaming_sql.return_value = iterator row_count = database.execute_partitioned_dml( - dml, params, param_types, query_options + dml, params, param_types, query_options, request_options ) self.assertEqual(row_count, 2) @@ -989,6 +997,7 @@ def _execute_partitioned_dml_helper( params=expected_params, param_types=param_types, query_options=expected_query_options, + request_options=request_options, ) api.execute_streaming_sql.assert_any_call( @@ -1006,6 +1015,7 @@ def _execute_partitioned_dml_helper( params=expected_params, param_types=param_types, query_options=expected_query_options, + request_options=request_options, ) api.execute_streaming_sql.assert_called_with( request=expected_request, @@ -1035,6 +1045,14 @@ def test_execute_partitioned_dml_w_query_options(self): query_options=ExecuteSqlRequest.QueryOptions(optimizer_version="3"), ) + def test_execute_partitioned_dml_w_request_options(self): + self._execute_partitioned_dml_helper( + dml=DML_W_PARAM, + request_options=RequestOptions( + priority=RequestOptions.Priority.PRIORITY_MEDIUM + ), + ) + def test_execute_partitioned_dml_wo_params_retry_aborted(self): self._execute_partitioned_dml_helper(dml=DML_WO_PARAM, retried=True) diff --git a/tests/unit/test_session.py b/tests/unit/test_session.py index 9c2e9dce3c..4daabdf952 100644 --- a/tests/unit/test_session.py +++ b/tests/unit/test_session.py @@ -550,6 +550,7 @@ def test_execute_sql_defaults(self): None, None, query_options=None, + request_options=None, timeout=google.api_core.gapic_v1.method.DEFAULT, retry=google.api_core.gapic_v1.method.DEFAULT, ) @@ -579,6 +580,7 @@ def test_execute_sql_non_default_retry(self): param_types, "PLAN", query_options=None, + request_options=None, timeout=None, retry=None, ) @@ -606,6 +608,7 @@ def test_execute_sql_explicit(self): param_types, "PLAN", query_options=None, + request_options=None, timeout=google.api_core.gapic_v1.method.DEFAULT, retry=google.api_core.gapic_v1.method.DEFAULT, ) diff --git a/tests/unit/test_snapshot.py b/tests/unit/test_snapshot.py index bbc1753474..627b18d910 100644 --- a/tests/unit/test_snapshot.py +++ b/tests/unit/test_snapshot.py @@ -15,6 +15,8 @@ from google.api_core import gapic_v1 import mock + +from google.cloud.spanner_v1 import RequestOptions from tests._helpers import ( OpenTelemetryBase, StatusCode, @@ -590,6 +592,7 @@ def _execute_sql_helper( partition=None, sql_count=0, query_options=None, + request_options=None, timeout=gapic_v1.method.DEFAULT, retry=gapic_v1.method.DEFAULT, ): @@ -649,6 +652,7 @@ def _execute_sql_helper( PARAM_TYPES, query_mode=MODE, query_options=query_options, + request_options=request_options, partition=partition, retry=retry, timeout=timeout, @@ -695,6 +699,7 @@ def _execute_sql_helper( param_types=PARAM_TYPES, query_mode=MODE, query_options=expected_query_options, + request_options=request_options, partition_token=partition, seqno=sql_count, ) @@ -747,6 +752,14 @@ def test_execute_sql_w_query_options(self): query_options=ExecuteSqlRequest.QueryOptions(optimizer_version="3"), ) + def test_execute_sql_w_request_options(self): + self._execute_sql_helper( + multi_use=False, + request_options=RequestOptions( + priority=RequestOptions.Priority.PRIORITY_MEDIUM + ), + ) + def _partition_read_helper( self, multi_use, diff --git a/tests/unit/test_transaction.py b/tests/unit/test_transaction.py index 99f986d99e..d87821fa4a 100644 --- a/tests/unit/test_transaction.py +++ b/tests/unit/test_transaction.py @@ -14,12 +14,15 @@ import mock -from tests._helpers import OpenTelemetryBase, StatusCode + +from google.cloud.spanner_v1 import RequestOptions from google.cloud.spanner_v1 import Type from google.cloud.spanner_v1 import TypeCode from google.api_core.retry import Retry from google.api_core import gapic_v1 +from tests._helpers import OpenTelemetryBase, StatusCode + TABLE_NAME = "citizens" COLUMNS = ["email", "first_name", "last_name", "age"] VALUES = [ @@ -416,6 +419,7 @@ def _execute_update_helper( self, count=0, query_options=None, + request_options=None, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, ): @@ -447,6 +451,7 @@ def _execute_update_helper( PARAM_TYPES, query_mode=MODE, query_options=query_options, + request_options=request_options, retry=retry, timeout=timeout, ) @@ -472,6 +477,7 @@ def _execute_update_helper( param_types=PARAM_TYPES, query_mode=MODE, query_options=expected_query_options, + request_options=request_options, seqno=count, ) api.execute_sql.assert_called_once_with( @@ -518,6 +524,13 @@ def test_execute_update_w_query_options(self): query_options=ExecuteSqlRequest.QueryOptions(optimizer_version="3") ) + def test_execute_update_w_request_options(self): + self._execute_update_helper( + request_options=RequestOptions( + priority=RequestOptions.Priority.PRIORITY_MEDIUM + ) + ) + def test_batch_update_other_error(self): database = _Database() database.spanner_api = self._make_spanner_api() @@ -529,7 +542,7 @@ def test_batch_update_other_error(self): with self.assertRaises(RuntimeError): transaction.batch_update(statements=[DML_QUERY]) - def _batch_update_helper(self, error_after=None, count=0): + def _batch_update_helper(self, error_after=None, count=0, request_options=None): from google.rpc.status_pb2 import Status from google.protobuf.struct_pb2 import Struct from google.cloud.spanner_v1 import param_types @@ -576,7 +589,9 @@ def _batch_update_helper(self, error_after=None, count=0): transaction._transaction_id = self.TRANSACTION_ID transaction._execute_sql_count = count - status, row_counts = transaction.batch_update(dml_statements) + status, row_counts = transaction.batch_update( + dml_statements, request_options=request_options + ) self.assertEqual(status, expected_status) self.assertEqual(row_counts, expected_row_counts) @@ -602,6 +617,7 @@ def _batch_update_helper(self, error_after=None, count=0): transaction=expected_transaction, statements=expected_statements, seqno=count, + request_options=request_options, ) api.execute_batch_dml.assert_called_once_with( request=expected_request, @@ -611,7 +627,11 @@ def _batch_update_helper(self, error_after=None, count=0): self.assertEqual(transaction._execute_sql_count, count + 1) def test_batch_update_wo_errors(self): - self._batch_update_helper() + self._batch_update_helper( + request_options=RequestOptions( + priority=RequestOptions.Priority.PRIORITY_MEDIUM + ), + ) def test_batch_update_w_errors(self): self._batch_update_helper(error_after=2, count=1) From c58cab432918ca119f21889b390c8ffe3ad7abf2 Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Tue, 22 Jun 2021 13:54:03 +0000 Subject: [PATCH 09/12] chore: update precommit hook pre-commit/pre-commit-hooks to v4 (#1083) (#375) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com) This PR contains the following updates: | Package | Type | Update | Change | |---|---|---|---| | [pre-commit/pre-commit-hooks](https://togithub.com/pre-commit/pre-commit-hooks) | repository | major | `v3.4.0` -> `v4.0.1` | --- ### Release Notes
pre-commit/pre-commit-hooks ### [`v4.0.1`](https://togithub.com/pre-commit/pre-commit-hooks/releases/v4.0.1) [Compare Source](https://togithub.com/pre-commit/pre-commit-hooks/compare/v4.0.0...v4.0.1) ##### Fixes - `check-shebang-scripts-are-executable` fix entry point. - [#​602](https://togithub.com/pre-commit/pre-commit-hooks/issues/602) issue by [@​Person-93](https://togithub.com/Person-93). - [#​603](https://togithub.com/pre-commit/pre-commit-hooks/issues/603) PR by [@​scop](https://togithub.com/scop). ### [`v4.0.0`](https://togithub.com/pre-commit/pre-commit-hooks/releases/v4.0.0) [Compare Source](https://togithub.com/pre-commit/pre-commit-hooks/compare/v3.4.0...v4.0.0) ##### Features - `check-json`: report duplicate keys. - [#​558](https://togithub.com/pre-commit/pre-commit-hooks/issues/558) PR by [@​AdityaKhursale](https://togithub.com/AdityaKhursale). - [#​554](https://togithub.com/pre-commit/pre-commit-hooks/issues/554) issue by [@​adamchainz](https://togithub.com/adamchainz). - `no-commit-to-branch`: add `main` to default blocked branches. - [#​565](https://togithub.com/pre-commit/pre-commit-hooks/issues/565) PR by [@​ndevenish](https://togithub.com/ndevenish). - `check-case-conflict`: check conflicts in directory names as well. - [#​575](https://togithub.com/pre-commit/pre-commit-hooks/issues/575) PR by [@​slsyy](https://togithub.com/slsyy). - [#​70](https://togithub.com/pre-commit/pre-commit-hooks/issues/70) issue by [@​andyjack](https://togithub.com/andyjack). - `check-vcs-permalinks`: forbid other branch names. - [#​582](https://togithub.com/pre-commit/pre-commit-hooks/issues/582) PR by [@​jack1142](https://togithub.com/jack1142). - [#​581](https://togithub.com/pre-commit/pre-commit-hooks/issues/581) issue by [@​jack1142](https://togithub.com/jack1142). - `check-shebang-scripts-are-executable`: new hook which ensures shebang'd scripts are executable. - [#​545](https://togithub.com/pre-commit/pre-commit-hooks/issues/545) PR by [@​scop](https://togithub.com/scop). ##### Fixes - `check-executables-have-shebangs`: Short circuit shebang lookup on windows. - [#​544](https://togithub.com/pre-commit/pre-commit-hooks/issues/544) PR by [@​scop](https://togithub.com/scop). - `requirements-txt-fixer`: Fix comments which have indentation - [#​549](https://togithub.com/pre-commit/pre-commit-hooks/issues/549) PR by [@​greshilov](https://togithub.com/greshilov). - [#​548](https://togithub.com/pre-commit/pre-commit-hooks/issues/548) issue by [@​greshilov](https://togithub.com/greshilov). - `pretty-format-json`: write to stdout using UTF-8 encoding. - [#​571](https://togithub.com/pre-commit/pre-commit-hooks/issues/571) PR by [@​jack1142](https://togithub.com/jack1142). - [#​570](https://togithub.com/pre-commit/pre-commit-hooks/issues/570) issue by [@​jack1142](https://togithub.com/jack1142). - Use more inclusive language. - [#​599](https://togithub.com/pre-commit/pre-commit-hooks/issues/599) PR by [@​asottile](https://togithub.com/asottile). ##### Breaking changes - Remove deprecated hooks: `flake8`, `pyflakes`, `autopep8-wrapper`. - [#​597](https://togithub.com/pre-commit/pre-commit-hooks/issues/597) PR by [@​asottile](https://togithub.com/asottile).
--- ### Configuration 📅 **Schedule**: At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻️ **Rebasing**: Renovate will not automatically rebase this PR, because other commits have been found. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] If you want to rebase/retry this PR, check this box. --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/synthtool). Source-Link: https://github.com/googleapis/synthtool/commit/333fd90856f1454380514bc59fc0936cdaf1c202 Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:b8c131c558606d3cea6e18f8e87befbd448c1482319b0db3c5d5388fa6ea72e3 --- .github/.OwlBot.lock.yaml | 2 +- .pre-commit-config.yaml | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/.OwlBot.lock.yaml b/.github/.OwlBot.lock.yaml index cc49c6a3df..9602d54059 100644 --- a/.github/.OwlBot.lock.yaml +++ b/.github/.OwlBot.lock.yaml @@ -1,3 +1,3 @@ docker: image: gcr.io/repo-automation-bots/owlbot-python:latest - digest: sha256:b6169fc6a5207b11800a7c002d0c5c2bc6d82697185ca12e666f44031468cfcd + digest: sha256:b8c131c558606d3cea6e18f8e87befbd448c1482319b0db3c5d5388fa6ea72e3 diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 4f00c7cffc..62eb5a77d9 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -16,7 +16,7 @@ # See https://pre-commit.com/hooks.html for more hooks repos: - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v3.4.0 + rev: v4.0.1 hooks: - id: trailing-whitespace - id: end-of-file-fixer From b8b24e17a74c1296ca5de75798a1a32597691b53 Mon Sep 17 00:00:00 2001 From: larkee <31196561+larkee@users.noreply.github.com> Date: Thu, 24 Jun 2021 09:22:54 +1000 Subject: [PATCH 10/12] fix: classify batched DDL statements (#360) * fix: classify batched DDL statements * docs: add comment * style: fix lint Co-authored-by: larkee --- google/cloud/spanner_dbapi/cursor.py | 7 ++++++- test.py | 11 +++++++++++ tests/unit/spanner_dbapi/test_cursor.py | 14 +++++++++++++- 3 files changed, 30 insertions(+), 2 deletions(-) create mode 100644 test.py diff --git a/google/cloud/spanner_dbapi/cursor.py b/google/cloud/spanner_dbapi/cursor.py index 689ba8cf66..c5de13b370 100644 --- a/google/cloud/spanner_dbapi/cursor.py +++ b/google/cloud/spanner_dbapi/cursor.py @@ -176,11 +176,16 @@ def execute(self, sql, args=None): try: classification = parse_utils.classify_stmt(sql) if classification == parse_utils.STMT_DDL: + ddl_statements = [] for ddl in sqlparse.split(sql): if ddl: if ddl[-1] == ";": ddl = ddl[:-1] - self.connection._ddl_statements.append(ddl) + if parse_utils.classify_stmt(ddl) != parse_utils.STMT_DDL: + raise ValueError("Only DDL statements may be batched.") + ddl_statements.append(ddl) + # Only queue DDL statements if they are all correctly classified. + self.connection._ddl_statements.extend(ddl_statements) if self.connection.autocommit: self.connection.run_prior_DDL_statements() return diff --git a/test.py b/test.py new file mode 100644 index 0000000000..6032524b04 --- /dev/null +++ b/test.py @@ -0,0 +1,11 @@ +from google.cloud import spanner +from gooogle.cloud.spanner_v1 import RequestOptions + +client = spanner.Client() +instance = client.instance('test-instance') +database = instance.database('test-db') + +with database.snapshot() as snapshot: + results = snapshot.execute_sql("SELECT * in all_types LIMIT %s", ) + +database.drop() \ No newline at end of file diff --git a/tests/unit/spanner_dbapi/test_cursor.py b/tests/unit/spanner_dbapi/test_cursor.py index 789ca06695..5b1cf12138 100644 --- a/tests/unit/spanner_dbapi/test_cursor.py +++ b/tests/unit/spanner_dbapi/test_cursor.py @@ -171,13 +171,25 @@ def test_execute_statement(self): connection = self._make_connection(self.INSTANCE, mock.MagicMock()) cursor = self._make_one(connection) + with mock.patch( + "google.cloud.spanner_dbapi.parse_utils.classify_stmt", + side_effect=[parse_utils.STMT_DDL, parse_utils.STMT_INSERT], + ) as mock_classify_stmt: + sql = "sql" + with self.assertRaises(ValueError): + cursor.execute(sql=sql) + mock_classify_stmt.assert_called_with(sql) + self.assertEqual(mock_classify_stmt.call_count, 2) + self.assertEqual(cursor.connection._ddl_statements, []) + with mock.patch( "google.cloud.spanner_dbapi.parse_utils.classify_stmt", return_value=parse_utils.STMT_DDL, ) as mock_classify_stmt: sql = "sql" cursor.execute(sql=sql) - mock_classify_stmt.assert_called_once_with(sql) + mock_classify_stmt.assert_called_with(sql) + self.assertEqual(mock_classify_stmt.call_count, 2) self.assertEqual(cursor.connection._ddl_statements, [sql]) with mock.patch( From 44aa7cc79769b6b7870b9de7204094f816150a25 Mon Sep 17 00:00:00 2001 From: Zoe Date: Thu, 24 Jun 2021 09:26:35 +1000 Subject: [PATCH 11/12] feat: add support for low-cost instances (#313) * Add LCI implementation * Update google/cloud/spanner_v1/instance.py Co-authored-by: larkee <31196561+larkee@users.noreply.github.com> * Fix docstring format * Update google/cloud/spanner_v1/instance.py Co-authored-by: larkee <31196561+larkee@users.noreply.github.com> Co-authored-by: larkee <31196561+larkee@users.noreply.github.com> --- google/cloud/spanner_v1/client.py | 9 +++- google/cloud/spanner_v1/instance.py | 83 +++++++++++++++++++++++++---- tests/system/test_system.py | 29 ++++++++++ tests/unit/test_client.py | 3 ++ tests/unit/test_instance.py | 71 +++++++++++++++++++++--- 5 files changed, 177 insertions(+), 18 deletions(-) diff --git a/google/cloud/spanner_v1/client.py b/google/cloud/spanner_v1/client.py index d5ccf39546..4d5fc1b69a 100644 --- a/google/cloud/spanner_v1/client.py +++ b/google/cloud/spanner_v1/client.py @@ -49,7 +49,6 @@ from google.cloud.client import ClientWithProject from google.cloud.spanner_v1 import __version__ from google.cloud.spanner_v1._helpers import _merge_query_options, _metadata_with_prefix -from google.cloud.spanner_v1.instance import DEFAULT_NODE_COUNT from google.cloud.spanner_v1.instance import Instance from google.cloud.spanner_v1 import ExecuteSqlRequest from google.cloud.spanner_admin_instance_v1 import ListInstanceConfigsRequest @@ -294,8 +293,9 @@ def instance( instance_id, configuration_name=None, display_name=None, - node_count=DEFAULT_NODE_COUNT, + node_count=None, labels=None, + processing_units=None, ): """Factory to create a instance associated with this client. @@ -320,6 +320,10 @@ def instance( :param node_count: (Optional) The number of nodes in the instance's cluster; used to set up the instance's cluster. + :type processing_units: int + :param processing_units: (Optional) The number of processing units + allocated to this instance. + :type labels: dict (str -> str) or None :param labels: (Optional) User-assigned labels for this instance. @@ -334,6 +338,7 @@ def instance( display_name, self._emulator_host, labels, + processing_units, ) def list_instances(self, filter_="", page_size=None): diff --git a/google/cloud/spanner_v1/instance.py b/google/cloud/spanner_v1/instance.py index 5a9cf95f5a..7f5539acf8 100644 --- a/google/cloud/spanner_v1/instance.py +++ b/google/cloud/spanner_v1/instance.py @@ -15,6 +15,7 @@ """User friendly container for Cloud Spanner Instance.""" import google.api_core.operation +from google.api_core.exceptions import InvalidArgument import re from google.cloud.spanner_admin_instance_v1 import Instance as InstancePB @@ -41,6 +42,7 @@ ) DEFAULT_NODE_COUNT = 1 +PROCESSING_UNITS_PER_NODE = 1000 _OPERATION_METADATA_MESSAGES = ( backup.Backup, @@ -95,6 +97,10 @@ class Instance(object): :type node_count: int :param node_count: (Optional) Number of nodes allocated to the instance. + :type processing_units: int + :param processing_units: (Optional) The number of processing units + allocated to this instance. + :type display_name: str :param display_name: (Optional) The display name for the instance in the Cloud Console UI. (Must be between 4 and 30 @@ -110,15 +116,29 @@ def __init__( instance_id, client, configuration_name=None, - node_count=DEFAULT_NODE_COUNT, + node_count=None, display_name=None, emulator_host=None, labels=None, + processing_units=None, ): self.instance_id = instance_id self._client = client self.configuration_name = configuration_name - self.node_count = node_count + if node_count is not None and processing_units is not None: + if processing_units != node_count * PROCESSING_UNITS_PER_NODE: + raise InvalidArgument( + "Only one of node count and processing units can be set." + ) + if node_count is None and processing_units is None: + self._node_count = DEFAULT_NODE_COUNT + self._processing_units = DEFAULT_NODE_COUNT * PROCESSING_UNITS_PER_NODE + elif node_count is not None: + self._node_count = node_count + self._processing_units = node_count * PROCESSING_UNITS_PER_NODE + else: + self._processing_units = processing_units + self._node_count = processing_units // PROCESSING_UNITS_PER_NODE self.display_name = display_name or instance_id self.emulator_host = emulator_host if labels is None: @@ -134,7 +154,8 @@ def _update_from_pb(self, instance_pb): raise ValueError("Instance protobuf does not contain display_name") self.display_name = instance_pb.display_name self.configuration_name = instance_pb.config - self.node_count = instance_pb.node_count + self._node_count = instance_pb.node_count + self._processing_units = instance_pb.processing_units self.labels = instance_pb.labels @classmethod @@ -190,6 +211,44 @@ def name(self): """ return self._client.project_name + "/instances/" + self.instance_id + @property + def processing_units(self): + """Processing units used in requests. + + :rtype: int + :returns: The number of processing units allocated to this instance. + """ + return self._processing_units + + @processing_units.setter + def processing_units(self, value): + """Sets the processing units for requests. Affects node_count. + + :param value: The number of processing units allocated to this instance. + """ + self._processing_units = value + self._node_count = value // PROCESSING_UNITS_PER_NODE + + @property + def node_count(self): + """Node count used in requests. + + :rtype: int + :returns: + The number of nodes in the instance's cluster; + used to set up the instance's cluster. + """ + return self._node_count + + @node_count.setter + def node_count(self, value): + """Sets the node count for requests. Affects processing_units. + + :param value: The number of nodes in the instance's cluster. + """ + self._node_count = value + self._processing_units = value * PROCESSING_UNITS_PER_NODE + def __eq__(self, other): if not isinstance(other, self.__class__): return NotImplemented @@ -218,7 +277,8 @@ def copy(self): self.instance_id, new_client, self.configuration_name, - node_count=self.node_count, + node_count=self._node_count, + processing_units=self._processing_units, display_name=self.display_name, ) @@ -250,7 +310,7 @@ def create(self): name=self.name, config=self.configuration_name, display_name=self.display_name, - node_count=self.node_count, + processing_units=self._processing_units, labels=self.labels, ) metadata = _metadata_with_prefix(self.name) @@ -306,8 +366,8 @@ def update(self): .. note:: - Updates the ``display_name``, ``node_count`` and ``labels``. To change those - values before updating, set them via + Updates the ``display_name``, ``node_count``, ``processing_units`` + and ``labels``. To change those values before updating, set them via .. code:: python @@ -325,10 +385,15 @@ def update(self): name=self.name, config=self.configuration_name, display_name=self.display_name, - node_count=self.node_count, + node_count=self._node_count, + processing_units=self._processing_units, labels=self.labels, ) - field_mask = FieldMask(paths=["config", "display_name", "node_count", "labels"]) + + # Always update only processing_units, not nodes + field_mask = FieldMask( + paths=["config", "display_name", "processing_units", "labels"] + ) metadata = _metadata_with_prefix(self.name) future = api.update_instance( diff --git a/tests/system/test_system.py b/tests/system/test_system.py index 8471cfc4c2..ad2b8a9178 100644 --- a/tests/system/test_system.py +++ b/tests/system/test_system.py @@ -229,6 +229,35 @@ def test_create_instance(self): self.assertEqual(instance, instance_alt) self.assertEqual(instance.display_name, instance_alt.display_name) + @unittest.skipIf(USE_EMULATOR, "Skipping LCI tests") + @unittest.skipUnless(CREATE_INSTANCE, "Skipping instance creation") + def test_create_instance_with_processing_nodes(self): + ALT_INSTANCE_ID = "new" + unique_resource_id("-") + PROCESSING_UNITS = 5000 + instance = Config.CLIENT.instance( + instance_id=ALT_INSTANCE_ID, + configuration_name=Config.INSTANCE_CONFIG.name, + processing_units=PROCESSING_UNITS, + ) + operation = instance.create() + # Make sure this instance gets deleted after the test case. + self.instances_to_delete.append(instance) + + # We want to make sure the operation completes. + operation.result( + SPANNER_OPERATION_TIMEOUT_IN_SECONDS + ) # raises on failure / timeout. + + # Create a new instance instance and make sure it is the same. + instance_alt = Config.CLIENT.instance( + ALT_INSTANCE_ID, Config.INSTANCE_CONFIG.name + ) + instance_alt.reload() + + self.assertEqual(instance, instance_alt) + self.assertEqual(instance.display_name, instance_alt.display_name) + self.assertEqual(instance.processing_units, instance_alt.processing_units) + @unittest.skipIf(USE_EMULATOR, "Skipping updating instance") def test_update_instance(self): OLD_DISPLAY_NAME = Config.INSTANCE.display_name diff --git a/tests/unit/test_client.py b/tests/unit/test_client.py index d33d9cc08a..2777fbc9a0 100644 --- a/tests/unit/test_client.py +++ b/tests/unit/test_client.py @@ -37,6 +37,7 @@ class TestClient(unittest.TestCase): INSTANCE_NAME = "%s/instances/%s" % (PATH, INSTANCE_ID) DISPLAY_NAME = "display-name" NODE_COUNT = 5 + PROCESSING_UNITS = 5000 LABELS = {"test": "true"} TIMEOUT_SECONDS = 80 @@ -580,6 +581,7 @@ def test_list_instances(self): config=self.CONFIGURATION_NAME, display_name=self.DISPLAY_NAME, node_count=self.NODE_COUNT, + processing_units=self.PROCESSING_UNITS, ) ] ) @@ -597,6 +599,7 @@ def test_list_instances(self): self.assertEqual(instance.config, self.CONFIGURATION_NAME) self.assertEqual(instance.display_name, self.DISPLAY_NAME) self.assertEqual(instance.node_count, self.NODE_COUNT) + self.assertEqual(instance.processing_units, self.PROCESSING_UNITS) expected_metadata = ( ("google-cloud-resource-prefix", client.project_name), diff --git a/tests/unit/test_instance.py b/tests/unit/test_instance.py index 2ed777b25b..c715fb2ee1 100644 --- a/tests/unit/test_instance.py +++ b/tests/unit/test_instance.py @@ -27,6 +27,7 @@ class TestInstance(unittest.TestCase): LOCATION = "projects/" + PROJECT + "/locations/" + CONFIG_NAME DISPLAY_NAME = "display_name" NODE_COUNT = 5 + PROCESSING_UNITS = 5000 OP_ID = 8915 OP_NAME = "operations/projects/%s/instances/%soperations/%d" % ( PROJECT, @@ -39,6 +40,7 @@ class TestInstance(unittest.TestCase): DATABASE_ID = "database_id" DATABASE_NAME = "%s/databases/%s" % (INSTANCE_NAME, DATABASE_ID) LABELS = {"test": "true"} + FIELD_MASK = ["config", "display_name", "processing_units", "labels"] def _getTargetClass(self): from google.cloud.spanner_v1.instance import Instance @@ -230,7 +232,7 @@ def test_create_already_exists(self): self.assertEqual(instance.name, self.INSTANCE_NAME) self.assertEqual(instance.config, self.CONFIG_NAME) self.assertEqual(instance.display_name, self.INSTANCE_ID) - self.assertEqual(instance.node_count, 1) + self.assertEqual(instance.processing_units, 1000) self.assertEqual(metadata, [("google-cloud-resource-prefix", instance.name)]) def test_create_success(self): @@ -258,7 +260,36 @@ def test_create_success(self): self.assertEqual(instance.name, self.INSTANCE_NAME) self.assertEqual(instance.config, self.CONFIG_NAME) self.assertEqual(instance.display_name, self.DISPLAY_NAME) - self.assertEqual(instance.node_count, self.NODE_COUNT) + self.assertEqual(instance.processing_units, self.PROCESSING_UNITS) + self.assertEqual(instance.labels, self.LABELS) + self.assertEqual(metadata, [("google-cloud-resource-prefix", instance.name)]) + + def test_create_with_processing_units(self): + op_future = _FauxOperationFuture() + client = _Client(self.PROJECT) + api = client.instance_admin_api = _FauxInstanceAdminAPI( + _create_instance_response=op_future + ) + instance = self._make_one( + self.INSTANCE_ID, + client, + configuration_name=self.CONFIG_NAME, + display_name=self.DISPLAY_NAME, + processing_units=self.PROCESSING_UNITS, + labels=self.LABELS, + ) + + future = instance.create() + + self.assertIs(future, op_future) + + (parent, instance_id, instance, metadata) = api._created_instance + self.assertEqual(parent, self.PARENT) + self.assertEqual(instance_id, self.INSTANCE_ID) + self.assertEqual(instance.name, self.INSTANCE_NAME) + self.assertEqual(instance.config, self.CONFIG_NAME) + self.assertEqual(instance.display_name, self.DISPLAY_NAME) + self.assertEqual(instance.processing_units, self.PROCESSING_UNITS) self.assertEqual(instance.labels, self.LABELS) self.assertEqual(metadata, [("google-cloud-resource-prefix", instance.name)]) @@ -389,9 +420,7 @@ def test_update_not_found(self): instance.update() instance, field_mask, metadata = api._updated_instance - self.assertEqual( - field_mask.paths, ["config", "display_name", "node_count", "labels"] - ) + self.assertEqual(field_mask.paths, self.FIELD_MASK) self.assertEqual(instance.name, self.INSTANCE_NAME) self.assertEqual(instance.config, self.CONFIG_NAME) self.assertEqual(instance.display_name, self.INSTANCE_ID) @@ -417,14 +446,42 @@ def test_update_success(self): self.assertIs(future, op_future) + instance, field_mask, metadata = api._updated_instance + self.assertEqual(field_mask.paths, self.FIELD_MASK) + self.assertEqual(instance.name, self.INSTANCE_NAME) + self.assertEqual(instance.config, self.CONFIG_NAME) + self.assertEqual(instance.display_name, self.DISPLAY_NAME) + self.assertEqual(instance.node_count, self.NODE_COUNT) + self.assertEqual(instance.labels, self.LABELS) + self.assertEqual(metadata, [("google-cloud-resource-prefix", instance.name)]) + + def test_update_success_with_processing_units(self): + op_future = _FauxOperationFuture() + client = _Client(self.PROJECT) + api = client.instance_admin_api = _FauxInstanceAdminAPI( + _update_instance_response=op_future + ) + instance = self._make_one( + self.INSTANCE_ID, + client, + configuration_name=self.CONFIG_NAME, + processing_units=self.PROCESSING_UNITS, + display_name=self.DISPLAY_NAME, + labels=self.LABELS, + ) + + future = instance.update() + + self.assertIs(future, op_future) + instance, field_mask, metadata = api._updated_instance self.assertEqual( - field_mask.paths, ["config", "display_name", "node_count", "labels"] + field_mask.paths, ["config", "display_name", "processing_units", "labels"] ) self.assertEqual(instance.name, self.INSTANCE_NAME) self.assertEqual(instance.config, self.CONFIG_NAME) self.assertEqual(instance.display_name, self.DISPLAY_NAME) - self.assertEqual(instance.node_count, self.NODE_COUNT) + self.assertEqual(instance.processing_units, self.PROCESSING_UNITS) self.assertEqual(instance.labels, self.LABELS) self.assertEqual(metadata, [("google-cloud-resource-prefix", instance.name)]) From 32a557674dfe0718ea922d9a2359b89a26a7baf8 Mon Sep 17 00:00:00 2001 From: "release-please[bot]" <55107282+release-please[bot]@users.noreply.github.com> Date: Thu, 24 Jun 2021 17:01:50 +1200 Subject: [PATCH 12/12] chore: release 3.6.0 (#370) * chore: release 3.6.0 * fix: add missing PR to CHANGELOG Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com> Co-authored-by: larkee --- CHANGELOG.md | 22 ++++++++++++++++++++++ setup.py | 2 +- 2 files changed, 23 insertions(+), 1 deletion(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 24886db2ab..6e9caf08c6 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -4,6 +4,28 @@ [1]: https://pypi.org/project/google-cloud-spanner/#history +## [3.6.0](https://www.github.com/googleapis/python-spanner/compare/v3.5.0...v3.6.0) (2021-06-23) + + +### Features + +* add RPC priority support ([#324](https://www.github.com/googleapis/python-spanner/issues/324)) ([51533b8](https://www.github.com/googleapis/python-spanner/commit/51533b812b68004eafeb402641b974e76bf9a837)) +* add support for low-cost instances ([#313](https://www.github.com/googleapis/python-spanner/issues/313)) ([44aa7cc](https://www.github.com/googleapis/python-spanner/commit/44aa7cc79769b6b7870b9de7204094f816150a25)) +* **spanner:** add processing_units to Instance resource ([#364](https://www.github.com/googleapis/python-spanner/issues/364)) ([113505c](https://www.github.com/googleapis/python-spanner/commit/113505c58dc52509973f4199330a8983e3c5d848)) +* update query stats samples ([#373](https://www.github.com/googleapis/python-spanner/issues/373)) ([c1ee8c2](https://www.github.com/googleapis/python-spanner/commit/c1ee8c2685a794f9f89329e16f7c461e135114af)) + + +### Bug Fixes + +* **db_api:** use sqlparse to split DDL statements ([#372](https://www.github.com/googleapis/python-spanner/issues/372)) ([ed9e124](https://github.com/googleapis/python-spanner/commit/ed9e124aa74e44778104e45eae1e577978d6b866)) +* **db_api:** classify batched DDL statements ([#360](https://www.github.com/googleapis/python-spanner/issues/360)) ([b8b24e1](https://www.github.com/googleapis/python-spanner/commit/b8b24e17a74c1296ca5de75798a1a32597691b53)) +* **deps:** add packaging requirement ([#368](https://www.github.com/googleapis/python-spanner/issues/368)) ([89c126c](https://www.github.com/googleapis/python-spanner/commit/89c126ceca327fcf9f344dace691522e7351dde7)) + + +### Documentation + +* omit mention of Python 2.7 in 'CONTRIBUTING.rst' ([#1127](https://www.github.com/googleapis/python-spanner/issues/1127)) ([#374](https://www.github.com/googleapis/python-spanner/issues/374)) ([b7b3c38](https://www.github.com/googleapis/python-spanner/commit/b7b3c383abcca99dcbae6d92b27c49ca6707010a)), closes [#1126](https://www.github.com/googleapis/python-spanner/issues/1126) + ## [3.5.0](https://www.github.com/googleapis/python-spanner/compare/v3.4.0...v3.5.0) (2021-06-11) diff --git a/setup.py b/setup.py index d8becf5f2c..c9e69d9271 100644 --- a/setup.py +++ b/setup.py @@ -22,7 +22,7 @@ name = "google-cloud-spanner" description = "Cloud Spanner API client library" -version = "3.5.0" +version = "3.6.0" # Should be one of: # 'Development Status :: 3 - Alpha' # 'Development Status :: 4 - Beta'