8000 chore: transition the library to microgenerator (#56) · googleapis/google-cloud-python@ef36ecd · GitHub
[go: up one dir, main page]

Skip to content

Commit ef36ecd

Browse files
plamutbusunkim96
andauthored
chore: transition the library to microgenerator (#56)
* chore: remove old GAPIC code * Regenerate the library with microgenerator * Fix docs toctree includes * Update Python version compatibility in README * Adjust samples * Fix datatransfer shim unit test * Reduce required coverage threshold The generated code tests do not cover all code paths after all... * Simplify TransferConfig instantiation in sample * Add UPGRADING guide * Update UPGRADING.md (method name) Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com>
1 parent 3b0967e commit ef36ecd

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

60 files changed

+11686
-11930
lines changed
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
#!/bin/bash
2+
# Copyright 2020 Google LLC.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
16+
set -eo pipefail
17+
18+
function now { date +"%Y-%m-%d %H:%M:%S" | tr -d '\n' ;}
19+
function msg { println "$*" >&2 ;}
20+
function println { printf '%s\n' "$(now) $*" ;}
21+
22+
23+
# Populates requested secrets set in SECRET_MANAGER_KEYS from service account:
24+
# kokoro-trampoline@cloud-devrel-kokoro-resources.iam.gserviceaccount.com
25+
SECRET_LOCATION="${KOKORO_GFILE_DIR}/secret_manager"
26+
msg "Creating folder on disk for secrets: ${SECRET_LOCATION}"
27+
mkdir -p ${SECRET_LOCATION}
28+
for key in $(echo ${SECRET_MANAGER_KEYS} | sed "s/,/ /g")
29+
do
30+
msg "Retrieving secret ${key}"
31+
docker run --entrypoint=gcloud \
32+
--volume=${KOKORO_GFILE_DIR}:${KOKORO_GFILE_DIR} \
33+
gcr.io/google.com/cloudsdktool/cloud-sdk \
34+
secrets versions access latest \
35+
--project cloud-devrel-kokoro-resources \
36+
--secret ${key} > \
37+
"${SECRET_LOCATION}/${key}"
38+
if [[ $? == 0 ]]; then
39+
msg "Secret written to ${SECRET_LOCATION}/${key}"
40+
else
41+
msg "Error retrieving secret ${key}"
42+
fi
43+
done

packages/google-cloud-bigquery-datatransfer/.kokoro/release/common.cfg

Lines changed: 13 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -23,42 +23,18 @@ env_vars: {
2323
value: "github/python-bigquery-datatransfer/.kokoro/release.sh"
2424
}
2525

26-
# Fetch the token needed for reporting release status to GitHub
27-
before_action {
28-
fetch_keystore {
29-
keystore_resource {
30-
keystore_config_id: 73713
31-
keyname: "yoshi-automation-github-key"
32-
}
33-
}
34-
}
35-
36-
# Fetch PyPI password
37-
before_action {
38-
fetch_keystore {
39-
keystore_resource {
40-
keystore_config_id: 73713
41-
keyname: "google_cloud_pypi_password"
42-
}
43-
}
44-
}
45-
46-
# Fetch magictoken to use with Magic Github Proxy
47-
before_action {
48-
fetch_keystore {
49-
keystore_resource {
50-
keystore_config_id: 73713
51-
keyname: "releasetool-magictoken"
52-
}
53-
}
26+
# Fetch PyPI password
27+
before_action {
28+
fetch_keystore {
29+
keystore_resource {
30+
keystore_config_id: 73713
31+
keyname: "google_cloud_pypi_password"
32+
}
33+
}
5434
}
5535

56-
# Fetch api key to use with Magic Github Proxy
57-
before_action {
58-
fetch_keystore {
59-
keystore_resource {
60-
keystore_config_id: 73713
61-
keyname: "magic-github-proxy-api-key"
62-
}
63-
}
64-
}
36+
# Tokens needed to report release status back to GitHub
37+
env_vars: {
38+
key: "SECRET_MANAGER_KEYS"
39+
value: "releasetool-publish-reporter-app,releasetool-publish-reporter-googleapis-installation,releasetool-publish-reporter-pem"
40+
}

packages/google-cloud-bigquery-datatransfer/.kokoro/trampoline.sh

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,14 @@
1515

1616
set -eo pipefail
1717

18-
python3 "${KOKORO_GFILE_DIR}/trampoline_v1.py" || ret_code=$?
18+
# Always run the cleanup script, regardless of the success of bouncing into
19+
# the container.
20+
function cleanup() {
21+
chmod +x ${KOKORO_GFILE_DIR}/trampoline_cleanup.sh
22+
${KOKORO_GFILE_DIR}/trampoline_cleanup.sh
23+
echo "cleanup";
24+
}
25+
trap cleanup EXIT
1926

20-
chmod +x ${KOKORO_GFILE_DIR}/trampoline_cleanup.sh
21-
${KOKORO_GFILE_DIR}/trampoline_cleanup.sh || true
22-
23-
exit ${ret_code}
27+
$(dirname $0)/populate-secrets.sh # Secret Manager secrets.
28+
python3 "${KOKORO_GFILE_DIR}/trampoline_v1.py"

packages/google-cloud-bigquery-datatransfer/README.rst

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -48,11 +48,14 @@ dependencies.
4848

4949
Supported Python Versions
5050
^^^^^^^^^^^^^^^^^^^^^^^^^
51-
Python >= 3.5
51+
Python >= 3.6
5252

5353
Deprecated Python Versions
5454
^^^^^^^^^^^^^^^^^^^^^^^^^^
55-
Python == 2.7. Python 2.7 support will be removed on January 1, 2020.
55+
Python == 2.7.
56+
57+
The last version of this library compatible with Python 2.7 is
58+
``google-cloud-bigquery-datatransfer==1.1.1``.
5659

5760

5861
Mac/Linux
Lines changed: 211 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,211 @@
1+
<!--
2+
Copyright 2020 Google LLC
3+
4+
Licensed under the Apache License, Version 2.0 (the "License");
5+
you may not use this file except in compliance with the License.
6+
You may obtain a copy of the License at
7+
8+
https://www.apache.org/licenses/LICENSE-2.0
9+
10+
Unless required by applicable law or agreed to in writing, software
11+
distributed under the License is distributed on an "AS IS" BASIS,
12+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
See the License for the specific language governing permissions and
14+
limitations under the License.
15+
-->
16+
17+
18+
# 2.0.0 Migration Guide
19+
20+
The 2.0 release of the `google-cloud-bigquery-datatransfer` client is a significant
21+
upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python),
22+
and includes substantial interface changes. Existing code written for earlier versions
23+
of this library will likely require updates to use this version. This document
24+
describes the changes that have been made, and what you need to do to update your usage.
25+
26+
If you experience issues or have questions, please file an
27+
[issue](https://github.com/googleapis/python-bigquery-datatransfer/issues).
28+
29+
30+
## Supported Python Versions
31+
32+
> **WARNING**: Breaking change
33+
34+
The 2.0.0 rel 10000 ease requires Python 3.6+.
35+
36+
37+
## Import Path
38+
39+
> **WARNING**: Breaking change
40+
41+
The library was moved into `google.cloud.bigquery` namespace. Existing imports
42+
need to be updated.
43+
44+
**Before:**
45+
```py
46+
from google.cloud import bigquery_datatransfer
47+
from google.cloud import bigquery_datatransfer_v1
48+
```
49+
50+
**After:**
51+
```py
52+
from google.cloud.bigquery import datatransfer
53+
from google.cloud.bigquery import datatransfer_v1
54+
```
55+
56+
57+
## Method Calls
58+
59+
> **WARNING**: Breaking change
60+
61+
Methods that send requests to the backend expect request objects. We provide a script
62+
that will convert most common use cases.
63+
64+
* Install the library
65+
66+
```py
67+
python3 -m pip install google-cloud-bigquery-datatransfer
68+
```
69+
70+
* The script `fixup_datatransfer_v1_keywords.py` is shipped with the library. It expects
71+
an input directory (with the code to convert) and an empty destination directory.
72+
73+
```sh
74+
$ scripts/fixup_datatransfer_v1_keywords.py --input-directory .samples/ --output-directory samples/
75+
```
76+
77+
**Before:**
78+
```py
79+
from google.cloud import bigquery_datatransfer
80+
81+
client = bigquery_datatransfer.DataTransferServiceClient()
82+
83+
parent_project = "..."
84+
transfer_config = {...}
85+
authorization_code = "..."
86+
87+
response = client.create_transfer_config(
88+
parent_project, transfer_config, authorization_code=authorization_code
89+
)
90+
```
91+
92+
93+
**After:**
94+
```py
95+
from google.cloud.bigquery import datatransfer
96+
97+
client = datatransfer.DataTransferServiceClient()
98+
99+
parent_project = "..."
100+
transfer_config = {...}
101+
authorization_code = "..."
102+
103+
response = client.create_transfer_config(
104+
request={
105+
"parent": parent_project,
106+
"transfer_config": transfer_config,
107+
"authorization_code": authorization_code,
108+
}
109+
)
110+
```
111+
112+
### More Details
113+
114+
In `google-cloud-bigquery-datatransfer<2.0.0`, parameters required by the API were positional
115+
parameters and optional parameters were keyword parameters.
116+
117+
**Before:**
118+
```py
119+
def create_transfer_config(
120+
self,
121+
parent,
122+
transfer_config,
123+
authorization_code=None,
124+
version_info=None,
125+
service_account_name=None,
126+
retry=google.api_core.gapic_v1.method.DEFAULT,
127+
timeout=google.api_core.gapic_v1.method.DEFAULT,
128+
metadata=None,
129+
):
130+
```
131+
132+
In the `2.0.0` release, methods that interact with the backend have a single
133+
positional parameter `request`. Method docstrings indicate whether a parameter is
134+
required or optional.
135+
136+
Some methods have additional keyword only parameters. The available parameters depend
137+
on the [`google.api.method_signature` annotation](https://github.com/googleapis/python-bigquery-datatransfer/blob/master/google/cloud/bigquery_datatransfer_v1/proto/datatransfer.proto#L80)
138+
specified by the API producer.
139+
140+
141+
**After:**
142+
```py
143+
def create_transfer_config(
144+
self,
145+
request: datatransfer.CreateTransferConfigRequest = None,
146+
*,
147+
parent: str = None,
148+
transfer_config: transfer.TransferConfig = None,
149+
retry: retries.Retry = gapic_v1.method.DEFAULT,
150+
timeout: float = None,
151+
metadata: Sequence[Tuple[str, str]] = (),
152+
) -> transfer.TransferConfig:
153+
```
154+
155+
> **NOTE:** The `request` parameter and flattened keyword parameters for the API are
156+
> mutually exclusive. Passing both will result in an error.
157+
158+
159+
Both of these calls are valid:
160+
161+
```py
162+
response = client.create_transfer_config(
163+
request={
164+
"parent": project_path,
165+
"transfer_config": {"foo": "bar"},
166+
}
167+
)
168+
```
169+
170+
```py
171+
response = client.create_transfer_config(
172+
parent=project_path,
173+
transfer_config={"foo": "bar"},
174+
)
175+
```
176+
177+
This call is _invalid_ because it mixes `request` with a keyword argument `transfer_config`.
178+
Executing this code will result in an error:
179+
180+
```py
181+
response = client.create_transfer_config(
182+
request={"parent": project_path},
183+
transfer_config= {"foo": "bar"},
184+
)
185+
```
186+
187+
> **NOTE:** The `request` parameter of some methods can also contain a more rich set of
188+
> options that are otherwise not available as explicit keyword only parameters, thus
189+
> these _must_ be passed through `request`.
190+
191+
192+
## Removed Utility Methods
193+
194+
> **WARNING**: Breaking change
195+
196+
Most utility methods such as `project_path()` have been removed. The paths must
197+
now be constructed manually:
198+
199+
```py
200+
project_path = f"project/{PROJECT_ID}"
201+
```
202+
203+
The only two that remained are `transfer_config_path()` and `parse_transfer_config_path()`.
204+
205+
206+
## Removed `client_config` Parameter
207+
208+
The client cannot be constructed with `client_config` argument anymore, this deprecated
209+
argument has been removed. If you want to customize retry and timeout settings for a particular
210+
method, you need to do it upon method invocation by passing the custom `timeout` and
211+
`retry` arguments, respectively.
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../UPGRADING.md
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
Services for Google Cloud Bigquery Datatransfer v1 API
2+
======================================================
3+
4+
.. automodule:: google.cloud.bigquery.datatransfer_v1.services.data_transfer_service
5+
:members:
6+
:inherited-members:
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Types for Google Cloud Bigquery Datatransfer v1 API
2+
===================================================
3+
4+
.. automodule:: google.cloud.bigquery.datatransfer_v1.types
5+
:members:

packages/google-cloud-bigquery-datatransfer/docs/gapic/v1/api.rst

Lines changed: 0 additions & 6 deletions
This file was deleted.

packages/google-cloud-bigquery-datatransfer/docs/gapic/v1/types.rst

Lines changed: 0 additions & 5 deletions
This file was deleted.

0 commit comments

Comments
 (0)
0