Closed
Description
One big takeaway is that connection
should not be bound to Blob
and Bucket
(#728).
If the Bucket
does not explicitly have the batch
set as it's connection, the request happens outside the batch:
>>> from gcloud import storage
>>> from gcloud.storage.batch import Batch
>>> storage._PROJECT_ENV_VAR_NAME = 'GCLOUD_TESTS_PROJECT_ID'
>>> storage.set_defaults()
>>> bucket_name = 'dsmlmsldfsacjnajdnkewee'
>>> connection = storage.get_default_connection()
>>> bucket = storage.Bucket(name=bucket_name, connection=connection)
>>> blob = storage.Blob('foo', bucket=bucket)
>>> blob._reload_properties()
<Blob: dsmlmsldfsacjnajdnkewee, foo>
>>> blob.content_type = 'foo/bar'
>>>
>>> with Batch() as batch:
... blob.patch()
...
<Blob: dsmlmsldfsacjnajdnkewee, foo>
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "gcloud/storage/batch.py", line 165, in __exit__
self.finish()
File "gcloud/storage/batch.py", line 130, in finish
raise ValueError("No deferred requests")
ValueError: No deferred requests
On the other hand, if we do the setup correctly, it puts the blob
in an undefined state (I referenced this some time ago):
>>> from gcloud import storage
>>> from gcloud.storage.batch import Batch
>>> storage._PROJECT_ENV_VAR_NAME = 'GCLOUD_TESTS_PROJECT_ID'
>>> storage.set_defaults()
>>>
>>> with Batch() as batch:
... bucket_name = 'dsmlmsldfsacjnajdnkewee'
... bucket = storage.Bucket(name=bucket_name, connection=batch)
... blob = storage.Blob('foo', bucket=bucket)
... blob.content_type = 'foo/bar'
... blob.patch()
...
<Blob: dsmlmsldfsacjnajdnkewee, foo>
>>> blob._properties
''
>>> blob.content_type
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "gcloud/storage/_helpers.py", line 163, in _getter
return self.properties[fieldname]
File "gcloud/storage/_helpers.py", line 64, in properties
return self._properties.copy()
AttributeError: 'str' object has no attribute 'copy'