Releases: octue/django-gcp
Refactoring after Task Emulator work
Allow Task Queue Emulation
Contents (#90)
IMPORTANT: There is 1 breaking change.
Enhancements
- 💥 BREAKING CHANGE: Update tasks to enable use of emulator
Operations
- Version bump
Refactoring
- Remove unused AppEngine functionality
- Update to new cloud scheduler import
Upgrade instructions
💥 Update tasks to enable use of emulator
The Task.enqueue() method no longer attempts to create a queue in the event of a queue NotFound exception. This avoids unintuitive obfuscation of what resources are being created and when. You should explicitly define your queues - either using an IAC tool like terraform (preferable) ormanually in GCP console (not preferable), or if you need dynamic queues (rare) then the suggested approach is to manage that queue creation process explicitly.
Use unfold's field rendering
Add a `get_bucket_name` method to blob field utils
What's Changed
Full Changelog: 0.18.2...0.18.3
Add get_bucket classmethod to BlobFieldMixin
What's Changed
Full Changelog: 0.18.1...0.18.2
Fix signature mismatch bug due to max size bytes header
Summary
In the previous release 0.18.0, the addition of the X-Goog-Content-Length-Range
header was "fixed". Previously it had been incorrectly added and was thus always missing.
However, this header must be identically both:
- Encoded into signature generation and
- Set on the upload request
The "fix" in 0.18.0 added it into the encoding of the signature generation but did not set it on the upload request. The resulting code caused errors in all uploads using the widget.
This release fixes that problem by clarifying the default value and ensuring that the correct header is set on upload. In the event that the requested max_size_bytes is 0, the upload size is unlimited.
This release adds a setting GCP_STORAGE_BLOBFIELD_MAX_SIZE_BYTES
allowing you to set a maximum for all your blobfields. However, to avoid a breaking change, the default is set to 0 (unlimited).
Add uploaded_blob context manager, allow calling of model clean() with overridden settings
Contents (#84)
New features
- Add uploaded_blob context manager, useful for unit testing
Fixes
- Allow field cleaning to happen inside an overridden context
- Correct implementation of headers for limiting content length range
- Make upload_blob woth with default destination path helper
Operations
- Remove deprecated setting from devcontainer json
Refactoring
- Remove redundant context in test
Testing
- Test uploaded_blob context manager
Export BlobField from `django-gcp.storages`
Access ingress path
Contents (#80)
IMPORTANT: There is 1 breaking change.
New features
- Add update_attributes callback for setting blob metadata
Fixes
- Avoid save on unrefreshed object
Refactoring
- 💥 BREAKING CHANGE: Move upload_blob to the operations module where it belongs
Testing
- Fix race condition from transaction handler
Upgrade instructions
💥 Move upload_blob to the operations module where it belongs
If importing the upload_blob function from django_gcp.storage.blob_utils, import directly from django_gcp.storage.operations or django_gcp.storage instead
Add complete method list to the blob field mixin
What's Changed
Full Changelog: 0.16.0...0.16.1