Skip to content

Commit fd1b62f

Browse files
authored
Preparing 2.7.0 release (#651)
* Testing 2.7.0 release
1 parent 3c8afae commit fd1b62f

20 files changed

+93
-96
lines changed

.bumpversion.cfg

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[bumpversion]
2-
current_version = 2.6.0
2+
current_version = 2.7.0
33
commit = False
44
tag = False
55
tag_name = {new_version}

CONTRIBUTING_COMMON_ERRORS.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,9 @@ Requirement already satisfied: pbr!=2.1.0,>=2.0.0 in ./.venv/lib/python3.7/site-
1313
Using legacy 'setup.py install' for python-Levenshtein, since package 'wheel' is not installed.
1414
Installing collected packages: awswrangler, python-Levenshtein
1515
Attempting uninstall: awswrangler
16-
Found existing installation: awswrangler 2.6.0
17-
Uninstalling awswrangler-2.6.0:
18-
Successfully uninstalled awswrangler-2.6.0
16+
Found existing installation: awswrangler 2.7.0
17+
Uninstalling awswrangler-2.7.0:
18+
Successfully uninstalled awswrangler-2.7.0
1919
Running setup.py develop for awswrangler
2020
Running setup.py install for python-Levenshtein ... error
2121
ERROR: Command errored out with exit status 1:

README.md

Lines changed: 30 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, Clo
88

99
> An [AWS Professional Service](https://aws.amazon.com/professional-services/) open source initiative | [email protected]
1010
11-
[![Release](https://img.shields.io/badge/release-2.6.0-brightgreen.svg)](https://pypi.org/project/awswrangler/)
11+
[![Release](https://img.shields.io/badge/release-2.7.0-brightgreen.svg)](https://pypi.org/project/awswrangler/)
1212
[![Python Version](https://img.shields.io/badge/python-3.6%20%7C%203.7%20%7C%203.8%20%7C%203.9-brightgreen.svg)](https://anaconda.org/conda-forge/awswrangler)
1313
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
1414
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
@@ -24,7 +24,7 @@ Easy integration with Athena, Glue, Redshift, Timestream, QuickSight, Chime, Clo
2424
| **[PyPi](https://pypi.org/project/awswrangler/)** | [![PyPI Downloads](https://pepy.tech/badge/awswrangler)](https://pypi.org/project/awswrangler/) | `pip install awswrangler` |
2525
| **[Conda](https://anaconda.org/conda-forge/awswrangler)** | [![Conda Downloads](https://img.shields.io/conda/dn/conda-forge/awswrangler.svg)](https://anaconda.org/conda-forge/awswrangler) | `conda install -c conda-forge awswrangler` |
2626

27-
> ⚠️ **For platforms without PyArrow 3 support (e.g. [EMR](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#emr-cluster), [Glue PySpark Job](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#aws-glue-pyspark-jobs), MWAA):**<br>
27+
> ⚠️ **For platforms without PyArrow 3 support (e.g. [EMR](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#emr-cluster), [Glue PySpark Job](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#aws-glue-pyspark-jobs), MWAA):**<br>
2828
➡️ `pip install pyarrow==2 awswrangler`
2929

3030
Powered By [<img src="https://arrow.apache.org/img/arrow.png" width="200">](https://arrow.apache.org/powered_by/)
@@ -42,7 +42,7 @@ Powered By [<img src="https://arrow.apache.org/img/arrow.png" width="200">](http
4242

4343
Installation command: `pip install awswrangler`
4444

45-
> ⚠️ **For platforms without PyArrow 3 support (e.g. [EMR](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#emr-cluster), [Glue PySpark Job](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#aws-glue-pyspark-jobs), MWAA):**<br>
45+
> ⚠️ **For platforms without PyArrow 3 support (e.g. [EMR](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#emr-cluster), [Glue PySpark Job](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#aws-glue-pyspark-jobs), MWAA):**<br>
4646
➡️`pip install pyarrow==2 awswrangler`
4747

4848
```py3
@@ -96,17 +96,17 @@ FROM "sampleDB"."sampleTable" ORDER BY time DESC LIMIT 3
9696

9797
## [Read The Docs](https://aws-data-wrangler.readthedocs.io/)
9898

99-
- [**What is AWS Data Wrangler?**](https://aws-data-wrangler.readthedocs.io/en/2.6.0/what.html)
100-
- [**Install**](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html)
101-
- [PyPi (pip)](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#pypi-pip)
102-
- [Conda](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#conda)
103-
- [AWS Lambda Layer](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#aws-lambda-layer)
104-
- [AWS Glue Python Shell Jobs](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#aws-glue-python-shell-jobs)
105-
- [AWS Glue PySpark Jobs](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#aws-glue-pyspark-jobs)
106-
- [Amazon SageMaker Notebook](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#amazon-sagemaker-notebook)
107-
- [Amazon SageMaker Notebook Lifecycle](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#amazon-sagemaker-notebook-lifecycle)
108-
- [EMR](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#emr)
109-
- [From source](https://aws-data-wrangler.readthedocs.io/en/2.6.0/install.html#from-source)
99+
- [**What is AWS Data Wrangler?**](https://aws-data-wrangler.readthedocs.io/en/2.7.0/what.html)
100+
- [**Install**](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html)
101+
- [PyPi (pip)](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#pypi-pip)
102+
- [Conda](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#conda)
103+
- [AWS Lambda Layer](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#aws-lambda-layer)
104+
- [AWS Glue Python Shell Jobs](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#aws-glue-python-shell-jobs)
105+
- [AWS Glue PySpark Jobs](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#aws-glue-pyspark-jobs)
106+
- [Amazon SageMaker Notebook](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#amazon-sagemaker-notebook)
107+
- [Amazon SageMaker Notebook Lifecycle](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#amazon-sagemaker-notebook-lifecycle)
108+
- [EMR](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#emr)
109+
- [From source](https://aws-data-wrangler.readthedocs.io/en/2.7.0/install.html#from-source)
110110
- [**Tutorials**](https://github.com/awslabs/aws-data-wrangler/tree/main/tutorials)
111111
- [001 - Introduction](https://github.com/awslabs/aws-data-wrangler/blob/main/tutorials/001%20-%20Introduction.ipynb)
112112
- [002 - Sessions](https://github.com/awslabs/aws-data-wrangler/blob/main/tutorials/002%20-%20Sessions.ipynb)
@@ -136,22 +136,22 @@ FROM "sampleDB"."sampleTable" ORDER BY time DESC LIMIT 3
136136
- [026 - Amazon Timestream](https://github.com/awslabs/aws-data-wrangler/blob/main/tutorials/026%20-%20Amazon%20Timestream.ipynb)
137137
- [027 - Amazon Timestream 2](https://github.com/awslabs/aws-data-wrangler/blob/main/tutorials/027%20-%20Amazon%20Timestream%202.ipynb)
138138
- [028 - Amazon DynamoDB](https://github.com/awslabs/aws-data-wrangler/blob/main/tutorials/028%20-%20DynamoDB.ipynb)
139-
- [**API Reference**](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html)
140-
- [Amazon S3](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-s3)
141-
- [AWS Glue Catalog](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#aws-glue-catalog)
142-
- [Amazon Athena](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-athena)
143-
- [Amazon Redshift](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-redshift)
144-
- [PostgreSQL](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#postgresql)
145-
- [MySQL](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#mysql)
146-
- [SQL Server](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#sqlserver)
147-
- [DynamoDB](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#dynamodb)
148-
- [Amazon Timestream](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-timestream)
149-
- [Amazon EMR](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-emr)
150-
- [Amazon CloudWatch Logs](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-cloudwatch-logs)
151-
- [Amazon Chime](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-chime)
152-
- [Amazon QuickSight](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#amazon-quicksight)
153-
- [AWS STS](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#aws-sts)
154-
- [AWS Secrets Manager](https://aws-data-wrangler.readthedocs.io/en/2.6.0/api.html#aws-secrets-manager)
139+
- [**API Reference**](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html)
140+
- [Amazon S3](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-s3)
141+
- [AWS Glue Catalog](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#aws-glue-catalog)
142+
- [Amazon Athena](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-athena)
143+
- [Amazon Redshift](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-redshift)
144+
- [PostgreSQL](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#postgresql)
145+
- [MySQL](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#mysql)
146+
- [SQL Server](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#sqlserver)
147+
- [DynamoDB](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#dynamodb)
148+
- [Amazon Timestream](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-timestream)
149+
- [Amazon EMR](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-emr)
150+
- [Amazon CloudWatch Logs](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-cloudwatch-logs)
151+
- [Amazon Chime](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-chime)
152+
- [Amazon QuickSight](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#amazon-quicksight)
153+
- [AWS STS](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#aws-sts)
154+
- [AWS Secrets Manager](https://aws-data-wrangler.readthedocs.io/en/2.7.0/api.html#aws-secrets-manager)
155155
- [**License**](https://github.com/awslabs/aws-data-wrangler/blob/main/LICENSE.txt)
156156
- [**Contributing**](https://github.com/awslabs/aws-data-wrangler/blob/main/CONTRIBUTING.md)
157157
- [**Legacy Docs** (pre-1.0.0)](https://aws-data-wrangler.readthedocs.io/en/0.3.3/)

awswrangler/__metadata__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,5 +7,5 @@
77

88
__title__: str = "awswrangler"
99
__description__: str = "Pandas on AWS."
10-
__version__: str = "2.6.0"
10+
__version__: str = "2.7.0"
1111
__license__: str = "Apache License 2.0"

awswrangler/athena/_read.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -595,11 +595,11 @@ def read_sql_query(
595595
596596
**Related tutorial:**
597597
598-
- `Amazon Athena <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
598+
- `Amazon Athena <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
599599
tutorials/006%20-%20Amazon%20Athena.html>`_
600-
- `Athena Cache <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
600+
- `Athena Cache <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
601601
tutorials/019%20-%20Athena%20Cache.html>`_
602-
- `Global Configurations <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
602+
- `Global Configurations <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
603603
tutorials/021%20-%20Global%20Configurations.html>`_
604604
605605
**There are two approaches to be defined through ctas_approach parameter:**
@@ -647,7 +647,7 @@ def read_sql_query(
647647
/athena.html#Athena.Client.get_query_execution>`_ .
648648
649649
For a practical example check out the
650-
`related tutorial <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
650+
`related tutorial <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
651651
tutorials/024%20-%20Athena%20Query%20Metadata.html>`_!
652652
653653
@@ -863,11 +863,11 @@ def read_sql_table(
863863
864864
**Related tutorial:**
865865
866-
- `Amazon Athena <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
866+
- `Amazon Athena <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
867867
tutorials/006%20-%20Amazon%20Athena.html>`_
868-
- `Athena Cache <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
868+
- `Athena Cache <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
869869
tutorials/019%20-%20Athena%20Cache.html>`_
870-
- `Global Configurations <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
870+
- `Global Configurations <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
871871
tutorials/021%20-%20Global%20Configurations.html>`_
872872
873873
**There are two approaches to be defined through ctas_approach parameter:**
@@ -912,7 +912,7 @@ def read_sql_table(
912912
/athena.html#Athena.Client.get_query_execution>`_ .
913913
914914
For a practical example check out the
915-
`related tutorial <https://aws-data-wrangler.readthedocs.io/en/2.6.0/
915+
`related tutorial <https://aws-data-wrangler.readthedocs.io/en/2.7.0/
916916
tutorials/024%20-%20Athena%20Query%20Metadata.html>`_!
917917
918918

awswrangler/s3/_read_parquet.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -697,7 +697,7 @@ def read_parquet_table(
697697
This function MUST return a bool, True to read the partition or False to ignore it.
698698
Ignored if `dataset=False`.
699699
E.g ``lambda x: True if x["year"] == "2020" and x["month"] == "1" else False``
700-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
700+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
701701
columns : List[str], optional
702702
Names of columns to read from the file(s).
703703
validate_schema:

awswrangler/s3/_read_text.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -217,7 +217,7 @@ def read_csv(
217217
This function MUST return a bool, True to read the partition or False to ignore it.
218218
Ignored if `dataset=False`.
219219
E.g ``lambda x: True if x["year"] == "2020" and x["month"] == "1" else False``
220-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
220+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
221221
pandas_kwargs :
222222
KEYWORD arguments forwarded to pandas.read_csv(). You can NOT pass `pandas_kwargs` explicit, just add valid
223223
Pandas arguments in the function call and Wrangler will accept it.
@@ -359,7 +359,7 @@ def read_fwf(
359359
This function MUST return a bool, True to read the partition or False to ignore it.
360360
Ignored if `dataset=False`.
361361
E.g ``lambda x: True if x["year"] == "2020" and x["month"] == "1" else False``
362-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
362+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
363363
pandas_kwargs:
364364
KEYWORD arguments forwarded to pandas.read_fwf(). You can NOT pass `pandas_kwargs` explicit, just add valid
365365
Pandas arguments in the function call and Wrangler will accept it.
@@ -505,7 +505,7 @@ def read_json(
505505
This function MUST return a bool, True to read the partition or False to ignore it.
506506
Ignored if `dataset=False`.
507507
E.g ``lambda x: True if x["year"] == "2020" and x["month"] == "1" else False``
508-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
508+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/023%20-%20Flexible%20Partitions%20Filter.html
509509
pandas_kwargs:
510510
KEYWORD arguments forwarded to pandas.read_json(). You can NOT pass `pandas_kwargs` explicit, just add valid
511511
Pandas arguments in the function call and Wrangler will accept it.

awswrangler/s3/_write_parquet.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -293,18 +293,18 @@ def to_parquet( # pylint: disable=too-many-arguments,too-many-locals
293293
concurrent_partitioning: bool
294294
If True will increase the parallelism level during the partitions writing. It will decrease the
295295
writing time and increase the memory usage.
296-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/022%20-%20Writing%20Partitions%20Concurrently.html
296+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/022%20-%20Writing%20Partitions%20Concurrently.html
297297
mode: str, optional
298298
``append`` (Default), ``overwrite``, ``overwrite_partitions``. Only takes effect if dataset=True.
299299
For details check the related tutorial:
300-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/stubs/awswrangler.s3.to_parquet.html#awswrangler.s3.to_parquet
300+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/stubs/awswrangler.s3.to_parquet.html#awswrangler.s3.to_parquet
301301
catalog_versioning : bool
302302
If True and `mode="overwrite"`, creates an archived version of the table catalog before updating it.
303303
schema_evolution : bool
304304
If True allows schema evolution (new or missing columns), otherwise a exception will be raised.
305305
(Only considered if dataset=True and mode in ("append", "overwrite_partitions"))
306306
Related tutorial:
307-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/014%20-%20Schema%20Evolution.html
307+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/014%20-%20Schema%20Evolution.html
308308
database : str, optional
309309
Glue/Athena catalog: Database name.
310310
table : str, optional

awswrangler/s3/_write_text.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -175,11 +175,11 @@ def to_csv( # pylint: disable=too-many-arguments,too-many-locals,too-many-state
175175
concurrent_partitioning: bool
176176
If True will increase the parallelism level during the partitions writing. It will decrease the
177177
writing time and increase the memory usage.
178-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/tutorials/022%20-%20Writing%20Partitions%20Concurrently.html
178+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/tutorials/022%20-%20Writing%20Partitions%20Concurrently.html
179179
mode : str, optional
180180
``append`` (Default), ``overwrite``, ``overwrite_partitions``. Only takes effect if dataset=True.
181181
For details check the related tutorial:
182-
https://aws-data-wrangler.readthedocs.io/en/2.6.0/stubs/awswrangler.s3.to_parquet.html#awswrangler.s3.to_parquet
182+
https://aws-data-wrangler.readthedocs.io/en/2.7.0/stubs/awswrangler.s3.to_parquet.html#awswrangler.s3.to_parquet
183183
catalog_versioning : bool
184184
If True and `mode="overwrite"`, creates an archived version of the table catalog before updating it.
185185
database : str, optional

building/build-lambda-layers.sh

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,6 @@ docker run \
1616
--volume "$DIR_NAME":/aws-data-wrangler/ \
1717
--workdir /aws-data-wrangler/building/lambda \
1818
--rm \
19-
-it \
2019
awswrangler-build-py36 \
2120
build-lambda-layer.sh "${VERSION}-py3.6" "ninja"
2221

@@ -25,7 +24,6 @@ docker run \
2524
--volume "$DIR_NAME":/aws-data-wrangler/ \
2625
--workdir /aws-data-wrangler/building/lambda \
2726
--rm \
28-
-it \
2927
awswrangler-build-py37 \
3028
build-lambda-layer.sh "${VERSION}-py3.7" "ninja"
3129

@@ -34,6 +32,5 @@ docker run \
3432
--volume "$DIR_NAME":/aws-data-wrangler/ \
3533
--workdir /aws-data-wrangler/building/lambda \
3634
--rm \
37-
-it \
3835
awswrangler-build-py38 \
3936
build-lambda-layer.sh "${VERSION}-py3.8" "ninja-build"

0 commit comments

Comments
 (0)