You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -150,7 +150,7 @@ These instructions can only be followed by Pinecone employees with access to our
150
150
151
151
Prerequisites:
152
152
- You must be an employee with access to private Pinecone repositories
153
-
- You must have [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed and running. Our code generation script uses a dockerized version of the openapi CLI.
153
+
- You must have [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed and running. Our code generation script uses a dockerized version of the OpenAPI CLI.
154
154
- You must have initialized the git submodules under codegen
Copy file name to clipboardExpand all lines: pinecone/data/features/bulk_import.py
+3-4Lines changed: 3 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -51,7 +51,7 @@ def start_import(
51
51
) ->StartImportResponse:
52
52
"""Import data from a storage provider into an index. The uri must start with the scheme of a supported
53
53
storage provider. For buckets that are not publicly readable, you will also need to separately configure
54
-
a storage integration and pass the integration name.
54
+
a storage integration and pass the integration id.
55
55
56
56
Examples:
57
57
>>> from pinecone import Pinecone
@@ -61,7 +61,7 @@ def start_import(
61
61
62
62
Args:
63
63
uri (str): The URI of the data to import. The URI must start with the scheme of a supported storage provider.
64
-
integration (Optional[str], optional): If your bucket requires authentication to access, you need to pass the name of your storage integration using this property. Defaults to None.
64
+
integration_id (Optional[str], optional): If your bucket requires authentication to access, you need to pass the id of your storage integration using this property. Defaults to None.
65
65
error_mode: Defaults to "CONTINUE". If set to "CONTINUE", the import operation will continue even if some
66
66
records fail to import. Pass "ABORT" to stop the import operation if any records fail to import.
0 commit comments