Skip to content

Commit cdd5e07

Browse files
authored
Update defaults in docstring (#934)
* ci: let `aiobotocore` handle `botocore` install
1 parent 51e3c80 commit cdd5e07

File tree

2 files changed

+3
-4
lines changed

2 files changed

+3
-4
lines changed

.github/workflows/ci.yml

-1
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,6 @@ jobs:
3939
run: |
4040
pip install git+https://github.com/fsspec/filesystem_spec
4141
pip install --upgrade "aiobotocore${{ matrix.aiobotocore-version }}"
42-
pip install --upgrade "botocore" --no-deps
4342
pip install . --no-deps
4443
pip list
4544

s3fs/core.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -213,7 +213,7 @@ class S3FileSystem(AsyncFileSystem):
213213
If RequesterPays buckets are supported.
214214
default_block_size: int (None)
215215
If given, the default block size value used for ``open()``, if no
216-
specific value is given at all time. The built-in default is 5MB.
216+
specific value is given at all time. The built-in default is 50MB.
217217
default_fill_cache : Bool (True)
218218
Whether to use cache filling with open by default. Refer to
219219
``S3File.open``.
@@ -241,9 +241,9 @@ class S3FileSystem(AsyncFileSystem):
241241
session : aiobotocore AioSession object to be used for all connections.
242242
This session will be used inplace of creating a new session inside S3FileSystem.
243243
For example: aiobotocore.session.AioSession(profile='test_user')
244-
max_concurrency : int (1)
244+
max_concurrency : int (10)
245245
The maximum number of concurrent transfers to use per file for multipart
246-
upload (``put()``) operations. Defaults to 1 (sequential). When used in
246+
upload (``put()``) operations. Defaults to 10. When used in
247247
conjunction with ``S3FileSystem.put(batch_size=...)`` the maximum number of
248248
simultaneous connections is ``max_concurrency * batch_size``. We may extend
249249
this parameter to affect ``pipe()``, ``cat()`` and ``get()``. Increasing this

0 commit comments

Comments
 (0)