Access Azure Datalake Gen1 with fsspec and dask
Filesystem interface to Azure-Datalake Gen1 and Gen2 Storage
To use the Gen1 filesystem:
import dask.dataframe as dd
storage_options={'tenant_id': TENANT_ID, 'client_id': CLIENT_ID, 'client_secret': CLIENT_SECRET}
dd.read_csv('adl:/{STORE_NAME}/{FOLDER}/*.csv', storage_options=storage_options)
To use the Gen2 filesystem you can use the protocol abfs or az:
import dask.dataframe as dd
storage_options={'account_name': ACCOUNT_NAME, 'account_key': ACCOUNT_KEY}
ddf = dd.read_csv('abfs:/{CONTAINER}/{FOLDER}/*.csv', storage_options=storage_options)
ddf = dd.read_parquet('az:/{CONTAINER}/folder.parquet', storage_options=storage_options)
- RPM
- python3-adlfs-2025.8.0-2.lbn42.noarch.rpm
- Summary
- Access Azure Datalake Gen1 with fsspec and dask
- URL
- https://pypi.org/project/adlfs
- Group
- Unspecified
- License
- ZPL
- Source
-
python-adlfs-2025.8.0-2.lbn42.src.rpm
- Checksum
- 327ddff54e7eda74fa5166811d54d1e6e27b241015f34ae84b1a3907395f76a2
- Build Date
- 2026/01/26 12:29:06
- Requires
-
python3.13dist(azure-core) ? 1.28
python3.13dist(azure-datalake-store) ? 0.0.53
python3.13dist(aiohttp) >= 3.7
python3.13dist(azure-storage-blob) >= 12.17
python3.13dist(fsspec) >= 2023.12
- Provides
-
python-adlfs = 2025.8.0-2.lbn42
python3-adlfs = 2025.8.0-2.lbn42
python3.13-adlfs = 2025.8.0-2.lbn42
python3.13dist(adlfs) = 2025.8
python3dist(adlfs) = 2025.8
- Obsoletes