azure-sdk-for-python: BlobServiceClient ImportError: cannot import name 'ParamSpec' from 'typing_extensions'

  • Package Name: azure.storage.blob
  • Package Version: 12.10.0
  • Operating System: Azure Databricks Cluster, DataBricks Runtime 10.3 ML
  • Python Version: 3.8

Describe the bug An ImportError from typing_extensions package shows up when I run from azure.storage.blob import BlobServiceClient. I run the commands from a Notebook inside the Azure Databricks cluster.

To Reproduce Steps to reproduce the behavior:

  1. start up a VM with Python3.8 and create a new notebook.
  2. create a new cell and run ! pip3 install azure-storage-blob
  3. create a new cell and run from azure.storage.blob import BlobServiceClient

Expected behavior Be able to import and use the package without any import errors from typing_extensions or any other dependent package.

Screenshots Adding some screenshots of the stack trace

image image

Additional context I fixed the problem in one notebook by doing

%rm -r /databricks/python3/lib/python3.8/site-packages/typing_extensions-4.1.1.dist-info /databricks/python3/lib/python3.8/site-packages/typing_extensions.py

This did not work on a second notebook where I tried it.

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Reactions: 1
  • Comments: 17 (9 by maintainers)

Most upvoted comments

That seems to have resolved it. I can’t call reload(typing_extensions) or it seems to reset some things. But, it is loading the azure python and pip packages now… awesome! Thank you!!

@xiaoyongzhu moving the chat from the PR to this issue:

@kristapratico seems this PR introduces some import issue in databricks ML runtime (see this issue: linkedin/feathr#154). Also seems like ParamSpec is added in 3.10 (https://docs.python.org/3/library/typing.html#typing.ParamSpec) so maybe we should also have a way to relax this limitation?

And can be repoduced by this (in databricks ML 10.2): !pip install azure-core==1.23.1 from azure.core.tracing.decorator import distributed_trace

and will throw out this error:


---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<command-3599966980702568> in <module>
----> 1 from azure.core.tracing.decorator import distributed_trace

/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py in import_patch(name, globals, locals, fromlist, level)
    160             # Import the desired module. If you’re seeing this while debugging a failed import,
    161             # look at preceding stack frames for relevant error information.
--> 162             original_result = python_builtin_import(name, globals, locals, fromlist, level)
    163 
    164             is_root_import = thread_local._nest_level == 1

/databricks/python/lib/python3.8/site-packages/azure/core/tracing/decorator.py in <module>
     29 
     30 from typing import Callable, Any, TypeVar, overload
---> 31 from typing_extensions import ParamSpec
     32 from .common import change_context, get_function_and_class_name
     33 from ..settings import settings

ImportError: cannot import name 'ParamSpec' from 'typing_extensions' (/databricks/python/lib/python3.8/site-packages/typing_extensions.py)

@xiaoyongzhu thanks for bringing to my attention, as it is still an issue I’m going to re-open this. Can you provide more details on how to set up your Databricks environment? I had previously tried to repro this error in a Databricks notebook with no luck (see above). Are you running this in a notebook? Also which version of typing-extensions is getting installed?

typing.ParamSpec was added in Python 3.10, but typing-extensions should backport it to as early as 3.6 (looks like you’re using 3.8). While this import works outside of Databricks, I’ll need to investigate why it’s a problem in the Databricks runtime and follow up with the service team.

@kristapratico thanks for the response, i found a workaround in the meantime. i will go ahead and file a support issue with Databricks!