mypy_boto3_builder: Mypy-boto3-appflow 1.28.12 has inconsistent typing

Describe the bug

The Mypy-boto3-appflow 1.28.12 introduces inconsistent typing for input/output parameters In airlfow, our tests started to fail in “canary” builds after that release and it seems we cannot use response from describe_flow in update_flow. without explicitly casting the types:

This code:

response = self.conn.describe_flow(flowName=flow_name)
....

self.conn.update_flow(
            flowName=response["flowName"],
            destinationFlowConfigList=response["destinationFlowConfigList"],
            sourceFlowConfig=response["sourceFlowConfig"],
            triggerConfig=response["triggerConfig"],
            description=response.get("description", "Flow description."),
            tasks=tasks,
        )

When we are passing output of appflow describe_flow to input of update flow, mypy complains:

airflow/providers/amazon/aws/hooks/appflow.py:113: error: Argument
"destinationFlowConfigList" to "update_flow" of "AppflowClient" has incompatible
type "List[DestinationFlowConfigOutputTypeDef]"; expected
"Sequence[DestinationFlowConfigTypeDef]"  [arg-type]

To Reproduce

You will see the mypy error above.

Example of it as well here: https://github.com/apache/airflow/actions/runs/5685555908/job/15411044776?pr=32882

Actual output

airflow/providers/amazon/aws/hooks/appflow.py:113: error: Argument
"destinationFlowConfigList" to "update_flow" of "AppflowClient" has incompatible
type "List[DestinationFlowConfigOutputTypeDef]"; expected
"Sequence[DestinationFlowConfigTypeDef]"  [arg-type]
                destinationFlowConfigList=response["destinationFlowConfigL...
                                          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
airflow/providers/amazon/aws/hooks/appflow.py:114: error: Argument
"sourceFlowConfig" to "update_flow" of "AppflowClient" has incompatible type
"SourceFlowConfigOutputTypeDef"; expected "SourceFlowConfigTypeDef"  [arg-type]
                sourceFlowConfig=response["sourceFlowConfig"],
                                 ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
airflow/providers/amazon/aws/hooks/appflow.py:115: error: Argument
"triggerConfig" to "update_flow" of "AppflowClient" has incompatible type
"TriggerConfigOutputTypeDef"; expected "TriggerConfigTypeDef"  [arg-type]
                triggerConfig=response["triggerConfig"],
                              ^~~~~~~~~~~~~~~~~~~~~~~~~
airflow/providers/amazon/aws/hooks/appflow.py:117: error: Argument "tasks" to
"update_flow" of "AppflowClient" has incompatible type
"List[TaskOutputTypeDef]"; expected "Sequence[TaskTypeDef]"  [arg-type]
                tasks=tasks,
                      ^~~~~

Expected output

No erorrs.

Additional context

It was working fine in 1.28.0. We are going to exclude 1.28.12 until this gets fixed (or maybe once we know how to adapt the code of ours other than explicitly casting the output types to the input ones - which seems like a bad idea).

cc: @o-nikolas

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 15

Commits related to this issue

Most upvoted comments

Thank you for testing this postrelease. I am going to leave this issue open until mypy-boto3-appflow 1.28.16 is released next Monday.

Please let me know if you find any other issues, that was a huge improvement.

Indeed. Seems like the issue is back even after the new release with 1.28.15: https://github.com/apache/airflow/actions/runs/5696388972/job/15441693475#step:6:161

Unfortunately, I cannot change it on my side - the problem is that both - producer(describe_flow) and consumer(update_flow) of the data structures come from mypy-boto3-appflow, so I cannot change the consumer to accept the union. Both methods are in the mypy-boto3-appflow realm.

For now I am going to simply limit mypy-boto3-appflow to < 1.28 and I hope it can get resolved somehow - possibly with mypy if you create an issue, but I think honestly this is a flaw of the design if you have two - from Python point of view - types and expect that one will be compatible with the other. That sounds a bit strange.

BTW. Once you release the new package, Airflow will run a canary build and will automatically pull and test the last version - so we will know immediately if it’s been fixed 😃

No issues with the new version on the following snippet:

import boto3

conn = boto3.client("appflow")

response = conn.describe_flow(flowName="flow_name")

conn.update_flow(
    flowName=response["flowName"],
    destinationFlowConfigList=response["destinationFlowConfigList"],
    sourceFlowConfig=response["sourceFlowConfig"],
    triggerConfig=response["triggerConfig"],
    description=response.get("description", "Flow description."),
    tasks=[
        {
            "sourceFields": ["field_name", "field_name2"],
            "taskType": "Map",
        }
    ],
)

I am going to prepare a new release and update boto3 packages later today.

Thanks. I am travelling and had no time to check it - but yes that was precisely the snippet that caused the problems for us so it looks good.

Thanks for quick reaction.