python: `create_namespaced_horizontal_ pod_autoscaler` throws ValueError: Invalid value for `conditions`, must not be `None`

File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/apis/autoscaling_v2beta1_api.py", line 60, in create_namespaced_horizontal_pod_autoscaler                                                                                                                                                
    (data) = self.create_namespaced_horizontal_pod_autoscaler_with_http_info(namespace, body, **kwargs)                                                       
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/apis/autoscaling_v2beta1_api.py", line 151, in create_namespaced_horizontal
_pod_autoscaler_with_http_info                                                                                                                                
    collection_formats=collection_formats)                                                                                                                    
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 334, in call_api                                      
    _return_http_data_only, collection_formats, _preload_content, _request_timeout)                                                                           
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 176, in __call_api                                    
    return_data = self.deserialize(response_data, response_type)                                                                                              
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 249, in deserialize                                   
    return self.__deserialize(data, response_type)                                                                                                            
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 289, in __deserialize                                 
    return self.__deserialize_model(data, klass)                                                                                                              
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 633, in __deserialize_model                           
    kwargs[attr] = self.__deserialize(value, attr_type)                                                                                                       
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 289, in __deserialize                                 
    return self.__deserialize_model(data, klass)                                                                                                              
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/api_client.py", line 635, in __deserialize_model                           
    instance = klass(**kwargs)                                                                                                                                
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/models/v2beta1_horizontal_pod_autoscaler_status.py", line 64, in __init__  
    self.conditions = conditions                                                                                                                              
  File "/virutalenv/lib/python3.6/site-packages/kubernetes/client/models/v2beta1_horizontal_pod_autoscaler_status.py", line 95, in conditions
    raise ValueError("Invalid value for `conditions`, must not be `None`")                                                                                    
ValueError: Invalid value for `conditions`, must not be `None`

code to reproduce -

from kubernetes import client
k8s_api = client.AutoscalingV2beta1Api()
k8s_api.create_namespaced_horizontal_pod_autoscaler(namespace='default',
body=client.V2beta2HorizontalPodAutoscaler(
        api_version='autoscaling/v2beta2',
        kind='HorizontalPodAutoscaler',
        metadata=client.V1ObjectMeta(
            name=scaler_name
        ),
        spec=client.V2beta2HorizontalPodAutoscalerSpec(
            scale_target_ref=client.V2beta2CrossVersionObjectReference(
                api_version=version1,
                kind=scalable_object,
                name=app_name
            ),
            min_replicas=N1,
            max_replicas=N2,
            metrics=[client.V2beta2MetricSpec(
                type='Object',
                object=client.V2beta2ObjectMetricSource(
                    metric=client.V2beta2MetricIdentifier(
                        name=metric_name
                    ),
                    described_object=client.V2beta2CrossVersionObjectReference(
                        api_version=version2,
                        kind=scalable_object,
                        name=app_name
                    ),
                    target=client.V2beta2MetricTarget(
                        type=target_type,
                        value=target_value
                    )
                ),

            )]
        ),
    )

This returns the exception i posted above, though it create the hpa. status is optional in V2beta2HorizontalPodAutoscaler, why does this exception occur and what is the work around for this?

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 3
  • Comments: 15 (3 by maintainers)

Most upvoted comments

Still facing the issue using kubernetes-12.0.1 and kubernetes v 1.17, any updates?

This issue can also be skipped by catching the ValueError exception and checking for the error:

except ValueError as exception:
    if str(exception) == 'Invalid value for `conditions`, must not be `None`':
        logger.info('Skipping invalid \'conditions\' value...')
    else:
        raise exception

fwiw, I’m also see this on k8s 1.17 on EKS with a HorizontalPodAutoscaler on autoscaling/v2beta2 (deployed from a YAML file via kubernetes.utils.create_from_dict).

/remove-lifecycle stale

Getting the issue with current version running on k8s 1.16 on eks. Any status on the fix?