istio: ECDS based Wasm filter configuration causes proxy segmentation fault
Bug Description
istio-ingressgateway got segmentation fault error when i test wasm extension.
Note that it’s occasional and i can’t reproduce it with the same steps.
Test steps:
# Delete wasm envoyfilter
kubectl delete -f ./ev.yaml
# Send request to istio-ingressgateway
curl http://x.x.x.x/test
# Reapply wasm envoyfilter
kubectl apply -f ./ev.yaml
# Send request to istio-ingressgateway,connection closed and istio-ingressgateway restart
curl http://x.x.x.x/test
ev.yaml
apiVersion: networking.istio.io/v1alpha3
kind: EnvoyFilter
metadata:
name: ingressgateway-real-ip
namespace: istio-system
spec:
workloadSelector:
labels:
app: istio-ingressgateway
configPatches:
- applyTo: EXTENSION_CONFIG
match:
context: GATEWAY
patch:
operation: ADD
value:
name: realip
typed_config:
'@type': type.googleapis.com/udpa.type.v1.TypedStruct
type_url: type.googleapis.com/envoy.extensions.filters.http.wasm.v3.Wasm
value:
config:
configuration:
'@type': type.googleapis.com/google.protobuf.StringValue
value: |
10.0.0.0/8,172.16.0.0/12,192.168.0.0/16
vm_config:
vm_id: realip
code:
local:
filename: /wasm-filters/realip.wasm
runtime: envoy.wasm.runtime.v8
- applyTo: HTTP_FILTER
match:
context: GATEWAY
listener:
filterChain:
filter:
name: envoy.http_connection_manager
patch:
operation: INSERT_BEFORE
value:
name: realip
config_discovery:
config_source:
ads: {}
initial_fetch_timeout: 0s
type_urls: [ "type.googleapis.com/envoy.extensions.filters.http.wasm.v3.Wasm"]
Last logs from istio-ingressgateway pod
2021-09-18T04:05:00.699261Z critical envoy backtrace Caught Segmentation fault, suspect faulting address 0x0
2021-09-18T04:05:00.699294Z critical envoy backtrace Backtrace (use tools/stack_decode.py to get line numbers):
2021-09-18T04:05:00.699302Z critical envoy backtrace Envoy version: 6a2597296110a21043af384282539739d996c0ad/1.18.4-dev/Clean/RELEASE/BoringSSL
2021-09-18T04:05:00.699582Z critical envoy backtrace #0: __restore_rt [0x7f9e73626980]
2021-09-18T04:05:00.709743Z critical envoy backtrace #1: Envoy::Filter::Http::FilterConfigSubscription::onConfigUpdate() [0x565162db0045]
2021-09-18T04:05:00.718959Z critical envoy backtrace #2: Envoy::Config::GrpcSubscriptionImpl::onConfigUpdate() [0x565162e36579]
2021-09-18T04:05:00.727856Z critical envoy backtrace #3: Envoy::Config::GrpcMuxImpl::onDiscoveryResponse() [0x565162e3bef4]
2021-09-18T04:05:00.736794Z critical envoy backtrace #4: Envoy::Grpc::AsyncStreamCallbacks<>::onReceiveMessageRaw() [0x565162b126a1]
2021-09-18T04:05:00.747694Z critical envoy backtrace #5: Envoy::Grpc::AsyncStreamImpl::onData() [0x565162e506d7]
2021-09-18T04:05:00.757142Z critical envoy backtrace #6: Envoy::Http::AsyncStreamImpl::encodeData() [0x565162e55ce2]
2021-09-18T04:05:00.767485Z critical envoy backtrace #7: Envoy::Router::UpstreamRequest::decodeData() [0x565162eccef1]
2021-09-18T04:05:00.780015Z critical envoy backtrace #8: Envoy::Http::ResponseDecoderWrapper::decodeData() [0x565162c9bf5b]
2021-09-18T04:05:00.791974Z critical envoy backtrace #9: Envoy::Http::Http2::ConnectionImpl::onFrameReceived() [0x565162e07bbe]
2021-09-18T04:05:00.803042Z critical envoy backtrace #10: Envoy::Http::Http2::ConnectionImpl::Http2Callbacks::Http2Callbacks()::$_17::__invoke() [0x565162e0fdc0]
2021-09-18T04:05:00.815230Z critical envoy backtrace #11: nghttp2_session_on_data_received [0x5651631472dc]
2021-09-18T04:05:00.826139Z critical envoy backtrace #12: nghttp2_session_mem_recv [0x565163149256]
2021-09-18T04:05:00.836016Z critical envoy backtrace #13: Envoy::Http::Http2::ConnectionImpl::dispatch() [0x565162e064dd]
2021-09-18T04:05:00.844793Z critical envoy backtrace #14: Envoy::Http::Http2::ConnectionImpl::dispatch() [0x565162e071a5]
2021-09-18T04:05:00.854912Z critical envoy backtrace #15: Envoy::Http::CodecClient::onData() [0x565162cfc540]
2021-09-18T04:05:00.863853Z critical envoy backtrace #16: Envoy::Http::CodecClient::CodecReadFilter::onData() [0x565162cfd6a5]
2021-09-18T04:05:00.872932Z critical envoy backtrace #17: Envoy::Network::FilterManagerImpl::onContinueReading() [0x565162f8ac8f]
2021-09-18T04:05:00.881297Z critical envoy backtrace #18: Envoy::Network::ConnectionImpl::onReadReady() [0x565162f81539]
2021-09-18T04:05:00.890250Z critical envoy backtrace #19: Envoy::Network::ConnectionImpl::onFileEvent() [0x565162f7f0ef]
2021-09-18T04:05:00.898716Z critical envoy backtrace #20: std::__1::__function::__func<>::operator()() [0x565162ad3ec1]
2021-09-18T04:05:00.908190Z critical envoy backtrace #21: Envoy::Event::FileEventImpl::assignEvents()::$_1::__invoke() [0x565162ad562c]
2021-09-18T04:05:00.918072Z critical envoy backtrace #22: event_process_active_single_queue [0x56516315e9d8]
2021-09-18T04:05:00.926834Z critical envoy backtrace #23: event_base_loop [0x56516315d3ae]
2021-09-18T04:05:00.935790Z critical envoy backtrace #24: Envoy::Server::InstanceImpl::run() [0x565162abaefe]
2021-09-18T04:05:00.944124Z critical envoy backtrace #25: Envoy::MainCommonBase::run() [0x565161338644]
2021-09-18T04:05:00.952473Z critical envoy backtrace #26: Envoy::MainCommon::main() [0x565161338e36]
2021-09-18T04:05:00.960819Z critical envoy backtrace #27: main [0x565161336e7c]
2021-09-18T04:05:00.960988Z critical envoy backtrace #28: __libc_start_main [0x7f9e73244bf7]
AsyncClient 0x5651650dbe00, stream_id_: 1449385254900746208
&stream_info_:
StreamInfoImpl 0x5651650dbfc0, protocol_: 1, response_code_: null, response_code_details_: null, health_check_request_: 0, route_name_:
Http2::ConnectionImpl 0x5651650e6390, max_headers_kb_: 60, max_headers_count_: 100, per_stream_buffer_limit_: 268435456, allow_metadata_: 0, stream_error_on_invalid_http_messaging_: 0, is_outbound_flood_monitored_control_frame_: 0, skip_encoding_empty_trailers_: 1, dispatching_: 1, raised_goaway_: 0, pending_deferred_reset_streams_.size(): 0
&protocol_constraints_:
ProtocolConstraints 0x5651650e63f0, outbound_frames_: 0, max_outbound_frames_: 10000, outbound_control_frames_: 0, max_outbound_control_frames_: 1000, consecutive_inbound_frames_with_empty_payload_: 0, max_consecutive_inbound_frames_with_empty_payload_: 1, opened_streams_: 1, inbound_priority_frames_: 0, max_inbound_priority_frames_per_stream_: 100, inbound_window_update_frames_: 14, outbound_data_frames_: 14, max_inbound_window_update_frames_per_data_frame_sent_: 10
Number of active streams: 1, current_stream_id_: 1 Dumping current stream:
stream:
ConnectionImpl::StreamImpl 0x5651650c7000, stream_id_: 1, unconsumed_bytes_: 0, read_disable_count_: 0, local_end_stream_sent_: 0, remote_end_stream_: 0, data_deferred_: 0, received_noninformational_headers_: 1, pending_receive_buffer_high_watermark_called_: 0, pending_send_buffer_high_watermark_called_: 0, reset_due_to_messaging_error_: 0, cookies_: pending_trailers_to_encode_: null
absl::get<ResponseHeaderMapPtr>(headers_or_trailers_): null
Dumping corresponding downstream request for upstream stream 1:
UpstreamRequest 0x5651650ec000
2021-09-18T04:05:00.965294Z error Epoch 0 exited with error: signal: segmentation fault
2021-09-18T04:05:00.965317Z info No more active epochs, terminating
Version
control plane version: 1.10.4
data plane version: 1.10.4
K8s version: 1.16
Additional Information
No response
Affected product area
- Docs
- Installation
- Networking
- Performance and Scalability
- Extensions and Telemetry
- Security
- Test and Release
- User Experience
- Developer Infrastructure
- Upgrade
- Multi Cluster
- Virtual Machine
- Control Plane Revisions
Is this the right place to submit this?
- This is not a security vulnerability
- This is not a question about how to use Istio
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Comments: 15 (13 by maintainers)
Still present on istio 1.13.7 Envoy version: e0c6f64173cd0db9370e7ad8b1fbcee370a9f0f3/1.21.5/Clean/RELEASE/BoringSSL
When I update wasm filter configration, istio gateway pods all crashed.