security: [BUG] java.io.OptionalDataException error in OpenSearch 2.0 rc

Describe the bug I am collecting data with OpenSearch with Logstash. OpenSearch throws java.io.OptionalDataException error during service operation.

This did not happen in the OpenSearch 1.3 operating environment.

The occurrence of errors is irregular, and errors sometimes disappear after a certain period of time has passed.

Data collection is collected in different indexes depending on the data type, and some index collections work when an error occurs. The data that is failing is not just a specific index, but a variety of things.

To Reproduce

Expected behavior

Plugins

Screenshots

Host/Environment (please complete the following information):

  • OS: CentOS Linux release 7.9.2009
  • Logstash: 7.16.2.
  • Version: OpenSearch works in docker, and images use opensearchproject/opensearch:1.3.0 / opensearchproject/opensearch:2.0.0-rc1

Additional context [OpenSearch Log]

2022-05-18T23:44:42.965700981Z [2022-05-18T23:44:42,965][INFO ][o.o.j.s.JobSweeper       ] [91616f4bd951] Running full sweep
2022-05-18T23:49:42.966011507Z [2022-05-18T23:49:42,965][INFO ][o.o.j.s.JobSweeper       ] [91616f4bd951] Running full sweep
2022-05-18T23:49:43.072676353Z [2022-05-18T23:49:43,072][INFO ][o.o.a.t.CronTransportAction] [91616f4bd951] Start running AD hourly cron.
2022-05-18T23:49:43.072820561Z [2022-05-18T23:49:43,072][INFO ][o.o.a.t.ADTaskManager    ] [91616f4bd951] Start to maintain running historical tasks
2022-05-18T23:49:43.073109887Z [2022-05-18T23:49:43,073][INFO ][o.o.a.c.HourlyCron       ] [91616f4bd951] Hourly maintenance succeeds
2022-05-18T23:50:30.021645243Z [2022-05-18T23:50:30,012][WARN ][r.suppressed             ] [91616f4bd951] path: /indexname-*/_search, params: {index=indexname-*}
2022-05-18T23:50:30.021674167Z org.opensearch.action.search.SearchPhaseExecutionException: all shards failed
2022-05-18T23:50:30.021678359Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.onPhaseFailure(AbstractSearchAsyncAction.java:642) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021681538Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.executeNextPhase(AbstractSearchAsyncAction.java:360) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021684355Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.onPhaseDone(AbstractSearchAsyncAction.java:677) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021687051Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.onShardFailure(AbstractSearchAsyncAction.java:457) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021689946Z 	at org.opensearch.action.search.AbstractSearchAsyncAction$1.onFailure(AbstractSearchAsyncAction.java:291) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021692560Z 	at org.opensearch.action.ActionListenerResponseHandler.handleException(ActionListenerResponseHandler.java:72) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021695600Z 	at org.opensearch.transport.TransportService$6.handleException(TransportService.java:735) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021707976Z 	at org.opensearch.security.transport.SecurityInterceptor$RestoringTransportResponseHandler.handleException(SecurityInterceptor.java:318) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021711627Z 	at org.opensearch.transport.TransportService$ContextRestoreResponseHandler.handleException(TransportService.java:1350) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021714318Z 	at org.opensearch.transport.TransportService$DirectResponseChannel.processException(TransportService.java:1459) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021717064Z 	at org.opensearch.transport.TransportService$DirectResponseChannel.sendResponse(TransportService.java:1433) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021719769Z 	at org.opensearch.transport.TransportService.sendLocalRequest(TransportService.java:967) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021723529Z 	at org.opensearch.transport.TransportService$3.sendRequest(TransportService.java:147) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021726036Z 	at org.opensearch.transport.TransportService.sendRequestInternal(TransportService.java:869) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021728354Z 	at org.opensearch.security.transport.SecurityInterceptor.sendRequestDecorate(SecurityInterceptor.java:212) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021731148Z 	at org.opensearch.security.OpenSearchSecurityPlugin$7$2.sendRequest(OpenSearchSecurityPlugin.java:665) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021733902Z 	at org.opensearch.transport.TransportService.sendRequest(TransportService.java:756) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021736648Z 	at org.opensearch.transport.TransportService.sendChildRequest(TransportService.java:831) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021739309Z 	at org.opensearch.action.search.SearchTransportService.sendCanMatch(SearchTransportService.java:149) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021742052Z 	at org.opensearch.action.search.CanMatchPreFilterSearchPhase.executePhaseOnShard(CanMatchPreFilterSearchPhase.java:128) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021744771Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.lambda$performPhaseOnShard$3(AbstractSearchAsyncAction.java:278) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021747524Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.performPhaseOnShard(AbstractSearchAsyncAction.java:312) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021750314Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.run(AbstractSearchAsyncAction.java:249) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021753108Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.executePhase(AbstractSearchAsyncAction.java:415) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021755926Z 	at org.opensearch.action.search.AbstractSearchAsyncAction.start(AbstractSearchAsyncAction.java:215) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021758723Z 	at org.opensearch.action.search.TransportSearchAction.executeSearch(TransportSearchAction.java:994) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021764118Z 	at org.opensearch.action.search.TransportSearchAction.executeLocalSearch(TransportSearchAction.java:757) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021767219Z 	at org.opensearch.action.search.TransportSearchAction.lambda$executeRequest$3(TransportSearchAction.java:398) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021770070Z 	at org.opensearch.action.ActionListener$1.onResponse(ActionListener.java:78) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021772833Z 	at org.opensearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:136) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021775554Z 	at org.opensearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:101) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021778319Z 	at org.opensearch.action.search.TransportSearchAction.executeRequest(TransportSearchAction.java:487) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021781263Z 	at org.opensearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:277) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021784064Z 	at org.opensearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:120) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021786868Z 	at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:194) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021789576Z 	at org.opensearch.indexmanagement.rollup.actionfilter.FieldCapsFilter.apply(FieldCapsFilter.kt:118) [opensearch-index-management-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021792191Z 	at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:192) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021794631Z 	at org.opensearch.performanceanalyzer.action.PerformanceAnalyzerActionFilter.apply(PerformanceAnalyzerActionFilter.java:78) [opensearch-performance-analyzer-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021797236Z 	at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:192) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021800054Z 	at org.opensearch.security.filter.SecurityFilter.apply0(SecurityFilter.java:325) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021802755Z 	at org.opensearch.security.filter.SecurityFilter.apply(SecurityFilter.java:157) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021806267Z 	at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:192) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021809178Z 	at org.opensearch.action.support.TransportAction.execute(TransportAction.java:169) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021811919Z 	at org.opensearch.action.support.TransportAction.execute(TransportAction.java:97) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021814537Z 	at org.opensearch.client.node.NodeClient.executeLocally(NodeClient.java:108) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021819342Z 	at org.opensearch.rest.action.RestCancellableNodeClient.doExecute(RestCancellableNodeClient.java:104) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021822194Z 	at org.opensearch.client.support.AbstractClient.execute(AbstractClient.java:425) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021824698Z 	at org.opensearch.rest.action.search.RestSearchAction.lambda$prepareRequest$2(RestSearchAction.java:130) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021827375Z 	at org.opensearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:123) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021830341Z 	at org.opensearch.security.filter.SecurityRestFilter$1.handleRequest(SecurityRestFilter.java:128) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021832662Z 	at org.opensearch.rest.RestController.dispatchRequest(RestController.java:306) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021835200Z 	at org.opensearch.rest.RestController.tryAllHandlers(RestController.java:392) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021837529Z 	at org.opensearch.rest.RestController.dispatchRequest(RestController.java:235) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021840258Z 	at org.opensearch.security.ssl.http.netty.ValidatingDispatcher.dispatchRequest(ValidatingDispatcher.java:63) [opensearch-security-2.0.0.0-rc1.jar:2.0.0.0-rc1]
2022-05-18T23:50:30.021843001Z 	at org.opensearch.http.AbstractHttpServerTransport.dispatchRequest(AbstractHttpServerTransport.java:361) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021845584Z 	at org.opensearch.http.AbstractHttpServerTransport.handleIncomingRequest(AbstractHttpServerTransport.java:440) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021848392Z 	at org.opensearch.http.AbstractHttpServerTransport.incomingRequest(AbstractHttpServerTransport.java:351) [opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021851677Z 	at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:55) [transport-netty4-client-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021854379Z 	at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:41) [transport-netty4-client-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021857188Z 	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021859832Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021862547Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021865145Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021868012Z 	at org.opensearch.http.netty4.Netty4HttpPipeliningHandler.channelRead(Netty4HttpPipeliningHandler.java:71) [transport-netty4-client-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.021873155Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021876009Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021878647Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021881288Z 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021884253Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021887006Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021890227Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021893150Z 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021895772Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021898247Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021900791Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021903611Z 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021906357Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021909121Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021911671Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021914332Z 	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021917065Z 	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021919797Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021924273Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021927154Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021930364Z 	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021933214Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021935825Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021938494Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021941185Z 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021943772Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021946507Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021949096Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021951894Z 	at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1371) [netty-handler-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021954600Z 	at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1234) [netty-handler-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021957283Z 	at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1283) [netty-handler-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021959941Z 	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:510) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021962822Z 	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:449) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021965687Z 	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279) [netty-codec-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021968924Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021971771Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021974631Z 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021979112Z 	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021982181Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021984750Z 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021987493Z 	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021990170Z 	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021992886Z 	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021995570Z 	at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:623) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.021998068Z 	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:586) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.022005132Z 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) [netty-transport-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.022007714Z 	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) [netty-common-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.022010101Z 	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.73.Final.jar:4.1.73.Final]
2022-05-18T23:50:30.022013231Z 	at java.lang.Thread.run(Thread.java:833) [?:?]
2022-05-18T23:50:30.022015580Z Caused by: org.opensearch.OpenSearchException: java.io.OptionalDataException
2022-05-18T23:50:30.022020352Z 	at org.opensearch.security.support.Base64Helper.deserializeObject(Base64Helper.java:185) ~[?:?]
2022-05-18T23:50:30.022023058Z 	at org.opensearch.security.transport.SecurityRequestHandler.messageReceivedDecorate(SecurityRequestHandler.java:155) ~[?:?]
2022-05-18T23:50:30.022025907Z 	at org.opensearch.security.ssl.transport.SecuritySSLRequestHandler.messageReceived(SecuritySSLRequestHandler.java:97) ~[?:?]
2022-05-18T23:50:30.022028556Z 	at org.opensearch.security.OpenSearchSecurityPlugin$7$1.messageReceived(OpenSearchSecurityPlugin.java:651) ~[?:?]
2022-05-18T23:50:30.022031217Z 	at org.opensearch.indexmanagement.rollup.interceptor.RollupInterceptor$interceptHandler$1.messageReceived(RollupInterceptor.kt:118) ~[?:?]
2022-05-18T23:50:30.022033939Z 	at org.opensearch.performanceanalyzer.transport.PerformanceAnalyzerTransportRequestHandler.messageReceived(PerformanceAnalyzerTransportRequestHandler.java:43) ~[?:?]
2022-05-18T23:50:30.022036620Z 	at org.opensearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:98) ~[opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.022042345Z 	at org.opensearch.transport.TransportService.sendLocalRequest(TransportService.java:931) ~[opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.022045459Z 	... 101 more
2022-05-18T23:50:30.022047895Z Caused by: java.io.OptionalDataException
2022-05-18T23:50:30.022050506Z 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1767) ~[?:?]
2022-05-18T23:50:30.022053172Z 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:514) ~[?:?]
2022-05-18T23:50:30.022055439Z 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:472) ~[?:?]
2022-05-18T23:50:30.022057905Z 	at java.util.HashSet.readObject(HashSet.java:345) ~[?:?]
2022-05-18T23:50:30.022060408Z 	at jdk.internal.reflect.GeneratedMethodAccessor36.invoke(Unknown Source) ~[?:?]
2022-05-18T23:50:30.022062944Z 	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
2022-05-18T23:50:30.022065516Z 	at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
2022-05-18T23:50:30.022068083Z 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1231) ~[?:?]
2022-05-18T23:50:30.022070808Z 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2434) ~[?:?]
2022-05-18T23:50:30.022073271Z 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2268) ~[?:?]
2022-05-18T23:50:30.022075787Z 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1744) ~[?:?]
2022-05-18T23:50:30.022078237Z 	at java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2617) ~[?:?]
2022-05-18T23:50:30.022081688Z 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2468) ~[?:?]
2022-05-18T23:50:30.022087904Z 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2268) ~[?:?]
2022-05-18T23:50:30.022090864Z 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1744) ~[?:?]
2022-05-18T23:50:30.022093895Z 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:514) ~[?:?]
2022-05-18T23:50:30.022096278Z 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:472) ~[?:?]
2022-05-18T23:50:30.022098911Z 	at org.opensearch.security.support.Base64Helper.deserializeObject(Base64Helper.java:183) ~[?:?]
2022-05-18T23:50:30.022101744Z 	at org.opensearch.security.transport.SecurityRequestHandler.messageReceivedDecorate(SecurityRequestHandler.java:155) ~[?:?]
2022-05-18T23:50:30.022104165Z 	at org.opensearch.security.ssl.transport.SecuritySSLRequestHandler.messageReceived(SecuritySSLRequestHandler.java:97) ~[?:?]
2022-05-18T23:50:30.022106691Z 	at org.opensearch.security.OpenSearchSecurityPlugin$7$1.messageReceived(OpenSearchSecurityPlugin.java:651) ~[?:?]
2022-05-18T23:50:30.022109481Z 	at org.opensearch.indexmanagement.rollup.interceptor.RollupInterceptor$interceptHandler$1.messageReceived(RollupInterceptor.kt:118) ~[?:?]
2022-05-18T23:50:30.022112185Z 	at org.opensearch.performanceanalyzer.transport.PerformanceAnalyzerTransportRequestHandler.messageReceived(PerformanceAnalyzerTransportRequestHandler.java:43) ~[?:?]
2022-05-18T23:50:30.022117725Z 	at org.opensearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:98) ~[opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.022119699Z 	at org.opensearch.transport.TransportService.sendLocalRequest(TransportService.java:931) ~[opensearch-2.0.0-rc1.jar:2.0.0-rc1]
2022-05-18T23:50:30.022121470Z 	... 101 more

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Reactions: 8
  • Comments: 51 (5 by maintainers)

Most upvoted comments

Faced same issue after upgrade from 1.3

so, I’ve found how to reproduce this bug locally

I used vm (4cpu, hdd) cluster was created by official docker compose file

  1. create cluster
curl https://opensearch.org/samples/docker-compose.yml -O
docker-compose up
  1. run fill.sh script to send sample data to the index
#!/bin/sh
while true
  do
    curl -XPOST -k -u admin:admin https://localhost:9200/test-index/_doc \
      -H "Content-Type: application/json" -d @test_data.json
  done
  1. run in infinity loop on purge cache url
while true;do curl -k -u admin:admin -X DELETE https://localhost:9200/_plugins/_security/api/cache; done
  1. run in infinity loop on _nodes stats
while true; do curl -s -k -u admin:admin https://localhost:9200/_nodes -o /dev/null; done

4 java.io.OptionalDataException will appear in minutes (on my env 😃

sh script and json payload can be found in my repo https://github.com/denisvll/os_test

docker ps
CONTAINER ID   IMAGE                                            COMMAND                  CREATED         STATUS         PORTS                                                                                                      NAMES
f9f8f58d95bb   opensearchproject/opensearch:latest              "./opensearch-docker…"   3 minutes ago   Up 7 seconds   0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 9300/tcp, 0.0.0.0:9600->9600/tcp, :::9600->9600/tcp, 9650/tcp   opensearch-node1
5fd8e3c80116   opensearchproject/opensearch-dashboards:latest   "./opensearch-dashbo…"   3 minutes ago   Up 7 seconds   0.0.0.0:5601->5601/tcp, :::5601->5601/tcp                                                                  opensearch-dashboards
523714cde06c   opensearchproject/opensearch:latest              "./opensearch-docker…"   3 minutes ago   Up 7 seconds   9200/tcp, 9300/tcp, 9600/tcp, 9650/tcp                                                                     opensearch-node2
docker images
REPOSITORY                                TAG       IMAGE ID       CREATED       SIZE
opensearchproject/opensearch              latest    9a3e759bec77   2 weeks ago   854MB
opensearchproject/opensearch-dashboards   latest    7da32241e476   2 weeks ago   998MB

Right now everyone, who is using OpenSearch 2.x (we are running OpenSearch 2.0.1) with Security Plugin and Logstash 7.16.2 with OpenSearch output plugin, is probably losing data. And because it happens sporadic, it could go completely unnoticed.

I am also using the Bulk API with the elasticsearch-net client. I followed the advice above and changed the ttl in my opensearch.yml config to plugins.security.cache.ttl_minutes: 1440.

This decreased the frequency of errors drastically. I then implemented logic to detect the error and call the flush cache api in my app before retrying. It’s super hacky but it’s getting me by for now.

if (item.Error.Reason == "java.io.OptionalDataException")
{
    try
    {
        var flushCacheResponse = await _simpleCacheClient.FlushAuthCacheAsync();
    }
    catch (Exception ex)
    {
        _logger.LogError(ex, "Error flushing auth cache.");
    }
}
   public class SimpleCacheClient : ISimpleCacheClient
   {
       private readonly HttpClient _http;
       private readonly ILogger<SimpleCacheClient> _logger;
       private readonly AppOptions _options;

       public SimpleCacheClient(
           HttpClient httpClient,
           IOptions<AppOptions> options,
           ILogger<SimpleCacheClient> logger
           )
       {
           _http = httpClient;
           _logger = logger;
           _options = options.Value;
       }

       public async Task<FlushCacheResponse> FlushAuthCacheAsync()
       {
           _logger.LogDebug("Begin {method}", nameof(FlushAuthCacheAsync));

           List<Exception> innerExceptions = null;
           foreach (var item in _options.ElasticClusterNodes)
           {
               try
               {
                   var url = $"{item}/_opendistro/_security/api/cache";
                   var response = await _http.DeleteAsync(url);
                   _logger.LogDebug("Attempting to flush cache at url {url}", url);

                   if (response.IsSuccessStatusCode)
                   {
                       using var stream = await response.Content.ReadAsStreamAsync();
                       return JsonSerializer.Deserialize<FlushCacheResponse>(stream);
                   }
                   else
                   {
                       throw new HttpRequestException($"Error sending request: {response.ReasonPhrase}");
                   }
               }
               catch (Exception ex)
               {
                   if (innerExceptions == null)
                       innerExceptions = new List<Exception>();

                   innerExceptions.Add(ex);
               }
               await Task.Delay(1000);
           }

           var finalException = new AggregateException("Unable to flush the cache from any of the available OpenSearch hosts.", innerExceptions);
           throw finalException;
       }
   }

I’m still looking into this. Yeah, so far, this seems like an issue in the security plugin but I’m still not able to root cause it. Will try to pull in specialist in security plugin.

Some information, based on what we’ve experienced:

We’ve recently updated Opensearch to the version 2.0.0. The problem seems to happen periodically (at least, in our case), every 60 minutes. After some further investigation, we detected that the problem seems to be associated with the Opensearch Security Plugin cache, and the frequency of the occurence of the problem seems to be directly associated with the TTL defined for the cache (currently defined in plugins.security.cache.ttl_minutes, with 60 minutes as a default value). Another evidence is that, once we flush/purge the cache, the problems seems to be solve temporarily, until the cache expires again.

I don’t have any more practical evidences/logs about it, but, as a temporary solution, we’ve increased the cache’s TTL, and started to flush the cache once a day.

Additional info: we use the Opensearch’s javascript client (v2.0.0) to save the documents in Opensearch.

I can confirm, it solved my issue with: plugins.security.cache.ttl_minutes: 1440 in opensearch.yml!

I tried to downgrade but due to Lucene upgrade to v9 cluster failed to read cluster state and indices metadata.

Same happens here. Multiple times a day Logstash fails to sent logs to OpenSearch cluster. Started happening after upgrade to V2. Is downgrading back to v1.3.2 an option? maybe some other workaround?

I can perhaps explain our very easy setup which raised the bug yesterday, but not anymore (for now ~11 hours since I purged the security plugin cache):

  • Docker single node (discovery.type=single-node), TLS disabled (plugins.security.ssl.http.enabled=false), but that shouldn’t matter
  • Data stream test-status (with ISM policy, but that shouldn’t matter) as index template and manually created (PUT _data_stream/test-status)
  • User with index permissions index on test-* as well as indices:admin/mapping/put on .ds-test-* (the latter was required at least in 2.0.1 because it was not properly propagated from the index permission on the data stream; perhaps it’s fixed with 2.1.0, didn’t check again)
  • recreated Docker container
  • Java 11 “raw” HTTP client, performing one bulk create request (POST test-status/_bulk) with one document every 30 seconds

This raised the mentioned OptionalDataException exactly 6 hours after recreating the Docker container for a duration of exactly 1 hour. About 2 hours later, I purged the security plugin cache, and the problem didn’t happen again for now ~11 hours.

We are still having this problem with 2.1.0. Looks like it happens after each TTL timeout as mentioned here: https://github.com/opensearch-project/security/issues/1927.

I can confirm, it solved my issue with: plugins.security.cache.ttl_minutes: 1440 in opensearch.yml!

Indeed, I solved the issue as well, when I set this configuration!

We are using 2.0.0, and are also using the Java client 2.0.0 (opensearch-rest-client and opensearch-rest-high-level-client) and have been experiencing the same issues as others on this thread. So not sure it is a client incompatibility. As with others happens intermittently. We are running a 3 node cluster, and problem may occur only once in a few days or within hours.

Caused by: org.opensearch.OpenSearchException: java.io.OptionalDataException
        at org.opensearch.security.support.Base64Helper.deserializeObject(Base64Helper.java:185) ~[?:?]
       ...
Caused by: java.io.IOException
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1767) ~[?:?]

So I’m beginning to think that this may be caused by an incompatibility with the Elasticsearch version of the clients. I implemented my own bulk index client and haven’t experienced these issues.

Originally I was using elasticsearch-net client version 7.12. I’m not using Logstash but I imagine it is using the Elasticsearch versions of the client libraries internally and is running into the same issues.

I think we have just hit a point where the API has diverged enough that the Elasticsearch clients are no longer compatible with > 2.x OpenSearch.

I can confirm i can reproduce the issue with few requests which fix the issue or open it:

curl -X DELETE --key /usr/share/opensearch/config/admin-key.pem -ks --cert /usr/share/opensearch/config/admin.pem https://localhost:9200/_plugins/_security/api/cache

2.0.1 with logstash oss 7.16.2

My problem appears since 1.3.2 -> 2.0.0

PS: if i backup the config, config is detected as version 1 and not 2, i don’t know why…

The assumption I wrote above is wrong. I removed all org.opensearch.OpenSearchSecurityException and the problem keeps happening.

Here are my current actions now:

  • increased cache.ttl_minutes to 6 hours
  • hourly client nodes restart
  • daily cache flush

@tlfeng , one important difference about the way you ingest data and logstash in my case: logstash uses bulk api.

I hope it helps. Regards.

We managed to fix it by increasing the cache TTL of security plugin, so maybe it will help for someone