roslyn: 'workspace/symbol' is slow to respond to cancellation

Version Used: VS 2019 int.master 16.6

Steps to Reproduce:

  1. Open a large solution with lots of projects (in this scenario, I used internal Editor.sln).
  2. Search for a symbol.

Expected Behavior: Any searches queued should be canceled by subsequent TYPECHARs.

Actual Behavior: I typed ‘CreateAsync’ and Ctrl+Q queues one search for ‘C’… and then attempts to cancel it on the next TYPECHAR but Roslyn ‘workspace/symbol’ didn’t yield/respond to cancellation for ~6 seconds.

It’s not entirely clear to me what’s happening, I debugged through the search task scheduler, LSP language client extension, and Roslyn and found:

  • Cancellation token is canceled immediately on subsequent TYPECHAR.
  • Cancellation message makes it to WorkspaceSymbolAsync’s cancellationToken immediately on the server side.
  • There’s a six second delay…
  • Then ‘workspace/symbol’ seems to return at that point with a canceled token.

image

Sequence of events:

** User types "C" {3/19/2020 2:31:44 PM}
** Start search for "C" {3/19/2020 2:31:44 PM}
** User types "reateAsync" {3/19/2020 2:31:46 PM}
** Client cancel {3/19/2020 2:31:45 PM}
*** SERVER recv. cancel message {3/19/2020 2:31:45 PM}

[6 second delay]

** SERVER WorkspaceSymbolAsync Returns {3/19/2020 2:31:51 PM}
** Client returns {3/19/2020 2:31:52 PM}
** Start search for "CreateAsync" {3/19/2020 2:31:52PM}

Cancellation

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 32 (32 by maintainers)

Most upvoted comments

closing this - the cancellation issue is better with smaller batch sizes. However the search is not as smooth as it was in 16.4/16.5. Will look into that separately

@dibarbet FWIW, can you see if this change makes a positive impact: https://devdiv.visualstudio.com/DevDiv/_git/VS/pullrequest/239467

Any update on this? Perhaps it’s worth imposing a maximum batch size so we can’t have runaway serialization for large projects like Microsoft.VisualStudio.Platform.VSEditor.dll in the repro.

I have a change to batch per document rather than per project. Testing out to see if it seems any better right now. https://github.com/dotnet/roslyn/pull/42649

We’ll probably need to discuss messagepack support in the VS LSP client, also tagging @tinaschrepfer

Yes, switching from JSON to MessagePack is very easy – at the streamjsonrpc level. But two gotchas that may getcha:

  1. The types you’re passing back and forth that were serializable via Newtonsoft.Json attributes and patterns must now conform to the rules of MessagePack-CSharp instead.
  2. LSP is fundamentally a JSON protocol. The MessagePack encoding is conveniently very compatible with JSON (which is why our JSON-RPC library allows MessagePack as an alternate encoding), but you can’t expect to interop with any other LSP party if you use messagepack since they’ll all be encoding with JSON.

Solution: make your types serializable both by newtonsoft.json and messagepack. Then initialize your RPC stream with StreamJsonRpc passing in the switch for either messagepack or JSON based on whether you know the remote party can speak messagepack. So you’ll be fast when you can, but still interop with others when necessary.

but I’m not sure it will help with this specific delay.

It definitely should. The delay is because we’re serializing out a huge project graph and sending that over. so we only finish once the full graph is serialized out. Having this be doc-level instead gives us much finer grained cancellation.