azure-functions-durable-extension: Event raised before ContinueAsNew (w/ preserveUnprocessedEvents: true) is lost in the next execution of the eternal orchestration
Orchestration definition:
Eternal orchestration listens for 2 events concurrently. When 1 of them is received, cancel the other one and continue doing some other work and then ContinueAsNew(null, preserveUnprocessedEvents: true)
If an event is raised while its inside the activity func Delay
, it is not being received by WaitForExternalEvent
in the next execution
`
[FunctionName(nameof(Delay))]
public async Task Delay([ActivityTrigger] IDurableActivityContext context, ILogger log, CancellationToken cancellationToken)
{
log.LogInformation($"{nameof(EternalOrch)}: {nameof(Delay)}: 20 sec");
await Task.Delay(TimeSpan.FromSeconds(20));
}
[FunctionName(nameof(EternalOrch))]
public static async Task EternalOrch([OrchestrationTrigger] IDurableOrchestrationContext context, ILogger log)
{
log = context.CreateReplaySafeLogger(log);
using (var cancellationTokenSource = new CancellationTokenSource())
{
log.LogInformation($"{nameof(EternalOrch)}: wait for any of event_0 / event_1");
var winner = await Task.WhenAny(
context.WaitForExternalEvent("event_0", TimeSpan.FromDays(1), cancellationTokenSource.Token),
context.WaitForExternalEvent("event_1", TimeSpan.FromDays(1), cancellationTokenSource.Token)
);
log.LogInformation($"{nameof(EternalOrch)}: event received, cancel other tasks waiting for event");
cancellationTokenSource.Cancel();
// if winner is event_0, do something
// else if winner is event_1, do something else
}
await context.CallActivityAsync(nameof(Delay), null);
log.LogInformation($"{nameof(EternalOrch)}: continue as new");
context.ContinueAsNew(null, preserveUnprocessedEvents: true);
}
`
Client definition: `
[FunctionName(nameof(RunEternalOrch))]
public static async Task<IActionResult> RunEternalOrch(
[HttpTrigger(AuthorizationLevel.Anonymous, "post")] HttpRequest req,
[DurableClient] IDurableOrchestrationClient client,
ILogger log)
{
log.LogInformation($"Client: start {nameof(EternalOrch)}");
var instanceId = await client.StartNewAsync(nameof(EternalOrch));
var status = await client.GetStatusAsync(instanceId);
log.LogInformation($"Client: orch status: {status.RuntimeStatus}");
// wait till its running
while (status.RuntimeStatus != OrchestrationRuntimeStatus.Running)
{
await Task.Delay(TimeSpan.FromMilliseconds(100));
status = await client.GetStatusAsync(instanceId);
}
log.LogInformation($"Client: orch status: {status.RuntimeStatus}");
log.LogInformation("Client: raise event_0");
await client.RaiseEventAsync(instanceId, "event_0");
// delay 10 sec
await Task.Delay(TimeSpan.FromSeconds(10));
log.LogInformation("Client: raise event_1");
await client.RaiseEventAsync(instanceId, "event_1");
// delay 40 sec
await Task.Delay(TimeSpan.FromSeconds(40));
await client.TerminateAsync(instanceId, "just because");
return new StatusCodeResult((int)HttpStatusCode.OK);
}
`
Logs:
There is no EternalOrch: event received
after Client: raise event_1
[2021-01-30T02:03:46.434Z] Executing ‘RunEternalOrch’ (Reason=‘This function was programmatically called via the host APIs.’, Id=a6bf4b26-9c67-44db-ae68-e42246972876) [2021-01-30T02:03:46.448Z] Client: start EternalOrch [2021-01-30T02:03:47.256Z] Client: orch status: Pending [2021-01-30T02:03:47.334Z] Executing ‘EternalOrch’ (Reason=‘(null)’, Id=828ac795-7b4c-44a0-aea6-6f0955458967) [2021-01-30T02:03:47.347Z] EternalOrch: wait for any of event_0 / event_1 [2021-01-30T02:03:47.440Z] Executed ‘EternalOrch’ (Succeeded, Id=828ac795-7b4c-44a0-aea6-6f0955458967, Duration=99ms) [2021-01-30T02:03:48.359Z] Client: orch status: Running => [2021-01-30T02:03:48.362Z] Client: raise event_0 [2021-01-30T02:03:49.011Z] Executing ‘EternalOrch’ (Reason=‘(null)’, Id=d690ce9b-ac5d-4314-84e3-58e24e2d18a6) => [2021-01-30T02:03:49.022Z] EternalOrch: event received, cancel other tasks waiting for event [2021-01-30T02:03:50.341Z] Executed ‘EternalOrch’ (Succeeded, Id=d690ce9b-ac5d-4314-84e3-58e24e2d18a6, Duration=1296ms) [2021-01-30T02:03:51.908Z] Executing ‘Delay’ (Reason=‘(null)’, Id=6b3a5b7f-f525-446e-85fa-109f6ff769ba) [2021-01-30T02:03:51.912Z] EternalOrch: Delay: 20 sec => [2021-01-30T02:03:58.746Z] Client: raise event_1 [2021-01-30T02:03:59.203Z] Executing ‘EternalOrch’ (Reason=‘(null)’, Id=e2f8bfd1-a57c-477e-b176-a28059daa42a) [2021-01-30T02:03:59.774Z] Executed ‘EternalOrch’ (Succeeded, Id=e2f8bfd1-a57c-477e-b176-a28059daa42a, Duration=570ms) [2021-01-30T02:04:11.929Z] Executed ‘Delay’ (Succeeded, Id=6b3a5b7f-f525-446e-85fa-109f6ff769ba, Duration=20634ms) [2021-01-30T02:04:12.446Z] Executing ‘EternalOrch’ (Reason=‘(null)’, Id=0d466ea3-696a-4983-80c8-12ec848d58eb) [2021-01-30T02:04:14.278Z] EternalOrch: continue as new [2021-01-30T02:04:14.639Z] Executed ‘EternalOrch’ (Succeeded, Id=0d466ea3-696a-4983-80c8-12ec848d58eb, Duration=2193ms) [2021-01-30T02:04:14.658Z] Executing ‘EternalOrch’ (Reason=‘(null)’, Id=1bbc4d03-258b-45bf-a629-ebab782e3dc9) [2021-01-30T02:04:14.661Z] EternalOrch: wait for any of event_0 / event_1 [2021-01-30T02:04:14.665Z] Executed ‘EternalOrch’ (Succeeded, Id=1bbc4d03-258b-45bf-a629-ebab782e3dc9, Duration=7ms) [2021-01-30T02:04:39.217Z] 0beb29998e2548a8acf5966aa98ee840: Function ‘EternalOrch (Orchestrator)’ was terminated. Reason: just because. State: Terminated. HubName: TestHubName. AppName: . SlotName: . ExtensionVersion: 2.4.0. SequenceNumber: 25. [2021-01-30T02:04:39.220Z] Executed ‘RunEternalOrch’ (Succeeded, Id=a6bf4b26-9c67-44db-ae68-e42246972876, Duration=52824ms) [2021-01-30T02:04:39.471Z] Executing ‘EternalOrch’ (Reason=‘(null)’, Id=f28fc06c-8e2e-4064-b29c-eee62774db67) [2021-01-30T02:04:39.476Z] Executed ‘EternalOrch’ (Succeeded, Id=f28fc06c-8e2e-4064-b29c-eee62774db67, Duration=5ms)
About this issue
- Original URL
- State: open
- Created 3 years ago
- Reactions: 1
- Comments: 15 (8 by maintainers)
@clurdish: We’ve added this as an item for discussion for our upcoming internal triage meeting: next Tuesday. In the meantime, I don’t personally know of any good workarounds for this (though I’ll ask internally) but your provided reproducer seems like exactly what we were missing in order to properly diagnose this behavior. Thank you!
I’ll get back to you asap after discussing internally.
Hello @davidmrdavid, this is durable functions app and I’m running this locally in my Visual studio (not on Azure)
Hi @nakranimohit0, Thank you for reaching out. This ticket reminds of this pending work item: I think there are some edge-cases around
ContinueAsNew
that still need to be robustly validated.In this case, it makes intuitive sense to me that we should preserve your unprocessed raiseEvent on the next iteration of your orchestrator. Does that seem right to you, @ConnorMcMahon?
After receiving confirmation that this is indeed a bug, we can look into prioritizing this for our next release sprint 😃