azure-functions-host: Azure Functions do not respect global Json.Net SerializerSettings
What problem would the feature you’re requesting solve? Please describe.
Note that this might be better suited for the Azure Webjobs repository, feel free to let me know/move it
I have some JSON I’d like to be able to convert to an object using custom types, e.g. from NodaTime. I want to be able to put the JSON on a queue, and then have it automatically converted to the correct type.
I can do this in a regular app by configuring the global JsonConvert.DefaultSettings
method. However when running Azure functions, this is not respected. It seems like Azure Functions uses its own JsonSerializer with its own settings via Microsoft.Azure.WebJobs.Host.Protocols.JsonSerialization
,
and thus ignores the users own settings.
Let me demonstrate with a code example that does not work
namespace AIP.BulkDataSimulator
{
public class FooModel
{
public Instant Instant { get; set; }
}
public static class Reproduction
{
[FunctionName("Send")]
public static async Task<ActionResult<string>> Send(
[HttpTrigger(AuthorizationLevel.Function, "get")]
HttpRequest req,
[Queue("simulate-queue"), StorageAccount("BulkDataSimulationInternalStorage")]
IAsyncCollector<string> simulateQueue
)
{
// This would normally enable the Instant to be parsed as json
JsonConvert.DefaultSettings = () => new JsonSerializerSettings()
.ConfigureForNodaTime(DateTimeZoneProviders.Tzdb);
var foo = new FooModel()
{
Instant = SystemClock.Instance.GetCurrentInstant()
};
await simulateQueue.AddAsync(JsonConvert.SerializeObject(foo));
return "Added";
}
[FunctionName("Receive")]
public static void Receive(
[QueueTrigger("simulate-queue", Connection = "BulkDataSimulationInternalStorage")]
FooModel foo
)
{
// This would normally enable the Instant to be parsed as json
JsonConvert.DefaultSettings = () => new JsonSerializerSettings()
.ConfigureForNodaTime(DateTimeZoneProviders.Tzdb);
Console.WriteLine(JsonConvert.SerializeObject(foo));
}
}
}
When the second function is hit, this triggers an error
System.Private.CoreLib: Exception while executing function: Receive. Microsoft.Azure.WebJobs.Host: Exception binding parameter 'foo'. Microsoft.Azure.WebJobs.Extensions.Storage: Binding parameters to complex objects (such as 'FooModel') uses Json.NET serialization.
1. Bind the parameter type as 'string' instead of 'FooModel' to get the raw values and avoid JSON deserialization, or
2. Change the queue payload to be valid json. The JSON parser failed: Cannot convert value to NodaTime.Instant
Describe the solution you’d like
There’s a few issues here. Of course the solution above with setting JsonConvert.DefaultSettings
won’t work - the settings are set after the object is parsed.
I would however expect that when using dependency injection and setting the settings via a Startup class, that the JsonSerializer settings are respected. This is not the case.
Perhaps you could use only your own settings, if the user has not provided them globally?
Describe alternatives you’ve considered
Use a custom JsonConverter
Json.Net seems to pick up custom JsonConverters when set as attributes. However the catch is that Json.Net automatically converts the date-looking strings into actual Dates, and Nodatime requires the strings to parse them into Nodatime types. See here for more information. This means that a custom JsonConverter is an alright solution for most cases, but not this one.
Receive it as a string
There is of course the possibility to receive the parameter as a string. However this means you lose out on a lot of other goodness, such as automatic bindings from the value of the class. For perspective, our actual function signature looks something like this
public static async Task RunAsync(
[QueueTrigger("simulate-queue", Connection = "BulkDataSimulationInternalStorage")]
SimulationItem simulationEvent,
[Blob("avro-examples/{inputFile}", FileAccess.Read, Connection = "SampleDataStorage")]
Stream simulationBlob,
[Blob("ingest-container/{outputFile}", FileAccess.Write, Connection = "IngestStorage")]
Stream ingestBlob,
[EventHub("bulk", Connection = "IngestEventHub")]
IAsyncCollector<byte[]> ingestEventHub,
CancellationToken token,
ILogger log
)
and I’d hate to do all that with dynamic bindings.
About this issue
- Original URL
- State: open
- Created 4 years ago
- Reactions: 19
- Comments: 19 (1 by maintainers)
Same problem here. I need to set
ReferenceLoopHandling.Ignore
globally but can’t find a way to do that? Anyone found a solution?As example like:
Any solutions for this very common issue or at least some info if this is possible in future versions?
We were able to resolve with the following solution:
<PackageReference Include="Microsoft.AspNetCore.Mvc.NewtonsoftJson" Version="3.1.21" />
While @tpaulshippy has a working solution, it really isn’t an ideal solution. It would be nice to see an update to this problem.
I can’t help but feel like pulling Newtonsoft in is taking a step backwards rather than a solution… The whole point of
System.Text.Json
was to replace NewtonsoftBumping this, how can we effectively set the global settings of the included Newtonsoft dependency of Azure functions to avoid “Self referencing loop detected for property”, adding [JsonIgnore] to properties is not a durable solution
December 2023 and we do not have a solution yet?
Bump! I have a custom converter for a HashSet of enum values that I would like to utilize
It would be nice with a straight forward approach in the Startup.Configure() method as suggested by chatGPT :
builder.Services.Configure<JsonSerializerOptions>(options => options.Converters.Add(new MyHashSetEnumConverter<MyEnumTypes>()));
Adding the attribute to properties that should use it, as done elsewhere, save Azure Functions:
[JsonConverter(typeof(MyHashSetEnumConverter<MyEnumTypes>))]
Needless to say. This does currently not work. 😥
This needs to be adressed, it’s one of the most basic features. Not only we should be able to set json serialization settings for ‘MVC’ related stuff, but also we should be able to set DefaultSettings.