Out of memory exception using Newtonsoft.Json package
Newtonsoft.Json package is probably one of the most essential packages in .NET software development. For those of you, not knowing what it does. It takes care of object serialization to JSON notation and deserilaization from JSON notation. I have used this package in numerous projects since its inception and I can only say great things about it.
However, as a part of one of our product’s GDPR compliance upgrade, I encountered an interesting undocumented feature. After object serialization a w3wp.exe process running the application pool for our product, started consuming 100% CPU capacity and hogged so much memory, we experienced “Out of memory exception” in a matter of minutes.
Since our product still uses .NET 3.5 (we are planning an upgrade to 4.7.2 shortly), tasks and parallel library are not native. We are using Microsoft’s TaskParallelLibrary package to circumvent framework deficiency. Hence, at first, I was dead sure that the library was a source of this issue. Specially, as we were doing serialization in an asynchronous method. After removing creation of new task, I was surprised it was not the case.
The object we wanted to serialize was a more complex derivative of this:
public class OurObject { #region Fields private int? _id; #endregion #region Properties public int Id { get { if (!_id.HasValue) throw new CustomException("some message"); return _id; } set { _id = value; } } #endregion }
The easiest way to serialize an instance of this object would be to do something like:
var instance = new OurObject(); var json = JsonConvert.Serialize(instance);
Except, this throws a CustomException, as Id property is not set. Newtonsoft.Json package documentation and StackOverflow answers offer a solution using serialization settings:
var instance = new OurObject(); var json = JsonConvert.Serialize( instance, new JsonSerializerSettings { Error = (se, ev) => { ev.ErrorContext.Handled = true; } });
This works as expected. It ignores exceptions thrown by a serialized object instance. Yeay!
Not so fast. When using above code as part of web application, it will cause your application to hog all available CPU power and consume as much memory as possible. Promptly. Yikes! Surley, not something you would want in a production ready environment. Running a debugger revealed issues with this solution. Whenever a serialized object raised an exception, because of our serialization setting, exception went by unhandled. This in turn put stress on servers CPU and caused a memory leak of Mt. Everest.
The bad thing is, that at the point of my writing, there is no option, to tell JSON serialization engine to handle all exception raised and not just mark them as handled. I guess what you could do is create new properties for each property causing you a headache and then decorate it with JsonPropertyAttribute accordingly, but in our case, that would mean changing every property in an object (and there were plenty). What I ended up doing was that I converted an object to DataTable (we use it for ADO anyway) and serialized that. Worked like a charm.
March 23rd, 2021 at 19:57
This is amazing, thank you! I think I have had the same issue!
I did not realise that marking the exception as handled still kept the exception in memory. Do you know what version you had this issue in please? And if it has been fixed?
March 24th, 2021 at 22:22
Am not sure about .NET Core or .NET 5, but it wasn’t fixed in .NET Framework 4.8.2 thus far. I have opened an issue regarding this, but it was dismissed as a non-issue.