Dapr Event Publishing and Double types 👨💻
Dapr (not to be confused with Dapper) is a really good piece of software to use in your Kubernetes clusters, it helps with everything from inter-service communication to handling event publishing.
Dapr (not to be confused with Dapper) is a really good piece of software to use in your Kubernetes clusters, it helps with everything from inter-service communication to handling event publishing.
I have however come across a bit of an issue with the way it handles the rounding/accuracy of a double data type
The Problem...
I had some geography points (lat/longs) that were being updated in a service, and this service was then publishing the updates onto the service bus for other services to consume. One day a user pointed out that some of the positions of the lat/longs seemed to be slightly wrong, not massively but it was noticeable.
I checked the logs to find the data coming in and the points looked correct, but they were wrong when they ended up in the database. As I looked into it further I could see that the numbers where losing accuracy somewhere down the line.
I created a test case and pumped it through the system, I could see the data going into the event fine but when it was consumed on the other side it had changed. Hmm, what's going on here?
As I couldn't see anything actually wrong, I headed over to the github issues for the netcore dapr, after a search yielded no results I asked the question.
Sounded like it was to do with the way the type was being converted into a JSON object, but no one really had a solution to fixing it, apart from 'use a string instead of a double'. I wasn't all that happy with that answer, so I came up with a different solution instead.
Hacky Fix incoming...
So instead of using a string in my model to hold the data, I used a customer JsonConverter instead.
The idea is to pass the data through another process as it gets serialized/deserialized so we can change its format etc. To create a new converter you just need to implement the JsonConverter<T> type, and then you can override the read and write methods. This is a converter I created:
public class JsonDoubleLatLongConverter : JsonConverter<double>
{
public override double Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
if (reader.TokenType == JsonTokenType.String)
{
return double.Parse(reader.GetString());
}
return reader.GetDouble();
}
public override void Write(Utf8JsonWriter writer, double value, JsonSerializerOptions options)
{
writer.WriteStringValue(value.ToString("G17", CultureInfo.InvariantCulture));
}
}
So on write (serialize), we are outputting our double as a string formatted as G17, and when we read we are going to just output the string as a double, this way it keeps our accuracy.
So now to use this converter, we just need to add a new attribute to our model properties where we want it to be used:
[JsonConverter(typeof(JsonDoubleLatLongConverter))]
public double? LocationLongitude { get; set; }
[JsonConverter(typeof(JsonDoubleLatLongConverter))]
public double? LocationLongitude { get; set; }
So now when the events get published we can keep the properties the correct types, but they are no longer losing any accuracy.