Serializing/Deserialzing DateTime values to JSON with the DataContractJsonSerializer

•February 3, 2009 • Leave a Comment

 

At the moment, my team and I are working on an ASP.NET project that requires some significant user interaction and data exchange.  We chose to standardize the data exchange format on JSON.  The primary reason for this is to allow us to serialize business entity objects and exchange them between the client and server in identical structure.  This allows us to utilize the entities on the client side, have the user operate on the data and then send the serialized entities back to the server to be recreated as .NET objects.  Hopefully the benefits of this are apparent, if not well, that’s topic for another time. 

In addition, we have a WCF service that utilizes the entities as well.  As a result, the entity classes are decorated with the DataContract attribute instead of the Serializable attribute. Since this is the approach we took with serialization, it led us to use the DataContractJsonSerializer for serializing to JSON.  There is a wonderful article that includes a class that uses generics to serialize .NET objects, and it can be found here.  Excellent information to be found there. 

The details of how to serialize an object to JSON is covered elsewhere in detail.  What I wanted to cover here is a “gotcha” when your entity contains a DateTime field.  When you serialize a DateTime to JSON with the DataContractJsonSerializer, you get the following format (assuming the field name is “Created”;

“{\”Created\”:\”\\/Date(1232739449000-0500)\\/\”}”

The \/Date()\/ is a Microsoft work around to an issue with dates and JavaScript, you can find a discussion on this here if you’re interested.  This is the seed of the “gotcha” due to the escape characters.  However, it’s only a gotcha when you send the string back to the server.  When you serialize a JavaScript Date using Sys.Serialization.JavaScriptSerializer object, you get the following string on the client side;

 
Note these are very similar.  Awesome, great, we can deserialize our object, yay!  However, when you send this over to the server, extra escape characters are created and the deserialization fails with a JSON format error on the DateTime.  Inspecting the string passed to the server, we see the following;

“{\”Created\”:\”\\\”\\\\/Date(1233698640637)\\\\/\\\”\”}”

The extra escape characters cause problems when the DataContractJsonSerializer encounters it and tries to create a DateTime from the value. The value is interpreted as a string, and not a date value.  What we had to do was scrub the JSON text passed in to the server method to remove the extra escape characters.  Using the Replace() method on the String replace “\\\”\\\\/D” with “\”\\\\/D” at the prefix and replace “)\\\\/\\\”” with “)\\\\/” on the suffix.  So far this seems to work and after executing these two replace calls, we get the following;

“{\”Created\”:\”\\/Date(1233698977441)\\/\”}”

Now that looks remarkably similar to the initial serialization result.  And when calling the deserialize method, we get our .NET entity object with all the values correctly set.

Sharing Data Between Strategies

•December 13, 2008 • Leave a Comment

At work, we have a project that requires a fair amount of number crunching.  Crunching is performed on a daily basis and runs for a variable number of days.  There are many algorithms for the crunching and which algorithm is used can change on a daily basis. For some of the algorithms, the processing for one day requires processing future days regardless of whether the results are kept. 

The whole algorithm being changeable by the user thing and changeable on a daily basis essentially pointed us in the direction of the “Strategy Pattern”.  No big whoop on the Strategy thing.  What made this problem interesting is how to keep those future processed values around in case the next day used the same algorithm thus eliminating the need to re-compute.

As long as the Algorithm Interface includes mechanisms to accept a repository type object for temporary storage, then the context object can manage the lifetime of the repository and share it among the concrete algorithm implementations.

We chose to utilize the Enterprise Library Caching Application Block and use the CacheManager object from the CacheFactory as a method parameter in each algorithm interface.  The context object interacts with the cache and manages the life time of the CacheManager object.  This gave us a generic caching object where we didn’t have to worry about types.  Also, it was already built. 

Basically, the context is created and creates and maintains a reference to the CacheManager object.  The CacheManager object is passed as a parameter to the implementation objects through the interface.  Each concrete class now has full access to a cache.  When the algorithm changes, the cache is flushed and the new algorithm obtains it’s own semi-private cache.

Yay!  A sweet way to maintain and share date between strategies! 

Now..for the next challenge….

On the road to more knowledge…

•November 12, 2008 • Leave a Comment

Life in the corporate cube rarely provides opportunities to play with new tech.  There’s always a deadline and little, ok no, room for R&D.  So, the only way I keep up and keep current is to read and play on my own time in the evenings. 

I firmly believe that the best way to learn is to use the tech.  Therefore, in an attempt to gain more knowledge and learn emerging tech, I am starting a hobby learning project.  This project will be a small application to manage an office lottery pool.  I manage our office lottery pool with Excel and it’s starting to get out of hand.

The idea is for the application to start as a WPF windows application, with possible Silverlight web component for players to view.  I plan to use the ADO.NET Entity Framework for no better reason than to learn more about it.  I plan to use LINQ to Entities along with EF.

First step is to get VS2008 SP1 and .Net Framework 3.5 SP1.  The first is for the improved WPF designer support.  The latter is to get the EF libraries.

Up next, gathering requirements…

Meta-data ..The Good, The Bad, The Ugly

•November 11, 2008 • Leave a Comment

Data which describes other data. For example, a description of a database in terms of its structure and the relationship between the entities in it. (www.oxfordreference.com).

Anyone working with relation database systems will have used and/or manipulated meta-data.  Every database contains a data dictionary.  This is a repository of all the data that describes the database objects.  For example a table will include;

  • the table name
  • the names of attributes (columns/fields)
  • the data types of attributes (columns/fields)

The Good

Meta-data can be valuable when needing to enable variable content presentation.  Where the UI is driven by the meta-data.  In this case, the data is likely to be table based, and the table header can be driven by meta-data describing such things as the titles and perhaps to go so far as describe styling. 

The Bad

When requirements dictate that a more complex business workflow is necessary, then the meta-data becomes measurably more complex.  The work necessary to model a general meta-data description to accommodate business processes is not outweighed by the benefits.  It is unrealistic to think that every conceivable process and use has been accounted for, invariably requirements will be overlooked, or simply not exist yet. 

The Ugly

When more complex business requirements emerge and highlight the shortcomings of the meta-data framework used within a system, you are faced with two avenues of recourse.  The first, limit your process modeling to ensure compliance with the limitations of the meta-data.  This means that the user interface is effectively crippled by a system layer intended to help the UI.  The second avenue is to break from the framework and essentially ignore it.  In this case, you system will not only require maintenance of two different approaches, but you will most likely be introducing some data access redundancy.

What To Do?

In our current project, we are tasked with enhancing a version 1.0 system.  The project was chunked into phases and we are taking over from the first phase.  Unfortunately, none of the original designers are present in the current team (a post for another time).  As you may have guessed, the first phase introduced a limited meta-data layer.  The intent was to allow for user customization of the field labels and to define validation rules for each column and finally single level table lookup fields.  This worked fine in the first phase where everything was essentially table based screens.  There is one exception where business workflow process is required.  However, to fit this into the meta-data, the UI workflow is awkward, slow and “clunky”.

Introducing The MetaDataAdapter

We wanted to keep the data access framework consistent, while allowing for domain models within the UI workflow.  We struggled with trying to incorporate the meta-data, but realized very quickly that we would need to sacrifice usability in order to make this work.  Exploring alternatives to the meta-data meant that we would have to roll our own data access layer.  This effort gave rise to the MetaDataAdapter. 

The idea behind the MetaDataAdapter is to wrap an object around the meta-data layer that converts between the business domain entities and the meta-data dataset.  Ideally we would have used a class based on .NET generics employing reflection to convert, however due to time constraints, we opted for a MetaDataAdapter class per business domain.

This approach provides us with the good, eliminates the ugly, and removes enough of the bad to make it palatable.  We are achieving code re-use, leaner objects and strongly typed business processing.  Good, good and good.

Hopefully as the project moves forward from design through development, we will not come to regret this decision.

Lest We Forget…

•November 11, 2008 • Leave a Comment
“If history repeats itself, and the unexpected always happens, how incapable must Man be of learning from experience.”
 
George Bernard Shaw

The realities of war are so far removed from our everyday lives, that it’s easy to marginalize the sacrifice that these men and women are making.  I can only hope that my children’s children’s children never have to experience this.  I know that the only way this will happen is if we learn from history and do our very best to see it not repeat.

“Those who cannot learn from history are doomed to repeat it.”

George Santayana

From a simple software developer in Ontario, Canada, thank you to all the men and woman of our armed forces for their sacrifices.

Hello world!

•October 30, 2008 • Leave a Comment

I can’t leave a newly created blog open without a post.  So…Hello and welcome to Musings From A Cube. 

People will find random thoughts, meditations, contemplations and lessons learned on life in a cube and my experiences with Microsoft technology, specifically .NET.

 
Follow

Get every new post delivered to your Inbox.