I'm working on a compact framework application and need to boost performance. The app currently works offline by serializing objects to XML and storing them in a database. Using a profiling tool I could see this was quite a big overhead, slowing the app. I thought if I switched to a binary serialization the performance would increase, but because this is not supported in the compact framework I looked at protobuf-net. The serialization seems quicker, but deserialization much slower and the app is doing more deserializing than serializing.
Should binary serialization should be faster and if so what I can do to speed up the performance? Here's a snippet of how I'm using both XML and binary:
XML serialization:
public string Serialize(T obj)
{
  UTF8Encoding encoding = new UTF8Encoding();
  XmlSerializer serializer = new XmlSerializer(typeof(T));
  MemoryStream stream = new MemoryStream();
  XmlTextWriter writer = new XmlTextWriter(stream, Encoding.UTF8);
  serializer.Serialize(stream, obj);
  stream = (MemoryStream)writer.BaseStream;
  return encoding.GetString(stream.ToArray(), 0, Convert.ToInt32(stream.Length));
}
public T Deserialize(string xml)
{
  UTF8Encoding encoding = new UTF8Encoding();
  XmlSerializer serializer = new XmlSerializer(typeof(T));
  MemoryStream stream = new MemoryStream(encoding.GetBytes(xml));            
  return (T)serializer.Deserialize(stream);
}
Protobuf-net Binary serialization:
public byte[] Serialize(T obj)
{
  byte[] raw;
  using (MemoryStream memoryStream = new MemoryStream())
  {
    Serializer.Serialize(memoryStream, obj);
    raw = memoryStream.ToArray();
  }
  return raw;            
}
public T Deserialize(byte[] serializedType)
{
  T obj;
  using (MemoryStream memoryStream = new MemoryStream(serializedType))
  {
    obj = Serializer.Deserialize<T>(memoryStream);
  }
  return obj;
}
 
     
     
     
     
    