As OP stated in comment, xml field is required. Then as D T said, you should store it separate table. Further you can add this field after COMPRESSING it in C#. I am giving some stats that I tested using BLToolKit and also EntityFramework.
I am reading a text file with 70000 lines and each line contains 70 characters. That is 5,039,998 of length. Below is the stats that I want to share. I have just tested 35 records as without compression, it required around 9MB for each row of space in database.

As you can see, OutOfMemory Exception, I didn't dig into that. I just wanted to see timings. What I actually did is,
When you insert data in that table; before inserting COMPRESS the string. The datatype for the field is varbinary (MAX), I had taken. When you fetch data; only fetch it when you want to process it. And don't forget to DECOMPRESS it. In SQL Server, it takes 0 seconds to fetch compressed records.
Below is the code for compression and decompression. Reference For The Code
public static void CopyTo(Stream src, Stream dest)
{
byte[] bytes = new byte[4096];
int cnt;
while ((cnt = src.Read(bytes, 0, bytes.Length)) != 0)
{
dest.Write(bytes, 0, cnt);
}
}
public static byte[] Zip(string str)
{
var bytes = Encoding.UTF8.GetBytes(str);
using (var msi = new MemoryStream(bytes))
using (var mso = new MemoryStream())
{
using (var gs = new GZipStream(mso, CompressionMode.Compress))
{
//msi.CopyTo(gs);
CopyTo(msi, gs);
}
return mso.ToArray();
}
}
public static string Unzip(byte[] bytes)
{
using (var msi = new MemoryStream(bytes))
using (var mso = new MemoryStream())
{
using (var gs = new GZipStream(msi, CompressionMode.Decompress))
{
//gs.CopyTo(mso);
CopyTo(gs, mso);
}
return Encoding.UTF8.GetString(mso.ToArray());
}
}
Hope that helps you.