According to the JSON spec, the correct way to represent a null value is the literal null.
If that is the case, why does WCF return an empty response instead of null? Is this a bug or is this behaviour documented somewhere?
Complete repro example:
using System;
using System.ServiceModel;
using System.ServiceModel.Web;
[ServiceContract()]
public class Service1
{
    [OperationContract(), WebGet(ResponseFormat = WebMessageFormat.Json)]
    public string GetSomeString() { return "SomeString"; }
    [OperationContract(), WebGet(ResponseFormat = WebMessageFormat.Json)]
    public string GetNull() { return null; }
}
public class Host
{
    public static void Main()
    {
        // Very simple WCF server
        var host = new WebServiceHost(typeof(Service1), new Uri("http://localhost:8000/"));
        host.AddServiceEndpoint(typeof(Service1), new WebHttpBinding() {
            HostNameComparisonMode = HostNameComparisonMode.Exact
        }, "");
        host.Open();
        Console.WriteLine("Service is running, press enter to quit...");
        Console.ReadLine();
        host.Close();
    }
}
Expected result:
$ curl http://localhost:8000/GetSomeString && echo
"SomeString"
$ curl http://localhost:8000/GetNull && echo
null
$
Actual result:
$ curl http://localhost:8000/GetSomeString && echo
"SomeString"
$ curl http://localhost:8000/GetNull && echo
$
 
    

 
     
     
    