I'm attempting to write an integration test to prove that a TCP client will correctly time out if an attempt to connect to the server is too slow. I have a FakeServer class that opens a Socket and listens for incoming connections:
public sealed class FakeServer : IDisposable
{
    ...
    public TimeSpan ConnectDelay
    {
        get; set;
    }
    public void Start()
    {
        this.CreateSocket();
        this.socket.Listen(int.MaxValue);
        this.socket.BeginAccept(this.OnSocketAccepted, null);
    }
    private void CreateSocket()
    {
        var ip = new IPAddress(new byte[] { 0, 0, 0, 0 });
        var endPoint = new IPEndPoint(ip, Port);
        this.socket = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
        this.socket.Bind(endPoint);
    }
    private void OnSocketAccepted(IAsyncResult asyncResult)
    {
        Thread.Sleep(this.connectDelay);
        this.clientSocket = this.socket.EndAccept(asyncResult);
    }
}
Notice my attempt to delay the success of the connection via a call to Thread.Sleep(). Unfortunately, this does not work:
[Fact]
public void tcp_client_test()
{
    this.fakeServer.ConnectDelay = TimeSpan.FromSeconds(20);
    var tcpClient = new TcpClient();
    tcpClient.Connect("localhost", FakeServer.Port);
}
In the test above, the call to tcpClient.Connect() succeeds immediately, before the server-side OnSocketAccepted method is even called. I've had a look around the APIs and I can't see any obvious way for me to inject some server-side logic that must finish before the connection from the client is established.
Is there any way for me to fake a slow server/connection using TcpClient and Socket?
 
     
    