2

I am trying to build a 10Gbps Ethernet channel between an FPGA and a Windows 7 PC. For a rough estimate, I measured the bandwidth usage of the link using Windows Task Manager's Network Tab. Later I measured the throughput using Wireshark. I do a small calculation by dividing the total number of bytes by time taken (N bytes/T)*8 bits per second.

At lower line-rates (<= 5Gbps), both the measurements matched closely. However as I increased the data-rate, say above 5 Gbps, the graph in the task manager rises accordingly but the data-rate obtained by Wireshark reduced to around 2.5 to 3 Gbps.

I can only guess that this is an OS level problem. I understand that line-rate and data-rate are 2 different things altogether. Please correct me if I am wrong.

My questions are:

  1. Where does the Task Manager measure the line-rate?
  2. Where does Wireshark capture the packets?

I read this post and understand that Wireshark captures packets between the NIC driver and higher layers (my guess, Transport layer) but I am not sure.

UPDATE

I have tried checking with Resource monitor. It shows the number of bytes being transferred to a particular application (if I am not wrong). When I am transferring data from FPGA at a rate of 10Gbps, the Resource monitor shows 1.26 Gigabytes per sec (10.001 Gigabit per sec!) transfer rate. This is confusing me more now.!

Why is Wireshark missing these packets?

More Info and NIC Params:

I am using UDP protocol. That could be one possible reason for packet loss at speeds >5Gbps. The packets from FPGA are 16060 bytes long (16000 bytes of payload and 60 bytes of headers).

  1. Receive buffer to 60000 (max 65535).
  2. Jumbo frames enabled - 16128 (Max value)
  3. Enabled RSS
  4. Enabled Checksum Offloading of UDP and TCP
  5. Number of RSS processors to 6 (6 physical cores are available in PC)
  6. Interrupt coalescing 25us (Not sure of this)

I am new to the networking area. Any help will be greatly appreciated.

0 Answers0