Let's say my server program has two threads (T1 and T2) running on separate cores. Both are serving RPCs coming in over the network from a single external client. The following sequence of operations occurs:
- In-memory variable
foois initialized to zero - Client sends RPC, which happens to be served by T1, to set
footo 42 - T1 writes value to
foo, write is cached in its core's L1 (not main memory) - T1 sends ACK to client
- Client sends RPC, which happens to be served by T2, to read
foo - T2 reads
foofrom its cache or main memory and sees that it is zero - T2 replies to client saying
foois zero .
This violates external consistency.
Can this actually occur, or is there an implicit flush of T1's cache when it performs the I/O of sending the ACK back to the client (step 4)?