They're pseudo-random number generators, not truly random ones. This is often a good thing since it allows you to reproduce bugs more easily where "random" numbers are involved.
You can get random number generators, such as reading /dev/random under Linux but the normal ones that ship with C libraries generally aren't.
The simplest one are linear congruential generators where:
nx+1 = (nx * A + C) modulo M
with suitably chosen values of A, C and M.
Wikipedia's page on LCGs gives some sample values used by various implementations. For example, the glibc one listed there has A = 1103515245, C = 12345, M = 2^31 so it's a simple thing like:
static unsigned int seed = 1;
void srand (int newseed) {
    seed = (unsigned)newseed & 0x7fffffffU;
}
int rand (void) {
    seed = (seed * 1103515245U + 12345U) & 0x7fffffffU;
    return (int)seed;
}
Aside: The glibc implementation still has this generator within it (called the Type 0 generator) but it also has a fancier trinomial generator as well, which is (presumably) better.
There are also more complex ones (such as the Mersenne twister) that have a much greater cycle time (time before starting to repeat).
Any truly random generator must use a truly random input source which is why /dev/random will sometimes block "waiting for entropy", while /dev/urandom won't.
"Truly" random sources may be affected by timing between keystrokes, data entered by users, the contents of network packets, disk I/O patterns, time taken for an ICMP response to come back over the network and all sorts of other wondrous, mostly non-deterministic things.
Unless you're heavily into crypto, normal random number generators should be just fine.