The naive implementation will fail on average 64 times before finding a valid BigInteger within the specified range.
On the worst case, my implementation will retry on average only 0.5 times (read as: 50% of the times it will find a result on the first try).
Also, unlike with modular arithmetic, my implementation maintains a uniform distribution.
Explanation
We must generate a random BigInteger between min and max.
- If min > max, we swapminwithmax
- To simplify the implementation we shift our range from [min, max]to[0, max-min], this way we won't have to deal with the sign bit
- We count how many bytes maxcontains (bytes.Length)
- From the most significant bit, we count how many bits are 0 (zeroBits)
- We generate a random sequence of bytes.Lengthbytes
- We know that for our sequence to be < max, at leastzeroBitsbits from the most significant bit must be 0, so we use azeroBitMaskto set them with a single bit-to-bit&operation over the most significant byte, this will save a lot of time by reducing the change of generating a number out of our range
- We check if the number we generated is > max, and if so we try again
- We unshift the range back from [0, max-min]to[min, max]by addingminto our result
And we have our number. 
Implementation
public static BigInteger RandomInRange(RandomNumberGenerator rng, BigInteger min, BigInteger max)
{
    if (min > max)
    {
        var buff = min;
        min = max;
        max = buff;
    }
    // offset to set min = 0
    BigInteger offset = -min;
    min = 0;
    max += offset;
    var value = randomInRangeFromZeroToPositive(rng, max) - offset;
    return value;
}
private static BigInteger randomInRangeFromZeroToPositive(RandomNumberGenerator rng, BigInteger max)
{
    BigInteger value;
    var bytes = max.ToByteArray();
    // count how many bits of the most significant byte are 0
    // NOTE: sign bit is always 0 because `max` must always be positive
    byte zeroBitsMask = 0b00000000;
    var mostSignificantByte = bytes[bytes.Length - 1];
    // we try to set to 0 as many bits as there are in the most significant byte, starting from the left (most significant bits first)
    // NOTE: `i` starts from 7 because the sign bit is always 0
    for (var i = 7; i >= 0; i--)
    {
        // we keep iterating until we find the most significant non-0 bit
        if ((mostSignificantByte & (0b1 << i)) != 0)
        {
            var zeroBits = 7 - i;
            zeroBitsMask = (byte)(0b11111111 >> zeroBits);
            break;
        }
    }
    do
    {
        rng.GetBytes(bytes);
        // set most significant bits to 0 (because `value > max` if any of these bits is 1)
        bytes[bytes.Length - 1] &= zeroBitsMask;
        value = new BigInteger(bytes);
        // `value > max` 50% of the times, in which case the fastest way to keep the distribution uniform is to try again
    } while (value > max);
    return value;
}
Test
using (var rng = RandomNumberGenerator.Create())
{
    BigInteger min = 0;
    BigInteger max = 5;
    var attempts = 10000000;
    var count = new int[(int)max + 1];
    var sw = Stopwatch.StartNew();
    for (var i = 0; i < attempts; i++)
    {
        var v = BigIntegerUtils.RandomInRange(rng, min, max);
        count[(int)v]++;
    }
    var time = sw.Elapsed;
    Console.WriteLine("Generated {0} big integers from {1} to {2} in {3}", attempts, min, max, time);
    Console.WriteLine("On average: {0} ms/integer or {1} integers/second", time.TotalMilliseconds / attempts, attempts / time.TotalSeconds);
    for (var i = 0; i <= max; i++)
        Console.WriteLine("{0} generated {1}% of the times ({2} times)", i, count[i] * 100d / attempts, count[i]);
}
Test output on my i7-6500U:
Generated 10000000 big integers from 0 to 5 in 00:00:09.5413677
On average: 0.00095413677 ms/integer or 1048067.77334449 integers/second
0 generated 16.66633% of the times (1666633 times)
1 generated 16.6717% of the times (1667170 times)
2 generated 16.66373% of the times (1666373 times)
3 generated 16.6666% of the times (1666660 times)
4 generated 16.68271% of the times (1668271 times)
5 generated 16.64893% of the times (1664893 times)
Another test output on my i7-6500U
Generated 10000000 big integers from 0 to 10^100 in 00:00:17.5036570
On average: 0.0017503657 ms/integer or 571309.184132207 integers/second