The most efficient, but not the C++ way would be:
   FILE* f = fopen(filename, "r");
   // Determine file size
   fseek(f, 0, SEEK_END);
   size_t size = ftell(f);
   char* where = new char[size];
   rewind(f);
   fread(where, sizeof(char), size, f);
   delete[] where;
#EDIT - 2
Just tested the std::filebuf variant also. Looks like it can be called the best C++ approach, even though it's not quite a C++ approach, but more a wrapper. Anyway, here is the chunk of code that works almost as fast as plain C does.
   std::ifstream file(filename, std::ios::binary);
   std::streambuf* raw_buffer = file.rdbuf();
   char* block = new char[size];
   raw_buffer->sgetn(block, size);
   delete[] block;
I've done a quick benchmark here and the results are following. Test was done on reading a 65536K binary file with appropriate (std::ios:binary and rb) modes.
[==========] Running 3 tests from 1 test case.
[----------] Global test environment set-up.
[----------] 4 tests from IO
[ RUN      ] IO.C_Kotti
[       OK ] IO.C_Kotti (78 ms)
[ RUN      ] IO.CPP_Nikko
[       OK ] IO.CPP_Nikko (106 ms)
[ RUN      ] IO.CPP_Beckmann
[       OK ] IO.CPP_Beckmann (1891 ms)
[ RUN      ] IO.CPP_Neil
[       OK ] IO.CPP_Neil (234 ms)
[----------] 4 tests from IO (2309 ms total)
[----------] Global test environment tear-down
[==========] 4 tests from 1 test case ran. (2309 ms total)
[  PASSED  ] 4 tests.