Is it a legitimate optimisation to simply create a really HUGE source file which initialises a vector with hundreds of thousands of values manually? rather than parsing a text file with the same values into a vector?
Sorry that could probably be worded better. The function that parses the text file in is very slow due to C++'s stream reading being very slow (takes about 6 minutes opposed to about 6 seconds in the C# version.
Would making a massive array initialisation file be a legitimate solution? It doesn't seem elegant, but if it's faster then I suppose it's better?
this is the file reading code:
    //parses the text path vector into the engine
    void Level::PopulatePathVectors(string pathTable)
    {
        // Read the file line by line.
        ifstream myFile(pathTable);
            for (unsigned int i = 0; i < nodes.size(); i++)
            {
                pathLookupVectors.push_back(vector<vector<int>>());
                for (unsigned int j = 0; j < nodes.size(); j++)
                {
                    string line;
                    if (getline(myFile, line)) //enter if a line is read successfully
                    {
                        stringstream ss(line);
                        istream_iterator<int> begin(ss), end;
                        pathLookupVectors[i].push_back(vector<int>(begin, end));
                    }
                }
            }
        myFile.close();
    }
sample line from the text file (in which there are about half a million lines of similar format but varying length.
0 5 3 12 65 87 n
 
     
     
     
     
     
     
    