I'm trying to make function that initializes VAO on specif index, with some data. The problem is, when I'm accessing the sizeof vertices, return's an wrong size.
Here is the data:
typedef struct {
    GLfloat XYZW[4];
    GLfloat RGBA[4];
} Vertex;
const Vertex Vertices2[] =
{
    { { 0.25f, 0.25f, 0.0f, 1.0f },{ 1.0f, 0.0f, 0.0f, 1.0f } },
    { { 0.75f, 0.25f, 0.0f, 1.0f },{ 0.0f, 1.0f, 0.0f, 1.0f } },
    { { 0.50f, 0.75f, 0.0f, 1.0f },{ 0.0f, 0.0f, 1.0f, 1.0f } }
};
const Vertex Indices2[] = {....}
I call the function like this:
createArrayObjects(0, Vertices2, Indices2);
void createArrayObjects(int index, const Vertex vertices[], const GLubyte indices[]){
    cout << sizeof(vertices) << endl;  ---> returns 4
    cout << sizeof(Vertices2) << endl; ---> returns 96
...
}
If I use sizeof(Vertices2), to fill the VBO, the program runs fine. 
Without the correct size on the input vertices, I can't fill the VAO and VBO and visualize correctly the data.
 
     
     
    