#include <iostream>
#include <stdint.h>
class Test {
  public:
  Test(const int64_t & val) : val_(val) {
    std::cout << "initialized: " << val_ << std::endl;
  }
  void print() {std::cout << "reference val: " << val_ << std::endl;}
  private:
  const int64_t & val_;
};
int main() {
  long long int input_val= 1628020800000000000L;
  auto t = Test(input_val);
  std::cout << "input_val: " << input_val << std::endl; 
  t.print();
}
If you build without an optimized build you get the following:
g++ main.cpp -std=c++17
initialized: 1628020800000000000
input_val: 1628020800000000000
reference val: 1628020800000000000
If you build with an optimized build e.g. -O3, you get the following:
g++ main.cpp -std=c++17 -O3
initialized: 1628020800000000000
input_val: 1628020800000000000
reference val: 0
I'm guessing this difference is due to the casting/treatment of input_val being of type long long int, but I do not really understand why that's the case. I thought a long was at least 32bit and a long long was at least twice as wide as a long long. Since the reference is a const int64_t, I thought there wouldn't be any casting issues.
I realize if I switch the long long int to an int64_t, this wouldn't be an issue, but I want to learn the reason as to why this happens.
g++/gcc version:
g++ (GCC) 9.1.1 20190605 (Red Hat 9.1.1-2) Copyright (C) 2019 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
EDIT:
The original version of this code had input_val (original static_val) as a global static variable. I changed the example to be simpler since the issue is not predicated on the variable being a global static var.
 
    