I created a view with the two Image Views side-by-side to compare the two.
Left side is from ARFrame.capturedImage.
Right side is from ARSCNView.snapshot()
You can see that the left side is slightly brighter than the right side, even though I pointed the camera at a white wall.
I need both functions to return the same pixel values for the same object, so that I can later copy over certain pixels without obvious contours showing up due to the difference in brightness.
Is it perhaps due to how I am converting the CVPixelFrameBuffer into UIImage?
