I'm trying to create image-snapshot tests for UIView. Unfortunately for me my CI machines have @1x pixel-to-point ratio, and my local machine has @2x, so basically I'm trying to render a UIView on @1x machine as it would look on @2x machine.
My code looks like this:
let contentsScale = 2
view.contentScaleFactor = contentsScale
view.layer.contentsScale = contentsScale
let format = UIGraphicsImageRendererFormat()
format.scale = contentsScale
let renderer = UIGraphicsImageRenderer(size: bounds.size, format: format)
let image = renderer.image { ctx in
self.drawHierarchy(in: bounds, afterScreenUpdates: true)
}
So problem is that when it reaches CALayer.draw(in ctx: CGContext) inside of drawHierarchy, the view.contentScaleFactor and view.layer.contentsScale are back to 1 (or whatever UIScreen.main.scale is). It happens in this callstack:
* MyUIView.contentScaleFactor
* _UIDrawViewRectAfterCommit
* closure #1 in convertViewToImage
I also noticed that there is ; _moveViewToTemporaryWindow in assembly code of _UIDrawViewRectAfterCommit call, which I guess it means it attaches my view to some temporary window which resets the scale. I tried changing the scale again in didMoveToWindow, i.e. right before the drawing, but the view comes out as pixelated even if view.contentScaleFactor is correct in the rendering of of the layer.
I noticed that some people try to solve it with using scaling on CGContext, but it makes no sense as the underlying quality is not scaled.
So what am I missing? How do render UIView into an image using desired scale?