_screen.brightness = _screen.brightness - 0.1;
This line of code gives me an unexpected result.
When I call the NSLog(@"%.2f", _screen.brightness - 0.1); command, then it prints the
-0.00 value. When I test to this if (_screen.brightness == 0), it gives me NO.
Why this happens? Is there any conversion problem?  
Here's my accessor methods in the class of _screen's object:
- (CGFloat)brightness {
    return 1 - _dimmingView.alpha;
}
- (void)setBrightness:(CGFloat)brightness {
    if (brightness < self.minValue || brightness > self.maxValue) {
        return;
    }
    _dimmingView.alpha = 1 - brightness;
}