While investigating/trying to answer mythz's question, "How can we create a generic Array Extension that sums Number types in Swift?" I ran aground of some really odd behavior.
Is this a bug? or am I crazy?
Working examples of casting 0 as a float: (both result in 0.0)
var zero_float1: Float = 0
var zero_float2: Double = 0
Keep in mind that that Float is a typealias of Float32 and Double is a typealias of Float64
extension Array {
func get_zero() -> T? {
return 0 as? T
}
}
returns 0
Array<Int>().get_zero()
but these return nil
Array<Float>().get_zero()
Array<Double>().get_zero()
...ok weird, maybe it's because it's a literal int ...? let's try something else
extension Array {
func get_zero_float_literal() -> T? {
return 0.0 as? T
}
}
The following return nil:
Array<Float>().get_zero_float_literal()
Array<Int>().get_zero_float_literal()
Array<Float32>().get_zero_float_literal()
But these return 0.0 — whaaa?
Array<Float64>().get_zero_float_literal()
Array<Double>().get_zero_float_literal()
My understanding is that if you use Array<Float> you are supposed to be able to substitute T wherever Float would be. But it seems there is some caveat (or bug?)