I was intrigued by this example from The Swift Programming Guide's Extensions section:
extension Double {
var km: Double { return self * 1_000.0 }
var m: Double { return self }
var cm: Double { return self / 100.0 }
var mm: Double { return self / 1_000.0 }
var ft: Double { return self / 3.28084 }
}
let threeFeet = 3.ft
println("Three feet is \(threeFeet) meters")
// prints "Three feet is 0.914399970739201 metersSurely 3 would be an Int, so this would fail to compile because ft isn't defined for Ints? But it doesn't. It appears that in this case the type of that 3 literal is being inferred based on the instance property called on it. The compiler is saying, "he wants to call .ft on that number; the only numeric type with a .ft property is Double, so let's make it a Double."
Which you could imagine getting a bit dangerous; if I subsequently declare a .ft declared property on Int, the compiler will use that instead:
extension Double {
var km: Double { return self * 1_000.0 }
var m: Double { return self }
var cm: Double { return self / 100.0 }
var mm: Double { return self / 1_000.0 }
var ft: Double { return self / 3.28084 }
}
extension Int {
var ft: Int { return self * 2 } // because why not
}
let threeFeet = 3.ft
println("Three feet is \(threeFeet) meters")
// prints "Three feet is 6 meters