Created
October 24, 2015 16:40
-
-
Save erynofwales/61768899502b7ac83c6e to your computer and use it in GitHub Desktop.
Error on line 23, pointing to the size() call: Cannot convert type 'Int' to expected argument type 'Int'
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
protocol Matrix { | |
typealias ElementType | |
static func dimensions() -> (rows: Int, cols: Int) | |
static func size() -> Int | |
init() | |
subscript(row: Int, col: Int) -> ElementType { get set } | |
} | |
struct Matrix4<T: FloatingPointType> : Matrix { | |
static func dimensions() -> (rows: Int, cols: Int) { | |
return (4, 4) | |
} | |
static func size() -> Int { | |
let dimensions = Matrix4.dimensions() | |
return dimensions.rows * dimensions.cols | |
} | |
private var data: [T] = [T](count: Matrix4.size(), repeatedValue: T(0)) | |
init() { | |
data = [T](count: Matrix4.size(), repeatedValue: T(0)) | |
} | |
subscript(row: Int, col: Int) -> T { | |
get { | |
} | |
set(value) { | |
} | |
} | |
} |
This turned out to be a name lookup bug. You can work around it in Swift 2 by using "Matrix4.size()" to specifically provide the type bound, but it is now fixed for Swift 3 as of Swift.org commit b3ac017.
That was supposed to be Matrix4 < T > .size()
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Often, changing the definition to an equivalent one yields a different error which sheds more light on the problem (or at least narrows down which subexpression is responsible). For example, changing the definition to:
yields:
With some further experimentation, it looks like Swift doesn’t like expressions involving
T
at this scope; by contrast, theinit
you wrote seems fine. Seems radar-worthy.