I just can't explain what I think in 140 chars, so here is a longer text.
Note that my goal is not to say “I'm right” but to be sure we understand each other. (hey, if I'm right it's good for my ego, but if I'm not it's even better for my intellect).
So please tell me if you don't agree ;-)
// C
// the compilator converts 3 into pointer then add the two pointers
"hey" + 3; // => returns 31444870
// JS
// the interpretor converts 3 into string then add the two strings
"hey" + 3; // => returns "hey3"
# Ruby
# the String#+ method checks the class of 3 then throw an exception (TypeError)
"hey" + 3 # => throws "TypeError: no implicit conversion of Fixnum into String"
The problem here is not the type, it's the conversions that some language do and some other don't do.
# Ruby
# The TypeError will be raised only when the bad pattern is
# encountered, so we could have stuff like this:
[
->(i){ 1 + i },
->(s){ "hey" + s },
].sample.(3) # will it work?
# I don't think it is possible to do think in Haskell
# Ruby
# a variable can refer to many classes
a = 3
a = "a"
a = true
// C
// C refuses to put an int in a variable whose type is a struct
struct { int a; } s;
s = 1; // compilation error
“What on earth is a type?”
There is absolutely no such things in the processor, so types are a concept used to reason about our code when we are writing/reading it.
For me, adding a type is setting a constraint (“this variable has to behave like this”).
In Ruby, everything we put in a variable has 1 contract: we can send a
message to it (and we have a shortcut to send it, the “o.meth(args)
”
syntax).
So we can have no constraint: whatever you give to a function, it will respects this contract.
You can force a function to accept only instances of some classes, by
adding checks at runtime (“raise "Oh no" unless o.is_a? String
”), but
it's just a runtime check, not a type check. I mean, is there really a
difference between this last example and “raise "Oh no" unless i < 34
?”
That's why I think:
- C is weakly typed, not because of the “
"hey" + 3
” but because of the casts (“(char *)3
”) - Ruby has only 1 type (
Object
), so it's equivalent to have no type at all. - Every Ruby object has a class, which can't be changed. So Ruby is not weak in this regard.
Does this seems valid to you?
I can't end up with
"hey" + 3
in Ruby but I can write"hey" << " " << 3
thinking I am appending "3" to the string (it is the same mistake we made when doing"hey" + 3
in JS)…What I say is: this example is IMHO not a good one: it is only a check performed at runtime. It is a library stuff, not a language one. Let me re-use the example I wrote in the gist: if I code a function which take an hour and raises an error when receiving -1 or 25, it will also be a runtime check, but has nothing to do with types (well… it you code in some esoteric languages like Idris it might have :-)). I even could name the raised error
DependentTypeError
, it won't add this feature to Ruby.In Ruby,
3
is a Int. It is not and will never be an instance of an other class. That's where it is stronger than C. But here I'm talking about classes, not types.Are types and classes the same thing? If yes so Ruby has strong types (has JS does for what I know).
(I tend to separate these 2 things when coding in Scala because of the interfaces and type aliases, but this separation makes no sense in Ruby…)