Long before I attempted to learn JavaScript, I made Flash games made with ActionScript 3, an EMCAScript dialect. Admittedly, this has left me with weird habits to kick when writing pure JavaScript. ActionScript 3 has JS-like syntax and semantics, but its strong and static typing often make it feel like a closer cousin to something like Java than JS. Now that I'm making the jump to try and gain a more complete understanding of JS, I find myself struggling to appreciate JavaScript's willingness to coerce values between types, especially when I make mistakes like this:

var stringIThoughtWasNumber = "10";

stringIThoughtWasNumber – 1

>> 9

It's an error, but the result that's returned is the one I'm expecting. So, naturally,

stringIThoughtWasNumber + 1

>> "101"


What's more,

stringIThoughtWasNumber - (-1)

>> 11

stringIThoughtWasNumber + (-1)

>> "10-1"

and so on. Adding a value to the string returns a string, and subtracting returns a number.

While these are probably old-hat issues to seasoned JS developers, I find it challenging to keep track of JavaScript gotchas when I don't understand why the built-in behaviors were designed to work the way they were. Thankfully, Paul pointed out that this particular behavior occurs because the + operator is overloaded: if either operand is a string, JavaScript will coerce the other value to a string and return the concatenated result. In contrast, when we attempt to subtract a number, the non-string operand is converted to a number, and the operation takes place as expected (or, at least, as I expected when I wrote my buggy program). Thankfully, my cellular automata no longer refuse to propagate in a positive direction!