In lua5.3, numbers now represent integer and float, which in previous versions, would only represent float. For what I read, it was done so bitwise operations could be done in a more natural way.
Thinking about it, this change created an exception for lua types, that is, while all other types represent "basically" one thing, number now represents two things (int and float). And they are not "really" the same thing, even though they're, obviously, both numbers.
So, question: why wasn't strings used for bitwise support, given they are ... well, regular byte sequences ... that way, number could continue being only float.
And: why not split "number" into int and float? Wouldn't that make code more predictable? Maybe even more natural.