parseInt is meant for strings so it converts the number there into a string. Once the numbers get small enough it starts representing it with scientific notation. So 0.0000001 converts into "1e-7" where it then starts to ignore the e-7 part because that's not a valid int, so it is left with 1
Ok some of these I understand but what the fuck. Why.
Edit: ok I have a theory. == checks equality without casting to any types, so they're not equal. But < and > are numeric operations, so null gets cast to 0. So <= and >= cast it to 0, and it's equal to 0, so it's true.
greater than, smaller than, will cast the type so it will be 0>0 which is false, ofcourse. 0>=0 is true.
Now == will first compare types, they are different types so it's false.
Also I'm a JavaScript Dev and if I ever see someone I work with use these kind of hacks I'm never working together with them again unless they apologize a lot and wash their dirty typing hands with.. acid? :-)
edit: as several people already pointed out, my answer is not accurate.
The real solution was mentioned by mycus
I know it’s a joke, but it’s an old one and it doesn’t make a lot of sense in this day and age.
Why are you comparing null to numbers? Shouldn’t you be assuring your values are valid first? Why are you using the “cast everything to the type you see fit and compare” operator?
Other languages would simply fail. Once more JavaScript greatest sin is not throwing an exception when you ask it to do things that don’t make sense.