When it comes down to it, all our data is just made of 1s and 0s. But data in the real world is obviously much more complex than that, so we impose extra meaning on top of it. We take it for granted that we can represent numbers, text and date/time information in our programs... indeed, these are three of the core building blocks which are used in just about every data model imaginable.
So how is it that it's all so broken? Why is Stack Overflow full of questions about arithmetic being "broken" because of using float or double? Is there any hope for things ever getting better?
In this talk, I'll give some examples of what's wrong with the world, allocate blame to just about everyone involved, and give a few suggestions to avoid getting burned.
Jon is a Microsoft MVP for more than 13 years, currently he is a software engineer at Google, London. Stack Overflow contributor, author of C# in Depth.