String to Numeric and Vice-Versa in Java

How often do you write code that converts String input to a number? How often do you need to convert it back?
For example consider the following code:

There are two conversions here. First, the literal “123123123” is converted to Integer and then that Integer is converted to String again (System.out.println calls the toString() method). So what do you expect the output to be? It should be the same as the original String argument. While this is true for the above-mentioned example this is not the case when operating with floating point numbers. What do you expect the output of this code to be:

Surprisingly or not (hopefully not) it is:

Yes, that’s right! After converting String to Float/Double and then back to String the value is different than the one we started with. This is not a big deal unless you develop an application that handels numbers without considering the context.

Let’s say there is a common method that converts Strings to numbers. Something like this:

At some point in time a defect comes up that the system cannot work with floating point numbers. So the easy fix would be:

This looks clever at first sight since the method is declared Number so its callers won’t break and the fix is in one place only. However, once this fix is deployed all the code that converts the number back to String will start to behave in an unexpected manner. There will be validation errors, wrong data display etc. For example, code that accepts String argument that is expected to be an integer representation will start getting values like “1.2312312E8”.
As an example see the program below

Run it with argument “123123123”. Now change lines 24 and 35 to

and run it again without changing the argument.

Yes I know that most of this is bad code and lack of good engineering. However, keep in mind that as a system grows it can get entangled and simple changes can lead to massive failures.

Leave a Reply

Your email address will not be published. Required fields are marked *