Forum Discussion
Handling non detects (<0.123) in a database
mtarler thanks for the response and sorry for delay in my response as well as not making myself clear. My source data contains values the less than symbols e.g."<0.01" so when I sum or average values this data it results in an error on the pivot table. I was hoping that there is a way to avoid these calculation errors.
ultankcc so in your op you mention a couple of options that you were looking at doing (replacing "<xxx" with a value =xxx/2 or =xxx) and you mention how you have a pivot table that is replacing them with "<LOC". So I have a number of questions.
Are you looking for advice on which way is best w/r to data analysis, or how to accomplish this replacement, or for alternative solutions?
With respect to the pivot table, there are ways to create custom data columns in a pivot table but are you only using the pivot table to calculate your averages? If so and you are just reporting that average, I could imagine a formula to calculate the average without the pivot table. I'm just trying to understand the direction to go.
You also mention a problem with duplicate values being a problem. What exactly do you mean by this because either direction can be plagued by this sort of problem and it would be helpful to understand this problem before working on a solution for you only to have this problem need to get addressed later.
As for the data analysis view point, maybe doing it both or all 3 way to give a min-max bracket around the average to give a confidence interval. (i.e. assume <LOC min=0, avg=LOC/2, max=LOC). It's a bit more work but if it is critical to know the tolerance of your value that might be best.