10-18-2022, 06:44 PM
So, I have a massive data base of decimal value with 5 digits after the decimal. I have routine which is trying to find the highest and the lowest of these values. Here is the algorythm that I am using but for some reason it's giving me the Highest value as the Lowest and the Lowest as the Highest.
HL = DataBaseValue
If HL < 1 And HL < Low Then Low = HL
Low = (_Round(Low * 100000)) / 100000
If HL < 1 And HL > High Then High = HL
High = (_Round(High * 100000)) / 100000
The rounding is to avoid scientific notation and be sure result will be 5 digit decimal value.
I can't see why this algorythm would give the High as Low and the Low as High.
HL = DataBaseValue
If HL < 1 And HL < Low Then Low = HL
Low = (_Round(Low * 100000)) / 100000
If HL < 1 And HL > High Then High = HL
High = (_Round(High * 100000)) / 100000
The rounding is to avoid scientific notation and be sure result will be 5 digit decimal value.
I can't see why this algorythm would give the High as Low and the Low as High.