Project

General

Profile

Actions

Defect #12680

open

Estimated Time value is corrupted

Added by Konstantin K. about 11 years ago. Updated about 11 years ago.

Status:
Confirmed
Priority:
Low
Assignee:
-
Category:
Issues planning
Target version:
-
Start date:
Due date:
% Done:

0%

Estimated time:
Resolution:
Affected version:

Description

Estimated Time value is corrupted.

Steps to reproduce:
1. Open an issue.
2. Set Estimated Time to '999999999999999999999999999999'.

Expected results:
Estimated Time is equal to '999999999999999999999999999999'

Actual results:

Version used:
2.1.4.stable.10927


Files

wrong-time.jpg (17.2 KB) wrong-time.jpg Konstantin K., 2012-12-27 12:33

Related issues

Related to Redmine - Defect #12955: Misbehaviour in estimated time validationClosed

Actions
Actions #1

Updated by Jean-Philippe Lang about 11 years ago

  • Status changed from New to Confirmed

Yes, this happends when we exceed the floating-point precision:

irb(main):001:0> "%02f" % 999999999999999999999999999999.0
=> "1000000000000000019884624838656.000000" 

A solution would be to validate the estimated time against a reasonable range of values (eg. 0 - 1000000), what do you think?

Actions #2

Updated by Daniel Felix about 11 years ago

Hi Jean-Philippe,

I give some other solution on #12955.

What do you think about changing the datatype from float to decimal, which is much more accurate? We're using decimal in all of hour analytics, because it behaves greater on converting, has a better support for such big numbers without falling back to exponentials and is supported in all databases too.

This would solve this problem without the limitation of the estimated time to some value like 1.000.000.

Actions #3

Updated by Jean-Philippe Lang about 11 years ago

A numeric column still have a limitation. But the question is: do we really need to store estimated times greater than 1,000,000?

Actions #4

Updated by Mischa The Evil about 11 years ago

Jean-Philippe Lang wrote:

[...] But the question is: do we really need to store estimated times greater than 1,000,000?

I don't think so. IMO a sane range-validation is the most pragmatic solution to workaround these - extreme edge-case - issues.

Actions #5

Updated by Daniel Felix about 11 years ago

But even with some limitation to 1.000.000, the risk of some floating point errors in timelog reports could occur. Well a good limitation would be ok, but the datatype still isn't the best.

Actions

Also available in: Atom PDF