Description
When trying to plot some date-based values, I noticed that the default labels show too much precision. I was trying to plot some data points over a minutes to hours interval, and the default plot kept showing times using a micro-second precision. Given that the major ticks were minutes apart, this seems overkill to me (and the extra digits quickly clutter the display, making the labels overlap).
Diving into the code, it seems the default "scaled" member of the
AutoDateFormatter, which determines what formats to use when, can be
improved. Currently, it:
- Uses microsecond precision when the ticks are minutes or less apart.
- Uses second precision when the ticks are hours apart
- Uses day-precision, month or year precision when the ticks are days
or more apart (these are fine, so I lumped them together).
I think it should be like this:
- Use microsecond precision when the ticks less than a second apart
- Use second precision when the ticks are seconds apart
- Use minute precision when the ticks are minutes or hours apart
- Use day-precision, month or year precision when the ticks are days
or more apart (unchanged).
Note that there is no point in displaying only the hour when the ticks
are hours apart, since then it won't be immediately clear that a time is
being displayed. Adding the (technically superfluous) :00 for the
minutes should prevent this, which is why the minute precision should
also be used when the ticks are hours apart.
The above would be implemented by the following change (documentation
not updated yet):
@@ -563,8 +573,9 @@
self.scaled = {DAYS_PER_YEAR: '%Y',
DAYS_PER_MONTH: '%b %Y',
1.0: '%b %d %Y',
- 1. / HOURS_PER_DAY: '%H:%M:%S',
- 1. / (MINUTES_PER_DAY): '%H:%M:%S.%f'}
+ 1. / HOURS_PER_DAY: '%H:%M',
+ 1. / SEC_PER_DAY: '%H:%M:%S',
+ 1. / MUSECONDS_PER_DAY: '%H:%M:%S.%f'}
def __call__(self, x, pos=None):
locator_unit_scale = float(self._locator._get_unit())
Does this sound reasonable?