Helpful Information
 
 
Category: Software Design
less complex (as in big-O) date algorithm?

I wrote a script in bash that takes in the argument YEAR, MONTH, DAY, "+" or "-", NUM_DAYS that either adds or subtracts the NUM_DAYS to/from the date provided and returns the date that is NUM_DAYS +/- the initial date.

My question: is there an algorithm that can determine this in one operation? Right now, it cycles through the dates linearly based on NUM_DAYS, so I guess is it O(n). I am looking for one that is less, maybe O(1) or O(n log n).

Thanks,
Brett

If you first determin, if the number of days will overflow yxour current month, the see if the rest will overflow the current year, then check how many years fit in, then (back down again) if the rest is greater than a year , then see if the rest is enought for a month ect. you can make it run in O(1)-operations at somewhat enhanced complexity.

You have to check how long a year/month is depending on the first date.
Checks (in worst case) have to be done:
one for the current year.
one for the n-years inbetween
one for the last year
one for the current month
one for the months to next year
one for the months in hte last year
Actually the complexity rises at O(log(days)) up to the region of 400 years the stays constant (if you don't include calender changes in history) afterwards.

Hope this helps to keep you busy implementing for some time :)

O(n log n) is worse than O(n)

do it this way:

http://unix.about.com/library/weekly/aa070901g.htm
and here
http://unix.about.com/library/weekly/aa070901h.htm

martinus










privacy (GDPR)