Today I've been thinking about mental/human memory algorithms for day of week computation. Many of them feel wrong because
- They require a lot of memorisation.
- They are restricted in scope (e.g. they make you memorise the century key).
- They require large number arithmetic.
My method:
int dayofweek(int y, int m, int d) {
static int t[] = {
0, 5, 3, 1,
0, 3, 2, 5,
0, 3, 5, 1,
4, 6, 2, 4
};
int g = t[m+3] + d; y -= m < 3;
y %= 400; int f = t[y / 100] + (y % 100) + (y % 100) / 4;
return (f + g) % 7;
}Pros: Only 16 digits to remember, which are further easy to remember due to their patterns. A very small amount of working memory is required: First we need to remember y,m,d; after int g = t[m+3] + d; y -= m < 3; we only need to remember g, y. Then, y %= 400 is easy to do in memory. The remaining computations operate on two-digit numbers. After the that line we only remember f and g. Which are then summed and taken modulo 7. The algorithm works for every forseeable date.
