Home page Identity lists
The "measure" of each identity is the sum of the reciprocals of the logarithms to base 10 of all the cotangents appearing in the identity (following a suggestion in a 1938 paper by *D. H. Lehmer) - the lower the measure, the better. When there is competition between eligible terms for inclusion in an identity, our software selects the term which makes the least contribution to the measure.
Until August 2006, we followed a convention that the contribution to the measure of an identity of any "half integer" term should be the sum of the contributions of the substituted integers, i.e. for a term [N/2] the contribution was to be 1/Log(N) + 1/Log(N(N²+3)/2). This was in part due to our desire to exclude fractional cotangent terms from our identities, half-integer terms only being admitted because they could easily be "disguised" as pairs of integer terms (however this did not stop us recommending that, in any actual calculation, "half-integer cotangents should be evaluated in their original form"). The negative effect of this convention is its inherent bias against half-integer terms when "lower measure" is the criterion for inclusion. Now that half-integer terms have proved their worth and are to be seen in our "best" identities, we have concluded that it makes more sense to use 1/Log(N/2) as the contribution to the measure of a half-integer term [N/2]. This will have the immediate effect of reducing the measures of all (pre-August 2006) identities which include half-integer terms; in the longer term, all identities in our database must be reviewed to ensure that eligible half-integer terms previously rejected are now included - fortunately our existing software makes this not too difficult.
Lehmer's measure is intended to be commensurate with the time required for serial evaluation, to a specified precision, of all the inverse cotangents in an identity (e.g. by applying Gregory's Series). It has been pointed out that any computer program for evaluting an inverse cotangent may follow different paths depending on the cotangent value - e.g. multiplying by an operand longer than the "normal" word-length will usually involve more program steps. We have not attempted to apply any "weighting corrections" on this account. In practice, the larger the number of digits in an identity's leading term, the higher its measure is likely to be - had it been the other way round, the validity of this type of measure would be much more questionable.
When D (number of digits in leading term in identity, assumed to contain the least cotangent value) and T (number of terms in identity) are high, it is quite possible that distributed computation (for instance, making use of up to T computers simultaneously) might be exploited in performing an evaluation, in which case the relative merits of two identities may be judged simply to depend on their leading terms, which involve the smallest cotangent values.
Lehmer made his proposal before the advent of electronic computers, and accordingly included suggestions reflecting the practical advantages, at that time, of using cotangent values involving powers of ten. In our present context, these suggestions have been ignored (though some historic examples of identities featuring such terms occur in our lists).
We also gratefully follow Lehmer in our use of [n] to represent arccot n.