Department of Anthropology, University of Illinois at Urbana-Champaign

Friday 10:45-11:00, Parlors

The optimality of a trait scoring system for age estimation often is defined in terms of lower observer error, which explains, for example, preference for Demirijian et al.’s scoring system for dental formation over Moorrees et al.’s. But as Ed Harris (2007) points out in the AJPA, observer error studies for ordinal categorical traits will always find that decreasing the number of stages reduces the error level. We argue for testing the distributional assumption in “transition analysis” instead of using observer error studies to find optimality. We extend a previously described Lagrange multiplier goodness-of-fit test to consider all possible ways of collapsing ordered scores into smaller sets of stages, using Todd scores from 422 males in the Terry Anatomical Collection and from 495 females in either Terry or the Gilbert and McKern cast collection as an example. To enumerate all possible ways of collapsing stages, the problem is framed in terms of partitioning integers and forming permutations in light of possible ties. Looking at all 510 possible ways to collapse stages within males and females, we found that the probability of a revised system of Todd phases I, II, III, IV, V, VI and VII-X (combined) arising from a log normal transition model was 0.9962 for males and 0.7849 for females. In contrast, the scores from the original Todd ten-phase system and from the collapsed T2 six-stage system (Suchey and Katz, 1986) yielded very poor goodness-of-fit statistics for both normal and log normal transition distributions.

Data collection supported by NSF SBR-9727386 to LWK.