# Understanding CoPS scores

## Understanding CoPS scores

For each CoPS subtest, results are calculated automatically and are shown both for accuracy (black dots on the report) and speed (blue dots on the report). Of these, accuracy is usually the most important indicator. CoPS results for both accuracy and speed on each subtest are given as Standard Age Scores. Standard Age Scores, like IQ, are usually expressed with a mean of 100 and a standard deviation of 15. These scores reflect the student’s performance compared to those of the norm referenced group, which is based on the student’s age, in three-month age bands from 4:0 up to 7:11.

Any test score is only an estimate of the student’s ability, based on their performance on a particular day. Performance on any test can be affected by several factors. The CoPS report provides confidence bands, which give an indication of the range within which a student’s score lies. The dot on each subtest row within the table represents the student’s SAS and the horizontal line represents the 90% confidence band. The shaded area shows the average score range. 90% confidence bands are a very high-level estimate; if the test were taken again, we would expect the score to fall within this range 90% of the time.

## Accuracy scores

How low must a CoPS subtest result be before the teacher should be concerned about the student’s performance? Put another way: what is the critical cut-off point or threshold that can be used when deciding whether or not a given student is ‘at risk’? Unfortunately, this is not a question that can be answered in a straightforward fashion, because much depends on other factors. These include: (a) the particular CoPS subtest under consideration (some subtests are more highly predictive of later literacy difficulties than others), (b) whether the results of other CoPS subtests confirm or disconfirm the result being examined, and (c) the age of the student being tested.

## The Threshold of Concern

Traditionally, a score which falls below an SAS of 85 (which is below one standard deviation below the mean) is by definition significantly below average and thus indicates an area of weakness, which requires some intervention. However, as stated at the start of this chapter, any test score is only an estimate of the student’s ability, based on their performance on a particular day. As there is some error in any test score, those test scores in the borderline range (i.e. just above SAS 85) could potentially represent ‘true scores’ that are within the ‘at risk’ range.

Therefore, the CoPS report identifies SAS scores of 88–94 as being ‘Slightly below average’ and SAS scores of 75–87 as ‘Below average’. As such, action is recommended where SAS scores are in either of these ranges and the CoPS report will refer the tester to the Indications for Action table on the Downloads page, where appropriate. Where there is strong confirmation (e.g. a number of related subtests below an SAS of 88) then the assessor can be convinced that concern is appropriate.

## The Threshold of Risk

On the other hand, where a student is scoring below a Standard Age Score of 75 on any subtest (near or below two standard deviations below the mean), this generally indicates a serious difficulty and should always be treated as diagnostically significant. Usually this will be a strong indication that a student is at risk of later literacy and/or numeracy difficulties. Remediation by way of training will often be required as well as a differentiated approach to basic skills teaching. The CoPS report identifies SAS scores below 75 as being ‘Very low’ and will refer the tester to the Indications for Action table on the GL website. Again, where there is strong confirmation (e.g. a number of related subtests below SAS 75) then the assessor can be even more confident about the diagnosis.

## Additional scores

The CoPS reports also provide Stanine scores (ST), National Percentile Ranks (NPR), T-Scores and Z-Scores:

- The Stanine places the student’s score on a scale of 1 (low) to 9 (high) and offers a broad overview of performance.
- The National Percentile Rank relates to the SAS score and shows the percentage of students obtaining a certain score or below. An NPR of 50 is average since 50% of students obtained an SAS of 50 or below. An NPR of 5 indicates that a student’s score is within the lowest 5% of the nationally representative sample and an NPR of 95 means that a student’s score is within the highest 5% of the national sample.
- T-scores have a mean of 50 and a Standard Deviation (SD) of 10, so a T-score of 40 is one SD below the mean and a T-score of 60 is one SD above the mean. 68% of T-scores would fall within the 40–60 range, so a T-score below 40 would be considered below average and a T-score above 60 would be considered above average.
- Finally, Z-scores show us the student’s score in standard deviation units, with a mean of 0 and an SD of 1. So, a Z-score of -1.0 would indicate that the student’s score is one SD below the mean and a Z-score of +1.0 would indicate that the student’s score is one SD above the mean.

The relationships between these different scores are shown in Figure 29.

**Figure 29. Relationship between scores**

## Differences between subtests

Some CoPS subtests are more highly predictive of later literacy difficulties than others. For example, **Races** and **Rhymes** (given at age 5 years) are the CoPS subtests which most consistently show the best correlation with literacy at 6 years 6 months and 8 years. After **Races** and **Rhymes, Wock** shows the next highest correlation, but higher at 6 years 6 months than at 8 years, which suggests that the importance of auditory discrimination in reading development (although still significant) decreases somewhat during that period. However, although this is probably true of readers in general, auditory discrimination remains an important factor for poorer readers and most of those who are dyslexic. The next highest correlations are produced by **Crayons** and **Rabbits**, with **Letters** having a higher correlation at 6 years 6 months than at 8 years. Again, this latter finding suggests that for most readers simple sequential memory for letter shapes as a componential factor in reading declines in importance over that period, although it will still remain significant for the poorer readers and many dyslexic students. The associative (as opposed to sequential) memory tasks (**Toybox** and **Letter names**) showed the lowest (although still statistically significant) correlation with later reading ability. Differential predictive efficacy is probably due to quite different factors operating. **Toybox** is quite easy for most children in the 4 to 7 year old range, whereas **Letter names** is much more difficult. In fact, many dyslexic adults cannot do Letter names very well. Of course, the results have been standardised to permit comparison between different subtests and with the population of students of that age. Nevertheless, it was important to include these two subtests in the CoPS suite because otherwise there would not have been any measures of associative memory for the teacher to rely on. Particularly in the case of a student who has difficulties with sequential memory – i.e. keeping those letters and sounds in the right order – it is important for the teacher to know whether associative memory is intact. If the student’s scores on **Toybox** and/or **Letter names** are satisfactory, then at least the teacher knows that the student should be able to cope with the memorisation of basic associations (e.g. between letters and sounds). Another reason for including **Toybox** in CoPS is that it has a high correlation with later numeracy skills.

## Speed scores

Speed scores are shown on the report by the blue dots. A high speed score is one in which the student completes the subtest more quickly than average (or attempts a higher number of items within the 90-second test phase on **Toybox**). Note that scores are not shown on the report if the speed score exceeds 5 SDs from the median speed score for that subtest.

Speed results can be useful to the teacher in a number of ways. Broadly, the teacher should look at:

- the overall pattern of speed results
- speed scores for individual subtests

## The overall pattern of speed results

The overall pattern of speed results from all the subtests for an individual student can tell the teacher whether the student is generally fast, average or slow at carrying out the CoPS subtests. However, speed results inevitably show wide variability between children and when interpreting CoPS, speed scores are not nearly as important as accuracy scores. Students with Attention Deficit Hyperactivity Disorder (ADHD) tend to be relatively fast and students with developmental co-ordination disorder (dyspraxia) tend to be rather slow. Fast speed, when associated with low accuracy, may indicate ADHD, but not necessarily. In such circumstances it is likely that the student has been rushing some of the tasks, or perhaps responding impulsively. Whenever there is a significant negative correlation between speed and accuracy (i.e. fast speed being linked with low accuracy, and slow or average speed being linked with average or high accuracy), the data should be regarded as suspicious.

## Speed scores for individual subtests

Observation of speed scores for individual subtests usually enables the teacher to check whether the student has approached the task carefully enough for the accuracy score to be relied upon. Conscientious use of the CoPS Comments Sheet when testing will also help the teacher to resolve cases where it appears that the student was unwell, inattentive, distracted, or poorly motivated. Obviously, if a student has a number of low accuracy scores coupled with high speed scores for those same subtests, it strongly suggests that the student has simply been doing the subtest too quickly. If he or she slowed down to a more reasonable speed, then the accuracy score might then be within the average range. If the teacher suspects that this is what has happened, then it would be a legitimate reason for repeating the subtest(s) in question. On the other hand, if the student has a high speed score coupled with an average or above-average accuracy score for that particular subtest, then the teacher has no cause for worry.

It is important to appreciate that different students can all achieve similarly fast speeds, but for quite different reasons. Correspondingly, different students can all achieve fairly slow speeds, but for equally different reasons. Speed scores can sometimes reflect personality factors. Some students are by temperament slow, meticulous and careful, others are fast, impetuous and careless. Some students are slow and still fail to achieve high accuracy, and a few are surprisingly fast but achieve high accuracy throughout.

## Case studies showing fast response speeds

Occasionally, a student who is consistently a fast responder shows some low accuracy scores. In such a case, even though there is a big discrepancy between the speed scores (high) and the accuracy scores (low), the accuracy scores may still be relied upon, especially if there is good confirmation from other CoPS subtests.

An example of this is given in Figures 30a and 30b, which shows CoPS scores for Adam, who is nearly six. He displayed consistently poor accuracy scores for the visual tests, but average or above-average scores for the auditory tests. All speed scores were high. Even though he was quite bright (WISC-V Full Scale IQ 123) and despite being in school for about eighteen months, he was making abysmal progress in reading and writing. Although he tried very hard, he could not remember letter shapes or visual word patterns very well. His father once commented, ‘Adam learns with his ears’. However, he had slight hyperactive tendencies and was orally extremely fluent, so his teacher had assumed that he just needed to settle down and concentrate better and then he would begin to learn without any special or individualised teaching. In fact, CoPS indicated that he was dyslexic (a diagnosis later confirmed in full psychological assessment) and only when he received appropriate teaching using a structured phonic approach, did he begin to make significant improvement. In Adam’s case, although he did obtain scores on some tests which showed a large discrepancy between accuracy and speed, high speed scores were normal for him and so did not diminish the validity of his accuracy scores.

**Figure 30a. Case study – Adam**

**Figure 30b. Case study – Adam**

Consider, on the other hand, CoPS scores for Peter, which are shown in Figures 31a and 31b. In Peter’s report we notice that the normal process of interpretation is confounded by a high negative correlation between speed and accuracy, where tasks that have low accuracy scores have been attempted far too quickly. By contrast, those subtests which have average accuracy scores have been attempted at speeds within the average range. This inconsistency is also apparent if one attempts to interpret Peter’s profile of accuracy scores. Thus, auditory discrimination (**Wock**) appears poor, but nevertheless Peter has still managed an average performance on **Rhymes, Races** and L**etter names**, all of which demand good auditory discrimination and listening skills, which is clearly contradictory. Similarly, **Rabbits** gives a very poor score, suggesting visual sequential memory problems, but **Crayons** is satisfactory, which appears to contradict this view (although it must be acknowledged that **Rabbits** and **Crayons** do assess somewhat different aspects of visual sequential memory, so it is not necessarily an inconsistent finding). However, some younger students, especially if they have attempted computer games of older siblings, erroneously assume that the only approach to all computer games is to ‘shoot everything in sight as quickly as possible’. They tend to point and click without really thinking about what they are doing. The recommendation with Peter would therefore be to retest, explaining to him that he must think about the tasks carefully and must not rush them.

**Figure 31a. Case study – Peter**

**Figure 31b. Case study – Peter**