Types of report
Exact creates a report for each student, which contains various sections. These are outlined below.
This shows the standard scores for the six different measures (word recognition, reading comprehension accuracy, reading comprehension speed, spelling, typing speed and handwriting speed) in both ‘graphical’ and ‘tabular’ form. The key data needed for JCQ Form 8 may be extracted from these. The summary table of results also shows percentile scores for each test. Note that on the chart the average score range (standard score 85–115) is shaded grey. To aid speedy identification of areas of difficulty, the bars on the chart are coloured blue if the standard score is 85 or above (i.e. within the normal range or better), and yellow if below 85 (i.e. below the normal range, indicating that the result is a matter of concern). An example is shown in Figure 1.
Cautionary warning regarding dubious reading speeds
The program checks whether the student has devoted a reasonable amount of time to the reading comprehension passages. If a student has completed the reading comprehension test in less than eight minutes the results should be regarded as ‘doubtful’, i.e. it is unlikely that proper consideration has been given to the answers, and hence the scores will be unreliable and should not (on their own) be used as meaningful evidence for exam access arrangements. If a student completes the reading comprehension test in less than five minutes, the results should be regarded as ‘impossible’, i.e. the student has answered the comprehension passages so quickly that it is impossible for them to have given proper consideration to the answers, and hence the scores are not safe to be used as evidence for any purpose.
When the program detects doubtful or impossible performance, a warning is given in red underneath the summary table, and the bars relating to that performance are shown in coloured hatching rather than solid block colour. Since this outcome necessarily places limitations on the use that can be made of the results of the reading comprehension test, assessors may wish to repeat this test having provided appropriate guidance to the student regarding how the test should properly be attempted (see here for further advice on this matter). When re-testing, the alternate form of the test (A or B, as appropriate) should be employed (see here for guidance on this).
There is a space at the foot of the Results profile for assessor’s comments, which can be typed into Testwise. As a rough guide, about 1,250 characters may be included in the comment. The report will not check the length of the text entered, so it may overflow the page if too many words are entered. Alternatively, comments may be typed separately and pasted in, or written directly on to the Exact printout.
This gives a complete breakdown of all the test scores in several tables, including comparison of ability to read and spell regular and irregular words, and the complete passage as typed to dictation by the student. An example is shown in Figure 2. Results on this page are shown in the following principal formats: standard scores, confidence intervals, percentile scores and age equivalents. In addition, this page includes raw scores (or, in the case of word recognition, transformed scores – see here and here for an explanation regarding this) and (where appropriate) times.
When the program detects doubtful or impossible performance on the reading comprehension test a warning is given in red underneath the results breakdown for that test.
Checking the scores from the dictation tests
The raw scores for the dictation tests (i.e. the number of words typed and handwritten) are estimated by the computer based on the typed text saved and the number of phrases listened to by the student for the handwriting task. In about 95% of cases these figures are sufficiently accurate to be safely used in the report. However, in a few cases where the student has not followed the instructions properly the computer’s estimates can be significantly different to the true figures.
Administrators should therefore carry out a visual inspection of the number of words typed and handwritten and, if a discrepancy is suspected, the administrator can manually count the number of words and enter the figures into Testwise. The manually entered figures will then replace the computer’s estimated figures in the report.
This page will also show the number and percentages of spelling errors in handwriting and typing if the raw data for these have been entered. The computer does not count or estimate the number of spelling errors made in the dictation tasks and hence, if this information is required on the report, the administrator must manually enter the relevant data. The procedure for this is shown on the Testwise help site. The program will then calculate the percentages of spelling errors and display these on the report. If the relevant data are not manually entered, the number and percentages of spelling errors in handwriting and typing will be shown as zeros on the report.
This gives the student’s responses for all items in reading comprehension and spelling. This information can be useful for diagnostic purposes. An example is shown in Figure 3. Note that as the spelling test is adaptive, not all items are administered; skipped items are shown as a dash but are credited to the score as if passed correctly.
Handwriting to dictation
The final section of the report is reserved for incorporating and displaying a scanned image of the student’s handwriting to dictation. The procedure for this is described on the Testwise help site. This facility is optional, and if an image is not available this page will remain blank. An example is shown in Figure 4.