Designing a whole-school literacy program

What does it take to create and maintain a successful school literacy program? I found out not so long ago when I was working for a small rural school with a timetabled literacy program.

It was originally designed to raise literacy and comprehension results, and was based on the work of Dr Carol Christensen - specifically her 'Reading Link' program.

Over five years the program had different leaders, each with different expertise. By 2012, it had become a mixture of both literacy and English, exposing students to literacy and its application in decoding meaning. Although the perception was that it was adding to students' abilities, the impact was not substantial, and was difficult to measure.

At the end of my second year at the Victorian school, Principal Wendy Caramarco requested I lead the literacy program. The initial brief was to maintain the existing program while facilitating the generation and record of data. This would allow the school to use the subsequent 12 months to evaluate the program's effectiveness.

Understanding the school's needs

I spent a good amount of time sifting through teaching and learning research, resources, techniques and tools. I reflected on the available data, and the current program's pros and cons.

My analysis concluded:

  • Our literacy program was providing instruction that did not target the individual learning needs and potential of students. It didn't provide students with an understanding of what they were learning and why they should be learning it. It also did not make clear to students exactly what they were good at and where they needed to improve.
  • Our ability to collect, reflect on and share data with colleagues, students and parents was ad-hoc. Assessments were accurate on an anecdotal level but there was no hard evidence to justify the program or demonstrate the real value-add the program was making.
  • There was an ad-hoc collection of data. This could have been due to poor hand-over from one coordinator to another and non-centralised storage, the erratic nature of external once-off or annual testing (student absences, psychological and physical health), testing methods, and poor diagnostic assessments that provided a number or alignment to an age, year level or development rate, but did not pinpoint core needs.

It was clear that the program I implemented had to address these points. I ventured to design a program that was to be used by all teachers in the school and could:

  • Provide students with a diagnostic evaluation of their literacy skills.
  • Provide targeted strategies customised to each student's Zone of Proximal Development – a term coined by Lev Vygotsky in 1978 – to improve their learning needs while maintaining or strengthening their existing skills.
  • Provide students, teachers, and parents with accurate, validated and centralised data that could be owned by students and easily shared in confidence.
  • Utilise a centrally located data depository to show the benefit of the program to students, teachers and parents.

With some assistance from my wife, a primary teacher who specialises in literacy, I developed a multi-stage literacy program that consisted of five cycles.

Stage 1 – Diagnostic assessment cycle

The pre-assessment cycle was designed to gather diagnostic assessment data to analyse what literacy skills the students had the greatest need for. The tools used included Comprehensive Assessment of Comprehension (CARS), Tests of Comprehension (TORCH) and Words Their Way.

Stage 2 – Strategies cycle

This stage aimed to target each student's greatest area of need at their Zone of Proximal Development. This was achieved by developing a five-day literacy program that utilised three external tools (CARS, TORCH and Words their Way) and two tools developed by myself - Best Practice Study Skills and The Seven Parts of Speech.

The Best Practice Study Skills and The Seven Parts of Speech were lesson plans that were highly scripted and structured to support staff and provide students with clear and explicit instruction and expectations. The lesson plans were distributed to literacy teachers on a weekly basis.

The understandings from these lessons were then to be made explicit in each student's classes through the presentation and completion of their work and assessments.

Stage 3 – Mid-cycle assessment

The mid-cycle assessment was designed to monitor the short-term impact of the program. It did this by repeating or elevating the complexity of the diagnostic assessments. Students were provided the assessment from Stage 1. If an improvement was seen, the student was then provided an assessment equal to the improvement.

If a regression occurred, a discussion was had with the student to help them identify the cause or influence and the reflection recorded. The regressive level test was then allocated.

Stage 4 – Strategies cycle

Students returned to the stage two of the program and targeted their greatest areas of need.

Stage 5 – End of program assessment

Finally, we evaluated each student's progress and the value that the program had added to their learning. This was the final stage in the cycle, and tended to occur before the end of Term 2, and two weeks before the reporting cycle in Term 4.

Tracking, sharing and communicating data

I then decided that a comprehensive diagnostic program was not enough by itself – it had to be supported by a data tracking system.

If the program was going to make an impact, the data generated from the cycles had to be fed back to each student, shared between teachers, and communicated to parents in a way that everyone could read and understand. For students and parents, this meant it had to show what they were learning and why and how they were improving. As John Hattie describes it: what am I doing; why am I doing it; and when will I know if I'm successful?

Conversely, it would hold students accountable for their regression or lack of engagement.

For teachers, it meant that the data needed to be collated, centralised, and communicated without the need for interpretation, so that they could understand each student's strengths and areas for improvement.

To allow for this, data was exported into a simple document – the ‘Literacy Continuum'. This document became pivotal to the success of the program.

The Literacy Continuum was designed to collate and track each student's literacy assessment data, as well as allowing a teacher to discuss the results and possible actions for improvement with each student. It was a complete summary of the student's literacy competencies. It also provided a summary of the students' On Demand assessment, NAPLAN results, and their English teachers' judgements.

A copy was to be provided to parents with each term report. Classroom teachers were also given access to each student's Literacy Continuum to assist with tailoring instruction and communicating with students.

Next steps

After gaining Principal Caramarco's support for the program I recognised that, in practice, it would not be that easy to implement. Staff would need to be developed and supported in its integration.

In the next article, Andrew Nicholls shares how he got fellow staff on board and outlines the program's impact on the college, its teachers, and students.

Is there a literacy program at your school? If so, are all teachers part of the program, rather than just literacy teachers?

Is the data shared with students to show what they are learning, why they are learning it, and how they are improving?

Do teachers use the data to understand each students' strenths and areas for improvement?