Literacy in SA surveyed in all 11 official languages

Researchers from the Centre for Evaluation & Assessment (CEA) at the University of Pretoria have completed the South African portion of the Progress in International Reading Literacy Study (PIRLS) to assess the state of numeracy and literacy levels...

Researchers from the Centre for Evaluation & Assessment (CEA) at the University of Pretoria have completed the South African portion of the Progress in International Reading Literacy Study (PIRLS) to assess the state of numeracy and literacy levels in primary schools. The South African team had the unique logistical challenge of designing survey instruments and translating them into all 11 of South Africa’s official languages, resulting in more than 200 different survey instruments.

Education in South Africa is facing severe challenges, but the scale of the problem is not clear without detailed, rigorous research. For the team at the CEA, assessing the education system effectively meant designing assessments and questionnaires in all 11 official languages, reaching schools across the country, and mining vast quantities of data for useful answers. The small dedicated team were more than up for the challenge.

Prof Sarah Howie, National Research Coordinator for PIRLS South Africa, understands the education system. Its well-being is very close to her heart as she and her team want to prevent “another lost generation” of South African learners.

The PIRLS SA study measured more than 1000 variables in the home, classroomand school environments, using the assessment instruments designed by the team. This process is extremely complex when one factors in the different languages spoken at home and in the classroom, the language spoken by the teacher, parents or guardians, and the fact that all these variables are interlinked.

For the the 2016 survey specifically, Howie worked with a team that included Gabriel Mokoena, Mishack Tshele, Nelladee McLeod Palane, Karen Roux and Celeste Combrinck; each of whom played an important role in designing and implementing the survey, and processing the resulting data.

Mokoena provided logistical support that included the packaging of testing material, such as achievement booklets and questionnaires and also worked as a fieldwork coordinator. One of his tasks was to contact the randomly selected schools to arrange times to test the learners.

During the project he had to overcome challenges like community protests, as well as the difficulties associated with accessing schools in remote rural areas. These schools were difficult to contact as they often had no landlines or fax machines at the school, or the correct contact numbers were not available.

Mokoena explains that data collection happened mainly around November, which is exam time. “We often had to convince schools to cooperate, especially with Annual National Assessments being conducted at the same time,” he says.

Mishack Tshele, a data manager at the CEA, was responsible for the huge amounts of data produced as a result of the PIRLS project. He was the link between the CEA and the International Association for the Evaluation of Educational Achievement (IEA) data processing centre in Hamburg for everything involving national and international data. Tshele was also responsible for making sure that the translated surveys were still laid out correctly and accurate. He says he personally ensured the correct layout of more than 200 survey instruments.

Celeste Combrinck, Project Coordinator for the PIRLS study and Acting Director of the CEA, relished the research challenges of getting the PIRLS study design and implementation just right.

Ms Combrink works in the area of psychometrics and, as part of the PIRLS team, used statistics to measure how well an instrument was functioning in the real world. She explains that she used principles of measurement similar to those used in the natural sciences.

“The major challenge in the social sciences is quantitative measurement, since the things we measure are usually qualitative in nature. Even aptitude or ability, like reading literacy, is qualitative,” she says. “Psychometrics is like trying to build a ruler, by using questions in a test.

"Measurement elevates a discipline to a science.”

While other countries participating in the PIRLS project may also encounter logistical hurdles, South Africa has the unique challenge of having many more languages to work with while still quantifying the data to make sure that, in Combrinck’s words, “all the points on the ruler are equal”.

Howie says that the researchers have risen admirably to the challenge of PIRLS 2016.

“The team at the CEA are fantastic. They are a small team - we were 13 when we were doing the fieldwork and every person had a job,” she says. “Each time we do this, I think it's impossible. Each time the team was different and each time the team rose to the occasion. We’ve been lucky to have such smart and driven people at the CEA working on the PIRLS project.”

The final PIRLS report was recently released both internationally and in South Africa.

CEA researchers will now embark on a new journey focussing on producing knowledge and recommendations from all of the data they have carefully produced and published.

The PIRLS 2016 team at the Centre for Evaluation & Assessment completed the South African portion of the international PIRLS study.