Warning: Trying to access array offset on value of type bool in /home/u269751370/domains/globaldev.blog/public_html/wp-content/themes/generatepress-child/functions.php on line 87
Basic rights and Equality

Improving educational outcomes in Bulgaria: lessons from data and research

5 min

by

Tihomira Trifonova

It is well understood that low literacy has damaging consequences for both individuals and wider society. As this column notes, literacy is more than reading skills: it involves the ability to comprehend what you read and, with the help of acquired knowledge, to use and critically evaluate it. More broadly, functional literacy is the ability to cope with life’s circumstances, while widespread functional illiteracy makes a whole society dysfunctional. Hence the value of large-scale international assessments of educational achievements to inform national efforts to raise the quality of education – but also the value of research to interpret the results for effective policy and practice.

Large-scale international assessments of educational achievements measure functional literacy, and the results are intended to inform national efforts to improve the quality of education. In the latest two waves of one of these assessments – PISA (Programme for International Student Assessment) – the results for reading competence revealed an alarmingly large proportion of functionally illiterate students in Bulgaria.

Yet the national external assessment (NEA) at the same stage of schooling – that is, the one organized by the country’s Ministry of Education – found different results. Not only did the two produce different results, but they are also not comparable, so we cannot tell what educational reforms are needed or how to implement them.

What PISA assesses in its reading literacy component is a set of competences to demonstrate what learners can make out of a text as a piece of information and what they can do with it. This makes it future-oriented, measuring functional literacy.

In the latest PISA test, implemented in 2018, the average result of the Bulgarian students was 420 points. This result does not have a substantive meaning but a relational one, compared across all participants in the test. The average country score is compared with the average score for OECD countries, which in 2018 was 487 points. This result places Bulgarian students last among European countries.

Indeed, since the 2012 PISA wave, Bulgaria’s results have been steadily deteriorating. The more important indicator, however, is students whose results are below the critical level 2 – the functional literacy threshold – and in the latest test, they approximate 50%. At the same time, students whose results are at the highest levels 5 and 6 are under 5%.

This discovery instigated widespread public debate, in which a variety of explanatory opinions were voiced. Some blamed the conceptually outdated educational system (meaning its content); others the inadequate preparation of teachers (meaning the method); and still others the uninterested students and teachers (meaning the motivation).

Looking at national testing, its announced purpose is to be used to reform the school curricula, plans, and content, and to readjust teaching methods to improve the quality of education. Yet when we approach the results for analytical purposes, we find that they represent low quality data that is not fit for processing due to numerous omissions and inconsistencies.

For example, if we take the results for 2015, corresponding to the previous PISA wave, we have an announced average of 58.68 out of 100, which is a score of ‘good to very good’. But when we cleaned and checked the data for consistency, we got an average score of 34.10 out of 100, which is the lowest part of the good score range.

So what happens when we put the results of the two assessments next to each other? First, we see a NEA result that in the Bulgarian interpretation qualifies as fairly good, and a PISA result implying functional illiteracy of nearly half the cohort. Second, we see a negligible difference between the scores in the two latest PISA waves and a substantial difference between the respective NEA assessments.

One more dataset really indicates a difference. Several years ago, the concept of an ‘innovative school’ was introduced in Bulgaria, allowing individual schools to design and implement new approaches, methods, teaching and learning activities, and certifying them as innovative. Extracting the results of schools certified as innovative in the same NEA for five subsequent years reveals that they consistently performed better.

What are the main conclusions we draw from this research?

First, the results of the international and national tests are not comparable because of the large discrepancy in the testing methods and content between the two. This invalidates the purpose of using them to inform reforms to improve the quality of education.

Second, the national results grading system does not correspond to international standards, and thus cannot provide a reasonable level of assurance that the measurement is adequate.

Third, while open data like the PISA results are growing globally, making sense of them is not easy. Knowledge steers civil activity and to ascertain if our ideas need proof to be considered as knowledge, research becomes a must.

There are three reasons why research is significant.

First, it is a means to understand various issues, and to increase public awareness of them. In the example of education, the problem is clearly systemic and cannot be fixed by civil initiatives only. If properly targeted, civil initiatives can make a contribution, but they need information and orientation. This leads to the second reason.

Research is a means to find, gauge, and seize opportunities. Addressing functional illiteracy is a complicated task that can be approached in a variety of ways. But even before that, there are many questions that need to be answered: Where are the most serious deficits – in content, method, or motivation? Are numerical and linguistic deficits related? And are they related to how knowledge is delivered at school?

Opportunities are hidden in the answers to these questions and to discover them we need supporting evidence. Such evidence can usually be found in the data, but when we turn to what is available, we realize that first we have to make the data usable. In our case, this means some processing that reveals relevant factors, and illustrates correlations and possible dependencies. Along the way, this also illustrates data gaps.

The third reason why research is significant is that it is a way to prove misconceptions and support truths. Sometimes, it sheds light on issues we did not know existed, and raises questions we had not realized needed asking.

Center Immigration and Integration (CII) is a civil organization that uses research in this way in its work with young people. We keep a close eye on all developments in our society, as well as globally, because it is our mission to address youth-related problems.

We know from experience that discovering the causes of a broadly defined problem requires research with a well-formulated question. For CII, functioning as a civil organization implies a socially responsible activity. Social responsibility is not just an attitude – it also takes knowledge.

 

Tihomira Trifonova
Center for Immigration and Integration