Were the assessment tools appropriate?

by Paolo Mazzoli

With the sole exception of the teaching of a foreign language, from the beginning of school autonomy until now, no real standards of quality of school service have been defined, neither by the State nor by the communities. autonomous for matters within its competence. . When the National Indications came out, some commentators raised the question of the possibility of using the National Indications as a quality standard for learning, but this is clearly not the case.[1]. A repertoire of quality standards must allow an objective measure of its actual achievement, also explaining the modalities and conditions under which this measure must be carried out.

Read the special “The point about the evaluation of schools in Italy” published in full in issue 623 of Tuttoscuola

When we don’t have a benchmark for a service, theoretically we might not even be able to evaluate it. However, over time, the progressive evolution of national standardized tests and the implementation of the RAV system have somehow compensated for the lack of official standards.

Since the introduction of the national computer-based tests (CBT), the analysis of the results of the Invalsi tests has allowed us to define five progressive levels of competence in Italian and mathematics that are added to the levels of competence derived from the CEFR for in English. . Given that, as mentioned, the frameworks of the Invalsi tests are strictly connected with the objectives reported in the National Indications and the Guides, we can consider the Invalsi levels as a first step to identify the standards of competence for to two fundamental disciplines. .

A similar argument can be made with reference to the RAV.

In fact, the RAV defines qualitative and quantitative indicators that affect both school outcomes and the processes that take place in school. Just think, for example, of the areas inherent in the curriculum, inclusion, continuity, teacher training and interaction with the territory and families. All these are aspects that define the quality of the school service, although they do not constitute the main purpose that continues being the one to develop suitable levels of learning in all the students, and that thanks to the RAV can be evaluated, although is of qualitatively, with the proposed rubrics.at the end of each area.

However, we must not forget that the RAV is a self-assessment tool (although guided) and that is why the legislator who defined the purposes and procedures of the National Assessment System (SNV) provided for the periodic visit of the outside. evaluation units (NEV). . In this way, in the design of the 2013 Regulation, an extremely balanced device was built which, not in vain, was also appreciated internationally.[2].

In short, we can say that the lack of reference standards should not prevent a reliable assessment of effectiveness, at all levels (national system, schools, individual teachers), because now we do not only have national standardized tests of good quality but also because, thanks to the RAV, the results of the tests, together with a considerable number of indicators and descriptors of result and process, are inserted in the context of each school and can therefore go beyond the simple comparison of quantitative data to an appropriate interpretation that allows to formulate a sufficiently accurate description of each school and, more importantly, the elaboration of ways of improvement firmly anchored to time and concrete objectives.

Were the assessment tools appropriate?

It has often been heard that the National Assessment System “is struggling to take off”, that Invalsi tests are useless because they could not produce beneficial effects in our school system or, again, that the self-assessment report risks, year after year, exhausting its ability to encourage reflection and improvement, gradually becoming a less and less incisive ritual.

But is it really so? And, if so, are these critical assessments related to the quality of the tools or the quality of their implementation and use?

I immediately say that all the evidence I have managed to gather points in favor of the second hypothesis, that is, that we are dealing with instruments of good, or excellent, quality assembled with little determination and “handled” with discontinuity and little conviction.

I try to list some facts that seem quite significant to me.

  • The Regulation establishing the National Evaluation System did not provide for any funding. It sounds amazing, but it is: nothing for the elaboration of the RAV, nothing for the inspection contingent, nothing for the realization of the RAV platform, and so on. In fact, what we achieved was done using part of the Pon funds and the ordinary resources of Invalsi and Indire.
  • The autonomous and independent inspection quota required for NEV visits has never been established. Only one ministerial decree was issued[3] which provided for the use of a part of the few technical managers already in service[4] for the SNV, followed by a directorial decree designating the first contingent[5] and, in December 2016, a simple note from the head of the Department of Education[6] which reaffirmed the usability of the permanent technical managers, as well as the temporary ones foreseen by the Law 107/2015, of the VNS and of evaluation of the managers of centers.
  • Since 2014, the planned Ministerial Directive on VNS priorities has no longer been issued[7]. This means that the Ministry is five years behind schedule.
  • RAV indicators show some mismatches and inaccuracies. These defects are mainly due to changes made by the Ministry not previously tested.
  • With the exception of the national test at the end of the eighth grade, the Invalsi tests were included in a state law only in 2017. Thus, nine years after the implementation of the mandatory national test[8]. Not only that: the 2017 rule provided for the mandatory Invalsi test of the last year of high school and that of the last year of high school. Well, this requirement has always been waived through law decrees issued even before the pandemic and therefore without any motivation. Proof of this is the most recent Ministerial Ordinance which confirms, also for the academic year 2021-2022, that the Invalsi test of the last year of high school should not be a requirement for access to the high school test and, for the first time, the same exception is also established for access to the final test of the first cycle of education (eighth grade)[9].
  • To date, no government has used data from the Invalsi and RAV tests for its own policies. It is true that most ministers have made statements about the need to use standardized assessment data to implement strong policies to support weaker territories and schools, but it is also true that these intentions have not been translated, at least so far, in medium- to long-term actions with appropriate investments and monitoring systems.

He could go on to list many other signs of the poor “taking over” of the VNS and the data it has produced by national and regional authorities.

However, there are very encouraging signs of the opposite sign. I am referring mainly to the increasingly widespread dissemination of evaluation practices focused on system evaluation by centers. I do not have accurate data but I can testify that the number of requests for support to develop the ability to read single school data (both standardized tests and RAV) has grown greatly over time. In an increasing number of schools, groups of teachers have been formed who have acquired not only a good ability to read data but also to illustrate its meaning and implications for their schools. Above all, it is clear, school principals have moved on and first understood that standardized assessment tools are the most reliable lever to induce effective improvements in their school, without getting lost in endless projects that often do not impact the learning of the students.Boys.

The following two figures show two slides taken from the presentation to the teaching staff of the Instituto Comprensivo Pianello (Piacenza) of the reference teacher for the assessment. In each of the two a priority is reported to the school’s RAV along with the actual performance of the data that the school had committed to improving.

Finally, on the adequacy of the standardized assessment tools, in particular the national tests conducted by Invalsi, I would like to quote the opinion of the head of the OECD PISA program, Andreas Schleicher.[10]in March 2019, with the significant title “How Italy developed a state-of-the-art school assessment culture[11] (“How Italy has developed an avant-garde school assessment culture“) Where does Schleicher begin:”I first visited the National Assessment Institute (INVALSI) in Italy in 1989. It was then called CEDE and was a place where academics talked about educational research and contributed to international comparative studies. At the time, few would have thought that the Institute would build a comprehensive national assessment of the Italian school system. But two decades later, Italy did just that. The country’s state-of-the-art assessment culture provides comprehensive national diagnoses and verifies student performance at multiple subjects and grade levels in all Italian schools.“.

In short: the tools for a school assessment capable of inducing documentable improvements are there, and also quite good, what is needed is its relaunch with resources and constant attention from the central administration and the progressive commitment to learning to use them seriously. and systematic by the schools and the experts who collaborate in it. The examples presented in this same dossier are a good example.

[1] For example, in the text by G. Allulli, F. Farinelli and A. Petrolino “The self-assessment of the institute” (published by Guerini and associates in 2013), it is explicitly stated (p. 23): “As for the ignorance of Italian students with the tests performed in the form of tests, it was also an element to reflect as a sign of another typicality not precisely positive and certainly not immutable of our school system, that is, the substantial absence of a precise framework of shared reference standards“.
[2] Interest in self-assessment tools developed in Italy arose, for example, at a recent international conference organized in virtual mode by the review Scuola Democratica in which numerous European countries participated (“Reinventing Education” – 2-3- June 4-5, 2021). ). References: https://www.scuolademocratica-conference.net/.
[3] Ministerial Decree 598/2015.
[4] At the end of 2019 the technical managers were 56 against 160 (+ 1600 part-time) in England, 480 in the Netherlands, 275 in the Czech Republic (data from the MIUR, Treellle, Quaderno 14/2017). 2019) on the basis of retirement forecasts).
[5] DDG n. 1169/2015 of the Director General of Laws of the MIUR.
[6] Note no. 3367/2016 of the Head of the Department of Education of the MIUR.
[7] Directive 11, of 18 September 2014, referring to the triennium 2014-2015, 2015-2016, 2016-2017.
[8] The invalsi test within the eighth grade examination was introduced by Minister Fioroni with DL n. 147/2007.
[9] OO.MM. n. 64 and 65 of March 14, 2022.
[10] Director, Directorate of Education and Competences of the OECD-PISA.
[11] https://oecdedutoday.com/italy-national-school-assessment-test-program/


Leave a Comment