• Ingen resultater fundet

Are the PISA measures directly comparable between countries?

In cross-cultural studies with variables such as values, attitudes and habits, it is often assumed that differences in scores can be compared at face value. However, response styles like acquiescence and extreme response style may affect answers, particularly on rating scales (Artelt et al. 2003; Fischer, 2004; Herk et. al. 2004). It has been argued that cross-cultural questionnaire studies are always challenged by the issue of response biases. Different cultural explanations of response bias have been suggested (Bempechat et al. 2002).

Figure 6 Control strategies in mathematics: Percentages of the students who ‘Strongly agree’ or ‘Agree’

'HQPDUN )LQODQG ,FHODQG 1RUZD\ 6ZHGHQ :KHQ,VWXG\IRUD0DWKHPDWLFVWHVW,WU\WRZRUNRXWZKDWDUHWKHPRVW LPSRUWDQWSDUWVWROHDUQ

:KHQ,VWXG\0DWKHPDWLFV,WU\WRILJXUHRXWZKLFKFRQFHSWV,VWLOOKDYH QRWXQGHUVWRRGSURSHUO\

:KHQ,VWXG\0DWKHPDWLFV,PDNHP\VHOIFKHFNWRVHHLI,UHPHPEHUWKH ZRUN,KDYHDOUHDG\GRQH

:KHQ,FDQQRWXQGHUVWDQGVRPHWKLQJLQ0DWKHPDWLFV,DOZD\VVHDUFK IRUPRUHLQIRUPDWLRQWRFODULI\WKHSUREOHP

:KHQ,VWXG\0DWKHPDWLFV,VWDUWE\ZRUNLQJRXWH[DFWO\ZKDW,QHHGWR OHDUQ

Lie and Turmo (2005) aggregated data from the PISA 2003 student questionnaire at country level and found strong negative correlations between the mean construct values and mean mathematics achievement. These findings contradicted what would have been theoretically expected. They also conducted factor analysis of all the PISA 2003 constructs at country level. The analysis showed that the first component alone could explain 66% of the variance. This was a rather surprising finding, taking into account that the constructs are substantially very different (‘sense of belonging’, ‘elaborations strategies in mathematics’, ‘instrumental motivation’ etc.). However, all the constructs are measured using Likert scales. It was suggested that the first factor in the principal component analysis therefore might be interpreted as the general response tendency on Likert scales in the different countries.

Based on the first component from the principal component analysis, a meta-construct was generated as the linear combination of the meta-constructs. The countries’

scores on this meta-construct may be interpreted as a quantitative measure of the general response tendency in each country. The results showed that Tunisia, Brazil and Mexico are the countries with the strongest general tendency to agree to statements, while Japan and Korea had the weakest general agreement tendency.

Similar analysis was also conducted on the data from PISA 2000. The correlation between the meta-construct values in PISA 2000 and PISA 2003 for the countries that participated in both cycles was 0.89. In other words, the agreement tendencies seem to be reasonably consistent between the two PISA cycles. The results showed that Denmark has the strongest general agreement tendency among the Nordic countries, while Finland and Norway have the weakest tendencies.

These findings indicate that direct comparisons of country mean values for the constructs based on Likert scales in PISA, like the learning strategies, should be made with caution. In this sense, the findings are in line with previous research on cross-country comparisons of questionnaire data (Flaskerud, 1988; Heine et al.

2002; Lee et al. 2002). However, in the Nordic countries the estimated correction factors are relatively moderate. This implies that the cultural bias when comparing the results from the Nordic countries, as in this article, is also relatively moderate.

However, in countries like Japan, Korea, Brazil and Mexico interpretations of the mean construct values from an international perspective may change significantly if the rather large bias estimates are taken into consideration.

Conclusion

What can we learn from the students’ self-reported strategy use in PISA? Most noticeable perhaps is that for students’ in the Nordic countries reported use of learning strategies is below the OECD mean. However, country-specific general response tendencies seem to be present in the data from the PISA student questionnaire. The estimated correction factors in the Nordic countries are, however, relatively moderate. Analyses indicate that the student response behaviour is fairly consistent across the Nordic countries, i.e. the general response tendencieson Likert scales are rather similar. However, for some of the learning strategies items there are rather large specificdifferences in the students’ responses between the Nordic countries.

The results in Finland are perhaps of particular interest. Finland is among the countries with the highest mathematical literacy score in PISA, only outperformed by Hong Kong among all the participating countries. However, the results show that Finnish students’ reported use of all the three learning strategies is below the OECD mean, and also below that of students in most of the other Nordic

countries. Finland has a particularly low value for control strategies in mathematics, which is rather surprising. Even after correcting for the general response tendency in Finland on Likert items, this finding still holds. The research literature suggests that metacognition is essential for effective learning. It is therefore relevant to ask if there are important aspects of Finnish students’ use of strategies in mathematics that are not captured by the measures in PISA. Knain & Turmo (2003) suggest that it is not so much the frequency of use of the strategies that identifies a student who can self-regulate his or her learning, but the fact that the student can flexibly adapt strategies according to the situation. One of the key aims of international studies like PISA is illustrated by the slogan ‘learning from others’ (Shorrocks-Taylor &

Jenkins, 2000). If other countries decide to use Finnish students as examples to learn from, which would be highly relevant based on their high mathematical literacy level, a more elaborated description of their approaches to mathematics learning is needed.

The results for the individual items used to measure learning strategies also show some interesting features, not least among answers to some of the questions used to measure elaboration strategies. Only a quarter of Finnish students agree that they think about how the solution to a problem in mathematics might be applied to other interesting questions. This may reflect a rather abstract and ‘pure’ approach to mathematics teaching in Finland. On the other hand, more than 60% agree that they try to understand new concepts in mathematics by relating them to things

they already know. Such findings underline the value of studying and discussing the results for learning strategies in PISA at the single item level.

Regarding the within-country correlations between the learning strategies and mathematical literacy, interesting differences between the Nordic countries are found. For example, we have seen that the strongest correlation between memorisation strategies and mathematical literacy is found in Norway, while no relationship is found in Iceland. It is interesting to reflect on how differences in approaches to mathematics teaching in the different Nordic countries might influence this relationship. It has been argued that the time spent on teaching fundamental skills in mathematics is rather low in Norwegian primary and lower secondary schools, and this was also evident in the Norwegian results in TIMSS 2003 (Grønmo et al. 2004). This highlights the importance of the students’ own emphasis on rehearsal strategies, as reported in the empirical results in PISA.

Boys report that they use memorisation strategies and elaboration strategies more than girls in all the Nordic countries. What these gender differences mean should definitely be studied in more depth. There may be a gender biased response tendency, with girls tending to answer more modestly than boys. However, no differences in use of control strategies between the genders are found in the Nordic countries. It is interesting to note that the girls report significantly more use of these strategies than boys in 22 of the 30 OECD countries participating in PISA 2003 (OECD, 2004).

The gender difference in the use of memorisation learning strategies varies between the Nordic countries. The difference is especially large in Norway, followed by Denmark and Sweden. In Iceland, the gender difference is moderate. However, the gender difference in the use of elaboration strategies is relatively consistent across the Nordic countries, with boys reporting greater use of elaboration than girls. The practical realities behind these patterns should be studied further.

In summary, our analysis shows that there are a number of questions which remain unanswered regarding the students’ self-reports on learning strategies in PISA and how well the scores actually reflect the students’ use of learning strategies.

Therefore, the empirical results should be interpreted with caution. Validation of the questionnaire instrument using other research methods and approaches, as for example has been done by Samuelstuen (2005), is therefore needed. In Norway, Hopfenbeck will continue this line of research through her PhD work which involves interviewing students about the PISA questionnaire. The goal of the interviews is to collect the students’ interpretations of the PISA items and their fundamental thoughts regarding their choice of strategies for solving particular

problems. Individual interviews will be carried out immediately after the students complete the PISA 2006 test.

References:

Artelt, C., Baumert, J., Julius-McElvany, N., & Peschar, J. (2003). Learners for Life. Student Approaches to Learning. Results from PISA 2000. Paris: Organization for Economic Co-Operation and Development.

Baumert, J., Fend, H., O’Neil, H.F., & Peschar, J.L. (1998). Prepared for Life-Long Learning: Frame of Reference for the Measurement of Self-Regulated Learning as a Cross Curricular Competence (CCC) in the PISA Project.Paris: Organization for Economic Co-operation and Development.

Baumert, J., Klieme, E., Neubrand, M., Prenzel, M., Schiefele, U., Schneider, W., Tilmann, K. J., & Weiss, M. (2000). Self-Regulated Learning as a Cross-Curricular Competence.

Berlin: Max Planck Institut für Bildungsforschung.

Bempechat, J, Jimenez, N.V., & Boulay, B.A. (2002). Cultural-Cognitive Issues in Academic Achievement: New Directions for Cross-National Research. In Porter, A. C.,

& Gamoran, A. (Eds.): Methodological Advances in Cross-National Surveys of Educational Achievement. Washington: National Academy Press. (pp.117-149).

Chamot, A. U. (Ed.). (1999). The Learning Strategies Handbook. White Plains, NY:

Longman.

Fischer, R. (2004). Standardization to account for cross-cultural response bias. A

classification of Score Adjustment Procedures and Review of Research in JCCP. Journal of Cross-Cultural Psychology, 35, 263-382.

Flaskerud, J.H. (1988). Is the Likert scale format culturally biased? Nursing Research, 37, 185-186.

Grønmo, L. S., Bergem, O.K., Kjærnsli, M., Lie, S., & Turmo, A. (2004) Hva i all verden har skjedd med realfagene? Norske elevers prestasjoner i matematikk og naturfag i TIMSS 2003. Oslo: Institutt for lærerutdanning og skoleutvikling, Det

utdanningsvitenskapelige fakultet, UiO

Heine, S.J., Lehman, D.R., Peng, K.P., & Greenholz, J. (2002). What’s wrong with cross-cultural comparisons of subjective Likert scales? The reference group effect. Journal of Personality and Social Psychology, 82, 903-918.

Herk, H., Poortinga, Y.H., & Verhallen, T.M.M. (2004). Response Styles in Rating Scales.

Evidence of Method Bias in Data From Six EU Countries. Journal of Cross-Cultural Psychology, 35, 346-360.

Knain, E., & Turmo, A. (2003). Self-regulated learning. In Lie, S., Linnakylä, P., & Roe, A.

(Eds.) Northern Lights on PISA. Unity and Diversity in the Nordic countries in PISA 2000. Oslo: Department of Teacher Education and School Development, University of Oslo.

Lee, J. W., Jones, P. S., Mineyama, Y., & Zhang, W. E. (2002). Cultural Differences in Responses to a Likert Scale. Research in Nursing & Health, 25, 295-306.

Lie, S., & Turmo, A. (2005). Cross-country comparability of students’ self-reports.Paper to the PISA Technical Advisory Group Meeting, TAG (0505)11, Paris, France, May 9 -11 2005.

OECD (2004). PISA 2003: Learning for Tomorrow’s world. First results from PISA 2003.

Paris: OECD Publication.

OECD (2005). PISA 2003: Technical Report. Paris: OECD Publications.

Ramsden, P (1992). Learning to teach in higher education. London: Routlage Falmer Samuelstuen, M. (2005) Kognitiv og metakognitiv strategibruk med særlig henblikk på tekstlæring.Trondheim: Doktoravhandling for graden doctor rerum politicarum, Fakultet for samfunnsvitenskap og teknologiledelse, NTNU.

Shorrocks-Taylor, D., & Jenkins, E. W. (Eds.) (2000) Learning from Others.Dordrecht:

Kluwer Academic Publishers.

Turmo, A. (2005). The relationship between the use of learning strategies and

socioeconomic background in 15-year olds. Journal of Nordic Educational Research, 25, 155 -168.

Weinstein, C. E. (1988). Assessing Learning Strategies: The Design and Development of the LASSI. In Weinstein C. E., Goetz, E., T., & Alexander, P. A. (Eds.) Learning and Study Strategies Issues in Assessment, Instruction and Evaluation. San Diego: Academic Press (pp. 26-39)

Zimmerman, B. J. (1990). Self-regulated Learning and Academic Achievement: An Overview. Educational Psychologist,25, 3 -17.

Chapter 9

Nordic Minority Students’ Literacy