Just this week, two different teams of authors published papers showing that cross-country comparisons of test scores (PISA, TIMSS) should be taken with a huge grain of salt.
Paper 1: @gema_zamarro and coauthors show that various measures of student effort explain over 30% of the variation in PISA scores across countries.

https://www.journals.uchicago.edu/doi/full/10.1086/705799
Paper 2: @UriGneezy and coauthors show that paying US students to try harder during these exams substantially boosts their scores (but Chinese students appear to already be exerting maximum effort).

https://www.aeaweb.org/articles?id=10.1257/aeri.20180633
International rankings based on test scores thus conflate achievement and effort.

Both quantities matter, but you should be substantially more skeptical when someone says that American students know a lot less than their international peers.

We just don't know that.
Another recent paper in this vein (h/t @Stephen_Sawchuk):

Account for "non-serious" test-taking behavior substantially alters PISA scores.

https://www.nber.org/papers/w24930 
Last comment for now, before I sign off to do some real work:

I'm a big fan of standardized exams. They're the only way to generate the international comparisons I'd love to have. But these efforts issues have lessened my certainty about where the US stands in the world.
One last thought because I couldn't resist.

If the OECD starts an initiative to get students to answer all the questions on these exams, it should be called:

"Minding Your PISA Q's"

That is all.
You can follow @JoshuaSGoodman.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: