#PeerReview A fantastic way of defining a high-quality peer review: 'In this paper we explore extracts of reviews authors have received on their submissions which they regard as particularly harsh and suggest how authors may deal with such comments.'1/n
https://doi.org/10.1016/j.jeap.2020.100867
#PeerReview they examined 850 excerpts posted by authors on the publicly accessible @YourPaperSucks website. They seek to explore these critical comments and identify some of the features which authors find so discouraging or scathing that they chose to share them there. 2/n
#PeerReview In doing so they focused on 3 areas: 1)the evaluative focus of the comments, what the reviewers addressed; 2)the stance taken by the reviewer in the extract; and 3) the word forms which are particularly common in a given text: those which are key to it. 3/n
#PeerReview Ideally, they preferred to compare these harsh reviews with others that are less severe, but a sufficiently large corpus for this purpose is virtually impossible to collect. @PEERe_REVIEW 4/n
#PeerReview so, instead decided they statistically compared the word frequencies in their @YourPaperSucks corpus against the 16 million words of the academic genres of the British National Corpus, a collection of published academic written texts. 5/n
#PeerReview they independently scoured the corpus and gradually refined their groupings, adding a category relating to the competence of the authors, which did not figure in the other studies. In this way they arrived at the following broad areas which include all comments: 6/n
#PeerReview 1)Author competence: Concerning the abilities of the writers to conduct and present research suitable for publication.
(1)
The authors are amateurs.

(2)
Frankly, she knows nothing about invasion biology or the Great Lakes.
7/n
#PeerReview 2)Overall verdict: An overall comment on the submission as a whole, its originality contribution to knowledge and the acceptability of the claims, and whether it meets the standards for journal publication:

It cannot be reviewed and should be rejected outright.
8/n
#PeerReview 3)Quality of the arguments: Whether the submission is persuasive, coherent and lucid for disciplinary readers.

Most part of ‘methodology’ is useless, most paragraphs are irrelevant to the main topics.
9/n
#PeerReview 4) Structure and language: The overall structure, the length, and use of grammatical and appropriate language.

The writing and data presentation are so bad that I had to leave work and go home early and spend time to wonder what life is about.
😳 10/n
#PeerReview 5)Research design: The clarity of the research questions, the nature of the data, how the research was conducted and appropriacy of the analysis

The first problem is that the method - whatever it is and however it works - is insufficiently evaluated.
11/n
#PeerReview Based on these categories, they counted the frequencies of occurrence the percentages of all comments.
Author: 24.5%
Overall: 29.9%
Argument: 12.8%
Language: 11.3%
Design: 21.5%
12/n
#PeerReview then they addressed concerns the reviewers use of explicit markers of epistemic and attitudinal stance in these reviews. Their framework encompasses three main components: 1)evidentiality, 2)affect and 3)presence.
13/n
#PeerReview
Evidentiality - the reviewre’s stated commitment to the reliability of statements and their potential impact on the reader, expressed through hedges and boosters

Affect - a range of attitudes towards what is said expressed through attitude markers

14/n
#PeerReview
Presence - the extent to which a reviewer chooses to intrude into a text through the use of first-person pronouns and possessive determiners.
15/n
#PeerReview Using this framework, they identified 1,192 stance expressions in the corpus, averaging 1.4 cases in every extract. This is an enormously high frequency compared with other genres and underlines the extremely evaluative nature of these texts
16/n
#PeerReview where authors feel that reviewers have overstepped the mark in criticizing their work. Example:

I am, frankly, underwhelmed by the revisions.
😳17/n
#PeerReview here is the distribution of these stance markers in the corpus:
Boosters: 33.9%
Self-mention: 28.4%
Attitude markers:23.2%
Hedges:14.4%
18/n
#PeerReview Conclusion: journals and editors should provide clear peer review guidelines, checklists, and standards. PhD courses should include peer review workshops and most importantly, editors should run co-reviewer and mentorship programs. n=19
You can follow @mehmanib.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: