I am watching folks argue over h-indexes in tenure & promotion on my timeline as if citational practices are devoid of misogynoir ( #CiteASista), as if BIPOC scholars don& #39;t play a delicate placating game in everything they write ( #BlackInTheIvory), and as if folks 1/
(sub)consciously know about the bias *even* finding research by scholars of color when the algorithms are racist ( #DistributedBlackness & #algorithmsofoppression). Do you think #GoogleScholar, #Ebsco, #JStor, etc. are devoid of such? 2/
If the algorithms censor/ lower/ push down BIPOC work and people start there (e.g. Google Scholar) as often as they do in the in house lib search functions how do people get their H-Indexes up?
Moreover, if you& #39;re qual and not quant, it takes longer to push stuff out.
Moreover, if you& #39;re qual and not quant, it takes longer to push stuff out.
STEM is great. I mean that. But I see STEM journal articles that are 3-4 pages. GIRL PLEASE. Many people& #39;s h-Indexes would be higher (esp. in education and the social sciences) if they were churning out 3-page articles.
Context so matters.
Context so matters.
Also-- quantitatively measuring impact is a slippery slope. If the journal "isn& #39;t good enough" but the article has 30 citations within 2 years what does that tell you about the work itself more than where it& #39;s published?
The publication process is fraught.
/5?
The publication process is fraught.
/5?
Reviewer bias is real and as long there& #39;s no checks and balances in the reviewership we will continue to see people relegated to the margins of "tier 1" journals.
We do a REALLY BAD JOB of explaining all of this to folks in grad school. Learning this on the job is yikes! 6/
We do a REALLY BAD JOB of explaining all of this to folks in grad school. Learning this on the job is yikes! 6/