|
See also Papers produced by the project
Latest articles on OA impact |
This update service is no longer available following the
closure of Connotea.
It may be resumed if a suitable alternative service is found. |
Last updated 25 June 2013; first posted 15 September 2004. If you have any additions, corrections or comments, please tweet @stevehit #oaimpact or email Steve Hitchcock.
What others say about this bibliography"a brilliant source of articles" (on impact of Open Access material)
Christine Stohn,
Ex Libris Initiatives (3 September 2012)
"a major bibliography on this debate"
Alan M Johnson, Charting
a Course for a Successful Research Career: A Guide for Early Career
Researchers, 2nd edition (April 2011), ch. 9, Where to Publish, p50
A great resource!
Eloy Rodrigues, University of Minho, via
Twitter (July 17, 2009)
"incredible resource"
OA Librarian blog, Citation
Impact Bibliography Resource (December 7, 2005)
"excellent bibliography"
Heather Morrison,
The Imaginary Journal of Poetic Economics (December 08, 2005)
Peter Suber, American Scientist Open Access Forum (28 September 2005)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/4804.html
"This ongoing chronological
bibliography may be worth bookmarking and checking every few months. There's
very little annotation, but it's a good brief bibliography on a narrow - but
important - subject."
Walt Crawford, Cites & Insights (November 2004, p13) http://citesandinsights.info/civ4i13.pdf
Open Access Policies. This bibliography is cited in support of the following open access policies,
statements and guidelines for authors:
|
Citations
of this bibliography found by Google Scholar
Web pages that link to
this bibliography found by Google
This chronological bibliography is intended to describe progress in reporting these studies; it also lists the Web tools available to measure impact. It is a focused bibliography, on the relationship between impact and access. It does not attempt to cover citation impact, or other related topics such as open access, more generally, although some key papers in these areas are listed as jump-off points for wider study.
Added 25 June 2013
Björn Brembs, Katherine Button and Marcus Munafò (2013)
Deep
impact: unintended consequences of journal rank
Front. Hum. Neurosci.,
7:291, published online: 24 June 2013. Also in arXiv.org > cs
> arXiv:1301.3748, v1, 16 Jan 2013 http://arxiv.org/abs/1301.3748
doi: 10.3389/fnhum.2013.00291
Abstract: Most researchers acknowledge an intrinsic hierarchy in the
scholarly journals (journal rank) that they submit their work to, and
adjust not only their submission but also their reading strategies
accordingly. On the other hand, much has been written about the
negative effects of institutionalizing journal rank as an impact
measure. So far, contributions to the debate concerning the limitations
of journal rank as a scientific impact assessment tool have either
lacked data, or relied on only a few studies. In this review, we
present the most recent and pertinent data on the consequences of our
current scholarly communication system with respect to various measures
of scientific quality (such as utility/citations, methodological
soundness, expert ratings or retractions). These data corroborate
previous hypotheses: using journal rank as an assessment tool is bad
scientific practice. Moreover, the data lead us to argue that any
journal rank (not only the currently-favored Impact Factor) would have
this negative impact. Therefore, we suggest that abandoning journals
altogether, in favor of a library-based scholarly communication system,
will ultimately be necessary. This new system will use modern
information technology to vastly improve the filter, sort and discovery
functions of the current journal system.
Added 25 June 2013
Valeria Aman (2013)
The potential of
preprints to accelerate scholarly communication - A bibliometric
analysis based on selected journals
arXiv.org > cs > arXiv:1306.4856, 20 Jun 2013
Master Thesis. Abstract: This paper quantifies to which extent
preprints in arXiv accelerate scholarly communication. The following
subject fields were investigated up to the year 2012: High Energy
Physics (HEP), Mathematics, Astrophysics, Quantitative Biology, and
Library and Information Science (LIS). Publication and citation data
was downloaded from Scopus and matched with corresponding preprints in
arXiv. Furthermore, the INSPIRE HEP database was used to retrieve
citation data for papers related to HEP. The bibliometric analysis
deals with the growth in numbers of articles published having a
previous preprint in arXiv and the publication delay, which is defined
as the chronological distance between the deposit of a preprint in
arXiv and its formal journal publication. Likewise, the citation delay
is analyzed, which describes the time it takes until the first citation
of preprints, and articles, respectively. Total citation numbers are
compared for sets of articles with a previous preprint and those
without. The results show that in all fields but biology a significant
citation advantage exists in terms of speed and citation rates for
articles with a previous preprint version on arXiv.
Added 21 June 2013
Carsten Nieder, Astrid Dalhaug and Gro Aandahl (2013)
Correlation
between article download and citation figures for highly accessed
articles from five open access oncology journals
SpringerPlus,
2:261, 13 June 2013
doi:10.1186/2193-1801-2-261
Abstract
(provisional): Different approaches can be chosen to quantify the
impact and merits of scientific oncology publications. These include
source of publication (including journal reputation and impact factor),
whether or not articles are cited by others, and access/download
figures. When relying on citation counts, one needs to obtain access to
citation databases and has to consider that results differ from one
database to another. Accumulation of citations takes time and their
dynamics might differ from journal to journal and topic to topic.
Therefore, we wanted to evaluate the correlation between citation and
download figures, hypothesising that articles with fewer downloads also
accumulate fewer citations. Typically, publishers provide download
figures together with the article. We extracted and analysed the 50
most viewed articles from 5 different open access oncology journals.
For each of the 5 journals and also all journals combined, correlation
between number of accesses and citations was limited (r = 0.01-0.30).
Considerable variations were also observed when analyses were
restricted to specific article types such as reviews only (r = 0.21) or
case reports only (r = 0.53). Even if year of publication was taken
into account, high correlation coefficients were the exception from the
rule. In conclusion, downloads are not a universal surrogate for
citation figures.
Added 17 June 2013
Vincent Lariviere, Cassidy R. Sugimoto, Benoit Macaluso, Stasa
Milojevic, Blaise Cronin, Mike Thelwall (2013)
arXiv e-prints
and the journal of record: An analysis of roles and relationships
arXiv.org > cs > arXiv:1306.3261, 13 Jun 2013
Abstract: Since
its creation in 1991, arXiv has become central to the diffusion of
research in a number of fields. Combining data from the entirety of
arXiv and the Web of Science (WoS), this paper investigates (a) the
proportion of papers across all disciplines that are on arXiv and the
proportion of arXiv papers that are in the WoS, (b) elapsed time
between arXiv submission and journal publication, and (c) the aging
characteristics and scientific impact of arXiv e-prints and their
published version. It shows that the proportion of WoS papers found on
arXiv varies across the specialties of physics and mathematics, and
that only a few specialties make extensive use of the repository.
Elapsed time between arXiv submission and journal publication has
shortened but remains longer in mathematics than in physics. In
physics, mathematics, as well as in astronomy and astrophysics, arXiv
versions are cited more promptly and decay faster than WoS papers. The
arXiv versions of papers - both published and unpublished - have lower
citation rates than published papers, although there is almost no
difference in the impact of the arXiv versions of both published and
unpublished papers.
Added 17 June 2013
Pekka Olsbo (2013)
Does
Openness and Open Access Policy Relate to the Success of Universities?
17th International
Conference on Electronic
Publishing, Karlskrona, Sweden, June 13-14, 2013
full paper
http://elpub.scix.net/data/works/att/110_elpub2013.content.01124.pdf
Extended Abstract. Introduction: The cross reading and examination of
the report The
state of scientific research in Finland 2012 by The
Finnish Academy and
Ranking Web of Universities seem to show that there could be a
connection between the internet visibility, ranking and the relative
citation impact of universities in different countries. These
relationships can be traced back to the effectiveness of the open
access publishing, self-archiving and Open Access policies of the
countries and the universities. This paper focuses on internet
visibility of the University of Jyväskylä and eight European
countries and how the openness of universities has developed during
last two editions of the Ranking Web of Universities.
Added 17 June 2013
Mike Taylor (2013)
The
Challenges of Measuring Social Impact Using Altmetrics
Research Trends,
Issue 33, June 2013
Abstract:
Altmetrics gives us novel ways of detecting the use and consumption of
scholarly publishing beyond formal citation, and it is tempting to
treat these measurements as proxies for social impact. However,
altmetrics is still too shallow and too narrow, and needs to increase
its scope and reach before it can make a significant contribution to
computing relative values for social impact. Furthermore, in order to
go beyond limited comparisons of like-for-like and to become generally
useful, computation models must take into account different
socio-economic characteristics and legal frameworks. However, much of
the necessary work can be borrowed from other fields, and the author
concludes that with certain extensions and added sophistication
altmetrics will be a valuable element in calculating social reach and
impact.
Occasional series: may have missed ... Added 17 June 2013
Roudabeh Torabian, Alireza Heidari, Maryam Shahrifar, Esmail Khodadi,
Safar Ali Esmaeile Vardanjani (2012) |
Added 28 May 2013
Mark J. McCabe, Christopher M. Snyder (2013)
The
Rich Get Richer and the Poor Get Poorer: The Effect of Open Access on
Cites to Science Journals Across the Quality Spectrum
Social Science Research Network SSRN, May 25, 2013
Abstract: An open-access journal allows free online access to its
articles, obtaining revenue from fees charged to submitting authors.
Using panel data on science journals, we are able to circumvent some
problems plaguing previous studies of the impact of open access on
citations. We find that moving from paid to open access increases cites
by 8% on average in our sample, but the effect varies across the
quality of content. Open access increases cites to the best content
(top-ranked journals or articles in upper quintiles of citations within
a volume) but reduces cites to lower-quality content. We construct a
model to explain these findings in which being placed on a broad
open-access platform can increase the competition among articles for
readers attention. We can find structural parameters allowing the
model to fit the quintile results quite closely.
Added 28 May 2013
San Francisco
Declaration on Research Assessment (DORA) (2013)
American Society for Cell Biology (ASCB), 17 May 2013
From DORA: There is a pressing need to improve the ways in which the output of
scientific research is evaluated by funding agencies, academic
institutions, and other parties. ... The Journal Impact Factor is
frequently used as the primary parameter with which to compare the
scientific output of individuals and institutions. The Journal Impact
Factor, as calculated by Thomson Reuters, was originally created as a
tool to help librarians identify journals to purchase, not as a measure
of the scientific quality of research in an article. With that in mind,
it is critical to understand that the Journal Impact Factor has a
number of well-documented deficiencies as a tool for research
assessment. These limitations include: A) citation distributions within
journals are highly skewed; B) the properties of the Journal Impact
Factor are field-specific: it is a composite of multiple, highly
diverse article types, including primary research papers and reviews;
C) Journal Impact Factors can be manipulated (or "gamed") by editorial
policy; and D) data used to calculate the Journal Impact Factors are
neither transparent nor openly available to the public. we make a
number of recommendations for improving the way in which the quality of
research output is evaluated. A number of themes run through
these recommendations:
"Measuring the effect for physics or astronomy is easy. This link returns the number of articles published in the Astrophysical Journal in 2003 and their number of citations.
"This next link shows the number of these papers which are available OA in the arXiv, and their citations.
"The result is that 75% of the papers are in the arXiv, and they represent 90% of the citations, a 250% OA effect.
"By replacing ApJ with the mnemonic for any other physics or astronomy
journal one can repeat the measurement; for Nuclear Physics A (NuPhA) one gets
that 32% of the articles are in the arXiv, and they represent 78% of the
citations, a 740% OA effect."
From Michael Kurtz, American Scientist Open Access Forum, 28 September 2005 http://users.ecs.soton.ac.uk/harnad/Hypermail/Amsci/4807.html
Note, the database links are 'live', i.e. they return the current database
figures, not the exact figures on which Michael Kurtz would have based his
calculations, but the percentages quoted are unlikely to change dramatically,
in the short term at least.
Elucidation of calculation (by Stevan Harnad, figures valid on 22 July 2007)
For ApJ:
TOT: articles 2592 citations 70732
Arx: articles 1943 citations 62586 c/a 32.21 (rounded to 32)
Non: articles 649 citations 8146 c/a 12.55 (rounded to 13)
Then 32/13 = 2.5 (250%)
For NuPhA:
TOT: articles 1134 citations 4451
Arx: articles 344 citations 3225 c/a 9.375
Non: articles 790 citations 1226 c/a 1.552
Then 9.375/1.553 = 6.041 (600%)
Michael Kurtz comments: "The differences in (NuPhA: 740% to 600% effect) results are because the database has changed over the past two years since I did it. There is a systematic error in the calculations for Nuclear Physics A (Elsevier does not give us the references) so the results will be higher than the true value. Physical Review C (Nuclear Physics) has an OA advantage number of 221%, the systematic in this case is small and in the other direction."
Reviews of OA impact studies |
This is an important article. It's the first major study since the famous Lawrence paper documenting the proposition that OA increases impact. It's also the first to go beyond Lawrence in scope and method in order to answer doubts raised about his thesis. By confirming that OA increases impact, it gives authors the best of reasons to provide OA to their own work (21 June 2004)Broader collaborations have emerged to extend these findings (e.g. Brody et al. 2004).
Open access has become feasible because of the move towards online publication and dissemination. A new measure that becomes possible with online publication is the number of downloads or 'hits', opening a new line of investigation. Brody et al. have been prominent in showing there is a correlation between higher downloads and higher impact, particularly for high impact papers, holding out the promise not just for higher impact resulting from open access but for the ability to predict high impact papers much earlier, not waiting years for those citations to materialise (e.g. Brody and Harnad 2005). The effect can be verified with the Correlation Generator (below).
(Note. The latest listings might include preprints, or even pre-preprints. This area of study is effectively a work in progress, and as such the list is intended to raise awareness of the most recent results, even where these may not be definitive or final versions. Check back for definitive versions.)
Added 13 May 2013
Cassidy R. Sugimoto, Mike Thelwall, Vincent Larivière, Andrew Tsou,
Philippe Mongeon, Benoit Macaluso (2013)
Scientists
Popularizing Science: Characteristics and Impact of TED Talk Presenters
PLoS ONE,
8(4): e62403, April 30, 2013
doi:10.1371/journal.pone.0062403
Abstract: The TED (Technology, Entertainment, Design) conference and associated
website of recorded conference presentations (TED Talks) is a highly
successful disseminator of science-related videos, claiming over a
billion online views. Although hundreds of scientists have presented at
TED, little information is available regarding the presenters, their
academic credentials, and the impact of TED Talks on the general
population. This article uses bibliometric and webometric techniques to
gather data on the characteristics of TED presenters and videos and
analyze the relationship between these characteristics and the
subsequent impact of the videos. The results show that the presenters
were predominately male and non-academics. Male-authored videos were
more popular and more liked when viewed on YouTube. Videos by academic
presenters were more commented on than videos by others and were more
liked on YouTube, although there was little difference in how
frequently they were viewed. The majority of academic presenters were
senior faculty, males, from United States-based institutions, were
visible online, and were cited more frequently than average for their
field. However, giving a TED presentation appeared to have no impact on
the number of citations subsequently received by an academic,
suggesting that although TED popularizes research, it may not promote
the work of scientists within the academic community.
See also
Sara Grossman, Giving
a TED Talk? Expect More Visibility, but Not More Citations, Chronicle of Higher Education,
June 19, 2013
Added 13 May 2013
Xianwen Wang, Wenli Mao, Shenmeng Xu, Chunbo Zhang (2013)
Attention History
Over Time of Scientific Literature: Metrics of Nature Publications
arXiv.org > cs > arXiv:1304.7653, 29 Apr 2013
Abstract: In this study, we report findings about patterns of the Nature metrics
page views over time. Using the page views data of papers published on
Nature, we calculate from two perspectives. The first one is the time
before their page views reach 50%/80% of the total, and the second one
is the percentage of total page views in 7 days, 30 days, and 100 days
after publication. Papers are viewed most frequently within a short
time period after publication. Respectively, 62.16% and 100% of the
papers on Nature are viewed more than half of their total times in the
first week and month. 52.48% of the papers gain more than 80% of their
total views in the following month. Meanwhile, the page views number
reaches more than 52% of the total in the first week and more than 72%
in the first month. In addition, we find that readers' attention on
Open Access publications are more enduring. Using the usage data of a
newly published paper, we conduct regression analysis to predict the
future expected total usage count of the paper.
Added 13 May 2013
Teja Koler-Povh, Goran Turk and Primo Juni (2013)
Does
the Open Access Business Model Have a Significant Impact on the
Citation of Publications? Case Study in the Field of Civil Engineering
Proceedings of the Fifth
Belgrade International Open Access Conference 2012, 26
April 2013
DOI: 10.5937/BIOAC-68
Original conference slide presentation, 18 May 2012
http://boac.ceon.rs/public/site/Koler-Povh_Juznic_Turk.pdf
From the abstract: we have chosen to analyze the publications in three
international journals in the field of civil engineering. All of them
have an ISI impact factor in the Civil engineering subject category in
the ISI/Web of science database (WOS). The articles were classified
into two groups - the OA publications and the non-OA publications. We
analyzed all the articles published in the same year and the number of
their citations until the end of February 2012, seeking to find out if
these two groups differ from each other. From the conclusion: It was found
that OA significantly influenced the citation counts for the
articles published in the Computers & Structures journal, which
is ranked in the first quarter - according to both databases. Only the GS
database showed a significant effect of OA on citations for the
articles published in the Journal of Computing in Civil Engineering.
Neither GS nor WOS databases indicate a significant effect of OA on the
citation counts of articles in the Automation in Construction journal.
These two journals are ranked in the second quarter among 88 journals
in the same subject category, civil engineering. The present results
indicate that more research is needed to give a final answer to the
principle question of the paper: does open access have a significant
impact on citations in the field of civil engineering. Some other
potentially influential factors will be tested as well.
Added 13 May 2013
Vincent Larivière, George A. Lozano, Yves Gingras (2013)
Are elite
journals declining?
arXiv.org > cs > arXiv:1304.6460, 24 Apr 2013
Previous work indicates that over the past 20 years, the highest quality work
have been published in an increasingly diverse and larger group of
journals. In this paper we examine whether this diversification has
also affected the handful of elite journals that are traditionally
considered to be the best. We examine citation patterns over the past
40 years of 7 long-standing traditionally elite journals and 6 journals
that have been increasing in importance over the past 20 years. To be
among the top 5% or 1% cited papers, papers now need about twice as
many citations as they did 40 years ago. Since the late 1980s and early
1990s elite journals have been publishing a decreasing proportion of
these top cited papers. This also applies to the two journals that are
typically considered as the top venues and often used as bibliometric
indicators of "excellence", Science and Nature. On the other hand,
several new and established journals are publishing an increasing
proportion of most cited papers. These changes bring new challenges and
opportunities for all parties. Journals can enact policies to increase
or maintain their relative position in the journal hierarchy.
Researchers now have the option to publish in more diverse venues
knowing that their work can still reach the same audiences. Finally,
evaluators and administrators need to know that although there will
always be a certain prestige associated with publishing in "elite"
journals, journal hierarchies are in constant flux so inclusion of
journals into this group is not permanent.
See also
Hadas Shema, Elite
journals: to hell in a handbasket? Scientific American blogs, May 2, 2013
Added 13 May 2013
Paola Bongiovani, Sandra Miguel, Nancy Diana Gómez
Acceso
abierto, impacto científico y la producción científica en dos
universidades argentinas en el campo de la medicina (Open
Access, scientific impact and the scientific production in two
Argentine universities in the field of medicine)
Revista Cubana de
Información en Ciencias de la Salud, Vol 24, No 2, 2013
Abstract: this paper studies the scientific production published by researchers
at two argentine universities (National University of La Plata and
National University of Rosario) in the discipline of medicine.
Objective: to establish the volume and evolution of scientific
production published in open access journals and in subscription
journals that allow self-archiving in repositories.
Methods: The scientific production for both institutions was determined by taking a
sample from Scopus and covers the period 2006-2010. It applies a
methodology based on the analysis of the access models of journals used
by researchers to publish their articles established through searches
performed using Romeo-Sherpa, Dulcinea, DOAJ, SciELO, RedALyC and
PubMed Central. Additionally, the study explores the citation levels of
articles from both institutions according to access models of journals,
comparing impact indicators from average citation per article. Results:
The two institutions generally show similar patterns to those found at
national level, although UNR, following international trends in
Medicine, has a higher percentage of articles published in open access
journals. In both cases, about half of the production could be
deposited in repositories, being pre-print versions and the author's
post print mostly allowed by editors. Conclusions: From the perspective
of the impact levels achieved, the results indicate a higher level of
citation in subscription journals with self-archiving permissions, and
this is encouraging for the promotion and development of institutional
repositories in both universities.
Added 9 April 2013
Xin Shuai, Zhuoren Jiang, Xiaozhong Liu, Johan Bollen (2013)
A
Comparative Study of Academic impact and Wikipedia Ranking
Author preprint. Joint Conference on
Digital Libraries JCDL 2013, Indianapolis, IN, July 22-26,
2013, to be presented
Abstract: In addition to its broad popularity Wikipedia is also widely
used for scholarly purposes. Many Wikipedia pages pertain to academic
papers, scholars and topics providing a rich ecology for scholarly
uses. Although many recognize the scholarly potential of Wikipedia, as
a crowdsourced encyclopedia its authority and quality is questioned due
to the lack of rigorous peer-review and supervision. Scholarly
references and mentions on Wikipedia may thus shape the societal
impact of a certain scholarly communication item, but it is not clear
whether they shape actual academic impact. In this paper we compare
the impact of papers, scholars, and topics according to two different
measures, namely scholarly citations and Wikipedia mentions. Our
results show that academic and Wikipedia impact are positively
correlated. Papers, authors, and topics that are mentioned on Wikipedia
have higher academic impact than those are not mentioned. Our findings
validate the hypothesis that Wikipedia can help assess the impact of
scholarly publications and underpin relevance indicators for scholarly
retrieval or recommendation systems.
Added 9 April 2013
Heather Piwowar, Todd J Vision (2013)
Data reuse and
the open data citation advantage
PeerJ PrePrints, v1, 4 Apr 2013
doi: 10.7287/peerj.preprints.1
From the Abstract: Previous studies have found that papers with
publicly available datasets receive a higher number of citations than
similar studies without available data. However, few previous analyses
have had the statistical power to control for the many variables known
to predict citation rate, which has led to uncertain estimates of the
citation boost. Furthermore, little is known about patterns in data
reuse over time and across datasets. METHOD AND RESULTS: Here, we look
at citation rates while controlling for many known citation predictors,
and investigate the variability of data reuse. In a multivariate
regression on 10,555 studies that created gene expression microarray
data, we found that studies that made data available in a public
repository received 9% (95% confidence interval: 5% to 13%) more
citations than similar studies for which the data was not made
available. Date of publication, journal impact factor, open access
status, number of authors, first and last author publication history,
corresponding author country, institution citation history, and study
topic were included as covariates. The citation boost varied with date
of dataset deposition: a citation boost was most clear for papers
published in 2004 and 2005, at about 30%. ... CONCLUSION: After
accounting for other factors affecting citation rate, we find a robust
citation benefit from open data, although a smaller one than previously
reported. We conclude there is a direct effect of third-party data
reuse that persists for years beyond the time when researchers have
published most of the papers reusing their own data.
Added 9 April 2013
Philip M. Davis (2013)
Public
accessibility of biomedical articles from PubMed Central reduces
journal readership - retrospective cohort analysis
The FASEB Journal,
Federation of American Societies for Experimental Biology, April 3, 2013
doi: 10.1096/fj.13-229922
Abstract: Does PubMed Centrala government-run digital archive of
biomedical articlescompete with scientific society journals? A
longitudinal, retrospective cohort analysis of 13,223 articles (5999
treatment, 7224 control) published in 14 society-run biomedical
research journals in nutrition, experimental biology, physiology, and
radiology between February 2008 and January 2011 reveals a 21.4%
reduction in full-text hypertext markup language (HTML) article
downloads and a 13.8% reduction in portable document format (PDF)
article downloads from the journals websites when U.S. National
Institutes of Health-sponsored articles (treatment) become freely
available from the PubMed Central repository. In addition, the effect
of PubMed Central on reducing PDF article downloads is increasing over
time, growing at a rate of 1.6% per year. There was no longitudinal
effect for full-text HTML downloads. While PubMed Central may be
providing complementary access to readers traditionally underserved by
scientific journals, the loss of article readership from the journal
website may weaken the ability of the journal to build communities of
interest around research papers, impede the communication of news and
events to scientific society members and journal readers, and reduce
the perceived value of the journal to institutional subscribers.
Added 28 March 2013
David J Solomon, Mikael Laakso, Bo-Christer Björk (2013)
A
longitudinal comparison of citation rates and growth among open access
journals
Author preprint, 27 March 2013.
In Journal
of Informetrics, accepted for publication
Abstract: The study documents the growth in the number of journals and
articles along with the increase in normalized citation rates of open
access (OA) journals listed in the Scopus bibliographic database
between 1999 and 2010. Longitudinal statistics on growth in
journals/articles and citation rates are broken down by funding model,
discipline, and whether the journal was launched or had converted to
OA. The data were retrieved from the web sites of SCIMago Journal and
Country Rank (journal /article counts), JournalM3trics (SNIP2 values),
Scopus (journal discipline) and Directory of Open Access Journals
(DOAJ) (OA and funding status). OA journals/articles have grown much
faster than subscription journals but still make up less that 12% of
the journals in Scopus. Two-year citation averages for journals funded
by article processing charges (APCs) have reached the same level as
subscription journals. Citation averages of OA journals funded by other
means continue to lag well behind OA journals funded by APCs and
subscription journals. We hypothesize this is less an issue of quality
than due to the fact that such journals are commonly published in
languages other than English and tend to be located outside the four
major publishing countries.
Added 4 March 2013, updated 13 May 2013
Jerome Vanclay (2013)
Factors
affecting citation rates in Environmental Science
ePublications@SCU, Southern Cross University, 2013. In Journal
of Informetrics, Vol 7, No 2, April 2013, 265-271
http://www.sciencedirect.com/science/article/pii/S1751157712000995
http://dx.doi.org/10.1016/j.joi.2012.11.009
Abstract: Analysis of 131 publications during 2006-07 by staff of the
School of Environmental Science and Management at Southern Cross
University reveals that the journal impact factor, article length and
type (i.e., article or review), and journal self-citations affect the
citations accrued to 2012. Authors seeking to be well cited should aim
to write comprehensive and substantial review articles, and submit them
to journals with a high impact factor which has previously carried
articles on the topic. Nonetheless, strategic placement of articles is
complementary to, and no substitute for careful crafting of good
quality research. Evidence remains equivocal regarding the contribution
of an authors prior publication success (h-index) and of open-access
journals.
Added 4 March 2013, updated 13 May 2013
Caitlin Rivers (2013)
Scholarly impact of open access journals
18 March 2013 (originally written on 26 January 2013)
Data sources for this work also available on Figshare
http://figshare.com/articles/Open_access_journal_impacts/154346
Extracts: I downloaded a list of open access journals from the
Directory of Open Access Journals (DOAJ). I also downloaded a
spreadsheet of 2011 impact data from Journal Metrics, an offshoot of
Scopus that assesses journal impact. Journal Metrics provides two
impact measures: Source Normalized Impact per Paper (SNIP) and SCImago
Journal Rank (SGR). ... In terms of impact, open access still lags
behind non-OA journals. The mean SNIP of non-OA journals in 2011 was
0.83 with a max of 41, while OA journals had a mean SNIP of .57 and a
max of less than 5. The mean SJR of non-OA was .64 (max of 36), and the
OA mean was .38 (max of 7.6).
Added 19 March 2013, updated 13 May 2013
Mark J. McCabe, Christopher M. Snyder (2013)
Does
Online Availability Increase Citations? Theory and Evidence from a
Panel of Economics and Business Journals
Social Science Research Network (SSRN), 14 March 2013, also at
http://mccabe.people.si.umich.edu/McCabe_Snyder_Revised_3_2013.pdf. Submitted to Review of Economics and
Statistics.
Revised preprint. Abstract: Does online availability boost citations? The
answer has implications for issues ranging from the value of a citation
to the sustainability of open-access journals. Using panel data on
citations to economics and business journals, we show that the enormous
effects found in previous studies were an artifact of their failure to
control for article quality, disappearing once we add fixed effects as
controls. The absence of an aggregate effect masks heterogeneity across
platforms: JSTOR stands apart from others, boosting citations around
10%. We examine other sources of heterogeneity including whether JSTOR
increases cites from authors in developing more than developed
countries and increases cites to long-tail more than superstar
articles. Our theoretical analysis informs the econometric
specification and allows us to translate our results for citation
increases into welfare terms.
Added 4 March 2013
Richard C. Doty (2013)
Tenure-Track
Science Faculty and the 'Open Access Citation Effect'
Journal of Librarianship
and Scholarly Communication, 1 (3), Mar 2013
info:doi/10.7710/2162-3309.1052
From the Abstract: The observation that open access (OA) articles
receive more citations than subscription-based articles is known as the
OA citation effect (OACE). Implicit in many OACE studies is the belief
that authors are heavily invested in the number of citations their
articles receive. This study seeks to determine what influence the OACE
has on the decision-making process of tenure-track science faculty when
they consider where to submit a manuscript for publication. METHODS
Fifteen tenure-track faculty members in the Departments of Biology and
Chemistry at the University of North Carolina at Chapel Hill
participated in semi-structured interviews employing a variation of the
critical incident tecnique. RESULTS Seven of the fifteen faculty
members said they would consider making a future article
freely-available based on the OACE. Due to dramatically different
expectations with respect to the size of the OACE, however, only one of
them is likely to seriously consider the OACE when deciding where to
submit their next manuscript for publication. DISCUSSION Journal
reputation and audience, and the quality of the editorial and review
process are the most important factors in deciding where to submit a
manuscript for publication. Once a subset of journals has satisfied
these criteria, financial and access issues compete with the OACE in
making a final decision.
Added 19 March 2013
Mazda Farshad, Claudia Sidler, Christian Gerber (2013)
Association
of scientific and nonscientific factors to citation rates of articles
of renowned orthopedic journals
European Orthopaedics
and Traumatology, March 2013
info:doi/10.1007/s12570-013-0174-6
From the Abstract: This investigation studied associations of
scientific and nonscientific criteria with the citation frequency of
articles in two top-ranked international orthopedic journals. Methods:
The 100 most (mean, 88 citations/5 years for cases) and 100 least
(mean, two citations/5 years for controls) cited articles published
between 2000 and 2004 in the Journal of Bone and Joint Surgery and the
Bone & Joint Journal (formerly known as JBJS (Br)), two of the
most distributed general orthopedic journals, were identified. The
association of scientific and nonscientific factors on their citation
rate was quantified. Results: Randomized controlled trials, as well as
multicenter studies with large sample sizes, were significantly more
frequent in the high citation rate group. The unadjusted odds of a
highly cited article to be supported by industry were 2.8 (95 %
confidence interval 1.5, 5.6; p<0.05) if compared with a lowly
cited article.
Added 4 March 2013
Steffen Bernius, Matthias Hanauske, Berndt Dugall, and Wolfgang Konig
(2013)
Exploring
the Effects of a Transition to Open Access: Insights from a Simulation
Study
Goethe University Frankfurt, Faculty of Economics and Business
Administration, 2012. In Journal
of the American Society for Information Science and Technology,
Vol. 64, No. 4, 701-726, April 2013, online: 21 Feb 2013
DOI: 10.1002/asi.22772
Abstract: The Open Access (OA) movement, which postulates gratis and
unrestricted online access to publicly funded research findings, has
significantly gained momentum in recent years. The two ways of
achieving OA are self-archiving of scientific work by the authors
(Green OA) and publishing in OA journals (Gold OA). But there is still
no consensus which model should be supported in particular. The aim of
this simulation study is to discover mechanisms and predict
developments that may lead to specific outcomes of possible market
transformation scenarios. It contributes to theories related to OA by
substantiating the argument of a citation advantage of OA articles and
by visualizing the mechanisms of a journal system collapsing in the
long-term due to the continuation of the serials crisis. The practical
contribution of this research stems from the integration of all market
players: Decisions regarding potential financial support of OA models
can be aligned with our findings as well as the decision of a
publisher to migrate his journals to Gold OA. Our results indicate that
for scholarly communication in general, a transition to Green OA
combined with a certain level of subscription-based publishing and a
migration of few top journals is the most beneficial development.
Added 4 March 2013
Christopher Hassall (2013)
Going
green: self-archiving as a means for dissemination of research output
in ecology and evolution
Ideas in Ecology and Evolution, 5 (2), Feb 2013
info:doi/10.4033/iee.v5i2.4555
Abstract: There is a perception that is prevalent within the academic
community that access to information is being restricted by the large
publishing houses that dominate academic publishing. However,
self-archiving policies that are promoted by publishers provide a
method by which this restriction can be relaxed. In this paper I
outline the motivation behind self-archiving publications in terms of
increased impact (citations and downloads of articles), increased
access for the developing world, and decreased library costs. I then
describe the current state of self-archiving policies in 165 ecology
and evolution journals. I demonstrate that the majority (52%) of papers
published in 2011 could have been self-archived in a format close to
their final form. Journals with higher impacts tend to have more
restrictive policies on self-archiving, and publishers vary in the
extent to which they impose these restrictions. Finally, I provide a
guide to academics on how to take advantage of opportunities for
self-archiving using either institutional repositories or
freely-available online tools.
Added 13 May 2013
Sayed-Amir Marashi, Seyed Mohammad Amin Hosseini-Nami, Khadijeh
Alishah, Mahdieh Hadi, Ali Karimi, Saeedeh Hosseinian, Rouhallah
RamezaniFard, Reihaneh Sadat Mirhassani, Zhaleh Hosseini, Zahra Shojaie
(2013)
Impact
of Wikipedia on Citation Trends
EXCLI Journal
(Experimental and Clinical Sciences International online journal for
advances in science), Vol. 12, 15-19, January 15, 2013, Supplementary
Information (dataset, methodology)
Guest editorial, Abstract: It has been suggested that the "visibility" of an
article influences its citation count. More specifically, it is
believed that the social media can influence article citations. Here we
tested the hypothesis that inclusion of scholarly references in
Wikipedia affects the citation trends. To perform this analysis, we
introduced a citation "propensity" measure, which is inspired by the
concept of amino acid propensity for protein secondary structures. We
show that although citation counts generally increase during time,
the citation "propensity" does not increase after inclusion of a
reference in Wikipedia.
Added 4 December 2012
Christian Gumpenberger, Maria-Antonia Ovalle-Perandones, and Juan
Gorraiz (2012)
On
the impact of Gold Open Access journals
u:scholar, Universität Wien, November 2012. In Scientometrics
info:doi/10.1007/s11192-012-0902-7
Abstract: Gold Open Access (=Open Access publishing) is for many the
preferred route to achieve unrestricted and immediate access to
research output. However, true Gold Open Access journals are still
outnumbered by traditional journals. Moreover availability of Gold OA
journals differs from discipline to discipline and often leaves
scientists concerned about the impact of these existent titles. This
study identified the current set of Gold Open Access journals featuring
a Journal Impact Factor (JIF) by means of Ulrichsweb, Directory of Open
Access Journals and Journal Citation Reports (JCR). The results were
analyzed regarding disciplines, countries, quartiles of the JIF
distribution in JCR and publishers. Furthermore the temporal impact
evolution was studied for a Top 50 titles list (according to JIF) by
means of Journal Impact Factor, SJR and SNIP in the time interval
20002010. The identified top Gold Open Access journals proved to be
well-established and their impact is generally increasing for all the
analyzed indicators. The majority of JCR-indexed OA journals can be
assigned to Life Sciences and Medicine. The success-rate for JCR
inclusion differs from country to country and is often inversely
proportional to the number of national OA journal titles. Compiling a
list of JCR-indexed OA journals is a cumbersome task that can only be
achieved with non-Thomson Reuters data sources. A corresponding
automated feature to produce current lists on the fly would be
desirable in JCR in order to conveniently track the impact evolution of
Gold OA journals.
Added 4 December 2012
Ling-Ling Wu, Mu-Hsuan Huang, and Ching-Yi Chen (2012)
Citation
patterns of the pre-web and web-prevalent environments: The moderating
effects of domain knowledge
Journal of the American
Society for Information Science and Technology, 63 (11),
2182-94, November 2012
info:doi/10.1002/asi.22710
Abstract: The Internet has substantially increased the online
accessibility of scholarly publications and allowed researchers to
access relevant information efficiently across different journals and
databases. Because of online accessibility,
academic researchers tend to read more, and reading has become more
superficial, such that information overload
has become an important issue. Given this circumstance, how the
Internet affects knowledge transfer, or, more specifically, the
citation behavior of researchers, has become a recent focus of
interest. This study assesses the effects of the Internet on citation
patterns in terms of 4 characteristics of cited documents: topic
relevance, author status, journal prestige, and age of references. This
work hypothesizes that academic scholars cite more topically relevant
articles, more articles written by lower status authors, articles
published in less prestigious journals, and older articles with online
accessibility. The current study also hypothesizes that researcher
knowledge level moderates such Internet effects. We chose the IT and
Group subject area and collected 241 documents published in the
pre-web period (1991-1995) and 867 documents published in the
web-prevalent period (2006-2010) in the Web of Science database. The
references of these documents were analyzed to test the proposed
hypotheses, which are significantly supported by the empirical results.
Added 4 December 2012
V. Calcagno, E. Demoinet, K. Gollner, L. Guidi, D. Ruths, C. de
Mazancourt (2012)
Flows
of Research Manuscripts Among Scientific Journals Reveal Hidden
Submission Patterns
Science, 11
October 2012
info:doi/10.1126/science.1227833
Abstract: The study of science-making is a growing discipline that
builds largely on online publication and citation databases, while
prepublication processes remain hidden. Here, we report results from a
large-scale survey of the submission process, covering 923 scientific
journals from the biological sciences in years 2006-2008. Manuscript
flows among journals revealed a modular submission network, with
high-impact journals preferentially attracting submissions. However,
about 75% of published articles were submitted first to the journal
that would publish them, and high-impact journals published
proportionally more articles that had been resubmitted from another
journal. Submission history affected postpublication impact:
Resubmissions from other journals received significantly more citations
than first-intent submissions, and resubmissions between different
journal communities received significantly fewer citations.
See also
Philip Ball, Rejection
improves eventual impact of manuscripts, Nature News, 11
Oct 2012: Just had your paper rejected? Dont worry - that might boost
its ultimate citation tally. An excavation of scientific papers'
usually hidden prepublication trajectories from journal to journal has
found that papers published after having first been rejected elsewhere
receive significantly more citations on average than ones accepted on
first submission.
Ruth Williams, The
Benefits of Rejection, The
Scientist, 11 Oct 2012: A survey of the
prepublication histories of papers reveals that manuscripts that are
rejected then resubmitted are cited more often.
From Vincent Calcagno research:
More
about submission flows, October 19, 2012: Mail contact to
obtain the raw data file
The
benefits of rejection, continued, October 23, 2012: On Figure
4A, "One result that attracts considerable attention in our article on
Submission Flows"
Added 4 December 2012, updated 17 June 2013
Mikael Laakso and Bo-Christer Björk (2012)
Delayed
Open Access - an overlooked high-impact category of openly available
scientific literature
HARIS (Hanken Research Information System), 10 October 2012. Journal of the American Society
for Information Science and Technology, published online 23 May 2013
http://onlinelibrary.wiley.com/doi/10.1002/asi.22856/abstract
DOI: 10.1002/asi.22856
Abstract: Delayed open access (OA) refers to scholarly articles in
subscription journals made available openly on the web directly through
the publisher at the expiry of a set embargo period. Though a
substantial number of journals have practiced delayed OA since they
started publishing e-versions, empirical studies concerning open access
have often overlooked this body of literature. This study provides
comprehensive quantitative measurements by identifying delayed OA
journals, collecting data concerning their publication volumes, embargo
lengths, and citation rates. Altogether 492 journals were identified,
publishing a combined total of 111 312 articles in 2011. 77,8 % of
these articles were made open access within 12 months from publication,
with 85,4 % becoming available within 24 months. A journal impact
factor analysis revealed that delayed OA journals have on average twice
as high average citation rates compared to closed subscription
journals, and three times as high as immediate OA journals. Overall the
results demonstrate that delayed OA journals constitute an important
segment of the openly available scholarly journal literature, both by
their sheer article volume as well as by including a substantial
proportion of high impact journals.
Added 4 December 2012
Philip Davis (2012)
The
Effect of Public Deposit of Scientific Articles on Readership
The Physiologist,
55 (5), 161-5, October 2012
Note, this link will download the whole journal issue, not just the
cited paper. Abstract: A longitudinal cohort analysis of 3,499 articles
published in 12 physiology journals reveals a 14% reduction in full
text article downloads when they are made publicly available from the
PubMed Central archive. The loss of article readership from the journal
website may weaken the ability of the publisher to build communities of
interest around the research article, impede the communication of news
and events with society members and reduce the perceived value of the
journal to institutional subscribers.
See also
Philip Davis, Is
PubMed Central Complementing or Competing with Journal Publishers?
Scholarly Kitchen, September 20, 2012
Added 4 December 2012
G Mahesh (2012)
Open
access and impact factors
Current Science,
103 (6), 610, 25 September 2012
Correspondence. Extract: The case-in-point is CSIR-NISCAIR journals.
The institute publishes 17 primary journals and on the First
International Open Access Day on 14 October 2008, NISCAIR made two of
its journals open access and by mid-2009 all its journals were
available in this mode. Going by the recently released Journal Citation
Reports, for the first time two CSIR-NISCAIR journals have crossed IF
1, and as shown in Figure 1, almost all journals have increased their
impact factors in 2011 over the previous years. It appears that the
increased 2011 IFs are a result of the journals having gone open access
from 2008 to 2009 onwards.
Added 4 December 2012
Filippo Radicchi (2012)
In science "there
is no bad publicity": Papers criticized in technical comments have high
scientific impact
arXiv.org > physics > arXiv:1209.4997, 22
September 2012
From the abstract: Technical comments are special types of scientific
publications whose aim is to correct or criticize previously published
papers. Often, comments are negatively perceived by the authors of the
criticized articles because believed to make the commented papers less
worthy or trusty to the eyes of the scientific community. Thus, there
is a tendency to think that criticized papers are predestined to have
low scientific impact. We show here that such belief is not supported
by empirical evidence. We consider thirteen major publication outlets
in science and perform a large-scale analysis of the citation patterns
of criticized publications. We find that commented papers have not only
average citation rates much higher than those of non commented
articles, but also unexpectedly over-populate the set of the most cited
publications within a journal. Since comments are published soon after
criticized papers, comments can be viewed as early indicators of the
future impact of criticized papers.
Added 14 September 2012
Sara Pérez Álvarez, Felipe P. Álvarez Arrieta, and Isidro F. Aguillo (2012)
EU
FP7 research in Open Access Repositories
Digital.CSIC, the Institutional Repository of the Spanish National
Research Council (CSIC), 21 Aug 2012. In 17th International Conference on
Science and Technology Indicators (STI), 5-8 September 2012, Montreal,
http://sticonference.org/Proceedings/vol1/Alvarez_EU_58.pdf
Abstract: Open access repositories are a reliable source of academic
items that can be used for testing the capabilities of the webometric
analysis. This paper deals with actions needed for extracting web
indicators from bibliographic records in open access repositories,
provides guidelines to support a further webometric study and presents
the results of a preliminary web impact evaluation carried out over a
sample of 1386 EU FP7 output papers available from the OpenAIRE
database. The European Commission project OpenAIRE aims, among other
objectives, to provide impact measures to assess the research
performance from repositories contents and, especially, of Special
Clause 39 project participants within EU FP7. Using URL citations,
title mentions and copies of titles as main web impact indicators, this
study suggests that a priori the implementation of the mandatory clause
SC39 to encourage open access to European research may be resulted
indeed in a greater and more immediate web visibility of these papers.
Added 4 December 2012
Melissa Terras (2012)
The
Impact of Social Media on the Dissemination of Research: Results of an
Experiment
Journal of Digital
Humanities, 1 (3), September 2012
Three collected blog posts, revised with a new introduction for this
journal presentation: The first, What Happens When You Tweet an
Open-Access Paper discusses the correlation between talking about an
individual paper online, and seeing its downloads increase. The second,
Is Blogging and Tweeting About Research Papers Worth It? The Verdict
discusses the overall effect of this process on all my papers,
highlighting what I think the benefits of open access are. In the final
post, When Was the Last Time You Asked How Your Published Research Was
Doing? I talk about the link between publishers and open access, and
how little we know about how often our research is accessed once it is
published.
Added 16 August 2012, updated 17 June 2013
Xianwen Wang, Zhi Wang, and Shenmeng Xu (2012)
Tracing
scientists' research trends realtimely
arXiv.org > cs > arXiv:1208.1349, 07 Aug 2012
In Scientometrics, 95 (2):717-729, 5 May 2013
doi: 10.1007/s11192-012-0884
From the Abstract: In this research, we propose a method to trace
scientists' research trends realtimely. By monitoring the downloads of
scientific articles in the journal of Scientometrics for 744 hours,
namely one month, we investigate the download statistics. Then we
aggregate the keywords in these downloaded research papers, and analyze
the trends of article downloading and keyword downloading.
Added 16 August 2012
Maged Boulos and Patricia Anderson (2012)
Preliminary
survey of leading general medicine journals use of Facebook and Twitter
Journal of the Canadian
Health Libraries Association, 33 (02), 38-47, 01 Aug 2012
info:doi/10.5596/c2012-010
From the Abstract: Methods: We selected the top 25 general medicine
journals on the Thomson Reuters Journal Citation Report (JCR) list. We
surveyed their Facebook and Twitter presences and scanned their Web
sites for any Facebook and (or) Twitter features as of November 2011.
Results/Discussion: 20 of 25 journals had some sort of Facebook
presence, with 11 also having a Twitter presence. Total Likes across
all of the Facebook pages for journals with a Facebook presence were
321,997, of which 259, 902 came from the New England Journal of
Medicine (NEJM) alone. The total numbers of Twitter Followers were
smaller by comparison when compiled across all surveyed journals.
Likes and Followers are not the equivalents of total accesses but
provide some proxy measure for impact and popularity. Those journals in
our sample making best use of the open sharing nature of social media
are closed-access; with the leading open access journals on the list
lagging behind by comparison.
See also this similar finding: Sandra L De Groote, Promoting
health sciences journal content with Web 2.0: A snapshot in time,
First Monday,
Vol 17, No 8, 6 August 2012: "Traditional journals were more likely to
use Web 2.0 technology than open access journals."
Added 16 August 2012
Bo-Christer Björk and David Solomon (2012)
Open
access versus subscription journals: a comparison of scientific impact
BMC Medicine,
10 (1), 17 Jul 2012
info:doi/10.1186/1741-7015-10-73
From the Abstract: In the past few years there has been an ongoing
debate as to whether the proliferation of open access (OA) publishing
would damage the peer review system and put the quality of scientific
journal publishing at risk. Our aim was to inform this debate by
comparing the scientific impact of OA journals with subscription
journals, controlling for journal age, the country of the publisher,
discipline and (for OA publishers) their business model. A total of 610
OA journals were compared with 7,609 subscription journals using Web of
Science citation data while an overlapping set of 1,327 OA journals
were compared with 11,124 subscription journals using Scopus data.
Overall, average citation rates, both unweighted and weighted for the
number of articles per journal, were about 30% higher for subscription
journals. However, after controlling for discipline (medicine and
health versus other), age of the journal (three time periods) and the
location of the publisher (four largest publishing countries versus
other countries) the differences largely disappeared in most
subcategories except for journals that had been launched prior to 1996.
OA journals that fund publishing with article processing charges (APCs)
are on average cited more than other OA journals. In medicine and
health, OA journals founded in the last 10 years are receiving about as
many citations as subscription journals launched during the same period.
See also
Open
access means business: Pay-to-publish approaching same impact factor as
subscription journals, Science Codex, 17 Jul 2012
David Solomon and Bo-Christer Björk, Opinion:
OA Coming of Age, The
Scientist, 06 Aug 2012
Added 16 August 2012
Bertil Dorch (2012)
On the
Citation Advantage of linking to data
hprints.org, Nordic Arts and Humanities and Social Sciences e-print
repository, 05 Jul 2012
Abstract: This paper present some indications of the existence of a
Citation Advantage related to linked data, using astrophysics as a
case. Using simple measures, I find that the Citation Advantage
presently (at the least since 2009) amounts to papers with links to
data receiving on the average 50% more citations per paper per year,
than the papers without links to data. A similar study by other authors
should a cummulative effect after several years amounting to 20%.
Hence, a Data Sharing Citation Advantage seems inevitable.
Added 4 December 2012
Adam Eyre-Walker (2012)
Can
we assess the quality and impact of science? (video)
YouTube, 02 July 2012
In International workshop on Evolution in the Time of Genomics - part
08, May 2012, Venice. Quoted extracts: "(F1000) assessors are highly
influenced by the impact factor of the journal and less by the actual
intrinsic quality of the paper" "Without the information of what
journal that paper is published in you have very little power to tell
the ultimate impact of that paper" "People tend to over-rate the
quality of science in high-impact factor journals and that is a very
important influence" "The difference between the really high impact
journals and the medium quality journals is nothing like as dramatic as
we might think" There is a brief mention of open access impact only in
questions following the presentation - assumes open access journals.
Added 27 June 2012
Sharon Mathelus, Ginny Pittman, and Jill Yablonski-Crepeau (2012)
Promotion of
research articles to the lay press: a summary of a three-year project
Learned Publishing,
Vol 25, No 3, July 2012, 207-212
Abstract: The promotion of scholarly journal articles to journalists
and bloggers via the dissemination of press releases generates a
positive impact on the number of citations that publicized journal
articles receive. Research by John Wiley & Sons, Inc. shows
that article-level publicity efforts and media coverage boosts
downloads by an average of 1.8 times and were found to increase
citations by as much as 2.0-2.2 times in the articles analyzed in this
study. We evaluated scholarly journal articles published in nearly 100
Wiley journals, which were also covered in 296 press releases. The
results in this case study suggest a need for greater investment in
media support for scholarly journals publishing research that sparks
interest to a broad news audience, as it could increase citations.
Added 27 June 2012
M. Riera and E. Aibar (2012)
Does
open access publishing increase the impact of scientific articles? An
empirical study in the field of intensive care medicine
Medicina intensiva /
Sociedad Espanola de Medicina Intensiva y Unidades Coronarias, 07
June 2012. Via NCBI PubMed.gov
info:pmid/22683044 | info:doi/10.1016/j.medin.2012.04.002
METHODS: We evaluated a total of 161 articles (76% being non-open
access articles) published in Intensive Care Medicine in the year 2008.
Citation data were compared between the two groups up until April 30,
2011. Potentially confounding variables for citation counts were
adjusted for in a linear multiple regression model. RESULTS: The median
number (interquartile range) of citations of non-open access articles
was 8 (4-12) versus 9 (6-18) in the case of open access articles
(p=0.084). In the highest citation range (>8), the citation
count was 13 (10-16) and 18 (13-21) (p=0.008), respectively. The mean
follow-up was 37.5±3 months in both groups. In the 30-35 months after
publication, the average number (mean±standard deviation) of citations
per article per month of non-open access articles was 0.28±0.6 versus
0.38±0.7 in the case of open access articles (p=0.043). Independent
factors for citation advantage were the Hirsch index of the first
signing author (=0.207; p=0.015) and open access status (=3.618;
p=0.006). CONCLUSIONS: Open access publishing and the Hirsch index of
the first signing author increase the impact of scientific articles.
The open access advantage is greater for the more highly cited
articles, and appears in the 30-35 months after publication.
Added 27 June 2012
Judit Bar-Ilan, Stefanie Haustein, Isabella Peters, Jason Priem, Hadas
Shema, Jens Terliesner (2012)
Beyond citations:
Scholars' visibility on the social Web
arXiv.org
> cs > arXiv:1205.5611, 25 May 2012. In 17th
International
Conference on Science and Technology Indicators, Montreal, 5-8 Sept.
2012, http://sticonference.org/Proceedings/vol1/Bar-Ilan_Beyond_98.pdf
Abstract: Traditionally, scholarly impact and visibility have
been measured by counting publications and citations in the scholarly
literature. However, increasingly scholars are also visible on the Web,
establishing presences in a growing variety of social ecosystems. But
how wide and established is this presence, and how do measures of
social Web impact relate to their more traditional counterparts? To
answer this, we sampled 57 presenters from the 2010 Leiden STI
Conference, gathering publication and citations counts as well as data
from the presenters' Web "footprints." We found Web presence widespread
and diverse: 84% of scholars had homepages, 70% were on LinkedIn, 23%
had public Google Scholar profiles, and 16% were on Twitter. For
sampled scholars' publications, social reference manager bookmarks were
compared to Scopus and Web of Science citations; we found that Mendeley
covers more than 80% of sampled articles, and that Mendeley bookmarks
are significantly correlated (r=.45) to Scopus citation counts.
Added 27 June 2012
George Lozano, Vincent Lariviere, and Yves Gingras (2012)
The weakening
relationship between the Impact Factor and papers' citations in the
digital age
arXiv.org > cs > arXiv:1205.4328, 19 May 2012. In
Journal of the American Society for Information Science and Technology,
Vol 63, No 11, 21402145, November 2012
http://onlinelibrary.wiley.com/doi/10.1002/asi.22731/abstract
From the Abstract: We compare the strength of the relationship between
journals' Impact Factors and the actual citations received by their
respective papers from 1902 to 2009. Throughout most of the 20th
century, papers' citation rates were increasingly linked to their
respective journals' Impact Factors. However, since 1990, the advent of
the digital age, the strength of the relation between Impact Factors
and paper citations has been decreasing. This decrease began sooner in
physics, a field that was quicker to make the transition into the
electronic domain. Furthermore, since 1990, the proportion of highly
cited papers coming from highly cited journals has been decreasing, and
accordingly, the proportion of highly cited papers not coming from
highly cited journals has also been increasing. Should this pattern
continue, it might bring an end to the use of the Impact Factor as a
way to evaluate the quality of journals, papers and researchers.
See also
George Lozano, The demise of the Impact Factor: The
strength of the relationship between citation rates and IF is down to
levels last seen 40 years ago, Impact of Social Sciences, 08
June 2012: Lozano discusses the recent paper that he co-authored.
Robinson Meyer, Thanks
to the Web, Even Scientists Are Reading for the Articles, the Atlantic, July
9, 2012
Study
reveals declining influence of high impact factor journals,
Université de Montréal News, 07 Nov 2012
Added 27 June 2012
Henk Moed (2012)
Does
open access publishing increase citation or download rates?
Research Trends,
No. 28, May 2012
The effect of "Open Access" (OA) on the visibility or impact of scientific
publications is one of the most important issues in the fields of
bibliometrics and information science. During the past 10 years
numerous empirical studies have been published that examine this issue
using various methodologies and viewpoints. Comprehensive reviews and
bibliographies are given amongst others by OPCIT, Davis and Walters and
Craig et al. The aim of this article is not to replicate nor update
these thorough reviews. Rather, it aims to presents the two main
methodologies that were applied in these OA-related studies and
discusses their potentialities and limitations. The first method is
based on citation analyses; the second on usage analyses.
Added 27 June 2012
Brian Kelly and Jenny Delasalle (2012)
Can LinkedIn and
Academic.edu Enhance Access to Open Repositories?
Opus: Online Publications Store, University of Bath, May 2012.
In OR2012: 7th International Conference on Open Repositories, 9-13 July
2012, Edinburgh
From the Abstract: we are witnessing the increasing take-up of a range
of third-party services such as LinkedIn and Academia which are being
used by researchers to publish information related to their
professional activities, including details of their research
publications. The paper provides evidence which suggests that personal
use of such services can increase the number of downloads by increasing
SEO (Search Engine Optimisation) rankings through inbound links from
highly ranked web sites. A survey of use of such services across
Russell Group universities shows the popularity of a number of social
media services. In the light of existing usage of these services this
paper proposes that institutional encouragement of their use by
researchers may generate increased accesses to institutional research
publications at little cost to the institution. This paper concludes by
describing further work which is planned in order to investigate the
SEO characteristics of institutional repositories.
Added 27 June 2012
Carlos Paiva, Joao da Silveira, and Bianca Ribeiro (2012)
Articles
with short titles describing the results are cited more often
Clinics, 67
(5), 509-13, May 2012. Via PubMed Central
info:doi/10.6061/clinics/2012(05)17
From the Abstract: OBJECTIVE: The aim of this study was to evaluate some
features of article titles from open access journals and to assess the
possible impact of these titles on predicting the number of article
views and citations. RESULTS: Short-titled articles had higher viewing
and citation rates than those with longer titles. Titles containing a
question mark, containing a reference to a specific geographical
region, and that used a colon or a hyphen were associated with a lower
number of citations. Articles with results-describing titles were cited
more often than those with methods-describing titles. After
multivariate analysis, only a low number of characters and title
typology remained as predictors of the number of citations.
Added 27 June 2012
Tom Rees, Katherine Ayling-Rouse, and Sheelah Smith (2012)
Accesses
versus citations: Why you need to measure both to assess publication
impact
8th Annual Meeting of
ISMPP (International Society for Medical Publication Professionals),
Baltimore, MD, April 23-25, 2012
Poster paper. Abstract OBJECTIVE: Article accesses and citations provide 2
metrics to assess article impact. However, the relationship between the
2 is not constant or well understood. We investigated the relationship
between article accesses and citations in 3 general medicine journals
with different journal rankings. RESEARCH DESIGN AND METHODS: We
collected the numbers of article accesses and citations from a
representative selection of original research articles published in
2009 and 2010 in 3 peer-reviewed, international, online- only,
open-access journals: PLoS Medicine, BMC Medicine, and the
International Journal of General Medicine (IJGM) (SCImago journal
ranking 1.04, 0.49, and 0.06, respectively). RESULTS: The sample
included 104 articles (2 outliers were excluded). CONCLUSION: The
relationship between article accesses and citations varies, with the
highest ratio of citations:access for journals with the highest journal
ranking. For open-access journals with a low impact factor, overall
article reach may be higher than expected on the basis of citations.
Added 27 June 2012
Patrick Vandewalle (2012)
Code Sharing is Associated
with Research Impact in Image Processing
Reproducible Research Repository, EPFL, Lausanne, 23 Apr 2012.
IEEE Computing in
Science and Engineering, Vol 14, No 4, 42-47, July-Aug 2012
http://dx.doi.org/10.1109/MCSE.2012.63
Abstract:
In computational sciences such as image processing, the publication
itself is often not enough to allow other researchers to verify the
results by repeating the described experiments. In many cases,
supplementary material such as source code and measurement data are
required, or can at least be very helpful. Still, only approximately
10% of recently published papers in image processing have code
available online. One of the arguments for not making code available is
the extra time required to prepare the material. In this paper, we
claim that this additional time may be well spent, as the availability
of code for a publication is associated with an increase in the
expected number of citations. We show this with exploratory analyses of
the relationship between code availability and the number of citations
for image processing papers.
Note on open access citation impact on
results (p4): As can be seen in the open access citation studies (such
as the one by Lawrence), papers for which an online version is freely
available have an increased number of citations. I did not take this
into account in my analyses by adding the open access availability as
another variable. Articles that have code available generally also have
an online version of the article. The citation effect seen above is
therefore the combined effect of the open access availability of the
paper and the availability of code.
Added 23 April 2012
Henk Moed (2012)
The
Effect of Open Access upon Citation Impact
Editors' Update, Elsevier.com, 22 Mar 2012
Does Open Access publishing increase citation rates? From a
methodological point of view, the debate focuses on biases, control
groups, sampling, and the degree to which conclusions from case studies
can be generalized. This note does not give a complete overview of
studies that were published during the past decade but highlights key
events. An extended version of this paper will be published
Added 23 April 2012
Jason Priem, Heather Piwowar, and Bradley Hemminger (2012)
Altmetrics in the
wild: Using social media to explore scholarly impact
arXiv.org > cs > arXiv:1203.4745, 20 Mar 2012
From the Abstract: In growing numbers, scholars are integrating social
media tools like blogs, Twitter, and Mendeley into their professional
communications. The online, public nature of these tools exposes and
reifies scholarly processes once hidden and ephemeral. Metrics based on
this activities could inform broader, faster measures of impact,
complementing traditional citation metrics. This study explores the
properties of these social media-based metrics or "altmetrics",
sampling 24,331 articles published by the Public Library of Science.
See also
Heather Piwowar, Altmetrics
shows that citations can't stand up to the full 31 flavours of research
impact, Impact of Social Sciences, 04 Apr 2012
Added 23 April 2012
Xin Shuai, Alberto Pepe, and Johan Bollen (2012)
How the
Scientific Community Reacts to Newly Submitted Preprints: Article
Downloads, Twitter Mentions, and Citations
arXiv.org > cs > arXiv:1202.2461, 11 Feb 2012.
PLoS ONE, 7(11): e47523, November 1, 2012
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0047523
doi:10.1371/journal.pone.0047523
Abstract:
We analyze the online response of the scientific community to the
preprint publication of scholarly articles. We employ a cohort of 4,606
scientific articles submitted to the preprint database arXiv.org
between October 2010 and April 2011. We study three forms of reactions
to these preprints: how they are downloaded on the arXiv.org site, how
they are mentioned on the social media site Twitter, and how they are
cited in the scholarly record. We perform two analyses. First, we
analyze the delay and time span of article downloads and Twitter
mentions following submission, to understand the temporal configuration
of these reactions and whether significant differences exist between
them. Second, we run correlation tests to investigate the relationship
between Twitter mentions and both article downloads and article
citations. We find that Twitter mentions follow rapidly after article
submission and that they are correlated with later article downloads
and later article citations, indicating that social media may be an
important factor in determining the scientific impact of an article.
Added 12 February 2012
Heekyung Kim (2012)
The
Effect of Free Access on the Diffusion of Scholarly Ideas
MIS Speaker's Series, University of Arizona, 24 January 2012
From the Abstract: By using a dataset from the Social Science Research
Network (SSRN), an open repository of research articles, and employing
a natural experiment that allows the estimation of the value of free
access separate from confounding factors such as early viewership and
quality differential, this study identifies the causal effect of free
access on the citation counts. The natural experiment in this study is
that a select group of published articles is posted on SSRN at a time
chosen by their authors' affiliated organizations or SSRN, not by their
authors. Using a difference-in-difference method and comparing the
citation profiles of the articles before and after the posting time on
SSRN against a group of control articles with similar characteristics,
I estimated the effect of the SSRN posting on citation counts. The
articles posted on SSRN receive more citations even prior to being
posted on SSRN, suggesting that they are of higher quality. Their
citation counts further increase after being posted, gaining an
additional 10-20% of citations. This gain is likely to be caused by the
free access that SSRN provides.
Added 12 February 2012
Jingfeng Xia and Ying Liu (2012)
Usage
Patterns of Open Genomic Data
College &
Research Libraries, 09 January 2012
Pre-print. Abstract: This paper uses Genome Expression Omnibus (GEO), a
data repository in biomedical sciences, to examine the usage patterns
of open data repositories. It attempts to identify the degree of
recognition of data reuse value and understand how e-science has
impacted a large-scale scholarship. By analyzing a list of 1,211
publications that cite GEO data to support their independent studies,
it discovers that free data can support a wealth of high quality
investigations, that the rate of open data use keeps growing over the
years, and that scholars in different countries show different rates of
complying with data sharing policies.
Added 12 February 2012
Jingfeng Xia and Katie Nakanishi (2012)
Self-Selection
and the Citation Advantage of Open Access Articles
Online Information Review,
36 (1), 2012
(Subscription access required.) From the Abstract: This research
examines the relationship between the open access availability of
journal papers in anthropology and their citation conditions. We apply
a statistical logistic regression model to explore this relationship,
and compare two groups of papers those published in high-ranked
journals and those in low-ranked journals, based on journal impact
factor to examine the likelihood that open access status is
correlated to scholarly impact. The results reveal that open access
papers in general receive more citations. Moreover this research finds
that 1) papers in high-ranked journals do not have a higher open access
rate, and 2) papers in lower-ranked journals have a greater rate of
citations if they are freely accessible. The findings are contrary to
the existing theory that the higher citation rate of open access papers
is caused by authors posting their best papers online.
Added 12 February 2012
Patricia Shields, Nandhini Rangarajan, and Lewis Stewart (2012)
Open
Access Digital Repository: Sharing Student Research with the World
Journal of Public
Affairs Education, 18 (1), 157-81, 2012
Note, this link will download the pdf of the full Winter 2012 journal
issue - go to page 157. From the Abstract: We study the impact of
content factors and search engine optimization factors on download
rates of capstone papers. We examined all 290 MPA capstone papers at
Texas State University which have been made available through an online
digital repository for public consumption. Results show strong support
for the impact of search engine factors on download rates. The
implications of high download rates of MPA capstone papers on public
administration research, practice, and education are discussed in this
paper.
Added 12 February 2012
Gunther Eysenbach (2011)
Can Tweets
Predict Citations? Metrics of Social Impact Based on Twitter and
Correlation with Traditional Metrics of Scientific Impact
Journal of Medical
Internet Research, 13 (4), 16 December 2011
info:doi/10.2196/jmir.2012
From the Abstract: Between July 2008 and November 2011, all tweets
containing links to articles in the Journal of Medical Internet
Research (JMIR) were mined. A total of 4208 tweets cited 286 distinct
JMIR articles. Highly tweeted articles were 11 times more likely to be
highly cited than less-tweeted articles (9/12 or 75% of highly tweeted
article were highly cited, while only 3/43 or 7% of less-tweeted
articles were highly cited; rate ratio 0.75/0.07 = 10.75, 95%
confidence interval, 3.433.6). Top-cited articles can be predicted
from top-tweeted articles with 93% specificity and 75% sensitivity.
Conclusions: Tweets can predict highly cited articles within the first
3 days of article publication. Social media activity either increases
citations or reflects the underlying qualities of the article that also
predict citations, but the true use of these metrics is to measure the
distinct concept of social impact. Social impact measures based on
tweets are proposed to complement traditional citation metrics. The
proposed twimpact factor may be a useful and timely metric to measure
uptake of research findings and to filter research findings resonating
with the public in real time.
See also
Gunther Eysenbach, Correction:
Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter
and Correlation with Traditional Metrics of Scientific Impact, Journal of Medical Internet
Research, 14 (1), 04 January 2012
info:doi/10.2196/jmir.2041
A minor error in the references section in the originally published
version of the editorial by Eysenbach (J Med Internet Res
2011;13[4]:e123) on the relationship between citations and tweetations
has been corrected; in addition, references being part of the dataset
are no longer cited as references. The now corrected problem with the
references was a formatting/presentation problem only and had no
impact on the study findings.
Haydn Shaughnessy, How
Could Twitter Influence Science (And Why Scientists Are on Board),
Forbes, 15 Jan 2012
Added 12 February 2012
Hyeon-Eui Kim, Xiaoqian Jiang, Jihoon Kim, Lucila Ohno-Machado (2011)
Trends
in biomedical informatics: most cited topics from recent years
Journal of the American
Medical Informatics Association, 18 (Suppl 1), 01
December 2011
info:pmid/22180873 | info:doi/10.1136/amiajnl-2011-000706
Abstract: Biomedical informatics is a young, highly interdisciplinary
field that is evolving quickly. It is important to know which published
topics in generalist biomedical informatics journals elicit the most
interest from the scientific community, and whether this interest
changes over time, so that journals can better serve their readers. It
is also important to understand whether free access to biomedical
informatics articles impacts their citation rates in a significant way,
so authors can make informed decisions about unlock fees, and journal
owners and publishers understand the implications of open access. The
topics and JAMIA articles from years 2009 and 2010 that have been most
cited according to the Web of Science are described. To better
understand the effects of free access in article dissemination, the
number of citations per month after publication for articles published
in 2009 versus 2010 was compared, since there was a significant change
in free access to JAMIA articles between those years. Results suggest
that there is a positive association between free access and citation
rate for JAMIA articles.
Added 12 February 2012; updated 4 March 2013
Sears, J. R. (2011)
Data Sharing Effect on Article Citation Rate in Paleoceanography
American Geophysical Union, Fall Meeting 2011
Abstract: The validation of scientific results requires reproducible methods and data.
Often, however, data sets supporting research articles are not openly accessible and
interlinked. This analysis tests whether open sharing and linking of supporting data
through the PANGAEA data library measurably increases the citation rate of articles
published between 1993 and 2010 in the journal Paleoceanography as reported in the
Thomson Reuters Web of Science database. The 12.85% (171) of articles with publicly
available supporting data sets received 19.94% (8,056) of the aggregate citations (40,409).
Publicly available data were thus significantly (p=0.007, 95% confidence interval)
associated with about 35% more citations per article than the average of all articles
sampled over the 18-year study period (1,331), and the increase is fairly consistent over
time (14 of 18 years). This relationship between openly available, curated data and
increased citation rate may incentivize researchers to share their data.
See also
Michael Diepenbroek,
Data
Sharing Effect on Article Citation Rate in Paleoceanography,
KomFor blog, November 27, 2011. Includes data plot.
Added 25 November 2011
Henneken, E. and Accomazzi, A. (2011)
Linking to Data -
Effect on Citation Rates in Astronomy
arXiv.org > cs > arXiv:1111.3618, 15 Nov 2011. In Proceedings of ADASS XXI
(Astronomical Data Analysis Software & Systems), Paris, 6-10
November 2011
From the Abstract: Is there a difference in citation rates between
articles that were published with links to data and articles that were
not? In this presentation we will show this is indeed the case:
articles with links to data result in higher citation rates than
articles without such links.
See also
Added 06 July 2011
Linking
to Data - Effect on Citation Rates in Astronomy
Meters, Metrics and More, 03 Jun 2011
Extracts: Using the data holdings of the SAO/NASA
Astrophysics Data System, our analysis shows that articles with data
links are indeed cited more than articles without these links - for
this data set, articles with data links acquired 20% more citations
(compared to articles without these links).
Added 25 November 2011
Xia, J., Wilhoite, S. K. and Myers, R. L. (2011)
A
librarian-LIS faculty divide in open access practice
Journal of Documentation,
67 (5), 791-805 (2011)
info:doi/10.1108/00220411111164673
(Subscription
access required) From the Abstract: This paper measures the OA
availabilities and citations of scholarly articles from 20 top-ranked
LIS journals published in 2006.
Added 25 November 2011
Henneberger, S. (2011)
Entwicklung
einer Analysemethode für Institutional Repositories unter Verwendung
von Nutzungsdaten (Development of an analytical method for
Institutional Repositories using usage data)
Thesis, edoc-Server der Humboldt-Universität zu Berlin, 31 Oct 2011
From the English abstract: Download data are the subject of scientific
investigations, in which the concept of the Citation Impact is applied
to the rate of use of a publication and the so-called Download Impact
is formed. Analyzed with nonparametric methods, download data give
information about the visibility of electronic publications on the
Internet. These methods form the core of NoRA (Non-parametric
Repository Analysis). The analytical method NoRA was successfully
applied to data from Institutional Repositories of four universities.
In each case, groups of publications were identified that differed
significantly in their usage. Similarities in the results reveal
factors that influence the usage data, which have not been taken into
account previously. The presented results imply further applications of
NoRA but also raise doubts about the value of download data of single
publications.
Added 25 November 2011
Tarrant, D. (2011)
A Study of
Early Indication Citation Metrics
PhD thesis, ECS EPrints Repository, University of Southampton, 24 Oct
2011
From the Abstract: Each new citation establishes a large number of
co-citation relationships between that publication and older material
whose citation impact is already well established. By taking advantage
of this co-citation property, this thesis investigates the possibility
of developing a metric that can provide an earlier indicator of a
publications citation impact. This thesis proposes a new family of
co-citation based impact measures, describes a system to evaluate their
effectiveness against a large citation database, and justifies the
results of this evaluation against an analysis of a diverse range of
research metrics.
Added 25 November 2011
Yuan, S. and Hua, W. (2011)
Scholarly
impact measurements of LIS open access journals: based on citations and
links
The Electronic Library,
29 (5), 682, 2011
(Subscription
access required) From the Abstract: The study selected 97 LIS OA
journals as a sample and measured their scholarly impact on the basis
of citations and links. The citation counts in WoS, coverage in LISA,
Web links, WIFs and Page Rank of the journals are retrieved and
calculated, and correlations between citation counts, links, pages,
WIFs, and Page Rank are also analyzed. The results indicate that LIS OA
journals have become a significant component of the scholarly
communication system.
Added 25 November 2011
Priem, J., Piwowar, H. and Hemminger, B. (2011)
Altmetrics
in the wild: An exploratory study of impact metrics based on social
media
Poster
at Metrics 2011:
Symposium on Informetric and Scientometric Research,
New Orleans, LA, 12 October
Extracts: As growing numbers of
scholars publicly read, bookmark, share, discuss, and rate using online
tools, these invisible impacts are beginning to be seen. Because
measurements of these new traces may inform alternatives to traditional
citation metrics, they been dubbed altmetrics. The goal of this study
is to better understand the potential of altmetrics. We gathered
altmetrics for a large sample of scholarly articles all 24,334
articles published by the Public Library of Science (PLoS) before
December 23, 2010.
Added 25 November 2011
Wang, M.-L. (2011)
The impact
of open access journals on library and information scientists' research
in Taiwan
Universiti Teknologi Mara Digital Repository, 07 Oct
2011. In Asia-Pacific
Conference On Library & Information Education &
Practice 2011 (A-LIEP2011), 22-24 June 2011, Malaysia
From
the Abstract: the objectives of the study is to explore the scholarly
productivity of LIS scholars in Taiwan, to find out what articles they
publish and OA articles as a percentage of all titles, and to calculate
the mean citation rate of open access articles and articles not freely
available online. To
determine whether a difference in research impact existed, two research
impact indicators were used, that is, open access articles as a
percentage of all published titles and mean citation rate of open
access articles and those not freely available online. Data on
published articles with citation counts by the LIS scholars in Taiwan
from 2000 to 2009 was collected from the ACI Database and Social
Science Citation Index Database. The study shows that for 72 LIS
scholars who were subjects of the investigation, 64 of them had
published 745 articles within the previous ten years: 679 articles in
Chinese and 66 articles in English; 499 of these were OA articles, and
264 were non-OA articles; OA articles constituted 66.98% of the total
number of academic articles. The mean citation rate of OA versus non-OA
article citation was 1.29.
Added 25 November 2011
Chuanfu Chen, Yuan Yu, Qiong Tang, Kuei Chiu, Yan Rao, Xuan Huang and
Kai Sun (2011)
Assessing
the authority of free online scholarly information
Scientometrics, 02 Oct 2011
(Subscription
access required. Online preview.) From the Abstract: Using a modified
version of Jim Kapouns Five criteria for evaluating web pages as
framework, this research selected 32 keywords from eight disciplines,
inputted them into three search engines (Google, Yahoo and AltaVista)
and used Analytic Hierarchy Process to determine the weights. The first
batches of results (web pages) from keyword searching were selected as
evaluation samples (in the two search phases, the first 50 and 10
results were chosen, respectively), and a total of 3,134 samples were
evaluated for authority based on the evaluation framework. The results
show that the average authority value for free online scholarly
information is about 3.63 (out of five), which is in the fair level
(3 Z < 4) (Z is the value assigned to each sample). About 41%
of
all samples collected provide more authoritative scholarly information.
Different domain names, resource types, and disciplines of free online
scholarly information perform differently when scored in terms of
authority. In conclusion, the authority of free online scholarly
information has been unsatisfactory, and needs to be improved.
Added 25 November 2011
Davis, P. (2011)
Do
discounted journal access programs help researchers in sub-Saharan
Africa? A bibliometric analysis
eCommons@Cornell, 23 Sep 2011. In Learned Publishing,
Vol. 24, No. 4, October 2011, pp. 287-298
Abstract: Prior research has
suggested that providing free and discounted access to the scientific
literature to researchers in low-income countries increases article
production and citation. Using traditional bibliometric indicators for
institutions in sub-Saharan Africa, we analyze whether institutional
access to TEEAL (a digital collection of journal articles in
agriculture and allied subjects) increases: 1) article production; 2)
reference length; and 3) number of citations to journals included in
the TEEAL collection. Our analysis is based on nearly 20,000
articlescontaining half a million referencespublished between 1988
and 2009 at 70 institutions in 11 African countries. We report that
access to TEEAL does not appear to result in higher article production,
although it does lead to longer reference lists (an additional 2.6
references per paper) and a greater frequency of citations to TEEAL
journals (an additional 0.4 references per paper), compared to
non-subscribing institutions. We discuss how traditional bibliometric
indicators may not provide a full picture of the effectiveness of free
and discounted literature programs.
Added 12 February 2012
Jeffrey Furman and Scott Stern (2011)
Climbing
atop the Shoulders of Giants: The Impact of Institutions on Cumulative
Research
American Economic Review,
101 (5), 1933-63, August 2011
From
the Abstract: This paper assesses the impact of a specific institution,
a biological resource center, whose objective is to certify and
disseminate knowledge. We disentangle the marginal impact of this
institution on cumulative research from the impact of selection, in
which the most important discoveries are endogenously linked to
research-enhancing institutions. Exploiting exogenous shifts of
biomaterials across institutional settings and employing a
difference-in-differences approach, we find that effective institutions
amplify the cumulative impact of individual scientific discoveries.
From the paper: Our empirical analysis focuses on whether articles
associated with materials exogenously shifted into a BRC receive a
boost in citations after their deposit into the BRC, controlling for
article-specific fixed effects and fixed effects for article age and
calendar year. Our setting allows us to evaluate both models that
include a control sample and models that rely exclusively on variation
in the timing and date of the treatment of the deposit of the
biomaterial into the BRC. Both approaches provide evidence for the
marginal impact of BRCs on subsequent knowledge; the post-deposit
citation boost is estimated to be between 57 percent and 135 percent
across different specifications. Empirical checks of our key
identification assumptions reinforce our overall findings. We find that
the marginal impact of BRC deposit is marginally higher for articles
published in less prestigious journals and that the citation boost is
concentrated in follow-on research articles involving more complex
subject matter. (Paper extract from copy at:
http://www.econ.tuwien.ac.at/hanappi/Lehre/Economic%20Policy/2012/Furman_2011.pdf)
See also these 2002
and 2006
papers of the same title by the same authors:
News articles on this paper:
Peter Dizikes, How
research goes viral,
MIT News, 10 January 2012: When things become more open, it's
not simply that you get more research, but you get more diverse research,
Stern notes.
Added 23 April 2012
Peter Ingwersen and Anita Elleby (2011)
Do
Open Access Working Papers Attract more Citations Compared to Printed
Journal Articles from the same Research Unit?
Royal
School of Information and Library Science, Copenhagen, (2011). In
Proceedings 13th
International Conference of the International Society
for Scientometrics & Informetrics, Durban, South
Africa, 4-7 July
2011
Abstract: This paper presents the results of an empirical case
study of the characteristics of citations received by 10 open access
non-peer reviewed working papers published by a prestigious
multidisciplinary, but basically social science research institute,
compared to 10 printed peer reviewed journal articles published in the
same year (2004) by the same institute and predominantly by the same
authors. The study analyzes the total amount of citations and citation
impact observed in Web of Science (WoS) and Google Scholar (GS)
received during the five-year period 2004-09 (February) by the two
publication types, the citation distributions over the individual
sample publications and observed years as well as over external,
institutional and personal self-citations. The institute concerned is
the Danish Institute for International Studies (DIIS), Copenhagen. The
results demonstrate that the open access working papers publicly
accessible through the DIIS e-archive became far less cited than the
corresponding sample of DIIS journal articles published in printed
form. However, highly cited working papers have higher impact than the
average of the lower half of cited articles. Citation time series show
identical distinct patterns for the articles in WoS and GS and working
papers in GS, more than doubling the amount of citations received
through the latter source.
See also Open
access working papers not good enough, ScienceNordic.com,
December 7, 2011
Added 25 November 2011
McGreal, R. and Chen, N.-S. (2011)
AUPress:
A Comparison of an Open Access University Press with Traditional Presses
Educational Technology
& Society, 14 (3), 231-9 (2011)
Abstract:
This study is a comparison of AUPress with three other traditional
(non-open access) Canadian university presses. The analysis is based on
the rankings that are correlated with book sales on Amazon.com and
Amazon.ca. Statistical methods include the sampling of the sales
ranking of randomly selected books from each press. The results of
one-way ANOVA analyses show that there is no significant difference in
the ranking of printed books sold by AUPress in comparison with
traditional university presses. However, AUPress, can demonstrate a
significantly larger readership for its books as evidenced by the
number of downloads of the open electronic versions.
Added 18 August 2011
Davis, P. and Walters, W. (2011)
The
impact of free access to the scientific literature: a review of recent
research
Journal of the Medical Library Association, 99 (3), 208-217, July 2011
doi: 10.3163/1536-5050.99.3.008
From the Abstract: The paper reviews recent studies that evaluate the
impact of free access (open access) on the behavior of scientists as
authors, readers, and citers in developed and developing nations. It
also examines the extent to which the biomedical literature is used by
the general public. Researchers report that their access to the
scientific literature is generally good and improving. For authors, the
access status of a journal is not an important consideration when
deciding where to publish. There is clear evidence that free access
increases the number of article downloads, although its impact on
article citations is not clear. Recent studies indicate that large
citation advantages are simply artifacts of the failure to adequately
control for confounding variables. The effect of free access on the
general public's use of the primary medical literature has not been
thoroughly evaluated.
Added 06 July 2011
Shafi, Sheikh Mohammad and Bhat, Mohammad Haneef (2011)
The
Impact of Open Access Contributions: Developed and Developing World
Perspectives
Proceedings of the 15th
International Conference on Electronic Publishing,
Istanbul, Turkey, 22 Jun 2011
From the Abstract: The study explores the research impact of Open
Access research articles across the globe with a view to test the
hypothesis that OA research contributions emanating from developing
countries receive equal citations (subsequently resultant research
impact) as those from the developed world. The study covers 5639
research articles from 50 Open Access DOAJ based Medical Sciences
journals covering the period from 2005 to 2006. The research articles
from the developed countries receive higher number of citations
(subsequently resultant research impact) compared to those of the
developing world. The study may help and pave way for framing policies
and strategies to increase the impact of research in the developing
world.
Added 06 July 2011
Yan, K.-K. and Gerstein, M. (2011)
The
Spread of Scientific Information: Insights from the Web Usage
Statistics in PLoS Article-Level Metrics
PLoS ONE, 6 (5), 16 May 2011
info:doi/10.1371/journal.pone.0019917
From the Abstract: In this work, we focus on a community of scientists
and study, in particular, how the awareness of a scientific paper is
spread. Our work is based on the web usage statistics obtained from the
PLoS Article Level Metrics dataset compiled by PLoS. We found that the
spread of information displays two distinct decay regimes: a rapid
downfall in the first month after publication, and a gradual power law
decay afterwards. We identified these two regimes with two distinct
driving processes: a short-term behavior driven by the fame of a paper,
and a long-term behavior consistent with citation statistics.
Added 06 July 2011
Xia, J. (2011)
Positioning
Open Access Journals in a LIS Journal Ranking
College &
Research Libraries, 16 May 2011
From the Introduction: some OA journals have successfully built
reputations, attracting high-quality articles and sizable numbers of
citations. This research is an attempt to add selected OA journals to
the journal quality rankings using library and information science
(LIS) as an example.
Added 05 May 2011
Sandra Miguel, Zaida Chinchilla-Rodriguez, Félix de Moya-Anegón (2011)
Open
access and Scopus: A new approach to scientific visibility from the
standpoint of access
Journal of the American
Society for Information Science and Technology, published
online: 11 April 2011
From
the Abstract: This study shows a new approach to scientific visibility
from a systematic combination of four databases: Scopus, the Directory
of Open Access Journals, Rights Metadata for Open Archiving
(RoMEO)/Securing a Hybrid Environment for Research Preservation and
Access (SHERPA), and SciMago Journal Rank, and provides an overall,
global view of journals according to their formal OA status. The
results primarily relate to the number of journals, not to the number
of documents published in these journals, and show that in all the
disciplinary groups, the presence of green road journals widely
surpasses the percentage of gold road publications. The benefits of OA
on visibility of the journals are to be found on the green route, but
paradoxically, this advantage is not lent by the OA, per se, but rather
by the quality of the articles/journals themselves regardless of their
mode of access.
Added 05 May 2011
Liu Xue-li, Fang Hong-ling and Wang Mei-ying (2011)
Correlation
between Download and Citation and Download-citation Deviation
Phenomenon for Some Papers in Chinese Medical Journals
Serials Review,
Vol. 37, No. 3, September 2011, 157-161, available online 7 April 2011
From the Abstract: The authors collected the numbers of citations and
downloads from 2005 to 2009 of papers in five Chinese general
ophthalmological journals published in 2005 from the Chinese Academic
Journals Full-text Database and the Chinese Citation Database in
Chinese National Knowledge Infrastructure (CNKI) to determine the
correlation between download and citation and the peak time of download
frequency (DF). The citations from 2000 to 2009 of papers published in
2000 were collected to determine the peak time of citation frequency
(CF) of medical papers. There is a highly positive correlation between
DF and CF (r = 4.91, P = 0.000).
Added 04 April 2011
Davis, P. (2011)
Open
access, readership, citations: a randomized controlled trial of
scientific journal publishing
The FASEB Journal
(Journal of the Federation of American Societies for Experimental
Biology), 30 Mar 2011
info:doi/10.1096/fj.11-183988
Abstract: Does free access to journal articles result in greater
diffusion of scientific knowledge? Using a randomized controlled trial
of open access publishing, involving 36 participating journals in the
sciences, social sciences, and humanities, we report on the effects of
free access on article downloads and citations. Articles placed in the
open access condition (n=712) received significantly more downloads and
reached a broader audience within the first year, yet were cited no
more frequently, nor earlier, than subscription-access control articles
(n=2533) within 3 yr. These results may be explained by social
stratification, a process that concentrates scientific authors at a
small number of elite research universities with excellent access to
the scientific literature. The real beneficiaries of open access
publishing may not be the research community but communities of
practice that consume, but rarely contribute to, the corpus of
literature.
See also this author's earlier dissertation.
News articles on this paper:
Corbyn, Z., Open
access articles not cited more, finds study, The Great
Beyond: Nature news blog, 01 Apr 2011: "We do leave open the
possibility that there is a real citation effect as a result of self
archiving but that we simply do not have the statistical power to
detect it," says Davis. (In comments appended to this article Davis says:
"the piece is not balanced. Zoe focuses on the weaknesses of the study and not
its strengths")
Wieder, B., Open
Access Does Not Equal More Citations, Study Finds, Wired
Campus, Chronicle of
Higher Education, April 1, 2011: Mr. Davis says he doesn't
see his study as a blow to open access - if anything, he thinks it calls
into question the wisdom of looking only at citation counts to measure
the impact of a journal article, particularly given the ease of
tracking article downloads online. "Twenty years ago, there was no way
of measuring readership," he says.
Added 04 April 2011
Donovan, J. M. and Watson, C. A. (2011)
Citation
Advantage of Open Access Legal Scholarship
Social Science Research Network (SSRN), 5 March 2011. In Law Library
Journal, 103 (4), Fall 2011, 553-573 http://www.aallnet.org/main-menu/Publications/llj/Vol-103/Fall-2011/2011-35.pdf.
Also in UKnowledge, University of Kentucky, 2011
http://works.bepress.com/james_donovan/64/
Abstract: To date, there have been no studies focusing exclusively on the impact
of open access on legal scholarship. We examine open access articles
from three journals at the University of Georgia School of Law and
confirm that legal scholarship freely available via open access
improves an articles research impact. Open access legal scholarship
which today appears to account for almost half of the output of law
faculties can expect to receive 50% more citations than non-open
access writings of similar age from the same venue.
Added 15 February 2011
Xu, L., Liu, J. and Fang, Q. (2011)
Analysis
on open access citation advantage: an empirical study based on Oxford
open journals
iConference '11,
Proceedings of the 2011 iConference, Seattle, 11
February 2011
Abstract: This study takes 12,354 original research articles which were
published in 93 Oxford Open journals in 2009 as a sample, and carries
out statistic analyses on the citation frequency that these articles
have received by July 2010 to validate 3 hypotheses: (1) there is
citation advantage for open access articles (OACA) published in Oxford
Open journals over the non-OA ones; (2) OACA varies with disciplines;
(3) there is some correlation between the impact factors (IFs) of Oxford
Open journals and the OACA of their open access articles. This study
discovers that: there exists OACA for open access articles, in this
case 138.87% higher over non-OA ones; different subjects have different
OACAs, and Humanities journals in Oxford Open have even a negative
OACA; Oxford Open journals with lower IFs have stronger OACAs than
those with higher IFs.
Added 15 February 2011
McCabe, M. J. and Snyder, C. M. (2011)
Did
Online Access to Journals Change the Economics Literature?
Social Science Research Network, SSRN, January 23, 2011
Abstract: Does
online access boost citations? The answer has implications for issues
ranging from the value of a citation to the sustainability of
open-access journals. Using panel data on citations to economics and
business journals, we show that the enormous effects found in previous
studies were an artifact of their failure to control for article
quality, disappearing once we add fixed effects as controls. The
absence of an aggregate effect masks heterogeneity across platforms:
JSTOR boosts citations around 10%; ScienceDirect has no effect. We
examine other sources of heterogeneity including whether JSTOR benefits
"long-tail" or "superstar" articles more.
See also Kolowich, S., Questioning
the 'Citation Advantage', Inside Higher Education, February
10, 2011: investigates reaction to this paper. "While McCabe and Snyder actually studied the effect of
economics and business articles being available online as opposed to
just in print -- not the effect of articles being online and free --
they nevertheless believe that their findings cast doubt on the
supposed citation advantage of open-access articles, because, they
contend, their study measures whether ease of access affects citation
volume. ... contrary to its provocative assertion about the lack of
evidence that free online access performs better, their paper does not
address the citation advantage of free versus not free, Harnad says,
and therefore cannot convincingly refute studies that do."
Added 04 April 2011
Saadat, R. and Shabani, A. (2011)
Investigating
the Citations Received by Journals of Directory of Open Access Journals
from ISI Web of Sciences Articles
International Journal of
Information Science and Management, 9 (1), 57-74, January
2011
From the Abstract: In this research, the citations received by DOAJs
journals from the ISI Web of Sciences articles in 2003 to 2008 were
studied and compared. The citations received by the journals in five
fields (Arts & Humanities, Social Sciences, Pure Sciences,
Technology & Engineering, and Health & Medical
Sciences) as well as the difference among the citations received by
DOAJs journals in the above- mentioned five fields were examined. The
research method is citation analysis and the research data have been
collected by means of Cited Reference Search in the ISI Web of Science.
The English-language journals in DOAJ were chosen, and no sampling was
used. Findings showed that out of 2953 journals, 321 journals (10.87%)
received citations, and the total citations received by these journals
were 19050 with the mean of 6.45 per journal; the journals in Pure
Sciences received most citations (10116 citations, equal to 53.1%), and
the ones in Arts & Humanities received the least citations (701
citations, equal to 3.68%).
Added 15 February 2011
Lee, K., Brownstein, J. S., Mills, R. G., Kohane, I. S. (2010)
Does
Collocation Inform the Impact of Collaboration?
PLoS ONE, 5
(12), e14279, 15 Dec 2010
From the Abstract: Despite the positive impact of emerging
communication technologies on scientific research, our results provide
striking evidence for the role of physical proximity as a predictor of
the impact of collaborations. From the Discussion: There have been
numerous articles that reported Open Access publications have higher
chance to be cited more. It may be that publications in Open Access
journals have higher citation, which may not necessarily be related to
collaboration and collocation. However, its impact on our results is
uncertain as there are also growing number of articles that are
reporting no evidence of Open Access advantage in different disciplines.
Added 15 February 2011
Xia, J., Myers, R. L. and Wilhoite, S. K. (2010)
Multiple
open access availability and citation impact
Journal of Information
Science, 37 (1): 19-28, published online 10 Dec 2010
Abstract: This research examines the relationship between multiple open
access (OA) availability of journal articles and the citation advantage
by collecting data of OA copies and citation numbers in 20 top library
and information science journals. We discover a correlation between the
two variables; namely, multiple OA availability of an article has a
positive impact on its citation count. The statistical analysis reveals
that for every increase in the availability of OA articles, citation
numbers increase by 2.348.
Added 6 December 2010
Davis. P. (2010)
Does
Open Access Lead to Increased Readership and Citations? A Randomized
Controlled Trial of Articles Published in APS Journals
The Physiologist,
53 (6), December 2010
Extracts. Introduction: In order to isolate the effect of access on
readership and citations, we conducted a randomized controlled trial of
open access publishing on articles published electronically in 11 APS
journals. This report details the findings three years after the
commencement of the experiment. Discussion: The results of this
experiment suggest that providing free access to the scientific
literature may increase readership (as measured by article downloads)
and reach a larger potential audience (as measured by unique visitors),
but have no effect on article citations. These results are consistent
with an earlier report of the APS study after one year and the results
of other scientific journals after two years. The fact that we observe
an increase in readership and visitors for Open Access articles but no
citation advantage suggests that scientific authors are adequately
served by the current APS model of information dissemination, and
second, that the additional readership is taking place outside this
core research community.
Added 18 August 2011
Düzyol, G., Taskin, Z. and Tonta, Y. (2010)
Mapping
the Intellectual Structure of Open Access Field Through Co-citations
E-LIS, 29 November 2010
In
IFLA Satellite Pre-conference: Open Access to Science Information
Trends, Models and Strategies for Libraries, 6-8 August 2010, Crete
(Greece)
also at
http://yunus.hacettepe.edu.tr/~tonta/yayinlar/tonta-duzyol-taskin-oa.pdf
From the Abstract: This paper maps the intellectual structure of open
access based on 281 articles that appeared in professional literature
on the topic between 2000 and 2010. Using bibliometric and co-citation
analyses, co-citation patterns of papers are visualized through a
number of co-citation maps. CiteSpace was used to analyze and visualize
co-citation maps. Maps show major areas of research, prominent
articles, major knowledge producers and journals in the field of open
access. The letter written by Steven Lawrence (Free online
availability substantially increases a papers impact, 2001) appears
to be the most prominent source as it was cited the most. The
preliminary findings show that open access is an emerging research
field. Findings of this study can be used to identify landmark papers
along with their impact in terms of providing different perspectives
and engendering new research areas.
Added 6 December 2010
Davis. P. (2010)
Access,
Readership, Citations: A Randomized Controlled Trial Of Scientific
Journal Publishing
eCommons@Cornell, 20 October 2010
From the abstract: This dissertation explores the relationship of Open
Access publishing with subsequent readership and citations. It reports
the findings of a randomized controlled trial involving 36 academic
journals produced by seven publishers in the sciences, social sciences
and humanities. At the time of this writing, all articles have aged at
least two years. Articles receiving the Open Access treatment received
significantly more readership (as measured by article downloads) and
reached a broader audience (as measured by unique visitors), yet were
cited no more frequently, nor earlier, than subscription-access control
articles. A pronounced increase in article downloads with no
commensurate increase in citations to Open Access treatment articles
may be explained through social stratification, a process which
concentrates scientific authors at elite, resource-rich institutions
with excellent access to the scientific literature. For this community,
access is essentially a non-issue.
Added 17 Feb 2010, updated 18 Oct 2010
Gargouri, Y., Hajjem, C., Lariviere, V., Gingras, Y., Brody, T., Carr,
L. and Harnad, S. (2010)
Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality
Research
PLoS ONE 5(10): e13636, October 18, 2010, doi:10.1371/journal.pone.0013636
Also in ECS EPrints, 10 Feb 2010, http://eprints.ecs.soton.ac.uk/18493/ (this version includes the paper and full
supplemental materials, including further analyses and responses to comments and feedback),
and in arXiv, arXiv:1001.0361v2 [cs.CY], 3 Jan 2010
Abstract
Background. Articles whose authors have supplemented subscription-based access to the
publisher's version by self-archiving their own final draft to make it accessible
free for all on the web (Open Access, OA) are cited significantly more than
articles in the same journal and year that have not been made OA. Some have
suggested that this OA Advantage may not be causal but just a self-selection bias,
because authors preferentially make higher-quality articles OA. To test this we
compared self-selective self-archiving with mandatory self-archiving for a sample of
27,197 articles published 20022006 in 1,984 journals.
Methdology/Principal Findings. The OA Advantage proved just as high for both.
Logistic regression analysis showed that the advantage is independent of other
correlates of citations (article age; journal impact factor; number of co-authors,
references or pages; field; article type; or country) and highest for the most
highly cited articles. The OA Advantage is real, independent and causal, but skewed.
Its size is indeed correlated with quality, just as citations themselves are (the
top 20% of articles receive about 80% of all citations).
Conclusions/Significance. The OA advantage is greater for the more citable articles,
not because of a quality bias from authors self-selecting what to make OA, but
because of a quality advantage, from users self-selecting what to use and cite,
freed by OA from the constraints of selective accessibility to subscribers only. It
is hoped that these findings will help motivate the adoption of OA self-archiving
mandates by universities, research institutions and research funders.
Added 6 Dec 2010 Comments below refer to the PLoS ONE publication
Howard, J., Is
There an Open-Access Citation Advantage? The Chronicle of
Higher Education, 19 Oct 2010:
Seeks to reignite the earlier debate around the preprint of the new
PLoS ONE paper. And succeeds - see reader responses attached to the
article, some extracts below.
Harnad, S., Correlation,
Causation, and the Weight of Evidence, Open Access
Archivangelism, 20 Oct 2010: One can only speculate on the
reasons why some might still wish to cling to the self-selection bias
hypothesis in the face of all the evidence to date. The straightforward
causal relationship is the default hypothesis, based on both
plausibility and the cumulative weight of the evidence. Hence the
burden of providing counter-evidence to refute it is now on the
advocates of the alternative.
sk_griffhoven, October 20, 2010: The authors were unable to control for
institutional effects in their model. While deposit mandates might be
responsible for the results they report, they might not, and I dont
see how mandates would outperform self-selection. Most importantly,
there is no basis for making a causal claim. I agree with Philip David:
the authors greatly overstate their results.
stevanharnad, October 21, 2010: The causal claim is not that mandated
OA out-performs self-selected OA, but that self-selected OA does *not*
out-perform mandated OA, hence OA is causal.
signofthefourwinds, October 21, 2010: It doesnt make sense that
researchers suddenly give up their habit of consulting certain
databases to go search in Google Scholar. What user behavior change
accounts for the OA advantage?
stevanharnad, October 22, 2010: The OA Advantage is not just, or
primarily, a convenience or laziness effect (though some of that no
doubt contributes to it too): It is not that scholars have become
sloppy, relying on google scholar instead of consulting more
established databases. It is that when their institution cannot afford
access to articles they need, they must make do with only those of them
that they can access for free online.
Patrick Chardenet, October 22, 2010: I dont think that the problem is
to know if Open Access reinforces or not the number of citations. It is
rather a question of knowing if the measurement of science by the
measurement of the number of citations has an interest for the
scientific development.
stevanharnad, October 22, 2010: In a nutshell, citations are not the
goal of research; the goal is that the research should be read, used
and built upon, in further research and applications. And citations are
a measure of that. But for research to be read, used and built upon, it
has to be accessible. That is why and how OA increases citations.
Fenner, M., New
in PLoS ONE: Citation rates of self-selected vs. mandated Open Access,
PLoS Blogs, Gobbledygook, 19 Oct 2010: "I feel that
the paper comes a little short. Yes, they did a very detailed analysis
of the citation behavior, and take into account important cofactors.
But the reader is left with the impression that mandatory
self-archiving of post-prints in institutional repositories is the only
reasonable Open Access strategy, and the introduction and discussion
accordingly leave out some important arguments." Notable for the
substantive discussion that follows between the blogger (Fenner) and
one of the principal authors of the paper (Harnad).
Harnad, S., Comparing
OA and Non-OA: Some Methodological Supplements, Open Access
Archivangelism, 19 Oct 2010. Responds
to Fenner: "If we have given "the
impression that mandatory self-archiving of post-prints in
institutional repositories is the only reasonable Open Access
strategy," then we have succeeded in conveying the implication of our
findings."
Comments below refer to the preprint
Davis, P., Does
a Citation Advantage Exist for Mandated Open Access Articles?
the scholarly kitchen, Jan 7, 2010; "Gargouri reports that
institutionally-mandated
OA papers received about
a 15% citation advantage over self-selected OA papers, which seems
somewhat counter-intuitive. If better articles tend to be
self-archived, their reasoning goes, we should expect that papers
deposited under institutional-wide mandates would under-perform
those
where the authors select which articles to archive. The
authors of
this paper deal, rather unscientifically, with this inconvenient truth
with a quick statistical dismissal that their finding "might be due
to chance or sampling error. In sum, this paper tests an interesting
testable hypothesis on whether mandatory self-archiving policies are
beneficial to their authors in terms of citations. Their
unorthodox methodology, however, results in some inconsistent and
counter-intuitive results that are not properly addressed in their
narrative."
The following were among the comments
added to the above blog by Davis.
Harnad, S., Jan 7, 2010: "Mandated OA Advantage? Yes, the fact that the
citation advantage of mandated OA was slightly greater than that of
self-selected OA is surprising, and if it proves reliable, it is
interesting and worthy of interpretation. We did not interpret it in
our paper, because it was the smallest effect, and our focus was on
testing the Self-Selection/Quality-Bias hypothesis, according to which
mandated OA should have little or no citation advantage at all, if
self-selection is a major contributor to the OA citation advantage."
Gaule,
P., Jan 7, 2010: "the paper does not appear to include controls for
institutions of the authors of the control sample. This is particularly
worrisome when comparing papers originating from CERN which arguably
does cutting edge physics to the control papers. The key issue in this
paper seems to be interpreting the mandated open access versus
self-selected open access. The authors find and point out that the
mandates actually result in compliance of around 60%. However, they
have little to say on what is going on here and why papers end up in
the compliant group or not. I am not sure what conclusions can be
inferred from this comparison of two types of self-selection, at least
one of which is not well understood."
Harnad, S., Jan 8, 2010: "(2)
THE SPECIAL CASE OF CERN: With so few institutional mandates, its not
yet possible to control for institutional quality. But CERN is indeed a
special case; when it is removed, however, it does not alter the
pattern of our results. (3) SELECTIVE COMPLIANCE? Mandate compliance is
not yet 100%, so some form of self-selection still remains a logical
possibility, but we think this is made extremely improbable when there
is no decline in the OA Advantage even when mandates quadruple the OA
rate from the spontaneous self-selective baseline of 15% to the current
mandated average of 60%."
Schneider, J. W., Jan 8, 2010: "the claim
of causality seems well beyond the mark. Neither former research nor
the current regression design permits any casual claims."
Harnad,
S., Jan 9, 2010: "CAUSALITY: We agree that causality is difficult to
demonstrate with correlational statistics. However, we note that the
hypothesis that (2a) making articles open access causes them to be more
citeable and the hypothesis that (2b) being more citeable causes
articles to be made open access are both causal hypotheses."
Harnad, S., Open
Access: Self-Selected, Mandated & Random; Answers &
Questions,
Open Access Archivangelism, February 8, 2010: "What follows is what we
hope will be found to be a conscientious and attentive series of
responses to questions raised by Phil Davis about our paper (currently
under refereeing) -- responses for which we did further analyses of our
data (not included in the draft under refereeing)."
Added 6 September 2010
Zawacki-Richter, O., Anderson, T. and Tuncay, N. (2010)
The
Growing Impact of Open Access Distance Education Journals: A
Bibliometric Analysis
The Journal of Distance
Education / Revue de l'Éducation à Distance, 24
(3), 2010
From
the Abstract: we examine 12 distance education journals (6 open and 6
published in closed format by commercial publishers). Using an online
survey completed by members of the editorial boards of these 12
journals and a systematic review of the number of citations per article
(N = 1,123) and per journal issue between 2003 and 2008, we examine the
impact, and perceived value of the 12 journals. We then compute
differences between open and closed journals. The results reveal that
the open access journals are not perceived by distance eductation
editors as significantly more or less prestigious than their closed
counterparts. The number of citations per journal and per article also
indicates little difference. However we note a trend towards more
citations per article in open access journals. Articles in open access
journals are cited earlier than in non-open access journals.
Added 6 September 2010
Kim, J. (2010)
Faculty
self-archiving: Motivations and barriers
Journal of the American
Society for Information Science and Technology, 16
Jul 2010
info:doi/10.1002/asi.21336
This
paper is broader than open access impact, but one part of the
investigation looked at it.
From the Discussion: A few interviewees did
believe that self-archiving resulted in their research work being cited
more frequently, although 13 interviewees were unsure about the
positive relationship between self-archiving and the citation rate.
Professors even considered self-archiving to serve other purposes, for
example, to recruit graduate students, or to find collaborators,
instead of increasing the impact of research. In fact, five
interviewees expressed uncertainty regarding whether self-archiving
would improve professional recognition. Four other interviewees did not
expect self-archiving to increase academic recognition, as they
believed this related more to the quality of research itself, rather
than merely making it publicly accessible. These findings suggested
that the majority of faculty participants in this study were unaware of
the evidence of a citation advantage from OA previously identified by
several studies. Without noticing the evidence, professors tend not to
expect a citation advantage from self-archiving; however, they see
benefits from the user side through self-archiving. This study shows
that faculty have diverse opinions about citation rates and academic
recognition related to self-archiving.
Added 6 September 2010
Strotmann, A. and Zhao, D. (2010)
Impact
of Open Access on stem cell research: An author co-citation analysis
76th IFLA General
Conference and Assembly, Gothenburg, Sweden, 22
Jun 2010
Abstract:
We explore the impact of Open Access (OA) on stem cell research through
a comparison of research reported in OA and in non-OA publications.
Using an author co-citation analysis method, we find that (a) OA and
non-OA publications cover similar major research areas in the stem cell
field, but (b) a more diverse range of basic and medical research is
reported in OA publications, while (c) biomedical technology areas
appear biased towards non-OA publications. From the Introduction: many
studies have investigated whether OA publication of research results
has a positive effect on the citation ranking of those publications ...
we approach the comparison between OA and non-OA publishing of research
results from a somewhat different perspective. We explore whether there
are substantial differences between the intellectual structure of a
research field when viewed from either the point of view of the OA
publications in that field or from that of its non-OA publications.
Added 6 September 2010
Jacques, T. S. and Sebire, N. J. (2010)
The
impact of article titles on citation hits: an analysis of general and
specialist medical journals
JRSM Short Reports,
1 (1), 2, 01 Jun 2010
info:doi/10.1258/shorts.2009.100020
More factors to consider in citation impact assessment.
From the Abstract:
We hypothesized that specific features of journal titles may be related
to citation rates. We reviewed the title characteristics of the 25 most
cited articles and the 25 least cited articles published in 2005 in
general and specialist medical journals including the Lancet, BMJ and
Journal of Clinical Pathology. The title length and construction were
correlated to the number of times the papers have been cited to May
2009. Results The number of citations was positively correlated with
the length of the title, the presence of a colon in the title and the
presence of an acronym. Factors that predicted poor citation included
reference to a specific country in the title. Conclusions These data
suggest that the construction of an article title has a significant
impact on frequently the paper is cited. We hypothesize that this may
be related to the way electronic searches of the literature are
undertaken.
Added 9 June 2010
Herb, U. (2010)
Alternative
Impact Measures for Open Access Documents? An examination how to
generate interoperable usage information from distributed open access
services
76th IFLA General
Conference and Assembly, Gothenburg, Sweden, August 2010,
paper available online 29 May 2010
From the Abstract: This contribution shows that most common methods to
assess the impact of scientific publications often discriminate open
access publications and by that reduce the attractiveness of Open
Access for scientists. Assuming that the motivation to use open access
publishing services (e.g. a journal or a repository) would increase if
these services would convey some sort of reputation or impact to the
scientists, alternative models of impact are discussed. Prevailing
research results indicate that alternative metrics based on usage
information of electronic documents are suitable to complement or to
relativize citation based indicators.
Added 9 June 2010
Giglia, E. (2010)
The
Impact Factor of Open Access journals: data and trends
DHANKEN, digital repository of HANKEN research, 27 May 2010.
In 14th International
Conference on Electronic Publishing, Helsinki, 16-18 June
2010. Slides in E-LIS, 21 June 2010 http://eprints.rclis.org/18669/
From the Abstract: The aim of this preliminary work, focused on Gold Open
Access, is to test the performance of Open Access journals with the
most traditional bibliometric indicator Impact Factor, to verify the
hypothesis that unrestricted access might turn into more citations and
therefore also good Impact Factor indices. Open Access journals are
relatively new actors in the publishing market, and gaining reputation
and visibility is a complex challenge. Some of them show impressive
Impact Factor trends since their first year of tracking.
Added 17 May 2010
Calver, M. C. and Bradley, J. S. (2010)
Patterns of Citations of Open Access and Non-Open Access Conservation Biology
Journal Papers and Book Chapters
Conservation Biology,
published online: 23 Apr 2010
From the abstract: We compared the number of citations of OA and non-OA
papers in six journals and four books published since 2000 to test
whether OA increases number of citations overall and increases
citations made by authors in developing countries. After controlling
for type of paper (e.g., review or research paper), length of paper,
authors' citation profiles, number of authors per paper, and whether
the author or the publisher released the paper in OA, OA had no
statistically significant influence on the overall number of citations
per journal paper. Journal papers were cited more frequently if the
authors had published highly cited papers previously, were members of
large teams of authors, or published relatively long papers, but papers
were not cited more frequently if they were published in an OA source.
Nevertheless, author-archived OA book chapters accrued up to eight
times more citations than chapters in the same book that were not
available through OA, perhaps because there is no online abstracting
service for book chapters. There was also little evidence that journal
papers or book chapters published in OA received more citations from
authors in developing countries relative to those journal papers or
book chapters not published in OA. For scholarly publications in
conservation biology, only book chapters had an OA citation advantage,
and OA did not increase the number of citations papers or chapters
received from authors in developing countries.
Added 4 March 2013
Pienta, Amy M., Alter, George C., Lyle, Jared A. (2010)
The
Enduring Value of Social Science Research: The Use and Reuse of Primary
Research Data
Deep Blue, University of Michigan, 22 Nov 2010. In The Organisation,
Economics and Policy of Scientific Research workshop, Torino, April
2010
http://www.carloalberto.org/files/brick_dime_strike_workshopagenda_april2010.pdf
Abstract: The goal of this paper is to examine the extent to which social science
research data are shared and assess whether data sharing affects
research productivity tied to the research data themselves. We
construct a database from administrative records containing information
about thousands of social science studies that have been conducted over
the last 40 years. Included in the database are descriptions of social
science data collections funded by the National Science Foundation and
the National Institutes of Health. A survey of the principal
investigators of a subset of these social science awards was also
conducted. We report that very few social science data collections are
preserved and disseminated by an archive or institutional repository.
Informal sharing of data in the social sciences is much more common.
The main analysis examines publication metrics that can be tied to the
research data collected with NSF and NIH funding - total publications,
primary publications (including PI), and secondary publications
(non-research team). Multivariate models of count of publications
suggest that data sharing, especially sharing data through an archive,
leads to many more times the publications than not sharing data. This
finding is robust even when the models are adjusted for PI
characteristics, grant award features, and institutional
characteristics.
Added 6 September 2010
Habibzadeh, F. and Yadollahie, M. (2010)
Are
Shorter Article Titles More Attractive for Citations? Cross-sectional
Study of 22 Scientific Journals
Croatian Medical Journal,
51 (2), April 2010
Open
access is not the only factor affecting citation impact. Here is
another factor that has received rather less attention. From the
Abstract: Longer titles seem to be associated with higher citation
rates. This association is more pronounced for journals with high
impact factors. Editors who insist on brief and concise titles should
perhaps update the guidelines for authors of their journals and have
more flexibility regarding the length of the title.
Added 17 May 2010
Agerbæk, A. and Nielsen, K. (2010)
Factors
in Open Access which Influence the Impact Cycle
ScieCom info, Vol 6, No 1, 2010 (issue notice posted 22 March 2010)
Short paper illustrating journal publishing flowcharts for non-open
access (OA), gold OA and green OA, showing why, in principle, open
access might lead to higher citations due to wider and earlier
dissemination.
Added 17 May 2010
Wagner, A. B. (2010)
Open
Access Citation Advantage: An Annotated Bibliography
Issues in Science and Technology Librarianship, No. 60, Winter 2010
(issue notice posted 16 March 2010)
The bibliography is divided into three sections:
- Review articles [5 reviews]
- Studies showing an open access citation advantage (OACA) [39 articles]
- Studies showing either no OACA effect or ascribing OACA to factors
unrelated to OA publication [7 articles]
The following databases were searched ... results were cross-checked
against an extensive, more general bibliography (this bibliography). It
is interesting to note that no study has ever claimed that OA articles
were cited less than TA articles. The research question still being
debated is whether other factors explain the widely observed OACA (Open
Access Citation Advantage) rather than the mere fact an article is open
access.
Added 09 Mar 2010
Snijder, R. (2010)
The
profits of free books - an experiment to measure the impact of Open
Access publishing
Google sites, undated, but first spotted in the wild 23 February 2010.
In Learned Publishing, Vol. 23, No. 4, October 2010, 293-301
Abstract: to measure the impact of Open Access (OA) publishing of
academic books, an experiment was set up. During a period of nine
months three sets of books were disseminated through an institutional
repository, the Google Book Search program or both channels. A fourth
set was used as control group. Open Access publishing enhances
discovery and online consultation. No relation could be found between
OA publishing and citation rates. Contrary to expectations, OA
publishing does not stimulate or diminish sales figure. The Google Book
Search program is superior compared to the repository.
Added 09 Mar 2010
Swan, A. (2010)
The Open
Access citation advantage: Studies and results to date
ECS EPrints, 17 Feb 2010
Abstract: presents a summary of reported studies on the Open Access
citation advantage. There is a brief introduction to the main issues
involved in carrying out such studies, both methodological and
interpretive. The study listing provides some details of the coverage,
methodological approach and main conclusions of each study.
Added 17 Feb 2010
Giglia, E. (2010)
Più citazioni in
Open Access? Panorama della letteratura con uno studio sull'Impact
Factor delle riviste Open Access
E-LIS, 21 Jan 2010, also in CIBER 1999-2009, 2009 (Ledizioni),
pp. 125-145
From
the English abstract: This work aims to frame the international debate
on the advantage citation of articles published in Open Access, then
present and discuss the overall data sull'Impact Factor of open access
journals, the result of an original study conducted in the Journal
Citation Reports (Thomson Reuters). The basic idea is to test the
performance of OA journals according to the traditional bibliometric
indicators dell'Impact Factor, in order to test the hypothesis that
unrestricted access may involve a greater number of citations and,
therefore, also a good impact factor. The results seem to confirm: the
38,62% of Open Access journals included in the "Journal Citation
Reports" is positioned in the first five percentile as an indicator
when considering the Impact Factor. If you use the Immediacy Index is
the percentage is 37.16%, while the second new indicator dell'Impact
Factor over 5 years - however, only applies to 356 titles on 479 - the
percentage rises to 40.05%.
Added 17 Feb 2010
Davis, P. (2009)
Studies on
access: a review
arXiv:0912.3953v1 [cs.DL], 20 Dec 2009
Brief
abstract: A review of the empirical literature on access to scholarly
information. This review focuses on surveys of authors, article
download and citation analysis.
Added 17 Feb 2010
Ibanez, A., Larranaga, P. and Bielza, C. (2009)
Predicting
citation count of Bioinformatics papers within four years of publication
Bioinformatics, 25 (24), 3303-3309, 15 December 2009
info:pmid/19819886 | info:doi/10.1093/bioinformatics/btp585
From
the abstract: "The possibility of a journal having a tool capable of
predicting the citation count of an article within the first few years
after publication would pave the way for new assessment systems.
Results: This article presents a new approach based on building several
prediction models for the Bioinformatics journal. These models predict
the citation count of an article within 4 years after publication
(global models). To build these models, tokens found in the abstracts
of Bioinformatics papers have been used as predictive features, along
with other features like the journal sections and 2-week
post-publication periods." Comment: Bioinformatics is not an open
access journal, so these results are not based on data for open access
papers, but they may have parallels with methods for predicting
citations and impact based on usage of OA papers
(e.g. Brody et al., 2005).
Data on which the results are based can be found at
the authors'
site. Without
access to the full paper it is not clear what predictive features are
being applied to achieve the claimed successful results: "In these new
models, the average success rate for predictions using the naive Bayes
and logistic regression supervised classification methods was 89.4% and
91.5%, respectively, within the nine sections and for 4-year time
horizon."
Added 10 Dec 2009
Rand, D. G. and Pfeiffer, T. (2009)
Systematic Differences in Impact across Publication Tracks at PNAS
PLoS ONE,
4(12): e8092, December 1, 2009
Investigates
citation counts for the three different publication tracks of the
Proceedings of the National Academy of Sciences (PNAS). Open access is used as a control factor in the
analysis; "To empirically investigate the impact of papers published
via each track, we inspect 2695 papers published between June 1, 2004
and April 26, 2005, covering PNAS Volume 101 Issue 22 through Volume
102 Issue 17. For each paper, we examine Thomson Reuters Web of Science
citation data as of October 2006 and May 2009, as well as page-view
counts as of October 2006. We also note the track through which each
paper was published, the topic classification of each paper, the date
of publication, and whether each article was published as open access
and/or as part of a special feature." In quantifying the size of the OA
effect to control, the paper found that "similar to previous
observations, Open Access papers receive approximately 25% more
citations than non-Open Access papers (Median 2006 [2009] citations:
Open access = 12.5 [38], non-Open access = 10 [30]; 75% percentile 2006
[2009] citations: Open access = 21 [61], non-Open access = 17 [50])."
Added 10 Dec 2009
Soong, S. (2009)
Measuring Citation Advantages of Open Accessibility
D-Lib Magazine,
15 (11/12), December 2009
Short
study of a small collection of papers deposited after publication in
the institutional repository of the Hong Kong University of Science and
Technology (HKUST). "A total of 50 archived journal articles that
already have 10 or more citation counts in Scopus were randomly
selected for inclusion in this study." Claims to present an
"easy-to-follow framework for citation impact analysis of open
accessibility. This framework allows for direct measurement and
comparison of citation rates before and after journal articles are made
openly available." The method compares the citation performance of the
same article over time pre- and post-open access rather than, as other
studies of OA impact, comparing open access papers with non-open access
papers from the same source.
Added 10 Dec 2009
Poor, N. (2009)
Global Citation Patterns of Open Access Communication Studies Journals:
Pushing Beyond the Social Science Citation Index
International Journal of
Communication, Vol. 3, 2009
From the abstract: Connectivity and citations, as used by a large
number of scholars in different fields, are a common measure of the
health of a discipline. This paper shows the citation patterns for a
multinational sample of open access journals in Communication Studies.
Their citations are similar to those of the main communication
journals, but with more international citations. Differences in the
citation patterns are attributable to the international nature of the
sampled journals, not to their open access status. From the conclusion:
The citation pattern of these open access journals is the same as that
for non-open access journals, which is how it should be if open access
journals are going to be of the same quality as more established,
non-open access journals (recall that open access does require
peer-review). The journals in the sample are not in a separate citation
space, and they take part in the larger conversation of the field. As
such, this indicates, to a certain extent, the health of these journals
(they are not isolates in the citing direction), which, in turn, is a
decent indicator for the health of the field.
Added 17 Feb 2010
Akre, O., Barone-Adesi, F., Pettersson, A., Pearce, N., Merletti, F., and Richiardi, L. (2009)
Differences
in citation rates by country of origin for papers published in
top-ranked medical journals: do they reflect inequalities in access to
publication?
Journal of Epidemiology and Community Health, 24
Nov 2009
info:pmid/19934169 | info:doi/10.1136/jech.2009.088690
Investigates
the connection between citations and access, but not open access, by
country and income with reference to the country of publication. From
the abstract: "Methods: We obtained the number of citations and the
corresponding authors country for 4724 papers published between 1998
and 2002 in the British Medical Journal, the Lancet, Journal of the
American Medical Association, New England Medical Journal. Countries
were grouped according to the World Bank classification and geographic
location: low-middle income countries, European high-income countries,
non-European high-income countries, UK and USA. Conclusions: Papers
from different countries published in the same journal have different
citation rates."
Added 10 Dec 2009
Mertens, S. (2009)
Open Access: Unlimited Web Based Literature Searching
Dtsch Arztebl Int., 106(43): October 23, 2009, 710-712
Reviews the findings of most of the principal papers found in this
bibliography
Added 15 July 2009
Kousha, K. and Abdoli, M. (2009)
The
citation impact of Open Access Agricultural Research: a comparison between OA
and Non-OA publications (pdf 12pp)
World Library And Information Congress: 75th IFLA General Conference and
Council, 23-27 August 2009, Milan, Italy. Also in Online Information Review,
Vol. 34, No. 5, 2010, 772-785 http://dx.doi.org/10.1108/14684521011084618
Blogged summary, Open
Access enhances accessibility and citation impact, International
Association of Agricultural Information Specialists, 13 July 2009: "The results
showed that there is an obvious citation advantage for self-archived
agriculture articles as compared to non-OA articles." - "results indicate that
self-archived research articles published in the non-OA agriculture journals
could attract nearly two times more citations than their non-OA counterparts."
Added 15 Oct 2009
Lariviere, V. and Gingras, Y. (2009)
The impact factor's Matthew effect: a
natural experiment in bibliometrics
arXiv.org, arXiv:0908.3177v1 [physics.soc-ph], 21 Aug 2009, also in
Journal of the American Society For Information Science And Technology, 61 (2): 424-427, February 2010
Makes no mention of open access impact, but presents some interesting parallel
results on journal impact factors, in this case that publication in higher
impact journals can result in higher citations for a given paper.
Added 15 Oct 2009
Asif-ul Haque and Ginsparg, P. (2009)
Positional Effects on Citation and
Readership in arXiv
arXiv.org, arXiv:0907.4740v1 [cs.DL], 27 Jul 2009
in Journal of the American Society for Information Science and
Technology, Vol. 60 No. 11, 2203 - 2218, published online: 22 Jul 2009
Added 15 Oct 2009
Greyson, D., Morgan, S., Hanley, G. and Wahyuni, D. (2009)
Open access archiving and article
citations within health services and policy research
E-LIS, 14 Jul 2009, in Journal of the
Canadian Health Libraries Association (JCHLA) / Journal de l'Association des
bibliothèques de la santé du Canada (JABSC), 2009, vol. 30, no. 2,
51-58
From the abstract: This paper contributes to growing body of research exploring
the “OA advantage” by employing an article-level analysis comparing
citation rates for articles drawn from the same, purposively selected journals.
We used a two-stage analytic approach designed to test whether OA is associated
with (1) likelihood that an article is cited at all and (2) total number
citations that an article receives, conditional on being cited at least once.
Adjusting for potential confounders: number of authors, time since publication,
journal, and article subject, we found that OA archived articles were 60% more
likely to be cited at least once, and, once cited, were cited 29% more than
non-OA articles.This paper contributes to growing body of research exploring
the “OA advantage” by employing an article-level analysis comparing
citation rates for articles drawn from the same, purposively selected journals.
We used a two-stage analytic approach designed to test whether OA is associated
with (1) likelihood that an article is cited at all and (2) total number
citations that an article receives, conditional on being cited at least once.
Adjusting for potential confounders: number of authors, time since publication,
journal, and article subject, we found that OA archived articles were 60% more
likely to be cited at least once, and, once cited, were cited 29% more than
non-OA articles.
See also this poster (1pp) with
the same title, E-LIS, 14 Jul 2009, in Canadian Health Libraries Association/ Association
des bibliothèques de la santé du Canada (CHLA/ ABSC) Conference 2009,
Winnepeg, Manitoba (Canada), May 30 - June 3, 2009
Added 15 Oct 2009 Joint,
N. (2009)
The Antaeus column: does
the “open access” advantage exist? A librarian's perspective
Library Review, Vol. 58, No. 7, 2009, 477-481
From the summary: Findings – The paper finds that many of the original
arguments for the benefits of open access have fallen by the wayside; but that,
in spite of this, there is a good evidence that an “open access
advantage” does exist. The application of straightforward library
statistical counting measures which are traditionally used to evaluate user
benefits of mainstream services is just as effective an evaluation tool as more
sophisticated citation analysis methods.
Added 15
July 2009 Gentil-Beccot, A., Mele, S., Brooks, T. (2009)
Citing and Reading Behaviours in
High-Energy Physics. How a Community Stopped Worrying about Journals and
Learned to Love Repositories
arXiv.org, arXiv:0906.5418v1 [cs.DL], v1, 30 Jun 2009
From the abstract: The analysis of citation data demonstrates that free and
immediate online dissemination of preprints creates an immense citation
advantage in HEP, whereas publication in Open Access journals presents no
discernible advantage. In addition, the analysis of clickstreams in the leading
digital library of the field shows that HEP scientists seldom read journals,
preferring preprints instead.
Added 15 July 2009
Lansingh, V. C. and Carter, M. J. (2009)
Does
Open Access in Ophthalmology Affect How Articles are Subsequently Cited in
Research? (abstract only, subscription required)
Ophthalmology, 116(8):1425-1431, August 2009, available
online 22 June 2009 Scintilla
From the abstract: Examination of 480 articles in ophthalmology in the
experimental protocol and 415 articles in the control protocol. ... Four
subject areas were chosen to search the ophthalmology literature in the PubMed
database ... Searching started in December of 2003 and worked back in time to
the beginning of the year. The number of subsequent citations for equal numbers
of both open access (OA) and closed access (CA) (by subscription) articles was
quantified using the Scopus database and Google search engine. A control
protocol was also carried out to ascertain that the sampling method was not
systematically biased by matching 6 ophthalmology journals (3 OA, 3 CA) using
their impact factors, and employing the same search methodology to sample OA
and CA articles. The total number of citations was significantly higher for
open access articles compared to closed access articles for Scopus. However,
univariate general linear model (GLM) analysis showed that access was not a
significant factor that explained the citation data. Author number,
country/region of publication, subject area, language, and funding were the
variables that had the most effect and were statistically significant. Control
protocol results showed no significant difference between open and closed
access articles in regard to number of citations found by Scopus ... Unlike
other fields of science, open access thus far has not affected how
ophthalmology articles are cited in the literature.
Added 15 Oct 2009
Ostrowska, A. (2009)
Open Access Journals Quality
– How to Measure It?
INFORUM 2009: 15th Conference on Professional Information Resources,
Prague, May 27-29, 2009
Added 15 July 2009 Lin,
S.-K. (2009)
Full Open Access Journals
Have Increased Impact Factors (editorial)
Molecules, 2009, 14(6):2254-2255
Added 15 July 2009
Mukherjee, B. (2009)
The hyperlinking pattern
of open-access journals in library and information science: A cited citing
reference study
Library & Information Science
Research, 31 (2), April 2009, 113-125
This paper appears to be another take on this study
Added 29 April 2009
Tiwari, A. (2009)
Citation
Trend Line For PLoS Journals
Fisheye Perspective blog, April 25, 2009
A short illustrated blog on predicting the impact of a new journal. The author,
a bioscientist, evaluates two PLoS (OA) journals using Scopus Journal Analyzer.
Using the service's Trend Line and % Not Cited parameters the author predicts
that one, a new journal that doesn't yet have an official impact factor, will
soon rival the other, which does: "I am sure it's impact factor (or quality or
what ever you love) is going to be same or may be much more." Does not claim to
be statistically sound.
Added 29 April 2009
Gargouri, Y. and Harnad, S. (2009)
Logistic
regression of potential explanatory variables on citation counts
Preprint 11/04/2009
Logistic regression analysis on the correlation between citation counts (as
dependent variable) and a set of potential correlator/predictor variables.
Result: Published journal papers that are self-archived in institutional
repositories - in this study the repositories mandate deposit, obviating the
self-selection bias postulated by some to be a factor in self-archiving - can
achieve a citation advantage whether published in journals of high and low
impact factor (IF): "Overall, OA is correlated with a significant citation
advantage for all journal IF intervals".
Added 29 April 2009
Watson, A. B. (2009)
Comparing citations and downloads
for individual articles
Journal of Vision, April 3, 2009 Volume 9, Number 4, Editorial i, Pages
1-4
Measures the correlation between downloads and citations counts for articles in
Journal of Vision: "Download statistics provide a useful indicator, two
years in advance, of eventual citations."
Added 29 April 2009
Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L.,
Chute, R., Rodriguez, M. A. and Balakireva, L. (2009)
Clickstream Data
Yields High-Resolution Maps of Science
PLoS ONE, 4(3): e4803, March 11, 2009 Scintilla
See also Nature news article on this paper, 9
March 2009: "A striking difference in the usage maps is that journals in the
humanities and social sciences figure much more prominently than in
citation-based maps. The difference partly arises because Bollen's study covers
a wider literature than the citation databases, which are biased towards
natural sciences journals. "By including practitioners we capture a much wider
sample of the scholarly community," adds Bollen. Usage maps are also more up to
date than citation ones because the inherent delay in publication means it
takes at least two years before a paper will start to gather citations in
sufficient numbers to be meaningful. Anthony van Raan argues that this more
current view may in fact represent today's "fashions", rather than trends that
will endure."
These findings are not based on OA journals or papers, but highlight the
emerging value of clicks, or hits, as possible contributory factors for online
impact metrics.
Added 15 July 2009
Bernius, S. and Hanauske, M. (2009)
Open
Access to Scientific Literature - Increasing Citations as an Incentive for
Authors to Make Their Publications Freely Accessible
Institute for Information Systems, Frankfurt University, publications 2009, in
42nd Hawaii International Conference on
System Sciences (HICSS '09), 5-8 Jan. 2009, pp. 1-9
http://dx.doi.org/10.1109/HICSS.2009.335
Summary of results from also in
Bernius, S., Hanauske, M., König, W. and Dugall, B. (2009)
Open
Access Models and their Implications for the Players on the Scientific
Publishing Market (see section 1.2)
Economic Analysis and Policy Journal, Vol. 39, No. 1, March 2009
Added 29 April 2009
Åström, F. (2009)
Citation
patterns in open access journals
OpenAccess.se and the National Library of Sweden, February 25, 2009.
"Fewer analyses have investigated whether OA and non-OA journals in the same
research fields are citing the same literature; and to what extent this
reflects whether it is the same kind (and thus comparable) research that is
published in the two forms of scholarly publications. ... The citation
structures in the journals were analysed through MDS maps building on
co-citation analyses, as well as a more thorough comparison investigating
overlaps of cited authors and journals between the different journals. ... The
results of the analyses suggests that it is hard to draw any overall
conclusions on the matter of whether research published in OA journals is
likely to have a larger citation impact or not."
This conclusion is unsurprising since the study did not measure impact but
mapped citation patterns between journals. It is suggested that these mappings
could improve understanding when comparing the impact of OA and non-OA
journals.
Added 29 April 2009
Gaulé, P. (2009)
Access
to the scientific literature in India
CEMI Working Paper 2009-004, February 23, 2009, in In Journal of the American Society for
Information Science and Technology, Vol. 60, Issue 12, 2548 - 2553, published online: 8 Oct 2009
Abstract: This paper uses an evidence-based approach to assess the difficulties
faced by developing country scientists in accessing the scientific literature.
I compare backward citations patterns of Swiss and Indian scientists in a
database of 43'150 scientific papers published by scientists from either
country in 2007. Controlling for fields and quality with citing journal fixed
effects, I find that Indian scientists (1) have shorter references lists (2)
are more likely to cite articles from open access journals and (3) are less
likely to cite articles from expensive journals. The magnitude of the effects
is small which can be explained by informal file sharing practices among
scientists.
See also
Patrick Gaulé and Nicolas Maystre, Free
availability and diffusion of scientific articles, Vox, 23 June 2009
Added 26 February 2009
Evans, J. A. and Reimer, J. (2009)
Open Access and Global
Participation in Science (full text requires subscription; summary only)
Science, Vol. 323. No. 5917, 20
February 2009, 1025 Scintilla
From the paper: "The influence of OA is more modest than many have proposed, at c.8%
for recently published research, but our work provides clear support for
its ability to widen the global circle of those who can participate in science
and benefit from it."
Listen to Science podcast
interview with James Evans
See also articles on this paper:
Dolgin, E., Online access = more
citations, The Scientist, 19th
February 2009 (free registration required): "When the authors looked just at
poorer countries, however, they found that the influence of open access was
more than twice as strong. For example, in Bulgaria and Chile, researchers
cited nearly 20% more open access articles, and in Turkey and Brazil, the
number of citations rose by more than 25%. Free online availability "is not a
huge driver of science in the first world, but it shapes parts of science in
the rest of world," Evans told The
Scientist."
Xie, Y., Open,
electronic access to research crucial for global reach, ars technica, February 19, 2009
Added 26 February 2009
Bollen, J., Van de Sompel, H., Hagberg, A. and Chute, R.
(2009)
A principal component analysis of 39
scientific impact measures
arXiv.org, arXiv:0902.2183v1 [cs.CY], 12 Feb. 2009, in PLoS ONE 4(6):
e6022, http://dx.doi.org/10.1371/journal.pone.0006022
Added 29 April 2009
Castillo, M. (2009)
Citations and Open
Access: Questionable Benefits
American Journal of Neuroradiology,
February 2009 Scintilla
An editorial.
(Ed. For the record, but I cannot
indicate what is questionable about OA, in this author's view, as I can't
access any part of this, not even an abstract.)
Added 26 February 2009
Norris, M. (2009)
The citation advantage of open access
articles
PhD thesis, Loughborough University Institutional Repository, 2009-01-15
Michael Norris has been named Highly Commended Award winner of the 2008
Emerald/EFMD Outstanding Doctoral Research Award in the Information Science
category for this doctoral thesis.
Two published papers (JASIST, ElPub) are based on this work.
Added 15 July 2009
Frandsen, T. F. (2009)
The effects of open
access on un-published documents: A case study of economics working papers
HAL: hprints-00352359, version 2, 12 January 2009, Journal of
Informetrics (2009) in press
Added 13 January 2009
O'Leary, D. E. (2008)
The relationship between
citations and number of downloads
Decision Support Systems, Vol. 45, No. 4, November 2008, 972-980,
available online 11 April 2008 (full text requires subscription; abstract only)
Broadly agrees with earlier findings (e.g. Brody
et al.) about the correlation - 'strong positive statistically
significant relationship' - between downloads and citations for digital papers,
notably for the most-downloaded, 'top' papers, in this case based on data for a
single, focussed source, the journal Decision Support Systems.
Added 29 April 2009
Mukherjee, B. (2008)
Do open-access journals in
library and information science have any scholarly impact? A bibliometric study
of selected open-access journals using Google Scholar (full text requires
subscription; abstract only)
Journal of the American Society for Information Science and Technology,
Vol. 60, No. 3, March 2009, 581-594, published online: 16 Dec 2008
From the abstract: "Using 17 fully open-access journals published
uninterruptedly during 2000 to 2004 in the field of library and information
science, the present study investigates the impact of these open-access
journals in terms of quantity of articles published, subject distribution of
the articles, synchronous and diachronous impact factor, immediacy index, and
journals' and authors' self-citation."
The paper does not appear to reveal any comparative findings (OA vs non-OA).
Added 24 November 2008
Tenopir, C. and King, D. W. (2008)
Electronic Journals
and Changes in Scholarly Article Seeking and Reading Patterns
D-Lib Magazine, Vol. 14 No. 11/12,
November/December 2008
From the abstract: "Reading patterns and citation patterns differ, as faculty
read many more articles than they ultimately cite and read for many purposes in
addition to research and writing. The number of articles read has steadily
increased over the last three decades, so the actual numbers of articles found
by browsing has not decreased much, even though the percentage of readings
found by searching has increased. Readings from library-provided electronic
journals has increased substantially, while readings of older articles have
recently increased somewhat. Ironically, reading patterns have broadened with
electronic journals at the same time citing patterns have narrowed."
Added 24 November 2008
Gaule, P. and Maystre, N. (2008)
Getting
cited: does open access help?
Ecole Polytechnique Fédérale de Lausanne, CEMI-WORKINGPAPER-2008-007, November
2008. In Research Policy, in press, available online 2 July 2011
info:doi/10.1016/j.respol.2011.05.025. Also available from RePEc
http://ideas.repec.org/p/cmi/wpaper/cemi-workingpaper-2008-007.html
Explains the 'widely held belief' that free availability of scientific articles
increases the number of citations they receive thus: "Since open access is
relatively more attractive to authors of higher quality papers, regressing
citations on open access and other controls yields upward-biased estimates."
Findings are based on a sample of 4388 biology papers published between May
2004 and March 2006 by Proceedings of the National Academy of Sciences
(PNAS). "Using an instrumental variable approach, we find no significant effect
of open access. Instead, self-selection of higher quality articles into open
access explains at least part of the observed open access citation advantage."
Note, OA in PNAS is itself self-selective by virtue of its OA charging
structure (i.e. payment is required for the published article to be OA, but not
if it is not OA).
See also
Patrick Gaulé and Nicolas Maystre, Free
availability and diffusion of scientific articles, Vox, 23 June 2009
Comments below refer to the Research Policy publication
Added 25 November 2011
Harnad, S., Getting
Excited About Getting Cited: No Need To Pay For OA,
Open Access Evangelism, August 19. 2011: What G & M have shown,
convincingly, is that in the special case of having to pay for OA in a
hybrid Gold Journal (PNAS: a high-quality journal that makes all
articles OA on its website 6 months after publication), the article
quality and author self-selection factors alone (plus the availability
of funds in the annual funding cycle) account for virtually all the
significant variance in the OA citation advantage: Paying extra to
provide hybrid Gold OA during those first 6 months does not buy authors
significantly more citations. G & M correctly acknowledge,
however, that neither their data nor their economic model apply to Green OA
self-archiving, which costs the author nothing and can be provided for
any article, in any journal (most of which are not made OA on the
publisher's website 6 months after publication, as in the case of
PNAS). Yet it is on Green OA self-archiving that most of the studies of
the OA citation advantage (and the ones with the largest and most
cross-disciplinary samples) are based.
Comments below refer to the preprint
Davis, P., Should You
Pay To Get Cited?, the scholarly kitchen blog, 18 November 2008: "This
study adds to a growing literature that casts doubt on early research
suggesting that free access to the scientific literature leads to increased
citations. Self-selection, whereby higher quality articles are made freely
available, is beginning to seem a much more plausible explanation."
Added 26
February 2009 Frandsen, T. F. (2008)
Attracted to open access
journals: a bibliometric author analysis in the field of biology
Hprints, Nordic arts and humanities e-print archive, HAL: hprints-00328270,
version 1, 10 October 2008
also in Journal of Documentation,
January 2009
http://www.emeraldinsight.com/Insight/viewContentItem.do?contentType=Article&contentId=1766883
Added 26 February 2009
Frandsen, T. F. (2008)
The integration of open
access journals in the scholarly communication system: Three science
fields
Hprints, Nordic arts and humanities e-print archive, HAL: hprints-00326285,
version 1, 2 October 2008
also in Information Processing &
Management, January 2009 http://dx.doi.org/10.1016/j.ipm.2008.06.001
From the abstract: "This study is an analysis of the citing behaviour in (open
access) journals within three science fields: biology, mathematics, and
pharmacy and pharmacology. The integration of OAJs in the scholarly
communication system varies considerably across fields. The implications for
bibliometric research are discussed."
Added 13 January 2009 De
Groote, S. L (2008)
Citation patterns of
online and print journals in the digital age
J. Med. Libr. Assoc., 2008 October; 96(4): 362-369 Scintilla
From the abstract: "Journals available in electronic format were cited more
frequently in publications from the campus whose library had a small print
collection, and the citation of journals available in both print and electronic
formats generally increased over the years studied."
Added 10 November 2008
Kousha, K. (2008)
Characteristics
of Open Access Web Citation Network: A Multidisciplinary Study
Proceedings of WIS 2008, Fourth International Conference on Webometrics,
Informetrics and Scientometrics & Ninth COLLNET Meeting (Berlin, 28
July - 1 August 2008), edited by H. Kretschmer and F. Havemann, October 2008
Added 10 November 2008, updated 29
April 2009 Lariviere, V., Gingras, Y. and Archambault, E. (2008)
The decline in the concentration of
citations, 1900-2007
arXiv.org, arXiv:0809.5250v1 [physics.soc-ph], 30 Sep 2008 and in Journal of
the American Society for Information Science and Technology, Vol. 60, No.
4, April 2009, 858-862, published online: 29 Jan 2009
From the abstract: "This paper challenges recent research (Evans, 2008) reporting that the concentration of cited
scientific literature increases with the online availability of articles and
journals. ... contrary to what was reported by Evans, the dispersion of
citations is actually increasing."
Added 25 September 2008
Davis, P. M.
Author-choice open access publishing
in the biological and medical literature: a citation analysis
arXiv.org, arXiv:0808.2428v1 [cs.DL], 18 Aug 2008, in
Journal of the American Society for Information Science & Technology,
Vol. 60, No. 1, January 2009, 3-8, published online: 25 Sep 2008
Scintilla
This study is a follow-up to the controlled trial of
open access publishing published in the BMJ: "According to a study
of 11 biological and medical journals that allow authors the choice of making
their articles freely available from the publisher's website, few show any
evidence of a citation advantage. For those that do, the effect appears to be
diminishing over time. ... (the paper) analyzed over eleven thousand articles
published in journals since 2003, sixteen hundred of these articles (15%)
adopting the author-choice open access model."
Added 25 September 2008
Clauson, K. A., Veronin, M. A., Khanfar, N. M. and Lou, J. Q.
Open-access
publishing for pharmacy-focused journals (full text requires subscription;
summary only)
American Journal of Health-System Pharmacy, Vol. 65, No. 16, 1539-1544,
August 15, 2008 Scintilla
From the conclusion. A very small number of pharmacy-focused journals adhere to
the OA paradigm of access. However, journals that adopt some elements of the OA
model, chiefly free accessibility, may be more likely to be cited than
traditional journals. Pharmacy practitioners, educators, and researchers could
benefit from the advantages that OA offers but should understand its financial
disadvantages.
The same issue has an editorial by C. Richard Talley, Open-access publishing:
why not? accessible only to subscribers
Added 10 November 2008
Henneken, E. A., Kurtz, M. J., Accomazzi, A., Grant, C. S.,
Thompson, D., Bohlen, E. and Murray, S. S. (2008)
Use of Astronomical Literature - A
Report on Usage Patterns
arXiv.org, arXiv:0808.0103v1 [cs.DL], 1 Aug 2008
in Journal of Informetrics, Vol. 3, Issue 1, 1-90 (January 2009)
Added 4 August
2008 Davis, P.M., Lewenstein, B. V., Simon, D. H., Booth, J. G.
and Connolly, M. J. L. (2008)
Open access
publishing, article downloads, and citations: randomised controlled trial
BMJ, 2008;337:a568, published 31 July 2008 Scintilla
See also this update by Philip M Davis:
Results
of Open Access RTC Robust at 3 Years, BMJ Rapid Response,
23 November 2010: "All of the articles in our study have now aged 3-years and we report
that our initial findings were robust: articles receiving the open access treatment received more
article downloads but no more citations"
Added 24 November 2008
Levitt, J. M. and Thelwall, M. (2008)
Patterns of annual
citation of highly cited articles and the prediction of their citation ranking:
A comparison across subjects (full text requires subscription; abstract
only)
Scientometrics, Vol. 77, No. 1 (2008)
41-46, published online: 24 July 2008
From the abstract: "For four of the six subjects, there is a correlation of
over 0.42 between the percentage of early citations and total citation ranking
but more highly ranked articles had a lower percentage of early citations.
Surprisingly, for highly cited articles in all six subjects the prediction of
citation ranking from the sum of citations during their first six years was
less accurate than prediction using the sum of the citations for only the fifth
and sixth year."
Open access is not a factor here. The highly-cited subject articles
investigated here date from 1969-71. For open access papers, Brody et al. (2005) revealed a correlation to
predict impact from much earlier data, i.e. download data for OA papers, before
any citations.
Added 28 July 2008
Evans, J. A. (2008)
Electronic Publication and
the Narrowing of Science and Scholarship (full text requires subscription;
abstract only)
Science, Vol. 321, No. 5887, 18 July
2008, 395-399 Scintilla
From the abstract: "Using a database of 34 million articles, their citations
(1945 to 2005), and online availability (1998 to 2005), I show that as more
journal issues came online, the articles referenced tended to be more recent,
fewer journals and articles were cited, and more of those citations were to
fewer journals and articles. The forced browsing of print archives may have
stretched scientists and scholars to anchor findings deeply into past and
present scholarship. Searching online is more efficient and following
hyperlinks quickly puts researchers in touch with prevailing opinion, but this
may accelerate consensus and narrow the range of findings and ideas built
upon."
This work was funded by the National Science Foundation. See the NSF press
release and video interview with James Evans
See also the news feature Great
minds think (too much) alike, The
Economist, July 17th 2008
Added 25 September 2008, updated 13
January 2009 Norris, M., Oppenheim, C., and Rowland, F.
The
citation advantage of open-access articles (full text requires
subscription; abstract only)
Journal of the American Society for
Information Science and Technology, Vol. 59, No. 12, 2008, 1963-1972,
published online: 9 July 2008
also available from Loughborough University Institutional Repository,
2009-01-12 http://hdl.handle.net/2134/4083
From the abstract: "Of a sample of 4,633 articles examined, 2,280 (49%) were OA
and had a mean citation count of 9.04 whereas the mean for (toll access) TA
articles was 5.76. There appears to be a clear citation advantage for those
articles that are OA as opposed to those that are TA. This advantage, however,
varies between disciplines, with sociology having the highest citation
advantage, but the lowest number of OA articles, from the sample taken, and
ecology having the highest individual citation count for OA articles, but the
smallest citation advantage. Tests of correlation or association between OA
status and a number of variables were generally found to weak or inconsistent.
The cause of this citation advantage has not been determined."
Added 28 July 2008
Eger, A. (2008)
Database
statistics applied to investigate the effects of electronic information
services on publication of academic research - a comparative study covering
Austria, Germany and Switzerland
GMS Medizin - Bibliothek - Information, June 26, 2008
Findings on increased usage of online full text articles leading to increased
publication, but says nothing on the effects of such access on citation
practices
Updated 25 September
2008 Norris, M., Oppenheim, C. and Rowland, F. (2008)
Open
Access Citation Rates and Developing Countries
12th International Conference on Electronic Publishing (ElPub 2008),
Toronto, June 25-27, 2008
"the admittedly small number of citations from authors in developing countries
do indeed seem to show a higher proportion of citations being given to OA
articles than is the case for citations from developed countries."
Added 25 September 2008
Sheikh Mohammad, S.
Research
impact of open access contributions across disciplines
12th International Conference on Electronic Publishing (ElPub 2008),
Toronto, June 25-27, 2008
Added 10
November 2008 Dietrich, J. P. (2008)
Disentangling visibility and
self-promotion bias in the arXiv: astro-ph positional citation effect
arXiv.org, arXiv:0805.0307v2 [astro-ph], 25 Jun 2008, in Publications of the
Astronomical Society of the Pacific, 120 (869): 801-804
Added 26 May 2008 Cheng,
W. H. and Ren, S. L. (2008)
Evolution
of open access publishing in Chinese scientific journals (full text
requires subscription; abstract only)
Learned Publishing, Vol. 21, No. 2, April 2008, 140-152
From the abstract: "Citation indicators of OA journals were found to be higher
than those of non-OA journals."
Added 28
July 2008 Harnad, S., Brody, T., Vallières, F., Carr, L.,
Hitchcock, S., Gingras, Y., Oppenheim, C., Hajjem, C. and Hilf, E. R. (2008)
The Access/Impact
Problem and the Green and Gold Roads to Open Access: An Update
Serials Review, Vol. 34, Issue 1,
March 2008, 36-40, available online 6 March 2008 Scintilla
also available from ECS EPrints, 06 Jun 2008
http://eprints.ecs.soton.ac.uk/15852/
Update to the paper published in Serials Review, 30(4), 2004
Added 26 May 2008
Lokker, C., McKibbon, K. A., McKinlay, R.J., Wilczynski, N. L. and Haynes, R.
B. (2008)
Prediction of
citation counts for clinical articles at two years using data available within
three weeks of publication: retrospective cohort study
BMJ, 2008;336:655-657 (22 March), published 21 February 2008 Scintilla
"Conclusion: Citation counts can be reliably predicted at two years using data
within three weeks of publication."
Added 11 February 2008
Chu, H. and Krichel, T. (2008)
Downloads vs. Citations:
Relationships, Contributing Factors and Beyond
E-LIS, 9 February 2008, in 11th Annual Meeting of the International Society
for Scientometrics and Informetrics, Madrid, 25-27 June 2007
From the abstract: "In a nutshell, an infrastructure that encourages
downloading at digital libraries would eventually lead to higher usage of their
resources."
Added 26 May 2008 Turk,
N. (2008)
Citation impact of Open
Access journals (full text requires subscription; summary only)
New Library World, Vol. 109, No. 1/2, January/February 2008, 65-74
Review of the main research about citation impact of Open Access journals,
focused on LIS journals.
Added 26 May 2008
Hardisty, D. J. and Haaga, D. A. F. (2008)
Diffusion
of Treatment Research: Does Open Access Matter? (pdf 39pp)
Center for the Decision Sciences, Columbia University, in Journal of
Clinical Psychology, Vol. 64(7), 1-19 (2008)
From the abstract: "In a pair of studies, mental health professionals were
given either no citation, a normal citation, a linked citation, or a free
access citation and were asked to find and read the cited article. After one
week, participants read a vignette on the same topic as the article and gave
recommendations for an intervention. In both studies, those given the free
access citation were more likely to read the article, yet only in one study did
free access increase the likelihood of making intervention recommendations
consistent with the article."
Added 11 February 2008
Kousha, K. and Thelwall, M. (2007)
The Web impact of open
access social science research (full-text requires subscription; otherwise
abstract only)
Library & Information Science Research, Volume 29, Issue 4, December
2007, 495-507, available online 15 October 2007
preprint http://www.scit.wlv.ac.uk/~cm1993/papers/OpenAccessSocialSciencePreprint.doc
(.doc 12pp)
From the abstract: "The results suggest that new types of citation information
and informal scholarly indictors could be extracted from the Web for the social
sciences."
Added 17
December 2007 Dietrich, J. P. (2007)
The Importance of Being First:
Position Dependent Citation Rates on arXiv:astro-ph
arXiv.org, arXiv:0712.1037v1 [astro-ph], 6 December 2007, in Publications of
the Astronomical Society of the Pacific, 120 (864): 224-228, February 2008
From the abstract: "We study the dependence of citation counts of e-prints
published on the arXiv:astro-ph server on their position in the daily astro-ph
listing. ... cannot exclude that increased visibility at the top of the daily
listings contributes to higher citation counts as well."
Added 13 September 2007
Kurtz, M. J. and Henneken, E. A. (2007)
Open Access does not increase
citations for research articles from The Astrophysical Journal
arXiv.org, arXiv:0709.0896v1 [cs.DL], 6 September 2007
Abstract: We demonstrate conclusively that there is no "Open Access Advantage"
for papers from the Astrophysical Journal. The two to one citation advantage
enjoyed by papers deposited in the arXiv e-print server is due entirely to the
nature and timing of the deposited papers. This may have implications for other
disciplines.
Added 22 August 2007
Sotudeh, H. and Horri, A. (2007)
The citation performance of open
access journals: A disciplinary investigation of citation distribution
models (full-text subscribers only; no abstract)
Journal of the American Society for Information Science and Technology,
Vol. 58, No. 13, 2007, 2145-2156, published online August 17, 2007
From the conclusion: "To sum up, the similarity of the science system across
OAJ and NOAJ boundaries has been confirmed. We see this as further evidence of
OA's widespread recognition by scientific communities. However, because the
magnitudes of the exponents found in this study are lower than what was
previously observed for the whole system, OA may currently perform at a
slightly lower level. According to the models used in this study, the citation
distributions between fields are strongly disproportionate in Life Sciences and
Engineering and Material Sciences, favoring larger fields in the former, but
smaller fields in the latter. However, the distributions tend to be rather
linear in the Natural Sciences."
Added 13 September 2007
Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A. (2007)
Incentivizing
the Open Access Research Web: Publication-Archiving, Data-Archiving and
Scientometrics
CTWatch Quarterly, Vol. 3, No. 3, August 2007
Added 22 August 2007
Lin, S.-K. (2007)
Editorial: Non-Open
Access and Its Adverse Impact on Molecules
Molecules, 12, 1436-1437, 16 July 2007
The point of this short editorial is clear, that the difference between the OA
and non-OA content in the journal Molecules is clearly reflected in
higher citations for the former. The context could be clearer, however. The
OA/non-OA history of the journal, especially prior to the period under review
(2005-6), is not elaborated and familiarity with the journal is assumed.
Added 22 August 2007
Taylor, D. (2007)
Looking
for a Link: Comparing Faculty Citations Pre and Post Big Deals
Electronic Journal of Academic and Special Librarianship, v.8 no.1
(Spring 2007)
Note. The Big Deal is where a library or consortium of libraries subscribes to
a larger package of a publisher's journals than they would have if they had
subscribed to journals individually. Big Deals are claimed to improve access
for an institution's users. "Pre Big Deal, the percentage of citations to
journals that are part of Big Deals but were previously not subscribed to was
an average of 2.6%. Post Big Deal this increased to an average of 6.1%." There
is no analysis or comment on how this result might be affected if it was
considering open access.
Added 22 May 2007 Craig,
I. D., Plume, A. M., McVeigh, M. E., Pringle, J. and Amin, M. (2007)
Do Open Access
Articles Have Greater Citation Impact? A critical review of the literature
Publishing Research Consortium, undated (announced 17 May 2007), Journal of
Informetrics, 1 (3): 239-248, July 2007
Added 10 May 2007 Tonta,
Y., Ünal, Y. and Al, U. (2007)
The Research Impact of
Open Access Journal Articles
E-LIS, 30 April 2007, also in Proceedings ELPUB 2007, the 11th International
Conference on Electronic Publishing, Vienna, 13-15 June 2007
Added 26 May 2008
Sharma, H. P. (2007)
Download plus citation
counts - a useful indicator to measure research impact (correspondence, pdf
1pp)
Current Science, 92 (7): 873-873, April 10, 2007
Added 10 May 2007
Piwowar, H. A., Day, R. S. and Fridsma, D. B. (2007)
Sharing
Detailed Research Data Is Associated with Increased Citation Rate
PLoS ONE, March 21, 2007
Principal Findings: "We examined the citation history of 85 cancer microarray
clinical trial publications with respect to the availability of their data. The
48% of trials with publicly available microarray data received 85% of the
aggregate citations. Publicly available data was significantly (p = 0.006)
associated with a 69% increase in citations, independently of journal impact
factor, date of publication, and author country of origin using linear
regression."
Added 8 March 2007
Bergstrom, T. C. and Lavaty, R. (2007)
How often do
economists self-archive?
eScholarship Repository, University of California, February 8, 2007
Added 26 May 2008
Chapman, S., Nguyen, T. N. and White, C. (2007)
Press-released
papers are more downloaded and cited (full text requires subscription;
extract only)
Tobacco Control, 16 (1): 71-71, February 2007
Added 22 January 2007
Harnad, S. and Hajjem, C. (2007)
The
Open Access Citation Advantage: Quality Advantage Or Quality Bias?
Author blog, Open Access Archivangelism, 21 January 2007
Does the OA Advantage (OAA) occur because authors are more likely to
self-selectively self-archive articles that are more likely to be cited
(self-selection "Quality Bias": QB), or because articles that are self-archived
are more likely to be cited ("Quality Advantage": QA)? Preliminary evidence
based on over 100,000 articles from multiple fields, comparing self-selected
self-archiving with mandated self-archiving to estimate the contributions of QB
and QA to the OAA shows: "Both factors contribute, and the contribution of QA
is greater." Includes comment on Moed, H. (2006), The effect
of 'Open Access' upon citation impact: An analysis of ArXiv's Condensed Matter
Section.
Added 17 January 2007
Harnad, S. (2007)
Citation
Advantage For OA Self-Archiving Is Independent of Journal Impact Factor,
Article Age, and Number of Co-Authors
Author blog, Open Access Archivangelism, 17 January 2007
Further comment on Eysenbach, G. (2006), Citation
Advantage of Open Access Articles: "The OA-self-archiving advantage remains
a robust, independent factor."
Added 17 January 2007
Brody, T. (2007)
Evaluating Research Impact
through Open Access to Scholarly Communication
PhD, Electronics and Computer Science, University of Southampton, May 2006, in
ECS EPrints, 14 January 2007
Added 8 March 2007
McDonald, J. D. (2007)
Understanding Online
Journal Usage: A Statistical Analysis of Citation and Use
Journal of the American Society for Information Science &
Technology, 58(1): 39-50, January 1, 2007, also in Caltech Library System
Papers and Publications, 18 May 2006
Added 28 July 2008
Knowlton, S. A. (2007)
Continuing
use of print-only information by researchers
J Med Libr Assoc., 95(1): 83-88,
January 2007
"to study the question, "Are researchers still accessing and using material
issued only in print?," a group of journals was selected, and the impact factor
of each was tracked over the period 1993-2003.
Conclusion: the online status of a journal is not sufficient to override all
other considerations by researchers when they choose which material to cite."
Added 26 May 2008
Walters, G. D. (2006)
Predicting
subsequent citations to articles published in twelve crime-psychology journals:
Author impact versus journal impact (full text requires subscription;
abstract only)
Scientometrics, 69 (3): 499-510, December 2006
"These results suggest that author impact may be a more powerful predictor of
citations received by a journal article than the periodical in which the
article appears."
Added 23 November 2006
Harnad, S. (2006)
The
Self-Archiving Impact Advantage: Quality Advantage or Quality Bias?
Author blog, Open Access Archivangelism, 20 November 2006
Added 19 November
2006 Moed, H. F. (2006)
The effect of 'Open Access'
upon citation impact: An analysis of ArXiv's Condensed Matter Section
ArXiv, Computer Science, cs.DL/0611060, 14 November 2006, in Journal of the
American Society for Information Science and Technology, Vol. 58, No. 13,
2007, 2145-2156, published online August 30, 2007 http://dx.doi.org/10.1002/asi.20663
(subscriber access only to full text)
"This article statistically analyses how the citation impact of articles
deposited in the Condensed Matter section of the preprint server ArXiv, and
subsequently published in a scientific journal, compares to that of articles in
the same journal that were not deposited in that archive. Its principal aim is
to further illustrate and roughly estimate the effect of two factors, 'early
view' and 'quality bias', upon differences in citation impact between these two
sets of papers ... The analysis provided evidence of a strong quality bias and
early view effect. Correcting for these effects, there is in a sample of 6
condensed matter physics journals studied in detail, no sign of a general 'open
access advantage' of papers deposited in ArXiv. The study does provide evidence
that ArXiv accelerates citation, due to the fact that that ArXiv makes papers
earlier available rather than that it makes papers freely available."
Added 13 September 2007
Bollen, J. and Van de Sompel, H. (2006)
Usage Impact Factor: the effects of
sample characteristics on usage-based impact metrics
arXiv.org > cs > arXiv:cs/0610154v2 [cs.DL], 26 October 2006, in
Journal of the American Society for Information Science and Technology,
59 (1): 136-149, January 1, 2008
Updated 23 November 2006
Mayr, P. (2006)
Constructing experimental
indicators for Open Access documents
E-LIS, 05 October 2006, in Research Evaluation, special issue on 'Web
indicators for Innovation Systems', Vol. 15, No. 2, 1 August 2006, 127-132
Author preprint, http://www.ib.hu-berlin.de/~mayr/arbeiten/mayr_RE06.pdf
(pdf 9pp)
Added 19 November 2006
Henneken, E. A., Kurtz, M. J., Warner, S., Ginsparg, P., Eichhorn, G.,
Accomazzi, A., Grant, C. S., Thompson, D., Bohlen, E. and Murray, S. S. (2006)
E-prints and Journal Articles in
Astronomy: a Productive Co-existence
ArXiv, Computer Science, cs.DL/0609126, 22 September 2006, in Learned
Publishing, Vol. 20, No. 1, January 2007, 16-22
Added 28 July 2008
Jacsó, P. (2006)
Open
Access to Scholarly Full Text Documents (pdf 8pp)
Online Information Review, 30(5) 2006, 587-594
Added 19 November 2006
Zhang, Y. (2006)
The Effect of Open Access on
Citation Impact: A Comparison Study Based on Web Citation Analysis
(abstract only)
Libri, September 2006 (Full text for
subscribers)
Added 09 March 2010
Kurtz, M. and Brody, T. (2006)
The impact
loss to authors and
research
e-Prints Soton, 12 July 2006, in Jacobs, N. (ed.), Open Access: Key
strategic, technical and economic aspects (Oxford, UK:
Chandos
Publishing)
Added 03 August 2006
Metcalfe, T. S. (2006)
The Citation
Impact of Digital Preprint Archives for Solar Physics Papers
Solar Physics, Vol. 239, No. 1-2, December 2006, pp. 549-553
also in ArXiv, Astrophysics, astro-ph/0607079, 5 July 2006 http://arxiv.org/abs/astro-ph/0607079
"Most astronomers now use the arXiv.org server (astro-ph) to distribute
preprints, but the solar physics community has an independent archive hosted at
Montana State University. For several samples of solar physics papers published
in 2003, I quantify the boost in citation rates for preprints posted to each of
these servers. I show that papers on the MSU archive typically have citation
rates 1.7 times higher than the average of similar papers that are not posted
as preprints, while those posted to astro-ph get 2.6 times the average. A
comparable boost is found for papers published in conference proceedings,
suggesting that the higher citation rates are not the result of self-selection
of above-average papers."
Added 03 August 2006
Henneken, E. A., Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C.,
Thompson, D., and Murray, S. S. (2006)
Effect of E-printing
on Citation Rates in Astronomy and Physics
Journal of Electronic Publishing, Vol. 9, No. 2, Summer 2006, also in
ArXiv, Computer Science, cs.DL/0604061, v2, 5 June 2006 http://arxiv.org/abs/cs/0604061
"It has been observed that papers that initially appear as arXiv e-prints get
cited more than papers that do not. Using the citation statistics from the
NASA-Smithsonian Astrophysics Data System, we confirm the findings from other
studies, we examine the average citation rate to e-printed papers in the
Astrophysical Journal, and we show that for a number of major astronomy and
physics journals the most important papers are submitted to the arXiv e-print
repository first.
Added 5 December 2012
Charlotte Tschider (2006)
Investigating
the public in the Public Library of Science: Gifting economics in the
Internet community
First Monday, vol 11, no 6, June 2006
Thanks to Chris Maloney @Klortho
Added 03 August 2006
Kousha, K. and Thelwall, M. (2006)
Google Scholar Citations
and Google Web/URL Citations: A Multi-Discipline Exploratory Analysis
E-LIS, 05 June 2006, also in Proceedings International Workshop on
Webometrics, Informetrics and Scientometrics & Seventh COLLNET Meeting,
Nancy (France), May 2006
"we built a sample of 1,650 articles from 108 Open Access (OA) journals
published in 2001 in four science and four social science disciplines. We
recorded the number of citations to the sample articles using several methods
based upon the ISI Web of Science, Google Scholar and the Google search engine
(Web/URL citations). For each discipline, we found significant correlations
between ISI citations and both Google Scholar and Google Web/URL citations;
with similar results when using total or average citations, and when comparing
within and across (most) journals."
Added 16 May
2006 Eysenbach, G. (2006)
Citation Advantage
of Open Access Articles
PLoS Biology, Volume 4, Issue 5, May 2006 Scintilla
Further evidence for the OA citation advantage, although quite critical of
other studies with which its findings broadly agree. This example is based on a
small, single journal sample (PNAS: Proceedings of the National Academy of
Sciences). Since PNAS offers authors the choice of paying to provide open
access to published papers and/or freely self-archiving, a 'Secondary analysis'
considers the relative impact of each type of OA, although the number of papers
involved is really too small to give this result the weight of the broader
findings. The paper is accompanied by two editorials, one in the publishing
journal, the other a self-published editorial by the author:
MacCallum, C. J. and Parthasarathy, H. (2006) Editorial: Citation
Advantage of Open Access Articles, PLoS Biology, Volume 4, Issue 5,
May 2006
Eysenbach, G. (2006) The Open Access
Advantage, Journal of Medical Internet Research, 2006;8(2):e8
Added 15 March 2006
Davis, P. M. and Fromerth, M. J. (2006)
Does the arXiv lead to higher
citations and reduced publisher downloads for mathematics articles? (pdf
12pp)
draft manuscript, ArXiv.org, cs.DL/0603056, 14 March 2006,
Scientometics, Vol. 71, No. 2. (May 2007)
Added 16 March 2006
Harnad, S. (2006)
OA Impact Advantage = EA + (AA)
+ (QB) + QA + (CA) + UA
Author eprint, 14 March 2006, ECS EPrints repository, School of Electronics and
Computer Science, University of Southampton
Added 03 August 2006
Mueller, P. S., Murali, N. S., Cha, S. S., Erwin, P. J. and Ghosh, A. K. (2006)
The effect
of online status on the impact factors of general internal medicine
journals
Netherlands Journal of Medicine, 64 (2): 39-44, February 2006
"becoming available online as FUTON (full text on the Net) is associated with a
significant increase in journal impact factor."
Added 30 December 2005
Hajjem, C., Harnad, S. and Gingras, Y. (2005)
Ten-Year
Cross-Disciplinary Comparison of the Growth of Open Access and How it Increases
Research Citation Impact (pdf 8pp)
IEEE Data Engineering Bulletin, Vol. 28 No. 4, December 2005
also Author eprint, 16 December 2005 http://eprints.ecs.soton.ac.uk/11688/
"In 2001, Lawrence found that articles in computer science that were openly
accessible (OA) on the Web were cited substantially more than those that were
not. We have since replicated this effect in physics. To further test its
cross-disciplinary generality, we used 1,307,038 articles published across 12
years (1992-2003) in 10 disciplines (Biology, Psychology, Sociology, Health,
Political Science, Economics, Education, Law, Business, Management). The
overall percentage of OA (relative to total OA + NOA) articles varies from
5%-16% (depending on discipline, year and country) and is slowly climbing
annually. Comparing OA and NOA articles in the same journal/year, OA articles
have consistently more citations, the advantage varying from 25%-250% by
discipline and year."
Added 30 December 2005
Hajjem, C., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2005)
Open Access to Research
Increases Citation Impact (.doc 12pp)
Author eprint, 16 December 2005, Technical Report, Institut des sciences
cognitives, Université du Québec à Montréal
Added 30 December 2005
Sahu, D.K., Gogtay, N.J. and Bavdekar, S.B. (2005)
Effect of open access on citation rates
for a small biomedical journal
Author eprint, December 1, 2005, in Fifth International Congress on Peer
Review and Biomedical Publication, Chicago, September 16-18, 2005
"We assessed the influence of OA on citations rates for a small,
multi-disciplinary journal which adopted OA without article submission or
article access fee. DESIGN The full text of articles published since 1990 were
made available online in 2001. Citations for these articles as retrieved using
Web of Science, SCOPUS, and Google Scholar were divided into two groups - the
pre-OA period (1990-2000) and the post-OA period (2001-2004). CONCLUSIONS Open
access was associated with increase in the number of citations received by the
articles. It also decreased the lag time between publication and the first
citation. For smaller biomedical journals, OA could be one of the means for
improving visibility and thus citation rates."
Added 27 September 2005
Zhao, D. (2005)
Challenges of scholarly
publications on the Web to the evaluation of science -- A comparison of author
visibility on the Web and in print journals (abstract only)
Information Processing and Management, 41:6, 1403-1418, December 2005
Compares author visibility between the Web and print journals as revealed from
citation analysis based on a search for the term "XML" or "eXtensible Markup
Language" using NEC Research Institute's CiteSeer, the entire ISI Science
Citation Index (SCI) database, and journals indexed and classified in SCI as
representing computer science research. The main finding: "The author ranking
by number of citations that resulted from CiteSeer data is highly correlated
with that obtained from SCI." i.e. it's not comparing OA impact vs non-OA but
Web vs journal, and finds that authors, notably the top authors, are
self-archiving and publishing papers in both places.
Added 11 February 2008
Coats, A. J. S. (2005)
Top of the charts:
download versus citations in the International Journal of Cardiology
(full-text requires subscription; otherwise abstract only)
International Journal of Cardiology, Volume 105, Issue 2, 2 November
2005, 123-125, available online 7 October 2005
From the abstract: "We have recorded the 10 top cited articles over a 12-month
period and compared them to the 10 most popular articles being downloaded over
the same time period. The citation-based listing included basic and applied,
observational and interventional original research reports. For downloaded
articles, which have shown a dramatic increase for the International Journal of
Cardiology from 48,000 in 2002 to 120,000 in 2003 to 200,000 in 2004, the most
popular articles over the same period are very different and are dominated by
up-to-date reviews of either cutting-edge topics (such as the potential of stem
cells) or of the management of rare or unusual conditions. There is no overlap
between the two lists despite covering exactly the same 12-month period and
using measures of peer esteem. Perhaps the time has come to look at the usage
of articles rather than, or in addition to, their referencing."
Added 13 July 2005
Adams, J. (2005)
Early citation counts
correlate with accumulated impact (abstract only)
Scientometrics, 63 (3): 567-581, June 2005
Working towards earlier prediction of impact. This paper is not OA and has just
appeared but was written before Brody et al.
(2005) revealed a correlation to predict impact from even earlier data,
i.e. download data for OA papers, before any citations.
Added 26 September 2005
Moed, H. F. (2005)
Statistical
Relationships Between Downloads and Citations at the Level of Individual
Documents Within a Single Journal (abstract only)
Journal of the American Society for Information Science and Technology,
56(10): 1088-1097, published online 31 May 2005
"Statistical relationships between downloads from ScienceDirect of documents in
Elsevier's electronic journal Tetrahedron Letters and citations to these
documents recorded in journals processed by the (ISI) for the Science Citation
Index (SCI) are examined. ... Findings suggest that initial downloads and
citations relate to distinct phases in the process of collecting and processing
relevant scientific information that eventually leads to the publication of a
journal article." Does not investigate open access sources. Notes the need for
caution in drawing conclusions on the frequency of paper downloads from formal
citation patterns, and vice versa.
Updated 05 October 2005
Vaughan, L. and Shaw, D. (2005)
Web citation data for impact
assessment: A comparison of four science disciplines (abstract only)
Journal of the American Society for Information Science and Technology,
Vol. 56, No. 10, 1075 - 1087, published online 27 May 2005
appears to be an expansion of Can Web Citations
be a Measure of Impact? An Investigation of Journals in the Life Sciences
(abstract only)
ASIST 2004: Proceedings of the 67th ASIS&T Annual Meeting, Vol. 41
(Medford, USA: Information Today), pp. 516-526
Brody, T., Harnad, S. and Carr, L. (2005)
Earlier Web Usage
Statistics as Predictors of Later Citation Impact
Author eprint, 18 May 2005, University of Southampton, School of Electronics
and Computer Science, Journal of the American Association for Information
Science and Technology, Volume 57, Issue 8, 2006, 1060-1072 (abstract)
Added 19 May 2005 Wren,
J. D. (2005)
Open access
and openly accessible: a study of scientific publications shared via the
internet
BMJ, 330:1128, 12 April 2005 Scintilla
Added 13 April 2005
Belew, R. (2005)
Scientific impact quantity and
quality: Analysis of two sources of bibliographic data (pdf 12pp)
Arxiv.org, cs.IR/0504036, 11 April 2005
Added 28 July 2008 De
Groote, S. L., Shultz, M. and Doranski, M. (2005)
Online
journals' impact on the citation patterns of medical faculty J Med Libr Assoc., 93 (2): 223-228, April
2005
From the conclusion: "It is possible that electronic access to information
(i.e., online databases) has had a positive impact on the number of articles
faculty will cite. Results of this study suggest, at this point, that faculty
are still accessing the print-only collection, at least for research purposes,
and are therefore not sacrificing quality for convenience."
Added 03 August 2006
Metcalfe, T. S. (2005)
The Rise and Citation Impact of
astro-ph in Major Journals
ArXiv, Astrophysics, astro-ph/0503519, 23 March 2005
"I describe a simple method to determine the adoption rate and citation impact
of astro-ph over time for any journal using NASA's Astrophysics Data System
(ADS). I use the ADS to document the rise in the adoption of astro-ph for three
major astronomy journals, and to conduct a broad survey of the citation impact
of astro-ph in 13 different journals. I find that the factor of two boost in
citations for astro-ph papers is a common feature across most of the major
astronomy journals."
Updated 13 April 2005
Ongoing studies Hajjem, C. (2004-05)
Cover page for the
range of studies highlighted below, Laboratoire de recherche en Sciences
Cognitives, UQAM. (Text in French but graphs "self-explanatory"; see this comment
for elaboration)
Updated 26 September
2005 Bollen, J., Van de Sompel, H., Smith, J. and Luce, R. (2005)
Toward alternative metrics of
journal impact: A comparison of download and citation data (pdf 34pp)
Arxiv.org, cs.DL/0503007, 03 March 2005, in Information Processing and
Management, 41(6): 1419-1440, December 2005
Added 5 January 2005
Ongoing study Brody, T., et al.
Citation Impact of Open Access
Articles vs. Articles Available Only Through Subscription ("Toll-Access")
with downloadable graphs of '% Articles OA' and '% OA Advantage' by discipline
and sub-discipline
Updated 31 January 2005
Schwarz, G. and Kennicutt Jr., R. C. (2004)
Demographic and Citation Trends
in Astrophysical Journal Papers and Preprints (pdf 14pp)
Arxiv.org, astro-ph/0411275, 10 November 2004, Bulletin of the American
Astronomical Society, Vol. 36, 1654-1663
See also a note from AAS Pub Board meeting, Tucson, November 3-4 2003
"Greg Schwarz (from the ApJ editorial office) reported some work he's doing
tracking citation rates of papers published in the ApJ based on whether they
were posted on astro-ph or not: ApJ papers that were also on astro-ph have a
citation rate that is _twice_ that of papers not on the preprint server"
http://listserv.nd.edu/cgi-bin/wa?A2=ind0311&L=pamnet&D=1&O=D&P=1632
Added 03 August 2006
Havemann, F. (2004)
Eprints in der
wissenschaftlichen kommunikation (Eprints in scientific communication)
Author eprint, 26 October 2004, presented at the Institute of Library Science,
Humboldt University, Berlin, June 1, 2004
"the use of eprints can significantly accelerate the scientific communication.
This was demonstrated by me with a small sample of articles in theoretical High
Energy Physics published 1998 and 1999 in Physical Review D. Typically the
eprints in this sample are available eight months before the printed issue is
published. Three quarters of them are cited in eprints authored by other
researchers before the journal issue appears (among them all highly cited
eprints)."
Brody, T. (2004)
Citation Analysis in the Open
Access World
Author eprint, October 4, 2004, in Interactive Media International
Added 9 November 2004
McVeigh, M. E. (2004)
Open
Access Journals in the ISI Citation Databases: Analysis of Impact Factors and
Citation Patterns
Thomson Scientific, October 2004
Added 29
September 2004 Antelman, K. (2004)
Do
Open-Access Articles Have a Greater Research Impact?
College and Research Libraries, 65(5):372-382, September 2004
also Author eprint, E-LIS, 29 September 2004, http://eprints.rclis.org/archive/00002309/
Added 17 May
2010 Comment on this paper:
Davis, P., Do Open-Access Articles Really have a Greater Research Impact? Letter
to the editor, College & Research Libraries, Vol.
67, No. 2, March 2006: "The study of citation behavior is complex and
involves multiple confounding, and interacting variables.
Methodologically, it is very difficult to distinguish whether Open
Access is an explanatory cause of increased access, or whether it is
merely an artifact of other causal explanations such as article
duplication or self-promotion. Do Open Access articles really have a
greater research impact, as Antelman suggests? Yes, but Open Access may
not be the cause."
Author's response: "While I intentionally
phrased my conclusion as an association, rather than a causation
(open-access articles have a greater research impact than articles
that are not freely available), there clearly is an implied causation
and I should have been more explicit that the data do not support that.
The article was very much a product of its time, however, when there
was little solid data that there even was an association between open
access and increased citations. ... Since I did the study in
C&RL,
I have collected additional data that indicate that quality bias is
real and significant, at least in the social sciences."
Davis, P.: In 2005, Wren conducted a massive automated study of the availability of
author reprints on the public web. He reported two main conclusions:
that articles available freely online yielded more citations; and that
there was a high degree of association between high-prestige journals
and frequency of author reprints. Journals with high Impact Factors
(New England Journal of Medicine, Nature, Science, and Cell) were
associated with a higher degree of author republishing than
lower-impact journals. Wren went further to discuss possible causes of
this difference and briefly discusses a trophy effect the desire for
researchers to display their accomplishments which would explain why
high impact publications are more common online. This is consistent
with Antelmans findings, that the greatest impact of open access is
with the most-cited articles.
Antelman, K.: Wren's study needs to be looked at carefully, however, because he did not look
at the source of the open access copies he found, so the extent of
trophy effect self-archiving cannot be assessed from his data. I also
collected some data on the source of open access copies from three of
the high-impact journals he looked atNEJM, Science and Natureand,
while many articles from those journals are freely available online,
many or most are not posted by the authors themselves, in particular,
NEJM where only 12% of the open access articles were posted by authors
or their institutions."
Updated 5
January 2005 Harnad, S., Brody, T., Vallieres, F., Carr, L.,
Hitchcock, S., Gingras, Y., Oppenheim, C., Stamerjohanns, H. and Hilf, E.
(2004)
The Access/Impact Problem
and the Green and Gold Roads to Open Access
Author eprint, 15 September 2004, in Serials Review, Vol. 30, No. 4,
310-314 (free access to published
version during 2005)
Shorter version: The green and
the gold roads to Open Access
Nature, Web Focus: access to the literature, May 17, 2004
Updated 26
September 2005 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant,
C. S., Demleitner, M., Murray, S. S. (2004b)
The Effect
of Use and Access on Citations
Author eprint, September 2004, in Information Processing and Management,
41 (6): 1395-1402, December 2005
Perneger, T. V. (2004)
Relation
between online "hit counts" and subsequent citations: prospective study of
research papers in the BMJ
BMJ, 329:546-547, 4 September 2004 Scintilla
Prakasan, E. R. and Kalyane, V. L. (2004)
Citation analysis of LANL
High-Energy Physics E-Prints through Science Citation Index (1991-2002)
Author eprint, E-LIS, 26 August 2004
Added 13 April 2005
Murali, N. S., Murali, H. R., Auethavekiat, P., Erwin, P. J., Mandrekar, J. N.,
Manek, N. J. and Ghosh, A. K. (2004)
Impact
of FUTON and NAA Bias on Visibility of Research
Mayo Clinic Proceedings, Vol. 79, No. 8, 1001-1006, August 2004
Notes and comment: FUTON = full text on the Net; NAA = no abstract available
This is not an article on how Open Access increases impact but on how *Online*
Access increases impact. The effects are related, but one is a licensing
effect, not an OA effect.
Added 10 May 2007 Davis,
P. M. (2004)
For Electronic
Journals, Total Downloads Can Predict Number of Users
portal: Libraries and the Academy, Vol. 4, No. 3, July 2004, 379-392
Harnad, S. and Brody, T. (2004a)
Comparing
the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals
D-Lib Magazine, Vol. 10 No. 6, June 2004
Replicates the Lawrence effect -- OA increases impact -- in physics.
Pringle, J. (2004)
Do Open
Access Journals have Impact?
Nature, Web Focus: access to the literature, May 7, 2004
Testa, J. and McVeigh, M. E. (2004)
The
Impact of Open Access Journals: A Citation Study from Thomson ISI (pdf
17pp)
Author eprint, 14 April 2004
Kurtz, M. J. (2004)
Restrictive access
policies cut readership of electronic research journal articles by a factor of
two (pdf 2pp)
Harvard-Smithsonian Centre for Astrophysics, Cambridge, MA
Poster presentation at National Policies on Open Access (OA) Provision for
University Research Output: an International meeting, Southampton, 19
February 2004
Brody, T., Stamerjohanns, H., Harnad, S., Gingras, Y.
and Oppenheim, C. (2004)
The Effect of
Open Access on Citation Impact (pdf 1pp)
Poster presentation at National Policies on Open Access (OA) Provision for
University Research Output: an International meeting, Southampton, 19
February 2004
Updated 5
January 2005 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C.
S., Demleitner, M. and Murray, S. S. (2004a)
Worldwide
Use and Impact of the Nasa Astrophysics Data System Digital Library
Author eprint, January 28, 2004, in Journal of the American Society for
Information Science and Technology, Vol. 56, No. 1, 36-45, published online
20 September 2004
Hitchcock, S., Brody, T., Gutteridge, C., Carr, L. and Harnad, S. (2003b)
The Impact of
OAI-based Search on Access to Research Journal Papers
Author eprint, 15 September 2003, in Serials, Vol. 16, No. 3, November
2003, 255-260
Hitchcock, S., Woukeu, A., Brody, T., Carr, L.,
Hall, W. and Harnad, S. (2003a)
Evaluating
Citebase, an open access Web-based citation-ranked search and impact discovery
service
Technical Report ECSTR-IAM03-005, School of Electronics and Computer Science,
University of Southampton, July 2003
Added 28 October 2004
Bollen, J., Vemulapalli, S. S., Xu, W. and Luce, R. (2003)
Usage Analysis
for the Identification of Research Trends in Digital Libraries
D-Lib Magazine, Vol. 9, No. 5, May 2003
Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Demleitner, M.,
Murray, S. S., Martimbeau, N. and Elwell, B. (2003b)
The NASA
Astrophysics Data System: Sociology, Bibliometrics, and Impact
Author eprint, March 2003, Journal of the American Society for Information
Science and Technology, submitted for publication
Updated 23
February 2005 Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant,
C. S., Demleitner, M., Murray, S. S., Martimbeau, N. and Elwell, B. (2003a)
The
Bibliometric Properties of Article Readership Information
Author eprint, March 2003, in Journal of the American Society for
Information Science and Technology, 56 (2): 111-128, January 15, 2005
Added 5 December 2012
Brown, C. (2003)
The
role of electronic preprints in chemical communication: Analysis of
citation, usage, and acceptance in the journal literature
Journal of the American Society for Information Science,
Vol. 54, No. 5, 362-371, published online 6 Feb 2003
This study characterizes the usage and acceptance of electronic preprints
(e-prints) in the literature of chemistry. Survey of authors of
e-prints appearing in the Chemistry Preprint Server (CPS) at
http://preprints.chemweb.com indicates use of the CPS as a convenient
vehicle for dissemination of research findings and for receipt of
feedback before submitting to a peer-reviewed journal. Reception of CPS
e-prints by editors of top chemistry journals is very poor. Only 6% of
editors responding allow publication of articles that have previously
appeared as e-prints. Consequently, it was not surprising to discover
that citation analysis yielded no citations to CPS e-prints in the
traditional literature of chemistry.
Added 17 December 2007
Drenth, J. P. H. (2003)
More reprint requests, more
citations? (subscriber access to full text)
Scientometrics, Vol. 56, No. 2, February 2003, 283-286, revised version
published online August 2006
From the abstract: "This study aims to correlate the number of reprint requests
from a 10-year-sample of articles with the number of citations. ... Articles
that received most reprint requests are cited more often."
Darmoni, S. J., et al. (2002)
Reading
factor: a new bibliometric criterion for managing digital libraries
Journal of the Medical Library Association, Vol. 90, No. 3, July 2002
Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Thompson, D. M.,
Bohlen, E. H. and Murray, S. S. (2002)
The NASA
Astrophysics Data System: Obsolescence of Reads and Cites (pdf 8pp)
Library and Information Services in Astronomy IV, edited by B. Corbin,
E. Bryson, and M. Wolf, July 2002
Bollen, J. and Luce, R. (2002)
Evaluation of
Digital Library Impact and User Communities by Analysis of Usage Patterns
D-Lib Magazine, Vol. 8, No. 6, June 2002
Added 5 December 2012
Curti, M., Pistotti, V., Gabutti, G. and Klersy, C. (2001)
Impact
factor and electronic versions of biomedical scientific journals
Haematologica, Vol. 86, No. 10, Oct 2001, 1015-1020
From the Abstract: The availability of journals (table of contents (TOC), abstracts, full text and free full text)
on Internet, in years 1995-2000, was assessed between December 2000 and January 2001. The
first 20 top-journals from 8 subject categories were included. Changes in impact factor over
time and association with Internet availability were modeled. RESULTS: Overall, 118/139
journals (85%) had their TOC on the Internet, of these 107 (77%) had abstracts, 97 (70%)
had full text and 33 (24%) free full text. The median impact factor for all journals was
1.65, 2.08, 2.10, 2.21 and 2.35 for the years from 1995 to 1999, respectively. This increase
was statistically significant, with differences among subject categories. The presence of
TOC, abstracts and full text on the Internet was also significantly associated with higher
impact factor, after accounting for time and subject category. INTERPRETATION AND CONCLUSIONS:
The impact factor has been used for assessing the quality of journals. We identified a new
limitation of this indicator: the impact factor seems to be related to the amount of circulation
of information through Internet. This could be a temporary limitation, associated with diffusion
of journals on, and spread of Internet.
Lawrence, S. (2001)
Free
online availability substantially increases a paper's impact
Nature, 31 May 2001
see also Online or
invisible, an extended version of the Nature article self-archived
by the author
The first major findings on the impact effect documented in this bibliography,
it remains the most cited paper in the bibliography. Note, its results concern
online access rather than open access. At that time the focus was on the
transition from print to electronic publication, with Lawrence quantifying the
improved access that resulted based on a sample of c.120k computer science
papers from 1,494 'venues'.
Added 04 October 2005
Anderson, K., Sack, J., Krauss, L. and O'Keefe, L. (2001)
Publishing
Online-Only Peer-Reviewed Biomedical Literature: Three Years of Citation,
Author Perception, and Usage Experience
Journal of Electronic Publishing, Vol. 6, No. 3, March 2001
One of the first studies of the citation effect of online against
offline publication, rather than of open access against non-OA. Provides
data for one journal and a small number of articles over a three year period
year. This paper was added to the bibliography following this correspondence:
Added 5 December 2012
Brown, C. (2000)
The
E-volution of preprints in the scholarly communication of physicists
and astronomers
Journal of the American
Society for Information Science, Vol. 52 No. 3, 187-200,
published online 30 Nov 2000
From the Abstract: To learn how e-prints are cited, used, and accepted in
the literature of physics and astronomy, the philosophies, policies,
and practices of top-tier physics and astronomy journals regarding
e-prints from the Los Alamos e-print archive, arXiv.org, were examined.
Citation analysis illustrated e-prints were cited with increasing
frequency by a variety of journals in a wide range of physics and
astronomy fields from 1998 to 1999. Even though the policies concerning
e-print citation and publication were inconsistent, the number of
citations (35,928) and citations rates (34.1%) to 12 arXiv.org archives
were found to be large and increasing.
Updated 04 October 2005
Odlyzko, A. M. (2000)
The rapid evolution of
scholarly communication
PEAK 2000: Economics and Usage of Digital Library Collections
conference, Ann Arbor, MI, March 2000.
Also in Learned Publishing, 15(1), 7-19, January 2002 here.
Author eprint http://www.dtc.umn.edu/~odlyzko/doc/rapid.evolution.pdf
Notes the growing usage of information in electronic form (c.f. print forms)
and of journal papers from non-journal sites (e.g. eprints), and presents
evidence that usage increases when access is more convenient
Youngen, G. K. (1998)
Citation
Patterns to Electronic Preprints in the Astronomy and Astrophysics
Literature
Library and Information Services in Astronomy III, ASP Conference
Series, Vol. 153, 1998
see also
Citation
Patterns to Traditional and Electronic Preprints in the Published
Literature
College & Research Libraries, September 1998
Youngen, G. (1998)
Citation Patterns
Of The Physics Preprint Literature With Special Emphasis On The Preprints
Available Electronically
Author eprint, UIUC Physics and Astronomy library, c. 5 November 1998,
presented at ACRL/STS on 6/29/97
Correlation Generator http://citebase.eprints.org/analysis/correlation.phpCiteseer "Scientific literature digital library" http://citeseer.ist.psu.edu/ User service, free
Generates a graph (or table) of the correlation between citation impact and usage impact from the Citebase database
see Brody, T. and Harnad, S. 2005 (in prep.)
Elsevier Scopus Bibliographic database covering 13,450 peer-reviewed
titles http://www.scopus.com/ User service
see
Added 28 July 2008 Jacsó,
P. (2007) Scopus
(2008 Winter Release), Gale, Reference Reviews, Péter's Digital Reference
Shelf, November 2007
Added 15 March 2006
Burnham, J. F. (2006) Scopus
database: a review, Biomedical Digital Libraries, 3:1, 8 March 2006
Added 15 March 2006 Dess,
H. M. (2006) Scopus, Issues in
Science and Technology Librarianship, Winter 2006
Added 15 May 2006 Quint, B.
(2006) Elsevier's Scopus
Introduces Citation Tracker: Challenge to Thomson ISI's Web of Science?,
Newsbreaks, January 23, 2006
Added 13 September 2007
Goodman, D. and Deis, L. (2006) Update
on Scopus, The Charleston Advisor, Vol. 7, No. 3, January 2006,
42-43
Jacsó, P. (2004) Scopus,
Thomson Gale, September 2004
see also Comparative reviews
Google Scholar Find articles from academic publishers, preprint
repositories and universities, as well as scholarly articles across the web
(presents citations as separate results) http://scholar.google.com/ User service, free
see
Added 28 July 2008 Harzing, A. W.K., van der Wal, R. (2008) Google Scholar as a
new source for citation analysis, Ethics
in Science and Environmental Politics, Vol. 8, No. 1, June 03, 2008,
61-73
Added 28 July 2008 Jacsó,
P. (2008) The
pros and cons of computing the h-index using Google Scholar. Online Information Review, 32(3) 2008,
437-452
Added 26 May 2008 Meier, J.
J. and Conkling, T. W. (2008) Google Scholar's Coverage
of the Engineering Literature: An Empirical Study, Journal of Academic
Librarianship, Vol. 34, No. 3, May 2008, 196-201 (full text requires
subscription; abstract only)
Added 28 July 2008 Jacsó,
P. (2008) Google
Scholar, Online, Mar/Apr 2008,
53-54
Added 13 September 2007
Harzing, A.-W. (2007) Reflections
on Google Scholar, Harzing.com, fifth version, 6 September 2007
about the citation analysis software Publish or Perish and its relation with
Google Scholar
Added 13 September 2007
Quint, B. (2007) Changes at
Google Scholar: A Conversation With Anurag Acharya, NewsBreaks,
August 27, 2007
rare public interview with the low-profile 'designer and missionary' behind
Google Scholar
Added 22 August 2007 Mayr,
P. and Walter, A.-K. (2007) An
exploratory study of Google Scholar, arXiv.org > cs >
arXiv:0707.3575v1 [cs.DL], July 24, 2007, in Online Information Review,
Vol. 31, No. 6 (2007), 814-830. Author preprint also available from http://www.ib.hu-berlin.de/~mayr/arbeiten/OIR-Mayr-Walter-2007.pdf
Added 10 May 2007 Robinson,
M. L. and Wusteman, J. (2007) Putting Google
Scholar to the test: a preliminary study, author eprint, also in
Program: Electronic Library and Information Systems, Vol. 41, Issue 1,
February 2007, 71-80
Added 15 May 2006 Sadeh, T.
(2006) Google Scholar
Versus Metasearch Systems, HEP Libraries Webzine, issue 12, March
2006
"thoughtful and informative ... altogether the best overview of Google Scholar,
other large federated search systems such as Scirus, and library-based
metasearch tools I've seen." Reviewed by Tennant, R., Current Cites,
January 2006 issue
Added 15 March 2006
Burright, M. (2006) Google Scholar -- Science
& Technology, Issues in Science and Technology Librarianship,
Winter 2006
Added 28 February 2006
Noruzi, A. (2005) Google
Scholar: the new generation of citation indexes (pdf 11pp), E-LIS, 11
February 2006, in LIBRI 55(4): 170-180
Jacsó, P., (2005) Google
Scholar and The Scientist, commenting on his interview in Perkel, J., The Future of Citation
Analysis (abstract only), The Scientist, October 24, 2005
Jacsó, P. (2005) Google
Scholar (Redux), Thomson Gale, June 2005
Myhill, M. (2005) Google Scholar,
Charleston Advisor, Vol. 6, No. 4, April 2005
Added 22 August 2007
Giustini, D. and Barsky, E. (2005) A look at Google
Scholar, PubMed, and Scirus: comparisons and recommendations, Journal of
the Canadian Health Libraries Association/Journal de l'Association des
bibliothèques de la santé du Canada (JCHLA / JABSC) 26: 85-89 (2005) (pdf
5pp)
Jacsó, P. (2004) Google
Scholar Beta, Thomson Gale, December 2004
see also Comparative reviews
ISI Web of Science Cited reference searching of 8,700 high impact
research journals http://www.isinet.com/products/citation/wos/
User service
see
Added 28 July 2008 Jacsó,
P. (2007) Web of
Science, Gale, Reference Reviews, Péter's Digital Reference Shelf, January
2007
Jacsó, P. (2004) Web of
Science Citation Indexes, Thomson Gale, August 2004
see also Comparative reviews
Added 11 December 2006 Rexa.Info Covers the computer science research literature. Rexa is "a sibling to CiteSeer, Google Scholar, Academic.live.com, the ACM Portal. It's chief enhancement is that Rexa knows about more first-class, de-duplicated, cross-referenced object types: not only papers and their citation links, but also people, grants, topics" http://rexa.info/ User service, free (login required)
Added 13 April 2006
Windows Live Search Academic Beta version. Indexes content related to
computer science, physics, electrical engineering, and related subject areas,
with more than 6 million records from approximately 4300 journals, 2000
conferences and ArXiv.org. In collaboration with Citeseer http://academic.live.com/ User service, free
see
Added 26 May 2008 Nadella,
S. (2008) Book
search winding down, Live Search, The official blog of the Live Search team
at Microsoft, May 23, 2008
"Live Search Books and Live Search Academic projects ... will be taken down
next week. Books and scholarly publications will continue to be integrated into
our Search results, but not through separate indexes."
Added 28 July 2008 Jacsó,
P. (2008) Live
Search Academic, Gale, Reference Reviews, Péter's Digital Reference Shelf,
April 2008
Added 28 July 2008 Jacsó,
P. (2006) Windows
Live Academic, Online, Sep/Oct
2006, 59-60
Added 15 May 2006 Quint, B.
(2006) Windows
Live Academic Search: The Details, Newsbreaks, April 17, 2006
Added 13 April 2006
Sherman, C. (2006) Microsoft
Launches Windows Live Academic Search, SearchEngineWatch.com, April 12,
2006
Added 15 May 2006
Citations in Economics not intended for direct user access; instead is
made available to RePEc services such as Socionet, EconPapers and IDEAS. Uses
Citeseer software http://citec.repec.org/
Data service, free
Rank working papers series and journals in Economics http://citec.repec.org/s/
see
Barrueco Cruz, J. M. and Krichel, T. (2004) Building an autonomous
citation index for grey literature: the economics working papers case (pdf
12pp), E-LIS, 01 February 2005, also in Proceedings GL6: Sixth International
Conference on Grey Literature, New York, December 2004
CrossRef Forward linking service tool allows CrossRef member
publishers to display cited-by links in their primary content, Data service
CrossRef and
Atypon announce forward linking service (press release) June 8, 2004
Institute of Physics becomes first journals publisher to implement 'cited-by'
links using CrossRef's Forward Linking service: Time travel with IOP journals (IOP press
release) 14 March, 2005
Forthcoming ISI Web Citation Index User
service
see
Added 15 May 2006 Martello,
A. (2006) Selection
of Content for the Web Citation Index: Institutional Repositories and
Subject-Specific Archives, Thomson.com, undated
Pringle, J. (2005) Partnering
helps institutional repositories thrive, KnowledgeLink Newsletter,
February 2005
Citeseer's
replacement? List server mailing, 18 March 2004
Quint, B. (2004) Thomson ISI
to Track Web-Based Scholarship with NEC's CiteSeer, Information Today
Newsbreaks, March 1, 2004
Added 13 January 2009
Norris, M., Oppenheim, C. and Rowland, F. (2008)
Finding open access
articles using Google, Google Scholar, OAIster and OpenDOAR
Online Information Review, Vol. 32, No. 6, 2008, 709-715
also available from Loughborough University Institutional Repository,
2009-01-12 http://hdl.handle.net/2134/4084
From the abstract: "Google, Google Scholar, OAIster and OpenDOAR were used to
try to locate OA versions of peer reviewed journal articles drawn from three
subjects (ecology, economics, and sociology). The paper shows the relative
effectiveness of the search tools in these three subjects. The results indicate
that those wanting to find OA articles in these subjects, for the moment at
least, should use the general search engines Google and Google Scholar first
rather than OpenDOAR or OAIster."
Added 28 July 2008
Jacsó, P. (2008)
The
Plausibility of Computing the H-index of Scholarly Productivity and Impact
Using Reference Enhanced Databases
Online Information Review, 32(2) 2008,
266-283
"aims to provide a general overview of the three largest,
cited-reference-enhanced, multidisciplinary databases (Google Scholar, Scopus,
and Web of Science) for determining the h-index. The practical aspects of
determining the h-index also need scrutiny, because some content and software
characteristics of reference-enhanced databases can strongly influence the
h-index values."
Added 26 May 2008 Meho, L. I. and Rogers, Y. (2008) Citation Counting, Citation Ranking, and h-Index of Human-Computer Interaction Researchers: A Comparison between Scopus and Web of Science, E-LIS, 10 March 2008, in Journal of the American Society for Information Science and Technology, 59 (11): 1711-1726, September 2008
Added 26 May 2008 Kloda, L. A. (2007) Use Google Scholar, Scopus and Web of Science for Comprehensive Citation Tracking, Evidence Based Library and Information Practice, 2(3): 87-90, 2007, also in E-LIS, 21 September 2007 http://eprints.rclis.org/archive/00011437/
Added 26 May 2008 Schroeder, R. (2007) Pointing Users Toward Citation Searching: Using Google Scholar and Web of Science, portal: Libraries and the Academy, Vol. 7, No. 2, April 2007, 243-248 (full text requires subscription)
Added 13 September 2007 Goodman, D. and Deis, L. (2007) Update on Scopus and Web of Science, The Charleston Advisor, Vol. 8, No. 3, January 2007, 15-18
Added 10 May 2007 Meho, L. I. and Yang, K. (2006) A New Era in Citation and Bibliometric Analyses: Web of Science, Scopus, and Google Scholar, arXiv.org, Computer Science, cs/0612132, 23 Dec 2006, published as Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar, in Journal of the American Society for Information Science and Technology, Vol. 58, No. 13, 2007, 2105-2125
Added 11 December 2006 Fingerman, S. (2006) Web of Science and Scopus: Current Features and Capabilities, Issues in Science and Technology Librarianship, Fall, 2006
Added 17 January 2007
Neuhaus, C. and Daniel, H.-D. (2006) Data
sources for performing citation analysis: An overview, ETH E-Collection,
June 30, 2006, Journal of Documentation, accepted for publication
Reports the limitations of Thomson Scientific's citation indexes and reviews
the characteristics of the citation-enhanced databases Chemical Abstracts,
Google Scholar and Scopus.
Bakkalbasi, N., Bauer, K., Glover, J. and Wang, L. (2006) Three options for citation tracking: Google Scholar, Scopus and Web of Science, Biomedical Digital Libraries, June 29, 2006
Added 8 March 2007
Bosman, J., van Mourik, I., Rasch, M., Sieverts, E. and Verhoeff, H. (2006) Scopus
reviewed and compared, Igitur repository, Utrecht University, June 2006
The coverage and functionality of the citation database Scopus, including
comparisons with Web of Science and Google Scholar
Wenzel, E. (2006) Google
Scholar beta, ZDNet, May 2, 2006
Brief comparison of Google Scholar and Microsoft Live Academic Search
Bailey, C. W. Jr (2006) A
Simple Search Hit Comparison for Google Scholar, OAIster, and Windows Live
Academic Search, Digital Koans, author blog, April 13, 2006
A simple but revealing experiment: "It should be clear that a sample of one
search term is a very crude measure".
Pauly, D. and Stergiou, K. I. (2005) Equivalence of results from two citation analyses: Thomson ISI's Citation Index and Google's Scholar service (pdf 3pp), Ethics in Science and Environmental Politics, 22 December 2005, 33-35
Added 23 November 2006 Jacsó, P. (2005) Comparison and analysis of the citedness scores in Web of Science and Google Scholar (pdf 10pp), Digital Libraries: Implementing Strategies and Sharing Experiences, Lecture Notes In Computer Science, 3815: 360-369, 2005, Proceedings of the 8th International Conference on Asian Digital Libraries, ICADL 2005, Bangkok, Thailand, December 12-15, 2005
Roth, D. L. (2005) The emergence of competitors to the Science Citation Index and the Web of Science (pdf 6pp), Current Science Online, Vol. 89, No. 9, 10 November 2005
Jacsó, P. (2005) As we may search — Comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases (pdf 11pp), Current Science Online, Vol. 89, No. 9, 10 November 2005
Bauer, K. and Bakkalbasi, N. (2005) An Examination
of Citation Counts in a New Scholarly Communication Environment, D-Lib
Magazine, 11(9), September 2005.
Compares citation counts provided by Web of Science, Scopus, and Google
Scholar.
LaGuardia, C. (2005) Scopus vs. Web of Science, Library Journal, 130(1); 40, 42, January 15, 2005
Deis, L. F. and Goodman, D. (2005) Web of Science (2004 version) and Scopus, Charleston Advisor, Vol. 6, No. 3, January 2005
"Research impact translates into money: employment, salary, tenure money, as well as research-funding money: (1) RAE rank correlates with substantial top-sliced funding, (2) it also correlates highly (0.91) with citation counts, and (3) self-archiving increases citation counts by 50-250+%. Do you really think that any researcher who is *aware* of those three correlations is being rational if he doesn't self-archive?" Stevan Harnad
Added 4 March 2013
Ulf Kronman (2013)
Managing
your assets in the publication economy
Confero: Essays on Education, Philosophy and Politics,
1-35, 17 Jan 2013
info:doi/10.3384/confero13v1130117
Abstract: The issue this article aims to address is the fact that
publications may nowadays be used to assess impact and quality of
research in ways academics may not be fully aware of. During recent
years, scholarly publications have gained in importance, not primarily
as the traditional vehicle for the dissemination of new scientific
findings, but as a foundation for assessing the production and impact
of organizations, research groups and individual researchers. This
means that publications as artefacts per se are starting to play a new
important role in the scientific community and that researchers need to
be aware of how publication and citation counts are being used to
assess their research and the outreach, impact and reputation of their
mother organization. University rankings, for instance, often have some
parameters based on the publishing of the ranked institution. This
article is thus not about scientific writing as such; it focuses on
what happens to your publication after the publishing has taken place
and on aspects to take into account while planning the publishing of
your article, report or book.
Added 18 August 2011
Spörrle, M. and Tumasjan, A. (2011)
Using
Search Engine Count Estimates as Indicators of Academic Impact: A
Web-based Replication of Haggbloom et al.s (2002) Study
The Open Psychology Journal, 4, 12-18, 13 July 2011
DOI: 10.2174/1874350101104010012
Abstract: Using a complex set of quantitative and qualitative
indicators of scientific importance, Haggbloom et al. compiled a
ranking of the most eminent psychologists of the 20th century. The
present study set out to replicate this rankordered list using simple
search engine count estimates (SECEs) obtained from three popular
internet search engines. In line with our expectations, our results
revealed a small, but significant relationship between SECEs and the
existing offline ranking when the query specified the scientists field
of research (i.e., psychology). Our results imply that SECEs may be
considered easy to apply indicators of a researchers impact.
Added 06 July 2011
Li, R. (2011)
Correlation
of Impact Measures of Institutional Repositories and PBRF Ranking
ResearchArchive @ Victoria, 18 May 2011
Master's thesis. From the Abstract: This study examines the correlation
of website impact factor of institutional repositories (IR) of all
eight universities in New Zealand and the Performance Based Research
Fund (PBRF) quality score. The research also studied the different web
ranking tools and tried to find out whether these tools can be used to
measure the quality of IR documents. The research used Yahoo Site
Explorer to collect information of inlinks and also use other tools to
collect the webpage ranking. The finding of this research are that
there is small correlation between the IR website impact factor and
PBRF quality score, and the page ranking is not a good tool to exam the
quality of IR document as a whole.
Added 18 October 2010
Suber, P. (2010)
Thinking
about prestige, quality, and open access
SPARC Open Access Newsletter, issue #125, September 2, 2008
Brief
extracts. Here are a dozen thoughts or theses about prestige and OA. I
start with the rough notion that if journal quality is real excellence,
then journal prestige is reputed excellence. (1) Universities reward
faculty who publish in high-prestige journals, and faculty are strongly
motivated to do so. If universities wanted to create this
incentive, they have succeeded. If journal prestige and
journal
quality can diverge, then universities and funders may be giving
authors an incentive to aim only for prestige. If they wanted
to
create an incentive to put quality ahead of prestige, they haven't yet
succeeded. (8) Universities tend to use journal prestige and impact as
surrogates for quality. The excuses for doing so are getting
thin.
Added 18 October 2010
Willinsky, J. (2010)
Open
access and academic reputation
NISCAIR Online Periodicals Repository (NOPR), Annals of Library and
Information Studies, 57 (3), Sep 2010, 296-302
Abstract:
Open access aims to make knowledge freely available to those who would
make use of it. High-profile open access journals, such as those
published by PLoS (Public Library of Science), have been able to
demonstrate the viability of this model for increasing an authors
reach and reputation within scholarly communication through the use of
such bibliographic tools as the Journal Impact Factor, conceived and
developed by Eugene Garfield. This article considers the various
approaches that authors, journals, and funding agencies are taking
toward open access, as well as its effect on reputation for authors
and, more widely, for journals and the research enterprise itself.
Added 6 September 2010
Li, J., Sanderson, M., Willett, P., Norris, M. and
Oppenheim, C. (2010)
Ranking
of Library and Information Science Researchers: Comparison of Data
Sources for Correlating Citation Data and Expert Judgments
Journal of Informetrics, 16
Jun 2010
Open
access provides scope for new citation-based metrics, but these would
have to be tested and validated against current, preferred methods of
assessment. This paper is not focussed on open access, but it shows how
such testing and validation might be performed.
From the Abstract: This
paper studies the correlations between peer review and citation
indicators when evaluating research quality in library and information
science (LIS). Forty two LIS experts provided judgments on a five-point
scale of the quality of research published by 101 scholars; the median
rankings resulting from these judgments were then correlated with h-,
g- and H-index values computed using three different sources of
citation data: Web of Science (WoS), Scopus and Google Scholar (GS).
Added 06 September 2010
Aguillo, I. F., Ortega, J. L., Fernández, M. and Utrilla, A. M. (2010)
Indicators
for a webometric ranking of open access repositories
Scientometrics, Vol. 82, No. 3, March 2010, 477-486, published
online: 6 February 2010
Added 09 Mar 2010
Harnad, S., Carr, L., Swan, A., Sale, A. and Bosc, H. (2009)
Maximizing
and Measuring Research Impact Through University and Research-Funder
Open-Access Self-Archiving Mandates
ECS EPrints, 08 Dec 2009, in Wissenschaftsmanagement,
15 (4), 36-41
Added 09 Mar 2010
Aguillo, I. (2009)
Measuring
the institution's footprint in the web
Library Hi Tech,
Vol. 27, No. 4, 2009, 540-556
DOI: 10.1108/073788309
Added 09 June 2010
Allen, L., Jones, C., Dolby, K., Lynn, D. and Walport, M. (2009)
Looking
for Landmarks: The Role of Expert Review and Bibliometric Analysis in
Evaluating Scientific Publication Outputs
PLoS ONE, 4(6): e5910, June 18, 2009 doi:10.1371/journal.pone.0005910
Added 09 Mar 2010
Corbyn, Z. (2009)
Hefce
backs off citations in favour of peer review in REF
Times Higher Education, 18 June 2009
Added 26 February 2009, updated 29
April 2009 Houghton, J., Rasmussen, B., Sheehan, P., Oppenheim,
C., Morris, A., Creaser, C., Greenwood, H., Summers, M. and Gourlay, A.
(2009)
Economic
implications of alternative scholarly publishing models: Exploring the costs
and benefits
JISC, 27 January 2009
Added 10 November 2008
Oppenheim, C. (2008)
Out with the old and
in with the new: The RAE, bibliometrics and the new REF (first page pdf;
full text requires subscription)
Journal of Librarianship and Information Science, 40 (3): 147-149,
September 2008
Added 24 November 2008
Cho, S.-R. (2008)
New evaluation indexes
for articles and authors' academic achievements based on Open Access
Resources (full text requires subscription; abstract only)
Scientometrics, Vol. 77, No. 1 (2008)
91-112, published online: 24 July 2008
Added 10 November 2008
Oppenheim, C. and Summers, M. A. C.
Citation counts and the
Research Assessment Exercise, part VI: Unit of assessment 67 (music)
Information Research, 13 (2), paper 342, June 2008
Added 28 July 2008
Adler, R., Ewing, J. (Chair) and Taylor, P. (2008)
Citation
Statistics (pdf 26pp)
Joint Committee on Quantitative Assessment of Research, International
Mathematical Union, IMU-ICIAM-IMS, 6/11/2008
Added 28 July 2008
Harnad, S. (2008)
Validating
research performance metrics against peer rankings
Ethics in Science and Environmental
Politics, Vol. 8, No. 1, June 03, 2008, 103-107
Added 28 July
2008 Taraborelli, D. (2008)
Soft peer review. Social
software and distributed scientific evaluation (pdf 12pp)
In Proceedings of the 8th International
Conference on the Design of Cooperative Systems (COOP 08),
Carry-Le-Rouet, France, May 20-23, 2008
From the abstract: "I analyze the contribution that social bookmarking systems
can provide to the problem of usage-based metrics for scientific evaluation. I
suggest that collaboratively aggregated metadata may help fill the gap between
traditional citation-based criteria and raw usage factors."
Added 13 January 2009
Pringle, J. (2008)
Trends in the use of ISI
citation databases for evaluation
Learned Publishing, Vol. 21, No. 2, April 2008, 85-91
Abstract: "This paper explores the factors shaping the current uses of the ISI
citation databases in evaluation both of journals and of individual scholars
and their institutions. Given the intense focus on outcomes evaluation, in a
context of increasing 'democratization' of metrics in today's digital world, it
is easy to lose focus on the appropriate ways to use these resources, and
misuse can result."
Added 11 February 2008
Armbruster, C. (2008)
Access,
Usage and Citation Metrics: What Function for Digital Libraries and
Repositories in Research Evaluation?
Social Science Research Network, February 05, 2008
From the abstract: "This systematic appraisal of the future role of digital
libraries and repositories for metric research evaluation proceeds by
investigating the practical inadequacies of current metric evaluation before
defining the scope for libraries and repositories as new players. Services
reviewed include: Leiden Ranking, Webometrics Ranking of World Universities,
COUNTER, MESUR, Harzing POP, CiteSeer, Citebase, RePEc LogEc and CitEc, Scopus,
Web of Science and Google Scholar."
Added 26 May 2008 Surya,
M., D'Este, P. and Neely, A. (2008)
Citation Counts:
Are They Good Predictors of RAE Scores? A bibliometric analysis of RAE 2001
Cranfield QUEprints, 31.01.2008
Added 10 May 2007, Updated 17 July
2009 Harnad, S. (2007)
Open Access Scientometrics and the
UK Research Assessment Exercise
ArXiv, Computer Science, cs.IR/0703131, 26 March 2007. Preprint of invited
keynote address to 11th Annual Meeting of the International Society for
Scientometrics and Informetrics, Madrid, 25-27 June 2007
also in ECS EPrints, 29 March 2007 http://eprints.ecs.soton.ac.uk/13804/
Latest version ECS EPrints, 27 Feb 2009 http://eprints.ecs.soton.ac.uk/17142/,
in Scientometrics, 79 (1), 2009, 147-156, published online: 13 November
2008, http://dx.doi.org/10.1007/s11192-009-0409-z
Added 19 November 2006
Steele, C., Butler, L. and Kingsley, D. (2006)
The publishing imperative:
the pervasive influence of publication metrics
ANU Institutional Repository, 30 October 2006, also in Learned
Publishing, 19(4): 277-290, October 2006
Added 19 November 2006
Houghton J. and Sheehan, P. (2006)
The Economic Impact of
Enhanced Access to Research Findings
Centre for Strategic Economic Studies. Victoria University. July 2006
See also
Houghton, J., Steele, C. and Sheehan, P. (2006)
Research
Communication Costs In Australia: Emerging Opportunities And Benefits
Department of Education, Science and Training (DEST), Australia, September 2006
Added 13
September 2007 Shadbolt, N., Brody, T., Carr, L. and Harnad, S.
(2006)
The Open Research Web: A
Preview of the Optimal and the Inevitable
ECS EPrints, 02 May 2006, in Open Access: Key Strategic, Technical and
Economic Aspects, Jacobs, N., Ed., chapter 21 (Oxford: Chandos Publishing)
Added 26
September 2005 Harnad, S. (2005)
Maximising the Return on
UK's Public Investment in Research
Author eprint, September 14, 2005
Attempts to monetise 'lost' impact: "The online-age practice of self-archiving
has been shown to increase citation impact by a dramatic 50-250%, but so far
only 15% of researchers are doing it spontaneously. Citation impact is rewarded
by universities (through promotions and salary increases) and by
research-funders like RCUK (through grant funding and renewal) at a
conservative estimate of £46 per citation. ... As a proportion of the RCUK's
yearly £3.5bn research expenditure (yielding 130,000 articles x 5.6 = 761,600
citations), our conservative estimate would be 50% x 85% x £3.5.bn = £1.5bn
worth of loss in potential research impact (323,680 potential citations lost)."
See also
Australia is not
maximising the return on its research investment (ETD2005, Sydney)
for the same estimate applied to the potential lost return ($425M) there.
Day, M. (2004)
Institutional
repositories and research assessment (pdf 29pp)
Author eprint (v. 0.1), 2 December 2004
Harnad, S. (2003)
Maximizing
university research impact through self-archiving
Jekyll.com, No. 7, December 2003
Harnad, S. (2003)
Enhance UK research
impact and assessment by making the RAE webmetric
Author eprint, in Times Higher Education Supplement, 6 June 2003, p. 16
Harnad, S., Carr, L., Brody, T. and Oppenheim, C. (2003)
Mandated online RAE CVs
linked to university eprint archives: Enhancing UK research impact and
assessment
Ariadne, issue 35, April 2003
Smith, A. and Eysenck, M. (2002)
The correlation between
RAE ratings and citation counts in psychology (pdf 12pp)
Technical Report, Psychology, Royal Holloway College, University of London,
June 2002
Holmes, A. and Oppenheim, C. (2001)
Use of
citation analysis to predict the outcome of the 2001 Research Assessment
Exercise for Unit of Assessment (UoA) 61: Library and Information
Management
Information Research, Vol. 6, No. 2, January 2001
Harnad, S. (2001)
Research
Access, Impact and Assessment (longer version)
Author eprint, in Times Higher Education Supplement, 1487: p. 16., 2001
Garfield, E. (1988)
Can
Researchers Bank on Citation Analysis? (pdf 10pp)
Current Comments, No. 44, October 31, 1988
attached (pp 3-10)
Diamond, Jr., A. M. (1986)
What is a Citation Worth?
J. Hum. Resour., 21:200-15, 1986
Garfield comments on studies that attempt to quantify the reward system of
science in terms of monetary returns to author salaries from article
publication and citations, reprinting one of those studies
Added 13 May 2013
Mike Thelwall, Stefanie Haustein, Vincent Larivière, Cassidy R.
Sugimoto (2013)
Do
altmetrics work? Twitter and ten other social web services
Author preprint. PLoS
ONE, 8(5): e64841, May 28, 2013 http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0064841
doi:10.1371/journal.pone.0064841
Abstract: Altmetric measurements derived from the social web are increasingly
advocated and used as early indicators of article impact and
usefulness. Nevertheless, there is a lack of systematic scientific
evidence that altmetrics are valid proxies of either impact or utility
although a few case studies have reported medium correlations between
specific altmetrics and citation rates for individual journals or
fields. To fill this gap, this study compares 11 altmetrics with Web of
Science citations for 76 to 208,739 PubMed articles with at least one
altmetric mention in each case and up to 1,891 journals per metric. It
also introduces a simple sign test to overcome biases caused by
different citation and usage windows. Statistically significant
associations were found between higher metric scores and higher
citations for articles with positive altmetric scores in all cases with
sufficient evidence (Twitter, Facebook wall posts, research highlights,
blogs, mainstream media and forums) except perhaps for Google+ posts.
Evidence was insufficient for LinkedIn, Pinterest, question and answer
sites, and Reddit, and no conclusions should be drawn about articles
with zero altmetric scores or the strength of any correlation between
altmetrics and citations. Nevertheless, comparisons between citations
and metric values for articles published at different times, even
within the same year, can remove or reverse this association and so
publishers and scientometricians should consider the effect of time
when using altmetrics to rank articles. Finally, the coverage of all
the altmetrics except for Twitter seems to be low and so it is not
clear if they are prevalent enough to be useful in practice.
Added 13 May 2013
Stacy Konkiel and Dave Scherer (2013)
New
Opportunities for Repositories in the Age of Altmetrics
Bulletin of the
Association for Information Science and Technology, Vol.
39, No. 4, April/May 2013
University administrators are increasingly trying to find new ways to measure the
impact of the scholarly output of their faculty, students and
researchers through quantitative means. By reporting altmetrics
(alternative metrics based on online activity) for their content,
institutional repositories can add value to existing metrics and
prove their relevance and importance in an age of growing cutbacks to
library services. This article will discuss the metrics that
repositories currently deliver and how altmetrics can supplement
existing usage statistics to provide a broader interpretation of
research-output impact for the benefit of authors, library-based
publishers and repository managers, and university administrators
alike.
Added 13 May 2013
Ross Mounce (2013)
Open
Access and Altmetrics: Distinct but Complementary
Bulletin of the
Association for Information Science and Technology, Vol.
39, No. 4, April/May 2013
Extract: Alongside (the) growth and preference for online journals, there has
been a notable rise in the growth and popularity of a particular type
of online journal - open access (OA) journals, which expressly allow
anyone on the Internet to read them for free without paying. Such
journals make it even easier for people to discover, access and re-use
journal literature. With this change in the consumption pattern of
journal content to online, new ideas such as altmetrics have arisen to
help us better assess the influence and impact of online journal
articles. This article considers the complementary relationship between
OA journal publishing and altmetrics, scholarly impact measures derived
from online activity, as a means of capturing and measuring some of the
influence of online journal articles.
Added 13 May 2013
Greg Tananbaum (2013)
Article-Level
Metrics: a SPARC Primer
SPARC, April 2013
From the Executive Summary: Article-Level Metrics (ALMs) can be
employed in conjunction with
existing metrics, which have
traditionally focused on the
long-term impact of a
collection of articles (i.e.,
a journal) based on
the number of citations
generated. This primer is
designed to give campus
leaders and other interested
parties an overview of
what ALMs are, why
they matter, how they
complement established utilities,
and how they can be
used in the tenure
and promotion process.
Added 9 April 2013
Petr Heneberg (2013)
Effects
of Print Publication Lag in Dual Format Journals on Scientometric
Indicators
PLoS ONE,
8(4): e59877, April 3, 2013
doi:10.1371/journal.pone.0059877
From the Abstract: Dual-format peer-reviewed journals (publishing both
print and online editions of their content) adopted a broadly accepted
strategy to shorten the publication lag: to publish the accepted
manuscripts online ahead of their print editions, which may follow
days, but also years later. Effects of this widespread habit on the
immediacy index (average number of times an article is cited in the
year it is published) calculation were never analyzed.
Methodology/Principal Findings: Scopus database (which contains nearly
up-to-date documents in press, but does not reveal citations by these
documents until they are finalized) was searched for the journals with
the highest total counts of articles in press, or highest counts of
articles in press appearing online in 20102011. Number of citations
received by the articles in press available online was found to be
nearly equal to citations received within the year when the document
was assigned to a journal issue. Thus, online publication of in press
articles affects severely the calculation of immediacy index of their
source titles, and disadvantages online-only and print-only journals
when evaluating them according to the immediacy index and probably also
according to the impact factor and similar measures.
Conclusions/Significance: Caution should be taken when evaluating
dual-format journals supporting long publication lag.
Added 28 March 2013
Ehsan Mohammadi, Mike Thelwall (2013)
Assessing
non-standard article impact using F1000 labels
Scientometrics,
published online 20 March 2013
DOI: 10.1007/s11192-013-0993-9
Abstract:
Faculty of 1000 (F1000) is a post-publishing peer review web site where
experts evaluate and rate biomedical publications. F1000 reviewers also
assign labels to each paper from a standard list or article types. This
research examines the relationship between article types, citation
counts and F1000 article factors (FFa). For this purpose, a random
sample of F1000 medical articles from the years 2007 and 2008 were
studied. In seven out of the nine cases, there were no significant
differences between the article types in terms of citation counts and
FFa scores. Nevertheless, citation counts and FFa scores were
significantly different for two article types: New finding and
Changes clinical practice: FFa scores value the appropriateness of
medical research for clinical practice and New finding articles are
more highly cited. It seems that highlighting key features of medical
articles alongside ratings by Faculty members of F1000 could help to
reveal the hidden value of some medical papers.
Added 28 March 2013
Ludo Waltman, Rodrigo Costas (2013)
F1000
recommendations as a new data source for research evaluation: A
comparison with citations
arXiv.org > cs > arXiv:1303.3875, 15 March 2013
Abstract:
F1000 is a post-publication peer review service for biological and
medical research. F1000 aims to recommend important publications in the
biomedical literature, and from this perspective F1000 could be an
interesting tool for research evaluation. By linking the complete
database of F1000 recommendations to the Web of Science bibliographic
database, we are able to make a comprehensive comparison between F1000
recommendations and citations. We find that about 2% of the
publications in the biomedical literature receive at least one F1000
recommendation. Recommended publications on average receive 1.30
recommendations, and over 90% of the recommendations are given within
half a year after a publication has appeared. There turns out to be a
clear correlation between F1000 recommendations and citations. However,
the correlation is relatively weak, at least weaker than the
correlation between journal impact and citations. More research is
needed to identify the main reasons for differences between
recommendations and citations in assessing the impact of publications.
Added 4 March 2013
Adriano Tort, Ze Targino, and Olavo Amaral (2012)
Rising
Publication Delays Inflate Journal Impact Factors
PLoS ONE,
7(12): e53374, 31 Dec 2012
info:doi/10.1371/journal.pone.0053374
From the Abstract: We analyze 61 neuroscience journals and show that
delays between online and print publication of articles increased
steadily over the last decade. Importantly, such a practice varies
widely among journals, as some of them have no delays, while for others
this period is longer than a year. Using a modified impact factor based
on online rather than print publication dates, we demonstrate that
online-to-print delays can artificially raise a journals impact
factor, and that this inflation is greater for longer publication lags.
We also show that correcting the effect of publication delay on impact
factors changes journal rankings based on this metric. We thus suggest
that indexing of articles in citation databases and calculation of
citation metrics should be based on the date of an articles online
appearance, rather than on that of its publication in print.
Added 4 March 2013
Emilio Lopez-Cozar, Nicolas Robinson-Garcia, and Daniel Torres-Salinas
(2012)
Manipulating
Google Scholar Citations and Google Scholar Metrics: simple, easy and
tempting
arXiv.org > cs > arXiv:1212.0638, 04 Dec 2012
From the Abstract: The launch of Google Scholar Citations and Google
Scholar Metrics may provoke a revolution in the research evaluation
field as it places within every researchers reach tools that allow
bibliometric measuring. In order to alert the research community over
how easily one can manipulate the data and bibliometric indicators
offered by Google's products we present an experiment in which we
manipulate the Google Citations profiles of a research group through
the creation of false documents that cite their documents, and
consequently, the journals in which they have published modifying their
H index.
Added 13 May 2013
Xuemei Li and Mike Thelwall (2012)
F1000,
Mendeley and Traditional Bibliometric Indicators
In 17th International
Conference on Science and Technology Indicators (STI),
Montreal, 5-8 September 2012
Abstract:
This article compares the Faculty of 1000 (F1000) quality filtering
results and Mendeley usage data with traditional bibliometric
indicators, using a sample of 1397 Genomics and Genetics articles
published in 2008 selected by F1000 Faculty Members (FMs). Both
Mendeley user counts and F1000 article factors (FFas) correlate
significantly with citation counts and associated Journal Impact
Factors. However, the correlations for Mendeley user counts are much
larger than those for FFas. It may be that F1000 is good at disclosing
the merit of an article from an expert practitioner point of view while
Mendeley user counts may be more closely related to traditional
citation impact. Articles that attract exceptionally many citations are
generally disorder or disease related, while those with extremely high
social bookmark user counts are mainly historical or introductory.
Added 4 December 2012
Daniel Acuna, Stefano Allesina, and Konrad Kording (2012)
Future
impact: Predicting scientific success
Nature, 489
(7415), 201-2, 13 September 2012
info:doi/10.1038/489201a
Presents a formula to estimate the future h-index of life scientists.
The h-index and similar metrics can capture only past accomplishments,
not future achievements. Here we attempt to predict the future h-index
of scientists on the basis of features found in most CVs.
Added 4 December 2012
Jasleen Kaur, Diep Thi Hoang, Xiaoling Sun, Lino Possamai, Mohsen
JafariAsbagh, Snehal Patil, Filippo Menczer (2012)
Scholarometer:
A Social Framework for Analyzing Impact across Disciplines
PLoS ONE, 7
(9), 12 September 2012
info:doi/10.1371/journal.pone.0043235
From the Abstract: We describe a system called Scholarometer, which
provides a service to scholars by computing citation-based impact
measures. This creates an incentive for users to provide disciplinary
annotations of authors, which in turn can be used to compute
disciplinary metrics. We first present the system architecture and
several heuristics to deal with noisy bibliographic and annotation
data. We report on data sharing and interactive visualization services
enabled by Scholarometer. Usage statistics, illustrating the data
collected and shared through the framework, suggest that the proposed
crowdsourcing approach can be successful. Secondly, we illustrate how
the disciplinary bibliometric indicators elicited by Scholarometer
allow us to implement for the first time a universal impact measure
proposed in the literature. Our evaluation suggests that this metric
provides an effective means for comparing scholarly impact across
disciplinary boundaries.
Added 16 August 2012
Daniel Torres-Salinas, Nicolas Robinson-Garcia, and Emilio Lopez-Cozar
(2012)
Towards a Book
Publishers Citation Reports. First approach using the Book Citation
Index
Hacia un ranking bibliométrico de editoriales científicas de libros. Primera aproximación utilizando el Book Citation Index
arXiv.org > cs > arXiv:1207.7067, 29 Jul 2012,
Grupo Evaluación de la Ciencia y la Comunicación Científica (EC3)
Working Papers 7. In Revista española de Documentación Científica, Vol 35, No 4 (2012)
http://redc.revistas.csic.es/index.php/redc/article/view/766
doi:10.3989/redc.2012.4.1010
Abstract: The absence of books and book chapters in the Web of Science
Citation Indexes (SCI, SSCI and A&HCI) has always been
considered an important flaw but the Thomson Reuters 'Book Citation
Index' database was finally available in October of 2010 indexing
29,618 books and 379,082 book chapters. The Book Citation Index opens a
new window of opportunities for analyzing these fields from a
bibliometric point of view. The main objective of this article is to
analyze different impact indicators referred to the scientific
publishers included in the Book Citation Index for the Social Sciences
and Humanities fields during 2006-2011. This way we construct what we
have called the 'Book Publishers Citation Reports'. For this, we
present a total of 19 rankings according to the different disciplines
in Humanities & Arts and Social Sciences & Law with six
indicators for scientific publishers
Added 16 August 2012
Joseph Bernstein and Chancellor Gray (2012)
Content
Factor: A Measure of a Journal's Contribution to Knowledge
PLoS ONE, 7
(7), 23 Jul 2012
info:doi/10.1371/journal.pone.0041554
From the Abstract: We propose a metric, Content Factor, and examine
its performance among leading medical and orthopaedic surgery journals.
To remedy Impact Factor's emphasis on recent citations, Content Factor
considers the total number of citations, regardless of the year in
which the cited paper was published. To correct for Impact Factor's
emphasis on efficiency, no denominator is employed. Content Factor is
thus the total number of citations in a given year to all of the papers
previously published in the journal. We found that Content Factor and
Impact Factor are poorly correlated. We further surveyed 75 experienced
orthopaedic authors and measured their perceptions of the importance
of various orthopaedic surgery journals. The correlation between the
importance score and the Impact Factor was only 0.08; the correlation
between the importance score and Content Factor was 0.56.
Added 12 February 2012
Paul Wouters and Rodrigo Costas (2012)
Users,
narcissism and control - tracking the impact of scholarly publications
in the 21st century
SURFfoundation, February 2012
From the Executive summary: This report explores the explosion of
tracking tools that have accompanied the surge of web based information
instruments. The report therefore advises to start a concerted research
programme in the dynamics, properties, and potential use of new web
based metrics which relates these new measures to the already
established indicators of publication impact. Its goal would be to
contribute to the development of more useful tools for the scientific
and scholarly community. This programme should monitor at least the
following tools: F1000, Microsoft Academic Research, Total-Impact,
PlosONE altmetrics, and Google Scholar. The programme should moreover
develop the following key research themes: concepts of new web metrics
and altmetrics; standardisation of tools and data; and the use and
normalisation of the new metrics.
Added 12 February 2012
Cynthia Lokker, R. Brian Haynes, Rong Chu, K. Ann McKibbon, Nancy L.
Wilczynski, and Stephen D. Walter (2012)
How
well are journal and clinical article characteristics associated with
the journal impact factor? a retrospective cohort study
Journal of the Medical
Library Association, 100 (1), 28-33, January 2012. Via
PubMed Central
From the Abstract: A retrospective cohort study determined the ability
of clinical article and journal characteristics, including appraisal
measures collected at the time of publication, to predict subsequent
JIFs. Four of the 10 measures were significant in the regression model:
number of authors, number of databases indexing the journal, proportion
of articles passing methods criteria, and mean clinical newsworthiness
scores. For the clinical literature, measures of scientific quality and
clinical newsworthiness available at the time of publication can
predict JIFs with 60% accuracy.
Added 12 February 2012
Xuemei Li, Mike Thelwall, and Dean Giustini (2011)
Validating
online reference managers for scholarly impact measurement
Scientometrics, 21
December 2011
info:doi/10.1007/s11192-011-0580-x
(Subscription access required. Online preview.) Abstract: This paper
investigates whether CiteULike and Mendeley are useful for measuring
scholarly influence, using a sample of 1,613 papers published in Nature
and Science in 2007. Traditional citation counts from the Web of
Science (WoS) were used as benchmarks to compare with the number of
users who bookmarked the articles in one of the two free online
reference manager sites. Statistically significant correlations were
found between the user counts and the corresponding WoS citation
counts, suggesting that this type of influence is related in some way
to traditional citation-based scholarly impact but the number of users
of these systems seems to be still too small for them to challenge
traditional citation indexes.
Added 12 February 2012
William J. Sutherland, David Goulson, Simon G. Potts, Lynn V. Dicks
(2011)
Quantifying
the Impact and Relevance of Scientific Research
PLoS ONE, 6
(11), e27537, November 16, 2011
info:doi/10.1371/journal.pone.0027537
Abstract: Qualitative and quantitative methods are being developed to
measure the impacts of research on society, but they suffer from
serious drawbacks associated with linking a piece of research to its
subsequent impacts. We have developed a method to derive impact scores
for individual research publications according to their contribution to
answering questions of quantified importance to end users of research.
To demonstrate the approach, here we evaluate the impacts of research
into means of conserving wild bee populations in the UK. For published
papers, there is a weak positive correlation between our impact score
and the impact factor of the journal. The process identifies
publications that provide high quality evidence relating to issues of
strong concern. It can also be used to set future research agendas.
Added 12 February 2012
Perry Evans and Michael Krauthammer (2011)
Exploring
the Use of Social Media to Measure Journal Article Impact
American Medical
Informatics Association (AMIA) Annual Symposium Proceedings Archive,
374-81, 22 October 2011. Via PubMed Central
From the Abstract: Using Wikipedia as a proxy for other social media,
we explore the correlation between inclusion of a journal article in
Wikipedia, and article impact as measured by citation count. We start
by cataloging features of PubMed articles cited in Wikipedia. We find
that Wikipedia pages referencing the most journal articles are about
disorders and diseases, while the most referenced articles in Wikipedia
are about genomics. We note that journal articles in Wikipedia have
significantly higher citation counts than an equivalent random article
subset. We also observe that articles are included in Wikipedia soon
after publication. Our data suggest that social media may represent a
largely untapped post-publication review resource for assessing paper
impact.
Added 06 July 2011
Merceur, F., Le Gall, M. and Salaun, A. (2011)
Bibliometrics:
a new feature for institutional repositories
Archimer, Ifremer's institutional repository, May 2011. In 14th Biennal
EURASLIC Meeting, Lyon, 17-20 May, 2011
From the Abstract: In addition to its promotion and conservation objectives,
Archimer, Ifremers institutional repository, offers a wide range of
bibliometric tools described in this document.
Added 18 October 2010
Herb, U. (2010)
OpenAccess
Statistics: Alternative Impact Measures for Open Access documents? An
examination how to generate interoperable usage information from
distributed Open Access services
E-LIS, 25 Sep 2010. In:
L'information scientifique et technique dans l'univers numérique.
Mesures et usages. L'association des professionnels de l'information et
de la documentation, ADBS, pp. 165-178
From the abstract: This
contribution shows that most common methods to assess the impact of
scientific publications often discriminate Open Access publications and
by that reduce the attractiveness of Open Access for scientists.
Assuming that the motivation to use Open Access publishing services
(e.g. a journal or a repository) would increase if these services would
convey some sort of reputation or impact to the scientists, alternative
models of impact are discussed. Prevailing research results indicate
that alternative metrics based on usage information of electronic
documents are suitable to complement or to relativize citation-based
indicators. Furthermore an insight into the project OpenAccess-Statistics
OA-S is given. OA-S implemented an infrastructure to collect
document-related usage information from distributed Open Access
Repositories in an aggregator service in order to generate
interoperable document access information according to three standards
(COUNTER, LogEc and IFABC).
Added 6 September 2010
Priem, J. and Hemminger, B. (2010)
Scientometrics
2.0: Toward new metrics of scholarly impact on the social Web
First Monday, 15 (7), Jul 2010
Added 6 September 2010
Herb, U., Kranz, E., Leidinger, T. and Mittelsdorf, B. (2010)
How
to assess the impact of an electronic document? And what does impact
mean anyway?: Reliable usage statistics in heterogeneous repository
communities
E-LIS, 10 Jun 2010. In OCLC
Systems & Services, Vol. 26, No. 2, 2010, 133-145
From
the Abstract: Purpose - Usually the impact of research and researchers
is quantified by using citation data: either by journal-centered
citation data as in the case of the journal impact factor (JIF) or by
author-centered citation data as in the case of the Hirsch- or h-index.
This paper aims to discuss a range of impact measures, especially
usage-based metrics, and to report the results of two surveys.
Originality/value - This paper delineates current discussions about
citation-based and usage-based metrics. Based on the results of the
surveys, it depicts which functionalities could enhance repositories,
what features are required by scientists and information professionals,
and whether usage-based services are considered valuable. These results
also outline some elements of future repository research.
Added 6 September 2010
Repanovici, A. (2010)
Measuring
the visibility of the University's scientific production using
GoogleScholar, "Publish or Perish" software and Scientometrics
76th IFLA General
Conference and Assembly, Gothenburg, Sweden, 07
Jun 2010
In Performance Measurement and Metrics, Vol. 12, No. 2, 2011. From
the Abstract: The first Romanian institutional repository was
implemented at Transilvania University of Brasov. As part of the
undertaken research, the visibility and the impact of the university's
scientific production was measured using the scientific methods of
scientometry, as a fundamental instrument for determining the
international value of an university as well as for the statistical
evaluation of scientific research results. The results showed that an
open access institutional repository would significantly add to the
visibility of the university's scientific production. In this article we
define the scientific production and productivity and present the main
indicators for the measurement of the scientific activity. Google
Scholar was used as a scientometric database which can be consulted
free of charge on the Internet and which indexes academic papers from
institutional repositories, identifying also the referenced citations.
The free Publish or Perish software can be used as an analysis
instrument for the impact of the research, by analysing the citations
through the h-index. We present the methodology and the results of an
exploratory study made at the Transilvania University of Brasov
regarding the h-index of the academic staff.
Added 9 June 2010
Neff, B. and Olden, J. (2010)
Not
So Fast: Inflation in Impact Factors Contributes to Apparent
Improvements in Journal Quality
BioScience,
60 (6), 455-9, June 2010
From the Abstract: Here we propose that impact factors may be subject to
inflation analogous to changes in monetary prices in economics. The
possibility of inflation came to light as a result of the observation
that papers published today tend to cite more papers than those
published a decade ago. We analyzed citation data from 75,312 papers
from 70 ecological journals published during 1998-2007. We found that
papers published in 2007 cited an average of seven more papers than
those published a decade earlier. This increase accounts for about 80%
of the observed impact factor inflation rate of 0.23. In examining the
70 journals we found that nearly 50% showed increases in their impact
factors, but at rates lower than the background inflation rate.
Therefore, although those journals appear to be increasing in quality
as measured by the impact factor, they are actually failing to keep
pace with inflation.
Added 6 September 2010
Repanovici, A. (2010)
Measuring
the visibility of the universities scientific production using
scientometric methods
6th WSEAS/IASME
International Conference on Educational Technologies
(EDUTE '10), 03 May 2010
Abstract:
Paper presents scientometry as a science and a fundamental instrument
for determining the international value of an university as well as for
the statistical evaluation of scientific research results. The impact
of the research measurable through scientometric indicators is
analyzed. Promoting the scientific production of universities through
institutional digital repositories deals with the concept of scientific
production of the university and the development of scientific research
in information society. These concepts are approached through the prism
of marketing methods and techniques. The digital repository is analyzed
as a PRODUCT, destined for promoting, archieving and preserving
scientific production. Find out more about the author and the paper
here.
The record
and abstract page for the paper does not currently link to the full text.
Added 6 September 2010
Wardle, D. (2010)
Do
Faculty of 1000 (F1000) ratings of ecological publications serve as
reasonable predictors of their future impact?
Ideas in Ecology and
Evolution, 3, 2010, 11-15
Commentary article.
From the Abstract: There is an increasing demand for an
effective means of post-publication evaluation of ecological work that
avoids pitfalls associated with using the impact factor of the journal
in which the work was published. One approach that has been gaining
momentum is the Faculty of 1000 (hereafter F1000) evaluation
procedure, in which panel members identify what they believe to be the
most important recent publications they have read. Here I focused on
1530 publications from 7 major ecological journals that appeared in
2005, and compared the F1000 rating of each publication with the
frequency with which it was subsequently cited. ... Possible reasons
for the F1000 process failing to identify high impact publications may
include uneven coverage by F1000 of different ecological topics,
cronyism, and geographical bias favoring North American publications.
As long as the F1000 process cannot identify those publications that
subsequently have the greatest impact, it cannot be reliably used as a
means of post-publication evaluation of the ecological literature.
Added 9 June 2010
Bornmann, L. and Daniel, H.-D. (2010)
The
citation speed index: A useful bibliometric indicator to add to the h
index
Authors' server, undated but notice posted to Sigmetrics listserv 26
March 2010, in Journal
of Informetrics, accepted for publication
This topic would appear to be a natural complement to OA citation
effects, but the paper does not mention any.
Added 9 June 2010
Horwood, L. and Robertson, S. (2010)
Role
of bibliometrics in scholarly communication
VALA2010 15th Biennial Conference and Exhibition, Melbourne, 9
Feb 2010
Added 9 Mar 2010
Bar-Ilan, J. (2009)
A
Closer Look at the Sources of Informetric Research
CYBERmetrics, 13 (1), 23 Dec 2009
From
the introduction: The Web has existed for twenty years only, yet the
large majority of the data sources for informetric research are
available through the Web. ISI's Web of Science was launched in 1997
... In November 2004 two additional major citation databases appeared
on the Web: Elsevier's Scopus and Google Scholar ... and there are
easily accessible and often open-source software tools that enable to
collect and analyze large quantities of data even on a personal
computer. It has become easy to conduct "desktop or poor-man's
bibiliometrics". The data for informetric research have never been
perfect, but now that informetric analysis can be conducted with much
greater ease than before, it is even more important to understand the
limitations and problems of data sources and methods and to assess the
validity of the results. In the following sections I discuss some
limitations of the existing sources.
Added 9 June 2010
Gonzalez-Pereira, B., Guerrero-Bote, V., Moya-Anegon, F. (2009)
The SJR
indicator: A
new indicator of journals' scientific prestige
ArXiv,
arXiv:0912.4141v1 [cs.DL], 21 Dec 2009
Added 9 Mar 2010
Ball, K. (2009)
The
Indexing of Scholarly Open Access Business Journals
Electronic Journal of Academic and Special Librarianship,
10 (3), Winter 2009
"this
study focusses on the business and management field and assess the
extent to which scholarly open access journals in this discipline are
currently being indexed by both commercial and non-commercial indexing
services. Of the commercial indexing services, Ebscos Business Source
Complete covers by far the largest number of open access journals. For
business researchers working in an academic environment, Business
Source Complete, with its more sophisticated searching and browsing
capabilities and deeper historical coverage, is probably the best
one-stop option for retrieving scholarly materials from both the
subscription-based and OA literature. However, from a simple quantity
perspective, OA business journals are being most extensively indexed by
OA indexing services, in particular, Google Scholar and Open J-Gate."
Added 9 Mar 2010
Neylon, C. and Wu, S. (2009)
Article-Level
Metrics and the Evolution of Scientific Impact
PLoS Biology,
7 (11), 17 Nov 2009
Added 9 Mar 2010
Moed, H. (2009)
Measuring
contextual citation impact of scientific journals
ArXiv, arXiv:0911.2632v1 [cs.DL], 13 Nov 2009. Also in Journal
of Informetrics (to appear)
About journal impact, and not directly about open access. From the abstract:
"This paper explores a new indicator of journal citation impact,
denoted as source normalized impact per paper (SNIP). It measures a
journal's contextual citation impact, taking into account
characteristics of its properly defined subject field, especially the
frequency at which authors cite other papers in their reference lists,
the rapidity of maturing of citation impact, and the extent to which a
database used for the assessment covers the field's literature. It aims
to allow direct comparison of sources in different subject
fields."
Added 15 Oct 2009
Armbruster, C. (2009)
Whose
Metrics? On Building Citation, Usage and Access Metrics as Information Service
for Scholars
SSRN Social Science Research Network, 31 Aug 2009
Services mentioned: Journal impact factor, journal usage factor, GoPubMed, SSRN
CiteReader, RePEc LogEc, RePEc CitEc, SPIRES, Harzing POP, Webometrics, ISI Web
of Knowledge, Scopus, Google Scholar, Citebase, CiteSeer X, CERIF
Added 15 Oct 2009
Stock, W. (2009)
The
Inflation of Impact Factors of Scientific Journals
ChemPhysChem, 10 (13), 17 Aug 2009, 2193-6
Added 9 Mar 2010
Patterson, M. (2009)
PLoS Journals
measuring
impact where it matters
PLoS blog, 2009-07-13
On why PLoS is will no longer highlight the journal impact factor.
Instead it
will present a range of metrics focussed on the published
paper, including
individual citation counts from various sources, blog and bookmark
counts,
links and searches, as illustrated
by Peter Binfield in the PLoS one community blog (March 31, 2009):
"rather than updating the PLoS Journal sites with the new numbers,
weve
decided to stop promoting journal impact factors on our sites all
together.
Its time to move on, and focus efforts on more sophisticated, flexible
and meaningful measures."
Added 9 Mar 2010
Canos Cerda, J. H., Campos, M. L. and Nieto, E. M. (2009)
What's Wrong with
Citation Counts?
D-Lib Magazine, Vol. 15 No. 3/4, March/April 2009
From the abstract: "We argue that a new approach based on the
collection of
citation data at the time the papers are created can overcome current
limitations, and we propose a new framework in which the research
community is
the owner of a Global Citation Registry characterized by high quality
citation
data handled automatically."
Added 26 February 2009
Cross, J. (2009)
Impact
factors - the basics
The E-Resources Management Handbook (2006 - present), UKSG, this chapter
published online: 03 February 2009
Added 10 November 2008
Leydesdorff, L. (2008)
How are new
citation-based journal indicators adding to the bibliometric toolbox?
Author preprint, undated, (announced 31 Oct 2008), in Journal of the
American Society for Information Science and Technology, Vol. 60, No. 7,
2009, 1327-1336, published online: 2 Feb 2009
http://dx.doi.org/10.1002/asi.21024
From the abstract: "The launching of Scopus and Google Scholar, and
methodological developments in social-network analysis have made many more
indicators for evaluating journals available than the traditional impact
factor, cited half-life, and immediacy index of the ISI. In this study, these
new indicators are compared with one another and with the older ones."
Added 9 June 2010
Final
Impact: What Factors Really Matter? (VIDEO) (2008)
Scholarly Communication Program, Columbia University, October 30, 2008
Panelists: Marian Hollingsworth, Thomson Reuters; Jevin West,
Eigenfactor.org; and Johan Bollen, Los Alamos National Laboratory
Added 10 November 2008
Radicchi, F., Fortunato, S. and Castellano, C. (2008)
Universality of citation
distributions: towards an objective measure of scientific impact
arXiv.org, arXiv:0806.0974v2 [physics.soc-ph], 5 Jun 2008 (v1), last revised 27
Oct 2008
in Proceedings of the National Academy of Sciences of The United States of
America, 105 (45): 17268-17272, Nov. 11 2008
Added 10 November 2008
Banks, M. A. and Dellavalle, R. (2008)
Emerging Alternatives to
the Impact Factor
E-LIS, 05 September 2008, also in OCLC
Systems & Services, 24(3)
Added 10 November 2008
Brumback, R. A. (2008)
Worshiping false idols:
the impact factor dilemma
J. Child Neurol., Vol. 23, No. 4, April 2008, 365-367
"the opacity in Thomson Scientific's refusal to reveal the details of their
calculations only serves to increase suspicion about possible data
manipulations. ... Now would seem to be the appropriate time for the academic
community to demand valid metrics to assess published scientific material"
Added 26 May 2008
Althouse, B. M., West, J. D., Bergstrom, T. C. and Bergstrom, C. T. (2008)
Differences in
Impact Factor Across Fields and Over Time
eScholarship Repository, California Digital library, Department of Economics,
University of California Santa Barbara, Departmental Working Papers, paper
2008-4-23, April 23, 2008. In Journal of the American Society for Information
Science and Technology, Vol. 60, No. 1, 27-34, published online: 21 Aug 2008
Added 11 February 2008
Kosmopoulos, C. et Pumain, D. (2007)
Citation, Citation, Citation:
Bibliometrics, the web and the Social Sciences and Humanities
Cybergeo, Science et Toile, article 411, mis en ligne le 17 decembre
2007, modifie le 18 janvier 2008
From the abstract: "The paper reviews the main (bibliometric) data bases and
indicators in use. It demonstrates that these instruments give a biased
information about the scientific output of research in Social Sciences and
Humanities."
Added 28 July 2008
Rossner, M., Van Epps, H. and Hill, E. (2007)
Show me the data
(editorial)
The Journal of Cell Biology, Vol. 179,
No. 6, 1091-1092, published online December 17, 2007
"Just as scientists would not accept the findings in a scientific paper without
seeing the primary data, so should they not rely on Thomson Scientific's impact
factor, which is based on hidden data. As more publication and citation data
become available to the public through services like PubMed, PubMed Central,
and Google Scholar�, we hope that people will begin to develop their own
metrics for assessing scientific quality rather than rely on an ill-defined and
manifestly unscientific number."
Added 22 August 2007
Citrome, L. (2007)
Impact
Factor? Shmimpact Factor! The Journal Impact Factor, Modern Day Literature
Searching, and the Publication Process
Psychiatry, 4(5):54-57, 2007
Added 17 January 2007
Bornmann, L. and Daniel, H.-D. (2007)
What
do citation counts measure? A review of studies on citing behavior
Author eprint, undated, Journal of Documentation, accepted for
publication
Added 17 January 2007
Meho, L. I. (2006)
The Rise and Rise of Citation
Analysis
Author eprint, dLIST, 31 December 2006, Physics World, January 2007
"Provides a historical background of citation analysis, impact factor, new
citation data sources (e.g., Google Scholar, Scopus, NASA's Astrophysics Data
System Abstract Service, MathSciNet, ScienceDirect, SciFinder Scholar,
Scitation/SPIN, and SPIRES-HEP), as well as h-index, g-index, and a-index."
Added 19 November 2006
Electronic Publishing Services and Oppenheim, C. (2006)
UK scholarly journals:
2006 baseline report: An evidence-based analysis of data concerning scholarly
journal publishing, see Area 4: Citations, impact factors and their role
Research Information Network, Research Councils UK and the Department of Trade
& Industry, October 3, 2006
Added 3 May 2007 Ewing,
J. (2006)
Measuring
Journals
Notices of the AMS, Vol. 53, No. 9, October 2006, 1049-1053
"in many respects usage statistics are even more flawed than the impact factor,
and once again, the essential problem is that there are no explicit principles
governing their interpretation. ... while usage statistics are only
slightly useful, their misuse can be enormously damaging."
Added 8 March 2007
Garfield, E. (2006)
Commentary:
Fifty years of citation indexing
International Journal of Epidemiology, 2006 35(5):1127-1128, published
online September 19, 2006
Added 03 August 2006
PLoS Medicine Editors (2006)
The
Impact Factor Game: It is time to find a better way to assess the scientific
literature
PLoS Medicine, Vol. 3, No. 6, June 2006
Added 15 May 2006
Altbach, P. G. (2006)
The Tyranny of
Citations
Inside Higher Ed, May 8, 2006
Added 28 February 2006
Noruzi, A. (2006)
The Web Impact Factor: a
critical review (pdf, 10pp)
E-LIS, February 9, 2006, in The Electronic Library, 24 (2006)
"Web Impact Factor (WIF) is a quantitative tool for evaluating and ranking web
sites ... search engines provide similar possibilities for the investigation of
links between web sites/pages to those provided by the academic journals
citation databases from the Institute of Scientific Information (ISI). But the
content of the Web is not of the same nature and quality as the databases
maintained by the ISI."
Added 28 February 2006
Bollen, J., Rodriguez, M. A. and Van de Sompel, H. (2006)
Journal Status (pdf, 16pp)
Arxiv, 9 January 2006
"By merely counting the amount of citations and disregarding the prestige of
the citing journals, the ISI IF is a metric of popularity, not of prestige. We
demonstrate how a weighted version of the popular PageRank algorithm can be
used to obtain a metric that reflects prestige. ... Furthermore, we introduce
the Y-factor which is a simple combination of both the ISI IF and the weighted
PageRank, and find that the resulting journal rankings correspond well to a
general understanding of journal status."
Added 03 August 2006
Moed, H.F. (2005)
Citation analysis of
scientific journals and journal impact measures
Current Science, 89 (12): 1990-1996, December 25, 2005
Added 28 February 2006
Dong, P., Loh, M. and Mondry, A. (2005)
The "impact factor"
revisited
Biomedical Digital Libraries, December 2005
This is a review, so the findings are not new, but this is perhaps the first
such paper to reflect on the effect of free and online availability on journal
impact factors, among other IF-related issues.
Added 30 December 2005
Hardy, R., Oppenheim, C., Brody, T. and Hitchcock, S. (2005)
Open Access Citation
Information (.doc, 105pp)
Author eprint, November 11, 2005, JISC Committee for the Information
Environment (JCIE) Scholarly Communication Group, September 2005
Describes a proposal to increase the exposure of open access materials and
their references to indexing services, and to motivate new services by reducing
setup costs.
Added 8 March 2007
Perkel, J. M. (2005)
The
Future of Citation Analysis
The Scientist, Vol. 19, No. 20, October 24, 2005
"The challenge is to track a work's impact when published in nontraditional
forms"
Added 30 December 2005
Monastersky, R. (2005)
Impact Factors Run
Into Competition
Chronicle of Higher Education, October 14, 2005
Added 28 February 2005
Garfield, E. (2005)
The Agony
and the Ecstasy — The History and Meaning of the Journal Impact Factor
(pdf, 22pp)
International Congress on Peer Review and Biomedical Publication,
Chicago, September 16, 2005
Garfield's typically dry, data-filled but essential take on JIFs.
Publishers promote impact factors of OA journals
BioMed Central "Open
access journals get impressive impact factors" 23 June 2005
Public Library of Science "The first impact factor
for PLoS Biology - 13.9" 27 June 2005
See also this discussion of these announcements on SPARC Open Access Forum,
prompted by Elsevier's response from Tony McSean,
followed by David Goodman,
Charles
Bailey, (both 8 July) and Matthew
Cockerill (10 July), or see this summary of the discussion: "BMC's
Impact Factors: Elsevier's Take and Reactions to It", Digital Koans
(Charles Bailey's Weblog), 11 July 2005, including Peter Suber's conclusion:
"It's important to distinguish the citation impact of an individual article
from a journal impact factor. The BMC-Elsevier debate is about the latter. But
OA is more likely to rise and fall according to the former."
Abbasi, K. (2004)
Let's dump
impact factors
BMJ, Vol. 329, 16 October 2004
BMJ Rapid
Responses to this editorial; also see this list response
Baudoin, L., Haeffner-Cavaillon, N., Pinhas, N., Mouchet, S. and Kordon, C.
(2004)
Bibliometric
indicators: realities, myth and prospective (abstract only, full paper in
French)
Med Sci (Paris), 20 (10):909-15, October 2004
Jacsó, P. (2004)
The
Future of Citation Indexing - Interview with Dr. Eugene Garfield (pdf 3pp)
Author eprint, in Online, January 2004
Cockerill, M. J. (2004)
Delayed impact: ISI's
citation tracking choices are keeping scientists in the dark
BMC Bioinformatics 2004, 5:93, 12 July 2004
Added 26 September 2005
Shin E. J. (2003)
Do Impact
Factors change with a change of medium? A comparison of Impact Factors when
publication is by paper and through parallel publishing (abstract only)
Journal of Information Science, 29 (6): 527-533, 2003
"it is found that Impact Factors of (journals from the period) 2000 and 2001
were significantly higher than those of 1994 and 1995 in the journals published
by parallel publishing (combination journals—simultaneous publication of paper
and electronic journals). In particular, the Impact Factors of the combination
journals increased after the journals transformed their available media from
paper journals to combination ones."
Walter, G., Bloch, S., Hunt, G. and Fisher, K. (2003)
Counting
on citations: a flawed way to measure quality
MJA, 2003, 178 (6): 280-281
Borgman, C. L. and Furner, J. (2002)
Scholarly
Communication and Bibliometrics, author preprint (pdf 45pp)
Author eprint, in Annual Review of Information Science and Technology,
Vol. 36, edited by B. Cronin, 2002
Guédon, J.-C. (2001)
In Oldenburg's
Long Shadow: Librarians, Research Scientists, Publishers, and the Control of
Scientific Publishing
Creating the Digital Future, Proceedings of the 138th Annual Meeting,
Association of Research Libraries, Toronto, Ontario, May 23-25, 2001
Garfield, E. (1999)
Journal impact factor:
a brief review
CMAJ, 161 (8), October 19, 1999
Wouters, P. (1999)
The Citation
Culture (pdf 290pp)
PhD Thesis, University of Amsterdam, 1999
Garfield, E. (1998)
The
use of journal impact factors and citation analysis in the evaluation of
science
Author eprint, presented at the 41st Annual Meeting of the Council of
Biology Editors, Salt Lake City, UT, May 4, 1998
Seglen, P. O. (1997)
Why the
impact factor of journals should not be used for evaluating research
BMJ, 314:497, 15 February 1997
Garfield, E. (1973)
Citation
Frequency as a Measure of Research Activity and Performance (pdf 3pp)
Author eprint, in Essays of an Information Scientist, 1: 406-408,
1962-73, Current Contents, 5, January 31, 1973
Garfield, E. (1955)
Citation
Indexes for Science: A New Dimension in Documentation through Association of
Ideas
Author eprint, in Science, Vol:122, No:3159, p.108-111, July 15, 1955
Suber, P. (updated)
Open Access
Overview
Swan, A. (2007)
Open
Access and the Progress of Science
American Scientist, April-June 2007
Swan justified open access in support of her 'progress' article in a list
discussion. See blogged
extracts from that discussion.
Swan, A. (2006)
Open
Access: Why should we have it?
presented at "Zichtbaar onderzoek. Kan Open Archives daarbij helpen?" / Visible
research. Can OAI help? Organised by AWI (Flemish Ministry for Economy,
Enterprise, Science, Innovation and Foreign Trade) and VOWB (Flemish
Organisation of Scientific Research Libraries), May 2006
Swan, A. (2005)
Open Access
JISC, Briefing Paper, 1 April 2005
Suber, P. (2004)
A Primer on Open
Access to Science and Scholarship
Author eprint, in Against the Grain, Vol. 16, No. 3, June 2004
Harnad, S. (2004)
The Green Road
to Open Access: A Leveraged Transition
American Scientist Forum, January 07, 2004
Suber, P. (2003)
Removing the Barriers
to Research: An Introduction to Open Access for Librarians
Author eprint, in College & Research Libraries News, 64, February,
92-94, 113
^Top |
<Home |