Jump to ContentJump to Main Navigation
World Wide ResearchReshaping the Sciences and Humanities$

William H. Dutton and Paul W. Jeffreys

Print publication date: 2010

Print ISBN-13: 9780262014397

Published to MIT Press Scholarship Online: August 2013

DOI: 10.7551/mitpress/9780262014397.001.0001

Show Summary Details
Page of

PRINTED FROM MIT PRESS SCHOLARSHIP ONLINE (www.mitpress.universitypressscholarship.com). (c) Copyright The MIT Press, 2022. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in MITSO for personal use. Subscriber: null; date: 29 June 2022

1.2 Identifying Winners and Losers: The Role of Webometrics

1.2 Identifying Winners and Losers: The Role of Webometrics

(p.45) 1.2 Identifying Winners and Losers: The Role of Webometrics
World Wide Research

Mike Thelwall

The MIT Press

Abstract and Keywords

Scientometrics is a research field that deals with the measurement of the outputs of science to evaluate scientists and to understand how disciplines and specialisms grow. The results can be used to allocate research funding and to compare research outputs between countries in order to identify areas of strength and weakness as well as patterns of overall improvement or decline. Scientometrics can therefore provide quantitative evidence of winners and losers in e-science. A related concept is “Webometrics” or Webmetrics, in which hyperlinks can behave like citations. This chapter examines the role of Webometrics in identifying winners and losers in e-science, the advantages of Webometrics, and the methods used in Webometrics.

Keywords:   e-science, scientometrics, Webometrics, hyperlinks, citations

Evaluating Scientists and Their Outputs

Since the 1960s, the research field of scientometrics has been concerned with measuring the outputs of science to evaluate scientists and to shed light on the dynamics of the growth of disciplines and specialisms. The results help government bodies in some countries to allocate research funding, both in terms of individual research groups and broad initiatives. Scientometrics is also used to benchmark entire countries’ research outputs to identify areas of strength and weakness as well as to reveal a pattern of overall improvement or decline (Moed 2005). Hence, when one is seeking quantitative evidence of winners and losers in e-science, scientometrics is a logical starting point.

The primary source of evidence for most scientometric investigations is citations within the Thomson Reuters database of the top journals of science and social science. The rationale for the choice of citations as a data source derives from Robert Merton’s (1973 [1942]) sociology of science, which proposes essentially that good science tends to be cited often because many researchers find it sufficiently interesting, credible, and useful to underpin their own research. Thus, a crude way to evaluate the work of an individual scientist, research group, university, or country would be to count the citations to that work.

Although such a simple counting exercise would be poor scientometrics, more sophisticated versions (e.g., taking into account field differences in citation norms) can provide credible evidence in many cases. Scientometrics often incorporates multiple sources of evidence—for instance, peer review, funding, and patent awards—which helps to create a more rounded picture of research activities, especially for applied research. It can also be of value in the social sciences, arts, and humanities, where journal citations may not be central to the construction of knowledge within a field. The World Wide Web has also become an important new source of data, spawning “Webometrics.”

(p.46) Webometric Advantages and Methods

The key idea behind Webometrics is that hyperlinks can behave like citations in two respects. First, a successful research group can expect to attract many links to its Web site; counting these links can therefore provide some (admittedly limited) evidence of success, even for disciplines outside the hard sciences. Second, links between research groups can help to reveal patterns of collaboration and information use, which may further understanding of the dynamics of science. In both cases, citations are often a more reliable source of evidence, but the big advantage of Web links is their timeliness. For instance, a new research collaboration may well be first signaled by the creation of a joint Web site, which may begin to create and attract links several years before the first research result is accepted by a journal and published. Hence, at least in theory, Webometric link analysis can describe current research trends, whereas scientometric analyses are inevitably retrospective.

Webometric methods usually rely heavily on commercial search engines, which are able to report the number of pages that link to any given Web site. For example, a study of life science researcher mobility in Europe (Robinson, Mentrup, Barjak, et al. 2007) for the European Union’s Institute for Prospective Technological Studies used Google to identify life science research groups in seven European countries. It then used Windows Live Search to count the number of pages linking to each research group’s home page. These data were employed for two purposes. First, a breakdown of country sources of links revealed international partnerships in the social sciences and individual countries that did not use the Web to publicize their research effectively. Second, the link counts were used to help select the most active and successful research groups (and a percentage of the rest) for a more complete scientometric analysis.

A second useful application of Webometric link analysis is for self-evaluation. Any research group can count links to its Web site using an appropriate commercial search engine and compare the result to a search for links to the sites of similar research groups. This simple exercise can either provide confirmation that the group and its site are recognized and seen as valuable or offer evidence of a potential problem.

A third application is to investigate factors about researchers that influence their online visibility. One study investigated whether factors such as gender and industry connections influenced the visibility of life science researchers’ Web sites, but found little evidence of any important factors other than research group size and Web presence (Barjak and Thelwall 2008). However, this study showed the difficulty in using Webometric data to identify anything except the most significant online trends.

Simple link counts can reveal crude but useful indicators of the impact of a Web site and its owners, but Webometric methods can provide improved data in terms of more data or more useful counting methods. The process of collecting data can be automated through the use of free software that interfaces with the search engines via their “Web (p.47) search service” or similar facility. For example, this software may automatically retrieve, list, and summarize by domain name all of the pages linking to a Web site, even if there are more than one thousand (the normal maximum for search engines).

Webometric processing can produce improved link counts by counting linking domains or Web sites rather than pages. This is an improvement because it cancels out the impact of sites that contain large numbers of links—for example, in collections of standard links that sometimes appear on every page of a Web site (Thelwall 2004). Nevertheless, the wide range of reasons why links are created means that links are a relatively weak source of information. As a result, Webometric data tends to be “indicative” of trends rather than robust evidence for hypotheses.


Webometric link analysis methods provide a relatively fast method of accessing indicative quantitative information about the impact of research groups’ Web sites and, on a larger scale, the interrelationships between the groups. Repeated over time, this analysis can give early warning of the winners and losers within research fields or disciplines.


Bibliography references:

Barjak, F., and M. Thelwall. 2008. “A statistical analysis of the Web presences of European life sciences research teams.” Journal of the American Society for Information Science and Technology 59 (4):628–643.

Merton, R. K. 1973 [1942]. The sociology of science: Theoretical and empirical investigations. Chicago: University of Chicago Press.

Moed, H. F. 2005. Citation analysis in research evaluation. New York: Springer.

Robinson, S., A. Mentrup, F. Barjak, and M. Thelwall. 2007. Collection and analysis of existing data on RESearchers CAReers (RESCAR) and implementation of new data collection activities—the researchtTeam survey, final report. EU Contract no. 150176–2005-FISC-BE. Brussels: ERAWATCH Network Asbl.

Thelwall, M. 2004. Link analysis: An information science approach. San Diego: Academic Press.