National Historic Trail.
The h-index is defined as the maximum value of h such that the given author/journal has published h papers that have each been cited at least h times. The index is designed to improve upon simpler measures such as the total number of citations or publications. Can apply to: Authors who have published scholarly outputs that have been cited in other scholarly outputs. Metric definition: An author-level metric (although it can also be calculated for any aggregation of publications, e.g. journals, institutions, etc.) calculated from the count of citations to an author’s set of publications. Metric.
Chat with us in Facebook Messenger. Find out what's happening in the world as it unfolds. Harry Anderson on 'Night Court's' success Penny Marshall has died at age Penny Marshall discusses 'Big' success. Actress Kim Porter dies at Stan Lee, Marvel Comics icon, dead at Rapper dies in airplane stunt gone wrong. Rapper Mac Miller dies at YouTube star Claire Wineland dies at Jefferson Airplane singer dies at Actor Burt Reynolds has died at Neil Simon dies at the age of Nobel-winning novelist dies at Pantera drummer Vinnie Paul dies at Rapper XXXTentacion killed in apparent robbery.
Nancy Sinatra, Frank's first wife, dies at Actor Jackson Odell found dead at Asheville Police responded to the actor's home at 6: Thus, use of the impact factor to weight the influence of a paper amounts to a prediction, albeit coloured by probabilities. The impact factor is not only used for ranking journals according to their relative influence, as initially intended, but also for measuring the performance of individual researchers.
Given the skewness of citation distributions described above, this is a misapplication. The use of the impact factor when applied to individual researchers has been criticized by a broad scholarly community, not least the co-creator of the Science Citation Index, Eugene Garfield, himself.
The impact factor can be manipulated. It is influenced by the point in time when a journal issue is published. Issues published at the beginning of a year have a greater chance to accumulate citations than those published at the end of the year. In addition, references given in the editorial count to the numerator, while editorials do not count to the denominator.
By definition the denominator only consists of citable articles and editorials are not regarded as such. References given in the articles may be incomplete and incorrect. Incorrect references are not corrected automatically and therefore are not added to the citations. This fact influences the value of the impact factor and other citation indicators such as the h-index. In order to compensate for some of the weaknesses of the impact factor field dependency, inclusion of self-citations, length of citation window, quality of citations , efforts have been undertaken to develop better journal indicators.
While such measures arguably do a better job of ranking journals, they are still only applicable to journals and should not be used to evaluate research output at the level of individual researchers.
For that purpose, the h-index, introduced below, is better suited. When exploring the literature of your research field, the h-index may give you a picture of the impact of individual researchers and research groups.
You may retrieve the h-index from e. Web of Science, Scopus and Google Scholar. When applying for a scholarship, project funding or a job, you may be required to state your h-index. Search for author Karen in a given database. The figure illustrates the search result for author Karen in an arbitrary database. It also indicates all citing publications counting lines within the same database. In our example, author Karen has 10 publications a, b, c, d, e, f, g, h, i and j.
View table as graph. These are publications c, i, a and g. The remaining N P — h publications do not have more than h citations each. In our example, the remaining six publications f, h, d, j, b and e do not have more than four citations each. H-index retrieved in Web of Science, Scopus and Google Scholar In this example, we use a renowned Norwegian researcher in ecology and evolutionary biology: We demonstrate that his h-index is different in each of the databases due to their different coverage of content.
The results presented here are based on data as of April Citation counts typically increase with time and so does the h-index. To determine the present value, perform a new search. It is possible to add rows to make sure that different spellings of the name are included. The time span is , and the number of publications is The h-index, including self-citations, is The time interval covers publications from to In Scopus, the h-index excluding self citations is With self-citations, the h-index would be The number of publications is Google Scholar covers a wider range of publication types; therefore the h-index is higher here.
The h-index for all years is , while the h-index since is The h-index alone does not give a complete picture of the performance of an individual researcher or research group. The h-index underrepresents the impact of the most cited publications and does not consider the long tail of rarely cited publications. In particular, the h-index cannot exceed the total number of publications of a researcher. The impact of researchers with a short scientific career may be underestimated and their potential undiscovered.
Read more about this below: However, this percentage decreases when citations are traced for longer periods. We find the highest share of self-citations among the least cited papers. Citing is an activity maintaining intellectual traditions in scientific communication. Citations are used for reasons of dialogue and express participation in an academic debate. They are aids to persuasion; assumed authoritative documents are selected to underpin further research.
However, citations may be motivated by other reasons as well. Note that scholarly communication varies from field to field. Comparisons across different fields are therefore problematic. However, there are attempts to make citation indicators field independent. Citations are basic units measuring research output.
Citations are regarded as an objective or at least less subjective measure to determine impact, i. They are used in addition to, or as a substitute for peer judgments. There is a strong correlation between peer judgments and citation frequencies. For this reason, citations are relied on as indicators of quality and are used for e.
Citation data vary from database to database, depending on the coverage of content of the database. Furthermore, two problematic factors are different motivations for citing, and the the considerable skewness of the distribution of citations. When sorting a set of publications by the numbers of citations received, the distribution shows a typical exponential or skewed pattern.
Works which have been cited are more visible and are more easily cited again vertical tail in figure , while other works remain hidden and are hardly ever cited horizontal tail in figure. This phenomenon is referred to as the Matthew effect in science.
What is the problem with skewed distributions? Skewed patterns make it difficult to determine an average citation count. Different approaches may be applied, see the figure. Being aware of how academic performance is evaluated allows you to make informed decisions and devise strategies to build and document your impact, and thereby improve your career prospects. Our general advice centres on making your work visible , accessible and understandable. Make your work accessible to other researchers by adopting open science practices:.
Statistical Science, 24 1 , A macro study of self-citation.
Long vertical or horizontal tails distort the median value.
Tech Innovate Gadget Mission: