About Academinence

An open data initiative mapping academic excellence across the world's leading research institutions, powered by open standards and transparent methodology.

Our Mission

Academinence provides a transparent, comprehensive view of research excellence by aggregating data on prestigious academic awards and mapping them to standardized institution identifiers.

By using open standards like the Research Organization Registry (ROR), ORCID, and Wikidata, we create a dataset that integrates seamlessly with the broader research infrastructure ecosystem.

This project isn't about rankings for their own sake. It's about making the data around academic achievement more accessible, queryable, and useful for researchers, institutions, and anyone interested in patterns of excellence in academia.

Methodology

Data Collection

We aggregate data from Wikidata, official award organization APIs, and structured databases, cross-referencing with open knowledge bases.

Organization Matching

Using the ROR affiliation API, we map institutional names to standardized identifiers for consistent aggregation.

Researcher Identification

We link laureates to ORCID profiles and Wikidata entities, connecting to their full body of research work.

Open Standards

All data uses open standards and persistent identifiers, ensuring interoperability with research infrastructure.

Prestige Score Methodology

We use a weighted scoring system that reflects the relative prestige and impact of different awards. This allows fairer comparison than raw award counts.

Scoring Formula

Prestige Score = (Tier 1 × 10) + (Tier 2 × 3) + (Tier 3 × 1)

This weighting means a single Nobel Prize (Tier 1) contributes as much to the prestige score as 10 Tier 3 awards or approximately 3 Tier 2 awards. The weights reflect our assessment of relative global recognition and career impact.

Tier 1×10

Apex Awards

The absolute pinnacle of academic achievement. Near-universal recognition, transformative impact on their field, and typically substantial prize money (often $500K+).

Tier 2×3

Elite Awards

Often described as "second only to Nobel" in their fields. Many recipients go on to win Nobel Prizes. Recognizes sustained excellence and breakthrough contributions.

Tier 3×1

Prestigious Awards

Leading prizes within specific subfields, historic society medals with centuries of tradition, and nationally prominent awards that signal top-tier achievement.

Why Weighted Scoring?

Raw counts can mislead: An institution with 100 society medals but no Nobel Prizes would rank above one with 10 Nobels under a simple count system.

Weights reflect consensus: While subjective, our tier assignments broadly align with how the academic community perceives award prestige.

Transparency over perfection: We publish our methodology so users can understand and critique our choices. You can also toggle to view raw counts.

Our Principles

  • Transparency in methodology and data sources
  • Use of open standards (ROR, ORCID, Wikidata)
  • Weighted scoring to reflect award prestige
  • Regular updates as new awards are announced
  • Clear acknowledgment of limitations
  • Respect for data licenses and attribution

Limitations

We strive for accuracy, but users should be aware of these limitations.

Historical Gaps

Some older awards lack detailed affiliation data, particularly before 1950.

Institutional Complexity

Researchers may have multiple affiliations that complicate attribution.

Western Bias

Many tracked awards have historically favored Western institutions.

Selection Subjectivity

Our list of "prestigious" awards involves subjective judgments about importance.

Tier Assignments

Award tier classifications represent our assessment and may differ from others' views.

Built on Open Standards

We use established open standards to ensure our data integrates with the global research infrastructure.

Built with care for the research community

Data sources: Nobel Prize API, ROR, ORCID, Wikidata, CWTS Leiden Ranking