Inter-Annotator Agreement (IAA)

Data scientists have long used inter-annotator agreement (IAA) to measure how well multiple annotators can make the same annotation decision for a certain label category or class. This information can be found directly through Datasaur's dashboards. The calculation can help determine the clarity and consistent reproducibility of your results.

We support two algorithms to calculate the agreement between two annotators.

Note that we apply the scale interpretation of Krippendorff's Alpha for both methods as in Image 1.

  • Discard will be presented in red.

  • Tentative will be presented in yellow.

  • Good will be presented in green.

✍ IAA is calculated in the background as soon as a project status changed to Ready for Review (after all labelers mark the project as complete) or Complete (a reviewer marks the project as complete).

Multiple Ways to View the Data

  1. Overview

    • You can view the Inter-annotator Agreement page by navigating to Analytics under the left sidebar, select Team Overview, then navigate to the Inter-Annotator Agreement tab. Please note that only admin that can access this page.

    • You can also filter the IAA for a specific project.

  2. Project

    • The IAA will also be available when you open the detail of a project. It's basically the same IAA information as the above.

Last updated