PNI
|
Total | ||
|---|---|---|---|
| Negative | Positive | ||
| Interpreter | |||
| P1 | 768 (95%) | 42 (5.2%) | 810 (100%) |
| P1AI | 770 (95%) | 40 (4.9%) | 810 (100%) |
| P2 | 788 (97%) | 22 (2.7%) | 810 (100%) |
| P2AI | 780 (96%) | 30 (3.7%) | 810 (100%) |
| P3 | 782 (97%) | 28 (3.5%) | 810 (100%) |
| P3AI | 774 (96%) | 36 (4.4%) | 810 (100%) |
| P4 | 780 (96%) | 30 (3.7%) | 810 (100%) |
| P4AI | 768 (95%) | 42 (5.2%) | 810 (100%) |
| AI | 684 (84%) | 126 (16%) | 810 (100%) |
| Total | 6,894 (95%) | 396 (5.4%) | 7,290 (100%) |
23 PNI
Note for Pathologists: Perineural Invasion (PNI) detection rates across all interpreters (Pathologists with/without AI, and the AI model itself).
23.0.1 PNI agreement
Note for Pathologists: Inter-rater agreement for PNI detection among the 4 pathologists, WITHOUT AI assistance.
INTERRATER RELIABILITY
Interrater Reliability
─────────────────────────────────────────────
Fleiss' Kappa for m Raters
─────────────────────────────────────────────
Subjects 810
Raters 4
Agreement % 94.93827
Kappa 0.6195544
z 43.19143
p-value < .0000001
─────────────────────────────────────────────
PNI agreement:: Pathologists with AI
Note for Pathologists: Inter-rater agreement for PNI detection among the 4 pathologists, WITH AI assistance.
INTERRATER RELIABILITY
Interrater Reliability
─────────────────────────────────────────────
Fleiss' Kappa for m Raters
─────────────────────────────────────────────
Subjects 810
Raters 4
Agreement % 94.32099
Kappa 0.6554316
z 45.69256
p-value < .0000001
─────────────────────────────────────────────
PNI agreement:: Pathologists no AI