Correct option is B
Introduction:
Confusion Matrix shows True Positives (TP), False Positives (FP), True Negatives (TN), and False Negatives (FN).
2. Contingency Tables help identify patterns and discrepancies in classification.
3. Error Matrices provide metrics like Overall Accuracy, User's Accuracy, and Producer's Accuracy.
4. Accuracy metrics derived from these matrices include:
· Precision: TP / (TP + FP)
· Recall (Sensitivity): TP / (TP + FN)
Information Booster:
1. Confusion Matrix (B):
· A table used to describe the performance of a classification model.
· Shows the number of correct and incorrect predictions across various classes.
2. Contingency Table (C):
· Similar to a confusion matrix, used in statistics to compare observed and predicted classifications.
· Helps analyze the performance of classification algorithms.
3. Error Matrix (D):
· Another term for a confusion matrix, commonly used in remote sensing to assess classification accuracy.
· Summarizes the classification results and errors.
Additional Knowledge:
Commission Table (A):
· Refers to a specific type of error (commission error) in classification, but it's not used as a comprehensive accuracy measure.
· Transition Matrix (E):
· Used to describe state changes over time, such as land cover changes, but not typically used for evaluating classification accuracy.