Starten Sie Ihre Suche...


Durch die Nutzung unserer Webseite erklären Sie sich damit einverstanden, dass wir Cookies verwenden. Weitere Informationen

Introducing Geo-Glocal Explainable Artificial Intelligence

Access. Bd. 13. IEEE 2025 S. 30952 - 30964

Erscheinungsjahr: 2025

ISBN/ISSN: 2169-3536

Publikationstyp: Zeitschriftenaufsatz

Sprache: Englisch

Doi/URN: 10.1109/access.2025.3541781

Volltext über DOI/URN

Geprüft:Bibliothek

Inhaltszusammenfassung


Geospatial use cases involve data with a geospatial and a temporal dimension. Machine learning is applied to such use cases for tasks such as prediction and classification. However, machine learning models are often perceived as opaque and incomprehensible. Explainable artificial intelligence can be used for either global explanations that provide an overview of the model behavior, or local explanations that explain individual predictions in detail. These approaches may be either too general ...Geospatial use cases involve data with a geospatial and a temporal dimension. Machine learning is applied to such use cases for tasks such as prediction and classification. However, machine learning models are often perceived as opaque and incomprehensible. Explainable artificial intelligence can be used for either global explanations that provide an overview of the model behavior, or local explanations that explain individual predictions in detail. These approaches may be either too general or too detailed for analysis in geospatial contexts. This study introduces a novel concept ’Geo-Glocal Explainable Artificial Intelligence’ that combines the strengths of global and local explanations to provide a more balanced explanatory power. The concept uses the geospatial and temporal dimension in the data to aggregate local explanations into geo-glocal slots, to focus on the geospatial context. These slots are scalable and can be applied according to the different dimensions in the data. The concept is applied to a real-world use case—predicting car park occupancy. The results of the application are evaluated against the traditional global and local explanations. The study shows that bridging the gap between local detail and global overview can be of great benefit to geospatial machine learning, improving understandability of the model’s behavior in a geospatial context.» weiterlesen» einklappen

  • Machine Learning
  • Explainable AI
  • SHAP
  • Geospatial Analysis

Autoren


Böhm, Klaus (Autor)

Klassifikation


DDC Sachgruppe:
Informatik

Verknüpfte Personen