Trusting AI when the stakes are high
Published by Raytheon
When visiting a new city, a GPS app is a must have. But when drivers are travelling in their own hometowns, they often disregard their GPS’s directions, believing they know the best route. It’s a matter of trust. Drivers don’t understand how their app arrived at its conclusion, putting more faith in their own instincts and sense of direction instead of a machine’s algorithm. As artificial intelligence and machine learning becomes more prevalent in society , trust and explanability are critical for its widespread adoption, especially in high-stakes environments, such as healthcare, national security and weather prediction. Read now to discover more about Raytheon Technologies developing explainable AI systems.