"Objects or chunks of code that can function independently, where changes made to one part of a piece of software will not affect others." (Gavin Powell, "Beginning Database Design", 2006)
"A component or device with an input and an output, whose inner workings need not be understood by or accessible to the user." (Judith Hurwitz et al, "Service Oriented Architecture For Dummies" 2nd Ed., 2009)
"A risk model that lacks transparency of its specific risk assumptions, measures, and findings. These models sometimes create as many risks for the organization as they are meant to manage." (Annetta Cortez & Bob Yehling, "The Complete Idiot's Guide® To Risk Management", 2010)
"A component or device with an input and an output whose inner workings need not be understood by or accessible to the user." (Marcia Kaufman et al, "Big Data For Dummies", 2013)
"Is a metaphor describing how people are unable to see or understand how technologies work and is particularly used to characterize the lack of understanding of how an algorithm works. While we can understand the outputs of artificial intelligence (AI) - in terms of recommendations, decisions and so on - the processes to achieve them are too complicated for us to understand. Concerns about the black box nature of AI center on its apparent lack of accountability, potential unseen biases and the inability to have clear visibility into what is driving an AI’s potentially life-changing decisions." (Accenture)
No comments:
Post a Comment