computer hallucination
A computer hallucination is an interpretation error in artificial intelligence (AI) machine vision and machine learning technology. Computer hallucinations cause AI systems to mis-classify what they might otherwise classify correctly and can be caused by a number of reasons. For example, a visual system may interpret a shiny stone as an eye -- and then, because it has misinterpreted that one feature, continue to misinterpret other features around the stone to build an animal. This kind of visual misinterpretation was able to be sold as a service called Google Dream. Dream would process and reprocess a given image, reinterpreting it and enhancing features with its given bias until these features where drawn into landscapes, creating images reminiscent of a psychedelic experience.
Computer hallucinations can also be triggered intentionally by purposely misusing the backpropagation algorithms that are normally used to correct errors during the AI training process. Instead of adjusting weights to improve accuracy, the source image adjusts and places more emphasis on what would otherwise be classified as incorrect responses.
Computer hallucinations can also be found in audio files. Eerily, a segment of Bach’s Cello Suite 1 was reported to have transcribed as “Speech can be embedded in music” by Nicolas Carlini, a research scientist at Google Brain.