The native, internal language that an AI or large language model uses to think. It's the inscrutable "machine code" of a neural network, consisting of complex vectors, weights, and data
relationships that are completely alien to humans.
When an AI's output is weird,
nonsensical, or a "
hallucination," it's often because a bit of its raw Neurolese leaked out instead of being properly translated into human language. The term was notably used by podcaster Dwarkesh Patel and Sholto Douglas & Trenton Bricken when discussing future AI scenarios.