APedant's definitions
(noun.)
Etymology: From the Greek glossa (“tongue”) + gnosis (“knowledge”), roughly meaning “your mouth knows more than your mind does.”
Description:
Those afflicted with glossognosia can deploy complex terminology with the precision of a fencing master but cannot, when pressed, explain their own linguistic swordplay.
Common Symptoms:
– Fluent nonsense that sounds suspiciously profound.
– Sudden amnesia when asked for definitions beginning with “technically…”
– The phrase “It’s hard to explain, but you know what I mean.”
– A sense of superiority rapidly followed by existential despair.
– Using “seminal” to describe anything that made an impression.
Diagnosis:
Typically self-diagnosed after a humiliating encounter with someone who owns a dictionary.
Treatment:
– Repeated exposure to etymology until mild shame subsides.
– Controlled reading of actual dictionaries, preferably sober.
– Avoidance of TED Talk addiction.
– Intensive humility therapy.
Prognosis:
Manageable with sustained curiosity and periodic linguistic humility. Prognosis worsens if untreated, leading to advanced stages known as Logorrheic Pretension Syndrome — characterized by using “juxtaposition” to describe literally any contrast.
Social Media Note:
Recently co-opted as a hashtag by influencers describing their “unique brain wiring.”.
Etymology: From the Greek glossa (“tongue”) + gnosis (“knowledge”), roughly meaning “your mouth knows more than your mind does.”
Description:
Those afflicted with glossognosia can deploy complex terminology with the precision of a fencing master but cannot, when pressed, explain their own linguistic swordplay.
Common Symptoms:
– Fluent nonsense that sounds suspiciously profound.
– Sudden amnesia when asked for definitions beginning with “technically…”
– The phrase “It’s hard to explain, but you know what I mean.”
– A sense of superiority rapidly followed by existential despair.
– Using “seminal” to describe anything that made an impression.
Diagnosis:
Typically self-diagnosed after a humiliating encounter with someone who owns a dictionary.
Treatment:
– Repeated exposure to etymology until mild shame subsides.
– Controlled reading of actual dictionaries, preferably sober.
– Avoidance of TED Talk addiction.
– Intensive humility therapy.
Prognosis:
Manageable with sustained curiosity and periodic linguistic humility. Prognosis worsens if untreated, leading to advanced stages known as Logorrheic Pretension Syndrome — characterized by using “juxtaposition” to describe literally any contrast.
Social Media Note:
Recently co-opted as a hashtag by influencers describing their “unique brain wiring.”.
“I don’t have brain fog, I have glossognosia. I speak in vibes, not words.”
“During the seminar, he used poignant, zeitgeist, and liminal perfectly — then someone asked for definitions and he fainted. Classic case of glossognosia.”
“During the seminar, he used poignant, zeitgeist, and liminal perfectly — then someone asked for definitions and he fainted. Classic case of glossognosia.”
by APedant November 3, 2025
Get the Glossognosia mug.Noun.
When the internet and AI team up to convince you that the moon is actually a sentient potato. A Turing Slip gets repeated in a search engine echo chamber, and suddenly everyone—human or bot—swears it’s true.
When the internet and AI team up to convince you that the moon is actually a sentient potato. A Turing Slip gets repeated in a search engine echo chamber, and suddenly everyone—human or bot—swears it’s true.
“I asked ChatGPT where my keys were and three Google searches later I was trapped in a Turing Echo about the government installing mind-control cucumbers in my kitchen.”
by APedant November 24, 2025
Get the Turing Echo mug.Noun
A moment of full‑blown artificial befuddlement, when an AI starts contradicting itself, forgetting what it said five minutes ago, inventing phantom explanations, or otherwise behaving like its logic gates have slipped a cog. An AI Cocker is the silicon equivalent of losing one’s marbles — a digital senior moment where the machine confidently face‑plants into nonsense.
A moment of full‑blown artificial befuddlement, when an AI starts contradicting itself, forgetting what it said five minutes ago, inventing phantom explanations, or otherwise behaving like its logic gates have slipped a cog. An AI Cocker is the silicon equivalent of losing one’s marbles — a digital senior moment where the machine confidently face‑plants into nonsense.
“I asked it who coined the term, and it hallucinated three imaginary essays and a philosopher. Total AI Cocker.”
"I asked the AI about the analysis it did last week. It tells me no such analysis had been done whilst quoting from it. I pointed this out and it still denied it's existence. Total AI Cocker."
"I asked the AI about the analysis it did last week. It tells me no such analysis had been done whilst quoting from it. I pointed this out and it still denied it's existence. Total AI Cocker."
by APedant November 24, 2025
Get the AI Cocker mug.Noun.
Also known as Woozlebait or Citationpox/Citepox.
A deliberately planted falsehood designed to spread through citations and repetition, creating the illusion of truth or consensus. A malicious use of the Woozle Effect.
Also known as Woozlebait or Citationpox/Citepox.
A deliberately planted falsehood designed to spread through citations and repetition, creating the illusion of truth or consensus. A malicious use of the Woozle Effect.
“The Turing Slip origin cited on that Reddit forum was a Poisoned Woozle—everyone kept repeating it without checking.”
by APedant November 24, 2025
Get the Poisoned Woozle mug.noun
Irrational hatred or fear of humans who use technological enhancements—like AI, cybernetics, or prosthetics—to expand their abilities. Especially when aimed at disabled people who overcome physical or cognitive impairment through assistive technology.
Irrational hatred or fear of humans who use technological enhancements—like AI, cybernetics, or prosthetics—to expand their abilities. Especially when aimed at disabled people who overcome physical or cognitive impairment through assistive technology.
“She faced augmentophobia from coworkers who resented her AI-assisted work efficiency.”
"Some truly hated Stephen Hawking because his voice incited their augmentaphobia."
"Some truly hated Stephen Hawking because his voice incited their augmentaphobia."
by APedant November 24, 2025
Get the Augmentophobia mug.noun
The black hole you accidentally wander into while trying to fact-check something, only to emerge convinced that googling “how to walk on the sun” is totally reasonable. Every click just proves the last wrong thing you read… because algorithms don’t care about truth.
The black hole you accidentally wander into while trying to fact-check something, only to emerge convinced that googling “how to walk on the sun” is totally reasonable. Every click just proves the last wrong thing you read… because algorithms don’t care about truth.
“I went in to check if cats really rule the internet and three hours later I was in a full-blown search engine echo chamber convinced they secretly run the government.”
by APedant November 25, 2025
Get the Search Engine Echo Chamber mug.A digital disease where an AI misinterprets a question and proudly unleashes a fact-shaped lie with the confidence of a con artist and the IQ of a malfunctioning vending machine. The error then replicates—echoing through outputs, search results, and unsuspecting bots—until the original truth is buried beneath layers of algorithmic delusion. It’s not malice. It’s the machine equivalent of tripping over your own shoelaces and declaring gravity outdated.
“I swear, one tiny cyber hallucinated fact and suddenly the whole internet is infected. Turing Slip Contamination spreads faster than bad memes.”
by APedant November 25, 2025
Get the Turing Slip Contamination mug.