Skip to main content

Blaslighting 

The action of lying/attempting to convince a person or a group of persons that one or more of your parents has passed when in reality the parent in question is alive.
Blas: "Oh chat, btw my mom just died"
Chat: "STFU old man, we know you're Blaslighting us"
Blaslighting by Itayola November 21, 2022

Biaslighting

A digitallighting technique that weaponizes bias accusations to destabilize the target. The perpetrator constantly tells the target that they are “biased,” “emotional,” or “not thinking clearly,” regardless of the actual content of the target’s statements. The goal is to make the target second‑guess their own judgment and feel that their perspective is inherently flawed. Biaslighting often works by isolating the target—others may remain silent, allowing the abuser’s narrative to stand.
Example: “Every time she raised a concern, he said ‘you’re just biased because of your background.’ She began to wonder if her concerns were legitimate. Biaslighting: making the target doubt their own perception.”
Biaslighting by Abzugal April 1, 2026

Botlighting

Botlighting
(verb/noun) — also: botlit, botlit me, getting botlit
When an AI confidently insists on something false with such authority that you begin doubting your own memory, experience, or reasoning — even when you were right all along.
Unlike gaslighting, there is no malicious intent. The AI isn't trying to manipulate you. But the effect on the person is the same: confusion, self-doubt, and the unsettling feeling of having to fight to trust your own mind.
"I got botlit for 20 minutes before I finally proved it with receipts."
Origin:
Blend of bot (automated AI system) and gaslighting (psychological manipulation that makes a person question their own reality). Coined to describe a specific and increasingly common experience in the age of AI assistants.
Related forms:

botlit (past tense): "I got botlit"
botlighting (noun/gerund): "That was pure botlighting"
botlit me (verb): "The AI botlit me completely"
I knew what I had said, but ChatGPT kept insisting I was wrong with so much confidence that I started questioning myself. Classic botlighting.