The Moana Problem, first coined by Matthew Walsh circa 2024, is a theory that suggests that there is really no cure to Racism, especially with children.
Basically the theory goes: you have two white children who both watch Disney movies. One white child gravitates towards white princess. Picks out white princess dolls, watch Disney movies with white princess, etc. Which is bad. Then, on the other hand you have the other white child, whose favorite princess is Moana. However, that child wants to be Moana for Halloween, which proposes Cultural Appropriation, if she were to wear the Pacific-Islander Attire. So “no matter which way you go… you end up back in Racism”.
Basically the theory goes: you have two white children who both watch Disney movies. One white child gravitates towards white princess. Picks out white princess dolls, watch Disney movies with white princess, etc. Which is bad. Then, on the other hand you have the other white child, whose favorite princess is Moana. However, that child wants to be Moana for Halloween, which proposes Cultural Appropriation, if she were to wear the Pacific-Islander Attire. So “no matter which way you go… you end up back in Racism”.
by YourfavoriteConservative January 07, 2025
by The other cracker June 21, 2019
by Your PC ran into a problem. April 07, 2024
by coolmarissa May 03, 2022
A situation that arises during software development in which one developers build works perfectly (we call this the David build) whilst all other builds are completely fucked.
Dev 1: "Oh man, why isn't this building!!"
Dev 2: "What do you mean? It's building fine for me and we are running the same version.. ?"
Dev 1: "Why must you constantly create the David Problem..."
Dev 2: "What do you mean? It's building fine for me and we are running the same version.. ?"
Dev 1: "Why must you constantly create the David Problem..."
by Miss David January 28, 2022
by SwagmastXD420 November 09, 2018
Is how it's supposed existence is demonstrated, right? They usually use this binary prompt-response scenario. Like "Think of a city. Now did you pick the the specific city or was it random?" And I think that's the wrong way to conceptualize it.
Hym "So, my problem with determinism (at least in this example of determinism) is that although I don't choose the specific city, I still activate the 'mode' that searches for city and I can choose not to do it and prevent a city from coming to mind OR I can misfire. It's like a hat with with slips of paper in it and, on the slips of paper, are the names of cities. Now, you can prompt me to think of 'city.' I can choose to reach into the hat. And only then do I get a random city. But what I DON'T get is 'Nissan' or 'helicopter' or 'banana' or 'dog.' I activate the mode that searches for city and I reach into the hat. See, as I have it conceptualized, thought exists in this nebulous, un-articulated format. So, to get language I need to activate some kind of process. And prompt response ISN'T THE SAME as what I'm doing when I'm monologing. I'm running that nebulous, un-articulated thought-matter through a lexicon that corresponds with my native language. But I am that which activates modes. I can can turn it on or off like a switch. It can also misfire while I'm not paying attention. So, yeah... I think it's a failure to properly conceptualize and a failure to compartmentalize."
by Hym Iam December 02, 2023