Skip to main content

Brockism

Brockism is one of four factions in the 2023 societal debate about the safety of Artificial General Intelligence (AGI) "x-risk", short for Existential Risk, meaning the risk that AGI will lead to human extinction.

(1) The "normal" faction, which includes Satya Nadella and everyone on Wall Street. Normals say that we can deal with x-risk later.

(2) The "decel" faction (short for "decelerate"), which says to slow down AI research.

(3) The "e/acc" faction (short for "effective accelerationists"). This faction is a mix of fanatical techno-utopians (like Yann LeCun and Andrew Ng), mixed with Twitter users who post macho memes and have a "lol let's watch the world burn" attitude.

(4) The "Brockism" faction (named after Greg Brockman). Brockists believe the way to reduce x-risk is to accelerate AI software research while halting or slowing semiconductor development. They believe that if chips are too fast, we could stumble into artificial superintelligence by accidentally inventing an algorithm that makes fuller use of existing chips. The difference between what we currently do with current chips vs what we *could* do with current chips is what Brockists call the "capabilities overhang".

Evidence for the Brockist position may be found in the accomplishments of the retro-computing "demoscene", which uses innovative software to produce computer graphics on par with the late 1990's on some of the very oldest personal computers. en.wikipedia.org/wiki/Demoscene
Brockists, or those who agree with the AI safety approach called Brockism, believe that we should speed up AI software research, but slow down semiconductor R&D as much as possible, in order to reduce the capabilities "overhang", which is the Brockist term for the potentially-dangerous gap in what is *possible* with current semiconductors, vs what we *currently know how to do* with current semiconductors.
by SPURSROCK184 November 22, 2023
mugGet the Brockism mug.

Brockism

Brockism is one of four factions in the 2023 societal debate about the safety of Artificial General Intelligence (AGI) "x-risk", short for Existential Risk, meaning the risk that AGI will lead to human extinction.

(1) The "normal" faction, which includes Satya Nadella and everyone on Wall Street. Normals say that we can deal with x-risk later.

(2) The "decel" faction (short for "decelerate"), which says to slow down AI research.

(3) The "e/acc" faction (short for "effective accelerationists"). This faction is a mix of fanatical techno-utopians (like Yann LeCun and Andrew Ng), mixed with Twitter users who post macho memes and have a "lol let's watch the world burn" attitude.

(4) The "Brockism" faction (named after Greg Brockman). Brockists believe the way to reduce x-risk is to accelerate AI software research while halting or slowing semiconductor development. They believe that if chips are too fast, we could stumble into artificial superintelligence by accidentally inventing an algorithm that makes fuller use of existing chips. The difference between what we currently do with current chips vs what we *could* do with current chips is what Brockists call the "capabilities overhang".

Evidence for the Brockist position may be found in the accomplishments of the retro-computing "demoscene", which uses innovative software to produce computer graphics on par with the late 1990's on some of the very oldest personal computers. en.wikipedia.org/wiki/Demoscene
Brockists, or those who agree with the AI safety approach called Brockism, believe that we should speed up AI software research, but slow down semiconductor R&D as much as possible, in order to reduce the capabilities "overhang", which is the Brockist term for the potentially-dangerous gap in what is *possible* with current semiconductors, vs what we *currently know how to do* with current semiconductors.
by SPURSROCK184 November 22, 2023
mugGet the Brockism mug.

Brockism

Brockism or potentially Overhang Reductionism (see discussion in comments) is a proposed name for one of four viewpoints represented in the famous 2023 societal debate about AGI safety taking place at OpenAI. Thankfully, all four factions agree on the need to deal with x-risk, but disagree about how:

(1) The "normal" faction, which includes Satya Nadella and almost every businessperson both in VC and on Wall Street. Normals say (at least with their investment decisions, which speak infinitely louder than words) that we can deal with x-risk later.

(2) The "decel" faction (short for "decelerate"), which says to slow down AI research.

(3) The "e/acc" faction (short for "effective accelerationists") is a trendy, recent term for optimistic techno-utopianism, in the milieu of Vernor Vinge's stories.

(4) The "Brockist" faction (named after Greg Brockman). Brockists (which may or may not include Brockman himself, as the idea was inspired by him but his own views have yet to be verified) believe that the way to reduce x-risk is to accelerate AI software research while halting or slowing semiconductor development. They believe that if chips are too fast, we could stumble into unwantedly making an unaligned artificial superintelligence by accidentally inventing an algorithm that makes fuller use of existing chips. The difference between what we currently do with current chips vs what we *could* do with current chips is what Brockists call the "capabilities overhang".
Significant circumstantial evidence for the Brockism may be found in the achievements of the retro-computing "demoscene", which uses innovative software to produce computer graphics on par with the late 1990's on some of the very oldest personal computers.
by brockist_agi November 26, 2023
mugGet the Brockism mug.

Share this definition

Sign in to vote

We'll email you a link to sign in instantly.

Or

Check your email

We sent a link to

Open your email