Top definition
In programming, a coding standard where all variables are tagged so you can tell their datatype by looking at the name. Microsoft likes hungarian notation, I do not. It is useful when naming GUI objects though.
nonhungarian notation:
int number;
string name;
double value;
-------------------------
hungarian notation
int nNumber;
string strName;
double dValue;
by Dahhak October 13, 2004
Mug icon

Donkey Punch Plush

10" high plush doll.

Buy the plush
2
Refers to the programming habit of naming variables so that the variable's intended use is clear from it's name.

The original hungarian notation was first described and used by Charles Simonyi, a hungarian programmer who worked for Microsoft.
Using hungarian notation, a variable called a_crszkvc30LastNameCol would imply a constant reference function argument, holding the contents of a database column of type VARCHAR(30) called LastName that was part of the table's primary key
by Praetexta August 21, 2007
Mug icon

The Urban Dictionary T-Shirt

Soft and offensive. Just like you.

Buy the shirt