Top definition
In programming, a coding standard where all variables are tagged so you can tell their datatype by looking at the name. Microsoft likes hungarian notation, I do not. It is useful when naming GUI objects though.
nonhungarian notation:
int number;
string name;
double value;
hungarian notation
int nNumber;
string strName;
double dValue;
by Dahhak October 13, 2004
Get the mug
Get a hungarian notation mug for your buddy Beatrix.
Refers to the programming habit of naming variables so that the variable's intended use is clear from it's name.

The original hungarian notation was first described and used by Charles Simonyi, a hungarian programmer who worked for Microsoft.
Using hungarian notation, a variable called a_crszkvc30LastNameCol would imply a constant reference function argument, holding the contents of a database column of type VARCHAR(30) called LastName that was part of the table's primary key
by Praetexta August 21, 2007
Get the mug
Get a hungarian notation mug for your papa GΓΌnter.