Decimal notation is the most common type of numeral we use. When a number is written in decimal notation, each digit in the numeral represents a different number depending on the place value it is in. For example, in the numeral , the "" represents the number , the "" represents the number , the "" represents the number , and finally "" represents the sum of , , and .
In general, consider the following "numeral," written in decimal notation, where each of the letters can stand for any integer from to : This numeral represents:
Yes, this explanation assumes the reader already basically knows how decimal notation works. It is so ingrained in math education that to try to avoid it would have been terribly confusing. If you are confused about what "" means in the above expression, it is the number of slashes below:
In unary notation, you just use that many marks to indicate a number. So 10 is written in unary above. Good luck trying to write 93846793284756. Some notations we actually use: hexadecimal and notation are like decimal notation, but in base 16 and base 2 respectively, instead of in base 10.