I'll start with a fun trick that I end up using quite a lot in number theory. Oh, number theory is a lot about integers. And the trick is that in the decimal system, if the sum of the digits of a number is divisible by 9, then that number must be divisible by 9. The simplest examples besides 9, are 18 and 27. But also, if the sum of the digits is divisible by 3, then the number is divisible by 3. And of course then if the number is even, then if the sum of the digits is divisible by 3, then that number is divisible by 6.

My favorite example? 111111111 = 9(12345679)

Moving on, for an odd number, if it is a square, then it must be of the form 8j+1, where j is an integer. For the rest of this post j will be used as an arbitrary integer. So 9, is an example, where j=1, and "of the form" just means you have a template for such numbers.

Often mathematics people, usually called mathematicians, but not always, will give a proof of such a statement. One way is to immediately state something and then give proof, for instance.

Every odd square must be of the form 8j+1, where j is an integer.

Proof: An odd number is of the form 2j+1 (stated without proof), squaring gives:

(2j+1)*(2j+1) = 4j

^{2}+ 2*2j+ 1 = 4j

^{2}+ 4j+ 1 = 4j(j+1)+ 1

Proof complete.

Things considered obvious may not be fully explained, and sometimes you'll see "stated without proof" like above, and in other cases, nothing. For instance, I rely on the fact that j or j+1 must be even above without stating it, as it's considered obvious, so the last step of multiplying 4 by 2 to get 8 is left to the reader. Math people do that sort of thing a lot.

(In my writings I rarely if ever use the standard form above of clearly stating that proof is about to begin and noting the end of a proof.)

The use of asterisks "*" above is generally considered standard worldwide, but in America we tend not to use it when multiplication is obvious. Oh yeah, also we say 10,000 for ten thousand but in other countries that can be something different, so at times I'll just use 10000, if it's not too long a number to eliminate that confusion. Things can shift among countries so it's not always easy to be sure you're saying what you think you are saying with math.

Exponents come up a lot in number theory, and 2 with an exponent is a favorite, so for instance 2

^{10}= 1024, is kind of cool for also being the kilobyte with computers. The megabyte is 2

^{20}and the gigabyte is 2

^{30}.

The logarithm is just going backwards, for instance, for 2

^{x}= 1024, x=10, which is called the logarithm. Logarithms are easier to look at with integers. Things can look way more complicated when they are non-rational.

Math people actually distinguish between non-rational and irrational. So for maximum flexibility I'll typically use the term "non-rational" in my writing.

The discrete logarithm is a little different still as it uses something called "mod" which was introduced by Carl Gauss, one of the greatest mathematicians in human history.

So, for instance, with k

^{j}= q mod N, you find j, for a given q, which is called the residue, and j is the discrete logarithm.

Have a post which discusses more: Focus on modular arithmetic

The term "mathematician" can be specialized or vary depending on your country. Here in the US, Gauss is considered a mathematician, but Albert Einstein is usually considered to be a physicist, while Sir Issac Newton is also usually considered to be a physicist, but he was also one of history's greatest mathematicians. I do not consider myself to be a mathematician, but also I'm not considered a physicist in this country because I do not have a physics Ph.D, as I only have an undergraduate degree, and do not work professionally in the field.

Some of the greatest mathematicians in history were gentleman, which for a time meant people who didn't have to work for a living, and others who did mathematical research to keep their minds occupied like Pierre Fermat.

Mathematics as a distinct and separate discipline didn't emerge until recently, in the late 1800's, when for instance, relying now a lot on the Wikipedia, the London Mathematical Society was founded, in 1865.

A lot of debates about modern mathematics go back to the late 1800's and early 1900's with major figures in those debates being Peano, Cantor, Russell, Hilbert, and of course GĂ¶del, among others.

One of my claims is something that others might dispute, as it may be surprising that mathematics as a clearly separate and

*distinct*field is relatively young--less than 200 years old--but for most of its history, mathematics has been a valuable tool that was not considered a separate academic field of study, like say, physics. So, much of its foundation as a separate discipline was established in the very early 1900's.

That is important for a lot of my research, in explaining how certain mistakes could have gained dominance in the field. I suggest they did so because it is a very young field academically, while mathematics itself is as old as human civilization.

And that is a rough overview of some basic math information that occurs to me as I sit here typing away. It's not meant to be exhaustive, nor to be extremely in-depth, as mathematics is incredibly huge as a body of information and is one of the most important that humanity has ever produced touching the lives of every single human being, every single day.

James Harris