Monday, July 28, 2014

Why the object ring?

Faced some important things over a decade ago, which forced me to question how I thought I knew certain things, and as a result I worked out several solutions where I'll talk about an important one now.

When Gauss considered what were eventually called gaussian integers he challenged conventional thinking about numbers not integers.

For instance 1+sqrt(2) behaves a lot like an integer. It could be said to not have certain numbers as factors, for instance 3 is coprime to it in the ring of gaussian integers. Multiplied times another number 1-sqrt(2), it gave an integer. And mathematicians pondered integer-like numbers as something that needed to be categorized.

And they concluded that algebraic integers did the trick. Algebraic integers were simple too! They were just roots of monic polynomials with integer coefficients.

Various tests were done which seemed to indicate these were the solution and included all integer-like numbers including integers themselves. Tests like determining that they were infinitely decomposable into algebraic integers and that the product of two algebraic integers was always an algebraic integer gave confidence to those who believed in them.

However, I came across mathematics that forced me to question if those tests were enough, and began to ponder the issue of "integer like" with the need to resolve what looked like contradictions.

Eventually I focused on units. With integers, of course 1 and -1, are the only units, but there are an infinity of units, in the gaussian integers, and 1+sqrt(2) is one of them. In fields, of course, units are meaningless, but I began to realize that for numbers that behaved like integers when themselves not integers, units might be the key.

So in the month of December 1999, I came up with what I call the object ring. Which was a result so important to me, as it underpins everything else that I find it appropriate that talking about it is the very first post on this blog:

That definition is a lot about units! for instance, it established 1 and -1 as the only rational units, which I realized was one key element that gaussian integers had helped reveal.

In contrast in the field, everything is a unit. For instance 2 is a unit because 2(1/2) = 1.

By focusing on properties I could include the algebraic integers, but other numbers missed! Where I found I could prove the existence of those other numbers and that they were in my object ring.

I consider more of these issue in my post where I questioned if mathematical rigour had failed:

What's interesting to me is that the concept of testing the algebraic integers failed so horribly. But human imagination can only go so far while mathematics is an infinite subject. Those mathematicians couldn't prove with tests that the ring of algebraic integers was complete, as in fact it wasn't!

While by focusing on the intrinsic properties of integers and integer like numbers I could flesh out the rules by which a ring would have to be complete. That is the power of mathematical proof! In essence the rules found preclude the object ring from NOT being complete as to be incomplete it must not be following those rules.

The rules are intrinsic proof. That is, they encapsulate proof within themselves.

That can be a hard concept and the simplest way to see it is to try and come up with numbers that do not follow those rules. For instance 1/2 is blocked because it is rational and a unit.

That is so weird. If a number is a member of the object ring it must follow those rules for membership. By definition then, if it does not follow any one of those rules it is not in the object ring. But those rules encapsulate everything that makes a number either an integer or like an integer; therefore, they must be complete.

What's interesting then is that simply studying gaussian integers is enough to figure out what the rules are, as it must follow all of them! So there cannot be any other rules beyond it, because they must include it.

But you need ALL the rules, and just those actually required.

So the problem with the ring of algebraic integers then is that it includes an unnecessary rule, which is being the root of a monic polynomial with integer coefficients, while leaving out the correct ones.

Rigorous logic allowed me to prove the validity of my own techniques, so mathematical rigour does remain, but the lesson is: when human imagination is used instead of it, you can't really trust the result.

On the face of it, it probably seemed reasonable to those mathematicians to "test" the ring of algebraic integers with everything they could imagine. But mathematics was far too subtle for that to work!

In the end mathematical proof was required even for the question of what number can be considered integer-like when not an integer, and with mathematical proof, total rigour was achieved, and once against mathematical certainty could appropriately be had.

James Harris

Monday, July 21, 2014

Why three dimensions at least

Found myself looking over one of my blog posts where I extended the methodology for what I call tautological spaces, identities used with conditionals for mathematical analysis.

And the concept there is so easy as you just subtract some equation from an identity and analyze the residue, which tells you about the equation! So it's like, you're probing the residue. That sounds, weird.

But one of the best things about that idea is you can look at familiar equations differently. Pull them through a tautological space and probe the leftover conditional residue. It's fun! But maybe it does feel a bit weird. Like playing with the cast off skin.

And it occurred to me that I haven't talked about the implications from tautological spaces requiring at least 3 variables with the v, so they automatically have at least 3 dimensions.

For instance the classic is: x+y+vz = 0(mod x+y+vz)

Note that here 'v' is an absolutely independent variable as its completely independent. That let's you set it to whatever you want, which is why I introduced it. I wanted to be the one, in control. That shifts the residue around. Like stirring that conditional residue in ways you choose. You can poke and prod that residue an infinite number of ways, regardless of the variables x, y and z.

Turns out there's no value in going to something with fewer variables. It's not hard to figure out why. Curious people can play with it and see if you can do anything with x, y and v.

For me considering such things was done years ago and I debated talking about it, but have often decided to stray from areas that can spark controversy.

The problem here is there may be mathematical reasons related to identities which can explain why you need at least 3 dimensions for reality like ours. But then again, these same ideas may indicate that a lot more dimensions are possible, with characteristics radically different from our own.

Sounds wild. I like the wild thoughts at times. Who knows how weird reality might really be?

Fun speculation but for now without more interest from others to pursue these lines of inquiry, it's something I'll mention with reticence in this post. And just wander off from there now.

James Harris

Friday, July 18, 2014

Why maybe difficult for fans

One of the things I've taken for granted for a long time, over a decade I think, is that it's hard to be a fan of my mathematical research. There just isn't a lot of positive that I find out there of other people discussing it, while lots of negatives, like people calling me insults like crank or crackpot.

Fun fan reality is when you get to excitedly discuss things you enjoy with like-minded people. Not search in vain for a nice thing at all, and worry if you're just being deluded by some mentally imbalanced individual and just can't quite figure out what is wrong with his math.

Which is why recently I've been more appreciative for the people who supported my research and me despite the difficulty and it's worth it to talk about why really it's so hard.

Reality is that the people who will tend to care most about ideas are not the discoverers. For instance I have lots of ideas, so I'm not terribly attached to most of them. Why would I be? On any given day I might muse over my prime research of which there is a bunch, or instead ponder my open source software program, or instead wonder about my political ideas.

Even if I focus on math, I can think about work with modular algebra symbology, or wonder yet again why I was the first person to do this thing or that thing or whatever.

But it's like you can see how people can be much more excited about something than the person who produced it in the entertainment industry, where fans--like me!!!--may get giddy, say, about Star Wars but you don't see George Lucas jumping up and down, screaming and shouting, now do you? (Maybe he does it behind closed doors.)

Actually I'm probably more excited on any given day about the latest movie I fancy than about my own research as it's so common to me. My research is with me every day, any day I wish to consider it. But great movies? They're a whole lot of rare.

So to me it's really not my job to worry about how excited other people get about, say, a partial differential equation that follows from a prime counting function, or a prime residue axiom, or a new way to count quadratic residue pairs. For me that's just part of a pile.

And I'm not excited about things like the Riemann Hypothesis or Fermat's Last Theorem aka FLT. There was a time before I had a pile of results when I DID get excited about FLT, but now it's like, who was that guy? I don't know him any more. He would dream of figuring something out wondering if it would ever happen and got taken in a bit by crowd mentality. FLT is a popular thing for people who don't know any better. For a while I didn't. Now I do.

People get excited about things that excite them. To my mind I note with some interest things I know aren't sufficiently interesting to certain people for them to be interested in solutions I discover. But that's not my problem. It's not my fault either.

Actually I think it's funny. World doesn't care. People come and go. New people are being born all the time. Let them grow up to appreciate it. History will not mind.

You know, it occurs to me that the people most motivated are among the ones who haven't figured anything out yet.

If I could get more excited about some of these things, like if I cared about the Riemann Hypothesis, maybe I'd figure out if my work could lead to its solution or not. But I can't make myself feel excitement I just don't feel. And I don't have that crowd mentality.

Some might suppose that any person who had the ability to resolve such things especially the demonstrated ability should necessarily go ahead and do it, which I used to believe. So I was very skeptical of Gauss, who is a hero of mine, about him reputedly saying he wasn't interested in FLT. Now I believe him. Before I thought maybe he must have worked on it, got nowhere and didn't want to admit it. Now I'm sure he just didn't find it interesting enough to bother. He wasn't lying. The crowd opinion couldn't motivate Gauss.

But that's not a bad thing. It leaves the door open for others. How many more results did Gauss really need?

I have the experience of having major math results. So I can no longer get excited about solutions in the same way I could before. I've lost that feeling, and really?

I don't miss it.

Fans of my research will be ok. The real joy is in the result, not what the discoverer does or does not do.

Or as I like to say, Sir Isaac Newton has been dead for a long time. So?

He inspired me as a kid, my hero. I thought to myself: if I had lived in his time would I have figured out the things that he did?

He didn't have to be around for that to happen. The excitement I felt as a kid motivated me.

And I'm still a fan.

James Harris

Monday, July 14, 2014

Guessing at support

One of the things I finally admitted to myself today is that I have speculations about what kind of support my mathematical ideas might have, which really are just guesses. And, um, I hesitate to admit, but I started this post to admit it, so why is it so hard? Reality is I don't know.

What I do know is there is enough support worldwide that my ideas get picked up by search engines, and I've known that for years and rationalized it, so now I just remind myself: people have to DO something for that to happen, and people other than me.

But then again, why wouldn't these ideas get support?

How many people who love numbers wouldn't be fascinated by a simpler prime counting function that leads to a partial differential equation? Or finally an explanation for the size of fundamental solutions to x2 - Dy2 = 1?

And those are just a couple of things I picked to highlight. There's so much now I actually wonder to myself how I pick any given thing in the moment when I discuss. And wouldn't even begin to try to talk about it all in one post.

I've explained why I changed the name of this blog from "My Math" to "Some Math", and after I did it there was that bit of wondering if that would be it. Maybe it would just disappear and the search engine thing would be an anomaly. But some important people where I've admitted I don't have a clue about them, re-connected with this blog in some way. Re-connected in some way with the ideas presented here, and now the blog regained a lot of its visibility. Though not all.

So some let it go. Why didn't every one of the people mysterious to me?

Part of me wishes to romanticize to some extent the people who stuck with it. What did they see? How do they know? What makes them different from so many others in the math world? Are they even in the math world? What makes them special?

Ah yes, they have to be special, now don't they? Don't you? If you're one of them?

If you can see a world where these ideas are mainstream because they're some of the best ever found by humanity, then one might ask, how?

How do you know?

To some extent that should be an easy question: by the math, what it can do, and by the proof of it.

At times I've wondered, without that support can I be sure I'd have kept going? Would I have kept figuring things out?

I didn't have to find out. And for that I have to be grateful.

There is an oddity to the world of mathematics. People can believe that ideally you know the best math when you see it, or that it's all about proof. That's correct in the abstract.

But discovery is just the start. For mathematics to be accepted there have to be those people who can see the correctness in front of them, and not just something different. Or not just the wild musings of some "crank" or "crackpot" deluding himself.

Certainly I make the effort to be understood, and go to extreme lengths to explain and make sure I step things out carefully and in detail, but someone has to actually go through those steps to even notice.

And thankfully for me, some people must have. As even to read this blog, you have to find it, now don't you?

If you're not one of them, your finding it is all about those people so unknown to me. They found it before you, and somehow knew there was something here.

I do wonder, will I ever know why? For now without further information, I am simply stuck with speculation.

James Harris

Saturday, July 05, 2014

More modular algebra symbology

The advent of algebra itself was one of the greatest things in human history, as considering unknowns with numbers vastly increased the analytical powers of the human species.

An equation like x+y = z may not seem like much to us today, but clearly its flexibility is far greater than 2+3 = 5.

With modular arithmetic I think there has long been a focus on explicit numbers versus heavy use of symbols, so it parallels a focus before on numbers as that's a more practical concern.

While a shift to the greater purity of symbols also gives more flexibility but can be difficult as an increase in abstraction which is my guess at how I got to be the person who introduced tautological spaces.

So I got to be the guy who pushed the symbology of modular arithmetic more into a modular algebra with something as simple as:

x+y+vz = 0(mod x+y+vz)

And the extent to which that was NOT intuitive to the mathematical world can be seen at the reticence in increasing the symbology of modular arithmetic into more of a modular algebra which exists to this day.

Talking about it on my blog helps then to increase the spread of human knowledge.

So you can see a HUGE increase in modular algebra symbology in my research, which has an extra level of abstraction which is clearly not evident in the more simpler approaches you see from the traditional, where for instance you may see a lot of things like: x = a mod b. So yeah there are still symbols, where actually a lot of times you'll see numbers, but nothing like the heavier symbology in my mod x+y+vz.

But is that a necessary level of abstraction like algebra was to simple counting or an unnecessary layer of complexity?

Well consider, using my approach I was able to simplify reducing binary quadratic Diophantine equations!

That achievement in and of itself I would think should be kind of big, right? I mean we're talking about approaches that were handed down from ancient times one could say.

Before my research I'm sure none of the top number theorists in human history even knew that such an improvement could even exist. They thought I'm sure, wrongly we now know, they had the best possible in mathematics already.

Yet I improved upon them? Oh yeah, that could make it one of the greatest in mathematics, but that's just an opinion. But what makes it kind of weird even to me is that it was so easy.

But then again the increase in abstraction with the symbols of algebra turned a lot of problems into far easier to approach ones as well! So I'm looking at a justification for the necessity of this approach.

A good introduction to these things including links to the full mathematical work and proof can be found through my post:

The shift in thinking necessary to embracing greater symbology in modular algebra appears to be kind of big, as I've had this research for quite a while now.

But that's not a surprise. Each major advance in human thought requires a paradigm shift. And while I can think it's an advance things move slowly until more and more people agree with me! That's ok. It's to be expected.

And quite reasonably people need a LOT of justification and time to accept such things, as once that shift is made, a TREMENDOUS amount of human effort will be involved from then on in that direction.

Which we've seen with algebra--one of the most important subjects, ever.

It makes sense that people take their time and are careful.

For me it gets to just be exciting.

James Harris