A Plan

The field of mathematics is pretty big. I intend to set down, in this webspace, the parts of it which are indispensable for an understanding of modern physics. I only have small amounts of time to work on it. I have a fair amount of legacy material that I need to weave into a better structure (writing the first form taught me a better way to approach it …) So I need a plan. There follow some rough sketches of the skeleton of what I intend to assemble. Some of this material already exists in earlier forms.


… among some of the fragments:


Can be had from any adequate foundation – as can the surreals. From naturals we can obtain positive rationals, which can serve as a skeleton for the scalar continuum.


Positive rationals suffice to sketch out enough of linearity to specify scalars, simplices and measures. Linearity and simplices suffice to specify chords and thus, via a limiting process, derivatives of (not necessarily linear) functions between linear spaces.

Smooth Manifold

Mappings between linear spaces can be obtained by composing mappings to and from another space, which need not be linear. Suitable structure on the intermediate space can enable us to induce local differential structure on it, from that of the linear spaces. This enables us to treat the intermediate space – a manifold – in the same terms as linear spaces, for purposes of differentiation. From this we can extend our notion of measure to manifolds, also.

Polynomials and complex numbers

From (:repeat:{naturals}) we can specify powers; and, when applied to linear automorphisms, addition and scaling then allow us to specify (and study) polynomials. The special case of angle-preserving linear maps in two dimensions yields the complex numbers and the fundamental theorem of algebra (that every polynomial has suitably many roots over complex).

Quadratic and Hermitian forms

Linearity lets us characterise quadratic forms and, for spaces linear over {complex}, Hermitian forms. The fundamental theorem of algebra then lets us find canonical representations for forms symmetric or antisymmetric under transposition; and for automorphisms whose composite with a given form is likewise symmetric or antisymmetric.


Not only am I writing an account of the mathematics I've been taught; I'm reworking it into a form that I like better. I'm taking relations as my primitive entities, which leads to new ways of thinking about many familiar topics. I'm exploiting ideas from software design in the ways that I express the subject matter. I've got a few little details that I've thought about long and hard that have lead me to treat old subjects in subtly different ways. I've listened to others and picked the ideas and suggestions they've offered that appeal to me.

When folk do any new thing for the first time, by specification they don't know what they're doing. Consequently, the way they go about it may appear somewhat clunky to those who come after, once they understand how to do it well. None the less, the precedent may so influence how those who come after are taught the matter that they remain burdened with some of the baggage fo how they got there. This happens in science and mathematics when the notation adopted in the earliest explorations of a subject remains in use despite containing some quirks that hamper understanding. In software, it's a well understood phenomenon, expected and worked around: the initial implementaiton of a program typically reflects a partial grasp of what's needed, that's not fully fleshed out until one has had a working system in the hands of users for long enough for them to complain about it and its maintainers to make a tangled mess of it by fixing what they're complaining about. Only then do you get to see what the program is really doing and recognise that it can be done a lot better if approached differently. Reworking the code to use the different approach is described as refactoring. It can prepare the way for adding new features or merely for fixing one more bug; but the main goal of it is to remove the kinks from the tangled mess that you got once you understood what you were really doing, in terms of what you initially thought you needed to do. One of my aims in my writing is to refactor mathematics into a more flexible and lucid form.

So I've worked with relations rather than sets. I've re-expressed group theory in terms of cancellation and completion and explored how cancellation clarifieds what matters about some other fields. I've experimented with different approaches to some standard formulations and preferred the variants that save me complications. All the while, though, I have been aiming to make clear the essence of what each topic is about, in a form usable by physicists and in terms of a coherent framework – both conceptual and notational – that ensures all the parts work correctly together.


Everything that needs to be included, somewhere.


In lieu of foundation: the reading of text as a discussion in a context (some presumed, some made explicit in the text); textual fragments that denote entities in the context under discussion; naïve logic. Denotational definitions in the text providing templates for parsing text.

Gödel's fork: undecidability as the least of three evils; the virtue of humility. Ubiquitous implicit in so far as on decidability. The grand domain in which one selects a sub-domain wherein enough is decidable (e.g. Set).

My chosen primitive tool: relations; left and right values; fixed points. Significant properties and flavours: transitivity, symmetry, reflexivity; mappings, collections and equivalences. Particular cases: empty and all. Emergent standard primitive tools: reverse, composition, union, intersection, successor.

The Naturals. Lists, repetition. Addition, multiplication.


Binary operators. Already seen composition (of relations), addition (of naturals) and multiplication (of naturals). Associativity, bulk action. Completeness and cancellability; identities, inverses and groups. Inducing completeness from cancellability (e.g. integers, rationals). Commutativity. Using the positive naturals as a model of repetition; a+a+a = 3.a and similar; 0-repeat as model of identity.

Linearity: interaction among binary operators; addition, multiplication, composition; linear relations. Distributivity, additive cancellability, multiplicative identity. Scalars (rings): positivity and multiplicative completeness; additive completion, fields. Linear spaces (modules).

Linear mappings, duality, measures; linear independence, span. Tensor products, transposition, trace and permutations. Decomposing the identity: bases and their duals. Diagonalise: automorphisms; bilinears. Metrics and the measures they induce: integration.

Complex scalars, conjugation, antilinearity. Antilinear square roots of some linear maps, notably of trace and identity. Hermitian as metric; diagonalise.


Linear maps, from a space of functions from a domain to linear spaces, to the destination of each function: {measures on S} = {linear μ: for all linear V, (V:μ:{map (V::S)})}. Densities as ratios of measures; representing measures via a canonical measure on the domain combined with a density. Summation as canonical measure on finite collections. Relation of canonical measure on a linear space to its metric.


Generalizing the triangle and tetrahedron: lists with total 1, a.k.a. unit measures. Using simplices to define chords of (not necessarily linear) maps between linear spaces; obtaining gradients of chords. Simplices of non-zero measure as surrogates for open/closed sets; derivatives as gradients of limitingly small chords.


Repeated differentiation, smoothness. Topological considerations; relationship with measure; completeness.

The transcendental functions (exp, sin, cos, log; π and e).

Manifold: a not-necessarily-linear space in local isomorphism with a linear space: charts, atlases. Inducing derivatives and smooth structure: gradients and tangents.

Information theory

Probability measures. Entropy. Bayesian inference.


… arising from:

reading what I wrote under earlier forms
thinking a bit

(without, as it happens (at least to begin with) the help of sobriety) interesting binary operators obey:

This suffices to define a multiplication, a×c = b, on the left arguments of *. One may have to live with the possibility that × is ambiguous, i.e. that b isn't unique, but the present formalism at least allows us to specify the binary operator we wish to discuss before we begin a discussion of whether what it denotes is unique. Once repetition comes into the picture, one need but allow n*z as a usage of * to ensure that repetition is enrolled as multiplication in the × you construct. It will do this naturally.

One can go further (whether the wise will is another matter): consider the standard trick, quotient, for turning a cancellable operator into a complete one; given *, define % by [a,b]%[c,d] = [a*c,b*d] with [n*p,n*q] and [p*n,q*n] deemed equivalent to [p,q] for all n for which each makes sense; then the distinguishable lists % will accept as arguments retain their cancellability yet are complete even if * wasn't; in the process, they represent * perfectly via [m*p,m]←p with m arbitrary (cancellability makes it irrelevant; I need to engage my sobriety sometime to work out what to make of the ambiguity that arises without cancellability).

Things to put elsewhere

(at least until the bits that matter to physics are orderly)

Valid CSSValid HTML 4.01 Written by Eddy.