Friday, August 29, 2008

Adaptive logic

A while ago, Ole mentioned a presentation on adaptive logic by some logicians from Ghent. It sounded pretty interesting. Apparently one of the Ghent logicians, Rafal Urbaniak, has started a blog, the first posts of which are on that very topic. Rafal was nice enough to link to a long introduction to adaptive logic. I haven't had the chance to go through it all yet, but it looks solid. How am I supposed to narrow my interests when neat stuff like this keeps popping up?

Saturday, August 23, 2008

From Logic and Structure

Amazon has been telling me repeatedly that a new edition of Dirk van Dalen's Logic and Structure is coming out soon, so I thought I'd look at an older version. I picked up a copy of the third edition. The opening of the preface is too good to pass up, if one ignores the run-on sentences:

"Logic appears in a 'sacred' and in a 'profane' form; the sacred form is dominant in proof theory, the profane form in model theory. The phenomenon is not unfamiliar, one observes this dichotomy also in other areas, e.g. set theory and recursion theory. Some early catastrophes such as the discovery of the set theoretical paradoxes or the definability paradoxes make us treat a subject for some time with the utmost awe and diffidence. Sooner or later, however, people start to treat the matter in a more free and easy way. Being raised in the 'sacred' tradition my first encounter with the profane tradition was something like a culture shock. .. In the course of time I have come to accept this viewpoint as the didactically sound one: before going into esoteric niceties one should develop a certain feeling for the subject and obtain a reasonable amount of plain working knowledge. For this reason this introductory text sets out in the profane vein and tends towards the sacred only at the end."

I don't have any comment on the book's contents as I haven't slogged through it. The brief chapter on second-order logic makes an interesting point though. It shows how all the connectives of classical logic can be defined using just → and ∀, although this does require both first- and second-order quantifiers. The book obscures this fact in the statement of the theorem. This isn't surprising once one sees the proof, but it is still neat.

Once more, again

The calm of summer is about to give way to the less calm start of the new year. Classes start on Monday. Posting has been a little slow because I've been plugging away at some things which have eaten into my blogging energy. I'm nearing completion on them though. Once classes start I should have some more things to talk about, so posting will, I hope, become regular again. I'm not teaching this year and I'm going to try to put the extra time to good use.

I have a good looking line up of classes. I'll be taking three. Belnap is teaching a proof theory class for which we'll be using Restall's book. I am, of course, looking forward to it. Gupta is teaching a seminar on truth. I'm not sure what we're reading. I think I remember hearing that the focus is on revision and fixed-point theories of truth, but I'll have a better idea soon. Wilson is teaching a seminar on the philosophy of math and it looks like we're going to be focusing on Russell, Cantor, Frege, and Dedekind, which should be interesting. With any luck I will be done with official class work by the end of the term and I'll have the glimmerings of a prospectus idea.

Wednesday, August 13, 2008

Some comments on incompatibility semantics

The first thing to note about the incompatibility semantics in the earlier post is that it is for a logic that is monotonic in side formulas, as well as in the antecedents of conditionals. (Is there a term for the latter? I.e. if p→q then p&r→q.) This is because of the way incompatiblity entailment is defined. If X entails Y, then ∩p∈YI(p) ⊆ I(X). This holds for all Z⊇X, i.e. ∩p∈YI(p) ⊆ I(Z). This wouldn't be all that interesting to note, since usually non-monotonicity is the interesting property, except that Brandom is big on material inference, which is non-montonic. The incompatibility semantics as given in the Locke Lectures is then not a semantics for material inference. This is not to say that it can't be augmented in some way to come up with an incompatibility semantics for a non-monotonic logic. There is a bit of a gap between the project in MIE and the incompatibility semantics. Read more

Since this semantics isn't for material inference, what is it good for? it is a semantics for classical propositional logic, but we already had one of those in the form of truth tables. Truth tables are fairly nice to work with and easy to get a handle on. One reason is that it validates the tautologies of classical logic without using truth. Unless one is an inferentialist this is probably not that exciting. It seems like it should lend some support to some of Brandom's claims in MIE but this depends on the sort of incompatibility used in the incompatibility semantics being the sort of thing an inferentialist can adopt. I'm not sure that incompatiblity as it is defined in MIE or AR is the same as this notion and so some further argument is needed to justify an inferentialist's use of this notion.

Incompatibility semantics has at least two generally interesting points I want to mention here. [Edit: This paragraph needed a longer incubation period. I've removed the point that was originally here that is not interesting and wrong in parts.] Also, it is possible that no set of sentences is coherent. As noted in the appendices, there could be a degenerate frame in which all the sentences are self-incoherent. There could also be incoherent atomic sentences.

The second is that it allows a definition of necessity that doesn't appeal to possible worlds or accessibility relations. The necessity defined is an S5 necessity. To get other modalities either some more structure will have to be thrown in, possibly an accessibility relation, or a different definition of necessity. In any case, a modal notion is definable using sets of sentences and sets of sets of sentences. This would be somewhat surprising if we didn't note that incompatibility itself is supposed to be a modal notion, so, in a way, it would be surprising if it were not possible to define necessity using it. That it is S5 is a bit surprising. This leads to some cryptic comments by Brandom about intrinsic logics, but I won't broach those in this post.

I'm not sure if this is interesting. One of the theorems proved in the appendices to the Locke Lectures is that when X and Y are finite, X |= Y is equivalent to a finite boolean combination of entailments with fewer logical connectives. The important clauses here are X |= Y, ¬ p iff X, p |= Y, and, X, ¬ p |= Y iff X |= Y, p. One can flip sentences back and forth from one side of the turnstile. I think there are a couple of things to check to make sure this works, but, modulo those, this is the same situation as for the proof theory of classical propositional logic. It is possible to define a one-sided sequent system for classical propositional logic, so it seems likely that we could define a monadic consequence relation, something along the lines of: an entailment X |= Y holds iff X*,Y is valid, where X* is the result of negating everything in X. I'm not sure if this is interesting because I'm not sure what, if any, advantage this would offer over the concept of consequence defined in the Locke Lectures. The one-sided sequent system yields a fast way to prove whether a given set of sentences is valid or not. It's not clear that there would be any gain on computing the incompatibilities to check whether a given set of sentences is incompatibility-valid or not in the monadic consequence. (This may be a really trivial point for any semantics for classical logic, but it isn't something I've thought about.)

Tuesday, August 12, 2008

La la la links

In lieu of commentary on incompatibility semantics today, here are a couple of worthwhile links.

The first is a tutorial on how to use Zorn's lemma by Tim Gowers. It is quite good and has several examples.

The second is a short piece on time management by Terry Tao. Most productivity stuff I read online is aimed more at the business or tech industry crowd, so I enjoyed reading suggestions by a successful academic aimed at the academic crowd.

Friday, August 08, 2008

Basics of incompatibility semantics

I've been spending some time learning the incompatibility semantics in the appendices to the fifth of Brandom's Locke Lectures. The book version of the lectures just came out but the text is still available on Brandom's website. I don't think the incompatibility semantics is that well known, so I'll present the basics. This will be a book report on the relevant appendices. A more original post will follow later. Read more

The project is motivated in Brandom's Locke Lectures. He does not want to take truth as a primitive notion since he doesn't want to start with notions regarded as representationalist. Rather, he opts for incompatibility, or incoherence. It is important that, to start with, incoherence is not formal incoherence. Atomic propositions taken together can be incoherent. Incoherence is linked to the notion of incompatibility by the following: for sets of sentences X,Y, X∪Y∈ Inc iff X∈ I(Y), where I is a function from a set of sentences to the set of sets of sentences it is incompatible with. From this definition it is immediate that X∈I(Y) iff Y∈I(X). It also turns out that given a language and an Inc property one can define a unique, minimal I and similarly for Inc given a language and an I function.

It is also taken as an axiom that if a set X is incoherent, then all sets Y∪X are also incoherent. The incoherence of a set of sentences can't be fixed by adding sentences to it.

Starting with an Inc property, logical connectives and a notion of entailment can be defined. These are more or less as would be expected from Making It Explicit and Articulating Reasons. The notion of entailment is one of incompatibility. X |= Y iff ∩p∈YI(p) ⊆ I(X). (I'm using the convention of dropping the brackets for singletons when it improves readability.) This definition says that X entails Y when everything incompatible with Y is incompatible with X. With this notion in mind, validity for a set X can be defined as: anything incompatible with everything in X is itself incoherent, which is equivalent to |= X. Negation is defined as: X∪{¬p}∈ Inc iff X |= p. Conjunction is defined as: X∪{p&q}∈ Inc iff X∪{p,q}∈ Inc. Disjunction and the conditional are defined from these in the standard ways.

It turns out that from these definitions the connectives behave classically. Disjunction distributes over conjunction. Double negations can be eliminated. The entailments work out as expected for conjunctions on the left and on the right, i.e. X, p&q |= Y iff X, p, q |= Y, and, X |= Y, p&q iff X |= Y, p and X |= Y, q. The left side of the entailment sign is conjunctive and the right side is disjunctive, e.g. X |= p, q iff X|= p∨q. Modus ponens is provable from these definitions.

The definition for necessity is a bit trickier. It is: X∪{□p}∈ Inc iff X∈ Inc or ∃Y(X∪Y∉ Inc and not: Y |= p). A necessary proposition, □p, is incompatible with a coherent set X iff there's some Y which is compatible with X and is compatible with something p isn't. Here is Brandom on the dual notion of possibility: "what is incompatible with ◊p is what is incompatible with everything compatible with something compatible with p." The semantics of modality without possible worlds involves looking at two sets of sentences (three counting {□p}) and their incompatibilities. The normal rule of necessitation falls out from the axioms and this definition. The modal logic that results from this together with the above definitions for negation and conjunction is classical S5.

I'll close with a brief comment. The definition of incoherence and incompatibility used has as a consequence that the consequences of an incoherent set is everything. The principle of explosion is built into the incompatibility semantics. The motivating idea is that an incoherent set of sentences will behave differently in inference, in particular by acting as a premiss for everything. This creates a problem, noted by Brandom, in dealing with relevance logics. Brandom sees the defining feature of relevance logic as the rejection of explosion which would mean that minimal logic would be a relevance logic. Incoherent sets of sentences behave just like coherent sets of sentences, unless one already has negation, in which case for some p an incoherent set would entail both p and ¬p. Part of the point of Brandom's project is that there is a coherent way to define logical vocabulary from a base language without any logical vocabulary so this is not an option. The possibility hinted at in the Locke Lectures is to define an absurdity constant and then have the incoherent sets imply that constant, but that has not yet been worked out.

I think working through this sheds some light on the otherwise cryptic comments about intrinsic logic that come up in lecture 5, but I'll save that, as well as my other commentary on this stuff, for another post.

Tuesday, August 05, 2008

On primary math education

Yesterday I was pointed to an essay, a lament, on primary school math education in the US, written by a K-12 math teacher, that I want to share. It is well-written and makes some good points about teaching, which activity puzzles me. It also contains some entertaining and interesting dialogues, a la Perry, Lakatos, Feyerabend and Berkeley.

Keith Devlin also has a couple of columns about conceptual understanding and why multiplication isn't repeated addition: here, here, and here. [Edit: Duck points out that there is a good discussion of some of Devlin's stuff at Good Math, Bad Math.]

Sunday, August 03, 2008

Short note on Wittgenstein and Church

In the Tractatus, Wittgenstein gives a definition of number as the exponent of an operation. In something I read recently, although for the life of me I cannot figure out what it was, the author pointed out that the basic idea in the Tractatus is the same as that of Church numerals. The definition of number in TLP has seemed somewhat obscure to me in the context of the TLP, so this comparison helped clarify things. The definition of number in TLP comes at 6.02. Lets call the basic operation S and name an element x. The number 0 is x, 1 is Sx, n is Snx and n+1 is SSnx. While not in lambda notation, this is fairly close to Church's definition for addition.

Together with some of Michael Kremer's remarks on Tractarian views of math, it makes the TLP seem concerned with computation as opposed to just structural concerns, which one would expect of something in the broadly logicist vein. (This may not be fair to all logicists. There did not seem to be a comparable concern with computation in Frege or Russell. The distinction I'm using here is the one drawn by Jeremy Avigad between math as a theory of structure and of computation in his "Response to Questionnaire" on his website. Although, Avigad points out that before the 20th century mathematicians were concerned with computational aspects of proof more than structural.) In Russell's preface to the TLP, he criticizes Wittgenstein's definition of number for not being able to handle transfinite numbers. If computation is supposed to be an important theme in the TLP, then this would not be bad. The transfinite numbers would not be the sort of thing that we would be computing with recursively. An interesting historical question is whether there were any reviews written of Church's work which pointed out that his definition only worked for finite numbers, echoing Russell's criticism of the TLP. Since the TLP was written before Church's, Turing's or Goedel's work on computability made it a more precise mathematical notion, it seems likely that it would remain implicit in the book rather than being made explicit as a main theme.