LingSync, OLD, etc.

Here’s a list of things I’ve looked at yet since the beginning of the project. I’m going to use it as a reminder and to track my progress.

Languages learned, or at least those I’ve familiarised myself with:
– Javascript: 3w, wikipedia, code academy, eloquentjavascript
– Angular.js: code school
– SQL: 3w, learn code the hard way, various tutorials (spector)
– json: 3w
– CouchDB (NoSQL): official doc, tuts+, online tutorials
– Backbone.js: pragmatic-backbone, adrian mejia blog, official doc, code school
– JQuery: 3w
– Python (brushing up with classes): official doc

Readings related to the projects:
– FieldDB white paper and other docs (on github)
– OLD documentation
– A couple of chapters from Joel’s thesis
– ACL papers on LingSync and the one by Emily Bender

“Technical” work (not much yet..):
– Set up OLD on my computer
– Set up the Dative app on my computer (not fully working though)
– Set up accounts on github, lab computers, basecamp, lingsync

Update 04/06/14

Since my last post, I’ve read a bunch of articles, but don’t really remember which ones, so that’s bad. Some I remember are:

Chomsky – Bare Phrase Structure

Seuren – Chomsky’s Minimalism

Stabler – Formalization of Minimalist Syntax & Derivational Minimalism

I’ve also been working a lot on learning HTML, CSS, MySQL, Javascript, and Angular.js during the past month. Mostly by using resources I found online. I’ve also worked a bit on pregroup grammars since I’ll be presenting soon and trying to get something interesting to say about the search-and-copy algorithm in phonology.

The last thing I’ve looked at was the homotopy type theory book

http://www.homotopytypetheory.org

I’m almost done chapter 2 now. Pretty happy with my progress; I might write some words on it later as study aid.

Readings from the last two weeks

Ranta – Type-Theoretical Grammars: Less interesting than I thought. I was excited to read how someone could use type theory to treat natural language, but it didn’t turn out to be that fascinating.  He basically gives lists of natural language expressions and their translation into type theory. I was looking for a more generative or tree-like way of doing things.

Jackendoff – On Idioms: I liked it. Makes a lot of sense to change the syntactic tree-depth of a lexical expression in function of its transparency, what we can have access to in it.

Horn – Idioms, Metaphors, and Syntactic Mobility: Suite to Jackendoff, less interesting. He classifies the different kind of idioms in different classes, calls some of them metaphors, others real idioms. He ends up with 3 or 4 different classes that don’t really make sense, are very arbitrary in their distribution. He talks about how some idioms are more metaphoric because their idiomatic meaning has some resemblance in its argument structure to its common meaning. Bah.

Stabler – A Formalization of Minimalist Syntax: Nice. Very clear. At the same time a short introduction to minimalism and a formalisation of it.

Stabler – Derivational Minimalism: Stabler’s original article on minimalist grammars. Interesting.  Maybe a bit hard to get into it at first because of the notation and definitions to remember. I liked it.

Stabler – Two Models of Minimalist, Incremental Syntactic Analysis: Showed that minimalist grammars and multiple context-free grammars are strongly equivalent. Interesting. Big part of the paper is in showing the isomorphism between derivations, so that’s more technical, less fun to look at I guess.

Stabler – On Language Variation and Linguistic Invariants: Quick overview of their program of looking at structural invariants within languages.

Stabler – Mathematics of Language Learning: Interesting, gives different mathematical approaches that people have used to look at language learning, like Markov chains and perceptrons.

Keenan, Stabler – Bare Grammar: They lay down a different approach to grammar where, instead of just coming up with syntactic categories and the kind of properties they have and so on, they look at how we can transform a sentence into another, both grammatical, and how certain lexical items will always stick together, can only be interechanged between themselves; these subsets would then form a category.

Kracht – The Mathematics of Language (ch. 6): I read it to get a more formal introduction to transformation grammars and HPSG. Really pretty, I liked it.

Leinster’s Rethinking Set Theory

This article is a presentation of Lawvere’s Elementary Theory of the Category of Sets (ETCS) to people who may not have had an introduction to category theory before. He basically goes through and rephrases the 10 axioms of ETCS using “simple” mathematical terms like sets and functions. This gives us something like:

- Function composition is associative

- There is an empty set

- There is a set with one element (terminal set in categorical terms)

- Functions satisfy the principle of extensional equality

- For any two sets, there is a cartesian product

- For any two sets there is a set of functions from one to the other

- The inverse image f^{-1}y of an element y exists

- The subsets of a set correspond to the functions from this set to the 2-set

- The set of natural numbers exist

- If a function is surjective then its inverse exists

 

One of Leinster’s main problem with traditional set theories like ZFC, is their lack of “sense”. For instance, ZFC’s ontology consists of sets only and we are allowed to talk about the set membership of anything present in our system, e.g. “is the dihedral group a member of 3?”, and even though those questions do have answers, they look extremely strange and non-natural. His answer is ETCS, where our objects of study are sets, functions, and the compositions of our functions.

 

Bottom line: Fun article, easy to read.

Wadler’s Propositions as Sessions

In the same line as the proposition-as-type interpretation of intuitionistic logic, Wadler comes up here with a proposition-as-sessions interpretation for linear logic. In this setting, proofs correspond to processes and cut-elimination to communication between processes.

pp

In the first section of the paper they describe a calculus called CP where classical linear logic propositions correspond to session types. It serves the same foundational purpose for concurrency in programming just like the lambda-calculus is one for functional programming.

r

 

Later he describes a linear functional language with session types called GV. A translation from GV to CP is given, and combined with the cut-elimination proofs of CP shows that GV is deadlock-free

gv

Bottom line: Read a bit more than half of it; what I understood from it was interesting, although I’ve never really studied process calculi seriously  and so I can only get a very superficial understanding of the content of the paper.

Blog

The idea here is to have a place where I can keep a journal of the academic stuff I do during a week. Not sure how it’s going to work, but I should, among other things, be keeping track of the articles I read, maybe write short resumes of them so that it’s easier to go back later and remind myself of what I read. The primary audience here is myself, and so what I write in here might not always make sense, but that’s okay, as long as I understand it.

The subjects I will write about here will generally have to do with linguistics, math, computer science, logic, and anything at their intersections.