We saw three stages of logics:

- Propositional logic, with formulas like
**DickLikesJane**⇒¬**JaneLikesDick**. While the propositions are named suggestively, nothing in the logic enforces a relation among these; it is equivalent to*A*⇒¬*B*. - Predicate logic, where variables (and constants) can express a connection between different parts of the formula:
**likes (***y***,***x***)****⇒****¬****likes (***x***,***y***)**Predicate logic introduced the idea of variables, and required domains and interpretations to determine truth. But it can't bind variables, and thus requires an interpretation of*x*and*y*to evaluate. - First-order logic, which included two quantifiers to bind variables: ∀
*y*: (∃x : (**likes (***y***,***x***)**⇒¬**likes (***x***,***y***)**))

So why, you might ask, didn't we just start out with first-order logic in the first lecture? One reason, clearly, is to introduce concepts one at a time: everything you needed to know about one
level was needed in the next, and then some. But there's more: by restricting our formalisms, we can't express all the concepts of the bigger formalism, but we **can** have
automated ways of checking statements or finding proofs.

In general, this is a common theme in the theory of any subject: determining when and where you can (or, need to) trade of expressibility for predictive value. For example, ...

- Linguistics: Having a set of precise rules for (say) Tagalog grammar allows you to determine what is and isn't a valid sentence; details of the formal grammar can reveal relations to
other languages which aren't otherwise so apparent. On the other hand, a grammar for any natural language is unlikely to exactly capture all things which native speakers say and understand.
If working with a formal grammar, one needs to know what is being lost and what is being gained.
- Dismissing a grammar as irrelevant because it doesn't entirely refect usage is missing the point of the grammar;
- Conversely, condemning some real-life utterances as ungrammatical (and ignoring them) forgets that the grammar is a model which captures many (if not all) important properties.

Of course, any reasonable debate on this topic respects these two poles and is actually about where the best trade-of between them lies.

- Psychology: Say, Piaget
^{2}might propose four stages of learning in children. It may not trade of total accuracy, for (say) clues of what to look for in brain development. - Physics: Modern pedagogy must trade of quantum accuracy for Newtonian approximations. Researchers exploring felds like particle physics must trade of exact simulations for statistical ("stochastic") approximations.

Understanding the theoretical foundations of a feld is often critical for knowing how to apply various techniques in practice.

- 1274 reads