CSC280 : Computer Models and Limitations
Course Schedule

Course Outline

Below is the course schedule and brief description of each lecture (where a "lecture" typically corresponds to one or two 1 1/4 hour sessions). The schedule is not rigid and subject to change in accord with the progress we make in class.


Wednesday, January 18
Overview and Fundamental Concepts
We go over the course organization. Then we motivate the course content. We also go over some fundamentals from CSC 173 to refresh our memory.


Monday, January 23
Chapter 1: The Regular Languages
We review the concepts of regular languages and finite automata, including the descriptive equivalence of deterministic FAs, nondeterministic FAs and regular expressions.


Wednesday, January 25
Chapter 1: The Regular Languages, continued
We show that the regular languages are closed under the regular operations. We then study and prove the pumping lemma for regular languages and use it as a tool to prove that some languages are not regular. The basic idea behind the pumping lemma is: if the length of a string accepted by an FA is larger than the number of its possible states, then some state is repeated while the string is processed, so the substring processed between the two occurrences of the repeated state can be inserted as many times as we like without changing the membership.


Monday, January 30
Section 2.1 — Context-free Languages

We recall the notions of context-free languages and parse trees and learn the concept of ambiguity. We also learn a type of CFG called Chomsky Normal Form and show that every CFG can be converted to an equivalent Chomsky Normal Form grammar.


Wednesday, February 1 & Monday, February 6
Section 2.2 — Pushdown Automata

The pushdown automaton is an NFA with a stack as additional storage. We show that these special kinds of automata characterize context-free languages.


Wednesday, February 8 & Monday, February 13
Section 2.3 — The Existence of Non-context-free Languages

We study and prove the pumping lemma for context-free languages and use it as a tool to prove that some languages are not context-free. The idea behind the pumping lemma is similar to that behind the lemma for regular languages. The difference is that we need to look at the number of nonterminals in a parse tree instead.


Wednesday, February 15
Chapter 3: Computability Theory
We recall the concept of Turing machines and study variants of Turing machines ( nondeterministic Turing machines and multi-tape Turing machines ) and show that they are both equivalent to the standard Turing machines. We also learn the equivalence of the TMs and other models and the Church-Turing thesis, which suggests that whatever is effectively computable on earth is Turing computable. We review a quick history of how the concept of Turing machines was invented.


Monday, February 20
Section 4.1 — Decidable Languages
A language is Turing-decidable if there exists a Turing machine that halts in an accepting state for every member given as an input and halts in a rejecting state for every nonmember given as an input. Many problems about regular expressions, finite automata, and context-free grammars and decidable. We learn such problems.


Wednesday, February 22
Section 4.2 - The Halting Problem
There are problems that are not Turing-decidable. We show here that the Halting Problem, which is to decide whether a given Turing machine accepts on a given input string, is not Turing-decidable. The method we use for proving the non-Turing-decidability is the diagonalization method, which resembles the one used to prove that the set of real numbers is not countable. We also show that there are languages that are not Turing-recognizable.


Monday, February 27 & Wednesday, March 1
Section 5.1 — Undecidable Problems
We learn the technique of reducing the undecidability of a problem to another. To prove a problem A is undecidable we select an already-proven-to-be-undecidable problem B and show that: if there were a Turing machine to decide A then there would be a machine for B, too.


Monday, March 6 (continued March 8)
Sections 5.2 & 5.3 — PCP and Reductions
We show the undecidability of a simple, puzzle-like problem called Post's Correspondence Problem.


Wednesday, March 8
Sections 5.2 & 5.3 — PCP and Reductions (cont'd from March 6)
We formalize the concept of reduction and use it identify more undecidable problems. A language A is reducible to B if there exists a transformation F computable by a Turing machine such that F maps each member of A to a member of B and each nonmember of A to a nonmember of B.


Monday, March 13 - Friday, March 17, Spring Break


Wednesday, March 20
Problem Session


Wednesday, March 22
NON-CUMULATIVE EXAM I
Covers up to Chapter 5 (inclusive).


Monday, March 27
Section 7.1 — Time Complexity Classes
We introduce the concept of time complexity of algorithms and define deterministic as well as nondeterministic time complexity classes. We learn simulation results of time complexity classes.


Wednesday, March 29
Section 7.1 — Time Complexity Classes (cont'd)


Monday, April 3 & Wednesday, April 5
Sections 7.2 & 7.3 — Classes P and NP
We study the two most important time complexity classes, P and NP. The former is the class of languages that are decided by deterministic Turing machines that are polynomial time-bounded, and the latter is its nondeterministic version. We learn some problems belonging to the class.


Monday, April 10
Section 7.4 — NP-complete Problems
A NP-complete problem is one of the ``most difficult'' problems in NP, in the sense that each problem in NP is reducible to the problem in polynomial time. The first to be discovered to have this property is the Satisfiability Problem (aka SAT), the problem of deciding whether it is possible to assign Boolean values to the variables of a given formula to make it evaluate to true. We prove that this problem is NP-complete.


Wednesday, April 12
Section 7.5 — More NP-complete Problems
Thousands of practical and important problems have been identified as being NP-complete. We review the proofs of NP-completeness of some such problems, including Vertex Cover, Hamiltonian Path, and Subset Sum.


Monday, April 17
Section 8.1 — Savitch's Theorem
We introduce the concept of the space complexity of algorithms and define space complexity classes as we did for the time complexity. We prove Savitch's Theorem, a very important simulation result about space complexity classes.


Wednesday, April 19
Sections 8.2 & 8.3 — PSPACE and PSPACE-complete Problems
We introduce the polynomial space class, written as PSPACE and study complete languages for the class.


Monday, April 24
Section 8.4, 8.5, and 8.6 — NL
We study logarithmic space classes L and NL — those of languages with algorithms that require space proportional to only the log of the input length. We present a complete problem for NL and show that NL is a subclass of P. Also We show that the class NL is closed under complement.


Wednesday, April 26
Problem Session


Monday, May 1
Problem Session


Wednesday, May 3
NON-CUMULATIVE EXAM II
Covers Chapters 7 and 8.