Course notes for CS154
This file will keep track of the material covered so far, and list the
hand-outs, homeworks and solutions distributed to date.
Preliminaries
Chapter 0 of the book consist of preliminaries that you should have
encountered in other courses. Please leaf through this chapter to
check that you are acquainted with its contents. Ask us for help if
necessary. During the first lecture homework
has been assigned on this material. It is due on Wednesday January 10, at
the beginning of class.
Wednesday, January 10
I presented a short overview of the course and made a start with
Chapters 1 and 2.
Completely covered: 2.1 and 0.3.3.
Covered to a large extend: 2.2, 2.3.
Slightly covered: 1.1, 2.7, 2.6 and 2.9.
To be read: Chapter 1 until and including 1.6
and Chapter 2 until and including 2.6.
Monday January 15 is a holiday.
Wednesday, January 17
Chapters 1 and 2 are now considered to be completely covered.
Except for the RAM, the devices of Chapter 1 were introduced formally.
Read Chapter 1 for examples and for the RAM. Chapter 2 was treated in
class. Here are two slides with the main
definitions of Sections 2.2 to 2.7. The second
homework has been handed out.
Monday, January 22
Chapter 3.1 to 3.3, with exception of Section 3.3.2, has been covered
in class. Examine this material in the book as well, to familiarize
yourself with the details.
Wednesday, January 24
I finished Chapter 3 and treated Sections 4.2 and 4.5.
Section 4.3 is to be read on your own.
The third homework has been handed out, as
well as solutions to the first.
Monday, January 29
Sections 4.1, 4.4.3, 4.6 and 4.7. Section 4.4 contains severals
algorithms to convert an NFA into a regular expression determining the
same language. Of these only the so-called pencil-and-paper algorithm
has been covered, together with a proof of its correctness.
This proves corollary 4.14. The rest of this section has been skipped.
The solutions to the second homework
have been handed out.
Wednesday, January 31
I treated Sections 4.7, 4,8 and 4.9, except 4.8.1 and 4.8.2 which will
be done later. The fourth homework has been
handed out.
The midterm exam will be on Friday February 17. You will be tested on
Chapters 1-6 minus the material that has been skipped (so far only
most of Section 4.4). I will try to treat the remaining material in
the next three or four sessions.
The remaining 8 (or 7) sessions of the course will be devoted to
Chapters 7, 9 and 10. These sessions also constitute the course CS154N.
Monday, February 5
I indicated several ways to associate a language to a Context Free
Grammar:
- The set of words that have parse trees (FB 5.4)
- The set of words that have derivations (FB 5.5)
- The set of words that have leftmost derivations (FB 5.5)
- The least solution of the corresponding inclusions (handout)
- The least solution of the corresponding equations (FB 5.1)
In class I proved the equivalence of methods 1, 2 and 3, following
Section 5.5 of the book. Namely 1 => 3 => 2 => 1.
Method 4 and its equivalence with derivations is treated in a
handout. Carefully read sections 1 and 2
of the handout. This replaces Section 5.2 of the book, as well as the
proof of Theorem 5.11.
The equivalence of 4 and 5 will be treated Wednesday.
If follows that all methods agree on what is the language associated
to a context free grammar. These are called the context-free
languages.
The languages accepted by NSA are exactly the context-free languages.
The proof has been done (for CFL => NSA) respectively sketched (for NSA
=> CFL) in class. The details are in Section 5.6 of the book, which you
should read as well. An optional alternative approach to NSA => CFL
is in Section 3 of the handout.
Wednesday, February 7
I treated method 5 to associate languages to context free grammars,
following the same method as the bottom up approach for method 4 in
the handout (the top-down method doesn't work for equations). It
follows that methods 4 and 5 yield equivalent results.
Subsequently I elaborated the proof that NSA languages are
context-free.
Thirdly the pumping lemma for context-free languages was treated
(Section 5.8 until page 361). Please read the rest of this section, as
well as Section 5.9 on Ambiguity (on which homework has been
assigned). There will be no time to treat Section 5.9 also in class.
Finally I proved the composition theorem (Section 4.8.2), and its
consequence (Section 4.8.1) that any class of languages accepted by
machines as in this book is closed under the converses of
(deterministic) finite transducers.
Solution to homework 3 (paper copy only) and the fifth homework has been handed out.
Monday, February 12
I treated the remainder of Section 4.8.1. and 6.1 on closure
properties, as well as Section 5.3 on Chomsky normal form.
Handout 11 (paper copy only) replacing Section 5.10 on Greiback normal
form has been distributed.
Wednesday, February 14
Section 5.7, 5.11 and 6.2 were treated.
Sections 5.12, 6.3, 6.4 and 6.7 are skipped.
The handout replacing Section 5.10 will, due to lack of time, probably
not be covered in class; read this material at home. Homework on this
topic will be assigned. Sections 6.6 and 6.7 are postponed until
Wednesday the 21th. Monday the 19th will be a holiday.
The sixth homework and solutions to homeworks 4 and 5 are now available.
Overview of covered material for midterm exam
The midterm exam covers the material of Chapters 1 to 6.
It is an open book exam, Friday February 17 from 2:15 to 5:15.
- Skipped: (No questions will be asked on this material)
- Section 4.4 except for Corollary 4.14 and the
pencil-and-paper algorithm in Subsection 4.4.3.
- Section 5.12.
- Sections 6.3, 6.4 and 6.7.
- Replaced by a handout: (Questions may be asked about the
replacing handouts)
- Section 5.2 is replaced by Handout 8, Section 1.
- The proof of Theorem 5.11 in Section 5.4 is replaced by
Handout 8, Section 2.
- Section 5.10 is replaced by Handout 11.
- Assigned reading, not treated explicitly in class:
(Questions relating to this material may be asked as well).
- Chapter 0
- Section 1.8
- Section 4.3
- Section 5.9
- Covered in class:
All the rest, except:
- Still to be treated: (no questions on this material)
- Handout 11 (replacing Section 5.10) on Greibach normal form.
(This will be assigned reading only.)
- Sections 6.5 and 6.6, simulating stacks with counters.
CS154N
Wednesday, February 21
I introduced the subject of computability and Church's thesis, and
covered Sections 7.1 and 6.5. Sections 7.5, 7.10, 7.11 and 7.14 will
be skipped. Of Section 7.2 probably only the 1-TMs will be done
(briefly). Section 7.3 is left for home reading. There is a point in
prereading Sections 7.4 and 7.6 before class (together with the
covered material), as this material may be hard to follow without
preparation.
The midterm exam has been returned, together with its model solution. If you didn't get it, you may
obtain yours from course staff during their office hours until Monday
(Feb. 26). After that time, it will be available in the hand hangout.
Homework 6(I) will be due on Monday
February 26; homework 6(II), which was handed out today, on
Wednesday the 28th.
Monday, February 26
I treated Sections 6.6, 7.2 (but only the part about 1-TM's in
detail), 7.4, 7.7 and 7.8.1.
Wednesday, February 28
I proved the equivalence of deterministic and nondeterministic Turing
machines (7.16) in great detail. I also showed that a language is
r.e. if and only if there exist an inattentive DTT (that is a
deterministic Turing transducer without input device) that outputs all
strings in the language one by one, separated by a special symbol.
Obviously an inattentive deterministic program has only one
computation, which in this case is infinite. We covered the rest of
Section 7.6 as well, except for the proofs of the two Range Theorems
(Thm 7.20). Please read these proofs (and the covered material) at home.
I also introduced the important notion of (many-one) reduction (7.9).
Do read the applications of this notion in Section 7.9.
Many examples of reduction (7.9, 7.13), including the proof that
first-order logic is undecidabe (7.12) will be presented in the
discussion session. You should be able to prove languages nonrecursive
(or non-r.e.) by using reductions (in the right direction!). If you do
not go to the discussion session, make sure to pick this up from the
book. Section 7.13 is full with good examples.
Due to lack of time, I will not be able to present more of Chapter 7
in class (apart from 7.8.3). Next week will be devoted to the
beginning of Chapter 8, and Chapter 9. For the final you should be
familiar with 6.5, 6.6, 7.1, 7.2 (1-TM), 7.4, 7.6, 7.7, 7.8, 7.9,
7.12, and to a lesser extent with 7.3 and 7.13.
The homework on this material is a bit
more than usual, and can earn you 70 points. This reflects the
importance of the material. Upon popular request there are more extra
credit problems as well.
Monday, March 4
I treated sections 7.8.3 and 8.1 in great detail. 8.2 and 8.3 are
postponed until next week, 8.4 will be covered on Wednesday, and
8.5 will be skipped. I started Chapter 9 by introducing the complexity
classes P and NP (9.1). You are adviced to read ahead through sections
9.1-4. The rest of Chapter 9 mostly consist of applications of the
theory presented in these sections. The most important application is
Section 9.6.
Wednesday, March 6
Today I treated Sections 9.1 to 9.4, as well as the concept of a
Turing reduction from Section 8.4. In addition I introduced the
classes of languages DSPACE(f), NSPACE(f), DTIME(f) and NTIME(f) for
integer functions f. The following theorems, which are not covered in
the book, were stated but not proved:
- If c>0 is a constant then DSPACE(f) = DSPACE(cf)
- and if O(f) > O(n) then also DTIME(f) = DTIME(cf)
- DTIME(f) < NTIME(f) < NSPACE(f) and DTIME(f) < DSPACE(f) < NSPACE(f)
- NSPACE(f) < DSPACE(f^2).
- If L is in NTIME(f), then there is a constant d such that L in DTIME(d^f).
- If L is in DSPACE(f) with f(n) > log(n), then L in DTIME(d^f) for some d.
The last homework has been handed out. Before doing the homework you
should be familiar with the covered material and at least the
statement of the results in Sections 9.5 - 9.7. In Friday's discussion
session you will see examples of NP-completeness proofs. The proof of
the NP-completeness theorems in Sections 9.5 and 9.6 I plan to do on
Monday.
Monday, March 11
Today's lecture covered Section 9.5 and 9.6, after recalling the
concepts of NP-hardness and NP-completeness. With this, we will have
covered Chapter 9 (Sections 9.1 to 9.6). You should know the problems
that are proven NP-complete or PSPACE-complete in Sections 9.7, 9.8
and 9.10 without necessarily knowing the proofs. You may be tested on
your ability to prove the NP-completeness of a new problem using these
problems. You may skip Section 9.9.
Wednesday, March 13
Statement and significance of Matijasevic's Diophantine theorem,
Theorem 7.82 (not the proof), Godel's Incompleteness Theorem & review
session for the final.
About the Final Exam
It is an open book exam;
Wednesday, March 20 from 12:15 to 3:15pm in Jordan 041
- Be familiar with 6.5, 6.6, 7.1, 7.2
(1-TM), 7.4, 7.6, 7.7, 7.8, 7.9, 7.12, and to a lesser extent with 7.3
and 7.13. (This is the material dealing with computability theory)
- Godel's Incompleteness theorem (essence of argument) and Theorem
7.82 (statement)
- Section 8.1 (Rice), Section 8.4 (concept of a Turing reduction,
omitting 8.4.1-3).
- Sections 9.1-9.6 thoroughly, 9.7, 9.8, 9.10 : omitting proofs not
done in class.
- Material before the midterm, including the handout (# 11) replacing the
book section on Greibach Normal Form.
Rob van Glabbeek
rvg@CS.Stanford.EDU