**Table of Contents**

- Preface
- Introduction
- I. AUTOMATA
- 1. Automata and Formal Languages
- 1.1. 1.1 Languages and grammars
- 1.2. 1.2 Finite automata and regular languages
- 1.2.1. 1.2.1 Transforming nondeterministic finite automata
- 1.2.2. 1.2.2 Equivalence of deterministic finite automata
- 1.2.3. 1.2.3 Equivalence of finite automata and regular languages.
- 1.2.4. 1.2.4 Finite automata with special moves
- 1.2.5. 1.2.5 Minimization of finite automata
- 1.2.6. 1.2.6 Pumping lemma for regular languages
- 1.2.7. 1.2.7 Regular expressions

- 1.3. 1.3 Pushdown automata and context-free languages

- 2. Compilers
- 3. Compression and Decompression
- 4. Reliable Computation

- II. COMPUTER ALGEBRA
- 5. Algebra
- 6. Computer Algebra
- 7. Cryptology
- 8. Complexity Theory

- III. NUMERICAL METHODS
- 9. Competitive Analysis
- 10. Game Theory
- 10.1. 10.1 Finite games
- 10.2. 10.2 Continuous games
- 10.2.1. 10.2.1 Fixed-point methods based on best responses
- 10.2.2. 10.2.2 Applying Fan's inequality
- 10.2.3. 10.2.3 Solving the Kuhn-Tucker conditions
- 10.2.4. 10.2.4 Reduction to optimization problems
- 10.2.5. 10.2.5 Method of fictitious play
- 10.2.6. 10.2.6 Symmetric matrix games
- 10.2.7. 10.2.7 Linear programming and matrix games
- 10.2.8. 10.2.8 The method of von Neumann
- 10.2.9. 10.2.9 Diagonally strictly concave games

- 10.3. 10.3 The oligopoly problem

- 11. Recurrences
- 12. Scientific Computing

- Bibliography

**List of Figures**

- 1.1. Finite automaton.
- 1.2. The finite automaton of Example 1.9.
- 1.3. Nondeterministic finite automata.
- 1.4. Transition tables of the NFA in Fig. 1.3
- 1.5. The equivalent DFA with NFA in Fig. 1.3.
- 1.6. Equivalent DFA's (Example 1.11).
- 1.7. Non equivalent DFA's (Example 1.12).
- 1.8. DFA of the Example 1.13.
- 1.9. NFA associated to grammar in Example 1.14.
- 1.10. Relations between regular grammars and finite automata. To any regular grammar one may construct an NFA which accepts the language generated by that grammar. Any NFA can be transformed in an equivalent DFA. To any DFA one may construct a regular grammar which generates the language accepted by that DFA.
- 1.11. Finite automata with -moves.
- 1.12. NFA equivalent to FA with -moves given in Fig. 1.11.
- 1.13. (a) Representation of an NFA. Initial states are represented by a circle with an arrow, final states by a double circle. (b) Union of two NFA's.
- 1.14. (a) Product of two FA. (b) Iteration of an FA.
- 1.15. Minimization of DFA.
- 1.16. Minimum automaton equivalent with DFA in Fig. 1.15.
- 1.17. Sketch of DFA used in the proof of the pumping lemma.
- 1.18. Properties of regular expressions.
- 1.19. DFA from Example 1.20, to which regular expression is associated by Method 1.
- 1.20. DFA in Example 1.21 to which a regular expression is associated by Method 1. The computation are in Figure 1.21.
- 1.21. Determining a regular expression associated to DFA in Figure 1.20 using sets .
- 1.22. Possible equivalent transformations for finding regular expression associated to an automaton.
- 1.23. Transformation of the finite automaton in Fig. 1.19.
- 1.24. Steps of Example 1.23.
- 1.25. Possible transformations to obtain finite automaton associated to a regular expression.
- 1.26. Associating finite automaton to regular expression .
- 1.27. Finite automaton associated to regular expression .
- 1.28. DFA's to minimize for Exercise 1.2-5
- 1.29. Finite automata for Exercises 1.2-6 and 1.2-7.
- 1.30. DFA for Exercise1.2-9.
- 1.31. Pushdown automaton.
- 1.32. Example of pushdown automaton.
- 1.33. Transition graph of the Example 1.27
- 1.34. Computation tree to show acceptance of the word (see Example 1.27).
- 1.35. Computation tree to show that the pushdown automaton in Example 1.27 does not accept word .
- 1.36. Recognising a word by empty stack (see Example 1.28).
- 1.37. Derivation (or syntax) tree of word .
- 1.38. Decomposition of tree in the proof of pumping lemma.
- 2.1. The translator.
- 2.2. The structure of compilers.
- 2.3. The programs of the analysis and the synthesis.
- 2.4. The positive integer and real number.
- 2.5. The identifier.
- 2.6. A comment.
- 2.7. The character string.
- 2.8. grammar.
- 2.9. The sentential form and the analysed text.
- 2.10. The structure of the parser.
- 2.11. The syntax tree of the sentence .
- 2.12. The grammar.
- 2.13. The -item.
- 2.14. The function .
- 2.15. The automaton of the Example 2.15.
- 2.16. The structure of the parser.
- 2.17. The syntax tree of the sentence .
- 2.18. The automaton of the Example 2.22.
- 3.1. Frequency of letters in characters of English.
- 3.2. Example of a code tree.
- 3.3. Example of Shannon code and Shannon-Fano-Elias-code.
- 3.4. Example of the Shannon-Fano-algorithm.
- 3.5. Example of a Huffman code.
- 3.6. Example of arithmetic encoding with interval expansion.
- 3.7. Table of the first values for the Krichevsky-Trofimov-estimator.
- 3.8. An example for a tree source.
- 3.9. Weighted context tree for source sequence ' ' with past . The pair denotes zeros and ones preceded by the corresponding context . For the contexts it is .
- 4.1. AND, OR and NOT gate.
- 4.2. The assignment (values on nodes, configuration) gets propagated through all the gates. This is the “computation”.
- 4.3. Naive parallel addition.
- 4.4. Failure at a gate.
- 4.5. An executive organ.
- 4.6. A restoring organ.
- 4.7. An executive organ followed by a restoring organ.
- 4.8. Reliable circuit from a fault-free circuit.
- 4.9. A shift register.
- 4.10. Part of a circuit which computes the sum of two binary numbers . We feed the digits of and beginning with the lowest-order ones, at the input nodes. The digits of the sum come out on the output edge. A shift register holds the carry.
- 4.11. A “computer” consists of some memory (shift registers) and a Boolean circuit operating on it. We can define the size of computation as the size of the computer times the number of steps.
- 4.12. Transmission through a noisy channel.
- 4.13. Using a refresher.
- 4.14. A regular expander.
- 6.1. The general scheme of modular computations.
- 6.2. Illustration of the operation of the
`Classical-Euclidean`

`Classical-Euclidean(`

`,`

`)`

- 6.3. The illustration of the operation of the
`Primitive-Euclidean`

`Primitive-Euclidean(`

`)`

- 7.1. A typical scenario in cryptography.
- 7.2. Example of an encryption by the shift cipher.
- 7.3. Vigenére square: Plaintext “H” is encrypted as “A” by key “T”.
- 7.4. Example of an encryption by the Vigenére cipher.
- 7.5. Three graphs: is isomorphic to , but not to .
- 7.6. Two ciphertexts encrypting the same plaintext, see Exercise 7.1-1.
- 7.7. The Diffie-Hellman secret-key agreement protocol.
- 7.8. The RSA protocol.
- 7.9. Example of an RSA encryption (M = Message).
- 7.10. Digital RSA signatures.
- 7.11. The Rivest-Sherman protocol for secret-key agreement, based on
- 7.12. Goldreich, Micali, and Wigderson's zero-knowledge protocol for .
- 7.13. Simulation of the zero-knowledge protocol for without knowing .
- 8.1. A Turing machine with one input and two work tapes.
- 8.2. Transition function of for .
- 8.3. 's states, their meaning and their intention.
- 8.4. Comparison of some polynomial and exponential time functions.
- 8.5. What happens when the computers get faster?
- 8.6. Transition graph of a stochastic automaton for describing
`Random-SAT`

- 8.7. Runtimes of some algorithms for the satisfiability problem.
- 10.1. Prisoner's dilemma.
- 10.2. Game with no equilibrium.
- 10.3. Finite tree of Example 10.3.
- 10.4. Tree for Exercise 10.1-5.
- 10.5. The graph of function .
- 11.1. The form of particular solutions.
- 11.2. Binary trees with two and three vertices.
- 12.1. Forward and backward error.
- 12.2. Gaussian elimination.