## 6.4. Linear Bounded Automata

In this section, we present a special, bounded version of the Turing machines, by which the class of context-sensitive languages can be characterized - as we already mentioned in Subsection. This version of the Turing machine can work only on the part of the tape where the input is/was. These automata are called linear bounded automata (LBA).

Definition 36. Let LBA = (Q, T, V, q0, ♯, δ, F) be a Turing machine, where

δ : Q × (V \ {}) → 2Q×V×{Left, Right, Stay}

and

δ : Q × {} → 2Q×{}×{Left, Right, Stay}.

Then LBA is a (nondeterministic) linear bounded automaton.

One can observe that ♯ signs cannot be rewritten to any other symbol, therefore, the automaton can store some results of its subcomputations only in the space provided by the input, i.e., in fact, the length of the input can be used during the computation, only.

Example 59. Give an LBA that accepts the language {aibicii ∈ ℕ}.

Solution:

Idea:

• The automaton rewrites the first a to A, and changes its state, looks for the first b.

• The automaton rewrites the first b to B, and changes its state, looks for the first c.

• The automaton rewrites the first c to C, and changes its state, looks (backward) for the first a.

• The capital letters A,B,C are read without changing them.

• The above movements are repeated.

• If finally only capital letters remain between the border ♯ signs, then the automaton accepts (the input).

Formally, let

LBA = ({q0, q1, q2, q3, q4, qf}, {a,b,c}, {a,b,c,A,B,C,♯}, q0, ♯, δ, {qf})

be a deterministic LBA, where δ consists of the next transitions:

1. δ (q0, ♯) = (qf, ♯, Right) – the empty word is accepted by LBA.

2. δ (q0, a) = (q1, A, Right) – the first (leftmost) a is rewritten to A and LBA changes its state.

3. δ (q0, B) = (q0, B, Left) – the capital letters B and C are skipped in state q0,

4. δ (q0, C) = (q0, C, Left) – by moving the head to the left.

5. δ (q1, a) = (q1, a, Right) – letter a is skipped in state q1 to the right.

6. δ (q1, B) = (q1, B, Right) – capital B is also skipped.

7. δ (q1, b) = (q2, B, Right) – the leftmost b is rewritten by B and the state becomes q2.

8. δ (q2, b) = (q2, b, Right) – letter b is skipped in state q2 to the right.

9. δ (q2, C) = (q2, C, Right) – capital C is also skipped in this state.

10. δ (q2, c) = (q3, C, Left) – the leftmost c is rewritten by C and LBA changes its state to q3.

11. δ (q3, a) = (q3, a, Left) – letters a,b are skipped in state q3

12. δ (q3, b) = (q3, b, Left) – by moving the head of the automaton to the left.

13. δ (q3, C) = (q3, C, Left) – capital letters C,B are skipped in state q3

14. δ (q3, B) = (q3, B, Left) – by moving the head of the automaton to the left.

15. δ (q0, A) = (q3, A, Right) – the head is positioned after the last A and the state is changed to q0.

16. δ (q4, B) = (q3, B, Right) – if there is a B after the last A the state is changed to q4.

17. δ (q4, B) = (q4, B, Right) – in state q4 capital letters B and C are skipped

18. δ (q4, C) = (q4, C, Right) – by moving the head to the right.

19. δ (qf, ) = (q4, ♯, Left) – if in q4 there were only capital letters on the tape, LBA accepts.

The work of the automaton can be described as follows: it is clear by transition 1, that λ is accepted.

Otherwise the head reads the first letter of the input: if the input starts with an a, then it is replaced by A and q1 is the new state. If the first letter of the input is not a, then LBA gets stuck, i.e., it is halting without acceptance, since there are no defined transitions in this state for the other input letters.

In state q1 LBA looks for the first b by moving to the right (skipping every a and B, if any; and halting without acceptance by finding other letters before the first b). When a b is found it is rewritten to a B and the automaton changes its state to q2.

In q2 the first c is searched, the head can move through on b's and C's (but not on other letters) to the right. When it finds a c it rewrites by a C and changes the state to q3 and starts to move back to the left.

In q3 the head can go through on letters a,B,b,C to the left and when finds an A it steps to the right and the state becomes q0. In q0 if there is an a under the head, then it is rewritten by an A and the whole loop starts again.

If in q0 a letter B is found (that could happen when every a is rewritten by an A already), LBA changes its state to q4. In this state by stepping to the right LBA checks if every b and c is already rewritten and if so, i.e., their number is the same as the number of a's and their order was correct (the input is in the language a*b*c*), then LBA reaches the marker ♯ sign after the input and accepts.

When at any stage some other letter is being read than it is expected (by the previous description), then LBA halts without accepting the input.

Thus, the accepted language is exactly {aibicii∈ ℕ}.

To establish a connection between the classes of context-sensitive languages and linear bounded automata we present the following theorem.

Theorem 46. The class of languages accepted by linear bounded automata and the class of context-sensitive languages coincide.

Proof. We do not give a formal proof here, instead we present the idea of a proof. A context-sensitive language can be defined by a monotone grammar. In a monotone grammar (apart from the derivation of the empty word, if it is in the language), the length of the sentential form cannot be decreased by any step of the derivation. Consequently, starting from the derived word, and applying the rules in a way which is an inverse, the length is monotonously decreasing till we obtain the start symbol. In this way every context-sensitive language can be accepted by an LBA.

The other way around, the work of an LBA can be described by a grammar, working in inverse way of the generation (starting from a word the start symbol is the target). These grammars are similar to the previously used monotone grammars, and thus, if an LBA is accepting a language L, then L is context-sensitive.

QED.

Actually, there are other models of LBA, in which the workspace (the maximal tape-length during the computation) is limited by c1 · ∣w∣ + c0, where w is the input and c0, c1 ∈ ℝ constants. The accepting power of these models are the same as of those that have been presented.

However, the deterministic model is more interesting, since it is related to a long-standing and still open question.

It is known that every context-sensitive language can be accepted by deterministic Turing machines, using at most c2· ∣w∣2 + c1 · ∣w∣ + c0 space during the computations, where c2, c1, c0 are constants. However, it is neither proven nor disproved that deterministic linear bounded automata (using c1 · ∣w∣ + c2 space) can recognize every context-sensitive language. This is still a famous open problem.

Exercise 85. Give a linear bounded automaton that accepts the language

{aibjaibji, j ∈ ℕ}.

Exercise 86. Give a linear bounded automaton that accepts the language

{a2ii ∈ ℕ},

i.e., the language of powers of two in unary coding.

Exercise 87. Give a linear bounded automaton that accepts the set of primitive words over the alphabet {a,b}. (A word w is primitive if it is not of the form un for any word u≠ w.)