Words & Parallel Universes
--
Let’s assume, a word in a parallel universe looks like a tree of letters instead of a sequence, and we cannot enumerate the letters in words anymore. Ask yourself these questions:
- How different would the languages look like?
- How much more complex would be the algorithms?
- How could books adapt to the change?
- What would be the consequences?
In the following sections, I will introduce some of the concepts from the formal languages & automata, then re-define and analyze the grammar types.
Introduction to Formal Languages & Automata
If you don’t understand the concepts, you can find additional sources at the end.
Important Definitions
Alphabet: A set of letters Σ = {a_{1}, …, a_{m}}. Each letter stands for a value.
Word: A sequence of letters w = (w_{1}, w_{2}, …, w_{l}) or simply w_{1} w_{2} … w_{n}, where ∀ i ∈ {1, …, l} . w_{i} ∈ Σ. Each word stands for an argument.
Language: A set of words. Each language stands for a problem, which accepts / rejects words.
Grammar: Basically a set of rules and letters, which define the syntax of a language. It can create any word in a language. Two types of grammars worth considering in this article:
- Regular (or type-3) grammar: It can create a word from left to right*[1] (1 direction)
- Context-free (or type-2) grammar: can create a word in both directions (2 direction)
Automaton: Similarly a set of states and letters, which define the syntax. It accepts the words of a language, and rejects other words.
Word Problem: The algorithmic problem of whether a word is accepted by a language. In general, word problems can be solved using top-down recursive formulas*[2]. If root node satisfies the criteria, the word is included in the language.
Notes:
*[1]: There are left-regular grammars which can work in reverse direction, each forms a reverse language of a right-regular language, so each grammar can be represented by right-regular grammars.