Evolving automata

cosmos 25th June 2017 at 9:46pm

See MMathPhys oral presentation. Automata theory, Simplicity bias in finite-state transducers

See also Exact learning


http://link.springer.com/chapter/10.1007/978-3-642-23780-5_20#page-1

http://www.sciencedirect.com/science/article/pii/S0031320305000294

http://www.mitpressjournals.org/doi/abs/10.1162/neco.1992.4.3.393#.V5JBI-02fCI

http://www.mitpressjournals.org/doi/abs/10.1162/neco.1989.1.3.372#.V5JBB-02fCI

https://scholar.google.com/scholar?start=20&q=learning+finite+state+transducer+complexity&hl=en&as_sdt=0,5


Simplicity bias in finite-state transducers

Random deterministic automata

Evolving Finite State Machines with Embedded Genetic Programming for Automatic Target Detection

Learning Finite-State Transducers: EvolutionVersus Heuristic State Merging

Boolean network and their evolution (What Darwin didn't know: natural variation is structured).

Introducing Domain and Typing Bias in Automata Inference

Random Deterministic Automata

An Automaton Approach for Waiting Times in DNA Evolution

Also, genetic regulatory networks: Highly designable phenotypes and mutational buffers emerge from a systematic mapping between network topology and dynamic output, Evolvability and robustness in a complex signalling circuit

Entropy of a Finite State Transducer

H=β,α,yPβPαpα,β(y)logpα,β(y)=β,α,yPβPα(xF1(y)pα(x))log(xF1(y)pα(x))H = - \sum_{\beta, \alpha, y} P_{\beta}P_{\alpha} p_{\alpha, \beta}(y)\log{p_{\alpha, \beta}(y)} = - \sum_{\beta, \alpha, y}P_{\beta}P_{\alpha} (\sum_{x\in F^{-1}(y)} p_{\alpha}(x))\log{(\sum_{x\in F^{-1}(y)} p_{\alpha}(x))}

Ergodicity of Random Walks on Random DFA

On the Effect of Topology on Learning andGeneralization in Random Automata Networks

Quantifying the complexity of random Boolean networks

The state complexity of random DFAs

http://tuvalu.santafe.edu/~walter/AlChemy/alchemy.html Artificial chemistry

Comparing nondeterministic and quasideterministic finite-statetransducers built from morphological dictionaries


Topological Entropy of Formal Languages proposes a measure of complexity of Formal languages

Topological dynamics and recognition of languages, defines topological automata, a type of extension of automata for infinite states, so that they are Turing complete


Is a random transducer an appropriate random model for GP maps in Nature?

For instance, in Gene regulatory networks, when modelled as random Boolean networks, the state transition network is probably not just a random transducer... Though maybe it depends in the regime. For instance, in the critical regime we observe the largest GP map bias apparently