On last week's Numb3rs episode One Hour, Charlie, the mathematician, and Amita, his colleague/girlfriend, had the following conversation: Charley (to Amita): I haven't seen an inductive Turing machine used like that before.

So what is an inductive Turing machine? I put the question to Bill Gasarch.

Amita: I'm trying to find the finite state machine for these. (points to screen)If you IGNORE the TV show, it could mean the following, taking a cue from the field of Inductive Inference:

Does this definition make sense in context of the show? The set S of regular languages (those computed by finite state machines) is in EX, where eA set of computable functions S is in EX if there is a TURING MACHINE M such that for all f in S if you feed f(0), f(1), f(2), … into M, it outputs e

_{1}, e_{2}, e_{3}, … and in the limit the sequence converges to e, a Turing Machine index for a machine that computes f. Such a machine M is called an INDUCTIVE TURING MACHINE._{i}is the lexicographically least FSM whose output is consistent with f(0), f(1), …, f(i-1).Of course this is an incredibly inefficient way to learn regular languages, but then again Amita wasn't having much success. Perhaps she should have used one of the efficient finite automata learning algorithms like Rivest and Schapire.

Gasarch has a different take.

The show DID NOT mean this. So what did they mean and what could they have said? They should have either used a generic term for learning or just use a fictional term. Then they can't really be wrong. Here are some possibilities:

By the way you can watch Numb3rs on-line for free.- Charley (To Amita): I haven't seen that learning algorithm used that way before.
- Charley (To Amita): I haven't seen Carl Smith's Technique used that way before.
- Charley (To Amita): I haven't seen cross-convergence used that way before.

Charley (To Amita): I haven't seen the Generalized Polynomial Hales-Jewitt Theorem used that way before.

Amita: I'm trying to prove the polynomial van der Waerden's theorem over the reals.The conversation they DID have is connected to later in the show when they are trying to learn the maze. I can make a vague connection–the show did not do so.

Having said all that, it was a good episode.

--

Posted by Lance to Computational Complexity at 3/01/2007 12:33:00 PM