Thursday, February 16, 2023

The Long Story of How Neural Nets Got to Where They Are

This is a fascinating discussion between Stephen Wolfram and Terry Sejnowski. These guys, Sejnowski especially, are pulling names from all over the place. There are lots of isolated and semi-isolated figures in this story. But it’s (the beginnings of) a map of where all this came from. Should probably check it against Grace Lindsey's wonderful little book, Models of the Mind. Note that the early stuff about computational linguistics is messed up. Chomsky had nothing to do with it.

And so forth and so on. 

After about 3:00:31 Wolfram and Sejnowski make that point that, while the field of neural networks started with a small group of "true believers," as most intellectual enterprises do, in the case of neural nets (Wolfram) "there were all these separate little pockets, I think that's not so common." In many cases "the tree grows from one trunk, so to speak." To which Sejnowski responds, "Ah, OK, that's an interesting observation."  Wolfram goes that, for example, in the case of "quantum mechanics there were not multiple trunks." He goes on to say that, in a sense, it all grew from the initial McCulloch-Pitts 1943 paper, "and yet there were all these separate branches, same seed but it wasn't a single trunk from that." Moreover, "the time scale from the initial seed to  fruition is extremely long. Many generations." Sejnowski agrees "that it was very diverse. But one possible explanation is that you have a much bigger search space to explore." 

No comments:

Post a Comment