You are viewing archived messages.
Go here to search the history.

Nilesh Trivedi 2024-09-03 04:01:21

TIL:

image.png

image.png

Jason Morris 2024-09-05 00:49:58

Can someone ELIF this for me? Is the memory available at training time or inference time? How is it connected to the network? And what does it mean that it is analogous to a Turing machine on von Neuman whatchamacallit?

Nilesh Trivedi 2024-09-05 04:51:59

Jason Morris These constructs define a "memory" to be used by the network at inference time.

I'm still going through the NTM and DNC papers. But basically, Siegelmann and Sontag showed in 1992 that Recurrent Neural Networks (RNNs) are Turing-complete.

image.png

Nilesh Trivedi 2024-09-05 05:07:51

image.png

Tom Larkworthy 2024-09-06 14:26:18

I thought I had shared roboco-op here but it seems I had not. The idea is to mix code/runtime/chat context into a single materialised human editable representation to enable "mimetic engineering". Copy and pasting "skills" between notebooks and therefore engineering the AIs context to suit the task at hand, all while having a machine checked code based conversation (I demoed this at Berlin's GenAI meetup) withou context switches.

youtube.com/watch?v=8cNRZUZSSS8

observablehq.com/@tomlarkworthy/robocoop