Back to home

Nathan's corner

It's a technical blog! Or is it...
Generating fractals with complex "neural networks".

Fractals like the Mandelbrot Set are traditionally defined by a pretty simple recursive function. What if instead of a simple user-defined function we initiated a random neural network, and used that as the function?

Equipping neural networks with differentiable memory: Neural Turing Machines.

Most neural networks are unable to remember, and those that can cannot remember for long. Networks like LSTMs are able to use their hidden state to store data, but will forget over time. How can we create a neural network that theoretically never forgets, but is still trainable through gradient descent?