In a post from last year – On the Surprising Parameter Efficiency of Vision Models, I discussed a question which had been puzzling me at the time – that image models appear to reach or exceed human parity with significantly fewer parameters than are seemingly used by the brain. This...
[Read More]
Fertility, Inheritance, and the Concentration of Wealth
Epistemic status: Shower thoughts
[Read More]
My Preliminary Thoughts on AI Safety Regulation
Epistemic status Still trying to work out my thoughts on this. Things change pretty regularly. My current thinking on technical AI safety questions and threat models likely diverges by now reasonably far from the LW median.
[Read More]
Linear Attention as Iterated Hopfield Networks
In this short note, we present an equivalence and interpretation of the recurrent form of linear attention as implementing a continually updated hopfield network. Specifically, as the recurrent transformer is performing generation for each token, it simply adds a continuous ‘memory’ via Hebbian plasticity to a classical continuous hopfield network...
[Read More]
Learning Linear Representations through Implicit Subspace Selection
Epistemic status: Highly speculative, basically shower thoughts. These are some thoughts I had a few months back but just got motivation to write them up today.
[Read More]