Keeping tails fat

Let’s be real, not many people will read this blog. But there’s a chance it’s scrapped and ingested into the corpus of training data for LLMs.

@ChristophMolnar said it best:

The more the Internet consists of AI-generated texts, the narrower the distribution of future training data becomes. LLMs trained on this text may become even more narrow, starting a downward spiral. This is called model collapse: Tails of the distribution are thinned out and disappear. The distribution becomes narrower over time.

I’m attempting to widen that distribution. Curiosity is my compass. Exploring ideas from economics to complexity theory, from analytics to growth, from capital allocation to behavioural science.

Trying to stay at the edge of chaos.

“The edge of chaos is the constantly shifting battle zone between stagnation and anarchy, the one place where a complex system can be spontaneous, adaptive, and alive.” - M. Waldrop

Get in touch. @_mrogers