This AI Was Trained Only on Pre-1930 Text. We Asked It About Hitler, Stocks, and the Future
In brief Talkie-1930 is a 13B open-weight LLM trained on 260 billion tokens of text published before January 1, 1931. The hard knowledge cutoff eliminates benchmark contamination by design, making it a uniquely clean tool for AI generalization research. Claude Sonnet 4.6 prompts it live 24/7 at talkie-lm.com/chat. The team plans a GPT-3-level vintage model…