admin

This AI Was Trained Only on Pre-1930 Text. We Asked It About Hitler, Stocks, and the Future

In brief Talkie-1930 is a 13B open-weight LLM trained on 260 billion tokens of text published before January 1, 1931. The hard knowledge cutoff eliminates benchmark contamination by design, making it a uniquely clean tool for AI generalization research. Claude Sonnet 4.6 prompts it live 24/7 at talkie-lm.com/chat. The team plans a GPT-3-level vintage model…

Read More

Vaccination generates broadly cross-neutralizing antibodies to the HIV Env apex

Animals The animal work was approved by the Emory University Institutional Animal Care and Use Committee (IACUC) under protocol 202100136. Twelve adult Indian-origin rhesus macaques (Macaca mulatta) (RM) were housed at the Emory National Primate Research Center (ENPRC) and maintained in accordance with NIH guidelines. Animal care facilities are accredited by the US Department of…

Read More