Version History

1.0major

Initial release of 13-billion-parameter vintage language model trained exclusively on public domain texts published before 1931.

Coverage

research

Researchers release 13B-parameter language model trained exclusively on pre-1931 data

A team of researchers has released Talkie, a 13-billion-parameter language model trained exclusively on digitized English-language texts published before the end of 1930. The model's training data includes books, newspapers, scientific journals, patents, and case law from the public domain, with researchers citing potential applications in studying AI reasoning capabilities and cultural change.

2 min read