pmc4
pre-trained markov chain (n=4)
A horrifically bad "language model" that's really just a Markov chain "trained" on random internet articles. It runs completely in the browser. Click here to go home.
You have a few options for models:
- generic: It's pretty broad, but its 68-article dataset is mostly about software, geopolitics, (some) memes, and random things like NEMA connectors.
- ai: 53 Wikipedia articles, ALL about AI in some form, make up most of the model. It also includes over 700 slogans from AI startups on Y Combinator.
- osdev+geopolitics: 59 articles from the OSDev wiki and 5 Wikipedia articles about geopolitics, combined into one model. It likes to write C a lot.