pmc4
pre-trained markov chain (n=4)
A horrifically bad "language model" that's really just a Markov chain "trained" on random internet articles. It runs completely in the browser. Click here to go home.
You have a few options for models:
- base: Trained off of 68 Wikipedia articles. It's pretty broad, but it's mostly software, geopolitics, and (some) memes.
- ai: 53 Wikipedia articles, ALL about AI in some form, make up most of the model. It also includes over 700 slogans from AI startups on Y Combinator.
- osdev+geopolitics: 59 articles from the OSDev wiki and 5 Wikipedia articles about geopolitics, combined into one model. It likes to write C a lot.