Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
The largest version of GPT-2, for example, has a fixed length of 1024 tokens, so we
cannot calculate \(p_\theta(x_t|x_{<t})\) directly when \(t\) is greater than 1024.