--- language: - en size_categories: - 10M The embeddings were generated using the E5-Base-4k model with a context length of 1024, employing scaled dot-product attention. **Original dataset:**
This dataset bases as derivative work on [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), an English web dataset created by the [TII (Technology Innovation Institute)](https://www.tii.ae/).
Attribution is given to the TII as original authors of the RefinedWeb dataset as per Section 4 of RefinedWeb's Open Data Commons Attribution License (ODC-By). - Embedding Model: E5-Base-4k - Embedding Dimensions: 768 - Embedding Context Length: 1024 tokens - Embedding Implementation Specifics: Scaled Dot-Product Attention - Embedding Scope: For embedding, all but the "content" field are disregarded. **Original Dataset Link:**
https://huggingface.co/datasets/tiiuae/falcon-refinedweb