Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Spaces:
openfree
/
ginigen-sora
Running

App Files Files Community
Fetching metadata from the HF Docker repository...
ginigen-sora / xora /models /transformers
Ctrl+K
Ctrl+K
  • 19 contributors
History: 10 commits
Sapir Weissbuch
Merge pull request #30 from LightricksResearch/fix-no-flash-attention
05cb3e4 unverified 6 months ago
  • __init__.py
    0 Bytes
    refactor 7 months ago
  • attention.py
    49.8 kB
    model: fix flash attention enabling - do not check device type at this point (can be CPU) 6 months ago
  • embeddings.py
    4.47 kB
    Lint: added ruff. 6 months ago
  • symmetric_patchifier.py
    2.92 kB
    Remove the word "pixart" from code. 6 months ago
  • transformer3d.py
    21.9 kB
    Merge pull request #30 from LightricksResearch/fix-no-flash-attention 6 months ago