Collections
Discover the best community collections!
Collections including paper arxiv:2410.02675
-
MinerU: An Open-Source Solution for Precise Document Content Extraction
Paper • 2409.18839 • Published • 27 -
FAN: Fourier Analysis Networks
Paper • 2410.02675 • Published • 25 -
Differential Transformer
Paper • 2410.05258 • Published • 169 -
UniMuMo: Unified Text, Music and Motion Generation
Paper • 2410.04534 • Published • 19
-
VILA^2: VILA Augmented VILA
Paper • 2407.17453 • Published • 40 -
Octopus v4: Graph of language models
Paper • 2404.19296 • Published • 117 -
Octo-planner: On-device Language Model for Planner-Action Agents
Paper • 2406.18082 • Published • 48 -
Dolphin: Long Context as a New Modality for Energy-Efficient On-Device Language Models
Paper • 2408.15518 • Published • 43
-
GLiNER multi-task: Generalist Lightweight Model for Various Information Extraction Tasks
Paper • 2406.12925 • Published • 24 -
Scaling Laws for Linear Complexity Language Models
Paper • 2406.16690 • Published • 23 -
DiffusionPDE: Generative PDE-Solving Under Partial Observation
Paper • 2406.17763 • Published • 24 -
FoleyCrafter: Bring Silent Videos to Life with Lifelike and Synchronized Sounds
Paper • 2407.01494 • Published • 13
-
MotionLLM: Understanding Human Behaviors from Human Motions and Videos
Paper • 2405.20340 • Published • 20 -
Spectrally Pruned Gaussian Fields with Neural Compensation
Paper • 2405.00676 • Published • 10 -
Paint by Inpaint: Learning to Add Image Objects by Removing Them First
Paper • 2404.18212 • Published • 29 -
LoRA Land: 310 Fine-tuned LLMs that Rival GPT-4, A Technical Report
Paper • 2405.00732 • Published • 120
-
TCNCA: Temporal Convolution Network with Chunked Attention for Scalable Sequence Processing
Paper • 2312.05605 • Published • 3 -
VMamba: Visual State Space Model
Paper • 2401.10166 • Published • 39 -
Rethinking Patch Dependence for Masked Autoencoders
Paper • 2401.14391 • Published • 25 -
Deconstructing Denoising Diffusion Models for Self-Supervised Learning
Paper • 2401.14404 • Published • 18