YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Population Transformer

Weights for pretrained Population Transformer (paper, website, code) using pretrained BrainBERT stft model (paper, code).

Trained on Brain TreeBank (paper, dataset)

Cite:

@misc{chau2024populationtransformer,
        title={Population Transformer: Learning Population-level Representations of Neural Activity}, 
        author={Geeling Chau and Christopher Wang and Sabera Talukder and Vighnesh Subramaniam and Saraswati Soedarmadji and Yisong Yue and Boris Katz and Andrei Barbu},
        year={2024},
        eprint={2406.03044},
        archivePrefix={arXiv},
        primaryClass={cs.LG},
        url={https://arxiv.org/abs/2406.03044}, 
      }
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.