reddit_dataset_30 / README.md
immortalizzy's picture
Update README.md with latest statistics
b743f03 verified
|
raw
history blame
8.27 kB
metadata
license: mit
multilinguality:
  - multilingual
source_datasets:
  - original
task_categories:
  - text-classification
  - token-classification
  - question-answering
  - summarization
  - text-generation
task_ids:
  - sentiment-analysis
  - topic-classification
  - named-entity-recognition
  - language-modeling
  - text-scoring
  - multi-class-classification
  - multi-label-classification
  - extractive-qa
  - news-articles-summarization

Bittensor Subnet 13 Reddit Dataset

Data-universe: The finest collection of social media data the web has to offer
Data-universe: The finest collection of social media data the web has to offer

Dataset Description

  • Repository: tensorshield/reddit_dataset_30
  • Subnet: Bittensor Subnet 13
  • Miner Hotkey: 5E7icYNuWZGLeRQ22jT26VqAXownqFVGw4ZoFmndXWaMkD13

Dataset Summary

This dataset is part of the Bittensor Subnet 13 decentralized network, containing preprocessed Reddit data. The data is continuously updated by network miners, providing a real-time stream of Reddit content for various analytical and machine learning tasks. For more information about the dataset, please visit the official repository.

Supported Tasks

The versatility of this dataset allows researchers and data scientists to explore various aspects of social media dynamics and develop innovative applications. Users are encouraged to leverage this data creatively for their specific research or business needs. For example:

  • Sentiment Analysis
  • Topic Modeling
  • Community Analysis
  • Content Categorization

Languages

Primary language: Datasets are mostly English, but can be multilingual due to decentralized ways of creation.

Dataset Structure

Data Instances

Each instance represents a single Reddit post or comment with the following fields:

Data Fields

  • text (string): The main content of the Reddit post or comment.
  • label (string): Sentiment or topic category of the content.
  • dataType (string): Indicates whether the entry is a post or a comment.
  • communityName (string): The name of the subreddit where the content was posted.
  • datetime (string): The date when the content was posted or commented.
  • username_encoded (string): An encoded version of the username to maintain user privacy.
  • url_encoded (string): An encoded version of any URLs included in the content.

Data Splits

This dataset is continuously updated and does not have fixed splits. Users should create their own splits based on their requirements and the data's timestamp.

Dataset Creation

Source Data

Data is collected from public posts and comments on Reddit, adhering to the platform's terms of service and API usage guidelines.

Personal and Sensitive Information

All usernames and URLs are encoded to protect user privacy. The dataset does not intentionally include personal or sensitive information.

Considerations for Using the Data

Social Impact and Biases

Users should be aware of potential biases inherent in Reddit data, including demographic and content biases. This dataset reflects the content and opinions expressed on Reddit and should not be considered a representative sample of the general population.

Limitations

  • Data quality may vary due to the nature of media sources.
  • The dataset may contain noise, spam, or irrelevant content typical of social media platforms.
  • Temporal biases may exist due to real-time collection methods.
  • The dataset is limited to public subreddits and does not include private or restricted communities.

Additional Information

Licensing Information

The dataset is released under the MIT license. The use of this dataset is also subject to Reddit Terms of Use.

Citation Information

If you use this dataset in your research, please cite it as follows:

@misc{tensorshield2025datauniversereddit_dataset_30,
        title={The Data Universe Datasets: The finest collection of social media data the web has to offer},
        author={tensorshield},
        year={2025},
        url={https://huggingface.co/datasets/tensorshield/reddit_dataset_30},
        }

Contributions

To report issues or contribute to the dataset, please contact the miner or use the Bittensor Subnet 13 governance mechanisms.

Dataset Statistics

[This section is automatically updated]

  • Total Instances: 267424
  • Date Range: 2025-03-23T00:00:00Z to 2025-03-23T00:00:00Z
  • Last Updated: 2025-03-30T12:09:20Z

Data Distribution

  • Posts: 6.71%
  • Comments: 93.29%

Top 10 Subreddits

For full statistics, please refer to the stats.json file in the repository.

Rank Topic Total Count Percentage
1 r/AskReddit 6795 2.54%
2 r/formula1 4117 1.54%
3 r/AFL 2746 1.03%
4 r/MAFS_AU 2594 0.97%
5 r/AITAH 2143 0.80%
6 r/nrl 1908 0.71%
7 r/AskPH 1690 0.63%
8 r/mildlyinfuriating 1642 0.61%
9 r/Cricket 1546 0.58%
10 r/cats 1537 0.57%

Update History

Date New Instances Total Instances
2025-03-30T06:45:28Z 4137 4137
2025-03-30T06:50:19Z 4233 8370
2025-03-30T06:55:23Z 4182 12552
2025-03-30T07:00:22Z 4018 16570
2025-03-30T07:05:23Z 4096 20666
2025-03-30T07:10:23Z 3792 24458
2025-03-30T07:15:20Z 3694 28152
2025-03-30T07:20:23Z 3745 31897
2025-03-30T07:25:20Z 3746 35643
2025-03-30T07:30:26Z 3879 39522
2025-03-30T07:35:23Z 3658 43180
2025-03-30T07:40:25Z 3816 46996
2025-03-30T07:45:23Z 3871 50867
2025-03-30T07:50:26Z 3752 54619
2025-03-30T07:55:23Z 3857 58476
2025-03-30T08:00:20Z 3825 62301
2025-03-30T08:05:24Z 3829 66130
2025-03-30T08:10:24Z 3797 69927
2025-03-30T08:15:19Z 3660 73587
2025-03-30T08:20:23Z 3798 77385
2025-03-30T08:25:24Z 7484 84869
2025-03-30T08:30:27Z 3742 88611
2025-03-30T08:35:19Z 3698 92309
2025-03-30T08:40:45Z 4245 96554
2025-03-30T08:45:28Z 3947 100501
2025-03-30T08:50:24Z 4239 104740
2025-03-30T08:55:23Z 4299 109039
2025-03-30T09:00:47Z 4296 113335
2025-03-30T09:05:23Z 3952 117287
2025-03-30T09:10:35Z 4179 121466
2025-03-30T09:15:28Z 3598 125064
2025-03-30T09:20:50Z 4268 129332
2025-03-30T09:25:29Z 3615 132947
2025-03-30T09:30:31Z 3843 136790
2025-03-30T09:35:25Z 3880 140670
2025-03-30T09:40:24Z 3807 144477
2025-03-30T09:45:23Z 3990 148467
2025-03-30T09:50:23Z 3811 152278
2025-03-30T09:55:24Z 3841 156119
2025-03-30T10:00:24Z 3833 159952
2025-03-30T10:05:24Z 3897 163849
2025-03-30T10:10:24Z 3692 167541
2025-03-30T10:15:23Z 3688 171229
2025-03-30T10:20:26Z 3523 174752
2025-03-30T10:25:21Z 3848 178600
2025-03-30T10:30:19Z 3656 182256
2025-03-30T10:35:32Z 3857 186113
2025-03-30T10:40:22Z 3693 189806
2025-03-30T10:45:18Z 3888 193694
2025-03-30T10:50:24Z 3962 197656
2025-03-30T10:55:23Z 3912 201568
2025-03-30T11:00:20Z 3844 205412
2025-03-30T11:05:19Z 3979 209391
2025-03-30T11:10:23Z 4138 213529
2025-03-30T11:15:19Z 4211 217740
2025-03-30T11:20:20Z 4175 221915
2025-03-30T11:25:19Z 4193 226108
2025-03-30T11:30:20Z 4461 230569
2025-03-30T11:35:22Z 4482 235051
2025-03-30T12:00:58Z 23597 258648
2025-03-30T12:01:25Z 1080 259728
2025-03-30T12:02:20Z 892 260620
2025-03-30T12:03:21Z 982 261602
2025-03-30T12:04:17Z 922 262524
2025-03-30T12:05:18Z 960 263484
2025-03-30T12:06:16Z 918 264402
2025-03-30T12:07:20Z 1059 265461
2025-03-30T12:08:21Z 989 266450
2025-03-30T12:09:20Z 974 267424