reddit_dataset_157 / README.md
immortalizzy's picture
Update README.md with latest statistics
e9f5405 verified
|
raw
history blame
7.09 kB
metadata
license: mit
multilinguality:
  - multilingual
source_datasets:
  - original
task_categories:
  - text-classification
  - token-classification
  - question-answering
  - summarization
  - text-generation
task_ids:
  - sentiment-analysis
  - topic-classification
  - named-entity-recognition
  - language-modeling
  - text-scoring
  - multi-class-classification
  - multi-label-classification
  - extractive-qa
  - news-articles-summarization

Bittensor Subnet 13 Reddit Dataset

Data-universe: The finest collection of social media data the web has to offer
Data-universe: The finest collection of social media data the web has to offer

Dataset Description

  • Repository: tensorshield/reddit_dataset_157
  • Subnet: Bittensor Subnet 13
  • Miner Hotkey: 5Cw1eMv2sdpn9zfbvH2Mf8V5xaoRMj6NjVQLkYq61verzNbq

Dataset Summary

This dataset is part of the Bittensor Subnet 13 decentralized network, containing preprocessed Reddit data. The data is continuously updated by network miners, providing a real-time stream of Reddit content for various analytical and machine learning tasks. For more information about the dataset, please visit the official repository.

Supported Tasks

The versatility of this dataset allows researchers and data scientists to explore various aspects of social media dynamics and develop innovative applications. Users are encouraged to leverage this data creatively for their specific research or business needs. For example:

  • Sentiment Analysis
  • Topic Modeling
  • Community Analysis
  • Content Categorization

Languages

Primary language: Datasets are mostly English, but can be multilingual due to decentralized ways of creation.

Dataset Structure

Data Instances

Each instance represents a single Reddit post or comment with the following fields:

Data Fields

  • text (string): The main content of the Reddit post or comment.
  • label (string): Sentiment or topic category of the content.
  • dataType (string): Indicates whether the entry is a post or a comment.
  • communityName (string): The name of the subreddit where the content was posted.
  • datetime (string): The date when the content was posted or commented.
  • username_encoded (string): An encoded version of the username to maintain user privacy.
  • url_encoded (string): An encoded version of any URLs included in the content.

Data Splits

This dataset is continuously updated and does not have fixed splits. Users should create their own splits based on their requirements and the data's timestamp.

Dataset Creation

Source Data

Data is collected from public posts and comments on Reddit, adhering to the platform's terms of service and API usage guidelines.

Personal and Sensitive Information

All usernames and URLs are encoded to protect user privacy. The dataset does not intentionally include personal or sensitive information.

Considerations for Using the Data

Social Impact and Biases

Users should be aware of potential biases inherent in Reddit data, including demographic and content biases. This dataset reflects the content and opinions expressed on Reddit and should not be considered a representative sample of the general population.

Limitations

  • Data quality may vary due to the nature of media sources.
  • The dataset may contain noise, spam, or irrelevant content typical of social media platforms.
  • Temporal biases may exist due to real-time collection methods.
  • The dataset is limited to public subreddits and does not include private or restricted communities.

Additional Information

Licensing Information

The dataset is released under the MIT license. The use of this dataset is also subject to Reddit Terms of Use.

Citation Information

If you use this dataset in your research, please cite it as follows:

@misc{tensorshield2025datauniversereddit_dataset_157,
        title={The Data Universe Datasets: The finest collection of social media data the web has to offer},
        author={tensorshield},
        year={2025},
        url={https://huggingface.co/datasets/tensorshield/reddit_dataset_157},
        }

Contributions

To report issues or contribute to the dataset, please contact the miner or use the Bittensor Subnet 13 governance mechanisms.

Dataset Statistics

[This section is automatically updated]

  • Total Instances: 85162
  • Date Range: 2025-03-24T00:00:00Z to 2025-03-24T00:00:00Z
  • Last Updated: 2025-03-31T02:41:14Z

Data Distribution

  • Posts: 12.15%
  • Comments: 87.85%

Top 10 Subreddits

For full statistics, please refer to the stats.json file in the repository.

Rank Topic Total Count Percentage
1 r/AskReddit 2606 3.06%
2 r/CollegeBasketball 2355 2.77%
3 r/AITAH 1012 1.19%
4 r/mildlyinfuriating 685 0.80%
5 r/Advice 445 0.52%
6 r/MadeMeSmile 430 0.50%
7 r/nba 429 0.50%
8 r/facepalm 390 0.46%
9 r/RoastMe 385 0.45%
10 r/moviecritic 379 0.45%

Update History

Date New Instances Total Instances
2025-03-31T01:44:45Z 859 859
2025-03-31T01:45:28Z 1358 2217
2025-03-31T01:46:30Z 1300 3517
2025-03-31T01:47:15Z 1471 4988
2025-03-31T02:05:19Z 28610 33598
2025-03-31T02:06:17Z 1495 35093
2025-03-31T02:07:15Z 1361 36454
2025-03-31T02:08:16Z 1439 37893
2025-03-31T02:09:15Z 1426 39319
2025-03-31T02:10:16Z 1492 40811
2025-03-31T02:11:15Z 1356 42167
2025-03-31T02:12:17Z 1598 43765
2025-03-31T02:13:15Z 1445 45210
2025-03-31T02:14:15Z 1440 46650
2025-03-31T02:15:16Z 1435 48085
2025-03-31T02:17:16Z 2882 50967
2025-03-31T02:18:15Z 1384 52351
2025-03-31T02:19:19Z 1429 53780
2025-03-31T02:20:15Z 1449 55229
2025-03-31T02:21:17Z 1357 56586
2025-03-31T02:22:14Z 1385 57971
2025-03-31T02:23:14Z 1389 59360
2025-03-31T02:24:22Z 1430 60790
2025-03-31T02:25:14Z 1387 62177
2025-03-31T02:26:14Z 1439 63616
2025-03-31T02:27:16Z 1368 64984
2025-03-31T02:28:15Z 1543 66527
2025-03-31T02:29:17Z 1493 68020
2025-03-31T02:30:16Z 1428 69448
2025-03-31T02:31:14Z 1433 70881
2025-03-31T02:32:16Z 1426 72307
2025-03-31T02:33:14Z 1445 73752
2025-03-31T02:34:15Z 1485 75237
2025-03-31T02:35:15Z 1424 76661
2025-03-31T02:36:16Z 1469 78130
2025-03-31T02:37:14Z 1460 79590
2025-03-31T02:38:15Z 1415 81005
2025-03-31T02:39:18Z 1424 82429
2025-03-31T02:40:14Z 1301 83730
2025-03-31T02:41:14Z 1432 85162