reddit_dataset_237 / README.md
immortalizzy's picture
Update README.md with latest statistics
2cb26b0 verified
metadata
license: mit
multilinguality:
  - multilingual
source_datasets:
  - original
task_categories:
  - text-classification
  - token-classification
  - question-answering
  - summarization
  - text-generation
task_ids:
  - sentiment-analysis
  - topic-classification
  - named-entity-recognition
  - language-modeling
  - text-scoring
  - multi-class-classification
  - multi-label-classification
  - extractive-qa
  - news-articles-summarization

Bittensor Subnet 13 Reddit Dataset

Data-universe: The finest collection of social media data the web has to offer
Data-universe: The finest collection of social media data the web has to offer

Dataset Description

  • Repository: tensorshield/reddit_dataset_237
  • Subnet: Bittensor Subnet 13
  • Miner Hotkey: 5ELUF52qoxx5zq3134YKeQoBvheztbynYMXHR5LmX1bNTbtf

Dataset Summary

This dataset is part of the Bittensor Subnet 13 decentralized network, containing preprocessed Reddit data. The data is continuously updated by network miners, providing a real-time stream of Reddit content for various analytical and machine learning tasks. For more information about the dataset, please visit the official repository.

Supported Tasks

The versatility of this dataset allows researchers and data scientists to explore various aspects of social media dynamics and develop innovative applications. Users are encouraged to leverage this data creatively for their specific research or business needs. For example:

  • Sentiment Analysis
  • Topic Modeling
  • Community Analysis
  • Content Categorization

Languages

Primary language: Datasets are mostly English, but can be multilingual due to decentralized ways of creation.

Dataset Structure

Data Instances

Each instance represents a single Reddit post or comment with the following fields:

Data Fields

  • text (string): The main content of the Reddit post or comment.
  • label (string): Sentiment or topic category of the content.
  • dataType (string): Indicates whether the entry is a post or a comment.
  • communityName (string): The name of the subreddit where the content was posted.
  • datetime (string): The date when the content was posted or commented.
  • username_encoded (string): An encoded version of the username to maintain user privacy.
  • url_encoded (string): An encoded version of any URLs included in the content.

Data Splits

This dataset is continuously updated and does not have fixed splits. Users should create their own splits based on their requirements and the data's timestamp.

Dataset Creation

Source Data

Data is collected from public posts and comments on Reddit, adhering to the platform's terms of service and API usage guidelines.

Personal and Sensitive Information

All usernames and URLs are encoded to protect user privacy. The dataset does not intentionally include personal or sensitive information.

Considerations for Using the Data

Social Impact and Biases

Users should be aware of potential biases inherent in Reddit data, including demographic and content biases. This dataset reflects the content and opinions expressed on Reddit and should not be considered a representative sample of the general population.

Limitations

  • Data quality may vary due to the nature of media sources.
  • The dataset may contain noise, spam, or irrelevant content typical of social media platforms.
  • Temporal biases may exist due to real-time collection methods.
  • The dataset is limited to public subreddits and does not include private or restricted communities.

Additional Information

Licensing Information

The dataset is released under the MIT license. The use of this dataset is also subject to Reddit Terms of Use.

Citation Information

If you use this dataset in your research, please cite it as follows:

@misc{tensorshield2025datauniversereddit_dataset_237,
        title={The Data Universe Datasets: The finest collection of social media data the web has to offer},
        author={tensorshield},
        year={2025},
        url={https://huggingface.co/datasets/tensorshield/reddit_dataset_237},
        }

Contributions

To report issues or contribute to the dataset, please contact the miner or use the Bittensor Subnet 13 governance mechanisms.

Dataset Statistics

[This section is automatically updated]

  • Total Instances: 120160
  • Date Range: 2025-03-24T00:00:00Z to 2025-03-24T00:00:00Z
  • Last Updated: 2025-03-31T02:47:14Z

Data Distribution

  • Posts: 12.05%
  • Comments: 87.95%

Top 10 Subreddits

For full statistics, please refer to the stats.json file in the repository.

Rank Topic Total Count Percentage
1 r/AskReddit 3646 3.03%
2 r/CollegeBasketball 3501 2.91%
3 r/AITAH 1485 1.24%
4 r/mildlyinfuriating 959 0.80%
5 r/nba 641 0.53%
6 r/MadeMeSmile 623 0.52%
7 r/Advice 605 0.50%
8 r/tennis 573 0.48%
9 r/politics 545 0.45%
10 r/LAClippers 534 0.44%

Update History

Date New Instances Total Instances
2025-03-31T01:27:22Z 1449 1449
2025-03-31T01:28:11Z 1423 2872
2025-03-31T01:29:11Z 1502 4374
2025-03-31T01:30:14Z 1537 5911
2025-03-31T01:31:12Z 1530 7441
2025-03-31T01:32:12Z 1514 8955
2025-03-31T01:33:11Z 1470 10425
2025-03-31T01:34:12Z 1409 11834
2025-03-31T01:35:13Z 1528 13362
2025-03-31T01:36:16Z 1603 14965
2025-03-31T01:37:17Z 1510 16475
2025-03-31T01:38:16Z 1488 17963
2025-03-31T01:39:14Z 1413 19376
2025-03-31T01:40:14Z 1529 20905
2025-03-31T01:41:14Z 1546 22451
2025-03-31T01:42:16Z 1535 23986
2025-03-31T01:44:01Z 2641 26627
2025-03-31T01:44:36Z 878 27505
2025-03-31T01:45:14Z 958 28463
2025-03-31T01:46:30Z 1728 30191
2025-03-31T01:47:14Z 1397 31588
2025-03-31T02:04:21Z 27159 58747
2025-03-31T02:06:18Z 2899 61646
2025-03-31T02:07:15Z 1361 63007
2025-03-31T02:08:17Z 1439 64446
2025-03-31T02:09:15Z 1426 65872
2025-03-31T02:10:16Z 1492 67364
2025-03-31T02:11:14Z 1356 68720
2025-03-31T02:12:15Z 1544 70264
2025-03-31T02:13:14Z 1499 71763
2025-03-31T02:14:19Z 1528 73291
2025-03-31T02:15:16Z 1369 74660
2025-03-31T02:17:15Z 2860 77520
2025-03-31T02:18:14Z 1384 78904
2025-03-31T02:19:14Z 1429 80333
2025-03-31T02:20:15Z 1449 81782
2025-03-31T02:21:19Z 1357 83139
2025-03-31T02:22:14Z 1385 84524
2025-03-31T02:23:13Z 1389 85913
2025-03-31T02:24:21Z 1430 87343
2025-03-31T02:25:16Z 1387 88730
2025-03-31T02:26:14Z 1439 90169
2025-03-31T02:27:21Z 1368 91537
2025-03-31T02:28:15Z 1543 93080
2025-03-31T02:29:17Z 1493 94573
2025-03-31T02:30:17Z 1428 96001
2025-03-31T02:31:15Z 1433 97434
2025-03-31T02:32:15Z 1426 98860
2025-03-31T02:33:15Z 1488 100348
2025-03-31T02:34:15Z 1442 101790
2025-03-31T02:35:14Z 1424 103214
2025-03-31T02:36:15Z 1469 104683
2025-03-31T02:37:18Z 1414 106097
2025-03-31T02:38:15Z 1461 107558
2025-03-31T02:39:16Z 1404 108962
2025-03-31T02:40:14Z 1298 110260
2025-03-31T02:41:14Z 1455 111715
2025-03-31T02:42:15Z 1412 113127
2025-03-31T02:43:14Z 1399 114526
2025-03-31T02:44:14Z 1450 115976
2025-03-31T02:45:15Z 1372 117348
2025-03-31T02:46:14Z 1393 118741
2025-03-31T02:47:14Z 1419 120160