Datasets:
license: cc-by-4.0
task_categories:
- graph-ml
tags:
- multimodal
- attributed-graph
- benchmark
MAGB
This repository contains the Multimodal Attribute Graph Benchmark (MAGB) datasets described in the paper When Graph meets Multimodal: Benchmarking on Multimodal Attributed Graphs Learning.
MAGB provides 5 datasets from E-Commerce and Social Networks, and evaluates two major learning paradigms: GNN-as-Predictor and VLM-as-Predictor. The datasets are publicly available on Hugging Face: https://huggingface.co/datasets/Sherirto/MAGB.
Each dataset consists of several parts:
- Graph Data (*.pt): Stores the graph structure, including adjacency information and node labels. Loadable using DGL.
- Node Textual Metadata (*.csv): Contains node textual descriptions, neighborhood relationships, and category labels.
- Text, Image, and Multimodal Features (TextFeature/, ImageFeature/, MMFeature/): Pre-extracted embeddings from the MAGB paper for different modalities.
- Raw Images (*.tar.gz): A compressed folder containing images named by node IDs. Requires extraction before use. The Reddit-M dataset is particularly large and may require special handling (see Github README for details).
π Table of Contents
π Introduction
Multimodal attributed graphs (MAGs) incorporate multiple data types (e.g., text, images, numerical features) into graph structures, enabling more powerful learning and inference capabilities.
This benchmark provides:
β
Standardized datasets with multimodal attributes.
β
Feature extraction pipelines for different modalities.
β
Evaluation metrics to compare different models.
β
Baselines and benchmarks to accelerate research.
π» Installation
Ensure you have the required dependencies installed before running the benchmark.
# Clone the repository
git clone https://github.com/sktsherlock/MAGB.git
cd MAGB
# Install dependencies
pip install -r requirements.txt
π Usage
1. Download the datasets from MAGB. π
cd Data/
sudo apt-get update && sudo apt-get install git-lfs && git clone https://huggingface.co/datasets/Sherirto/MAGB .
ls
Now, you can see the Movies, Toys, Grocery, Reddit-S and Reddit-M under the ''Data'' folder.
2. GNN-as-Predictor
...(rest of the content from Github README can be pasted here)