Datasets:
license: cc-by-4.0
task_categories:
- text-classification
task_ids:
- multi-label-classification
language:
- en
pretty_name: Steam Aspect review dataset
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: train
path: data/train/data-00000-of-00001.arrow
- split: test
path: data/test/data-00000-of-00001.arrow
source_datasets:
- Steam
Steam Aspect review dataset
Content on this dataset card initially appeared on SRec blog.
Steam Review aspect dataset is a dataset of Steam Review with 8 review aspects. This dataset contains 1100 English Steam reviews, split into 900 train and 200 test. This dataset was initially created to identify which aspects are mentioned in English reviews as part of Analysis of 43 million English Steam Reviews.
Data collection and annotation
The source of the reviews comes from a snapshot of the SRec database, which was taken on 21 February 2024. SRec obtain all reviews for all games and mods using API provided by Steam. To reduce bias when selecting reviews to be annotated, I chose reviews primarily based on these criteria,
- Character length.
- Helpfulness score.
- Popularity of the reviewed game.
- Genre or category of the reviewed game.
There are 8 aspects to define review in this dataset. I am the only annotator for this dataset. A review is deemed to contain a certain aspect, even if it's mentioned implicitly (e.g., "but it'd be great if there's good looking characters...") or only mention lack of the aspect (e.g., "... essentially has no story ..."). The below table shows 8 aspects of this dataset, along with a short description and example.
Table 1. A description and example for 8 aspects in this dataset
Aspect | Short description | Review text |
---|---|---|
Recommended | Whether the reviewer recommends the game or not. This aspect comes from the one who wrote the review. | ... In conclusion, good game |
Story | Story, character, lore, world-building and other storytelling elements. | Excellent game, but has an awful-abrupt ending that comes out of nowhere and doesnt make sense ... |
Gameplay | Controls, mechanics, interactivity, difficulty and other gameplay setups. | Gone are the days of mindless building and fun. Power grids? Taxes? Intense accounting and counter-intuitive path building ... |
Visual | Aesthetic, art style, animation, visual effects and other visual elements. | Gorgeous graphics + 80s/90s anime artstyle + Spooky + Atmospheric ... |
Audio | Sound design, music, voice acting and other auditory elements. | ... catchy music, wonderful narrator saying very kind words ... |
Technical | The technical aspects of the game such as bug, performance, OS support, controller support and overall functionality. | bad doesnt fit a 1080p monitor u bastard ... |
Price | Price of the game or its additional content. | Devs are on meth pricing this game at $44 |
Suggestion | Suggestions for the state of the game, including external factors such as game's price or publisher partnership. | ... but needs a bit of personal effort to optimize the controls for PC, otherwise ... |
Take note that few reviews contain language and content that some people may find offensive, discriminatory, or inappropriate. I DO NOT endorse, condone or promote any of such language and content.
Model benchmark
Model benchmark on Steam Review aspect dataset split into 3 categories,
- Base: Non-attention based language model.
- Embedding: Inspired by MTEB, obtained embedding trained on Logistic Regressor for up to 100 epochs.
- Fine-tune.
Source code for running these models is available on GitHub. But take note it may not follow best practice as it was written with the goal of using it only once. I ran those on Linux, RTX 3060 and 32GB RAM.
Base
Model | Macro precision | Macro recall | Macro F1 | Note |
---|---|---|---|---|
Spacy Bag of Words | 0.6203 | 0.5391 | 0.5494 | |
FastText | 0.6284 | 0.5713 | 0.5871 | Minimum text preprocessing, use pretrained vector |
FastText | 0.6933 | 0.5821 | 0.6027 | Minimum text preprocessing, choose hyperparameter based on K-5 fold autotune |
Spacy Ensemble | 0.6043 | 0.6773 | 0.6299 | Choose hyperparameter based on simple grid search |
Embedding
Model | Param | Max tokens | Macro precision | Macro recall | Macro F1 | Note |
---|---|---|---|---|---|---|
sentence-transformers/all-mpnet-base-v2 | 110M | 514 | 0.7074 | 0.5431 | 0.5853 | |
jinaai/jina-embeddings-v2-small-en | 137M | 8192 | 0.7068 | 0.6075 | 0.6437 | |
jinaai/jina-embeddings-v2-base-en | 137M | 8192 | 0.6813 | 0.6501 | 0.6618 | |
Alibaba-NLP/gte-large-en-v1.5 | 434M | 8192 | 0.7001 | 0.6501 | 0.6729 | |
nomic-ai/nomic-embed-text-v1.5 | 137M | 8192 | 0.7075 | 0.6498 | 0.6756 | |
McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-supervised | 7111M | 32768 | 0.7238 | 0.6697 | 0.6928 | NF4 double quantization, instruction |
WhereIsAI/UAE-Large-V1 | 335M | 512 | 0.7245 | 0.6718 | 0.6946 | |
mixedbread-ai/mxbai-embed-large-v1 | 335M | 512 | 0.7215 | 0.6817 | 0.6989 | |
intfloat/e5-mistral-7b-instruct | 7111M | 32768 | 0.7345 | 0.7000 | 0.7137 | NF4 double quantization, instruction |
Fine-tune
Model | Param | Max tokens | Macro precision | Macro recall | Macro F1 | Note |
---|---|---|---|---|---|---|
jinaai/jina-embeddings-v2-base-en | 137M | 8192 | 0.7485 | 0.7257 | 0.7354 | Choose hyperparameter from Ray Tune (30 trials) |
Alibaba-NLP/gte-large-en-v1.5 | 434M | 8192 | 0.8403 | 0.8152 | 0.8231 | Choose hyperparameter from Ray Tune (16 trials) |
Download
You can download Steam review aspect dataset from here (HuggingFace) or one of these sources,
Citation
If you wish to use this dataset in your research or project, please cite this blog post: Steam review aspect dataset
Sandy Khosasi. "Steam review aspect dataset". (2024).
For those who need it, a BibTeX citation format also has been prepared.
@misc{srec:steam-review-aspect-dataset,
title = {Steam review aspect dataset},
author = {Sandy Khosasi},
year = {2024},
month = {may},
day = {28},
url = {https://srec.ai/blog/steam-review-aspect-dataset},
urldate = {2024-05-28}
}
License
Steam Review aspect dataset is licensed under Creative Commons Attribution 4.0 International.