juancopi81 commited on
Commit
1eb9098
·
1 Parent(s): 36af553

Update README

Browse files
Files changed (1) hide show
  1. README.md +31 -1
README.md CHANGED
@@ -18,6 +18,10 @@ task_ids:
18
  - [Dataset Description](#dataset-description)
19
  - [Dataset Summary](#dataset-summary)
20
  - [Supported Tasks](#supported-tasks-and-leaderboards)
 
 
 
 
21
 
22
  ## Dataset Description
23
 
@@ -34,4 +38,30 @@ The dataset mainly contains guitar music from western classical composers, such
34
 
35
  ### Supported Tasks and Leaderboards
36
 
37
- Anyone interested can use the dataset to train a model for symbolic music generation, which consists in treating symbols for music sounds (notes) as text tokens. Then, one can implement a generative model using NLP techniques, such as Transformers.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  - [Dataset Description](#dataset-description)
19
  - [Dataset Summary](#dataset-summary)
20
  - [Supported Tasks](#supported-tasks-and-leaderboards)
21
+ - [Dataset Structure](#dataset-structure)
22
+ - [Data Instances](#data-instances)
23
+ - [Data Fields](#data-fields)
24
+ - [Data Splits](#data-splits)
25
 
26
  ## Dataset Description
27
 
 
38
 
39
  ### Supported Tasks and Leaderboards
40
 
41
+ Anyone interested can use the dataset to train a model for symbolic music generation, which consists in treating symbols for music sounds (notes) as text tokens. Then, one can implement a generative model using NLP techniques, such as Transformers.
42
+
43
+ ## Dataset Structure
44
+
45
+ ### Data Instances
46
+
47
+ Each guitar piece is represented as a string of text that contains a series of tokens, for instance:
48
+
49
+ PIECE_START: Where the piece begins
50
+ PIECE_ENDS: Where the piece ends
51
+ TIME_SIGNATURE: Time signature for the piece
52
+ BPM: Tempo of the piece
53
+ BAR_START: Begining of a new bar
54
+ NOTE_ON: Start of a new musical note specifying its MIDI note number
55
+ TIME_DELTA: Duration until the next event
56
+ NOTE_OFF: End of musical note specifying its MIDI note number
57
+
58
+ ```
59
+ {
60
+ 'text': PIECE_START TIME_SIGNATURE=2_4 BPM=74 TRACK_START INST=0 DENSITY=4 BAR_START NOTE_ON=52 TIME_DELTA=2.0 NOTE_OFF=52 NOTE_ON=45 NOTE_ON=49 TIME_DELTA=2.0 NOTE_OFF=49 NOTE_ON=52 TIME_DELTA=2.0 NOTE_OFF=45 NOTE_ON=47 NOTE_OFF=52 NOTE_ON=44 TIME_DELTA=2.0,
61
+ ...
62
+ }
63
+ ```
64
+
65
+ ### Data Fields
66
+
67
+ - `text`: Sequence of tokens that represent the guitar piece as explained in the paper [MMM: Exploring Conditional Multi-Track Music Generation with the Transformer](https://arxiv.org/abs/2008.06048).