brillm05 commited on
Commit
9083130
·
0 Parent(s):

Fresh start without large files

Browse files
Files changed (15) hide show
  1. .gitignore +15 -0
  2. README.MD +163 -0
  3. brain.yaml +11 -0
  4. figs/fig2.png +0 -0
  5. figs/fig4.png +0 -0
  6. figs/fig6.png +0 -0
  7. infer_en.py +36 -0
  8. infer_zh.py +34 -0
  9. model.py +305 -0
  10. run_en.sh +31 -0
  11. run_zh.sh +29 -0
  12. train.py +852 -0
  13. vocab_wiki_4k.json +4003 -0
  14. vocab_wiki_4k_en.json +4002 -0
  15. wiki_bpe_tokenizer_4000_bytelevel.json +0 -0
.gitignore ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ model_0.bin
2
+ model_1.bin
3
+ pytorch_model.bin
4
+ model.bin
5
+ split.ipynb
6
+ decode.ipynb
7
+ upload.py
8
+ results/
9
+ vocab.json
10
+ model_en.bin
11
+ model_zh.bin
12
+ __pycache__
13
+ transfer.py
14
+ word_frequency.json
15
+ word_frequency_en.json
README.MD ADDED
@@ -0,0 +1,163 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # BriLLM: Brain-inspired Large Language Model
2
+
3
+ We release BriLLM-Chinese and BriLLM-English.
4
+
5
+ Our paper: https://arxiv.org/pdf/2503.11299
6
+
7
+ Our huggingface: https://huggingface.co/BriLLM/BriLLM0.5
8
+
9
+
10
+ ## Overview
11
+ This work introduces the first brain-inspired large language model (BriLLM). This is a non-Transformer, non-GPT, non-traditional machine learning input-output controlled generative language model. The model is based on the Signal Fully-connected flowing (SiFu) definition on the directed graph in terms of the neural network, and has the interpretability of all nodes on the graph of the whole model, instead of the traditional machine learning model that only has limited interpretability at the input and output ends.
12
+
13
+
14
+ ## SiFU Mechanism
15
+ ![](./figs/fig1.png)
16
+ > As shown in Figure 1, SiFu model is a graph composed of multiple nodes, which are sparsely activated and utilize tensors to transmit a nominal signal.
17
+ Each node (ideally, a layer of neurons) represents a certain concept or word, e.g., a noun, a verb, etc.
18
+ Each edge models the relationship between every node pair.
19
+ The signal is transmitted by the magnitude of the energy. The energy will be strengthened, i.e., maximized, if it is in the right route. Or, at least, the right path always keeps the maximal energy for the transmitted signal.
20
+ Each node is sequentially activated in terms of the maximized energy.
21
+ Route or path is determined in a competitive way, i.e., the next node will be activated only if the energy can be maximally delivered in this node.
22
+
23
+
24
+ ## Architecture
25
+ ![](./figs/fig2.png)
26
+ > As shown in Figure 2, BriLLM implements SiFu neural network for language modeling.
27
+ Each token in the vocabulary is modeled as a node, which is defined by a hidden layer of neurons in the neural network.
28
+
29
+
30
+ ## Training Network
31
+ ![](./figs/fig3.png)
32
+ > To train a sample in BriLLM, every time we build an individual common neural network to perform the regular BP training. This network consists of two parts, in which the front part connects all input nodes (i.e., tokens), then it follows the rear parts which connect all possible paths in order. At last, a softmax layer collects all paths' energy tensors to indicate the right path with a 0-1 ground truth vector. We adopt a cross-entropy loss for training.
33
+
34
+
35
+ ## Dataset
36
+ We use the subset from the Chinese version of Wikipedia, which contains over 100M Chinese characters. We truncate the long sentences into small sentences with a maximum length of 16.
37
+ We select a vocabulary of 4,000 tokens consisting of the most frequently used Chinese characters.
38
+
39
+
40
+ ## Implementation Details.
41
+ BriLLM is implemented using PyTorch.
42
+ It uses sinusoidal positional encoding, GeLU as the activation function, cross-entropy loss for next-token prediction, and an embedding size of $d_{model} = 32$.
43
+ We used the AdamW optimizer with $\beta_1 = 0.9$, $\beta_2 = 0.999$ and $\epsilon = 10^{-8}$.
44
+ The model size is about $512 + 4000 * 4000 * (32 * 32 + 32) \approx 16B$.
45
+ We trained our models on one machine with 8 NVIDIA A800 GPUs for 1.5k steps.
46
+ ![](./figs/fig4.png)
47
+
48
+
49
+ ## Complexity
50
+ $n$ is the sequence length, $v$ is the vocabulary size, and $d$ is the representation dimension. The computational complexity is $O(n \cdot v \cdot d^2)$.
51
+
52
+
53
+ ## Case Study
54
+ ![](./figs/fig5.png)
55
+ ![](./figs/fig7.png)
56
+
57
+
58
+ ## Comparison of LLM and BriLLM
59
+ ![](./figs/fig6.png)
60
+
61
+
62
+ ## Installation
63
+ ```bash
64
+ pip install torch
65
+ ```
66
+
67
+
68
+ ## Checkpoint
69
+ [BriLLM0.5](https://huggingface.co/BriLLM/BriLLM0.5)
70
+
71
+
72
+ ## Train
73
+ ### BriLLM-Chinese
74
+ ```bash
75
+ bash run_zh.sh
76
+ ```
77
+
78
+ ### BriLLM-English
79
+ ```bash
80
+ bash run_en.sh
81
+ ```
82
+
83
+
84
+
85
+ ## Inference
86
+ ### BriLLM-Chinese
87
+ ```python
88
+ import json
89
+ import torch
90
+ from model import BraLM, Vocab
91
+
92
+ with open("vocab_wiki_4k.json") as f:
93
+ node_dict = json.load(f)
94
+ vocab = Vocab.from_node_dict(node_dict)
95
+
96
+ with open('word_frequency.json', 'r') as f:
97
+ freq_dict = json.load(f)
98
+
99
+ zero_freq_edges = {}
100
+ for s in freq_dict:
101
+ zero_freq_edges[s] = []
102
+ for t in freq_dict[s]:
103
+ if freq_dict[s][t] == 0:
104
+ zero_freq_edges[s].append(t)
105
+
106
+ model = BraLM(hidden_size=32, zero_freq_edges=zero_freq_edges, vocab=vocab)
107
+ model.prepare_network(vocab)
108
+
109
+ state_dict = torch.load("model_zh.bin", weights_only=True)
110
+ model.load_state_dict(state_dict)
111
+ model.to_device("cuda:6")
112
+
113
+ head = "《罗马》描述了"
114
+ max_token = 32 - len(head)
115
+
116
+ start = [vocab((head[i]+ '->' +head[i+1])) for i in range(len(head)-1)]
117
+ ret = model.decode(start, vocab, max_token)
118
+ decode_tuple_list = [vocab.decode(p) for p in ret]
119
+ decode_sentence = decode_tuple_list[0][0] + "".join([p[-1] for p in decode_tuple_list])
120
+
121
+ print(decode_sentence)
122
+ ```
123
+
124
+
125
+ ### BriLLM-English
126
+ ```python
127
+ import json
128
+ import torch
129
+ from model import BraLM, Vocab
130
+ from tokenizers import Tokenizer
131
+
132
+ bpe_tokenizer = Tokenizer.from_file("wiki_bpe_tokenizer_4000_bytelevel.json")
133
+
134
+ def decode_en_sentence(head, max_token=32, do_sample=False):
135
+ bpe_tokens = bpe_tokenizer.encode(head).tokens
136
+ if len(bpe_tokens) < 2:
137
+ return head
138
+ start = [vocab((bpe_tokens[i] + '->' + bpe_tokens[i+1])) for i in range(len(bpe_tokens)-1)]
139
+ ret = model.decode(start, vocab, max_token, do_sample)
140
+ decode_tuple_list = [vocab.decode(p).split('->') for p in ret]
141
+ decode_sentence = decode_tuple_list[0][0] + "".join([p[-1] for p in decode_tuple_list])
142
+ return decode_sentence
143
+
144
+
145
+ with open("./vocab_wiki_4k_en.json") as f:
146
+ node_dict = json.load(f)
147
+ vocab = Vocab.from_node_dict(node_dict)
148
+
149
+ model = BraLM(hidden_size=32)
150
+ model.prepare_network(vocab)
151
+
152
+ state_dict = torch.load("model_en.bin", weights_only=True)
153
+ model.load_state_dict(state_dict)
154
+ model.to_device("cuda:6")
155
+
156
+ head = "In frogs, the hind legs are larger"
157
+ encoding = bpe_tokenizer.encode(head)
158
+ token_len = len(encoding.ids)
159
+ max_token = 32 - token_len
160
+ decode_sentence = decode_en_sentence(head, max_token).replace("Ġ", " ")
161
+
162
+ print(decode_sentence)
163
+ ```
brain.yaml ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ compute_environment: LOCAL_MACHINE
2
+ deepspeed_config: {}
3
+ distributed_type: MULTI_GPU
4
+ fsdp_config: {}
5
+ machine_rank: 0
6
+ main_process_ip: 192.168.0.0
7
+ main_process_port: 29500
8
+ main_training_function: main
9
+ num_machines: 1
10
+ num_processes: 8
11
+ use_cpu: false
figs/fig2.png ADDED
figs/fig4.png ADDED
figs/fig6.png ADDED
infer_en.py ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import torch
3
+ from model import BraLM, Vocab
4
+ from tokenizers import Tokenizer
5
+
6
+ bpe_tokenizer = Tokenizer.from_file("wiki_bpe_tokenizer_4000_bytelevel.json")
7
+
8
+ def decode_en_sentence(head, max_token=32, do_sample=False):
9
+ bpe_tokens = bpe_tokenizer.encode(head).tokens
10
+ if len(bpe_tokens) < 2:
11
+ return head
12
+ start = [vocab((bpe_tokens[i] + '->' + bpe_tokens[i+1])) for i in range(len(bpe_tokens)-1)]
13
+ ret = model.decode(start, vocab, max_token, do_sample)
14
+ decode_tuple_list = [vocab.decode(p).split('->') for p in ret]
15
+ decode_sentence = decode_tuple_list[0][0] + "".join([p[-1] for p in decode_tuple_list])
16
+ return decode_sentence
17
+
18
+
19
+ with open("./vocab_wiki_4k_en.json") as f:
20
+ node_dict = json.load(f)
21
+ vocab = Vocab.from_node_dict(node_dict)
22
+
23
+ model = BraLM(hidden_size=32)
24
+ model.prepare_network(vocab)
25
+
26
+ state_dict = torch.load("model_en.bin", weights_only=True)
27
+ model.load_state_dict(state_dict)
28
+ model.to_device("cuda:6")
29
+
30
+ head = "In frogs, the hind legs are larger"
31
+ encoding = bpe_tokenizer.encode(head)
32
+ token_len = len(encoding.ids)
33
+ max_token = 32 - token_len
34
+ decode_sentence = decode_en_sentence(head, max_token).replace("Ġ", " ")
35
+
36
+ print(decode_sentence)
infer_zh.py ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import torch
3
+ from model import BraLM, Vocab
4
+
5
+ with open("vocab_wiki_4k.json") as f:
6
+ node_dict = json.load(f)
7
+ vocab = Vocab.from_node_dict(node_dict)
8
+
9
+ with open('word_frequency.json', 'r') as f:
10
+ freq_dict = json.load(f)
11
+
12
+ zero_freq_edges = {}
13
+ for s in freq_dict:
14
+ zero_freq_edges[s] = []
15
+ for t in freq_dict[s]:
16
+ if freq_dict[s][t] == 0:
17
+ zero_freq_edges[s].append(t)
18
+
19
+ model = BraLM(hidden_size=32, zero_freq_edges=zero_freq_edges, vocab=vocab)
20
+ model.prepare_network(vocab)
21
+
22
+ state_dict = torch.load("model_zh.bin", weights_only=True)
23
+ model.load_state_dict(state_dict)
24
+ model.to_device("cuda:6")
25
+
26
+ head = "《罗马》描述了"
27
+ max_token = 32 - len(head)
28
+
29
+ start = [vocab((head[i]+ '->' +head[i+1])) for i in range(len(head)-1)]
30
+ ret = model.decode(start, vocab, max_token)
31
+ decode_tuple_list = [vocab.decode(p) for p in ret]
32
+ decode_sentence = decode_tuple_list[0][0] + "".join([p[-1] for p in decode_tuple_list])
33
+
34
+ print(decode_sentence)
model.py ADDED
@@ -0,0 +1,305 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ import torch.nn as nn
3
+ import random
4
+ from torch.autograd import Variable
5
+
6
+ class BraLM(nn.Module):
7
+ def __init__(self, hidden_size, use_ds=False, zero_freq_edges=None, vocab=None):
8
+ super().__init__()
9
+ self.hidden_size = hidden_size
10
+ self.activation = nn.GELU()
11
+ self.positions = nn.Parameter(torch.ones(1, 512, 1))
12
+ self.device = None
13
+
14
+ # for fsdp
15
+ self._tied_weights_keys = []
16
+
17
+ self.use_ds = use_ds
18
+ self.zero_freq_edges = zero_freq_edges
19
+ self.vocab = vocab
20
+
21
+ def prepare_network(self, vocab):
22
+ # Create index mappings for the flattened structure
23
+ self.weight_indices = {} # Maps (s_idx, t_idx) to parameter index
24
+ self.shared_param_idx = 0
25
+
26
+ # Current index for new parameters
27
+ current_idx = 1
28
+
29
+ # Populate parameters and mappings
30
+ for s_idx, s in enumerate(vocab.edge_dict):
31
+ for t_idx, t in enumerate(vocab.edge_dict[s]):
32
+ if self.zero_freq_edges is not None and t in self.zero_freq_edges[s]:
33
+ # Use shared parameters
34
+ self.weight_indices[(s_idx, t_idx)] = self.shared_param_idx
35
+ else:
36
+ self.weight_indices[(s_idx, t_idx)] = current_idx
37
+ current_idx += 1
38
+
39
+ # Create new parameters
40
+ self.weights = nn.Parameter(torch.randn(current_idx, self.hidden_size, self.hidden_size).uniform_(-0.5, 0.5))
41
+ self.biases = nn.Parameter(torch.randn(current_idx, 1, self.hidden_size).uniform_(-0.5, 0.5))
42
+
43
+ self.node_bias = nn.Parameter(torch.randn(len(vocab.edge_dict), 1, self.hidden_size).uniform_(-0.5, 0.5))
44
+
45
+ def to_device(self, device):
46
+ self.weights.to(device)
47
+ self.biases.to(device)
48
+ self.node_bias.to(device)
49
+ self.positions.data = self.positions.data.to(device)
50
+ self.device = device
51
+
52
+ @staticmethod
53
+ def _reshape12(x):
54
+ return x.reshape(-1, x.size(-2), x.size(-1))
55
+
56
+ def get_positional_encoding(self, seq_len, d_model):
57
+ position = torch.arange(0, seq_len).reshape(-1, 1)
58
+ div_term = 10000.0 ** (torch.arange(0, d_model, 2) / d_model)
59
+ position_encoding = torch.zeros(seq_len, d_model)
60
+ position_encoding[:, 0::2] = torch.sin(position * div_term)
61
+ position_encoding[:, 1::2] = torch.cos(position * div_term)
62
+ return position_encoding.unsqueeze(0).to(self.device)
63
+
64
+ def get_initial_tensor(self, batch_size, d, pe):
65
+ # initialize energy_tensor
66
+ energy_tensor = torch.ones(batch_size, 1, self.hidden_size) / self.hidden_size #(bs, 1, hs)
67
+ energy_tensor = energy_tensor.to(self.device)
68
+
69
+ # Ensure d is on the same device as node_bias
70
+ d = d.to(self.device)
71
+ node_bias = self.node_bias[d[:, 0, 0]]
72
+ energy_tensor = self.activation(energy_tensor + node_bias + pe[:,0])
73
+ return energy_tensor
74
+
75
+
76
+ def forward(self, neighbor_ids):
77
+ # neighbor_ids: (bs, sen_len, 1+k, 2) ; k is the number of negative samples
78
+ batch_size = neighbor_ids.size(0)
79
+ loss = 0
80
+
81
+ pe = self.get_positional_encoding(512, self.hidden_size) #(1, 512, hs)
82
+
83
+ for i in range(neighbor_ids.size(1)):
84
+ d = neighbor_ids[:, i] #(bs, 1+k, 2)
85
+
86
+ if i == 0:
87
+ # for the first token, initialize energy_tensor as an all-one tensor
88
+ energy_tensor = self.get_initial_tensor(batch_size, d, pe) #(bs, 1, hs)
89
+ else:
90
+ energy_tensor = (energy_cache * self.positions[:, :i, :].softmax(1)).sum(1, keepdim=True) #(bs, 1, hs) :fix dim bug
91
+
92
+ # Vectorized parameter lookup
93
+ src_idx = d[..., 0] # (bs, 1+k)
94
+ tgt_idx = d[..., 1] # (bs, 1+k)
95
+ param_indices = torch.tensor([self.weight_indices.get((s.item(), t.item()), self.shared_param_idx)
96
+ for s, t in zip(src_idx.reshape(-1), tgt_idx.reshape(-1))],
97
+ device=self.device).reshape(batch_size, -1) # (bs, 1+k)
98
+
99
+ # Batch gather operation
100
+ w = self.weights[param_indices] # (bs, 1+k, hidden_size, hidden_size)
101
+ b = self.biases[param_indices] # (bs, 1+k, 1, hidden_size)
102
+
103
+ expand_energy_tensor = self._reshape12(energy_tensor.unsqueeze(1).repeat(1, w.size(1), 1, 1)) #(bs*(1+k), 1, hs)
104
+ # for deepspeed fp16: expand_energy_tensor.half()
105
+ if self.use_ds:
106
+ expand_energy_tensor = expand_energy_tensor.half()
107
+ nxt_energy_tensor = self.activation(expand_energy_tensor.bmm(self._reshape12(w))+self._reshape12(b)+Variable(pe[:,i+1], requires_grad=False)) #(bs*(1+k), 1, hs)
108
+ output_tensor = nxt_energy_tensor.reshape(batch_size, -1, nxt_energy_tensor.size(-2), nxt_energy_tensor.size(-1)) #(bs, 1+k, 1, hs)
109
+
110
+ if i == 0:
111
+ energy_cache = output_tensor[:,0] #(bs, 1, hs)
112
+ else:
113
+ energy_cache = torch.cat([energy_cache, output_tensor[:,0]], dim=1) #(bs, i+1, hs)
114
+
115
+ if 1:
116
+ energy = output_tensor.norm(2, (-2, -1))
117
+ label = torch.LongTensor([0 for _ in range(batch_size)]).to(self.device)
118
+ loss += nn.CrossEntropyLoss()(energy, label)
119
+
120
+ return loss / neighbor_ids.size(1)
121
+
122
+ def decode(self, start, vocab, max_new_tokens=16, do_sample=False, temperature=1):
123
+ ret = []
124
+ pe = self.get_positional_encoding(512, self.hidden_size)
125
+
126
+ for i, pair in enumerate(start):
127
+ if i == 0:
128
+ energy_tensor = self.get_initial_tensor(batch_size=1, d=torch.tensor([[pair]], device=self.device), pe=pe).squeeze(0)
129
+ else:
130
+ energy_tensor = (energy_cache * self.positions[:, :i, :].softmax(1)).sum(1, keepdim=True).squeeze(0)
131
+
132
+ # Get parameter index for this edge
133
+ param_idx = self.weight_indices.get((pair[0], pair[1]), self.shared_param_idx)
134
+
135
+ # Get weights and biases using parameter index
136
+ w = self.weights[param_idx].to(self.device)
137
+ b = self.biases[param_idx].to(self.device)
138
+
139
+ energy_tensor = self.activation(energy_tensor.mm(w) + b + pe.squeeze(0)[i])
140
+ if i == 0:
141
+ energy_cache = energy_tensor.unsqueeze(0) # Add batch dimension
142
+ else:
143
+ energy_cache = torch.cat([energy_cache, energy_tensor.unsqueeze(0)], dim=1)
144
+ ret += [pair]
145
+
146
+ x = pair[1]
147
+ prev_i = len(start)
148
+
149
+ for i in range(max_new_tokens):
150
+ candidates = vocab(vocab.get_neighbor_of_node(x, -1))
151
+
152
+ # Get parameter indices for all candidates
153
+ param_indices = torch.tensor([self.weight_indices.get((x, t[1]), self.shared_param_idx)
154
+ for t in candidates], device=self.device)
155
+
156
+ # Get weights and biases for all candidates
157
+ all_w = self.weights[param_indices].to(self.device)
158
+ all_b = self.biases[param_indices].to(self.device)
159
+
160
+ curr_i = prev_i + i
161
+ energy_tensor = (energy_cache * self.positions[:, :curr_i, :].softmax(1)).sum(1, keepdim=True)
162
+ expand_energy_tensor = energy_tensor.unsqueeze(1).repeat(1, all_w.size(0), 1, 1)
163
+ expand_energy_tensor = self._reshape12(expand_energy_tensor)
164
+
165
+ nxt_energy_tensor = self.activation(expand_energy_tensor.bmm(self._reshape12(all_w)) + self._reshape12(all_b) + pe[:,curr_i].unsqueeze(0))
166
+ output_tensor = nxt_energy_tensor.reshape(1, -1, nxt_energy_tensor.size(-2), nxt_energy_tensor.size(-1))
167
+
168
+ energy = output_tensor.norm(2, (-2,-1)).squeeze()
169
+
170
+ probs = torch.softmax(energy, dim=-1)
171
+ if temperature > 0:
172
+ probs = probs / temperature
173
+ if do_sample:
174
+ index = torch.multinomial(probs, 1).item()
175
+ else:
176
+ index = probs.argmax(-1).item()
177
+
178
+ y = candidates[index][-1]
179
+ ret += [(x, y)]
180
+
181
+ energy_tensor = output_tensor[0, index]
182
+ x = y
183
+
184
+ energy_cache = torch.cat([energy_cache, energy_tensor.unsqueeze(0)], dim=1)
185
+
186
+ return ret
187
+
188
+
189
+ class Vocab:
190
+ def __init__(self, node_dict, nodeindex_dict, edge_dict, edge_decode_dict):
191
+ self.node_dict = node_dict #{'node_p': index_p} ---- size: num_nodes
192
+ self.nodeindex_dict = nodeindex_dict #{index_p: 'node_p'} ---- size: num_nodes
193
+ self.edge_dict = edge_dict #{'node_p': {'node_q': (index_p, index_q), 'node_m': (index_p, index_m)},...} ---- size: num_nodes
194
+ self.edge_decode_dict = edge_decode_dict #{(index_p, index_q): 'node_p->node_q'} ---- size: num_nodes*num_nodes
195
+
196
+ def __call__(self, x):
197
+ if isinstance(x, list):
198
+ return [self.__call__(_) for _ in x]
199
+ else:
200
+ return self.fetch(x)
201
+
202
+ def fetch(self, x):
203
+ s, t = x.split("->")
204
+ return self.edge_dict[s][t] if s in self.edge_dict and t in self.edge_dict[s] else self.edge_dict[""][""]
205
+
206
+ @classmethod
207
+ def from_node_dict(cls, dictname):
208
+ node_dict = dict()
209
+ nodeindex_dict = dict()
210
+ edge_dict = dict()
211
+ edge_decode_dict = dict()
212
+ for s in dictname:
213
+ node_dict[s] = dictname[s]
214
+ nodeindex_dict[dictname[s]] = s # nodeindex_dict: {index_p: 'node_p'}
215
+ edge_dict[s] = {} # edge_dict: {'node_p': {'node_q': (index_p, index_q), 'node_m': (index_p, index_m)}}
216
+ for t in dictname:
217
+ edge_dict[s][t] = (dictname[s], dictname[t])
218
+ edge_decode_dict[(dictname[s], dictname[t])] = "->".join([s, t])
219
+ return cls(node_dict, nodeindex_dict, edge_dict, edge_decode_dict)
220
+
221
+ @classmethod
222
+ def from_edge(cls, filename):
223
+ edge_dict = dict()
224
+ edge_dict[""] = {}
225
+ edge_dict[""][""] = (0, 0)
226
+ edge_decode_dict = dict()
227
+ with open(filename) as f:
228
+ for line in f:
229
+ # line: node_p->node_q
230
+ s, t = line.strip().split("->")
231
+ if s not in edge_dict:
232
+ i = len(edge_dict)
233
+ j = 0
234
+ edge_dict[s] = dict()
235
+ else:
236
+ i = edge_dict[s][list(edge_dict[s].keys())[0]][0]
237
+ j = len(edge_dict[s])
238
+ edge_dict[s][t] = (i, j)
239
+ edge_decode_dict[(i, j)] = "->".join([s, t])
240
+ return cls(None, edge_dict, edge_decode_dict)
241
+
242
+ def get_neighbor_of_edge(self, key, k, frequency_dict=None):
243
+ s, t = key.split("->") # s, t: node
244
+ _s = s if s in self.edge_dict else ""
245
+
246
+ # if s in self.edge_dict:
247
+ # ret = ["->".join([s, _t]) for _t in self.edge_dict[s].keys() if _t != t]
248
+ # else:
249
+ # ret = ["->".join([s, _t]) for _t in self.edge_dict[""].keys() if _t != t]
250
+ # ret = ["->".join([s, _t]) for _t in self.edge_dict[s].keys() if _t != t]
251
+
252
+ # select by word_frequency
253
+ if frequency_dict:
254
+ frequency_lst = list(frequency_dict[_s].keys())
255
+ # index = frequency_lst.index(t)
256
+ # half = k // 2
257
+ # if index <= k:
258
+ # t_lst = [x for i, x in enumerate(frequency_lst[:k+1]) if i != index]
259
+ # else:
260
+ # t_lst = frequency_lst[:half] + frequency_lst[index-half:index]
261
+ t_lst = [x for i, x in enumerate(frequency_lst[:k+1]) if x != t][:k]
262
+ ret = ["->".join([_s, _t]) for _t in t_lst]
263
+ random.shuffle(ret)
264
+ return ret
265
+ # randomly select k negative samples
266
+ else:
267
+ ret = ["->".join([_s, _t]) for _t in self.edge_dict[_s].keys() if _t != t]
268
+ random.shuffle(ret)
269
+ return ret[:k] if k != -1 else ret
270
+
271
+ def get_neighbor_of_node(self, key, k):
272
+ #key :index
273
+ s = self.nodeindex_dict[key] #node
274
+ #_t: node
275
+ ret = ["->".join([s, _t]) for _t in self.edge_dict[s].keys() if _t != s]
276
+
277
+ # randomly select k negative samples
278
+ random.shuffle(ret)
279
+ return ret[:k] if k != -1 else ret
280
+
281
+ def get_neighbor_of_edge_broadcast(self, key, edges, k=100):
282
+ s, t = key.split("->")
283
+ _ret = [_t for _t in self.edge_dict[s].keys() if _t != t] # all neighbors of s except t
284
+ random.shuffle(_ret)
285
+ ret = []
286
+ for edge in edges:
287
+ s, t = edge.split("->")
288
+ ret += [["->".join([s, _t]) for _t in _ret[:k]]]
289
+ return ret
290
+
291
+ @staticmethod
292
+ def to_path(tokens):
293
+ path = []
294
+ for left, right in zip(tokens[:-1], tokens[1:]):
295
+ path.append("->".join([left, right]))
296
+ return path
297
+
298
+ def get_edge_of_node(self, key):
299
+ return list(self.edge_dict[key].values())
300
+
301
+ def decode(self, x):
302
+ return self.edge_decode_dict[x]
303
+
304
+
305
+
run_en.sh ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ set -xe
2
+
3
+ BS=32
4
+ LR=5e-2
5
+ HS=32
6
+ LEN=32
7
+ INITIAL_F=0
8
+ END_F=0
9
+ EPOCH=25
10
+
11
+ accelerate launch --config_file brain.yaml train.py \
12
+ --data_dir data \
13
+ --do_train \
14
+ --output_dir en_checkpoints_len${LEN}_sparse \
15
+ --hidden_size $HS \
16
+ --train_batch_size $BS \
17
+ --max_seq_length $LEN \
18
+ --learning_rate $LR \
19
+ --num_train_epochs $EPOCH \
20
+ --num_neg_samples 400 \
21
+ --initial_file_number $INITIAL_F \
22
+ --end_file_number $END_F \
23
+ --num_workers 8 \
24
+ --fp16 \
25
+ --run_name "BS${BS}_LR${LR}_HS${HS}_LEN${LEN}_f${INITIAL_F}_EPOCH${EPOCH}" \
26
+ --vocab_path vocab_wiki_4k_en.json \
27
+ --train_full data \
28
+ --sparse \
29
+ --use_frequency \
30
+ --use_bpe \
31
+ --bpe_tokenizer wiki_bpe_tokenizer_4000_bytelevel.json
run_zh.sh ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ set -xe
2
+
3
+ BS=32
4
+ LR=5e-2
5
+ HS=32
6
+ LEN=32
7
+ INITIAL_F=0
8
+ END_F=0
9
+ EPOCH=25
10
+
11
+ accelerate launch --config_file brain.yaml train.py \
12
+ --data_dir data \
13
+ --do_train \
14
+ --output_dir zh_checkpoints_len${LEN}_sparse \
15
+ --hidden_size $HS \
16
+ --train_batch_size $BS \
17
+ --max_seq_length $LEN \
18
+ --learning_rate $LR \
19
+ --num_train_epochs $EPOCH \
20
+ --num_neg_samples 400 \
21
+ --initial_file_number $INITIAL_F \
22
+ --end_file_number $END_F \
23
+ --num_workers 8 \
24
+ --fp16 \
25
+ --run_name "BS${BS}_LR${LR}_HS${HS}_LEN${LEN}_f${INITIAL_F}_EPOCH${EPOCH}" \
26
+ --vocab_path vocab_wiki_4k.json \
27
+ --train_full data \
28
+ --sparse \
29
+ --use_frequency
train.py ADDED
@@ -0,0 +1,852 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+ import logging
3
+ import os
4
+ import random
5
+ import math
6
+ import copy
7
+ import json
8
+ import numpy as np
9
+ import torch
10
+ import torch.nn as nn
11
+ import glob
12
+ from tqdm.auto import tqdm, trange
13
+ from torch.autograd import Variable
14
+ from accelerate import Accelerator, DistributedDataParallelKwargs
15
+ from accelerate.utils import InitProcessGroupKwargs
16
+ from torch.utils.data import IterableDataset, DataLoader, Dataset
17
+ import time
18
+ import torch.distributed as dist
19
+ import gc
20
+ from datetime import timedelta
21
+ from tokenizers import Tokenizer
22
+
23
+ import wandb
24
+
25
+ os.environ["WANDB_WATCH"] = "false"
26
+
27
+
28
+ class BraLM(nn.Module):
29
+ def __init__(self, hidden_size, use_ds=False, zero_freq_edges=None, vocab=None):
30
+ super().__init__()
31
+ self.hidden_size = hidden_size
32
+ self.activation = nn.GELU()
33
+ self.positions = nn.Parameter(torch.ones(1, 512, 1))
34
+ self.device = None
35
+
36
+ # for fsdp
37
+ self._tied_weights_keys = []
38
+
39
+ self.use_ds = use_ds
40
+ self.zero_freq_edges = zero_freq_edges
41
+ self.vocab = vocab
42
+
43
+ def prepare_network(self, vocab):
44
+ # Create index mappings for the flattened structure
45
+ self.weight_indices = {} # Maps (s_idx, t_idx) to parameter index
46
+ self.shared_param_idx = 0
47
+
48
+ # Current index for new parameters
49
+ current_idx = 1
50
+
51
+ # Populate parameters and mappings
52
+ for s_idx, s in enumerate(vocab.edge_dict):
53
+ for t_idx, t in enumerate(vocab.edge_dict[s]):
54
+ if self.zero_freq_edges is not None and t in self.zero_freq_edges[s]:
55
+ # Use shared parameters
56
+ self.weight_indices[(s_idx, t_idx)] = self.shared_param_idx
57
+ else:
58
+ self.weight_indices[(s_idx, t_idx)] = current_idx
59
+ current_idx += 1
60
+
61
+ # Create new parameters
62
+ self.weights = nn.Parameter(torch.randn(current_idx, self.hidden_size, self.hidden_size).uniform_(-0.5, 0.5))
63
+ self.biases = nn.Parameter(torch.randn(current_idx, 1, self.hidden_size).uniform_(-0.5, 0.5))
64
+
65
+ self.node_bias = nn.Parameter(torch.randn(len(vocab.edge_dict), 1, self.hidden_size).uniform_(-0.5, 0.5))
66
+
67
+ def to_device(self, device):
68
+ self.weights.to(device)
69
+ self.biases.to(device)
70
+ self.positions.data = self.positions.data.to(device)
71
+ self.device = device
72
+
73
+ @staticmethod
74
+ def _reshape12(x):
75
+ return x.reshape(-1, x.size(-2), x.size(-1))
76
+
77
+ def get_positional_encoding(self, seq_len, d_model):
78
+ position = torch.arange(0, seq_len).reshape(-1, 1)
79
+ div_term = 10000.0 ** (torch.arange(0, d_model, 2) / d_model)
80
+ position_encoding = torch.zeros(seq_len, d_model)
81
+ position_encoding[:, 0::2] = torch.sin(position * div_term)
82
+ position_encoding[:, 1::2] = torch.cos(position * div_term)
83
+ return position_encoding.unsqueeze(0).to(self.device)
84
+
85
+ # def get_initial_tensor(self, batch_size, max_norm=1.0):
86
+ # # initialize energy_tensor
87
+ # energy_tensor = torch.zeros(batch_size, 1, self.hidden_size).normal_(0, 1).to(self.device)
88
+ # delta_norm = torch.norm(energy_tensor.view(energy_tensor.shape[0], -1), dim=-1, p="fro").detach()
89
+ # clip_mask = (delta_norm > max_norm).to(energy_tensor)
90
+ # clip_weights = max_norm / delta_norm * clip_mask + (1 - clip_mask)
91
+ # energy_tensor = (energy_tensor * clip_weights.view(-1, 1, 1)).detach() #(bs, 1, hs)
92
+ # return energy_tensor
93
+
94
+ def get_initial_tensor(self, batch_size, d, pe):
95
+ # initialize energy_tensor
96
+ energy_tensor = torch.ones(batch_size, 1, self.hidden_size) / self.hidden_size #(bs, 1, hs)
97
+ energy_tensor = energy_tensor.to(self.device)
98
+
99
+ node_bias = self.node_bias[d[:, 0, 0]]
100
+ energy_tensor = self.activation(energy_tensor + node_bias + Variable(pe[:,0], requires_grad=False))
101
+ return energy_tensor
102
+
103
+
104
+ def forward(self, neighbor_ids):
105
+ # neighbor_ids: (bs, sen_len, 1+k, 2) ; k is the number of negative samples
106
+ batch_size = neighbor_ids.size(0)
107
+ loss = 0
108
+
109
+ pe = self.get_positional_encoding(512, self.hidden_size) #(1, 512, hs)
110
+
111
+ for i in range(neighbor_ids.size(1)):
112
+ d = neighbor_ids[:, i] #(bs, 1+k, 2)
113
+
114
+ if i == 0:
115
+ # for the first token, initialize energy_tensor as an all-one tensor
116
+ energy_tensor = self.get_initial_tensor(batch_size, d, pe) #(bs, 1, hs)
117
+ else:
118
+ energy_tensor = (energy_cache * self.positions[:, :i, :].softmax(1)).sum(1, keepdim=True) #(bs, 1, hs) :fix dim bug
119
+
120
+ # Vectorized parameter lookup
121
+ src_idx = d[..., 0] # (bs, 1+k)
122
+ tgt_idx = d[..., 1] # (bs, 1+k)
123
+ param_indices = torch.tensor([self.weight_indices.get((s.item(), t.item()), self.shared_param_idx)
124
+ for s, t in zip(src_idx.reshape(-1), tgt_idx.reshape(-1))],
125
+ device=self.device).reshape(batch_size, -1) # (bs, 1+k)
126
+
127
+ # Batch gather operation
128
+ w = self.weights[param_indices] # (bs, 1+k, hidden_size, hidden_size)
129
+ b = self.biases[param_indices] # (bs, 1+k, 1, hidden_size)
130
+
131
+ expand_energy_tensor = self._reshape12(energy_tensor.unsqueeze(1).repeat(1, w.size(1), 1, 1)) #(bs*(1+k), 1, hs)
132
+ # for deepspeed fp16: expand_energy_tensor.half()
133
+ if self.use_ds:
134
+ expand_energy_tensor = expand_energy_tensor.half()
135
+ nxt_energy_tensor = self.activation(expand_energy_tensor.bmm(self._reshape12(w))+self._reshape12(b)+Variable(pe[:,i+1], requires_grad=False)) #(bs*(1+k), 1, hs)
136
+ output_tensor = nxt_energy_tensor.reshape(batch_size, -1, nxt_energy_tensor.size(-2), nxt_energy_tensor.size(-1)) #(bs, 1+k, 1, hs)
137
+
138
+ if i == 0:
139
+ energy_cache = output_tensor[:,0] #(bs, 1, hs)
140
+ else:
141
+ energy_cache = torch.cat([energy_cache, output_tensor[:,0]], dim=1) #(bs, i+1, hs)
142
+
143
+ if 1:
144
+ energy = output_tensor.norm(2, (-2, -1))
145
+ label = torch.LongTensor([0 for _ in range(batch_size)]).to(self.device)
146
+ loss += nn.CrossEntropyLoss()(energy, label)
147
+
148
+ return loss / neighbor_ids.size(1)
149
+
150
+ def decode(self, start, vocab, max_new_tokens=16, do_sample=False, temperature=1):
151
+ ret = []
152
+ pe = self.get_positional_encoding(512, self.hidden_size)
153
+
154
+ for i, pair in enumerate(start):
155
+ if i == 0:
156
+ energy_tensor = self.get_initial_tensor(batch_size=1, d=torch.tensor([[pair]], device=self.device), pe=pe).squeeze(0)
157
+ else:
158
+ energy_tensor = (energy_cache * self.positions[:, :i, :].softmax(1)).sum(1, keepdim=True).squeeze(0)
159
+
160
+ # Get parameter index for this edge
161
+ param_idx = self.weight_indices.get((pair[0], pair[1]), self.shared_param_idx)
162
+
163
+ # Get weights and biases using parameter index
164
+ w = self.weights[param_idx].to(self.device)
165
+ b = self.biases[param_idx].to(self.device)
166
+
167
+ energy_tensor = self.activation(energy_tensor.mm(w) + b + pe.squeeze(0)[i])
168
+ if i == 0:
169
+ energy_cache = energy_tensor.unsqueeze(0) # Add batch dimension
170
+ else:
171
+ energy_cache = torch.cat([energy_cache, energy_tensor.unsqueeze(0)], dim=1)
172
+ ret += [pair]
173
+
174
+ x = pair[1]
175
+ prev_i = len(start)
176
+
177
+ for i in range(max_new_tokens):
178
+ candidates = vocab(vocab.get_neighbor_of_node(x, -1))
179
+
180
+ # Get parameter indices for all candidates
181
+ param_indices = torch.tensor([self.weight_indices.get((x, t[1]), self.shared_param_idx)
182
+ for t in candidates], device=self.device)
183
+
184
+ # Get weights and biases for all candidates
185
+ all_w = self.weights[param_indices].to(self.device)
186
+ all_b = self.biases[param_indices].to(self.device)
187
+
188
+ curr_i = prev_i + i
189
+ energy_tensor = (energy_cache * self.positions[:, :curr_i, :].softmax(1)).sum(1, keepdim=True)
190
+ expand_energy_tensor = energy_tensor.unsqueeze(1).repeat(1, all_w.size(0), 1, 1)
191
+ expand_energy_tensor = self._reshape12(expand_energy_tensor)
192
+
193
+ nxt_energy_tensor = self.activation(expand_energy_tensor.bmm(self._reshape12(all_w)) + self._reshape12(all_b) + pe[:,curr_i].unsqueeze(0))
194
+ output_tensor = nxt_energy_tensor.reshape(1, -1, nxt_energy_tensor.size(-2), nxt_energy_tensor.size(-1))
195
+
196
+ energy = output_tensor.norm(2, (-2,-1)).squeeze()
197
+
198
+ probs = torch.softmax(energy, dim=-1)
199
+ if temperature > 0:
200
+ probs = probs / temperature
201
+ if do_sample:
202
+ index = torch.multinomial(probs, 1).item()
203
+ else:
204
+ index = probs.argmax(-1).item()
205
+
206
+ y = candidates[index][-1]
207
+ ret += [(x, y)]
208
+
209
+ energy_tensor = output_tensor[0, index]
210
+ x = y
211
+
212
+ energy_cache = torch.cat([energy_cache, energy_tensor.unsqueeze(0)], dim=1)
213
+
214
+ return ret
215
+
216
+
217
+ class Vocab:
218
+ def __init__(self, node_dict, nodeindex_dict, edge_dict, edge_decode_dict):
219
+ self.node_dict = node_dict #{'node_p': index_p} ---- size: num_nodes
220
+ self.nodeindex_dict = nodeindex_dict #{index_p: 'node_p'} ---- size: num_nodes
221
+ self.edge_dict = edge_dict #{'node_p': {'node_q': (index_p, index_q), 'node_m': (index_p, index_m)},...} ---- size: num_nodes
222
+ self.edge_decode_dict = edge_decode_dict #{(index_p, index_q): 'node_p->node_q'} ---- size: num_nodes*num_nodes
223
+
224
+ def __call__(self, x):
225
+ if isinstance(x, list):
226
+ return [self.__call__(_) for _ in x]
227
+ else:
228
+ return self.fetch(x)
229
+
230
+ def fetch(self, x):
231
+ s, t = x.split("->")
232
+ return self.edge_dict[s][t] if s in self.edge_dict and t in self.edge_dict[s] else self.edge_dict[""][""]
233
+
234
+ @classmethod
235
+ def from_node_dict(cls, dictname):
236
+ node_dict = dict()
237
+ nodeindex_dict = dict()
238
+ edge_dict = dict()
239
+ edge_decode_dict = dict()
240
+ for s in dictname:
241
+ node_dict[s] = dictname[s]
242
+ nodeindex_dict[dictname[s]] = s # nodeindex_dict: {index_p: 'node_p'}
243
+ edge_dict[s] = {} # edge_dict: {'node_p': {'node_q': (index_p, index_q), 'node_m': (index_p, index_m)}}
244
+ for t in dictname:
245
+ edge_dict[s][t] = (dictname[s], dictname[t])
246
+ edge_decode_dict[(dictname[s], dictname[t])] = "->".join([s, t])
247
+ return cls(node_dict, nodeindex_dict, edge_dict, edge_decode_dict)
248
+
249
+ @classmethod
250
+ def from_edge(cls, filename):
251
+ edge_dict = dict()
252
+ edge_dict[""] = {}
253
+ edge_dict[""][""] = (0, 0)
254
+ edge_decode_dict = dict()
255
+ with open(filename) as f:
256
+ for line in f:
257
+ # line: node_p->node_q
258
+ s, t = line.strip().split("->")
259
+ if s not in edge_dict:
260
+ i = len(edge_dict)
261
+ j = 0
262
+ edge_dict[s] = dict()
263
+ else:
264
+ i = edge_dict[s][list(edge_dict[s].keys())[0]][0]
265
+ j = len(edge_dict[s])
266
+ edge_dict[s][t] = (i, j)
267
+ edge_decode_dict[(i, j)] = "->".join([s, t])
268
+ return cls(None, edge_dict, edge_decode_dict)
269
+
270
+ def get_neighbor_of_edge(self, key, k, frequency_dict=None):
271
+ s, t = key.split("->") # s, t: node
272
+ _s = s if s in self.edge_dict else ""
273
+
274
+ # if s in self.edge_dict:
275
+ # ret = ["->".join([s, _t]) for _t in self.edge_dict[s].keys() if _t != t]
276
+ # else:
277
+ # ret = ["->".join([s, _t]) for _t in self.edge_dict[""].keys() if _t != t]
278
+ # ret = ["->".join([s, _t]) for _t in self.edge_dict[s].keys() if _t != t]
279
+
280
+ # select by word_frequency
281
+ if frequency_dict:
282
+ frequency_lst = list(frequency_dict[_s].keys())
283
+ # index = frequency_lst.index(t)
284
+ # half = k // 2
285
+ # if index <= k:
286
+ # t_lst = [x for i, x in enumerate(frequency_lst[:k+1]) if i != index]
287
+ # else:
288
+ # t_lst = frequency_lst[:half] + frequency_lst[index-half:index]
289
+ t_lst = [x for i, x in enumerate(frequency_lst[:k+1]) if x != t][:k]
290
+ ret = ["->".join([_s, _t]) for _t in t_lst]
291
+ random.shuffle(ret)
292
+ return ret
293
+ # randomly select k negative samples
294
+ else:
295
+ ret = ["->".join([_s, _t]) for _t in self.edge_dict[_s].keys() if _t != t]
296
+ random.shuffle(ret)
297
+ return ret[:k] if k != -1 else ret
298
+
299
+ def get_neighbor_of_node(self, key, k):
300
+ #key :index
301
+ s = self.nodeindex_dict[key] #node
302
+ #_t: node
303
+ ret = ["->".join([s, _t]) for _t in self.edge_dict[s].keys() if _t != s]
304
+
305
+ # randomly select k negative samples
306
+ random.shuffle(ret)
307
+ return ret[:k] if k != -1 else ret
308
+
309
+ def get_neighbor_of_edge_broadcast(self, key, edges, k=100):
310
+ s, t = key.split("->")
311
+ _ret = [_t for _t in self.edge_dict[s].keys() if _t != t] # all neighbors of s except t
312
+ random.shuffle(_ret)
313
+ ret = []
314
+ for edge in edges:
315
+ s, t = edge.split("->")
316
+ ret += [["->".join([s, _t]) for _t in _ret[:k]]]
317
+ return ret
318
+
319
+ @staticmethod
320
+ def to_path(tokens):
321
+ path = []
322
+ for left, right in zip(tokens[:-1], tokens[1:]):
323
+ path.append("->".join([left, right]))
324
+ return path
325
+
326
+ def get_edge_of_node(self, key):
327
+ return list(self.edge_dict[key].values())
328
+
329
+ def decode(self, x):
330
+ return self.edge_decode_dict[x]
331
+
332
+
333
+ logging.basicConfig(format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
334
+ datefmt="%m/%d/%Y %H:%M:%S",
335
+ level=logging.INFO)
336
+ logger = logging.getLogger(__name__)
337
+
338
+
339
+ def stdf(string):
340
+ def _h(char):
341
+ inside_code = ord(char)
342
+ if inside_code == 0x3000:
343
+ inside_code = 0x0020
344
+ else:
345
+ inside_code -= 0xfee0
346
+ if inside_code < 0x0020 or inside_code > 0x7e:
347
+ return char
348
+ return chr(inside_code)
349
+
350
+ return "".join([_h(char) for char in string])
351
+
352
+
353
+ class WikiDataset(Dataset):
354
+ """
355
+ Processor for wiki data.
356
+ """
357
+ def __init__(self, filename, vocab, max_seq_length, num_neg_samples, seed, buffer_size=100000, shuffle=True, use_frequency=False, use_bpe=False, bpe_tokenizer=None):
358
+ super().__init__()
359
+ self.vocab = vocab
360
+ self.max_seq_length = max_seq_length
361
+ self.num_neg_samples = num_neg_samples
362
+ self.generator = np.random.default_rng(seed=seed)
363
+ self.use_bpe = use_bpe
364
+ self.bpe_tokenizer = bpe_tokenizer
365
+
366
+ self.data = self.read(filename)
367
+
368
+ if use_frequency:
369
+ freq_file = 'word_frequency_en.json' if use_bpe else 'word_frequency.json'
370
+ with open(freq_file, 'r') as f:
371
+ self.frequency_dict = json.load(f)
372
+ else:
373
+ self.frequency_dict = None
374
+
375
+ def read(self, filename):
376
+ lines = []
377
+ with open(filename, "r", encoding="utf-8") as f:
378
+ for line in f:
379
+ if self.use_bpe:
380
+ lines.append(line.strip())
381
+ else:
382
+ src = list(line.strip()[:self.max_seq_length])
383
+ lines.append(src)
384
+ return lines
385
+
386
+ def __len__(self):
387
+ return len(self.data)
388
+
389
+ def __getitem__(self, idx):
390
+ src = self.data[idx]
391
+ return self.vectorize(src)
392
+
393
+ def vectorize(self, src):
394
+ if self.use_bpe:
395
+ # For English with BPE
396
+ bpe_tokens = self.bpe_tokenizer.encode(src).tokens
397
+ # Truncate/pad
398
+ pad_token = "[PAD]"
399
+ if len(bpe_tokens) > self.max_seq_length:
400
+ bpe_tokens = bpe_tokens[:self.max_seq_length]
401
+ else:
402
+ bpe_tokens.extend(pad_token for _ in range(self.max_seq_length - len(bpe_tokens)))
403
+ tokens = bpe_tokens
404
+ else:
405
+ # For Chinese without BPE
406
+ if len(src) > self.max_seq_length:
407
+ src = src[:self.max_seq_length]
408
+ else:
409
+ src.extend("" for _ in range(self.max_seq_length-len(src)))
410
+ tokens = src
411
+
412
+ edges = self.vocab.to_path(tokens)
413
+ edge_ids = self.vocab(edges)
414
+ edge_ids = edge_ids[:self.max_seq_length]
415
+ neighbor_ids = [self.vocab(self.vocab.get_neighbor_of_edge(e, self.num_neg_samples, self.frequency_dict)) for e in edges]
416
+
417
+ new_neighbor_ids = []
418
+ for i, e_ids in enumerate(edge_ids):
419
+ new_neighbor_ids.append([e_ids] + neighbor_ids[i])
420
+ return torch.LongTensor(new_neighbor_ids)
421
+
422
+
423
+ def main():
424
+ parser = argparse.ArgumentParser()
425
+
426
+ # Data config
427
+ parser.add_argument("--data_dir", type=str, default="data/wiki",
428
+ help="Directory to contain the input data for all tasks.")
429
+ parser.add_argument("--output_dir", type=str, default="model/",
430
+ help="Directory to output predictions and checkpoints.")
431
+ parser.add_argument("--load_state_dict", type=str, default=None,
432
+ help="Trained model weights to load for evaluation if needed.")
433
+
434
+ # Training config
435
+ parser.add_argument("--do_train", action="store_true",
436
+ help="Whether to run training.")
437
+ parser.add_argument("--do_eval", action="store_true",
438
+ help="Whether to evaluate on the dev set.")
439
+ parser.add_argument("--num_neg_samples", type=int, default=100,
440
+ help="Number of negative samples.")
441
+ parser.add_argument("--max_seq_length", type=int, default=128,
442
+ help="Maximum total input sequence length after word-piece tokenization.")
443
+ parser.add_argument("--train_batch_size", type=int, default=128,
444
+ help="Total batch size for training.")
445
+ parser.add_argument("--eval_batch_size", type=int, default=128,
446
+ help="Total batch size for evaluation.")
447
+ parser.add_argument("--learning_rate", type=float, default=5e-5,
448
+ help="Initial learning rate for Adam.")
449
+ parser.add_argument("--num_train_epochs", type=float, default=3.0,
450
+ help="Total number of training epochs to perform.")
451
+ parser.add_argument("--max_train_steps", type=int, default=None,
452
+ help="Total number of training steps to perform. If provided, overrides training epochs.")
453
+ parser.add_argument("--weight_decay", type=float, default=0.,
454
+ help="L2 weight decay for training.")
455
+ parser.add_argument("--gradient_accumulation_steps", type=int, default=1,
456
+ help="Number of updates steps to accumulate before performing a backward pass.")
457
+ parser.add_argument("--no_cuda", action="store_true",
458
+ help="Whether not to use CUDA when available.")
459
+ parser.add_argument("--fp16", action="store_true",
460
+ help="Whether to use mixed precision.")
461
+ parser.add_argument("--seed", type=int, default=42,
462
+ help="Random seed for initialization.")
463
+ parser.add_argument("--save_steps", type=int, default=500,
464
+ help="How many steps to save the checkpoint once.")
465
+ parser.add_argument("--hidden_size", type=int, default=32,
466
+ help="Mask rate for masked-fine-tuning.")
467
+ parser.add_argument("--local_rank", type=int)
468
+ parser.add_argument("--initial_file_number", type=int, default=0,
469
+ help="From which file to begin training.")
470
+ parser.add_argument("--end_file_number", type=int, default=0,
471
+ help="End file number for training.")
472
+ parser.add_argument("--wiki_sorted_size", type=int, default=70,
473
+ help="Total file numbers for sorted wikidata.")
474
+ parser.add_argument("--run_name", type=str, default="plusb_pluspe_order",
475
+ help="Run name for wandb.")
476
+
477
+ parser.add_argument("--use_frequency", action="store_true",
478
+ help="Whether to use word frequency.")
479
+ parser.add_argument("--train_full", type=str, default=None,
480
+ help="Path to train on full text.")
481
+ parser.add_argument("--checkpoint_save_step", type=int, default=0,
482
+ help="Interval to save checkpoint.(Only support when train_full is True)")
483
+ parser.add_argument("--resume_from_checkpoint", type=str, default=None,
484
+ help="Path to checkpoint to resume training from")
485
+ parser.add_argument("--num_workers", type=int, default=8,
486
+ help="Number of workers for data loading.")
487
+ parser.add_argument("--vocab_path", type=str, default="vocab_wiki_4k.json",
488
+ help="Path to vocab file.")
489
+ parser.add_argument("--use_ds", action="store_true",
490
+ help="Whether to use deepspeed.")
491
+ parser.add_argument("--sparse", action="store_true",
492
+ help="Whether to use sparse.")
493
+ parser.add_argument("--use_bpe", action="store_true",
494
+ help="Whether to use BPE tokenizer for English.")
495
+ parser.add_argument("--bpe_tokenizer_path", type=str, default="wiki_bpe_tokenizer_4000_bytelevel.json",
496
+ help="Path to BPE tokenizer file.")
497
+
498
+ args = parser.parse_args()
499
+
500
+
501
+ device = torch.device("cuda" if torch.cuda.is_available() and not args.no_cuda else "cpu")
502
+ n_gpu = torch.cuda.device_count()
503
+ logger.info("device: {}, n_gpu: {}, distributed training: {}, 16-bits training: {}".format(
504
+ device, n_gpu, "-accelerate", args.fp16))
505
+
506
+ args.train_batch_size = args.train_batch_size // args.gradient_accumulation_steps
507
+
508
+ random.seed(args.seed)
509
+ np.random.seed(args.seed)
510
+ torch.manual_seed(args.seed)
511
+ if n_gpu > 0:
512
+ torch.cuda.manual_seed_all(args.seed)
513
+
514
+ # if not os.path.exists(args.output_dir):
515
+ # os.makedirs(args.output_dir)
516
+
517
+ with open(args.vocab_path) as f:
518
+ node_dict = json.load(f)
519
+ vocab = Vocab.from_node_dict(node_dict)
520
+
521
+ if args.sparse:
522
+ with open('word_frequency.json', 'r') as f:
523
+ freq_dict = json.load(f)
524
+
525
+ zero_freq_edges = {}
526
+ for s in freq_dict:
527
+ zero_freq_edges[s] = []
528
+ for t in freq_dict[s]:
529
+ if freq_dict[s][t] == 0:
530
+ zero_freq_edges[s].append(t)
531
+ else:
532
+ zero_freq_edges = None
533
+
534
+
535
+ def stat_cuda(epoch, cur_file_num, step, location):
536
+ if accelerator.is_local_main_process:
537
+ with open("cuda_stat.txt", "a") as f:
538
+ if epoch is not None:
539
+ f.write('epoch: %d, cur_file_num: %d, step: %d\n' % (epoch, cur_file_num, step))
540
+ f.write(f'--{location}\n')
541
+ f.write('allocated: %dG, max allocated: %dG, cached: %dG, max cached: %dG\n' % (
542
+ torch.cuda.memory_allocated() / 1024 / 1024 / 1024,
543
+ torch.cuda.max_memory_allocated() / 1024 / 1024 / 1024,
544
+ torch.cuda.memory_reserved() / 1024 / 1024 / 1024,
545
+ torch.cuda.max_memory_reserved() / 1024 / 1024 / 1024
546
+ ))
547
+
548
+ if args.do_train:
549
+ # training arguments
550
+ os.environ["NCCL_DEBUG"] = "WARN"
551
+ os.environ["TORCH_NCCL_BLOCKING_WAIT"] = "1"
552
+ ddp_kwargs = DistributedDataParallelKwargs(find_unused_parameters=True)
553
+ init_kwargs = InitProcessGroupKwargs(timeout=timedelta(seconds=1080000))
554
+ accelerator = Accelerator(kwargs_handlers=[ddp_kwargs, init_kwargs], cpu=args.no_cuda, mixed_precision="fp16" if args.fp16 else "no")
555
+ device = accelerator.device
556
+
557
+ # prepare model
558
+ model = BraLM(args.hidden_size, args.use_ds, zero_freq_edges, vocab=vocab)
559
+ model.prepare_network(vocab)
560
+ # model.shared_weight.requires_grad = False
561
+ # model.shared_bias.requires_grad = False
562
+
563
+ # load model from checkpoint
564
+ if args.load_state_dict:
565
+ print(f"Loading model from checkpoint: {args.load_state_dict}")
566
+ checkpoint = torch.load(args.load_state_dict, map_location="cpu")
567
+ #model.load_state_dict(checkpoint["model_state_dict"])
568
+ model.load_old(checkpoint["model_state_dict"])
569
+
570
+
571
+ # Load checkpoint if specified
572
+ wandb_id = None
573
+ global_step = 0
574
+ if args.resume_from_checkpoint:
575
+ print(f"Resuming from checkpoint: {args.resume_from_checkpoint}")
576
+ checkpoint = torch.load(args.resume_from_checkpoint, map_location="cpu")
577
+ model.load_state_dict(checkpoint["model_state_dict"])
578
+ #optimizer.load_state_dict(checkpoint["optimizer_state_dict"])
579
+ start_epoch = checkpoint["epoch"] # + 1
580
+ global_step = checkpoint.get("global_step", 0) # Get saved global step
581
+ wandb_id = checkpoint.get("wandb_id")
582
+ else:
583
+ start_epoch = 0
584
+
585
+ # if accelerator.is_local_main_process:
586
+ # for name, param in model.named_parameters():
587
+ # print(name)
588
+
589
+ model.to_device(device)
590
+
591
+
592
+ if accelerator.is_local_main_process:
593
+ print(f"start_epoch: {start_epoch}, global_step: {global_step}")
594
+
595
+ # prepare optimizer
596
+ no_decay = ["bias", "LayerNorm.bias", "LayerNorm.weight"]
597
+ optimizer_grouped_parameters = [
598
+ {
599
+ "params": [p for n, p in model.named_parameters() if not any(nd in n for nd in no_decay)],
600
+ "weight_decay": args.weight_decay
601
+ },
602
+ {
603
+ "params": [p for n, p in model.named_parameters() if any(nd in n for nd in no_decay)],
604
+ "weight_decay": 0.0
605
+ }
606
+ ]
607
+ optimizer = torch.optim.AdamW(optimizer_grouped_parameters, lr=args.learning_rate)
608
+
609
+ if args.resume_from_checkpoint:
610
+ optimizer.load_state_dict(checkpoint["optimizer_state_dict"])
611
+
612
+ if accelerator.is_local_main_process:
613
+ print(f"before prepare")
614
+ #input('-' * 10)
615
+ #stat_cuda(None, None, None, "before prepare")
616
+ #print(f"{accelerator.device}, model: {model.weights.device}, tensor: {model.tensor.device}, pe: {model.positions.device}")
617
+ if not args.use_ds:
618
+ model, optimizer = accelerator.prepare(model, optimizer) # for deepspeed: # this line
619
+ #stat_cuda(None, None, None, "after prepare")
620
+ #print(f"{accelerator.device}, model: {model.module.weights.device}, tensor: {model.module.tensor.device}, pe: {model.module.positions.device}")
621
+ if accelerator.is_local_main_process:
622
+ print(f"after prepare")
623
+
624
+ if args.do_train:
625
+
626
+ if accelerator.is_local_main_process:
627
+ # init wandb
628
+ wandb.init(
629
+ project="brain",
630
+ name=args.run_name,
631
+ id=wandb_id, # 如果有之前的run_id,使用它;否则会创建新的
632
+ resume="allow", # "allow"表示如果有id就恢复,没有就创建新的
633
+ config=vars(args)
634
+ )
635
+ wandb.define_metric("custom_step")
636
+ wandb.define_metric("batch_*", step_metric="custom_step")
637
+ wandb.define_metric("epoch")
638
+ wandb.define_metric("epoch_*", step_metric="epoch")
639
+ print(f"Started wandb run with id: {wandb.run.id}")
640
+ print(f"View run at: {wandb.run.get_url()}")
641
+
642
+ if args.train_full:
643
+ cur_file_num = args.train_full
644
+ cur_filename = f"{cur_file_num}.txt"
645
+ if args.use_bpe:
646
+ with open(args.bpe_tokenizer_path, 'r') as f:
647
+ bpe_tokenizer = json.load(f)
648
+ else:
649
+ bpe_tokenizer = None
650
+ dataset = WikiDataset(
651
+ os.path.join(args.data_dir, cur_filename),
652
+ vocab,
653
+ args.max_seq_length,
654
+ args.num_neg_samples,
655
+ seed=args.seed,
656
+ shuffle=True,
657
+ use_frequency=args.use_frequency,
658
+ use_bpe=args.use_bpe,
659
+ bpe_tokenizer=bpe_tokenizer
660
+ )
661
+ train_dataloader = DataLoader(dataset, batch_size=args.train_batch_size, num_workers=args.num_workers, pin_memory=True)
662
+ train_dataloader = accelerator.prepare(train_dataloader)
663
+ elif args.resume_from_checkpoint:
664
+ cur_file_num = checkpoint["cur_file_num"]
665
+ if isinstance(cur_file_num, int) or cur_file_num.isdigit():
666
+ cur_file_num = int(cur_file_num) + 1
667
+ #start_epoch = start_epoch - 1
668
+ else:
669
+ cur_file_num = args.initial_file_number
670
+
671
+ if args.resume_from_checkpoint and global_step > 0:
672
+ if args.train_full and global_step % len(train_dataloader) == 0:
673
+ start_epoch = start_epoch + 1
674
+ if not args.train_full and cur_file_num > args.end_file_number:
675
+ start_epoch = start_epoch + 1
676
+ cur_file_num = args.initial_file_number
677
+
678
+
679
+ for epoch in trange(start_epoch, int(args.num_train_epochs), desc="Epoch"):
680
+ # traverse all wiki files
681
+ if epoch != start_epoch or args.train_full:
682
+ cur_file_num = args.initial_file_number
683
+ while cur_file_num <= args.wiki_sorted_size:
684
+ if args.train_full:
685
+ cur_file_num = args.train_full
686
+ logger.info("***** Running training for wiki = %s *****", cur_file_num)
687
+ logger.info(" Batch size = %d", args.train_batch_size * accelerator.num_processes)
688
+
689
+ # prepare data
690
+ if not args.train_full:
691
+ cur_filename = f"{cur_file_num}.txt"
692
+ if args.use_bpe:
693
+ with open(args.bpe_tokenizer_path, 'r') as f:
694
+ bpe_tokenizer = json.load(f)
695
+ else:
696
+ bpe_tokenizer = None
697
+ dataset = WikiDataset(
698
+ os.path.join(args.data_dir, cur_filename),
699
+ vocab,
700
+ args.max_seq_length,
701
+ args.num_neg_samples,
702
+ seed=args.seed,
703
+ shuffle=True,
704
+ use_frequency=args.use_frequency,
705
+ use_bpe=args.use_bpe,
706
+ bpe_tokenizer=bpe_tokenizer
707
+ )
708
+ train_dataloader = DataLoader(dataset, batch_size=args.train_batch_size, num_workers=args.num_workers, pin_memory=True)
709
+ if not args.use_ds:
710
+ train_dataloader = accelerator.prepare(train_dataloader)
711
+ else:
712
+ model, optimizer, train_dataloader = accelerator.prepare(model, optimizer, train_dataloader) # for deepspeed
713
+
714
+
715
+ # training
716
+ train_loss = 0
717
+ num_train_examples = 0
718
+ if accelerator.is_local_main_process:
719
+ progress_bar = tqdm(train_dataloader, desc="Iteration")
720
+ # start_time = time.time()
721
+
722
+ #for _ in range(3):
723
+ for step, batch in enumerate(train_dataloader, start=global_step % len(train_dataloader)):
724
+ # batch: (bs, sen_len, 1+k, 2)
725
+ batch_train_loss = 0
726
+ batch_num_train_examples = 0
727
+ #for ind in range(2, batch.size(1)):
728
+ for ind in range(batch.size(1) - 1, batch.size(1)): # fix: only use the sen_len-1
729
+ # ind: 2, 3, ..., sen_len-1
730
+ # if accelerator.is_local_main_process:
731
+ # end_time = time.time()
732
+ # step_time = end_time - start_time
733
+ # logger.info(f"Step training time: {step_time:.2f} seconds")
734
+
735
+ model.train()
736
+ neighbor_ids = batch[:, :ind] #(bs, ind, 1+k, 2)
737
+
738
+ #stat_cuda(epoch, cur_file_num, global_step, "before forward")
739
+ outputs = model(neighbor_ids)
740
+ loss = outputs
741
+
742
+ # if n_gpu > 1:
743
+ # loss = loss.mean()
744
+
745
+ if args.gradient_accumulation_steps > 1:
746
+ loss = loss / args.gradient_accumulation_steps
747
+ accelerator.backward(loss)
748
+
749
+ if n_gpu > 1:
750
+ dist.all_reduce(loss)
751
+ loss = loss / dist.get_world_size()
752
+
753
+ train_loss += loss.detach().item()
754
+ batch_train_loss += loss.detach().item()
755
+
756
+ num_train_examples += 1
757
+ batch_num_train_examples += 1
758
+
759
+ del outputs
760
+ del loss
761
+ del neighbor_ids
762
+ gc.collect()
763
+ # if step % 5 == 0:
764
+ # torch.cuda.empty_cache()
765
+
766
+ if (step + 1) % args.gradient_accumulation_steps == 0:
767
+ optimizer.step()
768
+ optimizer.zero_grad()
769
+
770
+ ## modified
771
+
772
+ ppl = math.exp(batch_train_loss / batch_num_train_examples)
773
+
774
+ if accelerator.is_local_main_process:
775
+ progress_bar.update(1)
776
+ progress_bar.set_postfix(loss=batch_train_loss / batch_num_train_examples, perplexity=ppl)
777
+
778
+ wandb.log({
779
+ "batch_loss": batch_train_loss / batch_num_train_examples,
780
+ "batch_perplexity": math.exp(batch_train_loss / batch_num_train_examples),
781
+ "batch_epoch": epoch,
782
+ #"step": global_step,
783
+ "custom_step": global_step
784
+ })#, step=global_step)
785
+
786
+ global_step += 1
787
+
788
+ # Save checkpoint every checkpoint_save_step steps at the end of each step
789
+ if accelerator.is_local_main_process and args.checkpoint_save_step > 0 and global_step % args.checkpoint_save_step == 0:
790
+ output_dir_f = f"{args.output_dir}/HS{args.hidden_size}/step_{global_step}/"
791
+ if not os.path.exists(output_dir_f):
792
+ os.makedirs(output_dir_f)
793
+ output_model_file = os.path.join(output_dir_f, f"checkpoint_{global_step}.bin")
794
+
795
+ model_to_save = model.module if hasattr(model, "module") else model
796
+ checkpoint = {
797
+ "model_state_dict": model_to_save.state_dict(),
798
+ "optimizer_state_dict": optimizer.state_dict(),
799
+ "epoch": epoch,
800
+ "global_step": global_step,
801
+ "args": vars(args),
802
+ "wandb_id": wandb.run.id
803
+ }
804
+ if not args.train_full:
805
+ checkpoint["cur_file_num"] = cur_file_num
806
+ print(f"Saving checkpoint to {output_model_file}")
807
+ torch.save(checkpoint, output_model_file)
808
+ print(f"Checkpoint saved to {output_model_file}")
809
+
810
+
811
+ # save model for current training file
812
+ if accelerator.is_local_main_process:
813
+ epoch_avg_loss = train_loss / num_train_examples
814
+ epoch_ppl = math.exp(epoch_avg_loss)
815
+ wandb.log({
816
+ "epoch_loss": epoch_avg_loss,
817
+ "epoch_perplexity": epoch_ppl,
818
+ "epoch": epoch,
819
+ })#, step=global_step)
820
+
821
+ model_to_save = model.module if hasattr(model, "module") else model
822
+ output_dir_f = f"{args.output_dir}/HS{args.hidden_size}/EPOCH{epoch}/"
823
+ if not os.path.exists(output_dir_f):
824
+ os.makedirs(output_dir_f)
825
+ output_model_file = os.path.join(output_dir_f, "f{}_pytorch_model.bin".format(cur_file_num))
826
+ # only save the last model
827
+ if args.train_full or cur_file_num == args.end_file_number:
828
+ #torch.save(model_to_save.state_dict(), output_model_file)
829
+ checkpoint = {
830
+ "model_state_dict": model_to_save.state_dict(),
831
+ "optimizer_state_dict": optimizer.state_dict(),
832
+ "epoch": epoch,
833
+ "global_step": global_step, # Save global step
834
+ "args": vars(args),
835
+ "wandb_id": wandb.run.id # 保存当前运行的wandb_id
836
+ }
837
+ if not args.train_full:
838
+ checkpoint["cur_file_num"] = cur_file_num
839
+ print(f"Saving model to {output_model_file}")
840
+ torch.save(checkpoint, output_model_file)
841
+ print(f"Model saved to {output_model_file}")
842
+
843
+ if args.train_full:
844
+ break
845
+ cur_file_num += 1
846
+ if cur_file_num > args.end_file_number:
847
+ break
848
+
849
+
850
+
851
+ if __name__ == "__main__":
852
+ main()
vocab_wiki_4k.json ADDED
@@ -0,0 +1,4003 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<unk>": 0,
3
+ ",": 1,
4
+ "的": 2,
5
+ "。": 3,
6
+ "1": 4,
7
+ "年": 5,
8
+ "、": 6,
9
+ "在": 7,
10
+ "0": 8,
11
+ "一": 9,
12
+ "为": 10,
13
+ "中": 11,
14
+ "国": 12,
15
+ "2": 13,
16
+ "9": 14,
17
+ "是": 15,
18
+ "人": 16,
19
+ "有": 17,
20
+ "大": 18,
21
+ "于": 19,
22
+ "和": 20,
23
+ "后": 21,
24
+ "了": 22,
25
+ "以": 23,
26
+ "时": 24,
27
+ "月": 25,
28
+ "日": 26,
29
+ "会": 27,
30
+ "不": 28,
31
+ "学": 29,
32
+ "(": 30,
33
+ ")": 31,
34
+ "地": 32,
35
+ "成": 33,
36
+ "他": 34,
37
+ "5": 35,
38
+ "上": 36,
39
+ "3": 37,
40
+ "8": 38,
41
+ "4": 39,
42
+ "出": 40,
43
+ "之": 41,
44
+ "个": 42,
45
+ "军": 43,
46
+ "6": 44,
47
+ "与": 45,
48
+ "到": 46,
49
+ "部": 47,
50
+ "7": 48,
51
+ "行": 49,
52
+ "主": 50,
53
+ "发": 51,
54
+ "民": 52,
55
+ "其": 53,
56
+ "而": 54,
57
+ "生": 55,
58
+ "作": 56,
59
+ "并": 57,
60
+ "斯": 58,
61
+ "这": 59,
62
+ "对": 60,
63
+ "法": 61,
64
+ "公": 62,
65
+ "被": 63,
66
+ "分": 64,
67
+ "家": 65,
68
+ "来": 66,
69
+ "政": 67,
70
+ "及": 68,
71
+ "方": 69,
72
+ "同": 70,
73
+ "第": 71,
74
+ "区": 72,
75
+ "多": 73,
76
+ "用": 74,
77
+ "西": 75,
78
+ "长": 76,
79
+ "自": 77,
80
+ "下": 78,
81
+ "名": 79,
82
+ "开": 80,
83
+ "由": 81,
84
+ "《": 82,
85
+ "》": 83,
86
+ "南": 84,
87
+ "本": 85,
88
+ "也": 86,
89
+ "任": 87,
90
+ "动": 88,
91
+ "「": 89,
92
+ "因": 90,
93
+ "“": 91,
94
+ "此": 92,
95
+ "」": 93,
96
+ "尔": 94,
97
+ "北": 95,
98
+ "可": 96,
99
+ "”": 97,
100
+ "文": 98,
101
+ "经": 99,
102
+ "要": 100,
103
+ "最": 101,
104
+ "子": 102,
105
+ "战": 103,
106
+ "过": 104,
107
+ "前": 105,
108
+ "员": 106,
109
+ "所": 107,
110
+ "进": 108,
111
+ "得": 109,
112
+ "新": 110,
113
+ "当": 111,
114
+ "台": 112,
115
+ "东": 113,
116
+ "事": 114,
117
+ "等": 115,
118
+ "高": 116,
119
+ "建": 117,
120
+ "加": 118,
121
+ "三": 119,
122
+ "立": 120,
123
+ "教": 121,
124
+ "间": 122,
125
+ "但": 123,
126
+ "将": 124,
127
+ "代": 125,
128
+ "市": 126,
129
+ "位": 127,
130
+ "特": 128,
131
+ "现": 129,
132
+ "海": 130,
133
+ "至": 131,
134
+ "入": 132,
135
+ "德": 133,
136
+ "·": 134,
137
+ "内": 135,
138
+ "能": 136,
139
+ "期": 137,
140
+ "工": 138,
141
+ "外": 139,
142
+ "利": 140,
143
+ "世": 141,
144
+ "定": 142,
145
+ "里": 143,
146
+ "道": 144,
147
+ "次": 145,
148
+ "理": 146,
149
+ "天": 147,
150
+ "美": 148,
151
+ "二": 149,
152
+ "机": 150,
153
+ "克": 151,
154
+ "制": 152,
155
+ "者": 153,
156
+ "队": 154,
157
+ "重": 155,
158
+ "路": 156,
159
+ "全": 157,
160
+ "使": 158,
161
+ "十": 159,
162
+ "力": 160,
163
+ "都": 161,
164
+ "面": 162,
165
+ "度": 163,
166
+ "亚": 164,
167
+ "表": 165,
168
+ "山": 166,
169
+ "电": 167,
170
+ "合": 168,
171
+ "共": 169,
172
+ "业": 170,
173
+ "通": 171,
174
+ "物": 172,
175
+ "两": 173,
176
+ "体": 174,
177
+ "场": 175,
178
+ "小": 176,
179
+ ":": 177,
180
+ "化": 178,
181
+ "马": 179,
182
+ "们": 180,
183
+ "王": 181,
184
+ " ": 182,
185
+ "称": 183,
186
+ "设": 184,
187
+ "平": 185,
188
+ "性": 186,
189
+ "相": 187,
190
+ "联": 188,
191
+ "府": 189,
192
+ "院": 190,
193
+ "安": 191,
194
+ "关": 192,
195
+ "赛": 193,
196
+ "拉": 194,
197
+ "正": 195,
198
+ "州": 196,
199
+ "车": 197,
200
+ "明": 198,
201
+ "数": 199,
202
+ "球": 200,
203
+ "种": 201,
204
+ "从": 202,
205
+ "起": 203,
206
+ "华": 204,
207
+ "原": 205,
208
+ "统": 206,
209
+ "总": 207,
210
+ "城": 208,
211
+ "议": 209,
212
+ "然": 210,
213
+ "治": 211,
214
+ "达": 212,
215
+ "系": 213,
216
+ ";": 214,
217
+ "线": 215,
218
+ "就": 216,
219
+ "比": 217,
220
+ "港": 218,
221
+ "向": 219,
222
+ "式": 220,
223
+ "如": 221,
224
+ "金": 222,
225
+ "选": 223,
226
+ "士": 224,
227
+ "元": 225,
228
+ "基": 226,
229
+ "罗": 227,
230
+ "布": 228,
231
+ "水": 229,
232
+ "提": 230,
233
+ "受": 231,
234
+ "改": 232,
235
+ "组": 233,
236
+ "产": 234,
237
+ "四": 235,
238
+ "英": 236,
239
+ "-": 237,
240
+ "运": 238,
241
+ "实": 239,
242
+ "约": 240,
243
+ "该": 241,
244
+ "常": 242,
245
+ "说": 243,
246
+ "党": 244,
247
+ "号": 245,
248
+ "或": 246,
249
+ "无": 247,
250
+ "义": 248,
251
+ "影": 249,
252
+ "计": 250,
253
+ "心": 251,
254
+ "书": 252,
255
+ "接": 253,
256
+ "些": 254,
257
+ "始": 255,
258
+ "委": 256,
259
+ "意": 257,
260
+ "科": 258,
261
+ "格": 259,
262
+ "则": 260,
263
+ "量": 261,
264
+ "司": 262,
265
+ "务": 263,
266
+ "认": 264,
267
+ "县": 265,
268
+ "领": 266,
269
+ "界": 267,
270
+ "已": 268,
271
+ "门": 269,
272
+ "兰": 270,
273
+ "林": 271,
274
+ "首": 272,
275
+ "保": 273,
276
+ "结": 274,
277
+ "交": 275,
278
+ "女": 276,
279
+ "传": 277,
280
+ "参": 278,
281
+ "校": 279,
282
+ "还": 280,
283
+ "回": 281,
284
+ "尼": 282,
285
+ "直": 283,
286
+ "万": 284,
287
+ "口": 285,
288
+ "目": 286,
289
+ "巴": 287,
290
+ "处": 288,
291
+ "风": 289,
292
+ "持": 290,
293
+ "展": 291,
294
+ "各": 292,
295
+ "团": 293,
296
+ "程": 294,
297
+ "指": 295,
298
+ "导": 296,
299
+ "湾": 297,
300
+ "反": 298,
301
+ "香": 299,
302
+ "手": 300,
303
+ "广": 301,
304
+ ".": 302,
305
+ "史": 303,
306
+ "社": 304,
307
+ "更": 305,
308
+ "阿": 306,
309
+ "师": 307,
310
+ "���": 308,
311
+ "只": 309,
312
+ "曾": 310,
313
+ "语": 311,
314
+ "包": 312,
315
+ "演": 313,
316
+ "太": 314,
317
+ "击": 315,
318
+ "应": 316,
319
+ "权": 317,
320
+ "没": 318,
321
+ "省": 319,
322
+ "站": 320,
323
+ "色": 321,
324
+ "朝": 322,
325
+ "命": 323,
326
+ "形": 324,
327
+ "身": 325,
328
+ "五": 326,
329
+ "获": 327,
330
+ "列": 328,
331
+ "官": 329,
332
+ "米": 330,
333
+ "管": 331,
334
+ "信": 332,
335
+ "带": 333,
336
+ "决": 334,
337
+ "流": 335,
338
+ "变": 336,
339
+ "取": 337,
340
+ "局": 338,
341
+ "造": 339,
342
+ "集": 340,
343
+ "著": 341,
344
+ "视": 342,
345
+ "张": 343,
346
+ "情": 344,
347
+ "点": 345,
348
+ "族": 346,
349
+ "论": 347,
350
+ "少": 348,
351
+ "乐": 349,
352
+ "属": 350,
353
+ "古": 351,
354
+ "级": 352,
355
+ "维": 353,
356
+ "兵": 354,
357
+ "空": 355,
358
+ "气": 356,
359
+ "举": 357,
360
+ "强": 358,
361
+ "神": 359,
362
+ "清": 360,
363
+ "示": 361,
364
+ "解": 362,
365
+ "纪": 363,
366
+ "支": 364,
367
+ "资": 365,
368
+ "京": 366,
369
+ "派": 367,
370
+ "别": 368,
371
+ "近": 369,
372
+ "据": 370,
373
+ "类": 371,
374
+ "活": 372,
375
+ "江": 373,
376
+ "又": 374,
377
+ "历": 375,
378
+ "河": 376,
379
+ "先": 377,
380
+ "武": 378,
381
+ "铁": 379,
382
+ "往": 380,
383
+ "她": 381,
384
+ "记": 382,
385
+ "岛": 383,
386
+ "即": 384,
387
+ "非": 385,
388
+ "去": 386,
389
+ "品": 387,
390
+ "帝": 388,
391
+ "争": 389,
392
+ "再": 390,
393
+ "星": 391,
394
+ "苏": 392,
395
+ "放": 393,
396
+ "转": 394,
397
+ "术": 395,
398
+ "李": 396,
399
+ "字": 397,
400
+ "条": 398,
401
+ "夫": 399,
402
+ "件": 400,
403
+ "音": 401,
404
+ "令": 402,
405
+ "商": 403,
406
+ "石": 404,
407
+ "光": 405,
408
+ "头": 406,
409
+ "随": 407,
410
+ "初": 408,
411
+ "且": 409,
412
+ "研": 410,
413
+ "很": 411,
414
+ "宗": 412,
415
+ "洲": 413,
416
+ "未": 414,
417
+ "亲": 415,
418
+ "卡": 416,
419
+ "单": 417,
420
+ "图": 418,
421
+ "周": 419,
422
+ "调": 420,
423
+ "际": 421,
424
+ "收": 422,
425
+ "汉": 423,
426
+ "攻": 424,
427
+ "许": 425,
428
+ "复": 426,
429
+ "职": 427,
430
+ "知": 428,
431
+ "它": 429,
432
+ "专": 430,
433
+ "果": 431,
434
+ "节": 432,
435
+ "打": 433,
436
+ "连": 434,
437
+ "划": 435,
438
+ "完": 436,
439
+ "离": 437,
440
+ "土": 438,
441
+ "波": 439,
442
+ "求": 440,
443
+ "亦": 441,
444
+ "陆": 442,
445
+ "问": 443,
446
+ "死": 444,
447
+ "续": 445,
448
+ "型": 446,
449
+ "给": 447,
450
+ "游": 448,
451
+ "今": 449,
452
+ "观": 450,
453
+ "办": 451,
454
+ "除": 452,
455
+ "题": 453,
456
+ "每": 454,
457
+ "括": 455,
458
+ "奥": 456,
459
+ "规": 457,
460
+ "居": 458,
461
+ "龙": 459,
462
+ "见": 460,
463
+ "六": 461,
464
+ "言": 462,
465
+ "普": 463,
466
+ "究": 464,
467
+ "己": 465,
468
+ "我": 466,
469
+ "白": 467,
470
+ "角": 468,
471
+ "较": 469,
472
+ "剧": 470,
473
+ "案": 471,
474
+ "根": 472,
475
+ "纳": 473,
476
+ "育": 474,
477
+ "修": 475,
478
+ "片": 476,
479
+ "好": 477,
480
+ "创": 478,
481
+ "宣": 479,
482
+ "众": 480,
483
+ "段": 481,
484
+ "伊": 482,
485
+ "济": 483,
486
+ "引": 484,
487
+ "致": 485,
488
+ "终": 486,
489
+ "投": 487,
490
+ "威": 488,
491
+ "八": 489,
492
+ "标": 490,
493
+ "功": 491,
494
+ "服": 492,
495
+ "福": 493,
496
+ "样": 494,
497
+ "另": 495,
498
+ "失": 496,
499
+ "质": 497,
500
+ "助": 498,
501
+ "舰": 499,
502
+ "装": 500,
503
+ "协": 501,
504
+ "陈": 502,
505
+ "皇": 503,
506
+ "阳": 504,
507
+ "营": 505,
508
+ "希": 506,
509
+ "率": 507,
510
+ "兴": 508,
511
+ "器": 509,
512
+ "速": 510,
513
+ "增": 511,
514
+ "卫": 512,
515
+ "查": 513,
516
+ "黄": 514,
517
+ "构": 515,
518
+ "护": 516,
519
+ "继": 517,
520
+ "告": 518,
521
+ "那": 519,
522
+ "存": 520,
523
+ "备": 521,
524
+ "季": 522,
525
+ "准": 523,
526
+ "越": 524,
527
+ "九": 525,
528
+ "欧": 526,
529
+ "早": 527,
530
+ "央": 528,
531
+ "占": 529,
532
+ "推": 530,
533
+ "园": 531,
534
+ "整": 532,
535
+ "火": 533,
536
+ "%": 534,
537
+ "边": 535,
538
+ "故": 536,
539
+ "响": 537,
540
+ "具": 538,
541
+ "百": 539,
542
+ "置": 540,
543
+ "才": 541,
544
+ "仍": 542,
545
+ "爱": 543,
546
+ "供": 544,
547
+ "足": 545,
548
+ "镇": 546,
549
+ "担": 547,
550
+ "围": 548,
551
+ "青": 549,
552
+ "超": 550,
553
+ "便": 551,
554
+ "杀": 552,
555
+ "例": 553,
556
+ "客": 554,
557
+ "让": 555,
558
+ "花": 556,
559
+ "源": 557,
560
+ "革": 558,
561
+ "织": 559,
562
+ "警": 560,
563
+ "独": 561,
564
+ "落": 562,
565
+ "项": 563,
566
+ "圣": 564,
567
+ "想": 565,
568
+ "馆": 566,
569
+ "座": 567,
570
+ "份": 568,
571
+ "境": 569,
572
+ "儿": 570,
573
+ "需": 571,
574
+ "田": 572,
575
+ "象": 573,
576
+ "防": 574,
577
+ "航": 575,
578
+ "何": 576,
579
+ "父": 577,
580
+ "志": 578,
581
+ "村": 579,
582
+ "印": 580,
583
+ "密": 581,
584
+ "升": 582,
585
+ "留": 583,
586
+ "易": 584,
587
+ "塔": 585,
588
+ "席": 586,
589
+ "把": 587,
590
+ "施": 588,
591
+ "老": 589,
592
+ "证": 590,
593
+ "哈": 591,
594
+ "副": 592,
595
+ "均": 593,
596
+ "编": 594,
597
+ "败": 595,
598
+ "歌": 596,
599
+ "班": 597,
600
+ "伯": 598,
601
+ "母": 599,
602
+ "热": 600,
603
+ "容": 601,
604
+ "感": 602,
605
+ "极": 603,
606
+ "步": 604,
607
+ "半": 605,
608
+ "声": 606,
609
+ "住": 607,
610
+ "考": 608,
611
+ "着": 609,
612
+ "七": 610,
613
+ "几": 611,
614
+ "医": 612,
615
+ "止": 613,
616
+ "红": 614,
617
+ "吉": 615,
618
+ "曲": 616,
619
+ "飞": 617,
620
+ "环": 618,
621
+ "版": 619,
622
+ "盟": 620,
623
+ "看": 621,
624
+ "评": 622,
625
+ "律": 623,
626
+ "播": 624,
627
+ "域": 625,
628
+ "层": 626,
629
+ "萨": 627,
630
+ "沙": 628,
631
+ "录": 629,
632
+ "群": 630,
633
+ "农": 631,
634
+ "刘": 632,
635
+ "降": 633,
636
+ "室": 634,
637
+ "低": 635,
638
+ "伦": 636,
639
+ "真": 637,
640
+ "守": 638,
641
+ "双": 639,
642
+ "技": 640,
643
+ "艺": 641,
644
+ "承": 642,
645
+ "筑": 643,
646
+ "批": 644,
647
+ "势": 645,
648
+ "深": 646,
649
+ "博": 647,
650
+ "佛": 648,
651
+ "登": 649,
652
+ "试": 650,
653
+ "费": 651,
654
+ "写": 652,
655
+ "奖": 653,
656
+ "网": 654,
657
+ "察": 655,
658
+ "移": 656,
659
+ "抗": 657,
660
+ "胜": 658,
661
+ "仅": 659,
662
+ "楼": 660,
663
+ "积": 661,
664
+ "食": 662,
665
+ "模": 663,
666
+ "话": 664,
667
+ "洋": 665,
668
+ "却": 666,
669
+ "票": 667,
670
+ "黑": 668,
671
+ "采": 669,
672
+ "鲁": 670,
673
+ "排": 671,
674
+ "雷": 672,
675
+ "画": 673,
676
+ "洛": 674,
677
+ "确": 675,
678
+ "虽": 676,
679
+ "像": 677,
680
+ "责": 678,
681
+ "范": 679,
682
+ "—": 680,
683
+ "督": 681,
684
+ "病": 682,
685
+ "请": 683,
686
+ "远": 684,
687
+ "消": 685,
688
+ "满": 686,
689
+ "余": 687,
690
+ "射": 688,
691
+ "岁": 689,
692
+ "戏": 690,
693
+ "乡": 691,
694
+ "征": 692,
695
+ "干": 693,
696
+ "湖": 694,
697
+ "望": 695,
698
+ "勒": 696,
699
+ "堂": 697,
700
+ "料": 698,
701
+ "千": 699,
702
+ "负": 700,
703
+ "显": 701,
704
+ "做": 702,
705
+ "限": 703,
706
+ "思": 704,
707
+ "男": 705,
708
+ "船": 706,
709
+ "木": 707,
710
+ "效": 708,
711
+ "顿": 709,
712
+ "难": 710,
713
+ "依": 711,
714
+ "压": 712,
715
+ "景": 713,
716
+ "状": 714,
717
+ "控": 715,
718
+ "破": 716,
719
+ "俄": 717,
720
+ "街": 718,
721
+ "退": 719,
722
+ "康": 720,
723
+ "断": 721,
724
+ "川": 722,
725
+ "藏": 723,
726
+ "宫": 724,
727
+ "念": 725,
728
+ "精": 726,
729
+ "房": 727,
730
+ "必": 728,
731
+ "氏": 729,
732
+ "托": 730,
733
+ "诺": 731,
734
+ "预": 732,
735
+ "习": 733,
736
+ "友": 734,
737
+ "严": 735,
738
+ "温": 736,
739
+ "态": 737,
740
+ "候": 738,
741
+ "哥": 739,
742
+ "典": 740,
743
+ "塞": 741,
744
+ "桥": 742,
745
+ "森": 743,
746
+ "云": 744,
747
+ "曼": 745,
748
+ "章": 746,
749
+ "照": 747,
750
+ "配": 748,
751
+ "突": 749,
752
+ "况": 750,
753
+ "莱": 751,
754
+ "宁": 752,
755
+ "价": 753,
756
+ "素": 754,
757
+ "遭": 755,
758
+ "暴": 756,
759
+ "蒙": 757,
760
+ "久": 758,
761
+ "停": 759,
762
+ "测": 760,
763
+ "亡": 761,
764
+ "执": 762,
765
+ "奇": 763,
766
+ "策": 764,
767
+ "底": 765,
768
+ "斗": 766,
769
+ "走": 767,
770
+ "述": 768,
771
+ "附": 769,
772
+ "唐": 770,
773
+ "庄": 771,
774
+ "伤": 772,
775
+ "瓦": 773,
776
+ "甲": 774,
777
+ "识": 775,
778
+ "载": 776,
779
+ "判": 777,
780
+ "绝": 778,
781
+ "恩": 779,
782
+ "算": 780,
783
+ "左": 781,
784
+ "逐": 782,
785
+ "封": 783,
786
+ "冠": 784,
787
+ "银": 785,
788
+ "快": 786,
789
+ "授": 787,
790
+ "礼": 788,
791
+ "韩": 789,
792
+ "弹": 790,
793
+ "届": 791,
794
+ "验": 792,
795
+ "注": 793,
796
+ "祖": 794,
797
+ "词": 795,
798
+ "富": 796,
799
+ "户": 797,
800
+ "厅": 798,
801
+ "永": 799,
802
+ "毛": 800,
803
+ "般": 801,
804
+ "寺": 802,
805
+ "监": 803,
806
+ "宝": 804,
807
+ "泰": 805,
808
+ "庆": 806,
809
+ "拿": 807,
810
+ "闻": 808,
811
+ "牙": 809,
812
+ "姆": 810,
813
+ "澳": 811,
814
+ "临": 812,
815
+ "丽": 813,
816
+ "贝": 814,
817
+ "厂": 815,
818
+ "延": 816,
819
+ "拥": 817,
820
+ "迪": 818,
821
+ "邦": 819,
822
+ "旅": 820,
823
+ "索": 821,
824
+ "读": 822,
825
+ "右": 823,
826
+ "夏": 824,
827
+ "孙": 825,
828
+ "吴": 826,
829
+ "归": 827,
830
+ "瑞": 828,
831
+ "息": 829,
832
+ "尽": 830,
833
+ "值": 831,
834
+ "须": 832,
835
+ "堡": 833,
836
+ "弟": 834,
837
+ "晚": 835,
838
+ "甚": 836,
839
+ "切": 837,
840
+ "埃": 838,
841
+ "丹": 839,
842
+ "岸": 840,
843
+ "役": 841,
844
+ "杨": 842,
845
+ "束": 843,
846
+ "叶": 844,
847
+ "春": 845,
848
+ "冲": 846,
849
+ "署": 847,
850
+ "坦": 848,
851
+ "雄": 849,
852
+ "庭": 850,
853
+ "轮": 851,
854
+ "宋": 852,
855
+ "渐": 853,
856
+ "送": 854,
857
+ "隆": 855,
858
+ "侧": 856,
859
+ "旧": 857,
860
+ "似": 858,
861
+ "臣": 859,
862
+ "轻": 860,
863
+ "练": 861,
864
+ "迁": 862,
865
+ "麦": 863,
866
+ "救": 864,
867
+ "野": 865,
868
+ "害": 866,
869
+ "驻": 867,
870
+ "减": 868,
871
+ "核": 869,
872
+ "优": 870,
873
+ "阵": 871,
874
+ "库": 872,
875
+ "异": 873,
876
+ "灵": 874,
877
+ "细": 875,
878
+ "烈": 876,
879
+ "末": 877,
880
+ "爆": 878,
881
+ "阶": 879,
882
+ "丁": 880,
883
+ "乌": 881,
884
+ "草": 882,
885
+ "善": 883,
886
+ "财": 884,
887
+ "含": 885,
888
+ "罪": 886,
889
+ "遗": 887,
890
+ "尚": 888,
891
+ "唱": 889,
892
+ "兼": 890,
893
+ "略": 891,
894
+ "牌": 892,
895
+ "嘉": 893,
896
+ "免": 894,
897
+ "审": 895,
898
+ "劳": 896,
899
+ "简": 897,
900
+ "若": 898,
901
+ "油": 899,
902
+ "良": 900,
903
+ "敌": 901,
904
+ "乘": 902,
905
+ "杰": 903,
906
+ "松": 904,
907
+ "币": 905,
908
+ "亿": 906,
909
+ "店": 907,
910
+ "援": 908,
911
+ "签": 909,
912
+ "郡": 910,
913
+ "散": 911,
914
+ "婚": 912,
915
+ "挥": 913,
916
+ "养": 914,
917
+ "津": 915,
918
+ "酒": 916,
919
+ "毕": 917,
920
+ "鲜": 918,
921
+ "换": 919,
922
+ "胡": 920,
923
+ "予": 921,
924
+ "诗": 922,
925
+ "顺": 923,
926
+ "短": 924,
927
+ "徒": 925,
928
+ "端": 926,
929
+ "释": 927,
930
+ "追": 928,
931
+ "训": 929,
932
+ "差": 930,
933
+ "钟": 931,
934
+ "莫": 932,
935
+ "荣": 933,
936
+ "艾": 934,
937
+ "盛": 935,
938
+ "夺": 936,
939
+ "贵": 937,
940
+ "树": 938,
941
+ "晋": 939,
942
+ "皮": 940,
943
+ "泽": 941,
944
+ "禁": 942,
945
+ "秀": 943,
946
+ "乱": 944,
947
+ "朱": 945,
948
+ "梅": 946,
949
+ "够": 947,
950
+ "讨": 948,
951
+ "辑": 949,
952
+ "血": 950,
953
+ "架": 951,
954
+ "启": 952,
955
+ "摩": 953,
956
+ "廷": 954,
957
+ "药": 955,
958
+ "旗": 956,
959
+ "君": 957,
960
+ "扩": 958,
961
+ "籍": 959,
962
+ "玛": 960,
963
+ "仁": 961,
964
+ "补": 962,
965
+ "朗": 963,
966
+ "召": 964,
967
+ "丰": 965,
968
+ "撤": 966,
969
+ "假": 967,
970
+ "午": 968,
971
+ "摄": 969,
972
+ "访": 970,
973
+ "庙": 971,
974
+ "梁": 972,
975
+ "辖": 973,
976
+ "检": 974,
977
+ "枪": 975,
978
+ "沿": 976,
979
+ "鱼": 977,
980
+ "益": 978,
981
+ "/": 979,
982
+ "输": 980,
983
+ "宪": 981,
984
+ "诸": 982,
985
+ "逃": 983,
986
+ "佳": 984,
987
+ "迫": 985,
988
+ "屋": 986,
989
+ "杂": 987,
990
+ "玉": 988,
991
+ "抵": 989,
992
+ "雅": 990,
993
+ "喜": 991,
994
+ "遇": 992,
995
+ "袭": 993,
996
+ "弗": 994,
997
+ "菲": 995,
998
+ "讲": 996,
999
+ "顶": 997,
1000
+ "秘": 998,
1001
+ "互": 999,
1002
+ "仪": 1000,
1003
+ "款": 1001,
1004
+ "帮": 1002,
1005
+ "微": 1003,
1006
+ "否": 1004,
1007
+ "拜": 1005,
1008
+ "险": 1006,
1009
+ "企": 1007,
1010
+ "拍": 1008,
1011
+ "什": 1009,
1012
+ "觉": 1010,
1013
+ "返": 1011,
1014
+ "蒂": 1012,
1015
+ "媒": 1013,
1016
+ "股": 1014,
1017
+ "盖": 1015,
1018
+ "耶": 1016,
1019
+ "齐": 1017,
1020
+ "诉": 1018,
1021
+ "找": 1019,
1022
+ "炮": 1020,
1023
+ "捕": 1021,
1024
+ "适": 1022,
1025
+ "借": 1023,
1026
+ "愿": 1024,
1027
+ "犯": 1025,
1028
+ "序": 1026,
1029
+ "童": 1027,
1030
+ "黎": 1028,
1031
+ "背": 1029,
1032
+ "饰": 1030,
1033
+ "乎": 1031,
1034
+ "昌": 1032,
1035
+ "凯": 1033,
1036
+ "植": 1034,
1037
+ "透": 1035,
1038
+ "郑": 1036,
1039
+ "灭": 1037,
1040
+ "板": 1038,
1041
+ "销": 1039,
1042
+ "跟": 1040,
1043
+ "侵": 1041,
1044
+ "刻": 1042,
1045
+ "材": 1043,
1046
+ "夜": 1044,
1047
+ "货": 1045,
1048
+ "额": 1046,
1049
+ "纽": 1047,
1050
+ "\"": 1048,
1051
+ "坚": 1049,
1052
+ "按": 1050,
1053
+ "谈": 1051,
1054
+ "待": 1052,
1055
+ "途": 1053,
1056
+ "旋": 1054,
1057
+ "奉": 1055,
1058
+ "赫": 1056,
1059
+ "赵": 1057,
1060
+ "私": 1058,
1061
+ "损": 1059,
1062
+ "谋": 1060,
1063
+ "隶": 1061,
1064
+ "充": 1062,
1065
+ "某": 1063,
1066
+ "么": 1064,
1067
+ "介": 1065,
1068
+ "吸": 1066,
1069
+ "刑": 1067,
1070
+ "谷": 1068,
1071
+ "距": 1069,
1072
+ "缺": 1070,
1073
+ "巡": 1071,
1074
+ "激": 1072,
1075
+ "骑": 1073,
1076
+ "忠": 1074,
1077
+ "听": 1075,
1078
+ "献": 1076,
1079
+ "毁": 1077,
1080
+ "聚": 1078,
1081
+ "钱": 1079,
1082
+ "雨": 1080,
1083
+ "尾": 1081,
1084
+ "课": 1082,
1085
+ "废": 1083,
1086
+ "震": 1084,
1087
+ "杯": 1085,
1088
+ "舞": 1086,
1089
+ "巨": 1087,
1090
+ "宾": 1088,
1091
+ "售": 1089,
1092
+ "混": 1090,
1093
+ "购": 1091,
1094
+ "刺": 1092,
1095
+ "洪": 1093,
1096
+ "怀": 1094,
1097
+ "穿": 1095,
1098
+ "皆": 1096,
1099
+ "牛": 1097,
1100
+ "欢": 1098,
1101
+ "疑": 1099,
1102
+ "卢": 1100,
1103
+ "避": 1101,
1104
+ "荷": 1102,
1105
+ "郎": 1103,
1106
+ "择": 1104,
1107
+ "顾": 1105,
1108
+ "露": 1106,
1109
+ "谢": 1107,
1110
+ "竞": 1108,
1111
+ "智": 1109,
1112
+ "固": 1110,
1113
+ "疗": 1111,
1114
+ "圆": 1112,
1115
+ "讯": 1113,
1116
+ "唯": 1114,
1117
+ "眼": 1115,
1118
+ "申": 1116,
1119
+ "探": 1117,
1120
+ "兹": 1118,
1121
+ "休": 1119,
1122
+ "徐": 1120,
1123
+ "御": 1121,
1124
+ "穆": 1122,
1125
+ "帕": 1123,
1126
+ "玩": 1124,
1127
+ "映": 1125,
1128
+ "订": 1126,
1129
+ "杜": 1127,
1130
+ "急": 1128,
1131
+ "描": 1129,
1132
+ "你": 1130,
1133
+ "赞": 1131,
1134
+ "殖": 1132,
1135
+ "溪": 1133,
1136
+ "爵": 1134,
1137
+ "宇": 1135,
1138
+ "坏": 1136,
1139
+ "敦": 1137,
1140
+ "殿": 1138,
1141
+ "恶": 1139,
1142
+ "刚": 1140,
1143
+ "尤": 1141,
1144
+ "孩": 1142,
1145
+ "贸": 1143,
1146
+ "刊": 1144,
1147
+ "竹": 1145,
1148
+ "靠": 1146,
1149
+ "兄": 1147,
1150
+ "峰": 1148,
1151
+ "幕": 1149,
1152
+ "轨": 1150,
1153
+ "呼": 1151,
1154
+ "秦": 1152,
1155
+ "译": 1153,
1156
+ "紧": 1154,
1157
+ "炸": 1155,
1158
+ "妻": 1156,
1159
+ "潮": 1157,
1160
+ "毒": 1158,
1161
+ "涉": 1159,
1162
+ "坡": 1160,
1163
+ "秋": 1161,
1164
+ "阻": 1162,
1165
+ "弃": 1163,
1166
+ "址": 1164,
1167
+ "羽": 1165,
1168
+ "乔": 1166,
1169
+ "替": 1167,
1170
+ "翰": 1168,
1171
+ "姓": 1169,
1172
+ "汇": 1170,
1173
+ "魔": 1171,
1174
+ "脱": 1172,
1175
+ "泛": 1173,
1176
+ "雪": 1174,
1177
+ "拒": 1175,
1178
+ "招": 1176,
1179
+ "危": 1177,
1180
+ "魏": 1178,
1181
+ "宽": 1179,
1182
+ "蓝": 1180,
1183
+ "骨": 1181,
1184
+ "频": 1182,
1185
+ "操": 1183,
1186
+ "陵": 1184,
1187
+ "翻": 1185,
1188
+ "买": 1186,
1189
+ "渡": 1187,
1190
+ "耳": 1188,
1191
+ "尊": 1189,
1192
+ "辆": 1190,
1193
+ "迎": 1191,
1194
+ "蒋": 1192,
1195
+ "础": 1193,
1196
+ "…": 1194,
1197
+ "融": 1195,
1198
+ "培": 1196,
1199
+ "寻": 1197,
1200
+ "衣": 1198,
1201
+ "戴": 1199,
1202
+ "��": 1200,
1203
+ "董": 1201,
1204
+ "绿": 1202,
1205
+ "裁": 1203,
1206
+ "针": 1204,
1207
+ "贡": 1205,
1208
+ "恐": 1206,
1209
+ "}": 1207,
1210
+ "卷": 1208,
1211
+ "误": 1209,
1212
+ "{": 1210,
1213
+ "弱": 1211,
1214
+ "宜": 1212,
1215
+ "娜": 1213,
1216
+ "舍": 1214,
1217
+ "横": 1215,
1218
+ "径": 1216,
1219
+ "丝": 1217,
1220
+ "肯": 1218,
1221
+ "冷": 1219,
1222
+ "偏": 1220,
1223
+ "邻": 1221,
1224
+ "健": 1222,
1225
+ "灾": 1223,
1226
+ "倒": 1224,
1227
+ "沃": 1225,
1228
+ "锦": 1226,
1229
+ "租": 1227,
1230
+ "惠": 1228,
1231
+ "付": 1229,
1232
+ "隔": 1230,
1233
+ "霍": 1231,
1234
+ "困": 1232,
1235
+ "妇": 1233,
1236
+ "尺": 1234,
1237
+ "掌": 1235,
1238
+ "岩": 1236,
1239
+ "腊": 1237,
1240
+ "柏": 1238,
1241
+ "侯": 1239,
1242
+ "脑": 1240,
1243
+ "哲": 1241,
1244
+ "陷": 1242,
1245
+ "绩": 1243,
1246
+ "俗": 1244,
1247
+ "阴": 1245,
1248
+ "恢": 1246,
1249
+ "符": 1247,
1250
+ "奏": 1248,
1251
+ "税": 1249,
1252
+ "卖": 1250,
1253
+ "染": 1251,
1254
+ "毫": 1252,
1255
+ "暗": 1253,
1256
+ "辞": 1254,
1257
+ "篇": 1255,
1258
+ "冰": 1256,
1259
+ "扬": 1257,
1260
+ "曹": 1258,
1261
+ "迹": 1259,
1262
+ "赖": 1260,
1263
+ "墨": 1261,
1264
+ "犹": 1262,
1265
+ "剑": 1263,
1266
+ "汽": 1264,
1267
+ "楚": 1265,
1268
+ "伐": 1266,
1269
+ "络": 1267,
1270
+ "绍": 1268,
1271
+ "繁": 1269,
1272
+ "镜": 1270,
1273
+ "赢": 1271,
1274
+ "块": 1272,
1275
+ "努": 1273,
1276
+ "暂": 1274,
1277
+ "圈": 1275,
1278
+ "套": 1276,
1279
+ "佐": 1277,
1280
+ "概": 1278,
1281
+ "乃": 1279,
1282
+ "呈": 1280,
1283
+ "阁": 1281,
1284
+ "稳": 1282,
1285
+ "错": 1283,
1286
+ "辛": 1284,
1287
+ "廉": 1285,
1288
+ "鼓": 1286,
1289
+ "莉": 1287,
1290
+ "艘": 1288,
1291
+ "彩": 1289,
1292
+ "坐": 1290,
1293
+ "崇": 1291,
1294
+ "盘": 1292,
1295
+ "墙": 1293,
1296
+ "井": 1294,
1297
+ "矿": 1295,
1298
+ "遣": 1296,
1299
+ "泉": 1297,
1300
+ "佩": 1298,
1301
+ "肉": 1299,
1302
+ "芬": 1300,
1303
+ "扎": 1301,
1304
+ "纸": 1302,
1305
+ "鸟": 1303,
1306
+ "症": 1304,
1307
+ "奴": 1305,
1308
+ "韦": 1306,
1309
+ "赴": 1307,
1310
+ "锡": 1308,
1311
+ "味": 1309,
1312
+ "驱": 1310,
1313
+ "珠": 1311,
1314
+ "辅": 1312,
1315
+ "裂": 1313,
1316
+ "默": 1314,
1317
+ "笔": 1315,
1318
+ "虎": 1316,
1319
+ "恒": 1317,
1320
+ "翼": 1318,
1321
+ "患": 1319,
1322
+ "郭": 1320,
1323
+ "孔": 1321,
1324
+ "幼": 1322,
1325
+ "墓": 1323,
1326
+ "脚": 1324,
1327
+ "叛": 1325,
1328
+ "]": 1326,
1329
+ "叫": 1327,
1330
+ "估": 1328,
1331
+ "邀": 1329,
1332
+ "[": 1330,
1333
+ "胞": 1331,
1334
+ "促": 1332,
1335
+ "昭": 1333,
1336
+ "触": 1334,
1337
+ "甘": 1335,
1338
+ "陶": 1336,
1339
+ "静": 1337,
1340
+ "障": 1338,
1341
+ "亮": 1339,
1342
+ "振": 1340,
1343
+ "答": 1341,
1344
+ "筹": 1342,
1345
+ "狱": 1343,
1346
+ "仙": 1344,
1347
+ "祭": 1345,
1348
+ "妹": 1346,
1349
+ "洞": 1347,
1350
+ "寿": 1348,
1351
+ "池": 1349,
1352
+ "遍": 1350,
1353
+ "浪": 1351,
1354
+ "誉": 1352,
1355
+ "伍": 1353,
1356
+ "幸": 1354,
1357
+ "残": 1355,
1358
+ "蔡": 1356,
1359
+ "茨": 1357,
1360
+ "缘": 1358,
1361
+ "迷": 1359,
1362
+ "沉": 1360,
1363
+ "柱": 1361,
1364
+ "冬": 1362,
1365
+ "?": 1363,
1366
+ "潜": 1364,
1367
+ "铜": 1365,
1368
+ "烧": 1366,
1369
+ "漫": 1367,
1370
+ "析": 1368,
1371
+ "颁": 1369,
1372
+ "闭": 1370,
1373
+ "遂": 1371,
1374
+ "仰": 1372,
1375
+ "邓": 1373,
1376
+ "钢": 1374,
1377
+ "截": 1375,
1378
+ "纵": 1376,
1379
+ "逊": 1377,
1380
+ "违": 1378,
1381
+ "辽": 1379,
1382
+ "拔": 1380,
1383
+ "埔": 1381,
1384
+ "迅": 1382,
1385
+ "浙": 1383,
1386
+ "酸": 1384,
1387
+ "赤": 1385,
1388
+ "燃": 1386,
1389
+ "凤": 1387,
1390
+ "幅": 1388,
1391
+ "吕": 1389,
1392
+ "梦": 1390,
1393
+ "苦": 1391,
1394
+ "伟": 1392,
1395
+ "浦": 1393,
1396
+ "麻": 1394,
1397
+ "虑": 1395,
1398
+ "缩": 1396,
1399
+ "仔": 1397,
1400
+ "绪": 1398,
1401
+ "戈": 1399,
1402
+ "烟": 1400,
1403
+ "纷": 1401,
1404
+ "勇": 1402,
1405
+ "享": 1403,
1406
+ "沈": 1404,
1407
+ "龄": 1405,
1408
+ "厚": 1406,
1409
+ "桑": 1407,
1410
+ "衡": 1408,
1411
+ "句": 1409,
1412
+ "伴": 1410,
1413
+ "绘": 1411,
1414
+ "允": 1412,
1415
+ "莎": 1413,
1416
+ "盐": 1414,
1417
+ "颜": 1415,
1418
+ "豪": 1416,
1419
+ "琴": 1417,
1420
+ "液": 1418,
1421
+ "牧": 1419,
1422
+ "综": 1420,
1423
+ "!": 1421,
1424
+ "拆": 1422,
1425
+ "隐": 1423,
1426
+ "宿": 1424,
1427
+ "谓": 1425,
1428
+ "硬": 1426,
1429
+ "脉": 1427,
1430
+ "葡": 1428,
1431
+ "纯": 1429,
1432
+ "宅": 1430,
1433
+ "殊": 1431,
1434
+ "旁": 1432,
1435
+ "册": 1433,
1436
+ "折": 1434,
1437
+ "慢": 1435,
1438
+ "厦": 1436,
1439
+ "彼": 1437,
1440
+ "飓": 1438,
1441
+ "疆": 1439,
1442
+ "捷": 1440,
1443
+ "俱": 1441,
1444
+ "徽": 1442,
1445
+ "祥": 1443,
1446
+ "汗": 1444,
1447
+ "辉": 1445,
1448
+ "滨": 1446,
1449
+ "跑": 1447,
1450
+ "孝": 1448,
1451
+ "熙": 1449,
1452
+ "敬": 1450,
1453
+ "叙": 1451,
1454
+ "零": 1452,
1455
+ "挑": 1453,
1456
+ "珍": 1454,
1457
+ "缓": 1455,
1458
+ "裔": 1456,
1459
+ "吃": 1457,
1460
+ "敏": 1458,
1461
+ "冯": 1459,
1462
+ "冈": 1460,
1463
+ "亨": 1461,
1464
+ "瑟": 1462,
1465
+ "跨": 1463,
1466
+ "岭": 1464,
1467
+ "锋": 1465,
1468
+ "刀": 1466,
1469
+ "驶": 1467,
1470
+ "茶": 1468,
1471
+ "滑": 1469,
1472
+ "赏": 1470,
1473
+ "萧": 1471,
1474
+ "撒": 1472,
1475
+ "倾": 1473,
1476
+ "坛": 1474,
1477
+ "逝": 1475,
1478
+ "掉": 1476,
1479
+ "侍": 1477,
1480
+ "氧": 1478,
1481
+ "柯": 1479,
1482
+ "磁": 1480,
1483
+ "灯": 1481,
1484
+ "仓": 1482,
1485
+ "慕": 1483,
1486
+ "碑": 1484,
1487
+ "晨": 1485,
1488
+ "丘": 1486,
1489
+ "绕": 1487,
1490
+ "艇": 1488,
1491
+ "覆": 1489,
1492
+ "械": 1490,
1493
+ "档": 1491,
1494
+ "餐": 1492,
1495
+ "昆": 1493,
1496
+ "伏": 1494,
1497
+ "熟": 1495,
1498
+ "榜": 1496,
1499
+ "雕": 1497,
1500
+ "贯": 1498,
1501
+ "燕": 1499,
1502
+ "糖": 1500,
1503
+ "诚": 1501,
1504
+ "软": 1502,
1505
+ "汤": 1503,
1506
+ "虫": 1504,
1507
+ "疾": 1505,
1508
+ "痛": 1506,
1509
+ "葬": 1507,
1510
+ "朋": 1508,
1511
+ "凡": 1509,
1512
+ "慈": 1510,
1513
+ "孟": 1511,
1514
+ "欲": 1512,
1515
+ "壁": 1513,
1516
+ "搭": 1514,
1517
+ "贤": 1515,
1518
+ "胁": 1516,
1519
+ "桃": 1517,
1520
+ "偶": 1518,
1521
+ "恋": 1519,
1522
+ "乏": 1520,
1523
+ "邮": 1521,
1524
+ "桂": 1522,
1525
+ "袁": 1523,
1526
+ "菜": 1524,
1527
+ "藤": 1525,
1528
+ "贺": 1526,
1529
+ "惊": 1527,
1530
+ "虚": 1528,
1531
+ "莲": 1529,
1532
+ "览": 1530,
1533
+ "倍": 1531,
1534
+ "猪": 1532,
1535
+ "猎": 1533,
1536
+ "戒": 1534,
1537
+ "勋": 1535,
1538
+ "驾": 1536,
1539
+ "粉": 1537,
1540
+ "丧": 1538,
1541
+ "藩": 1539,
1542
+ "姐": 1540,
1543
+ "趣": 1541,
1544
+ "罚": 1542,
1545
+ "勤": 1543,
1546
+ "坑": 1544,
1547
+ "妮": 1545,
1548
+ "玄": 1546,
1549
+ "储": 1547,
1550
+ "饮": 1548,
1551
+ "蛋": 1549,
1552
+ "萄": 1550,
1553
+ "塘": 1551,
1554
+ "凌": 1552,
1555
+ "彭": 1553,
1556
+ "番": 1554,
1557
+ "措": 1555,
1558
+ "撞": 1556,
1559
+ "峡": 1557,
1560
+ "匹": 1558,
1561
+ "励": 1559,
1562
+ "葛": 1560,
1563
+ "拟": 1561,
1564
+ "迈": 1562,
1565
+ "沟": 1563,
1566
+ "抚": 1564,
1567
+ "轴": 1565,
1568
+ "伸": 1566,
1569
+ "屯": 1567,
1570
+ "既": 1568,
1571
+ "赶": 1569,
1572
+ "剂": 1570,
1573
+ "奈": 1571,
1574
+ "鬼": 1572,
1575
+ "旦": 1573,
1576
+ "粮": 1574,
1577
+ "僧": 1575,
1578
+ "淡": 1576,
1579
+ "挂": 1577,
1580
+ "罕": 1578,
1581
+ "纲": 1579,
1582
+ "岳": 1580,
1583
+ "污": 1581,
1584
+ "陕": 1582,
1585
+ "忆": 1583,
1586
+ "袖": 1584,
1587
+ "牠": 1585,
1588
+ "紫": 1586,
1589
+ "贾": 1587,
1590
+ "扰": 1588,
1591
+ "累": 1589,
1592
+ "菌": 1590,
1593
+ "键": 1591,
1594
+ "仲": 1592,
1595
+ "吨": 1593,
1596
+ "鉴": 1594,
1597
+ "笼": 1595,
1598
+ "琳": 1596,
1599
+ "跳": 1597,
1600
+ "昂": 1598,
1601
+ "晓": 1599,
1602
+ "乙": 1600,
1603
+ "兽": 1601,
1604
+ "灰": 1602,
1605
+ "町": 1603,
1606
+ "涯": 1604,
1607
+ "禄": 1605,
1608
+ "凭": 1606,
1609
+ "洗": 1607,
1610
+ "怒": 1608,
1611
+ "尖": 1609,
1612
+ "箭": 1610,
1613
+ "赐": 1611,
1614
+ "卓": 1612,
1615
+ "衰": 1613,
1616
+ "辩": 1614,
1617
+ "潘": 1615,
1618
+ "彻": 1616,
1619
+ "诞": 1617,
1620
+ "尉": 1618,
1621
+ "芳": 1619,
1622
+ "颗": 1620,
1623
+ "搜": 1621,
1624
+ "胶": 1622,
1625
+ "炎": 1623,
1626
+ "怪": 1624,
1627
+ "锁": 1625,
1628
+ "浮": 1626,
1629
+ "梯": 1627,
1630
+ "塑": 1628,
1631
+ "鹿": 1629,
1632
+ "『": 1630,
1633
+ "羊": 1631,
1634
+ "阮": 1632,
1635
+ "』": 1633,
1636
+ "谱": 1634,
1637
+ "捐": 1635,
1638
+ "俊": 1636,
1639
+ "俘": 1637,
1640
+ "祝": 1638,
1641
+ "详": 1639,
1642
+ "苗": 1640,
1643
+ "匈": 1641,
1644
+ "熊": 1642,
1645
+ "乾": 1643,
1646
+ "粒": 1644,
1647
+ "衔": 1645,
1648
+ "剩": 1646,
1649
+ "帅": 1647,
1650
+ "轰": 1648,
1651
+ "圳": 1649,
1652
+ "盾": 1650,
1653
+ "隧": 1651,
1654
+ "逢": 1652,
1655
+ "幻": 1653,
1656
+ "割": 1654,
1657
+ "耀": 1655,
1658
+ "盗": 1656,
1659
+ "粤": 1657,
1660
+ "荒": 1658,
1661
+ "咸": 1659,
1662
+ "拓": 1660,
1663
+ "齿": 1661,
1664
+ "泥": 1662,
1665
+ "跃": 1663,
1666
+ "乳": 1664,
1667
+ "贫": 1665,
1668
+ "闽": 1666,
1669
+ "询": 1667,
1670
+ "柳": 1668,
1671
+ "枢": 1669,
1672
+ "弥": 1670,
1673
+ "巧": 1671,
1674
+ "彰": 1672,
1675
+ "渔": 1673,
1676
+ "儒": 1674,
1677
+ "〈": 1675,
1678
+ "函": 1676,
1679
+ "〉": 1677,
1680
+ "笑": 1678,
1681
+ "撰": 1679,
1682
+ "抢": 1680,
1683
+ "詹": 1681,
1684
+ "贞": 1682,
1685
+ "契": 1683,
1686
+ "旺": 1684,
1687
+ "~": 1685,
1688
+ "慧": 1686,
1689
+ "尸": 1687,
1690
+ "尝": 1688,
1691
+ "缅": 1689,
1692
+ "屿": 1690,
1693
+ "仑": 1691,
1694
+ "鸿": 1692,
1695
+ "握": 1693,
1696
+ "阅": 1694,
1697
+ "侦": 1695,
1698
+ "聘": 1696,
1699
+ "杭": 1697,
1700
+ "疏": 1698,
1701
+ "肃": 1699,
1702
+ "搬": 1700,
1703
+ "瓜": 1701,
1704
+ "棒": 1702,
1705
+ "逼": 1703,
1706
+ "淮": 1704,
1707
+ "悬": 1705,
1708
+ "厘": 1706,
1709
+ "宙": 1707,
1710
+ "仿": 1708,
1711
+ "劝": 1709,
1712
+ "屏": 1710,
1713
+ "丈": 1711,
1714
+ "裕": 1712,
1715
+ "郊": 1713,
1716
+ "伙": 1714,
1717
+ "碍": 1715,
1718
+ "壮": 1716,
1719
+ "卿": 1717,
1720
+ "霸": 1718,
1721
+ "杉": 1719,
1722
+ "叔": 1720,
1723
+ "弘": 1721,
1724
+ "牵": 1722,
1725
+ "芝": 1723,
1726
+ "夷": 1724,
1727
+ "罢": 1725,
1728
+ "崔": 1726,
1729
+ "循": 1727,
1730
+ "柴": 1728,
1731
+ "孤": 1729,
1732
+ "勃": 1730,
1733
+ "浓": 1731,
1734
+ "砲": 1732,
1735
+ "硕": 1733,
1736
+ "拳": 1734,
1737
+ "扮": 1735,
1738
+ "湘": 1736,
1739
+ "钦": 1737,
1740
+ "粹": 1738,
1741
+ "惯": 1739,
1742
+ "晶": 1740,
1743
+ "亭": 1741,
1744
+ "禅": 1742,
1745
+ "纹": 1743,
1746
+ "秒": 1744,
1747
+ "枚": 1745,
1748
+ "迟": 1746,
1749
+ "傅": 1747,
1750
+ "押": 1748,
1751
+ "岗": 1749,
1752
+ "旨": 1750,
1753
+ "宏": 1751,
1754
+ "靖": 1752,
1755
+ "踪": 1753,
1756
+ "床": 1754,
1757
+ "填": 1755,
1758
+ "琉": 1756,
1759
+ "茂": 1757,
1760
+ "妃": 1758,
1761
+ "贴": 1759,
1762
+ "偷": 1760,
1763
+ "赠": 1761,
1764
+ "猫": 1762,
1765
+ "逮": 1763,
1766
+ "疫": 1764,
1767
+ "悉": 1765,
1768
+ "饭": 1766,
1769
+ "狂": 1767,
1770
+ "邨": 1768,
1771
+ "忍": 1769,
1772
+ "甸": 1770,
1773
+ "煤": 1771,
1774
+ "貌": 1772,
1775
+ "侠": 1773,
1776
+ "抽": 1774,
1777
+ "赋": 1775,
1778
+ "净": 1776,
1779
+ "庞": 1777,
1780
+ "姻": 1778,
1781
+ "稿": 1779,
1782
+ "坊": 1780,
1783
+ "寒": 1781,
1784
+ "棋": 1782,
1785
+ "玻": 1783,
1786
+ "颇": 1784,
1787
+ "翌": 1785,
1788
+ "箱": 1786,
1789
+ "崎": 1787,
1790
+ "伪": 1788,
1791
+ "涅": 1789,
1792
+ "娱": 1790,
1793
+ "妈": 1791,
1794
+ "揭": 1792,
1795
+ "铸": 1793,
1796
+ "肥": 1794,
1797
+ "喀": 1795,
1798
+ "诏": 1796,
1799
+ "蛇": 1797,
1800
+ "扶": 1798,
1801
+ "屠": 1799,
1802
+ "遵": 1800,
1803
+ "劫": 1801,
1804
+ "吐": 1802,
1805
+ "雇": 1803,
1806
+ "鸡": 1804,
1807
+ "浅": 1805,
1808
+ "趋": 1806,
1809
+ "递": 1807,
1810
+ "胎": 1808,
1811
+ "耗": 1809,
1812
+ "嫌": 1810,
1813
+ "娘": 1811,
1814
+ "泊": 1812,
1815
+ "焦": 1813,
1816
+ "敢": 1814,
1817
+ "浩": 1815,
1818
+ "洁": 1816,
1819
+ "偿": 1817,
1820
+ "湿": 1818,
1821
+ "狮": 1819,
1822
+ "寄": 1820,
1823
+ "舒": 1821,
1824
+ "祀": 1822,
1825
+ "曰": 1823,
1826
+ "吾": 1824,
1827
+ "稍": 1825,
1828
+ "稣": 1826,
1829
+ "陀": 1827,
1830
+ "愈": 1828,
1831
+ "冒": 1829,
1832
+ "插": 1830,
1833
+ "猛": 1831,
1834
+ "狄": 1832,
1835
+ "旬": 1833,
1836
+ "卜": 1834,
1837
+ "摇": 1835,
1838
+ "涌": 1836,
1839
+ "惨": 1837,
1840
+ "怕": 1838,
1841
+ "券": 1839,
1842
+ "廊": 1840,
1843
+ "垒": 1841,
1844
+ "鲍": 1842,
1845
+ "薄": 1843,
1846
+ "枝": 1844,
1847
+ "篮": 1845,
1848
+ "窗": 1846,
1849
+ "怖": 1847,
1850
+ "喷": 1848,
1851
+ "仇": 1849,
1852
+ "魂": 1850,
1853
+ "荐": 1851,
1854
+ "铺": 1852,
1855
+ "奔": 1853,
1856
+ "润": 1854,
1857
+ "惟": 1855,
1858
+ "歧": 1856,
1859
+ "碎": 1857,
1860
+ "腹": 1858,
1861
+ "扫": 1859,
1862
+ "串": 1860,
1863
+ "铭": 1861,
1864
+ "醒": 1862,
1865
+ "腐": 1863,
1866
+ "欣": 1864,
1867
+ "垂": 1865,
1868
+ "壤": 1866,
1869
+ "朴": 1867,
1870
+ "矛": 1868,
1871
+ "债": 1869,
1872
+ "婆": 1870,
1873
+ "摆": 1871,
1874
+ "潭": 1872,
1875
+ "滩": 1873,
1876
+ "刷": 1874,
1877
+ "悲": 1875,
1878
+ "鸣": 1876,
1879
+ "堆": 1877,
1880
+ "泳": 1878,
1881
+ "吹": 1879,
1882
+ "汪": 1880,
1883
+ "碳": 1881,
1884
+ "侨": 1882,
1885
+ "彦": 1883,
1886
+ "逆": 1884,
1887
+ "壳": 1885,
1888
+ "抓": 1886,
1889
+ "柔": 1887,
1890
+ "杆": 1888,
1891
+ "倡": 1889,
1892
+ "–": 1890,
1893
+ "页": 1891,
1894
+ "鹰": 1892,
1895
+ "狗": 1893,
1896
+ "兆": 1894,
1897
+ "崩": 1895,
1898
+ "忽": 1896,
1899
+ "璃": 1897,
1900
+ "豆": 1898,
1901
+ "竟": 1899,
1902
+ "阔": 1900,
1903
+ "嗣": 1901,
1904
+ "脏": 1902,
1905
+ "携": 1903,
1906
+ "宰": 1904,
1907
+ "豫": 1905,
1908
+ "忙": 1906,
1909
+ "夕": 1907,
1910
+ "邪": 1908,
1911
+ "堪": 1909,
1912
+ "寨": 1910,
1913
+ "抱": 1911,
1914
+ "斜": 1912,
1915
+ "溃": 1913,
1916
+ "迦": 1914,
1917
+ "淘": 1915,
1918
+ "炉": 1916,
1919
+ "衍": 1917,
1920
+ "辟": 1918,
1921
+ "穷": 1919,
1922
+ "蒸": 1920,
1923
+ "翁": 1921,
1924
+ "隋": 1922,
1925
+ "腾": 1923,
1926
+ "逻": 1924,
1927
+ "诊": 1925,
1928
+ "炼": 1926,
1929
+ "─": 1927,
1930
+ "滚": 1928,
1931
+ "龟": 1929,
1932
+ "姚": 1930,
1933
+ "掘": 1931,
1934
+ "毅": 1932,
1935
+ "吏": 1933,
1936
+ "坎": 1934,
1937
+ "膜": 1935,
1938
+ "斩": 1936,
1939
+ "耕": 1937,
1940
+ "妙": 1938,
1941
+ "苑": 1939,
1942
+ "鼎": 1940,
1943
+ "削": 1941,
1944
+ "陪": 1942,
1945
+ "埋": 1943,
1946
+ "掠": 1944,
1947
+ "添": 1945,
1948
+ "凝": 1946,
1949
+ "栏": 1947,
1950
+ "涵": 1948,
1951
+ "摧": 1949,
1952
+ "珊": 1950,
1953
+ "狼": 1951,
1954
+ "姬": 1952,
1955
+ "癌": 1953,
1956
+ "拖": 1954,
1957
+ "奋": 1955,
1958
+ "孕": 1956,
1959
+ "沪": 1957,
1960
+ "薪": 1958,
1961
+ "磨": 1959,
1962
+ "薛": 1960,
1963
+ "嫁": 1961,
1964
+ "拨": 1962,
1965
+ "愤": 1963,
1966
+ "拘": 1964,
1967
+ "寸": 1965,
1968
+ "灌": 1966,
1969
+ "吞": 1967,
1970
+ "坪": 1968,
1971
+ "螺": 1969,
1972
+ "丸": 1970,
1973
+ "韵": 1971,
1974
+ "巫": 1972,
1975
+ "稻": 1973,
1976
+ "歇": 1974,
1977
+ "弦": 1975,
1978
+ "伽": 1976,
1979
+ "募": 1977,
1980
+ "甫": 1978,
1981
+ "鼠": 1979,
1982
+ "荡": 1980,
1983
+ "挪": 1981,
1984
+ "绳": 1982,
1985
+ "’": 1983,
1986
+ "厄": 1984,
1987
+ "扣": 1985,
1988
+ "栖": 1986,
1989
+ "琼": 1987,
1990
+ "掩": 1988,
1991
+ "驳": 1989,
1992
+ "舱": 1990,
1993
+ "狭": 1991,
1994
+ "誓": 1992,
1995
+ "寮": 1993,
1996
+ "‘": 1994,
1997
+ "屈": 1995,
1998
+ "雍": 1996,
1999
+ "麟": 1997,
2000
+ "噶": 1998,
2001
+ "砖": 1999,
2002
+ "寓": 2000,
2003
+ "雾": 2001,
2004
+ "盆": 2002,
2005
+ "奶": 2003,
2006
+ "蜀": 2004,
2007
+ "脊": 2005,
2008
+ "辐": 2006,
2009
+ "谭": 2007,
2010
+ "厢": 2008,
2011
+ "渊": 2009,
2012
+ "溶": 2010,
2013
+ "凉": 2011,
2014
+ "挖": 2012,
2015
+ "魁": 2013,
2016
+ "鄂": 2014,
2017
+ "涂": 2015,
2018
+ "贩": 2016,
2019
+ "扁": 2017,
2020
+ "舟": 2018,
2021
+ "喇": 2019,
2022
+ "坝": 2020,
2023
+ "荆": 2021,
2024
+ "婴": 2022,
2025
+ "吁": 2023,
2026
+ "幽": 2024,
2027
+ "邑": 2025,
2028
+ "粗": 2026,
2029
+ "鹤": 2027,
2030
+ "斥": 2028,
2031
+ "袋": 2029,
2032
+ "忧": 2030,
2033
+ "鼻": 2031,
2034
+ "栋": 2032,
2035
+ "纬": 2033,
2036
+ "姊": 2034,
2037
+ "肌": 2035,
2038
+ "汰": 2036,
2039
+ "斋": 2037,
2040
+ "恭": 2038,
2041
+ "贪": 2039,
2042
+ "脂": 2040,
2043
+ "奸": 2041,
2044
+ "氢": 2042,
2045
+ "漠": 2043,
2046
+ "矶": 2044,
2047
+ "胸": 2045,
2048
+ "诱": 2046,
2049
+ "祠": 2047,
2050
+ "卵": 2048,
2051
+ "忘": 2049,
2052
+ "暖": 2050,
2053
+ "咨": 2051,
2054
+ "桓": 2052,
2055
+ "聪": 2053,
2056
+ "趁": 2054,
2057
+ "脸": 2055,
2058
+ "抑": 2056,
2059
+ "廖": 2057,
2060
+ "溯": 2058,
2061
+ "卑": 2059,
2062
+ "泡": 2060,
2063
+ "翔": 2061,
2064
+ "谦": 2062,
2065
+ "斤": 2063,
2066
+ "郁": 2064,
2067
+ "宴": 2065,
2068
+ "斑": 2066,
2069
+ "犬": 2067,
2070
+ "贼": 2068,
2071
+ "腓": 2069,
2072
+ "襄": 2070,
2073
+ "惧": 2071,
2074
+ "恨": 2072,
2075
+ "叠": 2073,
2076
+ "碧": 2074,
2077
+ "恰": 2075,
2078
+ "顷": 2076,
2079
+ "娃": 2077,
2080
+ "酷": 2078,
2081
+ "夹": 2079,
2082
+ "爷": 2080,
2083
+ "擅": 2081,
2084
+ "雌": 2082,
2085
+ "哪": 2083,
2086
+ "链": 2084,
2087
+ "谁": 2085,
2088
+ " ": 2086,
2089
+ "牲": 2087,
2090
+ "慰": 2088,
2091
+ "钻": 2089,
2092
+ "辰": 2090,
2093
+ "砂": 2091,
2094
+ "闸": 2092,
2095
+ "闲": 2093,
2096
+ "惜": 2094,
2097
+ "冀": 2095,
2098
+ "辨": 2096,
2099
+ "锐": 2097,
2100
+ "尹": 2098,
2101
+ "嘴": 2099,
2102
+ "瓶": 2100,
2103
+ "肖": 2101,
2104
+ "挡": 2102,
2105
+ "翠": 2103,
2106
+ "披": 2104,
2107
+ "姜": 2105,
2108
+ "凶": 2106,
2109
+ "奎": 2107,
2110
+ "瓷": 2108,
2111
+ "奠": 2109,
2112
+ "劲": 2110,
2113
+ "贷": 2111,
2114
+ "绥": 2112,
2115
+ "腔": 2113,
2116
+ "巢": 2114,
2117
+ "厝": 2115,
2118
+ "焚": 2116,
2119
+ "姑": 2117,
2120
+ "踏": 2118,
2121
+ "剪": 2119,
2122
+ "劣": 2120,
2123
+ "斐": 2121,
2124
+ "阪": 2122,
2125
+ "缴": 2123,
2126
+ "赔": 2124,
2127
+ "帽": 2125,
2128
+ "菩": 2126,
2129
+ "弯": 2127,
2130
+ "碰": 2128,
2131
+ "丞": 2129,
2132
+ "蜂": 2130,
2133
+ "邱": 2131,
2134
+ "骚": 2132,
2135
+ "亩": 2133,
2136
+ "囚": 2134,
2137
+ "卸": 2135,
2138
+ "°": 2136,
2139
+ "宠": 2137,
2140
+ "纠": 2138,
2141
+ "践": 2139,
2142
+ "株": 2140,
2143
+ "睡": 2141,
2144
+ "渠": 2142,
2145
+ "拼": 2143,
2146
+ "闪": 2144,
2147
+ "慎": 2145,
2148
+ "'": 2146,
2149
+ "棉": 2147,
2150
+ "侣": 2148,
2151
+ "冕": 2149,
2152
+ "屡": 2150,
2153
+ "殷": 2151,
2154
+ "骗": 2152,
2155
+ "宛": 2153,
2156
+ "虏": 2154,
2157
+ "蹈": 2155,
2158
+ "虹": 2156,
2159
+ "惩": 2157,
2160
+ "巷": 2158,
2161
+ "颠": 2159,
2162
+ "绑": 2160,
2163
+ "谊": 2161,
2164
+ "怎": 2162,
2165
+ "耐": 2163,
2166
+ "丑": 2164,
2167
+ "忌": 2165,
2168
+ "逾": 2166,
2169
+ "擎": 2167,
2170
+ "胆": 2168,
2171
+ "茅": 2169,
2172
+ "庇": 2170,
2173
+ "秩": 2171,
2174
+ "辈": 2172,
2175
+ "涛": 2173,
2176
+ "赌": 2174,
2177
+ "栗": 2175,
2178
+ "矩": 2176,
2179
+ "舆": 2177,
2180
+ "亥": 2178,
2181
+ "帐": 2179,
2182
+ "稀": 2180,
2183
+ "姿": 2181,
2184
+ "遥": 2182,
2185
+ "祸": 2183,
2186
+ "拱": 2184,
2187
+ "厉": 2185,
2188
+ "跌": 2186,
2189
+ "鲸": 2187,
2190
+ "颈": 2188,
2191
+ "腿": 2189,
2192
+ "蕾": 2190,
2193
+ "鸦": 2191,
2194
+ "祈": 2192,
2195
+ "抛": 2193,
2196
+ "厥": 2194,
2197
+ "蕃": 2195,
2198
+ "澄": 2196,
2199
+ "琪": 2197,
2200
+ "娶": 2198,
2201
+ "讼": 2199,
2202
+ "卒": 2200,
2203
+ "吧": 2201,
2204
+ "弓": 2202,
2205
+ "堤": 2203,
2206
+ "催": 2204,
2207
+ "淑": 2205,
2208
+ "抄": 2206,
2209
+ "叉": 2207,
2210
+ "蜜": 2208,
2211
+ "悟": 2209,
2212
+ "仆": 2210,
2213
+ "汀": 2211,
2214
+ "鹏": 2212,
2215
+ "漏": 2213,
2216
+ "逸": 2214,
2217
+ "肢": 2215,
2218
+ "肺": 2216,
2219
+ "哀": 2217,
2220
+ "欠": 2218,
2221
+ "汝": 2219,
2222
+ "雀": 2220,
2223
+ "筒": 2221,
2224
+ "镑": 2222,
2225
+ "庸": 2223,
2226
+ "剥": 2224,
2227
+ "浸": 2225,
2228
+ "弄": 2226,
2229
+ "珀": 2227,
2230
+ "禧": 2228,
2231
+ "暨": 2229,
2232
+ "囊": 2230,
2233
+ "涨": 2231,
2234
+ "僚": 2232,
2235
+ "尿": 2233,
2236
+ "纤": 2234,
2237
+ "歼": 2235,
2238
+ "悠": 2236,
2239
+ "阀": 2237,
2240
+ "硫": 2238,
2241
+ "邵": 2239,
2242
+ "咖": 2240,
2243
+ "怨": 2241,
2244
+ "蓄": 2242,
2245
+ "胀": 2243,
2246
+ "穴": 2244,
2247
+ "霞": 2245,
2248
+ "挤": 2246,
2249
+ "吊": 2247,
2250
+ "鸭": 2248,
2251
+ "炭": 2249,
2252
+ "畅": 2250,
2253
+ "匪": 2251,
2254
+ "梵": 2252,
2255
+ "丛": 2253,
2256
+ "躲": 2254,
2257
+ "蒲": 2255,
2258
+ "泄": 2256,
2259
+ "坟": 2257,
2260
+ "豹": 2258,
2261
+ "寇": 2259,
2262
+ "垦": 2260,
2263
+ "肤": 2261,
2264
+ "讽": 2262,
2265
+ "丙": 2263,
2266
+ "妖": 2264,
2267
+ "玲": 2265,
2268
+ "℃": 2266,
2269
+ "肠": 2267,
2270
+ "尘": 2268,
2271
+ "伞": 2269,
2272
+ "歉": 2270,
2273
+ "勘": 2271,
2274
+ "坂": 2272,
2275
+ "挺": 2273,
2276
+ "扑": 2274,
2277
+ "妥": 2275,
2278
+ "塌": 2276,
2279
+ "盈": 2277,
2280
+ "肩": 2278,
2281
+ "铃": 2279,
2282
+ "厌": 2280,
2283
+ "臂": 2281,
2284
+ "辱": 2282,
2285
+ "喝": 2283,
2286
+ "旱": 2284,
2287
+ "蚀": 2285,
2288
+ "帆": 2286,
2289
+ "遮": 2287,
2290
+ "嘛": 2288,
2291
+ "仕": 2289,
2292
+ "夸": 2290,
2293
+ "昏": 2291,
2294
+ "腺": 2292,
2295
+ "锅": 2293,
2296
+ "饥": 2294,
2297
+ "贬": 2295,
2298
+ "滥": 2296,
2299
+ "饼": 2297,
2300
+ "纺": 2298,
2301
+ "谴": 2299,
2302
+ "畏": 2300,
2303
+ "轩": 2301,
2304
+ "悦": 2302,
2305
+ "徙": 2303,
2306
+ "牺": 2304,
2307
+ "颂": 2305,
2308
+ "巩": 2306,
2309
+ "札": 2307,
2310
+ "肇": 2308,
2311
+ "窄": 2309,
2312
+ "骏": 2310,
2313
+ "猜": 2311,
2314
+ "唤": 2312,
2315
+ "谨": 2313,
2316
+ "卧": 2314,
2317
+ "璋": 2315,
2318
+ "怡": 2316,
2319
+ "赣": 2317,
2320
+ "椎": 2318,
2321
+ "瑜": 2319,
2322
+ "啡": 2320,
2323
+ "剿": 2321,
2324
+ "蓬": 2322,
2325
+ "搞": 2323,
2326
+ "昔": 2324,
2327
+ "颖": 2325,
2328
+ "薇": 2326,
2329
+ "钓": 2327,
2330
+ "爪": 2328,
2331
+ "腰": 2329,
2332
+ "圭": 2330,
2333
+ "捉": 2331,
2334
+ "滋": 2332,
2335
+ "戚": 2333,
2336
+ "删": 2334,
2337
+ "丢": 2335,
2338
+ "礁": 2336,
2339
+ "槽": 2337,
2340
+ "爬": 2338,
2341
+ "嫩": 2339,
2342
+ "芦": 2340,
2343
+ "谐": 2341,
2344
+ "朵": 2342,
2345
+ "庶": 2343,
2346
+ "撑": 2344,
2347
+ "缝": 2345,
2348
+ "肝": 2346,
2349
+ "葵": 2347,
2350
+ "尧": 2348,
2351
+ "柜": 2349,
2352
+ "佣": 2350,
2353
+ "贿": 2351,
2354
+ "晴": 2352,
2355
+ "冻": 2353,
2356
+ "缔": 2354,
2357
+ "喻": 2355,
2358
+ "坤": 2356,
2359
+ "矮": 2357,
2360
+ "扭": 2358,
2361
+ "框": 2359,
2362
+ "芒": 2360,
2363
+ "肿": 2361,
2364
+ "谣": 2362,
2365
+ "汶": 2363,
2366
+ "绵": 2364,
2367
+ "荃": 2365,
2368
+ "坞": 2366,
2369
+ "履": 2367,
2370
+ "漂": 2368,
2371
+ "虐": 2369,
2372
+ "坠": 2370,
2373
+ "蛮": 2371,
2374
+ "擦": 2372,
2375
+ "埠": 2373,
2376
+ "醉": 2374,
2377
+ "欺": 2375,
2378
+ "畜": 2376,
2379
+ "痕": 2377,
2380
+ "崖": 2378,
2381
+ "醇": 2379,
2382
+ "橡": 2380,
2383
+ "柬": 2381,
2384
+ "滕": 2382,
2385
+ "沦": 2383,
2386
+ "甜": 2384,
2387
+ "煌": 2385,
2388
+ "窝": 2386,
2389
+ "阎": 2387,
2390
+ "咒": 2388,
2391
+ "о": 2389,
2392
+ "艰": 2390,
2393
+ "呎": 2391,
2394
+ "佑": 2392,
2395
+ "牢": 2393,
2396
+ "澎": 2394,
2397
+ "炳": 2395,
2398
+ "勾": 2396,
2399
+ "桌": 2397,
2400
+ "苹": 2398,
2401
+ "茵": 2399,
2402
+ "鹅": 2400,
2403
+ "吓": 2401,
2404
+ "梨": 2402,
2405
+ "赦": 2403,
2406
+ "菊": 2404,
2407
+ "蝠": 2405,
2408
+ "鞋": 2406,
2409
+ "荫": 2407,
2410
+ "淹": 2408,
2411
+ "・": 2409,
2412
+ "酿": 2410,
2413
+ "竣": 2411,
2414
+ "踢": 2412,
2415
+ "毗": 2413,
2416
+ "睛": 2414,
2417
+ "祷": 2415,
2418
+ "眠": 2416,
2419
+ "哭": 2417,
2420
+ "闹": 2418,
2421
+ "秉": 2419,
2422
+ "桐": 2420,
2423
+ "攀": 2421,
2424
+ "饶": 2422,
2425
+ "铎": 2423,
2426
+ "霖": 2424,
2427
+ "疯": 2425,
2428
+ "拦": 2426,
2429
+ "膨": 2427,
2430
+ "祐": 2428,
2431
+ "饱": 2429,
2432
+ "匠": 2430,
2433
+ "眉": 2431,
2434
+ "芙": 2432,
2435
+ "瘤": 2433,
2436
+ "椅": 2434,
2437
+ "棕": 2435,
2438
+ "蝙": 2436,
2439
+ "扇": 2437,
2440
+ "窑": 2438,
2441
+ "泪": 2439,
2442
+ "幢": 2440,
2443
+ "仗": 2441,
2444
+ "肆": 2442,
2445
+ "堵": 2443,
2446
+ "茎": 2444,
2447
+ "拯": 2445,
2448
+ "跋": 2446,
2449
+ "钧": 2447,
2450
+ "芭": 2448,
2451
+ "爸": 2449,
2452
+ "娅": 2450,
2453
+ "闯": 2451,
2454
+ "斌": 2452,
2455
+ "骂": 2453,
2456
+ "绅": 2454,
2457
+ "畔": 2455,
2458
+ "苍": 2456,
2459
+ "糊": 2457,
2460
+ "烦": 2458,
2461
+ "裴": 2459,
2462
+ "+": 2460,
2463
+ "滞": 2461,
2464
+ "沼": 2462,
2465
+ "碱": 2463,
2466
+ "咏": 2464,
2467
+ "阐": 2465,
2468
+ "狐": 2466,
2469
+ "氛": 2467,
2470
+ "а": 2468,
2471
+ "妓": 2469,
2472
+ "浑": 2470,
2473
+ "吻": 2471,
2474
+ "棚": 2472,
2475
+ "帖": 2473,
2476
+ "氨": 2474,
2477
+ "浆": 2475,
2478
+ "懂": 2476,
2479
+ "涡": 2477,
2480
+ "惑": 2478,
2481
+ "盲": 2479,
2482
+ "庐": 2480,
2483
+ "谕": 2481,
2484
+ "窃": 2482,
2485
+ "宦": 2483,
2486
+ "祯": 2484,
2487
+ "帜": 2485,
2488
+ "漳": 2486,
2489
+ "虞": 2487,
2490
+ "冥": 2488,
2491
+ "聂": 2489,
2492
+ "烂": 2490,
2493
+ "罩": 2491,
2494
+ "糕": 2492,
2495
+ "悼": 2493,
2496
+ "酬": 2494,
2497
+ "厨": 2495,
2498
+ "黛": 2496,
2499
+ "熔": 2497,
2500
+ "蝶": 2498,
2501
+ "漆": 2499,
2502
+ "柄": 2500,
2503
+ "喊": 2501,
2504
+ "谍": 2502,
2505
+ "凰": 2503,
2506
+ "樱": 2504,
2507
+ "诈": 2505,
2508
+ "淳": 2506,
2509
+ "掷": 2507,
2510
+ "牟": 2508,
2511
+ "彪": 2509,
2512
+ "兔": 2510,
2513
+ "艳": 2511,
2514
+ "铅": 2512,
2515
+ "吗": 2513,
2516
+ "氯": 2514,
2517
+ "朔": 2515,
2518
+ "寡": 2516,
2519
+ "禹": 2517,
2520
+ "鞍": 2518,
2521
+ "浴": 2519,
2522
+ "猴": 2520,
2523
+ "驰": 2521,
2524
+ "渗": 2522,
2525
+ "酶": 2523,
2526
+ "笃": 2524,
2527
+ "垄": 2525,
2528
+ "‧": 2526,
2529
+ "萝": 2527,
2530
+ "庚": 2528,
2531
+ "勉": 2529,
2532
+ "苯": 2530,
2533
+ "哺": 2531,
2534
+ "酋": 2532,
2535
+ "庵": 2533,
2536
+ "舌": 2534,
2537
+ "榄": 2535,
2538
+ "焰": 2536,
2539
+ "诵": 2537,
2540
+ "и": 2538,
2541
+ "瑶": 2539,
2542
+ "藻": 2540,
2543
+ "冶": 2541,
2544
+ "岐": 2542,
2545
+ "缠": 2543,
2546
+ "陨": 2544,
2547
+ "锣": 2545,
2548
+ "峻": 2546,
2549
+ "窟": 2547,
2550
+ "邸": 2548,
2551
+ "彬": 2549,
2552
+ "旭": 2550,
2553
+ "藉": 2551,
2554
+ "凸": 2552,
2555
+ "悔": 2553,
2556
+ "枫": 2554,
2557
+ "呢": 2555,
2558
+ "砍": 2556,
2559
+ "滤": 2557,
2560
+ "褐": 2558,
2561
+ "垣": 2559,
2562
+ "滴": 2560,
2563
+ "赚": 2561,
2564
+ "弊": 2562,
2565
+ "缉": 2563,
2566
+ "毙": 2564,
2567
+ "孜": 2565,
2568
+ "裸": 2566,
2569
+ "叹": 2567,
2570
+ "饲": 2568,
2571
+ "骤": 2569,
2572
+ "皖": 2570,
2573
+ "劾": 2571,
2574
+ "翅": 2572,
2575
+ "虾": 2573,
2576
+ "畴": 2574,
2577
+ "乞": 2575,
2578
+ "赎": 2576,
2579
+ "擒": 2577,
2580
+ "衙": 2578,
2581
+ "诛": 2579,
2582
+ "傲": 2580,
2583
+ "肾": 2581,
2584
+ "淫": 2582,
2585
+ "焕": 2583,
2586
+ "淀": 2584,
2587
+ "稽": 2585,
2588
+ "磷": 2586,
2589
+ "廿": 2587,
2590
+ "倪": 2588,
2591
+ "衷": 2589,
2592
+ "谥": 2590,
2593
+ "咀": 2591,
2594
+ "沛": 2592,
2595
+ "纂": 2593,
2596
+ "烯": 2594,
2597
+ "墟": 2595,
2598
+ "蔽": 2596,
2599
+ "鞭": 2597,
2600
+ "犁": 2598,
2601
+ "挫": 2599,
2602
+ "摘": 2600,
2603
+ "倭": 2601,
2604
+ "汕": 2602,
2605
+ "匿": 2603,
2606
+ "闵": 2604,
2607
+ "婉": 2605,
2608
+ "镶": 2606,
2609
+ "麓": 2607,
2610
+ "寂": 2608,
2611
+ "胃": 2609,
2612
+ "挽": 2610,
2613
+ "狩": 2611,
2614
+ "矢": 2612,
2615
+ "垃": 2613,
2616
+ "唇": 2614,
2617
+ "戎": 2615,
2618
+ "е": 2616,
2619
+ "喉": 2617,
2620
+ "耿": 2618,
2621
+ "燥": 2619,
2622
+ "饿": 2620,
2623
+ "蔓": 2621,
2624
+ "胺": 2622,
2625
+ "碟": 2623,
2626
+ "纱": 2624,
2627
+ "棠": 2625,
2628
+ "爽": 2626,
2629
+ "窦": 2627,
2630
+ "暑": 2628,
2631
+ "粘": 2629,
2632
+ "圾": 2630,
2633
+ "冢": 2631,
2634
+ "僵": 2632,
2635
+ "笛": 2633,
2636
+ "卦": 2634,
2637
+ "彗": 2635,
2638
+ "侄": 2636,
2639
+ "竭": 2637,
2640
+ "茄": 2638,
2641
+ "嵌": 2639,
2642
+ "簧": 2640,
2643
+ "堕": 2641,
2644
+ "寅": 2642,
2645
+ "拾": 2643,
2646
+ "缆": 2644,
2647
+ "铝": 2645,
2648
+ "筋": 2646,
2649
+ "噪": 2647,
2650
+ "沧": 2648,
2651
+ "濒": 2649,
2652
+ "盒": 2650,
2653
+ "殉": 2651,
2654
+ "梭": 2652,
2655
+ "桶": 2653,
2656
+ "迄": 2654,
2657
+ "谏": 2655,
2658
+ "棱": 2656,
2659
+ "矣": 2657,
2660
+ "袍": 2658,
2661
+ "亏": 2659,
2662
+ "疲": 2660,
2663
+ "浊": 2661,
2664
+ "抬": 2662,
2665
+ "崛": 2663,
2666
+ "嫡": 2664,
2667
+ "啸": 2665,
2668
+ "诠": 2666,
2669
+ "峙": 2667,
2670
+ "羌": 2668,
2671
+ "槟": 2669,
2672
+ "滇": 2670,
2673
+ "趾": 2671,
2674
+ "兑": 2672,
2675
+ "妆": 2673,
2676
+ "舅": 2674,
2677
+ "萍": 2675,
2678
+ "竖": 2676,
2679
+ "阜": 2677,
2680
+ "弼": 2678,
2681
+ "凿": 2679,
2682
+ "陇": 2680,
2683
+ "弧": 2681,
2684
+ "雁": 2682,
2685
+ "煮": 2683,
2686
+ "湛": 2684,
2687
+ "愚": 2685,
2688
+ "韶": 2686,
2689
+ "棍": 2687,
2690
+ "芜": 2688,
2691
+ "咬": 2689,
2692
+ "稚": 2690,
2693
+ "琦": 2691,
2694
+ "绣": 2692,
2695
+ "骸": 2693,
2696
+ "曝": 2694,
2697
+ "舜": 2695,
2698
+ "黏": 2696,
2699
+ "镰": 2697,
2700
+ "霜": 2698,
2701
+ "瑰": 2699,
2702
+ "辣": 2700,
2703
+ "粟": 2701,
2704
+ "瑚": 2702,
2705
+ "匡": 2703,
2706
+ "鳍": 2704,
2707
+ "搁": 2705,
2708
+ "泌": 2706,
2709
+ "辜": 2707,
2710
+ "鳄": 2708,
2711
+ "钩": 2709,
2712
+ "摊": 2710,
2713
+ "汁": 2711,
2714
+ "蛛": 2712,
2715
+ "邯": 2713,
2716
+ "孚": 2714,
2717
+ "俞": 2715,
2718
+ "隙": 2716,
2719
+ "禾": 2717,
2720
+ "豚": 2718,
2721
+ "鳞": 2719,
2722
+ "蜥": 2720,
2723
+ "荔": 2721,
2724
+ "摔": 2722,
2725
+ "懿": 2723,
2726
+ "н": 2724,
2727
+ "挝": 2725,
2728
+ "栽": 2726,
2729
+ "敷": 2727,
2730
+ "蕴": 2728,
2731
+ "椒": 2729,
2732
+ "奕": 2730,
2733
+ "氮": 2731,
2734
+ "啤": 2732,
2735
+ "嵩": 2733,
2736
+ "谜": 2734,
2737
+ "颅": 2735,
2738
+ "胤": 2736,
2739
+ "杏": 2737,
2740
+ "眷": 2738,
2741
+ "邹": 2739,
2742
+ "蓉": 2740,
2743
+ "榴": 2741,
2744
+ "毓": 2742,
2745
+ "渤": 2743,
2746
+ "罹": 2744,
2747
+ "酵": 2745,
2748
+ "芽": 2746,
2749
+ "鲨": 2747,
2750
+ "蔬": 2748,
2751
+ "龚": 2749,
2752
+ "穗": 2750,
2753
+ "愉": 2751,
2754
+ "殴": 2752,
2755
+ "钞": 2753,
2756
+ "厕": 2754,
2757
+ "匾": 2755,
2758
+ "绸": 2756,
2759
+ "锥": 2757,
2760
+ "雏": 2758,
2761
+ "磅": 2759,
2762
+ "慌": 2760,
2763
+ "钉": 2761,
2764
+ "傍": 2762,
2765
+ "晖": 2763,
2766
+ "褒": 2764,
2767
+ "凹": 2765,
2768
+ "裤": 2766,
2769
+ "烤": 2767,
2770
+ "玫": 2768,
2771
+ "袜": 2769,
2772
+ "敕": 2770,
2773
+ "渴": 2771,
2774
+ "剌": 2772,
2775
+ "澜": 2773,
2776
+ "の": 2774,
2777
+ "罐": 2775,
2778
+ "铀": 2776,
2779
+ "→": 2777,
2780
+ "壶": 2778,
2781
+ "淞": 2779,
2782
+ "耻": 2780,
2783
+ "禽": 2781,
2784
+ "烷": 2782,
2785
+ "哨": 2783,
2786
+ "晤": 2784,
2787
+ "萌": 2785,
2788
+ "廓": 2786,
2789
+ "账": 2787,
2790
+ "晰": 2788,
2791
+ "轿": 2789,
2792
+ "郝": 2790,
2793
+ "顽": 2791,
2794
+ "飘": 2792,
2795
+ "骆": 2793,
2796
+ "俩": 2794,
2797
+ "巾": 2795,
2798
+ "砌": 2796,
2799
+ "×": 2797,
2800
+ "沫": 2798,
2801
+ "缪": 2799,
2802
+ "墩": 2800,
2803
+ "垫": 2801,
2804
+ "蛙": 2802,
2805
+ "聊": 2803,
2806
+ "讳": 2804,
2807
+ "р": 2805,
2808
+ "谅": 2806,
2809
+ "臭": 2807,
2810
+ "驼": 2808,
2811
+ "梓": 2809,
2812
+ "猩": 2810,
2813
+ "酱": 2811,
2814
+ "剖": 2812,
2815
+ "暹": 2813,
2816
+ "瞬": 2814,
2817
+ "疼": 2815,
2818
+ "橄": 2816,
2819
+ "膀": 2817,
2820
+ "蔚": 2818,
2821
+ "渭": 2819,
2822
+ "抹": 2820,
2823
+ "肚": 2821,
2824
+ "樊": 2822,
2825
+ "敲": 2823,
2826
+ "墅": 2824,
2827
+ "祺": 2825,
2828
+ "莽": 2826,
2829
+ "寝": 2827,
2830
+ "蟹": 2828,
2831
+ "棺": 2829,
2832
+ "惹": 2830,
2833
+ "膝": 2831,
2834
+ "椭": 2832,
2835
+ "妾": 2833,
2836
+ "刹": 2834,
2837
+ "舶": 2835,
2838
+ "绰": 2836,
2839
+ "舖": 2837,
2840
+ "汐": 2838,
2841
+ "郸": 2839,
2842
+ "羞": 2840,
2843
+ "塾": 2841,
2844
+ "肪": 2842,
2845
+ "奢": 2843,
2846
+ "祇": 2844,
2847
+ "驹": 2845,
2848
+ "俭": 2846,
2849
+ "裙": 2847,
2850
+ "溥": 2848,
2851
+ "瑙": 2849,
2852
+ "淋": 2850,
2853
+ "泼": 2851,
2854
+ "宵": 2852,
2855
+ "檀": 2853,
2856
+ "杖": 2854,
2857
+ "抨": 2855,
2858
+ "哩": 2856,
2859
+ "脆": 2857,
2860
+ "瀑": 2858,
2861
+ "揽": 2859,
2862
+ "冤": 2860,
2863
+ "甄": 2861,
2864
+ "谎": 2862,
2865
+ "楠": 2863,
2866
+ "氟": 2864,
2867
+ "橙": 2865,
2868
+ "$": 2866,
2869
+ "勿": 2867,
2870
+ "彝": 2868,
2871
+ "嘱": 2869,
2872
+ "虔": 2870,
2873
+ "缮": 2871,
2874
+ "榆": 2872,
2875
+ "俯": 2873,
2876
+ "炒": 2874,
2877
+ "с": 2875,
2878
+ "檐": 2876,
2879
+ "魅": 2877,
2880
+ "骄": 2878,
2881
+ "您": 2879,
2882
+ "烹": 2880,
2883
+ "岱": 2881,
2884
+ "瞄": 2882,
2885
+ "丕": 2883,
2886
+ "蜘": 2884,
2887
+ "钥": 2885,
2888
+ "壹": 2886,
2889
+ "菱": 2887,
2890
+ "濑": 2888,
2891
+ "绎": 2889,
2892
+ "岑": 2890,
2893
+ "洽": 2891,
2894
+ "嘲": 2892,
2895
+ "煽": 2893,
2896
+ "琅": 2894,
2897
+ "牡": 2895,
2898
+ "笠": 2896,
2899
+ "楷": 2897,
2900
+ "畿": 2898,
2901
+ "祂": 2899,
2902
+ "倚": 2900,
2903
+ "睹": 2901,
2904
+ "т": 2902,
2905
+ "糟": 2903,
2906
+ "苻": 2904,
2907
+ "栅": 2905,
2908
+ "莞": 2906,
2909
+ "酮": 2907,
2910
+ "膏": 2908,
2911
+ "扳": 2909,
2912
+ "诬": 2910,
2913
+ "祁": 2911,
2914
+ "蚁": 2912,
2915
+ "娥": 2913,
2916
+ "戊": 2914,
2917
+ "蜡": 2915,
2918
+ "裹": 2916,
2919
+ "驿": 2917,
2920
+ "恼": 2918,
2921
+ "梧": 2919,
2922
+ "薯": 2920,
2923
+ "磡": 2921,
2924
+ "掀": 2922,
2925
+ "撕": 2923,
2926
+ "溉": 2924,
2927
+ "谟": 2925,
2928
+ "敖": 2926,
2929
+ "匀": 2927,
2930
+ "榕": 2928,
2931
+ "缸": 2929,
2932
+ "侬": 2930,
2933
+ "凑": 2931,
2934
+ "睿": 2932,
2935
+ "钠": 2933,
2936
+ "橘": 2934,
2937
+ "匣": 2935,
2938
+ "黔": 2936,
2939
+ "锤": 2937,
2940
+ "胚": 2938,
2941
+ "戍": 2939,
2942
+ "晏": 2940,
2943
+ "哇": 2941,
2944
+ "泗": 2942,
2945
+ "睦": 2943,
2946
+ "篡": 2944,
2947
+ "捞": 2945,
2948
+ "婿": 2946,
2949
+ "璧": 2947,
2950
+ "绒": 2948,
2951
+ "棣": 2949,
2952
+ "碗": 2950,
2953
+ "憾": 2951,
2954
+ "沁": 2952,
2955
+ "焊": 2953,
2956
+ "沂": 2954,
2957
+ "舷": 2955,
2958
+ "邢": 2956,
2959
+ "瘟": 2957,
2960
+ "棘": 2958,
2961
+ "嫔": 2959,
2962
+ "焉": 2960,
2963
+ "泾": 2961,
2964
+ "樟": 2962,
2965
+ "刃": 2963,
2966
+ "慨": 2964,
2967
+ "饷": 2965,
2968
+ "缀": 2966,
2969
+ "荀": 2967,
2970
+ "琛": 2968,
2971
+ "盔": 2969,
2972
+ "堀": 2970,
2973
+ "摸": 2971,
2974
+ "雯": 2972,
2975
+ "枯": 2973,
2976
+ "釜": 2974,
2977
+ "乒": 2975,
2978
+ "渝": 2976,
2979
+ "啊": 2977,
2980
+ "堰": 2978,
2981
+ "​": 2979,
2982
+ "髓": 2980,
2983
+ "吟": 2981,
2984
+ "胖": 2982,
2985
+ "晃": 2983,
2986
+ "莹": 2984,
2987
+ "沽": 2985,
2988
+ "渣": 2986,
2989
+ "茹": 2987,
2990
+ "妄": 2988,
2991
+ "钮": 2989,
2992
+ "颌": 2990,
2993
+ "伺": 2991,
2994
+ "瘦": 2992,
2995
+ "镍": 2993,
2996
+ "狙": 2994,
2997
+ "蕉": 2995,
2998
+ "鲤": 2996,
2999
+ "邺": 2997,
3000
+ "挟": 2998,
3001
+ "淆": 2999,
3002
+ "灿": 3000,
3003
+ "′": 3001,
3004
+ "茜": 3002,
3005
+ "妨": 3003,
3006
+ "衬": 3004,
3007
+ "黜": 3005,
3008
+ "桩": 3006,
3009
+ "侮": 3007,
3010
+ "溢": 3008,
3011
+ "•": 3009,
3012
+ "抒": 3010,
3013
+ "捍": 3011,
3014
+ "洒": 3012,
3015
+ "桦": 3013,
3016
+ "鹦": 3014,
3017
+ "辗": 3015,
3018
+ "傀": 3016,
3019
+ "钙": 3017,
3020
+ "刮": 3018,
3021
+ "搏": 3019,
3022
+ "炬": 3020,
3023
+ "瓣": 3021,
3024
+ "缚": 3022,
3025
+ "擢": 3023,
3026
+ "棵": 3024,
3027
+ "娄": 3025,
3028
+ "埤": 3026,
3029
+ "麾": 3027,
3030
+ "昼": 3028,
3031
+ "瀛": 3029,
3032
+ "硝": 3030,
3033
+ "蔗": 3031,
3034
+ "汾": 3032,
3035
+ "儡": 3033,
3036
+ "嘎": 3034,
3037
+ "俸": 3035,
3038
+ "溺": 3036,
3039
+ "俾": 3037,
3040
+ "恪": 3038,
3041
+ "兀": 3039,
3042
+ "砸": 3040,
3043
+ "拢": 3041,
3044
+ "翡": 3042,
3045
+ "浚": 3043,
3046
+ "怜": 3044,
3047
+ "雳": 3045,
3048
+ "菁": 3046,
3049
+ "隍": 3047,
3050
+ "贱": 3048,
3051
+ "в": 3049,
3052
+ "锯": 3050,
3053
+ "膛": 3051,
3054
+ "陛": 3052,
3055
+ "觅": 3053,
3056
+ "熹": 3054,
3057
+ "衫": 3055,
3058
+ "捧": 3056,
3059
+ "鸽": 3057,
3060
+ "衞": 3058,
3061
+ "汞": 3059,
3062
+ "瞻": 3060,
3063
+ "粪": 3061,
3064
+ "簿": 3062,
3065
+ "锻": 3063,
3066
+ "犀": 3064,
3067
+ "吵": 3065,
3068
+ "偕": 3066,
3069
+ "忒": 3067,
3070
+ "硅": 3068,
3071
+ "扔": 3069,
3072
+ "梳": 3070,
3073
+ "讷": 3071,
3074
+ "馈": 3072,
3075
+ "蹄": 3073,
3076
+ "鹉": 3074,
3077
+ "蚌": 3075,
3078
+ "谬": 3076,
3079
+ "迭": 3077,
3080
+ "挠": 3078,
3081
+ "л": 3079,
3082
+ "浏": 3080,
3083
+ "禺": 3081,
3084
+ "斧": 3082,
3085
+ "溜": 3083,
3086
+ "吋": 3084,
3087
+ "灶": 3085,
3088
+ "匙": 3086,
3089
+ "瘫": 3087,
3090
+ "兖": 3088,
3091
+ "麒": 3089,
3092
+ "玺": 3090,
3093
+ "霹": 3091,
3094
+ "逗": 3092,
3095
+ "洼": 3093,
3096
+ "羚": 3094,
3097
+ "赂": 3095,
3098
+ "泵": 3096,
3099
+ "媛": 3097,
3100
+ "α": 3098,
3101
+ "酯": 3099,
3102
+ "瞒": 3100,
3103
+ "枕": 3101,
3104
+ "ー": 3102,
3105
+ "阙": 3103,
3106
+ "皓": 3104,
3107
+ "梗": 3105,
3108
+ "钊": 3106,
3109
+ "壬": 3107,
3110
+ "躯": 3108,
3111
+ "荧": 3109,
3112
+ "壕": 3110,
3113
+ "蜒": 3111,
3114
+ "娇": 3112,
3115
+ "诃": 3113,
3116
+ "涩": 3114,
3117
+ "垮": 3115,
3118
+ "莺": 3116,
3119
+ "桨": 3117,
3120
+ "鸠": 3118,
3121
+ "斡": 3119,
3122
+ "狸": 3120,
3123
+ "羲": 3121,
3124
+ "蜿": 3122,
3125
+ "=": 3123,
3126
+ "瑾": 3124,
3127
+ "跪": 3125,
3128
+ "韬": 3126,
3129
+ "戌": 3127,
3130
+ "杠": 3128,
3131
+ "馨": 3129,
3132
+ "磺": 3130,
3133
+ "豁": 3131,
3134
+ "炯": 3132,
3135
+ "揆": 3133,
3136
+ "庾": 3134,
3137
+ "铲": 3135,
3138
+ "猿": 3136,
3139
+ "篷": 3137,
3140
+ "掳": 3138,
3141
+ "烛": 3139,
3142
+ "譬": 3140,
3143
+ "瑛": 3141,
3144
+ "腕": 3142,
3145
+ "懋": 3143,
3146
+ "挣": 3144,
3147
+ "淤": 3145,
3148
+ "稠": 3146,
3149
+ "骼": 3147,
3150
+ "敛": 3148,
3151
+ "芯": 3149,
3152
+ "蚕": 3150,
3153
+ "舵": 3151,
3154
+ "讶": 3152,
3155
+ "戮": 3153,
3156
+ "琰": 3154,
3157
+ "霉": 3155,
3158
+ "羁": 3156,
3159
+ "腥": 3157,
3160
+ "婷": 3158,
3161
+ "颐": 3159,
3162
+ "晕": 3160,
3163
+ "い": 3161,
3164
+ "胰": 3162,
3165
+ "耆": 3163,
3166
+ "к": 3164,
3167
+ "〔": 3165,
3168
+ "苛": 3166,
3169
+ "闰": 3167,
3170
+ "霆": 3168,
3171
+ "箕": 3169,
3172
+ "泣": 3170,
3173
+ "岂": 3171,
3174
+ "噬": 3172,
3175
+ "瘾": 3173,
3176
+ "〕": 3174,
3177
+ "帛": 3175,
3178
+ "痪": 3176,
3179
+ "绞": 3177,
3180
+ "喂": 3178,
3181
+ "诰": 3179,
3182
+ "肋": 3180,
3183
+ "陡": 3181,
3184
+ "渥": 3182,
3185
+ "喙": 3183,
3186
+ "孵": 3184,
3187
+ "芮": 3185,
3188
+ "膳": 3186,
3189
+ "笨": 3187,
3190
+ "龛": 3188,
3191
+ "恕": 3189,
3192
+ "陋": 3190,
3193
+ "乓": 3191,
3194
+ "蝉": 3192,
3195
+ "诫": 3193,
3196
+ "哉": 3194,
3197
+ "坜": 3195,
3198
+ "遏": 3196,
3199
+ "蝴": 3197,
3200
+ "扯": 3198,
3201
+ "汴": 3199,
3202
+ "霄": 3200,
3203
+ "鑫": 3201,
3204
+ "鞑": 3202,
3205
+ "岚": 3203,
3206
+ "曜": 3204,
3207
+ "诅": 3205,
3208
+ "祚": 3206,
3209
+ "铉": 3207,
3210
+ "呆": 3208,
3211
+ "萃": 3209,
3212
+ "恺": 3210,
3213
+ "葱": 3211,
3214
+ "隘": 3212,
3215
+ "卯": 3213,
3216
+ "锚": 3214,
3217
+ "穹": 3215,
3218
+ "谤": 3216,
3219
+ "苟": 3217,
3220
+ "巅": 3218,
3221
+ "槛": 3219,
3222
+ "驯": 3220,
3223
+ "埗": 3221,
3224
+ "咎": 3222,
3225
+ "盎": 3223,
3226
+ "蔑": 3224,
3227
+ "沸": 3225,
3228
+ "枣": 3226,
3229
+ "兜": 3227,
3230
+ "昊": 3228,
3231
+ "悄": 3229,
3232
+ "釉": 3230,
3233
+ "闷": 3231,
3234
+ "沮": 3232,
3235
+ "潼": 3233,
3236
+ "崁": 3234,
3237
+ "汲": 3235,
3238
+ "钾": 3236,
3239
+ "泻": 3237,
3240
+ "萎": 3238,
3241
+ "桢": 3239,
3242
+ "艮": 3240,
3243
+ "隅": 3241,
3244
+ "臀": 3242,
3245
+ "【": 3243,
3246
+ "坍": 3244,
3247
+ "屑": 3245,
3248
+ "於": 3246,
3249
+ "垩": 3247,
3250
+ "靡": 3248,
3251
+ "嗜": 3249,
3252
+ "晒": 3250,
3253
+ "钵": 3251,
3254
+ "阱": 3252,
3255
+ "剃": 3253,
3256
+ "】": 3254,
3257
+ "臧": 3255,
3258
+ "魄": 3256,
3259
+ "楞": 3257,
3260
+ "卤": 3258,
3261
+ "矫": 3259,
3262
+ "氦": 3260,
3263
+ "毘": 3261,
3264
+ "峨": 3262,
3265
+ "夭": 3263,
3266
+ "靶": 3264,
3267
+ "佬": 3265,
3268
+ "稷": 3266,
3269
+ "薰": 3267,
3270
+ "皈": 3268,
3271
+ "聆": 3269,
3272
+ "谒": 3270,
3273
+ "恳": 3271,
3274
+ "殆": 3272,
3275
+ "敞": 3273,
3276
+ "臻": 3274,
3277
+ "皋": 3275,
3278
+ "萤": 3276,
3279
+ "锌": 3277,
3280
+ "翟": 3278,
3281
+ "毯": 3279,
3282
+ "韧": 3280,
3283
+ "桅": 3281,
3284
+ "诡": 3282,
3285
+ "贮": 3283,
3286
+ "埕": 3284,
3287
+ "煞": 3285,
3288
+ "磐": 3286,
3289
+ "秃": 3287,
3290
+ "甬": 3288,
3291
+ "姨": 3289,
3292
+ "叡": 3290,
3293
+ "筛": 3291,
3294
+ "靼": 3292,
3295
+ "沥": 3293,
3296
+ "妒": 3294,
3297
+ "〇": 3295,
3298
+ "孢": 3296,
3299
+ "幡": 3297,
3300
+ "朽": 3298,
3301
+ "栈": 3299,
3302
+ "藔": 3300,
3303
+ "赈": 3301,
3304
+ "盼": 3302,
3305
+ "伶": 3303,
3306
+ "蚊": 3304,
3307
+ "咪": 3305,
3308
+ "竺": 3306,
3309
+ "辕": 3307,
3310
+ "媚": 3308,
3311
+ "漕": 3309,
3312
+ "奚": 3310,
3313
+ "聋": 3311,
3314
+ "弩": 3312,
3315
+ "昧": 3313,
3316
+ "鹫": 3314,
3317
+ "侏": 3315,
3318
+ "攸": 3316,
3319
+ "旷": 3317,
3320
+ "暮": 3318,
3321
+ "ン": 3319,
3322
+ "圻": 3320,
3323
+ "痴": 3321,
3324
+ "翘": 3322,
3325
+ "锈": 3323,
3326
+ "β": 3324,
3327
+ "皂": 3325,
3328
+ "濠": 3326,
3329
+ "幌": 3327,
3330
+ "炀": 3328,
3331
+ "躁": 3329,
3332
+ "匆": 3330,
3333
+ "鸾": 3331,
3334
+ "し": 3332,
3335
+ "胥": 3333,
3336
+ "晟": 3334,
3337
+ "愁": 3335,
3338
+ "殡": 3336,
3339
+ "侃": 3337,
3340
+ "陌": 3338,
3341
+ "潟": 3339,
3342
+ "沾": 3340,
3343
+ "峭": 3341,
3344
+ "啦": 3342,
3345
+ "栓": 3343,
3346
+ "乍": 3344,
3347
+ "悖": 3345,
3348
+ "岔": 3346,
3349
+ "曦": 3347,
3350
+ "疟": 3348,
3351
+ "淇": 3349,
3352
+ "冉": 3350,
3353
+ "憩": 3351,
3354
+ "珂": 3352,
3355
+ "瞩": 3353,
3356
+ "菇": 3354,
3357
+ "瞿": 3355,
3358
+ "芸": 3356,
3359
+ "钝": 3357,
3360
+ "遁": 3358,
3361
+ "哗": 3359,
3362
+ "咽": 3360,
3363
+ "燮": 3361,
3364
+ "茱": 3362,
3365
+ "煎": 3363,
3366
+ "м": 3364,
3367
+ "捣": 3365,
3368
+ "踩": 3366,
3369
+ "沐": 3367,
3370
+ "蕊": 3368,
3371
+ "撼": 3369,
3372
+ "炽": 3370,
3373
+ "疹": 3371,
3374
+ "笙": 3372,
3375
+ "拐": 3373,
3376
+ "娴": 3374,
3377
+ "苓": 3375,
3378
+ "毋": 3376,
3379
+ "瓮": 3377,
3380
+ "朕": 3378,
3381
+ "寰": 3379,
3382
+ "婢": 3380,
3383
+ "娟": 3381,
3384
+ "竿": 3382,
3385
+ "倦": 3383,
3386
+ "曙": 3384,
3387
+ "歪": 3385,
3388
+ "窜": 3386,
3389
+ "蟾": 3387,
3390
+ "钰": 3388,
3391
+ "衮": 3389,
3392
+ "懒": 3390,
3393
+ "た": 3391,
3394
+ "秽": 3392,
3395
+ "躺": 3393,
3396
+ "д": 3394,
3397
+ ">": 3395,
3398
+ "蓟": 3396,
3399
+ "侈": 3397,
3400
+ "琶": 3398,
3401
+ "桧": 3399,
3402
+ "撃": 3400,
3403
+ "崭": 3401,
3404
+ "赁": 3402,
3405
+ "蝇": 3403,
3406
+ "篆": 3404,
3407
+ "琵": 3405,
3408
+ "狠": 3406,
3409
+ "拷": 3407,
3410
+ "濬": 3408,
3411
+ "捆": 3409,
3412
+ "褚": 3410,
3413
+ "甥": 3411,
3414
+ "辍": 3412,
3415
+ "镀": 3413,
3416
+ "莒": 3414,
3417
+ "骁": 3415,
3418
+ "耍": 3416,
3419
+ "禀": 3417,
3420
+ "癸": 3418,
3421
+ "靴": 3419,
3422
+ "醛": 3420,
3423
+ "焘": 3421,
3424
+ "卉": 3422,
3425
+ "愧": 3423,
3426
+ "挚": 3424,
3427
+ "纶": 3425,
3428
+ "裘": 3426,
3429
+ "绮": 3427,
3430
+ "喘": 3428,
3431
+ "翊": 3429,
3432
+ "窥": 3430,
3433
+ "肛": 3431,
3434
+ "衅": 3432,
3435
+ "椰": 3433,
3436
+ "う": 3434,
3437
+ "绛": 3435,
3438
+ "嫉": 3436,
3439
+ "碉": 3437,
3440
+ "莓": 3438,
3441
+ "睐": 3439,
3442
+ "淄": 3440,
3443
+ "诽": 3441,
3444
+ "觐": 3442,
3445
+ "悍": 3443,
3446
+ "巳": 3444,
3447
+ "に": 3445,
3448
+ "颚": 3446,
3449
+ "玮": 3447,
3450
+ "樵": 3448,
3451
+ "暇": 3449,
3452
+ "酉": 3450,
3453
+ "倩": 3451,
3454
+ "苔": 3452,
3455
+ "灼": 3453,
3456
+ "醋": 3454,
3457
+ "臼": 3455,
3458
+ "泸": 3456,
3459
+ "у": 3457,
3460
+ "萱": 3458,
3461
+ "羡": 3459,
3462
+ "镐": 3460,
3463
+ "炫": 3461,
3464
+ "糙": 3462,
3465
+ "饪": 3463,
3466
+ "侗": 3464,
3467
+ "琐": 3465,
3468
+ "渎": 3466,
3469
+ "腻": 3467,
3470
+ "濮": 3468,
3471
+ "烃": 3469,
3472
+ "彤": 3470,
3473
+ "媳": 3471,
3474
+ "|": 3472,
3475
+ "襟": 3473,
3476
+ "磊": 3474,
3477
+ "浒": 3475,
3478
+ "昙": 3476,
3479
+ "裳": 3477,
3480
+ "蠢": 3478,
3481
+ "牒": 3479,
3482
+ "脾": 3480,
3483
+ "捡": 3481,
3484
+ "绶": 3482,
3485
+ "恤": 3483,
3486
+ "迥": 3484,
3487
+ "葫": 3485,
3488
+ "昨": 3486,
3489
+ "帘": 3487,
3490
+ "耸": 3488,
3491
+ "酰": 3489,
3492
+ "莆": 3490,
3493
+ "卅": 3491,
3494
+ "叱": 3492,
3495
+ "○": 3493,
3496
+ "镖": 3494,
3497
+ "か": 3495,
3498
+ "囱": 3496,
3499
+ "僻": 3497,
3500
+ "槃": 3498,
3501
+ "遴": 3499,
3502
+ "晦": 3500,
3503
+ "珪": 3501,
3504
+ "弈": 3502,
3505
+ "瓯": 3503,
3506
+ "戟": 3504,
3507
+ "槐": 3505,
3508
+ "鄙": 3506,
3509
+ "谛": 3507,
3510
+ "銮": 3508,
3511
+ "颍": 3509,
3512
+ "镁": 3510,
3513
+ "蕙": 3511,
3514
+ "捏": 3512,
3515
+ "僖": 3513,
3516
+ "扼": 3514,
3517
+ "湄": 3515,
3518
+ "葆": 3516,
3519
+ "と": 3517,
3520
+ "艏": 3518,
3521
+ "ん": 3519,
3522
+ "铨": 3520,
3523
+ "潢": 3521,
3524
+ "埼": 3522,
3525
+ "衢": 3523,
3526
+ "馏": 3524,
3527
+ "咐": 3525,
3528
+ "贻": 3526,
3529
+ "谚": 3527,
3530
+ "绫": 3528,
3531
+ "迂": 3529,
3532
+ "虢": 3530,
3533
+ "膺": 3531,
3534
+ "佃": 3532,
3535
+ "苇": 3533,
3536
+ "呕": 3534,
3537
+ "砾": 3535,
3538
+ "飙": 3536,
3539
+ "憎": 3537,
3540
+ "陂": 3538,
3541
+ "哑": 3539,
3542
+ "惕": 3540,
3543
+ "鹃": 3541,
3544
+ "鸥": 3542,
3545
+ "烘": 3543,
3546
+ "﹐": 3544,
3547
+ "趟": 3545,
3548
+ "矽": 3546,
3549
+ "慷": 3547,
3550
+ "扈": 3548,
3551
+ "杞": 3549,
3552
+ "傻": 3550,
3553
+ "鹭": 3551,
3554
+ "肴": 3552,
3555
+ "畸": 3553,
3556
+ "る": 3554,
3557
+ "佚": 3555,
3558
+ "挨": 3556,
3559
+ "丐": 3557,
3560
+ "艋": 3558,
3561
+ "昱": 3559,
3562
+ "逵": 3560,
3563
+ "咤": 3561,
3564
+ "舺": 3562,
3565
+ "氰": 3563,
3566
+ "亵": 3564,
3567
+ "忏": 3565,
3568
+ "柑": 3566,
3569
+ "伎": 3567,
3570
+ "峇": 3568,
3571
+ "倘": 3569,
3572
+ "奘": 3570,
3573
+ "滔": 3571,
3574
+ "粥": 3572,
3575
+ "磋": 3573,
3576
+ "濂": 3574,
3577
+ "涧": 3575,
3578
+ "芥": 3576,
3579
+ "氹": 3577,
3580
+ "鄞": 3578,
3581
+ "铠": 3579,
3582
+ "蜗": 3580,
3583
+ "匮": 3581,
3584
+ "轼": 3582,
3585
+ "诀": 3583,
3586
+ "_": 3584,
3587
+ "冗": 3585,
3588
+ "泷": 3586,
3589
+ "俨": 3587,
3590
+ "鹘": 3588,
3591
+ "乂": 3589,
3592
+ "笋": 3590,
3593
+ "惶": 3591,
3594
+ "湳": 3592,
3595
+ "骠": 3593,
3596
+ "﹑": 3594,
3597
+ "姪": 3595,
3598
+ "孺": 3596,
3599
+ "︰": 3597,
3600
+ "骇": 3598,
3601
+ "瑕": 3599,
3602
+ "闾": 3600,
3603
+ "*": 3601,
3604
+ "尬": 3602,
3605
+ "氓": 3603,
3606
+ "<": 3604,
3607
+ "鳌": 3605,
3608
+ "雉": 3606,
3609
+ "浇": 3607,
3610
+ "驴": 3608,
3611
+ "佥": 3609,
3612
+ "稼": 3610,
3613
+ "榈": 3611,
3614
+ "妊": 3612,
3615
+ "馅": 3613,
3616
+ "尴": 3614,
3617
+ "鹊": 3615,
3618
+ "酪": 3616,
3619
+ "芹": 3617,
3620
+ "妲": 3618,
3621
+ "绯": 3619,
3622
+ "刁": 3620,
3623
+ "弋": 3621,
3624
+ "皱": 3622,
3625
+ "洮": 3623,
3626
+ "醮": 3624,
3627
+ "隼": 3625,
3628
+ "吼": 3626,
3629
+ "圃": 3627,
3630
+ "悚": 3628,
3631
+ "芷": 3629,
3632
+ "锂": 3630,
3633
+ "辙": 3631,
3634
+ "莘": 3632,
3635
+ "熏": 3633,
3636
+ "偃": 3634,
3637
+ "碘": 3635,
3638
+ "酌": 3636,
3639
+ "邕": 3637,
3640
+ "跆": 3638,
3641
+ "搅": 3639,
3642
+ "炜": 3640,
3643
+ "鸮": 3641,
3644
+ "脖": 3642,
3645
+ "吠": 3643,
3646
+ "疮": 3644,
3647
+ "绢": 3645,
3648
+ "弑": 3646,
3649
+ "钛": 3647,
3650
+ "枋": 3648,
3651
+ "懈": 3649,
3652
+ "缎": 3650,
3653
+ "肘": 3651,
3654
+ "钗": 3652,
3655
+ "傣": 3653,
3656
+ "潞": 3654,
3657
+ "茫": 3655,
3658
+ "咳": 3656,
3659
+ "劈": 3657,
3660
+ "碁": 3658,
3661
+ "孀": 3659,
3662
+ "浜": 3660,
3663
+ "黯": 3661,
3664
+ "熄": 3662,
3665
+ "琮": 3663,
3666
+ "拙": 3664,
3667
+ "猷": 3665,
3668
+ "榔": 3666,
3669
+ "筝": 3667,
3670
+ "跻": 3668,
3671
+ "奄": 3669,
3672
+ "郢": 3670,
3673
+ "#": 3671,
3674
+ "汛": 3672,
3675
+ "诣": 3673,
3676
+ "酗": 3674,
3677
+ "诲": 3675,
3678
+ "筠": 3676,
3679
+ "妳": 3677,
3680
+ "吡": 3678,
3681
+ "蝎": 3679,
3682
+ "沅": 3680,
3683
+ "骅": 3681,
3684
+ "褶": 3682,
3685
+ "拂": 3683,
3686
+ "髅": 3684,
3687
+ "畑": 3685,
3688
+ "ル": 3686,
3689
+ "唆": 3687,
3690
+ "丫": 3688,
3691
+ "颉": 3689,
3692
+ "萼": 3690,
3693
+ "剔": 3691,
3694
+ "佗": 3692,
3695
+ "珲": 3693,
3696
+ "翎": 3694,
3697
+ "胄": 3695,
3698
+ "は": 3696,
3699
+ "攘": 3697,
3700
+ "蟒": 3698,
3701
+ "琏": 3699,
3702
+ "颤": 3700,
3703
+ "驸": 3701,
3704
+ "苷": 3702,
3705
+ "ス": 3703,
3706
+ "п": 3704,
3707
+ "樽": 3705,
3708
+ "讥": 3706,
3709
+ "逍": 3707,
3710
+ "椿": 3708,
3711
+ "耦": 3709,
3712
+ "宸": 3710,
3713
+ "怠": 3711,
3714
+ "靳": 3712,
3715
+ "巍": 3713,
3716
+ "镕": 3714,
3717
+ "ы": 3715,
3718
+ "な": 3716,
3719
+ "暱": 3717,
3720
+ "蜴": 3718,
3721
+ "骷": 3719,
3722
+ "□": 3720,
3723
+ "掖": 3721,
3724
+ "嚣": 3722,
3725
+ "劭": 3723,
3726
+ "嗅": 3724,
3727
+ "苞": 3725,
3728
+ "烽": 3726,
3729
+ "鄱": 3727,
3730
+ "晷": 3728,
3731
+ "宕": 3729,
3732
+ "り": 3730,
3733
+ "楔": 3731,
3734
+ "贲": 3732,
3735
+ "酚": 3733,
3736
+ "孛": 3734,
3737
+ "荼": 3735,
3738
+ "乖": 3736,
3739
+ "瞳": 3737,
3740
+ "崧": 3738,
3741
+ "沌": 3739,
3742
+ "蒜": 3740,
3743
+ "昶": 3741,
3744
+ "羟": 3742,
3745
+ "篱": 3743,
3746
+ "鞠": 3744,
3747
+ "檎": 3745,
3748
+ "菅": 3746,
3749
+ "堑": 3747,
3750
+ "酥": 3748,
3751
+ "て": 3749,
3752
+ "慑": 3750,
3753
+ "窘": 3751,
3754
+ "躬": 3752,
3755
+ "荥": 3753,
3756
+ "拌": 3754,
3757
+ "柠": 3755,
3758
+ "耽": 3756,
3759
+ "岷": 3757,
3760
+ "枭": 3758,
3761
+ "癫": 3759,
3762
+ "谔": 3760,
3763
+ "й": 3761,
3764
+ "寥": 3762,
3765
+ "糯": 3763,
3766
+ "潍": 3764,
3767
+ "叩": 3765,
3768
+ "滦": 3766,
3769
+ "峪": 3767,
3770
+ "瓒": 3768,
3771
+ "喃": 3769,
3772
+ "嬴": 3770,
3773
+ "覃": 3771,
3774
+ "赃": 3772,
3775
+ "肽": 3773,
3776
+ "巽": 3774,
3777
+ "涤": 3775,
3778
+ "鞘": 3776,
3779
+ "嫂": 3777,
3780
+ "葺": 3778,
3781
+ "槿": 3779,
3782
+ "茉": 3780,
3783
+ "悯": 3781,
3784
+ "渲": 3782,
3785
+ "徘": 3783,
3786
+ "曳": 3784,
3787
+ "叮": 3785,
3788
+ "骧": 3786,
3789
+ "哮": 3787,
3790
+ "‰": 3788,
3791
+ "я": 3789,
3792
+ "涿": 3790,
3793
+ "呀": 3791,
3794
+ "俑": 3792,
3795
+ "娼": 3793,
3796
+ "澡": 3794,
3797
+ "佟": 3795,
3798
+ "漱": 3796,
3799
+ "徊": 3797,
3800
+ "纾": 3798,
3801
+ "渚": 3799,
3802
+ "迳": 3800,
3803
+ "撇": 3801,
3804
+ "僭": 3802,
3805
+ "溧": 3803,
3806
+ "簇": 3804,
3807
+ "邝": 3805,
3808
+ "璜": 3806,
3809
+ "貂": 3807,
3810
+ "イ": 3808,
3811
+ "艉": 3809,
3812
+ "粽": 3810,
3813
+ "炙": 3811,
3814
+ "榨": 3812,
3815
+ "蛟": 3813,
3816
+ "砚": 3814,
3817
+ "瀚": 3815,
3818
+ "檬": 3816,
3819
+ "斛": 3817,
3820
+ "蚂": 3818,
3821
+ "怯": 3819,
3822
+ "屁": 3820,
3823
+ "峦": 3821,
3824
+ "钚": 3822,
3825
+ "瑄": 3823,
3826
+ "叭": 3824,
3827
+ "岌": 3825,
3828
+ "镛": 3826,
3829
+ "ま": 3827,
3830
+ "颓": 3828,
3831
+ "铂": 3829,
3832
+ "г": 3830,
3833
+ "踞": 3831,
3834
+ "讹": 3832,
3835
+ "夔": 3833,
3836
+ "跤": 3834,
3837
+ "聿": 3835,
3838
+ "祜": 3836,
3839
+ "柩": 3837,
3840
+ "を": 3838,
3841
+ "墘": 3839,
3842
+ "骥": 3840,
3843
+ "妍": 3841,
3844
+ "纥": 3842,
3845
+ "霰": 3843,
3846
+ "娠": 3844,
3847
+ "が": 3845,
3848
+ "琥": 3846,
3849
+ "き": 3847,
3850
+ "兮": 3848,
3851
+ "迺": 3849,
3852
+ "惰": 3850,
3853
+ "睾": 3851,
3854
+ "鼬": 3852,
3855
+ "秤": 3853,
3856
+ "拣": 3854,
3857
+ "贰": 3855,
3858
+ "眺": 3856,
3859
+ "娩": 3857,
3860
+ "摹": 3858,
3861
+ "笈": 3859,
3862
+ "铿": 3860,
3863
+ "髻": 3861,
3864
+ "沱": 3862,
3865
+ "籽": 3863,
3866
+ "睢": 3864,
3867
+ "谯": 3865,
3868
+ "痒": 3866,
3869
+ "帧": 3867,
3870
+ "蔷": 3868,
3871
+ "赓": 3869,
3872
+ "煜": 3870,
3873
+ "呐": 3871,
3874
+ "牦": 3872,
3875
+ "戡": 3873,
3876
+ "洙": 3874,
3877
+ "仄": 3875,
3878
+ "砷": 3876,
3879
+ "ラ": 3877,
3880
+ "昕": 3878,
3881
+ "歹": 3879,
3882
+ "嬷": 3880,
3883
+ "饵": 3881,
3884
+ "嬉": 3882,
3885
+ "痊": 3883,
3886
+ "钳": 3884,
3887
+ "纣": 3885,
3888
+ "酝": 3886,
3889
+ "璞": 3887,
3890
+ "堃": 3888,
3891
+ "涟": 3889,
3892
+ "勐": 3890,
3893
+ "腌": 3891,
3894
+ "恃": 3892,
3895
+ "浔": 3893,
3896
+ "畠": 3894,
3897
+ "掏": 3895,
3898
+ "珅": 3896,
3899
+ "颊": 3897,
3900
+ "遐": 3898,
3901
+ "檄": 3899,
3902
+ "掺": 3900,
3903
+ "苳": 3901,
3904
+ "璟": 3902,
3905
+ "盏": 3903,
3906
+ "潇": 3904,
3907
+ "绊": 3905,
3908
+ "锟": 3906,
3909
+ "さ": 3907,
3910
+ "枉": 3908,
3911
+ "窒": 3909,
3912
+ "馥": 3910,
3913
+ "锭": 3911,
3914
+ "瞭": 3912,
3915
+ "崴": 3913,
3916
+ "缙": 3914,
3917
+ "灏": 3915,
3918
+ "辄": 3916,
3919
+ "筿": 3917,
3920
+ "羹": 3918,
3921
+ "歙": 3919,
3922
+ "锺": 3920,
3923
+ "唾": 3921,
3924
+ "呜": 3922,
3925
+ "ь": 3923,
3926
+ "咕": 3924,
3927
+ "璐": 3925,
3928
+ "碌": 3926,
3929
+ "蝗": 3927,
3930
+ "熬": 3928,
3931
+ "莪": 3929,
3932
+ "琨": 3930,
3933
+ "疵": 3931,
3934
+ "抖": 3932,
3935
+ "鳃": 3933,
3936
+ "栾": 3934,
3937
+ "淦": 3935,
3938
+ "筏": 3936,
3939
+ "狡": 3937,
3940
+ "熵": 3938,
3941
+ "懦": 3939,
3942
+ "窖": 3940,
3943
+ "戛": 3941,
3944
+ "恬": 3942,
3945
+ "忱": 3943,
3946
+ "杵": 3944,
3947
+ "嗓": 3945,
3948
+ "孽": 3946,
3949
+ "螂": 3947,
3950
+ "铮": 3948,
3951
+ "撮": 3949,
3952
+ "茧": 3950,
3953
+ "卞": 3951,
3954
+ "甩": 3952,
3955
+ "锷": 3953,
3956
+ "荟": 3954,
3957
+ "泮": 3955,
3958
+ "钨": 3956,
3959
+ "秧": 3957,
3960
+ "掸": 3958,
3961
+ "槌": 3959,
3962
+ "彧": 3960,
3963
+ "醚": 3961,
3964
+ "踝": 3962,
3965
+ "岬": 3963,
3966
+ "凋": 3964,
3967
+ "惇": 3965,
3968
+ "姥": 3966,
3969
+ "骞": 3967,
3970
+ "ч": 3968,
3971
+ "驭": 3969,
3972
+ "柿": 3970,
3973
+ "リ": 3971,
3974
+ "拇": 3972,
3975
+ "吩": 3973,
3976
+ "屎": 3974,
3977
+ "涪": 3975,
3978
+ "辫": 3976,
3979
+ "圩": 3977,
3980
+ "旌": 3978,
3981
+ "鼐": 3979,
3982
+ "筵": 3980,
3983
+ "溅": 3981,
3984
+ "螈": 3982,
3985
+ "祉": 3983,
3986
+ "惮": 3984,
3987
+ "阇": 3985,
3988
+ "癖": 3986,
3989
+ "峤": 3987,
3990
+ "氐": 3988,
3991
+ "瞰": 3989,
3992
+ "歆": 3990,
3993
+ "铯": 3991,
3994
+ "蛾": 3992,
3995
+ "轧": 3993,
3996
+ "镳": 3994,
3997
+ "汹": 3995,
3998
+ "璇": 3996,
3999
+ "ト": 3997,
4000
+ "佯": 3998,
4001
+ "楣": 3999,
4002
+ "哄": 4000
4003
+ }
vocab_wiki_4k_en.json ADDED
@@ -0,0 +1,4002 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "[PAD]": 0,
3
+ "[UNK]": 1,
4
+ "[CLS]": 2,
5
+ "[SEP]": 3,
6
+ "[MASK]": 4,
7
+ "!": 5,
8
+ "\"": 6,
9
+ "'": 7,
10
+ "(": 8,
11
+ ")": 9,
12
+ ",": 10,
13
+ "-": 11,
14
+ ".": 12,
15
+ "0": 13,
16
+ "1": 14,
17
+ "2": 15,
18
+ "3": 16,
19
+ "4": 17,
20
+ "5": 18,
21
+ "6": 19,
22
+ "7": 20,
23
+ "8": 21,
24
+ "9": 22,
25
+ ":": 23,
26
+ ";": 24,
27
+ "?": 25,
28
+ "A": 26,
29
+ "B": 27,
30
+ "C": 28,
31
+ "D": 29,
32
+ "E": 30,
33
+ "F": 31,
34
+ "G": 32,
35
+ "H": 33,
36
+ "I": 34,
37
+ "J": 35,
38
+ "K": 36,
39
+ "L": 37,
40
+ "M": 38,
41
+ "N": 39,
42
+ "O": 40,
43
+ "P": 41,
44
+ "Q": 42,
45
+ "R": 43,
46
+ "S": 44,
47
+ "T": 45,
48
+ "U": 46,
49
+ "V": 47,
50
+ "W": 48,
51
+ "X": 49,
52
+ "Y": 50,
53
+ "Z": 51,
54
+ "[": 52,
55
+ "]": 53,
56
+ "a": 54,
57
+ "b": 55,
58
+ "c": 56,
59
+ "d": 57,
60
+ "e": 58,
61
+ "f": 59,
62
+ "g": 60,
63
+ "h": 61,
64
+ "i": 62,
65
+ "j": 63,
66
+ "k": 64,
67
+ "l": 65,
68
+ "m": 66,
69
+ "n": 67,
70
+ "o": 68,
71
+ "p": 69,
72
+ "q": 70,
73
+ "r": 71,
74
+ "s": 72,
75
+ "t": 73,
76
+ "u": 74,
77
+ "v": 75,
78
+ "w": 76,
79
+ "x": 77,
80
+ "y": 78,
81
+ "z": 79,
82
+ "\u0109": 80,
83
+ "\u010a": 81,
84
+ "\u0120": 82,
85
+ "\u0120t": 83,
86
+ "\u0120a": 84,
87
+ "he": 85,
88
+ "in": 86,
89
+ "\u0120the": 87,
90
+ "er": 88,
91
+ "on": 89,
92
+ "re": 90,
93
+ "\u0120o": 91,
94
+ "at": 92,
95
+ "\u0120s": 93,
96
+ "en": 94,
97
+ "ed": 95,
98
+ "an": 96,
99
+ "is": 97,
100
+ "\u0120c": 98,
101
+ "\u0120w": 99,
102
+ "es": 100,
103
+ "it": 101,
104
+ "or": 102,
105
+ "\u0120of": 103,
106
+ "al": 104,
107
+ "\u0120an": 105,
108
+ "\u0120b": 106,
109
+ "ar": 107,
110
+ "\u0120in": 108,
111
+ "\u0120f": 109,
112
+ "\u0120p": 110,
113
+ "ic": 111,
114
+ "\u0120and": 112,
115
+ "ing": 113,
116
+ "ion": 114,
117
+ "as": 115,
118
+ "ro": 116,
119
+ "\u0120m": 117,
120
+ "\u0120to": 118,
121
+ "\u0120d": 119,
122
+ "le": 120,
123
+ "\u0120A": 121,
124
+ "\u0120h": 122,
125
+ "ent": 123,
126
+ "ou": 124,
127
+ "\u0120T": 125,
128
+ "om": 126,
129
+ "\u0120re": 127,
130
+ "st": 128,
131
+ "ly": 129,
132
+ "ct": 130,
133
+ "\u0120e": 131,
134
+ "\u0120th": 132,
135
+ "\u0120l": 133,
136
+ "il": 134,
137
+ "\u0120C": 135,
138
+ "\u0120n": 136,
139
+ "\u0120S": 137,
140
+ "\u0120B": 138,
141
+ "ol": 139,
142
+ "ve": 140,
143
+ "us": 141,
144
+ "\u0120be": 142,
145
+ "am": 143,
146
+ "ation": 144,
147
+ "ur": 145,
148
+ "\u0120The": 146,
149
+ "ce": 147,
150
+ "id": 148,
151
+ "\u0120is": 149,
152
+ "ch": 150,
153
+ "ad": 151,
154
+ "im": 152,
155
+ "\u0120I": 153,
156
+ "\u0120as": 154,
157
+ "un": 155,
158
+ "\u01201": 156,
159
+ "\u0120for": 157,
160
+ "ot": 158,
161
+ "ut": 159,
162
+ "ver": 160,
163
+ "ig": 161,
164
+ "et": 162,
165
+ "\u0120g": 163,
166
+ "ir": 164,
167
+ "el": 165,
168
+ "\u0120on": 166,
169
+ "\u0120M": 167,
170
+ "ul": 168,
171
+ "ter": 169,
172
+ "ith": 170,
173
+ "\u0120was": 171,
174
+ "ow": 172,
175
+ "ra": 173,
176
+ "\u0120(": 174,
177
+ "\u0120st": 175,
178
+ "\u0120con": 176,
179
+ "her": 177,
180
+ "\u0120with": 178,
181
+ "\u0120P": 179,
182
+ "ist": 180,
183
+ "\u0120al": 181,
184
+ "\u0120by": 182,
185
+ "\u0120that": 183,
186
+ "ay": 184,
187
+ "\u0120H": 185,
188
+ "em": 186,
189
+ "os": 187,
190
+ "\u0120wh": 188,
191
+ "ri": 189,
192
+ "th": 190,
193
+ "um": 191,
194
+ "and": 192,
195
+ "\u0120pro": 193,
196
+ "od": 194,
197
+ "ers": 195,
198
+ "'s": 196,
199
+ "\u0120or": 197,
200
+ "op": 198,
201
+ "\u0120R": 199,
202
+ "\u0120E": 200,
203
+ "\u0120com": 201,
204
+ "\u0120D": 202,
205
+ "\u0120are": 203,
206
+ "\u0120de": 204,
207
+ "\u0120In": 205,
208
+ "ag": 206,
209
+ "\u0120F": 207,
210
+ "est": 208,
211
+ "if": 209,
212
+ "\u0120it": 210,
213
+ "\u0120G": 211,
214
+ "rom": 212,
215
+ "ac": 213,
216
+ "\u0120W": 214,
217
+ "\u0120L": 215,
218
+ "\u0120su": 216,
219
+ "ate": 217,
220
+ "\u0120\"": 218,
221
+ "ain": 219,
222
+ "ies": 220,
223
+ "\u012019": 221,
224
+ "res": 222,
225
+ "ab": 223,
226
+ "\u0120at": 224,
227
+ "\u01202": 225,
228
+ "ity": 226,
229
+ "pe": 227,
230
+ "\u0120ex": 228,
231
+ "\u0120N": 229,
232
+ "\u0120from": 230,
233
+ "oun": 231,
234
+ "\u0120v": 232,
235
+ "\u0120he": 233,
236
+ "se": 234,
237
+ "ud": 235,
238
+ "ive": 236,
239
+ "oc": 237,
240
+ "qu": 238,
241
+ "ment": 239,
242
+ "ant": 240,
243
+ "iv": 241,
244
+ "\u0120se": 242,
245
+ "ich": 243,
246
+ "ort": 244,
247
+ "ld": 245,
248
+ "ess": 246,
249
+ "ical": 247,
250
+ "ian": 248,
251
+ "ial": 249,
252
+ "ore": 250,
253
+ "\u0120O": 251,
254
+ "art": 252,
255
+ "\u0120J": 253,
256
+ "\u0120us": 254,
257
+ "\u0120r": 255,
258
+ "\u0120un": 256,
259
+ "pp": 257,
260
+ "\u0120his": 258,
261
+ "all": 259,
262
+ "pt": 260,
263
+ "ia": 261,
264
+ "igh": 262,
265
+ "ated": 263,
266
+ "ill": 264,
267
+ "00": 265,
268
+ "ard": 266,
269
+ "\u0120which": 267,
270
+ "ew": 268,
271
+ "ere": 269,
272
+ "gh": 270,
273
+ "ber": 271,
274
+ "ction": 272,
275
+ "\u0120ch": 273,
276
+ "ak": 274,
277
+ "ther": 275,
278
+ "\u0120U": 276,
279
+ "ge": 277,
280
+ "ave": 278,
281
+ "ost": 279,
282
+ "end": 280,
283
+ "og": 281,
284
+ "\u0120were": 282,
285
+ "ure": 283,
286
+ "\u0120not": 284,
287
+ "ary": 285,
288
+ "cl": 286,
289
+ "ip": 287,
290
+ "ine": 288,
291
+ "\u0120pl": 289,
292
+ "ast": 290,
293
+ "ish": 291,
294
+ "\u0120le": 292,
295
+ "\u0120comp": 293,
296
+ "ous": 294,
297
+ "per": 295,
298
+ "so": 296,
299
+ "ong": 297,
300
+ "\u0120ar": 298,
301
+ "du": 299,
302
+ "our": 300,
303
+ "ell": 301,
304
+ "ame": 302,
305
+ "\u0120k": 303,
306
+ "rit": 304,
307
+ "\u0120have": 305,
308
+ "ome": 306,
309
+ "av": 307,
310
+ "\u0120K": 308,
311
+ "cc": 309,
312
+ "ans": 310,
313
+ "ect": 311,
314
+ "ition": 312,
315
+ "ight": 313,
316
+ "ord": 314,
317
+ "\u0120sh": 315,
318
+ "\u0120Ch": 316,
319
+ "ire": 317,
320
+ "iz": 318,
321
+ "\u0120cont": 319,
322
+ "ult": 320,
323
+ "up": 321,
324
+ "\u0120int": 322,
325
+ "ide": 323,
326
+ "ub": 324,
327
+ "ear": 325,
328
+ "ff": 326,
329
+ "ry": 327,
330
+ "ap": 328,
331
+ "\u0120has": 329,
332
+ "\u0120also": 330,
333
+ "mer": 331,
334
+ "\u0120ad": 332,
335
+ "ence": 333,
336
+ "ations": 334,
337
+ "\u0120St": 335,
338
+ "der": 336,
339
+ "ib": 337,
340
+ "ust": 338,
341
+ "ally": 339,
342
+ "\u0120had": 340,
343
+ "\u0120whe": 341,
344
+ "\u0120their": 342,
345
+ "ack": 343,
346
+ "ign": 344,
347
+ "\u0120its": 345,
348
+ "ass": 346,
349
+ "pl": 347,
350
+ "),": 348,
351
+ "\u0120ab": 349,
352
+ "\u012020": 350,
353
+ "\u0120part": 351,
354
+ "ue": 352,
355
+ "age": 353,
356
+ "\u0120other": 354,
357
+ "\u0120but": 355,
358
+ "\u0120res": 356,
359
+ "ang": 357,
360
+ "\u0120An": 358,
361
+ "ould": 359,
362
+ "\u0120tr": 360,
363
+ "low": 361,
364
+ "\u0120Th": 362,
365
+ "act": 363,
366
+ "\u0120cl": 364,
367
+ "ater": 365,
368
+ "ok": 366,
369
+ "clud": 367,
370
+ "\u0120Al": 368,
371
+ "\u0120can": 369,
372
+ "fer": 370,
373
+ "are": 371,
374
+ "\u0120y": 372,
375
+ "\u0120wor": 373,
376
+ "\u0120im": 374,
377
+ "ount": 375,
378
+ "ph": 376,
379
+ "vel": 377,
380
+ "\u0120dis": 378,
381
+ "\u0120en": 379,
382
+ "ie": 380,
383
+ "ru": 381,
384
+ "ory": 382,
385
+ "irst": 383,
386
+ "ree": 384,
387
+ "ates": 385,
388
+ "any": 386,
389
+ "\u0120been": 387,
390
+ "erm": 388,
391
+ "\u0120Un": 389,
392
+ "tern": 390,
393
+ "\u0120this": 391,
394
+ "over": 392,
395
+ "ions": 393,
396
+ "ance": 394,
397
+ "\u0120includ": 395,
398
+ "orm": 396,
399
+ "\u0120one": 397,
400
+ "ric": 398,
401
+ "\u0120bec": 399,
402
+ "\u0120comm": 400,
403
+ "out": 401,
404
+ "ace": 402,
405
+ "con": 403,
406
+ "\u0120first": 404,
407
+ "\u0120V": 405,
408
+ "own": 406,
409
+ "ime": 407,
410
+ "able": 408,
411
+ "ach": 409,
412
+ "wo": 410,
413
+ "hed": 411,
414
+ "ress": 412,
415
+ "vers": 413,
416
+ "ile": 414,
417
+ "fter": 415,
418
+ "\u0120such": 416,
419
+ "ugh": 417,
420
+ "ke": 418,
421
+ "\u0120used": 419,
422
+ ").": 420,
423
+ "te": 421,
424
+ "port": 422,
425
+ "\u0120It": 423,
426
+ "\u0120te": 424,
427
+ "ound": 425,
428
+ "\u0120ro": 426,
429
+ "land": 427,
430
+ "und": 428,
431
+ "ced": 429,
432
+ "ces": 430,
433
+ "ens": 431,
434
+ "\u0120pre": 432,
435
+ "\u0120Ar": 433,
436
+ "\u0120who": 434,
437
+ "ep": 435,
438
+ "\u0120sy": 436,
439
+ "\u0120per": 437,
440
+ "ice": 438,
441
+ "\u0120spe": 439,
442
+ "ough": 440,
443
+ "\u0120cons": 441,
444
+ "erv": 442,
445
+ "lic": 443,
446
+ "\u0120all": 444,
447
+ "ren": 445,
448
+ "uring": 446,
449
+ "\u0120ne": 447,
450
+ "ose": 448,
451
+ "\u0120des": 449,
452
+ "\u0120form": 450,
453
+ "\u0120ag": 451,
454
+ "ors": 452,
455
+ "ational": 453,
456
+ "ari": 454,
457
+ "\u0120200": 455,
458
+ "\u0120cent": 456,
459
+ "\u0120sp": 457,
460
+ "ron": 458,
461
+ "les": 459,
462
+ "\u0120they": 460,
463
+ "ov": 461,
464
+ "\u0120more": 462,
465
+ "ase": 463,
466
+ "we": 464,
467
+ "\u0120ev": 465,
468
+ "\u0120He": 466,
469
+ "\u0120sc": 467,
470
+ "ism": 468,
471
+ "ents": 469,
472
+ "\u0120into": 470,
473
+ "ite": 471,
474
+ "\u0120two": 472,
475
+ "\u0120most": 473,
476
+ "ks": 474,
477
+ "ade": 475,
478
+ "ited": 476,
479
+ "ts": 477,
480
+ "\u0120j": 478,
481
+ "ied": 479,
482
+ "\u0120rec": 480,
483
+ "ular": 481,
484
+ "\u0120201": 482,
485
+ "olog": 483,
486
+ "aw": 484,
487
+ "ative": 485,
488
+ "\u0120ele": 486,
489
+ "\u0120rel": 487,
490
+ "amp": 488,
491
+ "aus": 489,
492
+ "\u0120over": 490,
493
+ "reat": 491,
494
+ "\u0120me": 492,
495
+ "\u0120year": 493,
496
+ "\u012018": 494,
497
+ "\u0120reg": 495,
498
+ "rib": 496,
499
+ "\u0120col": 497,
500
+ "\u0120att": 498,
501
+ "\u0120As": 499,
502
+ "meric": 500,
503
+ "one": 501,
504
+ "ual": 502,
505
+ "\u0120some": 503,
506
+ "oin": 504,
507
+ "ft": 505,
508
+ "ath": 506,
509
+ "rough": 507,
510
+ "ood": 508,
511
+ "\u0120bet": 509,
512
+ "ne": 510,
513
+ "\u0120out": 511,
514
+ "\u0120gro": 512,
515
+ "wn": 513,
516
+ "\u0120num": 514,
517
+ "\u0120than": 515,
518
+ "ased": 516,
519
+ "ted": 517,
520
+ "ics": 518,
521
+ "\u0120after": 519,
522
+ "oth": 520,
523
+ "\u0120app": 521,
524
+ "\u0120am": 522,
525
+ "\u0120produ": 523,
526
+ "ings": 524,
527
+ "\u0120gen": 525,
528
+ "\u0120This": 526,
529
+ "form": 527,
530
+ "\u0120Americ": 528,
531
+ "\u0120sub": 529,
532
+ "ten": 530,
533
+ "ook": 531,
534
+ "red": 532,
535
+ "\u0120kn": 533,
536
+ "iss": 534,
537
+ "ific": 535,
538
+ "ob": 536,
539
+ "ons": 537,
540
+ "\u0120would": 538,
541
+ "\u0120lar": 539,
542
+ "ury": 540,
543
+ "old": 541,
544
+ "ark": 542,
545
+ "\u0120may": 543,
546
+ "\u0120time": 544,
547
+ "stem": 545,
548
+ "\u0120under": 546,
549
+ "ween": 547,
550
+ "\u0120En": 548,
551
+ "\u0120only": 549,
552
+ "\u0120play": 550,
553
+ "gan": 551,
554
+ "ys": 552,
555
+ "\u0120ac": 553,
556
+ "\u0120when": 554,
557
+ "\u0120inter": 555,
558
+ "\u0120so": 556,
559
+ "\u0120ear": 557,
560
+ "\u0120acc": 558,
561
+ "ind": 559,
562
+ "als": 560,
563
+ "ail": 561,
564
+ "\u0120mod": 562,
565
+ "the": 563,
566
+ "ove": 564,
567
+ "\u0120between": 565,
568
+ "\u0120ass": 566,
569
+ "ever": 567,
570
+ "\u0120new": 568,
571
+ "\u0120ph": 569,
572
+ "\u0120ra": 570,
573
+ "ran": 571,
574
+ "\u0120up": 572,
575
+ "\u0120there": 573,
576
+ "li": 574,
577
+ "\u0120spec": 575,
578
+ "\u0120many": 576,
579
+ "\u0120ind": 577,
580
+ "ral": 578,
581
+ "\u0120off": 579,
582
+ "ild": 580,
583
+ "\".": 581,
584
+ "overn": 582,
585
+ "\u0120them": 583,
586
+ "\u0120use": 584,
587
+ "\u0120him": 585,
588
+ "cre": 586,
589
+ "ities": 587,
590
+ "\u0120dif": 588,
591
+ "urn": 589,
592
+ "\u0120pe": 590,
593
+ "\u0120199": 591,
594
+ "\u01203": 592,
595
+ "velop": 593,
596
+ "ram": 594,
597
+ "ince": 595,
598
+ "\u0120Y": 596,
599
+ "\u0120no": 597,
600
+ "\u0120system": 598,
601
+ "\u0120count": 599,
602
+ "\u0120def": 600,
603
+ "\u0120Com": 601,
604
+ "ars": 602,
605
+ "\u0120pr": 603,
606
+ "\u0120trans": 604,
607
+ "\u0120qu": 605,
608
+ "\u0120through": 606,
609
+ "\u0120act": 607,
610
+ "ject": 608,
611
+ "\u0120her": 609,
612
+ "\u0120bl": 610,
613
+ "ution": 611,
614
+ "ines": 612,
615
+ "\u0120Se": 613,
616
+ "ern": 614,
617
+ "\u0120call": 615,
618
+ "ollow": 616,
619
+ "\u0120pos": 617,
620
+ "ward": 618,
621
+ "uc": 619,
622
+ "\u0120Brit": 620,
623
+ "\u0120pres": 621,
624
+ "\u0120where": 622,
625
+ "\u0120we": 623,
626
+ "ond": 624,
627
+ "orn": 625,
628
+ "ating": 626,
629
+ "\u0120pub": 627,
630
+ "\u0120rem": 628,
631
+ "ished": 629,
632
+ "\u0120during": 630,
633
+ "ake": 631,
634
+ "ists": 632,
635
+ "ict": 633,
636
+ "oy": 634,
637
+ "hes": 635,
638
+ "\u0120work": 636,
639
+ "arch": 637,
640
+ "\u0120em": 638,
641
+ "\u0120ob": 639,
642
+ "\u0120supp": 640,
643
+ "\u0120prov": 641,
644
+ "ident": 642,
645
+ "\u0120develop": 643,
646
+ "\u0120Re": 644,
647
+ "\u0120do": 645,
648
+ "\u0120inst": 646,
649
+ "tle": 647,
650
+ "\u0120fil": 648,
651
+ "cess": 649,
652
+ "ug": 650,
653
+ "\u0120known": 651,
654
+ "ivers": 652,
655
+ "\u0120set": 653,
656
+ "ious": 654,
657
+ "\u0120pol": 655,
658
+ "uro": 656,
659
+ "\u0120inv": 657,
660
+ "\u0120met": 658,
661
+ "ays": 659,
662
+ "ause": 660,
663
+ "\",": 661,
664
+ "ures": 662,
665
+ "\u0120man": 663,
666
+ "ower": 664,
667
+ "\u0120li": 665,
668
+ "very": 666,
669
+ "led": 667,
670
+ "\u0120pop": 668,
671
+ "ically": 669,
672
+ "\u0120inf": 670,
673
+ "ments": 671,
674
+ "\u0120mem": 672,
675
+ "hip": 673,
676
+ "\u0120gener": 674,
677
+ "ient": 675,
678
+ "owever": 676,
679
+ "oh": 677,
680
+ "\u0120vari": 678,
681
+ "\u0120found": 679,
682
+ "cept": 680,
683
+ "fore": 681,
684
+ "\u0120these": 682,
685
+ "\u0120tra": 683,
686
+ "\u0120ap": 684,
687
+ "ix": 685,
688
+ "ually": 686,
689
+ "\u0120Con": 687,
690
+ "ton": 688,
691
+ "\u0120about": 689,
692
+ "outh": 690,
693
+ "\u0120govern": 691,
694
+ "\u0120De": 692,
695
+ "\u0120Eng": 693,
696
+ "aj": 694,
697
+ "ized": 695,
698
+ "cted": 696,
699
+ "ah": 697,
700
+ "tain": 698,
701
+ "\u0120years": 699,
702
+ "\u0120char": 700,
703
+ "ible": 701,
704
+ "\u0120New": 702,
705
+ "\u0120follow": 703,
706
+ "\u0120fl": 704,
707
+ "ins": 705,
708
+ "\u0120dist": 706,
709
+ "\u0120co": 707,
710
+ "ium": 708,
711
+ "\u0120exp": 709,
712
+ "\u0120loc": 710,
713
+ "hen": 711,
714
+ "ople": 712,
715
+ "\u0120century": 713,
716
+ "ved": 714,
717
+ "\u0120number": 715,
718
+ "angu": 716,
719
+ "\u0120add": 717,
720
+ "oci": 718,
721
+ "ise": 719,
722
+ "\u0120eff": 720,
723
+ "ered": 721,
724
+ "\u0120American": 722,
725
+ "\u0120On": 723,
726
+ "yp": 724,
727
+ "\u0120including": 725,
728
+ "\u0120made": 726,
729
+ "\u0120rep": 727,
730
+ "\u0120differ": 728,
731
+ "\u0120fe": 729,
732
+ "ctions": 730,
733
+ "inal": 731,
734
+ "lish": 732,
735
+ "\u0120film": 733,
736
+ "\u0120sur": 734,
737
+ "19": 735,
738
+ "ural": 736,
739
+ "\u0120well": 737,
740
+ "orth": 738,
741
+ "\u0120being": 739,
742
+ "\u0120At": 740,
743
+ "\u0120imp": 741,
744
+ "\u0120both": 742,
745
+ "\u0120cre": 743,
746
+ "\u0120For": 744,
747
+ "io": 745,
748
+ "lex": 746,
749
+ "\u0120United": 747,
750
+ "\u0120serv": 748,
751
+ "\u0120called": 749,
752
+ "\u0120writ": 750,
753
+ "\u0120main": 751,
754
+ "\u0120car": 752,
755
+ "\u0120bu": 753,
756
+ "\u0120ext": 754,
757
+ "man": 755,
758
+ "ife": 756,
759
+ "\u0120again": 757,
760
+ "cy": 758,
761
+ "\u0120sign": 759,
762
+ "ature": 760,
763
+ "round": 761,
764
+ "ants": 762,
765
+ "iel": 763,
766
+ "\u0120any": 764,
767
+ "\u0120then": 765,
768
+ "uch": 766,
769
+ "its": 767,
770
+ ".\"": 768,
771
+ "\u0120ent": 769,
772
+ "\u0120Ap": 770,
773
+ "\u0120while": 771,
774
+ "bers": 772,
775
+ "cial": 773,
776
+ "\u0120stud": 774,
777
+ "\u0120hist": 775,
778
+ "imes": 776,
779
+ "\u0120end": 777,
780
+ "\u0120Am": 778,
781
+ "\u0120later": 779,
782
+ "ility": 780,
783
+ "\u0120198": 781,
784
+ "cond": 782,
785
+ "\u0120sim": 783,
786
+ "\u0120Is": 784,
787
+ "\u0120became": 785,
788
+ "ters": 786,
789
+ "\u0120dec": 787,
790
+ "\u01204": 788,
791
+ "\u0120art": 789,
792
+ "\u0120three": 790,
793
+ "\u0120sm": 791,
794
+ "ather": 792,
795
+ "ock": 793,
796
+ "\u0120British": 794,
797
+ "hem": 795,
798
+ "\u0120197": 796,
799
+ "\u0120often": 797,
800
+ "\u0120langu": 798,
801
+ "\u0120Euro": 799,
802
+ "ll": 800,
803
+ "ick": 801,
804
+ "\u0120term": 802,
805
+ "\u0120dep": 803,
806
+ "\u0120early": 804,
807
+ "ians": 805,
808
+ "ros": 806,
809
+ "ract": 807,
810
+ "other": 808,
811
+ "elf": 809,
812
+ "\u0120group": 810,
813
+ "\u0120const": 811,
814
+ "\u0120mon": 812,
815
+ "\u0120ser": 813,
816
+ "ames": 814,
817
+ "\u0120large": 815,
818
+ "\u0120aut": 816,
819
+ "\u0120high": 817,
820
+ "\u0120Le": 818,
821
+ "\u0120air": 819,
822
+ "ork": 820,
823
+ "ily": 821,
824
+ "gest": 822,
825
+ "rist": 823,
826
+ "ision": 824,
827
+ "ck": 825,
828
+ "\u0120Bl": 826,
829
+ "\u0120people": 827,
830
+ "\u0120sever": 828,
831
+ "ages": 829,
832
+ "\u0120.": 830,
833
+ "\u0120desc": 831,
834
+ "ology": 832,
835
+ "\u0120contin": 833,
836
+ "\u0120partic": 834,
837
+ "ting": 835,
838
+ "rop": 836,
839
+ "\u0120Af": 837,
840
+ "ful": 838,
841
+ "\u0120Europe": 839,
842
+ "\u0120Be": 840,
843
+ "\u0120est": 841,
844
+ "\u0120state": 842,
845
+ "\u0120common": 843,
846
+ "lished": 844,
847
+ "ajor": 845,
848
+ "\u0120mus": 846,
849
+ "\u0120dire": 847,
850
+ "\u0120orig": 848,
851
+ "ues": 849,
852
+ "ty": 850,
853
+ "\u0120city": 851,
854
+ "ann": 852,
855
+ "\u0120long": 853,
856
+ "\u0120Col": 854,
857
+ "\u0120appe": 855,
858
+ "\u0120196": 856,
859
+ "ool": 857,
860
+ "apt": 858,
861
+ "\u01205": 859,
862
+ "\u0120world": 860,
863
+ "vent": 861,
864
+ "\u0120Ind": 862,
865
+ "\u0120consid": 863,
866
+ "iversity": 864,
867
+ "\u0120result": 865,
868
+ "\u0120ret": 866,
869
+ "\u0120oper": 867,
870
+ "air": 868,
871
+ "ex": 869,
872
+ "\u0120pers": 870,
873
+ "ield": 871,
874
+ "iew": 872,
875
+ "ax": 873,
876
+ "\u0120Sh": 874,
877
+ "\u0120ant": 875,
878
+ "\u0120could": 876,
879
+ "ger": 877,
880
+ "\u0120lead": 878,
881
+ "\u0120Ad": 879,
882
+ "ined": 880,
883
+ "til": 881,
884
+ "az": 882,
885
+ "uss": 883,
886
+ "\u0120same": 884,
887
+ "ample": 885,
888
+ "\u0120own": 886,
889
+ "osed": 887,
890
+ "ange": 888,
891
+ "\u0120Aust": 889,
892
+ "though": 890,
893
+ "\u0120like": 891,
894
+ "ital": 892,
895
+ "\u0120War": 893,
896
+ "\u0120government": 894,
897
+ "ale": 895,
898
+ "ble": 896,
899
+ "\u0120occ": 897,
900
+ "\u0120min": 898,
901
+ "\u0120Sp": 899,
902
+ "\u0120Christ": 900,
903
+ "lo": 901,
904
+ "\u0120Sc": 902,
905
+ "\u0120States": 903,
906
+ "oman": 904,
907
+ "raph": 905,
908
+ "\u0120war": 906,
909
+ "\u0120small": 907,
910
+ "000": 908,
911
+ "omet": 909,
912
+ "acter": 910,
913
+ "ived": 911,
914
+ "ae": 912,
915
+ "rol": 913,
916
+ "\u0120ed": 914,
917
+ "\u0120before": 915,
918
+ "\u0120will": 916,
919
+ "\u0120she": 917,
920
+ "\u0120second": 918,
921
+ "ired": 919,
922
+ "\u0120different": 920,
923
+ "ank": 921,
924
+ "\u0120,": 922,
925
+ "ared": 923,
926
+ "\u0120fam": 924,
927
+ "\u0120med": 925,
928
+ "\u0120Joh": 926,
929
+ "urch": 927,
930
+ "\u012017": 928,
931
+ "ute": 929,
932
+ "gy": 930,
933
+ "ional": 931,
934
+ "\u0120each": 932,
935
+ "\u0120disc": 933,
936
+ "\u0120design": 934,
937
+ "\u0120several": 935,
938
+ "ives": 936,
939
+ "\u0120major": 937,
940
+ "ene": 938,
941
+ "\u0120bro": 939,
942
+ "\u0120support": 940,
943
+ "chn": 941,
944
+ "els": 942,
945
+ "\u0120bel": 943,
946
+ "\u0120because": 944,
947
+ "\u0120prog": 945,
948
+ "\u0120Ab": 946,
949
+ "\u0120include": 947,
950
+ "ised": 948,
951
+ "\u0120Germ": 949,
952
+ "\u0120example": 950,
953
+ "\u0120arm": 951,
954
+ "ety": 952,
955
+ "\u0120remain": 953,
956
+ "\u0120if": 954,
957
+ "\u0120law": 955,
958
+ "aking": 956,
959
+ "ccess": 957,
960
+ "iod": 958,
961
+ "\u0120hum": 959,
962
+ "\u0120effect": 960,
963
+ "\u0120Pro": 961,
964
+ "\u0120Wor": 962,
965
+ "ording": 963,
966
+ "min": 964,
967
+ "\u0120against": 965,
968
+ "\u0120land": 966,
969
+ "riv": 967,
970
+ "\u0120sch": 968,
971
+ "\u0120However": 969,
972
+ "\u0120There": 970,
973
+ "\u0120since": 971,
974
+ "iver": 972,
975
+ "aid": 973,
976
+ "\u0120John": 974,
977
+ "oint": 975,
978
+ "\u0120commun": 976,
979
+ "\u0120Z": 977,
980
+ "ash": 978,
981
+ "\u0120name": 979,
982
+ "atic": 980,
983
+ "ides": 981,
984
+ "ries": 982,
985
+ "ided": 983,
986
+ "\u0120po": 984,
987
+ "ilit": 985,
988
+ "ices": 986,
989
+ "\u0120str": 987,
990
+ "ning": 988,
991
+ "ird": 989,
992
+ "\u0120trad": 990,
993
+ "\u0120Comm": 991,
994
+ "\u0120incre": 992,
995
+ "\u0120fun": 993,
996
+ "\u0120four": 994,
997
+ "\u0120public": 995,
998
+ "ina": 996,
999
+ "\u0120until": 997,
1000
+ "\u0120allow": 998,
1001
+ "cent": 999,
1002
+ "\u0120requ": 1000,
1003
+ "\u0120cult": 1001,
1004
+ "ox": 1002,
1005
+ "\u0120leg": 1003,
1006
+ "\u0120book": 1004,
1007
+ "ulation": 1005,
1008
+ "\u0120Wh": 1006,
1009
+ "\u01206": 1007,
1010
+ "\u0120ins": 1008,
1011
+ "\u0120det": 1009,
1012
+ "\u0120World": 1010,
1013
+ "20": 1011,
1014
+ "ued": 1012,
1015
+ "\u0120English": 1013,
1016
+ "ley": 1014,
1017
+ "\u0120Cl": 1015,
1018
+ "get": 1016,
1019
+ "\u0120back": 1017,
1020
+ "way": 1018,
1021
+ "\u0120ref": 1019,
1022
+ "lu": 1020,
1023
+ "eral": 1021,
1024
+ "\u0120power": 1022,
1025
+ "ified": 1023,
1026
+ "itions": 1024,
1027
+ "ged": 1025,
1028
+ "\u0120country": 1026,
1029
+ "\u0120organ": 1027,
1030
+ "conom": 1028,
1031
+ "ven": 1029,
1032
+ "yn": 1030,
1033
+ "\u0120Ph": 1031,
1034
+ "ruct": 1032,
1035
+ "\u0120mat": 1033,
1036
+ "\u01207": 1034,
1037
+ "yl": 1035,
1038
+ "\u0120equ": 1036,
1039
+ "\u0120import": 1037,
1040
+ "mber": 1038,
1041
+ "raft": 1039,
1042
+ "\u0120though": 1040,
1043
+ "arm": 1041,
1044
+ "ius": 1042,
1045
+ "\u0120cap": 1043,
1046
+ "ission": 1044,
1047
+ "\u0120success": 1045,
1048
+ "\u0120val": 1046,
1049
+ "ases": 1047,
1050
+ "ience": 1048,
1051
+ "\u0120They": 1049,
1052
+ "\u0120sing": 1050,
1053
+ "\u0120elect": 1051,
1054
+ "to": 1052,
1055
+ "\u0120process": 1053,
1056
+ "son": 1054,
1057
+ "\u0120cor": 1055,
1058
+ "lect": 1056,
1059
+ "\u0120bat": 1057,
1060
+ "\u0120University": 1058,
1061
+ "\u0120op": 1059,
1062
+ "xt": 1060,
1063
+ "por": 1061,
1064
+ "enn": 1062,
1065
+ "ris": 1063,
1066
+ "ier": 1064,
1067
+ "\u0120poss": 1065,
1068
+ "\u0120show": 1066,
1069
+ "ternational": 1067,
1070
+ "\u0120rele": 1068,
1071
+ "\u0120Can": 1069,
1072
+ "ner": 1070,
1073
+ "inc": 1071,
1074
+ "oph": 1072,
1075
+ "ason": 1073,
1076
+ "ended": 1074,
1077
+ "eng": 1075,
1078
+ "rad": 1076,
1079
+ "\u0120refer": 1077,
1080
+ "ring": 1078,
1081
+ "\u0120those": 1079,
1082
+ "ross": 1080,
1083
+ "\u0120These": 1081,
1084
+ "ement": 1082,
1085
+ "\u0120period": 1083,
1086
+ "\u0120based": 1084,
1087
+ "\u0120did": 1085,
1088
+ "ouse": 1086,
1089
+ "\u0120describ": 1087,
1090
+ "com": 1088,
1091
+ "\u0120conf": 1089,
1092
+ "\u0120cur": 1090,
1093
+ "\u0120par": 1091,
1094
+ "\u0120right": 1092,
1095
+ "aim": 1093,
1096
+ "oot": 1094,
1097
+ "ize": 1095,
1098
+ "\u0120program": 1096,
1099
+ "\u0120II": 1097,
1100
+ "ness": 1098,
1101
+ "rodu": 1099,
1102
+ "\u0120offic": 1100,
1103
+ "pect": 1101,
1104
+ "ilar": 1102,
1105
+ "\u0120polit": 1103,
1106
+ "\u0120person": 1104,
1107
+ "ull": 1105,
1108
+ "\u0120life": 1106,
1109
+ "ript": 1107,
1110
+ "\u0120order": 1108,
1111
+ "ets": 1109,
1112
+ "ently": 1110,
1113
+ "ization": 1111,
1114
+ "ences": 1112,
1115
+ "ration": 1113,
1116
+ "resent": 1114,
1117
+ "\u0120around": 1115,
1118
+ "\u0120div": 1116,
1119
+ "\u0120area": 1117,
1120
+ "\u0120near": 1118,
1121
+ "\u0120Or": 1119,
1122
+ "\u0120class": 1120,
1123
+ "\u0120Arm": 1121,
1124
+ "ane": 1122,
1125
+ "\u0120non": 1123,
1126
+ "\u0120sl": 1124,
1127
+ "\u0120mar": 1125,
1128
+ "\u0120language": 1126,
1129
+ "\u0120even": 1127,
1130
+ "\u0120Car": 1128,
1131
+ "\u0120began": 1129,
1132
+ "\u0120194": 1130,
1133
+ "\u0120direct": 1131,
1134
+ "\u0120After": 1132,
1135
+ "ody": 1133,
1136
+ "dom": 1134,
1137
+ "\u0120within": 1135,
1138
+ "\u0120estab": 1136,
1139
+ "\u0120ke": 1137,
1140
+ "aced": 1138,
1141
+ "par": 1139,
1142
+ "aving": 1140,
1143
+ "\u0120using": 1141,
1144
+ "\u0120record": 1142,
1145
+ "mp": 1143,
1146
+ "\u0120techn": 1144,
1147
+ "\u0120modern": 1145,
1148
+ "\u0120expl": 1146,
1149
+ "\u0120death": 1147,
1150
+ "\u0120Afric": 1148,
1151
+ "\u0120character": 1149,
1152
+ "rid": 1150,
1153
+ "\u0120belie": 1151,
1154
+ "ode": 1152,
1155
+ "\u0120lit": 1153,
1156
+ "\u0120among": 1154,
1157
+ "\u0120typ": 1155,
1158
+ "\u0120Alex": 1156,
1159
+ "amed": 1157,
1160
+ "eg": 1158,
1161
+ "\u0120following": 1159,
1162
+ "\u0120perform": 1160,
1163
+ "\u0120av": 1161,
1164
+ "que": 1162,
1165
+ "\u0120pass": 1163,
1166
+ "\u0120much": 1164,
1167
+ "\u0120195": 1165,
1168
+ "lands": 1166,
1169
+ "\u0120influ": 1167,
1170
+ "\u0120Gree": 1168,
1171
+ "ories": 1169,
1172
+ "\u0120giv": 1170,
1173
+ "\u0120exper": 1171,
1174
+ "ograph": 1172,
1175
+ "\u0120prop": 1173,
1176
+ "ec": 1174,
1177
+ "\u0120opp": 1175,
1178
+ "ober": 1176,
1179
+ "gen": 1177,
1180
+ "\u012016": 1178,
1181
+ "\u0120comput": 1179,
1182
+ "\u0120water": 1180,
1183
+ "\u0120econom": 1181,
1184
+ "\u0120series": 1182,
1185
+ "hor": 1183,
1186
+ "ially": 1184,
1187
+ "\u0120Austral": 1185,
1188
+ "\u0120Mar": 1186,
1189
+ "tic": 1187,
1190
+ "\u0120'": 1188,
1191
+ "\u0120appro": 1189,
1192
+ "\u0120stand": 1190,
1193
+ "\u0120Sch": 1191,
1194
+ "\u0120adv": 1192,
1195
+ "ony": 1193,
1196
+ "epend": 1194,
1197
+ "estern": 1195,
1198
+ "\u0120team": 1196,
1199
+ "\u0120Some": 1197,
1200
+ "\u0120South": 1198,
1201
+ "ik": 1199,
1202
+ "\u0120word": 1200,
1203
+ "ological": 1201,
1204
+ "\u0120due": 1202,
1205
+ "\u0120control": 1203,
1206
+ "\u0120mark": 1204,
1207
+ "\u0120stat": 1205,
1208
+ "ball": 1206,
1209
+ "ves": 1207,
1210
+ "read": 1208,
1211
+ "\u0120resp": 1209,
1212
+ "\u0120sol": 1210,
1213
+ "\u0120music": 1211,
1214
+ "\u0120cour": 1212,
1215
+ "ately": 1213,
1216
+ "ister": 1214,
1217
+ "\u0120considered": 1215,
1218
+ "\u012015": 1216,
1219
+ "ilitary": 1217,
1220
+ "\u0120rest": 1218,
1221
+ "\u0120bas": 1219,
1222
+ "eb": 1220,
1223
+ "\u0120Cent": 1221,
1224
+ "ense": 1222,
1225
+ "ament": 1223,
1226
+ "ids": 1224,
1227
+ "ander": 1225,
1228
+ "row": 1226,
1229
+ "\u0120King": 1227,
1230
+ "\u0120North": 1228,
1231
+ "\u0120ann": 1229,
1232
+ "\u0120still": 1230,
1233
+ "\u0120popular": 1231,
1234
+ "ior": 1232,
1235
+ "itional": 1233,
1236
+ "\u0120eng": 1234,
1237
+ "\u0120By": 1235,
1238
+ "\u0120National": 1236,
1239
+ "\u0120along": 1237,
1240
+ "\u0120now": 1238,
1241
+ "\u0120very": 1239,
1242
+ "\u0120es": 1240,
1243
+ "ink": 1241,
1244
+ "\u0120hand": 1242,
1245
+ "ple": 1243,
1246
+ "ards": 1244,
1247
+ "\u0120fin": 1245,
1248
+ "\u0120comple": 1246,
1249
+ "\u0120gu": 1247,
1250
+ "rench": 1248,
1251
+ "craft": 1249,
1252
+ "\u0120comb": 1250,
1253
+ "de": 1251,
1254
+ "\u0120Q": 1252,
1255
+ "\u0120dem": 1253,
1256
+ "ublic": 1254,
1257
+ "\u0120members": 1255,
1258
+ "\u0120particular": 1256,
1259
+ "iet": 1257,
1260
+ "\u0120ter": 1258,
1261
+ "\u0120attack": 1259,
1262
+ "\u0120similar": 1260,
1263
+ "ried": 1261,
1264
+ "\u0120histor": 1262,
1265
+ "\u0120indu": 1263,
1266
+ "\u0120rece": 1264,
1267
+ "\u0120tem": 1265,
1268
+ "\u0120Ang": 1266,
1269
+ "\u0120open": 1267,
1270
+ "\u0120species": 1268,
1271
+ "\u01208": 1269,
1272
+ "\u0120Ag": 1270,
1273
+ "\u0120rad": 1271,
1274
+ "\u0120represent": 1272,
1275
+ "\u0120return": 1273,
1276
+ "oss": 1274,
1277
+ "ole": 1275,
1278
+ "iam": 1276,
1279
+ "\u0120rev": 1277,
1280
+ "\u0120run": 1278,
1281
+ "\u0120fact": 1279,
1282
+ "\u0120Bat": 1280,
1283
+ "\u0120head": 1281,
1284
+ "\u0120bo": 1282,
1285
+ "\u0120mov": 1283,
1286
+ "ains": 1284,
1287
+ "\u0120present": 1285,
1288
+ "\u0120Pl": 1286,
1289
+ "\u0120vers": 1287,
1290
+ "ination": 1288,
1291
+ "omen": 1289,
1292
+ "\u0120various": 1290,
1293
+ "\u0120human": 1291,
1294
+ "ator": 1292,
1295
+ "une": 1293,
1296
+ "\u012010": 1294,
1297
+ "\u0120cer": 1295,
1298
+ "\u0120said": 1296,
1299
+ "\u0120Bar": 1297,
1300
+ "ster": 1298,
1301
+ "me": 1299,
1302
+ "\u0120led": 1300,
1303
+ "\u0120French": 1301,
1304
+ "\u0120what": 1302,
1305
+ "gin": 1303,
1306
+ "af": 1304,
1307
+ "\u0120Air": 1305,
1308
+ "\u0120report": 1306,
1309
+ "\u0120fre": 1307,
1310
+ "\u0120place": 1308,
1311
+ "rent": 1309,
1312
+ "\u0120aff": 1310,
1313
+ "yd": 1311,
1314
+ "\u0120pract": 1312,
1315
+ "ches": 1313,
1316
+ "rab": 1314,
1317
+ "\u0120Roman": 1315,
1318
+ "\u0120important": 1316,
1319
+ "ality": 1317,
1320
+ "empt": 1318,
1321
+ "\u0120start": 1319,
1322
+ "\u0120vis": 1320,
1323
+ "ument": 1321,
1324
+ "ugust": 1322,
1325
+ "\u0120build": 1323,
1326
+ "cing": 1324,
1327
+ "\u0120All": 1325,
1328
+ "\u0120child": 1326,
1329
+ "alth": 1327,
1330
+ "\u0120somet": 1328,
1331
+ "\u0120line": 1329,
1332
+ "hod": 1330,
1333
+ "by": 1331,
1334
+ "\u0120Mus": 1332,
1335
+ "\u0120move": 1333,
1336
+ "\u0120ve": 1334,
1337
+ "\u0120last": 1335,
1338
+ "\u0120bi": 1336,
1339
+ "\u0120crit": 1337,
1340
+ "ray": 1338,
1341
+ "app": 1339,
1342
+ "\u0120great": 1340,
1343
+ "\u0120development": 1341,
1344
+ "\u0120pat": 1342,
1345
+ "\u0120August": 1343,
1346
+ "istic": 1344,
1347
+ "\u0120Acc": 1345,
1348
+ "ither": 1346,
1349
+ "idd": 1347,
1350
+ "ively": 1348,
1351
+ "lin": 1349,
1352
+ "ales": 1350,
1353
+ "\u0120game": 1351,
1354
+ "\u0120BC": 1352,
1355
+ "rew": 1353,
1356
+ "cer": 1354,
1357
+ "ites": 1355,
1358
+ "\u0120school": 1356,
1359
+ "ilt": 1357,
1360
+ "\u0120usually": 1358,
1361
+ "\u0120developed": 1359,
1362
+ "\u0120make": 1360,
1363
+ "\u0120claim": 1361,
1364
+ "ms": 1362,
1365
+ "\u0120given": 1363,
1366
+ "ember": 1364,
1367
+ "\u0120anim": 1365,
1368
+ "\u0120military": 1366,
1369
+ "\u0120general": 1367,
1370
+ "med": 1368,
1371
+ "\u0120short": 1369,
1372
+ "..": 1370,
1373
+ "\u0120less": 1371,
1374
+ "\u0120mult": 1372,
1375
+ "day": 1373,
1376
+ "olic": 1374,
1377
+ "\u0120home": 1375,
1378
+ "\u0120occur": 1376,
1379
+ "\u0120left": 1377,
1380
+ "\u0120And": 1378,
1381
+ "tal": 1379,
1382
+ "\u0120ident": 1380,
1383
+ "amb": 1381,
1384
+ "\u0120Apol": 1382,
1385
+ "\u0120invol": 1383,
1386
+ "\u0120national": 1384,
1387
+ "\u0120German": 1385,
1388
+ "\u0120sug": 1386,
1389
+ "\u0120former": 1387,
1390
+ "\u0120US": 1388,
1391
+ "\u0120become": 1389,
1392
+ "\u0120without": 1390,
1393
+ "\u0120signific": 1391,
1394
+ "\u0120repl": 1392,
1395
+ "\u0120proper": 1393,
1396
+ "\u0120history": 1394,
1397
+ "\u0120ide": 1395,
1398
+ "\u0120old": 1396,
1399
+ "ocial": 1397,
1400
+ "\u0120countries": 1398,
1401
+ "\u0120fore": 1399,
1402
+ "\u0120another": 1400,
1403
+ "\u0120lim": 1401,
1404
+ "\u0120During": 1402,
1405
+ "\u0120anal": 1403,
1406
+ "uted": 1404,
1407
+ "ata": 1405,
1408
+ "\u0120When": 1406,
1409
+ "be": 1407,
1410
+ "\u0120single": 1408,
1411
+ "\u0120colle": 1409,
1412
+ "itution": 1410,
1413
+ "\u0120need": 1411,
1414
+ "\u0120His": 1412,
1415
+ "\u0120Will": 1413,
1416
+ "rand": 1414,
1417
+ "\u0120field": 1415,
1418
+ "\u0120inc": 1416,
1419
+ "ague": 1417,
1420
+ "The": 1418,
1421
+ "\u0120Ass": 1419,
1422
+ "\u0120every": 1420,
1423
+ "\u0120Russ": 1421,
1424
+ "\u0120York": 1422,
1425
+ "astern": 1423,
1426
+ "akes": 1424,
1427
+ "\u0120port": 1425,
1428
+ "ivil": 1426,
1429
+ "\u0120struct": 1427,
1430
+ "\u0120published": 1428,
1431
+ "ots": 1429,
1432
+ "\u0120dev": 1430,
1433
+ "urther": 1431,
1434
+ "wards": 1432,
1435
+ "omin": 1433,
1436
+ "\u0120local": 1434,
1437
+ "\u0120prim": 1435,
1438
+ "ats": 1436,
1439
+ "\u0120arg": 1437,
1440
+ "line": 1438,
1441
+ "ware": 1439,
1442
+ "ule": 1440,
1443
+ "\u0120cr": 1441,
1444
+ "\u0120prot": 1442,
1445
+ "ling": 1443,
1446
+ "\u0120states": 1444,
1447
+ "\u0120obs": 1445,
1448
+ "\u0120Cal": 1446,
1449
+ "\u0120go": 1447,
1450
+ "\u0120family": 1448,
1451
+ "\u0120down": 1449,
1452
+ "ze": 1450,
1453
+ "\u0120Ear": 1451,
1454
+ "cil": 1452,
1455
+ "\u0120bus": 1453,
1456
+ "\u0120late": 1454,
1457
+ "ably": 1455,
1458
+ "\u0120associ": 1456,
1459
+ "ances": 1457,
1460
+ "\u0120Tr": 1458,
1461
+ "\u012012": 1459,
1462
+ "\u0120way": 1460,
1463
+ "\u0120political": 1461,
1464
+ "\u0120origin": 1462,
1465
+ "\u0120arch": 1463,
1466
+ "\u0120hig": 1464,
1467
+ "less": 1465,
1468
+ "ai": 1466,
1469
+ "\u0120European": 1467,
1470
+ "rote": 1468,
1471
+ "\u0120iss": 1469,
1472
+ "\u0120Ge": 1470,
1473
+ "\u0120few": 1471,
1474
+ "\u0120region": 1472,
1475
+ "\u0120Apollo": 1473,
1476
+ "\u0120caus": 1474,
1477
+ "ogn": 1475,
1478
+ "pecial": 1476,
1479
+ "\u0120Canad": 1477,
1480
+ "\u0120view": 1478,
1481
+ "\u0120exist": 1479,
1482
+ "ivid": 1480,
1483
+ "\u0120rul": 1481,
1484
+ "\u0120author": 1482,
1485
+ "\u0120however": 1483,
1486
+ "\u0120mill": 1484,
1487
+ "\u0120Church": 1485,
1488
+ "\u01209": 1486,
1489
+ "iron": 1487,
1490
+ "\u0120cond": 1488,
1491
+ "\u0120treat": 1489,
1492
+ "ones": 1490,
1493
+ "\u0120chang": 1491,
1494
+ "olution": 1492,
1495
+ "\u0120sk": 1493,
1496
+ "\u0120beh": 1494,
1497
+ "ential": 1495,
1498
+ "\u0120level": 1496,
1499
+ "\u0120202": 1497,
1500
+ "aterial": 1498,
1501
+ "\u0120body": 1499,
1502
+ "\u0120Bo": 1500,
1503
+ "ele": 1501,
1504
+ "ops": 1502,
1505
+ "\u0120hel": 1503,
1506
+ "\u0120included": 1504,
1507
+ "\u0120gl": 1505,
1508
+ "\u0120May": 1506,
1509
+ "\u0120independ": 1507,
1510
+ "au": 1508,
1511
+ "\u0120function": 1509,
1512
+ "\u0120indust": 1510,
1513
+ "ured": 1511,
1514
+ "\u0120men": 1512,
1515
+ "lear": 1513,
1516
+ "\u0120held": 1514,
1517
+ "\u0120tele": 1515,
1518
+ "ery": 1516,
1519
+ "\u0120Par": 1517,
1520
+ "aster": 1518,
1521
+ "\u0120prof": 1519,
1522
+ "\u0120kill": 1520,
1523
+ "\u0120Rep": 1521,
1524
+ "\u0120point": 1522,
1525
+ "idence": 1523,
1526
+ "side": 1524,
1527
+ "\u0120anc": 1525,
1528
+ "\u0120sour": 1526,
1529
+ "irc": 1527,
1530
+ "\u0120proble": 1528,
1531
+ "\u0120introdu": 1529,
1532
+ "\u0120pur": 1530,
1533
+ "ead": 1531,
1534
+ "ored": 1532,
1535
+ "\u0120Pr": 1533,
1536
+ "\u0120One": 1534,
1537
+ "hern": 1535,
1538
+ "\u0120although": 1536,
1539
+ "\u0120case": 1537,
1540
+ "\u012011": 1538,
1541
+ "\u0120day": 1539,
1542
+ "\u0120population": 1540,
1543
+ "ruction": 1541,
1544
+ "\u0120significant": 1542,
1545
+ "oon": 1543,
1546
+ "rain": 1544,
1547
+ "\u0120took": 1545,
1548
+ "bit": 1546,
1549
+ "\u0120camp": 1547,
1550
+ "ention": 1548,
1551
+ "ability": 1549,
1552
+ "\u0120193": 1550,
1553
+ "ering": 1551,
1554
+ "\u0120event": 1552,
1555
+ "\u0120prom": 1553,
1556
+ "formation": 1554,
1557
+ "ividual": 1555,
1558
+ "\u0120attempt": 1556,
1559
+ "\u0120Min": 1557,
1560
+ "\u0120Ex": 1558,
1561
+ "rict": 1559,
1562
+ "\u0120Char": 1560,
1563
+ "\u0120init": 1561,
1564
+ "sequ": 1562,
1565
+ "ott": 1563,
1566
+ "\u0120phys": 1564,
1567
+ "\u0120just": 1565,
1568
+ "\u0120space": 1566,
1569
+ "put": 1567,
1570
+ "\u0120wrote": 1568,
1571
+ "\u0120object": 1569,
1572
+ "\u0120suggest": 1570,
1573
+ "\u0120West": 1571,
1574
+ "ges": 1572,
1575
+ "\u0120separ": 1573,
1576
+ "\u0120material": 1574,
1577
+ "\u0120systems": 1575,
1578
+ "\u0120Fran": 1576,
1579
+ "\u0120According": 1577,
1580
+ "\u0120should": 1578,
1581
+ "orthern": 1579,
1582
+ "\u0120does": 1580,
1583
+ "ival": 1581,
1584
+ ".,": 1582,
1585
+ "\u0120God": 1583,
1586
+ "itive": 1584,
1587
+ "\u0120read": 1585,
1588
+ "atural": 1586,
1589
+ "\u0120method": 1587,
1590
+ "\u0120redu": 1588,
1591
+ "\u0120low": 1589,
1592
+ "\u0120activ": 1590,
1593
+ "\u0120having": 1591,
1594
+ "\u0120areas": 1592,
1595
+ "\u0120Oct": 1593,
1596
+ "\u0120relig": 1594,
1597
+ "\u0120largest": 1595,
1598
+ "\u0120Greek": 1596,
1599
+ "\u0120named": 1597,
1600
+ "\u0120Dav": 1598,
1601
+ "aun": 1599,
1602
+ "\u012014": 1600,
1603
+ "\u0120produced": 1601,
1604
+ "\u0120product": 1602,
1605
+ "rated": 1603,
1606
+ "emb": 1604,
1607
+ "most": 1605,
1608
+ "\u0120Ed": 1606,
1609
+ "\u0120role": 1607,
1610
+ "\u0120written": 1608,
1611
+ "for": 1609,
1612
+ "anu": 1610,
1613
+ "\u0120Bro": 1611,
1614
+ "\u0120cal": 1612,
1615
+ "\u0120wide": 1613,
1616
+ "aken": 1614,
1617
+ "\u0120original": 1615,
1618
+ "\u0120Christian": 1616,
1619
+ "ek": 1617,
1620
+ "\u0120individual": 1618,
1621
+ "\u0120band": 1619,
1622
+ "\u0120light": 1620,
1623
+ "\u0120established": 1621,
1624
+ "\u0120times": 1622,
1625
+ "\u0120cell": 1623,
1626
+ "\u0120test": 1624,
1627
+ "ained": 1625,
1628
+ "\u0120black": 1626,
1629
+ "ouncil": 1627,
1630
+ "\u0120Ber": 1628,
1631
+ "\u0120prev": 1629,
1632
+ "\u0120ep": 1630,
1633
+ "\u0120help": 1631,
1634
+ "\u0120er": 1632,
1635
+ "\u0120decl": 1633,
1636
+ "\u0120sym": 1634,
1637
+ "\u0120north": 1635,
1638
+ "che": 1636,
1639
+ "\u0120appear": 1637,
1640
+ "iving": 1638,
1641
+ "iness": 1639,
1642
+ "alf": 1640,
1643
+ "\u012013": 1641,
1644
+ "\u0120Union": 1642,
1645
+ "ril": 1643,
1646
+ "\u0120mean": 1644,
1647
+ "\u0120rese": 1645,
1648
+ "\u0120Sept": 1646,
1649
+ "ape": 1647,
1650
+ "ploy": 1648,
1651
+ "\u0120real": 1649,
1652
+ "\u0120must": 1650,
1653
+ "\u0120mag": 1651,
1654
+ "\u0120Her": 1652,
1655
+ "omb": 1653,
1656
+ "ends": 1654,
1657
+ "\u0120win": 1655,
1658
+ "augh": 1656,
1659
+ "ype": 1657,
1660
+ "ham": 1658,
1661
+ "\u0120either": 1659,
1662
+ "\u0120built": 1660,
1663
+ "ched": 1661,
1664
+ "ctor": 1662,
1665
+ "\u0120sometimes": 1663,
1666
+ "aur": 1664,
1667
+ "aut": 1665,
1668
+ "\u0120take": 1666,
1669
+ "\u0120service": 1667,
1670
+ "ote": 1668,
1671
+ "\u0120bre": 1669,
1672
+ "ways": 1670,
1673
+ "\u0120five": 1671,
1674
+ "ipp": 1672,
1675
+ "alk": 1673,
1676
+ "abor": 1674,
1677
+ "thern": 1675,
1678
+ "oms": 1676,
1679
+ "\u0120king": 1677,
1680
+ "ffic": 1678,
1681
+ "\u0120Arab": 1679,
1682
+ "ilos": 1680,
1683
+ "\u0120released": 1681,
1684
+ "ym": 1682,
1685
+ "\u0120turn": 1683,
1686
+ "bert": 1684,
1687
+ "\u0120theory": 1685,
1688
+ "\u0120England": 1686,
1689
+ "\u0120sent": 1687,
1690
+ "anes": 1688,
1691
+ "\u0120Other": 1689,
1692
+ "\u0120production": 1690,
1693
+ "\u0120my": 1691,
1694
+ "\u0120grow": 1692,
1695
+ "zer": 1693,
1696
+ "atures": 1694,
1697
+ "\u0120del": 1695,
1698
+ "\u0120others": 1696,
1699
+ "\u0120el": 1697,
1700
+ "umb": 1698,
1701
+ "\u0120described": 1699,
1702
+ "\u0120played": 1700,
1703
+ "rian": 1701,
1704
+ "ison": 1702,
1705
+ "uly": 1703,
1706
+ "outhern": 1704,
1707
+ "\u0120games": 1705,
1708
+ "\u0120top": 1706,
1709
+ "\u0120generally": 1707,
1710
+ "acy": 1708,
1711
+ "\u0120base": 1709,
1712
+ "ining": 1710,
1713
+ "\u0120languages": 1711,
1714
+ "\u0120social": 1712,
1715
+ "ency": 1713,
1716
+ "\u0120Although": 1714,
1717
+ "\u0120six": 1715,
1718
+ "ler": 1716,
1719
+ "\u0120sw": 1717,
1720
+ "pecially": 1718,
1721
+ "\u0120south": 1719,
1722
+ "\u0120America": 1720,
1723
+ "\u0120strong": 1721,
1724
+ "used": 1722,
1725
+ "\u0120alb": 1723,
1726
+ "\u0120study": 1724,
1727
+ "face": 1725,
1728
+ "\u0120September": 1726,
1729
+ "eth": 1727,
1730
+ "\u0120Gen": 1728,
1731
+ "\u0120let": 1729,
1732
+ "ille": 1730,
1733
+ "\u0120\u0120": 1731,
1734
+ "\u0120tit": 1732,
1735
+ "ergy": 1733,
1736
+ "overed": 1734,
1737
+ "\u0120dest": 1735,
1738
+ "urg": 1736,
1739
+ "here": 1737,
1740
+ "ected": 1738,
1741
+ "\u0120Hist": 1739,
1742
+ "yle": 1740,
1743
+ "aces": 1741,
1744
+ "\u0120official": 1742,
1745
+ "\u0120relations": 1743,
1746
+ "aries": 1744,
1747
+ "\u0120groups": 1745,
1748
+ "\u0120William": 1746,
1749
+ "\u0120how": 1747,
1750
+ "\u0120works": 1748,
1751
+ "\u0120position": 1749,
1752
+ "self": 1750,
1753
+ "co": 1751,
1754
+ "\u0120mass": 1752,
1755
+ "\u0120special": 1753,
1756
+ "ban": 1754,
1757
+ "\u0120circ": 1755,
1758
+ "aren": 1756,
1759
+ "\u0120free": 1757,
1760
+ "\u0120Act": 1758,
1761
+ "\u0120earli": 1759,
1762
+ "\u0120ver": 1760,
1763
+ "ret": 1761,
1764
+ "incip": 1762,
1765
+ "ung": 1763,
1766
+ "\u0120son": 1764,
1767
+ "\u0120further": 1765,
1768
+ "\u0120While": 1766,
1769
+ "reen": 1767,
1770
+ "ondon": 1768,
1771
+ "\u0120came": 1769,
1772
+ "\u0120Many": 1770,
1773
+ "\u0120vol": 1771,
1774
+ "fic": 1772,
1775
+ "\u0120mot": 1773,
1776
+ "ips": 1774,
1777
+ "\u0120She": 1775,
1778
+ "\u0120Jew": 1776,
1779
+ "\u0120company": 1777,
1780
+ "cul": 1778,
1781
+ "anuary": 1779,
1782
+ "\u0120song": 1780,
1783
+ "\u0120addition": 1781,
1784
+ "\u0120possible": 1782,
1785
+ "\u0120With": 1783,
1786
+ "\u0120rather": 1784,
1787
+ "\u0120Black": 1785,
1788
+ "bs": 1786,
1789
+ "bon": 1787,
1790
+ "\u0120post": 1788,
1791
+ "\u0120Ac": 1789,
1792
+ "\u0120Bel": 1790,
1793
+ "ending": 1791,
1794
+ "vision": 1792,
1795
+ "asing": 1793,
1796
+ "\u0120Br": 1794,
1797
+ "\u0120created": 1795,
1798
+ "\u0120version": 1796,
1799
+ "\u0120October": 1797,
1800
+ "\u0120syn": 1798,
1801
+ "\u0120Republic": 1799,
1802
+ "work": 1800,
1803
+ "\u0120third": 1801,
1804
+ "iation": 1802,
1805
+ "\u0120Gu": 1803,
1806
+ "\u0120January": 1804,
1807
+ "osp": 1805,
1808
+ "ify": 1806,
1809
+ "mon": 1807,
1810
+ "rap": 1808,
1811
+ "\u0120ri": 1809,
1812
+ "\u0120Alexander": 1810,
1813
+ "sh": 1811,
1814
+ "\u0120territ": 1812,
1815
+ "\u0120191": 1813,
1816
+ "\u0120Part": 1814,
1817
+ "\u0120range": 1815,
1818
+ "ridge": 1816,
1819
+ "cember": 1817,
1820
+ "\u0120forces": 1818,
1821
+ "ification": 1819,
1822
+ "let": 1820,
1823
+ "\u0120express": 1821,
1824
+ "\u0120Emp": 1822,
1825
+ "\u0120avail": 1823,
1826
+ "\u0120parts": 1824,
1827
+ "then": 1825,
1828
+ "ociety": 1826,
1829
+ "\u0120certain": 1827,
1830
+ "\u0120Har": 1828,
1831
+ "ovember": 1829,
1832
+ "\u0120chem": 1830,
1833
+ "\u0120London": 1831,
1834
+ "\u0120David": 1832,
1835
+ "atin": 1833,
1836
+ "sc": 1834,
1837
+ "avy": 1835,
1838
+ "\u0120current": 1836,
1839
+ "rance": 1837,
1840
+ "etic": 1838,
1841
+ "\u0120tri": 1839,
1842
+ "\u0120July": 1840,
1843
+ "\u0120sold": 1841,
1844
+ "\u0120International": 1842,
1845
+ "\u0120Book": 1843,
1846
+ "\u0120especially": 1844,
1847
+ "\u0120March": 1845,
1848
+ "\u0120enc": 1846,
1849
+ "lor": 1847,
1850
+ "\u0120East": 1848,
1851
+ "utions": 1849,
1852
+ "\u0120sum": 1850,
1853
+ "\u0120meas": 1851,
1854
+ "\u0120December": 1852,
1855
+ "\u0120-": 1853,
1856
+ "\u0120X": 1854,
1857
+ "\u0120town": 1855,
1858
+ "\u0120numbers": 1856,
1859
+ "arian": 1857,
1860
+ "\u0120April": 1858,
1861
+ "aul": 1859,
1862
+ "enced": 1860,
1863
+ "\u0120Mon": 1861,
1864
+ "\u0120November": 1862,
1865
+ "\u0120aircraft": 1863,
1866
+ "\u0120ast": 1864,
1867
+ "\u0120bar": 1865,
1868
+ "\u0120complex": 1866,
1869
+ "iction": 1867,
1870
+ "ope": 1868,
1871
+ "\u0120food": 1869,
1872
+ "\u0120good": 1870,
1873
+ "eum": 1871,
1874
+ "\u0120side": 1872,
1875
+ "\u0120close": 1873,
1876
+ "\u0120compet": 1874,
1877
+ "\u0120continued": 1875,
1878
+ "BC": 1876,
1879
+ "\u0120capital": 1877,
1880
+ "\u0120Med": 1878,
1881
+ "\u0120Geor": 1879,
1882
+ "arily": 1880,
1883
+ "\u0120broad": 1881,
1884
+ "\u0120recogn": 1882,
1885
+ "\u0120computer": 1883,
1886
+ "ateg": 1884,
1887
+ "\u0120list": 1885,
1888
+ "\u0120natural": 1886,
1889
+ "\u0120data": 1887,
1890
+ "\u0120June": 1888,
1891
+ "\u0120Western": 1889,
1892
+ "ming": 1890,
1893
+ "\u0120interest": 1891,
1894
+ "ps": 1892,
1895
+ "\u0120pri": 1893,
1896
+ "\u0120pot": 1894,
1897
+ "\u0120Press": 1895,
1898
+ "\u0120elements": 1896,
1899
+ "iddle": 1897,
1900
+ "ples": 1898,
1901
+ "like": 1899,
1902
+ "\u0120priv": 1900,
1903
+ "\u0120educ": 1901,
1904
+ "\u0120observ": 1902,
1905
+ "\u0120album": 1903,
1906
+ "ald": 1904,
1907
+ "hers": 1905,
1908
+ "aly": 1906,
1909
+ "ths": 1907,
1910
+ "ert": 1908,
1911
+ "\u0120care": 1909,
1912
+ "\u0120battle": 1910,
1913
+ "\u0120Art": 1911,
1914
+ "\u0120international": 1912,
1915
+ "\u0120abs": 1913,
1916
+ "\u0120Jap": 1914,
1917
+ "\u0120frequ": 1915,
1918
+ "oung": 1916,
1919
+ "\u0120million": 1917,
1920
+ "\u0120Empire": 1918,
1921
+ "\u0120fail": 1919,
1922
+ "aves": 1920,
1923
+ "atter": 1921,
1924
+ "rag": 1922,
1925
+ "\u0120civil": 1923,
1926
+ "\u0120App": 1924,
1927
+ "ief": 1925,
1928
+ "ront": 1926,
1929
+ "\u0120member": 1927,
1930
+ "iment": 1928,
1931
+ "oyal": 1929,
1932
+ "lation": 1930,
1933
+ "\u0120hold": 1931,
1934
+ "ounced": 1932,
1935
+ "\u0120subject": 1933,
1936
+ "\u0120accept": 1934,
1937
+ "mit": 1935,
1938
+ "wh": 1936,
1939
+ "eration": 1937,
1940
+ "\u0120formed": 1938,
1941
+ "\u0120cat": 1939,
1942
+ "ana": 1940,
1943
+ "rug": 1941,
1944
+ "ensive": 1942,
1945
+ "chan": 1943,
1946
+ "\u0120forms": 1944,
1947
+ "\u0120City": 1945,
1948
+ "\u0120Most": 1946,
1949
+ "\u0120terms": 1947,
1950
+ "\u0120received": 1948,
1951
+ "\u0120never": 1949,
1952
+ "\u0120type": 1950,
1953
+ "\u0120season": 1951,
1954
+ "\u0120impro": 1952,
1955
+ "\u0120Army": 1953,
1956
+ "\u0120press": 1954,
1957
+ "\u0120admin": 1955,
1958
+ "\u0120princip": 1956,
1959
+ "int": 1957,
1960
+ "\u0120der": 1958,
1961
+ "viron": 1959,
1962
+ "\u0120won": 1960,
1963
+ "\u0120throughout": 1961,
1964
+ "cur": 1962,
1965
+ "reg": 1963,
1966
+ "ators": 1964,
1967
+ "\u0120cop": 1965,
1968
+ "oid": 1966,
1969
+ "ask": 1967,
1970
+ "\u0120subst": 1968,
1971
+ "eter": 1969,
1972
+ "\u0120Man": 1970,
1973
+ "\u0120children": 1971,
1974
+ "\u0120women": 1972,
1975
+ "\u0120available": 1973,
1976
+ "\u0120192": 1974,
1977
+ "adem": 1975,
1978
+ "\u0120exec": 1976,
1979
+ "\u0120account": 1977,
1980
+ "ising": 1978,
1981
+ "\u0120multip": 1979,
1982
+ "\u0120ge": 1980,
1983
+ "ights": 1981,
1984
+ "ude": 1982,
1985
+ "\u0120colon": 1983,
1986
+ "hib": 1984,
1987
+ "ibr": 1985,
1988
+ "\u0120manag": 1986,
1989
+ "\u0120engine": 1987,
1990
+ "reme": 1988,
1991
+ "\u0120seen": 1989,
1992
+ "\u0120cy": 1990,
1993
+ "\u0120African": 1991,
1994
+ "\u0120host": 1992,
1995
+ "\u0120died": 1993,
1996
+ "\u0120ren": 1994,
1997
+ "\u0120making": 1995,
1998
+ "\u0120least": 1996,
1999
+ "\u0120previous": 1997,
2000
+ "osition": 1998,
2001
+ "ached": 1999,
2002
+ "\u0120laun": 2000,
2003
+ "of": 2001,
2004
+ "\u0120research": 2002,
2005
+ "val": 2003,
2006
+ "rum": 2004,
2007
+ "ened": 2005,
2008
+ "raw": 2006,
2009
+ "\u0120evidence": 2007,
2010
+ "\u0120Cor": 2008,
2011
+ "\u0120central": 2009,
2012
+ "\u0120complete": 2010,
2013
+ "utes": 2011,
2014
+ "ades": 2012,
2015
+ "\u0120ground": 2013,
2016
+ "sp": 2014,
2017
+ "\u0120age": 2015,
2018
+ "\u0120half": 2016,
2019
+ "\u0120himself": 2017,
2020
+ "\u0120Sw": 2018,
2021
+ "\u0120lower": 2019,
2022
+ "\u0120ancient": 2020,
2023
+ "itu": 2021,
2024
+ "\u0120Since": 2022,
2025
+ "\u0120army": 2023,
2026
+ "\u0120feature": 2024,
2027
+ "stand": 2025,
2028
+ "\u0120super": 2026,
2029
+ "inese": 2027,
2030
+ "\u0120Sov": 2028,
2031
+ "\u0120Ant": 2029,
2032
+ "\u0120philos": 2030,
2033
+ "\u0120economic": 2031,
2034
+ "\u0120thus": 2032,
2035
+ "imate": 2033,
2036
+ "istan": 2034,
2037
+ "\u0120tro": 2035,
2038
+ "\u0120capt": 2036,
2039
+ "ria": 2037,
2040
+ "\u0120force": 2038,
2041
+ "\u0120Pres": 2039,
2042
+ "\u0120coast": 2040,
2043
+ "lam": 2041,
2044
+ "\u0120offer": 2042,
2045
+ "\u0120standard": 2043,
2046
+ "\u0120information": 2044,
2047
+ "\u0120regard": 2045,
2048
+ "\u0120Australian": 2046,
2049
+ "iber": 2047,
2050
+ "alt": 2048,
2051
+ "\u0120net": 2049,
2052
+ "\u0120Ne": 2050,
2053
+ "\u0120eth": 2051,
2054
+ "\u0120pain": 2052,
2055
+ "\u0120days": 2053,
2056
+ "ption": 2054,
2057
+ "\u0120den": 2055,
2058
+ "\u0120Canada": 2056,
2059
+ "erous": 2057,
2060
+ "bo": 2058,
2061
+ "\u0120Cath": 2059,
2062
+ "\u0120command": 2060,
2063
+ "kn": 2061,
2064
+ "\u0120deb": 2062,
2065
+ "\u0120If": 2063,
2066
+ "play": 2064,
2067
+ "aper": 2065,
2068
+ "\u0120tre": 2066,
2069
+ "\u0120Great": 2067,
2070
+ "\u0120consist": 2068,
2071
+ "\u0120island": 2069,
2072
+ "ama": 2070,
2073
+ "\u0120Co": 2071,
2074
+ "\u0120above": 2072,
2075
+ "\u0120find": 2073,
2076
+ "\u0120Des": 2074,
2077
+ "inary": 2075,
2078
+ "\u0120Bra": 2076,
2079
+ "ford": 2077,
2080
+ "\u0120Pol": 2078,
2081
+ "ique": 2079,
2082
+ "\u0120Su": 2080,
2083
+ "\u0120gr": 2081,
2084
+ "\u0120across": 2082,
2085
+ "arn": 2083,
2086
+ "\u0120Africa": 2084,
2087
+ "\u0120concept": 2085,
2088
+ "\u0120means": 2086,
2089
+ "\u0120hard": 2087,
2090
+ "\u0120Ser": 2088,
2091
+ "\u0120scient": 2089,
2092
+ "cript": 2090,
2093
+ "\u0120project": 2091,
2094
+ "\u0120conc": 2092,
2095
+ "\u0120join": 2093,
2096
+ "\u0120court": 2094,
2097
+ "\u0120ball": 2095,
2098
+ "\u0120profess": 2096,
2099
+ "\u0120taken": 2097,
2100
+ "ula": 2098,
2101
+ "rael": 2099,
2102
+ "\u0120polic": 2100,
2103
+ "\u0120To": 2101,
2104
+ "lement": 2102,
2105
+ "\u0120entire": 2103,
2106
+ "\u0120Earth": 2104,
2107
+ "\u0120River": 2105,
2108
+ "gypt": 2106,
2109
+ "\u0120best": 2107,
2110
+ "pite": 2108,
2111
+ "\u0120model": 2109,
2112
+ "AS": 2110,
2113
+ "\u0120flow": 2111,
2114
+ "\u0120agre": 2112,
2115
+ "iol": 2113,
2116
+ "\u0120upon": 2114,
2117
+ "ingu": 2115,
2118
+ "\u0120tw": 2116,
2119
+ "\u0120energy": 2117,
2120
+ "\u0120determ": 2118,
2121
+ "\u0120fire": 2119,
2122
+ "\u0120begin": 2120,
2123
+ "\u0120Kingdom": 2121,
2124
+ "\u0120red": 2122,
2125
+ "\u0120next": 2123,
2126
+ "\u0120Comp": 2124,
2127
+ "200": 2125,
2128
+ "\u0120soft": 2126,
2129
+ "inct": 2127,
2130
+ "\u0120prote": 2128,
2131
+ "\u0120Inst": 2129,
2132
+ "\u0120includes": 2130,
2133
+ "ney": 2131,
2134
+ "\u0120located": 2132,
2135
+ "\u0120president": 2133,
2136
+ "\u0120party": 2134,
2137
+ "\u0120traditional": 2135,
2138
+ "\u0120maintain": 2136,
2139
+ "\u0120Sy": 2137,
2140
+ "\u0120access": 2138,
2141
+ "ef": 2139,
2142
+ "lant": 2140,
2143
+ "bor": 2141,
2144
+ "\u0120Egypt": 2142,
2145
+ "\u0120House": 2143,
2146
+ "\u0120Kh": 2144,
2147
+ "\u0120player": 2145,
2148
+ "\u0120indic": 2146,
2149
+ "\u0120market": 2147,
2150
+ "isc": 2148,
2151
+ "\u0120followed": 2149,
2152
+ "\u0120final": 2150,
2153
+ "icro": 2151,
2154
+ "\u0120Const": 2152,
2155
+ "\u0120culture": 2153,
2156
+ "\u0120conn": 2154,
2157
+ "\u0120novel": 2155,
2158
+ "\u0120sex": 2156,
2159
+ "\u0120adop": 2157,
2160
+ "\u0120successful": 2158,
2161
+ "\u0120business": 2159,
2162
+ "cle": 2160,
2163
+ "\u0120total": 2161,
2164
+ "ceed": 2162,
2165
+ "pir": 2163,
2166
+ "\u0120Te": 2164,
2167
+ "\u0120France": 2165,
2168
+ "oly": 2166,
2169
+ "\u0120father": 2167,
2170
+ "\u0120di": 2168,
2171
+ "\u0120change": 2169,
2172
+ "sel": 2170,
2173
+ "\u0120Chinese": 2171,
2174
+ "\u0120relative": 2172,
2175
+ "\u0120full": 2173,
2176
+ "\u0120neg": 2174,
2177
+ "\u0120instead": 2175,
2178
+ "\u0120anti": 2176,
2179
+ "\u0120Res": 2177,
2180
+ "iforn": 2178,
2181
+ "\u0120related": 2179,
2182
+ "ino": 2180,
2183
+ "\u0120El": 2181,
2184
+ "\u0120China": 2182,
2185
+ "\u0120text": 2183,
2186
+ "\u0120liter": 2184,
2187
+ "rast": 2185,
2188
+ "\u0120India": 2186,
2189
+ "fact": 2187,
2190
+ "\u0120doc": 2188,
2191
+ "gether": 2189,
2192
+ "ving": 2190,
2193
+ "\u0120together": 2191,
2194
+ "\u0120Latin": 2192,
2195
+ "\u0120contain": 2193,
2196
+ "\u0120foot": 2194,
2197
+ "\u0120Sm": 2195,
2198
+ "\u0120perm": 2196,
2199
+ "\u0120associated": 2197,
2200
+ "\u0120Soviet": 2198,
2201
+ "\u0120norm": 2199,
2202
+ "ads": 2200,
2203
+ "\u0120thought": 2201,
2204
+ "\u0120Conf": 2202,
2205
+ "vironment": 2203,
2206
+ "\u0120z": 2204,
2207
+ "atory": 2205,
2208
+ "\u0120trade": 2206,
2209
+ "\u0120provide": 2207,
2210
+ "\u0120sound": 2208,
2211
+ "\u0120mid": 2209,
2212
+ "\u0120Athen": 2210,
2213
+ "\u0120cases": 2211,
2214
+ "\u0120foc": 2212,
2215
+ "\u0120Australia": 2213,
2216
+ "\u0120higher": 2214,
2217
+ "\u0120mer": 2215,
2218
+ "\u0120except": 2216,
2219
+ "\u0120Phil": 2217,
2220
+ "\u0120Bas": 2218,
2221
+ "udd": 2219,
2222
+ "sy": 2220,
2223
+ "\u0120invest": 2221,
2224
+ "ebru": 2222,
2225
+ "\u0120almost": 2223,
2226
+ "order": 2224,
2227
+ "\u0120imm": 2225,
2228
+ "\u0120community": 2226,
2229
+ "bra": 2227,
2230
+ "unic": 2228,
2231
+ "ebruary": 2229,
2232
+ "\u0120see": 2230,
2233
+ "aff": 2231,
2234
+ "\u0120mount": 2232,
2235
+ "\u0120becom": 2233,
2236
+ "rest": 2234,
2237
+ "mercial": 2235,
2238
+ "\u0120itself": 2236,
2239
+ "ography": 2237,
2240
+ "action": 2238,
2241
+ "\u0120cover": 2239,
2242
+ "\u0120Pers": 2240,
2243
+ "\u0120Britain": 2241,
2244
+ "\u0120altern": 2242,
2245
+ "\u0120movement": 2243,
2246
+ "ficult": 2244,
2247
+ "\u0120miss": 2245,
2248
+ "\u0120Council": 2246,
2249
+ "\u0120Pe": 2247,
2250
+ "\u0120tar": 2248,
2251
+ "\u0120hyd": 2249,
2252
+ "cret": 2250,
2253
+ "\u0120plan": 2251,
2254
+ "ours": 2252,
2255
+ "ternal": 2253,
2256
+ "ica": 2254,
2257
+ "liament": 2255,
2258
+ "ought": 2256,
2259
+ "not": 2257,
2260
+ "\u0120ox": 2258,
2261
+ "\u0120fac": 2259,
2262
+ "icle": 2260,
2263
+ "\u0120Count": 2261,
2264
+ "\u0120employ": 2262,
2265
+ "\u0120electron": 2263,
2266
+ "\u0120films": 2264,
2267
+ "\u0120key": 2265,
2268
+ "\u0120influence": 2266,
2269
+ "\u0120fav": 2267,
2270
+ "\u0120players": 2268,
2271
+ "\u0120according": 2269,
2272
+ "\u0120introduced": 2270,
2273
+ "\u0120concern": 2271,
2274
+ "\u0120able": 2272,
2275
+ "ports": 2273,
2276
+ "\u0120difficult": 2274,
2277
+ "\u0120temper": 2275,
2278
+ "\u0120Central": 2276,
2279
+ "\u0120particularly": 2277,
2280
+ "aign": 2278,
2281
+ "iers": 2279,
2282
+ "\u0120February": 2280,
2283
+ "\u0120applic": 2281,
2284
+ "\u0120know": 2282,
2285
+ "chie": 2283,
2286
+ "\u0120referred": 2284,
2287
+ "ma": 2285,
2288
+ "\u0120features": 2286,
2289
+ "\u0120reported": 2287,
2290
+ "\u0120poin": 2288,
2291
+ "\u0120prob": 2289,
2292
+ "\u0120typically": 2290,
2293
+ "roy": 2291,
2294
+ "\u0120surv": 2292,
2295
+ "\u0120rights": 2293,
2296
+ "\u0120trib": 2294,
2297
+ "amples": 2295,
2298
+ "\u0120television": 2296,
2299
+ "hest": 2297,
2300
+ "anted": 2298,
2301
+ "\u0120value": 2299,
2302
+ "iles": 2300,
2303
+ "\u0120allowed": 2301,
2304
+ "\u0120building": 2302,
2305
+ "\u0120Californ": 2303,
2306
+ "atives": 2304,
2307
+ "\u0120lack": 2305,
2308
+ "ourt": 2306,
2309
+ "ounter": 2307,
2310
+ "yr": 2308,
2311
+ "oted": 2309,
2312
+ "\u0120respons": 2310,
2313
+ "\u0120stri": 2311,
2314
+ "\u0120tradition": 2312,
2315
+ "\u0120State": 2313,
2316
+ "\u0120defe": 2314,
2317
+ "ush": 2315,
2318
+ "ocr": 2316,
2319
+ "\u0120travel": 2317,
2320
+ "\u0120cross": 2318,
2321
+ "\u0120far": 2319,
2322
+ "\u0120compan": 2320,
2323
+ "\u0120surface": 2321,
2324
+ "medi": 2322,
2325
+ "\u0120story": 2323,
2326
+ "\u0120whose": 2324,
2327
+ "cast": 2325,
2328
+ "att": 2326,
2329
+ "\u0120fri": 2327,
2330
+ "ideo": 2328,
2331
+ "\u0120continu": 2329,
2332
+ "\u0120br": 2330,
2333
+ "\u0120Camb": 2331,
2334
+ "\u0120crew": 2332,
2335
+ "\u0120fall": 2333,
2336
+ "ibility": 2334,
2337
+ "\u0120structure": 2335,
2338
+ "\u0120contrib": 2336,
2339
+ "\u0120fund": 2337,
2340
+ "\u0120once": 2338,
2341
+ "\u0120ste": 2339,
2342
+ "\u0120deg": 2340,
2343
+ "\u0120specific": 2341,
2344
+ "\u0120required": 2342,
2345
+ "\u0120words": 2343,
2346
+ "\u0120Indian": 2344,
2347
+ "imately": 2345,
2348
+ "\u0120Catholic": 2346,
2349
+ "ream": 2347,
2350
+ "\u0120respect": 2348,
2351
+ "\u0120Germany": 2349,
2352
+ "\u0120schol": 2350,
2353
+ "born": 2351,
2354
+ "uel": 2352,
2355
+ "oral": 2353,
2356
+ "ties": 2354,
2357
+ "iverse": 2355,
2358
+ "\u0120size": 2356,
2359
+ "iting": 2357,
2360
+ "uth": 2358,
2361
+ "ara": 2359,
2362
+ "\u0120Arch": 2360,
2363
+ "\u0120books": 2361,
2364
+ "\u0120religious": 2362,
2365
+ "\u0120Cong": 2363,
2366
+ "\u0120From": 2364,
2367
+ "\u0120services": 2365,
2368
+ "\u0120Colle": 2366,
2369
+ "clus": 2367,
2370
+ "\u0120might": 2368,
2371
+ "\u0120190": 2369,
2372
+ "ysis": 2370,
2373
+ "\u0120California": 2371,
2374
+ "\u0120served": 2372,
2375
+ "rought": 2373,
2376
+ "\u0120administ": 2374,
2377
+ "\u0120Park": 2375,
2378
+ "\u0120science": 2376,
2379
+ "\u0120(\"": 2377,
2380
+ "\u0120Tur": 2378,
2381
+ "anish": 2379,
2382
+ "\u0120Me": 2380,
2383
+ "II": 2381,
2384
+ "\u0120star": 2382,
2385
+ "\u0120League": 2383,
2386
+ "\u0120integ": 2384,
2387
+ "istry": 2385,
2388
+ "\u0120mach": 2386,
2389
+ "\u0120remained": 2387,
2390
+ "oll": 2388,
2391
+ "\u0120uses": 2389,
2392
+ "\u0120disp": 2390,
2393
+ "ederal": 2391,
2394
+ "ume": 2392,
2395
+ "uck": 2393,
2396
+ "50": 2394,
2397
+ "\u0120Cro": 2395,
2398
+ "\u01202007": 2396,
2399
+ "\u0120commonly": 2397,
2400
+ "\u0120We": 2398,
2401
+ "ixed": 2399,
2402
+ "lying": 2400,
2403
+ "\u0120types": 2401,
2404
+ "\u0120Per": 2402,
2405
+ "\u0120186": 2403,
2406
+ "\u0120lo": 2404,
2407
+ "\u0120Mc": 2405,
2408
+ "\u0120meet": 2406,
2409
+ "\u0120believed": 2407,
2410
+ "ulf": 2408,
2411
+ "\u0120fig": 2409,
2412
+ "\u0120house": 2410,
2413
+ "oses": 2411,
2414
+ "\u0120industry": 2412,
2415
+ "\u0120prevent": 2413,
2416
+ "\u0120events": 2414,
2417
+ "\u0120communic": 2415,
2418
+ "\u0120gre": 2416,
2419
+ "\u01201990": 2417,
2420
+ "\u0120meaning": 2418,
2421
+ "\u0120style": 2419,
2422
+ "\u0120provided": 2420,
2423
+ "\u0120Atlant": 2421,
2424
+ "itch": 2422,
2425
+ "ols": 2423,
2426
+ "aps": 2424,
2427
+ "\u0120vict": 2425,
2428
+ "\u0120mechan": 2426,
2429
+ "ems": 2427,
2430
+ "\u0120Sea": 2428,
2431
+ "\u0120ill": 2429,
2432
+ "ington": 2430,
2433
+ "rial": 2431,
2434
+ "\u0120Em": 2432,
2435
+ "\u0120jud": 2433,
2436
+ "\u0120announced": 2434,
2437
+ "\u0120white": 2435,
2438
+ "\u0120increasing": 2436,
2439
+ "itect": 2437,
2440
+ "\u0120pen": 2438,
2441
+ "\u0120nuc": 2439,
2442
+ "\u0120commercial": 2440,
2443
+ "\u0120element": 2441,
2444
+ "\u0120tour": 2442,
2445
+ "\u0120larger": 2443,
2446
+ "\u0120emer": 2444,
2447
+ "\u0120behav": 2445,
2448
+ "\u0120Rober": 2446,
2449
+ "ength": 2447,
2450
+ "uries": 2448,
2451
+ "\u0120cause": 2449,
2452
+ "\u0120San": 2450,
2453
+ "\u0120Charles": 2451,
2454
+ "\u0120Reg": 2452,
2455
+ "\u0120killed": 2453,
2456
+ "\u0120northern": 2454,
2457
+ "\u0120site": 2455,
2458
+ "\u0120necess": 2456,
2459
+ "ires": 2457,
2460
+ "ulated": 2458,
2461
+ "\u0120rule": 2459,
2462
+ "\u0120effects": 2460,
2463
+ "\u0120School": 2461,
2464
+ "\u0120moved": 2462,
2465
+ "\u0120mission": 2463,
2466
+ "\u0120interp": 2464,
2467
+ "\u0120campaign": 2465,
2468
+ "\u0120Pal": 2466,
2469
+ "\u0120Port": 2467,
2470
+ "\u0120defin": 2468,
2471
+ "wood": 2469,
2472
+ "\u0120rock": 2470,
2473
+ "\u0120quest": 2471,
2474
+ "lim": 2472,
2475
+ "\u0120whether": 2473,
2476
+ "\u01202006": 2474,
2477
+ "\u0120prin": 2475,
2478
+ "\u012021": 2476,
2479
+ "su": 2477,
2480
+ "\u0120lost": 2478,
2481
+ "199": 2479,
2482
+ "\u0120source": 2480,
2483
+ "\u0120dou": 2481,
2484
+ "iding": 2482,
2485
+ "\u0120ord": 2483,
2486
+ "\u0120designed": 2484,
2487
+ "\u0120critic": 2485,
2488
+ "irth": 2486,
2489
+ "go": 2487,
2490
+ "\u0120transport": 2488,
2491
+ "\u0120widely": 2489,
2492
+ "\u0120god": 2490,
2493
+ "omm": 2491,
2494
+ "\u012030": 2492,
2495
+ "ership": 2493,
2496
+ "\u0120consider": 2494,
2497
+ "\u0120aud": 2495,
2498
+ "orph": 2496,
2499
+ "\u0120put": 2497,
2500
+ "\u0120domin": 2498,
2501
+ "off": 2499,
2502
+ "\u0120George": 2500,
2503
+ "\u0120past": 2501,
2504
+ "istance": 2502,
2505
+ "\u0120gave": 2503,
2506
+ "\u0120James": 2504,
2507
+ "\u01202010": 2505,
2508
+ "oney": 2506,
2509
+ "\u01201980": 2507,
2510
+ "ophy": 2508,
2511
+ "\u0120log": 2509,
2512
+ "\u0120longer": 2510,
2513
+ "\u0120achie": 2511,
2514
+ "\u0120ang": 2512,
2515
+ "\u0120legal": 2513,
2516
+ "ourn": 2514,
2517
+ "\u0120outside": 2515,
2518
+ "\u0120little": 2516,
2519
+ "based": 2517,
2520
+ "ference": 2518,
2521
+ "het": 2519,
2522
+ "\u0120nature": 2520,
2523
+ "\u01202000": 2521,
2524
+ "\u0120live": 2522,
2525
+ "\u0120church": 2523,
2526
+ "\u0120pred": 2524,
2527
+ "\u0120produce": 2525,
2528
+ "\u0120private": 2526,
2529
+ "\u0120caused": 2527,
2530
+ "\u0120purp": 2528,
2531
+ "\u0120Party": 2529,
2532
+ "selves": 2530,
2533
+ "\u0120mother": 2531,
2534
+ "\u0120rail": 2532,
2535
+ "\u0120increased": 2533,
2536
+ "\u0120started": 2534,
2537
+ "anks": 2535,
2538
+ "\u0120fem": 2536,
2539
+ "\u0120target": 2537,
2540
+ "\u0120went": 2538,
2541
+ "\u0120Court": 2539,
2542
+ "\u0120suff": 2540,
2543
+ "\u01202008": 2541,
2544
+ "ranch": 2542,
2545
+ "\u0120distrib": 2543,
2546
+ "\u01201970": 2544,
2547
+ "\u0120symb": 2545,
2548
+ "ilities": 2546,
2549
+ "\u0120Thom": 2547,
2550
+ "\u0120carbon": 2548,
2551
+ "\u0120brought": 2549,
2552
+ "ental": 2550,
2553
+ "\u01202011": 2551,
2554
+ "\u0120squ": 2552,
2555
+ "\u0120blood": 2553,
2556
+ "\u0120health": 2554,
2557
+ "\u0120No": 2555,
2558
+ "\u0120eventually": 2556,
2559
+ "\u0120UK": 2557,
2560
+ "\u0120astron": 2558,
2561
+ "\u0120away": 2559,
2562
+ "\u01202009": 2560,
2563
+ "ibrary": 2561,
2564
+ "\u0120environment": 2562,
2565
+ "\u0120reb": 2563,
2566
+ "\u0120President": 2564,
2567
+ "\u0120[": 2565,
2568
+ "\u0120remains": 2566,
2569
+ "view": 2567,
2570
+ "\u0120conditions": 2568,
2571
+ "\u0120viol": 2569,
2572
+ "\u0120Cont": 2570,
2573
+ "\u0120writing": 2571,
2574
+ "\u0120leading": 2572,
2575
+ "\u0120likely": 2573,
2576
+ "\u0120sat": 2574,
2577
+ "\u0120General": 2575,
2578
+ "itute": 2576,
2579
+ "\u0120gra": 2577,
2580
+ "aching": 2578,
2581
+ "anese": 2579,
2582
+ "\u0120dial": 2580,
2583
+ "\u0120appeared": 2581,
2584
+ "\")": 2582,
2585
+ "\u0120mathem": 2583,
2586
+ "\u0120independent": 2584,
2587
+ "\u0120education": 2585,
2588
+ "\u0120fle": 2586,
2589
+ "\u0120Dem": 2587,
2590
+ "ills": 2588,
2591
+ "eleb": 2589,
2592
+ "anced": 2590,
2593
+ "\u0120relationship": 2591,
2594
+ "\u0120cities": 2592,
2595
+ "light": 2593,
2596
+ "\u0120changes": 2594,
2597
+ "oice": 2595,
2598
+ "\u0120Asia": 2596,
2599
+ "\u0120added": 2597,
2600
+ "reet": 2598,
2601
+ "\u0120election": 2599,
2602
+ "\u0120radio": 2600,
2603
+ "ores": 2601,
2604
+ "\u0120code": 2602,
2605
+ "\u0120returned": 2603,
2606
+ "\u0120Ste": 2604,
2607
+ "\u0120effort": 2605,
2608
+ "\u0120require": 2606,
2609
+ "\u0120replaced": 2607,
2610
+ "\u0120hydro": 2608,
2611
+ "otic": 2609,
2612
+ "\u0120sea": 2610,
2613
+ "rate": 2611,
2614
+ "\u0120coll": 2612,
2615
+ "itiz": 2613,
2616
+ "\u0120Govern": 2614,
2617
+ "\u0120regions": 2615,
2618
+ "\u0120problems": 2616,
2619
+ "hing": 2617,
2620
+ "\u0120clear": 2618,
2621
+ "\u0120et": 2619,
2622
+ "\u0120themselves": 2620,
2623
+ "\u0120problem": 2621,
2624
+ "\u0120software": 2622,
2625
+ "\u0120southern": 2623,
2626
+ "ufact": 2624,
2627
+ "amer": 2625,
2628
+ "\u0120High": 2626,
2629
+ "\u0120physical": 2627,
2630
+ "\u0120vir": 2628,
2631
+ "\u01201960": 2629,
2632
+ "\u0120animals": 2630,
2633
+ "\u0120self": 2631,
2634
+ "\u0120west": 2632,
2635
+ "\u0120acid": 2633,
2636
+ "\u0120particip": 2634,
2637
+ "\u0120eas": 2635,
2638
+ "\u0120conver": 2636,
2639
+ "ricult": 2637,
2640
+ "\u0120189": 2638,
2641
+ "\u0120action": 2639,
2642
+ "\u0120Royal": 2640,
2643
+ "\u0120manufact": 2641,
2644
+ "ett": 2642,
2645
+ "irit": 2643,
2646
+ "\u0120companies": 2644,
2647
+ "\u0120cost": 2645,
2648
+ "uture": 2646,
2649
+ "\u0120condu": 2647,
2650
+ "pper": 2648,
2651
+ "acters": 2649,
2652
+ "\u0120variety": 2650,
2653
+ "\u0120east": 2651,
2654
+ "\u0120defined": 2652,
2655
+ "asion": 2653,
2656
+ "\u0120reason": 2654,
2657
+ "\u0120AD": 2655,
2658
+ "\u0120practice": 2656,
2659
+ "\u0120Bay": 2657,
2660
+ "iff": 2658,
2661
+ "\u0120quant": 2659,
2662
+ "\u0120recorded": 2660,
2663
+ "\u0120ten": 2661,
2664
+ "ows": 2662,
2665
+ "ques": 2663,
2666
+ "\u0120months": 2664,
2667
+ "\u0120limited": 2665,
2668
+ "\u0120dig": 2666,
2669
+ "\u0120Bur": 2667,
2670
+ "\u0120founded": 2668,
2671
+ "lied": 2669,
2672
+ "mar": 2670,
2673
+ "\u0120speak": 2671,
2674
+ "\u0120poly": 2672,
2675
+ "\u0120fight": 2673,
2676
+ "\u0120Columb": 2674,
2677
+ "ael": 2675,
2678
+ "\u0120bank": 2676,
2679
+ "\u0120Gl": 2677,
2680
+ "aval": 2678,
2681
+ "\u0120brother": 2679,
2682
+ "\u0120dom": 2680,
2683
+ "ump": 2681,
2684
+ "elling": 2682,
2685
+ "\u0120subsequ": 2683,
2686
+ "\u0120always": 2684,
2687
+ "\u0120ris": 2685,
2688
+ "\u0120say": 2686,
2689
+ "\u0120Hen": 2687,
2690
+ "\u0120characters": 2688,
2691
+ "\u0120establish": 2689,
2692
+ "onse": 2690,
2693
+ "itor": 2691,
2694
+ "odes": 2692,
2695
+ "\u0120majority": 2693,
2696
+ "\u0120Trans": 2694,
2697
+ "cies": 2695,
2698
+ "\u0120contro": 2696,
2699
+ "isms": 2697,
2700
+ "icles": 2698,
2701
+ "oul": 2699,
2702
+ "\u0120saw": 2700,
2703
+ "\u0120Mor": 2701,
2704
+ "\u0120Paul": 2702,
2705
+ "\u0120video": 2703,
2706
+ "\u0120club": 2704,
2707
+ "\u0120foreign": 2705,
2708
+ "\u0120AC": 2706,
2709
+ "\u0120stated": 2707,
2710
+ "\u0120dam": 2708,
2711
+ "\u0120grad": 2709,
2712
+ "\u0120amount": 2710,
2713
+ "\u0120cultural": 2711,
2714
+ "resp": 2712,
2715
+ "\u0120ess": 2713,
2716
+ "\u0120Day": 2714,
2717
+ "\u0120regular": 2715,
2718
+ "\u0120Israel": 2716,
2719
+ "artment": 2717,
2720
+ "\u0120dise": 2718,
2721
+ "\u0120Frank": 2719,
2722
+ "\u0120chemical": 2720,
2723
+ "\u0120towards": 2721,
2724
+ "\u0120hon": 2722,
2725
+ "\u0120alg": 2723,
2726
+ "\u0120involved": 2724,
2727
+ "\u0120come": 2725,
2728
+ "\u0120percent": 2726,
2729
+ "\u0120approx": 2727,
2730
+ "\u0120Battle": 2728,
2731
+ "\u0120hom": 2729,
2732
+ "\u0120smaller": 2730,
2733
+ "\u0120directly": 2731,
2734
+ "ike": 2732,
2735
+ "\u0120che": 2733,
2736
+ "ror": 2734,
2737
+ "irds": 2735,
2738
+ "\u0120personal": 2736,
2739
+ "\u0120architect": 2737,
2740
+ "\u0120Mich": 2738,
2741
+ "alled": 2739,
2742
+ "phas": 2740,
2743
+ "\u0120historical": 2741,
2744
+ "\u0120territory": 2742,
2745
+ "isions": 2743,
2746
+ "\u01202005": 2744,
2747
+ "ulg": 2745,
2748
+ "\u0120station": 2746,
2749
+ "\u01202014": 2747,
2750
+ "\u0120distinct": 2748,
2751
+ "\u0120Russian": 2749,
2752
+ "\u0120award": 2750,
2753
+ "berg": 2751,
2754
+ "ously": 2752,
2755
+ "\u01202012": 2753,
2756
+ "ali": 2754,
2757
+ "\u0120constitution": 2755,
2758
+ "\u0120lines": 2756,
2759
+ "\u0120Ben": 2757,
2760
+ "\u0120Bab": 2758,
2761
+ "och": 2759,
2762
+ "\u0120celeb": 2760,
2763
+ "\u0120too": 2761,
2764
+ "\u0120today": 2762,
2765
+ "\u0120Mount": 2763,
2766
+ "ects": 2764,
2767
+ "\u01202013": 2765,
2768
+ "\u0120primary": 2766,
2769
+ "\u0120title": 2767,
2770
+ "\u0120treatment": 2768,
2771
+ "\u0120dyn": 2769,
2772
+ "utive": 2770,
2773
+ "\u012025": 2771,
2774
+ "key": 2772,
2775
+ "18": 2773,
2776
+ "itude": 2774,
2777
+ "sych": 2775,
2778
+ "\u0120remov": 2776,
2779
+ "rang": 2777,
2780
+ "\u0120100": 2778,
2781
+ "\u0120opt": 2779,
2782
+ "\u0120adapt": 2780,
2783
+ "\u0120young": 2781,
2784
+ "vention": 2782,
2785
+ "\u0120increase": 2783,
2786
+ "known": 2784,
2787
+ "\u0120names": 2785,
2788
+ "\u0120Hol": 2786,
2789
+ "ands": 2787,
2790
+ "ression": 2788,
2791
+ "\u0120earlier": 2789,
2792
+ "igned": 2790,
2793
+ "ounds": 2791,
2794
+ "\u0120mostly": 2792,
2795
+ "ux": 2793,
2796
+ "\u0120Afgh": 2794,
2797
+ "\u0120veh": 2795,
2798
+ "\u0120Robert": 2796,
2799
+ "\u0120aver": 2797,
2800
+ "\u0120Mod": 2798,
2801
+ "\u0120Its": 2799,
2802
+ "\u0120nar": 2800,
2803
+ "ests": 2801,
2804
+ "peror": 2802,
2805
+ "\u0120results": 2803,
2806
+ "\u0120ax": 2804,
2807
+ "aria": 2805,
2808
+ "\u0120ur": 2806,
2809
+ "anded": 2807,
2810
+ "\u0120Rich": 2808,
2811
+ "\u0120spok": 2809,
2812
+ "ection": 2810,
2813
+ "\u0120fish": 2811,
2814
+ "asc": 2812,
2815
+ "\u0120Sim": 2813,
2816
+ "ables": 2814,
2817
+ "ledge": 2815,
2818
+ "\u0120bene": 2816,
2819
+ "ste": 2817,
2820
+ "\u0120Spe": 2818,
2821
+ "uk": 2819,
2822
+ "\u0120front": 2820,
2823
+ "\u0120Pat": 2821,
2824
+ "nel": 2822,
2825
+ "\u0120situ": 2823,
2826
+ "\u0120lay": 2824,
2827
+ "\u0120Museum": 2825,
2828
+ "\u0120products": 2826,
2829
+ "\u0120destroy": 2827,
2830
+ "ha": 2828,
2831
+ "rative": 2829,
2832
+ "\u0120born": 2830,
2833
+ "porary": 2831,
2834
+ "\u0120placed": 2832,
2835
+ "\u0120sett": 2833,
2836
+ "\u0120occup": 2834,
2837
+ "\u0120gun": 2835,
2838
+ "istics": 2836,
2839
+ "\u0120alc": 2837,
2840
+ "abit": 2838,
2841
+ "\u0120commit": 2839,
2842
+ "\u0120Qu": 2840,
2843
+ "\u0120gold": 2841,
2844
+ "\u0120Albert": 2842,
2845
+ "\u0120elected": 2843,
2846
+ "\u0120separate": 2844,
2847
+ "\u0120188": 2845,
2848
+ "\u0120phot": 2846,
2849
+ "iter": 2847,
2850
+ "agn": 2848,
2851
+ "\u0120sequ": 2849,
2852
+ "\u0120Another": 2850,
2853
+ "\u0120acqu": 2851,
2854
+ "field": 2852,
2855
+ "\u0120eight": 2853,
2856
+ "\u0120studies": 2854,
2857
+ "\u01202017": 2855,
2858
+ "\u0120sil": 2856,
2859
+ "\u0120length": 2857,
2860
+ "\u0120Cons": 2858,
2861
+ "ishop": 2859,
2862
+ "ios": 2860,
2863
+ "\u0120sal": 2861,
2864
+ "aint": 2862,
2865
+ "\u0120multiple": 2863,
2866
+ "ault": 2864,
2867
+ "\u0120living": 2865,
2868
+ "\u0120My": 2866,
2869
+ "\u0120construction": 2867,
2870
+ "respond": 2868,
2871
+ "\u0120But": 2869,
2872
+ "\u0120operations": 2870,
2873
+ "\u01202020": 2871,
2874
+ "\u0120discuss": 2872,
2875
+ "iences": 2873,
2876
+ "\u0120Sam": 2874,
2877
+ "throp": 2875,
2878
+ "\u0120islands": 2876,
2879
+ "unar": 2877,
2880
+ "\u0120alle": 2878,
2881
+ "onic": 2879,
2882
+ "rick": 2880,
2883
+ "\u0120Islands": 2881,
2884
+ "\u0120functions": 2882,
2885
+ "isation": 2883,
2886
+ "ago": 2884,
2887
+ "see": 2885,
2888
+ "\u0120proposed": 2886,
2889
+ "\u0120originally": 2887,
2890
+ "\u0120Apple": 2888,
2891
+ "\u0120Budd": 2889,
2892
+ "\u0120response": 2890,
2893
+ "\u0120document": 2891,
2894
+ "\u0120Congress": 2892,
2895
+ "\u0120cut": 2893,
2896
+ "\u0120working": 2894,
2897
+ "\u0120metal": 2895,
2898
+ "\u0120troops": 2896,
2899
+ "\u0120187": 2897,
2900
+ "\u0120board": 2898,
2901
+ "oe": 2899,
2902
+ "\u0120Eastern": 2900,
2903
+ "\u0120largely": 2901,
2904
+ "\u0120network": 2902,
2905
+ "\u0120beginning": 2903,
2906
+ "ky": 2904,
2907
+ "\u0120transl": 2905,
2908
+ "\u0120sources": 2906,
2909
+ "\u0120tax": 2907,
2910
+ "\u01202004": 2908,
2911
+ "\u0120rules": 2909,
2912
+ "\u0120College": 2910,
2913
+ "\u0120greater": 2911,
2914
+ "ements": 2912,
2915
+ "\u0120rap": 2913,
2916
+ "\u0120Islam": 2914,
2917
+ "\u0120units": 2915,
2918
+ "ancial": 2916,
2919
+ "obal": 2917,
2920
+ "rus": 2918,
2921
+ "izing": 2919,
2922
+ "\u01202001": 2920,
2923
+ "oz": 2921,
2924
+ "\u0120technology": 2922,
2925
+ "\u0120policy": 2923,
2926
+ "year": 2924,
2927
+ "\u0120hit": 2925,
2928
+ "\u01202016": 2926,
2929
+ "\u0120properties": 2927,
2930
+ "\u0120flight": 2928,
2931
+ "\u012024": 2929,
2932
+ "\u0120numerous": 2930,
2933
+ "\u0120economy": 2931,
2934
+ "\u0120schools": 2932,
2935
+ "rel": 2933,
2936
+ "\u0120First": 2934,
2937
+ "ivity": 2935,
2938
+ "\u0120discovered": 2936,
2939
+ "\u0120future": 2937,
2940
+ "\u0120methods": 2938,
2941
+ "\u0120magn": 2939,
2942
+ "\u0120adopted": 2940,
2943
+ "\u0120points": 2941,
2944
+ "'t": 2942,
2945
+ "pected": 2943,
2946
+ "\u0120unit": 2944,
2947
+ "\u0120divided": 2945,
2948
+ "\u0120creat": 2946,
2949
+ "\u0120below": 2947,
2950
+ "\u0120Dep": 2948,
2951
+ "\u0120Gold": 2949,
2952
+ "\u0120seven": 2950,
2953
+ "\u0120color": 2951,
2954
+ "\u0120review": 2952,
2955
+ "\u0120Gro": 2953,
2956
+ "\u0120address": 2954,
2957
+ "\u0120surround": 2955,
2958
+ "anned": 2956,
2959
+ "\u0120saf": 2957,
2960
+ "\u0120plants": 2958,
2961
+ "ready": 2959,
2962
+ "embly": 2960,
2963
+ "jan": 2961,
2964
+ "\u0120Old": 2962,
2965
+ "\u0120understand": 2963,
2966
+ "\u0120office": 2964,
2967
+ "arent": 2965,
2968
+ "\u0120Afghan": 2966,
2969
+ "ech": 2967,
2970
+ "isl": 2968,
2971
+ "\u0120better": 2969,
2972
+ "\u0120analysis": 2970,
2973
+ "aughter": 2971,
2974
+ "\u0120sac": 2972,
2975
+ "aged": 2973,
2976
+ "airs": 2974,
2977
+ "\u0120Batman": 2975,
2978
+ "\u0120highest": 2976,
2979
+ "\u0120look": 2977,
2980
+ "ili": 2978,
2981
+ "\u0120create": 2979,
2982
+ "\u0120give": 2980,
2983
+ "\u0120Je": 2981,
2984
+ "\u0120Island": 2982,
2985
+ "\u0120Associ": 2983,
2986
+ "\u0120growth": 2984,
2987
+ "\u0120Academ": 2985,
2988
+ "\u0120bound": 2986,
2989
+ "\u0120Red": 2987,
2990
+ "ients": 2988,
2991
+ "\u0120threat": 2989,
2992
+ "onst": 2990,
2993
+ "\u0120recent": 2991,
2994
+ "\u0120implement": 2992,
2995
+ "\u0120performed": 2993,
2996
+ "\u0120Scott": 2994,
2997
+ "axim": 2995,
2998
+ "nd": 2996,
2999
+ "\u0120sem": 2997,
3000
+ "\u0120categ": 2998,
3001
+ "\u0120via": 2999,
3002
+ "\u0120correspond": 3000,
3003
+ "omy": 3001,
3004
+ "\u0120gas": 3002,
3005
+ "\u0120oil": 3003,
3006
+ "\u0120epis": 3004,
3007
+ "curity": 3005,
3008
+ "time": 3006,
3009
+ "\u0120society": 3007,
3010
+ "\u0120Histor": 3008,
3011
+ "\u0120cust": 3009,
3012
+ "orial": 3010,
3013
+ "\u0120concent": 3011,
3014
+ "aling": 3012,
3015
+ "rying": 3013,
3016
+ "\u01202018": 3014,
3017
+ "\u0120emb": 3015,
3018
+ "\u0120idea": 3016,
3019
+ "men": 3017,
3020
+ "rim": 3018,
3021
+ "\u01202015": 3019,
3022
+ "\u0120heav": 3020,
3023
+ "\u0120centuries": 3021,
3024
+ "spe": 3022,
3025
+ "now": 3023,
3026
+ "\u0120minor": 3024,
3027
+ "\u0120Spanish": 3025,
3028
+ "\u0120hous": 3026,
3029
+ "osing": 3027,
3030
+ "utch": 3028,
3031
+ "\u0120vill": 3029,
3032
+ "17": 3030,
3033
+ "met": 3031,
3034
+ "\u0120highly": 3032,
3035
+ "\u0120supported": 3033,
3036
+ "ension": 3034,
3037
+ "na": 3035,
3038
+ "\u0120Mal": 3036,
3039
+ "\u0120cells": 3037,
3040
+ "\u0120farm": 3038,
3041
+ "\u0120ships": 3039,
3042
+ "\u0120media": 3040,
3043
+ "\u0120shown": 3041,
3044
+ "\u0120students": 3042,
3045
+ "\u0120Henry": 3043,
3046
+ "\u0120dri": 3044,
3047
+ "ext": 3045,
3048
+ "\u0120Jewish": 3046,
3049
+ "\u0120compon": 3047,
3050
+ "ferred": 3048,
3051
+ "\u0120laws": 3049,
3052
+ "\u0120lab": 3050,
3053
+ "\u0120loss": 3051,
3054
+ "\u0120cannot": 3052,
3055
+ "\u0120training": 3053,
3056
+ ".)": 3054,
3057
+ "\u0120Bu": 3055,
3058
+ "\u0120already": 3056,
3059
+ "olved": 3057,
3060
+ "\u0120stage": 3058,
3061
+ "30": 3059,
3062
+ "anc": 3060,
3063
+ "aring": 3061,
3064
+ "\u0120latter": 3062,
3065
+ "\u0120mathemat": 3063,
3066
+ "\u0120drug": 3064,
3067
+ "\u0120Mass": 3065,
3068
+ "rane": 3066,
3069
+ "\u0120release": 3067,
3070
+ "\u0120river": 3068,
3071
+ "\u0120relatively": 3069,
3072
+ "\u0120hor": 3070,
3073
+ "\u0120rate": 3071,
3074
+ "ampions": 3072,
3075
+ "pro": 3073,
3076
+ "\u0120wind": 3074,
3077
+ "icated": 3075,
3078
+ "\u0120script": 3076,
3079
+ "\u0120additional": 3077,
3080
+ "west": 3078,
3081
+ "\u0120wall": 3079,
3082
+ "gebra": 3080,
3083
+ "\u0120plant": 3081,
3084
+ "\u0120cred": 3082,
3085
+ "\u0120pun": 3083,
3086
+ "\u0120Jack": 3084,
3087
+ "\u0120immedi": 3085,
3088
+ "\u0120active": 3086,
3089
+ "\u0120obtain": 3087,
3090
+ "\u0120estim": 3088,
3091
+ "\u0120neigh": 3089,
3092
+ "\u01202019": 3090,
3093
+ "\u0120belief": 3091,
3094
+ "enes": 3092,
3095
+ "ston": 3093,
3096
+ "\u0120Northern": 3094,
3097
+ "\u0120ens": 3095,
3098
+ "\u01202021": 3096,
3099
+ "\u0120western": 3097,
3100
+ "erson": 3098,
3101
+ "\u0120Rev": 3099,
3102
+ "\u0120mention": 3100,
3103
+ "\u0120therefore": 3101,
3104
+ "\u0120focus": 3102,
3105
+ "ked": 3103,
3106
+ "hel": 3104,
3107
+ "ises": 3105,
3108
+ "\u0120enough": 3106,
3109
+ "ey": 3107,
3110
+ "\u0120cast": 3108,
3111
+ "urches": 3109,
3112
+ "ider": 3110,
3113
+ "\u0120Met": 3111,
3114
+ "\u0120detail": 3112,
3115
+ "\u0120collection": 3113,
3116
+ "bre": 3114,
3117
+ "ta": 3115,
3118
+ "\u0120Mac": 3116,
3119
+ "\u012026": 3117,
3120
+ "\u0120prior": 3118,
3121
+ "\u0120whom": 3119,
3122
+ "\u0120heart": 3120,
3123
+ "\u0120agricult": 3121,
3124
+ "\u0120conflic": 3122,
3125
+ "\u0120alternative": 3123,
3126
+ "change": 3124,
3127
+ "\u0120univers": 3125,
3128
+ "\u0120History": 3126,
3129
+ "10": 3127,
3130
+ "\u0120wood": 3128,
3131
+ "asons": 3129,
3132
+ "oles": 3130,
3133
+ "\u0120rare": 3131,
3134
+ "\u0120medic": 3132,
3135
+ "\u0120Japanese": 3133,
3136
+ "\u0120Net": 3134,
3137
+ "\u0120primarily": 3135,
3138
+ "\u0120Smith": 3136,
3139
+ "eds": 3137,
3140
+ "omp": 3138,
3141
+ "omer": 3139,
3142
+ "unity": 3140,
3143
+ "\u0120Hall": 3141,
3144
+ "\u0120performance": 3142,
3145
+ "\u0120Middle": 3143,
3146
+ "odies": 3144,
3147
+ "\u0120Alexand": 3145,
3148
+ "burg": 3146,
3149
+ "part": 3147,
3150
+ "\u0120birth": 3148,
3151
+ "aches": 3149,
3152
+ "\u0120Jer": 3150,
3153
+ "\u0120path": 3151,
3154
+ "cont": 3152,
3155
+ "\u0120scholars": 3153,
3156
+ "cher": 3154,
3157
+ "ointed": 3155,
3158
+ "\u0120changed": 3156,
3159
+ "ithm": 3157,
3160
+ "ords": 3158,
3161
+ "\u0120wife": 3159,
3162
+ "\u0120()": 3160,
3163
+ "\u0120Rad": 3161,
3164
+ "\u0120Que": 3162,
3165
+ "\u0120counter": 3163,
3166
+ "\u0120citiz": 3164,
3167
+ "\u0120Law": 3165,
3168
+ "\u0120III": 3166,
3169
+ "\u0120independence": 3167,
3170
+ "\u0120Bal": 3168,
3171
+ "\u0120Bell": 3169,
3172
+ "\u0120bur": 3170,
3173
+ "\u0120keep": 3171,
3174
+ "\u0120suggested": 3172,
3175
+ "fficient": 3173,
3176
+ "\u01202003": 3174,
3177
+ "\u0120potential": 3175,
3178
+ "OS": 3176,
3179
+ "war": 3177,
3180
+ "\u0120done": 3178,
3181
+ "\u0120display": 3179,
3182
+ "\u0120memory": 3180,
3183
+ "\u0120represented": 3181,
3184
+ "\u0120property": 3182,
3185
+ "\u0120get": 3183,
3186
+ "\u0120contrast": 3184,
3187
+ "\u01201999": 3185,
3188
+ "\u0120basis": 3186,
3189
+ "\u0120activity": 3187,
3190
+ "\u0120psych": 3188,
3191
+ "\u0120Vir": 3189,
3192
+ "\u0120reached": 3190,
3193
+ "olk": 3191,
3194
+ "ij": 3192,
3195
+ "\u0120slow": 3193,
3196
+ "\u0120Big": 3194,
3197
+ "reng": 3195,
3198
+ "\u0120location": 3196,
3199
+ "\u0120prec": 3197,
3200
+ "inn": 3198,
3201
+ "\u0120Italy": 3199,
3202
+ "\u0120Ze": 3200,
3203
+ "ls": 3201,
3204
+ "aled": 3202,
3205
+ "\u0120Award": 3203,
3206
+ "ift": 3204,
3207
+ "\u0120Wash": 3205,
3208
+ "\u0120you": 3206,
3209
+ "\u0120consum": 3207,
3210
+ "\u0120center": 3208,
3211
+ "\u0120esc": 3209,
3212
+ "\u0120issue": 3210,
3213
+ "\u0120Cat": 3211,
3214
+ "igen": 3212,
3215
+ "owers": 3213,
3216
+ "\u0120Pri": 3214,
3217
+ "\u0120spread": 3215,
3218
+ "\u0120quick": 3216,
3219
+ "\u0120levels": 3217,
3220
+ "\u0120pressure": 3218,
3221
+ "\u0120Sen": 3219,
3222
+ "vert": 3220,
3223
+ "\u0120antib": 3221,
3224
+ "\u0120claimed": 3222,
3225
+ "\u0120atom": 3223,
3226
+ "\u0120ship": 3224,
3227
+ "\u0120expans": 3225,
3228
+ "\u0120enter": 3226,
3229
+ "\u0120average": 3227,
3230
+ "\u0120ever": 3228,
3231
+ "umn": 3229,
3232
+ "pped": 3230,
3233
+ "\u0120status": 3231,
3234
+ ");": 3232,
3235
+ "tt": 3233,
3236
+ "\u0120hyp": 3234,
3237
+ "ulations": 3235,
3238
+ "opp": 3236,
3239
+ "\u0120noted": 3237,
3240
+ "\u0120fell": 3238,
3241
+ "lected": 3239,
3242
+ "\u0120Sl": 3240,
3243
+ "uments": 3241,
3244
+ "\u0120diss": 3242,
3245
+ "198": 3243,
3246
+ "\u0120incor": 3244,
3247
+ "\u0120settle": 3245,
3248
+ "back": 3246,
3249
+ "osaur": 3247,
3250
+ "\u0120Bulg": 3248,
3251
+ "cles": 3249,
3252
+ "\u0120contains": 3250,
3253
+ "\u0120whole": 3251,
3254
+ "\u0120sens": 3252,
3255
+ "\u0120Val": 3253,
3256
+ "\u0120measure": 3254,
3257
+ "\u0120card": 3255,
3258
+ "itors": 3256,
3259
+ "\u0120mole": 3257,
3260
+ "\u0120Im": 3258,
3261
+ "\u0120Wall": 3259,
3262
+ "\u01202002": 3260,
3263
+ "\u0120pie": 3261,
3264
+ "\u0120hun": 3262,
3265
+ "\u0120centre": 3263,
3266
+ "\u0120blue": 3264,
3267
+ "\u0120mainly": 3265,
3268
+ "lines": 3266,
3269
+ "\u0120carried": 3267,
3270
+ "\u01201950": 3268,
3271
+ "ging": 3269,
3272
+ "\u0120intended": 3270,
3273
+ "\u0120objects": 3271,
3274
+ "\u0120materials": 3272,
3275
+ "\u0120draw": 3273,
3276
+ "\u0120nation": 3274,
3277
+ "\u0120extreme": 3275,
3278
+ "\u0120matter": 3276,
3279
+ "\u0120Armen": 3277,
3280
+ "\u0120opened": 3278,
3281
+ "\u0120forced": 3279,
3282
+ "\u0120appears": 3280,
3283
+ "edy": 3281,
3284
+ "\u0120experience": 3282,
3285
+ "\u0120career": 3283,
3286
+ "\u0120degree": 3284,
3287
+ "reland": 3285,
3288
+ "\u0120sense": 3286,
3289
+ "cean": 3287,
3290
+ "\u0120provides": 3288,
3291
+ "\u0120initial": 3289,
3292
+ "\u0120behavior": 3290,
3293
+ "\u0120Ro": 3291,
3294
+ "\u0120kind": 3292,
3295
+ "\u0120Scot": 3293,
3296
+ "\u0120financial": 3294,
3297
+ "\u0120Arabic": 3295,
3298
+ "\u0120Academy": 3296,
3299
+ "\u0120AS": 3297,
3300
+ "\u0120scientific": 3298,
3301
+ "\u0120Washington": 3299,
3302
+ "ee": 3300,
3303
+ "onia": 3301,
3304
+ "\u0120presence": 3302,
3305
+ "\u0120break": 3303,
3306
+ "\u0120Richard": 3304,
3307
+ "amm": 3305,
3308
+ "\u0120Fre": 3306,
3309
+ "\u0120contem": 3307,
3310
+ "\u0120professional": 3308,
3311
+ "\u0120Thomas": 3309,
3312
+ "\u0120coin": 3310,
3313
+ "\u0120nearly": 3311,
3314
+ "\u0120Ca": 3312,
3315
+ "\u0120Society": 3313,
3316
+ "IC": 3314,
3317
+ "\u0120conqu": 3315,
3318
+ "\u0120leader": 3316,
3319
+ "\u0120football": 3317,
3320
+ "\u0120peace": 3318,
3321
+ "\u0120weap": 3319,
3322
+ "\u0120shows": 3320,
3323
+ "\u0120issues": 3321,
3324
+ "ias": 3322,
3325
+ "ellig": 3323,
3326
+ "\u0120influenced": 3324,
3327
+ "head": 3325,
3328
+ "\u0120impact": 3326,
3329
+ "\u0120rapid": 3327,
3330
+ "\u0120soon": 3328,
3331
+ "alian": 3329,
3332
+ "\u0120makes": 3330,
3333
+ "\u0120Ter": 3331,
3334
+ "\u0120Dutch": 3332,
3335
+ "estival": 3333,
3336
+ "\u0120Oly": 3334,
3337
+ "rupt": 3335,
3338
+ "NA": 3336,
3339
+ "erve": 3337,
3340
+ "\u0120Ireland": 3338,
3341
+ "osis": 3339,
3342
+ "\u0120i": 3340,
3343
+ "\u0120question": 3341,
3344
+ "\u0120More": 3342,
3345
+ "\u0120pay": 3343,
3346
+ "imal": 3344,
3347
+ "\u0120resulting": 3345,
3348
+ "\u0120emp": 3346,
3349
+ "\u0120demonst": 3347,
3350
+ "\u0120block": 3348,
3351
+ "ournal": 3349,
3352
+ "\u0120Parliament": 3350,
3353
+ "\u0120federal": 3351,
3354
+ "\u0120algebra": 3352,
3355
+ "\u0120algor": 3353,
3356
+ "\u0120ded": 3354,
3357
+ "\u0120eastern": 3355,
3358
+ "\u0120creation": 3356,
3359
+ "\u0120Bak": 3357,
3360
+ "\u0120Follow": 3358,
3361
+ "\u0120amb": 3359,
3362
+ "\u0120improve": 3360,
3363
+ "\u0120joined": 3361,
3364
+ "\u0120Inter": 3362,
3365
+ "\u0120La": 3363,
3366
+ "\u0120experiment": 3364,
3367
+ "\u0120earliest": 3365,
3368
+ "\u0120ethn": 3366,
3369
+ "\u0120Olymp": 3367,
3370
+ "cks": 3368,
3371
+ "hood": 3369,
3372
+ "ply": 3370,
3373
+ "\u0120worked": 3371,
3374
+ "\u0120Force": 3372,
3375
+ "\u0120era": 3373,
3376
+ "ocratic": 3374,
3377
+ "\u0120machine": 3375,
3378
+ "\u0120BBC": 3376,
3379
+ "\u0120streng": 3377,
3380
+ "\u0120ability": 3378,
3381
+ "\u0120probably": 3379,
3382
+ "erc": 3380,
3383
+ "\u0120bor": 3381,
3384
+ "\u0120inj": 3382,
3385
+ "\u0120Int": 3383,
3386
+ "iance": 3384,
3387
+ "\u0120spirit": 3385,
3388
+ "\u0120Americans": 3386,
3389
+ "\u0120temperature": 3387,
3390
+ "actions": 3388,
3391
+ "\u0120passed": 3389,
3392
+ "\u0120Cambridge": 3390,
3393
+ "rument": 3391,
3394
+ "\u0120table": 3392,
3395
+ "\u0120wat": 3393,
3396
+ "\u0120Association": 3394,
3397
+ "heast": 3395,
3398
+ "\u0120Cyp": 3396,
3399
+ "\u0120Peter": 3397,
3400
+ "\u0120184": 3398,
3401
+ "\u0120humans": 3399,
3402
+ "\u0120poor": 3400,
3403
+ "\u0120Common": 3401,
3404
+ "\u0120Russia": 3402,
3405
+ "\u0120sus": 3403,
3406
+ "\u0120mom": 3404,
3407
+ "iform": 3405,
3408
+ "\u0120Vict": 3406,
3409
+ "\u0120Bec": 3407,
3410
+ "\u0120Muslim": 3408,
3411
+ "\u0120happ": 3409,
3412
+ "\u0120Cap": 3410,
3413
+ "\u0120Mart": 3411,
3414
+ "\u0120true": 3412,
3415
+ "\u0120effective": 3413,
3416
+ "iques": 3414,
3417
+ "\u0120letter": 3415,
3418
+ "\u0120individuals": 3416,
3419
+ "lem": 3417,
3420
+ "\u0120heavy": 3418,
3421
+ "\u0120resour": 3419,
3422
+ "\u0120thous": 3420,
3423
+ "\u012022": 3421,
3424
+ "\u0120decided": 3422,
3425
+ "\u0120protect": 3423,
3426
+ "\u0120Southern": 3424,
3427
+ "\u0120promin": 3425,
3428
+ "\u0120regional": 3426,
3429
+ "\u0120Michael": 3427,
3430
+ "SA": 3428,
3431
+ "ares": 3429,
3432
+ "\u0120Ash": 3430,
3433
+ "\u0120flu": 3431,
3434
+ "\u0120broadcast": 3432,
3435
+ "board": 3433,
3436
+ "12": 3434,
3437
+ "use": 3435,
3438
+ "\u0120Lou": 3436,
3439
+ "\u0120rank": 3437,
3440
+ "astic": 3438,
3441
+ "\u0120values": 3439,
3442
+ "\u0120needed": 3440,
3443
+ "\u0120tend": 3441,
3444
+ "\u0120division": 3442,
3445
+ "\u0120Following": 3443,
3446
+ "vin": 3444,
3447
+ "ont": 3445,
3448
+ "\u0120assum": 3446,
3449
+ "\u0120corre": 3447,
3450
+ "\u0120forest": 3448,
3451
+ "ground": 3449,
3452
+ "riage": 3450,
3453
+ "\u0120Ox": 3451,
3454
+ "\u0120authority": 3452,
3455
+ "\u0120Library": 3453,
3456
+ "uda": 3454,
3457
+ "\u0120Jews": 3455,
3458
+ "itutions": 3456,
3459
+ "\u0120Del": 3457,
3460
+ "ibly": 3458,
3461
+ "tee": 3459,
3462
+ "\u0120pattern": 3460,
3463
+ "aith": 3461,
3464
+ "water": 3462,
3465
+ "\u0120Bron": 3463,
3466
+ "igation": 3464,
3467
+ "\u0120Kn": 3465,
3468
+ "\u0120Despite": 3466,
3469
+ "idents": 3467,
3470
+ "\u0120soldiers": 3468,
3471
+ "\u0120frequently": 3469,
3472
+ "oor": 3470,
3473
+ "tical": 3471,
3474
+ "\u0120inh": 3472,
3475
+ "\u0120Bill": 3473,
3476
+ "\u0120search": 3474,
3477
+ "\u0120becoming": 3475,
3478
+ "\u0120Because": 3476,
3479
+ "\u0120tree": 3477,
3480
+ "ico": 3478,
3481
+ "\u0120Aut": 3479,
3482
+ "verse": 3480,
3483
+ "\u0120composed": 3481,
3484
+ "\u0120medical": 3482,
3485
+ "\u0120basic": 3483,
3486
+ "\u0120Liber": 3484,
3487
+ "\u0120decision": 3485,
3488
+ "\u0120argued": 3486,
3489
+ "eder": 3487,
3490
+ "\u0120Stud": 3488,
3491
+ "ampionship": 3489,
3492
+ "ville": 3490,
3493
+ "igr": 3491,
3494
+ "\u0120step": 3492,
3495
+ "\u0120week": 3493,
3496
+ "\u0120contract": 3494,
3497
+ "\u0120Two": 3495,
3498
+ "\u01201998": 3496,
3499
+ "\u0120behind": 3497,
3500
+ "inated": 3498,
3501
+ "\u0120Both": 3499,
3502
+ "\u0120famous": 3500,
3503
+ "\u0120Institute": 3501,
3504
+ "\u0120expansion": 3502,
3505
+ "\u0120Japan": 3503,
3506
+ "oura": 3504,
3507
+ "irm": 3505,
3508
+ "\u0120daughter": 3506,
3509
+ "oud": 3507,
3510
+ "raction": 3508,
3511
+ "\u0120attacks": 3509,
3512
+ "zil": 3510,
3513
+ "\u0120cry": 3511,
3514
+ "\u0120Aber": 3512,
3515
+ "\u0120playing": 3513,
3516
+ "\u01202022": 3514,
3517
+ "\u0120taking": 3515,
3518
+ "\u0120paper": 3516,
3519
+ "\u0120sexual": 3517,
3520
+ "\u0120lunar": 3518,
3521
+ "asty": 3519,
3522
+ "\u0120spect": 3520,
3523
+ "\u0120letters": 3521,
3524
+ "\u0120Company": 3522,
3525
+ "\u0120ult": 3523,
3526
+ "\u012027": 3524,
3527
+ "\u0120Class": 3525,
3528
+ "\u0120ir": 3526,
3529
+ "itz": 3527,
3530
+ "\u0120Pre": 3528,
3531
+ "undred": 3529,
3532
+ "\u0120...": 3530,
3533
+ "\u0120Ir": 3531,
3534
+ "\u0120accepted": 3532,
3535
+ "\u0120link": 3533,
3536
+ "\u0120Bank": 3534,
3537
+ "\u0120enem": 3535,
3538
+ "\u0120attrib": 3536,
3539
+ "\u0120necessary": 3537,
3540
+ "\u0120risk": 3538,
3541
+ "\u01200": 3539,
3542
+ "\u0120bacter": 3540,
3543
+ "\u0120money": 3541,
3544
+ "\u0120vary": 3542,
3545
+ "onds": 3543,
3546
+ "century": 3544,
3547
+ "\u0120married": 3545,
3548
+ "\u0120Anglic": 3546,
3549
+ "\u0120calcul": 3547,
3550
+ "\u0120featured": 3548,
3551
+ "den": 3549,
3552
+ "rog": 3550,
3553
+ "ressed": 3551,
3554
+ "\u0120u": 3552,
3555
+ "\u0120Rome": 3553,
3556
+ "\u0120yet": 3554,
3557
+ "\u0120Second": 3555,
3558
+ "\u0120avoid": 3556,
3559
+ "\u0120failed": 3557,
3560
+ "ka": 3558,
3561
+ "ras": 3559,
3562
+ "enic": 3560,
3563
+ "\u0120date": 3561,
3564
+ "otes": 3562,
3565
+ "\u0120Moon": 3563,
3566
+ "\u0120deep": 3564,
3567
+ "ested": 3565,
3568
+ "\u012023": 3566,
3569
+ "rown": 3567,
3570
+ "rief": 3568,
3571
+ "izations": 3569,
3572
+ "\u0120appointed": 3570,
3573
+ "\u0120allows": 3571,
3574
+ "\u0120identified": 3572,
3575
+ "\u0120activities": 3573,
3576
+ "eties": 3574,
3577
+ "\u0120programs": 3575,
3578
+ "\u0120cand": 3576,
3579
+ "\u0120ey": 3577,
3580
+ "\u0120Fl": 3578,
3581
+ "\u0120emphas": 3579,
3582
+ "win": 3580,
3583
+ "\u0120native": 3581,
3584
+ "\u0120aspect": 3582,
3585
+ "\u0120inn": 3583,
3586
+ "\u0120burn": 3584,
3587
+ "\u0120Under": 3585,
3588
+ "\u0120Arist": 3586,
3589
+ "\u0120operation": 3587,
3590
+ "\u0120alcoh": 3588,
3591
+ "\u0120our": 3589,
3592
+ "197": 3590,
3593
+ "\u0120System": 3591,
3594
+ "bed": 3592,
3595
+ "rical": 3593,
3596
+ "\u0120That": 3594,
3597
+ "\u012050": 3595,
3598
+ "\u0120synthe": 3596,
3599
+ "\u0120launch": 3597,
3600
+ "\u0120micro": 3598,
3601
+ "vey": 3599,
3602
+ "uman": 3600,
3603
+ "\u0120round": 3601,
3604
+ "\u0120instrument": 3602,
3605
+ "annel": 3603,
3606
+ "ygen": 3604,
3607
+ "\u0120mis": 3605,
3608
+ "\u0120chief": 3606,
3609
+ "\u0120buildings": 3607,
3610
+ "\u0120Canadian": 3608,
3611
+ "\u0120Berlin": 3609,
3612
+ "\u0120responsible": 3610,
3613
+ "ully": 3611,
3614
+ "\u0120Dis": 3612,
3615
+ "\u0120applied": 3613,
3616
+ "\u0120completed": 3614,
3617
+ "\u0120previously": 3615,
3618
+ "unicip": 3616,
3619
+ "\u0120knowledge": 3617,
3620
+ "bour": 3618,
3621
+ "ned": 3619,
3622
+ "\u0120consequ": 3620,
3623
+ "\u0120churches": 3621,
3624
+ "azine": 3622,
3625
+ "comp": 3623,
3626
+ "\u0120\u0120\u0120\u0120": 3624,
3627
+ "iated": 3625,
3628
+ "\u0120discover": 3626,
3629
+ "\u0120combined": 3627,
3630
+ "\u0120takes": 3628,
3631
+ "alle": 3629,
3632
+ "ola": 3630,
3633
+ "\u0120Minister": 3631,
3634
+ "writ": 3632,
3635
+ "wide": 3633,
3636
+ "yond": 3634,
3637
+ "\u0120spl": 3635,
3638
+ "itation": 3636,
3639
+ "\u0120reform": 3637,
3640
+ "\u0120graph": 3638,
3641
+ "\u0120Dan": 3639,
3642
+ "\u0120sec": 3640,
3643
+ "\u0120constant": 3641,
3644
+ "\u0120possess": 3642,
3645
+ ",\"": 3643,
3646
+ "\u0120Tex": 3644,
3647
+ "\u0120alph": 3645,
3648
+ "\u0120section": 3646,
3649
+ "\u0120Street": 3647,
3650
+ "crete": 3648,
3651
+ "\u0120ideas": 3649,
3652
+ "\u0120disease": 3650,
3653
+ "\u0120ut": 3651,
3654
+ "ented": 3652,
3655
+ "\u0120chall": 3653,
3656
+ "\u0120consists": 3654,
3657
+ "\u0120Brazil": 3655,
3658
+ "icted": 3656,
3659
+ "\u0120dr": 3657,
3660
+ "\u0120Beng": 3658,
3661
+ "\u0120distingu": 3659,
3662
+ "\u0120Columbia": 3660,
3663
+ "\u0120brain": 3661,
3664
+ "\u0120compared": 3662,
3665
+ "\u0120techniques": 3663,
3666
+ "ship": 3664,
3667
+ "\u0120collect": 3665,
3668
+ "\u0120mix": 3666,
3669
+ "\u0120literature": 3667,
3670
+ "ien": 3668,
3671
+ "\u0120bomb": 3669,
3672
+ "\u0120dram": 3670,
3673
+ "\u0120decre": 3671,
3674
+ "\u0120Thus": 3672,
3675
+ "\u0120formal": 3673,
3676
+ "\u0120directed": 3674,
3677
+ "\u0120Johnson": 3675,
3678
+ "\u0120initially": 3676,
3679
+ "\u0120derived": 3677,
3680
+ "van": 3678,
3681
+ "itan": 3679,
3682
+ "\u0120retain": 3680,
3683
+ "odox": 3681,
3684
+ "\u0120Wil": 3682,
3685
+ "perial": 3683,
3686
+ "\u0120district": 3684,
3687
+ "\u0120Athens": 3685,
3688
+ "\u0120bra": 3686,
3689
+ "\u0120face": 3687,
3690
+ "\u0120examples": 3688,
3691
+ "\u0120rise": 3689,
3692
+ "\u0120Croat": 3690,
3693
+ "\u0120harm": 3691,
3694
+ "\u0120Italian": 3692,
3695
+ "\u0120simple": 3693,
3696
+ "\u0120aim": 3694,
3697
+ "enth": 3695,
3698
+ "\u0120reject": 3696,
3699
+ "eles": 3697,
3700
+ "\u0120university": 3698,
3701
+ "\u0120conserv": 3699,
3702
+ "\u012040": 3700,
3703
+ "\u0120autom": 3701,
3704
+ "\u0120Prot": 3702,
3705
+ "\u0120philosophy": 3703,
3706
+ "\u0120incorpor": 3704,
3707
+ "wise": 3705,
3708
+ "\u0120birds": 3706,
3709
+ "\u0120beg": 3707,
3710
+ "ocal": 3708,
3711
+ "illed": 3709,
3712
+ "\u01201997": 3710,
3713
+ "edom": 3711,
3714
+ "\u0120Serv": 3712,
3715
+ "\u0120Def": 3713,
3716
+ "\u0120Center": 3714,
3717
+ "\u0120reduced": 3715,
3718
+ "\u0120citizens": 3716,
3719
+ "oids": 3717,
3720
+ "\u0120TV": 3718,
3721
+ "\u0120appearance": 3719,
3722
+ "tles": 3720,
3723
+ "\u0120Alf": 3721,
3724
+ "\u01201996": 3722,
3725
+ "ini": 3723,
3726
+ "\u0120Av": 3724,
3727
+ "\u0120Sever": 3725,
3728
+ "\u0120conne": 3726,
3729
+ "...": 3727,
3730
+ "mark": 3728,
3731
+ "\u0120Sal": 3729,
3732
+ "\u0120Bre": 3730,
3733
+ "\u0120speed": 3731,
3734
+ "\u0120communication": 3732,
3735
+ "\u0120theore": 3733,
3736
+ "\u0120animal": 3734,
3737
+ "\u0120Ke": 3735,
3738
+ "\u0120remaining": 3736,
3739
+ "force": 3737,
3740
+ "\u0120benef": 3738,
3741
+ "\u0120AM": 3739,
3742
+ "\u0120Azer": 3740,
3743
+ "\u0120Hill": 3741,
3744
+ "\u012028": 3742,
3745
+ "\u0120Nor": 3743,
3746
+ "\u0120communities": 3744,
3747
+ "\u0120quickly": 3745,
3748
+ "oln": 3746,
3749
+ "urder": 3747,
3750
+ "\u0120Fil": 3748,
3751
+ "\u0120Fore": 3749,
3752
+ "\u0120Navy": 3750,
3753
+ "\u0120approximately": 3751,
3754
+ "sis": 3752,
3755
+ "\u0120mess": 3753,
3756
+ "\u0120Mat": 3754,
3757
+ "\u0120Jose": 3755,
3758
+ "\u0120speech": 3756,
3759
+ "\u0120friend": 3757,
3760
+ "pr": 3758,
3761
+ "\u0120phen": 3759,
3762
+ "\u0120hours": 3760,
3763
+ "\u0120Leg": 3761,
3764
+ "\u0120continue": 3762,
3765
+ "\u0120definition": 3763,
3766
+ "\u0120screen": 3764,
3767
+ "bai": 3765,
3768
+ "\u0120Mem": 3766,
3769
+ "\u0120secret": 3767,
3770
+ "\u0120device": 3768,
3771
+ "\u0120Mex": 3769,
3772
+ "\u0120arrang": 3770,
3773
+ "\u0120assist": 3771,
3774
+ "\u0120launched": 3772,
3775
+ "\u0120Government": 3773,
3776
+ "baijan": 3774,
3777
+ "zz": 3775,
3778
+ "\u0120internal": 3776,
3779
+ "\u0120Grand": 3777,
3780
+ "\u0120compar": 3778,
3781
+ "\u0120Chap": 3779,
3782
+ "izes": 3780,
3783
+ "cluding": 3781,
3784
+ "rone": 3782,
3785
+ "\u0120models": 3783,
3786
+ "\u0120signed": 3784,
3787
+ "rival": 3785,
3788
+ "\u0120controvers": 3786,
3789
+ "hy": 3787,
3790
+ "\u0120Civil": 3788,
3791
+ "emis": 3789,
3792
+ "\u0120Wales": 3790,
3793
+ "\u0120formation": 3791,
3794
+ "\u0120organization": 3792,
3795
+ "11": 3793,
3796
+ "arc": 3794,
3797
+ "elt": 3795,
3798
+ "\u0120orbit": 3796,
3799
+ "\u0120subs": 3797,
3800
+ "stein": 3798,
3801
+ "25": 3799,
3802
+ "can": 3800,
3803
+ "oned": 3801,
3804
+ "\u0120Bible": 3802,
3805
+ "\u0120alk": 3803,
3806
+ "\u0120rot": 3804,
3807
+ "\u0120purpose": 3805,
3808
+ "\u0120fif": 3806,
3809
+ "\u0120lived": 3807,
3810
+ "\u0120Mary": 3808,
3811
+ "thodox": 3809,
3812
+ "\u0120approach": 3810,
3813
+ "mitted": 3811,
3814
+ "wor": 3812,
3815
+ "\u0120anthrop": 3813,
3816
+ "\u0120repe": 3814,
3817
+ "ela": 3815,
3818
+ "\u0120Ann": 3816,
3819
+ "\u0120driv": 3817,
3820
+ "\u0120Greece": 3818,
3821
+ "\u0120night": 3819,
3822
+ "\u0120sets": 3820,
3823
+ "\u0120White": 3821,
3824
+ "\u0120equal": 3822,
3825
+ "\u0120claims": 3823,
3826
+ "\u0120devices": 3824,
3827
+ "\u0120powers": 3825,
3828
+ "rier": 3826,
3829
+ "\u0120Lord": 3827,
3830
+ "\u0120weight": 3828,
3831
+ "\u0120occurred": 3829,
3832
+ "\u0120Azerbaijan": 3830,
3833
+ "uge": 3831,
3834
+ "\u0120structures": 3832,
3835
+ "aught": 3833,
3836
+ "\u0120bring": 3834,
3837
+ "\u0120middle": 3835,
3838
+ "\u0120Pac": 3836,
3839
+ "\u0120dead": 3837,
3840
+ "ographic": 3838,
3841
+ "\u0120Barb": 3839,
3842
+ "\u0120industrial": 3840,
3843
+ "\u0120observed": 3841,
3844
+ "kin": 3842,
3845
+ "\u0120council": 3843,
3846
+ "\u0120ple": 3844,
3847
+ "\u0120Rec": 3845,
3848
+ "\u0120Corn": 3846,
3849
+ "\u0120Their": 3847,
3850
+ "ado": 3848,
3851
+ "ifically": 3849,
3852
+ "\u0120rat": 3850,
3853
+ "\u0120learn": 3851,
3854
+ "orders": 3852,
3855
+ "\u0120road": 3853,
3856
+ "\u0120allowing": 3854,
3857
+ "\u0120favor": 3855,
3858
+ "\u0120Atlantic": 3856,
3859
+ "onn": 3857,
3860
+ "\u0120pil": 3858,
3861
+ "\u0120places": 3859,
3862
+ "\u0120simply": 3860,
3863
+ "\u0120Spain": 3861,
3864
+ "incoln": 3862,
3865
+ "\u0120Prince": 3863,
3866
+ "\u0120fully": 3864,
3867
+ "\u0120Bor": 3865,
3868
+ "\u0120global": 3866,
3869
+ "\u0120offered": 3867,
3870
+ "iring": 3868,
3871
+ "\u0120direction": 3869,
3872
+ "\u0120summer": 3870,
3873
+ "\u0120reve": 3871,
3874
+ "\u0120Best": 3872,
3875
+ "unk": 3873,
3876
+ "ube": 3874,
3877
+ "\u0120ec": 3875,
3878
+ "\u0120suc": 3876,
3879
+ "obile": 3877,
3880
+ "\u0120refers": 3878,
3881
+ "\u0120declared": 3879,
3882
+ "\u0120Victor": 3880,
3883
+ "15": 3881,
3884
+ "\u0120bal": 3882,
3885
+ "\u0120Bos": 3883,
3886
+ "\u01201994": 3884,
3887
+ "\u0120female": 3885,
3888
+ "\u0120ways": 3886,
3889
+ "\u0120recon": 3887,
3890
+ "\u0120Lincoln": 3888,
3891
+ "\u0120depend": 3889,
3892
+ "\u0120classical": 3890,
3893
+ "\u0120presented": 3891,
3894
+ "aska": 3892,
3895
+ "ilies": 3893,
3896
+ "\u0120climate": 3894,
3897
+ "\u0120extended": 3895,
3898
+ "\u0120entered": 3896,
3899
+ "\u0120interpret": 3897,
3900
+ "ya": 3898,
3901
+ "\u0120Sol": 3899,
3902
+ "ipment": 3900,
3903
+ "\u0120shot": 3901,
3904
+ "\u0120impl": 3902,
3905
+ "\u0120records": 3903,
3906
+ "\u0120turned": 3904,
3907
+ "enh": 3905,
3908
+ "cell": 3906,
3909
+ "elled": 3907,
3910
+ "earch": 3908,
3911
+ "\u0120despite": 3909,
3912
+ "\u0120ancest": 3910,
3913
+ "\u0120songs": 3911,
3914
+ "\u0120spoken": 3912,
3915
+ "\u0120resistance": 3913,
3916
+ "\u0120newsp": 3914,
3917
+ "\u0120serious": 3915,
3918
+ "\u0120ordered": 3916,
3919
+ "\u0120Louis": 3917,
3920
+ "ooth": 3918,
3921
+ "\u0120male": 3919,
3922
+ "aded": 3920,
3923
+ "\u0120fort": 3921,
3924
+ "ipl": 3922,
3925
+ "\u0120arri": 3923,
3926
+ "\u0120clos": 3924,
3927
+ "book": 3925,
3928
+ "oured": 3926,
3929
+ "amin": 3927,
3930
+ "\u0120Each": 3928,
3931
+ "ructed": 3929,
3932
+ "\u0120musical": 3930,
3933
+ "\u0120reference": 3931,
3934
+ "\u0120interview": 3932,
3935
+ "\u0120positive": 3933,
3936
+ "\u0120Add": 3934,
3937
+ "\u0120teams": 3935,
3938
+ "\u0120Portug": 3936,
3939
+ "\u0120moment": 3937,
3940
+ "inet": 3938,
3941
+ "atre": 3939,
3942
+ "rod": 3940,
3943
+ "\u0120closed": 3941,
3944
+ "\u0120Ark": 3942,
3945
+ "\u01201995": 3943,
3946
+ "\u0120Shah": 3944,
3947
+ "\u0120Group": 3945,
3948
+ "16": 3946,
3949
+ "with": 3947,
3950
+ "\u0120rein": 3948,
3951
+ "igu": 3949,
3952
+ "\u0120Elect": 3950,
3953
+ "\u0120deal": 3951,
3954
+ "\u0120exhib": 3952,
3955
+ "sec": 3953,
3956
+ "\u0120Ach": 3954,
3957
+ "\u0120hundred": 3955,
3958
+ "aging": 3956,
3959
+ "\u0120prison": 3957,
3960
+ "\u0120victory": 3958,
3961
+ "CA": 3959,
3962
+ "\u0120cru": 3960,
3963
+ "quar": 3961,
3964
+ "\u0120Chic": 3962,
3965
+ "\u0120negative": 3963,
3966
+ "\u0120applications": 3964,
3967
+ "oes": 3965,
3968
+ "\u0120Tra": 3966,
3969
+ "umin": 3967,
3970
+ "\u0120Div": 3968,
3971
+ "\u0120183": 3969,
3972
+ "\u0120ended": 3970,
3973
+ "\u0120nuclear": 3971,
3974
+ "las": 3972,
3975
+ "iments": 3973,
3976
+ "ulate": 3974,
3977
+ "\u0120share": 3975,
3978
+ "\u0120Atari": 3976,
3979
+ "\u0120occurs": 3977,
3980
+ "\u0120condition": 3978,
3981
+ "\u0120Belg": 3979,
3982
+ "\u0120brief": 3980,
3983
+ "\u0120Sun": 3981,
3984
+ "uls": 3982,
3985
+ "estic": 3983,
3986
+ "\u0120effic": 3984,
3987
+ "\u0120opposed": 3985,
3988
+ "\u0120Franc": 3986,
3989
+ "\u0120Brown": 3987,
3990
+ "\u0120distribution": 3988,
3991
+ "\u0120virt": 3989,
3992
+ "\u0120binary": 3990,
3993
+ "\u0120Ah": 3991,
3994
+ "imens": 3992,
3995
+ "\u0120Mill": 3993,
3996
+ "\u0120Jes": 3994,
3997
+ "\u0120serve": 3995,
3998
+ "\u0120sport": 3996,
3999
+ "\u0120pal": 3997,
4000
+ "\u0120Rob": 3998,
4001
+ "\u0120Later": 3999
4002
+ }
wiki_bpe_tokenizer_4000_bytelevel.json ADDED
The diff for this file is too large to render. See raw diff