path
stringlengths
7
265
concatenated_notebook
stringlengths
46
17M
2020/monica/Day9.ipynb
###Markdown Advent of code 2020 --- Day 9: Encoding Error ---With your neighbor happily enjoying their video game, you turn your attention to an open data port on the little screen in the seat in front of you. Though the port is non-standard, you manage to connect it to your computer through the clever use of several paperclips. Upon connection, the port outputs a series of numbers (your puzzle input). The data appears to be encrypted with the eXchange-Masking Addition System (XMAS) which, conveniently for you, is an old cypher with an important weakness. XMAS starts by transmitting a **preamble** of 25 numbers. After that, each number you receive should be the sum of any two of the 25 immediately previous numbers. The two numbers will have different values, and there might be more than one such pair. For example, suppose your preamble consists of the numbers `1` through `25` in a random order. To be valid, the next number must be the sum of two of those numbers: - `26` would be a **valid** next number, as it could be `1` plus `25` (or many other pairs, like `2` and `24`).- `49` would be a **valid** next number, as it is the sum of `24` and `25`.- `100` would **not** be valid; no two of the previous 25 numbers sum to `100`.- `50` would also **not** be valid; although `25` appears in the previous 25 numbers, the two numbers in the pair must be different. Suppose the 26th number is `45`, and the first number (no longer an option, as it is more than 25 numbers ago) was `20`. Now, for the next number to be valid, there needs to be some pair of numbers among 1-19, 21-25, or 45 that add up to it: - `26` would still be a **valid** next number, as `1` and `25` are still within the previous 25 numbers. - `65` would **not** be valid, as no two of the available numbers sum to it. - `64` and 66 would both be **valid**, as they are the result of 19+45 and 21+45 respectively. Here is a larger example which only considers the previous 5 numbers (and has a preamble of length 5): ```35201525474062556595102117150182127219299277309576```In this example, after the 5-number preamble, almost every number is the sum of two of the previous 5 numbers; the only number that does not follow this rule is 127. The first step of attacking the weakness in the XMAS data is to find the first number in the list (after the preamble) which is not the sum of two of the 25 numbers before it. What is the first number that does not have this property? ###Code #Example with open('example9.txt') as f: xmas = f.read().split('\n') xmas = [num for num in xmas if num != ''] def find_number(xmas,num_preamble): stop=False i = num_preamble while stop == False: preamble = xmas[i-num_preamble:i] check_sum = [] for num1 in preamble: for num2 in preamble: if (num1 != num2)&(int(num1)+int(num2)==int(xmas[i])): check_sum.append(xmas[i]) if (i == len(xmas))| (check_sum == []): stop = True print('The first number to not follow the rule is', xmas[i]) else: i = i +1 return xmas[i] invalid_number = find_number(xmas,5) #Input 9 with open('input_day9.txt') as f: xmas = f.read().split('\n') xmas = [num for num in xmas if num != ''] find_number(xmas,25) ###Output The first number to not follow the rule is 10884537 ###Markdown --- Part Two ---The final step in breaking the XMAS encryption relies on the invalid number you just found: you must find a contiguous set of at least two numbers in your list which sum to the invalid number from step 1. Again consider the above example: ```35201525474062556595102117150182127219299277309576```In this list, adding up all of the numbers from `15` through `40` produces the invalid number from step 1, `127`. (Of course, the contiguous set of numbers in your actual list might be much longer.) To find the encryption weakness, add together the smallest and largest number in this contiguous range; in this example, these are `15` and `47`, producing `62`. What is the encryption weakness in your XMAS-encrypted list of numbers? ###Code #Example with open('example9.txt') as f: xmas = f.read().split('\n') xmas = [int(num) for num in xmas if num != ''] def find_weakness(xmas,invalid_number): stop=False start = 0 end = 2 num_range = 2 while stop == False: contiguous = xmas[start:end] if sum(contiguous) == invalid_number: print('The range is',contiguous) min_number = min(contiguous) max_number = max(contiguous) sum_numbers = min_number + max_number print('The encryption weakness is', sum_numbers) stop = True elif end == len(xmas): print('End of list with range',num_range) # stop = True if start == len(xmas): print('End of start') stop = True else: num_range = num_range +1 end = num_range start = 0 else: end = end +1 start = start +1 invalid_number = find_number(xmas,5) find_weakness(xmas,invalid_number) #Input 9 with open('input_day9.txt') as f: xmas = f.read().split('\n') xmas = [int(num) for num in xmas if num != ''] invalid_number = find_number(xmas,25) find_weakness(xmas,invalid_number) ###Output The first number to not follow the rule is 10884537 End of list with range 2 End of list with range 3 End of list with range 4 End of list with range 5 End of list with range 6 End of list with range 7 End of list with range 8 End of list with range 9 End of list with range 10 End of list with range 11 End of list with range 12 End of list with range 13 End of list with range 14 End of list with range 15 End of list with range 16 The range is [408514, 507208, 753282, 695857, 570543, 444281, 626571, 592643, 500865, 693401, 599118, 661929, 814643, 662453, 712303, 852795, 788131] The encryption weakness is 1261309
experiments/FOSSIL-advanced-playground.ipynb
###Markdown [//]: (Copyright c 2021, Alessandro Abate, Daniele Ahmed, Alec Edwards, Mirco Giacobbe, Andrea Peruffo)[//]: (All rights reserved.)[//]: (This source code is licensed under the BSD-style license found in the)[//]: (LICENSE file in the root directory of this source tree. ) 🦖 Welcome to FOSSIL! ___._ .' '-.._ / /.--.____") | \ __.-'~ | : -'/ /:. :.-' __________ | : '. | '--.____ '--------.______ _.----.-----./ :/ '--.__ `'----/ '-. __ :/ '-.___ : \ .' )/ '---._ _.-' ] / _/ '-._ _/ _/ / _/ \_ .-'____.-'__< | \___ <_______.\ \_\_---.7 | /'=r_.-' _\\ =/ .--' / ._/'> .' _.-' / .--' /,/ |/`) 'c=, OverviewThis tool uses neural networks alongside the CEGIS architecture to automatically synthesise sound Lyapunov functions of Barrier Certificates for any N-dimensional system.This notebook provides access to the following settings:1. [Verifier Type](verifier-type)1. [Neural Network Structure](nn-structure) i) Activation functions \ ii) Layer structure (number of neurons, number of layers)1. [Factorisation](factorisation)1. [Last Layer of Ones](ll1)1. [Barrier Function Sythesis Settings](barrier-settings)1. [Sympy Settings](sympy-settings)1. [Further CEGIS Settings](further-settings) i) Learning Rate \ ii) Batch size \ iii) Maximum number of CEGIS iterations \ iv) Rounding 1. [Primer Settings](primer-settings) i) Seed and Speed \ ii) Positive Domain \ iii) Interactive Domain [Running the tool with custom settings](running) [Advanced Functionality](advanced)If you would like to use a more simple interface to the tool with just defualt settings, please use the [playground](FOSSIL-playground.ipynb). ###Code # % Imports import sys sys.path.append('..') from experiments.playground_utils import * ###Output _____no_output_____ ###Markdown 1. Verifier Type (*CegisConfig.VERIFIER.k*)The tool has inbuilt support for both Z3 and Dreal4, and either can be selected as the backend to the verification. Please note that certain functionality cannot be used with Z3 as the verifier, though this is the default verifier.**To specify the verifier, change the following variable to either VerifierType.Z3 or VerifierType.DREAL** ###Code verifier_type = VerifierType.Z3 ###Output _____no_output_____ ###Markdown 2. Neural Network Structure Activations (*CegisConfig.ACTIVATION.k*)The following activation functions are available:* Linear or identity:$$\sigma(x) = x $$___* Square:$$\sigma(x) = x^2 $$___* Mixed $n^{th}$ order Polynomial:$$ \sigma(x) = \sum_{i=1}^{n}x_i ^i, $$ where $x_i$ represents the i^th neurons in the layer, giving a mixed layer of activations. For more details on this please see [mixed_activation_functions]___* RELU:$$ \sigma(x) = \max(0,x) $$___* RELU-Square:$$\sigma(x) = x_1^2 + \max(0, x_2), $$where x_1 denotes the the first half of the neurons in the layer and x_2 denotes the second half. ___* REQU:$$ \sigma(x) = x \cdot \max(0,x)$$___* Sigmoid (requires Dreal):$$ \sigma(x) = \frac{\text{e}^{x}}{\text{e}^{x} +1}$$___* Tanh (requires Dreal):$$\sigma(x) = \tanh(x) $$ Neuron Structure (*CegisConfig.N_HIDDEN_NEURONS.k*)The tool supports any number of layers each with any number of neurons and activation function. However, it should be noted that larger networks will have longer verification times. The desired activation functions for each should be inlcuded as a list with each element representing the activation function for that layer. Similarly, the number of neurons in each layer should be inlcuded as a list.For example, for a network structure with two hidden layers, one with 50 and one with 20 neurons, the first of which has mixed second-order polynomial (LIN_SQUARE) activations and the second with relu (RELU) activations.activations = \[ ActivationType.LIN_SQUARE, ActivationType.RELU \] neurons = \[ 50, 20 \]The default parameters are given in the cell below and can be changed freely. ###Code activations = [ActivationType.SQUARE] neurons = [10] ###Output _____no_output_____ ###Markdown 3. Factorisation (*CegisConfig.FACTORS.k*)In order to help ensure that $V(x^*) = 0$ when synthesising Lyapunov r functions, the learner can be augmented with a quadratic term as $$ V(x) = (x-x^*)^2 \cdot \text{n}(x), $$ where $n(x)$ is the original neural network. **To use quadratic factors, set the following variable to LearningFactors.QUADRATIC.** \**To not use any factors (the defualt setting), set the following variable to None** ###Code factors = None ###Output _____no_output_____ ###Markdown 4. Last Layer of Ones (Lyapunov Only) (*CegisConfig.LLO.k*)llo(*boolean, defualt True*): This constrains the last layer of the neural network to be all ones. This can improve synthesise by helping ensure the positive definiteness (V(x) > 0) condition holds for the learner and verifier.**To constrain the last layer to be all ones, set the following variable to True.** ###Code llo = False ###Output _____no_output_____ ###Markdown 5. Symmetric Belt (Barrier Only) (*CegisConfig.SYMMETRIC_BELT.k*)*symmetric_belt (boolean, default=False)*: defines whether the belt for the derivative barrier condition is symmetric around zero (if so, we consider $|B(x)| \leq 0.5$) or not (if so, the we consider $B(x) \geq -0.1$). **To constrain the belt to be symmetric, set the following variable to True.** ###Code symmetric_belt = False ###Output _____no_output_____ ###Markdown 6. Sympy SettingsThese settings determine the usage of sympy within the CEGIS object.sp_handle (*boolean, default True*): determines whether expressions are handled using the python symbolic library *sympy*. (*CegisConfig.SP_HANDLE.k*)sp_simplify (*boolean, default True*): determines whether expressions are simplified using the *sympy* (potentially costly operation). (*CegisConfig.SP_SIMPLIFY.k*)**To disable sympy handling and simplification, set the following variables to False.** ###Code sp_handle = True sp_simplify = True ###Output _____no_output_____ ###Markdown 7. Further CEGIS Settings* Learning Rate (*positive float, default = 0.1*): sets the learning rate of the neural network. (*CegisConfig.LEARNING_RATE.k*)* Batch Size (*int, defualt = 500*): defines the number of data points initially generated. (*CegisConfig.BATCH_SIZE*)* Max Iterations (*int, default = 10*): sets the maximum number of CEGIS loops before termination. (*CegisConfig.CEGIS_MAX_ITERS.k*)* Rounding (*int, default = 3*): sets the rounding precision used in the Translator. (*CegisConfig.ROUNDING.k*) ###Code learning_rate = 0.1 batch_size = 500 max_iterations = 10 rounding = 3 ###Output _____no_output_____ ###Markdown 8. Primer SettingsThese settings affect how CEGIS is called and interacted with.* Seed and Speed (*boolean, default=False*): if *Seed and Speed* is True, then a CEGIS object is repeatedly instantiated on a short timeout until the procedure is successful. (*CegisConfig.SEED_AND_SPEED.k*)* Interactive Domain (*boolean, default=False*) : determines whether the user can adjust the domain size if CEGIS fails. Cannot be used in conjunction with *Seed and Speed*. (*CegisConfig.INTERACTIVE_DOMAIN.k*)* Positive Domain (*boolean, defualt=False*): Lyapunov only. If True, then the verification domain for Lyapunov conditions is constrained to the positive orthant. (*CegisConfig.POSITIVE_DOMAIN.k*) ###Code seed_and_speed = False positive_domain = False domain_mode = False ###Output _____no_output_____ ###Markdown Running with custom settingsOnce you have set custom settings using the cells above, use the following cell to instantiate a dynamical system and to synthesise either a Lyapunov or Barrier Certificate. The settings are placed into the Cegis_Parameters dictionary as shown. ###Code N_Dimensions = 2 x0, x1 = initialise_states(N_Dimensions) dynamics = [ -x0 + x0 * x1, -x1 ] mode = PrimerMode.LYAPUNOV parameters = {CegisConfig.VERIFIER.k: verifier_type, CegisConfig.ACTIVATION.k: activations, CegisConfig.N_HIDDEN_NEURONS.k: neurons, CegisConfig.FACTORS.k: factors, CegisConfig.LLO.k: llo, CegisConfig.LEARNING_RATE.k: learning_rate, CegisConfig.BATCH_SIZE.k: batch_size, CegisConfig.CEGIS_MAX_ITERS.k: max_iterations, CegisConfig.SP_HANDLE.k: sp_handle, CegisConfig.SP_SIMPLIFY.k: sp_simplify, CegisConfig.SEED_AND_SPEED.k: seed_and_speed, CegisConfig.POSITIVE_DOMAIN.k: positive_domain, CegisConfig.INTERACTIVE_DOMAIN.k: domain_mode,} f_n, f_s = synthesise(dynamics, mode, CEGIS_PARAMETERS=parameters) ###Output _____no_output_____ ###Markdown Advanced FunctionalityUsers are able to define custom systems (dynamics + domains) for either Lyapunov synthesis or Barrier synthesis. These should be in the form of the python functions shown below, including routines for generating the system dynamics, symbolic domains and data generation in the domains. Note that for Lyapunov synthesis, the equilibrium point to perform analysis on must lie at the origin.For generating data within domains, users are encouraged to user the existing functions in experiments/benchmarks/domains_fcns.py when possible.These functions can be passed to *synthesise* or to *Primer.create_Primer* as the argument f. In this case, the CegisConfig.N_VARS parameter must be passed in cegis_parameters. ###Code from experiments.benchmarks.domain_fcns import * #Lyapunov Example def lyap_hybrid(batch_size, functions, inner, outer): # example of 2-d hybrid sys _And = functions['And'] def f(functions, v): _If = functions['If'] x0, x1 = v _then = - x1 - 0.5*x0**3 _else = - x1 - x0**2 - 0.25*x1**3 _cond = x1 >= 0 return [-x0, _If(_cond, _then, _else)] def XD(_, v): x0, x1 = v return _And(inner**2 < x0**2 + x1**2, x0**2 + x1**2 <= outer**2) def SD(): return circle_init_data((0., 0.), outer**2, batch_size) return f, XD, SD() # Barrier Example def obstacle_avoidance(batch_size, functions): _And = functions['And'] velo = 1 def f(functions, v): x, y, phi = v return [ velo * functions['sin'](phi), velo * functions['cos'](phi), - functions['sin'](phi) + 3 * (x * functions['sin'](phi) + y * functions['cos'](phi)) / (0.5 + x ** 2 + y ** 2) ] def XD(_, v): x, y, phi = v return _And(-2 <= x, y <= 2, -1.57 <= phi, phi <= 1.57) def XI(_, v): x, y, phi = v return _And(-0.1 <= x, x <= 0.1, -2 <= y, y <= -1.8, -0.52 <= phi, phi <= 0.52) def XU(_, v): x, y, _phi = v return x**2 + y**2 <= 0.04 def SD(): x_comp = -2 + torch.randn(batch_size, 1)**2 y_comp = 2 - torch.randn(batch_size, 1)**2 phi_comp = segment([-1.57, 1.57], batch_size) dom = torch.cat([x_comp, y_comp, phi_comp], dim=1) return dom def SI(): x = segment([-0.1, 0.1], batch_size) y = segment([-2.0, -1.8], batch_size) phi = segment([-0.52, 0.52], batch_size) return torch.cat([x, y, phi], dim=1) def SU(): xy = circle_init_data((0., 0.), 0.04, batch_size) phi = segment([-0.52, 0.52], batch_size) return torch.cat([xy, phi], dim=1) bounds = inf_bounds_n(2) pi = math.pi bounds.append([-pi/2, pi/2]) return f, XD, XI, XU, SD(), SI(), SU(), bounds ###Output _____no_output_____
Kaggle/KaggleTitanic/kaggleScikitLearn.ipynb
###Markdown Table of Contents1&nbsp;&nbsp;Machine Learning from Start to Finish with Scikit-Learn1.0.1&nbsp;&nbsp;Steps Covered1.1&nbsp;&nbsp;CSV to DataFrame1.2&nbsp;&nbsp;Visualizing Data1.3&nbsp;&nbsp;Transforming Features1.4&nbsp;&nbsp;Some Final Encoding1.5&nbsp;&nbsp;Splitting up the Training Data1.6&nbsp;&nbsp;Fitting and Tuning an Algorithm1.7&nbsp;&nbsp;Validate with KFold1.8&nbsp;&nbsp;Predict the Actual Test Data Machine Learning from Start to Finish with Scikit-LearnThis notebook covers the basic Machine Learning process in Python step-by-step. Go from raw data to at least 78% accuracy on the Titanic Survivors dataset. Steps Covered1. Importing a DataFrame2. Visualize the Data3. Cleanup and Transform the Data4. Encode the Data5. Split Training and Test Sets6. Fine Tune Algorithms7. Cross Validate with KFold8. Upload to Kaggle CSV to DataFrameCSV files can be loaded into a dataframe by calling `pd.read_csv` . After loading the training and test files, print a `sample` to see what you're working with. ###Code import numpy as np import pandas as pd import matplotlib.pyplot as plt import seaborn as sns %matplotlib inline data_train = pd.read_csv('train.csv') data_test = pd.read_csv('test.csv') #return 10 random samples data_train.sample(10) ###Output _____no_output_____ ###Markdown Visualizing DataVisualizing data is crucial for recognizing underlying patterns to exploit in the model. ###Code sns.barplot(x="Embarked", y="Survived", hue="Sex", data=data_train); sns.pointplot( x="Pclass", y="Survived", hue="Sex", data=data_train, palette={"male": "blue", "female": "pink"}, markers=["*", "o"], linestyles=["-", "--"]) ###Output _____no_output_____ ###Markdown Transforming Features1. Aside from 'Sex', the 'Age' feature is second in importance. To avoid overfitting, I'm grouping people into logical human age groups. 2. Each Cabin starts with a letter. I bet this letter is much more important than the number that follows, let's slice it off. 3. Fare is another continuous value that should be simplified. I ran `data_train.Fare.describe()` to get the distribution of the feature, then placed them into quartile bins accordingly. 4. Extract information from the 'Name' feature. Rather than use the full name, I extracted the last name and name prefix (Mr. Mrs. Etc.), then appended them as their own features. 5. Lastly, drop useless features. (Ticket and Name) ###Code def simplify_ages(df): df.Age = df.Age.fillna(-0.5) bins = (-1, 0, 5, 12, 18, 25, 35, 60, 120) group_names = ['Unknown', 'Baby', 'Child', 'Teenager', 'Student', 'Young Adult', 'Adult', 'Senior'] categories = pd.cut(df.Age, bins, labels=group_names) df.Age = categories return df def simplify_cabins(df): df.Cabin = df.Cabin.fillna('N') df.Cabin = df.Cabin.apply(lambda x: x[0]) return df def simplify_fares(df): df.Fare = df.Fare.fillna(-0.5) bins = (-1, 0, 8, 15, 31, 1000) group_names = ['Unknown', '1_quartile', '2_quartile', '3_quartile', '4_quartile'] categories = pd.cut(df.Fare, bins, labels=group_names) df.Fare = categories return df def format_name(df): df['Lname'] = df.Name.apply(lambda x: x.split(' ')[0]) df['NamePrefix'] = df.Name.apply(lambda x: x.split(' ')[1]) return df def drop_features(df): return df.drop(['Ticket', 'Name', 'Embarked'], axis=1) def transform_features(df): df = simplify_ages(df) df = simplify_cabins(df) df = simplify_fares(df) df = format_name(df) df = drop_features(df) return df data_train = transform_features(data_train) data_test = transform_features(data_test) data_train.head() sns.barplot(x="Age", y="Survived", hue="Sex", data=data_train); sns.barplot() sns.barplot(x="Cabin", y="Survived", hue="Sex", data=data_train); sns.barplot(x="Fare", y="Survived", hue="Sex", data=data_train); ###Output _____no_output_____ ###Markdown Some Final EncodingThe last part of the preprocessing phase is to normalize labels. The LabelEncoder in Scikit-learn will convert each unique string value into a number, making out data more flexible for various algorithms. The result is a table of numbers that looks scary to humans, but beautiful to machines. ###Code from sklearn import preprocessing def encode_features(df_train, df_test): features = ['Fare', 'Cabin', 'Age', 'Sex', 'Lname', 'NamePrefix'] df_combined = pd.concat([df_train[features], df_test[features]]) for feature in features: le = preprocessing.LabelEncoder() le = le.fit(df_combined[feature]) df_train[feature] = le.transform(df_train[feature]) df_test[feature] = le.transform(df_test[feature]) return df_train, df_test data_train, data_test = encode_features(data_train, data_test) data_train.head() ###Output _____no_output_____ ###Markdown Splitting up the Training DataNow its time for some Machine Learning. First, separate the features(X) from the labels(y). **X_all:** All features minus the value we want to predict (Survived).**y_all:** Only the value we want to predict. Second, use Scikit-learn to randomly shuffle this data into four variables. In this case, I'm training 80% of the data, then testing against the other 20%. Later, this data will be reorganized into a KFold pattern to validate the effectiveness of a trained algorithm. ###Code from sklearn.model_selection import train_test_split X_all = data_train.drop(['Survived', 'PassengerId'], axis=1) y_all = data_train['Survived'] num_test = 0.10 X_train, X_test, y_train, y_test = train_test_split(X_all, y_all, test_size=num_test, random_state=23) ###Output _____no_output_____ ###Markdown Fitting and Tuning an AlgorithmNow it's time to figure out which algorithm is going to deliver the best model. I'm going with the RandomForestClassifier, but you can drop any other classifier here, such as Support Vector Machines or Naive Bayes. ###Code from sklearn.ensemble import RandomForestClassifier from sklearn.metrics import make_scorer, accuracy_score from sklearn.model_selection import GridSearchCV clf = RandomForestClassifier(n_estimators=1000) # Fit the best algorithm to the data. clf.fit(X_train, y_train) predictions = clf.predict(X_test) print(accuracy_score(y_test, predictions)) ###Output 0.8555555555555555 ###Markdown Validate with KFoldIs this model actually any good? It helps to verify the effectiveness of the algorithm using KFold. This will split our data into 10 buckets, then run the algorithm using a different bucket as the test set for each iteration. ###Code from sklearn.cross_validation import KFold def run_kfold(clf): kf = KFold(891, n_folds=10) outcomes = [] fold = 0 for train_index, test_index in kf: fold += 1 X_train, X_test = X_all.values[train_index], X_all.values[test_index] y_train, y_test = y_all.values[train_index], y_all.values[test_index] clf.fit(X_train, y_train) predictions = clf.predict(X_test) accuracy = accuracy_score(y_test, predictions) outcomes.append(accuracy) print("Fold {0} accuracy: {1}".format(fold, accuracy)) mean_outcome = np.mean(outcomes) print("Mean Accuracy: {0}".format(mean_outcome)) run_kfold(clf) ###Output Fold 1 accuracy: 0.7888888888888889 Fold 2 accuracy: 0.8426966292134831 Fold 3 accuracy: 0.7865168539325843 Fold 4 accuracy: 0.797752808988764 Fold 5 accuracy: 0.8426966292134831 Fold 6 accuracy: 0.8202247191011236 Fold 7 accuracy: 0.7752808988764045 Fold 8 accuracy: 0.8202247191011236 Fold 9 accuracy: 0.8202247191011236 Fold 10 accuracy: 0.7865168539325843 Mean Accuracy: 0.8081023720349563 ###Markdown Predict the Actual Test DataAnd now for the moment of truth. Make the predictions, export the CSV file, and upload them to Kaggle. ###Code ids = data_test['PassengerId'] predictions = clf.predict(data_test.drop('PassengerId', axis=1)) output = pd.DataFrame({ 'PassengerId' : ids, 'Survived': predictions }) # output.to_csv('titanic-predictions.csv', index = False) output.head(20) data_test.head(20) ###Output _____no_output_____
tutorials/tcirc/1-circuits.ipynb
###Markdown Simulate ###Code from qiskit import execute, Aer def sim(circ): return execute(circ, Aer.get_backend("aer_simulator"), shots=10, ).result().get_counts() print(sim(tcirc.circ)) ###Output {'1 00': 10} ###Markdown CNOT ###Code treg = TopologicalRegister(2, ctype=REPETITION) tcirc = TopologicalCircuit(treg) # prep state tcirc.reset_z(0) tcirc.reset_z(1) tcirc.x(0) tcirc.x(1) # cnot tcirc.cx(0,1) # meas tcirc.measure_z(0) tcirc.measure_z(1) results = sim(tcirc.circ) results = set([result[:3] for result in results.keys()]) print("q1 q0") print(results) tcirc.draw(output='mpl', fold=500) treg = TopologicalRegister(2, ctype=XXZZ) tcirc = TopologicalCircuit(treg) # prep state tcirc.reset_z(0) tcirc.reset_z(1) tcirc.x(0) tcirc.x(1) # cnot tcirc.cx(0,1) # meas tcirc.measure_z(0) tcirc.measure_z(1) # tcirc.draw(output='mpl', fold=500) results = sim(tcirc.circ) results = set([result[:3] for result in results.keys()]) print("q1 q0") print(results) treg = TopologicalRegister(2, ctype="XZZX") tcirc = TopologicalCircuit(treg) # prep state tcirc.reset_z(0) tcirc.reset_z(1) tcirc.x(0) tcirc.x(1) # cnot tcirc.cx(0,1) # meas tcirc.measure_z(0) tcirc.measure_z(1) # tcirc.draw(output='mpl', fold=500) results = sim(tcirc.circ) results = set([result[:3] for result in results.keys()]) print("q1 q0") print(results) ###Output q1 q0 {'0 1'}
Part 1 - Data structure and file reading.ipynb
###Markdown Python for spatial analysis Part 1 - Data structure and file reading - vector data (geometry, projection, DataFrame --> GeoDataFrame)- raster data (band information, projection, read as array) ###Code import pandas as pd import geopandas as gpd from osgeo import gdal import numpy as np ###Output _____no_output_____ ###Markdown 1. For vector data 1.1 Read files Data in **geopandas** are organized as a form of **GeoDataFrame**, similar to Dataframe in pandas. It supports reading vector-based data including shpfile, geojson, zip files. ###Code ### read from shpfile city = gpd.read_file('../Data/ADDdistance0318_Detroit.shp') city.head() ### read from geojson city = gpd.read_file('../Data/census_tracts_data.geojson') city.plot() ### read from zip zipfile = "zip://../Data/Detroit_zip.zip!Detroit_zip" city = gpd.read_file(zipfile) city.plot() ###Output _____no_output_____ ###Markdown To load subsets of data, we use - gpd.readfile(file, **mask=gdf_mask**), - gpd.readfile(file, **bbox=(-98.499279,29.432841,-98.50150099999999,29.433827)**), - gpd.readfile(file, **rows=10**) (begin from row 10), - gpd.readfile(file, **ignore_fields=[GeoID]**). 1.2 Data structure A **GeoSeries**, an entry consist of several shapes, is referred as **'geometry'** column in the tubular data struture GeoDataFrame, including (multi)points, (multi)lines and (multi)polygons. The geometry information is provided in the form of WKT, for example:- POINT (-98.49077 29.42664)- POLYGON ((-98.499279 29.433827, -98.49914699999999 29.433812, -98.50150099999999 29.432841, -98.50051099999999 29.43326, -98.499279 29.433827))- LINEARRING (-98.499279 29.433827, -98.50150099999999 29.432841, -98.50051099999999 29.43326, -98.499279 29.433827)Almost its methods and attributes generalized from [Shapely](https://shapely.readthedocs.io/en/stable/). All the operations on geometry can be carried out for the GeoSeries. A full list of geometry attributes, methods, relationship detections can be found [here](https://geopandas.org/reference.html). ###Code ### find active geometry column print(city.geometry.name) ### we can also change the active geometry column by 'set_geometry' city_border = city.copy() city_border['border'] = city_border.boundary # GeoSeries geometric operations city_border = city_border.set_geometry('border') print(city_border.geometry.name) city_border.plot() ### find active geometry column print(city.geometry.name) ### we can also change the active geometry column by 'set_geometry' city_center = city.copy() city_center['center'] = city_center.centroid city_center = city_center.set_geometry('center') print(city_center.geometry.name) city_center.plot() ## Note that there is a warning about the CRS, it is recommanded to re-project the geographic CRS (in lat-lon) to a projected CRS (in kilometers, etc.), ## which we will introduce below. ### other methods for geometry, like: city.bounds ###Output _____no_output_____ ###Markdown Other attributes not related to geometry, like income, population, temperature, etc., can be operated as in pandas. 1.3 Projection The Coordinate Reference System (CRS) tells Python how those coordinates relate to places on the Earth. At most time, data we get always include projection information. ###Code # get projection information city.crs ###Output _____no_output_____ ###Markdown If not, we can use set_crs to help python to interpret the coordinate information, but ensure that you have geometry-related column to provide location information.> city = city.set_crs(epsg='4326')When we want to calculate distances and areas, it is better to re-project the latitude-longtitude information to kilometers. Due to map distortion, we need to choose the most suitable projection type. The 'geometry' column will change. ###Code city_proj = city.to_crs(epsg=3395) city_proj.plot() # Note the x-y sticks. ### NOW there will be no warning for center calculation print(city_proj.geometry.name) ### we can also change the active geometry column by 'set_geometry' city_proj = city_proj.copy() city_proj['center'] = city_proj.centroid city_proj = city_proj.set_geometry('center') print(city_proj.geometry.name) city_proj.plot() ###Output geometry center ###Markdown 1.4 Covert a DataFrame with coordinates to a GeoDataFrame ###Code csv_data = pd.read_csv('../Data/ADDdistance0318_Detroit.csv') ###Output _____no_output_____ ###Markdown - From WKT fomat ###Code from shapely import wkt csv_data['wkt'] = csv_data['geometry'].apply(wkt.loads) csv_to_shp = gpd.GeoDataFrame(csv_data, geometry='wkt') csv_to_shp.crs shp_proj = csv_to_shp.set_crs(epsg=4326) # WGS_84 shp_proj.plot() ###Output _____no_output_____ ###Markdown - From data coordinate ###Code csv_to_shp = gpd.GeoDataFrame( csv_data, geometry=gpd.points_from_xy(csv_data.center_lon, csv_data.center_lat)) shp_proj = csv_to_shp.set_crs(epsg=4326) shp_proj.plot() ###Output _____no_output_____ ###Markdown 2. For raster data 2.1 Read files ###Code dataset = gdal.Open('../Data/UHIDetroit_2010.tif', gdal.GA_ReadOnly) ### The second input indicating read/write, or read-only access to data. Values 'GA_ReadOnly'/0 means read only with no update, while 'GA_Update'/1 means read/write. ###Output _____no_output_____ ###Markdown 2.2 Data structure The GDALDataset contains related raster bands with the same raster size and their geographical information. Find more details in [Raster Data Model](https://gdal.org/user/raster_data_model.htmlraster-data-model).- **Drivers:** ###Code print("Driver: {}/{}".format(dataset.GetDriver().ShortName, dataset.GetDriver().LongName)) ###Output Driver: GTiff/GeoTIFF ###Markdown - **Data size** ###Code print("Size is {} x {} x {}".format(dataset.RasterXSize, dataset.RasterYSize, dataset.RasterCount)) ###Output Size is 676 x 481 x 1 ###Markdown - **Projection**. The information of coordinate systems are in the format of WKT: ###Code print("Projection is {}".format(dataset.GetProjection())) ###Output Projection is GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AXIS["Latitude",NORTH],AXIS["Longitude",EAST],AUTHORITY["EPSG","4326"]] ###Markdown Decompose the above detailed information in the GEOGCS:- **GEOGCS["WGS 84",** ------------ A geographic coordinate system name. - **DATUM["WGS_1984",** ---------- A datum identifier. - **SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]]**, ---------- An ellipsoid name, semi-major axis, and inverse flattening. - **AUTHORITY["EPSG","6326"]],** - **PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],** ---------- A prime meridian name and offset from Greenwich. - **UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],** ---------- A units name, and conversion factor to meters or radians. - **AXIS["Latitude",NORTH],** ---------- Names and ordering for the axes. - **AXIS["Longitude",EAST],** - **AUTHORITY["EPSG","4326"]]** ---------- Codes for most of the above in terms of predefined coordinate systems from authorities such as EPSG. - **Affine geotransform**. It describe the relationship between raster positions (in pixel coordinates) and georeferenced coordinates. It commonly consists of six coefficient representing [X-axis geocoordinate (e.g. longtitude) of the top left corner, pixel width (geographic distance along X-axis for every pixel cell), affine coefficient 1, Y-axis geocoordinate (e.g. latitude) of the top left corner, pixel height (geographic distance along Y-axis for every pixel cell), affine coefficient 2]. If the image is north-up, the two affine coefficient are both 0.We can calculate the projection coordinates (X, Y) in geographical space of any (R,C) pixel in raster space, as follows:> X = cof[0] + C\*cof[1] + R\*cof[2]> Y = cof[3] + C\*cof[4] + R\*cof[5] ###Code dataset.GetGeoTransform() geotransform = dataset.GetGeoTransform() if geotransform: print("Origin = ({}, {})".format(geotransform[0], geotransform[3])) print("Pixel Size = ({}, {})".format(geotransform[1], geotransform[5])) ###Output Origin = (-84.15776907745327, 43.32664446836864) Pixel Size = (0.0026949458523585668, -0.002694945852358565) ###Markdown 2.3 Read raster band Until now, the dataset is still a gdal.Dataset object. If we want to get the metadata and do some calculating, statistic, copying and rewriting operations, we can first read those bands, transfer the band data to arrays, which we are quite familiar. ###Code ### get one band band = dataset.GetRasterBand(1) # now the object is gdal.Band, with value type in Float32 print("Band Type={}".format(gdal.GetDataTypeName(band.DataType))) ###Output Band Type=Float32 ###Markdown Convert the band to easily-operated array by **ReadAsArray()** function. **BUT** one defect of this operation is that we can not ignore the NAN value, so necessary operations should be done to deal with the NAN problem. ###Code #### get predifined value representing no data band.GetNoDataValue() bandarray = band.ReadAsArray() # now the object is numpy.ndarray bandarray[bandarray==-999] = np.nan print(np.min(bandarray[~pd.isnull(bandarray)]), np.max(bandarray[~pd.isnull(bandarray)])) ###Output -3.1464381 2.439222
Examples/notebooks/Vectors.ipynb
###Markdown VectorsThis notebook discusses the basic properties of vectors and their productsLet begin by defining some vectors $\tilde{a}$ and $\tilde{b}$ ###Code import numpy as np #vectors origin = [0, 0] a = np.array([4, 1]) b = np.array([1, 3]) #b = np.array([-1, 4]) #extract values for plotting, let's make it into a function, #so we dont keep repeating it over and over def getCoordinates(vectors): V = np.array(vectors) #create a matrix of the vectors maxValue = V.max()+1 #too scale plot automatically x = V[:,0] #x coordinates, column 0 y = V[:,1] #y coordinates, column 1 return x, y, maxValue originX, originY, maxValue = getCoordinates([origin, origin]) # tail locations per vector X, Y, maxValue = getCoordinates([a, b]) colours = ['r', 'b'] ###Output _____no_output_____ ###Markdown Plot the vectors ###Code import matplotlib.pyplot as plt #use the quiver plot capability of matplotlib plt.figure(figsize=(6,6)) #prevent a skewed view plt.quiver(originX, originY, X, Y, color=colours, angles='xy', scale_units='xy', scale=1) plt.xlim(-maxValue, maxValue) plt.ylim(-maxValue, maxValue) plt.show() ###Output _____no_output_____ ###Markdown Vector ArithmeticLet's do some simple arithmetic of vectors $\tilde{c}=\tilde{a}+\tilde{b}$, $\tilde{d}=\tilde{a}-\tilde{b}$, $\tilde{e}=2\cdot\tilde{a}$. What are the results? ###Code c = a+b d = a-b e = 2*a print("a:", a) print("b:", b) print("c:", c) print("d:", d) print("e:", e) originX, originY, maxValue = getCoordinates([origin, origin, origin]) # tail locations per vector X, Y, maxValue = getCoordinates([c, d, e]) #use the quiver plot capability of matplotlib plt.figure(figsize=(6,6)) #prevent a skewed view plt.quiver(originX, originY, X, Y, angles='xy', scale_units='xy', scale=1) plt.xlim(-maxValue, maxValue) plt.ylim(-maxValue, maxValue) plt.show() ###Output _____no_output_____ ###Markdown Vector ProductsHow do we multiply vectors though? We get the dot product or the inner product $\tilde{a}\cdot\tilde{b}$ ###Code import math #dot or inner product f = np.dot(a, b) #project b onto a #compute angle between them theta = math.atan(b[1]/b[0]) - math.atan(a[1]/a[0]) #atan is arctan function #project b_a = np.linalg.norm(b)*math.cos(theta) print("b_a:", b_a) #compare with projection of vector g = np.linalg.norm(a)*b_a #print("a:", np.linalg.norm(a)) #print("b:", np.linalg.norm(b)) print("f:", f) print("g:", g) ###Output b_a: 1.69774937525 f: 7 g: 7.0 ###Markdown Lets plot the results to see what has happened ###Code aUnit = a/np.linalg.norm(a) h = b_a*aUnit originX, originY, maxValue = getCoordinates([origin, origin, origin]) # tail locations per vector X, Y, maxValue = getCoordinates([a, b, h]) colours = ['r', 'b', 'g'] #use the quiver plot capability of matplotlib plt.figure(figsize=(6,6)) #prevent a skewed view plt.quiver(originX, originY, X, Y, color=colours, angles='xy', scale_units='xy', scale=1) plt.xlim(-maxValue, maxValue) plt.ylim(-maxValue, maxValue) plt.show() ###Output _____no_output_____ ###Markdown So the dot/inner product is a scaled projection of the vector onto the other, i.e. the amount that $\tilde{b}$ has in common with $\tilde{a}$. Important case to consider: $\tilde{a}\cdot\tilde{a}$, which gives the length of the vector squared. ###Code print("a norm:", np.linalg.norm(a)) print("a dot a:", math.sqrt(np.dot(a, a))) ###Output a norm: 4.12310562562 a dot a: 4.123105625617661 ###Markdown Let us define our own inner product as $\sum_i^N a_i b_i$, where $N$ is the length (or dimension) of the vectors. ###Code def inner_product(a, b): total = 0 for i, value in enumerate(a): total += a[i]*b[i] return total print("a dot a:", math.sqrt(np.dot(a, a))) print("a inner:", math.sqrt(inner_product(a, a))) print("a dot a:", math.sqrt(np.dot(a, b))) print("a inner:", math.sqrt(inner_product(a, b))) ###Output a dot a: 2.6457513110645907 a inner: 2.6457513110645907
ZS-Young-Data-scientist-2018-Winner/ZS online phase I/Data_Preparation(Normalization).ipynb
###Markdown Extracting month,year and day from holiday_data date column ###Code def to_date(x=None): return pd.to_datetime('-'.join(x.split(','))) holiday_data['Date'] = holiday_data['Date'].apply(lambda y : to_date(x=y)) holiday_data['Month'] = holiday_data['Date'].dt.month holiday_data['Year'] = holiday_data['Date'].dt.year holiday_data.head() train_data = train_data.merge(right=holiday_data[[i for i in holiday_data.columns if i not in ['Date']]],on=['Year','Month','Country'],how='left') train_data = train_data.merge(right=promotional_data,on=['Year','Month','Country','Product_ID'],how='left') test_data = test_data.merge(right=holiday_data[[i for i in holiday_data.columns if i not in ['Date']]],on=['Year','Month','Country'],how='left') test_data = test_data.merge(right=promotional_data,on=['Year','Month','Country','Product_ID'],how='left') test_data.head() train_data.head() # converting train weekly data to monthly train_data = train_data.groupby(['Year','Month','Country','Product_ID'],as_index=False).agg({'Sales':'sum','Holiday':'nunique','Expense_Price':'mean'}) test_data = test_data.groupby(['Year','Month','Country','Product_ID'],as_index=False).agg({'Holiday':'nunique','Expense_Price':'mean'}) train_data = train_data.sort_values(['Year','Month','Country','Product_ID']) test_data = test_data.sort_values(['Year','Month','Country','Product_ID']) # train_data['expense_sales_ratio'] = train_data['Expense_Price']/train_data['Sales'] # test_data['expense_sales_ratio'] = test_data['Expense_Price']/test_data['Sales'] train_data.head() test_data.head() train_data.shape,test_data.shape old_test = pd.read_csv('Data/yds_test2018.csv') old_test.head() test_data = test_data.merge(right=old_test.drop(['Sales'],1),on=['Year','Month','Product_ID','Country'],how='left') test_data.head() len(set(test_data['S_No'])) == test_data.shape[0] train_data.to_csv('Data/Clean_train_data.csv',index=False) test_data.to_csv('Data/Clean_test_data.csv',index=False) ###Output _____no_output_____
models/1_solubility_physical-descriptors/02_dataset_delaney_and_new_components/.ipynb_checkpoints/model_predict-checkpoint.ipynb
###Markdown Prediction of the aqueous solubility of a chemical compound using a trained linear regression model based on the Delaney's dataset and the dataset of the new compounds ###Code # import the necessqary libraries import matplotlib.pyplot as plt import numpy as np import pandas as pd from rdkit import Chem from rdkit.Chem import Descriptors from sklearn.model_selection import train_test_split from sklearn import linear_model from sklearn.metrics import mean_squared_error, r2_score import pickle # load the model from disk loaded_model = pickle.load(open('model_desc_total.pkl', 'rb')) def AromaticAtoms(m): """ This function calculates the number of aromatic atoms in a molecule. The input argument, m, is a rdkit molecular object """ aromatic_atoms = [m.GetAtomWithIdx(i).GetIsAromatic() for i in range(m.GetNumAtoms())] aa_count = [] for i in aromatic_atoms: if i==True: aa_count.append(1) sum_aa_count = sum(aa_count) return sum_aa_count def predictSingle(smiles, model): """ This function predicts the four molecular descriptors: the octanol/water partition coefficient (LogP), the molecular weight (Mw), the number of rotatable bonds (NRb), and the aromatic proportion (AP) for a single molecule The input arguments are SMILES molecular structure and the trained model, respectively. """ # define the rdkit moleculat object mol = Chem.MolFromSmiles(smiles) # calculate the log octanol/water partition descriptor single_MolLogP = Descriptors.MolLogP(mol) # calculate the molecular weight descriptor single_MolWt = Descriptors.MolWt(mol) # calculate of the number of rotatable bonds descriptor single_NumRotatableBonds = Descriptors.NumRotatableBonds(mol) # calculate the aromatic proportion descriptor single_AP = AromaticAtoms(mol)/Descriptors.HeavyAtomCount(mol) # put the descriptors in a list single_list = [single_MolLogP, single_MolWt, single_NumRotatableBonds, single_AP] # add the list to a pandas dataframe single_df = pd.DataFrame(single_list).T # rename the header columns of the dataframe single_df.columns = ['MolLogP', 'MolWt', 'NumRotatableBonds', 'AromaticProportion'] #return single_df return model.predict(single_df)[0] smiles_new = 'FC(F)(Cl)C(F)(Cl)Cl' predictSingle(smiles_new, loaded_model) ###Output _____no_output_____
examples/solar_irradiance.ipynb
###Markdown Beam irradiance on a normal planeIrradiance during the day on a plane always perpendicular to the solar beam, on November 12 at the latitude 12 degrees north, for different altitudes: ###Code date = datetime(2019, 11, 12) # November 12 lat = 12 # north hemisphere fig, ax = plt.subplots(figsize=(18, 6)) for h in (0, 2e3, 5e3, 10e3): t = [date + timedelta(minutes=i) for i in range(0, 15 * 24 * 4, 15)] G = [sp.beam_irradiance(h, i, lat) for i in t] ax.plot(t, G, label='h = ' + str(int(h)) + ' m') myFmt = mdates.DateFormatter('%H:%M') ax.xaxis.set_major_formatter(myFmt) plt.autoscale(enable=True, axis='x', tight=True) plt.ylim(0, 1420) plt.xlabel('hour', fontsize=15) plt.ylabel('W/m2', fontsize=15) plt.title('Beam irradiance on normal plane', fontsize=18) plt.legend() plt.grid(True) ###Output _____no_output_____ ###Markdown Irradiance on a fixed planeIrradiance during different days on a fixed plane defined by its normal vector, at the latitude 52 degrees north at an altitude of 10km: ###Code vnorm = np.array([0, 1, 0]) # plane pointing eastwards date1 = datetime(2019, 2, 3) # February 3 date2 = datetime(2019, 7, 14) # July 14 date3 = datetime(2019, 12, 6) # December 6 lat = 52 # north hemisphere h = 10e3 fig, ax = plt.subplots(figsize=(18, 6)) for dt in (date1, date2, date3): t = [dt + timedelta(minutes=i) for i in range(0, 15 * 24 * 4, 15)] G = [sp.irradiance_on_plane(vnorm, h, i, lat) for i in t] # necessary as matplotlib expects a datetime.datetime, not a datetime.time t_ = [datetime.combine(datetime(2019, 1, 1), i) for i in [i.time() for i in t]] plt.plot(t_, G, label=str(dt.date())) myFmt = mdates.DateFormatter('%H:%M') ax.xaxis.set_major_formatter(myFmt) plt.autoscale(enable=True, axis='x', tight=True) plt.ylim(0, 1200) plt.xlabel('hour', fontsize=15) plt.ylabel('W/m2', fontsize=15) plt.title('Beam irradiance on fixed plane', fontsize=18) plt.legend() plt.grid(True) plt.show() ###Output _____no_output_____
.ipynb_checkpoints/Poker-checkpoint.ipynb
###Markdown Poker Texas Hold'em Game Author: Pablo Loren-Aguilar ###Code import pygame import sys import random as rnd import include.deck as deck import include.player as player import include.table as table import include.menu as menu import string pygame.init() #Create game window and title screen = menu.newmenu() #Prepare the table and sit the players surface = pygame.display.set_mode((1200, 1200)) surface.fill((255,255,255)) screen.create_table(surface) screen.put_all_cards(4,surface) screen.put_buttons(surface) players = [] croupier = table.newtable(4,players) #-------------------------Pre-flop-------------------------------# #Initialise the deck of cards and give every player a hand cards = deck.newdeck() croupier.set_hand(players,cards) #Reveal the player's hand screen.player_cards(surface,cards,players[0].cards) #Time to bet screen.show_blinds(surface) screen.put_blinds(surface) players[0].bet(surface) screen.show_common(surface) pygame.display.update() #Clean the table and prepare for a new round #surface = pygame.display.set_mode((1200, 1200)) #surface.fill((255,255,255)) pygame.quit() sys.exit() ###Output _____no_output_____
Complete-Python-Bootcamp/Lists.ipynb
###Markdown ListsEarlier when discussing strings we introduced the concept of a *sequence* in Python. Lists can be thought of the most general version of a *sequence* in Python. Unlike strings, they are mutable, meaning the elements inside a list can be changed!In this section we will learn about: 1.) Creating lists 2.) Indexing and Slicing Lists 3.) Basic List Methods 4.) Nesting Lists 5.) Introduction to List ComprehensionsLists are constructed with brackets [] and commas separating every element in the list.Let's go ahead and see how we can construct lists! ###Code # Assign a list to an variable named my_list my_list = [1,2,3] ###Output _____no_output_____ ###Markdown We just created a list of integers, but lists can actually hold different object types. For example: ###Code my_list = ['A string',23,100.232,'o'] ###Output _____no_output_____ ###Markdown Just like strings, the len() function will tell you how many items are in the sequence of the list. ###Code len(my_list) ###Output _____no_output_____ ###Markdown Indexing and SlicingIndexing and slicing works just like in strings. Let's make a new list to remind ourselves of how this works: ###Code my_list = ['one','two','three',4,5] # Grab element at index 0 my_list[0] # Grab index 1 and everything past it my_list[1:] # Grab everything UP TO index 3 my_list[:3] ###Output _____no_output_____ ###Markdown We can also use + to concatenate lists, just like we did for strings. ###Code my_list + ['new item'] ###Output _____no_output_____ ###Markdown Note: This doesn't actually change the original list! ###Code my_list ###Output _____no_output_____ ###Markdown You would have to reassign the list to make the change permanent. ###Code # Reassign my_list = my_list + ['add new item permanently'] my_list ###Output _____no_output_____ ###Markdown We can also use the * for a duplication method similar to strings: ###Code # Make the list double my_list * 2 # Again doubling not permanent my_list ###Output _____no_output_____ ###Markdown Basic List MethodsIf you are familiar with another programming language, you might start to draw parallels between arrays in another language and lists in Python. Lists in Python however, tend to be more flexible than arrays in other languages for a two good reasons: they have no fixed size (meaning we don't have to specify how big a list will be), and they have no fixed type constraint (like we've seen above).Let's go ahead and explore some more special methods for lists: ###Code # Create a new list l = [1,2,3] ###Output _____no_output_____ ###Markdown Use the **append** method to permanently add an item to the end of a list: ###Code # Append l.append('append me!') # Show l ###Output _____no_output_____ ###Markdown Use **pop** to "pop off" an item from the list. By default pop takes off the last index, but you can also specify which index to pop off. Let's see an example: ###Code # Pop off the 0 indexed item l.pop(0) # Show l # Assign the popped element, remember default popped index is -1 popped_item = l.pop() popped_item # Show remaining list l ###Output _____no_output_____ ###Markdown It should also be noted that lists indexing will return an error if there is no element at that index. For example: ###Code l[100] ###Output _____no_output_____ ###Markdown We can use the **sort** method and the **reverse** methods to also effect your lists: ###Code new_list = ['a','e','x','b','c'] #Show new_list # Use reverse to reverse order (this is permanent!) new_list.reverse() new_list # Use sort to sort the list (in this case alphabetical order, but for numbers it will go ascending) new_list.sort() new_list ###Output _____no_output_____ ###Markdown Nesting ListsA great feature of of Python data structures is that they support *nesting*. This means we can have data structures within data structures. For example: A list inside a list.Let's see how this works! ###Code # Let's make three lists lst_1=[1,2,3] lst_2=[4,5,6] lst_3=[7,8,9] # Make a list of lists to form a matrix matrix = [lst_1,lst_2,lst_3] # Show matrix ###Output _____no_output_____ ###Markdown Now we can again use indexing to grab elements, but now there are two levels for the index. The items in the matrix object, and then the items inside that list! ###Code # Grab first item in matrix object matrix[0] # Grab first item of the first item in the matrix object matrix[0][0] ###Output _____no_output_____ ###Markdown List ComprehensionsPython has an advanced feature called list comprehensions. They allow for quick construction of lists. To fully understand list comprehensions we need to understand for loops. So don't worry if you don't completely understand this section, and feel free to just skip it since we will return to this topic later.But in case you want to know now, here are a few examples! ###Code # Build a list comprehension by deconstructing a for loop within a [] first_col = [row[0] for row in matrix] first_col ###Output _____no_output_____
Ch16_Recommender_Systems/16-2.ipynb
###Markdown http://preview.d2l.ai/d2l-en/master/chapter_recommender-systems/movielens.html ###Code from d2l import mxnet as d2l from mxnet import gluon, np import os import pandas as pd #@save d2l.DATA_HUB['ml-100k'] = ( 'http://files.grouplens.org/datasets/movielens/ml-100k.zip', 'cd4dcac4241c8a4ad7badc7ca635da8a69dddb83') #@save def read_data_ml100k(): data_dir = d2l.download_extract('ml-100k') names = ['user_id', 'item_id', 'rating', 'timestamp'] data = pd.read_csv(os.path.join(data_dir, 'u.data'), '\t', names=names, engine='python') num_users = data.user_id.unique().shape[0] num_items = data.item_id.unique().shape[0] return data, num_users, num_items data, num_users, num_items = read_data_ml100k() sparsity = 1 - len(data) / (num_users * num_items) print(f'number of users: {num_users}, number of items: {num_items}') print(f'matrix sparsity: {sparsity:f}') print(data.head(5)) d2l.plt.hist(data['rating'], bins=5, ec='black') d2l.plt.xlabel('Rating') d2l.plt.ylabel('Count') d2l.plt.title('Distribution of Ratings in MovieLens 100K') d2l.plt.show() #@save def split_data_ml100k(data, num_users, num_items, split_mode='random', test_ratio=0.1): """Split the dataset in random mode or seq-aware mode.""" if split_mode == 'seq-aware': train_items, test_items, train_list = {}, {}, [] for line in data.itertuples(): u, i, rating, time = line[1], line[2], line[3], line[4] train_items.setdefault(u, []).append((u, i, rating, time)) if u not in test_items or test_items[u][-1] < time: test_items[u] = (i, rating, time) for u in range(1, num_users + 1): train_list.extend(sorted(train_items[u], key=lambda k: k[3])) test_data = [(key, *value) for key, value in test_items.items()] train_data = [item for item in train_list if item not in test_data] train_data = pd.DataFrame(train_data) test_data = pd.DataFrame(test_data) else: mask = [True if x == 1 else False for x in np.random.uniform( 0, 1, (len(data))) < 1 - test_ratio] neg_mask = [not x for x in mask] train_data, test_data = data[mask], data[neg_mask] return train_data, test_data #@save def load_data_ml100k(data, num_users, num_items, feedback='explicit'): users, items, scores = [], [], [] inter = np.zeros((num_items, num_users)) if feedback == 'explicit' else {} for line in data.itertuples(): user_index, item_index = int(line[1] - 1), int(line[2] - 1) score = int(line[3]) if feedback == 'explicit' else 1 users.append(user_index) items.append(item_index) scores.append(score) if feedback == 'implicit': inter.setdefault(user_index, []).append(item_index) else: inter[item_index, user_index] = score return users, items, scores, inter #@save def split_and_load_ml100k(split_mode='seq-aware', feedback='explicit', test_ratio=0.1, batch_size=256): data, num_users, num_items = read_data_ml100k() train_data, test_data = split_data_ml100k( data, num_users, num_items, split_mode, test_ratio) train_u, train_i, train_r, _ = load_data_ml100k( train_data, num_users, num_items, feedback) test_u, test_i, test_r, _ = load_data_ml100k( test_data, num_users, num_items, feedback) train_set = gluon.data.ArrayDataset( np.array(train_u), np.array(train_i), np.array(train_r)) test_set = gluon.data.ArrayDataset( np.array(test_u), np.array(test_i), np.array(test_r)) train_iter = gluon.data.DataLoader( train_set, shuffle=True, last_batch='rollover', batch_size=batch_size) test_iter = gluon.data.DataLoader( test_set, batch_size=batch_size) return num_users, num_items, train_iter, test_iter ###Output _____no_output_____
Weekly Assignments/Week 4/Core and Advanced Python - Week 4.ipynb
###Markdown Week 4 Challenge Importing necessary libraries ###Code import math import sys from collections import Counter from functools import reduce ###Output _____no_output_____ ###Markdown **Q1** - *A set contains names which begin either with A or with B. Write a program to separate the names into two sets, one containing names beginning with A and the other containing names beginning with B* ###Code #Case sensitivity is important as its mentioned clearly in the question #Elements present in sample set are exhaustive with elements starting with either 'A' or 'B' and not others #Creating a dummy set to test sample = {'Actor', 'Apple', 'Banana', 'Boy','Ball'} def set_seperator(sample): """ Taking an input as a set with elements starting with A or B (assumed exhaustive) and separating it into two sets starting with A and B """ #Creating two sets to store values setA = set() setB = set() #Iterating over the elements in for word in sample: #If word starts with 'A' then adding it to setA if word.startswith('A'): setA.add(word) else: setB.add(word) #Displaying the results print("The elements starting with 'A' in the original set:\n {}".format(setA)) print("\nThe elements starting with 'B' in the original set:\n {}".format(setB)) #Running the function to display the results set_seperator(sample) ###Output The elements starting with 'A' in the original set: {'Apple', 'Actor'} The elements starting with 'B' in the original set: {'Boy', 'Ball', 'Banana'} ###Markdown **Q2** - *Create a list of tuples. Each tuple should contain an item and its price in float. Write a program to print each item and its price in a proper format* ###Code #Creating a dummy list of tuples to iterate over with item and cost shop_cart = [('Apple',20.5), ('Banana',30.5), ('Litchi',40.25), ('Mango',60.25)] #Iterating over the elements using a for loop and printing each for item, cost in shop_cart: print('{} costs Rs {} in the supermarket closest to me.'.format(item, cost)) ###Output Apple costs Rs 20.5 in the supermarket closest to me. Banana costs Rs 30.5 in the supermarket closest to me. Litchi costs Rs 40.25 in the supermarket closest to me. Mango costs Rs 60.25 in the supermarket closest to me. ###Markdown **Q3** - *Write a program that reads integers from the user and stores them in a list. Your program should continue reading values until the user enters 0. Then it should display all of the values entered by the user (except for the 0) in order from smallest to largest, with one value appearing on each line. Use either the sort method or the sorted function to sort the list.* ###Code #Assuming 0 is entered by user or else the loop will keep running infinitely #Creating a list to store the multiple integers int_ls = [] #Breaking the loop whenever 0 is entered by the user while True: num = int(input('Enter an integer to store in list: ')) if num == 0: break #Appending the integer value into the list if any value apart from 0 is entered int_ls.append(num) #Sorting the sliced list omitting the zero entered at last idx position updated_ls = sorted(int_ls, reverse = False) print('\nThe sorted integer list:') #Iterating over the sorted list and printing each value in one line for n in updated_ls: print(n) ###Output Enter an integer to store in list: 5 Enter an integer to store in list: 10 Enter an integer to store in list: 15 Enter an integer to store in list: 16 Enter an integer to store in list: 4 Enter an integer to store in list: 7 Enter an integer to store in list: 0 The sorted integer list: 4 5 7 10 15 16 ###Markdown **Q4** - *Write a program to read a list of numbers from the user and remove the two largest and two smallest values from it. Display the list with the values removed, followed by the original list. Your program should generate an appropriate error message if the user enters less than 5 values.* ###Code #Assuming user has entered list of numbers in one line in a space seperated format #Inputs provided by user can have duplicate entries but have largest, second_largest, #second_smallest and smallest values present or else error message is displayed def second_largest(arr): """ Returns the second largest element in the list """ if len(set(arr)) == 1: print('The given input by user does not have more than 1 distinct element provided.') else: #Converting the list into a set to remove duplicate values and sorting them in descending order sorted_arr = sorted(set(arr), reverse=True) #Printing the second largest number return sorted_arr[1] def second_smallest(arr): """ Returns the second smallest element in the list """ if len(set(arr)) == 1: print('The given input by user does not have more than 1 distinct element provided.') else: #Converting the list into a set to remove duplicate values and sorting them in ascending order sorted_arr = sorted(set(arr), reverse=False) #Printing the second largest number return sorted_arr[1] #Allowing float values to also be entered ls = list(map(float, input('Enter a list of numbers: ').split())) #Condition to check smaller lists entry by user if len(ls) < 5: print('The list entered contains less than 5 values. Try again!') else: #Obtaining the values of the smallest, smallest2, largest2, largest in an input array val_ls = [min(ls), max(ls), second_smallest(ls), second_largest(ls)] #Getting the elements remaining in the list after removal of extreme values ls_afterrem = [ele for ele in ls if ele not in val_ls] #Displaying the results print('\nThe list after removal of values:\n{}'.format(ls_afterrem)) ls.sort() print('\nThe original list (sorted) :\n{}'.format(ls)) ###Output Enter a list of numbers: 3 3 7 6 7 8 8 9 9 The list after removal of values: [7.0, 7.0] The original list (sorted) : [3.0, 3.0, 6.0, 7.0, 7.0, 8.0, 8.0, 9.0, 9.0] ###Markdown **Q5** - *In this exercise, you will create a program that reads words from the user until the user enters a blank line. After the user enters a blank line your program should display each word entered by the user exactly once.* ###Code #As the output wants each word printed once & not necessarily in order - Unordered collections like dict can be used #It has O(1) query time. Hence would be faster in implementation over other types and handles duplicates gracefully output_dict = {} #Blank line in python includes -- ' ', '\n' #Iterating until the user enters a blank line as input while True: #To handle case sensitivity of words and any number of spaces in the entry by user inp = input('Enter a word: ').lower().strip() #Terminating the loop if blank line is entered -- \\n used as thats the way the dict stores it in python if inp == '\\n' or len(inp) == 0: break else: #Checking to see if the entry if inp in output_dict: output_dict[inp] += 1 else: output_dict[inp] = 1 print('\nThe unique words entered by the user:') #Iterating over the keys of the output dict -- words entered by user for word in output_dict: print(word) ###Output Enter a word: cat Enter a word: dog Enter a word: Dog Enter a word: Cat Enter a word: Panda Enter a word: The unique words entered by the user: cat dog panda ###Markdown **Q6** - *Write a program that reads numbers from the user until a 0 is entered. Your program should display the average of all of the values entered by the user. Then the program should display all of the below average values, followed by all of the average values (if any), followed by all of the above average values. An appropriate label should be displayed before each list of values.* ###Code #Assuming numbers are entered one at a time by the user and first entry 0 is void as all results will be empty try: #Creating a list to store the entries from the user result_ls = [] while True: num = float(input('Enter a number to add to the list: ')) #Condition to exit the loop if num == 0: break #Appending the number into the list result_ls.append(num) #Using reduce to calculate the sum of the list avg = reduce(lambda x,y : x+y, result_ls)/len(result_ls) #Displaying the average value print('\nThe average value of the resulting list:\n {}'.format(avg)) #Displaying the filtered list -- below average values strictly below_avg = [num for num in result_ls if num < avg] print('\nThe below average values of the resulting list:\n {}'.format(below_avg)) #Displaying the filtered list -- average values avg_ls = [num for num in result_ls if num == avg] print('\nThe average values of the resulting list:\n {}'.format(avg_ls)) #Displaying the filtered list -- above average values strictly above_avg = [num for num in result_ls if num > avg] print('\nThe above average values of the resulting list:\n {}'.format(above_avg)) except: print('\nZero was entered as the first value by the user. Provide input again for apt results') ###Output Enter a number to add to the list: 5 Enter a number to add to the list: 10 Enter a number to add to the list: 15 Enter a number to add to the list: 20 Enter a number to add to the list: 1000 Enter a number to add to the list: 480 Enter a number to add to the list: 0 The average value of the resulting list: 255.0 The below average values of the resulting list: [5.0, 10.0, 15.0, 20.0] The average values of the resulting list: [] The above average values of the resulting list: [1000.0, 480.0] ###Markdown **Q7** - *Create a program that determines and displays the number of unique characters in a string entered by the user. For example, Hello World! has 9 unique characters while zzz has only one unique character. Use a dictionary to solve this problem.* ###Code #Inputting a string to display unique characters st = input('Enter a string: ') #Creating an empty disk to store counts of unique characters unique_dict = {} #Iterating over the characters of the string for char in st: #If character is already in dict -- we will add 1 if char in unique_dict: unique_dict[char] += 1 else: #If character is not found in dict we will add it to the list with count 1 unique_dict[char] = 1 #Displaying the number of unique characters found in the input string print('The number of unique characters in the input string: {}'.format(len(unique_dict))) ###Output Enter a string: Hello World! The number of unique characters in the input string: 9 ###Markdown **Q8** - *Two words are anagrams if they contain all of the same letters, but in a different order. For example, “evil” and “live” are anagrams because each contains one e, one i, one l, and one v. Create a program that reads two strings from the user, determines whether or not they are anagrams, and reports the result.* ###Code #Inputting the two strings to check whether they are anagrams or not #Also making the program insenstive to upper/ lower case and spaces -- Making all inputs lowercase st1 = input('Enter the first string: ') st2 = input('Enter the second string: ') st1_lower = st1.lower().replace(' ','') st2_lower = st2.lower().replace(' ', '') #Condensed way to create dicts of counts of each unique character in input string using Counter #Counter objects take an iterable character as input dict1 = Counter(st1_lower) dict2 = Counter(st2_lower) #Iterating over the keys present in the first dict for char in dict1: #Condition to check if the exact number of a character are present in both for anagrmism if dict1[char] != dict2[char]: print("The two strings '{}' and '{}' are not anagrams".format(st1, st2)) break else: #If all entries run successfully the two strings are anagrams print("The two strings '{}' and '{}' are anagrams".format(st1, st2)) ###Output Enter the first string: Astronomer Enter the second string: Moon starer The two strings 'Astronomer' and 'Moon starer' are anagrams ###Markdown **Q9** - *An integer, n, is said to be perfect when the sum of all of the proper divisors of n is equal to n. For example, 28 is a perfect number because its proper divisors are 1, 2, 4, 7 and 14, and 1 + 2 + 4 + 7 + 14 = 28. Write a program that determines whether or not a positive integer is perfect. Your program will accept a number from the user. If that number is a perfect number then your program will display true. Otherwise it will display false.* ###Code def proper_divisors(num): """ Given a number as input it returns the list of proper divisors of the same excluding the num itself """ #To store proper divisors result = [] #Iterating over the range of 1 to num-1 to find proper divisors for d in range(1, num): #Checking if the value d is a proper divisor or not if num % d == 0: #Appending to the list if it is proper divisor result.append(d) return result def check_perfectnum(num): """ Given a number it will return True when a number is perfect or otherwise False """ #Checking to see if the number equals the sum of its proper divisors (excluding the number itself) if num == sum(proper_divisors(num)): return True else: return False #Inputting the number by the user num = int(input('Enter a positive integer: ')) check_perfectnum(num) ###Output Enter a positive integer: 496 ###Markdown **Q10** - *Write a program that finds all of the keys in a dictionary that map to a specific value. The program will take the value to search for as its input. It will display a (possibly empty) list of keys from the dictionary that map to the provided value.* ###Code #Creating a dummy dict to test out the logic and display results sample = {1: 'Cat', 2: 'Dog', 3: 'Pizza', 4: 'Dog', 5: 'Human', 6: 'Cat'} def dict_search(sample_dict, value): """ Takes the dictionary and value as input and returns the list of keys that map to that value """ #Using dict comprehension to returns keys that map to the value return [k for k, v in sample_dict.items() if v == value] #Testing logic with three values for _ in range(3): #Taking an input value from the user value = input('Enter a value: ') print("\nThe resulting list of keys that map to '{}' in the sample dictionary:\n{}\n".format(value, dict_search(sample, value))) ###Output Enter a value: Cat The resulting list of keys that map to 'Cat' in the sample dictionary: [1, 6] Enter a value: Dog The resulting list of keys that map to 'Dog' in the sample dictionary: [2, 4] Enter a value: Rhino The resulting list of keys that map to 'Rhino' in the sample dictionary: []
doc/LectureNotes/_build/jupyter_execute/lecturenotes/lecturenotes/notebooks.ipynb
###Markdown Content with notebooksYou can also create content with Jupyter Notebooks. This means that you can includecode blocks and their outputs in your book. Markdown + notebooksAs it is markdown, you can embed images, HTML, etc into your posts!![](https://myst-parser.readthedocs.io/en/latest/_static/logo.png)You an also $add_{math}$ and$$math^{blocks}$$or$$\begin{aligned}\mbox{mean} la_{tex} \\ \\math blocks\end{aligned}$$But make sure you \$Escape \$your \$dollar signs \$you want to keep! MyST markdownMyST markdown works in Jupyter Notebooks as well. For more information about MyST markdown, checkout [the MyST guide in Jupyter Book](https://jupyterbook.org/content/myst.html),or see [the MyST markdown documentation](https://myst-parser.readthedocs.io/en/latest/). Code blocks and outputsJupyter Book will also embed your code blocks and output in your book.For example, here's some sample Matplotlib code: ###Code from matplotlib import rcParams, cycler import matplotlib.pyplot as plt import numpy as np plt.ion() # Fixing random state for reproducibility np.random.seed(19680801) N = 10 data = [np.logspace(0, 1, 100) + np.random.randn(100) + ii for ii in range(N)] data = np.array(data).T cmap = plt.cm.coolwarm rcParams['axes.prop_cycle'] = cycler(color=cmap(np.linspace(0, 1, N))) from matplotlib.lines import Line2D custom_lines = [Line2D([0], [0], color=cmap(0.), lw=4), Line2D([0], [0], color=cmap(.5), lw=4), Line2D([0], [0], color=cmap(1.), lw=4)] fig, ax = plt.subplots(figsize=(10, 5)) lines = ax.plot(data) ax.legend(custom_lines, ['Cold', 'Medium', 'Hot']); ###Output _____no_output_____
machine_learning/lesson 2 - logistic regression/Classification_Softmax_Regression.ipynb
###Markdown > Note: Always open in Colab for the best learning experience. Classification: Softmax RegressionIn the previos lesson, we learned about logistic regression and how it can be used to construct a binary classifier to distinguish between two categorical values (e.g., whether a photo is of a dog or a cat). Logistic regression is great when we are working with only two categorical values. In practice, we are often more interested in differentiating between many categorical values (e.g., more than 2 classes):- Is this image of a cat, a dog, or a chicken?- Is this song in the genre of hip hop, pop, funk, rock, etc.?- What brand of clothing does this t-shirt belong to?When we want to distinguish between many classes (called *multi-class classification*), we can use a classification technique called softmax regression.In this notebook, we will learn the foundations of softmax regression and demonstrate how to solve multi-class classification problems using an example--building a softmax regression classifier to distinguish 10 handwritten digit (0-9). The ideas we introduce here will build on previous material and continue to lay out the fundamental concepts used in deep learning and neural networks, which we will cover in future lessons. Here is the lesson roadmap:1. Review: representing categorical data2. Introduction to softmax regression3. Building a softmax regression classifier: recognize 10 handwritten digits from the MNIST dataset 4. Summary Review: representing categorical data Representing data: a Shina Inu, Retriever, and LabTo motivate our understanding of softmax regression, let's consider the example where we want to distinguish 3 different dog breeds--(golden) retrievers, labs, and shiba inus, given $2 \times 2$ grayscale images. We can represent each pixel value with a single scalar (number), giving us four features $x_1,x_2,x_3,x_4$. Further, we know that each image belongs to one among the categories "retriever", "lab", "shina inu".To make the categorical labels useful, we need to convert it into a numerical representation. There are two general ways to represent the categorical data in numeric terms. The first way is to choose choose $y \in \{1, 2, 3\}$, where the integers represent {retriever, lab, shina inu} repectively. But, as you learned previously, the second way is better: *one-hot encoding*. As a refresher, a one-hot encoding is a vector with as many components as we have categories. The component corresponding to particular sample's category is set to 1 and all other components are set to 0. So in our case, this translates to:$$y \in \{ (1, 0, 0), (0, 1, 0), (0, 0, 1) \},$$where $y$ would be a three-dimensional vector representing the dog breeds, with $(1, 0, 0)$ corresponding to "retriever", (0, 1, 0) to "lab", and (0, 0, 1) to "shiba inu". Intro to softmax regression Softmax regression is a single-layer neural network | Source: Dive Into Deep LearningNow that we have a healthy understanding of classification techniques like *one-hot encoding* and logistic regression, let's dive into the softmax regression.Softmax regression is perhaps the most common machine learning algorithm. It is a special case of logistic regression where the labels ($y$) is categorical in nature but there are many categories rather than simply two. It is called "softmax" regression because it uses a *logit* function, called the *softmax* function, to estimate the *conditional probability* of a given class among many classes (i.e., more than 2). Unlike linear and logistic regression, softmax regression requires a model with multiple outputs, one per class. To address multi-class classification with softmax regression classifiers, we will need as many linear functions as we have outputs. Each output will correspond to its own linear function. In our case, since we have 4 features and 3 possible output categories, we will need 12 scalars to represent the weights, ( $w$ with subscripts) and 3 scalars to represent the biases ($b$ with subscripts). We compute these three *logits*, $o_1,o_2$, and $o_3$, for each input:\begin{split}\begin{aligned}o_1 &= x_1 w_{11} + x_2 w_{12} + x_3 w_{13} + x_4 w_{14} + b_1,\\o_2 &= x_1 w_{21} + x_2 w_{22} + x_3 w_{23} + x_4 w_{24} + b_2,\\o_3 &= x_1 w_{31} + x_2 w_{32} + x_3 w_{33} + x_4 w_{34} + b_3.\end{aligned}\end{split}We can depict this calculation with the neural network diagram shown above. Just as in linear regression, softmax regression is also a *single-layer neural network*. And since the calculation of each output, $o_1,o_2$, and $o_3$, depends on all inputs, $x_1, x_2, x_3$, and $x_4$, the output layer of softmax regression can also be described as *fully-connected* layer.To express the model more compactly, we can use linear algebra notation. In vector form, we arrive at $\mathbf{o} = \mathbf{W} \mathbf{x} + \mathbf{b} $, a form better suited both for mathematics, and for writing code. Note that we have gathered all of our weights into a $3 \times 4$ matrix and that for a given example $\mathbf{x}$, our outputs are given by a *matrix-vector product* of our weights by our inputs plus our biases $b$. From the above describtion, hopefully softmax regression feels suprisingly familiar to you, since it shares many of the techniques used by linear and logistic regression methods. The softmax operation Single-layer softmax regression neural network | Source: deepnotes.ioThe main approach of softmax regression is to interpret the outputs of our model as probabilities. We will *optimize* our parameters to produce probabilities that *maximize the likelihood* of the observed data, just like logistic regression. Then, we generate predictions using a threshold (which we define), for example, choosing the *argmax* of the predicted probabilities. The *argmax* defines a given sample's predicted class based on the category with the highest probability value.Put formally, we would like outputs $\hat{y}_k$ that we can interpret as the probability that a given item belongs to class $k$. Then, we can choose the class with the largest output value as our prediction $\text{argmax}_k y_k $. For example, if $\hat{y}_1 , \hat{y}_2$, and $\hat{y}_3$ are $0.1, 0.8$, and $0.1$, respectively, then we predict category $2$, which (in our example) represents "lab".To interpret our outputs as probabilities, we must guarantee two things:1. They will be nonnegative.2. Given an input sample, the total of all the class probabilities sum up to 1.To transform our logits such that they become nonnegative and sum to $1$, while maintaining that the model remains differentiable (for gradient descent), we first exponentiate each logit (ensuring non-negativity) and then divide by their sum (ensuring that they sum to $1$). $$\hat{\mathbf{y}} = \mathrm{softmax}(\mathbf{o})\quad \text{where}\quad\hat{y}_i = \frac{\exp(o_i)}{\sum_j \exp(o_j)}.$$Now we can guarantee $\hat{y}_1+\hat{y}_2+\hat{y}_3 = 1$ with $0≤\hat{y}_i≤1$ for all $i$ . Thus, we can interpret $\hat{y}$ as a proper probability distribution. Note that the softmax operation does not change the ordering among the logits, and thus we can still pick out the most likely class by:$$\hat{\imath}(\mathbf{o}) = \operatorname*{argmax}_i o_i = \operatorname*{argmax}_i \hat y_i.$$ A note on the loss functionThe loss function for softmax regression is called categorical *cross-entropy*. A detailed description of cross-entropy is beyond the scope of this lesson. However, it's worth a quick summary. At a high level, cross-entropy is a measure of the difference between two probability distributions. It measures the amount of information (also called *bits*) needed to encode the data given our model. The goal is to predict the correct labels most of the time (via *maximum likelihood estimation*), while minimizing the *suprise* (entropy) required to communicate the labels. Summary: Softmax RegressionTo summarize softmax regression:- Category labels ($y$) are converted to discrete integer values and represented using multi-dimensional vectors (e.g., dimensions $=$ of classes $k$).- The *softmax* logit function maps input features ($\mathbf{x}$) to probabilities and guarantees that they are nonnegative and sum up to 1.- A category prediction is determined by the threshold function, such as *argmax*, which selects the class with the highest probability among all $k$ classes. - Softmax regression classifiers try to *maximize the likelihood* of the observed data, like logistic regression.- Categorical *cross-entorpy* is the loss function for softmax regression. Softmax Regression: build a classifier to recognize handwritten digitsNow that we know about the fundamentals of softmax regression, let's apply this method to a real-world problem--distinguishing between 10 handwritten digits (0-9) given the MNIST dataset--a dataset of grayscale handwritten digits. In this section, we will demonstrate in an end-to-end fashion the process of creating a softmax regression classifier: from building, to training, and finally evaluating the model to solve the handwritten digit classification task. This process involves several steps:1. Find, load, and prepare a dataset for the model.2. Build the model.3. Train the model using an algorithm such as stochastic gradient descent.4. Evaluate the quality of our model.5. Draw conclusions. ###Code # import torch import torch import torch.nn as nn # import torchvision datasets import torchvision.datasets # Commonly used modules import numpy as np import os import sys # Images, plots, display, and visualization import matplotlib.pyplot as plt import pandas as pd import seaborn as sns import cv2 import IPython from six.moves import urllib ###Output _____no_output_____ ###Markdown 1. Finding and preparing the datasetFor step 1, we found the [MNIST dataset](http://yann.lecun.com/exdb/mnist/). The dataset contains 70k standardized grayscale images of handwritten digits at a resolution of $28 \times 28$ pixels. Our goal is to build a classification model to take one of these images as input and predict the most likely digit contained in the image (along with a relative confidence about the prediction):Source: MIT Deep LearningLoading the dataset will return two PyTorch Datasets:* The `training_data` dataset contains both matching training images and their labels-—the data the model uses to learn.* The `test_data` dataset contains both matching test images and their labels--the data the model is tested on.The images are $1\times28\times28$ PyTorch tensors (i.e., the x variables), with pixel values ranging between 0 and 1(usually 0-255 but the pixels were already standardized). The *labels* (i.e., $y$) are an array of integers, ranging from 0 to 9. Usually we have to use *one-hot encoding* (the technique we learned about in the logistic regression lesson) to convert these labels to vectors (i.e., arrays with mostly 0s and a 1 at the index that corresponds to the data sample's digit category), however PyTorch does this for us with the loss function. The images are *standardized* already by PyTorch. This is achieved by dividing the pixels of each image by 255 (i.e., the max pixel value). Standardizing the data encourages our model to learn more generalizable features and helps it perform better on outside data. The final data processing step is "flattening" the $28\times28$ image pixel matrices into 784 image pixel arrays, and this is done when the image is passed in the model. We reshape the image matrices into arrays because our model expects the input to be a vector/array with 784 features (i.e., values). Now, let's load the data! ###Code # Downloading the data root = '/content/MNIST' training_data = torchvision.datasets.MNIST(root, train=True, download=True, transform=torchvision.transforms.ToTensor()) test_data = torchvision.datasets.MNIST(root, train=False, download=True, transform=torchvision.transforms.ToTensor()) # Loading Data training_dataloader = torch.utils.data.DataLoader(training_data, batch_size=2048, shuffle=True) test_dataloader = torch.utils.data.DataLoader(test_data, batch_size=2048, shuffle=False) print('Length of training dataset: {}'.format(len(training_data))) print('Length of test dataset: {}'.format(len(test_data))) images, labels = next(iter(training_dataloader)) print('\nTraining batch feature dimension: {}'.format(images.shape)) print('Testing batch feature dimension: {}'.format(labels.shape)) print('\nFirst 5 labels:', labels[:5]) ###Output Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz to /content/MNIST/MNIST/raw/train-images-idx3-ubyte.gz ###Markdown Let's display the first 5 images from the *training set* and display the class name below each image. ###Code plt.figure(figsize=(10,2)) for i in range(5): plt.subplot(1,5,i+1) plt.xticks([]) plt.yticks([]) plt.grid(False) plt.imshow(training_data[i][0].numpy().reshape(28, 28), cmap=plt.cm.binary) plt.xlabel(training_data[i][1]) ###Output _____no_output_____ ###Markdown 3. Build the modelNow that the data is ready, we can build the softmax classifier. We will use PyTorch to define a simple softmax regression model (single-layer fully-connected neural network) to predict the class of each image (a digit between 0 and 9). Given a sample with a corresponding set of class predictions, our model will select the class with the highest prediction value using *argmax*. We also define the loss function and optimization algorithm to compute gradients and give a score to the model. We will use *category cross-entropy* loss, *stochastic gradient descent*, and track the *accuracy* metric.Notice that the second parameter in `nn.Linear` layer contains 10 *neurons* instead of 1, which is different from the logistic regression lesson (1 neuron). This is because we need to calculate probabilities for 10 classes (digits 0-9). ###Code class Softmax_Model(nn.Module): # Contructor def __init__(self, num_classes): super(Softmax_Model, self).__init__() # Defining Fully-Connected Layer self.fc = nn.Linear(28*28, num_classes) # 28*28 since each image is 28*28 def forward(self, x): # Need to flatten each image in the batch x = x.flatten(start_dim=1) # Input it into the FC layer x = self.fc(x) return x num_classes = 10 model = Softmax_Model(num_classes) # Defining loss function criterion = torch.nn.CrossEntropyLoss() # Defining optimizer optimizer = torch.optim.SGD(model.parameters(), lr=0.01) print(model) ###Output Softmax_Model( (fc): Linear(in_features=784, out_features=10, bias=True) ) ###Markdown 4. Train the modelNow it's time to train the model. We will train it for 100 *epochs* (iterations) with a *batch size* of 2048 (the number of training examples to evaluate prior to doing gradient descent), and record the training and validation losses while training. ###Code # Use GPU if available device = "cuda" if torch.cuda.is_available() else "cpu" # Moving model to use GPU model.to(device) def getPredsFromLogits(logits): # Using softmax to get an array that sums to 1, and then getting the index with the highest value return torch.nn.functional.softmax(logits, dim=1).argmax(dim=1) epochs = 100 train_losses = [] train_accuracies = [] for epoch in range(1, epochs+1): train_loss = 0.0 train_counts = 0 ################### # train the model # ################### # Setting model to train mode model.train() for images, labels in training_dataloader: # Moving data to GPU if available images, labels = images.to(device), labels.to(device) # Setting all gradients to zero optimizer.zero_grad() # Calculate Output output = model(images) # Calculate Loss loss = criterion(output, labels) # Calculate Gradients loss.backward() # Perform Gradient Descent Step optimizer.step() # Saving loss train_loss += loss.item() # Get Predictions train_preds = getPredsFromLogits(output) # Saving number of right predictions for accuracy train_counts += train_preds.eq(labels).sum().item() # Averaging and Saving Losses train_loss/=len(training_data) train_losses.append(train_loss) # Getting accuracies and saving them train_acc = train_counts/len(training_data) train_accuracies.append(train_acc) print('Epoch: {} \tTraining Loss: {:.6f} \tTraining Accuracy: {:.2f}%'.format(epoch, train_loss, train_acc*100)) plt.plot(train_losses) plt.xlabel('epoch') plt.ylabel('Mean Squared Error') plt.title('Training Loss') plt.show() ###Output _____no_output_____ ###Markdown As the above plot suggests, our model has not converged to the optimal parameters yet after training for 50 epochs (iterations). This suggests the accuracy may improve with more training (more epochs). Further, the validation accuracy peaks around ~93%. 5. Evaluate the modelNow that we trained our model, it's time to evaluate it using the test dataset, which we did not use when training the model. This gives us a sense of how well our model predicts unseen data, which is the case when we use it in the real world. ###Code test_loss = 0.0 test_counts = 0 # Setting model to evaluation mode, no parameters will change model.eval() for images, labels in test_dataloader: # Moving to GPU if available images, labels = images.to(device), labels.to(device) # Calculate Output output = model(images) # Calculate Loss loss = criterion(output, labels) # Saving loss test_loss += loss.item() # Get Predictions test_preds = getPredsFromLogits(output) # Saving number of right predictions for accuracy test_counts += test_preds.eq(labels).sum().item() # Calculating test accuracy test_acc = test_counts/len(test_data) print('Test Loss: {:.6f} \tTest Accuracy: {:.2f}%'.format(test_loss, test_acc*100)) # Plotting Training label distribution plt.hist(training_data.targets) plt.xlabel('Classes') plt.ylabel('Frequency') plt.title('Training Labels Distribution'); # Plotting testing label distribution plt.hist(test_data.targets) plt.xlabel('Classes') plt.ylabel('Frequency') plt.title('Test Labels Distribution'); ###Output _____no_output_____ ###Markdown Wow! Our softmax classifier fit the MNIST digit data pretty well, correctly predicting the unseen handwritten digits around 88% to 89% of the time. Since the dataset is fairly balanced (i.e., the digit classes are close to equally represented), we can be relatively confident that the test set accuracy score provides a good estimate for how the model will perform on similar unseen handwritten digit data. Summary- We use *one-hot encoding* to represent categorical data.- Softmax regression is probably the popular classification technique and it is a foundational algorithm for classification methods in deep learning (neural networks). - The *softmax* logit function maps the input features to a probability distribution that guarantees the values are nonnegative and sum up to 1.- Softmax regression classifiers try to *maximize the likelihood* of the observed data, like logistic regression.- Categorical *cross-entorpy* is the loss function for softmax regression. ###Code ###Output _____no_output_____
99_CustomModel_v0.6.ipynb
###Markdown 'Tiny' Resnet 3D Model 1. Libraries ###Code ######################################################################### # 01. Libraries import tensorflow as tf # import tensorflow_addons as tfa from tensorflow.keras import layers, models, optimizers, regularizers, constraints, initializers physical_devices = tf.config.list_physical_devices('GPU') try: tf.config.experimental.set_memory_growth(physical_devices[0], True) except: print('Invalid device or cannot modify virtual devices once initialized.') ######################################################################### ###Output _____no_output_____ ###Markdown 2. Model ###Code ######################################################################### # 02. Model Functions class AttentionModel(models.Model): """Instantiates attention model. Uses two [kernel_size x kernel_size] convolutions and softplus as activation to compute an attention map with the same resolution as the featuremap. Features l2-normalized and aggregated using attention probabilites as weights. """ def __init__(self, feature_dim, kernel_size=1, decay=1e-4, max_norm=0.1, name='attention'): """Initialization of attention model. Args: kernel_size: int, kernel size of convolutions. decay: float, decay for l2 regularization of kernel weights. name: str, name to identify model. """ super(AttentionModel, self).__init__(name=name) # First convolutional layer (called with relu activation). self.conv1 = layers.Conv3D( feature_dim, (1, kernel_size, kernel_size), kernel_regularizer=regularizers.l2(decay), # kernel_constraint=constraints.MaxNorm(max_norm), padding='same', name='attn_conv1') self.bn_conv1 = layers.BatchNormalization(axis=-1, name='bn_conv1') self.relu = layers.Activation('relu') # Second convolutional layer, with softplus activation. self.conv2 = layers.Conv3D( 1, (1, kernel_size, kernel_size), kernel_regularizer=regularizers.l2(decay), # kernel_constraint=constraints.MaxNorm(max_norm), padding='same', name='attn_conv2') def call(self, inputs, training=True): x = self.conv1(inputs) x = self.bn_conv1(x, training=training) x = self.relu(x) score = self.conv2(x) prob = tf.nn.softplus(score) # L2-normalize the featuremap before pooling. inputs = tf.nn.l2_normalize(inputs, axis=-1) feat = tf.reduce_mean(tf.multiply(inputs, prob), [1, 2, 3], keepdims=False) return feat, prob, score class CustomFibrosisImageModel(models.Model): def __init__(self, kernel_attention=1, max_norm=0.1): super(CustomFibrosisImageModel, self).__init__(name='CustomFibrosisImageModel') self.input_avg_pool = layers.AvgPool3D(pool_size=(2, 1, 1)) self.input_batch_norm = layers.BatchNormalization(axis=-1) self.block0_conv1 = layers.Conv3D(8, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_0_bn1 = layers.BatchNormalization(axis=-1) self.block0_conv2 = layers.Conv3D(8, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_0_bn2 = layers.BatchNormalization(axis=-1) self.block0_conv3 = layers.Conv3D(8, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block0_max_pool = layers.MaxPool3D(pool_size=(1, 2, 2), strides=(1, 2, 2), data_format='channels_last') self.block1_dropout = layers.Dropout(0.3) self.block1_conv1 = layers.Conv3D(16, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_1_bn1 = layers.BatchNormalization(axis=-1) self.block1_conv2 = layers.Conv3D(16, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_1_bn2 = layers.BatchNormalization(axis=-1) self.block1_max_pool = layers.MaxPool3D(pool_size=(2, 2, 2), strides=(2, 2, 2), data_format='channels_last') self.block2_dropout = layers.Dropout(0.3) self.block2_conv1 = layers.Conv3D(32, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_2_bn1 = layers.BatchNormalization(axis=-1) self.block2_conv2 = layers.Conv3D(32, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), padding='same') self.block_2_bn2 = layers.BatchNormalization(axis=-1) self.block2_max_pool = layers.MaxPool3D(pool_size=(2, 2, 2), strides=(2, 2, 2), data_format='channels_last') self.block3_dropout = layers.Dropout(0.3) self.block3_conv1 = layers.Conv3D(64, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_3_bn1 = layers.BatchNormalization(axis=-1) self.block3_conv2 = layers.Conv3D(64, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_3_bn2 = layers.BatchNormalization(axis=-1) self.block3_max_pool = layers.MaxPool3D(pool_size=(2, 2, 2), strides=(2, 2, 2), data_format='channels_last') self.block4_dropout = layers.Dropout(0.4) self.block4_conv1 = layers.Conv3D(128, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_4_bn1 = layers.BatchNormalization(axis=-1) self.block4_conv2 = layers.Conv3D(128, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_4_bn2 = layers.BatchNormalization(axis=-1) self.block4_max_pool = layers.MaxPool3D(pool_size=(2, 2, 2), strides=(2, 2, 2),data_format='channels_last') self.block5_dropout = layers.Dropout(0.4) self.block5_conv1 = layers.Conv3D(256, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_5_bn1 = layers.BatchNormalization(axis=-1) self.block5_conv2 = layers.Conv3D(256, kernel_size=(1, 3, 3), activation=None, kernel_regularizer=regularizers.l2(1e-4), kernel_initializer = initializers.RandomNormal(stddev=0.01), padding='same') self.block_5_bn2 = layers.BatchNormalization(axis=-1) ############## self.att_0 = AttentionModel(8, kernel_size=kernel_attention, decay=1e-4, max_norm=max_norm, name='att0') self.att_1 = AttentionModel(16, kernel_size=kernel_attention, decay=1e-4, max_norm=max_norm, name='att1') self.att_2 = AttentionModel(32, kernel_size=kernel_attention, decay=1e-4, max_norm=max_norm, name='att2') self.att_3 = AttentionModel(64, kernel_size=kernel_attention, decay=1e-4, max_norm=max_norm, name='att3') self.att_4 = AttentionModel(128, kernel_size=kernel_attention, decay=1e-4, max_norm=max_norm, name='att4') self.att_5 = AttentionModel(256, kernel_size=kernel_attention, decay=1e-4, max_norm=max_norm, name='att4') ############# def call(self, inputs, training=True): x = self.input_avg_pool(inputs) x = self.input_batch_norm(x, training) x = self.block0_conv1(x) block0_stack0 = x x = self.block_0_bn1(x, training) x = tf.nn.relu(x) x = self.block0_conv2(x) x = self.block_0_bn2(x, training) x = tf.nn.relu(x) x = self.block0_conv3(x) block0_stack1 = x x = tf.add(block0_stack0, block0_stack1) # att_0_feat, att_0_prob, _ = self.att_0(x, training) x = self.block0_max_pool(x) x = self.block1_dropout(x, training) x = self.block1_conv1(x) block1_stack0 = x x = self.block_1_bn1(x, training) x = tf.nn.relu(x) x = self.block1_conv2(x) block1_stack1 = x x = tf.add(block1_stack0, block1_stack1) x = self.block_1_bn2(x, training) x = tf.nn.relu(x) att_1_feat, att_1_prob, _ = self.att_1(x, training) x = self.block1_max_pool(att_1_prob) x = self.block2_dropout(x, training) x = self.block2_conv1(x) block2_stack0 = x x = self.block_2_bn1(x, training) x = tf.nn.relu(x) x = self.block2_conv2(x) block2_stack1 = x x = tf.add(block2_stack0, block2_stack1) x = self.block_2_bn2(x, training) x = tf.nn.relu(x) att_2_feat, att_2_prob, _ = self.att_2(x, training) x = self.block2_max_pool(att_2_prob) x = self.block3_dropout(x, training) x = self.block3_conv1(x) block3_stack0 = x x = self.block_3_bn1(x, training) x = tf.nn.relu(x) x = self.block3_conv2(x) block3_stack1 = x x = tf.add(block3_stack0, block3_stack1) x = self.block_3_bn2(x, training) x = tf.nn.relu(x) att_3_feat, att_3_prob, _ = self.att_3(x, training) x = self.block3_max_pool(att_3_prob) x = self.block4_dropout(x, training) x = self.block4_conv1(x) block4_stack0 = x x = self.block_4_bn1(x, training) x = tf.nn.relu(x) x = self.block4_conv2(x) block4_stack1 = x x = tf.add(block4_stack0, block4_stack1) x = self.block_4_bn2(x, training) x = tf.nn.relu(x) att_4_feat, att_4_prob, _ = self.att_4(x, training) x = self.block4_max_pool(att_4_prob) x = self.block5_dropout(x, training) x = self.block5_conv1(x) block5_stack0 = x x = self.block_5_bn1(x, training) x = tf.nn.relu(x) x = self.block5_conv2(x) block5_stack1 = x x = tf.add(block5_stack0, block5_stack1) x = self.block_5_bn2(x, training) x = tf.nn.relu(x) att_5_feat, att_5_prob, _ = self.att_5(x, training) features_vector = tf.concat([att_1_feat, att_2_feat, att_3_feat, att_4_feat, att_5_feat], axis=-1) return features_vector ######################################################################### # p = tf.random.normal((1, 32, 160, 160, 1)) # m = CustomFibrosisImageModel() # # tf.nn.avg_pool3d(p, (2, 1, 1), strides=1, padding='SAME') # m(p, training=True) ######################################################################### model = CustomFibrosisImageModel(kernel_attention=1, max_norm=0.1) p = tf.random.normal((1, 32, 220, 220, 1)) model(p) # m.build(input_shape=(None, None, None, None, 1)) # input_shape=(None, None, None, 1) # input_ = layers.Input(shape=input_shape) # model = models.Model(inputs=input_, outputs=m.output, name='Resnet3D') # print(model.summary(line_length=130)) tf.keras.models.save_model(model=model, save_format='tf', filepath='../05_Saved_Models/customModel', include_optimizer=False) ######################################################################### ###Output WARNING:tensorflow:Skipping full serialization of Keras layer <__main__.AttentionModel object at 0x000001CF0E309BC8>, because it is not built. WARNING:tensorflow:From C:\Users\Enric\anaconda3\lib\site-packages\tensorflow\python\training\tracking\tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version. Instructions for updating: This property should not be used in TensorFlow 2.0, as updates are applied automatically. WARNING:tensorflow:From C:\Users\Enric\anaconda3\lib\site-packages\tensorflow\python\training\tracking\tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version. Instructions for updating: This property should not be used in TensorFlow 2.0, as updates are applied automatically. WARNING:tensorflow:Skipping full serialization of Keras layer <tensorflow.python.keras.layers.convolutional.Conv3D object at 0x000001CF0E30B148>, because it is not built. WARNING:tensorflow:Skipping full serialization of Keras layer <tensorflow.python.keras.layers.normalization_v2.BatchNormalization object at 0x000001CF0E30F208>, because it is not built. WARNING:tensorflow:Skipping full serialization of Keras layer <tensorflow.python.keras.layers.core.Activation object at 0x000001CF0E3229C8>, because it is not built. WARNING:tensorflow:Skipping full serialization of Keras layer <tensorflow.python.keras.layers.convolutional.Conv3D object at 0x000001CF0E324088>, because it is not built. INFO:tensorflow:Assets written to: ../05_Saved_Models/customModel\assets
writing_efficient_code/code_profiling.ipynb
###Markdown Code profiling Code profiling for timeThis is a method that dictates for how long and when different parts of a program are executed. It allows us to profile individual lines of the code, without having to use magic commands like [`%timeit`](evaluating_runtime.ipynb). This notebook focusses on using the [`line_profiler`](https://github.com/rkern/line_profiler) package to analyse a functions runtime line-by-line. This is a separate package from the Python standard library and, therefore, has to be installed separately. > `pip install line_profiler` > `conda install line_profiler` We need to load the profiler into the session, we can do so with the `%load_ext` command. ###Code import numpy as np %load_ext line_profiler ###Output _____no_output_____ ###Markdown This means that we now have `%lprun` available to us, and using the `-f` argument means that we are looking to profile a function. ###Code def my_function(list, multiplier): new_list = [x * multiplier for x in list] return(new_list) %lprun -f my_function my_function(np.random.rand(1000), 3) def my_function_2(list, multiplier): array = np.array(list) new_list = array * multiplier return(new_list) %lprun -f my_function_2 my_function_2(np.random.rand(1000), 3) ###Output _____no_output_____ ###Markdown Code profiling for memoryWe may also want to consider the memory footprint of a process that we are building. We can see the size of an object in bytes by using the inbuilt `sys` function `getsizeof`; this only works for a single object. ###Code import sys sys.getsizeof(np.random.rand(1000)) ###Output _____no_output_____ ###Markdown If we want a more in-depth look at the memory allocation of our code, then we can use the [`memory_profiler`](https://pypi.org/project/memory-profiler/) package that is very similar in structure to the `line_profiler` package that we were using above. > `pip install memory_profiler` > `conda install memory_profiler` ###Code %load_ext memory_profiler ###Output _____no_output_____ ###Markdown Once we have loaded `memory_profiler` into the session, then we can use it with `%mprun`. However, `%mprun` can only be used in functions that are stored in files, and cannot be used on functions that are merely defined in the session. ###Code import functions as fn %mprun -f fn.my_file_function fn.my_file_function(np.random.rand(1000), 5) %mprun -f fn.my_file_function_2 fn.my_file_function_2(np.random.rand(1000), 5) ###Output
Network_diffusion_with_ndlib/NDlib.ipynb
###Markdown Author: Giulio RossettiPython version: 3.6NDlib version: 4.0.1Last update: 15/02/2018 *Intro to NDlib: Network Diffusion library*``NDlib`` is a python library designed to provide support to analysis of diffusive phenomena occurring on top of complex network structures.In this notebook are introduced some of the main features of the library and an overview of its functionalities.**Note:** this notebook is purposely not 100% comprehensive, it only discusses the basic things you need to get started. Table of Contents1. [Installing NDlib](install)2. [Simulation Workflow](workflow) 1. [Graph Creation](graph) 2. [Model Selection and Configuration](model) 3. [Simulation Execution](simulation) 4. [Results Visualisation](visual)3. [Available models](models) 1. [Epidemics](epidemics) 2. [Opinion Dynamics](opinion)4. [Advanced Model Configurations](advanced) 1. [Node Attributes](nodes) 2. [Edge Attributes](edges) 3. [Infection Seeds Selection](seeds) 1. [Model Stability](stability) 2. [Stability Visualisation](stability_vis)5. [Comparing Diffusion models](comparing)6. [Diffusion on Dynamic Networks](dynamic) 1. [DynetX: a library for dynamic network modeling](dynetx) 1. [Snapshot Graphs](snapshots) 2. [Interaction Networks](interactions) 2. [Available models](models2) 3. [Example: SIR](dynsir)7. [Custom Model Definition](custom) 1. [Compartments](compartments) 1. [Node compartments](nc) 2. [Edge compartments](ec) 3. [Time compartments](tc) 2. [Compartments Composition](composition) 1. [Cascading Composition](cascading) 2. [Conditional Composition](conditional) 3. [Example: SIR](sir)8. [NDQL: Network Diffusion Query Language](ndql) 1. [Syntax](syntax) 2. [Command line tools](cmd) 3. [Example: SIR](sir2)9. [Conclusions](conclusion) 1. Installing NDlib ([to top](top)) As a first step, we need to make sure that ``NDlib`` is installed and working.The library is available for both python 2.7 and 3.x, and its stable version can be installed using ``pip``: pip install ndlib On the project [GitHub](https://github.com/GiulioRossetti/ndlib) are also available the nightly builds that can be installed as follows: pip install git+https://github.com/GiulioRossetti/ndlib.git > /dev/null In order to check if ``ndlib`` has been correctly installed just try to import it ###Code import ndlib ###Output _____no_output_____ ###Markdown 2. Simulation Workflow ([to top](top))``Ndlib`` breaks the simulation of diffusive phenomena into a standard workflow:- Network Creation- Diffusion model Selection and Configuration- Simulation execution- Results visualisationIn this section we will observe how to templating such workflow describing a simple *SIR* simulation. 2.A Graph object creation ([to top](top)) As a first step we need to define the network topology that will be used as playground to study diffusive phenomena.``NDlib`` leverage [``networkx``](https://networkx.github.io) data structure to provide support for both directed and undirected graphs.In this example, to perform our simulation, we instantiate a Erdos-Renyi graph as follows: ###Code import networkx as nx g = nx.erdos_renyi_graph(1000, 0.1) ###Output _____no_output_____ ###Markdown 2.B Model Selection and Configuration ([to top](top))After having defined the graph, we can select the diffusion model to simulate. In our example we import the SIR model and instantiate it on our graph. ###Code import ndlib.models.epidemics.SIRModel as sir model = sir.SIRModel(g) ###Output _____no_output_____ ###Markdown Every diffusion model has its own parameter, ``NDlib`` offers a common interface to specify them: ``ModelConfig``.``ModelConfig`` takes care of validating model parameters.Indeed, every model has its own parameters: model specific parameter list and definitions are available on the project [documentation site](http://ndlib.readthedocs.io).In order to get a description of the required parameters just access the ``parameter`` field ###Code import json print(json.dumps(model.parameters, indent=2)) ###Output { "model": { "beta": { "descr": "Infection rate", "range": [ 0, 1 ], "optional": false }, "gamma": { "descr": "Recovery rate", "range": [ 0, 1 ], "optional": false } }, "nodes": {}, "edges": {} } ###Markdown Similarly, to obtain a list of the statuses implemented in the selected model just access the ``available_statuses`` field ###Code model.available_statuses import ndlib.models.ModelConfig as mc cfg = mc.Configuration() cfg.add_model_parameter('beta', 0.001) # infection rate cfg.add_model_parameter('gamma', 0.01) # recovery rate ###Output _____no_output_____ ###Markdown ``ModelConfig`` also allows to describe the initial condition of the simulation. It makes possible, for instance, to specify the initial percentage of infected nodes in the network. ###Code cfg.add_model_parameter("percentage_infected", 0.01) model.set_initial_status(cfg) ###Output _____no_output_____ ###Markdown 2.C Simulation Execution ([to top](top))Once described the network, the model and the initial conditions it is possible to perform the simulation.``NDlib`` models diffusive phenomena as **discrete-time**, **agent-based** processes: every iteration step all nodes are evaluated and, if their status is updated accordingly to the model rules.Iterations can be required (incrementally) by using two methods:- ``iteration()``- ``iteration_bunch(nbunch, node_status=False)``The former computes a single iteration step, the latter executes ``nbunch`` iterations. The ``node_status`` parameter allows to return the individual node status at each iteration. ###Code iterations = model.iteration_bunch(200, node_status=True) iterations ###Output _____no_output_____ ###Markdown To abstract from iterations details it is possible to transform them into diffusion **trends** using the ``build_trends(iterations)`` method: ###Code trends = model.build_trends(iterations) trends ###Output _____no_output_____ ###Markdown 2.D Results Visualisation ([to top](top))Finally, ``NDlib`` allows to inspect the behavior of the simulated model using standard plots such as the ``DiffusionTrend`` and ``DiffusionPrevalence`` ones. ###Code %matplotlib inline from ndlib.viz.mpl.DiffusionTrend import DiffusionTrend viz = DiffusionTrend(model, trends) viz.plot() from ndlib.viz.mpl.DiffusionPrevalence import DiffusionPrevalence viz = DiffusionPrevalence(model, trends) viz.plot() ###Output _____no_output_____ ###Markdown The proposed visualisation are realised using the [``matplotlib``](https://matplotlib.org) python library. They are fully customizable and all the metadata they visualize is gathered by the ``model`` object.To obtain web-oriented versions of such plots ``NDlib`` exposes a second visualisation endpoint built on top of [``bokeh``](https://bokeh.pydata.org/en/latest/): the plotting facilities its defines are collected within the sub-package ``ndlib.viz.bokeh`` and follow the same rationale of their ``matplotlib`` counterpart. 3. Available models ([to top](top)) The analysis of diffusive phenomena that unfold on top of complex networks is a task able to attract growing interests from multiple fields of research.In order to provide a succinct framing of such complex and extensively studied problem it is possible to split the related literature into two broad, related, sub-classes: **Epidemics** and **Opinion Dynamics**. 3.A Epidemics ([to top](top))When we talk about epidemics, we think about contagious diseases caused by biological pathogens, like influenza, measles, chickenpox and sexually transmitted viruses that spread from person to person. Several elements determine the patterns by which epidemics spread through groups of people: the properties carried by the pathogen (its contagiousness, the length of its infectious period and its severity), the structure of the network as well as the mobility patterns of the people involved. In ``NDlib`` are implemented the following 12 Epidemic models: [SI](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/SIm.html) [SIS](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/SIS.html) [SIR](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/SIR.html) [SEIR](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/SEIR.html) [SEIS](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/SEIS.html) [SWIR](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/SWIR.html) [Threshold](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/Threshold.html) [Generalised Threshold](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/GeneralisedThreshold.html) [Kertesz Threshold](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/KThreshold.html) [Profile](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/Profile.html) [Profile-Threshold](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/ProfileThreshold.html) [Independent Cascades](http://ndlib.readthedocs.io/en/latest/reference/models/epidemics/IndependentCascades.html) 3.B Opinion Dynamics ([to top](top))A different field related with modelling social behaviour is that of opinion dynamics.Recent years have witnessed the introduction of a wide range of models that attempt to explain how opinions form in a population, taking into account various social theories (e.g. bounded confidence or social impact).These models have a lot in common with those seen in epidemics and spreading. In general, individuals are modelled as agents with a state and connected by a social network.The social links can be represented by a complete graph (*mean field* models) or by more realistic complex networks, similar to epidemics and spreading.The state is typically represented by variables, that can be *discrete* (similar to the case of spreading), but also *continuous*, representing for instance a probability to choose one option or another. The state of individuals changes in time, based on a set of update rules, mainly through interaction with the neighbours.While in many spreading and epidemics models this change is irreversible (susceptible to infected), in opinion dynamics the state can oscillate freely between the possible values, simulating thus how opinions change in reality.In ``NDlib`` are implemented the following 6 Opinion Dynamics models: [Voter](http://ndlib.readthedocs.io/en/latest/reference/models/opinion/Voter.html) [Q-Voter](http://ndlib.readthedocs.io/en/latest/reference/models/opinion/QVoter.html) [Majority Rule](http://ndlib.readthedocs.io/en/latest/reference/models/opinion/MajorityRule.html) [Sznajd](http://ndlib.readthedocs.io/en/latest/reference/models/opinion/Snajzd.html) [Cognitive Opinion Dynamics](http://ndlib.readthedocs.io/en/latest/reference/models/opinion/COD.html) [Algorithmic Bias](http://ndlib.readthedocs.io/en/latest/reference/models/opinion/AlgorithmicBias.html) 4 Advanced Model Configurations ([to top](top))As already discussed, the ``ModelConfig`` object is the common interface ``NDlib`` use to set up simulation experiments.``ModelConfig`` allows to specify four categories of experiment configurations:- **Model** configuration (as already discussed)- **Node** Configuration- **Edge** Configuration- Simulation **Initial Conditions** 4.A Node Attributes ([to top](top))Node configuration involves the instantiation of both the *mandatory* and *optional* parameters attached to individual nodes.Let's consider as an example the ``Threshold`` model. In such model a *susceptible* node in order to become *infected* needs that at least $\tau\%$ of its neighbors are already *infected*. We can assign a value of $\tau\%$ to every node as follows: ###Code import ndlib.models.epidemics.ThresholdModel as th model = th.ThresholdModel(g) config = mc.Configuration() config.add_model_parameter('percentage_infected', 0.1) threshold = 0.25 for i in g.nodes(): config.add_node_configuration("threshold", i, threshold) # node attribute setting model.set_initial_status(config) ###Output _____no_output_____ ###Markdown 4.B Edge Attributes ([to top](top)) Edge configuration involves the instantiation of both the *mandatory* and *optional* parameters attached to individual edges.Let's consider as an example the ``IndependentCascades`` model. In such model a *susceptible* node become *infected* with probability $p$, where $p$ is a value attached to the link connecting the *susceptible* node to an *infected* neighbor.We can assign a value of $p$ to every edge as follows: ###Code import ndlib.models.epidemics.IndependentCascadesModel as ids model = ids.IndependentCascadesModel(g) config = mc.Configuration() config.add_model_parameter('percentage_infected', 0.1) threshold = 0.1 for e in g.edges(): config.add_edge_configuration("threshold", e, threshold) # edge attribute setting model.set_initial_status(config) ###Output _____no_output_____ ###Markdown 4.C Infection Seeds Selection ([to top](top))Status configuration allows to specify explicitly the status of a set of nodes at the beginning of the simulation.So far we have assumed that a random sample of nodes (10% in our examples) were initially infected: in order to cover specifi simulation scenarios we can also explicitly *specify* the nodes belonging to each *status* at the beginning of the simulation. ###Code import ndlib.models.ModelConfig as mc # Model Configuration config = mc.Configuration() infected_nodes = [0, 1, 2, 3, 4, 5] config.add_model_initial_configuration("Infected", infected_nodes) ###Output _____no_output_____ ###Markdown **NB:** Explicit status specification takes priority over the percentage specification expressed via model definition (e.g. percentage_infected). 4.C.a Model Stability ([to top](top))Indeed, different initial conditions can affect the overall unfolding of the diffusive process.In order to analyse the stability of a model w.r.t. the initial seeds ``NDlib`` implements a ``multi_runs`` facility.``multi_runs`` allows the parallel execution of multiple instances of a given model starting from different initial infection conditions.We can instantiate ``multi_runs`` as follows: ###Code from ndlib.utils import multi_runs import warnings warnings.filterwarnings("ignore") model = sir.SIRModel(g) config = mc.Configuration() config.add_model_parameter('beta', 0.001) config.add_model_parameter('gamma', 0.01) config.add_model_parameter("percentage_infected", 0.05) model.set_initial_status(config) trends = multi_runs(model, execution_number=10, iteration_number=100, nprocesses=4) ###Output _____no_output_____ ###Markdown In our example the initial seeds for each instance of the model were specified by the ``percentage_infected`` model parameter.Indeed we can also explicitly parametrize the seed sets as follows: ###Code model = sir.SIRModel(g) config = mc.Configuration() config.add_model_parameter('beta', 0.001) config.add_model_parameter('gamma', 0.01) model.set_initial_status(config) infection_sets = [(1, 2, 3, 4, 5), (3, 23, 22, 54, 2), (98, 2, 12, 26, 3), (4, 6, 9) ] trends1 = multi_runs(model, execution_number=4, iteration_number=100, infection_sets=infection_sets, nprocesses=4) ###Output _____no_output_____ ###Markdown 4.C.b Stability Visualisation ([to top](top))Model stability can be easily analised by using the visualisation facilities offered by the library: both ``DiffusionTrend`` and ``DiffusionPrevalence`` plots alloes to plot mean trends along with their point-wise variation. ###Code viz = DiffusionTrend(model, trends) viz.plot(percentile=90) viz = DiffusionPrevalence(model, trends) viz.plot(percentile=90) ###Output _____no_output_____ ###Markdown 5. Comparing Diffusion models ([to top](top))A common goal for which diffusion simulations are executed is to perform comparison among different models (or different instantiations of a same model).To address such demands ``NDlib`` provides visual comparison plots.To show how they work, as a first step we execute a second model (in this exampe an **SI** model) over our original graph. ###Code import ndlib.models.epidemics.SIModel as si model1 = si.SIModel(g) cfg = mc.Configuration() cfg.add_model_parameter('beta', 0.001) cfg.add_model_parameter("percentage_infected", 0.01) model1.set_initial_status(cfg) trends1 = multi_runs(model1, execution_number=10, iteration_number=100, nprocesses=4) ###Output _____no_output_____ ###Markdown Then, we can compare them: ###Code from ndlib.viz.mpl.TrendComparison import DiffusionTrendComparison viz = DiffusionTrendComparison([model, model1], [trends, trends1], statuses=['Infected']) viz.plot() from ndlib.viz.mpl.PrevalenceComparison import DiffusionPrevalenceComparison viz = DiffusionPrevalenceComparison([model, model1], [trends, trends1], statuses=['Infected']) viz.plot() ###Output _____no_output_____ ###Markdown The method parameter ``statuses`` takes as input a list of the models statuses trends we want to compare.So, for instance, if we are interested in comparing both the trends for *Infected* and *Susceptible* nodes we can do something like this: ###Code from ndlib.viz.mpl.TrendComparison import DiffusionTrendComparison viz = DiffusionTrendComparison([model, model1], [trends, trends1], statuses=['Infected', 'Susceptible']) viz.plot() ###Output _____no_output_____ ###Markdown 6. Diffusion on Dynamic Networks ([to top](top))So far we assumed that a *static* network topology. In real world scenario it is likely to observe nodes (as well as edges) that appear and desapear as time goes by, deeply affecting network structure and connectivity.Indeed, topological transformations have huge implications on how diffusive phenomena unfold. ``NDlib`` leverages [``DyNetx``](http://dynetx.readthedocs.io/en/latest/) to model time-evolving graphs. In the following we briefly introduce some [``DyNetx``](http://dynetx.readthedocs.io/en/latest/) primitives that allows to build and analyse dynamic networks.A dynamic network is a topology having timestamps attached to edges (and/or nodes). As an example: 6.A DyNetX: a library for dynamic network modeling ([to top](top))[``DyNetx``](http://dynetx.readthedocs.io/en/latest/) is a Python software package that extends [``networkx``](https://networkx.github.io) with dynamic network models and algorithms.We developed [``DyNetx``](http://dynetx.readthedocs.io/en/latest/) as a support library for ``NDlib`` simulation. It provides a generic implementation of dynamic network topology that can be used to model directed/undirected- [Snapshot Graphs](snapshots)- [Interaction Networks](interactions)In [``DyNetx``](http://dynetx.readthedocs.io/en/latest/) a generic dynamic graph can be built using: ###Code import dynetx as dn g = dn.DynGraph() # empty dynamic graph g.add_interaction(u=1, v=2, t=0, e=2) # adding the edge (1,2) at t=0 that vanishes at time e=2 g.add_interactions_from([(1, 4), (2, 5), (3, 1)], t=1) # adding some edges at time t=1 g.add_interactions_from([(2, 6), (3, 2)], t=2) # adding some edges at time t=2 g.add_interactions_from([(1, 5)], t=3) # adding some edges at time t=3 ###Output _____no_output_____ ###Markdown 6.A.a Snapshot Graphs ([to top](top))Often, network history is partitioned into a series of snap- shots, each one of them corresponding either to the state of the network at a time $t$ or to the aggregation of observed interactions during a period. Formally,> A ``Snapshot Graph`` $G_t$ is defined by a temporally ordered set $⟨G_1, G_2\dots G_t⟩$ of static graphs where each snapshot $G_i = (V_i, E_i)$ is univocally identified by the sets of nodes $V_i$ and edges $E_i$.Network snapshots can be effectively used, for instance, to model a phenomenon that generates network perturbations (almost) at regular intervals. In this scenario, context-dependent temporal windows are used to partition the network history into consecutive snapshots: time-bounded observations describing a precise, static, discretization of the network life.Considering our dynamic network example we can identify the following snapshot graphs:[``DyNetx``](http://dynetx.readthedocs.io/en/latest/) allows to (among the other things):- List the snapshots of the loaded graph ###Code g.temporal_snapshots_ids() ###Output _____no_output_____ ###Markdown - Access a specific snapshot ###Code g1 = g.time_slice(1) g1.edges() ###Output _____no_output_____ ###Markdown Moreover, snapshot graphs can also be read from/wite to file. For additional details refer to the official [documentation](http://dynetx.readthedocs.io/en/latest/index.html). 6.A.b Interaction networks ([to top](top))An ``Interaction network`` models a dynamic structure in which both nodes and edges may appear and disappear as time goes by. Usually, ``Intercation network`` are used in absence of a clear aggregation time scale, or when make sense to analyse a dynamic networok as a continuos stream of edges. Formally,> An ``interaction network`` is a graph $G = (V, E, T)$ where: $V$ is a set of triplets of the form $(v, t_s, t_e)$, with $v$ a vertex of the graph and $t_s$, $t_e \in T$ are respectively the birth and death timestamps of the corresponding vertex (with $t_s \leq t_e$); $E$ is a set of quadruplets $(u, v, t_s, t_e)$, with $u, v \in V$ are vertices of the graph and $t_s,t_e \in T$ are respectively the birth and death timestamps of the corresponding edge (with $t_s \leq t_e$).Considering our dynamic network example we can identify the following interaction stream:[``DyNetx``](http://dynetx.readthedocs.io/en/latest/) allows to to obtain the edge stream of a given dynamic graph. ###Code for i in g.stream_interactions(): print(i) ###Output (1, 2, '+', 0) (1, 4, '+', 1) (2, 5, '+', 1) (3, 1, '+', 1) (1, 2, '-', 2) (2, 6, '+', 2) (3, 2, '+', 2) (1, 5, '+', 3) ###Markdown In the former representation: - the first two values identify the nodes involved in an edge- the third value identify the edge operation ('+' apparence, '-' vanishing)- the last value identify the timestampAlso ``interaction networks`` can be read from/wite to file. For additional details refer to the official [documentation](http://dynetx.readthedocs.io/en/latest/index.html). 6.B Available models ([to top](top))As we have discussed, network topology may evolve as time goes by.In order to automatically leverage network dynamics ``NDlib`` enables the definition of diffusion models that work on ``Snapshot Graphs`` as well as on ``Interaction Networks``.In particular, so far, ``NDlib`` implements dynamic network versions of the following epidemic models: [SI](http://ndlib.readthedocs.io/en/latest/reference/models/dynamics/dSI.html) [SIS](http://ndlib.readthedocs.io/en/latest/reference/models/dynamics/dSIS.html) [SIR](http://ndlib.readthedocs.io/en/latest/reference/models/dynamics/dSIR.html) [Profile](http://ndlib.readthedocs.io/en/latest/reference/models/dynamics/dProfile.html) [Threshold](http://ndlib.readthedocs.io/en/latest/reference/models/dynamics/dProfileThreshold.html) [Kertesz Threshold](http://ndlib.readthedocs.io/en/latest/reference/models/dynamics/dKThreshold.html) 6.C Example: SIR ([to top](top))Let's instantiate a revised SIR model on a dynamic netework, first on ``snapshot graphs``... ###Code import ndlib.models.dynamic.DynSIModel as si # Dynamic Network topology dg = dn.DynGraph() # Naive synthetic dynamic graph # At each timestep t a new graph having the same set of node ids is created for t in range(0, 30): g = nx.erdos_renyi_graph(200, 0.01) dg.add_interactions_from(g.edges(), t) # Model selection model = si.DynSIModel(dg) # Model Configuration config = mc.Configuration() config.add_model_parameter('beta', 0.01) config.add_model_parameter("percentage_infected", 0.1) model.set_initial_status(config) # Simulate snapshot based execution iterations = model.execute_snapshots() trends = model.build_trends(iterations) viz = DiffusionTrend(model, trends) viz.plot() ###Output _____no_output_____ ###Markdown then in ``interaction networks`` ###Code model = si.DynSIModel(dg) # Model Configuration config = mc.Configuration() config.add_model_parameter('beta', 0.01) config.add_model_parameter("percentage_infected", 0.1) model.set_initial_status(config) # Simulation interaction graph based execution iterations = model.execute_iterations() trends = model.build_trends(iterations) viz = DiffusionTrend(model, trends) viz.plot() ###Output _____no_output_____ ###Markdown We can easily observe that the model we adopt to describe network dynamics (``snapshot graphs`` or ``interaction networks``) deeply affects the unfolding of a same diffusive process over a given evolving graph. 7. Custom Model Definition ([to top](top))``NDlib`` comes with a handy syntax for compositional (custom) model definition to support its users in designing novel diffusion models.At a higher level of abstraction a generic diffusion process can be described by two components:1. The nodes' ``statuses`` it exposes, and2. the ``transition rules`` that regulate status changes.We recall taht all models of ``NDlib`` assume an agent-based, discrete-time, simulation engine. During each simulation iteration all the nodes in the network are asked to 1. evaluate their current status and to 2. (eventually) apply a matching transition rule. A generic ``transition rule`` can be expressed with something like: > **if** ``actual_node_status`` **and** ``condition`` **then** ``new_node_status`` The ``condition`` can be easily decomposed into the evaluation of atomic operations that we will call ``compartments``. The evaluation of a compartment can return either ``True`` (condition satisfied) or ``False`` (condition not satisfied).A simple ``condition`` is composed by a single ``compartment``.Indeed, several ``compartments`` can be described, each one of them capturing an atomic operation. A custom model, having three statuses (``Susceptible``, ``Infected``, ``Recovered``), can be instantiated as follows: ###Code import ndlib.models.CompositeModel as gc import ndlib.models.compartments.NodeStochastic as ns # Composite Model instantiation model = gc.CompositeModel(g) model.add_status("Susceptible") model.add_status("Infected") model.add_status("Recovered") ###Output _____no_output_____ ###Markdown 7.A Compartments ([to top](top))> *Which are the atomic conditions that can be used to define complex transition rules?*To answer such question we identified three families of ``compartments`` (and some operations to combine them). 7.A.a Node Compartments ([to top](top))In this class fall all those compartments that evaluate conditions tied to node status/features. They model stochastic events as well as deterministic ones. NameUse case [Node Stochastic](http://ndlib.readthedocs.io/en/latest/custom/compartments/NodeStochastic.html) Consider a rule that requires a **probability** $\beta$ to be satisfied. [Node Categorical Attribute](http://ndlib.readthedocs.io/en/latest/custom/compartments/NodeCategoricalAttribute.html) Consider a rule that requires a specific value of a **categorical** node attribute to be satisfied (e.g. “Sex”=”male”). [Node Numerical Attribute](http://ndlib.readthedocs.io/en/latest/custom/compartments/NodeNumericalAttribute.html) Consider a rule that requires a specific value of a **numerical** node attribute to be satisfied (e.g. “Age” >= 18). [Node Threshold](http://ndlib.readthedocs.io/en/latest/custom/compartments/NodeThreshold.html) Consider a rule that requires that, at least, a **percentage** $\beta$ of Infected neighbors for a node to be satisfied. Let's add to our model a rule employing a node stochastic compartment: ###Code import ndlib.models.compartments.NodeStochastic as ns # Compartment description c1 = ns.NodeStochastic(0.02, triggering_status="Infected") # Rule definition model.add_rule("Susceptible", "Infected", c1) ###Output _____no_output_____ ###Markdown The **Susceptble -> Infected** rule defined works as follows:- if a node $n$ is *susceptible*, and- if $n$ has at least an *infected* neighbor (``triggering_status``)- then, with probability $0.02$, $n$ status will shift to *infected* 7.A.b Edge Compartments ([to top](top))In this class fall all those compartments that evaluate conditions tied to edge features. They model stochastic events as well as deterministic ones. NameUse case [Edge Stochastic](http://ndlib.readthedocs.io/en/latest/custom/compartments/EdgeStochastic.html) Consider a rule tha requires a direct link among an infected node and a susceptible one and that is subject to a **probability** $\beta$ tied to such edge. [Edge Categorical Attribute](http://ndlib.readthedocs.io/en/latest/custom/compartments/EdgeCategoricalAttribute.html) Consider a rule tha requires a link among an infected node and a susceptible one that has a specific **categorical** value (e.g. “type”=”co-worker”) [Edge Numerical Attribute](http://ndlib.readthedocs.io/en/latest/custom/compartments/EdgeNumericalAttribute.html) Consider a rule tha requires a link among an infected node and a susceptible one that has a specific **numerical** value (e.g. “weight” Let's add to our model a rule employing an edge stochastic compartment: ###Code import ndlib.models.compartments.EdgeStochastic as es c2 = es.EdgeStochastic(0.02, triggering_status="Recovered") # Rule definition model.add_rule("Infected", "Recovered", c2) ###Output _____no_output_____ ###Markdown The **Infected -> Recovered** rule defined works as follows:- if a node $n$ is *infected*, and- let $\Gamma$ be the set containing the *recovered* (``triggering_status``) neighbors of $n$ - then for each node $v\in \Gamma$ with probability $0.02$, $n$ status will shift to *recovered* 7.A.c Time Compartments ([to top](top))In this class fall all those compartments that evaluate conditions tied to temporal execution. They can be used to model, for instance, lagged events as well as triggered transitions. NameUse case [Count Down](http://ndlib.readthedocs.io/en/latest/custom/compartments/CountDown.html) Consider a rule that has an **incubation** period of $t$ iterations. Let's add to our model a rule employing a count down compartment ###Code import ndlib.models.compartments.CountDown as cd c3 = cd.CountDown("incubation", iterations=10) # Rule definition model.add_rule("Recovered", "Susceptible", c1) ###Output _____no_output_____ ###Markdown The **Recovered -> Susceptible** rule defined works as follows:- if a node $n$ is *recovered* a count down named ``incubation`` is instantiated- during each iteration ``incubation`` is decremented- when ``incubation=0`` $n$ shifts to *susceptible* 7.B Compartments Composition ([to top](top))Compartment can be chained in multiple ways so to describe complex transition rules. In particular, a transition rule can be seen as a tree whose nodes are compartments and edges connections among them.As an example consider the following picture describing the structure of a composite **Susceptible->Infected** transition rule. - The initial node status is evaluated at the root of the tree (the master compartment) - If the operation described by such compartment is satisfied the conditions of (one of) its child compartments is evaluated - If a path from the root to one leaf of the tree is completely satisfied the transition rule applies and the node change its status.Compartments can be combined following two criteria:- Cascading Composition- Conditional CompositionA ``transition rule`` can be defined by employing all possible combinations of cascading and conditional compartment composition. 7.B.a Cascading Composition ([to top](top))Since each compartment identifies an atomic condition it is natural to imagine rules described as chains of compartments.A compartment chain identify and ordered set of conditions that needs to be satisfied to allow status transition (it allows describing an **AND** logic).To implement such behaviour each compartment exposes a parameter (named ``composed``) that allows to specify the subsequent compartment to evaluate in case it condition is satisfied. ExampleIn the following scenario the **Susceptible->Infected** rule is implemented using three NodeStochastic compartments chained as follows:$$C_1 \rightarrow C_2 \rightarrow C_3$$- If the node $n$ is *Susceptible* - $C_1$: if at least a neighbor of the actual node is *Infected*, with probability $0.5$ evaluate compartment $C_2$ - $C_2$: with probability $0.4$ evaluate compartment $C_3$ - $C_3$: with probability $0.2$ allow the transition to the *Infected* state ###Code # Network generation g = nx.erdos_renyi_graph(1000, 0.1) # Composite Model instantiation model = gc.CompositeModel(g) # Model statuses model.add_status("Susceptible") model.add_status("Infected") # Compartment definition and chain construction c3 = ns.NodeStochastic(0.2) c2 = ns.NodeStochastic(0.4, composed=c3) c1 = ns.NodeStochastic(0.5, "Infected", composed=c2) # Rule definition model.add_rule("Susceptible", "Infected", c1) # Model initial status configuration config = mc.Configuration() config.add_model_parameter('percentage_infected', 0.1) # Simulation execution model.set_initial_status(config) iterations = model.iteration_bunch(100) trends = model.build_trends(iterations) viz = DiffusionTrend(model, trends) viz.plot() ###Output _____no_output_____ ###Markdown 7.B.a Conditional Composition ([to top](top))Conditional compartment composition allows to describe rules as trees.A compartment tree identify and ordered and disjoint set of conditions that needs to be satisfied to allow status transition (it allows describing an **OR** logic).``ConditionalComposition`` compartment allows to describe branching pattern as follows:$$if\ C_i then\ C_j else\ C_z$$``ConditionalComposition`` evaluate the guard compartment ($C_i$) and, depending from the result it gets (True or False) move to the evaluation of one of its two child compartments ($C_j$ and $C_z$). ExampleIn the following scenario the **Susceptible->Infected** rule is implemented using three NodeStochastic compartments combined as follows:- If the node $n$ is *Susceptible* - $C_1$: if at least a neighbor of the actual node is *Infected*, with probability $0.5$ evaluate compartment $C_2$ else evaluate compartment $C_3$ - $C_2$: with probability $0.2$ allow the transition to the *Infected* state - $C_3$: with probability $0.1$ allow the transition to the *Infected* state ###Code import ndlib.models.compartments.ConditionalComposition as cif # Network generation g = nx.erdos_renyi_graph(1000, 0.1) # Composite Model instantiation model = gc.CompositeModel(g) # Model statuses model.add_status("Susceptible") model.add_status("Infected") # Compartment definition c1 = ns.NodeStochastic(0.5, "Infected") c2 = ns.NodeStochastic(0.2) c3 = ns.NodeStochastic(0.1) # Conditional Composition cc = cif.ConditionalComposition(c1, c2, c3) # Rule definition model.add_rule("Susceptible", "Infected", cc) # Model initial status configuration config = mc.Configuration() config.add_model_parameter('percentage_infected', 0.1) # Simulation execution model.set_initial_status(config) iterations = model.iteration_bunch(30) trends = model.build_trends(iterations) viz = DiffusionTrend(model, trends) viz.plot() ###Output _____no_output_____ ###Markdown 7.C Example: SIR ([to top](top)) Let's describe, and simulate, a **SIR** model using compartments. ###Code from ndlib.models.CompositeModel import CompositeModel from ndlib.models.compartments.NodeStochastic import NodeStochastic # Network definition g1 = nx.erdos_renyi_graph(n=1000, p=0.1) # Model definition SIR = CompositeModel(g1) SIR.add_status('Susceptible') SIR.add_status('Infected') SIR.add_status('Removed') # Compartments c1 = NodeStochastic(triggering_status='Infected', rate=0.001, probability=1) c2 = NodeStochastic(rate=0.01, probability=1) # Rules SIR.add_rule('Susceptible', 'Infected', c1) SIR.add_rule('Infected', 'Removed', c2) # Configuration config = mc.Configuration() config.add_model_parameter('percentage_infected', 0.1) SIR.set_initial_status(config) # Simulation iterations = SIR.iteration_bunch(300, node_status=False) trends = SIR.build_trends(iterations) viz = DiffusionTrend(SIR, trends) viz.plot() ###Output _____no_output_____ ###Markdown 8. NDQL: Network Diffusion Query Language ([to top](top))``NDlib`` aims to reach an heterogeneous audience composed by technicians as well as analysts. In order to abstract from the its programming interface we designed a query language to describe diffusion simulations, ``NDQL``. 8.A Syntax ([to top](top)) An ``NDQL`` script is composed of a minimum set of directives:1. Network specification: - CREATE_NETWORK (-), LOAD_NETWORK (-)2. Model definition: - MODEL, STATUS, COMPARTMENT (+), IF-THEN-ELSE (+), RULE,3. Model initialization and simulation execution - INITIALIZE, EXECUTEDirectives marked with (+) are optional while the ones marked with (-) are mutually exclusive w.r.t. their class.The complete language directive specification is the following:> **MODEL** model_name > > **STATUS** status_name> > **COMPARTMENT** compartment_name > **TYPE** compartment_type > **COMPOSE** compartment_name > [**PARAM** param_name numeric]+ > [**TRIGGER** status_name]> > **IF** compartment_name_1 **THEN** compartment_name_2 **ELSE** compartment_name_3 **AS** rule_name> > **RULE** rule_name > **FROM** status_name > **TO** status_name > **USING** compartment_name> > **INITIALIZE** > [**SET** status_name ratio]+> > **CREATE_NETWORK** network_name > **TYPE** network_type > [**PARAM** param_name numeric]+>> **LOAD_NETWORK** network_name **FROM** network_file>> **EXECUTE** model_name **ON** network_name **FOR** iterationsThe **CREATE_NETWORK** directive can take as network_type any [``networkx``](https://networkx.github.io) graph generator name (param_name are inherited from generator function parameters). 8.B Command line tools ([to top](top)) ``NDlib`` installs two command line tools: - NDQL_translate - NDQL_executeThe former command allows to translate a generic, well-formed, ``NDQL`` script into an equivalent Python one. It can be executed as> ``NDQL_translate`` query_file python_filewhere *query_file* identifies the target ``NDQL`` script and *python_file* specifies the desired name for the resulting Python script.The latter command allows to directly execute a generic, well-formed, ``NDQL`` script. It can be executed as> ``NDQL_execute`` query_file result_filewhere *query_file* identifies the target ``NDQL`` script and *result_file* specifies the desired name for the execution results. Execution results are saved as JSON files with the following syntax:[{"trends": {"node_count": {"0": [270, 179, 15, 0, 0], "1": [30, 116, 273, 256, 239], "2": [0, 5, 12, 44, 61]}, "status_delta": {"0": [0, -91, -164, -15, 0], "1": [0, 86, 157, -17, -17], "2": [0, 5, 7, 32, 17]}}, "Statuses": {"1": "Infected", "2": "Removed", "0": "Susceptible"} }] where - ``node_count`` describes the trends built on the number of nodes per status - ``status_delta`` describe the trends built on the fluctuations of number of nodes per status - ``Statuses`` provides a map from numerical id to status name 8.C Example: SIR ([to top](top)) Let's describe a **SIR** model using ``NDQL``. Network creation> **CREATE_NETWORK** g1> **TYPE** erdos_renyi_graph> **PARAM** n 300> **PARAM** p 0.1 Model definition > **MODEL** SIR> > **STATUS** Susceptible > **STATUS** Infected > **STATUS** Removed > > **COMPARTMENT** c1 > **TYPE** NodeStochastic> **PARAM** rate 0.1> **TRIGGER** Infected> > **COMPARTMENT** c2> **TYPE** NodeStochastic> **PARAM** rate 0.1> > **RULE**> **FROM** Susceptible> **TO** Infected> **USING** c1> > **RULE**> **FROM** Infected> **TO** Removed> **USING** c2 Simulation initialization and execution> **INITIALIZE**> **SET** Infected 0.1> > **EXECUTE** SIR **ON** g1 **FOR** 100The complete query can be find in ``data/ndql_sir.txt``. In order to translate our query into the corresponding python script we can run ###Code !NDQL_translate data/ndql_sir.txt data/sir.py ###Output _____no_output_____ ###Markdown The translated script is then ###Code with open("data/sir.py") as f: l = f.read() print(l) ###Output import networkx as nx import numpy as np import json from ndlib.models.ModelConfig import Configuration from ndlib.models.CompositeModel import CompositeModel from ndlib.models.compartments.NodeStochastic import NodeStochastic g1 = nx.erdos_renyi_graph(n=300, p=0.1) SIR = CompositeModel(g1) SIR.add_status('Susceptible') SIR.add_status('Infected') SIR.add_status('Removed') c1 = NodeStochastic(triggering_status='Infected', rate=0.1, probability=1, name="None") c2 = NodeStochastic(rate=0.1, probability=1, name="None") SIR.add_rule('Susceptible', 'Infected', c1) SIR.add_rule('Infected', 'Removed', c2) config = Configuration() config.add_model_parameter('percentage_infected', 0.1) SIR.set_initial_status(config) iterations = SIR.iteration_bunch(100, node_status=False) res = json.dumps(iterations) print(res) ###Markdown In order to execute the query we can run ###Code !NDQL_execute data/ndql_sir.txt data/res.json ###Output _____no_output_____ ###Markdown As result we get ###Code with open("data/res.json") as f: l = f.read() print(l) ###Output [{"trends": {"node_count": {"0": [270, 200, 24, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], "1": [30, 97, 267, 273, 242, 213, 194, 177, 163, 147, 127, 112, 101, 86, 77, 73, 66, 58, 54, 50, 46, 43, 40, 35, 31, 28, 24, 20, 17, 16, 15, 11, 9, 8, 7, 6, 5, 4, 4, 4, 3, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], "2": [0, 3, 9, 27, 58, 87, 106, 123, 137, 153, 173, 188, 199, 214, 223, 227, 234, 242, 246, 250, 254, 257, 260, 265, 269, 272, 276, 280, 283, 284, 285, 289, 291, 292, 293, 294, 295, 296, 296, 296, 297, 299, 299, 299, 299, 299, 299, 299, 299, 299, 299, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300]}, "status_delta": {"0": [0, -70, -176, -24, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], "1": [0, 67, 170, 6, -31, -29, -19, -17, -14, -16, -20, -15, -11, -15, -9, -4, -7, -8, -4, -4, -4, -3, -3, -5, -4, -3, -4, -4, -3, -1, -1, -4, -2, -1, -1, -1, -1, -1, 0, 0, -1, -2, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], "2": [0, 3, 6, 18, 31, 29, 19, 17, 14, 16, 20, 15, 11, 15, 9, 4, 7, 8, 4, 4, 4, 3, 3, 5, 4, 3, 4, 4, 3, 1, 1, 4, 2, 1, 1, 1, 1, 1, 0, 0, 1, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}}, "Statuses": {"0": "Susceptible", "1": "Infected", "2": "Removed"}}]
nbs/dl1/lesson1-pets_own-hyp50.ipynb
###Markdown from pytorch_modelsize import SizeEstimatorse = SizeEstimator(model, input_size=(64,1,224,224))print(se.estimate_size()) ###Code untar_path = untar_data(URLs.PETS) img_path = untar_path/'images' label_path = untar_path/'annotations' img_fnames = get_image_files(img_path) np.random.seed(2) pattern = r'/([^/]+)_\d+.jpg$' data = ImageDataBunch.from_name_re(img_path, img_fnames, pattern, ds_tfms = get_transforms(), size=size, bs=bs) if normalize: data = data.normalize(imagenet_stats) learn.destroy() learn = None gc.collect() learn = cnn_learner(data, model, metrics=error_rate) learn.lr_find() learn.recorder.plot() learn.unfreeze() learn.lr_find() learn.recorder.plot() learn.summary() learn.fit_one_cycle(4, max_lr = stage1_lr) learn.unfreeze() learn.fit_one_cycle(2, max_lr=srage2_lr) metric = learn.recorder.metrics[-1][0] learn.destroy() ###Output _____no_output_____ ###Markdown Grid search ###Code %reload_ext autoreload %autoreload 2 %matplotlib inline from fastai.vision import * from fastai.metrics import accuracy from fastai.metrics import error_rate import pandas as pd gc.collect() # run constants g_bs = [16, 32, 64] g_size = [100, 150, 200, 224] stage1_lr = 3e-3 srage2_lr = slice(1e-6, 1e-4) normalize = True model = models.resnet50 df = pd.DataFrame(columns=['bs', 'size', 'model', 'normalize', 'error_rate']) #df = df.append({'bs': bs, 'size': size, 'model':model.__name__, 'normalize':normalize, 'error_rate':0.08}, ignore_index=True) #df for size in g_size: for bs in g_bs: print('model: {0}, {1}'.format(bs, size)) untar_path = untar_data(URLs.PETS) img_path = untar_path/'images' label_path = untar_path/'annotations' img_fnames = get_image_files(img_path) np.random.seed(2) pattern = r'/([^/]+)_\d+.jpg$' data = ImageDataBunch.from_name_re(img_path, img_fnames, pattern, ds_tfms = get_transforms(), size=size, bs=bs) if normalize: data = data.normalize(imagenet_stats) learn = cnn_learner(data, model, metrics=error_rate) learn.fit_one_cycle(4, max_lr = stage1_lr) learn.unfreeze() learn.fit_one_cycle(2, max_lr=srage2_lr) metric = learn.recorder.metrics[-1][0] learn.destroy() df = df.append({'bs': bs, 'size': size, 'model':model.__name__, 'normalize':normalize, 'error_rate':metric}, ignore_index=True) df.to_pickle('lect1-res50.pickle') #learn.destroy() ###Output _____no_output_____
utils/loss-function-library-keras-pytorch.ipynb
###Markdown Loss Function Reference for Keras & PyTorch---This kernel provides a reference library for some popular custom loss functions that you can easily import into your code.Loss functions define how neural network models calculate the overall error from their residuals for each epoch. This in turn affects how they adjust their coefficients when performing backpropagation, so the choice of loss function has a direct influence on model performance.The default choice of loss function for segmentation and other classification tasks is Binary Cross-Entropy (BCE). In situations where a particular metric, like the Dice Coefficient or Intersection over Union (IoU), is being used to judge model performance, competitors will sometimes experiment with loss functions that derive from these metrics - typically in the form `1 - f(x)` where `f(x)` is the metric in question. These functions cannot simply be written in NumPy, as they are implemented in the GPU and thus require backend functions from the respective model library that also includes a gradient for the backpropagation algorithm. This is less complicated than it sounds! For example in Keras, you would simply use the same familiar mathematical functions, albeit using the Keras backend imported as `K`, i.e `K.sum()`It is common in multi-class segmentation to use loss functions that calculate the average loss for each class, rather than calculating loss from the prediction tensor as a whole. This kernel is meant as a template reference for the basic code but it should be trivial for you to modify it for multi-class averaging. For example, if the flattened tensors contain the classes sequentially, you can split them into four equal lengths, calculate their respective losses and average them.I hope this kernel will be of use to you, and any corrections or suggestions are welcome. ###Code import numpy import torch import torch.nn as nn import torch.nn.functional as F import keras import keras.backend as K ###Output _____no_output_____ ###Markdown Dice Loss---The Dice coefficient, or Sørensen–Dice coefficient, is a common metric for binary classification tasks such as pixel segmentation that can also be modified to act as a loss function:![](https://wikimedia.org/api/rest_v1/media/math/render/svg/a80a97215e1afc0b222e604af1b2099dc9363d3b) ###Code #PyTorch class DiceLoss(nn.Module): def __init__(self, weight=None, size_average=True): super(DiceLoss, self).__init__() def forward(self, inputs, targets, smooth=1): #comment out if your model contains a sigmoid or equivalent activation layer inputs = F.sigmoid(inputs) #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) intersection = (inputs * targets).sum() dice = (2.*intersection + smooth)/(inputs.sum() + targets.sum() + smooth) return 1 - dice #Keras def DiceLoss(targets, inputs, smooth=1e-6): #flatten label and prediction tensors inputs = K.flatten(inputs) targets = K.flatten(targets) intersection = K.sum(K.dot(targets, inputs)) dice = (2*intersection + smooth) / (K.sum(targets) + K.sum(inputs) + smooth) return 1 - dice ###Output _____no_output_____ ###Markdown BCE-Dice Loss---This loss combines Dice loss with the standard binary cross-entropy (BCE) loss that is generally the default for segmentation models. Combined the two methods allows for some diversity in the loss, while benefitting from the stability of BCE. The equation for multi-class BCE by itself will be familiar to anyone who has studied logistic regression:![](https://wikimedia.org/api/rest_v1/media/math/render/svg/80f87a71d3a616a0939f5360cec24d702d2593a2) ###Code #PyTorch class DiceBCELoss(nn.Module): def __init__(self, weight=None, size_average=True): super(DiceBCELoss, self).__init__() def forward(self, inputs, targets, smooth=1): #comment out if your model contains a sigmoid or equivalent activation layer inputs = F.sigmoid(inputs) #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) intersection = (inputs * targets).sum() dice_loss = 1 - (2.*intersection + smooth)/(inputs.sum() + targets.sum() + smooth) BCE = F.binary_cross_entropy(inputs, targets, reduction='mean') Dice_BCE = BCE + dice_loss return Dice_BCE #Keras def DiceBCELoss(targets, inputs, smooth=1e-6): #flatten label and prediction tensors inputs = K.flatten(inputs) targets = K.flatten(targets) BCE = binary_crossentropy(targets, inputs) intersection = K.sum(K.dot(targets, inputs)) dice_loss = 1 - (2*intersection + smooth) / (K.sum(targets) + K.sum(inputs) + smooth) Dice_BCE = BCE + dice_loss return Dice_BCE ###Output _____no_output_____ ###Markdown Jaccard/Intersection over Union (IoU) Loss---The IoU metric, or Jaccard Index, is similar to the Dice metric and is calculated as the ratio between the overlap of the positive instances between two sets, and their mutual combined values:![](https://wikimedia.org/api/rest_v1/media/math/render/svg/eaef5aa86949f49e7dc6b9c8c3dd8b233332c9e7)Like the Dice metric, it is a common means of evaluating the performance of pixel segmentation models. ###Code #PyTorch class IoULoss(nn.Module): def __init__(self, weight=None, size_average=True): super(IoULoss, self).__init__() def forward(self, inputs, targets, smooth=1): #comment out if your model contains a sigmoid or equivalent activation layer inputs = F.sigmoid(inputs) #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) #intersection is equivalent to True Positive count #union is the mutually inclusive area of all labels & predictions intersection = (inputs * targets).sum() total = (inputs + targets).sum() union = total - intersection IoU = (intersection + smooth)/(union + smooth) return 1 - IoU #Keras def IoULoss(targets, inputs, smooth=1e-6): #flatten label and prediction tensors inputs = K.flatten(inputs) targets = K.flatten(targets) intersection = K.sum(K.dot(targets, inputs)) total = K.sum(targets) + K.sum(inputs) union = total - intersection IoU = (intersection + smooth) / (union + smooth) return 1 - IoU ###Output _____no_output_____ ###Markdown Focal Loss---Focal Loss was introduced by *Lin et al* of Facebook AI Research in 2017 as a means of combatting extremely imbalanced datasets where positive cases were relatively rare. Their paper "Focal Loss for Dense Object Detection" is retrievable here: https://arxiv.org/abs/1708.02002. In practice, the researchers used an alpha-modified version of the function so I have included it in this implementation. ###Code #PyTorch ALPHA = 0.8 GAMMA = 2 class FocalLoss(nn.Module): def __init__(self, weight=None, size_average=True): super(FocalLoss, self).__init__() def forward(self, inputs, targets, alpha=ALPHA, gamma=GAMMA, smooth=1): #comment out if your model contains a sigmoid or equivalent activation layer inputs = F.sigmoid(inputs) #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) #first compute binary cross-entropy BCE = F.binary_cross_entropy(inputs, targets, reduction='mean') BCE_EXP = torch.exp(-BCE) focal_loss = alpha * (1-BCE_EXP)**gamma * BCE return focal_loss #Keras ALPHA = 0.8 GAMMA = 2 def FocalLoss(targets, inputs, alpha=ALPHA, gamma=GAMMA): inputs = K.flatten(inputs) targets = K.flatten(targets) BCE = K.binary_crossentropy(targets, inputs) BCE_EXP = K.exp(-BCE) focal_loss = K.mean(alpha * K.pow((1-BCE_EXP), gamma) * BCE) return focal_loss ###Output _____no_output_____ ###Markdown Tversky Loss---This loss was introduced in "Tversky loss function for image segmentationusing 3D fully convolutional deep networks", retrievable here: https://arxiv.org/abs/1706.05721. It was designed to optimise segmentation on imbalanced medical datasets by utilising constants that can adjust how harshly different types of error are penalised in the loss function. From the paper:>... in the case of α=β=0.5 the Tversky index simplifies to be the same as the Dice coefficient, which is also equal to the F1 score. With α=β=1, Equation 2 produces Tanimoto coefficient, and setting α+β=1 produces the set of Fβ scores. Larger βs weigh recall higher than precision (by placing more emphasis on false negatives).To summarise, this loss function is weighted by the constants 'alpha' and 'beta' that penalise false positives and false negatives respectively to a higher degree in the loss function as their value is increased. The beta constant in particular has applications in situations where models can obtain misleadingly positive performance via highly conservative prediction. You may want to experiment with different values to find the optimum. With alpha==beta==0.5, this loss becomes equivalent to Dice Loss. ###Code #PyTorch ALPHA = 0.5 BETA = 0.5 class TverskyLoss(nn.Module): def __init__(self, weight=None, size_average=True): super(TverskyLoss, self).__init__() def forward(self, inputs, targets, smooth=1, alpha=ALPHA, beta=BETA): #comment out if your model contains a sigmoid or equivalent activation layer inputs = F.sigmoid(inputs) #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) #True Positives, False Positives & False Negatives TP = (inputs * targets).sum() FP = ((1-targets) * inputs).sum() FN = (targets * (1-inputs)).sum() Tversky = (TP + smooth) / (TP + alpha*FP + beta*FN + smooth) return 1 - Tversky #Keras ALPHA = 0.5 BETA = 0.5 def TverskyLoss(targets, inputs, alpha=ALPHA, beta=BETA, smooth=1e-6): #flatten label and prediction tensors inputs = K.flatten(inputs) targets = K.flatten(targets) #True Positives, False Positives & False Negatives TP = K.sum((inputs * targets)) FP = K.sum(((1-targets) * inputs)) FN = K.sum((targets * (1-inputs))) Tversky = (TP + smooth) / (TP + alpha*FP + beta*FN + smooth) return 1 - Tversky ###Output _____no_output_____ ###Markdown Focal Tversky Loss---A variant on the Tversky loss that also includes the gamma modifier from Focal Loss. ###Code #PyTorch ALPHA = 0.5 BETA = 0.5 GAMMA = 1 class FocalTverskyLoss(nn.Module): def __init__(self, weight=None, size_average=True): super(FocalTverskyLoss, self).__init__() def forward(self, inputs, targets, smooth=1, alpha=ALPHA, beta=BETA, gamma=GAMMA): #comment out if your model contains a sigmoid or equivalent activation layer inputs = F.sigmoid(inputs) #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) #True Positives, False Positives & False Negatives TP = (inputs * targets).sum() FP = ((1-targets) * inputs).sum() FN = (targets * (1-inputs)).sum() Tversky = (TP + smooth) / (TP + alpha*FP + beta*FN + smooth) FocalTversky = (1 - Tversky)**gamma return FocalTversky #Keras ALPHA = 0.5 BETA = 0.5 GAMMA = 1 def FocalTverskyLoss(targets, inputs, alpha=ALPHA, beta=BETA, gamma=GAMMA, smooth=1e-6): #flatten label and prediction tensors inputs = K.flatten(inputs) targets = K.flatten(targets) #True Positives, False Positives & False Negatives TP = K.sum((inputs * targets)) FP = K.sum(((1-targets) * inputs)) FN = K.sum((targets * (1-inputs))) Tversky = (TP + smooth) / (TP + alpha*FP + beta*FN + smooth) FocalTversky = K.pow((1 - Tversky), gamma) return FocalTversky ###Output _____no_output_____ ###Markdown Lovasz Hinge Loss---This complex loss function was introduced by Berman, Triki and Blaschko in their paper "The Lovasz-Softmax loss: A tractable surrogate for the optimization of the intersection-over-union measure in neural networks", retrievable here: https://arxiv.org/abs/1705.08790. It is designed to optimise the Intersection over Union score for semantic segmentation, particularly for multi-class instances. Specifically, it sorts predictions by their error before calculating cumulatively how each error affects the IoU score. This gradient vector is then multiplied with the initial error vector to penalise most strongly the predictions that decreased the IoU score the most. This procedure is detailed by [jeandebleu](https://www.kaggle.com/jeandebleau) in his excellent summary [here](https://www.kaggle.com/c/tgs-salt-identification-challenge/discussion/67791).This code is taken directly from the author's github repo here: https://github.com/bermanmaxim/LovaszSoftmax and all credit is to them.In this kernel I have implemented the flat variant that uses reshaped rank-1 tensors as inputs for PyTorch. You can modify it accordingly with the dimensions and class number of your data as needed. This code takes raw logits so ensure your model does not contain an activation layer prior to the loss calculation.I have hidden the researchers' own code below for brevity; simply load it into your kernel for the losses to function. In the case of their tensorflow implementation, I am still working to make it compatible with Keras. There are differences between the Tensorflow and Keras function libraries that complicate this. ###Code #PyTorch def flatten_binary_scores(scores, labels, ignore=None): """ Flattens predictions in the batch (binary case) Remove labels equal to 'ignore' """ scores = scores.view(-1) labels = labels.view(-1) if ignore is None: return scores, labels valid = (labels != ignore) vscores = scores[valid] vlabels = labels[valid] return vscores, vlabels def lovasz_grad(gt_sorted): """ Computes gradient of the Lovasz extension w.r.t sorted errors See Alg. 1 in paper """ p = len(gt_sorted) gts = gt_sorted.sum() intersection = gts - gt_sorted.float().cumsum(0) union = gts + (1 - gt_sorted).float().cumsum(0) jaccard = 1. - intersection / union if p > 1: # cover 1-pixel case jaccard[1:p] = jaccard[1:p] - jaccard[0:-1] return jaccard def lovasz_hinge(logits, labels, per_image=True, ignore=None): """ Binary Lovasz hinge loss logits: [B, H, W] Variable, logits at each pixel (between -\infty and +\infty) labels: [B, H, W] Tensor, binary ground truth masks (0 or 1) per_image: compute the loss per image instead of per batch ignore: void class id """ if per_image: loss = mean(lovasz_hinge_flat(*flatten_binary_scores(log.unsqueeze(0), lab.unsqueeze(0), ignore)) for log, lab in zip(logits, labels)) else: loss = lovasz_hinge_flat(*flatten_binary_scores(logits, labels, ignore)) return loss def lovasz_hinge_flat(logits, labels): """ Binary Lovasz hinge loss logits: [P] Variable, logits at each prediction (between -\infty and +\infty) labels: [P] Tensor, binary ground truth labels (0 or 1) ignore: label to ignore """ if len(labels) == 0: # only void pixels, the gradients should be 0 return logits.sum() * 0. signs = 2. * labels.float() - 1. errors = (1. - logits * Variable(signs)) errors_sorted, perm = torch.sort(errors, dim=0, descending=True) perm = perm.data gt_sorted = labels[perm] grad = lovasz_grad(gt_sorted) loss = torch.dot(F.relu(errors_sorted), Variable(grad)) return loss #===== #Multi-class Lovasz loss #===== def lovasz_softmax(probas, labels, classes='present', per_image=False, ignore=None): """ Multi-class Lovasz-Softmax loss probas: [B, C, H, W] Variable, class probabilities at each prediction (between 0 and 1). Interpreted as binary (sigmoid) output with outputs of size [B, H, W]. labels: [B, H, W] Tensor, ground truth labels (between 0 and C - 1) classes: 'all' for all, 'present' for classes present in labels, or a list of classes to average. per_image: compute the loss per image instead of per batch ignore: void class labels """ if per_image: loss = mean(lovasz_softmax_flat(*flatten_probas(prob.unsqueeze(0), lab.unsqueeze(0), ignore), classes=classes) for prob, lab in zip(probas, labels)) else: loss = lovasz_softmax_flat(*flatten_probas(probas, labels, ignore), classes=classes) return loss def lovasz_softmax_flat(probas, labels, classes='present'): """ Multi-class Lovasz-Softmax loss probas: [P, C] Variable, class probabilities at each prediction (between 0 and 1) labels: [P] Tensor, ground truth labels (between 0 and C - 1) classes: 'all' for all, 'present' for classes present in labels, or a list of classes to average. """ if probas.numel() == 0: # only void pixels, the gradients should be 0 return probas * 0. C = probas.size(1) losses = [] class_to_sum = list(range(C)) if classes in ['all', 'present'] else classes for c in class_to_sum: fg = (labels == c).float() # foreground for class c if (classes is 'present' and fg.sum() == 0): continue if C == 1: if len(classes) > 1: raise ValueError('Sigmoid output possible only with 1 class') class_pred = probas[:, 0] else: class_pred = probas[:, c] errors = (Variable(fg) - class_pred).abs() errors_sorted, perm = torch.sort(errors, 0, descending=True) perm = perm.data fg_sorted = fg[perm] losses.append(torch.dot(errors_sorted, Variable(lovasz_grad(fg_sorted)))) return mean(losses) #PyTorch class LovaszHingeLoss(nn.Module): def __init__(self, weight=None, size_average=True): super(LovaszHingeLoss, self).__init__() def forward(self, inputs, targets): inputs = F.sigmoid(inputs) Lovasz = lovasz_hinge(inputs, targets, per_image=False) return Lovasz import tensorflow as tf import numpy as np def lovasz_grad(gt_sorted): """ Computes gradient of the Lovasz extension w.r.t sorted errors See Alg. 1 in paper """ gts = tf.reduce_sum(gt_sorted) intersection = gts - tf.cumsum(gt_sorted) union = gts + tf.cumsum(1. - gt_sorted) jaccard = 1. - intersection / union jaccard = tf.concat((jaccard[0:1], jaccard[1:] - jaccard[:-1]), 0) return jaccard # --------------------------- BINARY LOSSES --------------------------- def lovasz_hinge(logits, labels, per_image=True, ignore=None): """ Binary Lovasz hinge loss logits: [B, H, W] Variable, logits at each pixel (between -\infty and +\infty) labels: [B, H, W] Tensor, binary ground truth masks (0 or 1) per_image: compute the loss per image instead of per batch ignore: void class id """ if per_image: def treat_image(log_lab): log, lab = log_lab log, lab = tf.expand_dims(log, 0), tf.expand_dims(lab, 0) log, lab = flatten_binary_scores(log, lab, ignore) return lovasz_hinge_flat(log, lab) losses = tf.map_fn(treat_image, (logits, labels), dtype=tf.float32) loss = tf.reduce_mean(losses) else: loss = lovasz_hinge_flat(*flatten_binary_scores(logits, labels, ignore)) return loss def lovasz_hinge_flat(logits, labels): """ Binary Lovasz hinge loss logits: [P] Variable, logits at each prediction (between -\infty and +\infty) labels: [P] Tensor, binary ground truth labels (0 or 1) ignore: label to ignore """ def compute_loss(): labelsf = tf.cast(labels, logits.dtype) signs = 2. * labelsf - 1. errors = 1. - logits * tf.stop_gradient(signs) errors_sorted, perm = tf.nn.top_k(errors, k=tf.shape(errors)[0], name="descending_sort") gt_sorted = tf.gather(labelsf, perm) grad = lovasz_grad(gt_sorted) loss = tf.tensordot(tf.nn.relu(errors_sorted), grad, 1, name="loss_non_void") return loss # deal with the void prediction case (only void pixels) loss = tf.cond(tf.equal(tf.shape(logits)[0], 0), lambda: tf.reduce_sum(logits) * 0., compute_loss, strict=True, name="loss" ) return loss def flatten_binary_scores(scores, labels, ignore=None): """ Flattens predictions in the batch (binary case) Remove labels equal to 'ignore' """ scores = tf.reshape(scores, (-1,)) labels = tf.reshape(labels, (-1,)) if ignore is None: return scores, labels valid = tf.not_equal(labels, ignore) vscores = tf.boolean_mask(scores, valid, name='valid_scores') vlabels = tf.boolean_mask(labels, valid, name='valid_labels') return vscores, vlabels #Keras # not working yet # def LovaszHingeLoss(inputs, targets): # return lovasz_hinge_loss(inputs, targets) ###Output _____no_output_____ ###Markdown Combo Loss---This loss was introduced by Taghanaki et al in their paper "Combo loss: Handling input and output imbalance in multi-organ segmentation", retrievable here: https://arxiv.org/abs/1805.02798. Combo loss is a combination of Dice Loss and a modified Cross-Entropy function that, like Tversky loss, has additional constants which penalise either false positives or false negatives more respectively.Since my GPU quota has run out this week, as of V16 these functions have not been tested, so please leave any debugging notes in the comments section below. ###Code #PyTorch ALPHA = 0.5 # < 0.5 penalises FP more, > 0.5 penalises FN more CE_RATIO = 0.5 #weighted contribution of modified CE loss compared to Dice loss class ComboLoss(nn.Module): def __init__(self, weight=None, size_average=True): super(ComboLoss, self).__init__() def forward(self, inputs, targets, smooth=1, alpha=ALPHA, beta=BETA): #flatten label and prediction tensors inputs = inputs.view(-1) targets = targets.view(-1) #True Positives, False Positives & False Negatives intersection = (inputs * targets).sum() dice = (2. * intersection + smooth) / (inputs.sum() + targets.sum() + smooth) inputs = torch.clamp(inputs, e, 1.0 - e) out = - (ALPHA * ((targets * torch.log(inputs)) + ((1 - ALPHA) * (1.0 - targets) * torch.log(1.0 - inputs)))) weighted_ce = out.mean(-1) combo = (CE_RATIO * weighted_ce) - ((1 - CE_RATIO) * dice) return combo #Keras ALPHA = 0.5 # < 0.5 penalises FP more, > 0.5 penalises FN more CE_RATIO = 0.5 #weighted contribution of modified CE loss compared to Dice loss def Combo_loss(targets, inputs): targets = K.flatten(targets) inputs = K.flatten(inputs) intersection = K.sum(targets * inputs) dice = (2. * intersection + smooth) / (K.sum(targets) + K.sum(inputs) + smooth) inputs = K.clip(inputs, e, 1.0 - e) out = - (ALPHA * ((targets * K.log(inputs)) + ((1 - ALPHA) * (1.0 - targets) * K.log(1.0 - inputs)))) weighted_ce = K.mean(out, axis=-1) combo = (CE_RATIO * weighted_ce) - ((1 - CE_RATIO) * dice) return combo ###Output _____no_output_____
Accommodation/accommodation_5.ipynb
###Markdown ###Code pip install spacy pip install blackstone pip install https://blackstone-model.s3-eu-west-1.amazonaws.com/en_blackstone_proto-0.0.1.tar.gz import spacy # Load the model import en_blackstone_proto nlp = en_blackstone_proto.load() text = """ 31 As we shall explain in more detail in examining the submission of the Secretary of State (see paras 77 and following), it is the Secretary of State’s case that nothing has been done by Parliament in the European Communities Act 1972 or any other statute to remove the prerogative power of the Crown, in the conduct of the international relations of the UK, to take steps to remove the UK from the EU by giving notice under article 50EU for the UK to withdraw from the EU Treaty and other relevant EU Treaties. The Secretary of State relies in particular on Attorney General v De Keyser’s Royal Hotel Ltd [1920] AC 508 and R v Secretary of State for Foreign and Commonwealth Affairs, Ex p Rees-Mogg [1994] QB 552; he contends that the Crown’s prerogative power to cause the UK to withdraw from the EU by giving notice under article 50EU could only have been removed by primary legislation using express words to that effect, alternatively by legislation which has that effect by necessary implication. The Secretary of State contends that neither the ECA 1972 nor any of the other Acts of Parliament referred to have abrogated this aspect of the Crown’s prerogative, either by express words or by necessary implication. """ # Apply the model to the text doc = nlp(text) # Iterate through the entities identified by the model for ent in doc.ents: print(ent.text, ent.label_) """ Visualise entities using spaCy's displacy visualiser. Blackstone has a custom colour palette: `from blackstone.displacy_palette import ner_displacy options` """ import spacy from spacy import displacy from blackstone.displacy_palette import ner_displacy_options import en_blackstone_proto nlp = en_blackstone_proto.load() text = """ The applicant must satisfy a high standard. This is a case where the action is to be tried by a judge with a jury. The standard is set out in Jameel v Wall Street Journal Europe Sprl [2004] EMLR 89, para 14: “But every time a meaning is shut out (including any holding that the words complained of either are, or are not, capable of bearing a defamatory meaning) it must be remembered that the judge is taking it upon himself to rule in effect that any jury would be perverse to take a different view on the question. It is a high threshold of exclusion. Ever since Fox’s Act 1792 (32 Geo 3, c 60) the meaning of words in civil as well as criminal libel proceedings has been constitutionally a matter for the jury. The judge’s function is no more and no less than to pre-empt perversity. That being clearly the position with regard to whether or not words are capable of being understood as defamatory or, as the case may be, non-defamatory, I see no basis on which it could sensibly be otherwise with regard to differing levels of defamatory meaning. Often the question whether words are defamatory at all and, if so, what level of defamatory meaning they bear will overlap.” 18 In Berezovsky v Forbes Inc [2001] EMLR 1030, para 16 Sedley LJ had stated the test this way: “The real question in the present case is how the courts ought to go about ascertaining the range of legitimate meanings. Eady J regarded it as a matter of impression. That is all right, it seems to us, provided that the impression is not of what the words mean but of what a jury could sensibly think they meant. Such an exercise is an exercise in generosity, not in parsimony.” """ doc = nlp(text) # Call displacy and pass `ner_displacy_options` into the option parameter` displacy.serve(doc, style="ent", options=ner_displacy_options) import spacy import en_blackstone_proto # Load the model nlp = en_blackstone_proto.load() def get_top_cat(doc): """ Function to identify the highest scoring category prediction generated by the text categoriser. """ cats = doc.cats max_score = max(cats.values()) max_cats = [k for k, v in cats.items() if v == max_score] max_cat = max_cats[0] return (max_cat, max_score) text = """ It is a well-established principle of law that the transactions of independent states between each other are governed by other laws than those which municipal courts administer. \ It is, however, in my judgment, insufficient to react to the danger of over-formalisation and “judicialisation” simply by emphasising flexibility and context-sensitivity. \ The question is whether on the facts found by the judge, the (or a) proximate cause of the loss of the rig was “inherent vice or nature of the subject matter insured” within the meaning of clause 4.4 of the Institute Cargo Clauses (A). """ # Apply the model to the text doc = nlp(text) # Get the sentences in the passage of text sentences = [sent.text for sent in doc.sents] # Print the sentence and the corresponding predicted category. for sentence in sentences: doc = nlp(sentence) top_category = get_top_cat(doc) print (f"\"{sentence}\" {top_category}\n") import spacy from blackstone.pipeline.abbreviations import AbbreviationDetector nlp = en_blackstone_proto.load() # Add the abbreviation pipe to the spacy pipeline. abbreviation_pipe = AbbreviationDetector(nlp) nlp.add_pipe(abbreviation_pipe) doc = nlp('The European Court of Human Rights ("ECtHR") is the court ultimately responsible for applying the European Convention on Human Rights ("ECHR").') print("Abbreviation", "\t", "Definition") for abrv in doc._.abbreviations: print(f"{abrv} \t ({abrv.start}, {abrv.end}) {abrv._.long_form}") import spacy from blackstone.utils.legislation_linker import extract_legislation_relations nlp = en_blackstone_proto.load() text = "The Secretary of State was at pains to emphasise that, if a withdrawal agreement is made, it is very likely to be a treaty requiring ratification and as such would have to be submitted for review by Parliament, acting separately, under the negative resolution procedure set out in section 20 of the Constitutional Reform and Governance Act 2010. Theft is defined in section 1 of the Theft Act 1968" doc = nlp(text) relations = extract_legislation_relations(doc) for provision, provision_url, instrument, instrument_url in relations: print(f"\n{provision}\t{provision_url}\t{instrument}\t{instrument_url}") import spacy from blackstone.pipeline.sentence_segmenter import SentenceSegmenter from blackstone.rules import CITATION_PATTERNS nlp = en_blackstone_proto.load() # add the Blackstone sentence_segmenter to the pipeline before the parser sentence_segmenter = SentenceSegmenter(nlp.vocab, CITATION_PATTERNS) nlp.add_pipe(sentence_segmenter, before="parser") doc = nlp( """ The courts in this jurisdiction will enforce those commitments when it is legally possible and necessary to do so (see, most recently, R. (on the application of ClientEarth) v Secretary of State for the Environment, Food and Rural Affairs (No.2) [2017] P.T.S.R. 203 and R. (on the application of ClientEarth) v Secretary of State for Environment, Food and Rural Affairs (No.3) [2018] Env. L.R. 21). The central question in this case arises against that background. """ ) for sent in doc.sents: print (sent.text) from google.colab import files uploaded = files.upload() pip install conll-df import pandas as pd import conll_df from conll_df import conll_df df = conll_df(doc18.conllup, file_index=False) df.head(40).to_html() ###Output _____no_output_____
.ipynb_checkpoints/hw4-checkpoint.ipynb
###Markdown Scraping and Analyzing the SF ChronicleIn Python, scrape sfchronicle.com for article information, like title, author, and date using html requests. Analyze website sections for most used words using nltk for natural-language processing and matplotlib for visualizations. Important for linguists to understand language in journalism. The San Francisco ChronicleIn this assignment, you'll scrape text from [The San Francisco Chronicle](https://www.sfchronicle.com/) newspaper and then analyze the text.The Chronicle is organized by category into article lists. For example, there's a [Local](https://www.sfchronicle.com/local/) list, [Sports](https://www.sfchronicle.com/sports/) list, and [Food](https://www.sfchronicle.com/food/) list.The goal of exercises 1.1 - 1.3 is to scrape articles from the Chronicle for analysis in exercise 1.4. __Exercise 1.1.__ Write a function that extracts all of the links to articles in a Chronicle article list. The function should:* Have a parameter `url` for the URL of the article list.* Return a list of article URLs (each URL should be a string).Test your function on 2-3 different categories to make sure it works.Hints:* Be polite and save time by setting up [requests_cache](https://pypi.python.org/pypi/requests-cache) before you write your function.* You can use any of the XML/HTML parsing packages mentioned in class. Choose one and use it throughout the entire assignment. ###Code import numpy as np import pandas as pd import requests import requests_cache import lxml.html as lx import matplotlib.pyplot as plt import nltk import nltk.corpus # import time requests_cache.install_cache("mycache") def read_html(url): response = requests.get(url) response.raise_for_status() return lx.fromstring(response.text) def get_article_list(url): """takes in url of an SF Chronicle article list returns list of links to the articles as strings""" if url[-1] == "/": url = url.strip("/") html = read_html(url) link_list = html.xpath("//h3[@class = 'headline display-above']/a | //h2[contains(@class, 'headline')]/a | //div[contains(@class, 'item rel-links')]/h4/a") new_list = [] for link in link_list: l = link.attrib["href"] if l[0] == "/": new_list.append(url + l) else: new_list.append(l) return new_list # test function with local articles get_article_list("https://www.sfchronicle.com/local/") # test function with politics articles get_article_list("https://www.sfchronicle.com/elections/") ###Output _____no_output_____ ###Markdown __Exercise 1.2.__ Write a function that extracts data from a Chronicle article. The function should:* Have a parameter `url` for the URL of the article.* Return a dictionary with keys for: + `url`: The URL of the article. + `title`: The title of the article. + `text`: The complete text of the article. + `author`: The author's name (if available) or a suitable missing value. + `date`: The date and time the article was published. + `date_updated`: The date and time the article was last updated (if available) or a suitable missing value.For example, for [this article](https://www.sfchronicle.com/homeandgarden/article/Gardenlust-looks-at-best-21st-century-13580871.php) your function should return a dictionary with the form:```js{'url': 'https://www.sfchronicle.com/homeandgarden/article/Gardenlust-looks-at-best-21st-century-13580871.php', 'title': '‘Gardenlust’ looks at best 21st century gardens in the world', 'text': 'The book...', 'author': 'Pam Peirce', 'date': '2019-02-01T18:02:33+00:00', 'date_updated': '2019-02-01T18:12:53+00:00'}```The value of the `text` field is omitted here to save space. Your function should return the full text in the `text` field.Hints:* Many parsing packages allow you to delete elements from an HTML document. Deleting elements is one way to avoid extracting unwanted tags.* You can union multiple XPath paths with `|`. ###Code def extract_article_info(url): """takes in a url of an article and returns article info in a dictionary""" info = {} html = read_html(url) info["url"] = html.xpath("//meta[@property = 'og:url']/@content")[0] info["title"] = html.xpath("//meta[@property = 'og:title']/@content")[0] text = html.xpath("//section[contains(@class, 'body')]//p | //div[@class = 'text-block']//p") text_list = [] for p in text: text_list.append(p.text_content().strip(" \r\n")) info["text"] = " ".join(text_list) # unique list separated with " & " in case there are multiple authors: author = list(set(html.xpath("//meta[@name = 'author.name']/@content | //meta[@name = 'sailthru.author']/@content"))) if author == [] or author[0] == "": author = html.xpath("//div[@class = 'author']//a | //div[contains(@class, 'author')]//span") author_list = [] for a in author: author_list.append(a.text_content().replace("By ", "").strip(" \n")) info["author"] = " & ".join(author_list) else: info["author"] = " & ".join(author) try: info["date"] = html.xpath("//meta[@name = 'og:datePublished']/@content | //meta[@name = 'sailthru.date']/@content | //time[@itemprop = 'datePublished']/@datetime")[0] except: try: info["date"] = html.xpath("//div[@class = 'date']")[0].text_content() except: info["date"] = "" try: info["date_updated"] = html.xpath("//time[@itemprop = 'dateModified']/@datetime")[0] except: try: info["date_updated"] = html.xpath("//div[contains(@class, 'byline')]/span[@class = 'dateline']")[1].text_content().replace("Updated: ", "") except: info["date_updated"] = "" return info # test with article extract_article_info("https://datebook.sfchronicle.com/movies-tv/the-hidden-world-a-touching-finish-to-the-how-to-train-your-dragon-series") # test with another article extract_article_info("https://www.sfchronicle.com/politics/article/Feinstein-goes-toe-to-toe-with-Green-New-Deal-13638416.php") # another article extract_article_info("https://www.sfchronicle.com/bayarea/article/Public-Defender-Jeff-Adachi-dies-13638785.php") # one more article extract_article_info("https://projects.sfchronicle.com/2018/stem-cells/politics/") ###Output _____no_output_____ ###Markdown __Exercise 1.3.__ Use your functions from exercises 1.1 and 1.2 to get data frames of articles for the "Biz+Tech" category as well as two other categories of your choosing (except for "Vault: Archive", "Podcasts", and "In Depth").Add a column to each that indicates the category, then combine them into one big data frame. Clean up the data, stripping excess whitespace and converting columns to appropriate dtypes.The `text` column of this data frame will be your corpus for natural language processing in exercise 1.4. ###Code def create_dataframe(url): """given a url to a list of articles returns a dataframe with information about all those articles""" links = get_article_list(url) dicts = [extract_article_info(link) for link in links] # https://stackoverflow.com/questions/20638006/convert-list-of-dictionaries-to-a-pandas-dataframe return pd.DataFrame(dicts) biz_tech = create_dataframe("https://www.sfchronicle.com/business/") biz_tech local = create_dataframe("https://www.sfchronicle.com/local/") local politics = create_dataframe("https://www.sfchronicle.com/elections/") politics # combine three dataframes into one big dataframe # http://pandas.pydata.org/pandas-docs/stable/user_guide/merging.html frames = [biz_tech, local, politics] df = pd.concat(frames) ###Output _____no_output_____ ###Markdown __Exercise 1.4.__ What topics has the Chronicle covered recently? How does the category affect the topics? Support your analysis with visualizations.Hints:* The [nltk book](http://www.nltk.org/book/) may be helpful here.* This question will be easier to do after we've finished NLP in class. ###Code # nltk.download("stopwords") # nltk.download('punkt') def process_corpus(list_of_texts): """given a list of texts, get a list of stemmed words""" corpus = " ".join(list_of_texts) corpus = corpus.lower() corpus = corpus.replace("san francisco", "sanfrancisco") corpus = corpus.replace("said", "") words = nltk.word_tokenize(corpus) stopwords = nltk.corpus.stopwords.words("english") corpus = corpus.replace("would", "") words2 = [w for w in words if w not in stopwords] stemmer = nltk.PorterStemmer() words3 = [stemmer.stem(w) for w in words2] return words3 corpus_all = process_corpus(df["text"]) def plot_fq(processed): """given a processed corpus plot a frequency distribution of the top 25 words""" fq = nltk.FreqDist(w for w in processed if w.isalnum()) %matplotlib inline fq.plot(25) return plot_fq(corpus_all) ###Output _____no_output_____ ###Markdown I removed the words "said" and "would" because they are very common but don't add much to the trying to understand the content of the articles. What we can infer from "said" and "would" is that many articles are reporting on what people said, which we can totally expect from a news organization, and that many articles are making promises, or are reporting on people making promises. "Would" means that some action will happen in the future, so these articles must talk a lot of the future.Out of the three categories (business/tech, local, and politics), the top topics in the news recently are San Francisco, California, Trump, and Democrats. These topics are entirely predictable, since the company is located in San Francisco, California. Trump is always in the news, and local politicians will always talk about Trump. The politics section will undoubtedly mention him very often, as well as Democrats. Interestingly, the articles mention Democrats much more often than Republicans, but that is somewhat expected since California is mostly Democrats. But this also might hint at the political leaning of the news site. There doesn't seem to be any differentiating words for the business and tech section.Some other interesting and frequent words are "newsom" and "cell." The first is in reference to Gavin Newsom, who is the new governor of California. It is no surprise that the articles mention him frequently. The articles talk about "cell" often, because there are a bunch of very long articles in the local section that talk about stem cells. However, let's go into how the sections differ. ###Code # create a corpus for the different sections corpus_biz_tech = process_corpus(biz_tech["text"]) corpus_local = process_corpus(local["text"]) corpus_politics = process_corpus(politics["text"]) plot_fq(corpus_biz_tech) plot_fq(corpus_local) plot_fq(corpus_politics) ###Output _____no_output_____
Week 8/Week-8-Exercises/Week-8-Exercises/Solutions/ex08_3_twitter-SOLUTIONS.ipynb
###Markdown Exercise Notebook on Accessing Twitter: Solutions Exercise 1: Twitter API AccessYou should already have Twitter access setup for the lecture, if you do not, please revisit the lecture and make sure you have your Twitter credentials saved: ###Code import pickle import os ###Output _____no_output_____ ###Markdown Make sure to select the relative path to the `secret_twitter_credentials.pkl` file: ###Code Twitter=pickle.load(open('../../../secret_twitter_credentials.pkl','rb')) import twitter auth = twitter.oauth.OAuth(Twitter['Access Token'], Twitter['Access Token Secret'], Twitter['Consumer Key'], Twitter['Consumer Secret']) twitter_api = twitter.Twitter(auth=auth) ###Output _____no_output_____ ###Markdown Exercise 2: Get the WOE ID for a place of interest Find the Yahoo! Where On Earth ID for a place you are interested in at:http://woeid.rosselliot.co.nz/Set `LOCAL_WOE_ID` to this integer number below: ###Code LOCAL_WOE_ID = None ### BEGIN SOLUTION LOCAL_WOE_ID = 44418 ### END SOLUTION assert LOCAL_WOE_ID, "Remember to set LOCAL_WOE_ID to a location identifier" ###Output _____no_output_____ ###Markdown Exercise 3: Retrieve and print local trendsLet's use the twitter API to retrieve trends ###Code local_trends = twitter_api.trends.place(_id=LOCAL_WOE_ID) ###Output _____no_output_____ ###Markdown `local_trends` is a highly nested data structure made up of lists and dictionaries, explore it with `type()`, `len()` and indexing like `[0]` and print out a list of all the trends: ###Code list_of_trends = None ### BEGIN SOLUTION list_of_trends = [trend["name"] for trend in local_trends[0]['trends']] ### END SOLUTION assert isinstance(list_of_trends, list), "list_of_trends should be a list" ###Output _____no_output_____ ###Markdown Exercise 4: Collecting search resultsNow let's execute a search on Twitter for the most popular trend and repeat the filtering step performed during lecture to remove duplicate results.Set the `q` variable to the most popular trend in the list we retrieved above: ###Code q = None ### BEGIN SOLUTION q = list_of_trends[0] ### END SOLUTION ###Output _____no_output_____ ###Markdown Then let's execute the Twitter search: ###Code # DO NOT MODIFY count = 100 search_results = twitter_api.search.tweets(q=q, count=count) statuses = search_results['statuses'] # DO NOT MODIFY all_text = [] filtered_statuses = [] for s in statuses: if not s["text"] in all_text: filtered_statuses.append(s) all_text.append(s["text"]) len(filtered_statuses) ###Output _____no_output_____ ###Markdown Exercise 5: Create a list of retweet count and status tuplesWe want to sort the tweets by the retweet count, therefore the first step is to create a list of tuples where the first element is the retweet count and then use the `sorted` function to perform the sorting operation. ###Code retweets = None ### BEGIN SOLUTION retweets = [(s["retweet_count"], s["text"]) for s in filtered_statuses] ### END SOLUTION assert len(retweets) == len(filtered_statuses), "Make sure you are using filterest_statuses and not statuses" assert len(retweets[0]) == 2, "Each tuple should only have 2 elements, retweet count and the tweet text" ###Output _____no_output_____ ###Markdown Exercise 6: Sort a list of tweetsUse the `sorted` function to sort retweets and get the 10 more popular, we'd like to have the more popular tweet first. ###Code popular_tweets = None ### BEGIN SOLUTION popular_tweets = sorted(retweets, reverse=True)[:10] ### END SOLUTION assert len(popular_tweets) == 10, "Find the 10 most popular" assert popular_tweets[0][0] >= popular_tweets[-1][0], "Make sure you are sorting in descending order" ###Output _____no_output_____
customscatter.ipynb
###Markdown Custom Scatter ###Code %matplotlib inline import plotly.offline as pyo import plotly.graph_objects as go import matplotlib.pyplot as plt import pandas as pd import numpy as np plt.style.use('ggplot') bitcoin = pd.read_csv('data/bitcoin-usd.csv', parse_dates=['date'], index_col='date') bitcoin.head() data = [ go.Scatter( x = bitcoin.index, y = bitcoin[col], name=col, mode='markers', marker=dict( size=15, color='rgb(88,188,255)', symbol = 100, line=dict( width=1 ) ) ) for col in bitcoin.columns if col == 'close' ] layout = go.Layout( title='Close', xaxis=dict(title='Date'), yaxis=dict(title='Close') ) fig = go.Figure(data=data, layout=layout) pyo.plot(fig) ###Output _____no_output_____
book/melvin/gaussian_density_estimation.ipynb
###Markdown Latent Gaussian Density Estimation ###Code import jax.numpy as jnp from jax import random from melvin import LaplaceApproximation import jax import matplotlib.pylab as plt from functools import partial jax.config.update("jax_enable_x64", True) SEED = random.PRNGKey(220) N_ROWS = 2000 LATENT_MEAN = 3.0 LATENT_STD = 1.0 NOISE_STD = 2.0 SEED, _seed_1, _seed_2 = random.split(SEED,3) y_latent = jax.random.normal(key=_seed_1, shape=(N_ROWS,))*LATENT_STD + LATENT_MEAN y = y_latent + jax.random.normal(key=_seed_2, shape=(N_ROWS,))*NOISE_STD bins = jnp.linspace(-8,12,100) plt.hist(y_latent, bins=bins, alpha=0.3) plt.hist(y, bins=bins, alpha=0.3) plt.show() class GaussianDensityEstimator(LaplaceApproximation): param_bounds = jnp.array([[jnp.nan, jnp.nan], [0, jnp.nan]]) def model(self, params, X): mu = params[0] std_latent = params[1] std_noise = self.fixed_params[0] std = jnp.sqrt(std_latent**2 + std_noise**2) return jnp.array([mu, std]) def log_prior(self, params): # Uninformative priors on both parameters mu = params[0] std_latent = params[1] mu_log_prior = jax.scipy.stats.norm.logpdf(mu, loc=0.0, scale=100.0) std_latent_log_prior = jax.scipy.stats.expon.logpdf(std_latent, scale=100.0) return mu_log_prior + std_latent_log_prior def log_likelihood(self, params, y, y_pred): mu = y_pred[0] std = y_pred[1] log_like = jax.scipy.stats.norm.logpdf(y, loc=mu, scale=std) return jnp.sum(log_like) initial_params = jnp.array([5.0, 5.0]) gaussian_density_estimator = GaussianDensityEstimator( name="Gaussian Density Estimator", initial_params=initial_params, fixed_params=jnp.array([NOISE_STD]), y=y ) print(gaussian_density_estimator) SEED, _seed = random.split(SEED,2) samples = gaussian_density_estimator.sample_params(prng_key = _seed, n_samples = 10000, verbose=True) plt.hist2d(samples[:,0], samples[:,1], bins=(30,30), cmin=1) plt.axhline(LATENT_STD, color="r", label="True parameters") plt.axvline(LATENT_MEAN, color="r") plt.xlabel("Latent Mean") plt.ylabel("Latent Std") plt.colorbar() plt.legend() plt.show() SEED, _seed = random.split(SEED,2) def get_pdf(params, x): return jax.scipy.stats.norm.pdf( x, loc=params[0], scale=params[1] ) x = jnp.linspace(-3,10,100) y_pdf = get_pdf(gaussian_density_estimator.params.x, x) y_pdf_samples = gaussian_density_estimator.sample_params_map( prng_key = _seed, n_samples = 300, func=get_pdf, args=(x,), verbose=True ) y_pdf_low = jnp.percentile(y_pdf_samples, q=5, axis=0) y_pdf_upp = jnp.percentile(y_pdf_samples, q=95, axis=0) plt.hist(y_latent, bins=x, alpha=0.3, density=True, label="Latent Samples") plt.hist(y, bins=x, alpha=0.3, density=True, label="Observed Samples") plt.plot(x, y_pdf, color="k", label="Fitted Latent Distribution") plt.fill_between(x, y_pdf_low, y_pdf_upp, color="k", label="90% Confidence Interval", alpha=0.3) plt.legend() plt.show() ###Output Method = simple, Perf = -4459.041416049719 Method = importance, Perf = -4459.056578949636 Method = simple was best
quizes/portfolio-variances/portfolio_variance.ipynb
###Markdown Portfolio Variance ###Code import sys !{sys.executable} -m pip install -r requirements.txt import numpy as np import pandas as pd import time import os import quiz_helper import matplotlib.pyplot as plt %matplotlib inline plt.style.use('ggplot') plt.rcParams['figure.figsize'] = (14, 8) ###Output _____no_output_____ ###Markdown data bundle ###Code import os import quiz_helper from zipline.data import bundles os.environ['ZIPLINE_ROOT'] = os.path.join(os.getcwd(), '..', '..','data','module_4_quizzes_eod') ingest_func = bundles.csvdir.csvdir_equities(['daily'], quiz_helper.EOD_BUNDLE_NAME) bundles.register(quiz_helper.EOD_BUNDLE_NAME, ingest_func) print('Data Registered') ###Output Data Registered ###Markdown Build pipeline engine ###Code from zipline.pipeline import Pipeline from zipline.pipeline.factors import AverageDollarVolume from zipline.utils.calendars import get_calendar universe = AverageDollarVolume(window_length=120).top(500) trading_calendar = get_calendar('NYSE') bundle_data = bundles.load(quiz_helper.EOD_BUNDLE_NAME) engine = quiz_helper.build_pipeline_engine(bundle_data, trading_calendar) ###Output _____no_output_____ ###Markdown View Data¶With the pipeline engine built, let's get the stocks at the end of the period in the universe we're using. We'll use these tickers to generate the returns data for the our risk model. ###Code universe_end_date = pd.Timestamp('2016-01-05', tz='UTC') universe_tickers = engine\ .run_pipeline( Pipeline(screen=universe), universe_end_date, universe_end_date)\ .index.get_level_values(1)\ .values.tolist() universe_tickers len(universe_tickers) from zipline.data.data_portal import DataPortal data_portal = DataPortal( bundle_data.asset_finder, trading_calendar=trading_calendar, first_trading_day=bundle_data.equity_daily_bar_reader.first_trading_day, equity_minute_reader=None, equity_daily_reader=bundle_data.equity_daily_bar_reader, adjustment_reader=bundle_data.adjustment_reader) ###Output _____no_output_____ ###Markdown Get pricing data helper function ###Code from quiz_helper import get_pricing ###Output _____no_output_____ ###Markdown get pricing data into a dataframe ###Code returns_df = \ get_pricing( data_portal, trading_calendar, universe_tickers, universe_end_date - pd.DateOffset(years=5), universe_end_date)\ .pct_change()[1:].fillna(0) #convert prices into returns returns_df ###Output _____no_output_____ ###Markdown Let's look at a two stock portfolioLet's pretend we have a portfolio of two stocks. We'll pick Apple and Microsoft in this example. ###Code aapl_col = returns_df.columns[3] msft_col = returns_df.columns[312] asset_return_1 = returns_df[aapl_col].rename('asset_return_aapl') asset_return_2 = returns_df[msft_col].rename('asset_return_msft') asset_return_df = pd.concat([asset_return_1,asset_return_2],axis=1) asset_return_df.head(2) ###Output _____no_output_____ ###Markdown Factor returnsLet's make up a "factor" by taking an average of all stocks in our list. You can think of this as an equal weighted index of the 490 stocks, kind of like a measure of the "market". We'll also make another factor by calculating the median of all the stocks. These are mainly intended to help us generate some data to work with. We'll go into how some common risk factors are generated later in the lessons.Also note that we're setting axis=1 so that we calculate a value for each time period (row) instead of one value for each column (assets). ###Code factor_return_1 = returns_df.mean(axis=1) factor_return_2 = returns_df.median(axis=1) factor_return_l = [factor_return_1, factor_return_2] ###Output _____no_output_____ ###Markdown Factor exposuresFactor exposures refer to how "exposed" a stock is to each factor. We'll get into this more later. For now, just think of this as one number for each stock, for each of the factors. ###Code from sklearn.linear_model import LinearRegression """ For now, just assume that we're calculating a number for each stock, for each factor, which represents how "exposed" each stock is to each factor. We'll discuss how factor exposure is calculated later in the lessons. """ def get_factor_exposures(factor_return_l, asset_return): lr = LinearRegression() X = np.array(factor_return_l).T y = np.array(asset_return.values) lr.fit(X,y) return lr.coef_ factor_exposure_l = [] for i in range(len(asset_return_df.columns)): factor_exposure_l.append( get_factor_exposures(factor_return_l, asset_return_df[asset_return_df.columns[i]] )) factor_exposure_a = np.array(factor_exposure_l) print(f"factor_exposures for asset 1 {factor_exposure_a[0]}") print(f"factor_exposures for asset 2 {factor_exposure_a[1]}") ###Output factor_exposures for asset 1 [ 1.35101534 -0.58353198] factor_exposures for asset 2 [-0.2283345 1.16364007] ###Markdown Variance of stock 1Calculate the variance of stock 1. $\textrm{Var}(r_{1}) = \beta_{1,1}^2 \textrm{Var}(f_{1}) + \beta_{1,2}^2 \textrm{Var}(f_{2}) + 2\beta_{1,1}\beta_{1,2}\textrm{Cov}(f_{1},f_{2}) + \textrm{Var}(s_{1})$ ###Code factor_exposure_1_1 = factor_exposure_a[0][0] factor_exposure_1_2 = factor_exposure_a[0][1] common_return_1 = factor_exposure_1_1 * factor_return_1 + factor_exposure_1_2 * factor_return_2 specific_return_1 = asset_return_1 - common_return_1 covm_f1_f2 = np.cov(factor_return_1,factor_return_2,ddof=1) #this calculates a covariance matrix # get the variance of each factor, and covariances from the covariance matrix covm_f1_f2 var_f1 = covm_f1_f2[0,0] var_f2 = covm_f1_f2[1,1] cov_f1_f2 = covm_f1_f2[0][1] # calculate the specific variance. var_s_1 = np.var(specific_return_1,ddof=1) # calculate the variance of asset 1 in terms of the factors and specific variance var_asset_1 = (factor_exposure_1_1**2 * var_f1) + \ (factor_exposure_1_2**2 * var_f2) + \ 2 * (factor_exposure_1_1 * factor_exposure_1_2 * cov_f1_f2) + \ var_s_1 print(f"variance of asset 1: {var_asset_1:.8f}") ###Output variance of asset 1: 0.00028209 ###Markdown Variance of stock 2Calculate the variance of stock 2. $\textrm{Var}(r_{2}) = \beta_{2,1}^2 \textrm{Var}(f_{1}) + \beta_{2,2}^2 \textrm{Var}(f_{2}) + 2\beta_{2,1}\beta_{2,2}\textrm{Cov}(f_{1},f_{2}) + \textrm{Var}(s_{2})$ ###Code factor_exposure_2_1 = factor_exposure_a[1][0] factor_exposure_2_2 = factor_exposure_a[1][1] common_return_2 = factor_exposure_2_1 * factor_return_1 + factor_exposure_2_2 * factor_return_2 specific_return_2 = asset_return_2 - common_return_2 # Notice we already calculated the variance and covariances of the factors # calculate the specific variance of asset 2 var_s_2 = np.var(specific_return_2,ddof=1) # calcualte the variance of asset 2 in terms of the factors and specific variance var_asset_2 = (factor_exposure_2_1**2 * var_f1) + \ (factor_exposure_2_2**2 * var_f2) + \ (2 * factor_exposure_2_1 * factor_exposure_2_2 * cov_f1_f2) + \ var_s_2 print(f"variance of asset 2: {var_asset_2:.8f}") ###Output variance of asset 2: 0.00021856 ###Markdown Covariance of stocks 1 and 2Calculate the covariance of stock 1 and 2. $\textrm{Cov}(r_{1},r_{2}) = \beta_{1,1}\beta_{2,1}\textrm{Var}(f_{1}) + \beta_{1,1}\beta_{2,2}\textrm{Cov}(f_{1},f_{2}) + \beta_{1,2}\beta_{2,1}\textrm{Cov}(f_{1},f_{2}) + \beta_{1,2}\beta_{2,2}\textrm{Var}(f_{2})$ ###Code # TODO: calculate the covariance of assets 1 and 2 in terms of the factors cov_asset_1_2 = (factor_exposure_1_1 * factor_exposure_2_1 * var_f1) + \ (factor_exposure_1_1 * factor_exposure_2_2 * cov_f1_f2) + \ (factor_exposure_1_2 * factor_exposure_2_1 * cov_f1_f2) + \ (factor_exposure_1_2 * factor_exposure_2_2 * var_f2) print(f"covariance of assets 1 and 2: {cov_asset_1_2:.8f}") ###Output covariance of assets 1 and 2: 0.00007133 ###Markdown Quiz 1: calculate portfolio varianceWe'll choose stock weights for now (in a later lesson, you'll learn how to use portfolio optimization that uses alpha factors and a risk factor model to choose stock weights).$\textrm{Var}(r_p) = x_{1}^{2} \textrm{Var}(r_1) + x_{2}^{2} \textrm{Var}(r_2) + 2x_{1}x_{2}\textrm{Cov}(r_{1},r_{2})$ ###Code weight_1 = 0.60 weight_2 = 0.40 # TODO: calculate portfolio variance var_portfolio = weight_1**2 * var_asset_1 + \ weight_2**2 * var_asset_2 + \ 2*weight_1*weight_2*cov_asset_1_2 print(f"variance of portfolio is {var_portfolio:.8f}") ###Output variance of portfolio is 0.00017076 ###Markdown Quiz 2: Do it with Matrices!Create matrices $\mathbf{F}$, $\mathbf{B}$ and $\mathbf{S}$, where $\mathbf{F}= \begin{pmatrix}\textrm{Var}(f_1) & \textrm{Cov}(f_1,f_2) \\ \textrm{Cov}(f_2,f_1) & \textrm{Var}(f_2) \end{pmatrix}$is the covariance matrix of factors, $\mathbf{B} = \begin{pmatrix}\beta_{1,1}, \beta_{1,2}\\ \beta_{2,1}, \beta_{2,2}\end{pmatrix}$ is the matrix of factor exposures, and $\mathbf{S} = \begin{pmatrix}\textrm{Var}(s_i) & 0\\ 0 & \textrm{Var}(s_j)\end{pmatrix}$is the matrix of specific variances. $\mathbf{X} = \begin{pmatrix}x_{1} \\x_{2}\end{pmatrix}$ Concept QuestionWhat are the dimensions of the $\textrm{Var}(r_p)$ portfolio variance? Given this, when choosing whether to multiply a row vector or a column vector on the left and right sides of the $\mathbf{BFB}^T$, which choice helps you get the dimensions of the portfolio variance term?In other words:Given that $\mathbf{X}$ is a column vector, which makes more sense?$\mathbf{X}^T(\mathbf{BFB}^T + \mathbf{S})\mathbf{X}$ ? or $\mathbf{X}(\mathbf{BFB}^T + \mathbf{S})\mathbf{X}^T$ ? Answer 2 here: Quiz 3: Calculate portfolio variance using matrices ###Code # TODO: covariance matrix of factors F = covm_f1_f2 F # TODO: matrix of factor exposures B = factor_exposure_a B # TODO: matrix of specific variances S = np.diag([var_s_1,var_s_2]) S ###Output _____no_output_____ ###Markdown Hint for column vectorsTry using [reshape](https://docs.scipy.org/doc/numpy-1.15.1/reference/generated/numpy.reshape.html) ###Code # TODO: make a column vector for stock weights matrix X X = np.array([weight_1,weight_2]).reshape(2,1) X # TODO: covariance matrix of assets var_portfolio = X.T.dot(B.dot(F).dot(B.T) + S).dot(X) print(f"portfolio variance is \n{var_portfolio[0][0]:.8f}") ###Output portfolio variance is 0.00017076
code/smote_ipf/smote_ipf_multiclass.ipynb
###Markdown SMOTE-IPF implementation for multi-class datasetsThe following code can be used to oversample multi-class datasets using SMOTE-IPF. It also shows various details of the dataset created like the number of new samples in each class and shows a diagram showing the changes from the original dataset side by side. ###Code import numpy as np import matplotlib.pyplot as plt import smote_variants as sv import imbalanced_databases as imbd import sklearn.datasets as datasets ###Output _____no_output_____ ###Markdown Loading the classic multi-class dataset named wine, which is available in sklearn datasets. ###Code dataset= datasets.load_wine() X, y= dataset['data'], dataset['target'] ###Output _____no_output_____ ###Markdown Illustrating the imbalance in the current dataset using a scatter plot Finding the Majority class and the distribution of the dataset ###Code for i in np.unique(y): print("Class %d - Samples: %d" % (i, np.sum(y == i))) ###Output Class 0 - Samples: 59 Class 1 - Samples: 71 Class 2 - Samples: 48 ###Markdown Plotting the dataset with three-classes. Class 0, Class 1, and Class 2. Here, we plot only the first two co-ordinates in a scatter plot. ###Code colors= ['orange', 'olive', 'cyan'] plt.figure(figsize=(10, 10)) for i in np.unique(y): plt.scatter(X[y == i][:,0], X[y == i][:,1], label='Class %d' % i, c= colors[i]) plt.title('Original dataset') plt.xlabel('Coordinate 0') plt.ylabel('Coordinate 1') plt.legend() ###Output _____no_output_____ ###Markdown Oversampling using SMOTE_IPFWe use the class `smote_variants.MulticlassOversampling` for multiclass oversampling using SMOTE-IPF.We use default paramaters with proportion `1.0`. This ensures that the oversampled dataset has equal number of samples. All the default parameters are ```{ 'proportion': 1.0, 'n_neighbors': 5, number of neighbors in SMOTE sampling 'n_folds': 9, the number of partitions 'k': 3, used in stopping condition 'p': 0.01, percentage value ([0,1]) used in stopping condition 'voting': 'majority', 'majority'/'consensus' 'n_jobs': 1, number of parallel jobs 'classifier': DecisionTreeClassifier(random_state=2), classifier object 'random_state': None initializer of random_state}``` ###Code oversampler= sv.MulticlassOversampling(sv.SMOTE_IPF()) X_samp, y_samp= oversampler.sample(X, y) ###Output 2021-11-20 23:19:19,054:INFO:MulticlassOversampling: Running multiclass oversampling with strategy eq_1_vs_many_successive 2021-11-20 23:19:19,055:INFO:MulticlassOversampling: Sampling minority class with label: 0 2021-11-20 23:19:19,057:INFO:SMOTE_IPF: Running sampling via ('SMOTE_IPF', "{'proportion': 1.0, 'n_neighbors': 5, 'n_folds': 9, 'k': 3, 'p': 0.01, 'voting': 'majority', 'n_jobs': 1, 'classifier': DecisionTreeClassifier(random_state=2), 'random_state': None}") 2021-11-20 23:19:19,058:INFO:SMOTE: Running sampling via ('SMOTE', "{'proportion': 1.0, 'n_neighbors': 5, 'n_jobs': 1, 'random_state': <module 'numpy.random' from '/home/nmc/.local/lib/python3.9/site-packages/numpy/random/__init__.py'>}") 2021-11-20 23:19:19,067:INFO:SMOTE_IPF: Removing 0 elements 2021-11-20 23:19:19,074:INFO:SMOTE_IPF: Removing 0 elements 2021-11-20 23:19:19,081:INFO:SMOTE_IPF: Removing 0 elements 2021-11-20 23:19:19,083:INFO:MulticlassOversampling: Sampling minority class with label: 2 2021-11-20 23:19:19,084:INFO:SMOTE_IPF: Running sampling via ('SMOTE_IPF', "{'proportion': 0.24468085106382978, 'n_neighbors': 5, 'n_folds': 9, 'k': 3, 'p': 0.01, 'voting': 'majority', 'n_jobs': 1, 'classifier': DecisionTreeClassifier(random_state=2), 'random_state': None}") 2021-11-20 23:19:19,085:INFO:SMOTE: Running sampling via ('SMOTE', "{'proportion': 0.24468085106382978, 'n_neighbors': 5, 'n_jobs': 1, 'random_state': <module 'numpy.random' from '/home/nmc/.local/lib/python3.9/site-packages/numpy/random/__init__.py'>}") 2021-11-20 23:19:19,094:INFO:SMOTE_IPF: Removing 0 elements 2021-11-20 23:19:19,102:INFO:SMOTE_IPF: Removing 0 elements 2021-11-20 23:19:19,111:INFO:SMOTE_IPF: Removing 0 elements ###Markdown `X_samp` contains oversampled `X` data and `y_samp` contains oversampled `y` data. Now let's look at the number of samples in each class. ###Code print('Class 0: %d' % np.sum(y_samp == 0)) print('Class 1: %d' % np.sum(y_samp == 1)) print('Class 2: %d' % np.sum(y_samp == 2)) ###Output Class 0: 71 Class 1: 71 Class 2: 71 ###Markdown Each class has equal number of datapoints as we have described above. Now we filters newly sampled datapoints to plot them distinctly below. ###Code X_samp, y_samp= X_samp[len(X):], y_samp[len(y):] ###Output _____no_output_____ ###Markdown Printing the number of new samples ###Code for i in np.unique(y_samp): print("Class %d - Samples: %d" % (i, np.sum(y_samp == i))) ###Output Class 0 - Samples: 12 Class 2 - Samples: 23 ###Markdown Now using a scatter plot, we plot Original dataset vs Oversampled dataset, showing the new samples distinctly. ###Code plt.figure(figsize=(20, 10)) ax= plt.subplot(121) # Plotting original for i in np.unique(y): plt.scatter(X[y == i][:,0], X[y == i][:,1], label='Class %d' % i, c=colors[i], marker='o') plt.title('Original Dataset') plt.xlabel('Coordinate 0') plt.ylabel('Coordinate 1') ax= plt.subplot(122) for i in np.unique(y): plt.scatter(X[y == i][:,0], X[y == i][:,1], label='Class %d' % i, c=colors[i], marker='o') if len(y_samp[y_samp == i]): plt.scatter(X_samp[y_samp == i][:, 0], X_samp[y_samp == i][:, 1], label='Class %d New Samples' % i, c=colors[i], marker='x') plt.title('Oversampled Dataset') plt.xlabel('Coordinate 0') plt.ylabel('Coordinate 1') plt.legend() ###Output _____no_output_____
doc/jupyter_execute/notebooks/operator_upgrade.ipynb
###Markdown Operator Upgrade Tests Setup Seldon CoreFollow the instructions to [Setup Cluster](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.htmlSetup-Cluster) with [Ambassador Ingress](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.htmlAmbassador) and [Install Seldon Core](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.htmlInstall-Seldon-Core). ###Code !kubectl create namespace seldon !kubectl config set-context $(kubectl config current-context) --namespace=seldon import json import time ###Output _____no_output_____ ###Markdown Install Stable Version ###Code !kubectl create namespace seldon-system !helm upgrade seldon seldon-core-operator --repo https://storage.googleapis.com/seldon-charts --namespace seldon-system --set istio.enabled=true --wait ###Output _____no_output_____ ###Markdown Launch a Range of Models ###Code %%writefile resources/model.yaml apiVersion: machinelearning.seldon.io/v1 kind: SeldonDeployment metadata: name: seldon-model spec: name: test-deployment predictors: - componentSpecs: - spec: containers: - image: seldonio/mock_classifier:1.9.1 name: classifier graph: name: classifier type: MODEL endpoint: type: REST name: example replicas: 1 !kubectl create -f resources/model.yaml %%writefile ../servers/tfserving/samples/halfplustwo_rest.yaml apiVersion: machinelearning.seldon.io/v1alpha2 kind: SeldonDeployment metadata: name: hpt spec: name: hpt protocol: tensorflow transport: rest predictors: - graph: name: halfplustwo implementation: TENSORFLOW_SERVER modelUri: gs://seldon-models/tfserving/half_plus_two parameters: - name: model_name type: STRING value: halfplustwo name: default replicas: 1 !kubectl create -f ../servers/tfserving/samples/halfplustwo_rest.yaml %%writefile ../examples/models/payload_logging/model_logger.yaml apiVersion: machinelearning.seldon.io/v1 kind: SeldonDeployment metadata: name: model-logs spec: name: model-logs predictors: - componentSpecs: - spec: containers: - image: seldonio/mock_classifier_rest:1.3 name: classifier imagePullPolicy: Always graph: name: classifier type: MODEL endpoint: type: REST logger: url: http://logger.seldon/ mode: all name: logging replicas: 1 !kubectl create -f ../examples/models/payload_logging/model_logger.yaml ###Output _____no_output_____ ###Markdown Wait for all models to be available ###Code def waitStatus(desired): for i in range(360): allAvailable = True failedGet = False state = !kubectl get sdep -o json state = json.loads("".join(state)) for model in state["items"]: if "status" in model: print("model", model["metadata"]["name"], model["status"]["state"]) if model["status"]["state"] != "Available": allAvailable = False break else: failedGet = True if allAvailable == desired and not failedGet: break time.sleep(1) return allAvailable actual = waitStatus(True) assert actual == True ###Output _____no_output_____ ###Markdown Count the number of resources ###Code def getOwned(raw): count = 0 for res in raw["items"]: if ( "ownerReferences" in res["metadata"] and res["metadata"]["ownerReferences"][0]["kind"] == "SeldonDeployment" ): count += 1 return count def getResourceStats(): # Get number of deployments dps = !kubectl get deployment -o json dps = json.loads("".join(dps)) numDps = getOwned(dps) print("Number of deployments owned", numDps) # Get number of services svcs = !kubectl get svc -o json svcs = json.loads("".join(svcs)) numSvcs = getOwned(svcs) print("Number of services owned", numSvcs) # Get number of virtual services vss = !kubectl get vs -o json vss = json.loads("".join(vss)) numVs = getOwned(vss) print("Number of virtual services owned", numVs) # Get number of hpas hpas = !kubectl get hpa -o json hpas = json.loads("".join(hpas)) numHpas = getOwned(hpas) print("Number of hpas owned", numHpas) return (numDps, numSvcs, numVs, numHpas) (dp1, svc1, vs1, hpa1) = getResourceStats() ###Output _____no_output_____ ###Markdown Upgrade to latest ###Code !helm upgrade seldon ../helm-charts/seldon-core-operator --namespace seldon-system --set istio.enabled=true --wait actual = waitStatus(False) assert actual == False actual = waitStatus(True) assert actual == True # Give time for resources to terminate for i in range(120): (dp2, svc2, vs2, hpa2) = getResourceStats() if dp1 == dp2 and svc1 == svc2 and vs1 == vs2 and hpa1 == hpa2: break time.sleep(1) assert dp1 == dp2 assert svc1 == svc2 assert vs1 == vs2 assert hpa1 == hpa2 !kubectl delete sdep --all ###Output _____no_output_____
Pyomo_optimizations.ipynb
###Markdown Integrating risk management tools for regional forest planning: an interactive multiobjective value at risk approach We demonstrate a method of incorporating multi-criteria decision making with stochastic programming First import required packages ###Code import pandas as pd import random from pyomo.environ import * import itertools from pyomo.opt import SolverFactory import numpy as np ###Output _____no_output_____ ###Markdown Import the data ###Code forest_data = pd.read_csv("INCOME_BD_DATA.dat",index_col=False) #Shuffles simulations, so that we have different inventory errors for different simulations. #As this is a random process, each iteration will produce slightly different results #An error occurs, as we are vopying over data b = list(range(0,25)) for i in forest_data.ID.unique(): random.shuffle(b) c = len(forest_data.SIMULATION[forest_data["ID"]==i]) c = c/25 #number of branches forest_data.SIMULATION[forest_data["ID"]== i] = b*int(c) #Generate a list of stands based on the forest data stands = set(forest_data["ID"]) #Generate a list of options for each stand, each stand has different management schedules possible. options = {} for stand in stands: options[stand] =list(set(forest_data[(forest_data["ID"] == stand)]["BRANCH"])) options_sims = set(forest_data["SIMULATION"]) #Lists of the key elements we are going to examine. income_periods = ["First Period (EUR)","Second Period (EUR)","Third Period (EUR)","Fourth Period (EUR)", "Fifth Period (EUR)","Sixth Period (EUR)"] biodiversity_periods = ["First Period (BIO)","Second Period (BIO)","Third Period (BIO)","Fourth Period (BIO)", "Fifth Period (BIO)","Sixth Period (BIO)"] ###Output _____no_output_____ ###Markdown Create the optimization model Initialize model and create variables ###Code model = ConcreteModel() #Alternatives to be chosen model.alternative= Var(set(forest_data.index.values),bounds=(0,1),domain=NonNegativeReals) #Which simulations are to be relaxed model.relaxationIncome = Var(options_sims,domain=Binary) model.relaxationBiodiversity = Var(options_sims,domain=Binary) relaxationNO = 100000.0 #Min income and biodiversity (over periods) for all simulations model.minIncome = Var(options_sims) model.minBiodiversity = Var(options_sims) #Relaxed min income and biodiversity model.minIncomeRelaxed = Var() model.minBiodiversityRelaxed = Var() #A help variable for getting the min income in all simulations that are not relaxed model.minIncomeRelaxed_all = Var(options_sims) model.minBiodiversityRelaxed_all = Var(options_sims) #Risk i.e., the proportion of simulations relaxed from calculation, separately for income and biodiversity model.risk = Var(range(2),bounds=(0,0.999),domain=NonNegativeReals) #First risk for income, second for biodiversity #Finally the value of the ASF scalarization when we maximize the min income and biodiversity, relaxed income and biodiversity #and (1-risk) for both objectives model.asf = Var() model.epsilon_sum = Var() model.minINCOME1 = Var(options_sims) model.minINCOME2 = Var(options_sims) model.minINCOME3 = Var(options_sims) model.minINCOME4 = Var(options_sims) model.minINCOME5 = Var(options_sims) model.minINCOME6 = Var(options_sims) model.minBIO1 = Var(options_sims) model.minBIO2 = Var(options_sims) model.minBIO3 = Var(options_sims) model.minBIO4 = Var(options_sims) model.minBIO5 = Var(options_sims) model.minBIO6 = Var(options_sims) ###Output _____no_output_____ ###Markdown Area constraint, each stand must be assigned to schedules summing to one.Each stand must have some form of management. For each scenario, the management must be the ###Code #For each simulation and stand, a single branch needs to be selected def onlyOneAlternativerule(model,stand): return sum(model.alternative[idx] for idx in forest_data[(forest_data["ID"] == stand)&(forest_data["SIMULATION"] == 0)].index.values )==1 model.alternative_const = Constraint(stands,rule = onlyOneAlternativerule) #Requires that each scneario has the same branch for stand in stands: def onlyOneAlternativerule2(model,sim,branch): return sum(model.alternative[idx] -model.alternative[idd] for idx in forest_data[(forest_data["ID"] == stand)&(forest_data["SIMULATION"] == 0)&(forest_data["BRANCH"] == branch)].index.values for idd in forest_data[(forest_data["ID"] == stand)&(forest_data["SIMULATION"] == sim)&(forest_data["BRANCH"] == branch)].index.values )==0 setattr(model,"onlyOneAlternativerule2"+str(stand),Constraint(options_sims,options[stand],rule=onlyOneAlternativerule2)) ###Output _____no_output_____ ###Markdown Equations to evaluate the income and biodiversity for each period ###Code #this section calculates the values for each of the six periods of either income or biodiversity def incomeDummyConstraint1a(model,option_sim): return model.minINCOME1[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,income_periods[0]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const1a = Constraint(options_sims,rule = incomeDummyConstraint1a) def incomeDummyConstraint2(model,option_sim): return model.minINCOME2[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,income_periods[1]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const2 = Constraint(options_sims,rule = incomeDummyConstraint2) def incomeDummyConstraint3(model,option_sim): return model.minINCOME3[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,income_periods[2]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const3 = Constraint(options_sims,rule = incomeDummyConstraint3) def incomeDummyConstraint4(model,option_sim): return model.minINCOME4[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,income_periods[3]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const4 = Constraint(options_sims,rule = incomeDummyConstraint4) def incomeDummyConstraint5(model,option_sim): return model.minINCOME5[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,income_periods[4]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const5 = Constraint(options_sims,rule = incomeDummyConstraint5) def incomeDummyConstraint6(model,option_sim): return model.minINCOME6[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,income_periods[5]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const6 = Constraint(options_sims,rule = incomeDummyConstraint6) def BIODummyConstraint1(model,option_sim): return model.minBIO1[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,biodiversity_periods[0]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.BIODummy_const1 = Constraint(options_sims,rule = BIODummyConstraint1) def BIODummyConstraint2(model,option_sim): return model.minBIO2[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,biodiversity_periods[1]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.BIODummy_const2 = Constraint(options_sims,rule = BIODummyConstraint2) def BIODummyConstraint3(model,option_sim): return model.minBIO3[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,biodiversity_periods[2]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.BIODummy_const3 = Constraint(options_sims,rule = BIODummyConstraint3) def BIODummyConstraint4(model,option_sim): return model.minBIO4[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,biodiversity_periods[3]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.BIODummy_const4 = Constraint(options_sims,rule = BIODummyConstraint4) def BIODummyConstraint5(model,option_sim): return model.minBIO5[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,biodiversity_periods[4]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.BIODummy_const5 = Constraint(options_sims,rule = BIODummyConstraint5) def BIODummyConstraint6(model,option_sim): return model.minBIO6[option_sim] == sum([model.alternative[idx]*forest_data.at[idx,biodiversity_periods[5]] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.BIODummy_const6 = Constraint(options_sims,rule = BIODummyConstraint6) ###Output _____no_output_____ ###Markdown Constraints requiring that the risk level is met, for both income and biodiversity. ###Code #Risk is the proportion of either incomes or biodiversities that are relaxed def relaxationLimitIncome(model): return model.risk[0]*len(options_sims)>=summation(model.relaxationIncome) model.incomeRelaxationLimit = Constraint(rule = relaxationLimitIncome) def relaxationLimitBiodiversity(model): return model.risk[1]*len(options_sims)>=summation(model.relaxationBiodiversity) model.biodiversityRelaxationLimit = Constraint(rule = relaxationLimitBiodiversity) ###Output _____no_output_____ ###Markdown Evaluating and constraining the income and biodiversity risk values. ###Code #Min income and biodiversity must be less than incomes and biodiversities in all periods def incomeDummyConstraint(model,option_sim,incomeperiod): return model.minIncome[option_sim] <= sum([model.alternative[idx]*forest_data.at[idx,incomeperiod] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.incomeDummy_const = Constraint(options_sims,income_periods,rule = incomeDummyConstraint) def biodiversityDummyConstraint(model,option_sim,biodiversityperiod): return model.minBiodiversity[option_sim] <= sum([model.alternative[idx]*forest_data.at[idx,biodiversityperiod] for idx in forest_data[forest_data["SIMULATION"]==option_sim].index.values]) model.biodiversityDummy_const = Constraint(options_sims,biodiversity_periods,rule = biodiversityDummyConstraint) #Relaxed income and biodiversity are calculated by adding relaxationNO in relaxed simulations def incomeDummyConstraintRelaxed_all(model,option_sim): return model.minIncomeRelaxed_all[option_sim] == model.minIncome[option_sim]+model.relaxationIncome[option_sim]*relaxationNO model.incomeDummyRelaxed_const_all = Constraint(options_sims,rule = incomeDummyConstraintRelaxed_all) def biodiversityDummyConstraintRelaxed_all(model,option_sim): return model.minBiodiversityRelaxed_all[option_sim] == model.minBiodiversity[option_sim]+model.relaxationBiodiversity[option_sim]*relaxationNO model.biodiversityDummyRelaxed_const_all = Constraint(options_sims,rule = biodiversityDummyConstraintRelaxed_all) #Then finally, relaxed simulations in all is taken as a minimum. def incomeDummyConstraintRelaxed(model,option_sim): return model.minIncomeRelaxed <= model.minIncomeRelaxed_all[option_sim] model.incomeDummyRelaxed_const = Constraint(options_sims,rule = incomeDummyConstraintRelaxed) def biodiversityDummyConstraintRelaxed(model,option_sim): return model.minBiodiversityRelaxed <= model.minBiodiversityRelaxed_all[option_sim] model.biodiversityDummyRelaxed_const = Constraint(options_sims,rule = biodiversityDummyConstraintRelaxed) ###Output _____no_output_____ ###Markdown Define the Acheivement Scalarization Function problem ###Code #parameter values for the nadir, ideal and reference #Initial values are unknown, and need to be calculated model.nadir = Param(range(6),default = 0,mutable=True) model.ideal = Param(range(6),default=1,mutable=True) model.reference = Param(range(6),default = 0,mutable=True) #Add constraints which will linearize the max-min objective #First the mean income and biodiversity def asfDummyIncomeMean(model): return model.asf <= (sum(model.minIncome[option] for option in options_sims)/len(options_sims) -model.reference[0])/(model.ideal[0]-model.nadir[0]) model.asfDummyIncomeMean_constraint = Constraint(rule=asfDummyIncomeMean) def asfDummyBiodiversityMean(model): return model.asf <= (sum(model.minBiodiversity[option] for option in options_sims)/len(options_sims) -model.reference[1])/(model.ideal[1]-model.nadir[1]) model.asfDummyBiodiversityMean_constraint = Constraint(rule=asfDummyBiodiversityMean) #Then relaxed income and biodiversity def asfDummyIncomeVaR(model): return model.asf <= (model.minIncomeRelaxed-model.reference[2])/(model.ideal[2]-model.nadir[2]) model.asfDummyIncomeMeanRelaxed_constraint = Constraint(rule=asfDummyIncomeVaR) def asfDummyBiodiversityVaR(model): return model.asf <= (model.minBiodiversityRelaxed-model.reference[3])/(model.ideal[3]-model.nadir[3]) model.asfDummyBiodiversityMeanRelaxed_constraint = Constraint(rule=asfDummyBiodiversityVaR) #Finally, (1-risk) def asfDummyRisk(model,risk_no): return model.asf <= ((1-model.risk[risk_no])- model.reference[4+risk_no])/(model.ideal[4+risk_no]-model.nadir[4+risk_no]) model.asfDummyRisk_constraint = Constraint(range(2),rule=asfDummyRisk) #a function to ensure efficiency def epsilon_sum_function(model): return model.epsilon_sum == (sum(model.minIncome[option] for option in options_sims)/len(options_sims) -model.reference[0])/(model.ideal[0]-model.nadir[0])+(sum(model.minBiodiversity[option] for option in options_sims)/len(options_sims) -model.reference[1])/(model.ideal[1]-model.nadir[1])+ (model.minIncomeRelaxed-model.reference[2])/(model.ideal[2]-model.nadir[2])+ (model.minBiodiversityRelaxed-model.reference[3])/(model.ideal[3]-model.nadir[3]) model.epsilon_sum_constraint = Constraint(rule=epsilon_sum_function) model.objective = Objective(expr=model.asf*1000+model.epsilon_sum/1000,sense=maximize) ###Output _____no_output_____ ###Markdown A function to automate the solving process ###Code def solveASF(nadir,ideal,reference,model,solver): for i in range(6): model.nadir[i] = nadir[i] model.ideal[i] = ideal[i] model.reference[i] = reference[i] model.preprocess() #This is a must after changing parameter value opt = SolverFactory(solver) if solver == "cbc": opt.options["ratio"] = 0.005 #opt.options["threads"] = 8 opt.solve(model,tee=False) ###Output _____no_output_____ ###Markdown Calculate Ideal and Nadir ###Code relaxationNO = 1000000.0 def calculateIdealandNadir(model,solver): trade_off_table = [] a= [] b1=[] b2=[] for i in range(6): reference = [-relaxationNO]*6 reference[i] = 0 print(reference) solveASF([0,0,0,0,0,0],[1,1,1,1,1,1],reference,model,solver) trade_off_table.append([sum([value(model.minIncome[option]) for option in options_sims])/len(options_sims), sum([value(model.minBiodiversity[option]) for option in options_sims])/len(options_sims), value(model.minIncomeRelaxed), value(model.minBiodiversityRelaxed), (1-value(model.risk[0])), (1-value(model.risk[1]))]) a.append(trade_off_table[i][i]) b1.append(sum([value(model.minIncome[option]) for option in options_sims])/len(options_sims)) b2.append(sum([value(model.minBiodiversity[option]) for option in options_sims])/len(options_sims)) print("Trade-off table:") print("Income, BD, VaR Income, VaR BD,1-delta for Income, 1-delta for BD") for objectives in trade_off_table: for i in objectives[:-1]: print(round(i,2), end=", ") print(round(i,2)) return a, b1,b2, trade_off_table a, b1,b2,trade_off_table = calculateIdealandNadir(model,"cplex") #To aid in computability, the income and biodiveristy values are adjusted to be of similar magnitude in the optimization #We have found rounding errors can impact the solution, and ensuring similar magintude of values avoids these types of errors areas = sum(forest_data[(forest_data["BRANCH"] == 1)&(forest_data["SIMULATION"] == 0)].AREA.values) multipliers = [1000/areas,0.001/areas,1000/areas,0.001/areas,1,1] #creates the nadir and ideal vectors b= [b1[1],b2[0],b1[1],b2[0],0.04,0.04] b[4], b[5] =0.04, 0.04 a[4], a[5] =1, 1 ###Output _____no_output_____ ###Markdown A simple function to print the results of a single optimization ###Code def print_results(): result = [round(sum([value(model.minIncome[option]) for option in options_sims])/len(options_sims),2), round(sum([value(model.minBiodiversity[option]) for option in options_sims])/len(options_sims),2), round(value(model.minIncomeRelaxed),2), round(value(model.minBiodiversityRelaxed),2), (round(1-value(model.risk[0]),2)), (round(1-value(model.risk[1]),2))] result_ha = [result[i]*multipliers[i] for i in range(0,len(result))] print(result_ha) ###Output _____no_output_____ ###Markdown Optimizing - using the same reference points as in the manuscript Round 1- initial preferences ###Code # Income, BD, Var Income, Var BD,VAR alpha INCOME, VAR alpha BD #Reference value provided by decision maker, in per hectare values. c = [250,.002,250,0.002,.95,.8] #Converstion to values used by optimization, scaled to similar magnitude. c1 = [areas*c[0]/1000,areas*c[1]*1000,areas*c[2]/1000,areas*c[3]*1000,1*c[4],1*c[5]] solveASF(b,a,c1,model,"cplex") print_results() ###Output [269.5732544740992, 0.0020456944899165334, 263.55251353178517, 0.0020363907231267025, 1.0, 1.0] ###Markdown Round 2- modified preferences ###Code # Income, BD, Var Income, Var BD,VAR alpha INCOME, VAR alpha BD #Reference value provided by decision maker, in per hectare values. c = [250,.0022,250,0.0021,.95,.95] #Converstion to values used by optimization, scaled to similar magnitude. c1 = [areas*c[0]/1000,areas*c[1]*1000,areas*c[2]/1000,areas*c[3]*1000,1*c[4],1*c[5]] solveASF(b,a,c1,model,"cplex") print_results() ###Output [237.1607486808368, 0.0021585780676653035, 234.1907715987702, 0.0021445424279403187, 0.92, 0.92] ###Markdown Round 3 - further modified preferences ###Code # Income, BD, Var Income, Var BD,VAR alpha INCOME, VAR alpha BD #Reference value provided by decision maker, in per hectare values. c = [350,.0,350,0.0,.95,.95] #Converstion to values used by optimization, scaled to similar magnitude. c1 = [areas*c[0]/1000,areas*c[1]*1000,areas*c[2]/1000,areas*c[3]*1000,1*c[4],1*c[5]] solveASF(b,a,c1,model,"cplex") print_results() ###Output [302.49599255060849, 0.0015879252467185677, 298.87334333955516, 0.0015825518629856235, 0.8, 0.8]
notebooks/convolutional-neural-networks-for-visual-recognition/14_LSTM_Captioning.ipynb
###Markdown Image Captioning with LSTMsIn the previous exercise you implemented a vanilla RNN and applied it to image captioning. In this notebook you will implement the LSTM update rule and use it for image captioning. ###Code import time, os, json import numpy as np import matplotlib.pyplot as plt from utils.gradient_check import eval_numerical_gradient, eval_numerical_gradient_array, rel_error from utils.rnn_layers import * from utils.captioning_solver import CaptioningSolver from classifiers.rnn import CaptioningRNN from utils.coco_utils import load_coco_data, sample_coco_minibatch, decode_captions from utils.image_utils import image_from_url %matplotlib inline plt.rcParams['figure.figsize'] = (10.0, 8.0) # set default size of plots plt.rcParams['image.interpolation'] = 'nearest' plt.rcParams['image.cmap'] = 'gray' # for auto-reloading external modules # see http://stackoverflow.com/questions/1907993/autoreload-of-modules-in-ipython %load_ext autoreload %autoreload 2 ###Output _____no_output_____ ###Markdown Load MS-COCO dataAs in the previous notebook, we will use the Microsoft COCO dataset for captioning. ###Code # Load COCO data from disk; this returns a dictionary # We'll work with dimensionality-reduced features for this notebook, but feel # free to experiment with the original features by changing the flag below. data = load_coco_data(pca_features=True) # Print out all the keys and values from the data dictionary for k, v in data.items(): if type(v) == np.ndarray: print(k, type(v), v.shape, v.dtype) else: print(k, type(v), len(v)) ###Output train_captions <class 'numpy.ndarray'> (400135, 17) int32 train_image_idxs <class 'numpy.ndarray'> (400135,) int32 val_captions <class 'numpy.ndarray'> (195954, 17) int32 val_image_idxs <class 'numpy.ndarray'> (195954,) int32 train_features <class 'numpy.ndarray'> (82783, 512) float32 val_features <class 'numpy.ndarray'> (40504, 512) float32 idx_to_word <class 'list'> 1004 word_to_idx <class 'dict'> 1004 train_urls <class 'numpy.ndarray'> (82783,) <U63 val_urls <class 'numpy.ndarray'> (40504,) <U63 ###Markdown LSTMIf you read recent papers, you'll see that many people use a variant on the vanilla RNN called Long-Short Term Memory (LSTM) RNNs. Vanilla RNNs can be tough to train on long sequences due to vanishing and exploding gradients caused by repeated matrix multiplication. LSTMs solve this problem by replacing the simple update rule of the vanilla RNN with a gating mechanism as follows.Similar to the vanilla RNN, at each timestep we receive an input $x_t\in\mathbb{R}^D$ and the previous hidden state $h_{t-1}\in\mathbb{R}^H$; the LSTM also maintains an $H$-dimensional *cell state*, so we also receive the previous cell state $c_{t-1}\in\mathbb{R}^H$. The learnable parameters of the LSTM are an *input-to-hidden* matrix $W_x\in\mathbb{R}^{4H\times D}$, a *hidden-to-hidden* matrix $W_h\in\mathbb{R}^{4H\times H}$ and a *bias vector* $b\in\mathbb{R}^{4H}$.At each timestep we first compute an *activation vector* $a\in\mathbb{R}^{4H}$ as $a=W_xx_t + W_hh_{t-1}+b$. We then divide this into four vectors $a_i,a_f,a_o,a_g\in\mathbb{R}^H$ where $a_i$ consists of the first $H$ elements of $a$, $a_f$ is the next $H$ elements of $a$, etc. We then compute the *input gate* $g\in\mathbb{R}^H$, *forget gate* $f\in\mathbb{R}^H$, *output gate* $o\in\mathbb{R}^H$ and *block input* $g\in\mathbb{R}^H$ as$$\begin{align*}i = \sigma(a_i) \hspace{2pc}f = \sigma(a_f) \hspace{2pc}o = \sigma(a_o) \hspace{2pc}g = \tanh(a_g)\end{align*}$$where $\sigma$ is the sigmoid function and $\tanh$ is the hyperbolic tangent, both applied elementwise.Finally we compute the next cell state $c_t$ and next hidden state $h_t$ as$$c_{t} = f\odot c_{t-1} + i\odot g \hspace{4pc}h_t = o\odot\tanh(c_t)$$where $\odot$ is the elementwise product of vectors.In the rest of the notebook we will implement the LSTM update rule and apply it to the image captioning task. In the code, we assume that data is stored in batches so that $X_t \in \mathbb{R}^{N\times D}$, and will work with *transposed* versions of the parameters: $W_x \in \mathbb{R}^{D \times 4H}$, $W_h \in \mathbb{R}^{H\times 4H}$ so that activations $A \in \mathbb{R}^{N\times 4H}$ can be computed efficiently as $A = X_t W_x + H_{t-1} W_h$ LSTM: step forwardImplement the forward pass for a single timestep of an LSTM in the `lstm_step_forward` function in the file `utils/rnn_layers.py`. This should be similar to the `rnn_step_forward` function that you implemented above, but using the LSTM update rule instead.Once you are done, run the following to perform a simple test of your implementation. You should see errors on the order of `e-8` or less. ###Code N, D, H = 3, 4, 5 x = np.linspace(-0.4, 1.2, num=N*D).reshape(N, D) prev_h = np.linspace(-0.3, 0.7, num=N*H).reshape(N, H) prev_c = np.linspace(-0.4, 0.9, num=N*H).reshape(N, H) Wx = np.linspace(-2.1, 1.3, num=4*D*H).reshape(D, 4 * H) Wh = np.linspace(-0.7, 2.2, num=4*H*H).reshape(H, 4 * H) b = np.linspace(0.3, 0.7, num=4*H) next_h, next_c, cache = lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b) expected_next_h = np.asarray([ [ 0.24642878, 0.28650384, 0.3229522 , 0.35578391, 0.38507927], [ 0.5759557 , 0.62998045, 0.67939457, 0.72397886, 0.76369564], [ 0.68627387, 0.75126369, 0.80592917, 0.8506307 , 0.88618946]]) expected_next_c = np.asarray([ [ 0.34455393, 0.41038464, 0.47491437, 0.53810747, 0.599929 ], [ 0.74660534, 0.83937852, 0.93414622, 1.03080224, 1.12920394], [ 0.8788603 , 1.01444375, 1.15462326, 1.29862835, 1.44508007]]) print('next_h error: ', rel_error(expected_next_h, next_h)) print('next_c error: ', rel_error(expected_next_c, next_c)) ###Output next_h error: 8.31698571060779e-09 next_c error: 4.553917113680252e-09 ###Markdown LSTM: step backwardImplement the backward pass for a single LSTM timestep in the function `lstm_step_backward` in the file `utils/rnn_layers.py`. Once you are done, run the following to perform numeric gradient checking on your implementation. You should see errors on the order of `e-7` or less. ###Code np.random.seed(231) N, D, H = 4, 5, 6 x = np.random.randn(N, D) prev_h = np.random.randn(N, H) prev_c = np.random.randn(N, H) Wx = np.random.randn(D, 4 * H) Wh = np.random.randn(H, 4 * H) b = np.random.randn(4 * H) next_h, next_c, cache = lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b) dnext_h = np.random.randn(*next_h.shape) dnext_c = np.random.randn(*next_c.shape) fx_h = lambda x: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[0] fh_h = lambda h: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[0] fc_h = lambda c: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[0] fWx_h = lambda Wx: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[0] fWh_h = lambda Wh: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[0] fb_h = lambda b: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[0] fx_c = lambda x: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[1] fh_c = lambda h: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[1] fc_c = lambda c: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[1] fWx_c = lambda Wx: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[1] fWh_c = lambda Wh: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[1] fb_c = lambda b: lstm_step_forward(x, prev_h, prev_c, Wx, Wh, b)[1] num_grad = eval_numerical_gradient_array dx_num = num_grad(fx_h, x, dnext_h) + num_grad(fx_c, x, dnext_c) dh_num = num_grad(fh_h, prev_h, dnext_h) + num_grad(fh_c, prev_h, dnext_c) dc_num = num_grad(fc_h, prev_c, dnext_h) + num_grad(fc_c, prev_c, dnext_c) dWx_num = num_grad(fWx_h, Wx, dnext_h) + num_grad(fWx_c, Wx, dnext_c) dWh_num = num_grad(fWh_h, Wh, dnext_h) + num_grad(fWh_c, Wh, dnext_c) db_num = num_grad(fb_h, b, dnext_h) + num_grad(fb_c, b, dnext_c) dx, dh, dc, dWx, dWh, db = lstm_step_backward(dnext_h, dnext_c, cache) print('dx error: ', rel_error(dx_num, dx)) print('dh error: ', rel_error(dh_num, dh)) print('dc error: ', rel_error(dc_num, dc)) print('dWx error: ', rel_error(dWx_num, dWx)) print('dWh error: ', rel_error(dWh_num, dWh)) print('db error: ', rel_error(db_num, db)) ###Output dx error: 4.825249082376043e-10 dh error: 2.429607196760839e-10 dc error: 4.632037203091362e-11 dWx error: 3.1927986058708833e-09 dWh error: 1.934765389089522e-08 db error: 5.021609036413371e-10 ###Markdown LSTM: forwardIn the function `lstm_forward` in the file `utils/rnn_layers.py`, implement the `lstm_forward` function to run an LSTM forward on an entire timeseries of data.When you are done, run the following to check your implementation. You should see an error on the order of `e-7` or less. ###Code N, D, H, T = 2, 5, 4, 3 x = np.linspace(-0.4, 0.6, num=N*T*D).reshape(N, T, D) h0 = np.linspace(-0.4, 0.8, num=N*H).reshape(N, H) Wx = np.linspace(-0.2, 0.9, num=4*D*H).reshape(D, 4 * H) Wh = np.linspace(-0.3, 0.6, num=4*H*H).reshape(H, 4 * H) b = np.linspace(0.2, 0.7, num=4*H) h, cache = lstm_forward(x, h0, Wx, Wh, b) expected_h = np.asarray([ [[ 0.0155477 , 0.01614085, 0.01673616, 0.01733362], [ 0.09358164, 0.10263888, 0.11186275, 0.12123936], [ 0.2888865 , 0.3107429 , 0.33244481, 0.35392041]], [[ 0.49459682, 0.51378891, 0.5316881 , 0.54835908], [ 0.71298602, 0.73589244, 0.75662273, 0.77538037], [ 0.85508075, 0.87260312, 0.88773107, 0.90081359]]]) print('h error: ', rel_error(expected_h, h)) ###Output h error: 1.381002062962206e-07 ###Markdown LSTM: backwardImplement the backward pass for an LSTM over an entire timeseries of data in the function `lstm_backward` in the file `utils/rnn_layers.py`. When you are done, run the following to perform numeric gradient checking on your implementation. You should see errors on the order of `e-8` or less. (For `dWh`, it's fine if your error is on the order of `e-6` or less). ###Code from utils.rnn_layers import lstm_forward, lstm_backward np.random.seed(231) N, D, T, H = 2, 3, 10, 6 x = np.random.randn(N, T, D) h0 = np.random.randn(N, H) Wx = np.random.randn(D, 4 * H) Wh = np.random.randn(H, 4 * H) b = np.random.randn(4 * H) out, cache = lstm_forward(x, h0, Wx, Wh, b) dout = np.random.randn(*out.shape) dx, dh0, dWx, dWh, db = lstm_backward(dout, cache) fx = lambda x: lstm_forward(x, h0, Wx, Wh, b)[0] fh0 = lambda h0: lstm_forward(x, h0, Wx, Wh, b)[0] fWx = lambda Wx: lstm_forward(x, h0, Wx, Wh, b)[0] fWh = lambda Wh: lstm_forward(x, h0, Wx, Wh, b)[0] fb = lambda b: lstm_forward(x, h0, Wx, Wh, b)[0] dx_num = eval_numerical_gradient_array(fx, x, dout) dh0_num = eval_numerical_gradient_array(fh0, h0, dout) dWx_num = eval_numerical_gradient_array(fWx, Wx, dout) dWh_num = eval_numerical_gradient_array(fWh, Wh, dout) db_num = eval_numerical_gradient_array(fb, b, dout) print('dx error: ', rel_error(dx_num, dx)) print('dh0 error: ', rel_error(dh0_num, dh0)) print('dWx error: ', rel_error(dWx_num, dWx)) print('dWh error: ', rel_error(dWh_num, dWh)) print('db error: ', rel_error(db_num, db)) ###Output dx error: 4.098644874456977e-09 dh0 error: 8.774198982513716e-10 dWx error: 1.537006983349729e-09 dWh error: 7.971014560087543e-09 db error: 1.8270440843902108e-09 ###Markdown INLINE QUESTIONRecall that in an LSTM the input gate $i$, forget gate $f$, and output gate $o$ are all outputs of a sigmoid function. Why don't we use the ReLU activation function instead of sigmoid to compute these values? Explain.**Your Answer:** LSTM captioning modelNow that you have implemented an LSTM, update the implementation of the `loss` method of the `CaptioningRNN` class in the file `classifiers/rnn.py` to handle the case where `self.cell_type` is `lstm`. This should require adding less than 10 lines of code.Once you have done so, run the following to check your implementation. You should see a difference on the order of `e-10` or less. ###Code N, D, W, H = 10, 20, 30, 40 word_to_idx = {'<NULL>': 0, 'cat': 2, 'dog': 3} V = len(word_to_idx) T = 13 model = CaptioningRNN(word_to_idx, input_dim=D, wordvec_dim=W, hidden_dim=H, cell_type='lstm', dtype=np.float64) # Set all model parameters to fixed values for k, v in model.params.items(): model.params[k] = np.linspace(-1.4, 1.3, num=v.size).reshape(*v.shape) features = np.linspace(-0.5, 1.7, num=N*D).reshape(N, D) captions = (np.arange(N * T) % V).reshape(N, T) loss, grads = model.loss(features, captions) expected_loss = 9.81829623280486 print('loss: ', loss) print('expected loss: ', expected_loss) print('difference: ', abs(loss - expected_loss)) ###Output loss: 9.818296232804864 expected loss: 9.81829623280486 difference: 3.552713678800501e-15 ###Markdown Overfit LSTM captioning modelRun the following to overfit an LSTM captioning model on the same small dataset as we used for the RNN previously. You should see a final loss less than 0.5. ###Code np.random.seed(231) small_data = load_coco_data(max_train=50) small_lstm_model = CaptioningRNN( cell_type='lstm', word_to_idx=data['word_to_idx'], input_dim=data['train_features'].shape[1], hidden_dim=512, wordvec_dim=256, dtype=np.float32, ) small_lstm_solver = CaptioningSolver(small_lstm_model, small_data, update_rule='adam', num_epochs=50, batch_size=25, optim_config={ 'learning_rate': 5e-3, }, lr_decay=0.995, verbose=True, print_every=10, ) small_lstm_solver.train() # Plot the training losses plt.plot(small_lstm_solver.loss_history) plt.xlabel('Iteration') plt.ylabel('Loss') plt.title('Training loss history') plt.show() ###Output (Iteration 0 / 100) loss: 79.515397 (Iteration 10 / 100) loss: 40.884586 (Iteration 20 / 100) loss: 26.833216 (Iteration 30 / 100) loss: 11.919131 (Iteration 40 / 100) loss: 5.498951 (Iteration 50 / 100) loss: 1.803673 (Iteration 60 / 100) loss: 0.676376 (Iteration 70 / 100) loss: 0.280431 (Iteration 80 / 100) loss: 0.203684 (Iteration 90 / 100) loss: 0.122093 ###Markdown LSTM test-time samplingModify the `sample` method of the `CaptioningRNN` class to handle the case where `self.cell_type` is `lstm`. This should take fewer than 10 lines of code.When you are done run the following to sample from your overfit LSTM model on some training and validation set samples. As with the RNN, training results should be very good, and validation results probably won't make a lot of sense (because we're overfitting). ###Code for split in ['train', 'val']: minibatch = sample_coco_minibatch(small_data, split=split, batch_size=2) gt_captions, features, urls = minibatch gt_captions = decode_captions(gt_captions, data['idx_to_word']) sample_captions = small_lstm_model.sample(features) sample_captions = decode_captions(sample_captions, data['idx_to_word']) for gt_caption, sample_caption, url in zip(gt_captions, sample_captions, urls): print(url) print('%s\n%s\nGT:%s' % (split, sample_caption, gt_caption)) # plt.imshow(image_from_url(url)) # plt.title('%s\n%s\nGT:%s' % (split, sample_caption, gt_caption)) # plt.axis('off') # plt.show() ###Output http://farm4.staticflickr.com/3316/5749092128_ff6e9aaa29_z.jpg train a person <UNK> their pizza out of a <UNK> <END> GT:<START> a person <UNK> their pizza out of a <UNK> <END> http://farm4.staticflickr.com/3776/9327227623_4eefba16a7_z.jpg train several boats <UNK> in a body of water <END> GT:<START> several boats <UNK> in a body of water <END> http://farm8.staticflickr.com/7418/8826607974_4924186df6_z.jpg val a person <UNK> on the <UNK> <END> GT:<START> a skier skiing down a snowy mountain with trees in the background <END> http://farm1.staticflickr.com/194/482608166_8657a3616f_z.jpg val open cute dog standing on a box of a building <END> GT:<START> a white plate with a cut in half sandwich <END>
Udacity/Bike_Share_Analysis/.ipynb_checkpoints/Bike_Share_Analysis (2)-checkpoint.ipynb
###Markdown 2016 US Bike Share Activity Snapshot Table of Contents- [Introduction](intro)- [Posing Questions](pose_questions)- [Data Collection and Wrangling](wrangling) - [Condensing the Trip Data](condensing)- [Exploratory Data Analysis](eda) - [Statistics](statistics) - [Visualizations](visualizations)- [Performing Your Own Analysis](eda_continued)- [Conclusions](conclusions) Introduction> **Tip**: Quoted sections like this will provide helpful instructions on how to navigate and use a Jupyter notebook.Over the past decade, bicycle-sharing systems have been growing in number and popularity in cities across the world. Bicycle-sharing systems allow users to rent bicycles for short trips, typically 30 minutes or less. Thanks to the rise in information technologies, it is easy for a user of the system to access a dock within the system to unlock or return bicycles. These technologies also provide a wealth of data that can be used to explore how these bike-sharing systems are used.In this project, you will perform an exploratory analysis on data provided by [Motivate](https://www.motivateco.com/), a bike-share system provider for many major cities in the United States. You will compare the system usage between three large cities: New York City, Chicago, and Washington, DC. You will also see if there are any differences within each system for those users that are registered, regular users and those users that are short-term, casual users. Posing QuestionsBefore looking at the bike sharing data, you should start by asking questions you might want to understand about the bike share data. Consider, for example, if you were working for Motivate. What kinds of information would you want to know about in order to make smarter business decisions? If you were a user of the bike-share service, what factors might influence how you would want to use the service?**Question 1**: Write at least two questions related to bike sharing that you think could be answered by data.**Answer**: Replace this text with your response!> **Tip**: If you double click on this cell, you will see the text change so that all of the formatting is removed. This allows you to edit this block of text. This block of text is written using [Markdown](http://daringfireball.net/projects/markdown/syntax), which is a way to format text using headers, links, italics, and many other options using a plain-text syntax. You will also use Markdown later in the Nanodegree program. Use **Shift** + **Enter** or **Shift** + **Return** to run the cell and show its rendered form. question 1:> Find the number of *Subscribers* and *Customers* in every city question 2:> Duration of the trips of every city. Data Collection and WranglingNow it's time to collect and explore our data. In this project, we will focus on the record of individual trips taken in 2016 from our selected cities: New York City, Chicago, and Washington, DC. Each of these cities has a page where we can freely download the trip data.:- New York City (Citi Bike): [Link](https://www.citibikenyc.com/system-data)- Chicago (Divvy): [Link](https://www.divvybikes.com/system-data)- Washington, DC (Capital Bikeshare): [Link](https://www.capitalbikeshare.com/system-data)If you visit these pages, you will notice that each city has a different way of delivering its data. Chicago updates with new data twice a year, Washington DC is quarterly, and New York City is monthly. **However, you do not need to download the data yourself.** The data has already been collected for you in the `/data/` folder of the project files. While the original data for 2016 is spread among multiple files for each city, the files in the `/data/` folder collect all of the trip data for the year into one file per city. Some data wrangling of inconsistencies in timestamp format within each city has already been performed for you. In addition, a random 2% sample of the original data is taken to make the exploration more manageable. **Question 2**: However, there is still a lot of data for us to investigate, so it's a good idea to start off by looking at one entry from each of the cities we're going to analyze. Run the first code cell below to load some packages and functions that you'll be using in your analysis. Then, complete the second code cell to print out the first trip recorded from each of the cities (the second line of each data file).> **Tip**: You can run a code cell like you formatted Markdown cells above by clicking on the cell and using the keyboard shortcut **Shift** + **Enter** or **Shift** + **Return**. Alternatively, a code cell can be executed using the **Play** button in the toolbar after selecting it. While the cell is running, you will see an asterisk in the message to the left of the cell, i.e. `In [*]:`. The asterisk will change into a number to show that execution has completed, e.g. `In [1]`. If there is output, it will show up as `Out [1]:`, with an appropriate number to match the "In" number. ###Code ## import all necessary packages and functions. import csv # read and write csv files from datetime import datetime # operations to parse dates from pprint import pprint # use to print data structures like dictionaries in # a nicer way than the base print function. def print_first_point(filename): """ This function prints and returns the first data point (second row) from a csv file that includes a header row. """ # print city name for reference city = filename.split('-')[0].split('/')[-1] print('\nCity: {}'.format(city)) with open(filename, 'r') as f_in: ## TODO: Use the csv library to set up a DictReader object. ## ## see https://docs.python.org/3/library/csv.html ## trip_reader = csv.DictReader(f_in) ## TODO: Use a function on the DictReader object to read the ## ## first trip from the data file and store it in a variable. ## ## see https://docs.python.org/3/library/csv.html#reader-objects ## first_trip = next(trip_reader) ## TODO: Use the pprint library to print the first trip. ## ## see https://docs.python.org/3/library/pprint.html ## pprint(first_trip) # output city name and first trip for later testing return (city, first_trip) # list of files for each city data_files = ['./data/NYC-CitiBike-2016.csv', './data/Chicago-Divvy-2016.csv', './data/Washington-CapitalBikeshare-2016.csv',] # print the first trip from each file, store in dictionary example_trips = {} for data_file in data_files: city, first_trip = print_first_point(data_file) example_trips[city] = first_trip ###Output City: NYC OrderedDict([('tripduration', '839'), ('starttime', '1/1/2016 00:09:55'), ('stoptime', '1/1/2016 00:23:54'), ('start station id', '532'), ('start station name', 'S 5 Pl & S 4 St'), ('start station latitude', '40.710451'), ('start station longitude', '-73.960876'), ('end station id', '401'), ('end station name', 'Allen St & Rivington St'), ('end station latitude', '40.72019576'), ('end station longitude', '-73.98997825'), ('bikeid', '17109'), ('usertype', 'Customer'), ('birth year', ''), ('gender', '0')]) City: Chicago OrderedDict([('trip_id', '9080545'), ('starttime', '3/31/2016 23:30'), ('stoptime', '3/31/2016 23:46'), ('bikeid', '2295'), ('tripduration', '926'), ('from_station_id', '156'), ('from_station_name', 'Clark St & Wellington Ave'), ('to_station_id', '166'), ('to_station_name', 'Ashland Ave & Wrightwood Ave'), ('usertype', 'Subscriber'), ('gender', 'Male'), ('birthyear', '1990')]) City: Washington OrderedDict([('Duration (ms)', '427387'), ('Start date', '3/31/2016 22:57'), ('End date', '3/31/2016 23:04'), ('Start station number', '31602'), ('Start station', 'Park Rd & Holmead Pl NW'), ('End station number', '31207'), ('End station', 'Georgia Ave and Fairmont St NW'), ('Bike number', 'W20842'), ('Member Type', 'Registered')]) ###Markdown If everything has been filled out correctly, you should see below the printout of each city name (which has been parsed from the data file name) that the first trip has been parsed in the form of a dictionary. When you set up a `DictReader` object, the first row of the data file is normally interpreted as column names. Every other row in the data file will use those column names as keys, as a dictionary is generated for each row.This will be useful since we can refer to quantities by an easily-understandable label instead of just a numeric index. For example, if we have a trip stored in the variable `row`, then we would rather get the trip duration from `row['duration']` instead of `row[0]`. Condensing the Trip DataIt should also be observable from the above printout that each city provides different information. Even where the information is the same, the column names and formats are sometimes different. To make things as simple as possible when we get to the actual exploration, we should trim and clean the data. Cleaning the data makes sure that the data formats across the cities are consistent, while trimming focuses only on the parts of the data we are most interested in to make the exploration easier to work with.You will generate new data files with five values of interest for each trip: trip duration, starting month, starting hour, day of the week, and user type. Each of these may require additional wrangling depending on the city:- **Duration**: This has been given to us in seconds (New York, Chicago) or milliseconds (Washington). A more natural unit of analysis will be if all the trip durations are given in terms of minutes.- **Month**, **Hour**, **Day of Week**: Ridership volume is likely to change based on the season, time of day, and whether it is a weekday or weekend. Use the start time of the trip to obtain these values. The New York City data includes the seconds in their timestamps, while Washington and Chicago do not. The [`datetime`](https://docs.python.org/3/library/datetime.html) package will be very useful here to make the needed conversions.- **User Type**: It is possible that users who are subscribed to a bike-share system will have different patterns of use compared to users who only have temporary passes. Washington divides its users into two types: 'Registered' for users with annual, monthly, and other longer-term subscriptions, and 'Casual', for users with 24-hour, 3-day, and other short-term passes. The New York and Chicago data uses 'Subscriber' and 'Customer' for these groups, respectively. For consistency, you will convert the Washington labels to match the other two.**Question 3a**: Complete the helper functions in the code cells below to address each of the cleaning tasks described above. ###Code def duration_in_mins(datum, city): """ Takes as input a dictionary containing info about a single trip (datum) and its origin city (city) and returns the trip duration in units of minutes. Remember that Washington is in terms of milliseconds while Chicago and NYC are in terms of seconds. HINT: The csv module reads in all of the data as strings, including numeric values. You will need a function to convert the strings into an appropriate numeric type when making your transformations. see https://docs.python.org/3/library/functions.html """ # YOUR CODE HERE if city == 'NYC': duration = float(datum['tripduration'])/60 elif city =='Chicago': duration = float(datum['tripduration'])/60 else: duration = float(datum['Duration (ms)'])/60000 return duration # Some tests to check that your code works. There should be no output if all of # the assertions pass. The `example_trips` dictionary was obtained from when # you printed the first trip from each of the original data files. tests = {'NYC': 13.9833, 'Chicago': 15.4333, 'Washington': 7.1231} for city in tests: assert abs(duration_in_mins(example_trips[city], city) - tests[city]) < .001 def time_of_trip(datum, city): """ Takes as input a dictionary containing info about a single trip (datum) and its origin city (city) and returns the month, hour, and day of the week in which the trip was made. Remember that NYC includes seconds, while Washington and Chicago do not. HINT: You should use the datetime module to parse the original date strings into a format that is useful for extracting the desired information. see https://docs.python.org/3/library/datetime.html#strftime-and-strptime-behavior """ # YOUR CODE HERE if city == 'Chicago': datetime_object = datetime.strptime((datum['starttime']), '%m/%d/%Y %H:%M') month = datetime_object.month hour = datetime_object.hour day_of_week = datetime_object.strftime("%A") elif city == 'NYC': datetime_object = datetime.strptime((datum['starttime']), '%m/%d/%Y %H:%M:%S') month = datetime_object.month hour = datetime_object.hour day_of_week = datetime_object.strftime("%A") else: datetime_object = datetime.strptime((datum['Start date']), '%m/%d/%Y %H:%M') month = datetime_object.month hour = datetime_object.hour day_of_week = datetime_object.strftime("%A") return (month, hour, day_of_week) # Some tests to check that your code works. There should be no output if all of # the assertions pass. The `example_trips` dictionary was obtained from when # you printed the first trip from each of the original data files. tests = {'NYC': (1, 0, 'Friday'), 'Chicago': (3, 23, 'Thursday'), 'Washington': (3, 22, 'Thursday')} for city in tests: assert time_of_trip(example_trips[city], city) == tests[city] def type_of_user(datum, city): """ Takes as input a dictionary containing info about a single trip (datum) and its origin city (city) and returns the type of system user that made the trip. Remember that Washington has different category names compared to Chicago and NYC. """ # YOUR CODE HERE if city == 'Chicago': user_type = datum['usertype'] elif city == 'NYC': user_type = datum['usertype'] else: t = datum['Member Type'] if t == 'Registered': user_type = 'Subscriber' else: user_type = 'Customer' return user_type # Some tests to check that your code works. There should be no output if all of # the assertions pass. The `example_trips` dictionary was obtained from when # you printed the first trip from each of the original data files. tests = {'NYC': 'Customer', 'Chicago': 'Subscriber', 'Washington': 'Subscriber'} for city in tests: assert type_of_user(example_trips[city], city) == tests[city] ###Output _____no_output_____ ###Markdown **Question 3b**: Now, use the helper functions you wrote above to create a condensed data file for each city consisting only of the data fields indicated above. In the `/examples/` folder, you will see an example datafile from the [Bay Area Bike Share](http://www.bayareabikeshare.com/open-data) before and after conversion. Make sure that your output is formatted to be consistent with the example file. ###Code def condense_data(in_file, out_file, city): """ This function takes full data from the specified input file and writes the condensed data to a specified output file. The city argument determines how the input file will be parsed. HINT: See the cell below to see how the arguments are structured! """ with open(out_file, 'w') as f_out, open(in_file, 'r') as f_in: # set up csv DictWriter object - writer requires column names for the # first row as the "fieldnames" argument out_colnames = ['duration', 'month', 'hour', 'day_of_week', 'user_type'] trip_writer = csv.DictWriter(f_out, fieldnames = out_colnames) trip_writer.writeheader() ## TODO: set up csv DictReader object ## trip_reader = csv.DictReader(f_in) # collect data from and process each row for row in trip_reader: # set up a dictionary to hold the values for the cleaned and trimmed # data point new_point = {} ## TODO: use the helper functions to get the cleaned data from ## ## the original data dictionaries. ## ## Note that the keys for the new_point dictionary should match ## ## the column names set in the DictWriter object above. ## new_point['duration'] = duration_in_mins(row,city) new_point['month'], new_point['hour'], new_point['day_of_week'] = time_of_trip(row,city) new_point['user_type'] = type_of_user(row,city) ## TODO: write the processed information to the output file. ## ## see https://docs.python.org/3/library/csv.html#writer-objects ## trip_writer.writerow(new_point) # Run this cell to check your work city_info = {'Washington': {'in_file': './data/Washington-CapitalBikeshare-2016.csv', 'out_file': './data/Washington-2016-Summary.csv'}, 'Chicago': {'in_file': './data/Chicago-Divvy-2016.csv', 'out_file': './data/Chicago-2016-Summary.csv'}, 'NYC': {'in_file': './data/NYC-CitiBike-2016.csv', 'out_file': './data/NYC-2016-Summary.csv'}} for city, filenames in city_info.items(): condense_data(filenames['in_file'], filenames['out_file'], city) print_first_point(filenames['out_file']) ###Output City: Washington OrderedDict([('duration', '7.123116666666666'), ('month', '3'), ('hour', '22'), ('day_of_week', 'Thursday'), ('user_type', 'Subscriber')]) City: Chicago OrderedDict([('duration', '15.433333333333334'), ('month', '3'), ('hour', '23'), ('day_of_week', 'Thursday'), ('user_type', 'Subscriber')]) City: NYC OrderedDict([('duration', '13.983333333333333'), ('month', '1'), ('hour', '0'), ('day_of_week', 'Friday'), ('user_type', 'Customer')]) ###Markdown > **Tip**: If you save a jupyter Notebook, the output from running code blocks will also be saved. However, the state of your workspace will be reset once a new session is started. Make sure that you run all of the necessary code blocks from your previous session to reestablish variables and functions before picking up where you last left off. Exploratory Data AnalysisNow that you have the data collected and wrangled, you're ready to start exploring the data. In this section you will write some code to compute descriptive statistics from the data. You will also be introduced to the `matplotlib` library to create some basic histograms of the data. StatisticsFirst, let's compute some basic counts. The first cell below contains a function that uses the csv module to iterate through a provided data file, returning the number of trips made by subscribers and customers. The second cell runs this function on the example Bay Area data in the `/examples/` folder. Modify the cells to answer the question below.**Question 4a**: Which city has the highest number of trips? Which city has the highest proportion of trips made by subscribers? Which city has the highest proportion of trips made by short-term customers?**Answer**: Replace this text with your response! ###Code def number_of_trips(filename): """ This function reads in a file with trip data and reports the number of trips made by subscribers, customers, and total overall. """ with open(filename, 'r') as f_in: # set up csv reader object reader = csv.DictReader(f_in) # initialize count variables n_subscribers = 0 n_customers = 0 # tally up ride types for row in reader: if row['user_type'] == 'Subscriber': n_subscribers += 1 else: n_customers += 1 # compute total number of rides n_total = n_subscribers + n_customers # return tallies as a tuple return(n_subscribers, n_customers, n_total) def proportion_users(filename): subscribers, customers, total = number_of_trips(filename) prop_subs = round((subscribers/total)*100 , 2) prop_cust = round((customers/total)*100 , 2) return (prop_subs,prop_cust) ## Modify this and the previous cell to answer Question 4a. Remember to run ## ## the function on the cleaned data files you created from Question 3. ## data_file = './examples/BayArea-Y3-Summary.csv' print(number_of_trips(data_file)) data_NYC = './data/NYC-2016-Summary.csv' data_Washington = './data/Washington-2016-Summary.csv' data_Chicago = './data/Chicago-2016-Summary.csv' temp = proportion_users(data_NYC) print("NYC: {}".format(number_of_trips(data_NYC))) print("Proportion of Subscribers of NYC: {}% Proportion of Customers of NYC: {}% \n".format(temp[0],temp[1])) temp = proportion_users(data_Washington) print("Washington: {}".format(number_of_trips(data_Washington))) print("Proportion of Subscribers of Washington: {}% Proportion of Customers of Washington: {}% \n".format(temp[0],temp[1])) temp = proportion_users(data_Chicago) print("NYC: {}".format(number_of_trips(data_Chicago))) print("Proportion of Subscribers of Chicago: {}% Proportion of Customers of Chicago: {}% \n".format(temp[0],temp[1])) ###Output (5666, 633, 6299) NYC: (245896, 30902, 276798) Proportion of Subscribers of NYC: 88.84% Proportion of Customers of NYC: 11.16% Washington: (51753, 14573, 66326) Proportion of Subscribers of Washington: 78.03% Proportion of Customers of Washington: 21.97% NYC: (54982, 17149, 72131) Proportion of Subscribers of Chicago: 76.23% Proportion of Customers of Chicago: 23.77% ###Markdown > **Tip**: In order to add additional cells to a notebook, you can use the "Insert Cell Above" and "Insert Cell Below" options from the menu bar above. There is also an icon in the toolbar for adding new cells, with additional icons for moving the cells up and down the document. By default, new cells are of the code type; you can also specify the cell type (e.g. Code or Markdown) of selected cells from the Cell menu or the dropdown in the toolbar.Now, you will write your own code to continue investigating properties of the data.**Question 4b**: Bike-share systems are designed for riders to take short trips. Most of the time, users are allowed to take trips of 30 minutes or less with no additional charges, with overage charges made for trips of longer than that duration. What is the average trip length for each city? What proportion of rides made in each city are longer than 30 minutes?**Answer**: Replace this text with your reponse! ###Code ## Use this and additional cells to answer Question 4b. ## ## ## ## HINT: The csv module reads in all of the data as strings, including ## ## numeric values. You will need a function to convert the strings ## ## into an appropriate numeric type before you aggregate data. ## ## TIP: For the Bay Area example, the average trip length is 14 minutes ## ## and 3.5% of trips are longer than 30 minutes. ## def avg_trip_length(filename): with open(filename) as f_in: reader = csv.DictReader(f_in) total_trip_len = 0 number_long_trips = 0 for row in reader: total_trip_len += float(row['duration']) if(float(row['duration']) > 30): number_long_trips +=1 total_trips = number_of_trips(filename) avg_trip = int(total_trip_len / total_trips[2]) proportion = (number_long_trips / total_trips[2]) * 100 return avg_trip, round(proportion, 2) data_Washington = './data/Washington-2016-Summary.csv' data_Chicago = './data/Chicago-2016-Summary.csv' data_NYC = './data/NYC-2016-Summary.csv' temp=avg_trip_length(data_Washington) print("Avg trip length of Washington: {}, Proportion of long trips: {}".format(temp[0],temp[1])) temp=avg_trip_length(data_Chicago) print("Avg trip length of Chicago: {}, Proportion of long trips: {}".format(temp[0],temp[1])) temp=avg_trip_length(data_NYC) print("Avg trip length of NYC: {}, Proportion of long trips: {}".format(temp[0],temp[1])) ###Output Avg trip length of Washington: 18, Proportion of long trips: 10.84 Avg trip length of Chicago: 16, Proportion of long trips: 8.33 Avg trip length of NYC: 15, Proportion of long trips: 7.3 ###Markdown **Question 4c**: Dig deeper into the question of trip duration based on ridership. Choose one city. Within that city, which type of user takes longer rides on average: Subscribers or Customers?**Answer**: Replace this text with your response! ###Code ## Use this and additional cells to answer Question 4c. If you have ## ## not done so yet, consider revising some of your previous code to ## ## make use of functions for reusability. ## ## ## ## TIP: For the Bay Area example data, you should find the average ## ## Subscriber trip duration to be 9.5 minutes and the average Customer ## ## trip duration to be 54.6 minutes. Do the other cities have this ## ## level of difference? ## def ans(filename): with open(filename) as f_in: Reader = csv.DictReader(f_in) subs_duration = 0 cust_duration = 0 for r in Reader: if r['user_type'] == 'Subscriber': subs_duration += float(r['duration']) else: cust_duration += float(r['duration']) total_trips= number_of_trips(filename) avg_trip_duration_subs = (subs_duration / total_trips[0]) avg_trip_duration_cust = (cust_duration / total_trips[1]) return round(avg_trip_duration_subs, 2), round(avg_trip_duration_cust, 2) data_Washington = './data/Washington-2016-Summary.csv' temp=ans(data_Washington) print("City:Washington , Avg trip duration Subscribers: {} , Avg trip duration Customers: {}".format(temp[0],temp[1])) if(temp[0]>temp[1]): print("Subscribers take longest rides"); else: print("Customers take longest rides") ###Output City:Washington , Avg trip duration Subscribers: 12.53 , Avg trip duration Customers: 41.68 Customers take longest rides ###Markdown VisualizationsThe last set of values that you computed should have pulled up an interesting result. While the mean trip time for Subscribers is well under 30 minutes, the mean trip time for Customers is actually _above_ 30 minutes! It will be interesting for us to look at how the trip times are distributed. In order to do this, a new library will be introduced here, `matplotlib`. Run the cell below to load the library and to generate an example plot. ###Code # load library import matplotlib.pyplot as plt # this is a 'magic word' that allows for plots to be displayed # inline with the notebook. If you want to know more, see: # http://ipython.readthedocs.io/en/stable/interactive/magics.html %matplotlib inline # example histogram, data taken from bay area sample data = [ 7.65, 8.92, 7.42, 5.50, 16.17, 4.20, 8.98, 9.62, 11.48, 14.33, 19.02, 21.53, 3.90, 7.97, 2.62, 2.67, 3.08, 14.40, 12.90, 7.83, 25.12, 8.30, 4.93, 12.43, 10.60, 6.17, 10.88, 4.78, 15.15, 3.53, 9.43, 13.32, 11.72, 9.85, 5.22, 15.10, 3.95, 3.17, 8.78, 1.88, 4.55, 12.68, 12.38, 9.78, 7.63, 6.45, 17.38, 11.90, 11.52, 8.63,] plt.hist(data) plt.title('Distribution of Trip Durations') plt.xlabel('Duration (m)') plt.show() ###Output _____no_output_____ ###Markdown In the above cell, we collected fifty trip times in a list, and passed this list as the first argument to the `.hist()` function. This function performs the computations and creates plotting objects for generating a histogram, but the plot is actually not rendered until the `.show()` function is executed. The `.title()` and `.xlabel()` functions provide some labeling for plot context.You will now use these functions to create a histogram of the trip times for the city you selected in question 4c. Don't separate the Subscribers and Customers for now: just collect all of the trip times and plot them. ###Code ## Use this and additional cells to collect all of the trip times as a list ## ## and then use pyplot functions to generate a histogram of trip times. ## def plot_graph(filename,city): with open(filename) as f_in: reader = csv.DictReader(f_in) data = [] for r in reader: data.append(round(float(r['duration']),2)) plt.hist(data) plt.title('histogram of the trip times for {} '.format(city)) plt.xlabel('Duration (m)') plt.show() data_Washington = './data/Washington-2016-Summary.csv' plot_graph(data_Washington, 'Washington') ###Output _____no_output_____ ###Markdown If you followed the use of the `.hist()` and `.show()` functions exactly like in the example, you're probably looking at a plot that's completely unexpected. The plot consists of one extremely tall bar on the left, maybe a very short second bar, and a whole lot of empty space in the center and right. Take a look at the duration values on the x-axis. This suggests that there are some highly infrequent outliers in the data. Instead of reprocessing the data, you will use additional parameters with the `.hist()` function to limit the range of data that is plotted. Documentation for the function can be found [[here]](https://matplotlib.org/devdocs/api/_as_gen/matplotlib.pyplot.hist.htmlmatplotlib.pyplot.hist).**Question 5**: Use the parameters of the `.hist()` function to plot the distribution of trip times for the Subscribers in your selected city. Do the same thing for only the Customers. Add limits to the plots so that only trips of duration less than 75 minutes are plotted. As a bonus, set the plots up so that bars are in five-minute wide intervals. For each group, where is the peak of each distribution? How would you describe the shape of each distribution?**Answer**: Replace this text with your response! ###Code ## Use this and additional cells to answer Question 5. ## import numpy as np def plot_subs(filename, city): with open(filename) as f_in: reader = csv.DictReader(f_in) data = [] for r in reader: if r['user_type'] == 'Subscriber': data.append(round(float(r['duration']), 2)) plt.hist(data, bins= [0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) plt.xticks(np.arange(0, 76, 5.0)) plt.title('Plot of Trip Durations of Subscribers in {}'.format(city)) plt.xlabel('Duration (m)') plt.show() data_Washington = './data/Washington-2016-Summary.csv' plot_subs(data_Washington, 'Washington') def plot_cust(filename, city): with open(filename) as f_in: reader = csv.DictReader(f_in) data = [] for r in reader: if r['user_type'] == 'Customer': data.append(round(float(r['duration']), 2)) plt.hist(data, bins= [0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75]) plt.xticks(np.arange(0, 76, 5.0)) plt.title('Plot of Trip Durations of Customers in {}'.format(city)) plt.xlabel('Duration (m)') plt.show() data_Washington = './data/Washington-2016-Summary.csv' plot_cust(data_Washington, 'Washington') ###Output _____no_output_____ ###Markdown Performing Your Own AnalysisSo far, you've performed an initial exploration into the data available. You have compared the relative volume of trips made between three U.S. cities and the ratio of trips made by Subscribers and Customers. For one of these cities, you have investigated differences between Subscribers and Customers in terms of how long a typical trip lasts. Now it is your turn to continue the exploration in a direction that you choose. Here are a few suggestions for questions to explore:- How does ridership differ by month or season? Which month / season has the highest ridership? Does the ratio of Subscriber trips to Customer trips change depending on the month or season?- Is the pattern of ridership different on the weekends versus weekdays? On what days are Subscribers most likely to use the system? What about Customers? Does the average duration of rides change depending on the day of the week?- During what time of day is the system used the most? Is there a difference in usage patterns for Subscribers and Customers?If any of the questions you posed in your answer to question 1 align with the bullet points above, this is a good opportunity to investigate one of them. As part of your investigation, you will need to create a visualization. If you want to create something other than a histogram, then you might want to consult the [Pyplot documentation](https://matplotlib.org/devdocs/api/pyplot_summary.html). In particular, if you are plotting values across a categorical variable (e.g. city, user type), a bar chart will be useful. The [documentation page for `.bar()`](https://matplotlib.org/devdocs/api/_as_gen/matplotlib.pyplot.bar.htmlmatplotlib.pyplot.bar) includes links at the bottom of the page with examples for you to build off of for your own use.**Question 6**: Continue the investigation by exploring another question that could be answered by the data available. Document the question you want to explore below. Your investigation should involve at least two variables and should compare at least two groups. You should also use at least one visualization as part of your explorations.**Answer**: Replace this text with your responses and include a visualization below! ###Code ## Use this and additional cells to continue to explore the dataset. ## ## Once you have performed your exploration, document your findings ## ## in the Markdown cell above. ## viewed the example from github import calendar def per_day_ana(filename,city,usertype): if usertype not in ['Subscriber','Customer']: return "user does not exist" user_each_day=[0, 0, 0, 0, 0, 0, 0] user_weekdays= 0 user_weekends= 0 with open(filename) as f_in: reader = csv.DictReader(f_in) for r in reader: if r['user_type'] == usertype: if r['day_of_week'] == 'Monday': user_each_day[0] +=1 user_weekdays +=1 elif r['day_of_week'] == 'Tuesday': user_each_day[1] +=1 user_weekdays +=1 elif r['day_of_week'] == 'Wednesday': user_each_day[2] +=1 user_weekdays +=1 elif r['day_of_week'] == 'Thursday': user_each_day[3] +=1 user_weekdays +=1 elif r['day_of_week'] == 'Friday': user_each_day[4] +=1 user_weekdays +=1 elif r['day_of_week'] == 'Saturday': user_each_day[5] +=1 user_weekends +=1 elif r['day_of_week'] == 'Sunday': user_each_day[6] +=1 user_weekends +=1 day_sys_max_use = calendar.day_name[user_each_day.index(max(user_each_day))] x=np.arange(7) plt.bar(x, user_each_day, 0.5) plt.xticks(x,('Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun')) plt.title('Ridership on various days of week for {}s in {}'.format(usertype,city)) plt.xlabel('Day of week') plt.ylabel('Ridership') plt.show() if(user_weekdays > user_weekends): return "For {}: Ridership was greater on Weekdays".format(usertype) else: return "For {}: Ridership was greater on Weekends".format(usertype) data_Washington = './data/Washington-2016-Summary.csv' ###data_Chicago = './data/Chicago-2016-Summary.csv' ###data_NYC = './data/NYC-2016-Summary.csv' print(per_day_ana(data_Washington, 'Washington', 'Subscriber')) print(per_day_ana(data_Washington, 'Washington', 'Customer')) ###print(per_day_ana(data_Chicago, 'Chicago', 'Subscriber')) ###print(per_day_ana(data_Washington, 'Washington', 'Customer')) ###print(per_day_ana(data_NYC, 'NYC', 'Subscriber')) ###print(per_day_ana(data_NYC, 'NYC', 'Customer')) ###Output _____no_output_____ ###Markdown ConclusionsCongratulations on completing the project! This is only a sampling of the data analysis process: from generating questions, wrangling the data, and to exploring the data. Normally, at this point in the data analysis process, you might want to draw conclusions about the data by performing a statistical test or fitting the data to a model for making predictions. There are also a lot of potential analyses that could be performed on the data which are not possible with only the data provided. For example, detailed location data has not been investigated. Where are the most commonly used docks? What are the most common routes? As another example, weather has potential to have a large impact on daily ridership. How much is ridership impacted when there is rain or snow? Are subscribers or customers affected more by changes in weather?**Question 7**: Putting the bike share data aside, think of a topic or field of interest where you would like to be able to apply the techniques of data science. What would you like to be able to learn from your chosen subject?**Answer**: Replace this text with your response!> **Tip**: If we want to share the results of our analysis with others, we aren't limited to giving them a copy of the jupyter Notebook (.ipynb) file. We can also export the Notebook output in a form that can be opened even for those without Python installed. From the **File** menu in the upper left, go to the **Download as** submenu. You can then choose a different format that can be viewed more generally, such as HTML (.html) orPDF (.pdf). You may need additional packages or software to perform these exports.> If you are working on this project via the Project Notebook page in the classroom, you can also submit this project directly from the workspace. **Before you do that**, you should save an HTML copy of the completed project to the workspace by running the code cell below. If it worked correctly, the output code should be a 0, and if you click on the jupyter icon in the upper left, you should see your .html document in the workspace directory. Alternatively, you can download the .html copy of your report following the steps in the previous paragraph, then _upload_ the report to the directory (by clicking the jupyter icon).> Either way, once you've gotten the .html report in your workspace, you can complete your submission by clicking on the "Submit Project" button to the lower-right hand side of the workspace. ###Code from subprocess import call call(['python', '-m', 'nbconvert', 'Bike_Share_Analysis.ipynb']) from subprocess import call call(['python', '-m', 'nbconvert', 'Bike_Share_Analysis.ipynb']) ###Output _____no_output_____
Notebooks/1_Data_Cleaning_&_Visualizations.ipynb
###Markdown Load the data ###Code def parse(path): g = gzip.open(path, 'rb') for l in g: yield eval(l) def getDF(path): i = 0 df = {} for d in parse(path): df[i] = d i += 1 return pd.DataFrame.from_dict(df, orient='index') df_reviews = getDF('../Data/Raw Data/reviews_Electronics_5.json.gz') df_reviews.shape df_reviews.head(3) df_metadata = getDF('../Data/Raw Data/meta_Electronics.json.gz') df_metadata.shape df_metadata.head(3) # Merge the 2 files to have all the info in one file df_metaReview = pd.merge(df_reviews, df_metadata, how='left' , on='asin' ) df_metaReview.shape # Remove the unnecessary fields df_metaReview.drop(['reviewerName', 'unixReviewTime','fit','also_buy', 'image', 'tech1', 'tech2', 'also_view', 'similar_item', 'details', 'date'], axis=1, inplace= True) # Check to see the Null values df_metaReview = df_metaReview.replace('', np.nan) df_metaReview.isna().sum() # Remove the Null values df_metaReview.dropna(subset=['category'], inplace= True) df_metaReview.isna().sum() df_metaReview.shape # Inspect the data df_metaReview['main_cat'].value_counts().head(10) df_metaReview['category'].value_counts().head(10) # Extract the headphone products only df = df_metaReview[df_metaReview["title"].str.contains("headphones|headphone|Headphones|Headphone")] df = df[~df['review_summary'].isna()] df.head() df.info() # Create a new column with combining 'reviewText' and 'summary' df['review_summary'] = df['reviewText'] + df['summary'] df.drop([ 'reviewText', 'summary'], axis=1, inplace= True) df.tail(3) # Look at a sample review df['review_summary'][1785367] # Create a new column for rating class based on ratings df['rating_class'] = df['overall'].apply(lambda x: 'bad' if x < 3 else'good') # Add year of review df['reviewTime'] = pd.to_datetime(df['reviewTime']) df['year'] = df['reviewTime'].dt.year df.head() df['rating_class'].value_counts() 7534/50761 ###Output _____no_output_____ ###Markdown Text Preprocessing ###Code # calculate raw tokens in order to measure of cleaned tokens raw_tokens=len([w for t in (df_coffee["review_summary"].apply(word_tokenize)) for w in t]) print('Number of raw tokens: {}'.format(raw_tokens)) def strip_html(text): soup = BeautifulSoup(text, "html.parser") return soup.get_text() def remove_between_square_brackets(text): return re.sub('\[[^]]*\]', '', text) def denoise_text(text): text = strip_html(text) text = remove_between_square_brackets(text) return text # special_characters removal def remove_special_characters(text, remove_digits=True): pattern = r'[^a-zA-z0-9\s]' if not remove_digits else r'[^a-zA-z\s]' text = re.sub(pattern, '', text) return text def remove_non_ascii(words): """Remove non-ASCII characters from list of tokenized words""" new_words = [] for word in words: new_word = unicodedata.normalize('NFKD', word).encode('ascii', 'ignore').decode('utf-8', 'ignore') new_words.append(new_word) return new_words def to_lowercase(words): """Convert all characters to lowercase from list of tokenized words""" new_words = [] for word in words: new_word = word.lower() new_words.append(new_word) return new_words def remove_stopwords(words): """Remove stop words from list of tokenized words""" new_words = [] for word in words: if word not in stopword_list: new_words.append(word) return new_words def stem_words(words): """Stem words in list of tokenized words""" stemmer = LancasterStemmer() stems = [] for word in words: stem = stemmer.stem(word) stems.append(stem) return stems def remove_punctuation_and_splchars(words): """Remove punctuation from list of tokenized words""" new_words = [] for word in words: new_word = re.sub(r'[^\w\s]', '', word) if new_word != '': new_word = remove_special_characters(new_word, True) new_words.append(new_word) return new_words def replace_numbers(words): """Replace all interger occurrences in list of tokenized words with textual representation""" p = inflect.engine() new_words = [] for word in words: if word.isdigit(): new_word = p.number_to_words(word) new_words.append(new_word) else: new_words.append(word) return new_words stopword_list= stopwords.words('english') stopword_list.remove('no') stopword_list.remove('not') def lemmatize_verbs(words): """Lemmatize verbs in list of tokenized words""" lemmatizer = WordNetLemmatizer() lemmas = [] for word in words: lemma = lemmatizer.lemmatize(word, pos='v') lemmas.append(lemma) return lemmas def normalize(words): words = remove_non_ascii(words) words = to_lowercase(words) words = remove_punctuation_and_splchars(words) words = remove_stopwords(words) return words def lemmatize(words): lemmas = lemmatize_verbs(words) return lemmas def normalize_and_lemmaize(input): sample = denoise_text(input) #sample = expand_contractions(sample) sample = remove_special_characters(sample) words = nltk.word_tokenize(sample) words = normalize(words) lemmas = lemmatize(words) return ' '.join(lemmas) df.info() df['clean_text'] = df['review_summary'].apply(lambda text: normalize_and_lemmaize(text)) df.sample(5) # Look at a sample review, before and after preprocessing df['review_summary'][1785367] df['clean_text'][1785367] # Let's put aside number of raw tokens in order to measure of cleaned tokens clean_tokens=len([w for t in (df["clean_text"].apply(word_tokenize)) for w in t]) print('Number of clean tokens: {}\n'.format(clean_tokens)) print('Percentage of removed tokens: {0:.2f}'.format(1-(clean_tokens/raw_tokens))) # Pickle headphone data df.to_pickle('../Data/Processed Data/Headphone_CleanText.pkl') ###Output _____no_output_____ ###Markdown EDA ###Code df = pd.read_pickle('../Data/Processed Data/Headphone_CleanText.pkl') ###Output _____no_output_____ ###Markdown **1.** Number of reviews per rating: ###Code ratings_counts = df.groupby('overall', as_index= False)['asin'].count().sort_values('asin', ascending=True) ratings_counts.index=[1,2,3,4,5] plt.figure(figsize = (8,5)) mycolors =['orange', 'darkorange','chocolate', 'sienna', 'saddlebrown'] ratings_counts['asin'].plot(kind="bar", color = mycolors) sns.despine() plt.xticks(rotation=0) plt.title('Total Review Numbers for Each Rating') plt.xlabel('Rating', fontsize = 13) plt.ylabel('Number of Reviews', fontsize = 13) plt.savefig('../Images/Reviews_per_rating.jpg'); plt.figure(figsize = (7,7)) mycolors =['lavenderblush', 'pink', 'lightcoral', 'indianred', 'darkred'] df2 = df.rename({'overall': 'Ratings'},axis=1) df2.groupby('Ratings')['Ratings'].count().plot(kind='pie',autopct='%1.0f%%', startangle=90,explode=(.1,0,0,0,0), colors = mycolors) plt.savefig('../Images/Reviews_pie_chart.jpg'); ###Output _____no_output_____ ###Markdown ---**2.** Review length per rating: ###Code plt.figure(figsize = (8,6)) mycolors = ['peachpuff', 'orange', 'darkorange', 'chocolate', 'saddlebrown'] sns.boxplot(x="overall",y='review_length',data=df, order = [1,2,3,4,5], palette= mycolors) sns.despine() plt.xlabel('Rating', fontsize = 13) plt.ylabel('Review Length', fontsize = 13); plt.figure(figsize = (8,6)) mycolors =['peachpuff', 'orange', 'darkorange', 'chocolate', 'saddlebrown'] sns.boxplot(x="overall",y='review_length',data=df, showfliers=False, order = [1,2,3,4,5], palette= mycolors) sns.despine() plt.xlabel('Rating', fontsize = 13) plt.ylabel('Review Length', fontsize = 13) plt.savefig('../Images/Box_plot_reviews_length.jpg'); ###Output _____no_output_____ ###Markdown ---**3.** Number of reviews and products per year ###Code review_per_year = df.groupby('year', as_index= False)['asin'].count().sort_values('year') review_per_year = review_per_year.rename({'asin':'Number of Reviews'}, axis=1) plt.figure(figsize = (8,5)) #mycolors =['crimson', 'crimson', 'lightcoral', 'pink', 'pink'] sns.barplot(x='year',y='Number of Reviews', data = review_per_year, palette ='Oranges') sns.despine() plt.title('Total Review Numbers for Each Year') plt.xlabel('Year', fontsize = 13) plt.ylabel('Number of Reviews', fontsize = 13) plt.savefig('../Images/Reviews_per_year.jpg'); ###Output _____no_output_____ ###Markdown ---**3.** Distribution of review lengths: ###Code plt.figure(figsize=(8,5)) plt.hist('review_length',data=df, bins=150, color = 'chocolate') sns.despine() plt.title("Distribution of Review Length") plt.ylim(0,14000) plt.xlim(0,4000) plt.xlabel('Review length', fontsize= 13) plt.ylabel('Number of Reviews', fontsize= 13) plt.savefig('../Images/Distribution_of_review_length.jpg'); ###Output _____no_output_____ ###Markdown ---**3.** Look at the top brands in the data ###Code brands = df["brand"].value_counts() plt.figure(figsize=(10,6)) mycolors =['saddlebrown', 'chocolate', 'chocolate', 'darkorange', 'darkorange', 'darkorange', 'orange', 'orange','peachpuff','peachpuff'] brands[:10].plot(kind='bar', color = mycolors) sns.despine() plt.title("Number of Reviews for the Top 10 Headphone Brands") plt.xlabel('Brand Name', fontsize= 13) plt.ylabel('Number of Reviews', fontsize= 13) plt.savefig('../Images/Top_10_brands.jpg', bbox_inches='tight'); ###Output _____no_output_____
Archive/U-net V2 (no dilation).ipynb
###Markdown Libraries and Global Parameters ###Code import os import cv2 import pandas as pd import matplotlib.pyplot as plt import numpy as np from skimage.io import imread, imshow, imread_collection, concatenate_images from skimage.util import img_as_bool, img_as_uint, img_as_ubyte, img_as_int from skimage.transform import resize from skimage.morphology import label import random from random import randint from keras import regularizers from keras.models import Model, load_model from keras.optimizers import Adam, SGD, RMSprop from keras.layers import Input, concatenate, Conv2D, MaxPooling2D, Activation, Dense, \ UpSampling2D, BatchNormalization, add, Dropout, Flatten, Conv2DTranspose from keras.layers.core import Lambda from keras.callbacks import EarlyStopping, ModelCheckpoint, ReduceLROnPlateau, LearningRateScheduler from keras import backend as K from keras.losses import binary_crossentropy, sparse_categorical_crossentropy ####### UPDATE THIS ######### ############################# model_num = 2 ############################# ############################# model_checkpoint_file= 'unet_v' + str(model_num) +'.h5' submission_filename = 'unet_v' + str(model_num) +'_pred.csv' # Root folders for test and training data train_root = "./stage1_train" test_root = "./stage1_test" # Size we resize all images to #image_size = (128,128) img_height = 128 img_width = 128 img_channels = 1 # 1 for B&W, 3 for RGB import warnings warnings.filterwarnings('ignore', category=UserWarning, module='skimage') #warnings.resetwarnings() ###Output Using TensorFlow backend. ###Markdown Preparing the Data ###Code # Import images (either test or training) # Decolorize, resize, store in array, and save filenames, etc. def import_images(root): dirs = os.listdir(root) filenames=[os.path.join(root,file_id) + "/images/"+file_id+".png" for file_id in dirs] images=[imread(imagefile,as_grey=True) for imagefile in filenames] resized_images = [ resize(image,(img_width,img_height)) for image in images] Array = np.reshape(np.array(resized_images), (len(resized_images),img_height,img_width,img_channels)) #Array = np.reshape(np.array(img_as_ubyte(resized_images),dtype=np.uint8).astype(np.uint8), # (len(resized_images),img_height,img_width,img_channels)) print(Array.mean()) print(Array.std()) # Normalize inputs Array = ((Array - Array.mean())/Array.std()) print(Array.mean()) print(Array.std()) print(images[0].dtype) # print(resized_images[0].dtype) print(Array[0,0,0,0].dtype) return Array, images, filenames, dirs train_X, train_images, train_filenames, train_dirs = import_images(train_root) ## Import Training Masks # this takes longer than the training images because we have to # combine a lot of mask files # This function creates a single combined mask image # when given a list of masks # Probably a computationally faster way to do this... def collapse_masks(mask_list): for i, mask_file in enumerate(mask_list): if i != 0: # combine mask with previous mask in list mask = np.maximum(mask, imread(os.path.join(train_root,mask_file))) else: # read first mask in mask = imread(os.path.join(train_root,mask_file)) return mask # Import all the masks train_mask_dirs = [ os.path.join(path, 'masks') for path in os.listdir(train_root) ] train_mask_files = [ [os.path.join(dir,file) for file in os.listdir(os.path.join(train_root,dir)) ] for dir in train_mask_dirs] train_masks = [ collapse_masks(mask_files) for mask_files in train_mask_files ] resized_train_masks = [ img_as_bool(resize(image,(img_width,img_height))) for image in train_masks] train_Y = np.reshape(np.array(resized_train_masks),(len(resized_train_masks),img_height,img_width,img_channels)) # Plot images side by side for a list of datasets def plot_side_by_side(ds_list,image_num,size=(15,10)): print('Image #: ' + str(image_num)) fig = plt.figure(figsize=size) for i in range(len(ds_list)): ax1 = fig.add_subplot(1,len(ds_list),i+1) ax1.imshow(ds_list[i][image_num]) plt.show() # Plots random corresponding images and masks def plot_check(ds_list,rand_imgs=None,img_nums=None,size=(15,10)): if rand_imgs != None: for i in range(rand_imgs): plot_side_by_side(ds_list, randint(0,len(ds_list[0])-1),size=size) if img_nums != None: for i in range(len(img_nums)): plot_side_by_side(ds_list,img_nums[i],size=size) plot_check([train_images,train_masks],rand_imgs=1,size=(10,7)) # Check size of arrays we are inputting to model # This is important! We need the datasets to be as # small as possible to reduce computation time # Check physical size print(train_X.shape) print(train_Y.shape) # Check memory size print(train_X.nbytes) print(train_Y.nbytes) # Check datatypes print(train_X.dtype) print(train_Y.dtype) plot_check([np.squeeze(train_X,axis=3),np.squeeze(train_Y,axis=3)],rand_imgs=1,size=(10,7)) ###Output Image #: 176 ###Markdown Now Let's Build the Model ###Code # Loss and metric functions for the neural net def dice_coef(y_true, y_pred): y_true_f = K.flatten(y_true) y_pred = K.cast(y_pred, 'float32') y_pred_f = K.cast(K.greater(K.flatten(y_pred), 0.5), 'float32') intersection = y_true_f * y_pred_f score = 2. * K.sum(intersection) / (K.sum(y_true_f) + K.sum(y_pred_f)) return score def dice_loss(y_true, y_pred): smooth = 1. y_true_f = K.flatten(y_true) y_pred_f = K.flatten(y_pred) intersection = y_true_f * y_pred_f score = (2. * K.sum(intersection) + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth) return 1. - score def bce_dice_loss(y_true, y_pred): return binary_crossentropy(y_true, y_pred) + dice_loss(y_true, y_pred) ## used for meshnet def create_block(x, filters=21, filter_size=(3, 3), activation='relu',dil_rate=1,dropout_rate=0.25,l2_reg=0): x = Conv2D(filters, filter_size, padding='same', activation=activation, dilation_rate = dil_rate,kernel_regularizer=regularizers.l2(l2_reg)) (x) #x = BatchNormalization() (x) x = Dropout(dropout_rate) (x) return x ## used for dilated unet def encoder(x, filters=44, n_block=3, kernel_size=(3, 3), activation='relu'): skip = [] for i in range(n_block): x = Conv2D(filters * 2**i, kernel_size, activation=activation, padding='same')(x) x = Conv2D(filters * 2**i, kernel_size, activation=activation, padding='same')(x) skip.append(x) x = MaxPooling2D(pool_size=(2, 2), strides=(2, 2))(x) return x, skip def bottleneck(x, filters_bottleneck, mode='cascade', depth=6, kernel_size=(3, 3), activation='relu'): dilated_layers = [] if mode == 'cascade': # used in the competition for i in range(depth): x = Conv2D(filters_bottleneck, kernel_size, activation=activation, padding='same')(x) dilated_layers.append(x) return add(dilated_layers) elif mode == 'parallel': # Like "Atrous Spatial Pyramid Pooling" for i in range(depth): dilated_layers.append( Conv2D(filters_bottleneck, kernel_size, activation=activation, padding='same')(x) ) return add(dilated_layers) def decoder(x, skip, filters, n_block=3, kernel_size=(3, 3), activation='relu'): for i in reversed(range(n_block)): x = UpSampling2D(size=(2, 2))(x) x = Conv2D(filters * 2**i, kernel_size, activation=activation, padding='same')(x) x = concatenate([skip[i], x]) x = Conv2D(filters * 2**i, kernel_size, activation=activation, padding='same')(x) x = Conv2D(filters * 2**i, kernel_size, activation=activation, padding='same')(x) return x ## 3rd place carvana function def get_dilated_unet( input_shape=(img_width, img_height, img_channels), mode='cascade', filters=44, n_block=3, lr=0.0001, loss=binary_crossentropy, n_class=1 ): inputs = Input(input_shape) enc, skip = encoder(inputs, filters, n_block) bottle = bottleneck(enc, filters_bottleneck=filters * 2**n_block, mode=mode,depth=3) dec = decoder(bottle, skip, filters, n_block) classify = Conv2D(n_class, (1, 1), activation='sigmoid')(dec) model = Model(inputs=inputs, outputs=classify) model.compile(optimizer=Adam(lr), loss=loss, metrics=[dice_coef, bce_dice_loss]) return model ### non-dilated unet def Unet(img_size): inputs = Input((img_size, img_size, img_channels)) #s = Lambda(lambda x: x / 255)(inputs) c1 = Conv2D(16, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(inputs) c1 = Dropout(0.1)(c1) c1 = Conv2D(16, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c1) p1 = MaxPooling2D((2, 2))(c1) c2 = Conv2D(32, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(p1) c2 = Dropout(0.1)(c2) c2 = Conv2D(32, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c2) p2 = MaxPooling2D((2, 2))(c2) c3 = Conv2D(64, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(p2) c3 = Dropout(0.2)(c3) c3 = Conv2D(64, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c3) p3 = MaxPooling2D((2, 2))(c3) c4 = Conv2D(128, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(p3) c4 = Dropout(0.2)(c4) c4 = Conv2D(128, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c4) p4 = MaxPooling2D(pool_size=(2, 2))(c4) c5 = Conv2D(256, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(p4) c5 = Dropout(0.2)(c5) c5 = Conv2D(256, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c5) u6 = Conv2DTranspose(128, (2, 2), strides=(2, 2), padding='same')(c5) u6 = concatenate([u6, c4]) c6 = Conv2D(128, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(u6) c6 = Dropout(0.2)(c6) c6 = Conv2D(128, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c6) u7 = Conv2DTranspose(64, (2, 2), strides=(2, 2), padding='same')(c6) u7 = concatenate([u7, c3]) c7 = Conv2D(64, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(u7) c7 = Dropout(0.2)(c7) c7 = Conv2D(64, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c7) u8 = Conv2DTranspose(32, (2, 2), strides=(2, 2), padding='same')(c7) u8 = concatenate([u8, c2]) c8 = Conv2D(32, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(u8) c8 = Dropout(0.1)(c8) c8 = Conv2D(32, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c8) u9 = Conv2DTranspose(16, (2, 2), strides=(2, 2), padding='same')(c8) u9 = concatenate([u9, c1], axis=3) c9 = Conv2D(16, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(u9) c9 = Dropout(0.1)(c9) c9 = Conv2D(16, (3, 3), activation='elu', kernel_initializer='he_normal', padding='same')(c9) outputs = Conv2D(1, (1, 1), activation='sigmoid')(c9) model = Model(inputs=[inputs], outputs=[outputs]) return model ## master function for creating a net def get_net( input_shape=(img_height, img_width,img_channels), loss=binary_crossentropy, lr=0.001, n_class=1, nb_filters=21, dropout=0.2 ): inputs = Input(input_shape) # Create layers net_body = create_block(inputs,filters=nb_filters,dropout_rate=dropout) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout,dil_rate=2) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout,dil_rate=4) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout,dil_rate=8) net_body = create_block(net_body,filters=nb_filters,dropout_rate=dropout) classify = Conv2D(n_class,(1,1),activation='sigmoid') (net_body) model = Model(inputs=inputs, outputs=classify) model.compile(optimizer=Adam(lr), loss=loss, metrics=[bce_dice_loss, dice_coef]) return model #### CREATE MODEL ########################################################## #my_model = get_net(nb_filters=21,dropout=0.1,loss=binary_crossentropy) #my_model = Unet(img_height) #my_model.compile(optimizer='adam', loss=binary_crossentropy, metrics=[bce_dice_loss, dice_coef]) my_model = get_dilated_unet(filters=16) ############################################################################ print(my_model.summary()) # Fit model earlystopper = EarlyStopping(patience=10, verbose=1) checkpointer = ModelCheckpoint(model_checkpoint_file, verbose=1, save_best_only=True) reduce_plateau = ReduceLROnPlateau(monitor='val_loss', factor=0.2, patience=4, verbose=1, # min_lr=0.00001, epsilon=0.001, mode='auto') results = my_model.fit(train_X, train_Y, validation_split=0.1, batch_size=1, epochs=100, verbose=1, shuffle=True, callbacks=[ earlystopper, checkpointer, reduce_plateau]) for val_loss in results.history['val_loss']: print(round(val_loss,3)) #print(results.history) ## Import Test Data and Make Predictions with Model # Import images (either test or training) # Decolorize, resize, store in array, and save filenames, etc. test_X, test_images, test_filenames, test_dirs = import_images(test_root) # Load model and make predictions on test data final_model = load_model(model_checkpoint_file, custom_objects={'dice_coef': dice_coef, 'bce_dice_loss':bce_dice_loss}) preds_test = final_model.predict(test_X, verbose=1) preds_test_t = (preds_test > 0.5) # Create list of upsampled test masks preds_test_upsampled = [] for i in range(len(preds_test)): preds_test_upsampled.append(resize(np.squeeze(preds_test[i]), (test_images[i].shape[0], test_images[i].shape[1]), mode='constant', preserve_range=True)) preds_test_upsampled_bool = [ (mask > 0.5).astype(bool) for mask in preds_test_upsampled ] plot_check([test_images,preds_test_upsampled,preds_test_upsampled_bool],rand_imgs=2) # Run-length encoding stolen from https://www.kaggle.com/rakhlin/fast-run-length-encoding-python def rle_encoding(x): dots = np.where(x.T.flatten() == 1)[0] run_lengths = [] prev = -2 for b in dots: if (b>prev+1): run_lengths.extend((b + 1, 0)) run_lengths[-1] += 1 prev = b return run_lengths def prob_to_rles(x, cutoff=0.5): lab_img = label(x > cutoff) for i in range(1, lab_img.max() + 1): yield rle_encoding(lab_img == i) def generate_prediction_file(image_names,predictions,filename): new_test_ids = [] rles = [] for n, id_ in enumerate(image_names): rle = list(prob_to_rles(predictions[n])) rles.extend(rle) new_test_ids.extend([id_] * len(rle)) sub = pd.DataFrame() sub['ImageId'] = new_test_ids sub['EncodedPixels'] = pd.Series(rles).apply(lambda x: ' '.join(str(y) for y in x)) sub.to_csv(filename, index=False) generate_prediction_file(test_dirs,preds_test_upsampled_bool,submission_filename) ###Output _____no_output_____
courses/machine_learning/deepdive2/building_production_ml_systems/solutions/4b_streaming_data_inference_vertex.ipynb
###Markdown Working with Streaming DataLearning Objectives 1. Learn how to process real-time data for ML models using Cloud Dataflow 2. Learn how to serve online predictions using real-time data IntroductionIt can be useful to leverage real time data in a machine learning model when making a prediction. However, doing so requires setting up a streaming data pipeline which can be non-trivial. Typically you will have the following: - A series of IoT devices generating and sending data from the field in real-time (in our case these are the taxis) - A messaging bus to that receives and temporarily stores the IoT data (in our case this is Cloud Pub/Sub) - A streaming processing service that subscribes to the messaging bus, windows the messages and performs data transformations on each window (in our case this is Cloud Dataflow) - A persistent store to keep the processed data (in our case this is BigQuery)These steps happen continuously and in real-time, and are illustrated by the blue arrows in the diagram below. Once this streaming data pipeline is established, we need to modify our model serving to leverage it. This simply means adding a call to the persistent store (BigQuery) to fetch the latest real-time data when a prediction request comes in. This flow is illustrated by the red arrows in the diagram below. In this lab we will address how to process real-time data for machine learning models. We will use the same data as our previous 'taxifare' labs, but with the addition of `trips_last_5min` data as an additional feature. This is our proxy for real-time traffic. ###Code import numpy as np import os import shutil import tensorflow as tf from google.cloud import aiplatform from google.cloud import bigquery from google.protobuf import json_format from google.protobuf.struct_pb2 import Value from matplotlib import pyplot as plt from tensorflow import keras from tensorflow.keras.callbacks import TensorBoard from tensorflow.keras.layers import Dense, DenseFeatures from tensorflow.keras.models import Sequential print(tf.__version__) PROJECT = 'cloud-training-demos' # REPLACE WITH YOUR PROJECT ID BUCKET = 'cloud-training-demos' # REPLACE WITH YOUR BUCKET NAME REGION = 'us-central1' # REPLACE WITH YOUR BUCKET REGION e.g. us-central1 # For Bash Code os.environ['PROJECT'] = PROJECT os.environ['BUCKET'] = BUCKET os.environ['REGION'] = REGION %%bash gcloud config set project $PROJECT ###Output _____no_output_____ ###Markdown Re-train our model with `trips_last_5min` featureIn this lab, we want to show how to process real-time data for training and prediction. So, we need to retrain our previous model with this additional feature. Go through the notebook `4a_streaming_data_training.ipynb`. Open and run the notebook to train and save a model. This notebook is very similar to what we did in the Introduction to Tensorflow module but note the added feature for `trips_last_5min` in the model and the dataset. Simulate Real Time Taxi DataSince we don’t actually have real-time taxi data we will synthesize it using a simple python script. The script publishes events to Google Cloud Pub/Sub.Inspect the `iot_devices.py` script in the `taxicab_traffic` folder. It is configured to send about 2,000 trip messages every five minutes with some randomness in the frequency to mimic traffic fluctuations. These numbers come from looking at the historical average of taxi ride frequency in BigQuery. In production this script would be replaced with actual taxis with IoT devices sending trip data to Cloud Pub/Sub. To execute the `iot_devices.py` script, launch a terminal and navigate to the `asl-ml-immersion/notebooks/building_production_ml_systems/solutions` directory. Then run the following two commands. ```bashPROJECT_ID=$(gcloud config list project --format "value(core.project)")python3 ./taxicab_traffic/iot_devices.py --project=$PROJECT_ID``` You will see new messages being published every 5 seconds. **Keep this terminal open** so it continues to publish events to the Pub/Sub topic. If you open [Pub/Sub in your Google Cloud Console](https://console.cloud.google.com/cloudpubsub/topic/list), you should be able to see a topic called `taxi_rides`. Create a BigQuery table to collect the processed dataIn the next section, we will create a dataflow pipeline to write processed taxifare data to a BigQuery Table, however that table does not yet exist. Execute the following commands to create a BigQuery dataset called `taxifare` and a table within that dataset called `traffic_realtime`. ###Code bq = bigquery.Client() dataset = bigquery.Dataset(bq.dataset("taxifare")) try: bq.create_dataset(dataset) # will fail if dataset already exists print("Dataset created.") except: print("Dataset already exists.") ###Output _____no_output_____ ###Markdown Next, we create a table called `traffic_realtime` and set up the schema. ###Code dataset = bigquery.Dataset(bq.dataset("taxifare")) table_ref = dataset.table("traffic_realtime") SCHEMA = [ bigquery.SchemaField("trips_last_5min", "INTEGER", mode="REQUIRED"), bigquery.SchemaField("time", "TIMESTAMP", mode="REQUIRED"), ] table = bigquery.Table(table_ref, schema=SCHEMA) try: bq.create_table(table) print("Table created.") except: print("Table already exists.") ###Output _____no_output_____ ###Markdown Launch Streaming Dataflow PipelineNow that we have our taxi data being pushed to Pub/Sub, and our BigQuery table set up, let’s consume the Pub/Sub data using a streaming DataFlow pipeline.The pipeline is defined in `./taxicab_traffic/streaming_count.py`. Open that file and inspect it. There are 5 transformations being applied: - Read from PubSub - Window the messages - Count number of messages in the window - Format the count for BigQuery - Write results to BigQuery**TODO:** Open the file ./taxicab_traffic/streaming_count.py and find the TODO there. Specify a sliding window that is 5 minutes long, and gets recalculated every 15 seconds. Hint: Reference the [beam programming guide](https://beam.apache.org/documentation/programming-guide/windowing) for guidance. To check your answer reference the solution. For the second transform, we specify a sliding window that is 5 minutes long, and recalculate values every 15 seconds. In a new terminal, launch the dataflow pipeline using the command below. You can change the `BUCKET` variable, if necessary. Here it is assumed to be your `PROJECT_ID`. ```bashPROJECT_ID=$(gcloud config list project --format "value(core.project)")BUCKET=$PROJECT_ID CHANGE AS NECESSARY python3 ./taxicab_traffic/streaming_count.py \ --input_topic taxi_rides \ --runner=DataflowRunner \ --project=$PROJECT_ID \ --temp_location=gs://$BUCKET/dataflow_streaming``` Once you've submitted the command above you can examine the progress of that job in the [Dataflow section of Cloud console](https://console.cloud.google.com/dataflow). Explore the data in the table After a few moments, you should also see new data written to your BigQuery table as well. Re-run the query periodically to observe new data streaming in! You should see a new row every 15 seconds. ###Code %%bigquery SELECT * FROM `taxifare.traffic_realtime` ORDER BY time DESC LIMIT 10 ###Output _____no_output_____ ###Markdown Make predictions from the new dataIn the rest of the lab, we'll referece the model we trained and deployed from the previous labs, so make sure you have run the code in the `4a_streaming_data_training.ipynb` notebook. The `add_traffic_last_5min` function below will query the `traffic_realtime` table to find the most recent traffic information and add that feature to our instance for prediction. ###Code # TODO 2a. Write a function to take most recent entry in `traffic_realtime` table and add it to instance. def add_traffic_last_5min(instance): bq = bigquery.Client() query_string = """ SELECT * FROM `taxifare.traffic_realtime` ORDER BY time DESC LIMIT 1 """ trips = bq.query(query_string).to_dataframe()['trips_last_5min'][0] instance['traffic_last_5min'] = int(trips) return instance ###Output _____no_output_____ ###Markdown The `traffic_realtime` table is updated in realtime using Cloud Pub/Sub and Dataflow so, if you run the cell below periodically, you should see the `traffic_last_5min` feature added to the instance and change over time. ###Code add_traffic_last_5min(instance={'dayofweek': 4, 'hourofday': 13, 'pickup_longitude': -73.99, 'pickup_latitude': 40.758, 'dropoff_latitude': 41.742, 'dropoff_longitude': -73.07}) ###Output _____no_output_____ ###Markdown Finally, we'll use the python api to call predictions on an instance, using the realtime traffic information in our prediction. Just as above, you should notice that our resulting predicitons change with time as our realtime traffic information changes as well. Copy the `ENDPOINT_ID` from the deployment in the previous lab to the beginning of the block below. ###Code # TODO 2b. Write code to call prediction on instance using realtime traffic info. #Hint: Look at this sample https://github.com/googleapis/python-aiplatform/blob/master/samples/snippets/predict_custom_trained_model_sample.py ENDPOINT_ID = # TODO: Copy the `ENDPOINT_ID` from the deployment in the previous lab. api_endpoint = f'{REGION}-aiplatform.googleapis.com' # The AI Platform services require regional API endpoints. client_options = {"api_endpoint": api_endpoint} # Initialize client that will be used to create and send requests. # This client only needs to be created once, and can be reused for multiple requests. client = aiplatform.gapic.PredictionServiceClient(client_options=client_options) instance = {'dayofweek': 4, 'hourofday': 13, 'pickup_longitude': -73.99, 'pickup_latitude': 40.758, 'dropoff_latitude': 41.742, 'dropoff_longitude': -73.07} # The format of each instance should conform to the deployed model's prediction input schema. instance_dict = add_traffic_last_5min(instance) instance = json_format.ParseDict(instance, Value()) instances = [instance] endpoint = client.endpoint_path( project=PROJECT, location=REGION, endpoint=ENDPOINT_ID ) response = client.predict( endpoint=endpoint, instances=instances ) # The predictions are a google.protobuf.Value representation of the model's predictions. print(" prediction:", response.predictions[0][0]) ###Output _____no_output_____ ###Markdown Working with Streaming DataLearning Objectives 1. Learn how to process real-time data for ML models using Cloud Dataflow 2. Learn how to serve online predictions using real-time data IntroductionIt can be useful to leverage real time data in a machine learning model when making a prediction. However, doing so requires setting up a streaming data pipeline which can be non-trivial. Typically you will have the following: - A series of IoT devices generating and sending data from the field in real-time (in our case these are the taxis) - A messaging bus to that receives and temporarily stores the IoT data (in our case this is Cloud Pub/Sub) - A streaming processing service that subscribes to the messaging bus, windows the messages and performs data transformations on each window (in our case this is Cloud Dataflow) - A persistent store to keep the processed data (in our case this is BigQuery)These steps happen continuously and in real-time, and are illustrated by the blue arrows in the diagram below. Once this streaming data pipeline is established, we need to modify our model serving to leverage it. This simply means adding a call to the persistent store (BigQuery) to fetch the latest real-time data when a prediction request comes in. This flow is illustrated by the red arrows in the diagram below. In this lab we will address how to process real-time data for machine learning models. We will use the same data as our previous 'taxifare' labs, but with the addition of `trips_last_5min` data as an additional feature. This is our proxy for real-time traffic. ###Code import numpy as np import os import shutil import tensorflow as tf from google.cloud import aiplatform from google.cloud import bigquery from google.protobuf import json_format from google.protobuf.struct_pb2 import Value from matplotlib import pyplot as plt from tensorflow import keras from tensorflow.keras.callbacks import TensorBoard from tensorflow.keras.layers import Dense, DenseFeatures from tensorflow.keras.models import Sequential print(tf.__version__) PROJECT = 'cloud-training-demos' # REPLACE WITH YOUR PROJECT ID BUCKET = 'cloud-training-demos' # REPLACE WITH YOUR BUCKET NAME REGION = 'us-central1' # REPLACE WITH YOUR BUCKET REGION e.g. us-central1 # For Bash Code os.environ['PROJECT'] = PROJECT os.environ['BUCKET'] = BUCKET os.environ['REGION'] = REGION %%bash gcloud config set project $PROJECT ###Output _____no_output_____ ###Markdown Re-train our model with `trips_last_5min` featureIn this lab, we want to show how to process real-time data for training and prediction. So, we need to retrain our previous model with this additional feature. Go through the notebook `4a_streaming_data_training.ipynb`. Open and run the notebook to train and save a model. This notebook is very similar to what we did in the Introduction to Tensorflow module but note the added feature for `trips_last_5min` in the model and the dataset. Simulate Real Time Taxi DataSince we don’t actually have real-time taxi data we will synthesize it using a simple python script. The script publishes events to Google Cloud Pub/Sub.Inspect the `iot_devices.py` script in the `taxicab_traffic` folder. It is configured to send about 2,000 trip messages every five minutes with some randomness in the frequency to mimic traffic fluctuations. These numbers come from looking at the historical average of taxi ride frequency in BigQuery. In production this script would be replaced with actual taxis with IoT devices sending trip data to Cloud Pub/Sub. To execute the `iot_devices.py` script, launch a terminal and navigate to the `asl-ml-immersion/notebooks/building_production_ml_systems/solutions` directory. Then run the following two commands. ```bashPROJECT_ID=$(gcloud config list project --format "value(core.project)")python3 ./taxicab_traffic/iot_devices.py --project=$PROJECT_ID``` You will see new messages being published every 5 seconds. **Keep this terminal open** so it continues to publish events to the Pub/Sub topic. If you open [Pub/Sub in your Google Cloud Console](https://console.cloud.google.com/cloudpubsub/topic/list), you should be able to see a topic called `taxi_rides`. Create a BigQuery table to collect the processed dataIn the next section, we will create a dataflow pipeline to write processed taxifare data to a BigQuery Table, however that table does not yet exist. Execute the following commands to create a BigQuery dataset called `taxifare` and a table within that dataset called `traffic_realtime`. ###Code bq = bigquery.Client() dataset = bigquery.Dataset(bq.dataset("taxifare")) try: bq.create_dataset(dataset) # will fail if dataset already exists print("Dataset created.") except: print("Dataset already exists.") ###Output _____no_output_____ ###Markdown Next, we create a table called `traffic_realtime` and set up the schema. ###Code dataset = bigquery.Dataset(bq.dataset("taxifare")) table_ref = dataset.table("traffic_realtime") SCHEMA = [ bigquery.SchemaField("trips_last_5min", "INTEGER", mode="REQUIRED"), bigquery.SchemaField("time", "TIMESTAMP", mode="REQUIRED"), ] table = bigquery.Table(table_ref, schema=SCHEMA) try: bq.create_table(table) print("Table created.") except: print("Table already exists.") ###Output _____no_output_____ ###Markdown Launch Streaming Dataflow PipelineNow that we have our taxi data being pushed to Pub/Sub, and our BigQuery table set up, let’s consume the Pub/Sub data using a streaming DataFlow pipeline.The pipeline is defined in `./taxicab_traffic/streaming_count.py`. Open that file and inspect it. There are 5 transformations being applied: - Read from PubSub - Window the messages - Count number of messages in the window - Format the count for BigQuery - Write results to BigQuery**TODO:** Open the file ./taxicab_traffic/streaming_count.py and find the TODO there. Specify a sliding window that is 5 minutes long, and gets recalculated every 15 seconds. Hint: Reference the [beam programming guide](https://beam.apache.org/documentation/programming-guide/windowing) for guidance. To check your answer reference the solution. For the second transform, we specify a sliding window that is 5 minutes long, and recalculate values every 15 seconds. In a new terminal, launch the dataflow pipeline using the command below. You can change the `BUCKET` variable, if necessary. Here it is assumed to be your `PROJECT_ID`. ```bashPROJECT_ID=$(gcloud config list project --format "value(core.project)")BUCKET=$PROJECT_ID CHANGE AS NECESSARY python3 ./taxicab_traffic/streaming_count.py \ --input_topic taxi_rides \ --runner=DataflowRunner \ --project=$PROJECT_ID \ --temp_location=gs://$BUCKET/dataflow_streaming``` Once you've submitted the command above you can examine the progress of that job in the [Dataflow section of Cloud console](https://console.cloud.google.com/dataflow). Explore the data in the table After a few moments, you should also see new data written to your BigQuery table as well. Re-run the query periodically to observe new data streaming in! You should see a new row every 15 seconds. ###Code %%bigquery SELECT * FROM `taxifare.traffic_realtime` ORDER BY time DESC LIMIT 10 ###Output _____no_output_____ ###Markdown Make predictions from the new dataIn the rest of the lab, we'll referece the model we trained and deployed from the previous labs, so make sure you have run the code in the `4a_streaming_data_training.ipynb` notebook. The `add_traffic_last_5min` function below will query the `traffic_realtime` table to find the most recent traffic information and add that feature to our instance for prediction. ###Code # TODO 2a. Write a function to take most recent entry in `traffic_realtime` table and add it to instance. def add_traffic_last_5min(instance): bq = bigquery.Client() query_string = """ SELECT * FROM `taxifare.traffic_realtime` ORDER BY time DESC LIMIT 1 """ trips = bq.query(query_string).to_dataframe()['trips_last_5min'][0] instance['traffic_last_5min'] = int(trips) return instance ###Output _____no_output_____ ###Markdown The `traffic_realtime` table is updated in realtime using Cloud Pub/Sub and Dataflow so, if you run the cell below periodically, you should see the `traffic_last_5min` feature added to the instance and change over time. ###Code add_traffic_last_5min(instance={'dayofweek': 4, 'hourofday': 13, 'pickup_longitude': -73.99, 'pickup_latitude': 40.758, 'dropoff_latitude': 41.742, 'dropoff_longitude': -73.07}) ###Output _____no_output_____ ###Markdown Finally, we'll use the python api to call predictions on an instance, using the realtime traffic information in our prediction. Just as above, you should notice that our resulting predicitons change with time as our realtime traffic information changes as well. Copy the `ENDPOINT_ID` from the deployment in the previous lab to the beginning of the block below. ###Code # TODO 2b. Write code to call prediction on instance using realtime traffic info. #Hint: Look at this sample https://github.com/googleapis/python-aiplatform/blob/master/samples/snippets/predict_custom_trained_model_sample.py ENDPOINT_ID = # TODO: Copy the `ENDPOINT_ID` from the deployment in the previous lab. api_endpoint = f'{REGION}-aiplatform.googleapis.com' # The AI Platform services require regional API endpoints. client_options = {"api_endpoint": api_endpoint} # Initialize client that will be used to create and send requests. # This client only needs to be created once, and can be reused for multiple requests. client = aiplatform.gapic.PredictionServiceClient(client_options=client_options) instance = {'dayofweek': 4, 'hourofday': 13, 'pickup_longitude': -73.99, 'pickup_latitude': 40.758, 'dropoff_latitude': 41.742, 'dropoff_longitude': -73.07} # The format of each instance should conform to the deployed model's prediction input schema. instance_dict = add_traffic_last_5min(instance) instance = json_format.ParseDict(instance, Value()) instances = [instance] endpoint = client.endpoint_path( project=PROJECT, location=REGION, endpoint=ENDPOINT_ID ) response = client.predict( endpoint=endpoint, instances=instances ) # The predictions are a google.protobuf.Value representation of the model's predictions. print(" prediction:", response.predictions[0][0]) ###Output _____no_output_____
ai/3.Hidden_layer/hidden_layer.ipynb
###Markdown Classification de données non linéaires avec une couche cachéeBonjour et bienvenue à ce workshop dans lequel nous allons construire notre premier réseau de neurones profond (deep learning), contenant une couche cachée. Vous remarquerez une grosse différence entre ce modèle et celui que vous avez implémenté avec la régression logistique.**Ce que vous allez apprendre:**- Implémenter un réseau de neurones avec une couche cachée pour une classification à 2 classes- Utiliser les neurones avec une fonction d'activation non linéaire, comme la tanh- Calculer la loss cross-entropy- Implémenter les forward et backward propagations 1 - Packages Commençons par importer les packages dont nous aurons besoin:- [numpy](www.numpy.org) est le package fondamental pour le calcul scientifique en Python.- [sklearn](http://scikit-learn.org/stable/) dispose de nombreuses fonctions pour la construction et l'analyse des données. - [matplotlib](http://matplotlib.org) est une librairie pour afficher des graphes, tableaux, etc. en Python.- tests fournit quelques exemples tests pour vérifier vos fonctions- utils fournit de nombreuses fonctions utiles dont nous avons besoin pour ce notebook ###Code import numpy as np import matplotlib.pyplot as plt from tests import * import sklearn import sklearn.datasets import sklearn.linear_model from utils import plot_decision_boundary, sigmoid, load_planar_dataset, load_extra_datasets %matplotlib inline np.random.seed(1) # seed pour comparer avec les valeurs attendues, ne pas modifier !! ###Output _____no_output_____ ###Markdown 2 - Dataset Tout d'abord, récupérons le jeu de données. Le code suivant va initialiser un jeu de données "fleur" à 2 classes dans les variables `X` et `Y`. ###Code X, Y = load_planar_dataset() ###Output _____no_output_____ ###Markdown Visualisez le jeu de données avec matplotlib. Les données ressemblent à une "fleur" avec des points rouges (label y = 0) et bleus (y = 1). Votre but est de construire un modèle qui classeront correctement ces données. ###Code c = [] for i in range(Y.shape[1]): if i < Y.shape[1] / 2: c.append('red') else: c.append('blue') plt.scatter(X[0, :], X[1, :], c=c, s=40, cmap=plt.cm.Spectral); ###Output _____no_output_____ ###Markdown Vous avez: - un numpy-array (matrix) X qui contient vos features (x1, x2) - un numpy-array (vecteur) Y qui contient vos labels (rouge:0, bleu:1).Analysons davantage nos données.**Exercice**: Combien d'exemples de train avez vous ? De plus, quels sont les shapes des variables `X` et `Y`? **Indice**: Comment récupérez vous le shape d'un numpy array ? [(help)](https://docs.scipy.org/doc/numpy/reference/generated/numpy.ndarray.shape.html) ###Code ### Début du code ### (≈ 2 lignes de code) shape_X = None shape_Y = None ### Fin du code ### print ('Le shape de X est: ' + str(shape_X)) print ('Le shape de Y est: ' + str(shape_Y)) ###Output _____no_output_____ ###Markdown **Résultat attendu**: shape de X (2, 400) shape de Y (1, 400) 3 - Simple régression logistiqueAvant de construire un réseau de neurones complet, visualisons les performances de la régression logistique à ce problème. Vous pouvez utilisez les fonctions de sklearn pour le faire. Exécutez la cellule suivant pour entraîner une régression logistique sur notre jeu de données. ###Code # Entrainement du classifier régression logistique clf = sklearn.linear_model.LogisticRegressionCV(cv=5); clf.fit(X.T, Y.T.ravel()); ###Output _____no_output_____ ###Markdown Vous pouvez maintenant plotter le decision boundary (limite de décision) de ce modèle. Exécutez le code suivant. **Attention:** Suite à des problèmes de versions matplotlib, il se peut que vous ne voyez pas les points initiaux. L'important est de voir comment la régression logistique a apporté sa solution au problème. Appelez un assistant si vous ne comprenez pas. ###Code # Plot de la decision boundary pour la régression logistique try: plot_decision_boundary(lambda x: clf.predict(x), X, Y) except: print("Suite à un problème de version matplotlib, vous ne voyez pas les points initiaux. Reprenez le graphique précédent et imaginez les points comme étant plot.") pass plt.title("Régression logistique") # Print l'accuracy LR_predictions = clf.predict(X.T) print ('Accuracy de la régression logistique: %d ' % float((np.dot(Y,LR_predictions) + np.dot(1-Y,1-LR_predictions))/float(Y.size)*100) + '% ' + "de points labellisés corrects") ###Output _____no_output_____ ###Markdown **Résultat attendu**: Accuracy 47% **Interpretation**: Ce jeu de données n'est pas linéairement séparable, donc la logistique régression ne fonctionne pas ici. Heureusement, un réseau de neurones profond pourra résoudre notre problème. C'est parti ! 4 - Réseau de neurones profondLa régression logistique ne fonctionne pas sur notre jeu de données. Vous allez maintenant entraîner un réseau de neurones profond avec une couche cachée.**Voici notre modèle**:**Mathématiquement**:Pour un exemple $x^{(i)}$:$$z^{[1] (i)} = W^{[1]} x^{(i)} + b^{[1]}\tag{1}$$ $$a^{[1] (i)} = \tanh(z^{[1] (i)})\tag{2}$$$$z^{[2] (i)} = W^{[2]} a^{[1] (i)} + b^{[2]}\tag{3}$$$$\hat{y}^{(i)} = a^{[2] (i)} = \sigma(z^{ [2] (i)})\tag{4}$$$$y^{(i)}_{prediction} = \begin{cases} 1 & \mbox{si } a^{[2](i)} > 0.5 \\ 0 & \mbox{sinon } \end{cases}\tag{5}$$A partir des prédictions calculées, vous pourrez calculer le coût grâce à la formule suivante:$$J = - \frac{1}{m} \sum\limits_{i = 0}^{m} \large\left(\small y^{(i)}\log\left(a^{[2] (i)}\right) + (1-y^{(i)})\log\left(1- a^{[2] (i)}\right) \large \right) \small \tag{6}$$**Rappel**: La méthodologie générale pour construire un réseau de neurones est la suivante: 1. Définir la structure du réseau de neurones ( nombre de features en entrée, nombre de couches cachées, etc). 2. Initialiser les paramètres du modèle 3. Boucle: - Implémenter la forward propagation - Calcul du coût - Implémenter la backward propagation pour avoir les gradients - Update des paramètres Après avoir créé vos fonctions, vous allez les merger en une unique fonction qu'on nommera `nn_model()`. Quand ce modèle sera entraîné, vous pourrez réaliser des prédictions sur des nouvelles données. 4.1 - Définir la structure du réseau de neurones **Exercice**: Définissez 3 variables: - n_x: size de la couche d'entrée - n_h: size de la couche cachée (définissez la à 4) - n_y: size de la couche de sortie**Indice**: Utilisez les shapes de X et Y pour trouver n_x et n_y. Hardcodez directement la couche cachée. ###Code def layer_sizes(X, Y): """ Arguments: X -- jeu de données en entrée de shape (size en entrée, nombre d'exemples) Y -- labels de shape (size en sortie, nombre d'exemples) """ ### Début du code ### (≈ 3 lignes de code) n_x = None # size de la couche d'entrée n_h = None n_y = None # size de la couche de sortie ### Fin du code ### return (n_x, n_h, n_y) X_assess, Y_assess = layer_sizes_test_case() (n_x, n_h, n_y) = layer_sizes(X_assess, Y_assess) print("Size de la couche d'entrée: n_x = " + str(n_x)) print("Size de la couche cachée: n_h = " + str(n_h)) print("Size de la couche de sortie: n_y = " + str(n_y)) ###Output _____no_output_____ ###Markdown **Résultat attentu** (il ne s'agit pas du nombres de neurones que nous utiliserons pour notre modèle, juste un test pour vérifier votre fonction). n_x 5 n_h 4 n_y 2 4.2 -Initialiser les paramètres du modèle **Exercice**: Implementez la function `initialize_parameters()`.**Instructions**:- Soyez sûrs que les sizes de vos paramètres sont correctes. Regardez le schéma au-dessus si nécessaire.- Vous initialiserez les matrices des poids avec des valeurs aléatoires - Utilisez: `np.random.randn(a,b) * 0.01` pour initialiser aléatoirement une matrice de shape (a,b).- Vous initialisez les vectors des biais à 0. - Utilisez: `np.zeros((a,b))` pour initialiser une matrice de shape (a,b) avec des zéros. ###Code def initialize_parameters(n_x, n_h, n_y): """ Argument: n_x -- size de la couche d'entrée n_h -- size de la couche cachée (définissez la à 4) n_y -- size de la couche de sortie Return: params -- dictionnaire python contenant les paramètres: W1 -- matrice poids de shape (n_h, n_x) b1 -- vecteur biais shape (n_h, 1) W2 -- matrice poids shape (n_y, n_h) b2 -- vecteur biais shape (n_y, 1) """ np.random.seed(42) # ne pas modifier le seed, il nous servira à comparer nos résultats ### Début du code ### (≈ 4 lignes de code) W1 = None b1 = None W2 = None b2 = None ### Fin du code ### assert (W1.shape == (n_h, n_x)) assert (b1.shape == (n_h, 1)) assert (W2.shape == (n_y, n_h)) assert (b2.shape == (n_y, 1)) parameters = {"W1": W1, "b1": b1, "W2": W2, "b2": b2} return parameters n_x, n_h, n_y = initialize_parameters_test_case() parameters = initialize_parameters(n_x, n_h, n_y) print("W1 = " + str(parameters["W1"])) print("b1 = " + str(parameters["b1"])) print("W2 = " + str(parameters["W2"])) print("b2 = " + str(parameters["b2"])) ###Output _____no_output_____ ###Markdown **Résultat attendu**: W1 [[ 0.00496714 -0.00138264] [ 0.00647689 0.0152303 ] [-0.00234153 -0.00234137] [ 0.01579213 0.00767435]] b1 [[ 0.] [ 0.] [ 0.] [ 0.]] W2 [[-0.00469474 0.0054256 -0.00463418 -0.0046573 ]] b2 [[ 0.]] 4.3 -La boucle **Exercice**: Implementez `forward_propagation()`.**Instructions**:- Regardez au-dessus les formules mathématiques de votre modèle.- Vous pouvez utiliser la fonction `sigmoid()`. Nous l'avons défini et importé au début depuis utils.- Vous pouvez utiliser la fonction `np.tanh()`. C'est une fonction de la lib numpy.- Les étapes que vous devez implémenter sont les suivantes: 1. Récuperez chaque paramètre à partir du dictionnaire python "parameters" (qui correspond à l'output de `initialize_parameters()`). 2. Implémentez la forward propagation. Calculez $Z^{[1]}, A^{[1]}, Z^{[2]}$ et $A^{[2]}$. Rappel: Z correspond à la pré-activation et A à l'activation.- Les valeurs dont vous aurez besoin pour la backpropagation sont stockées dans "`cache`". Le dictionnaire `cache` sera un argument de la fonction backpropagation.- NB: `np.dot()` peut vous être utile ###Code def forward_propagation(X, parameters): """ Argument: X -- données en entrée de size (n_x, m) avec m la batch size parameters -- dictionnaire python contenant vos paramètres (poids et biais) Return: A2 -- L'output de la sigmoïde de la deuxième activation cache -- dictionnaire contenant "Z1", "A1", "Z2" et "A2" """ # Récupérez chaque paramètre du dictionnaire "parameters" ### Début du code ### (≈ 4 lignes de code) W1 = None b1 = None W2 = None b2 = None ### Fin du code ### # Implémentez la forward porpagation pour calculer A2 (probabilités) ### Début du code ### (≈ 4 lignes de code) Z1 = None A1 = None Z2 = None A2 = None ### Fin du code ### assert(A2.shape == (1, X.shape[1])) cache = {"Z1": Z1, "A1": A1, "Z2": Z2, "A2": A2} return A2, cache X_assess, parameters = forward_propagation_test_case() A2, cache = forward_propagation(X_assess, parameters) # Note: on utilise la moyenne ici pour vérifier que vos résultats matchent avec les notres print(f"{np.mean(cache['Z1']):.12f} ,{np.mean(cache['A1']):.12f}, {np.mean(cache['Z2']):.11f}, {np.mean(cache['A2']):.12f}") ###Output _____no_output_____ ###Markdown **Résultat attendu**: 0.262818640198 0.091999045227 -1.30766601287 0.212877681719 Maintenant que vous avez calculé $A^{[2]}$ (dans la variable Python "`A2`"), qui contient $a^{[2](i)}$ pour tous les exemples, vous pouvez calculer le coût de la manière suivante:$$J = - \frac{1}{m} \sum\limits_{i = 0}^{m} \large{(} \small y^{(i)}\log\left(a^{[2] (i)}\right) + (1-y^{(i)})\log\left(1- a^{[2] (i)}\right) \large{)} \small\tag{13}$$**Exercice**: Implementez `compute_cost()` pour calculer la valeur du coût $J$.**Instructions**:- Il y a plusieurs façon d'implémenter la loss cross-entropy. Pour vous aider, on vous met ici comment on aurait calculé $- \sum\limits_{i=0}^{m} y^{(i)}\log(a^{[2](i)})$:```pythonlogprobs = np.multiply(np.log(A2),Y)cost = - np.sum(logprobs) pas besoin d'utiliser une boucle for !```(vous pouvez utiliser `np.multiply()` puis `np.sum()` ou directement `np.dot()`). ###Code def compute_cost(A2, Y, parameters): """ Calculez le coût cross-entropy dont la formule se trouve dans l'équation (13) Arguments: A2 -- L'output de la sigmoïde de la deuxième activation, de shape (1, nombre d'exemples) Y -- vecteur contenant les labels, de shape (1, number of examples) parameters -- dictionnaire python contenant vos paramètres W1, b1, W2 and b2 Return: cost -- coût cross-entropy selon l'équation (13) """ m = Y.shape[1] # nombre d'exemples # Calculez la cross entropy ### Début du code ### (≈ 2 lignes de code) logprobs = None cost = None ### Fin du code ### cost = np.squeeze(cost) assert(isinstance(cost, float)) return cost A2, Y_assess, parameters = compute_cost_test_case() print(f"cost = {compute_cost(A2, Y_assess, parameters):.9f}") ###Output _____no_output_____ ###Markdown **Résultat attendu**: cost 0.693058761 Grâce au dictionnaire "cache" que vous avez calculé pendant la forward propagation, vous pouvez maintenant implémenter la backward propagation.**Exercice**: Implementez la fonction `backward_propagation()`.**Instructions**:La backpropagation est généralement la partie la plus compliquée en deep learning (la plus mathématique). Pour vous aider, on vous a mis ici les différentes étapes de la backpropagation. <!--$\frac{\partial \mathcal{J} }{ \partial z_{2}^{(i)} } = \frac{1}{m} (a^{[2](i)} - y^{(i)})$$\frac{\partial \mathcal{J} }{ \partial W_2 } = \frac{\partial \mathcal{J} }{ \partial z_{2}^{(i)} } a^{[1] (i) T} $$\frac{\partial \mathcal{J} }{ \partial b_2 } = \sum_i{\frac{\partial \mathcal{J} }{ \partial z_{2}^{(i)}}}$$\frac{\partial \mathcal{J} }{ \partial z_{1}^{(i)} } = W_2^T \frac{\partial \mathcal{J} }{ \partial z_{2}^{(i)} } * ( 1 - a^{[1] (i) 2}) $$\frac{\partial \mathcal{J} }{ \partial W_1 } = \frac{\partial \mathcal{J} }{ \partial z_{1}^{(i)} } X^T $$\frac{\partial \mathcal{J} _i }{ \partial b_1 } = \sum_i{\frac{\partial \mathcal{J} }{ \partial z_{1}^{(i)}}}$- Note that $*$ denotes elementwise multiplication.- The notation you will use is common in deep learning coding: - dW1 = $\frac{\partial \mathcal{J} }{ \partial W_1 }$ - db1 = $\frac{\partial \mathcal{J} }{ \partial b_1 }$ - dW2 = $\frac{\partial \mathcal{J} }{ \partial W_2 }$ - db2 = $\frac{\partial \mathcal{J} }{ \partial b_2 }$ !-->- Indice: - Pour calculer dZ1 vous devrez calculer $g^{[1]'}(Z^{[1]})$. Comme $g^{[1]}(.)$ est la fonction d'activation tanh, si $a = g^{[1]}(z)$ alors $g^{[1]'}(z) = 1-a^2$. Vous pourrez ensuite calculer $g^{[1]'}(Z^{[1]})$ avec `(1 - np.power(A1, 2))`. ###Code def backward_propagation(parameters, cache, X, Y): """ Implémentez la backward propagation en suivant les instructions ci-dessus Arguments: parameters -- dictionnaire python contenant vos paramètres cache -- dictionnaire python contenant "Z1", "A1", "Z2" et "A2". X -- données en entrée de shape (2, nombre d'exemples) Y -- vecteur des labels de shape (1, nombre d'exemples) Return: grads -- dictionnaire python contenant vos gradients """ m = X.shape[1] # Récuperez W1 et W2 à partie de parameters ### Début du code ### (≈ 2 lignes de code) W1 = None W2 = None ### Fin du code ### # Récupérez également A1 et A2 depuis le dictionnaire "cache". ### Début du code ### (≈ 2 lignes de code) A1 = None A2 = None ### Fin du code ### # Backward propagation: calculez les gradients dW1, db1, dW2, db2. ### Début du code ### (≈ 6 lignes de code, correspondant aux 6 équations de la cellule au-dessus) dZ2 = None dW2 = None db2 = None dZ1 = None dW1 = None db1 = None ### Fin du code ### grads = {"dW1": dW1, "db1": db1, "dW2": dW2, "db2": db2} return grads parameters, cache, X_assess, Y_assess = backward_propagation_test_case() grads = backward_propagation(parameters, cache, X_assess, Y_assess) print ("dW1 = "+ str(grads["dW1"])) print ("db1 = "+ str(grads["db1"])) print ("dW2 = "+ str(grads["dW2"])) print ("db2 = "+ str(grads["db2"])) ###Output _____no_output_____ ###Markdown **Résultat attendu**: dW1 [[ 0.00301023 -0.00747267] [ 0.00257968 -0.00641288] [-0.00156892 0.003893 ] [-0.00652037 0.01618243]] db1 [[ 0.00176201] [ 0.00150995] [-0.00091736] [-0.00381422]] dW2 [[ 0.00078841 0.01765429 -0.00084166 -0.01022527]] db2 [[-0.16655712]] **Exercice**: Implementez la fonction update_parameters. Utilisez la descente de gradient. Vous devez vous servir de (dW1, db1, dW2, db2) afin d'update (W1, b1, W2, b2).**Règle générale de la descente de gradient**: $ \theta = \theta - \alpha \frac{\partial J }{ \partial \theta }$ avec $\alpha$ le learning rate, $\theta$ représentant le paramètre et $ \frac{\partial J }{ \partial \theta }$ le gradient.**Illustration**: Vous pouvez comparer ci-dessous la différence avec un bon learning rate (convergence) et un mauvais learning rate (divergence). Crédits animation: Adam Harley. ###Code def update_parameters(parameters, grads, lr = 1.2): """ Arguments: parameters -- dictionnaire python contenant vos paramètres grads -- dictionnaire python contenant vos gradients Return: parameters -- dictionnaire python contenant vos paramètres updatés """ # Récuperer vos paramètres ### Début du code ### (≈ 4 lignes de code) W1 = None b1 = None W2 = None b2 = None ### Fin du code ### # Récupérez vos gradients ### Début du code ### (≈ 4 lignes de code) dW1 = None db1 = None dW2 = None db2 = None ## Fin du code ### # Updatez vos paramètres ### Début du code ### (≈ 4 lignes de code) W1 = None b1 = None W2 = None b2 = None ### Fin du code ### parameters = {"W1": W1, "b1": b1, "W2": W2, "b2": b2} return parameters parameters, grads = update_parameters_test_case() parameters = update_parameters(parameters, grads) print("W1 = " + str(parameters["W1"])) print("b1 = " + str(parameters["b1"])) print("W2 = " + str(parameters["W2"])) print("b2 = " + str(parameters["b2"])) ###Output _____no_output_____ ###Markdown **Résultat attendu**: W1 [[-0.00643025 0.01936718] [-0.02410458 0.03978052] [-0.01653973 -0.02096177] [ 0.01046864 -0.05990141]] b1 [[ -1.02420756e-06] [ 1.27373948e-05] [ 8.32996807e-07] [ -3.20136836e-06]] W2 [[-0.01041081 -0.04463285 0.01758031 0.04747113]] b2 [[ 0.00010457]] 4.4 - Intégrez les parties 4.2 et 4.3 dans nn_model() **Exercice**: Construisez votre réseau de neurones dans `nn_model()`.**Instructions**: Le réseau de neurones doit utiliser vos fonctions précédentes dans le bon ordre. ###Code def nn_model(X, Y, n_h, num_iterations = 10000, print_cost=False): """ Arguments: X -- dataset de shape (2, nombre d'exemples) Y -- labels de shape (1, nombre d'exemples) n_h -- size de la couche cachée num_iterations -- Nombre d'itérations dans la boucle print_cost -- si True, print le coût toutes les 1000 itérations Return: parameters -- paramètres entraînés. Ce sont ces paramètres qui seront utilisés pour de nouvelles données. """ np.random.seed(3) n_x = layer_sizes(X, Y)[0] n_y = layer_sizes(X, Y)[2] # Initialise les paramètres. Inputs: "n_x, n_h, n_y". Outputs = "parameters". ### Début du code ### (≈ 1 lignes de code) parameters = None ### Fin du code ### # Boucle for i in range(0, num_iterations): ### Début du code ### (≈ 4 lignes de code) # Forward propagation. Inputs: "X, parameters". Outputs: "A2, cache". A2, cache = None # Fonction de coût. Inputs: "A2, Y, parameters". Outputs: "cost". cost = None # Backpropagation. Inputs: "parameters, cache, X, Y". Outputs: "grads". grads = None # Update des paramètres. Inputs: "parameters, grads". Outputs: "parameters". parameters = None ### Fin du code ### # Print le coût toutes les 1000 iterations if print_cost and i % 1000 == 0: print ("Coût après itération %i: %f" %(i, cost)) return parameters X_assess, Y_assess = nn_model_test_case() parameters = nn_model(X_assess, Y_assess, 4, num_iterations=10000, print_cost=False) print("W1 = " + str(parameters["W1"])) print("b1 = " + str(parameters["b1"])) print("W2 = " + str(parameters["W2"])) print("b2 = " + str(parameters["b2"])) ###Output _____no_output_____ ###Markdown **Résultat attendu**: coût après itération 0 0.693207 $\vdots$ $\vdots$ W1 [[-0.60314397 1.127818 ] [-0.74257953 1.3776121 ] [-0.69186287 1.27565127] [-0.7349128 1.36877347]] b1 [[0.26051203] [0.35648363] [0.31850012] [0.35292858]] W2 [[-2.10582454 -3.16994587 -2.70047639 -3.11996342]] b2 [[0.21989657]] 4.5 Prédictions**Exercice**: Utilisez votre modèle pour prédire de nouvelles données. Implémentez predict().Utilisez la forward propagation**Rappel**: predictions = $y_{prediction} = \mathbb 1 \text{{activation > 0.5}} = \begin{cases} 1 & \text{si}\ activation > 0.5 \\ 0 & \text{sinon} \end{cases}$ Par exemple, si vous voulez définir les éléments d'une matrice X à 0 ou 1 en fonction d'un seuil, vous pouvez faire: ```X_new = (X > threshold)``` ###Code def predict(parameters, X): """ Grâce aux paramètres appris, prédisez la classe pour chaque exemple de X Arguments: parameters -- dictionnaire python contenant vos paramètres X -- données en entrée de siwe (n_x, m) Returns predictions -- vecteur des prédictions (rouge: 0 / bleu: 1) """ # Calcule les propabilités en utilisant la forward propagation, et classifie en 0 ou 1 grâce au seuil défini comme 0.5 ### Début du code ### (≈ 2 lignes de code) A, cache = None predictions = (A > 0.5) ### Fin du code ### return predictions parameters, X_assess = predict_test_case() predictions = predict(parameters, X_assess) print("moyenne des predictions = " + str(np.mean(predictions))) ###Output _____no_output_____ ###Markdown **Résultat attendu**: moyenne des predictions 0.666666666667 On peut maintenant entraîner notre modèle sur notre jeu de données. Exécutez le code suivant pour tester votre modèle avec une couche cachée de $n_h$ neurones. ###Code n_h = 5 parameters = nn_model(X, Y, n_h = n_h, num_iterations = 10000, print_cost=True) # Plot la decision boundary try: plot_decision_boundary(lambda x: predict(parameters, x.T), X, Y) except: print("Suite à un problème de version matplotlib, vous ne voyez pas les points initiaux. Reprenez le graphique au début du notebook et imaginez les points comme étant plot.") pass plt.title("Decision Boundary pour une couche cachée de size " + str(n_h)) ###Output _____no_output_____ ###Markdown **Résultat attendu**: Coût après itération 9000 0.178517 ###Code # Print l'accurary predictions = predict(parameters, X) print ('Accuracy: %d' % float((np.dot(Y,predictions.T) + np.dot(1-Y,1-predictions.T))/float(Y.size)*100) + '%') ###Output _____no_output_____ ###Markdown **Résultat attendu**: Accuracy 91% L'accurary est beaucoup plus élevé que celui de la régression logistique. Le modèle a pu apprendre le pattern des feuilles de la fleur. Contrairement à la régression logistique, les réseaux de neurones profonds sont capables de traiter des problèmes non linéaires.Essayez maintenant avec plusieurs nombres de neurones différents dans la couche cachée. 4.6 - Tuning du nombre de neurones dans la couche cachée Exécutez le code suivant. Cela devrait prendre 1-2 minutes. Vous observerez des comportements différents du modèle selon le nombre de neurones que vous aurez défini dans la couche cachée. ###Code # Cela devrait prendre 2 minutes plt.figure(figsize=(16, 32)) hidden_layer_sizes = [1, 2, 3, 4, 5, 20, 50] for i, n_h in enumerate(hidden_layer_sizes): plt.subplot(5, 2, i+1) plt.title('Hidden Layer of size %d' % n_h) parameters = nn_model(X, Y, n_h, num_iterations = 5000) try: plot_decision_boundary(lambda x: predict(parameters, x.T), X, Y) except: print("Suite à un problème de version matplotlib, vous ne voyez pas les points initiaux. Reprenez le graphique au début du notebook et imaginez les points comme étant plot.") predictions = predict(parameters, X) accuracy = float((np.dot(Y,predictions.T) + np.dot(1-Y,1-predictions.T))/float(Y.size)*100) print ("Accuracy for {} hidden units: {} %".format(n_h, accuracy)) ###Output _____no_output_____
osmnx/geocoder/geocode.ipynb
###Markdown GeocodeGeocode a query string to (lat, lng) with the Nominatim geocoder. ###Code # OSMnx: New Methods for Acquiring, Constructing, Analyzing, and Visualizing Complex Street Networks import osmnx as ox ox.config(use_cache=True, log_console=False) ox.__version__ query = '서울역, 세종대로, 중구, 서울특별시, 대한민국' ox.geocoder.geocode(query) ###Output _____no_output_____
prediction/multitask/pre-training/code comment generation/large_model.ipynb
###Markdown **Generate the comment for java code using codeTrans multitask training model**You can make free prediction online through this Link (When using the prediction online, you need to parse and tokenize the code first.) **1. Load necessry libraries including huggingface transformers** ###Code !pip install -q transformers sentencepiece from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline ###Output _____no_output_____ ###Markdown **2. Load the token classification pipeline and load it into the GPU if avilabile** ###Code pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_comment_generation_java_multitask"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_comment_generation_java_multitask", skip_special_tokens=True), device=0 ) ###Output /usr/local/lib/python3.6/dist-packages/transformers/models/auto/modeling_auto.py:970: FutureWarning: The class `AutoModelWithLMHead` is deprecated and will be removed in a future version. Please use `AutoModelForCausalLM` for causal language models, `AutoModelForMaskedLM` for masked language models and `AutoModelForSeq2SeqLM` for encoder-decoder models. FutureWarning, ###Markdown **3 Give the code for summarization, parse and tokenize it** ###Code code = "protected String renderUri(URI uri){\n return uri.toASCIIString();\n}\n" #@param {type:"raw"} !pip install javalang import javalang def tokenize_java_code(code): tokenList = [] tokens = list(javalang.tokenizer.tokenize(code)) for token in tokens: tokenList.append(token.value) return ' '.join(tokenList) tokenized_code = tokenize_java_code(code) print("Output after tokenization: " + tokenized_code) ###Output Output after tokenization: protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; } ###Markdown **4. Make Prediction** ###Code pipeline([tokenized_code]) ###Output _____no_output_____
vacation/VacationPy.ipynb
###Markdown VacationPy---- Note* Keep an eye on your API usage. Use https://developers.google.com/maps/reporting/gmp-reporting as reference for how to monitor your usage and billing.* Instructions have been included for each segment. You do not have to follow them exactly, but they are included to help you think through the steps. ###Code # Dependencies and Setup import matplotlib.pyplot as plt import pandas as pd import numpy as np import requests import gmaps import os # Import API key from config import g_key ###Output _____no_output_____ ###Markdown Store Part I results into DataFrame* Load the csv exported in Part I to a DataFrame ###Code ###Output _____no_output_____ ###Markdown Humidity Heatmap* Configure gmaps.* Use the Lat and Lng as locations and Humidity as the weight.* Add Heatmap layer to map. Create new DataFrame fitting weather criteria* Narrow down the cities to fit weather conditions.* Drop any rows will null values. Hotel Map* Store into variable named `hotel_df`.* Add a "Hotel Name" column to the DataFrame.* Set parameters to search for hotels with 5000 meters.* Hit the Google Places API for each city's coordinates.* Store the first Hotel result into the DataFrame.* Plot markers on top of the heatmap. ###Code # NOTE: Do not change any of the code in this cell # Using the template add the hotel marks to the heatmap info_box_template = """ <dl> <dt>Name</dt><dd>{Hotel Name}</dd> <dt>City</dt><dd>{City}</dd> <dt>Country</dt><dd>{Country}</dd> </dl> """ # Store the DataFrame Row # NOTE: be sure to update with your DataFrame name hotel_info = [info_box_template.format(**row) for index, row in hotel_df.iterrows()] locations = hotel_df[["Lat", "Lng"]] # Add marker layer ontop of heat map # Display figure ###Output _____no_output_____
notebooks/3c. Rejection sampling.ipynb
###Markdown Rejection sampling Takeaways and objectives from this notebookThere are some similarities between importance and [rejection sampling](https://en.wikipedia.org/wiki/Rejection_sampling). Rejection sampling also uses an auxiliary (designed) probabiliy density $q(x)$ to guide the sampling process. However, rejection sampling brings in a key new idea: **not all Monte Carlo samples are accepted, some are rejected.**We need to assume that we know a value $M \in {\cal R}$ s.t. $\forall x \in {\cal R} \; p(x)/q(x) \leq M$. Note that the ratio $p(x)/q(x)$ is the **likelihood ratio** we encountered in importance sampling. In particular, this means we can assume that $Mq(x) > p(x)$ is always true. The rejection sampling algorithm1. Draw a sample from $q(x)$.2. Draw a value $u \sim U[0,1]$.3. If $Mu < p(x)/q(x)$ then accept sample, otherwise reject.4. Goto 1.The samples that we accept are drawn from the distribution $p(x)$. ExampleWe will use the same functions $p(x)$ and $q(x)$ we used for importance sampling but this time $q(x)$ will be setup reasonably well for you :) - no more tricks, that's a promise. ###Code import numpy as np from scipy.stats import norm import seaborn as sns # This function definition will also work on lists and np.arrays def eval_p(x): return (1./8/np.pi)**0.5*np.exp(-0.5*(x-3)**2) + (1./16/np.pi)**0.5*np.exp(-0.25*(x-6)**2) # Parameters of our designed (Gaussian) distribution q(x) q_mean = 0. q_stdev = 5. M = 5. def designed_q(x): return norm.pdf(x, loc=q_mean, scale=q_stdev) def sample_q(size): return norm.rvs(loc=q_mean,scale=q_stdev,size=size) import matplotlib.pyplot as plt %matplotlib inline # we shall illustrate rejection sampling using a fiture x = np.linspace(-20, 20, 500) fig = plt.figure(figsize=(18,6)) plt.plot(x, eval_p(x), label = '$p(x)$') fig.gca().fill_between(x, 0, eval_p(x), alpha=0.2) plt.plot(x, M * designed_q(x), label = '$q(x)$') fig.gca().fill_between(x, 0, M * designed_q(x), alpha=0.2) plt.plot([3, 3], [0, eval_p(3)], color='g', label='accepted region') plt.plot([3, 3], [eval_p(3), M*designed_q(3)], color='r', label='rejected region') plt.plot([6, 6], [0, eval_p(6)], color='g') plt.plot([6, 6], [eval_p(6), M*designed_q(6)], color='r') plt.plot([9, 9], [0, eval_p(9)], color='g') plt.plot([9, 9], [eval_p(9), M*designed_q(9)], color='r') plt.ylim([0, 0.45]) plt.legend(fontsize=12) _ = plt.title('Rejection sampling illustration') ###Output _____no_output_____ ###Markdown Let us think about the graphical illustration of the rejection sample and look at the algorithm from another perspective.1. We sample from $q(x)$ to obtain a point $x$ and we compute $q(x)$, the density itself.2. We then sample $u \sim U[0,1]$ and we map $u$ the point $uMq(x) \in [0,Mq(x)]$.3. If the point $uMq(x) \leq p(x)$ meaning it's in the green part of the line, we consider the point a hit and accept it.4. If the point is above $p(x)$, we consider it a miss and reject it.We argue (informally) as follows: drawing samples from $q(x)$ uniformly distributes samples in the orange region under $q(x)$ everywhere, that is including in the blue region under $p(x)$). The algorithm above rejects all samples that are in the orange region but not in the blue region. The remaining samples must therefore be uniformly distributed in the blue region under $p(x)$. Thus the accepted samples $x_i$ reflect the density of $p(x)$, since where $p(x)$ is larger, there will be more samples $x$.**Note**: it's not at all trivial to ensure that $Mq(x) \geq p(x)$ everywhere but in this example $M=5$ ensures this condition. ###Code # The sample is written using a loop in purpose. Monte Carlo is embarassingly parallel and all the operations could be done at the same time on all samples. def rejection_sampler(n, M): # sample from q(x) and also compute densities q(x) and p(x) for the x values samples_q = sample_q(n) samples_p = [] # could be written as a list comprehension but may be harder to parse for x,u in zip(samples_q, np.random.uniform(size=n)): if u*M*designed_q(x) <= eval_p(x): samples_p.append(x) return samples_p ###Output _____no_output_____ ###Markdown QuestionIf you run this sample, with the settings in this notebook, it will accept about 20% of the samples (see below). This may seem inefficient. How could we improve the acceptance ratio? ###Code len(rejection_sampler(1000, 5)) / 1000. ###Output _____no_output_____ ###Markdown Compare resultsLet us plot the density $p(x)$ and compare it to a kernel density estimate based on our rejection sampler. ###Code samples_p = rejection_sampler(30000, 5) plt.figure(figsize=(12,6)) x = np.linspace(-20, 20, 500) fig = plt.figure(figsize=(18,6)) plt.plot(x, eval_p(x), label = '$p(x)$') #sns.kdeplot(samples_p) plt.hist(samples_p, bins=np.linspace(-5, 15, 100), density=True, alpha=0.3,label='rejection sampler') plt.title('Histogram estimate vs. true $p(x)$ density') _ = plt.legend() ###Output _____no_output_____ ###Markdown Exercise Observe that if we set the value $M=0.5$, the inequality $p(x) < Mq(x)$ ceases to hold everywhere and our histogram becomes skewed. Think back to the rejection sampling algorithm above and explain why this happens. ###Code samples_p = rejection_sampler(30000, 0.5) plt.figure(figsize=(12,6)) x = np.linspace(-20, 20, 500) fig = plt.figure(figsize=(18,6)) plt.plot(x, eval_p(x), label = '$p(x)$') #sns.kdeplot(samples_p) plt.hist(samples_p, bins=np.linspace(-5, 15, 100), density=True, alpha=0.3,label='rejection sampler') plt.title('Histogram estimate vs. true $p(x)$ density') _ = plt.legend() ###Output _____no_output_____
P2-Traffic-Sign-Classifier/test/CNN_framework_from_coursera.ipynb
###Markdown Create placeholders ###Code def create_placeholders(n_H0, n_W0, n_C0, n_y): """ Creates the placeholders for the tensorflow session. Arguments: n_H0 -- scalar, height of an input image n_W0 -- scalar, width of an input image n_C0 -- scalar, number of channels of the input n_y -- scalar, number of classes Returns: X -- placeholder for the data input, of shape [None, n_H0, n_W0, n_C0] and dtype "float" Y -- placeholder for the input labels, of shape [None, n_y] and dtype "float" """ X = tf.placeholder(tf.float32, shape=[None, n_H0, n_W0, n_C0], name = "x_in") Y = tf.placeholder(tf.float32, shape=[None, n_y], name = "y_in") return X, Y ###Output _____no_output_____ ###Markdown Initialize parameters ###Code def initialize_parameters(): """ Initializes weight parameters to build a neural network with tensorflow. Returns: parameters -- a dictionary of tensors containing W1, W2 """ layer_depth = { 'conv1': 6, 'conv2': 16, 'fc1': 120, 'fc2': 84, 'fc3': n_classes } filter_size={ 'conv_1': 5 ,'conv_2': 5 } tf.set_random_seed(1) weights = { 'conv1': tf.get_variable(name="conv1w", shape=[filter_size['conv_1'], filter_size['conv_1'], 3, layer_depth['conv1']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'conv2': tf.get_variable(name="conv2w", shape=[filter_size['conv_2'], filter_size['conv_2'], layer_depth['conv1'], layer_depth['conv2']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'fc1': tf.get_variable(name="fc1w", shape=[1024, layer_depth['fc1']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'fc2': tf.get_variable(name="fc2w", shape=[layer_depth['fc1'], layer_depth['fc2']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'fc3': tf.get_variable(name="fc3w", shape=[layer_depth['fc2'], layer_depth['fc3']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)) } biases = { 'conv1': tf.get_variable(name="conv1b", shape=[layer_depth['conv1']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'conv2': tf.get_variable(name="conv2b", shape=[layer_depth['conv2']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'fc1': tf.get_variable(name="fc1b", shape=[layer_depth['fc1']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'fc2': tf.get_variable(name="fc2b", shape=[layer_depth['fc2']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)), 'fc3': tf.get_variable(name="fc3b", shape=[layer_depth['fc3']], initializer=tf.contrib.layers.xavier_initializer(seed = 0)) } parameters = {"weights": weights, "biases": biases} return parameters ###Output _____no_output_____ ###Markdown Forward propagation In **TensorFlow**, there are built-in functions that carry out the convolution steps.`tf.nn.conv2d(X,W1, strides = [1,s,s,1], padding = 'SAME')`: given an input $X$ and a group of filters $W1$, this function convolves $W1$'s filters on X. The third input ([1,f,f,1]) represents the strides for each dimension of the input (m, n_H_prev, n_W_prev, n_C_prev).`tf.nn.max_pool(A, ksize = [1,f,f,1], strides = [1,s,s,1], padding = 'SAME')`: given an input A, this function uses a window of size (f, f) and strides of size (s, s) to carry out max pooling over each window.`tf.nn.relu(Z1)`: computes the elementwise ReLU of Z1 (which can be any shape).`tf.contrib.layers.flatten(P)`: given an input P, this function flattens each example into a 1D vector it while maintaining the batch-size. It returns a flattened tensor with shape [batch_size, k].`tf.contrib.layers.fully_connected(F, num_outputs)`: given a the flattened input F, it returns the output computed using a fully connected layer. ###Code def forward_propagation(X, parameters): """ Implements the forward propagation for the model: CONV2D -> RELU -> MAXPOOL -> CONV2D -> RELU -> MAXPOOL -> FLATTEN -> FULLYCONNECTED Arguments: X -- input dataset placeholder, of shape (input size, number of examples) parameters -- python dictionary containing your parameters "W1", "W2" the shapes are given in initialize_parameters Returns: Z3 -- the output of the last LINEAR unit """ # Retrieve the parameters from the dictionary "parameters" weights = parameters['weights'] biases = parameters['biases'] stride = { 'conv_1': 1 ,'conv_2': 1 ,'maxpool_1': 2 ,'maxpool_2': 2 } kernel_size={ 'maxpool_1': 2 ,'maxpool_2': 2 } # CONV2D: stride of 1, padding 'SAME' # tf.nn.conv2d(input, filter, strides, padding, use_cudnn_on_gpu=None, name=None) Z1 = tf.nn.conv2d(X,weights["conv1"], strides = [1,stride["conv_1"],stride["conv_1"],1] , padding = 'SAME' , name="Conv_1") # RELU A1 = tf.nn.relu(Z1, name="Relu_1") # MAXPOOL: window 8x8, sride 8, padding 'SAME' P1 = tf.nn.max_pool(A1, ksize = [1,kernel_size["maxpool_1"],kernel_size["maxpool_1"],1] , strides = [1,stride["maxpool_1"],stride["maxpool_1"],1] , padding = 'SAME' , name="Maxpool_1") # CONV2D: filters W2, stride 1, padding 'SAME' Z2 = tf.nn.conv2d(P1,weights["conv2"] , strides = [1,stride["conv_2"],stride["conv_2"],1] , padding = 'SAME' , name="Conv_2") # RELU A2 = tf.nn.relu(Z2, name="Relu_2") # MAXPOOL: window 4x4, stride 4, padding 'SAME' P2 = tf.nn.max_pool(A2, ksize = [1,kernel_size["maxpool_2"],kernel_size["maxpool_2"],1] , strides= [1,stride["maxpool_2"],stride["maxpool_2"],1] , padding = 'SAME' , name="Maxpool_2") # FLATTEN FLAT_2 = tf.contrib.layers.flatten(P2) # FULLY-CONNECTED without non-linear activation function (not not call softmax). # 43 neurons in output layer. Hint: one of the arguments should be "activation_fn=None" FC3 = tf.nn.xw_plus_b(FLAT_2, weights['fc1'], biases['fc1'], name="FC_1") Z3 = tf.nn.relu(FC3, name="Relu_3") # FULLY-CONNECTED FC4 = tf.nn.xw_plus_b(FC3, weights['fc2'], biases['fc2'], name="FC_2") Z4 = tf.nn.relu(FC4, name="Relu_4") # FULLY-CONNECTED FC5 = tf.nn.xw_plus_b(FC4, weights['fc3'], biases['fc3'], name="FC_3") print (A1) print (P1) print (A2) print (P2) print (FLAT_2) print (FC3) print (Z3) print (FC4) print (Z4) print (FC5) # Z3 = tf.contrib.layers.fully_connected(P2, 43, activation_fn=None) return FC5 tf.reset_default_graph() with tf.Session() as sess: X, Y = create_placeholders(32, 32, 3, 4) parameters = initialize_parameters() Z3 = forward_propagation(X, parameters) init = tf.global_variables_initializer() sess.run(init) a = sess.run(Z3, {X: np.random.randn(2,32,32,3), Y: np.random.randn(2,4)}) print (Z3.get_shape()[1]) # print("Z3 = " + str(a)) ###Output Tensor("Relu_1:0", shape=(?, 32, 32, 6), dtype=float32) Tensor("Maxpool_1:0", shape=(?, 16, 16, 6), dtype=float32) Tensor("Relu_2:0", shape=(?, 16, 16, 16), dtype=float32) Tensor("Maxpool_2:0", shape=(?, 8, 8, 16), dtype=float32) Tensor("Flatten/flatten/Reshape:0", shape=(?, 1024), dtype=float32) Tensor("FC_1:0", shape=(?, 120), dtype=float32) Tensor("Relu_3:0", shape=(?, 120), dtype=float32) Tensor("FC_2:0", shape=(?, 84), dtype=float32) Tensor("Relu_4:0", shape=(?, 84), dtype=float32) Tensor("FC_3:0", shape=(?, 43), dtype=float32) 43 ###Markdown Compute cost ###Code def compute_cost(Z3, Y): """ Computes the cost Arguments: Z3 -- output of forward propagation (output of the last LINEAR unit), of shape (number of examples) Y -- "true" labels vector placeholder, same shape as Z3 Returns: cost - Tensor of the cost function """ cross_ent = tf.nn.softmax_cross_entropy_with_logits(logits = Z3, labels = Y, name="cross_entropy") cost = tf.reduce_mean(cross_ent, name="cost") return cost def evaluate_with_batch(X_data, y_data, batch_size=64): """ Creates a list of minibatches from (X, Y) for evaluation Arguments: X -- input data, of shape (input size, number of examples) Y -- true "label" vector (1 for blue dot / 0 for red dot), of shape (1, number of examples) batch_size -- size of the mini-batches, integer Returns: accuracy -- accuracy of model """ mini_batches = [] m = len(X_data) num_complete_minibatches = math.ceil(m/batch_size) for k in range(0, num_complete_minibatches): mini_batch_X, mini_batch_Y = X_data[k * batch_size:(k + 1) * batch_size], y_data[k * batch_size:(k + 1) * batch_size] mini_batch = [mini_batch_X, mini_batch_Y] mini_batches.append(mini_batch) return mini_batches from tensorflow.python.framework import ops def model(X_train, Y_train, X_test, Y_test, learning_rate = 0.001, num_epochs = 1, minibatch_size = 64, print_cost = True): """ Implements a three-layer ConvNet in Tensorflow: CONV2D -> RELU -> MAXPOOL -> CONV2D -> RELU -> MAXPOOL -> FLATTEN -> FULLYCONNECTED Arguments: X_train -- training set, of shape (None, 64, 64, 3) Y_train -- test set, of shape (None, n_y = 6) X_test -- training set, of shape (None, 64, 64, 3) Y_test -- test set, of shape (None, n_y = 6) learning_rate -- learning rate of the optimization num_epochs -- number of epochs of the optimization loop minibatch_size -- size of a minibatch print_cost -- True to print the cost every 100 epochs Returns: train_accuracy -- real number, accuracy on the train set (X_train) test_accuracy -- real number, testing accuracy on the test set (X_test) parameters -- parameters learnt by the model. They can then be used to predict. """ ops.reset_default_graph() # to be able to rerun the model without overwriting tf variables tf.set_random_seed(1) # to keep results consistent (tensorflow seed) seed = 3 # to keep results consistent (numpy seed) (m, n_H0, n_W0, n_C0) = X_train.shape n_y = Y_train.shape[1] costs = [] # To keep track of the cost training_accuracies = [] validation_accuracies = [] # Create Placeholders of the correct shape X, Y = create_placeholders(n_H0, n_W0, n_C0, n_y) # Initialize parameters parameters = initialize_parameters() # Forward propagation: Build the forward propagation in the tensorflow graph Z5 = forward_propagation(X, parameters) # Cost function: Add cost function to tensorflow graph cost = compute_cost(Z5, Y) # Calculate the correct predictions correct_prediction = tf.equal(tf.argmax(Z5, 1), tf.argmax(Y, 1)) accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32)) # Backpropagation: Define the tensorflow optimizer. Use an AdamOptimizer that minimizes the cost. optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost) # Initialize all the variables globally init = tf.global_variables_initializer() saver = tf.train.Saver() # Start the session to compute the tensorflow graph with tf.Session() as sess: # Run the initialization sess.run(init) num_minibatches = int(m / minibatch_size) # number of minibatches of size minibatch_size in the train set # Do the training loop for epoch in range(num_epochs): minibatch_cost = 0. seed = seed + 1 minibatches = random_mini_batches(X_train, Y_train, minibatch_size, seed) print ("epoch: "+str(epoch)) for i,(minibatch_X, minibatch_Y) in enumerate(minibatches,1): # print ("batch: "+str(i)+"/"+str(len(minibatches))) # IMPORTANT: The line that runs the graph on a minibatch. # Run the session to execute the optimizer and the cost, the feedict should contain a minibatch for (X,Y). _ , temp_cost = sess.run([optimizer, cost], feed_dict={X: minibatch_X, Y: minibatch_Y}) minibatch_cost += temp_cost / num_minibatches # Print the cost every epoch if print_cost == True and epoch % 5 == 0: print ("Cost after epoch %i: %f" % (epoch, minibatch_cost)) if print_cost == True and epoch % 1 == 0: costs.append(minibatch_cost) # Do the evaluation loop print ("evaluation start") minibatch_test = evaluate_with_batch(X_test, Y_test) minibatch_train = evaluate_with_batch(X_train, Y_train) train_accuracy = 0.0 test_accuracy = 0.0 for i,(minibatch_X, minibatch_Y) in enumerate(minibatch_train, 1): # print ("batch: "+str(i)+"/"+str(len(minibatch_train))) train_accuracy += accuracy.eval({X: minibatch_X, Y: minibatch_Y}) for i,(minibatch_X, minibatch_Y) in enumerate(minibatch_test, 1): # print ("batch: "+str(i)+"/"+str(len(minibatch_test))) test_accuracy += accuracy.eval({X: minibatch_X, Y: minibatch_Y}) print (train_accuracy/len(X_train)) print (test_accuracy/len(X_test)) training_accuracies.append(train_accuracy/len(X_train)) validation_accuracies.append(test_accuracy/len(X_test)) print ("evaluation ends") print("Train Accuracy:", training_accuracies) print("Test Accuracy:", validation_accuracies) # plot the cost plt.plot(np.squeeze(costs)) plt.ylabel('cost') plt.xlabel('iterations (per tens)') plt.title("Learning rate =" + str(learning_rate)) plt.show() save_path = saver.save(sess, os.path.join("model", "true_Lenet","fake_Lenet.ckpt")) print("Model saved in path: %s" % save_path) return training_accuracies, validation_accuracies train_accuracy, test_accuracy = model(X_train_norm, labels_train, X_valid_norm, labels_valid, num_epochs = ) ###Output Tensor("Relu_1:0", shape=(?, 32, 32, 6), dtype=float32) Tensor("Maxpool_1:0", shape=(?, 16, 16, 6), dtype=float32) Tensor("Relu_2:0", shape=(?, 16, 16, 16), dtype=float32) Tensor("Maxpool_2:0", shape=(?, 8, 8, 16), dtype=float32) Tensor("Flatten/flatten/Reshape:0", shape=(?, 1024), dtype=float32) Tensor("FC_1:0", shape=(?, 120), dtype=float32) Tensor("Relu_3:0", shape=(?, 120), dtype=float32) Tensor("FC_2:0", shape=(?, 84), dtype=float32) Tensor("Relu_4:0", shape=(?, 84), dtype=float32) Tensor("FC_3:0", shape=(?, 43), dtype=float32) epoch: 0 Cost after epoch 0: 1.250390 evaluation start 0.0138262845169 0.0126130121635 evaluation ends epoch: 1 evaluation start 0.0146079208787 0.0131359224687 evaluation ends epoch: 2 evaluation start 0.0151055830164 0.0136185149352 evaluation ends epoch: 3 evaluation start 0.0153299198117 0.013838919193 evaluation ends epoch: 4 evaluation start 0.0152729322491 0.0137166217071 evaluation ends epoch: 5 Cost after epoch 5: 0.087421 evaluation start 0.0153548063154 0.0140337888075 evaluation ends epoch: 6 evaluation start 0.0153469123098 0.01392028794 evaluation ends epoch: 7 evaluation start 0.0152511712284 0.0135869937013 evaluation ends epoch: 8 evaluation start 0.0153906381876 0.0139338494023 evaluation ends epoch: 9 evaluation start 0.0154196322934 0.0140298791889 evaluation ends Train Accuracy: [0.013826284516886538, 0.014607920878710063, 0.015105583016414587, 0.015329919811694374, 0.015272932249066591, 0.015354806315421989, 0.015346912309843969, 0.015251171228415298, 0.015390638187561792, 0.015419632293446178] Test Accuracy: [0.012613012163547162, 0.013135922468708757, 0.013618514935175578, 0.01383891919302562, 0.01371662170708585, 0.01403378880753809, 0.013920287939966941, 0.013586993701333632, 0.013933849402295759, 0.014029879188861977] ###Markdown Even though there is no file named model.ckpt, you still refer to the saved checkpoint by that name when restoring it. * `.meta` stores the graph structure. * `.data` stores the values of each variable in the graph. * `.index` identifies the checkpiont. So in the example above: import_meta_graph uses the .meta, and saver.restore uses the .data and .index ###Code with tf.Session() as sess: saver = tf.train.import_meta_graph(os.path.join("model", "true_Lenet","fake_Lenet.ckpt.meta")) saver.restore(sess, os.path.join("model", "true_Lenet","fake_Lenet.ckpt")) g = tf.get_default_graph() # for i in g.get_operations(): # print (i.name) learning_rate = 0.0001 cost = g.get_operation_by_name("cost") optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost) for epoch_i in range(1): minibatch_test = evaluate_with_batch(X_valid_norm, labels_valid) test_accuracy = 0.0 for i,(minibatch_X, minibatch_Y) in enumerate(minibatch_test, 1): _, curr_cost = sess.run([optimizer, cost], feed_dict={x_in: batch}) test_accuracy += accuracy.eval({X: minibatch_X, Y: minibatch_Y}) save_path = tf.train.Saver().save(sess, os.path.join("model", "true_Lenet","fake_Lenet.ckpt")) def resume_model(X_train, Y_train, X_test, Y_test, learning_rate = 0.001, num_epochs = 1, minibatch_size = 64, print_cost = True): (m, n_H0, n_W0, n_C0) = X_train.shape n_y = Y_train.shape[1] costs = [] # To keep track of the cost training_accuracies = [] validation_accuracies = [] optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost) # Initialize all the variables globally init = tf.global_variables_initializer() saver = tf.train.Saver() with tf.Session() as sess: saver = tf.train.import_meta_graph(os.path.join("model", "true_Lenet","fake_Lenet.ckpt.meta")) saver.restore(sess, os.path.join("model", "true_Lenet","fake_Lenet.ckpt")) g = tf.get_default_graph() x_in = g.get_tensor_by_name("x_in:0") cost = g.get_tensor_by_name("cost:0") # # Run the initialization # sess.run(init) # m = len(X_train) # minibatch_size = 64 # num_minibatches = int(m / minibatch_size) # number of minibatches of size minibatch_size in the train set # num_epochs = 10 # seed = 0 # # Do the training loop # for epoch in range(num_epochs): # minibatch_cost = 0. # seed = seed + 1 # minibatches = random_mini_batches(X_train, Y_train, minibatch_size, seed) # print ("epoch: "+str(epoch)) # for i,(minibatch_X, minibatch_Y) in enumerate(minibatches,1): # # print ("batch: "+str(i)+"/"+str(len(minibatches))) # # IMPORTANT: The line that runs the graph on a minibatch. # # Run the session to execute the optimizer and the cost, the feedict should contain a minibatch for (X,Y). # _ , temp_cost = sess.run([optimizer, cost], feed_dict={X: minibatch_X, Y: minibatch_Y}) # minibatch_cost += temp_cost / num_minibatches # # Print the cost every epoch # if print_cost == True and epoch % 1 == 0: # print ("Cost after epoch %i: %f" % (epoch, minibatch_cost)) # if print_cost == True and epoch % 1 == 0: # costs.append(minibatch_cost) # # plot the cost # plt.plot(np.squeeze(costs)) # plt.ylabel('cost') # plt.xlabel('iterations (per tens)') # plt.title("Learning rate =" + str(learning_rate)) # plt.show() # saver.save(sess, os.path.join("model", "true_Lenet","fake_Lenet.ckpt")) # print("Model saved") # return training_accuracies, validation_accuracies train_accuracy, test_accuracy = resume_model(X_train_norm, labels_train, X_valid_norm, labels_valid, num_epochs = 10) ###Output _____no_output_____
label_shift.ipynb
###Markdown Idea: Combine BBSE (Lipton et al ICML '18) with MALLS' (Zhao et al AISTATS '21) subsampling technique while using MAML (Finn et al ICML '17) to reduce bias in importance sampling weights learned from medial distribution and focus on domain adaptation on label shift. ###Code import numpy as np import scipy import pandas as pd import matplotlib.pyplot as plt import tqdm import math import time from collections import Counter, deque, OrderedDict from sklearn.datasets import load_digits from sklearn.model_selection import train_test_split from sklearn.metrics import accuracy_score, confusion_matrix from sklearn.neighbors import KNeighborsClassifier import torch import torch.nn as nn import torch.nn.functional as F from model import Network from maml import MAML #set reproducibility np.random.seed(0) _ = torch.manual_seed(0) X, y = load_digits(return_X_y=True) #multiclassification -- replace w/ CIFAR/etc... test_ratio = 0.2 X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=test_ratio, random_state=42) ###Output _____no_output_____ ###Markdown Create Imbalanced Dataset ###Code def group_by_label(y): """ Groups data by label and returns indices per label """ label_dict = {} for i in range(len(y)): if y[i] in label_dict: label_dict[y[i]].append(i) else: label_dict[y[i]] = [i] return dict(OrderedDict(sorted(label_dict.items()))) def dirichlet_distribution(alpha, idx_by_label, size, no_change=False): """ Create Imbalanced data using dirichlet distribution """ class_composition = np.array([len(idx_by_label[k]) for k in sorted(idx_by_label.keys())], np.int64) print("Original Class composition: ", class_composition) if no_change: dataset = [] for v in idx_by_label.values(): dataset += v return dataset distribution = np.random.dirichlet([alpha]*len(idx_by_label), size=()) idx_by_label = idx_by_label.copy() #Group data by label for label in idx_by_label: class_size = math.ceil(size * distribution[label]) if not class_size: class_size = 1 #min number to support A.2 assumption (BBSE ICML '18) indices = np.random.randint(0, len(idx_by_label[label]), size=(class_size, )) idx_by_label[label] = np.unique([idx_by_label[label][i] for i in indices]).tolist() class_composition = np.array([len(idx_by_label[k]) for k in sorted(idx_by_label.keys())], np.int64) print("Shifted Class composition: ", class_composition) #Build new dataset of indices dataset = [] for v in idx_by_label.values(): dataset += v return dataset #shifted distribution def get_distribution(labels): """ Returns the distribution of classes as ratios """ dist = dict(Counter(labels)) total_size = 0 for key, value in dist.items(): total_size += value for key in dist: dist[key] /= total_size return dict(OrderedDict(sorted(dist.items()))) idx_by_label = group_by_label(y_train) #label : [indices of all labels] train_ratio = 1 data_cap = int(2 * X_train.shape[0]) size = int(data_cap * (train_ratio / (train_ratio + 1))) shifted_dist_idx = dirichlet_distribution(alpha=0.01, idx_by_label=idx_by_label, size=size, no_change=False) #### Imbalanced test dist. idx_by_label = group_by_label(y_test) #label : [indices of all labels] test_ratio = 1 data_cap = int(2 * X_test.shape[0]) size = int(data_cap * (test_ratio / (test_ratio + 1))) shifted_test_dist_idx = dirichlet_distribution(alpha=0.5, idx_by_label=idx_by_label, size=size, no_change=True) def plot(y, indices, dist_type='Train'): ### Original Distribution plt.bar(x=np.unique(y), height=get_distribution(y).values()) plt.title(dist_type + " Original Distribution") plt.xlabel("Class") plt.ylabel("PMF") plt.grid() plt.show() ### Shifted Distribution plt.bar(x=np.unique(y[indices]), height=get_distribution(y[indices]).values()) plt.title(dist_type + " Shifted Distribution") plt.xlabel("Class label") plt.ylabel("PMF") plt.grid() plt.show() #train Distribution shift plot(y_train, shifted_dist_idx) #test Distribution shift plot(y_test, shifted_test_dist_idx, 'Test') ###Output _____no_output_____ ###Markdown Sync With Data ###Code ### No subsampling - take source Dist. X_train, y_train = X_train[shifted_dist_idx], y_train[shifted_dist_idx] ### Shifting test distribution X_test, y_test = X_test[shifted_test_dist_idx], y_test[shifted_test_dist_idx] #Get source (train) and target (test) label distributions dist_train = get_distribution(y_train) dist_test = get_distribution(y_test) print(f"Train distribution : {y_train.shape}") print(f"Test distribution : {y_test.shape}") ###Output Train distribution : (571,) Test distribution : (161,) ###Markdown Train Model ###Code device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu") #Enable cuda if available ##typecast to tensors X_train = torch.DoubleTensor(X_train).to(device) X_test = torch.DoubleTensor(X_test).to(device) y_train = torch.LongTensor(y_train).to(device) y_test = torch.LongTensor(y_test).to(device) # implement backprop loss_function = nn.CrossEntropyLoss() def train(data, epochs=500, epsilon=1e-5, print_st=False): """ Train the model. Assumes access to global variable: loss function """ X_train, y_train = data #extract info start_time = time.time() losses = [] model = Network().to(device) #load local model optimizer = torch.optim.Adam(model.parameters(), lr=0.005) #gather accuracies train_accuracy = [] test_accuracy = [] for i in range(epochs): model.train() #set back to train y_pred = model(X_train) loss = loss_function(y_pred, y_train) losses.append(loss) ## training accuracy predictions = np.array(y_pred.argmax(axis=1), dtype=np.int16) score = accuracy_score(y_train, predictions) train_accuracy.append(score) ## test accuracy model.eval() with torch.no_grad(): y_pred = model(X_test) predictions = np.array(y_pred.argmax(axis=1), dtype=np.int16) score = accuracy_score(y_test, predictions) test_accuracy.append(score) if loss.item() < epsilon: if print_st: print(f"Model Converged at epoch {i + 1}, loss = {loss.item()}") break optimizer.zero_grad() loss.backward() optimizer.step() if print_st: print(f"Total training time (sec): {time.time() - start_time}, loss - {loss.item()}") return model, losses, train_accuracy, test_accuracy model_normal, cost, training_accuracy, test_accuracy = train((X_train, y_train), print_st=True) #graph cost plt.plot(cost, label='loss') plt.plot(training_accuracy, label='training accuracy') plt.plot(test_accuracy, label='test accuracy') plt.xlabel("epoch") plt.ylabel("Loss") plt.title("Full Batch Training Cost") plt.legend() plt.grid() plt.show() ###Output _____no_output_____ ###Markdown Test Model ###Code def predict(model): """ Predict accuracy => y_hat = f(x). """ model.eval() #set to evaluation mode # predict X_test data predictions=[] with torch.no_grad(): for i, data in enumerate(X_test): y_pred = model(data) predictions.append(y_pred.argmax().item()) predictions = np.array(predictions, dtype=np.int16) score = accuracy_score(y_test, predictions) return score, predictions ### Estimated distribution score, predictions = predict(model_normal) print(f"Test Accuracy : {score}") ###Output Test Accuracy : 0.8509316770186336 ###Markdown MALLS - Subsampling instead of directly going from imbalanced source to target using IW, let's convert the source to a more uniform distribution (medial distribution) and then compute the Label Shift + IW on that. Creating uniform distribution from imbalanced dataset using Probability integral transform ###Code biased_probs = 1. / np.array(list(dist_train.values())) biased_probs /= np.sum(biased_probs) p = np.zeros(y_train.shape) for i in range(len(p)): p[i] = biased_probs[y_train[i]] p /= p.sum() medial_idx = np.random.choice(np.arange(len(y_train)), size=y_train.shape, replace=True, p=p) ### Medial Distribution plt.bar(x=np.unique(y_train[medial_idx]), height=get_distribution(y_train[medial_idx].numpy()).values()) plt.title("Medial Distribution") plt.xlabel("Class label") plt.ylabel("PMF") plt.grid() plt.show() ### Subsampling - take Medial Dist. X_train, y_train = X_train[medial_idx], y_train[medial_idx] ###Output _____no_output_____ ###Markdown BSSE - Label Shift IW ###Code delta = 1e-8 #0 < delta < 1/k where k = number of classes. validation_ratio = 0.5 data = X_train.clone(), y_train.clone() #store original training distribution. #Split training into training (source) and validation (hold-out) X_train, X_validation, y_train, y_validation = train_test_split(X_train, y_train, test_size=validation_ratio, random_state=42) ### obtain classifier by training on X_train, y_train f, cost, training_accuracy, test_accuracy = train((X_train, y_train), print_st=True) ### Estimated distribution score, _ = predict(f) print(f"No IW test score : {score}") #graph cost plt.plot(cost, label='loss') plt.plot(training_accuracy, label='training accuracy') plt.plot(test_accuracy, label='test accuracy') plt.xlabel("epoch") plt.ylabel("Loss") plt.title("Source Only Cost") plt.legend() plt.grid() plt.show() ###Output _____no_output_____ ###Markdown Generate Label Shift ###Code def calculate_confusion_matrix(X, Y): """ Calculates value for \hat{C}_{\hat{y}, y} @Params: - X : Validation data, i.e. X2 - Y : Validation labels, i.e. Y2 """ k = len(np.unique(y)) #number of classes conf_matrx = np.zeros(shape=(k, k)) #freeze params f.eval() predictions=[] with torch.no_grad(): for i, data in enumerate(X): y_pred = f(data) predictions.append(y_pred.argmax().item()) predictions = np.array(predictions) for i in range(k): for j in range(k): idxs = np.where((predictions == i) & (Y.numpy() == j))[0] conf_matrx[i, j] = float(len(idxs) / len(X)) return conf_matrx, k def calculate_target_priors(X, k): """ Calculates \hat{μ}_\hat{y} """ preds = np.array([f(xp).argmax().item() for xp in X], np.int16) target_priors = np.zeros(k) for i in range(k): target_priors[i] = len(np.where(preds == i)[0]) / len(preds) return target_priors conf_matrix, k = calculate_confusion_matrix(X_validation, y_validation) mu = calculate_target_priors(X_test, k) def compute_weights(cmf, target_priors): """ Computes label weights """ w, _ = np.linalg.eig(cmf + np.random.uniform(0, 1e-3, size=cmf.shape)) if abs(w.real.min()) <= delta: #non invertible matrix return np.full(shape=len(target_priors), fill_value=float(1 / len(target_priors))) try: label_weights = np.linalg.inv(cmf) @ mu except np.linalg.LinAlgError: label_weights = np.linalg.inv(cmf + np.random.uniform(0, 1e-3, size=cmf.shape)) @ target_priors label_weights = abs(label_weights) label_weights /= label_weights.sum() #label_weights[label_weights < 0] = 0 #strictly set rare occurances to 0 instead of abs (see BBSE) return label_weights label_weights = compute_weights(conf_matrix, mu) for lw in label_weights: print(float(lw), end=", ") print(f"\n|w| = {np.linalg.norm(label_weights)}") ###Output 0.007581779925065469, 0.05502089063609477, 0.0937116784166715, 0.06616855010477055, 0.27762282777887914, 0.021293825050924836, 0.21843521523923445, 0.02670093778416191, 0.1724922060523485, 0.06097208901184891, |w| = 0.4191310524162318 ###Markdown Importance Weights Training ###Code def train_iw(X, y, network, epochs=500, print_st=True): """ Train model using class weights """ start_time = time.time() m, k = len(X), len(np.unique(y)) loss_function = nn.CrossEntropyLoss(weight=torch.DoubleTensor(label_weights)) losses = [] model = Network().to(device) #load local model cloned_params = {} for layer in network.state_dict(): cloned_params[layer] = network.state_dict()[layer].clone() model.load_state_dict(cloned_params) optimizer = torch.optim.Adam(model.parameters(), lr=0.005) #gather accuracies train_accuracy = [] test_accuracy = [] for i in range(epochs): model.train() #set back to train y_pred = model(X) loss = loss_function(y_pred, y) losses.append(loss) ## training accuracy predictions = np.array(y_pred.argmax(axis=1), dtype=np.int16) score = accuracy_score(y, predictions) train_accuracy.append(score) ## test accuracy model.eval() with torch.no_grad(): y_pred = model(X_test) predictions = np.array(y_pred.argmax(axis=1), dtype=np.int16) score = accuracy_score(y_test, predictions) test_accuracy.append(score) optimizer.zero_grad() loss.backward() optimizer.step() if print_st: print(f"Total training time (sec): {time.time() - start_time}, loss - {loss.item()}") return model, losses, train_accuracy, test_accuracy X_train, y_train = data #regain data f_weighted, cost, training_accuracy, test_accuracy = train_iw(X_train, y_train, f) #graph cost plt.plot(cost, label='loss') plt.plot(training_accuracy, label='training accuracy') plt.plot(test_accuracy, label='test accuracy') plt.xlabel("epoch") plt.ylabel("Loss") plt.title("Full Source Training Cost") plt.legend() plt.grid() plt.show() ###Output _____no_output_____ ###Markdown Importance Weighting Test ###Code def predict_IW(model): """ Predict accuracy => y_hat = f(x). Refer to BBSE, ICML '18 """ model.eval() #set to evaluation mode predictions=[] with torch.no_grad(): for i, data in enumerate(X_test): y_pred = model(data) y_pred *= label_weights #IW softmax predictions.append(y_pred.argmax().item()) predictions = np.array(predictions, dtype=np.int16) score = accuracy_score(y_test, predictions) return score, predictions ### Prediction score, _ = predict_IW(f_weighted) print(f"IW test score : {score}") ###Output IW test score : 0.9006211180124224 ###Markdown MAML - Importance Weight Bias Reduction ###Code ### declare maml maml = MAML(X_validation, y_validation, f_weighted, label_weights) num_meta_updates = 2 for _ in range(num_meta_updates): maml.update() label_weights = maml.get_label_weights() print(f"Updated Weights : {label_weights} \n |w| = {np.linalg.norm(label_weights)}") ###Output Updated Weights : [0.02758347 0.07501227 0.11369767 0.04616687 0.29762258 0.04129891 0.23843501 0.04669691 0.19249243 0.0809733 ] |w| = 0.463004207342887 ###Markdown Run Predictions with updated weights ###Code ### Prediction score, _ = predict_IW(f_weighted) print(f"MAML + IW test score : {score}") ###Output MAML + IW test score : 0.906832298136646
references/exercises/ Dummies and VIF - Exercise/sklearn - Dummies and VIF - Exercise Solution.ipynb
###Markdown Exercise Solution - Dummies and VIF Please run all the cells below and find the exercise and the solution at the bottom of the notebook. Importing the relevant libraries ###Code import numpy as np import pandas as pd import statsmodels.api as sm import matplotlib.pyplot as plt from sklearn.linear_model import LinearRegression import seaborn as sns sns.set() ###Output _____no_output_____ ###Markdown Loading the raw data ###Code raw_data = pd.read_csv('1.04. Real-life example.csv') raw_data.head() ###Output _____no_output_____ ###Markdown Preprocessing Exploring the descriptive statistics of the variables ###Code raw_data.describe(include='all') ###Output _____no_output_____ ###Markdown Determining the variables of interest ###Code data = raw_data.drop(['Model'],axis=1) data.describe(include='all') ###Output _____no_output_____ ###Markdown Dealing with missing values ###Code data.isnull().sum() data_no_mv = data.dropna(axis=0) data_no_mv.describe(include='all') ###Output _____no_output_____ ###Markdown Exploring the PDFs ###Code sns.distplot(data_no_mv['Price']) ###Output _____no_output_____ ###Markdown Dealing with outliers ###Code q = data_no_mv['Price'].quantile(0.99) data_1 = data_no_mv[data_no_mv['Price']<q] data_1.describe(include='all') sns.distplot(data_1['Price']) sns.distplot(data_no_mv['Mileage']) q = data_1['Mileage'].quantile(0.99) data_2 = data_1[data_1['Mileage']<q] sns.distplot(data_2['Mileage']) sns.distplot(data_no_mv['EngineV']) data_3 = data_2[data_2['EngineV']<6.5] sns.distplot(data_3['EngineV']) sns.distplot(data_no_mv['Year']) q = data_3['Year'].quantile(0.01) data_4 = data_3[data_3['Year']>q] sns.distplot(data_4['Year']) data_cleaned = data_4.reset_index(drop=True) data_cleaned.describe(include='all') ###Output _____no_output_____ ###Markdown Checking the OLS assumptions ###Code f, (ax1, ax2, ax3) = plt.subplots(1, 3, sharey=True, figsize =(15,3)) ax1.scatter(data_cleaned['Year'],data_cleaned['Price']) ax1.set_title('Price and Year') ax2.scatter(data_cleaned['EngineV'],data_cleaned['Price']) ax2.set_title('Price and EngineV') ax3.scatter(data_cleaned['Mileage'],data_cleaned['Price']) ax3.set_title('Price and Mileage') plt.show() sns.distplot(data_cleaned['Price']) ###Output _____no_output_____ ###Markdown Relaxing the assumptions ###Code log_price = np.log(data_cleaned['Price']) data_cleaned['log_price'] = log_price data_cleaned f, (ax1, ax2, ax3) = plt.subplots(1, 3, sharey=True, figsize =(15,3)) ax1.scatter(data_cleaned['Year'],data_cleaned['log_price']) ax1.set_title('Log Price and Year') ax2.scatter(data_cleaned['EngineV'],data_cleaned['log_price']) ax2.set_title('Log Price and EngineV') ax3.scatter(data_cleaned['Mileage'],data_cleaned['log_price']) ax3.set_title('Log Price and Mileage') plt.show() data_cleaned = data_cleaned.drop(['Price'],axis=1) ###Output _____no_output_____ ###Markdown Multicollinearity ###Code data_cleaned.columns.values from statsmodels.stats.outliers_influence import variance_inflation_factor variables = data_cleaned[['Mileage','Year','EngineV']] vif = pd.DataFrame() vif["VIF"] = [variance_inflation_factor(variables.values, i) for i in range(variables.shape[1])] vif["features"] = variables.columns vif data_no_multicollinearity = data_cleaned.drop(['Year'],axis=1) ###Output _____no_output_____ ###Markdown Create dummy variables ###Code data_with_dummies = pd.get_dummies(data_no_multicollinearity, drop_first=True) data_with_dummies.head() ###Output _____no_output_____ ###Markdown Rearrange a bit ###Code data_with_dummies.columns.values cols = ['log_price', 'Mileage', 'EngineV', 'Brand_BMW', 'Brand_Mercedes-Benz', 'Brand_Mitsubishi', 'Brand_Renault', 'Brand_Toyota', 'Brand_Volkswagen', 'Body_hatch', 'Body_other', 'Body_sedan', 'Body_vagon', 'Body_van', 'Engine Type_Gas', 'Engine Type_Other', 'Engine Type_Petrol', 'Registration_yes'] data_preprocessed = data_with_dummies[cols] data_preprocessed.head() ###Output _____no_output_____ ###Markdown *** *** *** EXERCISE Part 1Calculate the variance inflation factors for all variables contained in data_preprocessed. Anything strange? Part 2As mentioned in the lecture, your task is to calculate the variance inflation factor (VIF) of all variables including the dummies (but without the dependent variable). Part 3Now calculate the VIFs for a data frame where we include the dummies, without 'log_price', but DO NOT DROP THE FIRST DUMMY. Anything strange now? *** Part 1 - Solution ###Code # Let's simply use the data_preprocessed and the VIF code from above variables = data_preprocessed vif = pd.DataFrame() vif["VIF"] = [variance_inflation_factor(variables.values, i) for i in range(variables.shape[1])] vif["features"] = variables.columns vif ###Output _____no_output_____ ###Markdown Obviously, 'log_price' has a very high VIF. This implies it is most definitely **linearly correlated** with all the other variables. And this is no surprise! We are using a linear regression to determine 'log_price' given values of the independent variables! This is exactly what we expect - a linear relationship!However, to actually assess multicollinearity for the predictors, we have to drop 'log_price'. The multicollinearity assumption refers only to the idea that the **independent variables** shoud not be collinear. Part 2 - Solution ###Code # Let's simply drop log_price from data_preprocessed variables = data_preprocessed.drop(['log_price'],axis=1) vif = pd.DataFrame() vif["VIF"] = [variance_inflation_factor(variables.values, i) for i in range(variables.shape[1])] vif["features"] = variables.columns vif ###Output _____no_output_____ ###Markdown As you can see, all VIFs are pretty much acceptable. The ones that are particularly high are 'EngineV' and 'Registration_yes'. We already discussed 'EngineV' in the lecture.In the case of registration, the main issue is that **most values are 'yes'** so all types of problems come from there. One way this imbalance manifests is in multicollinearity. Remember that all independent variables are pretty good at determining 'log_price'? Well, if 'registration' is always 'yes', then if we predict 'log_price' we are predicting registration, too (it is going to be 'yes'). That is why, whenever a single category is so predominant, we may just drop the variable. Note that it will most probably be insignificant anyways. Part 3 - Solution ###Code # To solve this one, we must create a new variable with dummies, without dropping the first one data_with_dummies_new = pd.get_dummies(data_no_multicollinearity)#, drop_first=True) data_with_dummies_new.head() # Let's simply drop 'log_price' from this new variable variables = data_with_dummies_new.drop(['log_price'],axis=1) vif = pd.DataFrame() vif["VIF"] = [variance_inflation_factor(variables.values, i) for i in range(variables.shape[1])] vif["features"] = variables.columns vif ###Output C:\ProgramData\Anaconda3\lib\site-packages\statsmodels\stats\outliers_influence.py:167: RuntimeWarning: divide by zero encountered in double_scalars vif = 1. / (1. - r_squared_i)
Cap05/Cap05-Exercicios.ipynb
###Markdown Data Science Academy - Python Fundamentos - Capítulo 5 Download: http://github.com/dsacademybr Exercícios ###Code # Exercício 1 - Crie um objeto a partir da classe abaixo, chamado roc1, passando 2 parâmetros e depois faça uma chamada # aos atributos e métodos from math import sqrt class Rocket(): def __init__(self, x=0, y=0): self.x = x self.y = y def move_rocket(self, x_increment=0, y_increment=1): self.x += x_increment self.y += y_increment def print_rocket(self): print(self.x, self.y) # inserindo valor ao atributo roc1 = Rocket(10 ,20) # chamando os atributos print('Deslocamento horizontal, ', roc1.x) print('Deslocamento vertical, ', roc1.y) print(roc1.move_rocket()) print(roc1.print_rocket()) # Exercício 2 - Crie uma classe chamada Pessoa() com os atributos: nome, cidade, telefone e e-mail. Use pelo menos 2 # métodos especiais na sua classe. Crie um objeto da sua classe e faça uma chamada a pelo menos um dos seus métodos # especiais. class Pessoa(): def __init__(self, nome, cidade, telefone, e_mail, valores=None): self.nome = nome self.cidade = cidade self.telefone = telefone self.e_mail = e_mail if valores is None: valores = [] else: self.valores = valores def __str__(self): return 'Nome: %s, Cidade: %s, Telefone: %s, EMail: %s' \ %(self.nome, self.cidade, self.telefone, self.e_mail) def __invert__(self): self.valores = self.nome return reversed(self.valores) # instanciando objeto p1 = Pessoa(nome = 'Carlos', cidade = 'Mauá', telefone = '(11)93273-2754', e_mail = '[email protected]', valores = 'Carlos') # chamando o método especial __str__ print(str(p1)) # Exercício 3 - Crie a classe Smartphone com 2 atributos, tamanho e interface e crie a classe MP3Player com os # atributos capacidade. A classe MP3player deve herdar os atributos da classe Smartphone. class Smartphone(): def __init__(self, tamanho, interface): self.tamanho = tamanho self.interface = interface class Mp3Player(Smartphone): def __init__(self, capacidade, tamanho = 'Pequeno', interface = 'Led'): self.capacidade = capacidade Smartphone.__init__(self, tamanho, interface) def print_mp3Player(self): print('Valores para o objeto criado: %s %s %s' \ %(self.tamanho, self.interface, self.capacidade)) # instanciando objeto da classe device1 = Mp3Player('64 GB') device1.print_mp3Player() ###Output Valores para o objeto criado: Pequeno Led 64 GB
Natural Language Processing of Twitter Samples using nltk.ipynb
###Markdown Natural Language Processing - twitter_samples dataset Natural Language Processing with `nltk``nltk` is the most popular Python package for Natural Language processing, it provides algorithms for importing, cleaning, pre-processing text data in human language and then apply computational linguistics algorithms like sentiment analysis.http://www.nltk.org/`nltk` also provides access to a dataset of tweets from Twitter, it includes a set of tweets already classified as negative or positive. Download and inspect the twitter_samples datasetIt also includes many easy-to-use datasets in the `nltk.corpus` package, we can download for example the `twitter_samples` package using the `nltk.download` function: ###Code import nltk nltk.download("twitter_samples") from nltk.corpus import twitter_samples ###Output [nltk_data] Downloading package twitter_samples to [nltk_data] /Users/anandkarthick/nltk_data... [nltk_data] Package twitter_samples is already up-to-date! ###Markdown First let's check the common `fileids` method of `nltk` corpora: ###Code twitter_samples.fileids() len(twitter_samples.strings('positive_tweets.json')) ###Output _____no_output_____ ###Markdown The twitter_samples object has a `tokenized()` method that returns all tweets from a fileid already individually tokenized. Read its documentation and use it to find the number of positive and negative tweets. ###Code number_of_positive_tweets = len(twitter_samples.strings('positive_tweets.json')) print(number_of_positive_tweets) number_of_negative_tweets = len(twitter_samples.strings('negative_tweets.json')) print(number_of_negative_tweets) ###Output 5000 ###Markdown Build a bag-of-words model functionThe simplest model for analyzing text is just to think about text as an unordered collection of words (bag-of-words). This can generally allow to infer from the text the category, the topic or the sentiment.From the bag-of-words model we can build features to be used by a classifier, here we assume that each word is a feature that can either be `True` or `False`.We implement this in Python as a dictionary where for each word in a sentence we associate `True`, if a word is missing, that would be the same as assigning `False`. ###Code import string string.punctuation ###Output _____no_output_____ ###Markdown First step we define a list of words that we want to filter out of our dataset: ###Code useless_words = nltk.corpus.stopwords.words("english") + list(string.punctuation) useless_words[:5] useless_words[-5:] positive_tweets = twitter_samples.strings('positive_tweets.json') positive_tweets[:5] ###Output _____no_output_____ ###Markdown Tokenize Text in WordsThe first step in Natural Language processing is generally to split the text into words, this process might appear simple but it is very tedious to handle all corner cases, see for example all the issues with punctuation we have to solve if we just start with a split on whitespace: ###Code positive_tokenized = twitter_samples.tokenized('positive_tweets.json') negative_tokenized = twitter_samples.tokenized('negative_tweets.json') positive_tokenized[:2][1] """Build a bag of words model""" def build_bag_of_words_features_filtered(words): return { word:1 for word in words if not word in useless_words} build_bag_of_words_features_filtered(positive_tokenized[:2][1]) ###Output _____no_output_____ ###Markdown Create a list of all wordsBefore performing sentiment analysis, let's first inspect the dataset a little bit more by creating a list of all words. ###Code words = [] for dataset in ["positive_tweets.json", "negative_tweets.json"]: for tweet in twitter_samples.tokenized(dataset): words.extend(tweet) len(words) ###Output _____no_output_____ ###Markdown Study the code above, see that it is a case of nested loop, for each dataset we are looping through each tweet. Also notice we are using `extend`, how does it differ from `append`? Try it on a simple case, or read the documentation or Google for it!Now let's filter out punctuation and stopwords: ###Code filtered_words = [] for x in words: if x not in useless_words: filtered_words.append(x) len(filtered_words) ###Output _____no_output_____ ###Markdown First we want to filter out `useless_words` as defined in the previous section, this will reduce the lenght of the dataset by more than a factor of 2: Find the most common words The `collection` package of the standard library contains a `Counter` class that is handy for counting frequencies of words in our list: ###Code from collections import Counter counter = Counter(filtered_words) ###Output _____no_output_____ ###Markdown It also has a `most_common()` method to access the words with the higher count: ###Code most_common_words = counter.most_common()[:10] most_common_words %matplotlib inline import matplotlib.pyplot as plt sorted_word_counts = sorted(list(counter.values()), reverse=True) plt.loglog(sorted_word_counts) plt.ylabel("Freq") plt.xlabel("Word Rank"); ###Output _____no_output_____ ###Markdown Build the features for machine learningUsing our `build_bag_of_words_features` function we can build separately the negative and positive features.The format of the positive features should be: [ ( { "here":1, "some":1, "words":1 }, "pos" ), ( { "another":1, "tweet":1}, "pos" ) ] It is a list of tuples, the first element is a dictionary of the words with 1 if that word appears, the second the "pos" or "neg" string. ###Code negative_features = [] for x in negative_tokenized: negative_features.append([build_bag_of_words_features_filtered(x),"neg"]) positive_features = [] for x in positive_tokenized: positive_features.append([build_bag_of_words_features_filtered(x),"pos"]) positive_features[0] negative_features[6] ###Output _____no_output_____ ###Markdown Train a NaiveBayesClassifier ###Code from nltk.classify import NaiveBayesClassifier ###Output _____no_output_____ ###Markdown A classifier based on the Naive Bayes algorithm. In order to find the probability for a label, this algorithm first uses the Bayes rule to express P(label|features) in terms of P(label) and P(features|label):Let's use 80% of the data for training, the rest for validation: ###Code split = int(len(positive_features) * 0.8) split classifier = NaiveBayesClassifier.train(positive_features[:split]+negative_features[:split]) ###Output _____no_output_____ ###Markdown We can check after training what is the accuracy on the training set, i.e. the same data used for training, we expect this to be a very high number because the algorithm already "saw" those data. Accuracy is the fraction of the data that is classified correctly, we can turn it into percent: ###Code training_accuracy = nltk.classify.util.accuracy(classifier, positive_features[:split]+negative_features[:split])*100 print(training_accuracy) test_accuracy = nltk.classify.util.accuracy(classifier, positive_features[split:]+negative_features[split:])*100 print(test_accuracy) classifier.classify({'miss': 1, '@RileyMcDonough': 1, 'Thank': 1, 'smile': 1}) ###Output _____no_output_____ ###Markdown It looks like the accuracy for the test is very high, check the most informative features below to understand why: ###Code classifier.show_most_informative_features() ###Output Most Informative Features :( = 1 neg : pos = 2362.3 : 1.0 :) = 1 pos : neg = 1139.0 : 1.0 See = 1 pos : neg = 37.7 : 1.0 TOO = 1 neg : pos = 36.3 : 1.0 THANKS = 1 neg : pos = 35.0 : 1.0 THAT = 1 neg : pos = 27.7 : 1.0 miss = 1 neg : pos = 26.4 : 1.0 sad = 1 neg : pos = 25.0 : 1.0 x15 = 1 neg : pos = 23.7 : 1.0 Thank = 1 pos : neg = 22.3 : 1.0
RealTimeDRTraining.ipynb
###Markdown REAL TIME DIGIT RECOGNITION FOR SINGLE DIGITS- Training ###Code #Importing Libraries import tensorflow as tf import cv2 import numpy as np import os from sklearn.model_selection import train_test_split import matplotlib.pyplot as plt from keras.preprocessing.image import ImageDataGenerator noOfClasses=10 # 0-9 digits imgs=[] #list to save images classes=[] #list for corresponding classes of images for i in range(0,noOfClasses): imgList=os.listdir('Data/'+str(i)) for j in imgList: currentImg=cv2.imread('Data/'+str(i)+'/'+str(j)) currentImg=cv2.resize(currentImg,(32,32)) imgs.append(currentImg) classes.append(i) print(len(imgs)) #Length of image list print(len(classes)) #Length of class list # Converting to numpy array imgs=np.array(imgs) classes=np.array(classes) print("Shape of Images list : "+str(imgs.shape)+" & Shape of classes list : "+str(classes.shape)) ###Output Shape of Images list : (10160, 32, 32, 3) & Shape of classes list : (10160,) ###Markdown Now once we have all the data ready we need to split them into Training and Testing Data ###Code train_imgs,test_imgs,train_labels,test_labels=train_test_split(imgs,classes,test_size=0.2) print("Training Images Shape : "+str(train_imgs.shape)) print("Testing Images Shape : "+str(test_imgs.shape)) ###Output Training Images Shape : (8128, 32, 32, 3) Testing Images Shape : (2032, 32, 32, 3) ###Markdown Now we need some validation data as well so lets split Train_images into training and validation data ###Code train_imgs,val_imgs,train_labels,val_labels=train_test_split(train_imgs,train_labels,test_size=0.2) print("Training Images Shape : "+str(train_imgs.shape)) print("Validation Images Shape : "+str(val_imgs.shape)) # To find length of index values where numbers are is present # Lets save them in a list noOflabelSamples=[] for i in range(0,noOfClasses): noOflabelSamples.append(len(np.where(train_labels==i)[0])) # Lets print them # for i in range(0,noOfClasses): # print("Class " + str(i) + " : " + str(noOflabelSamples[i])) print(noOflabelSamples) ###Output [647, 665, 655, 672, 642, 644, 639, 649, 668, 621] ###Markdown Lets Plot our results ###Code %matplotlib inline plt.figure(figsize=(15,10)) plt.bar(range(0,noOfClasses),noOflabelSamples) plt.title("Number of Images in each class") plt.xlabel("Class Number") plt.ylabel("Number of Images") plt.show() ###Output _____no_output_____ ###Markdown Converting our images to grayscale and preprocess the input so all our images are properly visible to the network when training ###Code def preProcess(img): img=cv2.cvtColor(img,cv2.COLOR_BGR2GRAY) img=cv2.equalizeHist(img) x,img=cv2.threshold(img, 127, 255, cv2.THRESH_BINARY) img=img/255 return img # Now Preprocessing train_imgs= np.array(list(map(preProcess,train_imgs))) test_imgs= np.array(list(map(preProcess,test_imgs))) val_imgs= np.array(list(map(preProcess,val_imgs))) #To test if the images are properly preprocessed img=train_imgs[30] plt.imshow(img, cmap='gray', interpolation='none') # Display using OpenCv # cv2.imshow("Preprocessed",img.astype('uint8') * 255) # cv2.waitKey(0) # cv2.destroyAllWindows() print("Training Images Shape : "+str(train_imgs.shape)) print("Testing Images Shape : "+str(test_imgs.shape)) print("Validation Images Shape : "+str(val_imgs.shape)) ###Output Training Images Shape : (6502, 32, 32) Testing Images Shape : (2032, 32, 32) Validation Images Shape : (1626, 32, 32) ###Markdown For better Training we Augment the Images - To augment images we use ImageDataGenerator which requires rank four shape So we have to reshape images ###Code train_imgs = train_imgs.reshape(train_imgs.shape[0],train_imgs.shape[1],train_imgs.shape[2],1) test_imgs = test_imgs.reshape(test_imgs.shape[0],test_imgs.shape[1],test_imgs.shape[2],1) val_imgs = val_imgs.reshape(val_imgs.shape[0],val_imgs.shape[1],val_imgs.shape[2],1) datagen = ImageDataGenerator(width_shift_range=0.1, height_shift_range=0.1, zoom_range=0.2, shear_range=0.1, rotation_range=10) datagen.fit(train_imgs) ## Creating Convolutional Neural Network Model model = tf.keras.models.Sequential([ tf.keras.layers.Conv2D(60,(5,5),activation='relu',input_shape=(32,32,1)), tf.keras.layers.Conv2D(60,(5,5),activation='relu'), tf.keras.layers.MaxPooling2D(2,2), tf.keras.layers.Conv2D(30,(3,3),activation='relu'), tf.keras.layers.Conv2D(30,(3,3),activation='relu'), tf.keras.layers.MaxPooling2D(2,2), tf.keras.layers.Dropout(0.5), tf.keras.layers.Flatten(), tf.keras.layers.Dense(512,activation='relu'), tf.keras.layers.Dropout(0.5), tf.keras.layers.Dense(10,activation='softmax')]) model.compile(optimizer='adam',loss='sparse_categorical_crossentropy',metrics=['acc']) # Model Summary model.summary() ###Output Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d (Conv2D) (None, 28, 28, 60) 1560 _________________________________________________________________ conv2d_1 (Conv2D) (None, 24, 24, 60) 90060 _________________________________________________________________ max_pooling2d (MaxPooling2D) (None, 12, 12, 60) 0 _________________________________________________________________ conv2d_2 (Conv2D) (None, 10, 10, 30) 16230 _________________________________________________________________ conv2d_3 (Conv2D) (None, 8, 8, 30) 8130 _________________________________________________________________ max_pooling2d_1 (MaxPooling2 (None, 4, 4, 30) 0 _________________________________________________________________ dropout (Dropout) (None, 4, 4, 30) 0 _________________________________________________________________ flatten (Flatten) (None, 480) 0 _________________________________________________________________ dense (Dense) (None, 512) 246272 _________________________________________________________________ dropout_1 (Dropout) (None, 512) 0 _________________________________________________________________ dense_1 (Dense) (None, 10) 5130 ================================================================= Total params: 367,382 Trainable params: 367,382 Non-trainable params: 0 _________________________________________________________________ ###Markdown Time to train the model ###Code history = model.fit(datagen.flow(train_imgs,train_labels,batch_size=50), epochs=10, validation_data=(val_imgs,val_labels)) ###Output WARNING:tensorflow:sample_weight modes were coerced from ... to ['...'] Train for 131 steps, validate on 1626 samples Epoch 1/10 131/131 [==============================] - 61s 469ms/step - loss: 0.3046 - acc: 0.9011 - val_loss: 0.0664 - val_acc: 0.9785 Epoch 2/10 131/131 [==============================] - 64s 488ms/step - loss: 0.2480 - acc: 0.9225 - val_loss: 0.0519 - val_acc: 0.9846 Epoch 3/10 131/131 [==============================] - 68s 517ms/step - loss: 0.2282 - acc: 0.9268 - val_loss: 0.0487 - val_acc: 0.9865 Epoch 4/10 131/131 [==============================] - 51s 386ms/step - loss: 0.1697 - acc: 0.9439 - val_loss: 0.0380 - val_acc: 0.9865 Epoch 5/10 131/131 [==============================] - 50s 381ms/step - loss: 0.1487 - acc: 0.9536 - val_loss: 0.0322 - val_acc: 0.9895 Epoch 6/10 131/131 [==============================] - 49s 375ms/step - loss: 0.1472 - acc: 0.9548 - val_loss: 0.0242 - val_acc: 0.9932 Epoch 7/10 131/131 [==============================] - 49s 378ms/step - loss: 0.1295 - acc: 0.9588 - val_loss: 0.0323 - val_acc: 0.9889 Epoch 8/10 131/131 [==============================] - 51s 386ms/step - loss: 0.1229 - acc: 0.9594 - val_loss: 0.0273 - val_acc: 0.9926 Epoch 9/10 131/131 [==============================] - 51s 390ms/step - loss: 0.1149 - acc: 0.9629 - val_loss: 0.0325 - val_acc: 0.9877 Epoch 10/10 131/131 [==============================] - 51s 386ms/step - loss: 0.0961 - acc: 0.9695 - val_loss: 0.0244 - val_acc: 0.9920 ###Markdown Lets Plot the results ###Code plt.figure(1) plt.plot(history.history['loss']) plt.plot(history.history['val_loss']) plt.legend(['Training','Validation']) plt.title('Loss') plt.xlabel('epoch') plt.figure(2) plt.plot(history.history['acc']) plt.plot(history.history['val_acc']) plt.legend(['Training','Validation']) plt.title('Accuracy') plt.xlabel('epoch') plt.show() ###Output _____no_output_____ ###Markdown Now lets evalute the Training ###Code score = model.evaluate(test_imgs,test_labels,verbose=0) print('Test Score : ',score[0]) print('Test Accuracy : ', score[1]) ###Output Test Score : 0.02635903046924756 Test Accuracy : 0.992126 ###Markdown Save The MODEL ###Code model.save('trained_model') ###Output WARNING:tensorflow:From C:\Users\My1\AppData\Roaming\Python\Python36\site-packages\tensorflow_core\python\ops\resource_variable_ops.py:1786: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version. Instructions for updating: If using Keras pass *_constraint arguments to layers. INFO:tensorflow:Assets written to: trained_model\assets
final_project_files/code/SciSpacy_NER_Models.ipynb
###Markdown Steps from following links - 1. https://github.com/allenai/scispacy 2. https://towardsdatascience.com/using-scispacy-for-named-entity-recognition-785389e7918d ###Code text='In 2 sibs with autosomal recessive primary microcephaly-19 (MCPH19; {617800}), {3:DiStasio et al. (2017)} identified a homozygous c.760C-T transition in exon 8 of the COPB2 gene, resulting in an arg254-to-cys (R254C) substitution at a highly conserved residue in a predicted WD40 protein-binding domain. The mutation, which was found by a combination of homozygosity mapping and whole-exome sequencing and confirmed by Sanger sequencing, segregated with the disorder in the family. It was filtered against the dbSNP, 1000 Genomes Project, Exome Variant Server, and ExAC databases, as well as an in-house database. A mutant mouse model engineered to carry the variant showed evidence of decreased cortical tissue and increased apoptosis of certain neuronal cell populations, suggesting disruption of neuronal development. {3:DiStasio et al. (2017)} concluded that the R254C allele is a hypomorphic variant.' ###Output _____no_output_____ ###Markdown en_ner_jnlpba_md ###Code nlp = en_ner_jnlpba_md.load() doc = nlp(text) displacy_image = displacy.render(doc, jupyter = True, style = 'ent') ###Output _____no_output_____ ###Markdown en_ner_bc5cdr_md ###Code nlp = en_ner_bc5cdr_md.load() doc = nlp(text) displacy_image = displacy.render(doc, jupyter = True, style = 'ent') ###Output _____no_output_____ ###Markdown en_ner_bionlp13cg_md ###Code nlp = en_ner_bionlp13cg_md.load() doc = nlp(text) displacy_image = displacy.render(doc, jupyter = True, style = 'ent') ###Output _____no_output_____ ###Markdown en_ner_craft_md ###Code nlp = en_ner_craft_md.load() doc = nlp(text) displacy_image = displacy.render(doc, jupyter = True, style = 'ent') ###Output _____no_output_____ ###Markdown en_core_sci_sm ###Code import en_core_sci_sm nlp = en_core_sci_sm.load() doc = nlp(text) displacy_image = displacy.render(doc, jupyter = True, style = 'ent') nlp.pipeline ###Output _____no_output_____ ###Markdown Adding Linker with a knowledge base to current pipeline ###Code from scispacy.linking import EntityLinker linker = EntityLinker(resolve_abbreviations=True, name="umls") # name represents the knowledge base to link to, time-consuming step and memory(RAM) consuming on google colab nlp.add_pipe(linker) doc = nlp(text) displacy_image = displacy.render(doc,jupyter=True, style='ent') ###Output /usr/local/lib/python3.6/dist-packages/scispacy/candidate_generation.py:284: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray extended_neighbors[empty_vectors_boolean_flags] = numpy.array(neighbors)[:-1] /usr/local/lib/python3.6/dist-packages/scispacy/candidate_generation.py:285: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray extended_distances[empty_vectors_boolean_flags] = numpy.array(distances)[:-1] ###Markdown Examing a sample entity ###Code entity = doc.ents[1] for umls_ent in entity._.kb_ents: print(linker.kb.cui_to_entity[umls_ent[0]]) ###Output CUI: C4540488, Name: MICROCEPHALY 19, PRIMARY, AUTOSOMAL RECESSIVE Definition: None TUI(s): T047 Aliases: (total: 1): MCPH19 CUI: C1138529, Name: MCPH1 protein, human Definition: Microcephalin (835 aa, ~93 kDa) is encoded by the human MCPH1 gene. This protein plays a role in both cell cycle regulation and neuronal development. TUI(s): T116, T123 Aliases: (total: 8): microcephalin protein, human, BRIT1 protein, human, MCPH1, microcephalin, human, microcephaly, primary autosomal recessive 1 protein, human, MCPH1 protein, human, Microcephalin, BRCT-repeat inhibitor of TERT expression 1 protein, human CUI: C1417075, Name: MCPH1 gene Definition: This gene plays a role in both mitotic checkpoints and brain development. TUI(s): T028 Aliases (abbreviated, total: 11): MCPH1 Gene, BRIT1, microcephalin 1, MCPH1 gene, BRCT-REPEAT INHIBITOR OF TERT EXPRESSION 1, MCPH1, MCPH1 GENE, FLJ12847, MICROCEPHALIN, BRCT-repeat inhibitor of TERT expression 1 CUI: C3273388, Name: CDK6 wt Allele Definition: Human CDK6 wild-type allele is located within 7q21-q22 and is approximately 232 kb in length. This allele, which encodes cyclin-dependent kinase 6 protein, is involved in cell cycle progression. Variation in the gene may influence stature. A translocation of this gene and the MLLT10 gene may be associated with acute lymphoblastic leukemia. TUI(s): T028 Aliases: (total: 6): MGC59692, CDK6 wt Allele, MCPH12, Cyclin Dependent Kinase 6 wt Allele, PLSTIRE, Cyclin-Dependent Kinase 6 Gene ###Markdown ###Code linker = EntityLinker(resolve_abbreviations=True, name="go") # gene ontology knowledge base # from scispacy.abbreviation import AbbreviationDetector # # nlp = spacy.load("en_core_sci_sm") # # Add the abbreviation pipe to the spacy pipeline. # abbreviation_pipe = AbbreviationDetector(nlp) # nlp.add_pipe(abbreviation_pipe) nlp.add_pipe(linker) doc = nlp(text) displacy_image = displacy.render(doc,jupyter=True, style='ent') entity = doc.ents[0] entity.vector ###Output _____no_output_____
ScalableComputingUsingPython.ipynb
###Markdown **Scalable Computing Using Python***Multi-Threading**Multi-Processing**Message Passing Interface (MPI) - Cluster Computing* ![image.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABuYAAAO5CAMAAAGlNlPjAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAADDUExURQAAAAAAAAAAAAAAAAAAAAAAAAAAAP8AAAAAANsAAAAAAL8AAKgAAAAAAJUAAAAAAIQAAAAAAHUAAAAAAGoAAF8AAAAAAFcAAAAAAE4AAAAAAEYAAAAAAEAAADkAAAAAADQAAAAAAC8AAAAAACoAAAAAACYAACIAAAAAAB8AAAAAABsAAAAAABkAAAAAABUAABIAAAAAABAAAAAAAA0AAAAAAAwAAAAAAAkAAAcAAAAAAAUAAAAAAAMAAAAAAAIAAAAAAEJhcZoAAABAdFJOUwAIEBggKDAzODlAQEZITVBTWFlgYGZobXBzeHmAgIaHjY+Tl5mfn6WnrK+yt7m/v8XHzM/S19nf3+Xn7O/y9/lACsycAAAACXBIWXMAABcRAAAXEQHKJvM/AADYRElEQVR4XuydC0MrSdGGAwQWs8si8i0iRhFZFBGjHGTRmM32//9VX71V1T3dc0kmECDhvM85ZPp+r67pufXA+OZzwtptL6zd9sLabS+s3fbSo3Z/9eMWkmoX3OGbH/yY+K8fEynoxtOsXaPsRe0add9o8tqFIL8/BKnAr/74oxhRT/n9rxqkjnD50w8/wCrGb7/55l/qs8EUtfvmm3/o7ze/t4NVMNYOv9p3sd7AfjeVWu2k8F5eHL6O2kHyPlftvvn5m19EAP8AuROT/H73d/h970H//BsNFUNvJql2nxLWbnth7bYX1m57Ye22F9Zue/HaEUI2jODHFjq96h5hcGmHil0/vhej9uK+oHZPfkyEwY0dKob6m1wW5PJ6wrX87AwOJBfNZz44HoSLMLjTkqljeHS/YQykx4BiwvB0os57u2Hw7E6jy0GYmlFqp95jHPYvJFqA3f+dR8P+ofzOrObrQ0sr+SEHHNRsRi+ZBsFfvXYB3hJQDxbEanc5uNOA4ooW0oPbxBAbxbIzo1vDDlzWjeV/pUaY9E9+bpLlHg5l7dww0FByDDIEYt+5z4783sgogAXO+DdMtUOoyl2tj+r07qCcPekdcHOQodaXLawd2VbKwba1Q69WcLGqy5LamQPmxo1FtY/8XKhlcISDOuwNhjJfT+AIzf8gxz2dwFMtg5wDqP05uYWBKHNBVMSF/Ox4BPl5UPcPoSqcH+X8YzA4jrUz5FREyylOaoODekqgKow1zym05Olgv2yOD2KGnzBL5TiVXsHpmJ6ZRdcwwFkTajfVQks3Se+qtwTcnw3O4XwgPzeDE62d+cg/qaxU9HMhdSKEfF3IhOYGP34WvD7Th8HVRNSVWEW7ubP8TO8Gd7eDs7lb5f+9/I1FJSOEzPGVittEgq5Lca7hxdVfRQ7JNtKVpeoscxCLLUPFZ1toqd2Z2wbHsF7KX3Sw2qXgG42U0bsmlRbnF1oZdZKfmVrNNVzi8o8cw56F3mhSnXqz7gtYhHylyDyxj6P+ZOjlchPNutd2EG7lZ65rs6A/skLTuXHwjN8bXb5Gr2FQxwPV/5O7F0xJ74xdKYB2nutaO9i6GhUcjCaD4ZP23SR6aSX1YrlfMd8GAm7j4EdOQbR2eq9jNI6qexK9gl0rORyIgn+Tq/8fytb0FyEbT5z+cLDLlXW29ORST/WPrIIhHIoaw1FvSNp/1FlrDw1o6n17CAf6O5AVOub70bOpb/yoxyRepBVFYU7bxxg/Ujvpn1i7A1X0eu1B6yW1+6AbwGumu49EixNC1oPO+Eqh7Qr5yywNuewKaKR7Qcnn3e7XRn2nNQy4ChvLYOoO86doQXWWxd2hessPnp8K1xrgTN2AhfeLZmKT9YT8npozVKmZPPQ7EPWdZSwFEMUnJnPQu62i8UZYtMJF7BVh8CBh9UEwjyJTKqyahD5pNYEr+k5Dae3k33sj+k6ylXyzkYly2L1kW6fjWKudutnRkZFgFqud/rfzAAklic/ywO9Lj5z1AYBXl+/Ej4QQUmOTL4vJ3HfROv8Vi9S2ENXsumj+FA1gasHJ1M3bEvzKuei4m+Nj6GIx2ZmK/J88xbMXPOkMN/kH4/hAXbXEYSrmg/CkvmH8KEoxnIjHxZXY5RQAJz5WO0tU7PIzCOqvyb0ZlrT8Su328ISzrl/HnqM9GC2eATfVtc1PYXyQ8jkw2J/5a1SUf2CnADCkvsOZitl3xWT+OKF5W+RkUc8p5FQEtbMyCbF2lzjI364/IO0rdeA++MMDghYVf2o0a6qd/+Diht2qN/+3B3f3xziBAjt7Yt7BVKE3/aXGqPS+2MQOI26pKHhVRB96FNT3RGPoyf/trpxwjuTP5hz1lgxgV7P6a3KEkM/Ky6Ywi/M+k98LMeWMVSmqKNr46G4QngcXE5nU1PHKj/gvpwDyK6v5YD94lhs6W4xi2VRUAXn5Qni+OLp6Fi0+ktkb7qORGgeDc9EBp4OzoJcRtNLiLrXDknbDa/eIVwSlijibmD3oBXcpMXzkiP83er3k3iz2o0+KT0RLq+UTkC7fEUJeh924apsalt1R3vDJBBMjyniJubKp7fCmk/ojmIa0N5eeB3tn5lz5bSz6bhJKWmq7Q68driajCmLBi01ifIaq04vMYSCr840GygvlbGi7MLyYyELMa3c6GcwnwxsEla7VJ8oe5XC74bXrgXRaYvtrQwghhBBCyKr4R/0+Hazd9sLabS+s3fbC2m0vy2v3sx+3kVS7zk0qSo8fvvnm727cAlat3Wbv1FEnr51u7/MDNvwJf/gm/P57sQVsEgRn/f/zr3S3oG+++e8/1MG2X9lg8toVv9+r6Tdqsf9KFeBb/P5DrRtLZ+0SYrH/0eZox2lPbi4vq92/5E/7butqpxtu6Z/+/4v+/kF/f/VHuGWeW1O7Twlrt72wdtsLa7e9sHbbC2u3vVjtCCHbQ+eWPg+NF8izd9mc9350Dk8srpBp6H7QVr/zXtBM9h1rh6ctJT8c7KtFsEjh3TgYTIIUWT/Iq1543Q4P2qJB1GnoG/+o70D3lLG9hXyXIPzoy3g3O/oEqz25C8bYi8CME3u2tdk2r8WS11ytpsjHjJY53qnUg/3pQT/q58Fv4Gk+VjscxQEvS8LLAw7Ogu6Voy443IqzRUPtELiyrxd7EwamO30fxD97Egb4sghqF0vtB62dWcyAlyOvxIRwB/hQibWJeKUi7+tOQFeZC4x47fJYa6e7Eennnd+dVJ7PiW7XRcgGsv2ilz2MDzDl2SwuWO2yCa/OuR+7+NDmQeYjaK9qIoeiMj070H1h5DTLPTU8dgVSo8zpZ6K0Zq5ZNMxYjVAv9/jRCPrzcV9kdr0n5fCfGWqH91m1doL+qoN/uekY5rghY6ydoB/31do9T+x7xuosP3r8COxlXWTvtXsc7MEwnOE1XxQbflD42DJITt12RN1rYJxsPA+e9y0Q3qvUwPiS9BO+aCH/p7ZPECJsLS2F164mhHxNhOCaOKnuzwJUkUzToqmugiyz5mLbwYQviLbGFKhfBUKo3cEN3BFE/9+JKpzCe2NR1YviqyJG3+E3aaWpGtUq/6HULIj8P1bn+OGjDQYF1PMnL7r8eZnx6q5XQH4u5TfVzkLFjzFtMFq+sYw0FH2k9dTSyyGcSQUOrIdh1Qa4DA9me5TawiBum038rtQKbH6lCNkGDi8xCepn6UqqJWfdq3mfYCPRaREzo86GB5PBRC8iY7Mf3wRIP/PnXrBP/YK7fol4w7EC4sPPWI1LzXRdbTVUvadhRu5lxNptfOWU2WCiHxzBz6HWTjf7wSZAWgOpnXvZCZie4PhNEkIIacGnB5wqd3wObUsnEJsT8afXH/VzPrJ+fdYP7Nq8r/P/JNzCtGVogY+GUnrVZlfed/jIIiqj3xzGBVo9bl3tAG6e+9XVMWqnql33pNAr6qgTavdxtwLeA7+1QAhZC5hDjHZtpxQTZmFZTCWsyaSK9T3QCRH/H1SzYZc4A2pO1gW6mLP/flRDwMXdQbgYhGzXV/jvaEhZU+FKd5iLm2rKO6R/Kn/7WFylGG+NZiT6Ti1zfEofjxqhEgIaGx+r1QeCxEmM8ovSaQkf/QYkPK3agwMc4KztpN/RrzRlupFrx3fC9J3iBXDiUBrbvXO3p9rtaVQvOrBuMWsaBVI7DQUn/S79O9euP51PaK4Gn2Ai5CvD9vEY6e9izvy4gQRRY/VnknWutgl70bRtD90umdnhm4dYHHqNqH4zJSfaJwyO54PHPbvWPMHFaPWei/XeQ4n13i5Kz9RvMDjXK9QhHOg2sGqxsOFZHw+33XX0b4w9D80uZ3rqr8m9FbqFk/zpf/mxp5mxoz5s8MQvvqoPs1YXtdaVuvraQY27ARsX4XLujZ6bDMb6nHcWYmR3cuXfDEtg9beU35CU/6Nlbp/AFrvlC6NJoBRKjRbcfOUg55FmwMP7ONoOObv6nDec9ddMXrtwho82w99SfkNGmgW2ANLDSE8h9uXMyWcVuMufepvx2LcG0ocyL3XiGVudxFMsu3inJF6gwDsMthoY6wSldsH9Y7KEkE9IdlbiE2IbTa8FgTcElHA0GmF3NJ2z8WywPUr1AO/Hue75o57w8IW2nOLgqPc0seANt0fw3UjQd3PTdSjjGc7RTrEBgiLHE9sVQTwng3vbDwE/2CooLcL1ZxNBNaRnROfqr/ybocx+DorulC7TExY8M2D7IeBHAsrvKX42uHYrEK+0EEJeSfeE4KvUZWzsjIKC7fnKB0t1HJK2kyP2CXIlp366R17A+nOmzr4d3kajmsDItF2wvtPa+Us+pgRRuxGerlKL6vTNReqm2/6gEviXtJ1t+2O1kxrA7P+Hpw/mbK6azPYy1vfSCCGEEEII+crxT/uR7cB7TXEnsh14rynuRLYD7zXFnch24L2muBPZDrzXFHci24H3muJOZDvwXlPciWwH3muKO5HtwHtNUYe/rL6JX/ivGzr5je38RNaJ95qiDtjlrmRpsy/ouh/YZ2+G95qiDtZ18hPcpKjTr4JuV/h7scet/H6CUbvuF+uk8G89fPE4CgzqCrsLtTiYD3kp3muKOsQO099f9Bc/0embn+2gNuuuf2nXuXduFqLU4fCdmX/SQ54BeRHea4o6eJuq2RrezLWDHqNTo+vUCPKuC9+5GYJp7pxPX4H3mqIOb9h1tq0qu25NeK8p6rC06+yoBzN/q92FbUS/EXWoXZfmQWzwC3D4n5ktoJnZda/Ae01Rh2bXfVucaSBEOsH4tRq1u8zdzer+B5ji+QrMeUw7sOtegfea4k5kO/BeU9yJbAfea4o7ke3Ae01xJ7IdeK8p7kS2A+81xZ3IduC9prgT2Q681xR3ItuB95riTmQ78F5T3IlsB95rhBBCCCFbQ8AnPO2TlT0+R2Ff3nCq/bIzbpqp+PfC+3NgmwYuAOWodgDrSj9m3FrQ7cW+2wkD/sJ4MHjGB2FSx+wEbBsqvtgcda5fiFHr4O7Emipgt7S73f1wC9scXz9VdA8r/cirMIEzvtaqO6ud2QGY8W4XHx+bVt9yxcaPg/v53NKaezo3AXtu54HFoF1nBfPM/PNl9+ixPZhjxvPBUAoi4e/CHUKMw4O7IMK1VFDjeOWvxHeTCSG211MYPOoHeuz/4FjLDyBAsOADxGqwMDe6/Zv/l1ruhgG+1Bd0X3Md6fLfw+vBmsWd9fNA0ZgGih0DvpUqRh1EZsSHet2YB8Z/aXgvWPQQ007KTSgy1oImq5nsTweXmaXy4QX7bX4U+umhpymaHqU3UM2x/MElHJmbABcZvRJFpMMtchQR9T5Z1nVO9LXo0a6/8pN33f1cG1OMeWCzxoLBERIopuGCrotzeXbQv6zrjFiXjWdkk46WHA+V2Od/xYDvRidXqxhctAVhTRbYg0+Gta67k5CxYQ5SMtHoaSVnN6WuM/vcDnlgt1Y24VDMz951llLRdWJ/UtfBpfziu59ixlYhY+86jXNpMb8eMKtdr7/CX1MTEkIIeVdGCzW1eU2qZa4FnV4VqglJyBpqubZKKz0QJtXK7yW8KvIWE2xxja7Db5iG4X54xtUW7If9GHxZHvbEe+8JXWddjN/hYKJfK38K+g1zcbQ/C5Elg5NLXLu5sV3UECCGihk/y6m5OyKPCYaARIXDsxjmYXcw0TNURJoM9YCIGqKI0HJl53MyjRX1rqv+7L/+4Xd+LD/SrLBf2md3g3edN6gasMuRmCe2ATo8zaJBhnqBxN2x6xrCCLOQuk4dLbTaIeZ+ui+ZwWn4rKZ7++SvB0sRxCg/XxdZ151iw2FvFP0zC/6GgytzF4F6nMau0yCCG+SA3krJwCI9c4hrlBYEv+anPye6GjuDywOSvZwOsB9NCiU/j49516mr/rdDFQHrPqRK3pt+zY5OJIQQQshXg97XIpsP+knvcSqw2S2QN0Vv4JAXIetW3zdcu86uYsmfdR2Wwn7vEjdgw+AS7sJFUJkUVzx6EJ6mJ/CNfR0ONYmZ3mmPqcjfOTaoFZM5PsKo9950L1D5JSuBVnWDPidwFAbP+liJS538aQg/qjke9MIGdv9JNz9ViNTZHlGSvitSsVDR8SjG5KLsVVjTylElC395o0/ndsQfDtLLQ5UbDeCy46izOYS9IpULe4ykSvrJH2qClbya92hG7TqyZkRQ3h6evBJCSGJbHmH8nJg+mqROqEwdtAZYFotab31g3RVNslT2phc3mOAVBpNnNV3cu13/76UAoOEosewwuJjFB5k1SG4i60Ma3Jpe/lc9U92d9uY2XwSILyvIT+VoofR2KIx6k9sdcahMZD3sB331Qy9+zPWNDGncaQg7eMsHDT18GgzO9VqIWCaHGuDaZVZ+K8cqFp72OVcvd9RHYVIiZINh/xBCvirO991ANpROtcQn5zaUfF2HH/2LazJZlV171+G5ZXFxj0db4w2mB7A+4Ha7/ogle7wVDuRd2LElWFqTze/iU+LWD9Ydc0ygHtKCSwQYrNMQHIY0JMhbo/ez9eC9FWS5HaquO6qk0l5D1vAaax5EJvUn67pb7U7y8azYD/hEALuOEPI10b2m868UtdDu46+edkcb+GOGbczO3dBBr6Xn552/W2tmpyh1ELS7IVp9ho/Loq2aYk57MessTWbLqM4m4hlk2B1czWCYapu4a04MKP/zp/5kDTj1TwyV/z2A/uGrM/pfViKVH/6bJWLLRMF9zSQ/uxrS/j9a8eJaEt4aQsrvJUeGeCANZgvxObF1ndVyFvApuNgAJd4Ok/TSsWBrQLfbf/ind9HLP/n/bJOcWdy5QPoDb9pWARQxnN5kTvoLqTMXd7c1ij+YXTl+UuxsPtXy8WYidm0TrPQKPKB3nQWwNaCY9cH35D/WG3aiyK5wVzbddcXhwTLU37F7VfgyMTx46FgKMXvX4XNRIpqIH7tO09LQOugqWzx8FaAZXn/d2RpTTXb4QL6WniOEEEK2nXT2IMdoXoy9TWe8VOPLaSOeE3Pu8ODnCwl2WcY+x9lSHPf/ROhrjCDdb4vV1ocq7WDGHQ1r3nCRrvMwcN0x834WqMJcq8fZ4zGFG+n7RV6CEg+rKaT8kmloC3VZdVjX2PfA9CcGOZS/5B+JnjjG0m0Xcqo+c5N9nls/+SWdYusAiB+WZefaFvGpZ29deMaPe6urxJIe8O9O4e95dqmrenN9xHuQlaeiy3y14Y1MdXVvzdrkGja9H+h/hpjy1y5T1ylqVi8PAv+Z5AW7+cqvxfcyby+o0sUDKif/0WT6PXUx4Kvd6nhafcAbESyUGfUgXe82O8KqTjGMrwphjQGFGCA7SkY7czHAqXqMIvvA3pksvWWUtXYdfuFm7qmQ8PcS7dfjbzlaC68K5OdshOpVXbcbRjKixfT4JOH0YkfedeEGHwTYC1ejqTgjkIrrCF82TK5PI2xRgCgp2hXSnIcTXCMxV0kbHvYnw0lvxptPGGlhdFjhU4up68JRuKp33Uz3OLBCuv8urr/ANY8fS0cIIYSQ7QFfFornExUNh14sjSUBUhg5h7BzDrIa97EJ9QPC+odNc6Q99XY0/qmzLpLkRPIWiy57JlM9dAnm9vQrx4AFFII/4yPPtt1V2A/RpP/jo503yFCd5Pcov+BCFhB0KzthFMZjPxm3ho1/2qBqic9CR89ost9bvBiZnHzHLgSf6rUM90qmGE3+47YoLnEkB7ISUerS//gXBlex87KuMyyI2cc2BsRiV2P0Z6KPE2FFZf8rU4wmB6QZn5o3T/LxaK8QQgghm0G6eSIEbNexnOd460uv7L6QFJWnJ6uR1nV+fNCTyRDwlpW6yV/7mm4/PKHrNIKEwgpODIinT16e2E23gFuflpT8f5TF3hOegLbQElZ90mKPrEBa12kDa/NVB/mLjo013e2ZSp3bq8dY3dt+5WcvfyY6RahC61+ykBdSLK3sr3NNdzkTS6PrDrCTljPf9dApTIxgiwZzkr9qsUc+jqnMiLfSLQ3YL4QQQgghhBBCCCGEEPIp+YZsE95rwF3IduC9BtyFbAfea8BdyHbgvQbchWwH3mvAXch24L0G3IVsB95rwF3IduC9BtyFbAfea8BdyHbgvQbchWwH3mvAXch24L0G3IVsB95rwF3IduC9BtyFbAfea8BdyHbgvQbchWwH3mvAXch24L0G3GV1gh/Je+K9BszhBd2wPEr4pxvI2vBeA+bQ6Ibvv3NDJwt67gc7BIrl2vFeA+bQaOPlrb4gQPjeDWTdeK8Bc2DPbQnea8ActBvCd998CeEvYvo9PnOBzvvpl29+tk6UQ+yrX6nRoqhDOmhsxAzh92L4nbr+XWxmqjIgL8R7DZiDdcP34X/f/Dr8IsYfQvjhh19/881/pdW/FYfv4frf8BOC/Rh+lL8/N3ruX+Fv3/yf9Omvfwi//wGqLvxB/cKvJHg9A/IyvNeAOVjDShtDosyMX+ksa+bwG/x+6+2vv2bGjx/MjPNJny2152ICf8NPlgF5Ed5rwBys6f+Yma15/2uHf3hjw/Wv0Zx+7PCTm4Wi59T4zX9wzDMgL8J7DZiDNqbNbm625vWec5sekzn92CH8bGYh67nYzypoeQbkRXivAXPQxlzQc1/0ICcbyceDqVEP4e9mFrKe8wQ8CHvutXivAXPQxlzQc97YX+TY2XP/MbOQ9RximAV/7LnX4r0GzEEbc3nP4fhzNKcfO8QwQtZzv3bXv2kQ9txr8V4D5qCN2d1z8XTQvMxoZo2i5u88jBATwiEu5fTckj33WrzXgDlk3WDm31rzRjXl/aRn+GoO5qMnlMFOK9VdJdJCW3p/t/6zEOy51+K9Bsyh2bC4DpKdYPwEu53Uq9f/eTBcWolmuCeThLX0ZHEe/LyTPfdqvNeAu5DtwHsNuAvZDrzXgLuQ7cB7DbgL2Q6814C7kO3Aew24C9kOvNeAu5DtwHsNuAvZDrzXgLuQ7cB7DbgL2Q6814C7kO3Aew24C9kOvNeAu5DtwHsNuAvZDrzXgLuQ7cB7DbgL2Q6814C7kO3Aew24C9kOvNeAu5DtwHsNuAvZDrzXgLuQ7cB7jRBCCCGEEEIIIVvEbnDDSqweKSyJgnJUQeaXbqjzotJuDVq7cCw/k+UVHeYtdtMaPgzdULE83RotaZQU5RiEGzfU8SDtBd16tFY6xsMujAtBi62GJmzG/iyNsFrPgWaSSzPZUC5iwcNMpG0MG/7uwp26YqA+ye/d7q04n4W5OVqLTUdinA4GEiAMZprQdXiYXMMwkJB3u/vhVi3DSZjcSKCbcKB+4VSdBTOGQTgfnFZz4w1KI8I/mcB2PzgIMOyEgByzwLHnrGCx56bhHrYBHB8kbMr4KSbpOQUZpOqy+whbjOOVD2EHh00lxObasYLDSf6PpcrqLG11CYen6Qnq68JmLXYMi7mJlwUbPEhXAthmj4OANjGrBDresyiDODDcGMLwYnyjZVDXWUzbrNNnZIDBgOk8C+w95wXznjMH+Tu5HUgfTodi1Iw99ODQj+FQw4pppFb5r3Gs8u61qcgwjsXTkkozzq/tBKQqNnyii4761GKDgUhpbJDqAGB7qM5l1EfGsAhPuHK7/LrRA2kIS90aHzY5SDpuhqpyVwSuyoGCZbMlHM8GIqtms4yzgmJk+KCCLfWcxImVx2GTmcnAM+R0QMsrP7d4Nd9LfqcmbRR11daJLTbGb95zhzqgAWwaKVn17xniA8zVjWo7E6OekugpEoTOAlXpNALHcgCZjb3nYLNgfrKFX8k4FVSJ5z9wST0nh1j5qaqJbeAqPKpKkEJnJ2HhwqqUdYLgbSATlvzmPad1VmDr6LkjdQHRCJ85YmhjYt61yRc2OVg6EAYoYrh64NhzGiz2HGzueGU++M17Th3tRNpsKpvuW1X+OZk2HG30sK+/6gDc1t5zg3ApZxJ5z6mvAlt7z91VoaJRffflrxKDsGdHGOUo3YbAECL1tcBdPWdK28z+m/cc5ujB4LHyxE8aKuYMKtNmoz13o6rmKQnP3EzaKAfJNbUBfrKe05kGAdRW9Jy4wygNCHMKZUbYDsX0bD03M9cU90asslSRX5wKZ4E9ay9Y7Dm3RbMZvedk8jdXWbWbZ3SBhMn/WHmcAUhZvxZQ8+e1qwfvEfKGoOfs/HCdsOfeAZ1/1k2wtT0hhBBCCPlqmNidm5GtDUre4ISTLECvJXThd6+qEN5zYs8jWRrdyUSKEPPQ4778Al4V+TOwsAHsYtZZujMbe07I44n5/nFxSkoRYnnwxbw2/pbykMTIjs9id8mZhYBLhOpsPae/z2GOm8vjKF74P/F7p2I+nw6CJHIrFgtRWHCv3S374qP3zOwWAjzCYDLUcE/hCaGnYXhs8dLvbjINB886V8OcRQgPuGH+FaANEU2o/GBwLFU/kxaxy/fmHQZX9/h1B2liXEveMbu5WDD9rzGLZI5xK1z/x3DpT371Xqhaq2RPruM9J/lBGnpDcYI75eYavOfcUkT4Kkh3FK3KaAk8U5T6o3K2H1yClwNmy4neahV363S4BzzdYCGKZOJNCAvmjvYH4KUHJHsvyWrwFAI/lxMcTm+ywHnPpQgP1VMvXwuotv4+4RGG1G7JWV3cwZp4um92uBhmshBFMhNYQB4i2gaDO+0W+auSVauHwM/kGIei55BosgD7nflTL18NVeXlz/7Hg1vOosPjY7zpqf8vRdDsloFY/TfzTofBgR/N8XSgTxgN9N68/pfOcBPcLdRI/HGz2/7HnnsYPE4h/jIdW7AUQWZ7PAv2NVFVXs4ShqlBKuf0h/swd3Iw7ST/z4NPUfD2QLVkJLg+9eVTmU2deFgPt3fhDyfIXLQggJmkZy2I/J1e6yHsa9xnv6cPlxhh9jXdId0gqsXGQtBFZKNgzxFCCCGEEEII+Wrg6nYruNS7IgXsua0Ar43hrTQHpnfoOQ6OF5OeI9EX/vzOWqi9kT/VRxOqF/YFfXN+cGzPEUk0sVXv7Y/1/eBBwF0g8by2VPRNqoG9Po+kJbS9cD/RcGQ10hQZe+4xfzvO38g/1i8vFC/s65vz+uaov3OPGPHZkzA4D4M96Si72+Kp2Pv0/vq89JvGhMf+AHdqYCS9qd7wl54b4msUaNYjb1LcbpsM4vMI2Qv78c15+UvdnL23b78mjSkV7fP0+rw44uVst5LVSW/4471QHLPmxN/zs70CjJf58RqbhtEb1YL+yg+OOMHxUdD0jKmk1+dzzxHF7VXEF9sfQvxQQWpzfXzkLO+57M15/OCYv7df/bonUtH36WMuuedgcOqu5CXENhXh8Wkutrm+CS9/Wc/5Ab9Te2ouvawvwCR6T6bP+KZ9TCVG1E/9qO50qx3IS8h6zn7tPFPaXCdCPCuU95zOeTLPaegYw6c9LA1FJ7odfzGV9O0AfWkfnvLrLrCS1/As4vDmsJvegHdpVPbcG8CeI4QQQgghhETCFJesqhP4pafyWYCpfUNSWBJraaJkdaxRq6Zd2sitAZbEWpoo6UvtDf8QwjVe2L9wk19kjG/W+4v0+tK+OEgAe2V/clo4alj8wPF5UH02QH+qt/ERlbwQNKUhfSO/bkPTDrL38qs36/EiffXuP/4E9FxyzN7m91cfa47pbXzyGrJvhsdXd3FTQE3xvfzqLV91EINacMCP9VxyrGJ1OKoRJrIurGnv/CkH6RB7L19M2Zv19ltYyp6rYnU7xl/yem7wAIn+fz5Pn5u35pXfjp6Lr+wXPWdOZrKv19cck4mshanOfEd4TEvnMpjEaPrJ36zHTOeGGFSf3bs7yR1TLPnTRPFnjmpPiZCNhd2zrbDnCCGEEEJIge5+SjYXvZqYkSyrnNpzGfABdDY6e2Mj2U+CZsfroB9DnukFMJFCv4yl99jgchiweTDeR30I4QBv5M39Jp/+4L/d8quLMFkzVfs+h3F6HQevYpm72c3wdKVv5cCOl1Sxz4T8PNrV5vQjvXo1g8E3rCZvw1EmGWJ8fJikzU3hFG8aRKfoISZ9hzJMEN6FzPzce4ANwck7EQbP+26InWD/k5P94m9u7/HjB0f0o78bDhM2T6XMvQ/a9P5NbG99v7WWnGbn7oE/vDx+6X4yy+KzyvIjdvOOHuTNwdv/R3qekVrfzzKsEzKzmvB8wggnOGHHHKI//GTavZloKPLhrNgN1n9kA1i15wTTmYQQQgghhCid9+YW3LS79mOJn5t2x1uQon++qps+txCP4lcdPx04j9dv6eV0NtmCtgz42F6dyZ4du+N1+/iFnQW0BGg6LU1lW2mtWKfjgmZYGKc7XrfPDT6LupDuuBm9Am0P9ftzQWYe3dIW9+PwoVm/Y5ehwokX6nbszTi/paf38s7gpa8ZDG1/OoTVSGqODvE232BwaAHMOUfv8uG7YQPdizeWAqU79e3o9FfCHfg9QUvLyy8mtUoJ5/aC2dUYCXwarPpA78+h2VHn6n/8gF4GrPqRNv1fBTCDJIieU//zs8HAtgdECP/D/3SbT7i4HkzR9TBXRKscb06rTLR0B4NHiaOvo/htQr0nKEhaqfwP+IqgGsVLDvr3eajdn5PaSbNrdf2/37ETo6Km6k/+pwDVC0Dp1Z6Li8Gevp/n++rq30RvyPqtPWn3q+foIz8KznG8XEdykJ5LmXjphhLHbg3abUL/Q1pZ+e3+R/LF3ydFaxlrrk2X7tjlxFbQ4NktPfsvf6nnbHdwYHb9iz0H58H8Tnsh2TNwu0/v+EnPpVJkPaf26jZh8LSqnsumj/T3CUnj02q+bx/fTnfscrKWkP8WoLqXJ32Ves4+ow7Mrn9Fz8nvWHrhJNoz0u0+6blUiqrn5vrZ2uo2IRQs0sp6zo8wyt9Zn7XDFuL35yqZu/URa3fsco7FAU7mH2/p6a+cTYhb6rlhiHfp1I5gqecQ4xlpSS+IOds9QYmJ3mjPpVJUPQdPMeltQngFTyv1nPjITK3GdPgKQDUnqkNeg7dZ+v1QvpKegwi+vqqQFDu3/Ph22/HNFwghhBBCCCGEkO3APrmm+P3tV3PqxxeQlaaVZf5by57eScF1xt2+lzeyULrb0guwz5smgt1TexGejh7w8dM6/eq0hdhdZakg3nFMF/SbZH2UtUWt53p3pCSRywJSfGkLezw9fBU9FxtOeg53vPa15xQ/4j6yGfXXOwV9K20Rw4qrG/NAFblndsStV8NcYK23cB5nv/pccJmUFN3j6UF7bscDa8RGuh5fw8S0too0W90E3Tw1xPeKw43e0cZ3lwGMuGGDL8HCPNXfWYwdJvIP3y8NJzEQHveCN/Z7VNfBRRjrboGj8BR3iMQXg3EI1xIAd13xEpffMdJ7Mxfh6llsdkM23N3DB57PY2zcqqWDVWOZj/7glo/dMxcPPIcCf1TQt0HM4sfSbRkHqcY33l6p59RVnQ6jTd1FnNBtZksebo4HCaQPiphTdMUBzwnhhp6YhZE+VaT7ReZ/s6kcxQCz/tqzD9LN2YOW8Jd8RD8jvu/dK9GAzZYTdKBIVfTHf/mb24wAG95/lqP+bRn6JQWAnpsOwmXqOa1MrJVaRiqFaCy7TZc8zFXMN/4oiVu99ZMrDjfJ07gNJ5qly5j7hMFMmnofZsl16D0n8VWOwDg+5AKZPpGje+hBe+5g7HeCkz/2yZ7cWZAUP5Zua8FunSp23nM47TOD/7l1Sc/l1tF85/pZpCu5+qHoOeg6fcQBD5rBVX1QEvm3PzjDC7Cx5+BjZPe20XOYe91XDzpbnlZPMLn/OOzM5QjXKn4s3daiPXeJH++5VDX/88NMd1CFd/LEwaYdzJAAgewhMvxF13goe+5EmhK2/E+aPQzOdKN5rFFiz1WrRg+Ig/ScPveAv3iQnoMYxZ5L/lqiC0zF5oBDLN3Wgp5DTb3nhmGoJyFwsL8w20PFD8JY5pypdrR74nAv5y/4xk3QKVUDWTtpIHOdhgvd/b3qOTl7UJ0Zpnig1lyDzNgi8JAascrpSYg9txeuRlNEFq8LGxdi1N137+40Pnz051lSGOlCx+zurw5qqOKLg5Zue8H5MfQNZkIc7/BT/I31MMTjs2P8RA9BpMGeo9q9g06xQCPpvT0N4K4WTSXH4w3GZsD3vAW1jM7cpLb924Eko6Xy+Mr4BN4IsYNly/jQQnukXbhd7yArc3b/HfyoSxW/Kh0hhBBCCCGEEPKx2BUGNVY8V7fIVmDpNQp/2UcRw9LwZBHafPU2fJ+eIy9BryuqwX/sjbenJ9x2eXoe7qVXuk/tGiBuxgS/f3CLK41wVPsTXg4f3Adcrjzx+0nxAjMu19/iUrWa4gt2fjNQ/k9RjqexOGh4spR0r3UQRkLA4wx6l/TA2nSov9eD4cw3ePTX5XGLFL1sFrWHvcHZbHAe8B9OSnjGnc5n6ZoLDe0m7TmPBsOVvoCKO2pBt3slS7GdHkEYC2bRxhzcnulsKSZp+9wDf3vSBSo77iZ283F/HIBbB/EN7mSyMBYN8gwfc9M/0p/UdhP0H/7LVCY9N755FBvsMpmph4onQqeeU7sKrfnjDz9uHByFK/9qQjJV0fT/DsLbf/0j/dH2ymXmEvOinKFIm0527cNS9l//lNhzavFfmUbVNMftGjVqotUb3LnMWQg12BFGtZLeePMN9cEsa8EQxtZz1h+30WMeJujS1HNmP7UvAeGRHjH7wx6w7Qxm4WFmPZdMKRqiDMLU7+wiFv7IR8Ne2FbYc4QQQgghhGwPQ33c2AlLP52spHP+V5z86wPNAAt18gKkAdP3mvF0cZ/OeEWHGXkC7LnVKO/PySEcD05wzUpYeG8OV8XUUSLAIDbckptJPPfeix8b3g9PZ3ueEjzUgNAaFv/9ph3pT2xnmPRX/h/rt0Bhha3r3hzehINF/h/r7+NgfuU33Czq/mCcvtNtFyijRf5raHdLN+1If6r7cyJX88ET7nCGvOc6782ZxSMkT/n1b5bfqx1/1Y0+WNJt8OyQbiWQlyENj1szVc8tujenlhghemJahUE4Dxfxlp3d6PMIJ/GuHP7sUN2+Iy8jDK6n0uTT1HOYD7vuzZnFIyTP5C1Em9/oswDj+Glv/bOD/FLmXgx24rBzBmlH9NyjajH8td+bu7Gg/mueAS/xB9FfYpzY5xr0+wwhiMxhCw+1+NMrGkmN1e078mF4jzSQniMbTbgazvUEpgZ7jhBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYS8Pd8QQt4IF7Ia7kkIWTsuZDXckxCydlzIargnIWTtuJDVcE9CyNpxIavhnoSQteNCVsM9CSFrx4WshnsSQtaOC1kN9ySErB0XshruSQhZOy5kNdyTELJ2XMhquCchZO24kNVwT0LI2nEhq+GehJC140JWwz0JIWvHhayGexJC1o4LWQ33JISsHReyGu5JCFk7LmQ13PP9COG/biLkk+NCVsM9QQjBTW/JOmTun+9TVEJehwtZDfcEPQbyGsb6y2TuhxC+dyP48a+/chMhm4sLWQ33BFskc4RsAy5kNdwTUOYIWSsuZDXcE2QChRH+Z7Eb5vZXtynq8l85/trsf1eH78yi/FNdjL+4Wwi/zWROrG4CuSxmWf1GrG5U/qoBQviDHgEWd5HqjLOtBoS8Ly5kNdwTZGMzhJ9suAs/h/CDmYoggshcCP9ziyC2GPJbMUex+JWYXRj+hxjLZO43lYckYynW9FyV+L9C+NmNMC+tASHvhwtZDfcExYjNhnjp7iYAmXMjKOQP6udPZircRRKWyJyInCmznC6ZEyHOr6ZUSYqptQaEvB8uZDXcE2QjMz93++YPuXs+eHFuWVEf2NEuWkaPzlKZq6ejdMjc9yH8WQ3Ob5O9qwaEvB8uZDXcE2SDvRixMt7dVJOHusxFYTL+4741Geohc1/MkNMhc1/KtOHhOrWrBoS8Hy5kNdwTvFrm8lPLb775u/uuLnP/MkNOh8yVJRDS6o4yRz4eF7Ia7gleLXPlwI72mnsPmSvTUTpk7q8h/F4Nznch/Ggmyhz5eFzIargnyAb7S2TuXyH8zo3gd/Eayo+FXPy6krl/5tFlZebu/w7h12bKkDLklx5T+RoCHa2UOfLxuJDVcE/QQ+Z+Ki4T1s7s/hfCT25UP9c43/wSwh/dqHfzosxlGeLWQq7/Uu6/q66JpFsCQgqBO4KpREjFjZ01IOT9cCGr4Z6gz4iVMOHf8cpFfTWFG3GCeAuZiIjKE9e/SfDwb0khyZym9uVHhP8pd/8D3MMXEfAQfnG3P6rTL37zPSsfbsqJj2hHwW/ICZQ58vG4kNVwT/D9j+kc8M9RRyk/5rbffQlfXPdkERK/Fwn69z+/c1vi1+L8RSXiNz/mC7Nf/eN/4ctvYfpt4f7dP/8bfvrHt25Tvv3bl5//5WedZfm+/TtSj7rUWFADQt4JF7Ia7kkIWTsuZDXckxCydlzIargnIWTtuJDVcE9CyNpxIavhnoSQteNCVsM9CSFrx4WshnsSQtaOC1kN9ySErB0XshruSQhZOy5kNdyTELJ2XMhquCchZO24kNVwT0LI2nEhq+GehJC140JWwz0JIWvHhayGexJC1o4LWQ33JISsHReyGu5JCFk7LmQ13JMQsnZcyGq4JyFk7biQ1XBPQsjacSEjhBBCCCGEEEIIIYQQQgghhBBCCCHkfbmbDN20O7l102BvcuOm1zAMAYcQ7tWacWA+fegX1LN6S3Tru2e3vIjUHi2N2+q4iFqNV2hR8tE8pc7KhtTKI6AVHxbP4VytGXth7qalLA4aC//2Mnf7+hzeUuZiM/Up5Zu3FVlCcJEYhXnsjOP19MrbC0IaPm+f1fPrdBx4S5mL9GmHN28rknOvZ0huMa7cHsLOJBmPcTjSwCGcqKMxdbc4qw5FUMNEjLPc3cjHWJLnCxiSz/DZoqnXjpmB2pX2oMq5OcBJQu2a+cL8WoojFK6p8HvmGsK+ujsn7ujZG+oDxChnCEBPzTuapUw5bw+hVphM5hrRDmERo3WfGms11rSzBql1ngcNV3mYkQYUrAezfMrIblEzWRVpZVCORPQEdJu4Wi9Lt1TuEJdHPebMtZfQG+YXNWOxrCjGWJQGGSqZTxjj+BCt+JXDjh6N9qCRaJNQMTK0dmtx6q6p8I6MdDeB6Ckpn+FY03MS+RLH/RAO1EGpN4vjKRft0ShMJnNOFQ1tJsSVMWbEWo09bbHpQY55542j8yl+oqUpcymfPHJ0JS8D0285pmPTq6v1jQW4jD1SjkXDOzGNlDTyfCgqxRiTwWmWa/ktfNQgPx5CLHkftwZNRFscci4arcWpuzaGuessJdU+mhoy55FF9ZgB1JslYikXVVlWGFBEGwyu86xqNY5WP9Q6z09cnOjXlDmz1iLftZaNvIoQ7mTanZnxVE42dXTJiebY0dnRkFND6aGJnOfBlnpDBo6HHVcCXYwxG5zenaUPDPi9DWEmuZpeibQHjURbGi9R5lqKU3ethhLqNJ9I1pXMPaZ85DwLhy6ZMzHraJYy5aIqnYUBbdHyMgm1GkerH2qdF8KeOhsxZpfM1SLLbIip+k6NZD3IOir2AwaYG1OP5Ny4Y1PPQX3VKMYYjpcx7ZqPOV+Vwua0Bk1EWxovUeZailN3zZIc+bGSOZc0YW6zUZfM6fled7PkKRdV6SoMaI2Wl0mo1Tha/VDrvFJiol9KoiZzrT1vYci6wNUAHVho2jQYRO3YEuymkgVZhWAYIAKs1Uhxh8GhpwO8E7MRGAdT00d+5NxSebQwTmvQhI4EKVEaLy4akoxa8+LUXbMkYZeFVSZzWDhiUSMJT9XekDldYOLahNi6mqVMuayKh60XBrRGE0TRYe12h1as1ThaY4OIrso7zzJLa0cLA1c5jZegpcyVPb+ra3yZl/PTU/JqYv9r21ZXWPQyVphnp5baH2G+N7BQ82y5JINUuMvO5WRA4pACnYTwYKa6j1pdSWA6z27otQatwJU8mcM9FErhWTSLA3LXqvAy2jDTPJYXC840rA0+jHdP2ICMILEoiB3NUqRcr0p7YUBbNAXZBH10oVbjZPUGqXeephiXkCmMxMLFyXO9NpPnk0cWtScUg4B8ElKXx8tmm0yhlwjZTkS/OcXJ5WZCmSOEEEIIIYQQQgghhBDyteFX8au7pr15Lh6yb0tikt9wtWciujKy93mmeIiiO9Ca0PvPs9pNdULeidZH8PoRwrmKkYJhbAKTUcic0ZWduR/ZM7vrkLnONPxdm50Puz2/jtqRreG+ccc3l7nwhAcL8VSSPrxkj+GJ41gsh3ZrO3/LVWI+RwcxR5mTMR0VIGQOL33qk1c34Uh+PTu9TS5pJtzdiiM/eJ4/PihWlgbvZkYfPAfpT27hTdR5esgeMQQ1Z9HBbaadJ/7QlD7ZdSvBoW79ObGsiFXD4HVUfd0vr6ZE3BEPzD8SzJ7iLBLLs0E0Ada2HNBcvCn/iUAf+0CMFDLnz+3hPQF9Sg8DVRxlKOB0TETmLI8Ns7+y9yTqyWTOtYiMc/md2GOLctBfKD01jk1Q8vf9LJo/nSiZSVrpwd2yNLkPZOxMnxs2l+x9cE+wjK7IsJ65XD2n5/zlx0t7p5aiiFXDYNo4E6mqV9NeAfWSIKcisSIb/23N4SqbG8inQD80kK3BBHvO1R6v9cHwFDte7e5YHpTKewzFYzIXA9zj0cp4bjnFmMtkzgPtpQeJ4WSornT/CaK1lkZ9bqOOMI/aN8g8ZBk98ig5wb0mc1lpPbgV0S2zSiJiank1L0xBHfg8kyXWKnNtOZx6A5DPTKnn/OBaQO3JMT8oaj6U332f5mW4pFfPL/HiUBx493iBrpC5kaG+Sp5wtKhktZbGfU7yZHDOGd8VEDxkGT1D36BtlTktbVHElFY8d22t5qnJ3LCQOU2sQ+aaOcgyEyekbiafkxaZu/Ylybke0wjJD4qZ74KdP4nV9JydL+px4kpVAxQyB9eCwsktKlmtpVGfcU1l48zUBUzwkGX0jDnWZTcQGVUv8luU1qMbbrmp0qhVs03mssSKbGJqbTkoeNOYfGJaZE6vX8h6xC4WuGN5UJJTPPp5IT7tZ24yGGG0AZrJnAx4yWGWn0jlCUeLSlZracwHiT/AS4344kG6jGqnjwhcRAdwB3piakZ7G3wS8PqZoOKUF9FzxVvjwIS/qCYONZnLEzOjv3QeS9aSg6Q9kQWAmgn59LjorIe1JkbI54QyRwghhBBCCCGEEEIIIYQQQgghhBDSgW56dJFvltSk2j+CEPJKbgJ2G8Nb0rVXy0AUtc8hc3y3mmwEce80GZHNb39Q5gh5HfbmiFuMTOae8XECP8kM4Tzb7l1kTryAv++pb6gItlddaNlhH/iLKubaiKJvqYgRXw8Rqj24BM/NpoFiO32URf4DK0sRVD8eIugHWKqQ/l6NIhW8NpNuP1XsXkrImvFPLBSjO8qcjL2dai/pBwsUxyOGrhrswz4H/sIXPuOAg/jqi2f24Y9Evud8SxR7XdO/GRK/XmLEveh00/kYUgpxZgfztLLUgpqutt3nU0iRahyinhMZ1qNCmSNviukaG/yOyJyhs76EwBdyqv309YDRawbb1bP6spdtXJVO2orxW+w5vyCKfe1KDFYCkH0LqNpS3k2NsrQEta8ipJASCK+GZ3lm78ES8s6kc0tHt9SMYzL6NcZ5vMxpgtIuc4UGWRClZb/9KnSejJw4ym9nWfA6WrH7fC5zeN875ekzTTH3EPJe1GVOlkdy/qf752KU2rExzv3zBv6VnXaZMxFxFkSxBV6B772rVMmYDm6UpQqaf0JCSCHxCSShvDRbSSQh70pd5nCClpxUUkTl1cb5jod4cl3RLnPQUFhx6Z7znVHEZJGK/fbN0dZ4skrEKk3KoB/3qZWlCHqb7z6PkHpSee/nn3Kya3nrjCJKEQd+WIS8N9VyKfKcueDKY+se+SI7ITz72Vnagb6eVrXnvNAeRcDNQcnGfQyVl7jUK7bTb5SlCJrvPg/pVLvaBNGxGlmLpQKL+heXlAj5APKrEltN0oiEbDQhPLlp26HMEUIIIYQQQgghhBBCCCFrw+47V7ex34v0yBghXxe+J5Q+CWI7TJW0ua1G/xRemRfv2ZEN5D57AN+oBqq/yIrnkrFfm26dnbnpA172SFbbnvX1HenjTvtZCuUu+76lf2vIMq99fS/QQ9qzzGnL+yok4gowliXxgIR8BDoqS22ABxXjdoiuZ3b0MUrf7TPqHnuxxvbLF5HEi6MSs9qzvr4j/TC9q1qlgN+0y75tFtURssjLnpCWkHjwMgSk4JYipFjw2yiJP65JyEfQsoe/gO1K9Wng8tyukINiv3zfW63Ys94HfLEjvcWtJKn4FFKUObdkIdvySiHt6JYiZHT0Q1ESQjYNe5MtSsdczhvH/mUEdyv2y3c5KPbybduRvpAkAeeG/gpBTKQtZFteKaQd3VKErBybboRsHDo27/WrJNXYxa+7Ffvlt8sczI5bTIY8BaPaZb9F5hbkhZAa8zrPrdzF3x3bSkLIxiDaTLE3zWC6x92Dx+sQnuIQ1pPCbL/8VpmDcqzvSJ/0FlKo77LfInML8jI/oCeoUZTykHiRVg0tJSGErAylh5D3hTJHCCGEEEIIIYQQQgghhLwRB9ezyYLPXjau0ae71OuGdwPIV8Gzvotz0T3cx9mTW8qbyVwjJ0I+AWd1eakpl+NxVHmjweB0vDcYHOIRqsFg53yMl9yEXOZGuxJg7HthwaK75ohxfFG9s3Yyjq/TDMdI0RiN9XGvzGQ5wZxt5DMa26sDhGwleCCxJmT32cbHT9B5O/bs/yQ8hquTiQul7jE3srfVcpkLszDb27uzNMM0zI+ud/Hc42hw4BtdzfD2qG7FKAmLPNvGJcfIaRfPaVYmy2kiSY7xHKVl9YBnKmdROgnZNlpkTrdDtRdBL/0FAB/7/jRyFn4/4NvrhcyZp20k6ZZbF2Ld8eO++tKKex/iKeVrezxaqEwxX0s9SzG+JUfIFrJ7e1PskRPRLYJVIBVxSZJl4943FYc+apG5XEKSmxqSxXbOUVSYJUf9OENm0rCFzN2F6Y6Irb0oTsinAkP8Jn/RrZA5/06DvbXTInO2I6Nb7ks9l15Sde+Ko+xTKjBpgELmJPKOLScJ+SzchPnp7kjWXlg+TfUcc/cmH/s++LEUk6VfU+ZEWoa+dItCNcerawfVjv7Veg7nmSf4DJGeUM4gj5XJ4hcyBxUoVNutEvIZOBmfp4uJg4txvE6YriraRccjXJ3ElcnKR4BkXIztHDGGFHbG4/hdI1nsjdN1y4NxvPg5GKcLpMlk8T31kUj/hW+OPOV3hAhxTJe9FY+2kavvU04IEZl748sbJ/iaQ1SNhBBCCCGEEEIIIYSQr4bq/bnF1+Tze+E9edFFfo3UMzP7nG3k5dmtCj5b+5J4Ff0q2CfUKwtC3p38/bnFL7CtIHNxGLxcCIbV2zwL8PRfn92K+BesX0O/1uzTDDv65DnZWBa+P7f4BbY0Sqp37LrIhaDt9bosheq9POEYT6xYZH0cBVFGMT5GYFn6iT8lvYbsusnf84sP1Nwkgam1hQTd8Uwzj/y1wfg2obZm1cxZuYr297xjIVApLTUqGp9EneqDe2Qz0acXyzGWvz+nXp0vsJnM5e/YdTAa6dY4YpLMZntDfWlBLP56XZ5C/l6ePo+pLxaJxTLT1/NSfGxAMA9HPgwFc19Hdl1Ub/ft6kYJT7qxw+g+XCLDRlug7c4GtyE8StrnlrL4i/TZa4N4gNSfPsUmC6mZy9cTi/aXZshfMZyF6e5hCNJrw/24rdkh/MmGghHWGGPp/Tkbw1GdqcUC2wts6lG8Ywd2xxVp7EVfP051BxC3NFKI7+UVkVzmsvgT3fMjj1bGqI4vyE658GoAfbC0eLvPsNCm55pJe9t5rCOVCffV1waztwmLZjbK1xPVQy31VwzrhzwNsnEsfH8u6+ZoyV9gUw8VWgVOXURfP9quO27x6AJs2Xt5RSQrRS3+qYxckzwlPYdZO74gu26kZewMVjdMFmA2mXMHAW7A2y7fsKh4bTDLq2jmttcT1cMtqRCeQHlIR7JVaLfpT9HnctqUXmBTj+Idu05qg6EQgjwFOe/CodzBTo9Wijz+TMxIJqOIUR1fkN1C9O2+tJkyfk3mmm3hbde5MV/2NmHezEW5ivaPluIVw/pBX1QkW0Px/lzRzWrBBCvoC2zmkb9j10mQQYfAPhgKIShSyN/LE0VyjEVNVoo8vvoIfhkDWKlxfG12Xajw6Nt9t1jJDT20X0NptIW3XSFzIlDptUEpQLWe06BWzLxchYdaqkKYY/1w7zHI1lC9P7fkBTb3yN6x62T3TK/U+Rt1Oxqzer0uTyF7Lw+X6yTTrBR5fB9gxaQeza/PrpPqPb8DXEy0tCuFW2sLd48Hzyd7bbB6mzCGkSIIzdcTzcMstVcMBwfmPLRD3iJk29mkF9hC0JXokxXJOA/Vq7FfLRS5T8YGvcC2ex3C46WrDueIMjdkExBCCCGEEEIIIYQQQgghhBBCSM46b6PiWccWOpw/iNd/SuFF+BZEL+a18cm7c1MbabDqE3pB37xcmbOJPwqY8yKZa03pNZygbsDtJSG+ADB5391GSpnx11aBPc28FMrc1iGdfJGNwhAOXOZeSKscvUjmFvuuzuFiLZZ8FwdbO5S5z80uZvn0FrKCTp6m9/Zn4Uo60fWcDvlz1QzqH8JQE8Abofr+pD3Hvuvv8uAVOjOZ+VJN+iarpLWjtvj4vSHOlmAULnyLwD8moCZhKCZ7qj7gExDCkQ3LLKzgb33a4/upoEV2QytWYl+DWNWiCjw7c0PA5ssabBSlIITzweDKfE0VSor6IhPMZXGMRmB1iKcPeoYRpKCLZS5vXq2azB0aJY+/hsKSN8EGYtnW2slREG/sxctM5nzc6ns8iCsHvB+mr+24QM3jaNBXeqIAiRCo4RpDzGPiiEMEzngPQAQYh0cbjiKd/rqzpSQjBweIhdq1fI2wWsz7lE1bdkjhOb0B7+0gtbEJJ4WNBk9TImk19cslJ0WR0BLWmLXiGM3AcDh18THBgWtN5kaOy1y9eUXM3CGP/+rCkjcCPVb7MpBNrPZip2mQmsxp/yhRCG6tE9GLejTGZouB0vIIREcx2NFIzlPkchw95ewWhywSLufIqFL365aw+CKDYoaO7ASd4LX+VTMkGVBbZdC3RgdnQVoIAfw1IcOcUgPUi15SCzxVfR3i60YNmavIzy1T85qyr8dfW2HJutmbPNfeuTGZ036QiQ8u5bml6saoCWwoe/fDQQ/+nY4Y2wLFMEqHECRnTdFOeAw4Jl8dLWM965WJWT1rYd0ETtTamp2BCsmh8vLQySEzyLAWm5yEIkcd7NJQRjGM60V32gM/qyyZoxpqMucm8TCZa23eRvxXF5a8G97JD3LwrxiUMgdkmOJTe9GllDlZAamloecy4c5Gih2N5PyMj1nll3JAVYBnOSmCJyaFZz0NroVNes7oyC6iZ2eVHi4+YpIbYLqCjp1LXcw1fnOwNozrRTc6AkeZ83sxS2Suo3mb8cX/NYUl70fsZMx6drZVyFy8fGFnd9bfpcxdms3VR1xSYY7VFcYJrB1CIFH8Y2Hq7LZITEmQSVxF5CncuWMZVs6S8o8ftGdXnYzhdFlOxrS6Pp9kYau0bsM8lmwe140qqVK0YhiLe1F0oyOwyZwUGSpZppHFMldr3lS1RvxXFpa8H6mT0/Khtp5TXBO0yRxmYmHmNr02pgpT7zHbir9T5vziJr7EKmAAKSbpKSW9eKpOcvR3wGthLZ10gbUtu/hBu6kvdZ7M6rWuwspoFtQoRwzsgUi6+1qlTp5rw7heHKM9sMmcJi7snS6WuVrzpqo144vlNYUlhBBCCCGEEEIIIYQQQgghhBCyFLsB+z5Ud3lfSrGzT4a56LMBhHwE6bWs5cP8E8nc1J8KI+T9+TpljpB34gFP4vnzwE5d5iQA3gixF9Gep2Ic6huhwhQyZ6GG/hYXbLMQZo/iDfskiA+MT5KM/GXfH5ng0xDmKeT+cEee6qFBsEe9mG+ksP6QtOKp24O9bdmWbpXM5bmpS/R8uJlrgPCsT4CGA8SuvcpAyCvQAaWDLCEyl3Anwcwuesf+cquO9RN9PNdD39kLBuBIn2aOEuKHE98zEkzsuV57Hjf3P3bJdA990Pk0D5XwJMQ9vbZZZmtEtyRz7mulUYt6PlQPAqd3nfypffwSsg5Ebz250WmcW47skfXMZepvwNi5JaTiOByMIW4eQPSRAHmwgQ6PiFpB5aU/kfSSghB94xHKKy9uTOLeXjBrZtt0a+RmLuYJtZ8Vqn4g5E2oydydvXxjju418XdAfT0nrvCQPz8v09+WDehLCq/c37b/dqJHCqCvuTsxiSkUlLm3ZZu7qUuWhFtiBJyiQst6iPJAyJtQk7lnfQc8vbmMXxztu0Qmc3fhASeAB7IG82/n2G8++C/9mw/ZpzsKuSj8/a2+AbTpnZ7HSv4S6iCKeEROIfFhgkeN3JZt060tN/lRT31d7VgnGc+kPBDyJuynN4ttpOF8bj64UnMaexfiOBmc26JHnDGmcZ6qVn2T8nKgZ4F39m6lYB/yM1FQopfHKvxxqSbcmYDiYsahhbrOggCRFHxFzwvclm3DrZmbuqjnPrLSqsRg5YGQr56ozggh7wNljhBCCCGEEEIIIYSQr50j3NJy8+tpeSnhsOHyLjSy5eVQ8q4chjC9jHuc5Rz5o47+gPNracrc7jvfevaKNLOlzJF3pXPg3/hIXJNkvOfLd+10VoQyR96Olvfn8l3o8GAVtmd61JdpgHjb0UfscPX35vKX7/IcPMnitbo9iY036eLGNPqeQHyFLk/aE73SRM3bpTp/GS9P2wxmkp/mm3aEvAVp3GVgWxgb4/E9OdujqabnXvreXPHyXZmD/rS8Vjfx5zqBJyi4yV+5yxKNPipzxct4RdopqXgU8jftCHkTmu/PKSIDDxiMvjGOjsr6ueUL35srXr4rc9CfKqb+QNqqyPbIta8rE7VEY3i1FC/jFWmnZP3Y9vYdIe8HBuJTfJFNR2VjPScGmOVvlffmipfvyhz0p5S55ys5M3S5TNgrdHnS5Rt97qOW4mW8BTJnlrLwhLwHNzNfn2GnM3vbRRZVcGrI3Avfmytevity0N9S5sL0djzOpCZ7ha72ElyRKPZnTZbqZbylMkc9Rz6AfX13zZdneiFkpuN8cB3fSUMANdgIhV2tzZfVQOO9ufLluzwHTSd/rW7HUj7KvneSv0KXJ22Jmp7T6yb3g7EvA6uX8fK0Ba+I2VoLT8hXRlzuzSuZW4zLHCHkhchqDqQ7BcugzBFCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEvJJvCCFvhUtZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7EULWj0tZifsRQtaPS1mJ+xFC1o9LWYn7vR//F4KbCPnsuJSVuN/78SNljnw1uJSVuJ8Swu/c9IasReYCBZdsAy5lJe6nhPCjm96Qdcncb9xIyObiUlbifkoPmfvhBze8mBfK3Pc/fOcmIDKXWwnZTFzKStxPWS5zP4TwvRtfygtlrjyb/M2//ugmQjYYl7IS91O2R+YI2QpcykrcT6HMEbJWXMpK3E+hzBGyVlzKStxPocwRslZcykrcT6lk7qfwi/z+QQZ6CP/7lblh2Cd+7y7fffPNX2H/We3C39U7/Fy70/cnc/5tLnO/CuHXbhQkMzeB31n4/+rVyd+bxVDvn4rAv//ZvP7hdvCfthoQ8r64lJW4n1LJ3H8xpn/RASv8wRzdpvzVXb5HUKD2b/5pFiUf6O4kwz+Tue9DyO48FPrPQwv/FZtKdUT9tXzOd+4B/ulumkKzBoS8Ly5lJe6nFDInWij8SYx/k6Opo1//8IOoq9//ANQBMica55+/+eEfSRTCPyBrIkAhfKtuQCw/i8b6Hvqoj8xJsP/BC8L2PzlKhmLSjK0omcxJKuEXqN3vkftP5qhJNGtAyPviUlbifkohc6ZihN+I0UyN9ZxKUOXwr+wcU8wp1v9C+LuZkNZymRP1FG+//SW6ZskJmcyJR0zlWzFHjYZ82mpAyDviUlbifkopc/9yM0Qgnic2ZE6XaI6c5GEN5YgC/JuZRLaSKEL3xPHfJXOylqvOEb/zRKpooJI5UZ3VU2AQOjcin7YaEPKOuJSVuJ9SyJzrCOHXIfzZjU2Zq6QD6qy2hjODJFa5Z2LRJXNViIzSsZI5O/mMyHoSZ5OChG+tASHviEtZifsphcy5CVTLpKbMuQnUZOU/cUVXuley1S1z/3FTRofM/aMsUBWsXrS00CPk/XApK3E/pVvmoq0hc9kFQRGhQpn8NtrLi/hLZU7yaHmlqEPmSufMXpe5Mhgh74JLWYn7Ka+Tub/WnvWXpdu/cZTlWf5A8lKZ+3urfJRSQ5kjW4FLWYn7Ka+TuS/1ce1LrT+F8H9qN5bKXCMdpZSaTOaqVRuQE1ozUObIBuBSVuJ+yutkrowjeDTRf2p1lspcIx2lU+a+mMFJSpIyRzYAl7IS91PWLnN6i0BkLo+zbpnLL1tS5shm4VJW4n7Kq88tcz/46nruz/k9vB4yt+K5ZRk4RabMkQ3ApazE/ZRXX0OxJ5+d7zy536Z7ZkolcxI9u0CZ3P/W+t2FnjKX7KVHPRgh74JLWYn7Ka+TuV9nD34AkZ14fy6/OVbIXCaMyf03VTEySqmhzJGtwKWsxP2UXjKXnQ6KRyZz9YHdMfx/rmyFMFZZtgpI6ZgC/6u8mfe7/DkUMyitSRLy1riUlbifslzmZAlWSllu+18x/v+QnreUxKpHIv+TDf9cEv5dWX4pTi7dWEpNKh+e3zSTUtm63Al5R1zKStxPWS5zYkyPKwulzOFFtvRgZfZcJdzdqCKXLF8qIRVjcpcFYAoiatGOPxUPc1blE0GvlKU4xxrkadRthLwTLmUl7qf0krl8QVfKnL4j7g5/FmNSbiIX9sIBBFFOT9UMxGqXXeSEU04L1SjgNTtbCiIZNUAQs7vfWfkkRJwGREGmFxtSRKW0EfJOuJSVuJ/SQ+b+KObw5Rd/H64mc/rSnHjjPLH40DKsP/9Tddm3cnrqrnrdJIS//EVlrCaLIfxPw6f7DDD/O94KyMqnbwdJpvpbKWGxuAmUNkLeCZeyEvdTesicnQTGJyjrModNd5z8FNRWa+A/uiR0RwF6DPxPL8+4o2DCK1S6DToSqKUon5x2OtlznWJzEyhthLwTLmUl7qf86ce4ZPr+x/xW229/zD5t8O3fvvz8L7f/Ocpoxbd/F6n80vzK8t++hH//RU0/FpF+/yX8z5Tmj8VbCX9AMqVE/+5L+OJhyvIhkfDvfxZ39RbUgJD3wqWsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68elrMT9CCHrx6WsxP0IIevHpazE/Qgh68eljBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYRsOPNw6aY35x2y2g8hzN38MqyQuyGYNafVcSHv2LhkvYwmqesuJsduGlxNdt30GkK4kd+DluEUwr2bltIvqGX1lkg1BLe8DCvksC2VVseF1Gq8QouSD+as6uxsSIVw4KbXYMPiJoSh2StWGL39gr69zIXw7KYX87Yyt2oC5MMI4dQMo6rbxOimV2HDYq/ljOw5nLtpKQuDpoH3HjK356YX85Yy583Upx3evq3IYtL8PRGZOzPj4ysXLs57CILn8B5ZueHlWCHfRuacPu3w9m1FFnMXezuEpzCLxgszvI73EATP4T2ycsPLsUJS5r425iEcudFIvS0eldGOZxJ67qeekdOp6MO4XheDxDelKMIb7tQU8c6dP+Fs9UGdBoPLMEbYE5iRzrXEj4vHPejaieB20BEUTCT41EIjq0Mp7a35CM3igNy1KvyOrDnD08icE8eo6d0OjHvISnKqTi+RCmLF1uloliJla49KvLLCVI5KEe0u7A5ug507jsX9Qa9v1WqszZQ1SL3z9h4kJpLIwtwHz2Gmpzp5PkXkXXSLnwKRlZHGqyuxACGAVMzFqN156gNgpqGL88w9cwphHzYZKYewiFHkVVEBcbIxtpuWQ5p98pH/wEqEYWFkktUeVDEHTVlCPZrFvNqKU3OV9LzwEBdQnlDLqFOuxXzs5rS0rMozhbWrWcqUs/YQ6oVRR6MebYiyiJRIKypX6lrUWNM2uzV12Xn3ZpOwbrBG056HwX5jPmVkmeoAb028DO9Mtxlza9qJdIANCmlxnSktoIyGTF2MNcSOp6HD7mRfploZY+g/kRq4O/kYk+lZ3aLNfUI41gELL0lDZG0/BFUtkdagES8wDCE8Qonit6M4Ndeq8GGO+V7ipnslggw/WGX0mWuZlEYWPSDDHt5dzVKmnLdHszA4OPVoT2E6PNGpKhwOBucaFnlkNfaWyBtEfiVZ7bwL1cG7U72CnIWpyVyWj9hjZJtJrj0sWRF0lDVohd8twFCXMxozopFlboel9f6ajH0cMLjsRp7NvRipOgKMfIzJZKluTyrhycccTYFMLY2LODcbrUEj+RDTMeFDuLU4Ndeq8M7cphpDam2iL9OUHv3gpPKITOpRaTSL4ynn7dEsjNoKUjSvhZxY6nGAk75GjbUlYoPUOi+e9mair4aazHk+ZeQTt5GXIc0K3ObYAIGrPQ0h07U526AQg5/2Z9iIlJFip/npMYrYXUptjOldOltOJh8b13ZC4yFqw681aCQbPqZGrSDtxam5psJHfMYx/KSxKk7umZWnvK1Sa5ZImsuq9mgWxqw5KZr7RZlQyhp72vEgx7zzymkza7S6zKlNDHnkNd04+mqRxhee3OZoHxy7/pFGHpuiSauqtlN5k480UuQMxww+0xv5GBMLspUzR7NkPri5hEW8n0rV5tXWoJFs+CSD/LQXp+baGOZJjQBfZ6lJp5wydBW5UGm1Zol4ynlVlhUGFNHUoA1klDWO1uhadl48yTCymDWZS+5Fz8uI0cmQvIzTpsjh8hkWJjhnm2Emndvpm4qn4X2hyKmoUgyuW3cE5gIsogeSJYX8PpnIFD5RkLQc8MvWjx1BI9kwSQb5aS9OzTUf5riC6M5ONc58ZOaeeWS/DtzaLEKecl6VBYUBzWhqyMZ+WeNoja4eHYjDpEg9i9klcwm44GGJdH2WrAXtb2t2vWLpHSST3Nix0ygF7T/H9eZicIndg46zzrEui4F0/e/DpvSJgiRJT2R54o5Oe1AnGybJID/txam5piQxHUjOk+c85+qxE1fzZbGqyFot+W1rllrKeVU6CwPaoqkhlkkoaxyt0bXsvMci9Sxml8zVeh53LvxcmqwHnEt4r8ghno/Vb+Qp8+RZDK5iKZSwToyB5iIthyl67uOC9BRkIBZrFtAaNJINk2SQn/bi1FxTkrIi0hsQtXPLuIj1xVlHZHPvaJZaynlVugoDWqOpYYVzy7zzrorMsphdMtfoeZwfuZGsg3mYnPgjKCGc3/hVCrszVSMOxdrgal9nWyfGQLjj/uznjaWPC1JrGu1BI+k2dDZe5Ke9ODXXlGQc/jWZ82s11RSkBydFtlmko1lqKedV6SoMaI2mhuysu6xxtMYGKTsPTV+RNZqn18in2fNFAcmrkbVIfJJ4EmTK1hu7cXlV4p479ZOodKkrJx9japWBZmcoNZ8oc22nL61BI0/x6mI2Xuy3pTg115RkrKhoWT0q6XRs6rcuck+NbM52r6CjWWopF1XpKAxojSakMkELNmqs1tggtc6LmaF0VaM9eSDXYSnFjp53A1kLOKUzk17YNGPsisGhP4UJgupDhCpl7jY+IHKTXSUvxhjEJZ6CNX0gSP7wQ3kJpT1o5LoUYjXgt7U4GMOZa0ry2CojI9AdQLy/La4xBz04ElkvWN7b+V5Hs9RSLqpSFjFFAa3RzAQHu/Jfq7FbswaxBK3zRFoxp96pOk5hRIWLxsO7uK35eGR7uqyYkMjria1uxnhL2m4sAHcQ5FRKea7JnA4QJbuvUIwx61z3rfm4IFkCAq5KRNqDRjS4FCWNFw/XVpyaa1V4mQqAXVmNiE42XGoLT0SWM0Cgy8+uZilTLqvSURjQFg1cmrvemavVOFrVXwpR6zw3ZxYU1Gc5z7zKp4jspvz6DXk90pfxTvNzvoDGRUTRO/kpn/QPTvcf9a5UurEr2LArnoiuf4tA/N1U83nwUWS3xIrrh+1BEzJVq0/6UkEM11IcIXfNCo8L9/O9QZptDBWK51j5olAqI5qYZ9DVLEXKtap0FAa0RFM0G7t7VqtxtMYGqXceUozPQ1dh4PpUTwLkkTVqMdORT8JVGvLl+NtIXDUQss1M412C2py/kVDmyCdg7CdNcuZTnD5uJJQ58hnwKweyknGHDYYyRz4FeNhBVuv+gO1Gsw3nv4QQQgghhBBCCCGEEEIIIYQQQgghhBBCCPlEFM8uHuIF5kmf7yn2CkTIlhJ3iin3uemFxKq+dHybvkmekcucPbDfKU5WCnuf+Y1lTkqSsiLk3dHxJ+RfKukHvitSiRnScGNF7mQP7HeJU/pICT4asA6Zu/GPnDU5lGJflh9jfVe6S0a+Dl4+9lRM3Dy4Cr4LUEHTaYHM4WDbd61D5ibp07F1fH6xb3N9AN0lI18Hr5K52/Qpx/SNqYKVZc5ivLXMueGDoMx9VeC7UNlmMyAbgbcBX82DojmGCrOPmIjjzsy+HvZULtkgJjH2YbhNMqdflrJvgYsTttiIXyCWnyhO+PRW/qGGSuaGFkjWmfGt8rI0+HpY9NEdTf2sGAvT+Ek57J4I9KvFWXQllhO4+VG/NxeeNJ5/CzkrYtUw2G9gai2YV/MJ359AQVAgWxkXiWXZ5CVryeFEnOb8+N0nQrvbzU5mn9iuvjqq8ZVYFc9J8G8iynoMWFBgMoeLkTgnjB81wLcuEV8HuRn9e27qbzKHJCfyB2ejpucwhKN/a2nUR8bsRP70U7Uyem+kFE42svPoio5tJ2WBaoTnrLR5EVPD+EcmxKms5rN9T/JAVomCzipFYlk2VcnacriRcwd+5PUzcYV+LbYzTcMBSGfrt4FFMeAgE7k5YlxJPP30b7aZBcTk2DWfDGOXObsgKAJqX0HG14d27APknhxGpH3H9My3UgAuczIO5dfznOuZa1tpzGdsSkKvu1xlBVPiGVwRXYGy9y00rExJ5uxbSbG0VRFjw9yZtOKnXk2RaHwt8ki/V49AtcTg5NnEkrXlYAF5jeXzcC0DIX7YO6JOghil493JTm78PE9n6okNt4t05uViYlHw6W6TOdthMo48V4OXOqDURWXuOG0oogdgMndoM4LnaSm2lSb6wCynY6I0HuoXA8uR7dEN1dgmdZ5ElDkNcoHSFkWsGiZ+h7ZeTc3DP9prgfPE3KmUuY4c9EA+DXZ6GMeNoU6CGONaK3b8PbSJO17YWunAxryiYvKggxcmk4OnqG/UFlOqLJrcra/BorcgiSkq0kVBYqC8NIXPHsp0msmUUsmcHix6BN9EUqFx3yhzatFjUcR6w3RUszjEsHmAQuY6cqjmNPIpwBLfLzlE4tgQvOP9JBHqaZIcT03mhnWZ0y3Kj+EZdY+vm9xWt2hy6dt77i2YzPknwPMx3lqamGBELLikYX6Gj+wyeoUoKJyYum9T5ooi1hsGQVqqWRxi2DxAIXNtOSCLxm655LMRx4bgHS/rHbXKMJWTNnfskjmJv2dprCBzIZyMDPUFlpgTRyDcWksTE8xT2cE1CzUpPrLL6Bm2c5v7NmWuKGK9YeD9aplry0HANZYPu3VI3oU4NoRiQAsTnEC5Y6fMnYfnHT1PM5l7sgFVjrXMosn5qMvplLmURl6a0icxy3bbinkU0XPyJJoyVxSx3jAd1SwOMWweoJC51hyAX/Uhn5asg6uhZUs+Pbpjp8xJqEc9XzWZu/YLmed69JSuNZb6a3Jju8Ses0jmmqWJPmqu0BNd5963uiqi5+RJ2GLQlZeWtihivWGa1VRLccgTq3yQjZesNQdlHpUo+Zz4cACx432T6kcdFO7YLXO4F42jr3ZsVeWnYeK1p5duMFbVyZKLOwZXa8tumWsrjflc+sBH4lq64+wKxNjXRUV0cDNTwbHb/HbJX4ppMofBnkpbFTHmemcX9HHPulZN/JaHWmJZNrFkLTkcwN8TIJ+WrIPj0MLg0BU+zO7YLXMia7pScpnDkxoT+dNr8X5L3Ma7+ltyuG74gFBwV7plrq007gPnyUxHcgjTiUi/ehqIAhGUQ4oO/Ha7Xbe0neLCOMqcooKbFzFvGCBiVFbTEi8OGlDQxIps1EucW3IQp4m4FFd7yKfDBwq4Sy/06E6jKmTR8TQKVnVVbd9m/ZiCTOl6FNkLYaaDCw9A3YnNXptR/5iHvkKU7TkZE1NiIE+xWZroYw91YLju40mQaj4QcI1WB3IWXdnXvH11h4dB7kUOcYNNEoUIxGcGqiJWDYMHTaZa6qKaVpriUCaWZ1OVrJmD3kF1ASfk0+Mysx7WmhghnxPKHCHvC2WOkPeFMkcIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgjZCub+VS5CyKs5GIPsC91tFJ+hI4S8BnwrEdhWaR1Q5ghZGyJO47F+UdUdMpKofQ6Z4xd/yEbgH8Q/8m+XF1DmCFk/cROKbC/tBGWOkNcy9+2sK6LM6f4bz26RgPMBNqKYTia2M83N4FDi+hanAs5GZ7aRlHjf60f059iPvgBbbTzEPZvqUYZxEYndCqo94Yw9bIkaNwU4xjf7474Fd2FXNx5IZSmCnkkp5/4t/3tJ/kBC6m4aQ91XYzLRyeNKYjxU+yAQ8mbIUGtuz29ipluKyoBVi247r4EF3dTpBhtfCOZ9Yha3ivDoZhj1pG1n8mB7dDei6DY0YsRmGCDusKHcu6NadMccwfZAlcx8Ix+1lkGxCYhgsixuT2Yfxp1CBPWIJph5G4S8HbrFSxyrTpS5W7ir4AHbnik7t9Tl3qUv+vZEYYn2Ova0VOCO4VpPGvs8ymAXc1uUk33RVSJ62CqqPLW90Px3p7otlIgc9scRcdJtciShvCxFUEtdEle1KWoau/BguxtYxa4HSQnb+wxjQbipFHk7MNDjAIy4zO3IAFVb3JIKZ165zOkeai4Z8XqLDFicK0KA1F6d4gE5sTQDTvVaotg5p6tBkSwLoJjIi4jKn4iMnQXKjIFDoyxV0Ljhr0TBQUJqbfZD0LPeWJ6YtVKWmZD1IoMQuM0wmcPuhLD5QHZxyWTuQY+2fZwIjK+FLFaylzojihloiWIru7ghXbFBtsuMYZOBIHFwKMtSC2ryKwZUKoYUg246F4Ny/3vybuAsK2QbMgJ1EuYqEzKQcfTBm8lcMsiPnNaZzU0uDLgEk2+omLZMFDqjJA/fAVi5zsySTFzpRUnKy1IL6ldx4jbBHlKWfDjEoHfJg5C3BjvlliInAxE8xKuZdj3QR2eHzE3SODfl1ClztsEo6Iwiy8iEuQhVaKFKxk4qu8oieDoAYVJIF+wUFOvaJ9e7hLw38RqKo8MzKp8OmXtMg9euuXTKnG7Sq3RGEZnRBz6FarfeKrRQJVPTXhqoFvTS0xqbtvaQ6VRZbcI+LoXWb08Q8j7UZE7se/Jni60OmUvng27qlLke55Y3ySPjKnesCmjFKstSC1rcfEwh7+3mQZHVc73ihLwT9aGH64dxcMZrgtXoVZ90R0EC21V3t9dlrtIknVH0VnydKrSQLsz4iq+rLEK8hedkIbUeZVa8SUA+iLrMySC+UbEQnuIlw3Kcy69dH9yx874OmUunfbhK2BUleRRER6zkUjLTqK1ay4KgdylVRc4f9ZrKrt4njBEij3FCIeR9aZxi4QqEDlG9KGhXGmrj/MEGugiOymSHzCEl+bWr+V1RcBHFrtzf2PmsInKGu2Z3KJtIqYZ98sLUypIH9RwHg0OdNFCTsYqcuapmG4p9XyySrN9irx5oI+Q9aMicSEeUBx21uGxYG+fxESu3dcmcLN0UvUvWEUVlycifwXInDSSnn0ZjkekHA2a7GQLMR99REuweOJ7rFB8IMdCySkF4iknel6TUIjIk/VayLqF0+KZvM/g4x/OYgt9aTve1H6qYigxowe+ZtUcR9MHL9GCyo/cQogZSsXz2i/uNshRBMWPE56EhnbCniQCPZsoa04TY82s89E3Iu5Nf499qkkYkZKOpXYjYYihzZDuoXXDfYihzZCsobnhtN5Q5shXM4j257efzaGxCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYSQTg6rT2QS8lWh72LP/E3t9yP7hAMhXxXzELB9m313yzaaKmhzW432FPJPOERel9eNfoCIkA3H9+/Y0a/atame16uj/im8Lq9J/ZNLhGwid/l+UZQ5QtYMPvaVbdAm3Fa7WmFdJ4gJ+43YhiPqYqJwBIN9SEz8sLWxWOAY99bBJ7zs42C3EuE5nq8aMGL/jur1bnXrCFnmJQnvxpBWfLHsW6wU0jdb1VdZi5J4QEI+BB2WbjZG1Qfp1BPesxBmj2rK5EDG8UT+bJfTZ/uk5QFC+aft8P1JWRdq6pPgX6NE2Mz0cDNXf0Xd2kOWeUFUhxLSPu+nW/m4JQ9ZyVxZEg9IyIeA/e3rm4/KiMbOxIoPzmf9bOXUPljpbmNTHHEPO6gbXPAUecX+5OqGz/id6bfZZcRDe87t2+eewlW2rwhQ59aQ9by0OBISH7O0T5eJRT/nXISM55a1kvjH4gn5CK4hcvWdSGVYxlsFPuaNo2L3DT+c6MmbfybTVaR6HfuuIS4S8UPKyQmfhC0vK3aH9EPMy8rrIcUuZZUyu0UPFtJlrlYSC0HIx4DP+/vWADk4Q/QP/KvdGNogL0f3njqWbnq4jcsvWCb+JWUL4MFkkVjcBe8O6YcirxRyCskqo3lIl7nWkhDyQeCKg43IElme4RDHN7aOE0qZi+Ru1QE7LRpiaZUk/dZ5JgHdIS0dULmlkPfYG6SKFhGLy1xrSQjZNOKei2oRyXy6Gl/VZG5kFG7VIYSTKkC7zA12cN3Rze7cIXNVUil2JXNS0ipaFtJlrrUkhGwad7qFTRrz+G09tzRKNz3ECxhKh8wJM19sCR6tVeb0YERLDDnFsrCM5ngRWktCyKYx9z30YfYLGqvJ3Li6z1cXiTzqTmVR0yoyZxm0RHNUA3aUhJCNwe4cnNjg9SEcLbnMXfo3oPUCTBEuBTf1guViqyTp3fDjagdUdW4N2ZaXhNT7Ao/qF6MVIcd2G19iNEtCyMZgVxv8yore4tYrKrcPYWYy5256aWIiPrj26HJQHPDsygPuO4i5lCRPIYTpZOpRQHfItrwkJLaYEyBfSZTykBJUdxdvKwkhm4NeolTpEvDIhxywa+klnrrK3PxBDx3CLgflQQPalox3/jyY+1gK+yIaKSNBPVtDtuUF6dkXN7vDHaMVIfWqrJqaJSGErAo1FiHvC2WOkPeFMkfI+0KZI4QQQgghhBBCCCGEEEIIIYQQQgghhBBCyPYwwadTCCEvZjx5utKPG7TSfLY4vkW6bvgUM/k6OME71UE/+9VK0+utZG5BIQj5POzqvjp7086ta4bjug58K5lr5kTIJ+Qy3LqpN28lc4R8RkZj3eSm4kk/bJkYjse6/YeE3JWVHhZY/oXW0ThGzWUOnhcpymBwaubd8dijCQfj9B2g43Ham2d/fOFqLZk0CvIdVTGGWgZCtpXap7eEi1yCdnRpZ6eZ4flQzNj3DYP+SX1se5EsxiQM9At4+oUwsSB9fI1WsV0X8bGuYJ+hzLYFst1KEKIyWU7h2aJrViiCAgsh24ddLyk3CRGHuP2ciJyooRsb4WEa5kfXuyYJu/op2Ke4Hyp+lUmYhSskC+9JeAxXJxMEuNvFVybxmXQRoPFg7xFRniB5kgeETjeVO4X4VyaXuVmY7Q1FeMUsnkN8w/KoUpqEbBX2VVbd/jeBXUl908ep7a74aGPf9V1+BT8KglrAxATWvoguFpUNkTwczNE2HxnIGafZPXqVRmVymYvbOMrPuUXmto1kazlTmYt7GUewW45vHq72Q9VaURaWyJx5Btu4w671R/85rohWgS9NgMylcq5MUebU4tvM+feZ1YmQLQQi13JX4MFcQ5gohVQkmTs4GptbJgLR8wHHSgD1ICepN0m5CY/hoUpdfn03/9yUyZzurnOh56F7/kFpQraR8aTcTdyRE8xDGdsR6KsoLCYJ+Cw5gEslRu4J8ao2YvRtfbBnx31lQbQIbBdytKs5ydSUObGc4jqKajtCPhUzrPIyYaosKgkju/JiblmwKHOPOGGNluh/gz1/qsDPtf3ERYxd5bqpKXN+AdSSJeRT8YATvEyYKotKgl5cjG5ZsErMdpoyN4eQVYFvfD1XUfmpqSlzMzFD3RHyifBTTb0IMred/I0oEC5zeqvb3CpREU+9j2ZnkFHm9NpJdd1S05TU5dQVpox5clBTU+Zc1An5TIRwM8INNty0Fim5lMPJHAIWJUQl4RYnfzjTg0smOyJzcgJ5hp9K5kS45ET11KRNVorjwfARS7KpCvbujUQ/xamqbjRZmVpkTk5pFUoe+UTYsyG+HbA9M2K6K0pWlAThAeJSk7lT7Lpot/yizPktCXvgxDeI0+dQHtSICya+UNvNTS0yJ55jXOi0zWAJ+STsl89Gjn18j6JyMU/x2BnsqDn5mJgMx/GJzSqZ4/G4ugt4EZOEcexPwRymXCtTnrpm5fK3m0k5IV83SbW9Cbvx2VDKHCHO28qcyJo+6fy0+vtGhHxS3ljmfAHI5y0JiVzUXlJYN7vXITxexvceCCGEEEIIIYQQQgghhBBCCCGEEEIIIYSszOr7zp1Pyk+g9OJiog8292KnePr5zbOr2J34Z6pfzNp28eNLF1tI2n9uyRPLq3fuix6BftbPfPXL7LkI9YrsVkQ/kfa6x7v7VbBHqDHe7iXbhH1P3T+vt3DufU+Zsy+sLKX8WOf7ydy88WHelelVwV7NsHq/kA8l339uyQZw/ftWP6ygxxcLwbj+kbA2Liz9NWS3ImsY5v2S6NMM1aecyFawwv5z7ypzvfASfWKZ68N5uZ0Z2TAW7z+nQ7dzA7g4Sqo96roohAAfUlHEzbeny1Oo9rUTdYsvp5gQHKokIJkUH2HL1/XaZO5l2XVT7ZO3c54+7JIEptYWaLtjy+BwXKWbbbuXduNDEodVM1flKtrfmqHYom/HdV+V5vrEl6yfxfvPmZ7o3ADOQu6otVhH1bjREFgbSoL6qreq0rQ9XZ5Cvq+dnCQJYxMCzUyiVPFFOgzf3gDf8cML5GvJrotqdzxvCezvZaZa0kp41k+nzT2eOWbb7mW78YVgYbQsxfZ+eftrGlUhYoOcWq19CogZkQ3ErpfUdUXaf85lrmsDOHWRYSbzq+9R18HoMZyPRjJm8Jm8Mb50CV0Qt6fLUyj2tcOGdxolk7lZHl9G3WmYmU4DV/aNv3Vk10W+T56onpGmPZLDaDRsaQsp73RXpPNeUhXx0i8Y5tvu4eOBvhufZJyaudzer2x//YmFQIOcDW4l5XCJrxgiiAcim8mS/edc5ro2gFOXYo+6LqqTPZXnK40vFnVtScEGjf0eFTKXxTfvqf4az/bB6HVk14UFqriw1otlqSdtbSeSVTWYSBvMZql24/Og5uyYufDwWDACr6F0GXJ+9K7kR3c3mCX7z0WZg1kGlAzZYgO4vP9tjzpwMa4w0UW0KAQ2GjVSbXu6KgV3unSH4twyi2/xbjM1HVX2GrIDu14LYGeCKXZiP9+pr5m0u+SHjm33olGb2TGnwqMWy2t4befXR/4p4Dj3kE0EIpctPyKyRoBrIXM6kIsN4Lz/dRM5mXDhpA4VUYN2CEG0lCnEfe0e9Fvry2QuvzAeVdQasgP+SWvD3SRaWj7KSeX4ypbD5t9oC3cvDvm2e1HnATfGsje29zMPtVSF8Eqdmn3oa/NJfh2MbBqL9p/zHq31ebUBHDzKPeq6WCQEZQrZvnZRgrpk7i5Md+TsWM/ajD4y1zO7Tqp98vwSTiVzLW2h7uXBgwh60qkewI1W9qxcpYdZalv01WXuyetNtgrdf856NO9zu4KWD2X3XcwiIShSyPe1e/aT3i6Zw1VI3/jAmadsXp3dAkQecBIw10uWPswt0Txpw13yQ7HtXhbDjVr2vFyFR4rghfBK1WQuPz8lW4PuP2c9mvd5sQGcesRBsJCJn7ctFQJf/avTrZ9AdcmcnObulOLx6BfL15DdQhDeTwtXl7li270shhu1mfNyFR5ZBDV5pWoy5zuVkS0h3eKR2dh6tGUwGOpR7FHXxbXuqZWGiMWMljwFHy3qL6dqMJ/aSV8RBZZ9v2CQcW5X+NaR3UKwhty14ONc5pptYe7FwfMxbH8w7MYXw3gzV+UqPKJFyLboa8qcHsiWkO8/Zz2a97mc9CjVNCzzfbVHXRdyJrS7K6durUKQp1Db106GEO5Ates5O7fUMzzH5WAd2XWQ7Y6n4nKfr+da2sILVByybffy3fg8jDZz2/Z+lcw1tuijzG03+f5z1qN5n8s4qDaAM498j7pOEECEo1UIarvcCb6vnVkuxh0y9xhubrUs6qREM+K9LrsOfD2LeyDHairOLZtt4e7lQZ8ccWGtduNzzyhaQrm9XyVzWSG8UqXMHRSLXLIFZPvP6bFzA7i471y1R103I3tK0xMe6WWEmEuRQravnUzmupVdVoosvt+NFrVRne4mHfXq7DqpdscbHOEhSDwNKb+xDLW2cPcDizKMMatt92C2CHkzF+UqPMxS26KvOmhhTDrJ52CTNoCLRZjoJT7jKLvR/fWyAZ1D1oefNm3CBnCPVobikgSHmzBceKuUbBu+FMlvRH8UO16U/HbX4IqnVXKy7QbyOdikDeBOJiHUH6NJi62vl15v1BNCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEbDaj6vMkOR3OH8XoQ15uPVjwSbE+vDY+2RxO1zcAh+2fJOlw/iBO8U0H3efkfXltI/BzL1sHNhJ1oyJW/RLi5ZKvU67Cpsic1KlbrqTe45sP+FJPvfnTVNezdd67EcmrwSi0r4srd7DDkO9nvAqT7BvKkZfJXFtKrwJVU9xekOp7NnnfM8yyOJS5r4AQfEszI4Rbt77wm0JtQ+BlMrfYd3Wu/fPuk9Z0de83UGwm9w6U1aTMfQVIJ+su2MaxjLjXdWJb7M2QubDw47LP1adxKXNkndiXtDOkk+V80i1iO2nI3B62vk/4J78HwzylUXU6lsdOYdXxoEhHMOcsrtCRUitlzP1ReQFvWM/ONuvI2clS6C1zO7V80sXXsjhGPXAZSKOW1WyVuVrK6XPrLfFrrFRY8hbo/vxudqSTd9LlEjENXOZcCe3YTjf4RitcxFM3sjjEERsMClcw2144the5AHdcnbGNQxBTr194DCc5xyL5pqQYF1VKj+4t6liPUmKkmYUF2BfEdzHRdLVQZXbi4CbDopiON6ME8GMYisk+TBuXu0eIvusb/9hIliMaAhdeasVRGoF31ep7Y1mMG3E3u9GUuax5rWpIVYxF/NcXlrwNNpDrI3Ekf37W9ShdUsqchB+fT3TqFxfpRIzUK1kC6ohFiAsRwqvxsw74AzgKuDQohzkCiTqUmOKfYkTE+UmcsR+wyjxSGo8lD1l2VSmJSUOL7GU76uRh1S3cnWBGgaUjO6lYvjsdvG8RCNuK+Oic+0SBiJJCDKdHVYRimYwRSZ1k2CKshKsVx6gHllljJhX2vVvhK3WSkqrVachc3ryxaupTxNdCmKMeVy8seRvQ5oLbDHSyyGK07JQyt+8+kAlxCVdyRKfjO+kSC84ndiom07EeY1IyeOG7AytiIlRt70Q4Y38Bmf719oSrABEdHFJK7uy5YqeoytHDPtpEImoa47kjO6QQ975SCVRPqY0KfOPc0it0IpHUrtIwjzpD9/9BgnaGXSu60QzsRxxEXJCvHPIo4ns+csyjbF5UzSaOMv7rC0veBjS74DZDe8d7QYdzIXM2wA1x0c4TETFVaSd5EQ8ZI8TpHEhMGzmegiPOtq3ItUZ6jFukuTTElGZRIs1hjrRqYePGvTb0OrJzbeZlTheO/HSsIXM4zxaeRbljHehWY2KBJTUb1vWiF1SB1Xpsxyj+IjR6dFDChLspZpOq+WlKGX9thSVrxnfzcJuhMmcnKGIW0StkTuQrXdM0FyH2VRQZw109kOgYPSop5jS/FVg5W6SkhNwQfW330DNcUEUOOr+XYX0Y6xmp/HZkJ9gZK0wpCpZ1ODSvoWC7U/xa0Ov8nNxvr6Q7m/WiF6TAKV35iScQbk1oASPupljlU9Xq8ddVWLJutCfLpbPKnCyqxWi/hcxh090YIfV3JXO2iBgMRuOx9Gwuc8WJS4qZBraSnCWSLC1kUEwUH52Zr5wTzcLuBBcSbLCVYaXIZjP56chOET/1jXchkypvypwlJj9+9AF6IJV1Oc3EqCx6oj2w/KTTebMmtDuMyqNq3lS1evx1FJa8CVfze129VFgn63WxqV4cK2XOHkzRHkz9XZM5mVqNXOZccxodQpCcJZJoU08F+LVE9YNB9KnYVNwmOl17OCBhZfQkxLMjO0dOxuT06yml7uumpsxpMmNZwuppsofHYlaBJRvGiXRaIHQFlh9vZSEZlKbMFc2bqlaP//rCknfDOlkXQtY3NZmzq524dpJcSpkT7yfIcSlz2T2/LGa3zMl6rNw1LqVkJ0c7OFuCi4Uqw1YDUOnILhLgWpXPNUZT5sQ0ktXjjpypPVerMJ1/mqeLZdGNzsDyc2mOQjIoDZkrmzdVrRH/tYUl74d3shz82ldD5uI6PbmUMmcCmVw9UH7xpUsIqizUkE6InCoFMV1hvTIP5+5ahr2ogoKO7CIyFPNzMz/LbJG55/Co2eGKxLNem4ixmsO4LLrSHVh+XCyEZFAaMlc2b6paI/4rC0veEe/keZj7xcsWmbN+TS51mYtXoHOZk/NACwQ6hECcbYCZtDfvHLpBTHZJ5izML+JFzDysjDO7bml0ZBfRq+0SxsvngzqFnaULL0dSNb0/EYJfCPSlUcswLouudAeWn92obYoTAng2ZC5v3lS1RvxXFpa8I97JsqrxLipkzrbnlA5OLkJd5jTQfZI5u+QSe3Z20CkE4qyZyxjBPaeD2uWdynqLCQEGSdRuQdTClhsrt2d37HIpeg0FjdWNIzOFza7+yEJSK/cUfGT7GZ20hx7TMK4XXekKHK16kNLoMdIic3nzpqo14/curHR1s7DkHYmdLD1jjxcVMhfC/Wgo54kQrtTfpczJQv0Cs2ySuTC+lIBY+t+PH7R/24UAziIsY9Fy5j2VpcvFaBQvKcaUbNhoRETAsR4WDzNdjUbjqY6r9uxQxAd9LEPHnihiiXItvzavpLAiP4/jJ1UueCgFR0xI2jhSkNloT1rI3NMwrhdd6QpsVty+xtXIWR4FnjWZK5s3Va0Z/3WFJe9I7OQ0vRcyJ50KdDim/i5lTjs7hJNnc023wGSRBXALr10I4CwqDtjpE2Zmxe7ZppSQk11ikxncU6qF9XTshKsjO4gb8Jvilny6dV+F1TvnqhNlskn5WxFljSTMxMGc4zCuFcfoCOxpQtzx0F0srNGQubJ5U9Wa8V9ZWPJ+HMY+HsWn7/xRdH8EfTS+1mUejH5KEqPs+nF/fCsSEW0SwzTH4Pjy0qPGmDvls/4jGdn7l/lXSI7Ht+PqqlpKKb0AkD8OX4bdHd+N46XvjuwGB+PJebbwG44nVQJZ2OE4Oo/85Ym9Kt/R9VgCouRwViejLI7RGjilJdnLb3z5wshsMVzRvOnFAKEW/7WFJYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgjZHKoXmd8cfzP9NWhpJ40PYZnL4auTJ+SlJEG60U90L+LzyFz2HQRC3puvUuZ2KXPk4/gqZY6QD4QyR8gbgo9F+sfvInWZO8WX2rCZKj5qik9F4vux9+KGPVsRXL80dW2fjtStBmQYC/ZdO4n5ZFlg19Fiuyrxwjfj4r52ub9+49G/eoWPUz5FmcPWQekzcikJ3zm2LdvcLZe5KjdzsVrjQ5Jz1Edqip2e8X1BCcnv1JH1gREZZcypyZyMvJlv8CnDc2jh8StDHcYr+6ipf8r5Ch8nFy8M9Qt1f4aYDPULk3CEmxOebS8Z3diu8JdhPpE/TRhiIExV5kRObuZZGjEJ/VylHBvZFm6IaBKW55bJ3I0URpzV0T+JiU/cCghIyBrAZ5XLHVQx+ny/3XuVuWf9FOZUh62Mx2zLYfuwqX8w38eluqgQVe6agKnDs/y75+IlciRj2yyV/9g0kG3oa1+YtbBXtstChaeun5dty7bmJj9RwqrcMpmzcPgWpNQUn3iXZBDEd3cg5PXY94FVCyTUyajWc0d6/mY6AEH0fNQ289CfUXjWzyp7AGBGT/3YP9ae+YtZz0r1UPh7oBPsVnzsWyGozD34XhkJdcVnpCtRLrJ1opv8qIQVuTVkDvgewRPL/SKevRLyWuzMKX3UWMGn8ZXHTOaGOgZteFZj08es6MlJ2MPiZ+SDGVggD3rrq7NqVCfzBF8QL/zdYw952l4Yfg2lvlF/Si5L1oyZg5C5aRWK3AqZi7LlNb2wxeNB+tY6Ia8FFyuyL3SDNFzjdUvdeaJb5obQBWKC5clU18i+u5+CxA/xA7UqblaZK/zdJOCk1opn11CwSboVwYjJ2bGZbdNNq1DklskcVqt2tchremoyZzMOIW9EGq4mcyKVT1fjq26Z018ZmzMc1EGG9ATbyERPPZz4KlGtinupzBX+aUWp5lzmBju4ZqIOSjTrsSXbpptWocgtkzlbo9rKlTJH3o04XF3mzLrg3BKq6FQk4iSMd1RL3NkZmnl6UBWsGrlX4R+TB5P83FKZNc5f7diWbdNNq1DkVsgc1nopGGWOvA9p9KnM+YOIdZnTJaBdvhRN+DCFIYQrVUqegB3cMtbLgCXupQJQ+LuHcumXR6qVnF+FVPxiDvbVjrHsUFhyN61CkVtN5kQ16m09yhx5N9Loy/VcuZ6708t52K9OrXKEXQ5qf9abD7WTvLjXWrZ4dC9TOrn/ZdxoDj/xbgBkTgXgOLuIKM4iILs6B7Rl23SLElblVsncgYp1FowyR94FH65R5mYh3D6EWSFzEgb4jpGwIaRYVSeJany8llWgesbU8DTLQ7wNbbjZZK7wxzWOiWQMGbArOGEMSwjTybRMwoAUtmXbdLMq5LlVMieuE3HEwyeUOfKOpDF9bfe4sN/pJR6mgjHutIj9SQ9j0DM71xR1Yw9Z4fLffHClnik1fWwr3GX3Jdxr4ldJcn9cyohXKLGH60Rc9gaDfWSbD39JApJjN/Xbsm24xSpUuZmLeurtSs3Vg53iFiFSsauZhHz1RNEihLwPlDlC3hfKHCHvC2WOEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghmwXeObP3q9dBy7OS60t8JRrZ8jFO8r7c4s3NWfZyqXMV5idXbylz9or2u9PMljJH3hV8nm7i2w8U+Ei8qX9M+YU0B/YUmx28H7EizWwpc+Q9sW/NDXaaw99HYvGhulfw4QO7uyKUOfKe3PnXRZpQ5gh5Pc39526LL1GK1rMdamQcgpl9HShci10/k/6CfeeyzeuEKgc52UNciYlMXPLxBSJgNoArOfo1rjJpJHpkiU5sK6CBnx9nm9llaaeKeLZF4Slz5M3QcedmYxQHHsCXyvHhO5EdDRnmaai+Yt85RTevK3KQUa9fvcu3pZNEriBZVXyRMt+CrkgaYWKiz67AzEuip83ssrQrmYsf26vtVEfIW9C2/5wM85mrPl/bPdjBR6Kfkr143zkZ7PaLQ5FDGvzDuC3dkX5d9lw/n2lkW9DlSReJFjJX28yuSjtWxLJt7lRHyFvQtv+cjMF4q8DPH+WIr7v6SIxDVa0v2HfOT2ZP1anIIcqcOunhJgoufpVqC7ralnVZooXMeVzdzE4sVdqlzDkW2uMQsn7a9p8TsJcx1Ekce/d6Nd1t1VAV/fiCfeeiUY/RojlEmVMXzcajZ7FFqiD/QtuWdXZskzndzC5arAqUOfIRtOw/p8ww7IZx7F3qXeNiwMJXVnLiBudV9p2LRhzLHJoyd6AnjtfxmglIW9AVSeeJ1mUuEl1iFWoy17Z7HSHvRwhnuKhplks/MVNLHKpqF2lYdd8599FjmUNT5sSSBCESt6CrbVmnfnasyVwVKgZrk7mWwhPyntxB8cSxN8nPNJPMvXDfuZioHqNFc2jK3F3YF1morqA4ugVdkXSRaPQpclDytFMwzbat8IS8J3OM9LjUixf91HLvuzC+dN85T9SuNBY5NGXOzXX08mJty7os0RtVy+kqDcyOW0zaYkU0W/exQxGHkDfG7hyc6LC7ssH3GO+K41fGetykRs7xcHMunvz13Heu2LyuyKEpc3IaKSd8JyaXSrYFXZ50kaitEs/MUtvMTs0mc7Eimm1r4Ql5DyBBQIVEjnplwX30oCH8ROyF+84B37xOTCmHpsx52Cx2tgVdkbSGEoWbWTwHJB83s/OELG0NJhXRbFsLT8i7oJssxk3e1OJXDeNIxLVOCMdL950rN6/Lcsh2goMkiNQ/WEaPqsWUYgu6PGm478fIuMySLLhT7iXO004VsWzbC0/IV0aUtd5CQGkh5DXsuszprcBeUOYIeRWywBrtjOT00x8AXQpljpBXsaPXV9Iz18uhzBFCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQsml8QwghhGwbrsN64BEIIYSQ7cF1WA88AiGEELI9uA7rgUcghBBCtgfXYT3wCIQQQsj24DqsBx6BEEII2R5ch/XAIxBCCCHbg+uwHngEQgghZHtwHdYDj0AIIYRsD67DeuARCCGEkO3BdVgPPAIhhBCyPbgO64FHIIQQQrYH12E98AiEEELI9uA6rAcegRBCCNkeXIf1wCMQQggh24PrsB54BEIIIWR7cB3WA49ACCGEbA+uw3rgEQghhJDtwXVYDzwCIYQQsj24DuuBRyCEEEK2B9dhPfAIhBBCyPbgOqwHHoEQQgjZHlyH9cAjEEIIIduD67AeeARCCCFke3Ad1gOPQAghhGwPrsN64BEIIYSQ7cF1WA88AiGEELI9uA7rgUcghBBCtgfXYT3wCIQQQsj24DqsBx6BEEII2R5ch/XAIxBCCCHbg+uwHngEQgghZHtwHdYDj0AIIYRsD67DeuARCCGEkO3BdVgPPAIhhBCyPbgO64FHIIQQQrYH12E98AifmB+D8F+3EEII+Qy4DuuBR/jEbI6a+/7HH3/8g5sJIYS8AtdhPfAIdaAawo9u2W42Rs39TRs1fOdWQgghL8Z1WA88Qh2dkKnm1sq/tFHDb9xKCCHkxbgO64FHqKMT8mvV3K9/kUR+/pXbPop3VXOLqvwbFCT8022EEEJejuuwHniEOjojv1bN/aCpfO+2j+Jd1dxmVJkQQj47rsN64BHq6GxNNbcyVHOEEPIeuA7rgUeoo7M11dzKUM0RQsh74DqsBx6hjs7WVHMrQzVHCCHvgeuwHniEOjpb19Xcd3/88c+/dvM333z/5398+fm/X7787Qd3yPjND+BPmsrv1axUkcvEvv3Tly/h5y9///237pDzw49/+/Llfz9/+fLPH3/rTgv4/k9///Lll5++/PWP8cH9VjX36z/lVSn57Y8//qGtIMq3v5f0//vLly9//1MZv1+Vf/w/N7fQq6L9+4AQQj4xrsN64BHq6GxdV3P/haOuVH79bw2Q+F85L//dnRv83QMIKbFv87SyAMIPP7lzxk/dM/r//cfDRP4ObdWm5r5X7/aUuld/v2sU55cfoz7sXeXW98P7V1S9+vQBIYR8ZlyH9cAj1NGJs0PN/cpfAAv/+edfvyTNkk/fv3O3Br/3AIIn9v3/1COSKYXvNYTxk6zQ3Cj8u3Wt5W9fy3z/RdaGkT+sS8396p8aB8iCqyrbL6Zdele5qeZWqqi69+kDQgj5zLgO64FHqKOzZqua+7VOsD//qXo37P9smm6+ELbwRpVG+l4XQT/Hdci3P6RUv3P1pwuyxG/d9V9ur/i9e1TF+vZHUxia0avV3J81RvhX9g2Tb3/8Wd3y65DLq1xXRStWVF1X6ANCCPmUuA7rgUeoo5Nmq5oDP9U+WGV3pP7htsTyOR/83PZZkD90etky5pdaqn9V13/X3sv+Ll3Xe62as4uK9QYRdSSa7lVqbtWKqiPo2QeEEPIpcR3WA49QR+fMDjX3U/NSmmmZ37kt0kvNtV5os7XTX91WYh8T+aXQaKqawp/clhEv7r1SzekFy1+6nlnJWFXNrVrRqOZ69wEhhHxKXIf1wCPU0SmzVc393DrZ62qnvpTooeb+15yuBbvR9We31bHp/99uA5ZRe/jvNKPXqTnTID203KpqbtWKeses0AeEEPIpcR3WA49QR2fTVjXXuvqyVUn9St/yOb9jUtZ7XuX0nvMXeOdKSpds/3NLnbbV2Wpq7ld6k695xbKFFdXcqhX1jlmhDwgh5FPiOqwHHqGOzqYrqDm7xeSWyIpzfsIeJ+m+/GZKqnrc4ju1d6mh16s5dem3gc5qVV61ooK6rNAHhBDyKXEd1gOPUEdnzBXUnE3vbom8VM39A15dizOgi6BKE9n83qUtXq/mtDzda66c1aq8akUFOKzSB4QQ8ilxHdYDj1BHZ8yPUnN6k+k/bmnDXotzS1xtdeSzBjWn5Wm+w9DGalVetaKCOlDNEUK+dlyH9cAj1NEZ86PUnL4z9pNb2rBPjrglWlsfZhFer+a0POX3WbpYrcqrVlRQB6o5QsjXjuuwHniEOjpjfpSas7fd3NKGPt9faSJbzXV9LfL1ak7L88Uti1mtyqtWVNAYVHOEkK8d12E98Ah1dMb80HtznWpL0FVQ9WSG3ZtrT6tdzf1KY6x0b+5ntyxmtSqvWlFBY1DNEUK+dlyH9cAj1NEZ86PUnD2A2PKyt2Nvm1VKyp60bH/Hul3NLdIWzfB/1NC9voy8WpVXraigLlRzhJCvHddhPfAIdXTGXI+a63irekFi9oBh7fMfFb/SNU7+5KNd/Ou4Odeq5vRrx39zS0kzvL031+tRyxWrvGpFqeYIIURxHdYDj1BHZ8zXqrlfq2P2if6cRWrOVjFdz2bo84m/5ErNPhfSevfMP/9fV3P6XZMiDedb+9x/Gd6+F/kXty1ixSqvWlGqOUIIUVyH9cAj1NEZ87VqzlZBHY8oLlJztqIK/227+Gc79/xSfuvYvsbV1HPfqqYQ6mrOrnM2IqT9bWrhbROetm+2/PrfIS/KqlVetaJUc4QQAlyH9cAj1NEZ87Vqzp+Hb7+ptVDN+TInfKl/euTXprd+ql/nMz3ne79FfqfXBJuvWAO73/Zzcd/re13JtYa3h0Xqy7Rv9WtcRTOtWuVVK6quVHOEkK8d12E98Ah1dMZ8tZqzJwXDf02Z/O5f+dMWi9Vctaf4//4a1zM//M2Sq2sz5f9sc7nwxf2++6uqq/DTt/ryQEPNffN/6h3Cn/yioG8N/tO3WpVGeN/PTorjD0b+5sdYmlJDrVrlFSuq7lRzhJCvHddhPfAIdXTGfL2a84dDEtnVvCVqTkjbgZf8VLuMF/m9K7oc3E7rUHNxBVjwF1k8tau5+NpCnf80SrN6lVepqPpQzRFCvnZch/XAI9TRpVB9gxidptsfr9C10S9uyfnVn21VpWR7ey9KrOIPNZ3x0587HqdUfvVHe37E+PcfLLdvof7av6j13T8y1fhfT1ur0v5UyG/L4nxpVzcvqXLvimrSK/YBIYR8OlyH9cAjEEIIIduD67AeeARCCCFke3Ad1gOPQAghhGwPrsN64BEIIYSQ7cF1WA88AiGEELI9uA7rgUcghBBCtgfXYT3wCIQQQsj24DqsBx6BEEII2R5ch/XAIxBCCCHbg+uwHngEQgghZHtwHdYDj0AIIYRsD67DeuARCCGEkO3BdVgPPAIhhBCyPbgO64FHIIQQQrYH12E98AiEEELI9uA6rAcegRBCCNkeXIf1wCMQQggh24PrsB54BEIIIWR7cB3WA49ACCGEbA+uw3rgEQghhJDtwXVYDzwCIYQQsj24DuuBRyCEEEK2B9dhPfAIhBBCyPbgOqwHHoEQQgjZHlyH9cAjEEIIIduD67AeeARCCCFke3Ad1gOPQAghhGwPrsN64BEIIYSQ7cF1WA88AiGEELI9uA7rgUcghBBCtgfXYT3wCIQQQsj24DqsBx6BEEII2R5ch/XAIxBCCCHbg+uwHngEQgghZHtwHdYDj0AIIYRsD67DeuARCCGEkO3BdVgPPAIhhBCyPbgO64FHIIQQQrYH12E98AiEEELI9uA6rAcegRBCCNkeXIf1wCMQQggh24PrMEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQsiq7I1Gh278RHyWWu2MTsdnI7d8CEVL9m/Wd+iAzzlyCSHrZhiEG7cMhqfjPnNqz2ArssZUy1ptK6dzVAM8ucv7U7Rk/2Z9hw5YksXbjFFCvmquIXWXbomM4TjfdVtEXU/d8sEUk8UlLGHHbZ30DLYi60z1HWbZt+cOlZhejcc3s4k7vT9FS/Zv1nfogMVZvM0YJeTrRqUulBptx07Hr9zqqOuzWz6aYrK4gGXulm56BluRV6R6NpncFafu7zDLvjnaHmdu+TCKluzfrO/QAYuzqI2mxgh5IetKh5Dt5KopdrpsE4ZuN/RE89gtH01tstg9cMNiegZbkRenWtZBeIdZ9s2ZoA5u/jiKluzfrO/QAcuyKEbTuoqzrnQI2U5s6ZZrNF/MhXDnDsouXD7uKlSNz6AQPqeae5YqfPyav2jJ/s36Dh2wUhbrKs660iFkMxmORvWbbCVnkIFco2ExN9FFXn6dQ2/i1dYt+6PRqFzytbOzOJwUsfbs2dKE1zofSf6j9tshu6PRvhtXpjvVSKMOea0OuqP3bXawOGy95Zd0VA6CjvbcktFbzaGCbixZcbS0Bi/GR/tgaWuZ5cNq5aEs1SykpmcfG53FWdit9Sxfoeaa9VmQ896y6hDyBuzeY3yD+kMmOU8IUMmFLuZG+vvgTkJ9AtjTi1POY00UKnavZh5GucsERFMcDg49QFwo9ku4KE5lOYGpXHKq530yFXGGUj4clXlxQXbnoii40qrxVko1oecWOdowntaOPYgA6tF7NrvSGdZLWbT8go5qsJ8nHKYn7mytn9G8Q6c5S+NWFQwPWauuOFqWBW+OD2dxy+QhM1YfypfxsogwO3dHz6Kjj6sCtI8QYeEQaGbZns4IpjFMETjEgnfUpzvnnfOsbaYvPjMkZHXyET7vHns65ONQ9sXcYHAO10oAb8RWPXupd8rD8934ZHxvA7x2+uh4uMnVeDzBab5Ih/u4LB2aq2AF6JtwNR2UllsY44yiPMBFVx3NOCPNfTZRRS8cuWd81vRufDzWFJVM6WeskmrFQTYdAXv0XqNbQ7VG79s6YEFYzaZs+UUdVUMHhvB8ez2ZmvHafIbZXCe0DDnNeagZzCcT7RrJKa4AVhwtS4O3jA+wrGWqkDkrFs5vUmo9vZVMejRsZx9XBWgfIYsK355lezo91NwKonms1pvx8fkYRWhvQELeAr2bluictvwR8HgBSWPBglGerj4dwDWpj0exVOegO7pmbJvMByfzm+ykV9c4PiW6LAmP2TXV3glX00Fp2VEBzPLM3oJoxsnsp2qLEdEi83hFbg8TRe2x04oVUq2Th1SWRO/dOsKisDGbrOUXdVQJznfCYwpsOmtaXasqhk2dmPOd57yrsePsuuJoWRq86JXU0EtbpuiSxIqF00ETbmK77FzH5ydj2PY+rhWgtAmLCt+ZZUs6PdSc0FM092FOXoM9Xrgk74eqpsTMXVvQUR3VIARYh7tegoqaDfN+mrww4qf5bRkV+mwS6AKneikVk6V5Hq1/wsV0kFtUfqt1l0rgrZmbcXLR1zNSn7hU1VcK5AjWrnVT/1Qb1MIui75Ksy8Ma9kULV+j6KgCPSVKlymBXn2rWryHmvP+UPSWb/s5xLLRUqMZvOiVaFneMkWXdLGscLqcqoZQhYXt6uNaAUrbksJ3ZvlCNddbNHV9zyuV5GPwi0LGhTu2oUPW5q60mLPoczszU6mI52sql8Vr4qpQe7w4rkurKDwqS8WFwBUSLqaDwqKzbhJgXBWKS43uOApmCS+OFqSSWy1F++MSq6TaoBF4eaF6NvvisJpNV6GUoqNyNOHaUMKMXwVeruaKCmoN2187XDJa6jSDt/VKj5Ypy9fBksLpGq317cFmFvkQqfnWwi4sfHeWL1NzRX0W5qxzxrw49yHk3bDL6WC+8GmF7M1vaDx/7FJFwZ5dwUwWZcAuWz1McuCyYHY4GI3GoJgQm/K+QsJF5DIlnHdGHaWTUaz6gjggn5/RHtUaDCuYqZsbrJBqnUbghdFXafbFYZvZRFo7KkfvVNY0vp7MpylwVTWnRS2Wyj1HS2R58Nzy0paJ9CycXtNrlblm2LzFar61sAsL353ly9RcEX5xs+1BU8O6cJYh5I3Yu5rMn+/Pl10t14dVcCaoJ2Zx0tELVLg8r+dyaXWjA7xJ+9ncMYS4oHtuWCHhInKZ0h5sdqNdzziTMC+IA/LZZh96zh+Ou4C5+wmeFVKt0wi8MPoqzb44bDMbobujcvSZCTdHdMpMyS2qcFvOeiYSK7HKaBF6Bs8tL2iZyCqF06BuLmkP20/NLSx8d5ZrUHPLBt+uzhVgWqz5CNkgMH3h7l22mMtGOxZI1eJGz+z6XYq3Zz2fTl3Ndl9XUlZIuIhcS0mv3ugqFJJfXXtZFEfIZ5tDvZtfkd9NqrFCqnUagRdGX6F1loRtZrOwo3J0NqvdptS4acpcVOG2nLWodtNntdHSP3huWbllIqsVTtvpbVZzXYXvznINaq7P4Ns9Q02Ezse1CPlQfNgXizm/Z3egesNv0gG9FNrrnE2nhmzQL5kb+idcRq6npFfWROBxmy49L7kkTjHboML3g72LyVN4vB8vLlH/VBs0Ai+MvkLrLAnbzGZhR+VowNoNIG3uNGgWVbglZw1va+8VR0v/4Lll1ZaJrFg4vZBbvNgSaYbNW6zmWwu7sPDdWXaNtOJlWjh0q7m+g29Xbxlk7UTIBoFzwfkOTtqyD6LoPbuJymEmPztizdVHN3qpI3soecnc0D/hMnI9pR3cKpgODuGcLqssiVPMNpDV6gHpxfRPtQFefihu9C+MvkLrLAnbzGZhR+Vo1NJHp9dqYltUYYs+z58GzHbIWHG09A+eW1ZtmciKhdMnfLtbsAibt1jNtzZCFha+O8uWkaZ3n/1VPEWXa91qrv/gg+x09j8hH4qO7MYlKZ3CcG+7GLh6y2vW43azPvWYzv39myzdc0P/hMvIjZRUwV1j+shdF8fJZxt7Jfx+PD4eCYuv1fRPtUHjoYHF0fu3jq/Ou8I2s1nYUQX2pZPqWctdnf+z0/dFFbacpWBR0dnJv7/LsuJo6R+8sCxsxWYukVWHst7Ort4zG5zMXLCaYfMWq/nWR8jCbu3Msu3xFL1Q8xAv0ejJxiI1t7DZzjIJ0etBxdVQQjYHHfdCObohgqBc2/iTVU9XF1AD41tYy3iGvf07uxgN9o70dBEsmBt6J1xGbqak59pCMd8uiVPMNrHeiccXvlCQp1pHp47wOB5fPtlCYUn03q0jLArbzGZhR5Xo0zmyWL46GZ1d+ysrafoXFlXYctYEZtfjsX89K14+WHG09A9exl2tZSIrD2VVOtJOcrJ0Y81kOqIZNm+xmm9jhCwcAl1ZtqTjt/LC8+14rKZnVGqBmluQs4a+PxsNh6NzDTSpbnAQslHEfQnKyc0/UxjHfyL/KqHS8SG7M08VzE5UohfNDb0TLiK3pOTzbxF3SZx8ttHl3Hh0pM+O3+sMF2Zly0RWSLXBgc3zQK8HLY3et9lBd9hmNgs7qs5B/ESV8Vy+kbywwp6z3ulx7rMrgSuOlr7B63FXa5nIykNZH9FN3Ho9m2HzFqv71kfIkiHQnqXQTGcwzLrxfk/PDBepuQU5x5NK4zm/HETIhoFP6jeHKM7dspkoY/dYzgTlbPCke6oF+6Px9e34RM/wdvOP2u81vn7u9Em4iNySEr6WXl+ALY6Db8/baaje5SguvOjZcL6LQ0bfVNsZ4jWsdNWnT/Seza50hG1ko3R2VAvD0/Hl5Hp80SzDwgpX8+f+xdVkPG4OtxVHS7/gLXFXapnIykN551ia6XJ8nDdII2zRYs2UyhGiLBoCbVkqLensnY3vrsd+mnK4sBROZ86SuCR13HXNgxCyWWApWFu/4sy3saQlq9K+TCCEEPKu6FMVxemq3r7o++wl6YRqjhBCNgB/OfzhfAQurtU67bpVRfpDNUcIIRuBbRmXMWm5C0JWhmqOEEI2h4MTPGd5uvhZDLISBxfjk/YHmgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQsin5XwyWf7Zyl6BerG+lAghhLwjw3N8BjJx0djb8S3YOxuPzzv2czy8kGL0+Cgl9t45dXNJ5tMdqIOqOY5KpbZyShvGwfh6MnuaXDU37ySEkM/MoczeBe8xlevO3sKV2zOO3Wv5Hjqdeke/qe8bqq6qnOrNcenuW67mTqw2DjebJYR8RYww77n53dBMH+XvxB0SUIDweI2aGwzH4zNfs6yqnKrm2D+5hzmp4k1RcyhHxzq4i13sUTS1tt67mIa5mr4iXtBmhJBPw8epuV1sfVq727UzC+FZV2OvUXMZL1dz4BS2WzNvsZq7rGrxdUI1R8inZXQ2Pj1wcwddam400j3GTsfj0+o+2e5oPB5fmE/CQ+4cyxqqyuxwPD7uVFWWKX6n5Z0iWcjN91rUXFvGSe8cid9xcTPvcBSjN5TTUIpZC51Ta44zsfnaJ6UkNRufNG9wLWqbAfxa4nQUZl+CX4w67qG1T9mtuUeeJMrYzQWjYmO+nVhaIRpxo/Sk2VptBW8dMdgA8KQ+BF81jLpL1l2b1jbrWYeFnUEI+VD2ZWWkPJeiXNKh5mRieB4cego6zwwxVyQe8inkebD/4O5hLqpg98YtIZx7qBLP9EJ+791JQbyR3Vur5rYFGYfT3ZRxeEpR1KdhlJnvWgMq8wt3LKk1xxGslmM9u9ts5lvYNsP2OB2FyXIID+6WqNoVxFtsXbkn0M7hyC0ZuJaZzf6wehtqwffsqi2o2raz4BqlGDHVoJDTmai5XjuMukvWUZvWNutZh4WdQQj5aPKnDg7drYVuNTfF0urq6OR6grlIb5jd4WT34NTMFhAhZzIxTM92B4e38LiH9+VwsH8O26MHK4iZYraqHvIYIIJMaKWaW5QxmI13B8MznY+iyoRPU83tYFEzPVajzn1tiq7WHKiQp5qy2xsMT3UmTGcPy9qmJU5XYfAEjqmE3dPnZ3XKGY1GyOBcjqO4wujMvcLvMqIfcxaqOS341d5g7+QOkWPbdraiRClGzFDrO5aht3eCYlmozsJ6jkuHUXfJumrTbLO+dVjSGYSQD6Y64Q3h2t1a0HldZwAlKkQReBH3atbARFE9F7kDaz79x7Ny1VBhFk/dEawt76hMdqZy1OkGYOmEW0iFmluS8VOcuneRlAeET0PNaV7V1TAk5GFySjWH2HO/fVjUc3AllpmblxSxLU5nYaCuF18gQ5rZbL4g94wDTPoCzgkqFqq5rG1VLVge3a2oUaoRswfrmVvEah7dhS2aasEw6izZgtrU2qx3HZZ3BiHkI8FtpUjLBauIzusZ2YP42XPnOL+upidRKnh8xB7dQ8hKX+Bi0I2bTXG1LeeSMtmHwVc4mBifYMjVXO+M7VUAU0mZT2XEcxjVtKvBm+qg0voXV4iaNUItO0yVNh/2L2IVp7MwZfu1gTSz2XxB7iXHrunCU3rsZ7Gaywqu3aTRulsRUbIRU1uoG2sYRp0l66/metdheWcQQj4Uva8E5tWZa5M0rzuuc2qzCQIVqeBaj80AZUg8m1jNDFBYbRd7qjUTlPFUTTjHnutZeq7mYOyVsZ17mz3zqYw4aX+YVIi1WrYlSq3/lJ0f1LKD1WZOBOxXxCpOd2HUGG66LzNXiSgI3ZF7kyO71zT3xu2t5qq2XVzwKsoOPBqVgOMrh1Fnyfqrub51MPvCziCEfDC749vJzfniyy6VxikoBV4Vj5uNsTjYKXAZclU1pxOd3p9BOjafZGquf8aWkj1PmPkko15Eq5GduEc6mkOoZQerzpwrFDHFWViYC72pJQ3XvgZPiSiLcm/lQFO3hu6v5mLbLih4GaVZLqF/U/VXc1Wv91VzvesAlnQGIWQL6KXmBri4lK51Acwu9vhb3/kpJ88Ud1dknsKNK3+eTmdDn6F6Z2wn6TYZZT6VEdNekVIrq6u5FYpYxVlWmCMEDfO2ZUSViLIg93bwaqLf6epWDLWCV23bXfBaFJSrWLiBNQyjzpItqE2tzXrXwVnQGYSQLaCfmsOdlsZNFZsqypCrq7k9JHWUx8vVXO+M7d6crVwzn8qIGTW7I9PBC9Rc/yJWcXoUBtec7XJuCRJJT+0szL0DXOazpkZSWSm61Zy2irZtd8FrUVCuxqOsaxhGnSVbUBv1qtqsdx0qujqDELIF9FNzB5iPqgcb9Fk19y5Drq7m0lcs07MquZpbknG4j5dk9fl1X8VkRaqMqk4z/dDOC9Rc77bJ4vQozI4Eic9y5uC5iOzZjgW5Rw7P88crcZ3QVQse1Z/HBrRbuZmaC5OWtu0ueK2u+tnSKuC+FnkNw6izZAtqU2uz3nWo6OoMQsgW0E/N2dQQrjFvHJxidkrzRN/5KafMFE++ZTNUoeYWZwy/6fnOYHiGa1fpWl1WpMyok2q4tBl+9+RmXr34VvECNde7bfI4HYWR5vMHHk4wgbe92acT+M3uYPfUrtV15x7B6mV+czraHYz0DbYwjRXX985uDiQmTMjRmx0lFabSorW27WzFWl3989x4b26ouWrKrx9G3SXrrk29zXrWoUdnEEK2gJ5qrnxBofbliF7zU04tU526qhs5hZpbmPHz4FTnTeUxrVmyIhWlyz/LITwnxVrxEjXXt23KOK2FGVbVyd4OK4mr3/RmdVfukfyTH8KjqUflJOU3v9R1XqbmTo+qwuQvP3a0Yq2uwnlWmWms92uH0aKSddWm2Wa96tCnMwghW8Cw/BRgZDRqPECg3xsUsmkSlCEPRmkiFyTx1ulhv8xUIuUqZ2+UvklptGYcQw2xP125gsmKVK/HAUKPzzufJ+hoDqFMaWc0stWA06dt6nHaC4NvZo6P81asszs6Kz4c2ZV7zv6J+J+31G3vaDy+sOaumj3O99q2Lam2Fbze0mAf4c5r7q8aRktK1lob0GyzXnVY3hmEEEK2j3JZs0lsbskIIYRsDVRzhBBCPjFUc4QQQj4xFyHMOm9ffiibWzJCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQsgp7oxG3qSSEEPJJGQbhxi2EEELIJrAP5VTxdLbrHitDNUcIIWTjuFHtVnC9434LOZtM7kZuVqjmNodG5xBCyNeKqrk4I+5dzGENx25fBMIVSo1qbnNgVxBCiFOoOeEU9nDqtgUgGNXchsKuIIR8JQxHoyX32upqbnAOh3u3LADBOtXcwWg06rj0uSNeoz23tCKF7opcR/I5cGPJslwkj9pTofsSYejmxbw404VV60x1WcHQ1m5MNDrH2evOnxBCto3de0x24NJd2mioOdVWQQzHOE7M0dkzpzMccnQWdjW3c6lOYF67+Lk/cQ9leuLOQCMPB7tX6gXqketc2vVVZXbujmB5Locz9UqV28tjPC56KeIVmS6oWmeqnQXTNEWvVm0dHvbNq71zBjvnXmkw9bDC/t1kcvXix44IIeTjyKe7eTWt1WmouQs4PMB0C9OFOjoPcNkbHGTTMnhST516J8/4nU2ecBCO1EvRZaLwfHs9mZrx2r088qg7ch2f/+eTmFScqZfncqi5AFNzWuHwfDc+Gd+bKmhfVb0q0wVV60x1QcE0zaGmKfG0W0QP6jrtsLVz9Jwl3IyPz8fIL630dtS9PJkhhJBtYFfnr8ijuzapq7lDDW8OOu1mGlLn3XjbDubmRcvM0e7ypcttmtFjstocPY1X0JZErrGnU/lNjLxzHeZu7JfLY7Z8eRR7tbza0SVwq359babtVetOdVHBYpp3XpFdzbLSVrDlnaOvjVRryL2YHdUcIWRrOdD5KzJz1yalmtu3C51+zU0nx2czC2pNk2dhATb1Zk66goirmjtY8mt5A73gpstGYXHkOqp/23RRr1zmufaEMpnm99P02mKbfn1dpl1V60x1YcEszVuzKNdwuHJLo3N0udm6pN+9uL89czMhhGwTfiXLKC49FthKZCJgXlWqtY5e+Ey6Bhfcpm7uUnOFC6Zwn+x1aq+VQq/W5fN2V+Q6uh5qm5t75VIkqjGK50r1/KDlSdPXZdpVtcWpdhasmaYmGheCjc7Rtf28UMWEELLt2K0dMF/wVIWquYz74pxfb8/5xS5dolS3rWArJtrm1Israb4Y1ITyW4CCLjE65+0sch1dcrbVafVctPoP0PKJRhhjjZlmVetMdXHBWtScRojd0/De01WjOC4YCoQQsm3sXU3mz/fn6UZMGzo7no+6HlrXNYKu7vSmXbZogbWYSRdO5/rohRkToyzCSmoOXvXElNVzUd3RpGXds8ZMs6p1prq4YC1qbpx5NztHFnR6bRVMi0UiIYR8blTN1dYiGXo/Tp9QgMLLH1WARzGTLpzOdY6tPcGol0THZl5JzWlibcuS1XPR6nc/h5qxxkzr7dK5mussWIua0wjxXl7TW9k9U7Wa3cQjhJDPzhI1ZxO2LOLwiIMt6xy4FzPpwulck6ndg9Krfa4fVlJzekkwf78ssnouxcOjC1ljplnVOlNdXLAWNYdE7eUBoemd2NV7sNRzhJCvhWVqzmbs/SP8Fm81402u4mmOhdO5epYPMOoMH6fbhZHr2K4KLRdZV89FH6ifL/xyibPGTLOqdaa6uGCa5jx/QFOftEzfAWh0Tg70XEfLEkLIp2OpmtOrlVPMm+Xj/Y2HJxZO54MT+Gb39nb15lNaVCyOXEcfQ8y+JXIy86XTyrnY84uzPg9mrC/TvGqdqS4smKYp3lHR2RKtejuy3jln2eVPferSL6hKfnNZBPa6aksIIVvJcjVnC46GzrEX8x7H48sn+27HEk21b1/nmF6djM6u/W2H6lLfamrOlIMkdj8e31haPqevmkt6BvHp6gLP4YxvYa0FcdaWaVG1zlQXFUzT1Fxn1+PxlX0j5c49hVrnaPD7s9FwODrXVCfxsST9fBtfDyeEfGKWqzm7+dR8HuLAJleg19aWaqqD+KUr4zm/5rY0ch3fMsi5re4arpYLyL8MqWSffCxZU6a1qnWm2l0wT7N6Z0S0WBVPqHWOPoaZePa7hkDz7vGlbkII2VYOln+YX2fT2kMWynA0Ho/TBbG9+lf/8cX+2tsMw9Px5eR6fNFQJH0i19g5lrQux8fNUP1ziewej8d3t+PxSZeGi6wj02bVulNtL1ilOvcvribjcaa3EmXnDAZivbseH9dPaQ77bsxACCGfFX38JLsiRj6c9mUpIYSQF6AfF54tXliR94VqjhBC1gbuJBWPVJIPh2qOEELWhX6nIz1/TjYCqjlCCFkXB+PzUZ/Xp8l7cnAxPikerSSEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEII2V4OxteT2dPkqmV/0E/IJIRw6mZCCCGfnRN8Nj8xcdfPi24T8PmrSQghBOxiX7rpiZr3LqZhriYBa56RmxfRN9xbsloZhuPx2fJl64fWaxMalRBCPgWXMqHeurng86q5flDNEULI5jM6G58euLmdJ5lQW3dfbZlpd0bn4/H4vEywdUYeHkvA43K/u9FId1rbEa+zKolDCTd0c8aCBAYj8ToplmNtZdiXYBej1lXb4SjluFqai4p1Oh6fmkdM8/BC0izCNoKCtjRfnjkhhHw97M9ksgTPC2bACwQ4ckvkRqNF9D7WIfRh4s40Qks40WPXbhfmF+YmyMz9PNh/cI8wPx0Mdqv45x4KLEpgmBIIt91l2K1ChQd1yYEO8UdQVkhzYbEOvamhPtVh794chKekUxtBW9N8VeaEEPL1kD9acuhuLdh8fGULkMhoNHoU13M5jnRFtCu2e7uDN4LCm5vmbIQb7MB3eqxGnbDjnCwz8kzm4+nZ7uDwFh73iHk5HOyfw/bowZYmMBvvDYanOrN3leFYrFNdMO6ePj/jWCAJZWquZ5qLijWVsPOro5PrCVoxpnm1N9g7uUPQewvZCNqR5msyJ4SQr4dqPRHCtbu1cYCHUITZuJgnZQZtXjczsAJMKZbhdqZirS5JYrquNIosrTwLfdoxzGJABPMEeyYwuBLLzM31smJ1puqhHYReNc0lxZpnuavDU0xTNdSVmWtBu9N8eeaEEPL1cIYp0Klflaxx7JouPO27S32mLdhHSDfXwuGJljM3C4di9bUMwsVZfDDANbgbNw8GR2Lz5VzfBAaY+mO+tbKWiTfIEuqd5uJi+ZVFo5YmGitYs9aCdqf58swJIeQrAspDmVcrgW6O7A7VPN7hKWfayMHoaDzGqiddCyzDYaXxMKkQazAfGKvZ/1RslSbC4s4T7JtAkW9uBhor3HRcqs0S6p1m72I1HdCuZq/5dKf58swJIeSrYnd8O7k5X3D9ruRA7065cihnWlGDccVntKu5PfMs8NVGOSN3qbneCRT51ss6GFxoVSTVlnVsllDfNPsXq+mAm2n2LGvpsyDNl2dOCCFkATtQDn6TrJhpB6O5WO9cA2aLr3o46MLqwmdOOSN3rub6JlDkW5YhcgTnMG+s6bKEeqfZu1hNB6zFTNnWfLrTfHnmhBBCFoELbK59MIPqs31gRyzVQxANNZfC6dIlu42UUc7InWqubwJqzVVSVYYMXLedujmRJdQ7zd7FajiMxOoPxNR8utN8eeaEEEIKDs/zxyvHMmvGdQOe47g0o71P4M/aCwiW1FwRbrCHVV+rxiln5E411zcBtUaVVJYhY0dSS89ORrKEeqfZu1jmECbxSvEQC2R/K7AWtDvNl2dOCCGkAAuF+c3paHcwOsHVtTCNykwfYLnZHeye4oob3nSb60w7tBcVkporw9nD7+HSdOXuyc08asdyRu5Uc30TUGtUSWUZJG1/+uQEOia+Y5bIEuqdZu9imYMwFd02PNM2je++14N2pvnyzAkhhBTkX9cQHnVWNfCStXJX2EJ4GIpyTGquFm6QfelEefZ1TTkjd6u5ngmoNaqksgxDLH4i6QW2iiyh3mkK/YrlDkdVGap3FhtBu9J8eeaEEEKa7J+Mx+PzUVxKJHZHZ/m3Eg+O4ocfd+JnG41aOOHgQoKOz4unP0aj/L2Gg1HSJ8KwTLBPAlIGW90YtTIcIvpxnkVGltAqaQrLi1WpniHCZqcNQj2o0pam8LLMCSGEkLeFKyxCCCGfGKo5QgghnxiqOUIIIZ+YixBm9RtthBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGkxuFo6KYutu9rxsvrRAghZEsY3ooWMmaXaf/qvmDn7zBxSwfbpub61IkQQshWcDjHnD6dXE4eYZi7c292pxLryi0dbJua61MnQggh28AxdFvaN2bn4k0md27BRggh5GO4Ew107uY3g2qOEELI+tm7x1It3Oy6vQ3cl3t0c52jJ40fwkO2TahEeBoM9tVrjpRvxHBkfoPBKTSa8pTcOtWcJTU4f7YY91U5G5kIxw8aSng8NpfIxczcp+Ospq2FHwxOcD1SmN/suUubW1anspDz4pRgJ+bsvPn5AiGEkIqkFsKtu7Qwgv88U0mRA1MTk4kd79wZKut56Hoh4HHETIfZpD+zu3whXJhrp5rTpFzHKc/x8cZGJvse7FnvJIqmSwrt2F2MZ3PsKjzUV3i4vZkg0gK3rMAoykFeyNSYe1rAyZVnIozdhxBCyNtz5XMvWLDMsMk6zPKVkDAWt4e4uDlDAH8GExpAbAdmEyqVsPN8m5x3kGzUc5nWyLGknj0OLp+GfTPXM9E7iJdu2VX97Ss6xEpa+mw+1WNX4dEkaSXoWbW55QW2ojyaAj6AKvTbl0dinHqjnIuZOo4QQt6Va0zPTlpXtbFrMzlUXXqdAHolrYEGgxOxXptRA+fpZSohB1rg3s0dQeCcFnCDwYVYXR/VMkFp5tVFRtVcushThRZ1TaKz8NCPUZNF2tzyAmshU976qoHZsApMehihsgISQgh5c/SpeOOxrgganPp1xrkvbHDLLruntSdWvx6YaQCj4WBAIXiUriB1ZxR4pKaaD64KFvfjcNNRF1W4TFr4gM7Cn4op3FeKFbS55fnXigKrlbHIBcpyaSMTQghZK0em6J6KZzC6OdNbaw9qLm54OerR0E01h9FNFnVFNQflZZf+Sh9dQhXXVC/FAa9vq09NRS0q/NDvVz5l2ba5ZfnXCgmrqbkDMc18BYeV840ZCSGEbCy40BjOYILhZFSgIZq6KXc4UgXzNLkaj8e45/UCNae513ywHGuqOVyHVJ/GxUI4thde2IkPdmYXXhtuWf61QsLqqek6MNFSP0IIIRsGHujQj1xls3mN2rSfOyB2vB330ouWdpus5oOLlkVp4K/XKuHTeEwUvu2Fjxxj4ZrdvgO5W5Z/rShV2lLb5/0rVeyT2vM7hBBCNhNM2nrZEI92tL9OV5v2Mwe9gqhOoKeaq94M0Ct/HqMWAT75gyZ4sNHseA6k8QhKd+ETO6hpTTdlbln+taLAqmoO1yxt6UkIIWRjuQmz86Ql8Ehi/GIxLgtOsyXRTrzBV5v2cwdc+YuvLkDV9FFzITyYZtnFQzBRY9Uj6MsR0WEPIaMawwpyFst5PrcrmB2FP7hJt/HwMKYuPNvcivxrRYHVUtbrnNPJ7Vg4GdVVLSGEkE1AtVHGPD2ucqqX40QLTSb21rerg7oGyhyGGvDx9hoPdcyeeqk5+1CLcx+VRSMCXjYQ5vAA1Utq+nZ3wvVke+H1Ztp0Mpnoozmmz9vcivxrRYHVFai+WZ/Bbz0TQsgGsn9jz2MKz7VH88+TT7g7jRoIy6cTNyu5wxAWYX6pKzB8JQs04himQfZddc2yd9hbIgyTPptVCzDlwF+GCNNqYdpe+P1rU3pS1aS52tzy/GtFQWH1VGAHujzFOEIZ5ln2hBBCvnpqC6XtQrRo/sa6vka3tZUhhBDyBmyzmtvF6i17hmU452qOEEJIwVav5vSu5tPVaLQzHNk78c+1JzcJIYR83Wy1mhvs2DtzTv3T14QQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYSQj+RoPD5wI/lq2b0Ycxt6Qj6Cw1G5q9sqvMvHKt/vi5jYJzw2xmuapc65pHvk5jofU7s35ijuAZjt7P7erLVh1zMapFXiZviEkDdhiK3SjNml7yUzhC3upb0yn1bNva5ZahxLWhdubvAZ1dyV5DS/PDm/mn4ONbeu0bAnydy5mRCyfg71O/vTyeVEN+Gem+suzruvzLw6bzJH30jh9t0MPkIRvK5ZSjBJ3rt5M2r3xhxJRnE/+Y9jjQ27ttFwIumM3UwIWTdYUoRDtwx2LtYyh7/JHI1E81XAtiuChyLVz1a7FrBD38dP5u/XsCuAQuUnOYSQNXInAnbu5rXxJlPJJ1MEF5Jodk7xyWrXhixYN0DBvF/DrsCBFOrjF7qEbCF79yI9Idws2vAT9+Xa7n9jTvLHIyQIRPAc82EI80Ip7lzM1DVinrWp5OjJPMNDWjauCB7WqLhWN8/kwh5ryItlBd7XXOde+Y4ynCIV5al8GGTf2i48ie7JFMGrmqUAAfbMuDG1ewEnVsAwv/HKCC3Z4rJcYuY3gJeMC6voztiacmJhPM70RG2JzqRiR0zjtrfdDdvVXt0dnY2GtpZYoQLoj2M3E0J6g6tixq27tDCC/7ycBIHPBm58PjAZN1JyeyrZk6tJnGT8olQWeXBgfhMP87Jb7V2KYNxaLBR46JOOzuBdZbD5Z2Z3JfPnQYpZ+R5JuSJ4VbPknIrrg5s3pnYrg3k+PNzeTHCD191as0V1E3r/d/m40AZ2LaUc7HqaShxgC5I6zvd3D8/q1t2wXe3V3dFIy4vR1hK9KzAYnIn9ZaJByNcMnmuLtKwmIjYnh1k823UyCYZRlnw2FR5AjP1iG54pmPqZOabqNJlnkXFD5iGe3kKY46n8yiDR+mW91mK5zyy9ktZVhp3n2xRmB80QZzbVO3667TqoVc0JqzRLDua7vFc2onargiGW1iB+b6kzW9UEaWrvMS68CWxcysoIzOyEDBFSzp1J4YJ8OoM7m0/12NmwnQVf3BVWo7aW6F0BYRf2WAdCSE+uITlOmmra2DVxFBEcV1NNJcFqfE4iqE9Rmw3zVpoZECoGqiLjCZfsLBXrCFutrA4SrSuC1mKpT1blfmWAcvIHHxEhXg4UdCXiiqCqWXf+nc2Sg9myXpsPr92q4HrBwicnsmxLNden1GUTwFZddcCdzaw+rUlB/cXTjYruhs3JC764K6xGrS3RswIKVot5/xNCeqCPOxuPDWGvc+pXU+bxDLOS4NwIYDWBxAlquhUCQY/ZVDGKIPaGkF07SlyKUyvxlDlS5Wt0Fqvus7wMANOXOyNC/l0KTEHtam61ZslAIQrlkhVf6Uy97rPO2iV6doqqyPsFOjLLtlRzfUpdVhR64cbNsqiqInQmhcuF2YLJ6W7YnLzg3TEqn9aWKCN2VkDB9dYsE0JIP/yjE0/ZJLCIM72N4DeMMhHtFHOI6szPVrF2TEJcxcCipY56JPTyTRv1DyBV+Rqdxar7LCzD6Cbz9nkHzZa32YpqrrNZMnRNsBY1t87aJfp2ytDv/z5lBRLasi3V3MJSO2VFoUiqlszUUFdSjSY2uhtWaC14d4zMp60lyoidFVAQ9uPftiDk84NLNT6TZSJaSqtaXcz1HDZRhaliwP1kVKDuLyDLV+kuVs1HbB1lONJJ7WlyNR6PcXfF5x04vkLNdTZLDnzqtfnw2r2EHX8CMV1H7ci2VHMIsmxclBXt1BJdSemCOV0yTHQ3bFfBu2OUPo2WKL0XqzlErT09Sgh5C3DT3r5elIlop5hL6Of9K50cJsUTLFWMKvCrqSfVWawFPiWobbo/ks07iPCKi5adzZJTn9bqhexMfYFPyQtq92KOcSHAbpB1ZVuquc5SZ5QV7dQSnUnhOmC6F5YoU81idxa8M0bdB2QtUfNerObEWt3QJYS8HZibG+8EdIk5Ls7VrywaVQw8BrCmz9Li1aI8u65iNXy6yoCZprpSls07uDOVPfinTyP2V3PdzZKDOT9/6GIjavdydjByoNI7sy3VXJ9xUVa0U0t0JoX82h9BaWvY7oJ3xWj4KKkl6t4L1dyhWGduJoSslZswO08zAR5Si5+izUS0U8z1Gs10cjsWTkbZhJLFwKSaf6l3p+eNwiaYzvCabqSzWHWfzjKg/PGhfqSe5h2c2M/93Fo9VlrNdTZLDt5XzOe1jajdihzcpHh43NGWQp3ZFmquz7goK9qtJTqTQkVn0eN8blcwOxu2s+CdMSqf1paoRVyo5nCRtP7MFSFkLfg0l5jHKSIT0W4x13fLM5rvE4l069W7EB4mE3tT9sUrh/Rqrc0i3cWq+XSWYaiWx9trPD4we6rNyIkbNJKXOku6O/+uZinArJo/B7gJtVsRTNyizycTfdTJz48WZ1sVfPm4KCu6QEt0JlVU1Fd2nQ3bWfDOGJVPa0vUIi5Uc8ib1ywJeSP2b+x5TOE5m3ZxJuz3jjIjgPCqMtzBfJDE+Ajz9NzWLrUY5ymHcHfasbrpQ9yuzPLsKlbDR2kvwxBBhfmlnlBny6n4kSh8ognX9fxZhizprvy7m6UAujCb5zaidquyf+3phOdqOu/MFk+dFgVfMi7KikJLVFd5oSXyxW9XUgfx7GEaL1p0N2xXwbtjVD6tLdG/AnjZoPHeICHkw5GZZZ7Pj3iBKZvtvlb6NguWGvl3UMhXy64sR+0bLYSQTQLfJ8q+pjEYiqy2LVu+Mno3i35Q6oXLKPKpwPNHvGRJyAai9/WerkajneHI3qt97np8/muid7NAzxWXLcnXCZ6haX6uhRCyAezYy2FO/ePPXy29m2VPwlUPJZCvFHyUYfkbKIQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIb3YxU7bl25ZymqhvzL2H7BbeQjzE3f41DxKTT9y6/WPzp+Qz8/wYjwen+27rclIvMfjodsGgwNMgPdu2SCGKFfv6WK10FvJ3sXNZBbC8/3Fgbv0QzvYeHanTw0qOnFznf2ryeRi1y2J07vJ7bGbX8+i/Akha+ACUgba5TbNeSN3GNyotVJ7mwLVXM6pKLic/lXVHn+2/t28Xn4DUN8uNaODPQ39CBzXdwaA1KjmCHlDVJCn+Gk759/DtT38VbK+9ywu5255e5B3rzn6HdXca+K+C9ptUsaj/cHB6FI6LAT3Wc4tQu+5RYHDRlf3daB6VHOEfGJMkJ/kZ7rjThnQf1ftsv5OIG+qudXY0aVc3mOjMzcsB0Nh5mZj06v7SlA9qjlCPjEmyHvtsnavrlRzNV4T9z04RQGv3LIqWPuVc/imV/eVoHpUc4RsKRd67Wp65NZWXJCPcajPjFeIvVOX9bt5eMoewdu9msA/hIezajl4F8I97tyP9BJYeI5eu+d6BS3c1a6Q7pze6IXTEJ7uU07DiQC3KQyTSTXV7o39WcCH8/R8QFJchzda7fntoicvUuj9G7uLdX9qHhlnUlMwv6v82sokdZzmamCn1kLqMKvufbalm+jyjC26f63lXVg77cxHt7RyHJt7dpf12t691AjZz61ye6+vbmvHKl6hoY6Q7Cr4wtbJWJrygqYaeaffQjRgeJ2aW7mSHflvSMsSsj1gKeZcuFMLUZCh0mqPoeiiYL8u6+U66Bq2RHxCX8MMD1wMlee97HEXUInfXh4OzO25z3O3JuyO0W5WMeDK2cq1i8ezE931bgt9k1+1PbKJKBKfrW8r0xjHLCttyvz2ljr4s6wd6RrdntaiQztJcLprp8k8Yqpv48Rmu4pr91D1mHH+yup2dSywCh3GorieWdg6GctT7m6qstMfNfgr1NzKlezOfxNalpBtohCEOJE1SYKs66P80Tp95A7SsEDNQXZmLmHD6zA2k4WBnE1wIj2yGQfJzMeiSnYv1Z7WjiLd8xtPf8f8Kn0LW36G67ry/hDm3XPU0p8KVHfhEZH3LJ3OJU0K/axntoc29STVe6hWr85Qm+bOLADWvEyo3zwplV14y5m6Wz0vq+3CdBd5ahpo7V6181lwfpErbsceT5kdW3mPbfbLzm9QmcUXLXtXd1HHaii9JXwy3D+51cl9cavnLE25u6lMl08waHbHce5/hZpbtZKL8t+AliVkixjpwI7M3bVJEmR9YXpqjkCnQ12edau5E5hbFg0aJolVWgPN0tUjSKGq0CaabzXtIGA+x4qAi5JzCzj2KzD1PE1vV2vGEg9dvSC+rzOOX9+tX8I9QJmqmRK+eZm0rZODNpeQWgxX/eyhjoXpLvRcrXbZwnl6Xqo6PXl5zNyGqhKrR1SWq7ne1S0pO9YrlC85l7R6N60pdzSVZvJUqRK7NLBQzbVSNlGiXyW78t+4liVko9nByE50v85d6TCVseocD9OdRauCAJUiF0WNkl8QcjTMg1uA3t9J55t+dpnJXY5ml+ZhWLI51mZptxRYntn8rSfDXfNXI7TVxSYvjVmUTueJ1ASw5PO+PYTvOhwpP6MOhfrSui9Md3Gmq9UO7GmhlGkquaVTPlOrTtULB8vVXN/q1ik61nLNHwBd1uoLaKbc1VRqnuf1X9iQmnArHWquVyW789+4liVko9EzNucpl6sSlRAb8LoCiCfBd2L26TALIqgUxVkPocRWf424CAP0ukoWSpVwx9yi67UUFpYsJb0X2CrqjTxbJuyKZmi7l6mnwFratPIE+iRqWvrBUsTVlbCrdTTJiTr4TFOdLyxMd3Gmq9UucphUXbxqrbnUltHqlvq3h5rrWd06Rcc2K7Ss1RewJOWsSkfwLBfBcFmo5hrqAI5dTb+kKEvy37iWJWTDiV/CeNL7WB3kgqwTvV23U8nwm98L1NxgFG8BPp1mqrQhaaXqEmDP55YzU5iJFBaWLCVdFuYJJRbObg2aoa3KWvtqHVSQghcW5QxOdpdPDHL6jcu0dtKuF2ytJRemuzjT1WqXc6gXiMPMitDWfnpqn67xNdOF90uqq3R1bLNCy1q9Tv+UsyrpQCz1Flxeqeb6F2VZ/hvRsoR8MnJB3oHOmkOS9KZbXDQtUnPCvj3AL9xFTdcIs1jNpRtJ+iA7JqUqLCxZSvqoSCbrFQtntwbN0FZGPSNQXXCp3/LMqPQ4vMu4Wi69bYK4aCtkjutEejru13UXprs409VqV7Kris4uR7e1n64w0ql9M114v6S6Czu2WaFlrV6wUspZlXQsl6d9cHmNmlupKEvz//CWJeTzUQiyfc/Qj+n8fomaU05V0mKoRphFak7lN7vDV4aFJUtJH6l+u4uW9qiOVtifRmmjGTe1GFSG1kt1h9QCaT1pkCXpLs50tdrV0MJl12Nr87a6pXs5zXTh/ZLqLuzYZoWWtXrGiilnVapWSwm4vFzNrViUpfl/cMsS8hkpBVml8HYHUlQ9jFIGaUqRoe4uro0wpeoSYPfA9amkDAtLdkNChb71WehmuVZTc/pUjOWkT6N0v4NRL5OiNw2HejJud0GgNSc2a8XKLUx3caar1a7GPiKbBtf2Kyd1fU6veoCvXc29pLoLO7ZZoWWtnrFiylmVtJejtlAW3SZuZGTAMTbRikVZnv/Htiwhn5GaxKTL+NlMVwZpSpEDwfTZsBGmVF0C7C7beouhert1Ry+xpLD6ApCbAXIpnhob+kntwtmtgYbO3nAY7OpNRr+cpEUqHk0rqJcJ6Hck7/BGvSthzeEYlakKtTDdhZ6r1G53XLUm2NGLlr44b7RfdaXaaKb70uou7NhmhZa1esaKKedVUi2STfk6uF+h5lat5NL8P7ZlCfmM1AUZU4KQzXvdam5XVn5R2EzU/HJiQ9IWqTl9AmLmngc6JWdhdVqY6G0Ec9N5Oj6sdgABtqI1pXu5mpN6jvVa3o4+fVNdDTVtX10VGhzezKpHtBtlAvrNGCyMopNeCxTyx8cXprvIc5XaaUn0O1Lg2BosXijb0bfI0pw31PbO3zBopvvS6i7s2GaFlrV6xoopF1XS+s9tzBxYYxRqpmC5mlu5kkvzf4uWVeHs07KEfEbqgqyPGpdX8jvVXPlunoh+1I0NSVuk5gbpu0TGBWalKqyeiRrmGKU+4s9fL5ndamhoV26R/IXaOHdkZM9fN8oEbCqpimD7BNTuxCxMd4HnKrXbgVdJ/j74yMtVUUx4Lem+tLqLOrZZIWFxq2eslnJZJfuEVmR2ALXycjW3eiWX5v+hLUvIJ0SVRv7ol54pFm/2lEH09ncUEf80M7CvZhlFGKAPhuUP+EFys68YxWTm1xJIlh/VF49E8UZllG7JpeCSRrrs2MhTz5vzd9RzEBp3qw7j+XRefCN5CfO78rm0ZpnETWejaVVyvQ+WhzAWpdvpuVrtBge38UUPmUjxgbWSs2rSe65/EAyLv3q6L65ud8c2K2QsbJ2MVVKuN9VRrL5+FxlPFddvPUYa0qGgbbP0Vq7kkvw/uGUJIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCiHAxmVx/wh0gN6Ja55PJZL9h7KRPGEII+bzsnI/H2A0543A8Hr9yYsy3//9EbES1JlII3zQ9M3ayYnBCCPlkjGTqC252bsRl7OYXQjX3drxYzQ3FFCZqJISQrwaquRXY8tXccDw++4SXkgkhZBFUcyuw5WqOEEI+G6Oz8emBmzvooeZ2Rufj8fi8TGg02sXhdDw+3VMHYe84Bkv6YM/DOYej/Kbf7miU4vbOZDCUXMbHlb1BzHIk4U6aq5fO+F0eLdUy9iWHi1HH+iiW4vBCSlFPtG9t28ItUnMtVcjDHI5yJd0ovhdgRxI5q7I8lBQ/4SkLIWT72Z/JBAeeW6b0xBI1d/gE/8hdmhNl8nweHHoOOgmezc0ijCt9AEM1EcNW3RxC1s8w9M9k59osYH6hgRporOGDhwrhNpvIO+N3erRXS1R0lUF4MKcCLcXevYcI4Slpir61XRCuVc11VKE9eFvxtQD7yWMuQXcxFoxzD0UIIRvCiU9P4NDdWlis5nbFeG8PYo4w686jxpQpcfoo9qujk+uJrAF2pvC9PBoMjjGZAp3Xkfw8ztDn6n7stgESHMmxfybwnWr8HZ1/WxWdxJqJspiN9wbDU9UaaW3UFb/bo6Nag2MxTXXFs3v6rLq6RizF1d5g7+QOEe/No29tF4Vr0VudVWgN3lp88UaRp2e7g8NbJHEvxQmXw8G+dtyjByOEkM2gWkiEcO1uLaiaa9J2b+5C3GNKmDHDHErKgDpISucInq4PMF9eqWkwsJVKnFXPxHzj5ooFmajOqS6mYQ72+btAYz3YFcPB4EosMzN2xu9OuLNaWPNkq8QmWoqnWApVQrEZMhY1aU49XENvdVehNXhr8bUAseH0scwwi0kiwQWjiBBC3h9okYisRrpYQc3ti/uTmzElZs+mX4rVVytKdXVvFxf97IbcWKZNlMr0xo54pHVexZJMztwsHIo1zzOSzesA87/pjs74Cz3aq4ULhE0dnVErBarlzZCzoLYF9XANvdXdNq3BW4ufhRTKIFDxXM4RQjYLW3wI8+osv8nii5aRg9HReIyFUXaBK5sSdXWQLkYK2U0sXPDS2RsK7xzLgrkuGJBaloKwLBOsKB4mFWKtlVypxYLV1Fxn/E6PBdXSQOGm83JwrRSaVFHdZbWNLApXGbvbpjW4GuvFz7yFU/i72RZ3sQCEELIp7I5vJzfnC6+tLVVzR5jaKzrmZHj5/K/8f3t32No4zoVhOEkzoRhCCYFSgqGUUhjKEChDCYQS/P9/1eqRZEtyfFzP7vvuTL339WHHjmVZcuGclaw4WT7wEVjPmFyodmdrnKHoqbjZjV0mXeQmHCwMjX56TdOuT3Pm+XbF2rK6tTiEOdjmNDhW7rUiv6nTbumUct3myL0ZKi7XzS8bQJoDMA/jaa7SlONr/H/+PNSVIdHH/zza5/lAie0cztYZqt0NL7UqI5b4lYtMeAlZ7yzthtGcef7oAatb3la1N5eBMV2vFT7Z+6qm9nZKubRp35vB4lHZ/PIwaQ7APIymuaUOdlOeIzFZC01STOzlAz3lqVWrf7ijFYRHf9mYSidfRO3Knj9ZemdpN6Q583zzwGi3Is0Nuyze12uFv80aV0/t7aRyadO+N4PFk6z55WHSHIB5GE1zSkndcnwtIbEykJ80C6vfHUXILB9osUmjcBoec6ma3XuqavpFVE/+qGxY7yzthjRnnm8fcJ+b3YrUu7iUM6fLNsd2wnitGUL/zbOpvZ1ULm3a92aweJI1vzxMmgMwD+OTlhrOXHz0XIcvKBgZKKx3OalkpUOK6ikfhPzQPUjTmMgJuceZfBG/ar55DJNzq93LJeWBTO8s7cZLmeebB6xuuR7F5Rs7fdZ95yBRcefsctv6XjOW7ferp/Z2Srls0+zCUPHh5pcNIM0BmIfxNOe/Rxy9rd0BKwOFAUdwOfidbNij0imvhTqzycDpF0nv6PBOQ8tremdpt8uo5vnmgeFurTV4anXfjsv5VmxTsfSls6m9nVCuOMXowlDx4eaXDSDNAZiJdfZiyaiq4qIEb7Ot47shl+17Gp2q6h4cdXauYPylut5hd5F0quop3m3pTL/IYnPQZR7yJpbKs1x95bXM860DRrf0s3z1XZdBe9qcsVal/bWYU3v7abn+KUNdMIoPNL+sbVPlB91fMP0BAQD/deXQCACAWSHNAQBmjDQHAJixQ9N82I8PAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAMvq6fjRNG/xh0L/KIfj8Tn+0homezgej72fRgKA+djUz8eP9+PT3bT8cNP9KOcf+G5i/TR59oOuw/5b71We0FteNA1gvnYuwiXH+OkIn+X6PxT6x5iS5vRT2FN6Og9TekuaAzBXK6WF885v3xzOzcVvjdq7M17itqMIaf2E9m8waTS3ruv7+c5s9v8iE3pLmgMwV48uvn2P2xO9uFOyp3JfMc3N29/4i5DmAHxN1X2938TtYe8uvtVx+9qqquv6UK3ibqA0l4XEMqjeVEXp2ypf2LCqqpu4uVhWD67uh7J18eR9Xe+7gov1nSt4l/aH3Lgyoa6rNDfYh9uqK9M2WMV2V4Oetft4P/67cd9ckUPVP9NqdWrpYpM3Ii+5LG/icGV2u6/SXNZbZ+jOk+YAfEHfPlzsktNIijiowOCDtrVCX+ctxNQ67gZrZbxEz3+UZVK81F56KlS5vZM2bpVcO69djHZXPC1uY7N9ZF4+hx25HHyha/fdkhiXsYs0N9gHJ4vq/prrt1DC+Z4yxq3qymVTtdEqndi8xc9GWr1PLX1Z6l7GlupCWWIq+mBUZrT7+i9S9HbkzpPmAHwx+dKSkeHID1/gqRw9OD/18atGDJt92PYfV1WlMx7dvxoiuP/q4IN2/YBGqezSRs8HndfcxT0/clQwX7l/f4THgZU+u7Rp2AXbs6vu8rTdPR/dlZc6evbnL338Hkp0y7OqeHSp+k7BWtoUYfShiOpu88Ml1o/6ZrHe+wybtaZ581UpO31sq+uJwDtX5OyHRav9yWdwx2y13VI7zZmVWe2++ovkvR2786Q5AF9MyF/Bc/xsyCaOWT7qPNUpVD7FbRdhtfsj7ijaZiFRETIL0d+zE8O4rA3/9277ejyk4WTbPB/8L11lPi2kuTW1YSAUq1CX/raqoU0Rdh+yqO6v2Y3zntzOR9hUjr4Pm/4i8eOCBlPdiCiyW2231ExzdmVmu6/+Inlvc/07P1AEAP5gyiqtT5b/38VM17y3T9LyZCUrzbaFYcAnac6XDNXULvSqFSG0L92BbpyXfHMF3uO2qspWvmt5TJdoFotbt9um2kSF8k9TihjpQxbVewFeaSX0RgdiFgp1DQyJNZ/Yz9xmq+2W2mnOvgVmu8OhCWmuf+cHigDAnyyMF5xLsdrAsA2PeS4hvGqzOEu5LQb08TTnR0E+VymtPGgAcvFjDg04yki6qbZ1rY/b8V4v2Grs8nZM3G4TDyVqdjct6qQUoeJGH7IL9a6ZeqPBTjv20lcF24xQ8G1qXvIMaLbabqmd5uxbYLa73JZeUeezOw8AX8Oq/n58ebgeQRk2fpJRIdt/ozh8GGm9RBxHfJLmfGjWoMmFURdFNQBRblGV3Uhtq0CeDAfbm3CwkI31In0ak4XXpYixPmQX6gX4rDevbvO8d3fvVvO/H/lFMocwM9uc4ojZbrW2Bltqp7mRW2C3u9iWvOikOw8AM7VUyPaPa9LMY6TcFr8s91maU2I7hzSjYiruBlVKGjGoV6r9NY6AVMwItorIn75oUYXyOdkuRYz1IbtQ75qpN+uYv7xzVuTaVmc1l9Ajs9V2S/3mUJqzK7PbXW5LKjr1zgPATGlezU/rafxy9VwrxtuBNJfPxYUnVrWK/dSeVvcd/QrM+P28pdtMs4kjwVYXyh5MDdNTs/zxWEoRI33ILtS7pnZDinBZTnOWy+I7ZybNELvc7pittlvqr5qdk47Yt8Bsd9jO/yJd0cl3HgDm4vYhX17pvxXnE8FGKSEuOXH8gr82DPbSnJLaY9wOtNikUdwPsVa17t5TSFXe61bt+6NGsPUvzywz6DU/r9e1VC8ia1PESB+yC/WuqV2fLrRA42fxaG+U+hyWOpqttlvqn2d2y3PyNZj2LbDa7fT/Il3RyXceAOZCKevysq9Wi2qnZ2rNOUZBH1+bZwXbzV4JIgXbXprzUflltVjtuxm5EMK7B2kanDjdPJqGNRdf3Tp868EKtj4zNY9hBLbavVxSjE789U+qrtL5mmqMycPuQ3ah3jW1Gxpa5ZOWTfM6kPJcP+Pqk50KxwUrZqt7LZXYUj+l27y4S2y0lfXBrMxs9/VfJBWdfOcBYCbyV2w4P7NnR/kXEop3efTSnP+OtNd++TpEzRR2Q4Fsvq47o2ne9CYVO9h+C+s/W6fB5TR+EOq5ZmqnTR5mH7IL9a6p3dDutdL+S/29XeDYPZtM1j6PRu9pXGy2OmvpfT5p6fJkV9XlseyDUZnZbqf3F8mKTr/zADAf33Z6x2HxVsVgqVcp1nW+bkKqqjeyWVX3xVsoXQKo8tcyLt1eGI90NltXsX8VY/4Cx6uaZXNQIx7iwolh6kEdLtGrY7gPWaGyvGtNqCb/8rSz0qhqoHGLW9V+12WYjtVqf699DivSnHPjbskhfODfMJMZqsxod1D+RYqiv3DnAQBzpcdY7VDH03hnNNP+sn6aAwDgX6Mpy/Qa5xu9AfIXf7DoM6Q5AMBvsywfiTXNR2/a9R8jzQEAfqfV7llTlc3P42PxvYv/kdemOf0fqgUAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAP+ixeIv3CSYc3JSkKEAAAAASUVORK5CYII=) ![image.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABgwAAAQZCAYAAAFukxEJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFxEAABcRAcom8z8AAP+lSURBVHhe7J0F2B3F9f+x4O7uENzd3Z3gDsHdJVggWIAgARJICAmBENw9uLtbobTQ4toWSkv7/+0/n5M9b08ms1f33nfvfc/3ec4zM2dkZ3dn5oyeGS9xOByJVwSHYzS8Ijgco1G2Ivz3v/9NbbXjt99+S84888zUNTaOPvro1NY8XHLJJamtPBZaaKHU1vr47LPPUltl2G677VJb52PHHXdMbaXx17/+NbVVh7IVgUIMDjjggGTaaadNFl98cXED3DPOOKPY99tvv2TppZcW++GHHy78X375Rdy2Ivz888/i98EHH4jbVgR+1BtvvJFMP/30yd/+9jfhDR06NJl99tnFzvP2339/sYMpppgiWX/99VPXGP+DDjpI7M8880wy00wzJX/+85/FPccccySTTTaZ2E888UQxR4wYkcwwwwzJjTfeKO5vvvlGnv3RRx8l//jHP4TXThXh//2//yfmt99+m5x77rliB7z3HXfcIfb//Oc/yb///W+xf/nll2K+/vrryZVXXin2EO+9917yyiuvJK+99pq4//WvfyV///vfpQH95z//mVx44YXCB6Tdq1ev1JXIc2w+nn766WT48OGpK0lOPfXUjjL09ddfS/5Jf8CAAcIDX3zxRTJ48GB5h3pQcUWgwCjuueee5Prrr09dYz7Y3nvvnbr+F2euueYS01aEJ554Qkw+LrAV4U9/+lNqS5LjjjtOzE033VRMrWTgL3/5i1QCi7DALrLIImI+/PDD8vF+/fVXcfPjjjrqKLFTYQEfHJx//vliAn4maMeKcPPNN4u52mqrJZ9++qnYAd+YxkgboZ49e0rDpejfv39q+x/GG+9/Rei0005L3n777Y5vrYV45ZVXFvPFF18Uc7nllhNTG8Olllqqo/KBu+++O+nevbvY3333XTH5Z1quwFZbbTVW4Z9ggglSW22ouCJssMEGYgJeuHfv3slTTz0ltR5xtNtuu6W+SbLvvvtKYZ1vvvnEbSvCW2+9lfzxj3/s+AFZFeHOO+8Uk9oObEty//33y4ewQIyTLqR5fuCBB5J55plH7LTwSItPPvmkoyIMGTJEzH79+omJFFG0c0V4//33k8UWWyz5wx/+kBx77LHJnnvuKcT3sd0nKgJYffXVpeU+55xzxG1hKwJ2KgKgBVc8+eSTqe1/oLLttdde8tx99tlHeEjmeeedN/m///s/cW+55ZbJmmuuKfawIiy77LLJ5Zdfnrr+16jVioorwuSTTy7mgw8+KKZ2V84+++zku+++66gItALUblphuibAVoSJJ55YzAUXXFDMsCKsvfbaYtdwWhH4cUB/xnrrrSfmyJEjxaR1A3R3gOaXrhxiWkX3eeedl1kRtFWhhWrninDdddeJScGlm6QFj28cVgTtaoILLrggtf0PtiKMGjVKGjrFSy+9JKZ2n/W5s802m5g0aGDWWWftkCJg5513HqsBA7GK8Pvvv3dIHZuPWlBxRdh9993FbCSsROgM2G6Coh0rAtCxE6Dh0hbchtF/r/+FBi8EBZDu0w8//CBuGx+EA3T7XGDdVIavvvoqdSXJ559/3pGerQSACkAFJu+ME1ZcccXUpzaUrQg6mLn44ovFbCSsOO0M8NPWWWed5N577xU30k6lRjtgjTXWSG35od6WuF4svPDCyTLLLCOVBtT6jp37Fg5HQeAVweEYDa8IDsdoeEVwOEbDK4LDMRpeERyO0fCK4HCMRtUVgYUNVvRCYmEDCnmhX2haCsPZMKX8lGfdIa9UeLVjtjvsOyuF38D6x/wsaThLlfjFKAwT2rP8rLtWVF0RWJJnf1G1RAWK8YtEulzfzmD1Nvbu7UDh6nM1qKkisA+nGmLZnL0h888/v+wdgYfJT1E7e4B4GdzsdJ1qqqnETRgNS0HVONCuu+4qu0zZPMYK50QTTSR5xI8VYhu2ErI7LdsV1X4TiNZ25plnTiaddNIOHv8C4t9AbHNgmzV+a621lvxrdgpoGP139vlsuV5iiSWSgQMHyv+beuqpJSy7GNj2TRwNWwkRt1bUVBF++uknoR9//LHDbt0hn3MAmAcffHBy5JFHygs+9NBDsumKzXt84OWXXz758MMPZWsu2zpIgxrOLlQ+EvTCCy8kH3/8sWzwo9CyeYt0+QjTTDNNx1ZteBtuuKHwNQ9hnkIedqjdQUGMfYPwW6gdOuGEE2S3KP+MswdUDLbdw9d/c/zxx8teJM6TrLTSShKPZ7FHrVu3brL5kd3K1157rYTnGZiEowBj33777WXXMgenrrrqKnmmzUdIpKGk7lpRdUXgZaHvv/++w65uJcuHOL9wyy23yMuyC5UKwC5WCjktDR9q0UUXlRfXD8SuR/p8F110kexEhcd+EvYBcbCG53Am4a677pK055xzTnkWlaxHjx4Sno1g8GyeYvmzvHYHhVPfWSn8BtYfO40O35xvygY8tquzdZrDWvDYtcuBKdLmm+O+6aabpOHaaKONkvHHH18kN+lw9oN/TbpbbLGFlAXCkA7PIz7nWNjsaP+f5iU0ldRdK2qqCIi8aoiuERlFmmDH5FCFZh47pOHVDzsmcfS54fP5WJikqTxNU93Ej9lDwq/dQWGNvXtI4Tez3xQ731v/Hzz7vSH9f/pf1Mz6f5av5UPdlf4/qFbUVBE4iKOEKLPukFetfzl3yI/5Wz/rH7pDHma7g4qQ9R1i9phf6B9SNf5h2DD9LP+Qr1QraqoIbFeG2EsOWXdohnZ1hzzrl8Wz8bJMpVjY0D9mb3dQEfSdw29gedbP+lteKT91W7/QjFGpMNYv9FderaipIuiRSD4qA9oYcULNuun/c0wy5GcR4Rg8MyMU888iBtPEq+Y5+j5Qu4N/pu9Kfz/8FtZtiRk84sT8YkRadGNq+ecoT4j5ZxHPqPf/1VQRyChEX05H63Ykr3bLY3qNl2TAFPppHGsnHIMtjv6pn8a14dStdg7now2BsKXCqZtw+j5Qu4OKoO/K97Xfxf6b8FvRlyeO+oVkwyoRnvPR+i+sn7ptfMKh1OHll18eJ5z6x/jMNOo71YqaKgIvR4FjFgcNFgxSMGPEB8TkGCRxaCVwE4fKoeHUjsnLYX/uuefkw2gYjRsScTQPHB5//PHHxwkTEnEg0uR9INWq0M6gIuj/YzqTb6HfTv+BJQat6kc85ev/s3HUToHFfOedd4T4xviF/8/G1XLC9CzaLpRvyYaHNN9ouiBvqvGiFtRUEShsEDUQ8coMAdNjV1xxhWiXYD6fLsqjjz6anHXWWSKyyCitOxVC56PRSYTJwhgHtjnsTSuFFgzi8KNoHbCjb+jWW2+VsKh4YQGOhZtHHnlE8rDLLrtIOCoOz8V+8sknS3os2sw999wyN40qEVS89OnTRz4s+dH34ae1O6gI+r40GHw7vvthhx0mChBWWWUVIaat+U5Icb4l3Q/iYKe7Sr+cf/7888+LBhEWMPnnHOBH1Q/f9dVXX03efPNNsTP9+thjjyU77bSTaKtgHQI7Gi747pQZ0qYSPPvss2Jnap1p9k022USOZPKvl1xySckXebnsssskHOUKt1UcUC1qqggUNhZOeDAvga4iMkIrQ8HDn9oJ0aoTBh5EGIgPyEsrX8MRBz9MfpTaIZ7JR+IZpIEdPnY+OPGpOFQ0+GiuwA+TdLDzczQOfOyaB9Jod1AR+I68LwWKb8a6gLaofCd4/E9M/f7Yiad2GincfEcqEt8dP01DTf3eEGH4z5iUHZ5LWviRBib/lLSxkweI/63/T9OGKD/E5xn6D2tFTRWBF1ERhhkSHzjk8QLEI9OhXxbRstPfj/llER+NRZqYH8THD3n6PuSt3UFF0He+7777xvoOpUi/UTVEgc4qI1mEhEByxPyyiEqESR5rRU0VgcLGC/KiFCxLiDJqveVRy3lBKgN2CD6rxNYdxqFlR9qo25LloevIxmUlW+0aVu1UEo2n+eR9ID5ou4OKoO/Lyn74jWKEPw0S/zwMa92hHw0Z3xi++oVuSO2Y9AKIp37KV7LxIfzh8T6Ur1pRU0XQTFBzaf2Vbr/99mTQoEFj8ZS0QNPPUx5267YEn4rGD4j5Wzr00EPFJA7PoNKFYSztscceHXY+ur5PPR+yVUBF0Pel0NnvUor418The8X8Y8R/IE7WPw6JcPzvSv65kv5znoNZK6quCAyMecGQyDw6LtVNwY/ZLVEgiRfzs5QVX4ndqjG+Uhgf90knnTQWDyIv7Q4qQvjeSpX8s0qp3rRiccqlg3+tqLoioD5R528tMZNj53krIXaZltthWAmxqSvGL0WohkSLmuWxaNTuQDucnYNvJ2rqNuwYGMEz31wtmPLMA+xcrAVWxb2jMQhVQBYVuVSEddddN7VVh8022yy11YdaKwKgi+SoHn/+7C/J0888X5ZuufWOKN/Ss8+PURffmai7InDQplawcJMH6qkI11xzTWpzVIMnnqp9YBritN7/uyyks9CpFSEvpbT1VASgl1I4KodXBIMddtghtdWGbbfdNrXVhymnnDK11Qa2C9QyxunK8IqQgvUEtsDWA86o5gHuTqsXenGJozJoRWAtgmOb7E2yN9jcdtttsiDKGIz9TDQ2LHyxp4n5f66v4lgtaOmKYO9UqxX1Xvej0Du56gWb8ByVIUsisPrM4lsW2JRHBQHsRgUtWxGYr2Xetl4ceOCBqa0+sCMxD3CY3FEZvGs0Gno3Wr1gi3YeyEsioP7FxwqVoctXBFYm89qlecopp6S2+sB5g7yAjiVHeXz51dfJa6+/WZLeePPtZN31Noj6WSJcZ6PqisCNk3mBAzh5QG/UzAN6nNSRD/IYSzYDVVcEvRo0D9S6Ih1C73POC3o1raN+oMaxFVBVRUCjHJvk8kJeFYELsfMEmthaEXRbVV9spaThbbxq0qg1XkjErSe+UrVp6F6oqipC3mJu1VVXTW31gd2neaN3796prXVARQh3ANudpmq3ZkgaNkbqb8OGplKMH9pDUr8wbMizfhoXQgFAGNZSjMdualBVReA+2zzRs2fP1FYf6t1iEQNKhVsNVATm5juT0CwR4xeVqq4I/fv3F00GeSKvLRaNqAiqWbuVQEVgFTdvYlU4xofU74wzzhA7mqwpJxB8TBZOUfSLsgQW24444ghRz8jRWOKwymzTbCSF71J1RUCdRt7Iq9VtREVg+f/SSy9NXa0BKgKzXp1BDzzwQIcddS+o6qHQoa1i8803lzUjVp3xQ/ULpxNt/M6iqitCIwrbvPPOm9rqQyPyBiaZZJLU1hqgIlDwLKGiJuTFyIZTe6VxoaOPPrrDPmTIENEogR0VLHfccUey3377yQEu1K/AgzQ8F4VU8qxa3qUUD6qqIrC5buutt05d+SGvhbBGVYRWGydQEehuKFEYIWu3vDCsDWd5Id/yrDvkxeIrKb8UlYtv/cNwoX+Mj1lVReDwCgqg8kZeWyPQgtcIoI0NxbStAqYCVQtEjFQrSCl7FoVhyrmV0Cyh9qxnZsWtlGqJr3GqqgioS2wE6Cfmgb322iu15QtE+dVXX526WgNIb52T72wiL7169Yr6FYHIH1IUVFQRGjFPD/r27Zva6gMDs0aAj8Q1Ro7aQf+/FVBRRUBZayPAlGweWG+99VJb/kA5saN2cGinFVBRRUDDdSPAjYt5oFGDZZDXlvN2w/sffJjcdsc9JemOu+5NLrmsf9TP0p1335+m2nkoW4LYo98oiVDkBTVFv379UpvDosudR/jssz+L1uRGIC8FX1xn2ihwl7BjXLRtRfj2u+9TW/34+z/GHMouGt58O7+LQN7/sP2vmSqFtq0IX32d3xHFH3/K72DLf/7zn9RWP15/o/YbVUJ4RRhTEbhJ/4ILLpDdukOHDhUeuPLKK5MbbrhBbi1iBZfblbjd5qijjhK1/UxL6wEorwgVwCtCMZElEbjilSukgM7RW6B8mU12gLuSQeEqAtoptOBxAIfdeYAVS4gXQ58R1/mw9xs3K3OY3E82fPhwCU9FoLbrqh2rs2y2AvA0LW7lZA8I+8Q1HXDTTTfJyh8gP+wbV5AWLQz497//3ZFf0iK/bAPW5wIuI1H1IlQE4rOYQh44qI+ae8Cz9X3ID8/Q5xLn999/T04//XRp6YBXhDbuGjEoZtflsGHDZKOU/nQ2UVFwOLTPjTjnn3++hKXQUagAF8TpCi8VAZFJZUBnPZe+cdEg0JVaKhN3Xp199tmSFmJTcffdd3fM1lDQ2eKBziGeTV40DTVpWahQxCFdux2EXZHkF1ARuPeLNAcMGCD7TajAgDyw0ojOHVosxD2inlkz1Xc0ePDgZM899xS7VwQfI5SFd43aH999933y4Ud/KE1/+DhZdbU14n6WRofrbHhFqBFdvSJUipY7vH/14KHJdcNuzIUGXXt9lD8OXT8izjc0ZOgNUb6lYTfcFOWHdMVVg6L8shTJp1eEypCn1pNGoqMiZIEBdKN0gjKDkAcaubLMOCfvI6pdCcsvv3xqKzYqKkEMihsBnbGpF7POOmtqyx86190qYIarFmIGLmavh3RWT+2hfzVUTfxqwupMZUUVQadF84ad5qwHjThPreA61VYBP5VZLiovxBR4FmmYrLDKywpj/W04G0apXNiYf+hnCV4YJuSFflk8LYMVVYQNNtggteUL5ubzQCMlAmdtWwVUBNZGWFOBsJciGyYMX0n8RlK1z88KD79UWlVVhKIfYj/ooINSW/5olcEeoCJ8/vnnuRNrRJhcB8zKMes2qGOhywyfjYmYu+66a/LBBx/IAiZuFjgxLRGfuJxc4yD/eeedJ27SxP/LL7+UMRkmWiZQ1Q+fcDadvKiqijDHHHOktnxRz724FnndxRbDRBNNlNqKDyoC2xsssaIPhfaQLD8Mt8kmm4zld//998sCKPqKUAP66aefyrFWFjMhdhVrWBvPmuiNYqEU97LLLisXhrNyr7sD1l577bHiougZU+PbtEIqxQ/jV1UR8rrHIAQtQR5olK5SBnoc4G8VUBFoZfMgVb0C0TW2flwLjP9dd90lallotakAuKkkHNjXsCj1IoyND7GSj0naI0eOTJ5//nkJBx9JgB87GTT8Siut1GHPk6qqCIgxth/kDcRoHph//vlTW77gfjB+cKuAikDhUaIQvvLKK2JX09oxLam/EvHVbsOE4UN3jMK8hBSGD8mGs+GVb/2VbP5jz8deVUVggx0q/fJGXoVshRVWSG35Ii8lxc0CFYEL+5SeffbZDgr51j+LpxT6Wf/Qbv2t3YbJclcTNuRhWrs1rb/lQVVVBDD77LOntvzAqD0P5HX2OUQjF+oaBRoXVWbV2UQrjAaUmF8RiB3GNB6g4j/diEKR12A57/sRFI1SHNaVMMsss6S2YqPi0s0AKc8NcICzDXlggQUWSG35gYUptl076sN0002X2oqNiisChZZRdp5goSMPLLbYYqktP+SlYaOrY8YZZ0xtxUZV/Z1GdUHqRSMUFOd9L1u74ZXX3kgGXjOkNA26Ljn2+JPjfoYGDbk+TbXzUFVF0FswuT09D3CHVR7Qk3T14qKLLuo49cYikSMbbXtCrRzoM9NKssBUNHXpeVUEziqznWSHHXZI1lprrdwW/NoRXbIioIqDjW0o0uJC7nov5V5wwQWTU089NVlooYWkL17rOgBbPxgod+/eXYjzz/VisskmS6acckrp2x544IEp1xGiy0oEsMgiiyQTTjhh3RWBfSozzzyzTMkyvbbNNtukPtWBuKQBTTHFFLlMx3LHMhUhr6tv2xVaER5++OHknHPOEc3m9qottGCjhOH444+XK6TYbMdKLvenEQcpjqII0HIVAdx+++2ipKleoFgYCVPv+gQtOPc35LXNAm0YbClxlEaWRGBmEe0gWWB8iaYQwK2WoFAVgTFA7JIJ5WFauzWVQrflYYbxqARIA/VTfhhPzRiP00jsEGWNw/opWbeNp2T9LM/6WX+169J8V0Xbdo3Y+movYlZiABm7qNny1B4Lp/zQDzcnhEJeaNe4WW4IvUVqDymMG/pbfpY/pOko5b242Gpo24rAwQgqA4TIUrNSCsPbtCol++xa4ivlHT+WTlevCL+Olozff/9DWVp8iSWi/JA6Gx0VgYMKOqjhcASHXXCzoQ2enuhhJyr+LK7BRz0jhzO4mE3D4W8vdmZOHj9miV544QVJm3BsuiMcq9bsZecwPxWS2R+Nj6mEH7NNHORgFovBOwc8UCqLBrpjjjlGNPRxyw1rHjxT42paKKXlRBU7S9k3z3nnOeecU66IYmqY92OamAEdgzviMGbQdJS6ekWoFIsvvnhqKzY6KgJqEpkmVeI6JutGhSMmJ4c4qEOh5EgdM0D4sc1Bw3LyR+2WOFxBhdPb2LFzSolCyiBrxx13FJ6GRyWkjQ9xgIOKRFocDSQdKsL1118v/scdd9xYz2ffudqVeNYhhxwiF2CzZsAULhcbMl2K//7775/06NGjI3zsfbwiVAYmM1oBHRWBLalKbOVlbl/t6AulxcWNndNJFBYKFG69zJmwEKeMMJWHqXFxU+CZWrvtttuEj35S0tD4EGlzUsnGx87WXg70EF4JvqatpHGotDa+2iFafBoAzb9Nk+drHM2nje8VoTK0nKY7Ws7Ynu2ikL00ughU1IrA/voiESv1MX5RSNFRETitQ585ixgDxOy1UL3xLdWSVh75L2JF4McyvcseLharMEsRYTRcGN66S6Vl42eFq5Rv3Vn2kNQPMytcqfjjnFAjMANFp/LEdKptTYoC8sTUbizPTnGq+qimo/igIqCnFp1AkLXHKAyrZMPEiOljzpKoHVPjMdWs7nJp6VS05RFH064kL0r6PI1TaVyvCG0IKoJOV0PWbinks8tWp4VR78kMHPefoWuoT58+MhPHCUUmLzj7QXwmITQtmx6XqzAJwWXyXBxz6KGHyrYcZvvgM5HCPrN11lmnY3pa01GTKW5mJTVfqHlhMqRnz57JnXfeKYokmIq/9dZbZRpe4zGJo/lR0nxlkVeENgQVgRm1aokpcLWrXiLWclgj4vYg1oxYl6EiXHfddTLtrOGZPVS7EgWcgrviiivKLUbEZ/2HW/iZ+mbGkK01Gp5ZORsfIr5OzzODx5S1+qGPFvcpp5wyVhyea92VkFeENgQVwU4BV0osZqqdy+XtFDQtMNPbuDUMfLUjNdSupP4UTFT/s/ZEYYbY1Uta2DU86hzVbkmnsDU9zYPNi5LOeob8UkRl84rQhqAi0FpDTDdbUp41YzxMS1l+uCuJZ902vPUP7TG/mBnaQx4mOxmsOwzvFaENwYWP3EYaI1puKOYO/WIUhgntWfGZalZ7VpxS8ZVCf+uuJD6kYWx4rwhtCtY3ULdfFNpyyy2j/CKQXQvyiuBoKNhO0wrwiuCoCXmeRzi1FY9qOhygbQ/mOBzVwCuCwzEa9VQE1hcg9voArwiOloVWBE4BDho0SFaLL7jgApnChbgGitOCvXv3ljMkbJUYNmyYbLlg2vKqq64SJQjAK4KjZZElEfSO42rgFcHRsvAxgsMxGlSEt95+tyy9/c57Ub6lM846P0218+AVwdFQMB5oBXhFcOQOtjcvtdRSQgcffHCy/fbbF16zuFcER0PQrVs30W2LQuVGXO2VN7wiOBoGdNuioJmNd0WHVwRHw8DaAlKhFeAVwdFQPPjgg6mt2PCK0GZgnz2LWk7liW+l8IrQRuCoJrcGhcTdFzE3prWHPKVSbg0fUlb40K12jaek/jaMDVfKz/JCCv38hFobgoqARnG0iyuF7ixCl1CMX2n8LKo3vqVa0ioXxytCG4KKwHy9JZRYYY4cOVKUXilflWGhYAsdQvfee2+HP+r6NZwl1KhgkqbqHQrTxE95bLRTP0s2nDWtv5ps1FO+JXQW4a8KwqyyMHQoEUbTfOyxx8TUdC15RWhDUBHQA2SJy/zU/sQTT4gWB1Z70S/E7aa4UblCOO654FJAGz9GaIZABQwFHdX93EtBmij0Gjx4cHLsscdKQY3FVdpkk01ENxHKvFQzOgq30FWE0i4KrSrhisWHUEJ24okniroY1LOgN4nFu4033lj0M5FH1NNoeHg2PuQVoQ1BRUArnKWjjz56LDdniNHgwN0Sjz/+uKj756J13Mstt1yy0UYbiap8wrK9mlViG5+7IkgDPoWO+yxQ1kWFQAnYdtttJ9ruNA3CcfbApgEtv/zyUpB5Plu2WWtAj9Giiy4qcVZZZZWxwlPZrBviQkp0HlGh2d7NnRkbbrihXG5DHrfYYgupIBoeHU02PuQVoQ1BRUBpFYVSiUIFjzMBFBoKm/W34cO46laTNMIwFDRNE014mIQJwxFX/Sw/i8I0iK9pZJENH8aHKPghzytCG4KKQP+Yroua1m5NpVJhrF/MrbzQruGUQj9rVuOnFPOLhbW8kK9urwhtCCrCI488IgPfUoQi3pi9Uqo3vqU8n19tWnwrrwgOh4FXBIdjNLwiOByj4RXB4RgNrwiOLg+vBI4uD68Eji6PkpWAC6Lrxcorrywml8iF4BqizgD3SFcC9si0C2abbTbZHFcpRo0aldrGxuSTT57amgc2CVYCNghyJLRaVFQJuFLnoYceSqaffvqOwwtcLDfTTDPJcjrgHqtll11W7NNNN12yyCKLiD2sBHPMMUcyzTTTiD2sBKuttloy7bTTJt27dxf3ZZddJvtHQI8ePcRPcdppp8lzdAGEzVbWn2foc9iwRV55B8CGsIEDByb77befHCZXzDPPPJImeQTsm2kXzD777Kktkb1EtkJccsklHdroqPhc+Ge10/Xq1Uv26oOpp55aTIuTTjoptY1R+cjNm4AFLfscdo5yt5mCisaGOgW3depz2Qpy5ZVXil2fTVjyxt3RCtyUSc4VgIZVApaeFXvvvbeY7AAEWvDZqQi4BM7CVoJddtlF7AB7WAnsC9x4443yswA7AhULL7xwR8EHbKbSDwAo+PwIBR+QygvY9AXYWHbFFVeIHbAhzOZb89GOlYCNaBZsaAPcOA/03dkvBNgIB6699loxw0rAda4AfaZA40844YRiAjblcZumgn+i6QLOAGiDxQ2dQP8pGwS18aKBAg8//LCYc845p5hA89GwSmC7D9RmsOqqq8p2W+7FBVrD2erKBiYUvgJbCRZffPGOba4U1LAS2FaZu3MvvPBCsXO/r4JdiaFYZ5OUbsvVysnBC3Y/asvCO2hloBJw368CqUC+Fe1cCQDfVlWprL/++vJNIZTvrrDCCsLXSrDtttvKluVLL71U3DFJALS11u9oVbVsvvnmyVxzzZW6xmCqqabqeC5KgcFaa60lW7YBZxzYPfq3v/2toxJwpzPQf6wVD7BVHDSsEiAJ+HBACxLiCmgfUSvBFFNMISb7wIGtBBx2QJQhvrhILqwEfBjA3ngqkVYCoBWBvepARbBKoosvvljM+eefX56hIpPWn64QOOCAA8SMVQIqCzsYkTKqXaEdK4EWMoW2rjRoQP+XVgK+FVhiiSXEDCuBShYKMNh0003F1C4le5yoXLaR4d9qeIAKeG0AtddB/x4gqbIqgZY9/nVTJUEjsfTSS6e25oNxg0Lfu10lASeztPsDrGRVvkpyCjHHNIHte1vY72TTpZtjD8BT2K0/0tq6bbeXi/z0n+ilfmHeaLA0juYh90pAv5nWWPvSjcbhhx+e2joHHDzR7h0IxzetDA6kMPHQTqALtOaaa8qEBmB8aMcJlaL6auNwtBm8Eji6PLwSOLo8vBI4ujy8Eji6PLwSOLo8vBI4ujyqrgQsfqgWYKsRODRDisWxbkthHEulwpTys/6xMGq2O3755ZfMdw/t6lYKw5Tys1QqjPULwynPukOejaN3LleLqioBq4dsL/jnP//ZlqT7X9oZbEeIvXs5ooDF+EUiVqBrQdWVgJaEpfNqiP1EbIZi2Zz4moaavABq+bCzv4gVQPaJ4GapHJMCCmkc+GxvYL/RPffcI3uG2DDHsjl+XDxHuGqoK1QCtkzE3r0UseOT/8N+Ivv/9N9g8t3Vzv6iWWaZRdz8W0y+rY2De/zxx5d4bNtgpXeGGWYQN37sUSNcNdS0SsCuvp9++kk2LWGGFPIJj1JYMsmORHRIzjrrrLJFlr3gfFi22rLhjk14bJgiDT70RBNNJApd2Ud+2GGHyQdiA9jPP/8sBy1onUiXMCh75cPrzyC8zU+WaakrVAIamfC9Qwq/DXu6MCncE088sWxvQSEu/w1i+wJ8/h96R/l3fEs2yfE8VEPyX9kGj+JcygR7ftjvw7/kDAMb8fh3bMHmWex1CvOBO/bflJpWCWjNrS54HmzNmJ2XY3chuzRnnnlmcSMdeHEO4zz77LOiSZkX5IOiyJWPSkt/+eWXSyWA2N1JHPLw9ddfy5kEWhO24tLC8Cx2FuImHZsPzYuS5anJz2t3sIU9fPfwOyhP3Xz3888/X4h/x7ZoGjR2d7KvjD1m3HjJf/nkk0/kgBPhOBD1zDPPiIZqzmywE5XdwISD5p13XrkYkF2hhOFZ7P8hXRrLrHxl8aBaUHUloE9JAYTQGx+zx4gM6qURmDYd/EJ3zA4RX+2aDmSfb8NUSsTvCpWAVte+s/0GpYhCy/cO/x12/pH95tau/0/jWD94pIfd5kUbuXJk42DnWbWg6kqgFyLQj8OMEX7WPwwb+sfIhikXP/RXnvKz/EN7V6gEdEPC96/EncVTws/6l0svKy3lx8Jbv5g/FawWVF0JEKccwIBCu5LyLFm/mJnFU3foZ01L1i8rfFaYrlAJ6Ifzvvb99RsoT+3l3Gq3ZhheKQwb8mL+YRg1YzzMplUCPcbIgIjpUganlRB9RXTbW16p+PjxHERcyLduSzrgYt445h8SadlbV7pCJeAf8K78j9g3ySL67JwmrOaf8xwKZ8wvRvw3/Q8xfyh8Pl0h/X9NqwR8DIhZHTsyj43alYfJB0TLQxgGYoYAk3Bqx+Q5/Cz1U374LHXTLyQOs0PKV78wDkRavIe+U1eoBAxkeVcus4h9k6xvxjlujj6qW/0h/WehHxWOGaPYP7PhND7/jbwxVlA/Gw5T01I3jbH+v6ZVAj4ixEwP4wO9XA0zRur37rvvytnh0N8S6VGzsRNPr9zBDV+fp6YSAylMpBR54yPi1rSUbDw+IG6uGtJ36gqVABU5vCumfotS/4+BKyb/m3Phys+KY/n8cyqO9Q+Jf6DP4L/xv+ne4M7659j1H/MM3od4TasETGlCXAqHCEK/EBcgoDWAo27cg8WcMDegUyj5cISjxSUcvLPPPju5+eab5TA2awboqUEjAao0mApFzxHh+FForoDHS6MZgXlo7qpCNxHpEYd1Aro1SBvyhh0+H4d7upiC3XrrrWVKFX9uLmFdgXzxLJ4Dn2nWdgdz9rwr35X333nnnWVKGR1MXOx35JFHyjQlN0Py/VBgwL/gO3EXGnFQboCdaW0WudBAwgIn35wCjB4jwvEcpL/GP+ecc+R/sPjGFCj/gDgUZMLz34hDq66VibUFdFztsMMOyZJLLin5Jg6KFoiDhNL/17RKoPdAUcN5OEQBp6DB44WwY+JHgcOkxaViwH/55Zel5vJC+GPXl8GPFyKc+sPnp2DyYzj4zzP1GUgYTM0b6UGEIz1+PC0ZbsIRT/PKPDbPIV5XqAT6jeim8C34TvwbGiv99hRc9dP/zLfjH/DN9F40GkL8CUchx4+0+db4Y6JgS7834eBrmpqO/gv8+Rf6b9F0QRj4lAPyCR8iP5jw9f8hUWpB1ZWADwXxofiQMcI/5PGR0EgW8ksR6dDixPxixAfB5OOHfhALNSGPD6357QqVgILEu/I/7HcoR/xvJG/ML4v4rjR8Mb8YkTfi6H+shLQcEq9plYBaSyZpfWlFafWtiepEtVui5rLUbsP37Nmzwx4jnoPo0zCalrWzMqzhqTD8XJ4VhlVzyJAhHX4QSp54DtQVKoEWMvtdLCnPfiOIxoICbXk2Dt88jEdZYV+X5YWkaSiRN/6jum04G08JScM/h5pWCXgwGUJ8UkAtHXLIIePwlIjHy1keFzRbtyVejucggpXHcroNA/FjNAwVkzgU7DCcEhdLowgMOy0bLQlxoK5QCWwF0DFVOSIc4WP/PIuIw3P4JzH/GPEficO/j/nHyFaYplUCWgQeDPGCamdDHBlRdznio6KqL+ZXKfF8BmV8DM2LzVMWHXTQQVL4LY94XaES0GLbd7ZmJWTDVhNPqd74lsL4TasETCPq3K0SOwZRaxjySxFzwmyXjflVQ8wosCAT8ytFbNyybuafVbNZO4OKbt+7ncgqaq4GVVWCLFjlq9VA1R3WAwbbtbQArDYyq+BoHFjdbQXUXQmYw621Bc2jEnBRc61ikDsPHLXh6WefT55+pjQ98NAjo8O9EPVTemo0dTbqrgR6L0Et4MxAvWBxjoWVWoHacUf1+PrrMVqj2wF1VYJ6W1IOYtSL4cOHy4pyrWCAXE/8rgqvBCnYrlAP0JRcL5ghYFBbD9hS4agOXglGY911101ttePMM89MbbWDadl6wSDZUR2oBMccc4zs12GjG4uW7BkCRx11lCySsTWa7Qzs9OQSDb2KifsG2BekN9R0NmqqBGgQyONQer2SBDA9mgc4M+uoHFmSgF0BWWCxzaIou3ZrqgTs5swDlV7NWQp53SZDq8UWXUdl6NLdIcSf3ldWL9hOXS8QxXkh61I6x7jo0pUgz8uc86gEbKfNC+xjRyI4yuO1198sS7vsukfy1tvvRv0sdTaqrgR5LjCxyapesGKcJ2acccbU5qgXHIhpBVRVCexl3Hkgj4vxrrrqqtSWDzhh5cgHyyyzTGorNqqqBJNNNllqywdomqsXp512WmrLD3lM/3YGGNizQS5GzOjF+JWQxq01jXqebanWfMTCw+NsOqi4ErCDVOeB84Je+FwPuIi73sWyEJNOOmlqay1YzRwh6U5Lay/Hs3GzzJg9pNDPuq1pScNaKhU2tFszJA3LWgWouBKglThv6E3z9eDaa68VfUN5gtNXFKhWA3lmH5VTZaTbZSquBKjNzhPsPM2jxUUbQq1aBkqhVfqzFqyZcL6iFmLRMcaHSvlBKNslDAfkcVMZyQsrxWif4B9R4FCXT+uLlgrCcdSVeGi8sOk1k0BFlYBlbo5C5gmmIvPYRbrUUkt1vEyemG666VJb64CCh8KrZhMHqqybvVjkRU0qA8c6L7jgAqkM++6771h5RYWOjd8sUmlfUSVgjwjiI08gCfI4T7DeeuvJC+UN9CBx+q2VQGFDRUkW0VLH+PUQaV555ZWiqkV56BbCRFUK/ieffLJoklCVK/hpeEy2rGjcUpSVf8uv9B0Jx7oQqKgSNGruPA9JwHkGPm7eYKbluOOOS12tASoBuoBixDgHUnspvuVZCsOp3fKtXxaVih/yLGXxwniWFyP114XWiioBFy40ArUey7Sgv9moQWyj3rtRoIvBJjUlzkpYdyVUSxyleuIqaRqYSmGYSsnGjaWDFAIV/WXuA2sEuPmkXnC8E930jQC34LQS6GIy/10U4v9yzjjmVwTSg/llKwGqURjFNwLoLK0X2tdsBPKYwu3KYNanFVC2EhxxxBGyUNYIrLPOOqmtdqCUlf5dI3Dqqad2CTUsjULb7B1q5FRhHn1uRvl5bMSLgYETK9KOcXH7nfckt91Rmg49/KiKwnU2ypZC5nQbBa5orRcMeBrVHQJ+/jiOLnWeYLbZZktt+YMLvuvFm2+80ZB1AoWPC+Jou0rw0suviiMPPPTwqNRWP266+fbUVj+uHDAotdWPDz78Q2rruvBKUAIPPJhf/3zEyNtSW/244qr8KsG77+VzvLSVQSVgoVLvN0YhLjcOAS5gHzFihExDcuiJ/UGsKg8ePFj8AdeuNmosVy28EtQArwTZkqDU+AyN2BZcwVUEjFUJ2JfPdgH2XAPNNDsB0SOjhxAAm+rYW4NadKulQSsBabHfSLc5c5sIYGELfTP4M/UKMQ1JWtxpxSF+PeerlYDnEg+T8LprlPSZIiW/pEGeAGlxYIeZI9UppJUAVTHEZz85QB8OoLVCL449m0BLRnjyZNWDeCVo4+7QjTfeKPvz+/XrJwWbm0YABQXxduGFF0qFYOMR4o1TXeHxRq0Ed955pyyyIQIpwFoJSOeaa65JLrroItnIhLYIwpxwwgnJAQccIGEUWgnQT0S+uG6JlkYPQ6CMl7hcBMgVTewmJT9cCMd2CgutBDybws6NOjxbFUCh/oWKzrZgTh6hsZr8c5kdl3pYeCXwMUFJeHeoa+DDj/5QltZae93koz98HPWz1NnwSlADvBJUBq7n1S5qkSGV4MWXXk2uG3ZjlIbdcFOUn0X9Lrsqyo/RsOGl076w3+VR/jh0/Yg439B5fS+J8stRLI9X5Djd2s5YYYUVWmLbiVSCUmjkduI8VLMzYA1nHfJEq22nLhLWWmutmq9QaibK/uFGqibMoxIAHcA3Aoj0VgGzYmxdziJm3WL8eqhUmlSAGN9SNXnKM/+kxewfKFsJGrkdNo/jlaARZ4wVyy67bGorPpguZooXYmra2kOK+VXKy+KHPHWH4dQvdNuwoTukUn5KNo1YeD0yXLYSNHJVLy+9po1Uj3L11VentuKD78ACFCu4mKHdujFDvxiVClNJ/JBsnFriKxFX41eaThiOVWvQqR3evPSaNlIS/P7776mt+KASoM0hb2J9BpMDTLSerNOwjqQXbbAGxDoNJ8l69eolYVljsWkobbfddnLOl7Ud1o/OOussSYcCiT8mWinOPfdcKaj4sSWDsJoGa1Vqr4dYIAUVVQJevhHI41AN+PTTT1NbvuAntBI4Y1yKOIifxbd+1k4XiwKjbhYoKcgvvviinNO97bbb5PJDJCb+22+/vfhpeApamD7/a6ONNko22GADKeSc2WByg3svSPuwww5Ldtxxx47w5AHSNGxa6rZ+oX+MCEOaoKJK0Kg99fPOO29qqw+8VCPAKnYrgR+LSpM8iRX6999/v8PNKj/f+/HHH5eu8qWXXiqr76y2c8IP/v333y8HkgiPoi2Nq4QkIBzp7L333qLek9ae57ALgYr2wAMPdIQnLZuHvIjvBSqqBHkNYEPsueeeqa0+qFjLG602PcpPRWrTPbEEzxJdF7XbMGrHX+3WT+NYiqUVkvVXuyXlh2FDUj9LmlfsNlzIs35Kutmvor/cqMKwzTbbpLb6oGItb+SterLRoBLQreAyQ0jtmNZueeqO8ayfDaOmpZBn3aX8Yu6Q1B8zFtfyY2YWryqVKxTWRix6IArzQCMqAYPMK664InW1BtgZq4qlikD0+x999NGoXxGoKtXsdDfop+UNVhTzQCOOV+6xxx4tNTNURBx00EEtocqy4n7OEksskdryAzor80AjBsatpniriODWn0ZoDM8bFVeCRhw433jjjVNbfWjEwPjwww9PbY5aweyaPYhVVFRcCXr37p3a8gPzy3kg73UC5q5bbY2g2Rh4zXWjaUhJ6t3n/OSqq6+N+lnqbFRcCYD275heygN5XOYN8to2gc5V9r+z4OMojbY7WVYJuKNgnnnmkRW9vLSyoee0SGBX64orriiLeOHxTMfY6HKVYKqpppKbK7lPAK1x7CWpFew94bAFdx5wOQMKuFjEqBYM1E866SRJi1mIPO5UO/LII6UiTDnllA1VOtYO6JKSgDlzFs2YNan3yBzpQFSqueaaqyaFv+Rhvvnmk/zQcucxe8VYYMIJJ5Q02bDlyAaVgEtMUFbA7kyUIbDxDWCinIA9++wjortK11f1DrHmxDjurrvuEndno6oxAZmn4NULCj0FbZFFFkk23HDDlFs9qJgUWqQUPyIPtNpWic5CliRgD1AW2HBnwZ7+IkD+OCdtOGWDjh4l3Fk8S9YvDBe6lYfJRiwKMK2F8sK4arfukD/99NPL6Tf1C8MpL+RbCsOGPBsOQhdRV0fbdYfQs6MXHFvSi4/tBciW4FtSnvVXe4zHTkLLD8OXi4+dM8YUShs/NMtRLG6WHbMVFoAajbasBAxYY0Sfz9qVQr9KKIyvPBumFNn41cQLSeNWm46GtRr3uiq+//6HsnTDjTeO/lZfRv0sdTY6KgGrrhCndhgUDho0SNwsGrHqB9ECUhC22mormd7kdBD3084+++wSllttiK9pKZ1++unJGWeckay22mqi2Y3DFN9//72kxUki+vPwmIWimxTGh7bccks53MGzGVShRY5nLb300rI8z+rk3HPPLavQFFLSDtNgXzpxunfvLvvJ2bvEe7Dt4oknnpC84CafhEcTHybPxuQEGyb5cJRH0abAs9BRCVCtqMRonoKhbk4NoSeUQsR2XXYGLr744skhhxwi7s0220xMDc8eb7UrsW310EMPTeacc07ZwcdZAj4ShZdDO6hHZLpUw6P/08aH2CjHLAOVgFkHdo+SBn7Dhg2T/HEwPoxj3dDKK68s/FGjRkklZsqXQxvrrruuzBCRLt9Aw3MCzsbPa3Gu3YHUZMxXdHRUAlpgvWCZwwYo2lUeszC0nNgh5tPZRkGrrnFQfqt2zppqWCX4mjYtN9tsKegQF3ITH9Jn6oXQGlfTQb2K2iHNgz4f0vAUcuVZP43DCaeYP3lUOyZrENafPDvKA4lcy/R3s9FRCcK91p1B9qLlIhMNQlFBoSsK6VpLzK8oBDoqAd0K5nExayUbP8+0aqF681Iqzuuvvy4fr2hgrMK/ZMYMUjtmjEr5Kdm0Yn4xviUbv1T4Sp9h3aXSUyKMhgvDMy4FUgnQF8nUnw6AnUoT6ypFBGOVWH6d4sT4D0glcLQHqAT8WNUJZO2hGRIzf9bNoBYzFj8rDY0T2qFYfGbwbJhSZJ9p7cwsqj1GYTwl3MwUAq8EbQRm9Jg5g5jOtfaQ1I9NjJhsQmR2DSIdpqBRnUL3j4kAVKAwhXzLLbfINDEFyaaPyVgJ+5lnninKt+g2cqCdfUTEQUcRYypm9kiLi1w0DSWdomc7DRMSzCJyNxq7epnB43IYJicIg8oXlH0R77zzzhsrHfuOId8S8ErQRqDw6oxbpcT0tHWjFZDp45tvvlm0wDEjRroUzp49e4rGOA3LtDgzaTY+dOqppyY6xcztR+wSZrMcaTHrSEViGlrDs7HOxofQZ7TUUksl++yzj0yjM63NjUn9+/cXf9ZyyKeGt/mqlHgv4JWgjUCh1KnmSgmFAmpHHclRRx3VMYUMsTsUP3XTCmt4LlJXuyXCYXJzJeswFGLcFFQOZNmwOvVuecq3U/D6XA1r8wHtvvvuY7krISoR8ErQRqAS0NWAmG5We8iLmWF4646FD80svxjP+qk7Fj70i4WzfjEztKsbousHvBK0ERDvbGWGaKWtPYusfxjeui0/izR8VthK07D2SuIohWFt/FhaSBXglaCNwOIPupKKQowL2Hoe8ysC6VVSXgkcDQPTl0VdU7HwSuCoCSw2tQu8EjhqQpc8aO9wWHglcHR5xCqBVdrMwFMR09zBUdWinDXwSuCoCVSCo48+Wg4zsRWCrRKcIwGsEjNmYG8QOzc5tMQaxoABA8SfCsJ05Q033CDuzoZXAkdNyOoOMSXaavBK4KgJPiZwdHk8/ewLyVtvv1s3ffDhmMvzOhNeCRwNQ6t0jbwSOHLHwgsvLAPl1VdfXXaQ5nUPRaPglcCRO9g+jVbvCSaYIJl55plFV1SR4ZXA0RAsueSSomsWKjq8EjgaBjR877zzzqmruPBK4GgYUGvSCvBK0Ebg/gi2IjhVRnrZjFeCNgLaFFBTD3HJotrVHfIsxcKraSn00/AhWT+NWyqe9Vd36GdJ/WyYLLfarQnpDaVeCdoInORCf1AlhPa1GL9Sqje+pTzzUk1aqI0BXgnaCFQCfiy7NtHrowTPukM+OoEwUUiFiYZwDWOJTXCEsYq1wjDwSBd9Q+QjzAub6zAJY0nTCxV44WfjQ+hD0jSUp3YOz6ubDXyYqKckH2FarnyrDcFBe/T5KFForZv7GLbYYgtRZoXWBY4/HnbYYXLpIQXipptuknB0q2w8SxRudouiWZx0uBOCNLfZZhsx99prr+TCCy+MxoVIGyVa6BBCJcspp5wid0NcdNFFwkcty2677SbpxOIrkQ7vgM5YKi3qU6666irp4rAucc8990hBt3F4pnWTBvBK0EagEqAMSwltCtZNIUPpFHc105qi5Q39O8sss4xomkOXD1ujNTwF3caHdtxxR9Esx6UrJ554oqwKn3zyyckmm2wiFQoNcZtuumlHeG4ntfEhFHDtuuuuotxr3333lfsv1lhjDdFxxAozeaRF1/CXX3655N2mQZhddtlFKvoOO+wgd2agRoWKhfY77tPgGTYOUsi6iQu8ErQR+KkU0JBQYoWJRjjrVpNLVdSeRTaOJU0TojVWuxKVzLo1ntp5NoRbzZDCNCxpGmFaMbLPhqhYwCtBG4FKwCUmSmxfiJkxe5afddtwWaalSv3CcKFZjZ+S+ll+GFbvmfBK0EZgAMgRR4hugdproc6Ib+Nk2SulSuK7LlKHI4VXAkeXh1cCR5eHVwJHl4dXAofD4XC4MHA4HA5HncJg1llnTRZaaKFk0UUXTTmdg5NOOqlDAybXzrGTpBKwgD506NDU1d4YOHBgxd+lEsw///xC7D5yFAtcn8gNdEcccUTKyRecyNAtiJWADStdAdwIz77svMDWUv7jwQcfnHIai7qEwSSTTFIITUvc86n5QPENF+hWAi7pRbusozaw/4DOgKNYmH322TvupCoCpp566tTmqAVormgGchUG3IlLQUSaKWhwu3Xr1lE4ca+77rpiB9xvRWHhwDjgjt155plH9p8ruHmdQxsKttlyKESRJQx+/vnnZIopppAzpor55puv41bzUsKAMHPMMUey3XbbpZxEdiLyY3QUQp4WW2wx2cuu4MAJBz4U7JtfddVVU1eS7L333mM9k1vk0d6mYL98nz59Ulciaen34n1mnHFGsQPyQX44PM73nWWWWeQIoQIlWJzWAvyTxx9/XOzs/5944ok77vQFCyywgJzqAnwv0tXrTgGHcyaddFLpFSpcGBQToTCgvlCW+f+cIeHCdM6QKE477TT5vxx04pAV9ZoTi4CDV8stt1wy00wzyYEqQHpc8q7grAqjc/icD5lmmmlkY6eilDCgnG2++eZyaIuR63HHHSfxtT3gvAuXuPNOnDvhokrKJW0KdYcbSbmcnnqj4LjttNNOm1xxxRWSJn4rrbRS6jtm0+n000+fXHvttfI9OJDFGRcF7sMPP1yeTd227dx5550nZ3K4m5tDXORVN2U+9NBDHW0f4ZkxWWWVVSSPnOWZaqqpOk42AsLyL4h3xx13SN0+4IAD5OJ7i5YUBjSWG2ywQer6Hzjgw2EkBUPMiy++ONl///2lEJIOJ+MAwkAbLYtXXnlFGvo999wzWXrppaWhU2QJA8LxUylgSgy5tPEtJwwQJCH4qZwEBAgDvacCULhOP/301PU/IADYyUtjy4nELFDIF198cal0Ns8UbC1EVEI0HpJvzpNYcCCLsITh4JVqGAFWGPTv3z8ZNGiQ2BX9+vWTqSSw3nrryUnGEBRKFwbFRygMaGzoKNDgcIeKBf/8sssukztWaEQhpjsmmmgi8afcMR1oEQoDgEYbjniPGjVKDg3SaCrKCQNbTgH1RDs9HJZcfvnlxQ4QBpT/EAi3Y445Ruy2bQhB53PBBReUi5P0fXl32g2dMl5zzTWTHj16SPkO80Z7ssIKK0j9CC9qDYXB+uuvL3YFJ3wRrODOO+9MDj30ULFb0H62jTCgMQqHqFtuuaU0RPBnmGGGcfyR0ioMOHZNw69gDq579+6pawwoEKSjyBIGm222mRzNtkAQEQaUEwaxn4AA49gGQBgMHjxY7AAdB8suu2zq+h8Y6XDMnZ48PYkQjHLoCVEJY+svCBO+GcJCL0FS0Ktj9EHhDgsXlZZCDUJhYEczwAoDRiNWeCtcGLQGQmFAuaTOKGjgtWzTKaATEYL/TBzCUr8srDCgLqEuwUJPfSvKCYNQQNFo0ikC9M45xW5hR8cKTrD37dtX7NNNN52YFtR1Zi5owOedd96U+z/Q6OOHqaNpBfWcHjvQWQXFY4891tHAh8KA9seCtkfbB/JiRzMKBA0qIixaUhjwgvvss4/c3sYLQOuss07qOwb0HvgZ+NFI0vCjP4OeCA0ijRDqASy4+Y1hIXEY7vFM9H5ogWE0Qc8E0FjygxQ0WIsssojEpaf/yCOPpD6J6BSx008W/HRGFjT8+mx+uk4RAXoS9LZCMK1FeIiefQh6F+OPP774I/xCUPA0/u67755yx4DDsgzZ8eObjRw5MvUZMzymsGncs88+O/VJpKJw6x5gKuCuu+4Su2L48OHJbbfdlrrGTBXpd5t88slFqGK3jYoLg2IiFAbUDTpPaExmOmfOOecca0RLeZhyyimTQw45RDoelCut13TGaKAsWBDWaRU6OZQLOlXEZzqK9OjUIDQA5ScLxGUkTN2iZ0/bQP1RkMbGG2+cusaABpve/7bbbitqT8g701MKbfDpmVP/EA60MYq//e1v8o1orI899liJb/25ZZI4xOWdqKu8J0A4cQ8BMwzkl2+lo+gHH3ywY9Ge7xfmGyHHtJCCto/4fHPUpTANS4curJt8o2agrqfQSNJIaE8eYRA2Xq0KFQZdEQzTETioWaHXhpBh15CuZSAQ0OJADyacQnB0Pmjo6FhZwV1UNKuhKxqoU9QxOw2FFhk7iuL/QS0hDBwOh6MehHPyXQ009oxU7CaXzoILA4fD4XC4MHA4HA6HCwOHw+FwjIYLA4fD4XC4MHA4HA6HCwOHw+FwjEbDhQFbp9jzzBYyp9Yh1Q3jaH14/Ws96oxzIg0VBrwMitM4AcmpQE7lWYJn+epWXmjacNatvFi8mN2S5cXCWnsYrpy9VLwYT+MoxcLYcNavVJhKedb85z//OY6aAEfr4ddff5XTs/a/6z+3POWXcivP8tWtvNC04UI7Zhg+Fi6kWJhy8bLiZPFD/1i4rDCWVy5cqTCop2kmGi4MKIjasChxwMK6lSi4lmJhShH6eVDQhiZCnlFJWtYPO+oy0Mdi42GiMRQNiJwS5GdlxbcUCwORN74JJ3xRNsdpRMLw80kfNQ+cgNZCAk+/GXlDd4pNTyl8prpDfqnvryZhIEdrg3/JifLwP9t/rXz95+q2pHwbXt1KlFVUM6DM8ZtvvhknXGi3cdXNcyj/yoPgUV/QzYU6CA5pqZ8NZ9MqRYQhr1988YWopEEvGo0v+pjQ5Et9Q2srbsLj1vehPUOthKYTPt/aS32z0I6ppG7ahWaiKcKAj0pjWgnxo61ZijQMHx1tf+gq0Q+JbhQuh4BOOeUU4aMqF5UZqFJAFwp6jtAeSP6OPPJI0cWz0047iSZVwmva6A9CxxJ2eKipoEFGFQdK5ngWSqrQfogfulbQBIm2Vvw22mgj+Q6a7w8++ED0m1DoSRN9TjyfSoQKbL4bhXTCCSeU78hxdM0L6nPRf6J5sRR+u2q+ZUgIVp7haG1QF2jswv+bR1mxcVAOiQ4i6hLlBp0/6A6jrKMFlMaXuti7d29p5NDbQ91Bvw91Db1Y1BXioa+Hxlmfg8Zf9PYwy0Da1E90bVEHsaOFFG3BPAP9Syjf23HHHaVBR702+oC4yErzSh4nm2wyUYFNetRd1K2g2pq6hj4m+NRRVFVTDy+99FKJh+4z9C9hD79ZaCqFbkul0mhLYUDDYlXkKsGzVIoX+lk+6aOPH+2J2PlxCAcaZRRZUTD4eajxRZkUCqjQ50+B5LY2tHhSGAiDAi/uD6Dxp2BgpzCjXI6eO4UEPen0FvCnx4BWSPQ0UXDg4da7D9DIig4S7jTQ74B+cwo3FRU3GhWpDCjbIjyCZMSIEdJzQeGYPodCQr5QCkZavLtS7LsoL+YXxrGkPN7V0dqgjNHIhf9Z/305snFi8ZRHnUB5HI0YjXevXr2SueaaS0bT1DUaapS+of4dZZFo70UbKmWbDhpKL2ncaYhpfCn/+FFP0IJKJwmhwbNQqoiqbOoz4Sin1F/qDfeLoEqe+oKyOQglfWg61TzjR9qMMnAzCicuPX74CCw0+DISoZ4h5BA+2AlHXmgLiBv7JkrhN1N3jK92S/CbiaYIA4gfAFm7pVJhKonDx0NxGrr4GQGgShb1uoRBwyIFkRECmjspTGuvvbZoB0RjID+Z+xUQIBQk1O/SC9G0+SloMiRtLruh0MBHgygXYaBiF0FAYUYtLY04z0VlNGnyfPyIo3mm14KmQgQN2k95BiMWekbkR59NeHp23OPA89GuyLvatNSubssvR7E42Jkuo/I5WhsIAxrYrP+s9lJUSRz41Iftt99eLqOhDnEPAWWZesioHdXoNLLcUUD9Q+00ii1xc7EO5Zt6hpp4LeOaNj170qZzhiZheJTRAw88UO5P4D4DGmjqOGkRD5XR1LFNNtlEnkEczT/1jctymB2gc0VcvhP1jHj6bA1LB5K6ykgGN3xNq5Q9pCy/GJ/nNBNNWUBmzpLGMA+ioY3xm01Z+ag1f0V7L0wXBq0PhAGNW/ifa6F2Kdt5UVZ6eT3H3lrYDDRFGDBtwlDVqXWIhWoXBq0PhAFrU7F/7FRsakthwJwcvRNLTLXEqJRfpVTJc+qlZjwDIn39fjH/PMi+iyUXBq0PhAG71mL/N6ssxPilKIxT7hl5UTOe04xnQKRPPVfC3ZbCgJ6JJeYWmUuksYFYAFJ7Fg+35cXiQMy9Ufh5DtvcqAyxtKxd3daMhVEiTdYIeAaLTEjxStOK8UJ/5TFnyFY3nsNCHNvrNGxW/DAt3GEc5UG8CzfL6b9hLQPCTlhHa4P/y44e/b8QZYrbymyZULJlw/rHwlm38tiEwfw8z+HSq9DfupVn+epWXmgq0ViyiYO6zvNiYWw6Mf+Qp+GVWPRmkZn6wPdiXSEWz8a1dkthHMtjrYL30Hqn1JbCgMJIAcSEWAzSObG8iYaZxo3ncF0mDWosXD1EmuwW4hlcy8ePjIWrl5heQ3Dqu1AYY+HqId6F/8Ez+Ef2P1FYHa0NhAEdFvtvcbODJ1Ye6iU6Y5yd4Tlsj46FyYO0jFLX6fQ1op5T3+j08Rw6Y6wFxMLVS/wX3sPWPQi/ZqIpwoDCR6OGCdFjYGjEvDTEVjT2CnM/Kztn4NGoY+p6g5oaB0J6cgcvO4SUR2+BngLP4y5kmw47f9jLjLTXNPHTMJhIaeVD5Kdnz56yV5mGX8NxsI1nUFh4noanYHIVJLsYEHga3qZrn6fx2D2x+uqrSzwKHTzehWfyzXgX/Takyy4M3h1/0rFk01ZT7aTN9lV6cMrj+9n/o3YaEkdrg39I+bT/ljJD50LLBZ0Zzsew1ZOyZcuqNeHb8kRYzuCww015hKE88SzO4Sif+P369ZMyy/5+LctQWEb1mepmlyD5I56GpafOu1CXqYvKp3xzDzj3NpMXeKRn01VT40DkZ4sttpCdSRoPHkKA59D5o+eu4dkByLuwFV3T1/TUVL7G0TDsTGIrufL5H/Yf6X9qS2HAD1PixWkwVRLSoG+99dbJeeedJ4dS2C/MJfVsEWNLGDzi8HPZcsaHuvLKK2WbGtKTrZ4XXHCBFGiIj0hPlzgcJqFB5TkUUA6gnHbaaZImF1rzLMLxUym4HObab7/9Oi6lZhvcPffcI7wddthBtrSRFs+hMvE+FBaEm75Ljx495F3IP9tK2WJHemyt44J6hGD37t3l0m8OmfEMDrKQT96bbXnMGeq70Msijw8//LCkz3N4P97ljDPOkMvHOSjHt2IrKwd92Md96qmnihCDECQ8g3diWy3vy1Y80uK59BLDf0ThdGHQ+uAf0nDqf8WkA8NJff3/bKtmiyZbLOmM4F5llVXk7ACNPWVxySWXlO3anOGhUaY8UQ4RBGynplyTHkTDyXMoj8TV57DVlLpK3WWLKM/k0CedJw5oUtfYpk0eKPekTaN7yy23SGeMraikQ5rUOd6LcPY5d9xxhxw8O/HEEzu2l/NczvBsueWWUq65KJ9zQ7wn9YF3YVs39XObbbaR99I8w+ddEDjaZvGufA/OJ11yySVSFzk0SrtE/ePQ2/vvvy/fkK3rxOUZ1HXqGu/I9nfqs76Ltln6jzAZZTUTTREG/DBLDB8pkLwwPzSkLD7xYnwl4vFhIdLnMFdWWkql/Ev50cDyLvxcfrb1K/fMGMXi8B4IG57DYTkKSRgm61nwY+FDIhyC0P4fiMLpwqD1wT8M/y1liroRKw/lqFR5w6TcUE95DgInDGcpK61yRDxtQ+jk0fnRtGpJMyuOCgHehc6fCiClrHjl8hD604Zom6X/CHtbCgMKnyU+LD+THoQSEjg01U7vgV6+hlVSf0s0bPAZfdAb0HQ0rI1DGLUrxcKF8SF66rwLp4k5DalhLdk4IS90a16sPwURk+cwlUYY/DVMVlrWzelNGyck+Ag2+38gCqcLg9YH/5AG0/5bGh/qhv5/WzZidsuLURiO+s1zqCM2nA1Tzo1p7TYMxGhA6wwjbEwbzsa3fqEZkvUnfQ6w8S7Uca2PYdystJTwt+mGfrwLpv1HUFsKAxp+CqB+AAoLQymGcEzbWGLqQ02GXkyvqBv9I2E4S0zPMEzkp/E8pnU4Rk48SNPQdJieUj5u5Vs7fkrwyDNDPn4g74IwYNjMEXaNq/HDdDEtX3lM6XDyUf0hhsTMleq7IAwZ1qq/5kfTsemrW/ONCgDUcqgb0vBMBdCD4xmWKJwuDFof/EMaNf2vlFnKFFMvffr0GavMaNlQN6a6tczESMOgDobpR+oEz6EBpXzZ9MO46qd2TUvt1q1hmOZFAPA+tC3Mv1P3NX0b38ZT/yyyYajP1Gt9F+o70zzU/zAt3Pocm07s+WE4poaZHtM2Ugl3WwoDPig99koJqcj8JT+djxILUw+RJj+CNQMqRixMM4m5fNZEaJRj/vUS70hlOfnkk6P+MeIbuTBoffAPacxi/9ipuESb2VbCALBrhcaIBo/pnnLEiAHtoSNHjoz650EMkVFeN/XUU4s9FqaZxDujIIveWsw/L7r++utl8ZoFvJi/ElNSzGuiOdLR2uAf8i/5p7F/7VQ8oq2kzUTpHx3qZqHhwgDwQuUIMA3C1i4QC5MXAXbzoCkRxMI0k5iSQRiwvTTmnycBdlKxYwPEwig52gOxf+tUbOoMNEUYlANqmtEeyPayZgH11SoMOhs333yzCINmDgvRP89+7N9//z3lOBytBc4UDB48OHU56kWnCwN09NNLZ/9uM8EIpCjCgP3JzRYGABXAnNNAXa7D0Qw8+fSzybvvvZ98NbrsFYW++/775Iyzz09z2HXRqcKAAyAc3mB6pNlg8ZhLN4oATmYiDDix2GxwPzWjJLYBOhyNxhNPPTO6E1K8zsdpvc9NbV0XnSYM2DbJNEVnLVJy9LwowoAtsAgDjvd3Frj8h5PSDkcj4cKguOgUYYDaB6YnOmuhBGy22WayY6kI4DQoR+Y7G2znRUWAw9EoWGHAWuF//vMfsatpwVkCC9oL4qC914K4rH2xhoA/FILdOWoyE8GUrE3HhUEnCAPODnA/amdjjz32kEXUIoBj71zQXwRwMIbDeA5HI2CFAaNR9G/RKePgJWdb2FrJgSzc+DONCf773/+K7iCIdUbaEdYZUfqGOnzO6DDdiq4f9Jehcwi9YPvss4/E59AYwoRw7N5DELCVW+HCoMnCAMVO/KQigNPN7LcvAtBkiLKuogDlYEUQ2I72g08TFRdNEwY0LmgdLQroUaC6oghAZW4RpoksUIfBRf4OR55wYVBcNEUYsHWSOfoiAcHEidwiAFW5nLouGjiYxgjK4cgLbC19+513k79+8WWhyLeWNkEYfPXVV3JxTdGAbqKiTBOhLqCIwgBwGBAFXQ5H0UAHc4YZZkhdjnrRUGHA5Q1TTDGFaE0sGtZcc01ZLC0C0I6IMOjM3VVZYGfGjDPOKDvAHI4igalVFwb5oWHCgF0A3Ch0/PHHp5xigRucUK9bBLCDYtVVVy2kMACcBWGEQD4dzQPfnXpUK7HdMsZvB+LduHGM28fYWmr5Nlwl1ArfqVF5tGiYMOAquPXWWy91FQ+LLbZYcswxx6SuzgVb6LiXuMi44YYbkplnnlm29TkaDzoGNHZo/a2E0HAZ2i0vi8IwuGPxyqUVi5eVliX1t+EqiRejrLTUXgmFacTSLEc2jo1XaRqxOJXEjYUpxUMdDVt2FQ0RBiiPmnbaaWX/b1HBHahcjl8EoG+eE8ixwzJFAqODoq5ttBsQBlzAzn0gHJCCrN1SFr8UNSMtp+qpmf+SC/8bKgyQOggCFmiLDA62TDnllKmrc8HaCsLgH//4R8opJjioQz59h1HjocIAfVVZxPmUGL8WqiUtGyfLXikRB3UsnA5WHrvsUNViw2URp49RuEhvl7Rwc6MiB0vx//HHH6Vx1PAffPCB3J2MnfD4EV/9lUq9S5ZfqThZVEucLKo0Lba0N1QYbLPNNsmkk05a2PlvwBzj7LPPXhitpRtttJE0shTkouOCCy6QvLqm08aC+kMnAYFQijSMDYvdkvJDCv1LxSnlZ/3L8UI/G+aqq66SBtryEAiY3NTHVvBrr71WLqSiHI4YMUJGq4TZfffdZVqNdUBOGHPd7c477yxhPv30U1HGiGDgPmibB2tutdVWHW5LYfgYqZ+NpxSGCe0hLwxnSfkhhf5qj8XRsHyvhgkDrmrr1q1b4RcaqWhzzjmn5LUI2HzzzaWB5Qe1AmaddVZR9OdoHFQYsEAKoWLBkvJifsq3cbPCWQrD2zihn/KzKCtuzF/p0ksvlZGAujWeEuolUHD5/vvvi33DDTdM1llnHZlmRU0FjX+PHj2k17/TTjslq622mtzLzCYW6hiCgGlOTPsMNQmjfEsaxrpDXkhhGLXH4lWaVqk0QrJhwvCaFkK0YcJgrrnmkoai6HPfVLQ55pijMFpLt99+exEGFPJWAIvJ448/vuh7cTQGlFEq62uvvSb0+uuvd9gbTTyr2udlha8mLcLRUMfCc0k8dwO/+uqrHWGtv7o5s2P5lkrl45VXXinpnyc16zkQz8p6Hu1NQ4QB809Mu3DauBVAj2KqqaZKXZ2Lyy67TBrXZl/wUw/Y482OLEdjoMKARhCi96tmSCE/Fq6SMCERppa0stLO4peiWuLEqNZn55VnTatceqF/Vlql3Fn80M19Mg0RBnvuuadsjwz3rhYVO+ywQ7LXXnulrs4FKqy5eS1U2VtkMBXI/0brpCN/IAzY8YHKZS4echqb0D6Kdl3u8475O5UmRlmsUdpZnNyEAT1bDnK1Clgs2mWXXVJX5+K2225Lll566ZZZMwC//vprMtlkk8kUl8PRGdhxxx1bZiaiFZCLMGBln8VYVu1bBSwsscBUBLz00kvJKqusIsO4VsLKK69cSL1Tjq4BDrayRdyRD3IRBgzX2B7ZSjjzzDMLs2bA3B33GTB8ayWgU2neeeeVrXwOR7NxzDHHJDfeeGPqctSLuoUBawTs9UW3TiuB+34nmGCC1NW5YCscu4lY1Gkl/Pbbb3JD2wYbbJByHI7S4D6Dv/zlr2PpxykC+X0GOQgDtiixv5dr6loJLB7PNttsqatzwXY5trlitho45OO7ihyVwi+3KS7GEgavvvZGcvZ5FyW9z+lbMZ159gXJ8Sedlpw++mPG/POg8/pekpx97oVRv1rorNFpnXLaWcmJp5wR9a+Uzr/w0uSsCL9aOuOs8+QbUiBj/tVS34svT865oF/UL2/iOx534qnJWX3i/pb6jC5bH330SVraHF0RLgyKi7GEwUsvv5p8VcAf9cCDjyQ//lQ8bZkjRt4mqi2KhiuuGpS8/sZbqas4ePe9D5L3P/wodTm6IqoRBllraEzrhGCLJPp2soCuIvDZZ58lv/zyi+yGs3XXhYELg7rgwqA6uDBwWGHAhUkoVUN/0CWXXCKXYKF3iB1C6BZCPYXug8fkxj38UDp37733in4sLqiiYWe6etiwYcmf/vQn2aI9aNAg2eBw7LHHypkN9tUjRPr06ZM89NBDYrfX3rowaKIwqKfRbKQwiPUyKkUjhUE96TZSGNSTLxcGjmqnibjgB225bFZoJFwYlBAGaLdjaMXP4FAUemgGDBggEhmJys07nI5U3HLLLcnNN98sdsIAzh8o+vfvLyYnB9Ftw24e9JAwZLvwwgvF7/zzz5ctn+Dkk08WE1hhwDAPoO756quvlj36w4cPF/sJJ5wgvQT0jCjoHej2szvuuENMtCMqeA49B7Qlcqp26NChHYr26EUA8qSHW+w7WWHA+wNU4aJZkZO5I0eOFMVaKNGix/L9999LGDBkyJDknnvuETsnTYG9FQ4Ni1xAgWIp8n/RRRdJvhBepA/IH70noN8QWGFAJUKBIOB59K64+5meFL0lYLfnofaX70lPjN4VGDVqVEcPTZ/Nu/G/6anRQ6OHp9/m8ssvT0466SSx8+4KFwYOXzMoLuoaGdAw07AjHKyOi0qAMEFzHqDBwm1he+zVjgw++ugjyddNN90kDX014EIeGm6AYLEgjzaf1Y4MaIBpeFVoVgOGxioIERL2vciT/f7Vjgx4Z/2PtdypwHsB8kAvzsKm58LAgTB49rkXk3fe/aBuev+Dj4Teeff9cfyqoQ8/+jg54+zz0xx2XfiaQR3wNYPq4MLAkSfYGj7ddNNV3eFzxOHCoA64MKgOLgwceaJ79+5yq6ILg3wwljCoBcwXt+LNV8x5F+UEsl4nyXpKq4E1pQknnDB54oknUo7D0RygF6soB0fbAXULA4B0LrXHt4hgwZh8FwEIAwo1F1S3GlgnQJV10e9vbkWwaA8x+nQam8D666+fbLrpptIRjYVxKk26KUSRizBYdtllOxaDWwXsYppooolSV+cDwWR3QbUK+I6ujiJ/UFHZMMAmBr6xk1PexM7H3C+34bJqtna2EvgQ9GiLAi7ob8WRgW7pdeQL5sHZ3s1Orx9++CFK7DAr5a6GakmrlufVk8eQ8nx+nvmqhSp9fiXhwjBZcb788sv8hQHX880444ypqzXAXPe2226bujofXNShW1pbCWirdRXW+UOFARXWEmd/rBnjl/KrhK8Ev5o4X331VUccOjZZcWNu4lo/S9zIZd36jJCX5ccZGLWTlj6LcNjDZ2elpe9nwzM9rmFtuuqfZbfuUvzQT6lUHMtXiqXFd8ldGIApppgitbUGmOPmAv+iYKGFFmqpy4EU3HvtyB8IA9QqUCZCotOA+cc//nEcXrUUi1dtWtx/zn0mNIzK46Ajd3rTAG288cZymPGAAw5IHnzwweToo49O9ttvP7l6ljA9e/aUd0EVOnG5AnaPPfaQw6tMle27777SWdL0OdeCsLHvD5HvGI/DpuQRO3evcA5ps802S0488UTZ+IC2YPJ24IEHymFY8kUHl00mW265pai0eOyxx5LFF19cNnksscQSkhZpoj4DuxKXVJFn+3zI5smS+oX/slScLIrFKZdOQ4TBkksuKadkWwn89KKAk8Tca9BK4ITyfPPNl7oceUKFAWUiRu+//36UX4qqiVNNWBpXhAENWpb/wgsvLJoMsHNvMSfwuSaXhnP11VeX0SULwggH9A1x2p5T/Jxwp8F+/fXXk7333lvi27QZPS211FKib8jyLem7kB7PePPNN+U5nJRHGwEn8tGFhBthgM4k8kee0Sww00wzyXO4553eNAKCC6nC55A3RsoIEuVV8h0b9V+UsuLwTg0RBnzoeeaZJ3UVHyx4b7fddqmr87HeeutJYWoVsMDJxoHBgwenHEeeQBjQqNBwlaI33ngjyodK+eVNnKxH5cxbb701Fh91KvS4odBPCb5SzD9G+m6YqKQJ+ZUSjTrPxTzssMMkn2zkUH9O5TOKQZDZeDFCgCFU1F1tXkqFrzatSgiB2BBhQGbZK89wsBWAXiSGikUA27y4EJ+eVauARWP+t9W35MgPKgzQ5AnRc1Z7vZRnWpVQs59XdMr6Hs3+TgjqhggDCi+NwzrrrJNyig2GTsxDFgVcLs+0QKugW7duyTbbbJO6HHmD+sQwnrl3FApaQkFkyCtHecYpSlp06Jg+qiUNpaK8S6V8qJbnhES5YkTUEGEA0PiJQGCUUHT87W9/kznAooA5V4ZtrYD7779f/jOFytE4MBWnyhGdxia+DWt+CyywgNTlWBinbGrYoTMFUmaGGWYozPRLKVCA5pxzztTV+VhkkUVaRhiwjZhFP4ejM8E95jPPPLPsmXfUj1yFAWChBYFQdF1FHOZhO1lRQOPaCio9uJ+BbcR6B4PD0Vmg/nJy38tiPshdGAAWQ1mZLzIQVuwSKArYn8y+5aJjqqmmGusiHYejs8C208kmm6zht6B1FTREGHAghMat6Aui555bnNuNOKOhl9cUAeF8IuDWOA4DcXrb4agFXG5TxJ6833TWIGEAOD3I1jgaEA5tsHOHk35FAppLi4IrrrhClEcVBeyZ5oT2LLPMIqqCd9ppJ9lD7XDUA7/2srjITRhwbSJTCKgnQAEcmiynnHJKmdNDPfPkk0+e9O7dOw1dDDz33HOprfPBicciga2N888/v6wPjD/++EKc3mSelkW7ffbZRwS9w1ENXBgUF7mODOjZsl8eDZxcHANx8QkqCyaeeOJOVXM99dRTJyussIIsbpOfU089VS5rZ0fCpJNOKusHNIDNAMfd2T00zTTTJIsuumhy+OGHC4+8zDHHHMlKK61UiKE0pzIR7vxDBDwqv/mPCAO/v8BRC6wwYHujzvcz9Uj9Y9uj4p577kltY6BbItkJyDQm4TG5L534qIrAX+9Px1/30es93ZiqCdauNbgwaOA0EY3FcsstN5ZA6Gz06dNH8kKDxsITdkYs9H6bPWVEQ8vBLfLA82locSOYdt111zRU54O93OSRkQEHCrWiORy1wAoDFNRxJokdiKzfPf7447JbDQVyF198sSwQa3mj0Wdd7ZxzzhFh0K9fP1HCRieKxh0FeOeff77o3KFuoSaFkeuRRx4p8VE0h2AYOHBgMmrUKBEEN998s/gBFwaBMECSqmTVwwnYY27lhaYNg530GDE88MADHf5hOA0bmjF7jLL8YvEpOPRyaeAQVByeQg1ELI7lWX4YRs2QYv7YlQAaHVVgYp5yyilSaDW8jWfjhzxrKlm+9QvdyouZaDbkxKP2wGxctatbyfrbcJSFZo2+HMVELdNEzSgzLgwCYcCtSuia+fvf/y49+3JmFuFfLk4sjVgYpZi/Ula8kKcm10yyYItQoJGyYcrFtWboH/KUQj/1V5MGF6GEorrQr5SpVMqdFTc0lWLurDihaSnk4WZojkBwdF34mkFxMY4wQP84t4DFCEFRjTtGGqZUXOyl0qokjUopK60sXikivFLMH8ryC/nqriYt686yZ/HUHZqlKCusdWNXNxeLuDDo2nh8tDC44657k4cfeaxuenTUE8kTTz4T9auGHnvsyeSMs89Pc9h1MY4wYB6Ok7BKnBmA1M0NOZbH/Bs8G96a1q437CgfwcPeeg3DTUHqhxvdN2q3/JAHkc7zzz8vfNLVW4c0LHb4pdLChDScxucwGFr+YmGxh3zmL0lD3fbZ5Et5aiqPMJCG1zD4a3zmVdWuzw6fryb5ZscUbuLYbwLpcwirz1KTxThNS/OifvAYzaAoTN02LdxqKqnbhYEjL6AhmR2Lfu4lH4wjDDgbwBw6GhMtwUNQsL0QEzfEDiH8MaebbrqOew3gcRKZLYgoYbvyyitFwRk9RC7CQZ929+7dZf86DRaNz+mnny56jUif+GgWpdHR52u+rF2JcDYvJ598sjSG+++/v+yXJ30aL/hcWsF7cGMRjRh28sgWWBa0OBPBAhU7fvT5TOHos/T5YR7Uj2+ASeO36aabyqIW6ZIfdluxo4GLPVjoYvFs+eWXl8aSZ22yySayfZNvwKIamgW5RIZ8kibfRJ8bPh+35Wl47HyTs846S6ZrODjG4hz/g2fybj169JD3ZpcVZeCoo44S/fQsIPPPDjroIMkD74Pg5UAh8fR5pLPuuuuOkwdLhHFh4MgLlMO55547dTnqxTjCgIaMhiNGKFLjEhYat5g/DQ/nC0iDyj/vvPNK44KqYxo17hxmpwB76rlAYq211hK9+Ojxxs6Vcqg6YItl+AwaGBZYs56tRB64uWjDDTdMHn74YdHZTWN6zDHHSAOMylsaODSWog+IRm3NNdeUBphGecSIEWInLbZ5Ej98BhdgsC015IdEXAQKuyD4BvDYSsqBPN6Xb8Cz2fEwbNgwEQ4IT7a58g353uSVxfcwbYhvoWsNoR/fQe2E40YnBA27KkiX25oQvuxcIh9sZ2VEg6BEcCEc+VZrr722LCCT30MOOUR2bNi09Vk8g91Glh8SQsaFgSMv0ElhJ17stLyjeowjDOi10ghk0bPPPiuNt+URB+KuURouy7fhaGQYEWBaviWNUy4flZBNS2/34dl33XVXct99940VlukuTt3SMNq4IdWSrzzepRbK87nl0mJ0FysX1kQguDBw5AWmh9gizo41R/0YRxhQoZl7x7SkPOuXZbc8pZBXiX9INnwYJsav1B3jh6b1j/HUbfmhvRRPG9PQP2ZXsu5YOGu3ZP1DXmi3vCy+9Q/DWH+EsQuDykF9pMGD2BcPqduS8nVrb2gP3WFaYThrV7flW3dWvmzckMLwNlwsLSj2fIhtpyE/ZrdU6vlZcZRC//D9Y2nF4li3JfXDjKWldnWXen7IVz8b127bHUsYEIBeHAewuGDaqfjEVFg1/M4ipqeYznJhUBmopEyNcsAqi37++eeSdsvLoliYWtLCPwwT44UU889Ky7pDsnGsGYsX4ymFcdQe8mP+MV6Mb8mGtbyQb/2zKBbG8sK02N3XsJvOHA5HPkAYsO6GWhLUrVszixfjh/4xKpdWJRSmkZVmKbJxlCy/GsqKG0srK334SuoOw4QUhsmKE+NnxcVUsv4xsnFifEtsWnFh4HAUHCoM2AXXaGJXDoSdHXDbb7+9NBRhuJAISx7Z+KA8NBOzI5CRKbvn2BWnfqTJrjQaJuWVItJlRxt2TQuT50LMYsDDbuPpu1i3hlE/4qKyIowbEuHJN5snbFjWGMkf/uRJTfzYNafvjYmf+mtYfRe2e7OZxaZNfLXXSzwrxofwc2HgcBQcCAN2usWIXVlQzE8p5p8Vh11+NEzqpjHCpLHYaqutpDHceeedpQFD8zD6gdgSTSOmgoCdg4TXHXtMCTI1eNNNNyULLbSQpMnzmfrS50AIE7WH+cOP8PDZjciuO7Zbo1OI7ek8jw0tZ5xxhuwMZDqSLdKcT2K7+BNPPJEss8wysl7F7r/jjz9e8qzvap+N3bpD0m9CXLZmb7755nLmh/dkizxudiySV84/9OrVS9IjPNvn2VXJu7DBBkHENm78Y98EHWqsr1leSOG3qoRi39eFgcNRcKgwYHuyEtuNQ7dS6G95Id+6IbY90zCE4TBp7Nh+zPZh3GyrPvTQQ0WdC1uUaXTZpvzUU08J0fMnDLvy2KKNQFANwaShz4dopFdbbTVJR3n22ZY490O6pIeb9BEIaPxFILEVGr1ebE3nOQMGDEhOOukkEQ6MADhfxKiFc0c0xvYZbIvmMi4OlirP+tt80Uiz9ZubHBFKCB7sN954ozyPfDGKgEiPZ/K9jj322I70Lr/8cgmnbk1bn8O3eumll8byC8NYt+VlhQvtmAgHFwYOR8GBMKCy0phANGBqz5tKpd3I59ZDYb6y8mn5eb5nteEtlctrPWlXQwhNFwYOR8GBMKBXzpZjJaZErBnyS/FwW14sTowfC1dJmJAIUy5cLEwWz7qzqFQ8tVeaVkjVppXlBz+WVimqNE7ID92MqFwYOBwtgB9//FHm35lOcRqXOLW/+OKLy1RNzN8pmyhXKJCk06FwYeBwOFoS7MTZd999x+rdOmqHCwOHw9E0vPHmW8mQYTckQ4bWTwMHDUmOP6lXcu3Q4VH/amjY8JvSHHZduDBwOBxNg19uU1y4MHA4HE2DC4PiwoWBw+FoGlwYFBcuDBwOR9NghQEHwLjDhAuX+vbtKwvCnHC+9dZbkwsuuEAuYFJwZ8Ell1wid5XcfffdcicIijV79uwpB7Q4xctWXE5JcwCMQ13c4cHdKIDngIEDB8qJajTCospe4cLAhYHD4Wgiqh0Z0Lhz6pdTwtUCgYDaDHD77beLqUAgsLVS4cLAhYHD4WgifJqouHBh4HA4mgYXBsWFCwOHw9E0/PjTT8mnn/45F/r4kz8mDz70cNSvavrTn9Mcdl24MHA4HC0L7gFw5AMXBg6Ho/BA5cR0002XrLjiisnMM88s9yecddZZyciRI0Xl8/TTT59svfXWaWhHLXBh4HA4WgZsJR1//PGTiSaaKJliiimSCSecMOnWrVsy6aSTymU2jtrhwsDhcLQU/vOf/0jjjyBAMIw33niijtlRH1wYOByOlgT3I88yyyzJv//975TjqAcuDBwOh8PhwsDhKCq+++675Lfffkv++c9/jkUxXowqCReGyYqj/FL+WX5QzK9UnDzTyqKsOOXSyfIP+eXSgZqRVkga7m9/+5uo+VC4MHA4CghuoELfzs8//5z89NNPTk65E7qg/NpLh6PgQBh88sknUmG//vprodCu7iwzK1xoz3IrWb76xczQ37qVSoUpxVe7kvWzYUq5MUO7dWt4SzF+VtzQDMnyQ3sYNzSzwoV2684y1c4ZDRcGDkfBgTD4+OOPkz//+c9Cn332WQcp79NPP0023njj5A9/+IO4v/zyy+Tggw9OPv/8c+G99dZbIlDYaYM/Ct8glL898sgjcmXkDjvskLz77rvizx7++++/X9K98MILkwUXXDD58MMP5Zk0INttt52Es/kI86TE4i737H7xxRfJgQcemOy3337JuuuuK6OdQw45JOnVq1ey0047SVye8d5773WkRd7J9zvvvJOcfPLJyUUXXZRcffXV8m40YCid4101PBQ+X3mYPJtvwnttuOGGyVFHHSVnEy699NLk0EMPTY444ojkvPPOEy2mbF1dbLHF5PkrrbSSKLvD5Jl8H77V6quvLppSyQPhyVPW8637zDPPFIV7hN91110lX+utt548i29BHuabbz7RsLrCCisko0aNSpZffnmJS76JC/Xr1y/ZZZddhEd+eLfBgweLqc/iHe+7776x8qB+aqccuDBwOAoOFQaoZc6i999/Pzn22GOjfsTddNNNk3PPPVcaVRoc7LfccovE6969u/QOuUz+yCOPlEaCxp4GgrhbbbWVpL3OOut0qIem0db0SQM10jTcyosRcZ577rnk/PPPT2666SYRUDSEPAchQdo0agiNHXfcURpnBNYPP/wg/qitRiX19ddfn5x++unjfBOeT+P5xhtvjMWP0auvvioCioYdgXTPPfdI2hxcg4dA2mabbaRh5ZuRNo0xDSfxyT9Ccvjw4eO8two01GuX+yb433DDDfJNUNfNv3n88cel8UcI8G1R302Dvskmm8i3WXTRReW7kReE1h133CFCBWEUewbvRn5jfhD55bkuDByOggNhQKNBrz6LaNjWX399MUO/hx56KNliiy2k4VAevUgIOw3u8ccfnxxzzDEyGqDxww9Tw2LiVh4NOSZ00kknJS+++GKHuxTZNDCvu+466d3SM37ttdekwaVx0zD0frnPACFBA6vxIZsHiDwgzDRuKSKu/QY2XQ1j/dUd+qvd5oV8cj/Dyy+/LPwwnyGRpqZF2Fg+NJz623BKGl/jKfFNEAZZ/vAQCC4MHI6CQ4UBjUuMmL6wZoyvFPqFFMYJ+eX8LMX4WWGVqklLqdo48GuJE+ND5eJglopvycbJ4iuFfiHZODG+2iEEmAsDh6PgQBgwnH/mmWecnBpCjBBcGDgcBQfCgP3g33zzjSxUOjnlSd9++62cNaCcKVwYOBwOh8OFgcPhcDhcGDgcDodjNFwYOBwOh8OFgcPhcDhcGDgcDodjNFwYOBwOh8OFgcPhcDjqEAYod+LKuQUWWEBOs3UW/vrXvya9e/cWO3ejoh+kUuyzzz6prb3BtYB5viu6ZRZeeOFkuummEyVajuLg+eefFwVnEMrnGgH02tiTq6UQ6r9pZ6AryF4WUw9QSId+oaOPPlpURzQDNQsDtO6dddZZqavzgIY/lHUBGr3ZZptN7JVg+umnT23tjd9//z3p0aNH6soPG2ywgWhOdBQHqHpG8yedpEbdDTzVVFPJpTuVYNVVV5WLVLoCaE/+8Y9/pK76wL/Tju7QoUNTbmNRlzA444wzUlfnwYVB54Hv7sKgWEAYoFG0KFhjjTW6jDBoBLhzoSWFAT8enRdrrbVWMsMMMyRTTjml6CG3QN/Kcsstl8w444wyzbDXXnulPkny66+/JksuuWTywgsvJBNNNFHH8Agd5rPOOmsy00wzSZpXXXWV8EE5YYBknXbaaYXQl25RShigx/2XX34R/ebkk94QutUtuAQDVcDdunVLHnzwQeGR12mmmUbef+aZZ5b8WVAx4PPsKaaYQi4ZsUBvyFxzzSXxeSYXVigYgqKjnri8D5d9WJxwwgnCJy7/wuodWWihhVLbmJ4d+SActPjii8t/sUCf+tRTTy3p7b333sm//vWvZPvtt099x8CFQfGQJQzQlX/ccceJSutwKoNpHO4c4J9Tfi2YriA8qqYvvvhi4aFN1YJ6e80114hKbC59sSglDODTk6ae0egNGjRI2g8LLqShHHPvAncFKOD36dMneeKJJ6IjIN6D70CapB+CXvc555wj9ZbpZYu///3vcjkPl9Fw74IFYWn7eFfUZ1twP4B+W5QMAr4b+Qi/C2DEzhQ7F/gQF4TftmWFATcCcWkGF3kr0FU+cODA1JVIg25fmA+BbnPAT5t77rnlJiGUKCmPRskWYASGfuxSwmC33XaTBlJBnrkZSlFKGDC8pQFlzlNBXC6kULBmgmDjIg6AFkCbJpVszjnn7HhfKs144403VuWYY445Op6BUjKEHYUR8M403BQacPjhh0tlV1AgufADoLuc248UDz/8sNzMpLDfhTxsu+22qSuRizCWXnrp1JXIXOXOO++cuhLRl8+/3WyzzVLOGLgwKB5iwmCVVVaRho3yecUVV0jnRUHZpcxedtll0mCx5rblllumvmPKPBfB8O+pP4Ayq+sApEmHgTsRKCcHHHDAWGWnlDBAgHDjGBe30CgjqGafffax2gvK3Z577ikX79DwgzXXXDPZY489pCElDdoHWw65uYwpTMJzecz4448vl+goevbsKZf3IABZ/5pgggk6pr0QfnwPOmmoe+ZyGW3naF+ojyNGjJBO4GGHHdbR9gDW0bTdop2jPnKhEJfqcG+ErWMIPeo6/jyHdorvxk1nFi0rDJZYYglplCyQ/JNPPrnYaQx5eQs+lM5n0/DTUFlw0cXNN9+cusaAtQokPsgSBqRFDzsENwnR+walhAECqW/fvqlrDOgVkD/tcU822WRj9WSoJBQmC0Y3a6+9ttj52RROCwoV1/oBCjw3HlkgKLRy8p4PPPCA2AHvy01NgMpOz87CFqJQGISLehReAH+SSSYZp/fIPPTmm2+eusbAhUHxEAoDRnThaJlGTHuqjHzpdVsw+uTmMEBjyJWMFlYYUCbC9QP8FaWEAXWYhjyELZ8IMhp3BZf2kD8L2ggaYkDdph2yYJSAkAEIL67ftGCUQX0H1MUrr7xS7AqEFXjqqafkwiALZkG0roTCwM5gAPtd+O5hW0m7pvlUtLQw4Cq9EExF6DQEP5leBENW7l/lA2ECGnCmZELQUx42bJgUCn4GBUh7yFnCgALOz0Hi8iyInjQ/lqEbKDcy+PHHH1PX/0B+tafOyMCCRjQEhWOeeeYRO4VcC0sMTC9pXpWo2ExHAd6V6TIqKBVJRxCACsl0D5UHIcYowyIUBiG0oJK/pZZaSuwWDKtdGBQfsZEB5YfecHjnLZ0aerqU8++//76DqJ90LgA9bG7dsqCs2HQAnT7CMb1iy1e5kYGObC0oVzpaXnbZZceaqsFN4x+CegG4W9mOnkMw6uXaSvu+1B2mbgGdN9orwoSLwXwn2h7uKtbZAAsrDJgGD9+bjqhCnxcibFPaThjQyCEM/va3v0mjxLQQP5uw3Laz9dZbSziEQThMYgiFtNS7QvkhSNwBAwaIf5YweOyxx6TRpCAxRFSiF6A/rBZhwE9UYcCUlsXEE0+c2v4H3nveeecVO5WEPGaBisk72vxCtsGlF8K34y5WCs7ll1+e+owBI5Nrr71WGgCuFVRUIwxU+Fi4MGgNxIQBZYbOA71fpi3Zrgioj0wZcTE+UzFKrOPpWh9XT4Z12goD6hnlm3CMOLgX2XboygmDcE0RsEal64U0oDY+zwpHrUDLN9NZ4ejaYqONNpIw9n0hRr4K6hwdTzpXzC7QBinICz14pnx4po7MQSgMwrqu07YI4SxhEM5mtLQwoAdvQQ9VG13mK8OeAHPbujAZEwYIinDxhZ9XbmSA5I5NEzFSqHSaSM8vKOhBMP+oCIUBjXO44PTkk0925I8hJXe9WrA3WafJVlttNRmKWtC4s64Cdt9993EKmDbsscUuKrr2YCoVBhRUFu9V4Ck4pxAOkV0YFA+hMKC8aHkH/F/WnnRbuE7hhmB6CZQTBpNOOulYZYX07dRPuWki1hhCsO6oZyRCYUCHJKwjgMYXUGftup5Cvwlz97YBV+gogHMzdkGZaWA6eeSBziFtlIJvRF5VWFQqDAAjmXDkQfhQSLSsMODH0btFsgJejpdmgRLQi7DzdXx4Fp90MTMmDJja0V0MgEUf5up1kSlLGAAWmthxoLjvvvvG6vWWGxnwLjpEpoDwM21PJhQGzE2SNy0QOqzUysjCOg2xLrBTmBBYumOI+HwPLfyMKihUjGYAPS9b0RGShAcsarEDSsGzJ5xwwtRVuTAAzJvSCGgPjN1d5MNHBsVHKAwoi4zMLVio1Y0VdKzCDtwiiyzS0TMvJQwguxgNaKht+SolDAYPHiyLt7ZRpPNDvVOEwoC82J1xgBEGm0UAaTFfj1BS0OHUxVvSD+Oz0Mz0E6CzGgoT0qPOssjNyMKC9QsVTtUIA6bTOBhoQX2y9RC0rDBgZw0NGwWAAgHdeeedqe8YsHtA/SiIgIYKiU7jGO5YASysEp6tmAgGRhs0upy2pDesvQt6KMxxWtAL0ufRA7dznQiLLLCLgR9LfjR++FOYhgqBcEJIEJ7CEfbWadjpTeBPJQ1PbzNS4FS3+iP8LNhOqvmhINpeDFNG6ofQs9tSVWCCcIENUOkt2EarabGmw7ewu0SAC4PiITZNxBoSnTIEAFNCduRHh4PGcb311pMRKI0RHTAF/zgUBqSl5Y4pHRo55uqpe0yv0KFjbh6UOnRGIz5kyBCp10xpslBNB8aeaqeRDuPTWaGx5Rl0gAhj6wFChhEPnSfKbLgOSQNLJ4rpMp7Jrh7t+PBd6KCxwMs7sXORdBS8I+906qmnSv1feeWVU58kmX/++TuEAemHwoD20YI6t8wyyyTHHnusrCuyo6ttdhPZbZatDoSBXaDtKqBHxZ70EFQI3QKscGFQPMSEAaAsM2L/+uuvU87YYCSJP+sIFuwAtL1sEE5v0Gumk0NYQCPIwiyIxVcgDHTalHLE1FAYltmCWHyEGOGz2hs6lviHGykUxKejluXPd+KdYm0A78Z2cX1Hhc1r+I2AnWJSYczzWSfkO5Fn3bmkaAlhwGo788gcvNAP5sKgPUCv6JRTTpGGgULNoiC9JwU9N/aU879dGBQLCIMTTzwxswEuEqww6GrgnINuj1cw0rDrGfxDRuiFFwZs72JujTl53VvPsK1dGlAWmnTI19VAIWS3FgeOmPJjasEOeZm64vQm0w+xHpCj88DedebH+W+NUlSXFyhjt912W+rqWmAUwDQv01BMdbFj0o7oOH/EJhY6XM0SmDULA4fD4agHdDpaYQTTSLDWwTSbXfPoLLgwcDgcDocLA4fD4XC4MHA4HA7HaLgwcDgcDocLA4fD4XC4MHA4HA7HaLgwcDgcDocLA4fD4XC4MHA4HA7HaLgwcDgcDocLA4fD4XC4MHA4HA7HaLgwcDgcDkfjhQE3CKGRz6m1qKtrk2wX8B9j/9ep2BS79L/RaKgw4IW4RB5d+BC3+0Bqt6b1Vwp5Gi6kWBjLU35ohmTD27BZfOtvebF4Wf7qtrzQT+2WH/pbt+VZUn4sXOjuqnc5tBMQBNwvkvWPs/jl3Fk85Vu7JcvTMMqLuWN86xf6Kz/m1vDWP/RTfszf8iwpP/RXu+WH/tZteRD3HdgrepuBhgoDXoqr43gxrpmDsIdua4Z8a6+UF1KpeJixeFm8WLxSvJg9DGN5WeEsxcJk8bLclqd86+bGOh8dtDbojHETHQ2N/uPYf1deaMbCWP9YOGtXCt3KC+NYnvKtGbNbsrxYWGu34ay/5cfClOOF/mE4a8/i4aYTTfvZTDRcGHC/J73MZhEfM8Z3qo64cMOFQWsDYcBtZzQusX/cCPL6lw/99NNPIsSbiaYIAy575jJoCLu6LR/iCkXrVrLhbXzrR4F/99135Yo4RiN8UBtGw6nd8my4jz/+WIbWyiNPpM1l4S+99JI0khrPxo+lFdoxlbhM+4UXXpB0qUDweDYXbWN+++23EofKzLWi2LmTGD/yZNPS5yjhX+pbahwbN4zjwqD1ocIgrAulyoaatmxYv5A0LfwZhdxzzz1yP7aW6WrSgijrXBJv45H/r776Knn22Wc7hFsYLyT7DJsWpuaZqyXfe++9jnBffvml1D+Id+EdaAuocxqf+vrjjz+KPawzlsLnq2mfH5Llt6Uw0Ma1kURhOeCAA5IJJ5xQ7hOdaKKJ5LJtPi4fnzD6odWuP0XTwM3HH2+88aQwWP7xxx8vaS622GLJxBNPLEJB09GKpm77HMzwORAFcIoppkgWWmihZJJJJkn2228/WTQi/zPOOGMyyyyzJJNOOqkIt3XXXVfutCWtN954Q/JHxbDpNYIQRi4MWhsqDLQ8NpKefPJJqSNrrrlmMv3008uF75R9nq3lX/OBW/00Pnbq0jLLLCP3a9s68+ijj0q9o/6NP/74yRlnnNFRr4iDXdMlHX2m+tu0IDpV3D1MXZtjjjmSBRZYQBpf7vqmPs4777xSz7iMns4sdtoypm6olwg8m14jiM5i2wkDGlY+Ph8yyyxFYZiYmwLPD6NXwYd86KGHpGFF0q+wwgrSy3366aeTLbfcUvwRHBSsHXbYQQrL+++/LxdP77zzzvKzP/roo470//rXv0rar732mhSqXr16JZtttpkUwF122UUK0o477ihhL7roIqG11lpLwhJu0UUXTYYOHSr5hCioK664YrLnnntKGvRAunfvLnmcfPLJk1tvvVV6Pssuu2xy2mmnJZtsskmy0korSb7pcVHhvvjii47vEH6PSiiMG0uL3pkLg9aGCgPKjv7XSqmSMqJER2a66aZLBg4c2NGQ0Xi/8847SY8ePZK7775byhOX9NORufbaa5PFF19c6gl1j7JP3aTTQ11AGOhzqJ9zzTVXcu6550qd+uCDD5IlllhC6tHll1+eLLzwwlLH6bHTeUIYkdarr76anHjiieJPZ8t+gyuuuCKZbbbZRABQB5dffnm5mJ92YaeddpJGmOcR5k9/+pO0CdRJRgS857333jvOd4h9l/CbhWYp4pu0pTDgI/LhQwr5uC0vZo/xGD7SYFMocSMUKIw04PC//vpr+YH0Aj788EMpoCyQ8pMfe+yxZO65504OP/zw5JZbbpGeB4Xq5ZdfTl555ZXk8ccfTyabbLKOZ/EjKYgXXnihpEfhRrCcffbZkka3bt2SO++8Mzn44INFSHz++eeSByolafCjZ5hhhuSGG27oSJP0KKyMFrbbbjsplMQhnY022kiEB8/lfRAGCCj7HWKEfyVhstwuDFofKgzsP4ZKlQ3ll/O3bhpsyivlEh6NNnWC0Tmj30GDBknHjLpBfmisaWTpLNH4UncIT0M/7bTTJuedd550fKh/zCxQ/55//vmOZ1Jf4PNMhMlRRx0ldVjr+2WXXSZ1hxEKjf2CCy6Y3HXXXRKX+rfPPvsk22yzjeRT80v923///SUfzAQwath2223lObQTvCNCDmHAyMB+JyXNXxbZONa0/sprW2FAw6vEB7Vuy8O0/ll86wdRiCgE/FSeR+98ggkmEDt8CiK9ExpvCuGcc84phYef/OCDD4r54osvSoFQAUE8SAudPu+ZZ55JhgwZkmy//fbJXnvtJYXr6KOPTjbeeOPkiCOOEAFAw73qqqsmU089tRRIGv/nnntO4uNHj+PSSy/tyD/DUaZlGBkwVKUwa3gqDj0XCggCigrFaELjhqTfSinGt+GzyIVB6wNhQMeIBoZ/qv8+LAfqDvmlSMNhUs+pIzTePKtfv35S5ulI0RAjDBAUdGQQBkx9Mh2z+uqrS+eHunPCCSdI3WAUTH2gLpAmo4ippppKRvv6bPypz9RjGnumqBAYOnKmHaCzRl2h8Wba9bDDDuuIz4hhlVVWkXDUX57B1G/Pnj1lhuDiiy+WBp93o/NIXnXEM8000yT33Xdfx7sradoxXjmycTQe9a/thAFTLvqSSDvIvrjyKiEbT+Niam97ww03lOEehQg7HxU7PHoC88wzjwxlKaD0/ik4/HTce+yxR3L11VdLePyIS8EmfeKR3gMPPJBMOeWUUtBIh94DjTZChsb90EMPTXbddVdpuBEMFHoEC0NfKoHml8aekQv5ovdBOvx4CvSIESM63g267rrrZMoLoUUviiEzeVN/wmp4tdv4lsqFszxGUy4MWhsqDLS8hP9bKSwLoTuLZ/0YFdOrf/jhh6Wzo6NppnTo0DClowID86233pL1ga233lpGAvPNN590tBACZ511lsSFqEsHHnigdKruv/9+aayZ72eqlFE89Zd6wXQsDTp1mnjUO0ba8NZff/3kkUce6ci/rr2df/75yZVXXil2Oo50xOjkac8c0pH9BRdcIHVcw4bfI2Yv5W/DWLfyqH9tJwyQrEi5Wokec8weupmuQbIzpXLSSSfJXCG9kZtuuknm3ZmWoQdPIaIBZh7xmGOOEX92H1FgKYRM7zD/aJ9BOoSnt0+vhB9GgWN4C+/000+XH8jawIABA8SfeKS/9tpryw4n3JpfwjJvyfwmwoP0iUOhZ21Dn61hGekgWBjeklfNlw1nKfRTt+WXig/xPV0YtDZUGFRSVqopG0o2HPWhb9++Uh9233136e0zMqAu0cDiR92irJ9yyilSJ6kXxx57rKRDR4pw1CWtL5o2wqx3797Sm2cKVRtORgfUr912203qNZ042gBtUJmmYvRx6qmnCk/TI21GE1tssYXkkw4d/oxgEFo2LPannnpKOpMILoSK+md9p2r5kPXDzmxGWwoDpFweRAMV4yvxkyg4fEzs8Pi4FAz9yBpO3cojjMaLPQe+pqM87DxPefYZkKan7pC0x6ZuzZcNA+lzbNqWyn2XGJWL48Kg9aHCIFamSlEt5QnScko51rJMWlpvtKxTjmM87FllXONoeCXlaZ6tP+njn5UmfpC6CQfF3j/27Eoo61uW+sb4IdzaUhgg5aohPkSM3+rUau/lwqC1ocKAxiX2f2PUzDJalPpQxHrJCKrthAGLuyx48nKWyvFi/lmkYbPil0sr5l8urSyepdA/i9Q/Fgez0vjWnsWz/JCsPxXEhUFrQ4UB/zL8z6Hd/vssivnH0qqENGxW/FJp4VdN/FhY62fJ8m24kNQ/FgezXHxLGt7Gwd6WwoBFGKfWIhcGrQ8VBjQssX/sVGxqO2HAPmAKpCV21oT2GC+kMIxS6B/jWbdSyMtyW761h2GUF4aLhSnFs+6Qr3Z1h+Esz/KzwmTxXBi0PlQYxP5xJXzrjvlnxbH+Id/yQn8bLsYL49lw1m4pDBOGU3cYTu3qVgp5YbjQjIUpF07dbScMUL3ADpiQ2GXQCGrGMyDS58xCs96l0c/hXZRwM1R1YdDaUGGg/1T/cyPLEqTPynpOHs8v94w8SNPP+5uFaYXPUXtbCgP74hBTRyiByiKOfsf4pUjjcK6BZ7AXmPMHtaRVjjiAgs4SiOfx0+rJc4wPkT7vwsE3dkbEwiqVSivGV+JdSN/+H8iFQetDhUH4b6l/1M1yZSNG5coZI0qeQdmldxuGs1RrmcVf6znvwsGzcnFiVC4OC7n6Lpjwas1zjIhD20FbRfqWEAhtKQxobCyhPoIj5ZwAjJlKoVt5lq9u5XGghGegF4VtWjZcGF55liwv9Nc4bJ+jMPIcTgVTGEulpfFCv9DfEmm+/fbb8gym2igclaYVC2ftlse7kL7+GyXmmV0YtDZUGIT/loNfNEa2rMTKRsi3vCyThoxnUD8ov9ZPCbflhWnE+NZO3eDQGs9BBQWHxMJwSiFP3ZZv/W043oWGGuI5sbA2nZh/KR6EmgvaKvt/VPi0nTDgIAgFQ4mtpqNGjZIfGCM9dRgz1W75SowCMGmceQ5CgQZNw4bxbZyQbyn0w81WPRpQ3oVTk/SubVpZ6cHXcKX42CEKoFYqnhXGse6sdEO38tREYFIY7T+CWMByYdDaUGEQ/lv0/NAg2TKjZcOWEVtOlLLKmfLpqVNWqR8InTCcdVsK/XArWR7PgbRuoEaG/f/qb00bT8nmP/YulrSO8058M302fuEzLOmzYnzrJi1UcNBWhf8IgdB2woCGRgsIJoRKWn5gI4jCwTP4wFSEWJh6iSkURjwUFk4M24M2eRGHXKhMvAu9IJ4XC1cv8S4IG/03Si4MWh8IA+ae+Z+2/nHilsYoVh7qJcopz8BEFUQsTL1k6waNNJ2zWLh6iY4sdRxiNkMPnuVJtB0INv03+p8YjbSlMKBgaOOJyZFue8BCT+NhZ87R+inF+MTjY1p/hAHP4AMz5LLh9XSf5UFZz4Two+Jo/iAqGAUFeuKJJ8byg0K3JdILn4ebfPEc5eGmMvEuCDYEgo2j71Iu7zE+hVDtvAuXfGih1//kwqD1oSMD/adKjGY5aWvLR6kyayksU5RZeMqnnPIM6j0NtQ1LeY3VP0uxMkt5pSFWN2kgDHgvBBtl1YbXk882LZvHLKKnbt3aCaOe0+kL817tu2gebD3nu6NuO/xHbSsMtOGEcKOBULdPQZtuuqnoEuKD4Ka3islPVpNCzbSPxiEMCuNQR63hIAogz9Ceg/JptOeff/7kkEMOkR8CT+NpunY/tsbjxyyyyCLJQQcd1JEvhnAUFJ6D/hWNT+FAfwlKs2hgScfmGdK07TOwc3cBuor4RsrjJjSegWBDI6OGR0ig/I5n8Uy+jaZn30XT0XgQyr7Qh6RuBCZp23+keXBh0NrQkUH4b9G1o3WAMo1OLuofap9tHdCyE5YtJRrI2WeffawyRseFMkv9QOgonzlwlMyhuwi3rc+Y6rZlGaLBJA7aTJWHP3WA5/AM0lY++ohQPImOIa2vNp6a9l1wUwe4XwHFdRoOHs9g5IwuI+Xz7XgX1F3zfeHZd9E0ralEeDSpIoxxE56pbZ6j/wc7I4S2EwZ8yJDQwMkPZFF0+PDhosWTgoXSOBTGoYGUj4+aaVRCo68cddH40RNAMyIfkA+NxkLSIT2I4RzPoLDwYeFRwNCkSO+Cew1WW201aRDRQqiXYSBY0IaIumhGFzwDIi6ChYtv+JGkx4+isafAox1R+YTv06ePPJdGl+dQycizvstVV10ll+qst956UtiIQ6NPwaDAUxBIi3ciHu9Cz4E8wKfwIGzwHzlyZLLBBhvIt0HIobALxVt8T7SbommV92Ukpu+CznYKPWlBCEyEjf0/EPlxYdDaUGEQ/lvqFv+X/095owxRrqgzaOClfNB54sYvlMfdfPPNcq8GdYg4lCU6QZQnFMXZ+kdZ5RnUD+o0PPKAIjrKM3Wf+wJQukg5Hjx4sCiXRMX05ptvLnUELabUK55D/ngOeaITps+izvAcBJvuxqGBnXnmmSU8ZR6NqeSPaWm0ntJxpI1ZeeWVpYNH/eUZWjdQbY2iSX0GdU7fhfdVPorvaH9Y+0R7MWq4EQ5oHEYTKwrwaF/QhIxyPgSWvgvvh6JMvrd+G9ob/TdK1Mu2FAZ8TG08MdEHziIJjSo/nasq+RE0yOg1p9dLQeLmIwoQH5XGDq2eqH3mB1Co0FrILUekSVqkyYfnOXxg5VNYaIC5QYnGmoLFTUY8k1EJqmz5UTwPrYikQaPNc2iwKWBoV6Qw8gwaexpQekEUNHrXPIewSy21lGhYJD6NNIWR9Cj4FCg0NqIql/fhHXgGF2/Qk2F0gMZGzTOCjXehklL44ZMHKtKZZ54p34r0yT+aImnoSZ+eC2q3eQa9PXpLFHLyi/ZVVGZT+PVdKNj2H0H4uzBobagw0H8K8Y8ph5QjyhONDh0gyhCafdEMSjmiHNJg0uHg0ie0dlLGKOOU7euvv17KDZfXUH5Ji/JEWaWcUT94DjyIOwO43Y8yS/nnedRz0kfgMCvATWiowqYucGcIZZZ6TANL3act0ecwA8D7ILSIC593pbeOlt/+/ftL3WBETyM900wzSZ4RHrQ3vAv1i2egup46SieK51CfeQbvx7vwztRzeDwHtdk06mglZlRFvWX0cvvtt4s2Vr4xnT06sbQFCCDqOXmm3UHVN9+JtKjn1O3wHzEj0XbCgJfTD6pEb1WHQ/xIbeyw88H4GITDTlzC8rOwEwfSRkx7CLghfjbhiItk1+dgUnBIGzf+fHBMiGcTD+GjaUHkH3/yqGkRjrTJIxKfuPhh0jPS9LRiaH6oIPB5Bnb7LrgRgJoWxLvwDCoEBdeG511IFzvfhnh8C54Dj3yTnqal8fRbK480SB9T/w/kwqD1ocLA/lf+Mx0whICWAeoEZUXLA2UEO2WJOkY9gYddyxNEPA2jaWk5px7Qc1a+lnvqDXbKLH7EJQ51mHDkg7KLHSIM5ZMw6oaoD+QVgUN4fQ7pUDe0E6rtCibpYCcPWgeUCE9a+gxMracQ9dyGJz55Jz3NP+9MHM1PrM3CJK6GIT7vYv8RBL8thYF+UC0kSFY+UqXET9VGvhwxBOY5/BRt5IgfhquHKEwUBN6F3j0FIRYupGryQVh9FwpPpe+vRP5i/DAPWjHsP4JoRFwYtDZUGIT/lka60jJbLdGwUS8of0zV1Fr3ysWjoeVdEGw8D16pOLXkg2eQNnWEdc5YmHqJfOm7WEJo0H42Ew0XBrwsH1OJHgbDSssrR3wcppZifiFRODDpYVAwQ/88CCEDUeBZG9ACkyeRJu/CMxBs+l6VEnEY2pbLG4KAsOrmeZALg9aHCgP+p/5bTBpp6pT+8zyJDgzPoX7QUYqFyYP0OZRxnhULUy8xwuAZtFnU81iYeon6qe+i/wdCWLelMNAX5aPy45iPZGhFQwRhzyL8mfO2bmtXt9rp9fAcetJ8ZA1rKYwT2m2YmBtJjsm7IKQ0roYJ44YU87c8TUvfhedRgcNwoVvtarKmwtA35FsTgQnxHEsslrkwaG2oMOB/av2D6OVSdikDSlomyrkxrd2GgShvmjZ1AzOMY+NZvvJiFManoeZdqBeU3zC8hi3F17QszxIdMN4FP6a2w3ChXd1ZppLlQ/oulphRaUthoC+tH4FFYH4gjVwpE6nPjh8+lvKUcCvRs1U7hUM/MD9T+TYeYSAKa1Zayrf+akfIYOc5THnZsLHw1m3DKp+KynV/1h9inpJnINjoZSlfw2nYMM82nP1+MeJdGEVR6JX4Ry4MWh8qDGzDw/+lLMGLlYdYGVJelmmJOkvalEk6I1lhS6WhhJ/1t3bqNu+DYKP8Wn8brxJT7ZYPaQcMYjbDhrHxskxLGl79bJ3Vd1HBAzGF1HbCgJfixS3RgHLBNA0gdMYZZwhhZ2UeYvsWWyV79uzZwYNsGI2Lye4aVuzpmfDzaOTYwQA/Fp+dRBBuJZsHddu4ENve2DWgP5RL8tm9oHGVwng2rTAMOxJ4V+WRZ3ZsMDLgXRAG7ITQnQtKpGnzrDzc+jx2F+ndshpO/djZwLZBKpMWev1HbHtzYdDaUGFAg8M/1X9MuWLHji0PtgxB1o2/hlGyPA3H/eDUB55F/bj11ltlB58Np3FDt9o1rZA0DHWAu8apE7wLDSl1Reu5houRPlfN0K7hsFOntbMIsSuPHVA2jMa3biXlaXglwusz2WnE1lo6a2H9YxG57YQBL6UfFKKQ8PJIXXq+McKPLWhs8ywVLiQKua4VYFJQYuHoTTD1RKMe8y9F5AeBo1IdoUNvKBa2UqJgc1m/unmGfReexbtU8y2UiEMFYhtcGD98FyXcLgxaHyoMwv9Leau3zMaINHXUzDPL1fNaiPSoC/pOvAt1JRa2HrLvAjXyXbSeW2pbYcDLVkM01Hvvvbf8gJh/vcQwmUMhSPuYfzOJQs3BFA61YY+FqZeY70QYsF885h8SFYx96C4MWhsqDPifsf/sVFxiYbkthQEStlJCUnKQjKFg6IdwCHlKWX4xPnOZnPblwEvoB5V6TrVULi0q6rrrriuH4igEsTBKYVqVvLPaEbActuH7hmFCN3YXBq0PFQax8lCKSoXJ8quWDzUjrVqo0rRsuErzXEnahGFHUVsJg//+979ymo9tWfTCyxE9WE7IcnKQBZtYmDyIefJJJ51U1hRi/s0k1k/QmcRJaIRULEweRNq77LKLnNBkLjQWBuK7M+xGgZYLg9YG/w+lbfzPRtYnp3yJdRdOJv/nP/9J/2Rz0FBhALjNp5zecAhd33yIOeecU+4HiIXJi1Bgtfjii4v0jfk3k9Bnjt4Xpq0a/d5cpMFzUMZVSh87eWp2r8SRPxAG/MdGlyunfIn2knaz2Z2xhgsDwEuVIwotCpx0S2MjCU2fKK7iyHfMv5kEEEwouwKxMHkSWiBRzIda4Ji/kqN9EPu/TsWmzkBThEElYAspu4eaARROoUaW/f1FAAq70BzZLLAFkHUKh6NV0dkNZzuiEMKAOWzmspulmIkRwQQTTFAYYTDJJJOIuulmgu/NtJzD0YpgJgE10BxqdeSDThcGzI1xpwD73ZsFdsqMP/74smZQBCAMuIegmeDd2VHF+oDD0Sw8/ezzyVPPPJ88XSc9+dSzyaOjnkgeePCRqH81RH5++eXXNIddF50uDFjM5K6AZoJT0QgDjn8XAZNPPnmy/vrrp67mgdPUnFB2OJqFfpdemXzz7bfJV1zbWhB6+NHHR+fpuzSHXRedKgy4hIW5axZ0mwmmicYbbzzZ01sEMDLgYo9m4x//+IcsJqMLxeFoBi7rPzC1FQfPv/CyC4PR6FRhwDV06OtoNtAIiDBgj38RgDDgNqfOAGcuOIzGmRCHo9FwYVBcdJowQBkTWyp/+OGHlNM8cMsTwoCDWEVAZwoDwPWYXDfocDQaLgyKi04RBmwH415jVFl3BrjSkTUDTmYWAVNOOaVcnt9ZYEcGl4T/9ttvKcfhaAxcGBQXnSIM+vbt2/StlBaff/65jAwYIRQBE088cXLYYYelrs4Bl6AfeuihqcvhaAxcGBQXTRcGnCVYZJFFZEdPZwHlXQgDjn4XAeSls7e5Ml239NJLd8q0naPrQIUBSvTUVHsIdhpa6MgVFc/AxkOvWQzMQrBRQuNyJwFAgabChcEYNF0Y9OnTRw6ZdSY4rEIDXJRpkYkmmki0iXY2GJ2g3tvhaBRUGHz77bdJ//79RU8YDTSn4qHrrrtOdvuhrO2kk06SsIBt4Fw48+uvv4q6d84KUVZRy85ZJd2IwiVRXHLFRTEjRoyQRp+LsgC7B6+44gqx8xzVv+XCYAyaKgzoibNQiqTuTKDJEWFQFLB+QeEuApZaaqnkl19+SV0OR75QYcCOPhp/evfc+nXUUUd13JT2/vvvy02IdlMDmz2GDBkiOrW4PY1Rw7Bhw+Q+ddqTgw46SOIx1cktblzORHy0f+pZGsq1qrwhrmo8cGEwBk1tEbnNi6smOxucui2SMGDNgMJbBHAQbZNNNkldDke+qGbNABXO3EaI+u1qT8ozWrjwwgulwUdbLyMKC0YOChcGY9C0FhGpvNNOO8mP6Wyg0hdhUJS99VNMMUWnrqFYMMeKmoqirKc42gu+gFxcNE0YMKzjAugiAF3+RRoZdOvWLfnwww9TV+eDLb9+7sDRCLgwKC6a0iIyGphqqqlk0agIYMjIPH1RMMMMMxRmmgiwsD7NNNP4ziJH7nBhUFw0RRiwK2CrrbZKXZ0PFq3ojRcFCyywQGEWkBUsuu27776py+HIBy4MiouGCwMWcOhlNlsZXSnQ42WaqCgXY0w44YTJO++8k7qKAXZh8N/YveFw5AW0lv71iy8LRQ8+PMqFwWg0XBjccMMNnap3JwZ2GhRJGKCjqSinoS24fe3cc89NXQ5H/Xjt9TdzoUcefSxZboUVk6HDhkf9q6V/uiqWxguDeeedt3A3arFmgHK4olz6zuU+Rbl1zeKll15KZpppJhGeDkeRoGeFbr311pTjqBcNFQboAJp//vnlxG+RwKX7U089derqfHDQi7sdioi1115bDvQ4HEUCyiYRBpwyduSDhgqDJZZYQoRB0cBxd+bpi3DmAUw22WTJww8/nLqKhauuukoqnWs0dRQJOjJwYZAfGioMZpxxRjkuXjSgr4RpomZdwF8OM888c2FuXQvBYjujKNdZ5CgSvvzySxEGXM7kyAcNEwb9+vVL5phjjtRVLHz88cfJfPPNVxhhwAIydwoUFcccc0wy22yzpS5HM8DpeMpnI4i1shi/lYhR/dtvvy0HSGP+IbXCO9eSx3reC3UfFg0TBrPMMkuy3Xbbpa5i4ZFHHkm6d+8+zsfoDHDmAWFQpENnIVAZzCgPNR6O5oDG7ueff5bNDhButcfcpUjDZsWJ8StNv5JwYZhK045RPXGVNI3QrITCsLXkp9Y0CFcubMw/K06o76khwoCGDX073BtQRKBCe/rppy9E44YwmH322ZP7778/5RQTTBUdfPDBqcvRaKCJkzMelFGIigtZtzWzyMazpo1n7eV4IT8WLiQNY81YvEp4No2QV47CuLG0ylE9cZWIY+NVmpaNZ00bL/QP7ZZCDQMNEQY0GkWdIgIohWPN4N133005nQvuM9hnn31SVzHBCfKFFlqoEKOprgCEAYukTINAKA6EQrcl5WeZMbt1W7JxbTgbPrTbMNZtKQxn+dY/y23J8myYLDMMl0U2jNpjcWLhQv/QtOHUjFEYJmaqPea2FMZRd1OEAT3dHj16pK7igW1pjAy+/vrrlNO5YGupXs5RVDA3yV3NRdwQ0I5AGHDxC/q8aiVGFjF+LVRLWnk+v52os7+lptVwYcCNRGzbfOaZZ1JO8cCIgAVRLsPobHAKmvug995775RTXEw33XTJuuuum7ocjQTCgPM5qHHpqkRnLcbPgxqZdqtQw4XBsccem8w555yFnk5gDg1NoZ999lnK6VysuuqqyV577ZW6igt2FaFUz08kNx4IA9bcOLiJUKiGaomTRfWmZeOXS+v777+XLaPqHjRokEyV2TBKYVr0domvbkb/nEFAzQsNPw2fxuGAJ2nbNJg2UXslVO5dYpRnnDzSYrrIIldhwFQCvce11lor5RQTzz33XDL55JMXQgUEIwN2NrXC7WKstbC+4Xu7Gw+EAZpsEQj1EI1iNfxSRJxa48X4lmigN9poo+TVV18VN3E4+Y4wQGOAfgs2p+CHG2WK+KHkkc7UcccdJxfj08nbb7/9ZKMIo/+HHnpIOqiPP/64pEGjiL8+mzR23HFH4SuviJT1HWv9Jw0VBkhgDoLcdNNNKaeYuOaaaySfTz/9dMrpPDCCIi+ofWgFTDrppFJxHI0FwuCjjz6SRo9zMUq4lWft1m0pFq5c+NBf7WGcUmFLhVG+ddMQs85oefr+NOgrrLCCrKGwiUEbcy7LQpEidoQIUx+MXLGvvvrqco8yU8I0fKx1rb/++mMJEpsH7mGmkQzzBSnPmiHFwsd4sTil/CyFcSoJn+UPr6HCYMCAAckEE0wg0rzIuPjii6UBvuWWW1JO54HRFHnhrEErYLfddpOFZLbEOhoHhAGNIDfgdQWiIeZcEgIg5t+/f3+5MJ9ePm5O7DMlTfj9999fRvsIAxRjMnpgdx7h8OeSfO7mWGONNWQUYdPVML169ZIGMvRrZ2qoMFh22WVFH1HRMXToULmEnhFCZ4OTgOzhL/JWXIsHHnhAhEFRtuW2KxAGNFx8Z3RpQdjVXcosRYRRUnfob90hL7SHpH5h2BjPxqOnvs0224zFs/HwX2655URA0ngvvPDCEv6QQw4R94orrigq1xnt33333cmSSy6ZbL755uKmTUID74EHHpjsueeekobNA26EQcjXPIR5UQr5GicrPJTlH8aNhQnJhld36G/dlihbrJNY5CYMWJTlKskibylVXH311XIorgjbJFE7gG4iFpFbAQzVEV70yhyNA8IAFSVvvfWWmGqvl2JpZaVdyzNrzScLveuss07y2GOPdfA0rRdeeEE2L3D7Hioo4GlDSKMG4c+93UwbEYaGHX/smMTRhjDMI7rKtthiC/GzfKUwvHVX+76Ez4qTZ1ohxcI1TBggeREGw4YNSznFBRfuFGUhlKPi9LRXWmmllFNssODNGQ3mbh2NA8LgtddeK0s0ZKX45fyrIeLklV5WWsz3x3hPPfVU8vzzz4/jp8TmBmtWQ+SDzSQxfsirlOqJG5KmlZVmrc9qmDDo27evzH0XVS+/xahRo2RkcO2116aczoOqo2DY2yqgB8baEILM0RggDF555RXpZNFQQdjVrXbrDsNZUj9Llm/DhXbLU3dI1s+GDfmhf6n4YdiQH5qh3VLIt3Gz4tswygv5StZtwyplhQntWeEsZYWJuWN2iPNgDRMG9GynnXballBXwIdB8VoR1gzoaS+22GKFvPchC+hRQhgUYTdWuwJhwPQIveGQWCyFYn6WwjCxOHmnVcqdxc8Kl0XVhi9FtaYVi6e80AzJ8rHHwoW8cm7lxfghaTgUIVrkJgxo0FitbwUw7OzWrVsycuTIlNN5QBhwNqNVpokA87Hk+cwzz0w5jrzxyy+/SIVlegShG6Msv1JxsqiaOBq2mueXC1uLP1oOrLtcWqWo3PPzoFJp5fn8cnHwh1jntchFGDBdgMpq9vW2AtiZwDRRUS5759vtuuuuqav4YGqLxb5WORvRiuCUNwufqFvnFjynsemuu+6SHYFsMY35O2XTo48+Kgvr4TRvLsKAk32sF7DtsBXASvryyy9fGP1JyyyzjFwG1ErYdNNN5Z+zG8qRPxgxsu2YCus0LjHFMdNMM4kq5pi/U2mibFHGLHIRBuzdZdolVHxUVHDggvUN7vctAqaZZppk5513Tl2tgZNOOkn+OfpgHI5mgwNTbLwIGzRH7chFGMw111wtc4IWcNqRQyqXXXZZyulcsNbSSgvIgO25DNOZynA4mg06IWxv9pFpfshFGDBdgJKpVgH6SRBeHFQpAhAG3MncSkAIsKOIA3wOR7NBHWaaKFwEddSOuoUBC13sLEHDYKuAU7SsGRTlQhmO0aNgq5XAHaqcnN5ll11SjsPRPLAAitJEVF478kHdwoCTbHPPPbcokmoVoPWQrbCsdRQBTFnRy241cNENmiAdjmaD9clVVlkldTnyQN3CgNO89GzZe94qQNU2t7EVRRjQw2GqrdWw++67F2aqzdEauO2Oe3KhIUNvSA48+LCoX7V0+2j6+W9/T3PYdVF3C8TdBUjoVpq7Q9UtU1uHHXZYyulczDPPPC0pDFgvYFTjOzocleLSywfItsYi0TPPvpB88+13aQ67Lupuge68807ZTcQccquAaSKmZc4///yU07mYZJJJZHtpqwFFfwiyIt937SgWLus/MLUVB8+/8LILg9GoWxgcccQRcqEEErZVwAIyJ5DPPvvslNO54D7mBRdcMHW1DrhohH9/xRVXpByHozRcGBQXdQsDpogWWWSR1NUa+Pbbb2Va5sILL0w5nYuJu3VriUuBQnDwh+k2DqA5HJXAhUFxMZYw6HPeRUnvc/pWTGede2FyYq8zk+NPPC3qnxf1vfjyKL9mOvuC5ISTT096nXZ23L9CuuCiy6L8aohvSF5OHv0dY/7VEulddEn/pHefuH+edNZoOp68n3ZW0nv0c2NhLF1yeTFOfDs6Dy4MiouxhMGJpxRTC+WVAweltmLh6sHXpbbi4F///ndy8qlnpa5i4cJ+rbP92NEYuDAoLlpCGFwxoJjCYOCg4gmDX3791YWBo7CoRhigYTOG2AVabGTJArvd9J4VlGoCq1PLhcEYuDCoAy4MqoMLA4cKA3b0oakXtSYDBw6US+zRFXbJJZck3333ndxCyCX1Cu4nvvjii0Vb6ZZbbin3Ht9zzz3JlVdeKXc/9O7dW8Ltu+++kh6X5HMQlkOxZ501pj5gP+ecc8SOkkrVa+TCYAxcGNQBFwbVwYWBQ4XBJ598IsoOuRuDjRxsQqDxfuKJJ+SOAtTh77///hIWPPvss8ngwYNFUJxxxhly8c8tt9wi/N9++02uYkXAHHzwwSIMaPiHDx8ud5ecfPLJkgY3x2222WZiv/HGGyUecGEwBi4M6oALg+rgwsBR6TQRQoKeOyMARgU03PAqBTrTGElwGPaPf/yj6PC3uOOOO1KbCwNFU4QBc3b1nFJtlDCo9+Rso4RBPWp5Gy0MqqmQIVwYOKpZM6B+cpiVhryeclcOLgzGICoM+AncIMR+/N9//13smLgB9gEDBozVmLLnnLk7Fn2Q4oTBZI6PoR2SGn/sgJ+MHw0fh5fAqaeeKuYrr7wipsIKA80XBYT0MHVBiWcwh2iBHiLywiXuHIwjT1w2TjyGlYB86LuRL54BuAQH6DDzwQcfHKuhVmEAj3gozyJdnoNb0yRfOm+pQCUG4Zg31XxpwWdoC8gH3xWQZ4jvx3woUN1K+v1AKAz0+5NHFs34Z6SJCY+hNs9U8E3x5xnkn0oIDzsm70he9d30HwOG8eCaa64Rf+6NIM8KFwaOaoRBs+DCYAwyRwZ9+/YVk3k2buGiIfnggw9kNf6+++5LLr/8cvEHNCycQuU6SdswcU8poLEgPn5cRv+Xv/xF+DTcDAEZxr388svJRRddJA0gjZQVCFYYHHjggWIyr3j00Ucnr7/+uqixpWEnf/YeZho1Gib8EAYAwcAdoIAGGdx6662iEhc+z2YIyTvSiKNqgbuSadR4P/KqsCOD66+/XkzurGWu88033xTlfXy/n376qUPQAQQU85rky56C1otimDcFQ4YMETt5AyyYsWuCvNx2222SL/LLUBr97iAUBpdeeqmYfGdOi/N9db6Vf0w6VhiwgMczeU/dgcFwW/HSSy9Jfkjv448/FkHTp08fSYNLtjndjYDjX4Dbb79dTODCwOHCoLiICgN6qujtocd97733JoMGDZIGl8YQNw0JDYQdGbD6jz8NPT1EeoXaS6ehRKEdjdmLL74ojTBpsHsAIUHDDlg00hGE9s6BCgMaJxoeeqc0YoxOiPvpp5+KkKHxtFdZ0jsdNmyYNLo0btddd500ZBqGfCKwWExC+NBIo2+HtImr289ogMkXwsIq5FNhQFjehfyxaMW34D0RCHwTnmOFFA0m34RvpflGqA0dOlT8+W4ITvKJEKN3TprsuGBhTHW48/6Ab8A/A6EwOO+88yQPzz//vKRBXBp7hCSCi3zzvRU8m/QQOCiiowwgpHl/hCRCDMH9xhtvSN7J5wknnCB5pJwAvrGO1uxNaC4MHC4Miou61gzo9dGQ0tOsFjRq2oDpVIhCe+yKatcMaNDZqaA96mpAI6c9YhUGitBd7ZoBeYJqUfdNw6vzpqGGWJ16A7WsGdCg8x919FQNmErTqSBGQBbh93Jh4ECbwPsffJS88+4HddG7730g6bz73odR/2rojrvudWEwGk1ZQK4XjVpArheNWkCuB41eQK4HLgwcH3z4UfLhR3+om24cMTJZYYWVkqHDro/6V0Pk6V//ah1Fm42CC4M64MKgOrgwcOQFpm5R/c40sCMfuDCoAy4MqoMLA0deYP2R2wp1g4SjfrgwqAMuDKqDCwNHXmDdbaqppurYzuyoH2MJg1NOPzu5btiNVdHQ4TcJxfzyovP6XhLl10PDJN8jon6V0vkXXhrlV0tDrx8h+Yn5VUuDrxue9O5zQdSvEVRN3i+4yHtxjnzAVur55ptPdu858sFYwqAWoOtj0003TV2tgznmmCM57LBDU1fnYp999mnJay8BZz1mn3321OVwNAdsXeeCKj3f46gfdQsDFEytttpqqat1MNtssyVnnlmMabG99tormXPOOVNXa4EzCGuuuWbqcjiaA84OMTJg7cCRD+oWBhxc2nDDDVNX62DttddOHnvssdTVueDQ1sYbb5y6Wguo6iiKUG0ncKCTcyWceXEalxAG3BvOIUi+VSyMUzahucAeGgZ1CwNOxa6//vqpq3Ww0047jaPHqLOAPqcll1wydbUWVl999bFUkDjyAWpTUPHByW+IA5pqr5camVYz80mDxneK+YXU2d/PxsmKX2+6tZBF3cKAubt55pkndbUOdt99d5niKAJQ78AaRitirrnmGktfkyMfoCKEE92YpQjlhjF+o6jZz3OqjSr5T6HGgLqFAQ9lIafVwCUY6EcqAlCbsdBCC6Wu1sK00047jvoQR/1AxQdqRlDyiPoRS8rDtHbrn+WXRWH4MK6aoT1GGj8k6x/aK/G3FIazVEkYKAwXC2v9rRmSja/2MGzotjzrF9pDCsOEFIYL7epGx5tFLq04wiCcfyo6tttuO1FJXQQwzbLCCiukrtbC9NNPn9oceQJhgJClwlpCj1fIC6mSMFAtaVWadrWUlW4tz2tUHuulWr5lpd+l1rQschEGM844o+i1byWwnRNtoUUAGkuZbmlFTD311KnNkScQBmi3/eabb0oSGnOtafkhr1KKxaslraw81JqvUpRnmrW+a4yXR1qaTh5pKQ9qiDBg8RP99q0E9sdzV0ERgNbXQw45JHW1DlBTTUfAkT8QBp999pmoHFfie8fs1h3yleCX8ou5q0kL9e/qh4l6dusO7ZYYAUHWbZ+hfjbtrLRCsvkgHfJpn6X5thRLW/OkbuwIaxumVFphmpZfyq8SvqVq4jREGGy55ZYtdxKwd+/eHZfcdDbQ+c9Zg1bDyJEjW/KMSSsAYcA9HWyhbAXi8iY2k2Cn97nrrrtKA6QNpjbKmDSsEGE5SdyrVy+5LIuw8Pfee+/koIMOEjejdy7Ax054zKWWWqrDXYp4FvWcS5iwn3baabINmjRxc1HXcccdJ9vjSY/8ar4g5dHIc/8Hd4Kw2YPtrMTncihG9YTl3bjYn4umNH7RiWkji1yEAbdj8fNbCdyYZi/F7kxQyNii2WrgKs8TTzwxdTnyBMKARoydWggFiMYWUndIMb+sOKXSgapJi1vtKAfqpoHnoirCcn8HN/89/fTT0kZw4dMuu+yS7LHHHnJrHmt3NKAjRoyQrd40qPPOO69M4xJ/1KhRyQEHHCDrapo2lyjps2Jkv9mzzz4r3xE7QoB0uPCJ9UKeDR8tClxGhRDCjiZUnt2jRw/ZdUh+uRjqqKOOktsCae+IxztxPSx2wpM27407JPJUzb+MUVacculkxWmIMOCimkUWWSR1tQZ69uwpt7YVAUxXUfBaDdtvv31hdmS1GxAGNDQ0ZNyJrYRbeTF7SKGfppMVp5xfjM8UMYcPY36ffPJJsvTSS0sjyWVYjILpTdMQ0/NefPHFpWHi8iamSpka22STTYRH3MMOO0yu1KV+0JiGeSAMYS3PkobHZPaCA7LrrbeejArozBCfWx1p2DlJz/O32GILEW500mjb9ttvP7nZEd4555wjW9LDtDGZaaBNwZ5FhFWyPOun/JBCPw2fFaecX0OEAYlOMcUUqas1wDWc3O5VBPD91llnndTVOphsssmkQjvyB8KARpZedzWENs8YP0bVhFWKxaG3zyn6kK/E3QPslqPhRRAwCqbhZNRAbxuVEscff7zcRc40DFOP+JMuHQ6mZHbbbTfpvYfP58Am6XHjHu5S70SjzzMRsgiWVVZZRW42pCNL3maYYQYZoTANheBg9oDpL0Ym5I+rccn/scceG02fkQHvGvOrhRr9LxsiDDjePPHEE8vcWquABeSiqL+lQDI0biUw1Jxooolkz7IjfyAMOH/CFEqzCSEU42cRvXoa9ax4NPJ0vLDTwNN75u5x7IwUEBSMAJi6QQgwikBA0Pjed9998h0w6anbdIl76KGHyp3lPDvr+eqHtgRIvyuj2nXXXVeEBHnZYYcd5Nn40bg+9dRTMoJgeou8MXpgFE/eYs/inRj9hPzOpKxvAuV+6EzB4aOizMFXAuYsKaRFAJWj1TS/UjkmmGACUQngyB8IAzoJNFKYas+D8kxLiUVghELIp6e97777yjz8W2+9NY4/ROMc86skn6w7VBIuFoZnKtHoMx3EeoTmBWHDqACBgz2Mb4nGlWmnF154IeqvRD6y8lvJe4RUSxylhgkDpgxaaTGRaZmizHczZcXiVSuBofVyyy2Xuhx5A2FA40rvGMKuFLotnzlu9bdmyFPCbePE+ErWz5qWH3u+9VeyvCwzKy3sSsqzYWImaWk4S5WkoWTDxYgw9jmxNJTCMCHBj70/pqUwjjU1L7jDtKCGCQMOHy266KKpq9igN7vNNtvI0LMIuPnmm5OtttoqdRUfKAdjVMD2P0djgDCg18s8eSmiJ1oNv0jUynlvJjXie5Bmw4QBZw2mnHJK0bTYCmB+sCjTRMw1br755qmr+KBXgQoSRjSOxgBhQIXlQCLbIyHspSgMp+5SvNA/DGftNpz1C3mhu9J4MV4pu7pj/llxrH+psOX8Q3csrFIYPxZG/aw95i7Fs36WbBh1s/YRrvflJgxYxKGBYJW6FYAwKMpuovvvv7+l7jO46qqrZGTArgxHY4ACSCoti5YhUZHz4EOl/KqlZqaFP/PljJ5i/pZqyVdnf5esOKXSqiYOayR///vf09I2BrkJA060tdI1dJw8LMotSczjIZxaBYxiUFDHLjJHY8C35YAVozC2WzqNTTRmtDdM9zIfHgvjFCfaG3YDcheCRW7CAK2lLCKzVasVsO222xZm9xMFe8cdd0xdxcdMM82UrLzyyi2nqbaVwLfVy1ucxiXUezM67d+/v9wIFwvjlE0NuenMgkVQ7iVtBbB3uChz3gzbWuUe4e+//z4Zf/zxk0svvTTlOBzNB1McE044YXL22WenHEe9yFUYXH311TJ046hz0cGeYua+iwCGbq1yIT47sPjHHGZxODoLXM6CMODksyMf5CoMUE1AQ8HumKKDuUYWvYsALuZvlZEBu8YY/fkUkaMzgTBgmgjFcY58kKswoIGYZZZZpNdddKAoqygqrJ944gm5hrMVwCU8qPJwODoTbL1lZMDpYEc+yFUYACR1K2jgZH2jKIfOULaFcCo6WLRD7QhaDx2OzgTnmRAG7Ap05IPchQHKpFhgLPrhs5122qkw01mMUNCBUnSg5GvyySf3KSJHp4NzGLPOOqvoA3Lkg9yFAdu8uAaT3m6RwYUbRdFNxOEZ1jCKjo022kjU+joctWLA1UOSgdfUT1cNvDY54qjjknMvuDgZOOi6aJhq6IdAnXNXRO7CADBNhEraIoOdT+gfLwLYfcV1fEVCqI2U0cDcc8/dMocKHcXEZf0Hprbi4IUXX0m++fa71NV10RBhoBdT6HQC9xzQ+BYJ5Kco6ijYpslVfEUBp1/ZCLD11lt3XF7DQRWO/mM6HLWiiMLg+RdedmEwGg0RBgD95Rym2nnnnZOZZ565cIdDOGNQlPMQ3LrE/QBFwoILLihaaGebbTY5A8GJbVc/4agXLgyKi1yFAVrwUFTHtXEIgLXWWkvuRkWPzcILL5z89ttvacjOx9ChQ2URqgj48ssvC3eIi4vAp5pqqmSxxRaTe2rRnTTPPPMke+65p2g+dDhqgQuD4iI3YYDyMnQTceiMS0/mmGMOEQhLLLGEbAHjjuR//etfaejOB3eVsthdBPzwww9yqXeRwHoKKsn5d+wO479uttlmstCNXiIuFS+ScHe0BlwYFBe5CQMuruZkKo1Ht27d5E5k7silIcHs3r17GrL5YHfTNNNMI9fvoSqaW7rYlkbvl6mQZZddtmm7eVirmG666ZJNNtlEDm8xN4/iN84Z0MDynVZaaaVxLp5oNng+ZwpUkGMyumNrKSOGL774Ig3pcFQOFwbFRa7TRCzIMiJAGHBUHKKhY8SAsOgscMKXee8ZZ5xRpjw4cMaJ3549e0ovlwa4WXP2nJzkVjjm5Oeff34RRlwXesopp8jZBxrgoqyvoHoCIcB/nGSSSZJJJ51UhEGRFrsdrQUVBrpbDVNVKauWVsD6FFO5Fnp2iY0MgDC6jqXbxBnt27QhRrA6K8Edx4AL7xUuDMYg9wXkr776Si53RwBoQ8LooLPvR0a1Nlsjme4gT5g0xphLL710UxeThwwZIo0+30jzgtCkx42QKIpGUA7CkT+IkRRCzAWBox6oMOA0+0UXXSR3Nlx55ZXS8F922WXJtddeK5cmccCR9UYFZ3EIj04i1q8Iw1mm4cOHS2Pfu3dvCcdo/5prrhHlj2yDZlcjl/WDUaNGJeeee67YBw8e3CF4XBiMQe7CQMGcM9ND2thxm1dngh45IwPyRKOm01iMYtC302wgMHk2PW0IgYmJgCjKFAxrGeRLhfrtt9+e+jgctUGFARfSXHjhhdKTpyGns3jBBRck9957rzT8CAZ7Kp/t6ggNyuT555+fjBw5Uhp0rq6lbjMF/M4778ho/+KLLxaVKZdffrl08hh1Ay50YVccQPjoaMGFwRg0TBgAfhJz4UwxFEE9BRdB6zw4jS5CgEXS119/PQ3RPPz4448yPUVeEJYstCOcinb4bLXVVpMRFT04h6NeVLNmwBQPGn2Z5q125x9TT5dccok0+Kx/he0PV0EqXBiMQYcw4IAY828MnfiQpUy1W74lG5Z0mRbhp8T8bfzQrbyYqXbLD+3qVuL9GCZqbxdisRaE4cO4oTvk2/ihGZKGfe+99zryQc+baSKGvOqvYWOmUuhWnpINEzPVbvlq53udc845HTdJqZ8l5WOG8dWu7qLs3nJ0HqpdQKb90MOrjYILgzHoEAZUVOb7aYyQokq4LU/dIT8k62/DKy8WxpphGOsfhsniW7f6I5Q23HDDjh45DZWNY8OrqWT9snhZ7qwwCErygoDiIJyGCcNZdyleKXfI0zSUbDgbxppqD+NYt+Vb/19++SUtbY6uimqFQTPgwmAMxhIGzLMxtQNxrZzaQ8JPSd3WtOGy3FlxwzjKC/mWp/ZYGOtWHsTUxw033NDh1rChGcaLuUPThrFu5SnhpqFkQYzpGBpM5cfihhSGDeNUkhZ+Suq2pg1Xyj/mZ8MgCJjvdXRtuDAoLsYSBiy2cIqY+Wzm2SDs6rb2UlQqTiV2Sza+2q3bhlWK8W083pFeuA2XlRakfqFp/S3PusOwStafHQ+ffPLJOGHUP0xLzSyy/qHdpmH9siiMX4lfVrouDBwuDIqLcYQBF56z7UsJt+WF/jHSMBo3K041aam92jjqDuNWkg5USVrlSMNWE0cpfFYlaWSFqSetMG5IlaTlwsBxzvkXJ4+Mejx5+JHH6qYnnnomGfXYk8lDEb9qaMjQG5Ovv/k2zWHXxTjCAA2jrB2UozCcuiuJnxU3dFeTVj1xQn6WO0ZhWpXEUcqK24i0KqFa4oZhstyYLgwc30mn4odcaOGFuydnnd0n+X50uYr5V0N6UK0rY5w1g7/85S+itjjLRKkadiVuNgvDWFMJN/vnbXx4n332mZjw8bd8tn+pW+NYtzU5QUwa2Gl8ND3c+lzsGsfGt3YbTu1PPvlk8vnnn3eEDeNYO40eW1iVb/OBXQm35kvdapJ//DA1Dfjsk+b0JXb7TGu3JtvyyA9u+01wY8aIMPjbtJSPqW4O8Gg4KBbP2jFdGDjyBNvD9RCZo36MMzJgP7kSytOs+6mnnkrWXHNNqdw01qhePv3008Xv22+/Tb755htpdLAThsoPEZbDIZwC3m233cTN4iK6eDhkQkPCDh+0nRKP9DhhePzxx0tYmwcl+BB5pKFG1w9TEbwDi8OrrLJKctBBB0nau+66q6R/3XXXdTSKTGto2sSDz1oCSvY4EIPeIPb8k59DDjlEGuKs56sdPu/LKV3saCLle3GGgEN4LBKfeuqpogKDxWsO2qy66qrJAgssIHngUBzCFZUeJ5xwgrwDaeB+++23hfr27dvxX/T5miflqZ3b3HhPtrCiN4r0UMPBO+24446ieG6vvfaSk56HH364VCzy+uqrr8p/5V34l+uvv77kk+9IXGifffaRb8LzaehR8wFfv0WYL8iFgSNPoEHg5ptvTl2OejGOMGAhU4nGHlI7F8ij7VN50IcffiiNwjHHHCO3mw0YMEBOA3L6j6Ph6ARCrTXqrGkkrrjiCunF08CiQI4LrWkoaCQ5dfjWW2+NlbbaSxH5ocHDpBGicYOPuuVHHnlEdCYhMBA+3LOAygfyQ8NK73mZZZYRIYDwoJHjXlVOP3KknTTRYxI+U8l+C+yQfkdOSfbv318aSfLH8XgEDY0uozBOYFKY77jjDol30003ycI2R/L5pugqIt8o19O0+ZaY+kx9bui2YaEjjjhC0kQAIHhIk2/FaU8IwYGQ5D/yvnwDhBTfD6FFWq+88oqoyiA93sc+j1OfpBfmwbpdGDjyBPWDzqQjH4wzTUSPPIu4QJ4efswPQvMnPU8aE9wcWCIObnrmNAo0dKg1oOFBeNBI0IjQ86S3igChMQ3TpvHhpHDID4l3oEHjUh2OnJNf9JzwHBpCGuPTTjtNBBNK4sgXDS6CjvB77LGHCAUEFtNUsWcgJGL8kHjWs88+K4KPdyZ9ppAYcdDjp4Elnxyr5xuQ97XXXluEI3YEE/lCQMbS57tW8k34njT29OwZkfBc8oHg4XuTzwceeEAqF8KZdFE/Tn6JC49RCSMG/mn4f3Aj2MqVHxcGjjxBefZpovwwjjCg0Q2JxgGiQaMnmxWGXjC9cuwokEJTKbpC6F3SAEOopyB8nz59knXWWUeOjKNYiukj4jHdQKMTpk3PFkVVlh8j9JMwAkAfifaMmeagEaNRo/FkJEBDi5veO9c76jTS6quvnrzwwgsysqFHHKbPN0KvEb3s0M8So5ozzjhDLpBHuNCo77fffvIM3hVdTZwvOPbYY+U7kM8RI0bIiWg0MPLO5AMV16QVewb5R59LzM8SPX+EDN+fb8L35TtTkfiujJzIJ1NZ5JURDFNde++9t3xzBBbTVmgxRSjEnkE+SDvmp8SUlcORF6hPtDmOfDCOMKCy0xO1phK9bBpIGtzQn8aORpXeNHzC0PDSEGFHIDBtww/kpiz21isRHpN01LTp08jQYGpY5asZ2jVd5TMFRSN35JFHyrCSho+Glt65xkEoMU9un2HzommxboKiLPs8DW/dmLw/Da/yNW014fF9ND6kbo2j3xqy4cg7Qo1eufJsmDAO6dr3wc3ogEYaf/x4ls2bxoV4F4SSujUdiHiMMPR7hv4Q6bowcOQJOpuMqh35YCxhwDQOlT5GTCVA7KyJ+TP3znoCDUzMH6JBgEgn5l8JhXHVXSpNbQgxacyZo0cY2Ljw6BkTJowfUixMqecXjbLyWss7aJzHH398HD9LhHNh4MgT3PvB9KQjH4wjDJgPbzdiOyYU86uF8kyrqxDfzIVBdVAlbU7jEthiiy1Et1fM36k8hRhHGNBjZnsjJsSUjvIsX+3Kt/5qWn91Z/GUb/3DMNbf8ipxW35or4Rn3ZZv7TactVuK8S1P7TZM6M8/UXsYJoun/Jh/yLf+WXalGM/yLbkwqBxoeUVXFYoVVeOv1fyr/Cwibix+aLdkw0M2XJZfmJa6w+eH/jHKSguKpUWbhWZdddswWfkN7ZZiz1Cyfhof09rVHyI8FPO3vBg/tGtaMT9rL0UazqYFz2IcYcCcP/T000932EMq5VctdXZaWXHyTKsWata71BIni8o9x4VB5UB/P+deMC2h8C/kKT/0U3dWnCwqlVY1pOlUm1bMPyutUpQVXnm1pJUVJ8bPiqPuWJwsvqaTFSeLYnHUjb4wiw5hwLCBg1dcJMHcuSWulwt5Ts2jPL9/Z/xLnsnIgIOGjsqAplcO/HHNI4RggKzbmllk41nTxrP2cjzlh2YpykqrEp4l/DWMNWPxwnAhL4uvvBhlxSkXL0ZZaSkvRrFnhTy1x3ghhVu9xxIGDE2JSCAnp7wILaY0bq7/pXLwveicxb6nk1NeZNEhDBwOR3GAMEDNh6p5wcyicv6QhsFUCsOofxYvy6yE8khDqZa0Qr8sd6k0lMKwWWnFeOXCxuKGZNMqFT7mZ+O6MHA4WgAIA07Jc1I/Rqo8sBQ/DJPlDvkoN7RuKAwbS0v1flmedeNP2pWmRaNleTZMSBonFk7zZf1KpR2mpfkOw1h3Ob4lG4a0NX8h2XBhulnukB8jDePCwOFoASAMOOWOqpRGEIJG7fY56K1Cv5i6Ldk4ECMX1K2gews3JqfcOViKHbUqNjzbi1Fxgj1MKyTiox+MTQe4UdcC8UzcpE16+uxSZPPBc2kMOXiqeSiXF87IDBo0SJ4F4eYgJXbyRPqYpAPxPTVN+Goq4ad8vh+aHfRZ4TerhMrlP0bEYfrWwoWBw1FAIAzQCEDDAtF4lDLVbt2Wn2VeffXVyaOPPtoRloYSXV40SjRSaA5gEwAaBjhpzolfVfyImhfyiBZeercogUT9DAdQSRtVEahT0QaSBhSVK/qsUvni+aisoQdLfPQQcWB0rrnmEtUtaDsgPfRp8TzOsqB3i3wRF5UuNLRsXujRo4coWVxhhRUkDdJDaIXPVcKthJtT+qjhwY0yTfSWcb6BLdSkQ3roWeOEPup20MOFhmbystFGG0k81N9wWBVllIRHUST5JN98W5sP0iCuui2FPHVjKoXhLN+SCwOHowWAMKBxobGthlBPUglPaeDAgdK4WR5bzIkDH2FBA0ijhXCgZ85hL070o52Yhhn9XjS+NLwoiaTnTOODugh2kdFY27StXZ+lPEukjUkYVNnQIKNxGP1e5E2FAudutt12W9F+gMJLRibo/kJFChoTUD6JinsabBQy2rSVbL6gME/qzzPQH4ZeMBp1lE6ioHOxxRYTfWIInRVXXFEaetLg+WhERqDwTNRuIxRQ2Knp2mcjSBAGNNax75L1rZSy4oR83C4MHI4WAMKABgS9XI2kK6+8UnqmMT8aDZQT0iun8UDnFJp/OTdCr5yRAQKLxpcpGxr+Xr16SaOITq5DDz1UpoVoHIkfpk+DzWiCNEI/S/gjDOiJY0cQoASS74NSSvKDgkzC0ijTOKMCBZ1kOjJgtAPddttt46TPe9K40xCHfpbwZ7TCCIk8MHWEPjOEAvef8H3QcYZaHoQP6dLQqyZm0mC0xBRV1r/l2yHYYt8rb3Jh4HC0ABAGNDpKKAGEYnbLK8WP+aGdl4YNt5KGpeFF/w9nj3AzMkDLLSrZafhoXLmwqXfv3tK4oYEXFfbcecFogsYYPkoiacht+jR6TNvQq8etfPt8dTMqIQ0EDekRFzcNLnmi983dJKwx0DOHyCMCA51j8BFOaOmlsdZ09VmE5z1I2/JtOOWh9h118CjspMHGpJfPd2G6DaGEEDzppJMkn9zfQuNPPonPd2M0gV/4HEz4jHZorC3fhlV3LL71szzLV3vmoTOHw1EcIAyYTlAFi6rkMW9iiofpE3Xrs2gs6AWvscYa0pCrPyMCCDuNMOsJ9P6tH0SjBylf7Uq4mW+nYbL8LNJ0sROXRp4pHxpZLlai92+1IbNegIp4BBXvovHCfECsi/AuIT9GxCc9TUfTtO+Mv+bV5hvi+2rckOAjsBht2Di1Uqlyg58LA4ejBYAwYO6dHm0pygpTSVylUmGrSadayjNtTcumWWn6WeFqyV8tcSwRP0yj3jRjxGjIhYHD0QJAGDA9woIl8/EhwbeU5Wd5sTDWzQggFi40Y2EsT90xnroh+zxrxuyWYnzrrjSO8kN/Gy4rThZfTd4t9MuKY/3CMKG/5YXhrD3rX1pyYeBwtAAQBizIMvURElMgWXzrF4aLxQvjhH4xe8wd42WFyTOt0F1JuJCsv9rDtGJplOOpPQwXixcSYWJplaIwTuhn7RCqhyxcGDgcBQSaJam47IvPIua5S/Gtf1ZYpVJpKVmeNUOK+WO3lBXW+mXxsasbbbjKt/7WDCnGLxc2K06ptKyfDRvyQ56lrDhZ4aFyaakdZXUWLgwcjgICpZHsS2e4zwjBaVziYBcLxEx5xPyd4sT3YssrdxpYuDBwOAoKtLwiFJzGJe5fYcsm97JzwU0sjFOc+F4xDcIuDBwOR0uCUQFnJBz5wIWBw+FoSaD6goNejnzgwsDhcDQNQ4bekBudclrv5JLLr4r6VUs//jT2zpquCBcGDoejabj8iqtTW3Hw0suvJd98+13q6rpwYeBwOJqGy/oPTG3FwfMvvOzCYDRcGDgcjqbBhUFx4cLA4XA0DS4MigsXBg6Ho2lwYVBcuDBwOBxNgwuD4sKFgcPhaBqqEQbctxwDev9DcGtaFjh1q6oXuNkNcHmMwoXBGLgwcDgcTYMKg2+++UZuBePKyLvuuivhXuNzzjknGTBggFwXydWR3CKmQGkfl8mjdplL8NGtw01rXD3522+/Jccff7yE23zzzeVqSVQ2X3PNNaK3n1vHAAra+vXrJ3Yu5EFIABcGY+DCwOFwNA0qDFDCd+ONN4qeHC7d54pMrojkik0UqdHQc4WmAj6njb/66iu5zhJtpdxnjIkwOPLII+Wyfq6cRAgweiA97i0+5ZRTJA1GA1zSD4YPH94xWnBhMAYuDBwOR9NQ7ZoBF9tzRaP24isF4bmv+O9//7uMPv75z3+mPmNw7733pjYXBgoXBg6Ho2moRhj83//9n/T6IbSUNgouDMbAhYHD4Wgaqh0ZNAMuDMbAhYHD4WgaXBgUFy4MHA5H0+DCoLhwYeBwOJqGCy6+LHn33Q+St95+t256+533ovxq6bY77nFhMBouDBwOR9Pwx0//lHz66Z9zoUceGZV8+qe4XzVEnv797+p2K7UjXBg4HI6WhN9yli9cGDgcjpbEn//859TmyAMuDBwOR+Gx1157JfPMM4+oothqq62SjTfeOFlxxRWTs88+O9lzzz3lZPHRRx+dhnbUAhcGDoej8LjhhhuSRRddNJlpppmStdZaK1l11VVFkR2njHv06JHMOeecyZNPPpmGdtQCFwYOh6MlsNhiiyXzzjtvMt544yUTTDBBMv744ydzzTVXMtFEEyWLL7548qc//SkN6agFLgwcDkdLAD1Dk08+uQiCKaecMunWrVsy4YQTCm+WWWZJQzlqhQsDh8PRMkB76UILLSRCYIoppkhmnXXWZOqppxZldo764MLA4XC0FM4//3yZKkIgYLKG4KgfLgwcDkdL4R//+Eey7LLLijBg3YB7DBz1w4WBw+FoOSAQZp999mTgwOLpOmpVuDBwOBwtiVtvvTW1OfKACwOHw+FwuDBwOIoIbvni6sZ///vfTk65EuUKCm+Pc2HgcBQQXNbOvnru7oW4+lHtliy/kjAxd0j4lwtTKdWSVlaccmnVEieLstIKeZVQGC8rnUrSrzYtzKwwv/zyS1raxsCFgcNRQLBA+sMPP4gJIRigmN2GUbvlhWHDONZt+aE7jGfN0K5uJesfCxeasTCWH5o2jLpDXsy0/payeMoPzSxelrtcvDCO8sKwIU/tlkKeuilfFi4MHI4Cgsr6zTffJD/99JOTU0PIhYHD0QJAGHzxxRfJd999l3z//fdiNpryfE6rpvXtt9+Ow8v7+bFn5E1ZebZ8FwYORwsAYfCXv/wl+eqrr5Kvv/56HLL80K7uMG4snFIsjHVnmWpXUl4YxrpjZmi3bmsqxdyWbJjQDMnysYcUhgnd5eyhGVKp+FlxY24bPiuMNV0YOBwtAIQBl7cgEDhhGyMqNaMHdT/22GPJBx98UDKO0pdffilxMdWN3h90/ODWtDWtUaNGdcQtR6R10003deSNhge7uvHX52scfY7NO3YNR37gYR8wYIDwNVxINo2HH35YtJli1zxoWqGpz1LSZ2BaP+XzvT/++GOxWyJMyGM0MGLEiI40eCakYfUZatfnEOb+++8fK00Nq+GhYcOGdYTRuOqXRT/++GNa2sbAhYHDUUAgDD755JPks88+E0IwKClv1113TW688cbk888/FzencV9//XWxv/nmm8kf//jH5K233pIG68MPP5SG/v3335eGgAtiTj755OSUU06RxpPG5LrrrkueeeYZCbP66quLv6Y9aNCgsfKg+YjxaIiOPPJISRchMv/88yc777xz0qdPn+Tll18W+2677Za89tpryTvvvCP5/OijjyQuz3v77bfl3ck7dxiQ7lJLLSVhSZO8ETb2fEj9oP79+8v7w+/Xr5+80zLLLCN+yy+/vHwvFN/hXnPNNZM99tgjOeuss5I777xT7kx46qmn5DtD1157rdCmm24q+eQuBfIYPjPmZnqGb0K8hx56KFlyySXlHgbyxL/igp7dd99d7mTYaKON5EDdgQcemAwfPlwu8EGg8f94F/K93377Jccdd1zHM7j0h+dAjz/+uHwjhII+35pqd2HgcLQAEAZ/+MMfpKGM0aeffioNOj3OmN8qq6ySvPrqq8l2220nDfw+++yTXHjhhckaa6yRPPfcc8kFF1wgDQyNHA3aDjvsIDeFPfjggxK/Z8+e0hjR+ITpV0M8g7sG3n33XWlEd9llF7HDR7/QIYccInmkYbz77ruTbbfdNjn++OPl3oLnn39e+OR1yy237Pge5C98TjkizgknnJBcccUVyQsvvCACcbPNNhMhR6NPw4kwJS8IIhpLbk/be++9RQA9/fTTyeWXX54MHTpUGtvYMyol3p1n07jzvGeffVbyxggAP/4TwuGee+5JXnrpJckzl/sceuihkjeEKflFGJBP0rTfBIHDfybfyouRCwOHowWAMKCHTo+ZXj2mJRpGGkgahZg/jTiNBz1wGhAaYBp+BAHC4fTTT5cGgcb3vffeS66++mrpoRKXMPAJf/3114+TNsQUCY1ZzM8Sz5577rmlQSM8DRnCh8aLaytp+Ojx8+yjjjoq2WSTTaQx22abbeQdjznmmOSAAw5Ijj32WHmHMH3yft9990nY0A+y3+aWW26RRp0eOHwaYvK03nrrSW+5b9++ySOPPJJcdNFFEp7dXHwn7Hwret9cvxk+i7TgcUF/LI+WCMs3WW211WTUwnvfe++9MkriWzPyOPfcc0Vg8a0QiIxAEMx8cysM+C64w2fwbREGmKGfJRcGDkcLAGFAw0JjFyMExeabby4NZ5Y/qp2xs45Aj5zGbIUVVpCGi8Zlxx13TC699FJpnJmaoHGkN7rFFltIL3jkyJEy3RCmTXrdu3eXkUfoF9Jtt92WLLzwwsnFF18s8V588UXJN89+5ZVXpAFmagYhgBBiKokRz9prry1TTDyfeLiZurFp847XXHONvAt26xcSaejUDKMe3pnpFqZleBemppZYYgkRFEwT0cgyFbPuuuvK1BZp8K3IL2mF6dPwcrcC6YZ+lsgn35W8ICBxM3XFSIHRAIKaxr53794yQuIb8b34Z0svvXRy8803J8stt5zkk28aewYdAYQNcWL+EGWL7aUWLgwcjgICYUDDokRjFRK9Z+aUQz4NDI0WDQ4VHx69byXc9MiZr6bnTcOgxLMwNRx2TRfCHyFw0EEHdfhl5VF5Nj16wieeeGLSq1cvWadgGoaGjzxrPKaq1llnHekJ6zPCfECkecQRR0gvOvSLkb5/+G7WrXbyrmH1PdQvRgg51l9ifkr2O9lnYarbPlfjaT6tqWE1jKUHHnhAhL79piER14WBw9ECQBgwdcACp5pMaWAqaSNieYQhvE4FWD/IpqUUhgkpjKN25avbhg0p5CMUiEMvO8wnjRU8GizLj5GmY/MVhoH0+ZiQDRfGCdMK827TsnzNi+UphXGy0raUlS8lm5b147shCNSdRS4MHI4WAMKAHjhTFEwVQNhjZP2ywilf0yoXLrRbCvlZccqlBS/GV8ryyzMO/HJxQv8sfhbP8jFtmNCe5WdJ+WH4kEqlxcK0CwOHowWAMGDqgSkQJRYT1VSyfjFe6K/ukB+GtWbMbt2WH5o2nLpjvFJhwnCVhImFC81YmHK8SsIoL3SH4cq5s8JYviUNY/1jcTTMzz//nJa2MXBh4HAUEAgD3fnj5JQ3savKhYHD0QJAxTC9OPaes/XQElspQ14tfKiWOFlUS5wsyvP5zXqXZjy/VFqVxsHN2QZXYe1wtAD++9//ypwu6gis+gEnp3qJMkXZ+s9//pOWtjFwYeBwOBwOFwYOh8PhcGHgcDgcjtFwYeBwOBwOFwYOh8PhcGHgcDgcjtFwYeBwOBwOFwYOh8PhcDgcDofDBwYOh8PhcDgcDodjNHxg4HA4HA6Hw+FwOHxg4HA4HA6Hw+FwODppYPDHP/4xueGGG5Jrr702GTJkSIf5l7/8JQ3RtYBO8R9//FH0i//f//1fyk3EzuUT3377bfKvf/0r5eYD0iNdbtNytAcoL3//+9+T7777Lvn3v/+dcosDLu4fPHiw1HWlO+64I/n111/TEA6Hw+FwODoTnTIwGD58eDLJJJMke+21V/Loo4/KVZxcyUnnuCvi/fffT1ZcccVk3XXXTX777beUm0jn7owzzkhmmWWW5Lbbbku5+YCbFKeddtrkqKOOSjmOVgdlhzo1xRRTJLfeemvKLQ7++te/Jk8++aTcwz5ixIhkueWWSxZccMEuOyHgcFSCq6++OplpppmS2WabLVlttdWEVlllleS9995LQxQfr7zySrLOOuskJ554okxe5Im99947WX311XNP19F5eOSRR5INN9wwOeWUUwo3cUS922effZJVV11V6uJ8882XjDfeeMmwYcPSEK2PThkYsFrAwIBOr+0Id1UwMFh55ZWT9ddff5yBwZlnnikC4fbbb0+5+eCBBx5Ipp9++uToo49OOY52AGWGa/S5PbnI+Pjjj6W8L7TQQj4wcDhKgIHB7LPPLp2kVgW7BPr06ZPcdNNNucv8NdZYI5l66qllxd3RHnjnnXeS8847L7nlllty3y2RN/r37y8Dg6FDh6ac1kdLDQzYKvHWW28lI0eOTB588MHk+++/T30qA/HZznD33XfL6O7hhx9OPv/889S3MrDt58UXX5S4bNmI4YcffkieffbZ5MYbb5SC/eqrr5Yc9eY1MPjwww+Tm2++WVZgym0Ryntg8OmnnyZ33nlncs899ySfffZZyq0c/+///T9pDMhXVkeR7/Hyyy/L6skbb7wh7mrAf2EmgvLHignCqpoO9O+//568/fbb8i9eeumlimcyeMYf/vAHKXfMlD/xxBOyjasSfPPNNzLLTp6J/9FHH4213axWkMYnn3wiKwt33XWXfItq0uV/0blnK9C9995bdefeBwYOR2XQgcFJJ50k9c4xNnxg4OhMXHrppT4wyAOVDAyee+65ZPnll0922GGH5KyzzpItByzb0Dhef/31yeWXXy5LiHPMMUcy2WSTJaeddlp0KZH0e/funUw11VTJMsssk5x88snyfDqXpLHtttsm00wzTbLRRhtJp8uC/f277rprssgiiyRXXnmlLBnRQK+11lrJMccck/zpT39KQyYyYIFPWttvv31yySWXyDN4Vq9evZLFF188mXLKKZOLL75YBhcWtQ4M6GzyfSaddFLZhnTqqafKYIRnwGcLEts1Hn/88TTG/1DrwIB3ZtvTsssuK+83//zzy7sdfvjhyTXXXJMMGDAgOfDAA5OFF15Y/jH2L7/8Mo39P/Tt2zeZe+65k2OPPTbZdNNNZVsT32DfffeVbwkQgnTkeQbbY7beemspC/x/zC233DKZYYYZZEmPd4wJTfLLt6CM8H9OOOEEqcD8zz322COZeeaZ5b8wkxUDA4Gtttqqo4xQlng+sxmUHb49y/qEs6CTTYd+qaWWkjJzwAEHJNddd50MFPv16yfvPOGEEyZrr732OIPTL774Itlzzz3lmYQjPN+aPfmkQ3r4MVAIceSRR8p/D/+5zmrwnyiflAvqEtv6SJd4CyywgAhY8hkbJNCZ79GjRzLRRBMlSy+9tMQZNGhQctVVV8l/pn4sscQSsi1Q8561vO8DA4ejMlQyMIDPhAN1irZurrnmErk177zzSjuD/CSdrIkQJpLwp62deOKJJT522kZtN6y8U/z5z3+WNgyZwBaL4447TtoH4iAjFEzkzDjjjLLVMZaH1157Tdqlbt26Sbju3buLDME++eSTJzvttNM4bayi1oEBsop8Iltpj5AHtFvIBNpCno0/8o1JLyaGQvBetPEbb7yxTADyfuOPP77EQ7bad2XypWfPnvKOyA3+DTKQ782zTj/99OSrr75KQ48LtlpfdNFFyayzzir/lLjaZpPmzjvvnLz55ptp6HHBJNPZZ58t8SkjPH+e/8/eXYDrUlV9ALcDA1sUxQSVEJRSFAkVkVBAJARBCQHpEkG6u7uRFCkFBKSkBUEwABUbFRADERVFv/n4bc7GzXbmfd9T955z7/o/z3pmZtfsmrXXWjvmDW9I47Q6NqY99NBDQ6GfCuU0jgonvr7l3cpBjjAuttU/4613qlP5xO+1rfLqJ8stt1xKuwb5RN423HDDNAOeQT4iyxmL1aelaca7173udSlt71H3lpaRIbuMXYy6ZBCylfZSJt/IS1/60iSLkBPl3VX6XfIBhGIwRhhEMbAOmSCvwgkjvT4YnUhjfuhDH3qKtdoUlA6rE7Kkd8G7dPQNNtjgKR8GxcBaMh3HRyRcG1i6MUdMuU0IzsA0fUys6iVGohiw9HLHzHpZrjESylM9KIxGMfDRaRcfYq8ZF7MmwmCstdJFecGc5M2G1Brabuutt07vUYYuKJMpdgrCfvvt9xSl68orr0yMC0Pr2r9iQPUeigdGkcGdEI9RYE696piA7D2s5xkGTO2pXbsGNHVpUFlqqaWenH1SbkqutqVodUGe9FntWMKeEXFrxYAipC5nn3325rvf/e6Q61Oh7lZfffVUZnt/SpipkS7lvNfaZmXNA+oyyywTikEgMEr0UwzwQLzPd4d3tIUxy+27NqaWs5XCMg4Qfhhe7r777iGf/4JgJn1KAkGshPGWAQD/JFgRbtvGQAIrXk8wrhWD9ddfP/FPAlYbzPQa5xgezG7WGKliYNzBpwjYDGsEyRryaqy1x4PwWQux+B2eqPzqlkGwXslgHF9hhRWSHKF+2mbzCe3kD2PiHnvs8ZQ2JNzKqzpaaaWVWhU0dWRMUJ7Pf/7zQ65PwPu1H6VHmPvvv3/I578w/qy77rrNM57xjKcIuN5NdiJfaae2cZCAzjhHRtLPMvB445/xn4xUw3ijPxH099577yHXJ3DppZcmxdB4VisGWfj3zmuuuWbI579QXgZHYQ4//PAh1yegXs3a6y/2vLS1OTDE6c/epX/02rMXisEYYVDF4B3veEeyOrCM94MPkgZqFiALhzq1TlJr+joHoR9TsynKx6rjs1jYIJkhDMutRu+1lAfzINDVDNl7fXDyb+kT5oN5Y/QlhqsY+OAwQwI6hqKcXaSMFCvMq2T6Y6EYyEc/sEZjaAasUmgn7GNUXe8n1LOKYFbK0QuETxYEgrIlPoCZGOwWWGCBtMypF6SPWZdMD/PE6MxK9KtjZNAww6XNQJ4oLAZT1iTMSD7arE7lQCktg59BEGPac889kzWtbTDR3+o+16UYYJDazHfSC6UimaFv+w6UsW2WogbLpXctvfTSoRgEAqNEP8UAz2HZxE8Zn7qWuBoLGGtKf0Yq3zu+0W/5KcGHAGyGMEOaeC+lA4/rQpdiwGBDCKuNZSWEP/roo5MwR2iuMVLFwGwnPrXkkkv2XdpJsNUGxhMzuhkUA3xf2WojTcZqq62WBP5ehp4M46G6VN4MS6e1LcG8rf1LkK2sHDBmgbpjFDImmBHoB4oDIxseDsZseSG8W6HRZQjUjoT0Un4yg2JmmRzHr2z3DOmTwdRvKR/0UgzIUGa825SNDEvGGd3M5pfwLuOYVQD92pysY3wiR4RiMAUwiGJAk6OB07QHWY/Nuku722ijjZ4i4NHeMTI73H3UOovlDpaVUAQIrISvueaaKy0NqRUDH7Vp1VITruFj1eFo6jqiTktLNr1KKMQMLeVwpX2WHz0MVzGwppvQyIqhTDTnfkQ5KC3Xo1EM5JXQ3DXtWMJSIGFNGRIwM1gwfKCmc9tAQK2t+F1QZ6a61VOe8sM0fKwsLCMBxogZYgoUobY6LcngtOqqq/6PsmSmy34WS6QouQYQs0buhdcOZb1kKBPlyIC/+OKLp36DWPz1EwJ+Zv4luhSDvJTI0cC9YNmSPJZ91Pdk8NOvu2YbShAwWJ5CMQgERo9+igEQnggveAXBiWGBMcjSFt8yS3f9LfqujcEMMPPNN1+a4TN72Ua+ZTzOkhU8Jhs4KAZm6o135dhZo9eMAZhlMONKqDUuS9MYbazBKy0rkc/asgwjVQwI6nhibahrg7pTl8Iz1GQYU+Vv4YUXblXIjJGWZBkD26ztNexdxL/VQYZlVOrOvrjhggwjLQYq/aGtbTMxgpGNGEnLetbWV1xxRRrDyRzGIf1Fvsz8k4269nsyilq+q/3IZ/oJRcxMwUUXXZT6D2NYjX6KgVmqLtkRGOHykucS5A1jOrlqEOiv6p582YVQDMYIg84YsADTOG2q7QUdiyBprZ6PPDNPWqoPkrBOey010hL8CD46fptigMkS/NvgY7eHgBCps/ViThQQSsZoFQPMSHiMUx5HgrGYMaiXB7UB89XWyl4OCBQDDKZtGRHcdNNNaQbIh9kP6tyAiKnlqV7MJDPDfhvM5YvFSp3k+tcPMDKKaT9LWhv0NWm1DYIYralfzIY1BbPNiqe+TFHIMw8l9Gv9TX81ta6/WypXCgv9FAMzOL3QphjIi5mbuh92gULjXaEYBAKjxyCKQQk8B88idNnnRHAhjLFaM1TlpYD4jJOC8ElCDZ7pe+0iaRpvxMvCnHcwuhmPei337VIM8DLjM16Ll5nxZQj8y1/+knhdVkAYwwiE8ltjtIoB4bZNOC1B6GdlNs6XSynthyMkE1Tb3m8cIAwb93stM84we4LP2rOVwaik/fkNF9qLAI/PGuO0YVvbZhJevbeNP6Ce+CsLGc34bamuttEH8kxDG/QbMwuMdsYRy30I/wyEDICl7NNPMaDIdOURuhQDy6LEt0d0EFCqtEcoBlMAw91jQGDvJdxZ3kBIsvkmzy5gPmYKdAIMp+vD9zHk9Wg6+HAVA4xR/gh4pSWhhvg23dDG6ynFkewxsGRKWhhIrw+EIGga0oeA2WaMhWLAEmJ9YRcIiBiatXy18NdPMVAmlgbrMg855JAh1/8FZrPJJpuk6U/LZcqB0xSsfNon0mt2Qz/E8G1ozsDQTcHqpwbkrn4K+hdrln6UYSqTYkth6aXYWsYmj/IKGPfaa6+dlAUbybtgcFI3FIRSORwPxQD0UQOcb9Jg3gUKjkHeu2KPQSAwevRTDAjkDskgLNnb1AVGk8wPMz8jgBFajWHlEpk2GHPMGpbntY9GMcC3WOGN3b1mIoVjYcZjGZhqjHYpEeNhPvCiC5ZJCWs5TzkW9FMMwPggbr+xlrGIMcVYVvJvbWR8sgeka68ckHHygRhZFuHmMAljvWVCvcYxMPPOSEqhBO8zvlIsGeu6QNgnyBvrcx889dRTk2GOlb7LKGv81gfMKJQyzngpBuQ7M/zGsrb9CSXMYJGxjKexlGgKYNClRJbJmN7SoX34nk1x0ZwtibBsyNSmNXmEuHr9NmFQR9doOgMGS2smhGEwhBzCm+VEBDiaK4EVk8KAMSTaNnebL7vgFBbWahYZedWhnGJACMOEfZSW8iive5YH+cgWBEIXhmsGoKwPHV/eMIq2junjVSfKhzERoL2XVqzc6syAor5rmMZjKRJuOKAYeKePjhCp7tWPsqlXJ/ZggD4mJxhgim0MUzsK028alwKjjZQRAxFe+xukMFFMUL11bSJigcLoxDeAmVnCgCw5olRZroQpmSqtoQ9YCqVtDUr6IgFb+xLaWXIsmTHgtTEZzNIUs7janBJEiZX/ffbZJ7WN/qssJZMjTBsI+Vk6pK7k2d4DzAeTVh59ysBcwjehv9TlyczLYNgL+j/BvlxHnGHg17+Vh0BPUFEPyAZu73W6g/LJu/piXWpDKAaBwGDopxgQ/vAG354ZdkJVLYgZvyyXxe/qfULGAsoBPoY/1Et3GQKMj77peikP/mPZjzGhn2KgDIwe5YwBPkPoxStqXmZ5ijGF4G/5hxlkJ9zUYKiSxkgVA/VCAHSIRGnowpMJ2IRQ4SwVruUVioHxBU/u9f5spCKoEvpLvmhJqLxoP23QZngxDiqjsZagXy5b0tbSlLb2rY1C/I0b5AEyj75SGgnJTZYwWWZNJsC/S0OqfSnGIeMkYb8U1MHeRUt29Q/Gudy+2s9hFhQ/An7N59VXXjJsZr5cjqQPGxvaFAPjDHmnn2JAYVPeGtrYYTHyK33HdmfIuzbVnvqbZb/K3evwmlAMxgiDKgaEZR0rd+I8fYWxIR2217rGDJ2c1dYHQ8CSts5QdjhKgCU6BMnM4MSzCZUwTLPtB9o1Bmza02ZjzLBmlqwy3sGinJmDDu6jUZbyg3TPiqCTl1bhGvKmPKy16sWJMqxDvRiVd/tAujaqdSErBgagvMRGuc2IGGAwHUJyL8sVyJu0SgbVC8qoDf2jwXu0vTL3Yg4l9DPlZfXQPgRn7ZPL0A/qizCr76hjsyHidwm+JfQzSoI4FDfto756WX8y1JF3qtdcbv2hCwZ1/jXzpiRb79mvvrWL+L3CEU7UhWlj5dHXKdO5LrU/Rulo1S6EYhAIDIZ+ikEGXsTgREgzG0vgIvCa8WbEwPN68Us8ySwtQ4sZS4IooYgVWrptYxB+w8jGgNVrVtZ4QLA2W16XwRhLoDTee6cZbsYYBx4QqPFuxBBFCGeFLg1BZtUZ4AbhxSXyUiJlE9esAGMTizSDE0GaUkJ56SobZcYhEww9g7yfEdGsAyFcORlhKDaWI/c67S1DfMIsQ6c2kkeWdQoXvturfxB6jT2MO4RmAjZFQHz9w9hSKm0lKA/GXu8hLDP6aSuKjP5Bruoaz4wnloCZVZJf71THnhmijAU1COdWOZidKvusfFjWqr678gqULe3S66eA8qXfUnqUh2GX4sUAaawmf+lXyqtuuhCKwRghKwY+6C6Bu1QMCMeBiYFSMehaJhKY9sCCQ9AwWNcWxRIUL4OFpXX6ShcodaEYBAL9kRUDBxGUhqPA6JAVg/KY6cC0DTKLGRczH22Hd5Qwi0JxM2PUZcAGKzVCMRgDmJbB6CwRcYQnsh6ZBTaDdkq4sO6/a8d7YMqDlm3qlZVhUGt/YPKDQGJanXWIBZIlzf4b1icWHN+vtbr8WIFq65H4LHKmcC3pEp91hvI/yKxfIDC9whrtvHyCBZVQ49rLQh/oj1AMpj+YUXG6kz0G5BhLg81OmNEy+2P2wooCy+MsgbKHrx7LzHBZNWDZtG/R6YL6Ub/9e5MJU0Ux0Dimh2oqp8Gsi8P4LE8IK8nEgTaiEGibaJfpE/nbJNBbfmeZlMMB+s0gmfqtv/l6X1AgEHgqfFuWgBBYLClElvANugwy0A5LpxwYYY9fYPqCscipivaqWm5keZf/NFj+almXZbtdijdDtaVG+Vu0FM33OC0ZuKaKYhAIBAKBQCAQCAQmFkIxCAQCgUAgEAgEAqEYBAKBQCAQCAQCgVAMAoFAIBAIBAKBwOMIxSAQCAQCgUAgEAiEYhAIBAKBQCAQCARCMQgEAoFAIBAIBAKPIxSDQCAQCAQCgUAgEIpBIBAIBAKBQCAQCMUgEAgEAoFAIBAIPI5QDAKBQCAQCAQCgUAoBoFAIBAIBAKBQCAUg0AgEAgEAoFAIPA4QjEIBAKBQCAQCAQCk1cx+L//+79E//73v5tHH320eeyxx5p//etfPakMM0h4lMO51nH6pTFonNKtV5w6XNt9Sb3ca7/8PNZxBnEradB3lM9TIk7tnt263MvroO69/IYTp4vKtNri1G7lcxnH9T//+c/QlxgIBKYG8hjom/znP//55Lfai8pvON+3+bc9d8WpaSzilH5dlMO1vaMtjdJtpHFKv+zWlU4/v9I905SM0+bWFR7Vfvm5y72X30ji1NSVVqZe6XT5DRqH3Onbm1YxqRUDDfXQQw81v/3tb1NjYY6UBFTetxH/TKVbeS2p9Ovl3+ZWxinD9YvTy63Nr8u/dm+LU4Yp3Uu38trrvnTLVLqVftm9n18v99qvDNMvTr/72q3Lb6zjtD23hc/umdrcS7fs3uVfupV+5b1v7u9///s0zRgDgYkOyvk//vGP5v7772/+9Kc/PeXbLb/b2i27d/mXbqVf6Vb7l24l1eG63Nr8RhOnDNcvTu1W35dumUq38lpS6dflX7u3xSnDlO6lW3ntdV+6Zeryq91qvzJM6V5Trzj5vqa2cK5tcUr/Nr82/zps7Vf7l+6PPPJI8/DDDyf5c1odAye1YkAZ+P3vf9/89Kc/TQ1HWMEoXTOVz/m+7doWrg7Tds3U9twVp9c139fPbX5d10yle6Y2v7Zrvq+f8335XF5LqsPV9/m5DN/vOkiY+trlN6h/m1uv564w9bWXW+1eXnv5lddMngcJ23ZfuqG//e1vzYMPPtj89a9/DcUgEJiKoBj4Hn/1q181DzzwQBJaur5bz7VfW9i2+17+mWr/tnBtz/V927UtXOmX3YZzre/L50Gv+b50r8MMcs3kuV/YfM3kuXQrn+trTaV/GaZ+zm79roOEabv2civdy+fav3bL7vV97Va71/61G/rzn//c/OEPf0jf27SKaUYx0GCYZBdp3Db3XjSSOGNFyqPj5XJljXVK58n782xMm39NOd9ZyxbXtS3slGoTccp4ZR5d5THXcxm+nlbMJF6ZnjSEdS3d3ef02+qgDDsI5fDDiTfWcSgEoRgEAlMfpWJg1qDkYSWNNQ/oRcMNj9ricMMzM69FmWd3xRkvahsjepE85ry6okHjjgfVdZXrMudR2dyX4dxnv5rawnI3Btbl9JzjlXFQfq7de9GUjNMWnlsoBhMYpWJwzz33pEYjrCBTPfm+jdr8+8Vpo15xRpqeD+kXv/hFs9deezULLbRQ86Y3val57Wtf27ztbW9rll9++ebMM89s7rvvvqeUdyyozm/Oy6mnntosuuiizec///k0APkw2uK4//nPf94cdthhzUc/+tFmzjnnbGafffYUd+utt25uu+22lF4O30Z1Hgahtjhd6Zj++8EPftDsvPPOzSKLLNK84x3vaOaaa67mIx/5SLP77rs3P/vZz54ceK677rpmscUWa9773vc2Cy64YDP//PM373nPe1LZtttuu+bmm29O7zE4b7rppinMWmutld7DXT39+Mc/bjbZZJPm/e9/f3PggQcmgbpXuw2nLJlGEqeNesUp/f7yl7+kb859KAaBwNRDqRjUY8J48oBebiWNNA664oormrXXXjuNI6973euaWWaZpZl33nnTWPKd73znKWHrNPrRoHGEo3DJxwc+8IHmy1/+cpI52uJzIzBee+21zYYbbtgssMACzdvf/vZmnnnmSeP20Ucfnfim8aWOm0kaXWnXbpmGE4cbgfbrX/96s8YaazTvfve7k1xhbPvUpz7VnH322Ym/G6f1p7333juNa8Y95RFOPay66qrNsccem8Ioz69//evmfe97X/Ijn3DzLuPteeed1yyxxBLNsssum8bUfnLLoGXJ1Muvi4b7DpT9Xf/4xz+GYjBRUSoGP/nJT1Kj6dSTlXxErkceeWTz/Oc/v3ne857XbLPNNs2NN96YhFkf84orrtg885nPbOabb77mmmuuSXEoEbfffnsSaq395mZa+Yc//GEi9ZPT9yFjqpdffnnzve99L3Xu8t05rjA+4nvvvbfZd999U3589OrZhy1sphwX06TAzDDDDElAlt9vfetbzf7779+88Y1vbJ72tKc1K620UmI64qDf/OY36T1f+9rXUnkMcNlPWZTtu9/9bmI8mK56uPPOO1Me+Jspuvrqq9N7lL/MT0n6BqF8zz33bF7wghc0r371q9MAc9lll6UB6Etf+lLzohe9qHn605/e7Lrrrmlj0UUXXZTCvupVr2p23HHH5pvf/GZinJ/73OeaF7/4xc1LX/rS5rjjjkt9kWKhfJhnfh8Gol0+/OEPN8961rOSgvC73/0uudf5m0ykHfQRZQzFIBCYesiKwS9/+cu0z26yj4HKYgxYYYUVEj9917velcaVO+64IxmWDjrooGa22WZrnvOc5zRbbLFF4un4kTHBeKYOyrGDG2OVMLlujBnGC+OGMHm84ZevwhiPjD2MjoRjPJyxjjBYjzHyLc6aa66ZxmcGJ4Yg7/jGN76R3JXn5S9/eXPKKackuUUa9oXI+6WXXtpceOGF6X3KlPNkfL777rtTObgbj4VRTu/k5tk4dssttyQlpqsPcDeuG4/khSB//PHHp3Kec845T45hDJE33HBDeoexjhsj2mmnnZbGxKOOOioZy3Iayq2OlZvMcvDBByfFQP7Vu/DGyplnnjmVcbL30dwuoRhMUNSKgQbDEDLplCW1+ZXPbe4l1e794tTudfjSD/lgMAkfkI+O5d0HluMQKFmgP/nJT6aPcIMNNmjuuuuuZqONNkrhP/GJTySBTThC+Ste8YokAGM6BHBWD+Gk/6EPfaiZY4450jOB38ftPSwdPm5CMv+3vOUtSQh+xjOekZiD98tTnmbLhHGzihOsN99886RR+4jkW7sYuIQh2HNz/eAHP5jSffOb39wsvvjiieEr18ILL5yUBe0qP/LIWoSpvuENb0iKBkZoNsIAMffccyerzHOf+9zE1FgsvLuuW5ac17zmNUnYP+SQQ1K+c926qgOC/I9+9KPkpg4pOd59xhlnJIVG/Rqo1It377bbbqkvLrnkkimfFANpKTMymGXFYOONN07tkPtpfm++1vcllWHrML3ilPFqv/I5u7XFqd0NZqEYBAJTH+WMAUGu5nvld5uvpV/5nN3a4tTuJXX5dbmXfvk+uyvPAQcckHjp61//+jTe5DIJh2fjxWYQjG0EzVtvvTUZnowlrNiENQYYY4fx6NOf/nQyiDGG4dvSZinHl1/2spc1z372s9NssfQJ8vykhe8bYxi78HrEsETmMBbksc+9PB5++OFp3DQGMSJpl7JsxnbjCyGaUCmvyiCOccM4LD8vfOELk0EwC/JmHeRZ3vmZsTb+bbvttimuNKwsUF/yzV9fyHWW30+JkK60jNXGYHks/b///e+nPDLEGbPXXXfdFJ5BMrcFWYvCoG6VlWJSKgaUt1wnxgqGToqBejSmln1U/so8ZrfyuXSrr/V9ScONw60tbOmf79VVKAYTFLVioOF0xF5EYK3vS7far7x3bYtT3tc0aBzPPhhaP8HVx3jiiSemjsdfZ1RWFoN11lknManVVlstCdss0cKzxmOIGAqGOdNMM6WPkcVCJ8aYTj/99GaHHXZIFn1TtD5u4YQhUFMCMCAfsPfJ1xe/+MUkdFsSdP311ydFBPM0rWhqFwM2q0GRwBgw2brMypaZKLdsQTHFmNfGYi6YuDwpCyaVy8ZyRAmSButJtnpgqOutt15SkrIVg6JgmY86y/Wrb5x11llJuVBewr20sj9Sb9yyRf+CCy5IjFiaNVG6dtpppyToQ86POsnvk4ZBKysG6kn7lfmq6ynfl8+1f5tbL/+2cK51nLZwJeU4KBSDQGDqo5wxwFvqMTB/x+X3XLqV7qVffV+6tblnyn5t6bTFq92Md2Zv8VKGIgpP5pcMKsYkYxU/QrTxjPD81re+NY1RZnCNJfiycYrQb8mMmQHjDOs4o9Bmm23WLLfccikN77LURzqWyBBwP/OZz6TxiBGMoPzOd74z8XCzAIR+S0yNfci9mWRKA0OSMcA4iT8qXy6jccX4guTD2DzjjDMmhcJ7lO3cc89NhjLpWMLLsGRcNSZ+4QtfSGlqY0qFMYjAbXbFOLn66qs/aVS0tLXsC+4Zvj772c8mf8t6CPplHoXRl+RPXSt/njFoI8uL1IU0GNMoJcZ/daSs0iN3HHHEEU8qBlYGlONfSXVf6Hou3Wu3Nr/6fjRu+Z7MGYrBBEVWDGhvOrlG01hTmkby3rY4OhwG//GPfzx9eCwFFAUfGaH2la98ZWIYlAIC/AknnJAUgSw8s1rnpSonn3xyCseKcMkllzRf/epX070PFGNkUbj44oubl7zkJcni4D4rBiwl3qfTyw/mgMGy5BPKWQiuuuqqZF1BGKdBCVPNDAPjIxhjGt6FCWOsBHmMmzCfw2L+GBLBP5ed4mAZFCu7Z0wPs8dUKAymPZXPsh8MzPv1Afkz9WoQKOtYPOl5PyaLmQorf5igOPYayOP666+fmCPFQH2rN1YqbtLE8DJz82zDlYFEPg1YBhjlkbbZGlYn6djXoC4ywxwOjVUf60WDhve9qV/9LBSDQGDqoVQMWHnrb3i8eEBJI4nTRcpiuYqZXQK6fW34pTGCgI+PIjzcMhYCP+F51llnTWMUoRQ/5m4GHU8mDBszGI88m/n+9re/ncaTLbfcMrlRDIy1xjBpr7LKKmk8MVaxzkvf2ED4x8PNaBv7jIPGOmkxppnJMC7Zg8bN2IKEtRyXnxkR1nqz35QZ6/iNLcbbk046KZXdTIAxm2HJ+KFsjFnGHfVk7GfksqRY/rxLmRn/GAuNicKVbWO8sgpB/VEq3OPl8mesomzIH8Mk5UufYoRUP3nGII97rnkc88yP3KDNKCVmaKRpnKDQqDuGSPWVx82cr0FpIvVl5XINxWACopwxoLFqKB3ds2smz9mt9Ot333ZtC1f6lW6le77v56cM7jEPAruPEvmYfXQ+MERpIPzryAR/G4iEw3x89NbFs0YQagn9mIflNtJgUWfpIAhjOMJhSKwWrOCeaf+UCB87BiacqU7CNYZQl0U+COfie2fOdyaWd0uZMAyMAeMya4ERSVuczDhtgqJs5Olg8c2OlAMfBmMGAxPHIHNdmVKm4AiX85br2XspH5QSccr8IfW2yy67PMn8WG/Uu7yxTBGEc7nL8ssPxkyhEL5OV1vYz2APhHTLfOVrmV5+Rizz2b0Om6/lfelfu3eFK+/bwpbumUIxCASmPkrFgPCcx4/y2833bde2+7awpXsvv677XnFqP+OLcYYwSnDGQ41b7o1LxghEeDdOEKotX+VmPGDswsuzkM5oQ2C2Twx/5m8sy0tlpU9ZIEMQxC2zlY7xyIyCsVUcaVk6SpDO9VyWw7hh78JSSy2V4pZjADITYBWA/IpDkTAG85NfY5j3yjcrO8OamXj7FYyRebzP4w2Fwuw3P+N0HnuWXnrpJ/tCWa+ZzG6YXVeXZf6Qmfkrr7wylcXeirz8mJLFeFaOq/XYZPbAbElOS1nyPXnDJuScp0zylvNXXmu3TG1h87V2y9TlLv/ZbbjjbCgGExjljIGPWqNpYI2WrzXV7uXzWMapKYcbNLwy+TgJ0CzbrNCsDIRllgGWaFYQYXNnFVYYgwQBVDhu4uS68XGzKGAcPnLCtilGYflLxxVjlhZhnzAvHem6L8tS3ud8I0Iwpo3Reic3H1IZx7P0MG1WH+9kjcnhhJE/abmW70LyL3/qQZmEy4yrDFtekfTVh+Vn3oswwSy05/TVmzTVjXyVaWTKYXO6SFryk/NUMpMctuvaRl1hBonT676mtnBt4bUZCsUgEJi6KBWDzMPzdzrIfU1t4QYN38u9Ld2aanf8GO9kzMl8Gs/mZjwkiOL/wuaxhBxgvJGWZ+OWsS6HER5flqZ0EB6NZxO2xRNOfQrHEs5NGsLlMDmP7stnZAwydniH8c84qG2M52V47/YufsKYiZcPY0Ueh6QjL8pRj0HiCqdOxDVeK7P3l+8p4+RnYZTFmGumhIzhWR7lSzhp5fFX+T2X6eS08jWnK5+5zYzt6py7dHO4Mm4/ynGGGw/VcXrF7QrbFifLKaEYTEDUioEG03mDgoLGjwxWrqEYBAJTF6ViQHiMMTAoaPwpFIMJjFIxYCXQWCXRgDOVbrV/fu7l1nbN95myW0mDxmnzz8+93Hr5Z/e25xy+9C/9ard8n6n2b3PL9+VzFw0SpyudHL707wqbaSzjdMUrw9dhej23hc/uXc9tcXqlkcOXYernmkp/ykEoBoHA1EWpGLDstn2zXW75ey7D1M819YpTu9X++bmXWz//2q30K6/5vo7Tdl+7ZarDlf75uZdb6Vf7l2Ha3Os4+b7NLd+XzyUNN07tlp9z+NK/9MtuJfWK00VjFSe7tz2X4bvCtD2XcUIxmKCoFQONZdoqk2mw8rmmNv9ecYYbHmX/Mly/OKgOM0icmoYbR/gcZ9C4I41TXgehKR1nUJrS+Srj9IvPvw7TK06bX5dbKAaBwNRHqRhYEto2BtbfcNs3nanre6/dShqrOJm6/IYbZ9A8lOH6xUF1mEHi1DTcOMLnOIPGLeMMSqN5x3DeNdI4U+Id5bWL+JM7QzGYgCgVA2vaNBayHi5Tr+ccvg6T/er78lrHaXPL7l3+pVvp1+ZWUu1fXus4vcLma77P1OVXume30q8rbO1X+5dupXvpl936XfN9pn5h83NJtXu+z+6lX+3f5pYp+9X+5bXLbdDn2i271/fltZ9/7YdYT0IxCASmLkrFwPp032j5nWZq+47Laz//Lr8u90Hj9ArX5t/mVl9L/9qt9GtzK/264pR+pXu/a77P1Ctsvub7TOVz230dPrv1ilfHyc+93PvF6bqWVLuVz+67/Otw+Zop+9X+5bXLrc29K04oBhMUtWKgwWzkQRhlvi8pu7tmKp/r+35x2q6ZcthMbWHbrpnKZ/c11WH6PfeL0+bvvit8eV9eM+UwpX8dpqS2cG3XLrc2935xevm71m5t922U/ctwbdcut5La4nbF6eU+SJyautxZJkMxCASmLkrFwEbZPAZ2fbeo/OYzlc9t93X42m2QOHWYMlx5n8PUzzlcSWXYMtwg10zl80jjlOHK+xymfM5h2qgMX8epr5ly2NK/vC+vmXKY0r8MUz+Xbtm99i+pDNMWp77Wbr3u83O/OF1+XWFc26jNj1soBhMUpWJg97sGswErkzWX9ZVVpewQbSRsm3svGo848ojRu5omlv+6LDlcHTfTSPI1HpTrPJcllyFf88lIvcqCRlKe4cYZJHwuS1mGKVGW4dJwyiJsWZZeFIpBIDD1USsGvuf8jda8CeVxpDSitZGwbTf0EhcAAP/0SURBVO69aCRx+pFyyWsuR6bJWhbkvh4Dc1my/1jSeJU/t0tbWZAw/coz3LwNNzwaJE7uQ7kM+dpFwoZiMEFRKwa5Q7YRITofoeUcYmcJ+3mJaxf182+j8YjjqC9HkTmWLJdHWRynxm8k7xyUhpt2r/D8HIfmaLRcFu1ifwg3J2r4l0Jb3JrGMl9d1K8sjtPTDgbk3C7utYulNpOpLM4Ol3dtkcvSi6yzDMUgEJi6KBWDft8unovPCkegaeMFNY2E1wyX+r0jl68c440byoJntcVpo+GWpS38aOtDfMtQHBOqDLksjtxWFrLJIO8YST7GOo4+RKCuy+JZW/kPw2Qpi7HamGY8L+WsLqIchGIwQVErBhq0i3x0GIk/EBKmdVofYb7m+0z1c5v7cOO0+XfFycSfwOw8ZEeylmXx23WbQIUp06nT7HrHaOJ0+fcif0J0lrFzkw1OuSyYoh+50MLzn4hznEHfM5L8ZBJ3uPFzWZw/nZkJosBpFwyS4FzGaXtHv/fW/l3hs7vroHEyKYtvSDtg6rksvSgUg0Bg6qNUDEo+1Eb8/fH35ptvTnFKHjASvtHm3yvOSN/BmCT/+b8C7o0fyuL8/jJs233bc+3e5V/TaN6B8FqCpXGjLIsxkdGSbCJ8V1qo33t6xS2pDFfH6fcOpA8pQ1mW/JM4bWNMEX4076njD0JtcfqlYSwzS0CuynJWL9KGoRhMUJSKAYFGg3aRj08HvuKKK5Kg7QOcLEQIw8wx9rIsfsVOY2e5bos30Ug+KTKUAIpcLguG7xfzhGmDQFvciUbKYlYAEyz7HsaiXZRrMpVFu+hTlINcll4UikEgMPWRFQNGlX5jIJ6L9yLCUBsvmIhkfDf2EZzx1VyWG264IQnUbXEmKuG1ypB//JnLctNNN6Wy+JnWZBnPjW/yXraLq3JoG3LWZClLnjGnaGY5qxeRVUIxmKAoFQOWWp0Sadh8n0kH1ujl3xInMvmJRr5nDaIYKEMun7KY/aDltv2Zb7hUvm8kNEh8+TSNimkYxHJZspLjOf91sY1GksfRxOkVV1koZZhg7nvKQsnxW3xu+mVb3JqGm8eRlKkX5XbRpyg2uSzlNd/nZ+FDMQgEpi7KGYNeY6BnPNdsJiOMM9jbeEEXjYaPjpYYLZSNMJ3LlctCoG6LM5FJ3v25uSyLMVFZ7N0ai/G8pPEaX7LcVbYLGYWsoo+Rs3qVJb9nSvWtXnHIHcqQ/9acv5nymu+RWYNQDCYoasXAB6ZR87W8zx/jZZddlphoPu/ZNZNOrIMQ+DzXP7RAdZxBqC0ON+mz9Fx00UXNxRdfnKanSv98z52VhzW3LMuVV16ZtFzp1O/Q2TF/ZeU/aFna3ErqiiN9eTn99NObM888M91j6GV4YazNwzTKsvgYKTnaUJw6bW4sD+J2laWNyne35buNclm865xzzmlOPvnklE9MpS4LpcyUKWE6l4VwfdVVVyUFQZw6bWVhoZB+Lqu0hpO/Nvd+5B1nnXVWypu+wS2nlctCyaGkKQdSplyu+t43QiAJxSAQmHooZwxKnlp/tyjPZqKaB7jiA8YMzxR/z228aSQ8qC0ONzxSvk899dQ0ppV+5T1+yl+ZkDjXXnttEqjb4iBygfSNgXitspT+qC1fvahfeGPbcccdl8Yzz3X9edYOeG1uH2URj6KTx+v6PeQS8gk+rUxjUZY2ymm4EvhPOOGEJJvk/lCGlQ/jRTaO5bKQVRjH9Mm2fCqLtmF17ypLLxq0nGU47a8PnXHGGSmfeewt/eVdH3PN30zbt+SefBOKwQRFrRj0Io2tA19yySVJ0PYB6riuhBwf5bLLLtsstthi6cP+5Cc/2ay33npJaCVM7bTTTs3WW2/dfO1rX0sd2oe83XbbNQcffHCz//77NxtttFES8HUcAti2227bHHHEEemd+R3IB+aa3y+tL37xi82KK66Y8sAv5ytfdUJ+OnZZlssvvzy9rwwrvnCf+MQnmne/+93NMccck8qx0korJSFWvvfcc8+Uv3PPPTfVhbS23HLL5qCDDmoOP/zwZoMNNmi+8pWvJK1YGGVXTh88xpTLkckzbRoToazsvffezQILLNBceOGF6aPL+RNOeTH0sixZycFkcvmRtGnnK6ywQjP77LM3xx9/fPOFL3yhWXTRRZ9cK6vut9lmm6SMmOJTli996UvNIYcc0hx55JHNmmuumQYd6X39619vdt1112a//fZLeVCWXGe5LPlZ3jBGYeebb760BA0Dy3kTTruoE/nO/UwcYV2lV5ZFW33mM59J7aKe9ad3vetdKc/SOPTQQ1M7fPnLX05hKRl77LFHyoPw6667bnPKKaekfF966aXNLrvs0uy+++5Pzhy1lQVp4+WXX75ZYoklmo985CNpvWXOVy6Ltlaf+nsuS0mYYXnvPaEYBAJTF6ViQOAsv9ma8BgCm/GM0FR+/wwVePxcc82VeBD+ie/hRfgcgWqzzTZrDjzwwMQn8HH8focddmiOPfbYZquttmp222239B6Wb2nhy+eff37iy/hFG2/Ct/Cc97znPc0BBxzwpHvOV84jXui9xgvvMH5QcPDxMk5+j3wry+c+97nEVz/0oQ+l8Q1fZfBRxn333Tfxb+EZtIwtxn7j+vbbb5/GJWQcN6YYE8Wvy5Lv8f0TTzwx5WnxxRdP/Frd8S/zpyzGr7Is4iD1WpafwEowf+tb39qsscYaqa6XXnrplD98WP2SH/baa69UH9qV/LHxxhunOtA+m2++eXqf94rPzZjo/bnuyrIg7aK/WGFh7CDfGOP55TbxLm1XlkUfpKzpY8aUXBZEFjBuzTbbbM3qq6+e6vPDH/5ws+mmm6Zxnrwgb8Zoso045513XupH6pWfOjVeeg+jnXYhF8ir8N4nj7ksuWzqcMEFF2zmn3/+5qSTTkqKcc4XElYa+ljXd5THQGVVF6EYTFCUioHG0qD1Nd/7+HTgb3zjG6lRfYA6rquPwEf09re/PTELnYQ7hkZ4m2OOOZJAzdKL2XzsYx9LnSkLqTqqPHjX+973vmahhRZKQttnP/vZxCC4+3h9BEsttVSKj0nljkzwpoj4sAlx8pXzhjAjTEPHlVYui49W2XIcYTF4ys/rXve69NH4KLj78HbeeefE7L/61a+mj1Re5Yc1Xh6UM5fFB7Dccss1c845Z0rHB0lx8mFLZ8kll0xxMSkfsvQwWUqFchLUvbcsh3zKh8EplwX5GDFVTEZb5Djawfte8YpXJEaYy6JdDDxzzz13GpyUeeWVV24++MEPprjyN88886Sy5BmgVVddtXnTm96UFKINN9wwlU0eDUbKpSzKpB68xzvUufbXnhdccMGT/SKXRbtQ2ORbObSFd2Jqrm1lUcdbbLFFqgfu+gAm/frXvz4NQJ4x4XnnnTfVy2GHHda8973vTf0WM7Ouk6I088wzpzoxWH/0ox9NjJMSqG95zmUxGHhWfn1Rv8CQ9buyLPJjQKfQ5LKU15K4iR+KQSAwdVEqBsaF+jst7/ECRgRGGDwz8yZ8iv+nPvWpZBgzDnHDF/A3YwChFI8nVM8000yJ3+D3iyyySDJO8DM+4levfvWrm3XWWSeNgYwRxkn8HQ/Cl/AjY6N84LUErg984APJyOO9mS9lkk8CujEvl8f4oSzGkjKs+IRxPNT4gBdzQ8oiv4xFwuKpz3zmM1Oe8WfjIYWBMK4sBE8CuXzvuOOOqSzGQu8ty6LeCJ6Zlxv/vF+ey7E5Ex6r7sqyKAcy7giT4+D5BONXvepVSdjO7WJsMA4wAGo7Ri9jCAXBeM6QZsw2A2Q8JxO85S1vSeOeccGYp10I42uttVYqB+IuLfxd3ihMymlc91y2j3tlNG7ksuiDZCVtW8tZDGvqMZcl90Hx9JeFF144xSfXMJgZe5VVfcqjPmbWRH/Rx8gelNFsxJVvCpFyGM+R8iF1s/766zezzjprqjNtIO2yLNyUJX9H+fspvyPE33hJtgjFYAKiVgx8YF2EKfoYWfWF1WmRBkY6G0FxlVVWSR2LhZ3FWUdhscUIdEDWBAyKcO2DIYyyAmOY0pO+D5lG/OlPfzrF5a4TZkXA1bM8sQjr4JgOZol5YQ45X+LqjIRUAmhZFh+7+xyeoOrq2UeCYcmjshCApc3yLF/KQllRPmF9fKwq4mAG0vCB0tINCshMgPf5iMqy+MAxAvVAUcJ8pJNnV3K+lIXCgbGWZZEH71KvdVkwDuVU/6zd8k0REYelwDu5YSKYjbKYfdEuGADmrq4NdhQGg4LyYyjqUN5RLouyyQN/jEVayq0NuJdl4YYJYo65n2HYlqtJ2wAlXI7jmR8mKG2WGMK9OjCgYH7eR3FQZmHVu3bhzpolDfVk0FV29SwN/UM56j7m/dwpot7nvRQDbmVZfBP6h/znsnQRxugdoRgEAlMXpWLQ9q2WRPDBI/EVfLXkTXgbYwah0XjEwEDYItDjzZtssknip3gUY4ywxkLGBuEJfNzw96OOOirFNZZ+/vOfT7POJZ9F+ZnQRoA1/kiHYMtd3uQrE56Hp+ayGD+MOVkALcO7F2afffZ5SlmMDcpCIMZP8dajjz46hcdPKUXCG7/xQjIBBYghCZ8lWJ599tlpTJPHktdKg7xgjFpmmWUSUSLaxg08Vvr4fi5LVti8sxwDhUcMYXg3wV0bmBUQ3vhsfCdvUHDEy2Ux9ms3Y580jOsE59VWWy3NpBhzvEP+y7LoG/qJsVI5jFfaMOel7DPaxQxBWRbvU89ZzsplyeVXT7ks6pVRVhryrc9wI69Iy1U5yEfqn/Ikf/KTy6I+jI3qrmwXZLwUXv70RelkpY1fmS/9Vz6ybGKcc61JWuKFYjBBUSoGPkANijQcKp9dNTrmoHGzFuhaUp4yQnnaKLu5ZjeUw5Vhc/jyuRcJV6ZT+uX8Yeg+CJ25LAuLR2bGdRxUpp3DuGb3Ml4OV7vnsLU7quuvDItKv0zyjumUZcHwWdOVqSuetHP6ZT7a3PL767Ry2Dp8TcrVLyw3eccEMXr3yoNRKgtBvX5/GbdMO9djdi/fl8OV7oPkr426wnuWd0pOWZbcPvmayTOmG4pBIDB1USoG9Tdafrfu8VzCJKMDwabkASUvaOMT2b3NrYunZPd6nKgpp9EV3rOxgTBdloXSYkYghynjoK78lXnrF7YrfL/3taWT4xjv6rJoF2Uhm9TvQWXapX92d83u+b4tD13uqC5Tr7CIn3YxbuSyZFnFGOi5V9ycfnbLz+U729x6he9FOXxbWG65XZQhlydf8z3iL51QDCYoSsWAducDQxquvGbSgVmxCdO5kceafNiEQu/x0bSFGS4R1lgU5Fs5XHVgsxM6c9lpJzrJO4buqn1yWQxWhGz11xZvopE6z+3imvua9lAW7uNVFumyyOSBZDjt3xY2lwVDNxNTfz+u5b04BJFQDAKBqYtaMcjfaaby2/WN472WvRgvSx7Qi6b2+CL/xlTjd10WAmhbnIlMyqE8uW2UhWJgJsd42BZnIpJ+YcxWlnI8N/YxWipXW7yJSMqS20W+c9ug+jtSRooExYD8Oa1imlEMNCjScDVx14lt1iG8seyOB1EILF/BsLyzLcxwieCMcbjPZdGBTcPmjlzHmahEWSLUEkDLshiszCRMlrLIZ26XXBaUFba8z6At7liQ93u3qW3L2DC3tnCDknZR/665LCV5X77HHEMxCASmPkrFgOBSfrM1GfcIn8YNz218YCIS/spokWdn5Z2bshBA2+JMZCI4K09ZFuOFsoynbDLWJP/GjLIs3I1LxkBjSR1nopL8KwfyrCw15TK6mmUIxWCColYMcgOXjZ3Js0a3y50gSoDTcV1rKt3rMG3PPmzCrtN+rInz4fvAu+LUbv38MUTWkZJpKAslJAtz/dLIz+KJI02MKNdFGWY4z21ubWEyeR+GXpZFXgxWmKO4g7yjdhskTr9ndcyNUieP7nvFke+cZ+XQz5RPWXL8Os4gz21ubc+IlcOmY/tFtKe+2CtO13PuY+5zWfJ3k5/zPeZoLWcoBoHA1EWpGFDYy2+1/n5972YzjRt5fOrHI9rchvvc5VZSrzTwOMJmFtqURf7xaby2LU6XW7/n2m2Q8LV7vzjKYgzObaMsxgvCdC/ZpF+6w40zSPjarb43ZpB3clmykmPJtjGljlPGHc5zm9tYpuGa20UZcj9ruyLLrkIxmKAoFQNLKnxgGjVTfnbV8BrdphVHrxGmbNZxlNlISFwbgmzAsjnJphmbW23uLcO03Q9CwtsYddppp6WPLAuguSw+RiflKIuNN96b39H1Lu4UF5tp5V2e3XPvitNFwwkvb8pis5M8U6J8jGVZlFFZbCgu6zDTcPOHhhPH8XxO17Axykk/Nqtxq9PwrCw2TOWyGLiUBRm8DFbKYuOak4bK+GNN8qO+bP6zacsm7Oxeh20jGwWVhcJscCrLUn9D+bvCIG3SDsUgEJi6KBWDLDDn7zR/x5l824RpG1d983hdG0/oouHw00yjiYN32uhsVtSMsnEij4HKgl85BMQRzvhYnU4vKvPVlcfhuvciZTG2Ofghz47nshgLjSNmfrWLsozkHZnGO66ykDmUhTKTDaG5LGQVZTHed8lZ2W24eR1J2XrFMXYqi2+CDEI5yLJJpvJ7ylfyZigGExSlYmCNmAbtRTRYHyCLiROICN0+RB241zXfZ/JMoLYb3jGZeVe89Oq4ZZzyvuu5dCdcWvqEARI4MUPlcM3WamXRqbvKUqaX3YT1cTtlwuCQ45ZUx8/XNveS6rCZlIXwaWDCSJQBY1QeVhKWBx8mZuP0hDqttvTzNVP5nMOXYcvnkko376foOSmjjJv91ZWyUDDzXoKyLO4xF8qBAU1ZctycVqb83HYtKYcv/epnyooTRfIJGo7+K8O2xUHyV5ZFW+SydF0xxlAMAoGpj1IxILTkMaKkkjf5xn3r2UBW8od8X/KLtmvbfRmmfu66ZiqfyzB4LR6KlxofsgU6l8WY2FWWnF5+zlT6l26le+mX3fpd832m+llZjC3ZyFeWBWkXY6Mxsle71G51uNq9DtMrbL62ublmymUhe5Rlyf1MuygL2YWcldMo0yufa/+SslvpX7vl50y1f+1WXpWF/JSNfHVZMpVjoL5H3gzFYIKiVgwINRqti/iXpBOU1zaq/TzTNB33xUpLsM7CYa84be6Z2ty5le5jURbEUkGbp9g4MgyjaguXqU6vX/qoLUydv7by5Osg70BluEHi1GHqZ0txDDKOaXMUKAZX+rfFr8uRyzAcGiTvNdVxvFc/pPD52RDyM5vs1xanpK6yZMIUXYXFHEMxCASmPkrFgMLu+6y/3ZKyf/7uM/XjDW3PXXHa3Ll1hUe1X1vYtrKUlOOM9j0ldYUfyTtK97osdXlG+p5e4VHpn+9HEqektrKg7N8vfVSHGW6cQcKjfuHaylGTcPbzhGIwQVErBgQjDZ+JYJepdOvy63J3TxiizTtL17n566yzThKydZS2OHVapX+XW1u8fm79/Es3wp0yOPvXn4n98Ivmr6PX4XOcfG1Lr5dbP/82t9qv9m9z64rT5VbGLf3UgSnp97///c1LX/rS1MYsCLlu2uJ0uZfX2q/LvbyW7m1xyufaT39kAfG3S+d0O7uZZafsp23p1W7Zvbwi35g68Y+EUAwCgamLUjEwRvUbA9vu83Ptlt3La+3X5p6p9svhu9zzfZtfl3vpVlKbX3br8qvdS7d+/m1utV/t3+bWFafLrYxb+vVy6+ff5lb79XMr45Z+4xGnzW8s4pTP5b1vzH6eUAwmKAgljz32WPpLoc0gefPIWJNOwRLr5yWvfOUr00yBDtIWdiKTPJt6NdtBKXjZy16WhMjJWJaxJnVgtsBfhp/1rGel2RRrQSdj3WBclBo/aaMA+gW9Wa6xKgvFwIzBP/7xj1AMAoGpCIqB79AfhM0YtH2vQUFBY0fkTIqBv0qTP6fVMXDSKgaQGaNZA1ZM58vaGDIWlI9AdeLL/PPP37zxjW9MgrSjqtrCT3SSb0Kdvwg/73nPa2accca0eUg528JPT6RuKAKveMUrmqc97WnNxz/+8TTQUjjbwk8G8hMWm/PMci2yyCLp76IY2kjLpJ/426Vf2//1r3+dppliIDAZkI1jvkffpT+4Bj8PChofIl8yilEKyJ3//ve/h77EaQ+TWjHIwCApCRpqLEh69957bzrFh1LgN9qYL7SFnwykTKa//CqeJZkQTCDm3hZ+eiJ18Ktf/Sr9Pv/pT3962megriZ73eRvYvvtt28WX3zx1JctOxhNP5amegkEAhMDvsfg40FB40/Ty/g3TSgGYw2bci0dmmWWWdJxV9MKHnjggWbLLbdMSsFzn/vcpBgEnoDZFFZ1MwbLLLNMWqI2LcFsweqrr56WkTm1CIMLBAKBQCAQKBGKQYE///nP6Sz4t7zlLcnCeueddw75TBugCFg//4xnPCMpBjZtB56AZTfqhGKgjkwXTmswK7LSSiul/m2J3COPPBIKQiAQCExB2LRqRtoY44dZDHaBwERCKAZDsIHLaS5zzz13Ug48T2tTRtah2mNgg+1znvOctG4u8ATsJ3n+85+fFIOPfvSjad/KtIg//elPT25Ad0LVPffcM+QTCAQCgfHG//3ff5oH7r8/LU+2odWf5AOBiYRQDB4Hy/nGG2+cZgl23XXX5qGHHhrymbZw2223NfPNN18SflnHQzH4L5zc8+xnP3uanjHIsGHRUbWLLbZYWjJn6VzsGwgEAtMarrnuhma3Pfdv9j/o8OagQ45sDjzkiAlG8jT183XQoUc2++x/cLPnPgc2D/x+2jSKBQbHdK8Y+HufvwDbcOqUnml5acUvf/nLZoUVVkhLicwY+KNk4Ak4yvXFL35xUgwcW/q73/1uyGfahbPP/evADIL/ODgTPRAIBKYVXH3Ndc0hhx09zc4AjyW+ffOtzQ677BmKQWD6VgxsLF522WWbD33oQ+mc92kdjtpaYoklnlQMLrvssiGfgN+3+7cDxcCpTZZdTQ9wzO+mm27aLLzwwun0LUcexuxBIBCYFpAVg/vvj3X8/XDjTbeEYhBImC4VA0spjj/++LR0yJn104swpJwLLbRQEn4pBhdccMGQT+DLX/5yqhN18853vjPtMZlecN999zXbbbdd+vOz7+H8889/8njeQCAQmKwIxWBwhGIQyJjuFAMnsWy77bZprf3KK6+c/hI7rR1N2YW77rrrKYrBN77xjSGfwMknn/zkqUTO+//Nb34z5DN94OGHH25OPPHEZumll07KgZ+ZBQKBwGRGKAaDIxSDQMZ0pRj4uZOz3Ak/yy23XDqedHoCK7hyW0o0wwwzxB6DAhTEF7zgBUkxYDmfHvYY1LC/xpIq+23mmWeetCk5EAgEJitCMRgcoRgEMqYbxcCJPDbeWku9yy67pF9aT2+w+djPuwi/fnI1PdZBF+y38Nfjt73tbekIuekZlIPll18+7b2xKdkfHwOBQGCyYbwUA8svh3NynaXK/l8wCH784x+nGdyR4J///OdTlkV7ZgDNPDz/vbcNoRgEMqYLxcDG4ve85z3pz7bnnnvudPtTJ8tjPvnJTybF4JBDDplullANghtvvDHNpLz5zW9urrzyyiHX6Rdmk17/+tc3c845Z1IoA4FAYLKhTTEwc25PmZPo7DW0944g7lRCs6Q/+clP0rOltieddFJz2mmnNffee2+K6w/yZ5xxRrPvvvumn0S2wQlIV199dToS+uyzz07Ll+3b+tSnPpX29XmfE+CcCsf/rLPOai6++OLk9r3vfa856qijkuIBDoc47rjjUn4vvfTS5tFHH015Y8g6/fTTmxNOOCHlF/ykU17toQT5OOaYY9LhEvfff39yk670f/CDH6TnEqEYBDKmecXgpptuaj72sY81K664Yvrz6/QMisC6666bFAOMbVALxvQAzPp5z3te8+pXvzopCYGmue6669J+A8f5+o4CgUBgMqFNMWBF32effZpbbrklGQlZ0Qnm9lgR0g888MBkDHHPQPLDH/4wCdMEbycZEs4ZG7faaquhFJ8KS5bPOeec9G+cHXfcMc0sUEYc8EBJEN97vZ9g7y/IrPxI3szm33nnnen5L3/5S9oTSaCntFBKKAWUGjMLe+yxR7pSBoTLY5fDI5TDn+79l4fSA8JRRrz373//e3LLCMUgkDFNKwbXXHNNspBbQnTttdcOuU6/wICsH6cYsECUU47TOzB/MwbveMc70n3gCRjkKNZrr712GjgDgUBgsqBrxuDb3/52c/311yf+BpbV+uEnIZs/IyLh3EyBPxMbExjSCO6MJNwd5tE2hjLAEfrJHOWsvCU9jC1m7gnl0ifIey+FQR5Y8nPe5INS4QesrP/CyxdFRhpf+9rXkmKQy0BpwKNLgx+FQvr5Pw7iUTzuuOOO9FwiFINAxjSrGPjgbDJ+17veFef1D8EUpGlFioEpx1AM/gtM+PnPf35SDDDgwH9hI7YlRerG9HUgEAhMBozn5mOWfzPNlgRRFhCBe0oc9UxBGcner15jfigGgYxpUjGgrfujqxOIYqbgvygVA5aF6XWvRRtMHc8444xp83Hb+svpHRSnTTbZpPnsZz+bpqJDqQwEAhMd46kYTGsIxSCQMc0pBtb+OW7ygx/8YMwUVDBdueWWW8aMQQsIvhSD2WefPWYMOnDPPfc066yzTvO+970vbeKP/hMIBCYyQjEYHKEYBDKmKcXAmr9FFlkknabiRIAQXJ4KU582KPmRV/z1+KmwdnPmmWdOp1dREgLtMANnWdEb3vCGNHMQG9gDgcBERSgGgyMUg0DGNKMYWOtHqJtpppnSaQLW4AWeCoqBkxFiKdH/4re//W3z2te+tpltttma73//+0OugTaYlTOzsuCCC6aNcqGABwKBiYisGDz44OD/HJhe8e2bbw3FIJAwTSgGrL2WNzzzmc9MPy9zHnDgf2Ep0RZbbJEUA8epBf4LJ0+8/OUvT3/8zedCB7px3nnnpaNd1ZezvQOBQGCi4Yqrrmm23X7XZsdd9mp22m3vZqddH79ORdrxcdpl932bAw85sjnsyOOaQw4/ttl97wOSe1v4KUaP1832O+3R7LDzns19Mbsy3WPSKwaO4FxllVWSsLvZZpuFUtADDz30UNqUra783TbwXzjS7VWvelWyhJt9CvSG87APPvjg5iUveUmz6KKLpiP1AoFAINAblqq+/W1vS+PwS14yY/o/QSAwkTDpFQM/Hnnxi1+clANnEAe64UzlrbfeuplhhhmSUBdK1H/B6m0p2lvf+tb018pAf/j5jh8FOclJ3fm7ZyAQCAS6YYXDfPPNlxQDs9SWPgcCEwmTWjHwS/JnPetZaV/BzjvvPOQa6ILlMiuvvHLzwhe+MP3gLPYY/Bc2Y9tQO++886ZN7IHB4Kc866+/fupTZlzieOBAIBDohp+Ovfe9702Kwcte9rL4N0xgwmHSKgZ77bVX+iHVi170ovQjs5H87GN6w6233poYEmVqo402SkuLAk/Ar+4JtzYgW2YVG2oHhx/7LLzwws1znvOcZpZZZmkeeCDWqAYCgUAbKAb2ZsWMQWCiYlIqBk6QoQyw8H7gAx9I650D/WFt44Ybbti85jWvab7xjW8MuQbg0ksvTcLtQgstlPpXYHBQor761a+m2RYWsI997GNpmVEgMK1Cn2eMCgoaDuk39kWuuuqqadnq/PPPn8YeMIPfFicoaDyol/Fz0ikGP/rRj9K/Chwr+cpXvjLWgw8DP/zhD5sVVlihedOb3hSnN1U444wzkqKJUV933XXpwwkMD8cff3yz+OKLp9OKVlpppSHXQGDagcHUUdh//vOfE/+0lG5qkKOn29x70UjiBI0t5TbI44v+5F8wdbigqU9T6hubGt+lEypdu5SDSaUYYMj+3GvzrCm4a665ZsgnMAhuuOGG5sMf/nBaLnPsscfGTMsQfBw2YzuRyEba+I/ByEBQWn755ZtnP/vZaXNdTJEHpjWw6hpUf/GLXzR/+tOf0j3661//+pT7/Fxe29xcu+57ubmW/m1uvZ7La5uba9d9l39+Hk2c8trmnqnNr82tpNJtpHFKqv3LcPU1U+leUu1Xh2/zd63d2u7rcOVzGa6+1vf5uQxX3pdh8n1+LsOV97Vbpq5wbf5tbvW167689nKr3Wu38rkOV7qV4cr7MmwvN9eu+y7//Fy6mbXCv7oMoJNKMTjwwAPT8YgUA0tiYo388GCD7ayzzprWNn7kIx9p7r333iGf6RsG+3XXXTf9Edr+iz322COUphGC8u7P4+px7rnnjj9sB6YpMCIYYH/2s5+lwdWSObMHxqIuavPvF6eNRhJnuDSl8jXcOMJP7bwN170XTdQ4EyFfY5Vev3RGEqeNhhtnSrwD9Ypjn4tTKie9YuCnUzbOskZaB+6Pq4Hh4Qc/+EFaSmS2xT6De+65Z8gnQIC1Pv7pT396s95666XBPzAymJmyEXnGGWdsPvrRjyYhKhCYFpAVg5/+9KdpcGV1M8CiP/zhD633+bl2y+7ltc2vy6329zzWcXr5dbkPGqd0G06c7D7cOIOGL93b4nWlM9o4bf5dbmXc2q+m0cbp8utyHzROGb72b3PL7m33+blXnF5+Xe61v+d+6UyJOL38utxLf/soXSe1YmCJwlZbbZVOILJ+mUU3MHyYIfATOLMuTo8xuAWegM3YllmZTfETOOvvAiODGRh7WChZlNBtt902TV8GApMdpWLwm9/8prn//vsTOYmrpNKtDJPvS/8yXNd9vzht17bw2a/r2handBtpnF7PpXvbtb5vc2uLmymHKf3r+/zcFqf0L6/5vis8Gos4OXy/OKV/vzjZvS1Ofi7du66ZcthMbWHzNVMOm6l0q+/bwme/fB0kTleYfF/6l+HangeJ03btFb68H0mc8rlXHLxrUisGGLGfmBEwLE/40Ic+lKw0geHDj1U+97nPpU3b6tNz4AlceeWVzTvf+c6kGKijEGRHh9/97nfNggsumOrzda97XfOVr3xlyCcQmLwoFQP8k+VtSpHBvM19LGkk7xjLOMN1RyOJ00WDxhFO+zvpD69rC9NFg7xDmiiHLeNwf/DBB5+crSrDlWHr9+Rw8sxIWPrVYQelkcSbUnGmFA03bxOhzvSBSa0YOHv/fe97XxJknaaTj/YKDB+YiT8f+1O09fR33HHHkE/A0rT80xlH4Zp2C4wOlmfZ6O7bdVqRE8UCgcmMrBhYhmmJHOEwC4j5vu25pF/+8pfpOiXitPl3xRnJOzIJU4YbNE6v55pG+o7hxiFQ3XLLLc3+++/f7Ljjjuk/LXUat99+ezpy1J/fjRXZ3zVT23PtVvp575133pkOwjBr/YUvfCEZrLgLQ5j3k9KTTz65+dSnPtWsttpqzSabbNJcf/31yQr89a9/vdlhhx2anXbaqTnuuOOejJffm8u1yiqrNIcffvhT+kdO++ijj26+9KUvpYMjuOX85jS6qAwzSHg03Dh1mF5xpuQ31ubeKw4aab7KcP3egbreI81Ju/nYJgmbjAmx1ixvvvnmcT76CGF5x2mnnda88Y1vTEs81OnFF1885Dt9w8ex4oorNs973vOSYuBEHUw0MDpQRJdbbrlUp5YBWlJEsAoEJitKxQARpigI5TVT/Vy6ldcyXFucTG1xy2svGk7YTGXY4cTLNJx8dT23UZ2v0bynviK8/8gjj2w+8YlPNN/85jefdBcmh2PkOOecc5J/FrT89d1f4D/4wQ82++23X3P33Xc3O++8c/rX0hZbbJGWcRC4petfL3jjsssumw5VEd9PNt///vc3559/fjr1Cjn2WVjvkz556B3veEez2GKLpb1wlBcGPvFZgi3hXHnllZs11lgjLRumMOQ8S88+Qz/wtA/Mc1k2z66UIflSrtI/39fP+b5XmPq59utyz8/1tfZvcyuvZbg6TptfrzA18cuUn8trLyrDDhIelWEHiVOHyc+TUjHAgGnqCyywQBIsLEv43ve+N+QbGC6clXzooYem/QXq0ybuU045Zch3+oaTdPyci8Kkbvx4JvZfjA3M+OUlRU4pMrgFApMVpWJAWHMoRqYf//jHT7l2Uenfdt8Wv/brFTZTrzi94qG2sIPEqcP2ijOcsDWVYet4vdIZ5F0E4iOOOCId1GGFQluYTPpBTsuY8Z3vfKf55Cc/mazurLNrrbVWOmKdMCasGQA8kEJBQTAr4P9Cd911VzJOMX6Sc6QnH34c+Za3vKW5/PLLk/DP4r/22msnhUB63imsd1FU5pxzzmbfffdN+zD9XdmYT9HJ+e1F0nM160AxkOfs11XHpXvXfU113PraRsONU7qXYbvCo9qvjNfm3+bXK2xNbWH7xWsL2y8OqsNq20mpGDhqaYMNNmie//znJ6Fit912iyMkRwHCL4bhpBj1aQbmhBNOGPKdvqFuMFT1gvzoLBSDsQGFdM8992xmnnnm1Ofs3zCTEAhMRpSKQRbogqYtwvsJ1P7JctFFFyXLf1u4NiJ43Xjjjc3qq6+e9vJZuuu/OJRIfgR8S1VZ+y0BYpwzq0ChOOmkk5Ih9MQTT0yWfUYVSzA/+9nPpmf5oFhYRpTjSFu6V1xxRbPkkks2hxxySFIekGVB4vt5p/7alt+Scjm32267Zplllkl5rsMETRukP0w6xcCyFx/Hq171qiSomTbTaQMjh7/rEdD8wEu9WkrkJJ5A0zz88MPpj8dvf/vbU3/DnK2vDIwNDGCmvx0eYLA0lR1LigKTEVkxYHUjlBHYgsaOKFtt7r1oJHF6EUHb8h5W8/POO2/gdhYO2a9G8LeUh3Cf8+dqCY9ZAMt6kaVF3/rWt1I8+wvMJlBIHLLi3zoUFPzScqGbb745zQqcfvrpicS/7LLLmu9+97vpys0hD8Iiew6EcbUnos5vGwlHmXHMtPy0hQma/IR/Mb5PKsWANrPEEkukpR2sjNtvv30cHzkGOPvss9MeA3VKMbjkkkuGfKZvmImykeulL31pUgysCTXVFhgb2Be0zTbbNC984QvTN82yZeo7EJhsKBUDAhlBKlN+rt170XDCorZ3EALzfRvVcbrCt+WljlPG7RUeDRqnpjp/veKM5B394hDSCdoMadbss/LXeUJ1Wc0A2FtAKVhzzTXTvdmDMk4X5bQYUdzrXwTzHL9+f68y8uuX35qE5+8ESMugbF7Obm3h297RK31U++fnfvFKmlJx0Gjf0dYGJdVxBm2ztneUcfvFQQztk0oxwHgJrP5XQEibY445kpYdGD1YD/zYTL3aY+BkgsATS4k+/vGPpzpRN2YOWHoCYwf7hax5pZQ6peiYY47pZEqBwERFqRjgEQS5yUSWmLS5d9Fww5c0krijed9YkrbNVPt15VHY2267LRFBzDX7TZa66FXuLpoSZZsadTElaUrWh5knSuekUgws67Dz3l9oWbX92AwjDowepjiduOO4UgIa60DgicF+r732SnsLKAazzTZb0sADYwfKl+l5y9jyrIGTEQKByYRSMcBPEaW3vM/P+b72bwtXh6/d6nCudZzarQzb5l+71X693LridPl13ddu5XWQOHWY/NwVvvbruu8iYUYap47bi8qwEzVOGbcf5XBlnF5x63AllWHyfRmu9GtzK2kQ//q5dKvj1v5tbmXYNv/arfYbxK3291y6WdY2qRQDR3XZqEhA89+Cq6++esgnMFqwYiywwAJJKchrvQNPwKYtyqi68aOzEFrHHhiS0598244vNYMVCEwmlIqBZR7XXXddWkKSr5nq59Ktl1+XW+3veazj9PLrch80Tuk2JeIMN3x934tGGmfQsJly+OHEm5JxMrX511SGy/f94tZx6vBt8bNbfa3vS2oLW/t1udX+nsc6Ti+/Lrd+6Vku53cAk0YxcPqQjYrPfOYz085+Z/8GxgYUA0ei+RPtUkstlZSwwBODvc3ufqRn6ZrNX7HZfexhA7xZA/3PrIF9Hf7GGQhMFjgY429/+1s6OtISV4arQcgG0zb3saaRvGdKxIl8PTWOfQuOQnWaUOle0mQpyyA0PZcFTYk4wwlvxoCBY1IoBo5RstSFRdGsgR32gbGDdWmOIbNkxhKt448/fshn+gbFwH4LMwUUJkdq2qATGHtcc8016ZQxir+jcw2QgcBkgsGUcuCPt04voyTkH1IFBfUi/yawvtsPyPxQ08/EGD/bwgYFjSXhU/iV48IfeeSRdOgK2acNE0oxIJzl/xYstNBCsc57jOGYKptsKV0UgwsuuGDIJ+Cot9e//vXpmDY/mfEhBcYe1jX66Q/FwHfuj6CYVCAwGWFgDQoaDv39739vdtppp2ammWZKf4PPeyjbwgYFjTUNggmjGPh1N2WAsOCD2XXXXYd8AmMF62I///nPp821H/7wh5sDDjhgyCfgnw7+eGw5kaNyKVGBsUfehGwDvBkD53Xfd999Q76BQCAwbeP3v/99s/LKK6f9Vn5qFghMNEwYxcCyoTxbMNdcc6VzhANjC1ZwszLvf//7k3Lg74mBJ2B2irKEKKaOmguMD6yFdLCAb32GGWaI5USBQGC6geVEiyyySPOSl7wk/TPALGogMJEwYRQDSwoICs46t0HWsaWBsQXFwJpGm7ut8/ZnxsATcL4vpeBd73pX85a3vKW56qqrhnwCYw0/N7Px2Brbj3zkI83GG2888BRnIBAITGZY620vpf+5bLLJJs0f//jHIZ9AYGJgQigGP/3pT5v1118/WRE/9alPxRKXcQKGpG7nn3/+ZBWnjAWegF36iy++eNqA7P8ZfgISGD/4s+Z73/ve1BfXXnvtWE4UCASmC1jSyyBi6WrIOoGJiAmhGDiyi5XWUZGOKPXn48DYwykarLPOkPeX309/+tNDPgFWbMqSWatnPOMZ6Si5wPjBng4/ObPhm2Jw8cUXD/kEAoHAtAt7DMg5ZqfPPjtOXgxMPEwIxcB+ApsRnQhjM6IPJzD2sMHbaUT577OWcwSegGVW9l1QDFAop+ML62q/8IUvpB/tOSnLpvhAIBAYS1z1rWubHXfZq9nvgMOagw45sjnw4COmOh186FHNUcec2Bx97EnNoYcf+7jb1M+XutlznwObnXfbu7n/gZC/pndMdcXA0V2nnnpqWvP+0pe+tFl44YWfPL4rMLZwhq3TECyVYRW38SnwBPyV12YwSoH6ufXWW4d8AuOFL3/5y+m/EYwCToP67W9/O+QTCAQCo8fV11zXHHLY0c0DIez2xY033dLssMuezQO/f3DIJTC9YqorBjbeUAxYDK03tu79H//4x5BvYCzxm9/8pllhhRWSUvCCF7wg/WU68AT8JjwrTK9+9atDMZgC8AfG8847L51C5sCB0047bcgnEAgERo+sGNx//wNDLoEuhGIQyJjqioG/8bJimzHw4y3CQZxQMj5gkbWe+4UvfGEz99xzN4ceeuiQT+C73/1uUggoBqzXNsQHxhf+guzIXPsM3vzmNzeHHXbYkE8gEAiMHqEYDI5QDAIZnYrBI3/7W/Pb393X3Pf4BzWedPsd32sW/sAizSxveGMz++xzNGeceVZruKDR05133tWsvc66zeteP0vz7nnna3bYYcfWcNMjXXPtdc2b3vTm5vWzvKFZdNHFmosu/kZruKAxpPvub7bccqvmrbPO1sz0mtc262/w+THjOb97PG3CwD/+8egQRwsEAtMbQjEYHKEYBDI6FYNv33xr88Uv7dJ884qrm1tu/W56Hnv6TvPtW25tvnPb7c2tj5Przd+5rSXcBKfHy3DiKac3+x90WHP1t64bx/oaDT1R1/KW63qq5PPxPKAjjj6hOfCQI5obbrr5Cbe2sFOMJnA/fDxf1153Y+pb2++0R3POeV9L+WsNOwlJH0z1/t07nuiPqS883h4tYQcl6Zz/tYubffY/pLnr7h8PcbRAIDC9IRSDwRGKQSCjUzG4+fEBepvtdk7Wt0B/fOOSbzaHH3Vc86c/PzTkEuiFM846pzn6uJOaxx57bMgl0AWzd4cfeVyz7fa7Nt+9/XtDroFe+OGddzf7HXhYc9ePQjEIBKZXjJdi8M9//rP597//PfQ0trDH8j//+c/Q0+gx6NLsUAwCGaEYjBFCMRgeQjEYHKEYDB+hGAQCgTbFwHHo9pT97ne/a66//vr0c8V//etfzQ9+8IPmtttua/7yl78k4fyee+5Jz8I6PRH+9jgvFu7ss89uttlmm+RW49FHH21+9KMfNTfeeGNz5513pjHu+9//frPtttumA0CcukixkL5wwkhTHhybbZ+lo8VB3sS1F/NnP/tZciPoS+fuu+9uvve97z1etvuT+x133JF+HCmP4B0nnHBCs+aaaz75A0npn3TSSem/PTVCMQhkhGIwRgjFYHgIxWBwhGIwfIRiEAgE2hQD1vg999wz/cSSkE4JcArdQQcd1Oyyyy7NHnvskQRpxylfcMEF6S/4BO6LLrqo2W+//dKPQr/2ta91KgaEdekLe+yxxzYPP/xwUi423XTTdABIFty985BDDkkKAjf5Mgux//77P3n4hbjiCXPOOeckkg9XaTpynHIhrvxQdIAiQwHYa6+90v9i8il7lIWzzjqrOfnkk59UdjJCMQhkhGIwRgjFYHgIxWBwhGIwfIRiEAgE2hQDFnqCPss8azywvjs2/Stf+Upz7rnnJuH+wgsvTJb6b3/720kQ/9Of/pSOWD7++ONTXAI2gZ4FPxMBneWevzjXXnttmkEg8LsXl1Lhn0J+7OrZzzT98JGSIn3KhLzI069+9avmlFNOaX7+858nheCGG25I7/zWt77VHHnkkWkWIisRThl0/HNeOuSqrBScn/zkJ8mNUnPEEUekvORwGaEYBDJCMRgjhGIwPIRiMDhCMRg+QjEIBALjufnYTMPtt9/efPOb32yuvPLK5oorrkjLhyxFGk9QMsxomCmgUJQg+Pfa+2DGwCxE2x6GUAwCGVNcMaD50oZNjQ0KH9pDD/UXuH0UNHU/TRsuaPUsAj6cDOeslz+6si6v66MbL8WANYBVwXrIQaGuaobRBusUDz744KGn4UE95bWNoP5YMayVBP7WcnYJ/uOlGPhRGQtJnq4dBJhs2e5dYK1h5Wljqv3AGqRNclzTuKapsyUn12dtxYHxVAz0FRYy9dZrQCmhrfsNQKAs1uKyvg0X0s5WtIwf/vCHyaKW21Z9dbVzKAaBQGA8FYNpDaEYBDKGpRjYHEPw/tSnPpXWrfk5kc0sNObNNtus+fjHP56ELEL/kksu2Xz2s59N7gRq02LCf/rTn27WWGONJIi04cwzz0xp+wOyNXLS8lOuj370o81OO+2UBGTr9FZfffVm8803T+vqCM4ECVNw3pcF/H322Sf9PG2VVVZJ03qEUNODn/vc59KfVldbbbW0yefBBx9M03AEW1BO+UCEzCyssRB4L427RptiIE/WCy699NKp7LvvvnuqH9N96mbrrbdOz+eff37Kj3wpj3jf+c53UvlsHFpvvfXSesIaBKfDDz88pbv88sunKU/1pa4+8pGPNAcccEASzlk11OHGG2+c6tU0IuVpxx13bL74xS+mtJTZWkb52HDDDVM9EV61t79SL7vssmnaUt4IZNLLChjhTV3ldZhZ+L388stT+duUujbFQDsov/dttdVWqU3UvXWY6maDDTZIQqkpXH1N3QgjfQK7+tKm6667bqtyqC78RGvLLbdMYU0Ts/IsvPDCzWc+85k0xQrctZn+po0oBMoszoknnpjCUKq0oXrwN2nWG/VwzDHHJHf5s75TvWpLfTErd4888khSyAjkX/3qV5MbUETFUccleikG1r/qk5tssklqH2171113pbWn+oR60W99B7lu9HPKtrqQf9+HNafqtobvVB1IT/rKoCzqbKONNkoWMt+Dv2hrj+22267ZfvvtU7uxovne9T8wha6u5Fca2k2/lj63D33oQ6kPi2tKX12C/pQVPn0sb8wDZTJdXiMUg0AgEIrB4AjFIJAx7BkDm1sIAQZjggRBlKBlnRzBwHo4woPNNyx8BEZC+emnn54EVeEIMQSWLHBneCYsSWOHHXZIm3cIBV//+teTAMh6T5BgJaQ8EEKzkECIJdARXvLJAoQLcQl2NhZZr0fwImDYZLT33nuntJSl3PDjKozyrLrqqkkoBHkhxHhPja4Zg6uuuioJ2vJJSGQ9JZzLGyXmG9/4RgpDCKJkEZgIUvKZ1x+K36ZIEX4JyYRL+cxCJiutdKyf1AbaQ3oXX3zxk9OcrgTEddZZJ+VN/RG6KAQ33XRTUrIIaUcddVQ62YCgrg1Bnm3QInACpY+CQsAkPLMmg7J6b67XEm2KgXvtbEaJ4A2UKHVO+aRQEja1p7pRT+pN/VGOtDVFZq211koKVw1uZ5xxRiqnuldWIPTmaWB5JeBTRNRdnnlRX5QCSop0kD6lP6pXfVv/Ui/qSz/UHspEIOaWFUrfECVkiSWWSGXTL0B76ZP6QYleioF62nXXXVP9SJ/SIQ++IYof5YbC5XukNOR3U0LkkQWe0sqtzfouHAVduSkC0vLN4wHagWKijvQNyg4jgO+EcqM+pO37lzf9KrebfuU7pZAcffTRKX18wdpbfdH3KG9A0ac8++aXWmqp1E4ZvmP9vUYoBoFAIBSDwRGKQSBjWIqBAZ/QlJeq5OUXBBuDd/YnFAhDcSBsZIFDOP4ESuEJNYSHTARKQgG/LJBlcJcm5cF7CWrCZUGXmzie63e75y5f8kChIDRl6zb3vEbQM3iPvBK0shvFhMLQhjbFQBo5T65ZCM75d5U3ZSM4yRv3LHC7ZncCGaEr15V7aSofP3ktwV1a8iCd3Ga5Lbyzzpuy8pcv71VuSgOhTH1lyy8QhH/84/8KXcKKI7575aJAECxz/ZVoUwyUUxryk9sapOlZOvIobeXN+RQP+CmrspkdKPsWgVMa0qJQ5XrI4JfrUDrqRf+SFuQ8IH45r/LAzz1QIik2FJW8rEq/JrzmZUOgzylnTl+ZKMuUnBq9FIOybXNa6o1bzps6lmf+3pPzCu5zPzMTl+vrl7/8ZVJ+xBG3XjokTe9UD7ntcz68n1uuL27yUbabq2cg6FMgKSe5P+hb6jGHAXnxnhxGHsXRXjVCMQgEAlkxeOCB3w+5BLpw07e/E4pBICE2H48RYvPx8BCbjwdHbD4ePkIxCAQCN9z07Waf/Q9t9j0AHTJ0nbq0594HNNvvtHvzpR13S9fd99q/NdyUpSfq5sBDjmx+/+ATM/6B6RehGIwRQjEYHkIxGByhGAwfoRgEAoGJCDPDr33ta5vnPe95zUte8pLmkku+MeQTCEwMhGIwRgjFYHgIxWBwhGIwfIRiEAgEJiLsVaMQPO1pT2tmnHHGtK+uXKYZCExthGIwRgjFYHgIxWBwhGIwfIRiEAgEJiIcwvC2t72teeUrX5kUBAdo5P10gcBEQF/F4E9/7n8efqBpLrv8yuaIo49v/v73JzYOB3rjK2ef1xxz/MlDT4Fe+Pd//t0ccdTxzRe336X5wQ/vGnIN9MKP7/lpKAaBQGDCwalus88+e5oteMELXpBOjCsPWQgEpjY6FYPv//DO5rAjj22OPOaEJPCOF3nHNtvt2Gy0yRbNF7+0U3PoEce0hpvodPRxJzbHHHfSuNfXaOioY05s9t7voGbzLb/YbLTpls12O+zSHHn01Mmv2QI0UerLbM+OO+/RbLjxFqlu9tn/4NZwU4PUkb5FkTrq2BNbw0xm2nGXPVKdb7rFF9ImuLHoE9I44eTTmp///JdDHC0QCASmPpz69s53vrN51ate1cw000zpKOu2k/sCgamFTsVgSsGRg36CZL3d4osv/uRPoALjA0zo1a9+dfOc5zwnnTEfeAKO5PzYxz7WPOMZz2je9KY3pfP+A+MPR5n678Izn/nMZo455kj/NggEAoFpFWSeOeecM20+NnPgD/ixlCgwkTDVFQPw06PnPve56Y+9oRiML/x0CzN64Qtf+OQPywJPnJG//vrrJ2Y977zzpunewPjDfxD8E0N/9Mdr/58IBAKBaRV43LLLLtu84Q1vaD784Q+nH2D6l0wgMFEwIRQDf6slkH3iE59IP5UKjB9+/vOfNyuuuGISxPzQLfAETOXqh7PMMkuauTLdGxh/+FGf796UOkU1ptQDEx36KIU2kx/2BQUNQg7b8LNIBjp/hPeTVX/P9yNIfm1xgoLGmvCtXidhTQjF4Hvf+17zute9rpl11lnTfWD8QPH65Cc/2TzrWc9Kx6QF/gt/HX7xi1/czDzzzNEPpxAwKUuIXv7yl6c/ZQcCExmWfLDu+sN5/mu42cbRkL+Ct7kHjYyiPsefoo6nHo1F3VNE//rXv3Ya4iaEYkCDXmqppdK697Bijy8MaMstt1zz7Gc/O1nIA//F17/+9SSgzjbbbM2Pfxyn2UwJ/OxnP0uzBZSDX/ziF0OugcDEBMXAgGrm1bhFOfDsmu/tVyqf830m/m3umfrFqd/Ri4YbR7h+cer8l3F6vaOM0+WW4+dr/Z7araTRxCmpKyzqCj/ecTznOL3ilZTjuG+L0/WO/Nx1Xz7XcepnVD/X+arzUYevqS1OTXU+6ji1n2sdp+0dpT8q47iWcdriZ+qK00XC5ThdVOYF5Tiofgc58I9//GPnrMGEUAxMa6yzzjppA/KZZ5455BoYD9AQ11tvvaSEbbPNNkOuATBLYNbKaRHf/OY3h1wD44mTTz65mWGGGZqPf/zjQy6BwMRFVgx++tOfNg888ECyvJmFtYm+i9r8xzpOv/TaaErkCw03Tr/02qgrzkjeM1buaKLGmRLvQGMVp1f4LuoXZyTvGW6cKfEONJw4eNZ999038RUDoBA8//nPbzbffPP46dU4wjTU6quvnjZ7n3322UOuAbjlllvSUiIzBk6OCIw/9tprrzRLYwNyIDDRUSoGv/3tb5sHH3wwzRzka3lfXrsoxynDDRKnvNb3bVTH6Rc+03DekWlKx3EdJE79jkHilJTDDyfelI7T9VxTm/9w4+TnXvG6/IYbp+1dbW5txL8OM0ic8lrft1Edp194NJI4qAw33DiueNekUAz8DfD1r399OsbLtEdgfGBN94Ybbpjq+fzzzx9yDQDF4D3veU/afGyJS2B8YaZwiSWWSH//vPzyy4dcA4GJi1IxuPfee5PlLWjKkSUQqM1vSlB+90jyMF75ntp1MlKarPmeFgjvmhSKwSOPPNIsuuii6U+AduoHxgeWEm255ZbpJJgQxp6KW2+9tVlllVXSsrZY7z7+oIg5OtfBA1HfgcmAUjEwq/ib3/wmaAyIoOLqNDhHRWfFqwxDiHTm/5FHHtncfffdyepZ+o+UpIPsK7vrrrtSuzo2vQ7n9CArG/wL6Morr0xx5NF+E/nFw+o8I2kZa/03SJixyjdyqpvTjQ477LCnpF3Wo6s88iMU5rjyxQB25513piu/0eYt10f9bcgb91w/2tKevgMPPLD50Y9+9D/vFf8nP/lJipPdxrLepnfSPyw1mvCKAay77rppU+yOO+445BIYa7DS2mNgs+cll1wy5BqAO+64o5l//vmTghrHlY4/DjnkkGbGGWdMSmogMBmQFYMsdBFguggPydd834/KcMONM2j4kiZKvgi4BMGVVlop8QMKAGG1DEMwPPzww5Nh6/bbb09Cpnj2hl133XXNDTfckIRMAv6NN96Yngm9hFJhrrnmmubqq69uvvWtbyUjkHxJ84wzzkh7nHbYYYdm//33b9Zcc830DoqA9AmnxxxzTBobHNjhgBQrHMSnSFBU/IPpbW97W9q3Jw9lmQliZucXXnjhJHwPUh+DhEHCeT/ZSf3JL7r55pubtdZaq1lwwQWbzTbbrNlqq62apZdeutl2221T31UXlhRvsskmSanYeOONm2WWWSadzCdddffDH/4wtQEDjnJmgRKpN+VUz9JD4hhDd9ppp+Ytb3lLs++++6a8SP+Nb3xjykOuG0rJUUcdlWQRbSNcLo/3SGv33XdvFlhggaRUaeuy3JmEL6+ZtHn5XFIZp47XRV3v6UXDjVOGmxJxJo1iQFB9xSte0bzvfe9L66ACYw+Dmg/Uj1Vo7IH/wuBigFh11VXjVKJxBgXVD+UYAgwcgcBkQKkYEMQoB0GjJ8I3IX7JJZdstttuuyR4toWriQBIcHXSniWg+DahGA8nLFMgCMGeHc9tX90iiyySTkEUlswx99xzp1kAAhMB9dvf/nYaB+x3NDNx6aWXNptuumlSDI4++ujmlFNOSYIroRmdeOKJScAVjjBOIJY3fmVeszI5JYjg578wlCyKjHdvv/32Sfg/66yz0l/+KTHqQB1SWNS7pbTqi3H2Na95TVIY1I2fT6pfCps6o+SsvfbaqS6+9KUvpf//UOp8ExQVR6JvtNFGaZaE8qFd8flcZ215Lkme5HuhhRZKK0g8t4ULGhlpg0mjGDgT+p3vfGdaWhDLicYHBjaKgQ/bT1YC/4WB6UMf+lBa926QCIwfWOze+973phkDlrlAYDKgVAxYUQlW2Upd3ruW923+pVvtXl57+bVd6/vyuQ7TFa7rvoxTXzP18u8KS6D8/ve/33zkIx9pvvjFLybBtg5TP7sicX/wgx80n/vc55IgbHwj1LNEZ+VN2iznBFaCL2WCRdz9fPPNlwRYygnFwKwCIZjgbEzgdtxxx6VZAXGMDdLVB5xeR9imaKyxxhrp/YRrQjnhq85/mXdX6XjHZZddlv6AjBfW4eo4+drLzbvNbhjLzLKwuiuzurjtttuS4mPJrHrJS3woNR/4wAcSbybciyusZT/yuPzyyzdbbLFFUhII+txY95Xh05/+dIpPoaMYqf9zzz031TkDpLrZZ599OvNdXpG6VSfGCPXiuS1ued/mX7u1+bu2ufW7DhomP5du5XPtXt+X117+bW5dV/1g0igGfrzgI3NsKWtiYHzAcsJKYgo18F/4UEyvmhLGLAPjhyOOOCKdQjbvvPOmaeVAYDKgVAwIo4SjoNETyzyh2Ez21ltvnQSXtnBdRNixvGeFFVZIQig3aSLLYygcltUQiAn4ViWcd9556T1mCMgbrNMEfMoDK7m4lD8WcYa0ueaaKwnEyy67bBJ2GS8/+MEPppmEPfbYIykiZhQI195lJkH8Oq+Z+CFC+Gtf+9pkWbcUx3vbwg9K4rPsKyNDKwFf/igB2d/9nnvumcIok2VGZhT0aXVJRphnnnmSwK88SHhtJI399tsvxaVAiateP/vZzzbXXnttmo1QJ3g8RerYY49NytL73//+pEgNUj558MNL8bST57ZwQcMnbUKZc3TppFAMaOY6zote9KLUKU3XBcYWGI+PHTOwaSnwX2CmLBSYvQ8nMD5w/ruTsZ7+9Kcn66AjdAOByYCsGOAPeClhEBG08rW8L/3L59K/Dlfet8Wp3fP9cPy63Esq3bvua+oKV96XlN0JKyzVK6+8chLOCeaWAbXFqeNZ2mI9Oms9gZTAzr0M67kU0t1zJwgTdgm6BF/r3s0UEJItDcpxc5xM4vErhS3vKt+T3YTLlPOEpEEpMcOBD1oCleOU8co4be75PrtnynnOecnhEEGbwG4/gNmQXXbZJS2HsrlanswmWN/vOGkWe3lF+T3uCfjqSN0z6FLqrr/++ieF+FwWYXO95bzV+c35ci8upUA7ULC8v8x/GaeOW/qV/qV77V8+l/51uHzNlP3KMKV7vs/uvfx63dfUFa68Lym7l37aaNIoBmD665WvfGXzspe9LGmcGHFg7OADY1X5zGc+E38+rsAyYeMx6wcmFhh7WCLg5KdnPOMZzUtf+tJ0ukcgMFlQKgYESmS5RXmt7/Nzm399LakrTq/nOk52b7vPz2WctucyfBmmfs5upV8dptezK+EF7yVMlu75vi0OQYeQisT1XIbP4cq4JZXuZfi250HC1M+1W76vw9RU+5dx2uJ3ufV7Lt3cU8jUY65PbdIVR12XYT2X6ZVxBn12T+kmq9T9oA6X7/NzptK/vpZUhs/PtX/5nN3qOPVzvi/9s3t9re/zc+1fupV+dZhez/mqvSaVYuBPqH6+RXBgATCLEBg7WHtoKtSSmfiPwVNhTaM/H5tRiX439nBUrun1V7/61Wm54GqrrZY2vQUCkwWlYvCd73wnrdseDk1LcXqF7/Kb2vnqookaZ6TvmMh5a3PvRSOJM1yaqGXpFb7Lr9878szUpFIMbGZ561vfmgQHAkRYtccOlmzYEGQtpY1Clm0FnoDB3nTuu971rrS5KjYfjz1sYnNyhW/bMqLddtutefTRR4d8A4GJj1IxsOTCcY2ZrHEvr4P49Xpui9PPrZ9/m1+Xe9u1dqvd28J13dfX2q3Lve2a77vc26697rv8XLvi1H5t7rV/l1+Xe/083Pvy2XWQOPV9fi7d257b3Nvu29IZbZzar7wO4ue5dCvDlfdt/m1u/fzb/DJltzpM/ZzvB4lnJuahhx6aPIqBDcjOEyY8IMs6wqo4drC2bMUVV0xrAmPz8VNhQ5k1qk4mij0GYw97OPzd3HdN+Q/FNDDZUCoGBlhHHNtwidxnqp9rquP0C49GGqfXcxsN9x1oSsSpwww3Tlf48Uq3puHGqcNMiTjuxytOSSON0+beRfU7Bok/kjg1DfqeXs9tNJJ8tcVxtTxsUikGYCrEubgECEs7CGyx12Bs4Lgqy4goB3Fc6VPhxzcUA6dWxKlEYwsKv9M9ssLvaEECViAwmWAc0pfNbJuWt/nS+GSDZFBQP9JXvvKVr6Q9fsYaB1042tO/BS688MLWOEFBY0H6Hn7lUB/yzSOPPJKW97ZhQioGf/nLX9JOfQKEE4qcnet83MDo4cdx6tOyGWctB/4La++clmPtOwUqMHag7M8555zpm7Zci7U1EJhsoBiwsv3zn/9Miu0f//jHxFODggYh67oJZU5fso/yhS98YTrrnxu/tjhBQWNF+NXDDz+clvDiY10G9wmpGADBwVn7edaAphMYPRwVafOxo8XsNwj8F34CQ3h1lGssJRo7YEJ7771386xnPSvtLXBOePy7IDAtwMAaFDQogSUcjE8Ugze+8Y3NgQce+OTsaVucoKCxpn6YsIqB6Vo/OqEYvOAFL0gbFWk6gZGDhqhe/SqesuWPiIH/wvo7/zF4wxvekM5SDowN7GuZY4450rf87ne/Oy3BCAQCgekRlAA/VHvmM5+Zlkz7f4JVEoHARMGEVQzAWqi8/MB6vPjh2ejBWmGPgT8iWtcY+C/s3CfA+rGLP5sGRo/HHnss/TzHX46f/exnp9OwnE4UCAQC0yPyGGzGgNGTkvDggw8O+QYCUx8TWjGgWZs1eM5znpMEWf84iL+kjg733ntv+j2835bb7BT4Ly6++OLm7W9/e1rq4n8PgdHDXg3f7gwzzJDqlfIVCAQC0yvINRtttFGaMWAwcRBDKAaBiYQJrRiAP+ktvPDCzYtf/OJmgw02aP70pz8N+QRGgn/961/NGmuskY6BDcXgqXAu+eKLL57+DP2LX/xiyDUwGuy6667Nq171qmauueZqjjnmmIHWNwYCgcC0Ckui11tvvaQYvOQlL0kHXvz+978f8g0Epj4mvGIARxxxRDqdyEd00EEHdZ69GugPm4/9Udrfff3pN/BfqI8FF1ww7cH45S9/OeQaGCn8RGWeeeZJSwFXWWWVdCJCIBAITM9wotWhhx7azDjjjOlUIgbPUAwCEwmTQjEgUNjF71STpZdeOpZ5jALWfK+55prNoosuGnsMKtx2223NvPPOm35wFn8+Hh0Mfttuu21SCt73vvfF8aSBQGCK4z///k+aJZ8o9Nhj/04bjfFG+wtmft3MzWGHHdb87ZG/Jb+2OFOaYlY3MCkUA7jkkkua97///c3b3va2Zo899ohd/KPAF77whXR85HnnnTfkEoAf/vCH6UxpP5z59a9/PeQaGAkoWU7coBTstdde6TSsQCAQmJK4+prrm9333L855dQzm7PPOb/5ylfPmxB0znlfa8674KLH6cLmq+de0BpmStLZ51zQHHnMCc0uu+/TPPD72O8wvWPSKAYskDvvvHNaknD88cf3tOgSQkIQaYdlWP5fcPTRR8cegwp+MnPKKac0+++/f/z5uAX+knj77bc3J510Uuo79mG0KeisTjbXWZIlrL/EBgKBwJTG1ddc1xxy2NHN/fc/MOQS6MKNN93S7LDLnqEYBCaPYgDWx7M+1ieb3HXXXenEos9+9rNpVoFA4i+2gf8FoW2HHXZIfz3+5je/OeQaABvdt9tuu+bggw+OIzVbQKk85JBDmuc+97lpiZDjRx0jTGH/8pe/nH71T2E3W+DP2qakLV2LqelAIDA1EIrB4AjFIJAxKRQDlspHHnkkLe9wsok/IlsHvtBCC6XTTt785jc3L3vZy9JGnte//vXpr77nnHNOLDdqAeHu/PPPb6699tp0Ck/gv6AMEG7PPvvs9Hv6wP+C0O8ncNbH+uYcCvDKV74yfXtopplmSn6+ST8lvPvuu9O53XFgQCAQmNIIxWBwhGIQyJiQioG13mYArPVGa6+9dvp1OCulM9EdJ7nccss1p512WtqU7CxgG5MJJrPNNls6CWXfffdNykTgf3HFFVekOg48Fc6Xvuiii+JEoh6wRG+bbbZJysA73vGO5hWveEX69sweuJpNMGN3wgknpBk8/8yYeeaZk784yyyzTJr1u+qqq5p//OMfQ6kGAoHA2CMUg8ERikEgY8IpBptsskn6oRklgDLgh1OO9XJP6Cdg8Hv605+ergSRV7/61emsdNZLyoEwfjM+vYFAazbAHoITTzyx+da3vtX89Kc/TUtkLLey1tvaef6Oknz00UeHYj4BP4/znwinQE1LSpX9KbfeemuaCbCEyvWOO+5Ip1u5/uQnP0mzBd/97ndT/XkuYTmM+jCL8Ic//CEtx5qe4adlZg18g8j3lmfsnM2NfJO+R0r76173uuRHQVhppZWaCy64ILVJIBAIjCdCMRgcoRgEMibkjIG/AFrr/ZrXvCb9NtzPzd70pjelK2XA/wxe+tKXpiULrJSZCCmUCLMJl19++VBq0wcs1XjPe96T6ocQRjDLSpI6pGxRsuzBmGOOOdKsypxzzpmO51SvhDnLsszAEJ6nJXz9619PwmnuOy9/+cuTwKrM+oy+pO4WWGCBZAW3VG2++eZr3vrWt6Yw+pTjXe3NsPk20KTZOn2IQq5/ZYUgk36X65iC4PtdbLHFmiOPPDIpV4FAIDDeGC/F4L777hu3n60y4I3V4SmMWn6oZjl2P4RiEMiY8HsMbrjhhrQ0wd4BswOUAdcsgFAILFEwo0AQIfz5gZeNytMj/EVRPaijbMmlEKijrDSoK0La7LPPnmZhCG3CiUdRuPrqqwdiJJMJZgMI/1lYdc2zUHmGyrIYSoE/bTsWd9ZZZ00KhT5Hedp0003jGNMhGBRtMKZoZcVAHeY6y27Pe97zkiL2+c9/vrn33nuHYgcCgcD4o00xwMMd1f2d73wn/cvHrLpljVdeeWVaSsqfMG3G3ez61772tSflCYcrXHrppWmpM/7XBnuqrrnmmvQOBkppG1PXWmut5pZbbklKhdl5h6g4AER63i2c2X0nBubDL+65556Ujnxcf/31abZaODPg4prhln9wpPtxxx335GysPZb2ZPrLck5P2RhnfvSjH6XnEqEYBDImxeZjsP7bx/ClL32pWXzxxZOwQRDJCkIWTAi4/io4PZ+EgtkRetURa7i6IfwSbrPS4JmSwBrOX72x/u60007T7FIZzJGSSUnKm2SVfYYZZkj3rhRNwiwrd16aht797nenDduBJ8ASZUmQ2afyO6QQqENu73rXu1IYA2UgEAhMabQpBmbX99tvv3RyIQMYXmbf3eGHH57c/SeJ1d44Svi27Paoo45KvMyJdYR67ltvvfVQik+F/XtHHHFESk94S1AJ+P4fhBdmwd1BDvZiUTqyIc71wAMPTMs1gdzjZ2i/+93v0hHRDsf4xje+kZQFs9c77rhjWvqqTBQVhlQw4+CkuF122SUpBtnd2C4NS43rWYlQDAIZk0YxKOFj0ekpCVkwyUuJWHvDqtuk9fM2f1pSRBDOghsyk0DYpSAQ5NQbKy+LOEvJtAyMF2M0M5L3pZRCLQVB3ehP3ChTlhWx2ASeCoORjcT2Dqgr3yGlygZjpxEFAoHA1ESbYkDoZ333N/YsK/z+979Pz5QF+87Mbn7/+99P/2oxlhLC7cljnTeTwPJPsLf3jHtJZhWk5dQ/8fFJhkr7+8Q1e23G9c4770z58J6cjr2A1113XRLkf/vb36Zll9KxvFo4eSHcS4Nwv+uuu6Z3gOWdZ5xxRnpfBmOYvCofmK3wp2Xxa4RiEMjoqRjozASpiUh5RsBHQOOmKLDq2nRMI5/IeZ8SpPwYjfX1ToYhAOflHciz2QTCHGGYZQPDaUtrWiPAvAm1llmpl0x56YsrhcmRuGaqoC2t6Zn0MfW4yCKLpD8cm4bPm9bbwk8pkq/MHwKBwPSL8dx8TM6gKBgfCNqIYG9GYTyBt7H2WypktqOEMbwX78MfS8WhRCgGgYxWxUDHyiex6IDWtGVtuLxvc8v3I4lTErfaPT/XV+QjNU3H4p3dSmqL477NvfSv/QaJU7v3Cp9ptHHct8VRL6zd1n4TdLNikC28hGAWdBC2jl9T2zvankv3tjhd4dEgcdril/FKt/KaCXNkRbEnwwxBqTQhdWWjtilc30KO15ZefV+/q1ecNr+259KtLU6v8OX9cOPUYepng6BTrEq3fnFKyn5lGPdt7qV/m1tbvKwoBAKB6Q/jqRhMawjFIJDRqhgYTCkEpqhMeRn8a6I0oPK+diuvdbi25zJcfV+GLd26wg0SJ1PtX1J2K6/94mT3tjjlfU11uDb3msrwdZzynnXBpian7thnQCmw+dgUKcG3jF/Gzc9tfqV7TdmvDpPvS7far+t+0Diug8Rx1c9NJ6+yyippI63lVerGMix1VqdTptfmlp/bwpX3g7r18y+vtX/5nN3qa5uba79w9X3tNtI4+TpInDJcfW9dL6otaoFAYPpBKAaDIxSDQEanYmBwtamFVZUVnjWeoITcl8/ZrXyu3QaJ0+s53/cL0xanF9Vxutxqyv7ltVec0q+O20WjjZOfM7GiWnO4/PLLp5/GWbeobeu4OXx2L/267uvwtdsgcbqe2+J2PbdRrzj53hSsUx2sl7cszXMOU4bL973SHOS5za18dl8/5/uSesWp/duojFNfyzBdz4PEcd/ruY3awgwaT5+2NtdMRigGgcD0i1AMBkcoBoGMvopBPq/XIBs0bZA9GVlwCvovUZQowm1+QZOL9HHXUAwCgekXWTG4777705LC/wS1krq54cabQzEIJPRVDOzgJywZaIOCgoImAzFohGIQCEzf+M5ttzdHHnNCc8TRxzeHH3XchKE9996/2XizrZpNt/hCs+se+zSHH3lcc8RRUzePRx59QnPcCV9u/jhOP24LTB70VAycvevUERswe5FjtQZxKyn7G8Qtb/E8aJzyuV+cmkYap3aTZ1S7o9Hma9A66fLvF6efW7/nLrec7y7/0m3QNLmhLOzVfuVzm1u/59ptkDRrt9q/fO7qJ73idFEdJqfdFbfNvd97an/P/eLU1C9O2U8yjfQ9be6IQSMUg0AgMBFx8cUXNy984QuaGV/84mb77bdPByUEAhMFfRUDZ+/anDkSsnnZ5lZ/+HNmb+nnnGD+n/nMZ9JZvJa2lP6DEsWlfJauq/f6Y6F1444Qy+6IQOGsYj8cWWKJJZqPf/zj6fxfwoo/AjrL2NnB/kIo/fId0nFGu5+b+I+CMhBCsn+mOl+DkHTM0qy44orN7rvvnvLDXVryks9GVp+OSNM++d3KJO+XXXZZ+vmJM5bltSz3cElcP1rxLu0nH96X3ZyrzM27/QRm4403TkeA/vKXv2xNL9Nw64aA98lPfjIdRVvWtfw5Ls7PafyMBrNVR9zkQTs6w1lb6wP+dInUk7i5rcVRp46bE5efe3HVOzflE1aZufHzrC68S7t41u9yn3C/wQYbNAcddFB6Hk1blCQd7eFnbX5IZ6kfd/lQVnkp+4n853oT1w94/Kzn7LPPTnWXy61cObywyioNdeZ9fv7jpKaTTjrpKeVxFUefy/Xpqn71DWGQe3Xyuc99LtWJeqrrJPML9V26t1G/fhSKQSAQmIjAd/1pf5ZZZkmHgsTJaYGJhL5LiZABHPm5hitloaTsVl4NzM41/+AHP5gENoJEHYYwQUDwYw4CAsGGAOevgQQev/cmqDhu0x/8DjnkkCSgEED22WefJJxvs802zc4775zSkJ5fkS+11FLpL4AEB0L8csstl35HTtDx7DfhH/jAB5qPfvSj6Wcf4hKovF8+Hd/5qU99Kv3YitIinTrf/mror4XKWfrlK1Jfnku/LspxvMs7pc1Czo8bYW6rrbZqNt988yT0yaOz47nxl19KThZ6CXF+9qbu/TZ93333TfeY0G677dacc845Ke/SJ7BJU1sQ7ClUfvmuvgloa665ZvOWt7wlpSPcsssum/6BQJBUZ0i7SPvMM898sjy9yp+fa/eS+CH508b6gXdlt1NPPTWVmSJEsZRv9aCc6uCUU05JdeSeFVn5Vl111dR+2puA6i+X+hylZrHFFkvu+oArBU39UtDUhT6k/MptAzflTX7UI+F5wQUXTO+WN3mkNHgnpTP3/7Js9RXlbyy7l5TdkXZW38cff3wqG3950U/WX3/9ZrvttkvKizZzEpXvyZJA5fnYxz6WvgPh5c+P7SgHyuV7ffOb35zqkmC95ZZbNptttllSQvGCLbbYIn2L0tbH/HnT7/yFpWjb1E64v+qqq9LJTtrHz3ry9+0d/pkhX7lMZdnUkz+XL7roov/z3ZXXfF/2sZK8KxSDQCAwEUGumWeeedK4ykgTCEwk9J0xIEBkoSBfayr9MxmcCRcEDUIoYayOgwzsKLu799EQMAghhAMWyjXWWCNp2VlwJziLTzgh0Pm4xKcoENqyFVR8+ZhzzjmThTTnQ/rrrLNOEmIoDIQ4goa/DnoXwd/75P/YY49NgkaO673u6zJlv0z8XUv3tvD1VbxcJ/mZQEURWm211ZJASPEhiFJy1A8Gk63l6l6ZhGVVdr/SSis1H/7wh9Ov3Am9O+ywQzPXXHMlQVcYSgQ/Shdhb+65505WePWiPgjQn/jEJ5JwrM4Jm/l9OZ/ynJ9L4pfL4ZrdymsZtu1aps3NvXJRGOaff/5mgQUWaFZeeeUkhBOa1QGFz8/bFl988USOZ3Usae7ThGh/oiS8qg9lLdNX5xQGdUtAVjfyoZ/4Lf2SSy6ZlBNCrzT0GUpCmXfp5HyXxC/75/vsXofNlMNlymnnOK7qxCwFoZxiTUHwEzdKkrzp35QjbU/4JvQT7NUX5UYbqzdh9Ht9gTKqTrMC9d73vjcpTfy5qRNKrP5J2aJMclOflEoKQ5n/Mt9l3rPf/vvvn/LmPpe9pBwn35dpZHfvD8UgEAhMRDDSGEtnnHHG9AfjmDEITCT0VQxYgglSwyWDM6sggYxATkhoC9dGwhI611tvvSSwEvgIYoQTfgRb6/L4ffWrX22uvfbaJCCzYnqXP7ESLuSdkMKCSnBjxZSGclEMCMTSEoa7+AQlwg2h+MILL2z23nvv9Pdbyo14bfktiUAirCVS8iDdtnDDIWXOedtkk03SMo3srp4pQZZ4KKeyKDdmI99HHnlkUn4sxWEpttSIAGjW4J3vfGd6llf1c+mll6b3mLEwW+LYTmEpFYRMszGWZgnzhje8Id0P0q7CZGu+GQl11BZuUFKnynjAAQekOvFbefWgLGYI9AVCJQWKECs8MsukHgj8wrNcUyoJwgRmwq4ZJEqg93gHvywsl+1POVC3ZqMoXITiQeoik7T8lVp+5CP3wbawwyHtpV9TkCnY0pQv5JnSoA2UjWJkxmCZZZZJ35d2XXrppdM3JT/62mtf+9pUx8LqZ+JTksSTdwqH+qR0CKcvSUefM8vir9tmerRHW35Lynk1u0Ux0EZt4QYh9RuKQSAQmIgwC81wx3DFoETmCgQmCvoqBgQEQoFrJs+lW77P7sjgTMCgGBAMDPh1+DpOds+CnCVFhEkCOsskP+larmFpBOWA5RwdddRRyaLrPazAhLx3v/vdSehhNSWYUgKyUExw2nbbbdOVYEYRoQzYN8AKTUgm3BD+hLFshHU9C1Q5r+W9vFkfTcgmlBNYhc/hXOs4+b6Xm3wTwFhelTXnL+dFHPWlPqxZtESKcEiwp0QQCFknxCf4KqNyqQtpIG0kXeW0lEO9W6ZCeJWuOiM8EwLVtXwQtFmocx7KPGc3JH0WfcqGOi3rpC18puxW+rvmPJs5IqQSys0uyaN2lkfkWRubUbGszbNlMIRgfYXioGzCqBuzIsqor4lvWRQ/ZRXOM/ecH3VuyRVBWxvJU5nXkko34fQTypw60U+kVYcv4+T77F77I4I7BU5+9VkKntmAXF/a6sADD0xLh9SbMmvrq6+++snv1cwJ0u6+Jd+gMmo334cZBPXnfZQl75GG2TWKmvpCvk/9T73kbzdTr7JQBijUlHt5LsN1xamvSN4tLwvFIBAITDQ4GnudddZJPxtlfPznP/855BMITH30VQwIBJkM8uVzm3u+J0ARSlgdN9poo+b0009PbjlMWxxX1kaWRtq0pRriXn755SluGc5zTdwRoUscFm5CD2GR4EIYzGEISeK4eq7TzeGyW74v85vv87O0vJPFWtlzvF5xSvc2/0w5X5lKf/fcKEyUIjMo7pH6NHtgtoB1m0BIsMvxy6uwBEdKAeWC4Gh9OEGrfGd+X1sariUJJw0WZldhynB1nK602uK4Sl+95/bMYZDnnM98n58ztbnX8Uv/Mv0cpnTvui9JHHWhn+uf6jf7dcXplW7t15bnfOVW1lcZt45XUu2f0yzTKf1znDJ+pvK5DCctdWIZFIXDc/Zri1M/5ysFiTISikEgEJiIwJusTHjxi1+cDCyBwERCT8XA9D9rM4HadTiU47gOEr8Mk+P0i9vLr4umRJy28P3SqP0Heedw8iVsptq9fu4KO73ScOthJPU2UeNMjXwNEr9XGDNhZg9CMQgEAhMRf/7zn9N+rRe96EVpxYI/7QcCEwV9FQNLUSyhGYQGDVuGmxJxBglfhx00ThmuVxyzGG1pt8XhJnzp75qpDFtSjpOpjFu6l1Tna5A4JZVxBqXyHf3i5vwNGqctXK/wqA5Xxu2ikcRBdbhB4qAcbtDwJQ33HfV9LxpNfgaJK0wZrl8c/vqMWYtQDALjAePjv//979S3CHRBQcMhfcfmY0tcHcHNyNEWLihovAjvwse6Nr33VQwMstZAI8sf6mtJ/cKNZZx8HSROHa72z+71cxmnK0z9XFIdpryv3QaN0xauLUz5XLq1xWkLn90z5efSL9+XbiW1+be5le51mF7p5LhlmLbwpfugccow9bW+L91GEidf6zhd4Uv/8jm75fvSLbvX1y4qw7XFaYs/3Dht4drC1M+94mR/y4pCMQiMNQykfkZljfjDDz/c/OMf/2j+/ve/J8r3ruV92zVTm3uvOKVb6V67lX6lW+3f5lb793IriXum0q28trm59nKr3Wv/Xm61/yBu5XPt3+aW78swpV+bW77Xl9r829xKKt3awvXzz8+DximvtVumfu5d9661e1u48r4MU19ravPvF9dzptqtfK7vyzC93Pr5Z7f8nKkO1+VWE78y3F/+8pfmr3/9a5L129CqGNBoJeB0FoOsdevWzudrvi+fS/fyfizi5PsutzJOeV+H6XoeJE5bmOHG6Qo/FnFq//I5u2X38j4/5/uS6jD94vQL0xandM/xy3DDjdMvfL6vn/N9SaOJk8P3i1P7l3HK+5Jqt/J5kDh1mLbwtXsZp7wvqXYbaZzyvl+cOkx+ts/Aj99CMQiMJSgGBlTH+1I83ddEYXjkkUfSNT/n+/yc72sq49TuXWn0itPruXZzP9o4beFr6hUn39fp9HtucxttHPfDTaOMU1/LMF3Po4lTUx2mVxr5Wb8tn3OY+lqGqePU92MRp6a2NNvSKSm7lddecdr82sK0hSv9Src6TH4u+UUdxn0dp416xanjP/DAA2l8JOu3oVUxwPwMqBLz3wCng4xk3XBQUFDQlCRLiByHi+mxjPSaLg0EhousGJhNN7g+9NBDab14UNC0QPhmm3svmlJxgsaO/BTVcd7DUgwyDKpI5EwUhqCgoKCJSHhUKAOB8UKpGPjDuVkDy4qQ+/I5u5XPNY00Tnmt72viV8fpFR6VcUq38rmLynCDvKfXcxu1xRnJe3rFafPLbl3xJkKctudBwtdug8TJz/W1pDa3Xu6oVzrDeUcm/nWYQeIM4papfEd97aKRximv9X0btfnjXSNWDAKBQCAQCDyBUjHwx3N//Pazqpp6ubf5dYVHg8QZNM1ecQZJY0rEGSTN2q32HySN4cYZJM3arfZvC19TW5x+8YYbp81vrN/RRlMyziBumbreUbuVz4PE6eff5lb7D5LGcOPgXaEYBAKBQCAwSpSKgT14ltpO62TZgX/j+N8KS2NbmOESwSTfS1+6bWlzc4KPpScEGcJNjtsvL/aBODzF0cVdYful0UbiaHs/RbV0Uf7bwtRuJYnTFq+kQfPWqz5qN3/sl29t2eY/CHW9J6ft/zOeyzbOYbraunRr88/utVumLr/avX6WH0tPHVzhJ5+lX6Ze752s9Ktf/Sp9U6EYBAKBQCAwCmTFwM8/CRQERIMscl9SdmvzHyROm99I4vRyc3/OOeekP40vtthi6W/ydRiCOf/DDz+8+fWvf538EKGK9ZHgxN01P4tHMMxWSleUhRJXf1pfaqmlmgUXXLB597vfnX6oqU7Fkx7B3p/Wl1hiifQH9vwnf+7+3D/vvPM2M888c0qDUCdOzrN7xxf7Wae43p39yrKV94O4uSI/UvQTVn+9tzRDnh14sP766zdvectbmne84x2pTOptn332ebJc9mquvfbayU+55e+qq65Kfuot11FZX2Uesl9WKtxrHz8xfcMb3tBssMEGSWn1vu222y65+WmkOhNenay88sqp/spyle/o5Vb7ZbdctlVWWaU57LDD0ruyu59N+hu/8qqPt73tbelKObnuuuua97///c3b3/72Zpdddkk/WJ1rrrma973vfenHtP5gv+GGGz5Zn3PMMUezwAILpENx7CnzvplmmqnZbLPN0g9a1adnda4u9I9FF120+eY3v5l+eqsvzTLLLOnHuf6yr/4cVuHnr2ecccaT/Vu53Mu7H9HNPffc6eewvvuyn+Xyl8+1e+3vuS1O6TbecShxoRgEAoFAIDBKlIoBwYIAVlIWyrrc+/kPQl1p9UqjKw46+uijm+WXX7657LLLktCT3XNYQiUhjJDmmWBBCFt66aWbGWaYodl2222TULjFFls0z33uc5NwrG6OO+649GffZZZZJgmCfublr+Y/+tGPmq233jrFNxMhPQLkHnvs0cw666zJ6kwB8Lf+N77xjUkYzH/i9/NCQo08bb755kkA9Jf0T3ziE0noyXnPxKJvxkD42g911VmvukT8WcbLtF09E/wPOOCAVJeOhVxooYWS4HrppZcmAZgQT7CXX4rY4osvnpSXnXfeuXnNa16TBFlC9Oyzz9685z3vSYqbMotPCZIG4ZjwTPAnsBJuP//5zzfvete7UnueffbZzTzzzNOstdZaSSiWF/nTLo5y1gZ1mUZLvgkKEyUt14ky7rfffs2yyy6bBH3lkBd//Cd0i7P77rs3a665Zoqn3TfZZJOk1Ohv8upZGc4///xUJxdccEFKV1raYaeddkrxCfarrrpqqi/+6phQT4FzUICw+ph6Ug/yyA15j/e7r8vFzf8m9DH9qa2f5XDltXZvo15+vagtXr/3l+7KHopBIBAIBAKjRKkYEGwIL4igke8HpX5xxiLNQdI46qijksX04osvTsLWIHFyGEoBoYywRgBj8c1KE0GLIE8YowSceeaZSTghPFIQWHf5C0/YOuGEE5Kge9555yXhkTJCsCMwE+wINNKVhjyzLLN+C/PmN7+52XTTTZOS0ZbPmrgTjrNw774tXBuVadb3BEd5IrizZqsb1nl5pxCwfCuf8ioHgZfSs9tuu6XjlikFBFzlMNNy7rnnphmIk08+OVm71aW6MLtgBkW9SoOg7b1f//rXUxqUMUrHiiuu+KRAnvNZUl0/2owlH5XubdRVDyWpV3XCwn/WWWelGZZPfvKTaXaKAqjttS/FgeB/+umnp35BCdQu6k06lC2KIgVUXeS4/Ai56sXMg7J65i5PvlF1RHmgLKkjymhOt43a6kT7UJ7NCvWK20V1ml31VdJI4tTUFkeb6HuhGAQCgUAgMEqUigHBheDhOG/X8t41U1uYruc2v7Yww7n2C2PpB6HrwgsvTIJEHSff13EzsVgTXlnJsxvh47TTTmve9KY3JSvuQQcd1Ky33nrNpz/96STws3B/5jOfSYLzwQcfnITDD3/4w80xxxyThFiC36677poESEoLq/Mpp5ySlApKAcs44VE47+I/22yzJUUlW63rPJf5dk+ReMYznpHeL40yXA7bdW1zIzQee+yxzQc+8IFmpZVWag499NCUP37yo7+wjrP6u+6///7N6quvnpbZyDOBlXBLsGclN8sw//zzN1tttVVKm9Jl9oFSZbbETIMZBcqAOjHLIC1CPYVH/VluRRjn1lYnrpkI1NpnxhlnTMuP6jop45TXfF8/ex9BXjtbBkQ4N4ukv5hBuuKKK9LMkzrQzhQm1n6CvPqj7JhBMDPCX31RANTJRRddlPqqeqNwesfCCy/c7LDDDmnmK+dDGdQrRU1dUA58uzmPZX7re1ek3Sxz0g8psoTrXnEGvdZumXq5l25t921x6qvyqDvHloZiEAgEAoHAKJAVAwMrSygBLFP93I+GGx51xenn3utdBGNCFcHLZt06Th2XkElQtbQlW7ItDSIEZqFUHAKIcIQplP8zkoVUAu0aa6zxpGWdVZgQK44w0hMHiU8IlLYrP5Tzk59zmEx13hEBifBomRJBU1uW/m1xepHwOY685zzLT5mW+uCmrihiyrzOOuskYZ6QTCgm9BLK1a005E1+paOcBF3vsF6eIkY5IGxbCiM8v/w+dcGtrpM2ki6LPYVr3333TXnNbZmpV7308tN2FAAbfAn70va+7C/P6kUZpOM+l0MeuJV9iF92FyaXE5V1XubJvXB1mdooxxNWnsyGUcbMwtijoT3qOMOltjz2ozrOIHGFKcPle8pNKAaBQCAQCIwSpWKQBZhM9XMbDTfOSN5RU784hDaCJmGUYsCtVxzCkjiWbZx66qmJCJXc+LXF6aJSaBG3X/yRlL+NpJOVjTb/8SLvVcZSUEPyof4tpbFEyMbstvyJy4pOWFXvZmVYz7PAnN9RxhmExEFlOr1opO/o9dxGUyJOm392c73yyiubSy65JC2Hyu0x3He00dSK49nMQSgGgUAgEAiMEqViQIjK1lDkntBWu+XnfF/7i1PGK8Pk+/I5h63j1OHb/HKc2r98LuOU92WYfJ+fS7cyX9m9LU753FaWMkz9XLpl9zpM23PpXvuXfuVzfS39S7+auuLUzyXV4cr7fnFq/5HGKZ/b3Ov7kcZpi5fdBvFzLcPVcUr/st8PEqeOl5/r8KVbDtsvjufar3Qr3cvnMmz9jvo+P5duZf7MeoRiEAgEAoHAKFEqBpYVWEOeibW961pSDl+GaXvO98Nx67ovryW1havvu/xca/e2cOV9Gaa+ltQWrrzv8utyz/e9/Or7Nr9+7qVfGaYMV7v38qvdS78yTB2u7T6Hr5/rcOV9DlM/1+Hr5xwuU1eY+lpSHa4OW7qVlN1L/7bnMnxbmNKt7b6fW+k3SJzar8s933f5ufYKhyyTeuihh0IxCAQCgUBgNMiKgTW61kw7h70mmx3b3LtouOHRRI0TZRleHMtT2tz70fRcZ2h6Lsto3+He8rVQDAKBQCAQGCUoBn//+9/Tufum5K0/dlykzaxBQYOSPRk2eztFx2lL9he0hQsKGivCp/ArBg0nMz3yyCPNf/7znyHO9lSEYhAIBAKBwIBgZXv00Uebhx9+uPnjH/+Y/jzr77dBQYPQH/7wh7Scw5Gc/mHgZCd/CtaX2sIHBY0V6WP41j/+8Y/mscceG+Jo/4tQDAKBQCAQCASmEPxgylGtjsB0fn/Xko5AYGogFINAIBAIBAKBKQR/NfbTrI022igtI2LBDQQmCkIxCAQCgUAgEJhCsJxo7bXXSj+Hu+aaa4ZcA4GJgVAMAoFAIBAITHP41rXXN7vusV9z0KFHNocefszEoSOOaQ4/8rjm8KOOaw474tj2MFOY9j3g0Gb3vfZvHvj9g0O1F5heEYpBIBAIBAKBaQ5XX3Ndc8hhRzf33//AkEugCzfedEuzwy57hmIQCMUgEAgEAoHAtIdQDAZHKAaBjFAMAoFAIBAITHMIxWBwhGIQyAjFIBAIBAKBwDSHUAwGRygGgYxQDAKBQCAQCExzCMVgcIRiEMgIxSAQCAQCgcA0h/FSDPyp2B9kxxr/93//19x7773N3/72tyGX0cM/Ev7zn/8MPXUjFINARigGgUAgEAgEpjm0KQZ33nlnc+SRRzbXXXdds9VWWzU33XRT+q/AwQcf3BxyyCHNj3/84+ZPf/pT+iPxvvvu2+y///7Nb3/72xSXHzfxt9tuu+RW46GHHmq++tWvNnvttVdz8sknJwXiy1/+crPkkks2p556anPHHXc0f/nLX5qLLrqoOfvss5ujjz46pcftyiuvbHbffffH83t/Suu73/1uc9RRRyW3c845J7k9+uijzYUXXtgcccQRzeGHH95873vfS+5f+9rXUlr84ZFHHklh1lprrSfT+/Wvf53KePfdd6fnEqEYBDJCMQgEAoFAIDDNoU0xYJU/6KCDktCdce211zY77LBDIn8kpggQxLnfc889SeA+66yzkgAPN998c/OFL3wh3df46U9/2uy4447NOuusk95DQDcLsP322zePPfbYUKimuf3225PCQJEoceihhyYFBMwc7LTTTklRueSSS5rTTjstKQVXXHFFyiMlRf6k+8UvfrG58cYbUzzKiHS23nrr5jOf+UxSOEA4ysgJJ5zQ/P3vf09uGaEYBDJCMQgEAoFAIDDNoU0xoBAQ8FniCfiQhf8TTzwxCc633nprc+655yZBm3Jw5plnJmHbX4oPPPDA5qSTTmrOOOOMZOX/97///SQRvG+77bbmmGOOSWndcMMNzb/+9a+0lMf9AQcckOL9/Oc/Txb+ww47LCkgZiykbxaBm/xRHCgZxx9/fPOTn/wkzWCY5fjnP/+Z8sHyb9ZCGDA7IK1y2dBdd92V8poVjQcffDDNMlx//fX/s7woFINARigGgUAgEAgEpjmM5+Zja/e///3vN1dddVUS1NEtt9wyLnsPShDo77vvvrQciGJS4o9//GNSULpAqTCDYdakRigGgYxQDAKBQCAQCExzGE/FYFpDKAaBjFAMAoFAIBAITHMIxWBwhGIQyAjFIBAIBAKBwDSHUAwGRygGgYxQDAKBQCAQCExzyIrB70PY7Yubvv2dUAwCCaEYBAKBQCAQmOZw/Y3fbvba96Bm9732b3bdY98JRTvvtnez0657PX6/3//4TWnaZfd9mz32PqA54OAjmt8/+Ieh2gtMrwjFIBAIBAKBQGAKwclB3/rWt9IpRoHAREMoBoFAIBAIBAJTCI4c/cUvfvHkPwgCgYmEUAwCgUAgEAgERolHH300/SzNbMBll12WfiT2wAMPpH8e3HnnnemHahQCPy/zwzI/LvNX49/97neJfv3rX6efkQl7xx13JL9AYEojFINAIBAIBAKBUcIfjRdZZJHmaU97WvOsZz2red7zntfMMMMMzcte9rJ0ffazn93MP//8zaKLLtrMNNNMzYte9KLm9a9/faIXvOAFzfOf//xmmWWWadZdd93m4IMP7vmzskBgvBCKQSAQCAQCgcAY4MEHH2w+8YlPNM997nOb5zznOUlJeOYzn9m8+tWvbl7ykpc0z3jGM5L7S1/60maeeeZpZp555uRGaeAmDOXB7EIgMDUQikEgEAgEAoHAGOH//u//mkMPPTQpA7PNNlvz2te+tnn605+eZhFe/vKXp5kEigAF4ZWvfGXzute9Ls0W8Oe2/fbbp30IgcDUQCgGgUAgEAgEAmOM2267rVliiSWeXE5k5gBRACgGlAHXPGMgzNprr938/e9/H0ohEJjyCMUgEAgEAoFAYBzA8n/SSSc1c845Z5oNoARQDvJyole96lXp2czBPvvs0zzyyCNDMQOBqYNQDAKBQCAQCATGEZdeemkz66yzptmCrBwgexBsRD755JObf/3rX0OhA4Gph1AMAoFAIBAIBMYZZg9OOOGEZpZZZkknElEK5p577ubyyy8fChEITH2EYhAIBAKBQCAwhXDfffc1q6++erPaaqulU4wCgYmEUAwCgUAgEAgEpiD8o+Cxxx4begoEJg5CMQgEAoFAYEA4ijKTpSEEvKCgoKCJTHgVKvlXF0IxCAQCgUBgABhMWXn/9re/Nf/4xz+aRx999H/on//850Bumbr8hhunV3g02jj5vlec2q9f+jUN8o6ayjiDxhvpe0b6jikVp/brouHGKcONZ5xM/cK3+Y91nCnxDjTcOLVffh5OHLyLW5dyEIpBIBAIBAIDgMWNUvDrX/+6+fOf/5zukSMm63vXfjTSOG33vWg0cQYNX5I4g8Srwww3Tr7vF2+4cWq/fumjQcLUVMZpi9/PbZA4/Z7b3MrntvAlZf+RxOnnlqntHW3PbSRMV/wuKsMN9x35ufTvohxu0PBouHHKcO7RH/7wh8S/zCS0IRSDQCAQCAQGAAvbX//61+ZnP/tZ2jT68MMPN3/5y1+CJik99NBDre5dNNzwaCRxgoLGk2x+/+Mf/xiKQSAQCAQCo0FWDH7605+mwfVPf/pTGmCnFI3kfeIMN95I47S5d9FEzReaiO9AI4kzXJqWyjKRSfmHWwcjjVO7/fa3v03XUAwCgUAgEBgFSsXgN7/5TfP73/++eeCBB/6Huty7aLjh0ZSIM1Hzhab3skyJOGgk72lz70XTc77QlC4L3hWKQSAQCAQCo0SpGNhn8Lvf/a4nscy1uWcy61C75Th5MBemXzqZBgmb31le6zjcvDsLE7VfjttGwlvD7D6nm8MPWg5Ux3GVtiVcZTiU83T//fc/xb18X/bP6Wa3Olw/KuN3PdduZZ30ojq/g+Qr10l+Z51GSaV/6TbIc+leX8uw5X2msl3y+9vassxb+dyL2uL06yf53rV+Rw5Thm2j7JfDtX0rmcp02tLMbmW+uXXlrXQry+++jtNGeFcoBoFAIBAIjBKlYvDzn/88DbDo3nvvffJ+ULr44oubgw46qDn33HNTWqx43F3RPvvs06y44orNTTfd9GT63A38rtxQDp/TLcO0hb3hhhvSe3feeefmjDPOaH784x8/mb7rj370o2bTTTdtPvrRjzZLL710s/vuu6fyonPOOSfFPfbYY5ubb775yXdmIvzuvffezfzzz/8/fv0o56GLCE277bZb88lPfrK59dZbk4CjfLfccktz5JFHNoccckhz2GGHNccff3zzzW9+M+VXedH3vve95K7Mhx56aPP9738/xc11grwj35d5cZ/D5Xtx77nnnlQfxx13XEqP/5133tmccsopzemnn57qkZuw6mv22WdPe1O45bT7UZmPmqTzq1/9qtlxxx2bZZddtvnJT36S3sVPPOX/yle+0my//fbNrrvu2hxwwAHNFVdckfwvueSSVA8nnXRS88Mf/jDlmf/Xv/715pe//GXzgx/8IP2hOdcpuvrqq9P7rr322hSXm7h33HFHc/jhh6e0hDn55JNTvEsvvTTlST8X9ogjjkjpyhsheq+99mo+8pGPNLfffvuTwmpNZfl71UVJ1tBvscUWzQorrJDaQz0h39FRRx3VHHzwwU/2k6uuuiq1ibTRd7/73dS/t9lmm2a//fZr9t9///R96Oviam/hv/Od76T40rnuuutSf1t++eWbT3/60ymd/E70i1/8ovnGN76R6gwdffTR6Vnd5DLlq29n8cUXf9ItuyPv0c+11V133fVkW5dUhm+j7K+NLTEKxSAQCAQCgVEgKwaEQmTQJ9SXVLt5Lt3ci0uY23zzzZNQVIZHBArCxoknntjcfffdaSBH5513XhI+NtpooyScfPvb32623HLLZsMNN0zhCaiEGc/c119//WbjjTdubrvttiSIiEtgItARHL/85S8n4YyQI19nn3128n/nO9/ZbLfddkmQ/ta3vpWECfmiVHzuc59LYfy51zvLsglH2CIoeq7Lnt0GeS7d3RNKCVSUGYIRN0RQu/LKK5PgTUkgDO6xxx7NEkss0Vx22WVJsVlqqaWSMKzMlIoPfOADzWabbZbCEpyVdd999031+sUvfrH52te+9mT7SoPwvd566yVhlvCm3bxTvVGC1IW2IJDONddcSagUP7cboZRwzqpblqsuZ/lcX0sq/bSBfOT8eh9hnQIln561d1boCJQXXHBB8/GPf7z56le/mtrM86KLLpoEX3lUx5/4xCeaG2+8MZVZ/VFgc/rKpw4Jq+pHXWsbfhQAQjIlhCB6/fXXN6usskoSouVZGG1JcaBEUKDqMo7kObv5dvTjM888M9VJ9tdP1JP+rp9QFn0j6kE51cW73/3u1N/lUT/xHSmjPOoDc8wxR7POOuuktj7//PPTNywuhZjisMwyy6Q+Ralfc801k3Llu1O/vhmKmjxdeOGFqf7EURc5j97t2ynLl++Fk85CCy2U8qqcdZhBn1EoBoFAIBAIjBKlYkBYcM1EAOj1nIlwKi6hYquttkqCGHfhu+JkokSwUM8777zNNddck4TTT33qU0ngIeARdgjABHjC28orr5yEJIKAsLPMMksSVggV8kGRIOhTGAiT0peWmQoCDiGSQCVfWYgmWLFaEoZXXXXVJCRJq8xnr7JkwbCOg/qVvybhpUN4e+1rX5sEYcIcYc/sAqGUkEoApEwoi/CsxrPNNluqL/cLLrhgc/nll6fyUmwIwjvssEOzyy67JAGO8K1utNv73//+ZoMNNkhpEXYpDOqFYP2lL30pKWOUAO8ry9hW3uyuTlBXmC5qq2fP8qZ/UH4I5SuttFKz9tprJ8WSsG5m4IMf/GBqwy984Qup/QnFBE/9SHxWcn0DqRN1ld8nDWmJw8Kt78g7UkfqQz1Ig6K17bbbpv4j3TKvJZVlyXWin9ThSsrhe5Ew0vPuU089tZlvvvlSP9GGvkFCPMWJP0V8jTXWSP2aYmXmTNm05Z577pnKRcHS5qz/lCdKn75BIaKcqg/xCOO5n6tX3yLlSv1yl55ZOd9mzmed95LUB4Xife97X1IwPA9S/pKER8oaikEgEAgEAqNEqRgQdFjzh0sEJ4II6yILMyGrLVwXGdRZQ2edddZkqbWEg3DBz1IIAvGSSy6ZFAOC8Wc/+9lknZXfz3/+80nQ9WzGQdjFFlsszRwQGMwsEGo/9KEPJas54ZJAKC5BSFxCjnRZiAmcBCwW2DqfNVmSQTkhcBHI3HNrCzsoqUvLUyg/6kOe1QFhTBsJox4WWWSR5phjjkl+rK2s+izAFCQzAOISYKVjloSicOCBByYlbJ555kkzCgRhQuEb3/jGJBh6L+XhM5/5TFpCo+7UCwF8k002Sc/9yief6kF7qRcCuDK1hR2UtKP2JUCaBZEH1m/5klftedZZZ6V3and50J/e8573pGf9y8wBRYdCxSKu32QhWXoUSvfqhjJEOCY8e7/861v63Xvf+970TvUiX3Ve20i7yT/lbrnllkvKxVj0E+XWpvoCxUj7KLt8uVKQ9EsKIqFbu8w999yp34tLudHulFACvvp917velcquzvK3pL7M0lEE9Ef9xAyMZXn6oDbWLhRMS84GLZs8+va8V3qe28INQuo4FINAIBAIBEaJUjEw4BPwXbso+7uWRKhiLSRAEiq64pRxsz8BzJKDBRZYIAmyWTgWhuBLYGdR5OdKiCEY8icUEJQJvpQKAh9hyXIQ/izj4uW4iOBDiJIOd2Hk31pzAiRBm7CU81dSzrcrIkS+4x3vSAKovHSFL+9Lt5r4mR0p8yx/BNccz3tOO+20VF7C7mqrrZasxGY91D3FgIWYMGp5CKssRUd8pE4vuuiiVFbvUmeUIXUivPcSIr1Tu6gPdaXech7qa76XNiFSW8ojBaxXnDbiX5P2li9pWq5D8Le0R7vJp3v+lB1CvHJQFvQD/nWdumpv79P/lE05sz9B214B7875MpPEXVjlynnN/m0knPQt6bKcTT3muG2Uy1vf18RdH9WGuTzqIPt5B4FdPakzyra6UQ/C6A+EfP76iXrkp19QGtyrj/wd5aVJ8u+93uW9SB7yN1TnNz+7lveu+rE0fbtmPigUdZw6bhdRhuIHZ4FAIBAIjBJZMTCwEoQM7pnyc+1eUhnGsgUzBpaysIoSMOrwJRGYWBgJsayRljxYo0wIaAufqc6X97S9q8x3HWcQErZfeAITQbF8f784pX++7xeni8r3EpAsuyLom/mwrIZ1mGBc1w/BkSKh3tU/azhBWhpluH7Ulm/1gdr8RlLOOk5+7pXWSOLU1C9sl3/pnuvdDINvovbP973eVfsNEqcmYYcbvrwOQsONQykwq2XPByWt7RvuovId7hkTQjEIBAKBQGCUKBUDVllWYxZX966e3WfiVlL2d5/D1HEy5TB1nLbw3PI7UBmmfK79shvK913hSsrh6vu25+zWFqfMV/bL1C9OG+U4wuT7HK8XCV/mJcep49bvbntPHSe7ofId5bWNsl8dJ9/XcUu3Ok5bGvm5DFP6l3FK9/xcp5ndsn92L8PUcdrSKO9zmNKtjFO/D/Er47vmOCWV4dvyUV5LaouD6jQyZf8cJvvn5zJsSTlcfT/cOHW+XBkZQjEIBAKBQGCUKBUDywtYNUuyfKK+5vsuqsP0i5P96jj5vo1GE6efW011mF5xsl8Zpt87RhonU+lWhimpDJ/D9QqPRhundCvD1DTcOGX4HK5X+Exl+LbnNhpunOxXx8n3bVTHcR00TvncK072K8P0Co+mZJw6TK842a+8mmUMxSAQCAQCgVGiVAws/7DkwfriNuKX/ctwg8bp5VbTcOO0+Y31O9Bw39MvvTYa7jtQvzi9/PuFyzRouJLawvWLyz9T6VaGqSn7DxqHXx2ndGujNv/SrS1um19buJKGG4dfHad0a6M2/0HiDOJW0nDfgYYbh5/lcg899FAoBoFAIBAIjAYUg7/97W9p87EB1qk9maz7rZ9LKt3qMF3hu+LUfm1h6ufar1eY8rkMm6l0L+NkKsOWYdqe833p3y9MeV9SW5j6uQyf3cv7OnzpX4Yr7/Nzvi/9y3BtYcr7kkq3HCa7lfcllW5t4Uv/Mlx5n5+7wme/8r5fnNq9Dl/6lWFK93yf3Uu/Mkwdrr6vqXQfJE7tXj7Xfr3C1M9l+Oxe3pfh2sLX7nX4TDbIUwwefvjhUAwCgUAgEBgNKAb//Oc/01F/lAPrdSkITh4JCgoKmqhktgC/svH4wQcfbB599NHmP//5zxBneypCMQgEAoFAYEBQDgyojz32WFISgoKCgiYL4Vv4Fz7WhVAMAoFAIBAIBAKBQCgGgUAgEAgEAoFAIBSDQCAQCAQCgUAg8DhCMQgEAoFAIBAIBAKhGAQCgUAgEAgEAoFQDAKBQCAQCAQCgcDjCMUgEAgEAoFAIBAIhGIQCAQCgUAgEAgEQjEIBAKBQCAQCAQCjyMUg0AgEAgEAoFAIBCKQSAQCAQCgUAgEAjFIBAIBAKBQCAQCDyOUAwCgUAgEAgEAoFAKAaBQCAQCAQCgUAgFINAIBAIBAKBQCDwOKa4YvDYY481l156aXPCCSc0J554YiL33/jGN4ZCTF/4v//7v+Zvf/tb8+CDDzaPPvrokOsTUFd//OMfm4ceeiiFGytI6y9/+Ut653/+858h18Bkx7/+9a/m97//fWrbidauDz/8cPO1r32tOe6445787k866aTm+9///lCIQCAQCAQCUxtTXDEgBC+99NLNC17wguZNb3pTM9tsszVvfetbm09/+tNDIaYv/Pvf/24uuOCCZs0112xuvPHGIdcn8LOf/azZcsstm3333bf5xz/+MeQ6emiDgw46qFlrrbWSchCYNvCDH/ygWXvttVPb/vnPfx5ynRj49a9/3Sy33HLNG97whvTNv+Y1r2le9KIXNfvtt99QiEAgUMN3fOyxxzabb755s9lmmyX6/Oc/nxTryQJj3C9/+cvmjjvuSEausQSDw0033dTcd999Qy6ByQ79hexjPGPkmkhgrL3ooouarbbaqtl0003T9+i69957N//85z+HQk1+TBXF4AMf+EAz88wzp5mC6667rrn22mub733ve0Mhpi+w8u61117Ni1/84uYrX/nKkOsTuO2225o555yz+fjHP9488sgjQ66jB+a88sorNy972cuaX/3qV0OugcmOK664oplxxhmbFVZYofntb3875DoxQLHVn7/1rW8111xzTbPJJps0M8wwQ7PrrrsOhQgEAjUo1EsuuWTzvOc9r3nnO9/ZvO9972ve/e53N1/60peGQkx8EN532WWXlHff/1jilltuScaG448/fsglMNmhv1CEP/ShDyXZcCKBvHbMMcc0iyyySPPe9763mX/++ZtnP/vZzTve8Y4k204rmCqKwcILL9y87W1vSxrh9A4djbb5kpe8pDn77LOHXJ8AQcpgwNI61orBKqus0rziFa8IxWAaAovcSiut1Oy5557Nn/70pyHXiYkjjjgiKTG77bbbkEsgEKhBMSAgzTrrrGm5LYWf2x/+8IehEBMfZj3MiBOgzj///CHXscGVV17ZPO1pTws+Mg1Bf7GqhOHy61//+pDrxEBehv2b3/wmfYtkNCteQjEYJbJiYDnB7bffPuQ6/SIUg8BYwb4ClnlTmmO5J2WsIW+HHnpoKAaBQB9kxWCOOeaYtLPqeNJ5553X7Lzzzs3dd9895Do2uOqqq5JisMceewy5BCY79JfTTjstraT40Y9+NOQ6MWFPH6UgFINRIhSDpyIUg8D0hlAMAoHBkBWD2WefPY0HgaciFIPA1MTvfve75u1vf3soBqPFaBQD06ff/OY3mzPPPDMtQxruySs22lqzRhv96le/2nz3u99t/v73vw/5DgbTR05V6mLS8sQqYgrslFNOaS655JK08aoLY6UYWJd32WWXNeeee27z05/+dMi1HWOtGGhT9WqPhI1gI1FitI1pYRuw69OZMtT9xRdfnOr0/vvvH3IdDDY06TNnnXVWom9/+9vD3qDLOqCOL7zwwjRgDwrLem644YbmjDPOSFPpP/zhD9Mmpn7QN4TVpuJef/31Y7ap+K9//Wta75vTHe4mL31Im+e6HM7Gq1AMAoHBEIpBb4RiEJiaCMVgjDCIYkBY32mnndImK0L4Aw880HzhC19oXve61yUB+uUvf3narOt+jTXWSDvYe8F01Hrrrde88pWvbF760pema05jrrnmShuX2hqVEKjBv/jFLyZhd+utt06nKdk0KT8lCLOUjQUXXDC9g9Cd3yefq666aloDXmM0igEBi8BpM7eyKFMul/p1NGSbwDYaxcDGm1e96lVp47gyH3zwwWmNHSHPu12tDVx++eXTxrA2/PznP28WW2yxZsUVV0xtY73561//+rQG1VpU+cswrehdb3zjG5sXvvCFKe38Hv2DktdPuSPA2vhmA1/OpzqyAV4/63WiBYWCIqKOtWV+vxN1DNaO4BSmDY6a1bZO39K++oM6f/7zn9/MM888qb3b4uaTD97//vc/2a7i5bxvu+226ZuoQdG1d8fG3rIOKVQ2Si200EKprihw8i5t6cqbsn32s59t7rnnnqFY/wtKL+Vt8cUXT3FQrkvf0Ze//OWU9/333z/VjzK0IRSDQGAwDEcxwI/vvPPO5tRTT02bk7fbbrvEr7/zne8MJLQwFvzkJz9J/EF83yhDT6+T6xgqfvzjHz9psMBLLRsyPuY9hL53hhzhuvKBDwrDcLLPPvuk0/gcTHDOOeekOugypIxUMZAnBjRjUX62bpwRZscdd0zp2dPB3xjUBnEYK5WLYc6z8MYkm63bxhV8+9Zbb03t4j3CMv4MuidMPTPoWGbjZBx1ZQy69957O8ehEsYF7eKoaG1M9iGH9TM4SVvZjFnKRhYy9jtIop9RSdkuv/zy5oADDmi22GKLxPOlY1xqa1fjDD9yXd1fjKn6eJaHCObkLn1df1EX4g1iqJJvRlz9ffvtt0/x9ae8f0ca+kivug3FYIyg8vopBsJss802KYyNlITMWWaZJQmzhxxySLLEE46k8/SnP72Zb7750ulGbSAs8SeE5Pg+fkKMd2hUwt7hhx+ehPQSrNME1vXXXz99hN71rne9K50SZNYiA+NwpKh0CPKYIubsPUceeWTzyU9+Mr1ffus1liNVDHRa73DsI6HZMZVHHXVUqhtM1W55wp4P2MdUYjSKgcGCcqRdfOTKrD4wKu/2cX30ox9NQry69aFimCV8bO95z3uaD37wgyl/BHSb6z7ykY+k+sqCPgu9o8Aw/rnnnju1w9FHH53IvRObvB9TELYGBoPxUii1oyNxHeWp3jACSpy0l1122cSIasiH/Igvf5/73OfSuzFT+fJ+Qn7bcbIYrT5DaNbu6kf7qiNxKVP6E+WthDwbDLXpW97ylnQ6g/yyzOsnFBR5tsm4HrAJAK997Wubdddd9ymM3oBHcH/zm9+c8uHd+uRhhx2W+rEjQ7UZxWyppZZqneHCwA0m2ooy4Js0KPmOrB12coo+LC0Djjxq+zboD6EYBAL9MahiQOD0HTLa4Dn4FcIfn/WsZyV+T4Btg+/R/0SEYTzxHeMReAkeZRw2S1qPj+IZcxjK8EWz+fLp20fiAIXFsY7GhLZTiSgkjEN4nvfjU8YOz9KWF+m3CV4jVQykZeyUX7zbkbDu5dF7jRfPec5zEk90PGxWIEoYg73XOIuvU4jkXX6kQ4DNEBb/V5f8tRP+bgz2DqfcOFWul3B/8803N8sss0zzjGc8I7WrepFPz9q6HDtrSPfqq69uPvzhD6c6ZqQSXz7kx7hi5rcNlB7tw8Bl7JdvBqhXv/rVKa68k7/qcd4zo1x+p3GUAC0ddUumMwbV7ao/rL766uk98lzC+OKdZCtG44UWWijVv/JL13hsXCGHdCksxlgGVRucn/nMZ6a20m7yI1+MaN7LkKuOyAdt8gWEYjBGUHmDKAYEfx1P59Bp26yPBF5C8HOf+9zUQerlJdJZbbXVEqMk+LeBRUSjEhJZS0pQDByFxjLtAyJMts1O+KCFU662k5Z8IJi2j2P33Xd/ysc/UsUA08W4Cdj1xwOEO7MUGADmXb5zNIqBemQN9jFgpBh2DYyEsCx94ep6pRhoL4xCP/A/BQpczVgI1pjAJz7xif9JA+66664koGIGFL7aSqBeCLHqkMWiBuu4WQvvIKDW8TEffRDjq/8xASw9mIuBlzBfQp/Qdyk/bYMKJs+foF2WjYJCaOfedoKHuv3MZz6T8qxdMbkMioF4FJhaMTBDYQDhTxmp69q3RLkURhuX6YLlRtoLUYzquqI4qEN9wqCBwYZiEAiMDoMoBsYGYwi+jB9R9lmlhfePHIYC37V02gQcwpvZSYLyxhtvnHgZXkLQZ0hhVDA+4RulcuA7JlAbYxg78Bj83Oy8mdjM9ygGeDle7F0lCOWMFXgoXukdBFSKCmGTQMrIhW8xjtQW5tEoBvk/St4x00wzpSO8GWEIjcYOY9sSSyyRlCPXeuzHA41zxoh11lmnWWCBBZKxiCJx4IEHPmmQU2cEWgKoTeQMKuQZY4D2IVcYCygkp59+eqtyYPUCo6R6IPOwkpuRl09trxzaWHvVvFudCW8s9h6GG/KDNpYPMwBkCe+v5Sx5MWOvjZVP+3gvodlMutUUxnljrDG8hDYkZEtbGeWVm7FUnZMByGbuy35FMTB7rc+bkShhlkJ7k1+kTYHQX/V36RqbyIsEfvVcQ93IN0Oj70V8bU4O1DfN5PhO5JlyIRz5o2vWLBSDMcKgigErME2aQF5a52tgLKyntMZaSJW+zswSTaBqg497o402Sp2QBloiKwY62QYbbPA/whBk5oCp+oC64EQJZbZUprTgj0QxwNwJ/d6JeXfB7ISPR0cv9x2MRjHwEWOUPk5KTheUy5IWH58PrAQGm631BPtf/OIXQz7/BaaFcfvo2qz5GcIZuDCZcjbGYCRt9cqS0wUMhVVAPZd1YRqYdYYlwfKZLlAEDQws7nk6GPMhtHNX9jZGDxgyKwWrewbmOu+886b0TGG2QZkJ35hnyYy4dykGGJyBwwxNF1jztJfleWV8fXzDDTdMA5uZoi4oN8ugdg3FIBAYPQZRDPA9/A/faAtjjDR++S7N+pXAG/BJCj8hvxTQMvAV4wjBsVwO6zsmKOJzxhKWYYJ6LZj2UgzwX2O88aAco0oQngm0+FItoI1GMcDbGeuMH+SNNhmBgc1Ms3eYIS+BLxL4zWrgu8ZUAmZtdLEkiaWdgbHNwAT2JBrv8OnauCgPlt6qJzMTbWD4MntAwakt/2QPBkRW9a7jP41xhGFGsHKZKmOr/sFPXbeBEqQOKJW57MY8Qjz3Lh4vnwRq/bY8faiXYsAAqC2MHdqsXDKboU/JrzHd91PCmGoGQHvIVznOZQjDSGa8068pjKEYjDOGoxiYCfCx9furoQ5LMWBpLT9uHxjhlWDYJZzRps1OYBKWnZTIS4loy6zTbTB1x5JBYOp1tJb4NH5LNcrOOhLFQL7kyXIaH1EXMGSaOsGXcpU/2tEqBj5M1qG22ZMS2teAwxpVWqrEs7zLh9fF6LQJKxLG2wvqT1izBgTsPCixsBiIDKrqoQv8MBvMr1xCwwqByRpQu6ZnIQvDlFiWJlDPLA/eT0jvYioUopNPPjkNvBlZMaDQdu3R0OYYvBmFUlntpxgQ+llsuqD/YnKWBYmTYWZEP2Tt6tfmmDJlmvIYikEgMDoMohgQsPJS264lD2b88BW8MvNIYyL+i2cQensJNtZhGy9YmzPP8R1TDIy9LL/GpTb0Ugyka+zDL7uALzPSUH5qI9JoFQNxpd0mYGbgmcY7hriS/2XFAK/TPm0z99LNs9K9/uGgTVi4GVTILGUdW25DwLZEuV6yWoKB0nvKMdPYJT3yAgG+VtpKiG+M0KYZyq7ezdp0GaoIx+QaxsqcP2OUNsfju5QRZTRbRGHKYycMMmNAUfLeNiijmQD9Mi9nA+7kIP1Ym/Q6wIRcRJHyLuUKxWCcMRzFgFank2eBtgsaTSPSdkurA6EfU6o/BsIkBmrNJWsEwZUQWFpuAaMj2LJmdEHedPDa0sKdMI+xE5YoDj4SluBSAB2uYoCZE2QxYtYf7+mizPgxQEwnK0djoRiw3nhHL2BKLBWs22VbayOMhsKibmqI52Nk4e61GTbDyTqYACUoMyZ1Kp8YST9oP3VSKo/aS/+Tdl2vJYHpS+8q289ATBjXrwzGGJ935DgZ+mbZPzGrbK0wE2YwIaSXecuo+3UvxUA/4tdLScozOab0y0GAkuU7kJ9+zE/9m3pVH6EYBAKjwyCKAWMC4YSAZQljzWPAd4vXsh5nfzOcvukuobYE45wlGgTkPOMtHUIkodUY1WXA61IM8C95xz/bZo0zLLU0G0H4qo1vo1EMWOgJwr1m+jMYf5STkJ5h3CBMMwox0rVZn80QGKsdflGuFGiDupAnckw2zBgPLA8zW9A125CBZ6tnMk0eG8g4+g9DXC/DJRiXjaOE8ixvGH/VPQs8A2st52SYSdK/skJjHDAjzmBHluslWFsyViq0vRQDS6W0d70ku4Sye7fZION3hjJRgLVjr1UoGVmGCMVgCkDlDaIYEKZ8DPXUZxtsMtH5MUbr9mqYRWB9lZaTACwdMj1IyzWNSAjT8a01K5FnDGiX/aAzEvhNG+q8lmwQ0MxiLLrook9uZmKdGI1i4KORtg/Ypl/3mEFN2V0YH4K1gHl95mgUA+vPWUhs3O4HA4L3ECrLJU8YEIs4paGNWWFwmBErRz9mCtYMUn4oK3nGyLSmj5rfcKGe8qZn+Wepqes3E6aHcQvLIpMHXcoNRZMCRAA2oMqfAcw6Tow/M9Ea+rAB20DKEvixj30s5YGSx3rfxYD6KQYYbe4DbbAm2JIBpw6Vs1rWveqf8tBLscjwfamPUAwCgdFhEMWAUIXfMWSw7jKmGe/wzl7fK96PDzOMsQxTFJzIUhNeQujH54xjee9AVgzM7NtT0PWuLsWgBkEP3/Q+ioKZEDO3ljHabGv9/lgrBmY38dR+oDwQ8OUlIysGxtcuAxTDjvyZaekSZDNyHctTXqHgyrDWy0LeCwxShFbjGAVL3ba1sba3WVodmzHOy3fVEx6t7cwYM74R1vW5XrMXwKDEOGapmSXU9jmQfcgyveqil2JgjFWfpcBfQ78kg5G3SmMvJcuyIGNkm5xYg5LNOKnuQjEYZ6i8QRUDH0gtLLeBMEjg1nFLywdh3Zo/m69YQk2l6myENZZ7ApQPmpJAsGybMcAIWQN6wUeFQUpT2j4GCgcLyworrJCER0c4crffYTSKgXcpD2VGhyTI9SIDhbKbZcja/mgUAxYTey7kuR8wThuLKXisGBkUA3ky4LVZijB/eddHlLcfMA/tSglSNsCIMJB6veUgoGiKTwHSZuqwrtdMlBv+SB8oBW/3+h/rhb6hj+lPmIi+wb0UwEuoF3tmTIli1piZWTF9WJ0aYGshv59igEaiGLDosfxQqrssRiUooaEYBAKjxyCKge+J0Mcaah8Afmv8IWQSSBmr2paB4BfGTEKfcUl4xrKasrulNMbkzFOzYsAya3zpQj/FgIDJ2szohHcRTPEgVnaWbnk03o21YqCujKF5zOgFQqIx2vr0DOObmXh8rMuA6aQg+bN/ox+MO2aLCbTaBowfyl7v+xoUDGMUGns0tHHdtpm0sX5GATOWlfv19B31S0HRhurMrLIT8yhuZprbBH1KHsVIWPHUt3GfvKVezJC01f0gikHX8iTQL8l1ZnLKpcrkDnkxppcnRnWB8kCpYRgOxWCcMRzFgMXYGux+0LmkiYmUS1NoyzRtswEYjlNmTHnpyIQlHxqm5XxdQle93j0rBjbRdkEnxhyE0+lZa2zixMAoAKwAykPowux8fKNRDDAP+dFhrcuUrvJ0UfZnFfDBwGgUg7yUiPDXDxinvKsbDDwjKwYGgDbB2MdGkMZM5LsfMD/WHwJznjFwMoR89tqc3QV9wskaGLQ1sIT0ul5LUseorOMS2h9zxWgon2Yj7DcxjU1xLDf0lcBsbQTDxAjZmDPmrl4MltIqGfJ4KQbqwIAh3/2sRJBna0IxCARGh0EUgwzfvAM0nIBHgMVjLNc0Vpi1Nr6UVn1Cl3HPjK7T+wifvQh/xfezUOU7JvD6jnvtEeilGOCZlm3iZ5QZvIeRz7vMSlM8zLASXglgY60YeG/XzG0JioFymsnOyIoBRcw+wzbkpS+nVqsR2kCWyJvEs/KlvlitjUfl/slBwSBnrx7er40ZvNraFvGjmODJ9bhM/rDkR3kJylZnkOEoi4xVDttom93XRygZJ554YhqXKB2EaAomQxlZphbSB1EMyr0DNbyzTTGgPFumZfa+a89oCYqLfIZiMAWg8gZRDFjZTdHpiP1A+MYAaYNZ6NaRrQvM+xS6zrT1cbOEEiy7FAOCThdo9Dpw3kDTZVHVEc0mYCyjUQzk14YdVoRB1sm1YSwUA1p/P6hzZcZ8y4+/n2JAeHUkJ4s9wbsfCMhZyMwDn8FRPjGsfsAwpOGascMOO6Qp8nKmY1BgTPJRDsIZ/FhS1AemiGlj+rnfuPIXroa2pyiw4BvM64FyvBQDJ1YYRM3I9GN+8k7ZUfehGAQCo8NwFIMMfIfAbXmQ5RsUeksSKQiMYBkMEtLNxioCGeGzjfgZg3zfmYf4jikGlrX2soh3KQaMDE5tM0Y7DtT4h9/jXd7H36y/WeNPfepTyWI9loqBMZ/c0DabUkPejNEUloysGKhXxpM2GKPlz5KjNp5egvBp6TFFIx8Sgacbp7kPYiSrYTmP9IwJxoFB2zjvUaihzOQHwj7h3NIiCp08UuSk0QYGLGmra8K+2SHKHoMTY1cpC0pjPBQDy6McMkLY7zLGlbBP0Ey5Zb2hGIwzhqMYsNhqyH4fro+SkEHjzUzLphkdTqP10g51QkKyj6fW6vspBj4emjDG4GxhH00XbBySx9FuPgbWCcyUYOmD7oK6EJYVQFnyxz4WioGPtpxubIMj1NSrdijz308xAO1PMSgHsjYov4FPX1F/mflag6iO1HcvYVj9UyK0TfmTPJYqFn11nIX2NqhT08iYXLbyYKyWXLHC9Fq/ap2mdjdlnvu4gdM6SAN6F3Pmrq9jWuVRrOOlGOiv3A2krCi9oJ/bcM3KFYpBIDA69FMMfEu+9SzYt4FgbjkR5cB4hh8AnoMPWw5ZGkW6IB3vGCvFwBhNMCdY9uIrZmzHY8YAPzN2mw3oB8s+GePK9wyiGFjazPhjDFYPvWBcNMtj5UNeYqu8DGvc25bdlqBA4afGHmM8MFwSWh0IUR5D2gX1gvLYY3z13i7Dqr6g7exVYTzKR76rG7KFd7aNY/qO/qdfGMcY4oybMF6KgTJYSWBple+hH4zB3hWbj6cAVN6gioGGxUzqtf8lhCX8EUAtFcrwkRE8da6uPz6Cjicv3tN1XGkvxYBQSBCy2bdLgMQQ8t9gTfGOVjHAxA0Wlkj1WkPvwyTkWV9Y7r0YrWJA6FMWzNJH2AblYl0h/NX/lxhEMbB5zqCBeu0zMNAIo5zlCUaYgGlLgxarSRdY7tUxa1TJeKUlj/pGr41KLFx5Oj4rAcpuMx7FAoPq2mglj5QXM115GtZAR7hnpcB0upAVg/LHbeOlGOjn2hpD9V12lQdjtzlP3/DthmIQCIwOgygG1nFbf8+Y0Qv4LQNKPhrSOEAAM371W+pCQLXEx1iUhSTvHo1iYJknYRsvaxMeM1huCV7qYCwVA0tZjGW9VgQAXoon4uflEdKDKAZkD/lmGcdfe0EaZqmNm1kQ925jNXeCai+oJ8Y0exCzkTIrVZTCWsiu4cQq46BDJvL7Cfr2w+HTpQxSw2wBGYx8APqt/XGW4dRtVoJSZuxkDMszIuOlGOhj9jYYc+2P6CXI83Poh3eFYjAFoPIGVQwIDj5GFe+jqdc356MdNR6NuPTXuazX0wmsf6wFIkKMJRLWnNk81cYcB1lKxMqMwRBCfVg1LHMyjYgBWhrFel4KfCNRDICwrWwEOcyqFtAJ/IQ0A4FNaeXMwmgVAwIpRmdA8AHWlipCLkVIeTGV+qMaRDEA08zatj7JCQinBjgnaqgHymNdB2Zz5NWgad9H3QcwaszLZupagMdEMCFpW/5l70pbHWN8BhcDU/Z3JbCzSHl32w/S1Jk+rnzlSResKJQCfd/61DZmY08FZuobKvvSeCkGYIBjtdJP1UseODIob6aV9XNh9LuuZVjqJxSDQKA/+ikGQHjyzRkn8mxADYKiMQpPLnm+9fvGPssEax5bwj6AfBpdtnz7jkejGDC44AGWHnbtXcL/GN3wSTMLtZFvtIqBuAw7bX/GB2OCMY8cQsYoMYhiAPJmjCBwdxkPtYnlVITr+kerZv3xVQJ+188+1bHlvWYnSkNqFoYZdSgMXaf8GRfIUsZC4TPIFtbk4/29VghQ7vKRpmD8sURWvfVSOik7+q5Z/yxHjJdiAMrgOzA29zqiN88W6POhGEwBqLxBFAObj1U4pkBY0cFo0jYjazTCtClQjWfGoK3T0kZ99IRDS310WsKKNdo6Moao0TE7TIvWysqZO4HOJ4zppy4QkHzwPnxTgD4qafiYKQQ+KJ3QchX5tKFa3jFFH60PEuPAsOsNTAQ9HY5gXCsGPjyCGKVG3m0Ks3yG9chUos6vbigA9VIsigGBFrPoNRi0Qdo+ZO1jM5N3WP9oLb928UFicNy1T9vZy6zx9mQY8HopJsqszX2cZj3Ukzoys6M9TV1m4bKN4WHchG9MFfNmJcAozCzJpzzIJwWybVZCP7BPRR17V65j8TEoA5X4GGCtHHm3k6i0K2uNqVKKn74h//qauL6Fuu9ihtImXGP28uwUBvHth1FmVP8+HxPXvzD4sjzaXz9CvRQDsx/6sDy1KWwUZYoO65XBXF/X5k4koeQpj+81b6Crf6+fEYpBIDAYBlEMGMgsWyQU+v5q5QAfw398k4RL318GPsGQIK6xwphc8hRxfat4EX5ZCuZZMTAeDKIYMCaVioGllMYpioXjQMtZA2lbZ49nM4IwYr3tbW8bc8XA2IAvShuPLQ1ohD5llz98tZ45zooBYb6XYqCcNn/LJ/5djsf4MYOi8Zg/I16tJBnbyCn8Xf24tWwjfUS6/LVhbbRh8CFDaGPjWV2HZIC8ogFf158yjMHaDs9nRKut/wysZDJta7wvxw1jnbYzZli6U489Vjs4HIYcUi4DzoqBdhmNYqC8tWKg3uxvIBMaY+WxrC91beaNv7yT/2KPwRTAoIoBIUyF61CWexDOfZw6RSbPBMSyI9ewjIQ1hHCZ42EGGBIhV8f2YVjiQwB0lFW2/JsNYOH38feCD58SQAkp85enyDJD8QFxo8li4DopZmg5Est3fYKOvREYBuaY19+V8AGwDliKQsgq321dIgWkVgoA86MMmS0xozEcqDMfDeXK1B/lhNBevtuHTqDvsjBgHgRebdrv/cqofqzD1z7lO7QNIb1kkm0giLOCY/BlPrU1BtJrUxcmjVGoK++s4xs4Sut8CfWsfeS9jIcoGtqgrY6UmUKFEdfvVAf6BCG9hu+EEEHYLgc46zzFocD1qisCBQGDZanrm9KXpVO2BcL8fQOEgDwTUg+kGcoXikEg0B+DKAZgMyUeZWxhSMNbCXsEQfF9jw6CaOM3DALrrLNOEv6Myzb64k3iGlvENdtoqUoJY5eZazyq36lEBF7feynoiU/AY2lm+CP04h3GRkYnwjoeywDC2GDcIeAaD7IwZzZW/vzwajggY+DByiVPyk24lT7rtSPKySks6PzKZcoZFAMHgVBaaqNeDW1naQoDovZR32QXdU1uUH713TXjQyAXlrCrXowN6sk/Y7KBVN7NxrfB+8k4+ocyO9yDjGW22v4A8Rl32vaaUCy0jbrQHvKunSgM9oDqN5bz1jMdZBbjq3GXgE0W0q8YFRmvyA3GEctUS+GcYmAcUk4z/SUY2+S133GlFAgyn75Sg2HUgTQMqtout7m6zEZNqwQoFWRQZe+aaQnFYIwwHMVAx8iaIeZiDbcpP2fnEqIJ8F1TcyUIfrRT2qFlGJhoKfjoSARoggzhKmvs8mGNXpe2WMNHxYLBam+Zi+eys/hQaPvykqcEvZvVBnOurc7KzKKu8wnXBWVhXaEE+WAIldLrAoZMIPfB97IgtyErBnktoXyZAcjlpsjZjFSXpYQ2U9/qYND3qwPMTfspp7ankA0KwrE60n9Ysm3KGs5sifrSbzC/HL/fZrAMzF6b68vah8JZW23agFl6p/Ka7VLH6rYU+kvoL9q03vCljvWjXrMzkNtFfnu1i/5qdkJ5zMBh3rku5CH/+Vhe2qDPhGIQCPTHoIoBEAoJWRQAVnzCJms2IxgBrdf3T1iy9JLwRlBlPCMoS4s1uk3g9B0TzAmKeFMX8BXCPYNDrZjgU8bKvBTFe10ZcuzTygeH4DcMEoR5xij5BeO1M/iHe3qccVlalsmAcZnQqp4tH80zvcrWZeDAI/FzSoTxoB/wZeMmodOST+1DDlLnZhxKWaENxjsz8/groTrPhJOnWMGNkb2At5vdpaAou/iMq/qH2d+uZUogbTzb0mD9QlwCPyVDO+VTlGookzGb1V19kh0okt7POGg8rVdDkL8onBSP+uAYBjHl73cIhndSnsoDRUpQ6siDlt+pAwqK9rDEmfKhLsinlAcHkHS1TSgGYwSVNxzFoJdmGJjyqBWDwLQPwr5lbSyG/ZifcGbFzFh1KdShGAQCg2E4igEQwhnCGD0Yhyj5XdbOGr5LBh0CrLgESWn1Mr7xY6joFQbwEGl3zVgSDglZFBBCGcG/NE7IG0MJI5h0PIP0hCXoDQf4GOGUBTxDGoRvZZcP+enH78SRn0ENXPJthlkZGU5yeQaF+Opb3sSXjvbN9dEPFDH1yNA13P4hLqOQfsHo6YrHD1J2efZO9cqIpNy1QpChLJQD/nV/Ud/y3++dwrX1S+mVbvpOLo9vRn/PdWkWyKyDWaHS0FYiFIMxgsrLikHXWbLChGIwMRGKwfQHzNNSJNbHXpY5DNt0uNkCU7VdzBT0o1AMAoHeKBWDflbSwOBoUwwC0z4cIuIIdKsqeilTlBIzDmaOrIToAmUnFIMxgMqznt50ofXXLIyWWZRLK0IxmLgIxWD6hH0aZgIo9aZzWdZYujBQCoElAqZ+bVL0J8zy6Fgol9GZdqdA5PWlgUCgHaVikE9GyxQYOUIxmD5hyRaDlOVn9RiVYXbLnhlLi+zLKJedQ/kNmrEJxWAMQECw2cPaLWvbMD3rCZ2akKGCbVZyVGgvbS0w5ZFPBbBmNTD9wDep7a0ptdHYOlWnmdi8ZqOgDXC+ad824b8WXEzTWq9rjaoNjdKwVtXMQiAQaId9AcZHs3WUaAq6vQBda6cDgyEUg+kTLPw25tt0bIOzvaoUBN+ZJU5m5RzNa/+EVS3lD0TBUiTG7FNPPTWRPRvGPftyQjEYBawNc8ShE0ys3dIIThQoz/CnsdkR7hjIro0/gakDG05t/mk7FScwbYNSb6Ox0zRYUgj4ZhAQhcC33LXh2FpUpzv5N4VwTvoypRvfdyDQDWuYHd1o9tzGSN+aI4V9R4GRgxD3mte8Jll6A9MX7Ccwu00pZO23wd6xqQzVTlfSL5zw5SAVsmgJz1ZN8LfyJcdxopPxcVrBFFcMwOYPm0NKqjcaqWRrm/ttagpMWdgUZLNS/cEEph8YVG3SwmDtE3KUHsG/V5/wTbd9912bEQOBwBP81ok8TkJzioo9Po7OtKwoMHLgPWYrneYTmP5AtjRb4Eh3x8E7Xp0C7me2BH+nXbXBeGVJbP4WXRlL/Z9jWhrLpopiEAgEAoFAIDC1QJALw8T0De1P+WboYvB03+vQjOkFoRgE/p+9s4C3qtj++N/OZ3d3K3Z3J3a3Yis2oigqGAgIggKKrYQYYLdiF3ZhPfXZ3fF8NX++y7NgMc7eJ+45955z7/p9Pusze6+ZNXtm9t4za006HA6Hw+FwOBxuGDgcDofD4XA4HA43DBwOh8PhcDgcDsc4uGHgcDgcDofD4XA43DBwOBwOh8PhcDgcbhg4HA6Hw+FwOByOcXDDwOFwOBwOh8PhcLhh4HA4HA6Hw+FwONwwcDgcDofD4XA4HOPghoHD4XA4HA6Hw+Fww8DhcDgcDofD4XC4YeBwOBwOh8PhcDjGwQ0Dh8PhcDgcDofD4YaBw+FwOBwOh8PhcMPA4XA4HA6Hw+FwjIMbBg6Hw+FwOBwOh8MNA4fD4XA4HA6Hw+GGgcPhcDgcDofD4RgHNwwcDofD4XA4HA6HGwYOh8PhcDgcDofDDQOHw+FwOBwOh8MxDm4YOBwOh8PhcDgcDjcMHA6Hw+FwOBwOhxsGDofD4XA4HA6HYxzcMHA4HA6Hw+FwOBxuGDgcDofD4XA4HA43DBwOh8PhcDgcDsc4NLxh8N///jf85z//Cf/+97+dnJxqSPxn/G//+9//Cn+fw+FoSfAvehvo5FR74h/TNrC1o2ENA60Q//jjj/Cvf/2rJOLlqqvXeWTDpGRSccQy1q8YDzf2t34xT69LkbGU4meFVSonLuXHfnqfJQOVKpMXF7wsvnVL5ef5VVMm5isvi1+Kf+q+Uhm9pnJ048DhaFnwD/Iv0gZW8j9XImP9skjD4ZYiEz+jEhnrV4yHm/JPUZ6M9Yt5el2KjKUUPxVPnp/eV1MGKlWmWFxZfKjUuOx9uTIxP4tn+bg2TGs3DhraMOAF/fDDD+G3336byEAodp1y88KleLh54VL3xWSy/FJ86x+7SpZvyfrbe0txeOvqteVbP0txmPjehlWKw9h7G07J8uPwsZ9eW8qSKRY+Fa4WMvG13qfcmJfnb6mUcNz//vvvQm2h18ThqGfwD9L20QbG/6v9n+117G+plHCpe6U4TMrNC5fi4eb5Z4XJCpdy9TqLn+UqWb6l2E/Dx3xL1j++j/2UZ129tnzrZykOY+9tOCXLb4pMfJ1FGiYlo25MGiYVrphMfK33eq33tH9ct+bOsYY1DKgUeUGffvpp+Omnn8a/tH/+858TUR4P117H/pbywqXCK18p5ZfFszI2XDVlbHjlWT/lF/PL4qdkbBjLtzzr5l1bgp8KF/NjSvllySgvdvU6SyYOl7rO4ynFcVl+ln8pMll+KT70yy+/hB9//FF6Kh0OR8uBfxCj4PPPP5/oH039t5anfOum/FJ8pTye9bNhLC++jnlZ4bJklCzPupasX55/imdlbLhiMjEvvrY8Jcuzfsov5pfFT8nYMHF45dnwyktdW4plivGV1M+GyZKJebGM9SvGw83zR+ek/cMgb82dYw1tGPz666/h/fffD999991EL1F7NSHLi934OualwuHGMpZn/Swv5a+U8suTicPlXceysV8pMnGYVLhqytjwWXzrH/tZf8u3/nl+yo95qXDF/JVS4ey1dS1ZXjGZavHy/KkU+d8YVnU4HC0H/sFvvvkmfPjhh3/5T1P/bjm8Uv2tn/W3buwf8+JwqeuYlwqHG/MtL3ZtmJhv/WO/PJk4nLpZfOWl/EqRicOkwll+yq/Ytb3HTYWzfOuf52f5MeXJlMuL/W04S6XIYBB8++230kHmhkEdQg2D9957T14UL46Xpi9Pr+19zItdGyZ2bZjUfXydclPh4jDluqWEwVXiXsnep/ysTOzacPZayfpluXqtZO+tf+zGvNR1KffKi90UD1cpz9/ysq6tm8WL77lO+SsvDpvlb3kx37pZfvRQooy4YeBwtCz4B7/++mvpHMv7d+11zIv51s3zi8PE93muXqf842u9V7L+qXBZ17FM7CpZfp5fKa5ex/eWbyn2tzKpe+UVc/VaKS+sdS1ZHtd5MqlrXCV7r+Fs2Pg+JRO7pfrHvJivlPLDRefkn2OWihsGdYjYMLAvszVR/KE6ObUkff/9924YOBx1gNgwaI3k7Z9TPZEbBnUOaxigqOhLy6Ji/jGVG76axLN1TlvqvrlIn8v6DSrnVBhLGl7TqtdZsuWWcbnhIZWxbiqNNm6u8aPh1UVHSnF+CAufsHE+uVcZGz+k9zE/jyqVqeYz3DBwOOoDahj8/e9/T/6rStWuA/KoWjLUnbautfeVPKNSsnV4yj8mDU/7Ael1KmxzUVxeNo3ki2t4Ngwy8FNk88OUGu51dywbB6Txp8pA01XO+yRsueGtWwrlyZBfNwzqGNYw4EXxEnlpKSLczz//PNG9dVNEeOtfTCbLn/s8v9iF+In+8Y9/hKeeeirceuut4e677w4vv/xy+Oqrr8b/wJqfVBy4MVl+MRnuIeaTM3/13XfflWcrP5YhLaSLd8BCuFdffTU8+eST4n700Ufj/W38cRyWUv5cx+/E+sV8vVc/TQPuxx9/HF566SUp3zfeeCN88sknEla/IcLAe/vtt8M777wjRBnwrSGLv4Zlao2GQ2HW5xKGMkNG18HAV7LpUhnrZ+8tX2VSfjFP+alnZJGGzUqXGwYOR33AGgb6f9r/VSnrf86iYnWAvbcUy8RuivCz/lxTT9P2vPnmm+GBBx4Io0aNCo888ojUpcSvCpuVsW5Msb91Y56S3lOHU/fTDpImlbHhSRMu7TbtwQcffBCee+658Mwzz4jsl19+Ob591LCQxmHj0vtS/LL4KRl1SQczLEgX7d+YMWPk++E7UuWdsLRXlLe2f0rkjfwQVuNkA5i33npLygiFWZ9LedFe8ixtY1VGXciWieXHPHXj8NY/5hE2lrFxKc8SfCsb+7thUMewhgE/Ly+MF6kfgF7HlPLLCw9VS6YYIQPdddddYe+99w5LLbVUmHfeecOCCy4Y1lhjjXDaaaeJQqthY/kUxeGy5GK+lu3BBx8c1l9//XDzzTcnwymPiuShhx4Khx12WFhttdUk7ausskrYbbfdwrXXXivKpCrmKSKOrLhjnqVSZeDxnZCPvfbaK6y88sqSRsp1//33D7fffrv87OT7iy++CGeccUZYffXVw5prrikuRDnss88+4ZprrgmfffaZ9IbwPogDuv/++yWPxEEDccMNN4RNN91UyuDpp58WfpyuYpSX/1Lzbkn9bbhSZSAaFTcMHI6WhzUMsv7XFDW1Dsjj5VEp4QlDZ1KfPn3ChhtuGBZZZBFpA5dYYomw3Xbbheuuu07qXq1Lq5GuFBEHyu9NN90UNthgA2mPUXKz6nD4KMYDBw4M22yzTVhuueXCMsssE9Zbb73QsWNHqf9pG/LSluVXiUxMGm7s2LHh3HPPDRtvvLGkb9lll5U2qkuXLqLcq3Hw+OOPh/bt249v22jTcbfYYotw0kknhSeeeEIMHUYKevbsKe3jgQceKB2DPIe80iFG2HXWWSd0795d2o68NjCVl2L5q0QmRXky1g8dAT3CDYM6hTUMsGD1pTUyMeR25513hkUXXTRMOumkYc4555TKkZ9upplmCtNMM03YY489wiuvvPKX3gebf+VBytOwlBkUy8Rh+LFff/310K5du/B///d/oV+/fuPDWBmle+65R9I52WSThRlnnFEqHQyaSSaZJMw333yhd+/eolBqmvRZ5AOK06P+ysfVSkX9uUfW8lOELD3dV155ZVhyySUlTXPMMYdU3nPNNZfkj/Ref/31Ehe9G7vssovwp5pqqrDAAgsI/e1vf5P3Ms8884TLLrtMnvnYY49JOGjYsGHjK38MJRq3qaeeWmQpH/ip9DUSuWHgcNQHrGGQ+lcbjahPqVs6deokde0UU0wRll9+eWkDMQwmn3xyMRIuvvhiCUd4bQds3QqPe4hry4/bDOsfh6F8afeo22mTUXRTdTg82ozTTz9d2hXaCOp8lG7abdpEjIvRo0f/JZ02PVnpVT6uyus9/inZmGiXUPzp6KN9nnLKKcPiiy8uaaSNgujAYpSfDq/bbrstzDbbbJJ33IUXXljaSt4BbSKdZIzogEMOOUTCrbDCChPNEGA0fssttxQ/OtS0AzeVvkYhRkHcMKhjxIaBvrRGJfLClJSNNtpIfiSU7FtuuSW89tpr4dlnnxWrfKGFFhLjAOueXm2UNH4+wnCvZUAFhvGAyz0/Koox8TMk++ijj8pUJfxURl2GC/FH/sUXX5ReAiqDSy65ZKJwSsSNzO677y7pxpCgMqUSZAiYygb+zDPPLMPC9Eggg+JMulGYMYZ4nlb2PIOfj7xRUZF2KhyGPcknlQvvnPv77rsvPP/88+N/Vps2JfgMm9JzQVpoaOj1J58o81Ry8BdbbDEpM4ZG99xzT+FtsskmMsrAiEL//v3lvcDfeuutpYeInhPuoeHDh0slTx54NxdddJFUwlTA9957r/BT6Wsk4h25YeBwtDzUMKANTP2rjUbUj4yyUmdOP/304aijjpKOF9oJptNSJ6NkL7300jI6Tb1OO0Y7RRtEmwKPtoG2g44t2gviVj5tBW3GCy+8IGWnfvp82hHC0GOOso8RgiJMxxHlHNfhyDI6TN1P+0xbiRI8cuRIaV8GDBggRgXtw1ZbbTVeV4FoZ2iXUMJpL5mmo37ETZvHiDRtEnkjT5A+Ez5ylAXTdeCprCXSTJ1N59yss84qxsrxxx8vbS9pPP/888d3eh1zzDGi1DNrASOMsKeeeqqEJU/4zzLLLGK00b6BI444QvLHqALvgzTQDpLWbbfdVsrkgAMOkLw3ehtIGbthUMewhoEqxbw0JX2JebxUGEulxBHzSnlGKgxDcoMGDZJKcbrpppMKkh8UP/JGJXLcccdJb/faa68tFQbzGHfcccew+eabS+WA0k0FcNZZZ4kSfN5554mCyg9J3Jtttpn0ENALQw9G586d5UfWioPKjaFCptigAFMxUyFSMVJBptKO7B133CEVC2nv1auXPBM+xFoDlGsqR4aB4ZEXwjFNB6UZhXyttdaSIU4qO3osmJ+50047Sd6GDh0qw50777yz9FJAJ5xwgkwHYgRgpZVWCkcffbSsw9B0aTrV7du3r5QdoxgYAxgbpIUKjHyvu+66YtRgOJE+Rmao7BhCpiKgbMkXRhnlQRnR8FjDgHiJj/dFecaGgX6jcRmWe5/iNVec5Atyw8DhaFlYwyDrfy3nPsVrrjgg6kemr1CX0tPMtBetT+kMevjhh8ePYNNW0K4xdQfescceK20Z4VC2aSuYevTggw9KPY/xcOSRR8r0VuruVVddNeywww4ySqzP5nm0ifhhfDBqvN9++0nvOu0gyrfW4TbtGBD0xJMu2hHaPMLxXNLTo0cPaXeZvsMz8KPdQFmml522keeRHjrJiBdZ2myUbabjnH322dJm025S/zLFifwhB9HZxSg2BoSWr7roSSjptMOkkU48Rg94BkSYDh06iF5AmsgPhhiGAbMWaH91ETFtLLoBbSkGAVDDgHJTw0CfSRrVMLCdlzwzi/CPw5R7n+JVIw5IjTs3DOoQsWEQv0AUP0ivU3zrp3zrZ8NYvl4r3/rFYeL7rPAcr62K//zzzz+eb8NfffXVMqxHzwTKOD0fhEX5pFebhgIllh4LflQqKyoKwqIQY+VT0VCJ0BPAvU6JoQKlwkSOsFSEOnRI7w2GAT8DxgqVtRJpoycdOSoWKm+UaPi8Ex3VoGKmh5174sKQmGGGGcQ4wIghD/RmdOvWTfJKekgHvRgMIzNSQo8LlT5zNqeddlqZ0oORRAWGPz1K9LLYsiMNzHukh4Q0UjlqpUgY/Pl+6JXC2GL+KnHoCAhxMreUuBi9YD0CvVY0Rhgo9CwRDsIw4JuknGiwMQwoZwwfelx4FvFAmj5NaxbFYa0M16k4KpUpJTz5csPA4Wh5xIaBpfhfzvunU/xSwsd+pcgoX/1sGK5pz2gD6aUmf8qnvqbTSNs2XO4Jxz0dR5QDijidMLQdTOuhXUQe5ZV4adPobFpxxRWlbcFF2aUcMQpok2jvaP9Q2Km/kWPaKYYBbZ5tA6nvGe0mTtJBjzptiNb1tAW0HzyD9oPnEH777beXdoR06nx/0oPhQucU8bKmkDhnn332MPfcc0t7h0FEbz6dYrTfGBYo5HRW4c90WcpNyxWXMmFkgPadvNAukT6bRsqONGK40EYzQq6GAaPrhIHPLAY640grbSo4/PDDJZ2kQxdp80zaVDtiYPU0mz7c+Nry4rDlyOh97G/vLS92U9duGNQxrGHAj8gLyyMU5FJ4McVh8mTUz4bJCm/D8sFRydA7zQ9GZYWFbsMThh2KUDIZyuNnpQJk7h8KNT8sFj0fLb0c/IzM/aNs6F2gIjrzzDPHD73qlCWMB35YrWBR7hnapJKgkmP0AiWcqUQqyxQeiOFYhj9PPvlkqSgwOBjF4L3YPFKpU1HwM1Fx0MOhz6aypYKnx1/zzjMYAdEhWNI0ePBgqdwwcqgkqbRY1IwRcuONN8oiNYwNely0wtNys1OdcElzHIY0QqSRfGp4GgQMNkYoKEOdd8lCK8qWNHEPYRiQT+KLDQN6YOwzlex3EPNKvU/xioVJ+VtS/1jGDQOHoz5gDYP4X03dx7yUv+XHbnydokpkIOpM2gHaGua70zNOG6/+1Mv0ZKOM0tagTDNiywgudS8j3WoY0GFG24GiTLtIm0OvOyPQtKG0tZdffrm0F9TntB+0ZWosoOii1BKWqbTaDtCO0KlEG0f7h0vnEJ1hKPQo6vTs0wZrXU/euSZdELxLL71U8kgbxogFI9RsXIFxwLNoi6lju3btKve0wXRI0Y6ziJnREfiMMJAeFHZ6/OFhoKjiyrMpV55JHjG6yDMj96TJvhfC2zYQHYD08T4YxcFowrgin5QRoyikhw5NaxjwDomPvFrDgPTTQRc/F+Le8mL/Unil+ls3JWMp5U9Zu2FQx0gZBrw0feFKlqfXuPZa/ZVSflbGUsyLZWJeFmEIoNDzg9FDTuWlcvys9FIwx4/RAno9mP5CBcg9hgE9I8SBkq+GAYo3ZcMPSeXFdBqUXObHY1zwLHrEqfDYSYF7dhWiAub5KL1UiFQCVGYMKWJQMOpABUGlyboBKmd6Ipirz3oI3gtph3gvVBKkHz497PSUUCFfddVVMoUKf/JDRUTvyIgRI6SiRNmnsj3nnHMkHHExdExFSW8QQ73kByWeMiFejB/9ibVsyZ9ODdp1112l8qKC0jRSeZIG8kxZ05Bo+BQxpEt5IkPvjvIxDKgwiI+RG8pbDQN6eeDrO1XXXmt6lB9TqTJZ4VL+MS8rnF6TLzcMHI6WB/8g/yKdK/E/nfp3UzxLxXipeGI/e41bigwu9TXtDnU79Ti72FDnqx8dQLRtdFbR1jBFBaVYe9WZekQ9T/3N6KwaBijElBHr66ifmWaLEs2IL8+hvWEkHgOC3nGUZ2Roq6ivUfRpS1HaUXRpb5ClDaTXHmWZTiyUYtoq4iedWtdDtDW0LRDtMXGQZowZ0gUf4+Cggw4SPiPjvFN2xuOe5zH1iPad0Xc6qODTBh966KFSJux+BI+OQtok4qRstR0kTyyIZjSETkXSZ98D5YYM+Sa9GEVsGkKcMTGqcuKJJ4qCTGckOgN8axgQHyMkahigk5D3+Ll6HfOsX+o6yz/mFZO1VExGifdL/twwqEPEhoF9eVQEsaukYWKK/ays5VuyccbXNpwl62dlqOTpKeAnorLSITn8UCqZYoPSiwXPkCCVEYYBQ5FqGGjlyTAr8VABck8vOko703WoGJi6wxAkPzNbd1LZMG+QngAqNpQ/nq3zCYmLKUcXXnihPN9WEgw54sc1IwEYClQwpJsKggpvyJAh0gPEugEqHCpklH4WJ2MwUFlQmVP5kkZ6c8gbYUivhqN3DCNBe5UwbljUTP4xFGgI2BmCZ/M9ULa4pEGnEjGqwXCpli2KPPcYKSyapjJjxEAXTVPZsVaDvDNqQVkSnripHDCeyA9hqXCJl3xTabJgnB4a3hejO5omfefxNW5Msb/K5IW3YVNuiqwf11kybhg4HPWB2DBQSv2/sZsiG0bJ8lMUhysmE/vbcNSbTFOhLkXZ1U4WlE1GX6mf6dTCn/VvjPzqSLcaBsgwOouCTHtArzb6AT3e8OjkYsSZXnDaGtpa6n6Itkanw5IWOn6YJqtrDJgChDLP85RoV1msrHzWMdDeqGFAHIwuMH2WdpLRcDUAaKfZUYo2g847jByMHtbb0SarYUDc5I24GKXQtQK0LbSBEPmgsw0jgk41dCMtV8oEY0GnErFbHvmz/rTblC9rG0m/TiXiGSj1yNDRRecgbR5GHO0Z7TLvivRgKJFO4iZOpuZi5KA/6OwF1dPs+1fS9FiyflluTFYmvs+Tid2UDNduGNQxrGHAh2xfZnNSuc/NCo8yzRAfPzmKJlu2oegzP5EFxijC9KbwY/Oz0dNABcaIARUc02roYaEs6NUnDix5KhhGDpBjVIBKDMV33333lZ+ZEQPi0hEDKgH8+fAZoaDnhfmLVJD89OwSwU4IEJUNFQTTe6hUIYwXKgCt1OmpIE/0pmC80OtCRUtYKiJ6QXgWlSYKPpU5xoMaBlRMVPSEoZxQzikHjBvSQIVKw8GwMhU3BpX9iXExKFDsMXxoFIgDHhUy8WJssEaBMiZ9xKGGASMHvAMqNIg8UfERN3lkaJf8EZahX/jEqcPeVIoYV1SmtlIslcoND9VSBoMH1w0Dh6NlYQ0D+49C9VZvlELUj7qGAEWcNoA2EOWedo56n7aIOpURbOoiOoIIT48503rQCXRnI3q8aUuYFsMoPPLszEPnBguEGWWmzcEooG1ixAAebSl1OG0pU1xpN0gPHUIo9rS7tD243NNeqrJPRxJKtirHtDO0taSZBci0VWrMsD6OjijSTJvDVCf4tPcYDJo3et15x8RHe676ACP18GlraJtp//BHcaUstVxp53iOGhS09SjtxAfRhmJQkEZGIwhPudEW0+Fm1xiQL9pAbcvQWzSdbLJBmVB28Om8ZFQDXeSUU06R59h0lUr19C3zPt0wqGNYw4Afg5fKS4P0Bapryb7gvHCWnyejrlKefxzW8iHywFQXfjIqLHrvWYRLj7Uqn7hY78jRo68LhpnDxzQeFHgqRQwBDAMMAYZdCcOiXRRfKlp6RuAxDYcKUHsnMARYwETlRpwo9CjN9BaQRq0QcPUaw4I598hTudLzQc868VDhwKcS5NlUDrptKBUeSj8ViJ4bwK4OVPD0etAYYEBgeFAxUSkRllESyoHyIX+MMDDnlIXM+NsfXMuXXhsWqPEMeqUwFHi2bgMLn4qZ8FTemh7KB8Mg9X2Rf74/7cWiImWqE6MfNED0UMFn4ZX26tl0xfFl8VN+ykvJFuNZfso/i4dLpYjrhoHD0bKwhkGxfzfmWX7KP4uXJxfflyJr+dSnjCpTj1Jv0hvPHHxGjHW3Iog2graO8GwHCo8ec4wF6l5GweExdQaj4IorrhAFl7aJnnCUaDre6KCinWUqEQo+HTjI0VlGu4vBgDIPTw0D2iBt+5RIO1ORUIwJy6g0cdK+sLYBJR4+ijx5JU90ePFsZgIwvZY00kOPEq27Ceo0KdoX0ozCTltN2glHOaDAY+Sw7gBdgDZfe+a1bIkLY4h1grpGjo4v9ADSSHmhL5AmdkLSact0olGuOhKu70vjhWiXiQeDCgOK90SZ884w8tAfmHqEsabtn8rb+CyvmJvFS/HLlcnzw6Uc3TCoY1jDAMXNvjh9qXqdcmOekr2Pr/NkUn5510rWj5+ZCgrLHuudH5gfiwqMn45rej1Q2LH4kWGOofpT0TCnHcUZHsN3KO1Y68jxk6OUo6xTCRE/yjI9L1TIDGHCwxCgEqXHnviYusNQLs+jnOO8wKMiJd08lzgsMcWIioJKBBldK4EfaeJZVEzMxUfJp0dCF1ZTWWEYwOM5vGsqU/JK7xG9QqSP8mLxNsPL8beg98wxpQeDZ9n0cU/eMTJ4B/TWqIGGYcAuRRqfzTtEeBobXVAN8S40Xgwi/DUNGk+Wa695L1xnhVU35inZ+/jakvKtn15bPq4bBg5HfcAaBql/Nr6Pry0p3/rpteXnXadcGy6rPovD0Aag1Gt9SnuEIkz7Rz1P+8aoL2ExEHSTChRw2gWMAKaYsl4AhZ3RWtbDEQcdYnQg0QbSbtC+0dGGokcnEbLERccTbSRtIHIYCPTGx20LhMJLu4RCr+2aJdopOprocKK94H3RSUe+yA/pJO08hwXF6AC0dzqyQA+/TtHh+YzaM72I9gVFXw0pFHk64+itt+WLiyxrMjAgNI+WUN5Z+0D85Ic2WkcMMHJITxwnRHjyzgYklGUcLzxGXdDTiFflUnFxrcS9/V7UP+/a8qxf7B/z9DoOE7tKbhjUMaxhgOLGC+NDspTi5fGhaspkEeGzZPiBUfpZfMWwHpUYPQNM8WEhFIo8w6YM2fGj6c4MVCiER4FnWg4GA70QfMRUNPy4TDGil55RBSpfhj+pDKiwSA+97cgRjl4EKgSeiYHBOgfSF6db80IFQa8802lIC4o2hgJzQen9tz8XcgzdMspArxD5wdhh/iKjP1RCKOmkmTmbzKskr8gRD4o7PfMo7yje9KhQXoxIaPz2WSoHYcCwwxAyNBbMTcVwooHRSp+hYXpX6PGgR4o0aTw2To2X9DLCwtQh3gHvizIgXt4TYaxcHEcp/HJloGIyebJZ5IaBw1EfUMMApTH1r0K1qANiKvaMFGX5UVdSx9ATzsJe2hB68rWepgeeDiN606mnCU/HEaOyKND0mNPTTztE28aiXeJlcwuUc21LGVlg3QHTbFGmaXdpB6nvqbs584c46C1n1x2eR7uTlW7aJ3rq6RknDbTTtC+0B5xjwDsiDPLabl9wwQXj20ryhyGgbR35YhMOpiFhuDDViHeNPH4YB4wc8BzaUMLxbNKiacTVa8oUo4SOM0ZWKAPkeDbtKDzbwcoUWdozDAkdhde8xoQf74Ky4z2QJkbwSRPThSk34tXwNl2WUjylLJk8ygof8/U+7xnKpxyZ9eCGQZ0iZRjw0iB9gTHFfL0vNbzlZcnEZMOVKsMPTH7ooUBBZis1ehmQp+LgR+VH5J4fjuFFpt9A3FNBUclRLsQHj/DMQaTXQD9ywhAv4eERjoqDKUpUZMgSBz+2KseaRr22LhUEcZEO0k08PJdK1/5YVDw8i+djBGG48B7hQ4QhHipD0sM1slae+BgNQZZKlp+VODWchlVXr0kjecKIwJBh7iXxa5lrWNJNvrUMY7LxQ8hTRuSd94XLvaYpJRPHlSIbthSZmJ+SjSkr3pQMZUWe3DBwOFoW1jCw/2j83+p96n9WyvrvS5XJ4+t9qeGph2kztM3Sepp2Dh5KPW0Ldb5tt2gHcKnjaTtoX5AhftoM2j7aG/yIn7qddodwPJcwhKXupn1AUeYZ+BMv1zYvcbqR17SQPtJNGw6f9kFlIMIRH9OTaC94Hu0Fadd2iOfz7LgNQp5whKdcyBP50DZM06WuJZ5LGNp9nks7TblQHpp/wlFuPNuWoY2He8tDFqLNJO+kyY5kqUzKzaOmytjwebI2bJ4M74w20A2DOkVsGOS99Jh4uSl+HjWHjA1PfviJqQAg/Wn1B9T8IqOVo/L1Z8TV+GwYjT+OC+I+FZf6FyN9jsZh02BJn2/DwtcysM+26VOK5VNhUkT8hEWGcsVNpVGfjYuMfTdZRFiNNyvdeVTKMyyVmq6YmiLjhoHDUR/IMgxKoeaoN5r6DFufaj1t62Urp20BfOpdDat1MK4No/HHcWnbYOMqpy63z1HKktWwNn82/+pv02dJ048sbipMFiHLc/XZcRq5JwzENekq9j41vZpvlU2FTVEpz4ipUpkUP4vsM3DdMKhjWMOAXgT7Ap2cnGpDbhg4HPUBaxik/lUnJ6fqEu2fGwZ1DGsYMDRoX1qtiCG1FD+Lyg3f2qmS8qi0DJvzWa2Z4jLh3g0Dh6PlYQ0D+49WmxqhLi33ec2dvuaken9f9Z6+PNJ0uGFQp7CGAXP6eGGWmH8HxXzrZ/2zeKlry8uTybrPCqN+1j/Fs5Ty1+sUT6/tfeyf4qVklJ+61/DFZOLrVHj1y7qOZbLigDR8Si6LUjKxa8Oqq2T9U1RMJi+OlEx8r7xi/qXKuGHgcLQ8rGEQ/7/xvfKK+VdTRv2sfxavmH+Kl7q2vJhfyn3Mi/2sv16neHpt761/uTIpnvWL/W2YFL8Umfg6Dl+KfEoui2x4lSkmG4eP/VL3eTIxpWTcMKhjpAwDFuukXiSkfurG/JgsP+s6pjyZLLk8fiUyxa4twW+KTBwmTya+zgoL4ZeSse84plhGeVl+SjZMys2jUmUsn2sbvpiMfsuxTKossu4tP+vaEvw8GTcMHI6WR8owiP9pvc/7n/XaEvymysR+Kb6lUsJANlx8nRVHU2Ri/1je+qWIMFkyWfKxTOyXxYdKaRtSfL3GzQsfy+h9lgwUh7FunhyUJRuT5eu1lkWpMnqfJQOPON0wqFNYw4CV8/ZlshhZr1OU8q+FDBSHKVemlPAxIVOunMqUI6dhS5Wx8ddSRqmc8JU8o1KZajwjL46UTF54KOWfJeOGgcPR8lDDgN3Z4n+01nWAUrky9ZSGYvGmyMpUKl+unIYvVc4+ozlkSg0PlfsMqDllSgnvhkEdwxoGuq2kk5NT7ckNA4ej5WENg9R/6uTkVH1yw6COYQ0DSF8aOxRBep3Hwy3mn+VnKRXO3hfj6bV1rX8qTDF+zEtdp2QsP+XGMnlh1dVrpbyw1rUEz/Jt2Ngv9re8mNTPho1dvVZSng0Tu3odh7d+8bV18/xTPNxi/ll+lp8KF/u7YeBwtDysYWD/T/u/ptwsXqn3lqfXeTzcYv5ZfpavFPuneLGfvbZuLKM8GyZ29Vopy8/ylWf9Uv7KtzwbJs/Va6W8sNa1FPNKkYn58XUso/cxX3mlyFj/VPgUX69jN8s/xcN1w6COERsGvDR2J1JXr+19MX7sZ8Pk8fU6z6+UcLhZ/JSr1/be8lNusWsl5Vm/+Nq6sUyKp/zYP48X+5fCs/elylg3K5y9t/5xuFL8s3iW1N/6WV4WX++tnw2Tui4nnBsGDkfLwxoG+p+m/ld7Hfvb61LDpa5TfjHfkvWzYYrxs/yyrksNl3JjnuWn/K2r11n8lJt3rffFeHptXeufxVO+dYvxivnn8XCtv1Iez8rYcOXy8vyVUuGUmLbuhkGdwhoGnLDHAmQlXqS9t/yYlB+HsXzlxfc2jFIcJr6Pw9lrDWPvNUyKbPhYJs9VsvcpP6WYn3WfCq986x+TDZN1bd2Yl7rOu7f8lJvFi/kpimVS19aNeTHZMKl7S9Yvlol5MT9FsZ/KuWHgcLQ81DDg5Fr9N+3/GpP6a1h7n3JjnpK9j6+LydgwMV95MamMJeXH4bKuS5Gx91YmJhsmz1Wy91zHFIeJ7204e295yk+5eh2HV36WG/NS1ynKk0n5KaXCxG4xXsofN+alrlOU8nPDoI5hDQN2ZdDju2PiRdp71iOUSuWGhyqRySOb9phS4euZ4vTbd5MKX89k8wE1cl6gOB+49lr9qSjdMHA4Wh7WMND/M0X1XjcVS5emPaZU2DyqRKbaFOdBKRW23smmv96/sWKUyguukvV3w6COkWcY6Iu2L5xwuHnWIUSYFD+PypUpJTzpZNhK024/YL2vRV4qoWLPsXmJ343NS7H8lEuV5L8Umby8QMXyUqt0xVRMRt8L13FesggZNwwcjpZHlmFg/2H7X+vIerF6lvApfhaVGx4qRYa6ibRSp9q8QOQFv2J5KZdqkRdNp+Ylfj/wNFws21SqZX7i96J54bpYXmqRrkpI80LalXhOHrEzkRsGdQprGDDH0r7YmAjz5ptvyqKRP/74I/z+++/hn//8p7jlULky1XgGH6HNn+YF/m+//ZaUKYUqkamE9Dn/+te/ZOFOnJexY8fKj6l5sTKlUnPlX2X4hug1wCAlD5oX8oYbhy+HmlsG94svvpgoL3lEZeqGgcPR8rCGQepftYQi/cYbb4Sffvppon/f1gnFqCn1TDlk0/fDDz9IvWTrJvJCG/jjjz/+RaZUau68QL/88oukHdK8UO/y/niPhK3kGZVSuc+y4dG9SLvmhffDNXn5/PPPm1XPgpqSF94LCj/50feSR24Y1DGsYcDHqJVHinjhL774oig1fBAoocWIDybFz6JU+GJxFPMnf3ywVOg2Ly+//LJ8oHzQKblqULn5h/JkqChIPxU6FYjm5dVXXxWFutS8VDtdWZQnwzfEt4RRo3nBff3118Nrr70m7y0lF1M95AU/jBz7XvKI79ENA4ej5aGGAfVQ6l+1RBs5ZsyY8P3335dUh6TCFJMrJd6Y8mTw+/bbb6Vu0k4XzcsLL7wgfim5mKqdrkoJo4w2gnbP5oX2nM6ZUp/ZHPkpFh7FmLzoe6Ht4PqVV14RxblWehZU7bzwXngPpfxHkBsGdYzYMODDzCL8n376aXGRQQnlxULxtZK9T11nyVh/68b+MS8Opy4/G8qz5oX7Z599dnyPSUrGxhXz4uuYF/NLCZe61nu95ufkxyMvVIyaFyp4eKm8ZF2n3FS4OEzspq5tmKxwVCbkwVbyuFTwNL5UNFYmFZflxa4NE4e3vJSMvVbKCofL/8CIDQaNvhdL5M26bhg4HPUBaxjYfzZFdC49+uij4euvv/5LvaDX9j7ln+JlyVjXhonDW4plqJu++uoraR9sG09ennjiCfGLZa283ufxUtcxLxUON5axPOun1xhltBE2L7TjtOeMptvwKXl7nXJjnpK9zwoXy1heSob2+qWXXpK80DZA5OW5556Ta9rIWNbK22sNE9/H11YmFS7Ft7zYhfjGeC+05do5Bml7l7p3w6COYQ0DKkaUmizihVOR8PJ5oXzUSihx9j6LZ/mxf7Hw8XXqPuarS9qxwlGiyQt5feqpp0SR44MmXCxjKcWz/Ni/WPj4OnWf4vOuKH+bFyoVKkWMg1LyUoyaS4aKhbTraAeVBffkg3fD8DfhbF7i5xR7bqUypfCU8ON/YPRJGyz736SIsG4YOBwtD2sYpP5VJeoo6qoHH3xQeqZtncB1XEdk1RnKL0dGqRIZ6ibSiwJKO6h5oe0bPXp0+Oyzz4rGmfUMy69EJqZiMriMcNBG2LzQJqKbMAIdx1HsPsUvVyblX4rMd999F55//vmJvj3yoh2wtJE2Hq6z4s2iSsLHMsXiwJ/3gl5C+jUveYQR54ZBncIaBvxo/GRZRA/DY489Ji+fj4GPGuUNZdQSfCjmpdzYP49XikyKzz1pV8ucvKiRgyLHB23Dq4y6Wc9RsmGtmyob5Vcqww9I+asCSl6oVJ555hnpZaCBi2XtM6C8Z+h1SkZ5luK4NHwpMnxDvBfyo8o0eWG04PHHH0/KKukz9NryceN0qV8sY2XLlcFVGVzSzzdGHux/kyL+NzcMHI6WhzUMUv+qEnUUddX9998v6+z4520dYCmuK+LrFKVk9L5Y3WR56tq6ifSiTFPfal4wch566CGZAmllsuKMeZav18q3fpZSMnpdTEZdRjho6+K8oJvQ4WLzHcuqq2FiivPPdSkyuPYZNo48YuRJZy7od0Ze0E3g0d4TLvVsfU6cZqVYRsNXIqPX1g/S8Li8F9o/DE7NSx65YVDHUMOAH4oPkZ8si7AEGUZFKeVjQKFuBKLS52dTpY28UKnw88Hj50zJ1SPxA1L+cV7oYaCybKS88A3xTZEfbZT5BskHlTzvLSVXj0SlSfpZg1PsP4LcMHA46gNqGBT7b6mjMAzuu+8+Uab551N1Qb0R6WTaBj3T1Lc2L4x+4NcoeYEY/UCZjvOCbsKocyPlBWWaTj3acP3O1DAgf7T3Kbl6I8qc90L7R/o1L3nkhkEdw44Y8CHyk0H68vQewv+RRx4RpZSPASW0EYifj4qDj1bzwo9IrzS8L7/8MilXj8QPSPmrAqp5wTCgsmykvPANqcGmecElH1TyvLeUXD0SigX/iw5xkxf7/8T/Eg2YGwYOR8vDGgb2f43/WYhODAwDlGlkUnVBvRHpRAlTw8Dm5YEHHpDtTBslLxC79ahhoHmhfUc3oXOmkfKCMo1hEOcF3YS2kfY+JVdvRJnzXmj/SL/mJa8N5Ltzw6BOkWUY2EpS7xki4ufj5aO08VE3ClEJWmWavNIrTWXJHMuUTHMSP1WKHxM/IeUf54U5+RgHxfJS6nOUbPhSZUuVUYPNGgYYOVT6fGfFnqf+pabLUiUyeYRBRh50uF7/G0jzpteQGwYOR33AGgb6r9p/1v67tCP33nuvbDTAP5+qC7Kopeop0snce6Zo0obbvGAYsBFCuXlpSWK0BmU6zgvrJbivVl6ao31hihftdmwY6JRt2vuUHGSfUerzKsmLUp4sZc57sYaB/Yf02vLcMKhjxIYBSg0vz5LysGAffvhhUab5SFBC+bBxLenHomGgVLhiVKoMCibTUnCtjF7jonyq0gZRqfDzUVnyQatMTPoz4Obl2fJT/nkySsRPA8XwIRWCLT8l0kKaMQw0L7y3J598UoyDauVFSf1tuDwZ66fvRXsTbDiItOjoh357uFT6fGd5z7F5UV5eeEul5sUS4Xgn5Cf1THikXXvl4n8IIn967YaBw1EfiA0D+5/GPOqru+++W6beaj0Q1yHF6qZS6xxLeTI8R6dzWL6tm1D+1TDQ/JAXRj9sXmLSvDQ1P3GYLBmeQx0LcZ0Kg0KJMo0+Qj40LxgG5C2VF31esbzElPLLCq98XIjn8F7IC4pzHB5i5Ek3QdFvDCMH3QQ3S07jj/OSRzbdNq3KSxH+StzzjfGvxOFIA6NSfGOkW/OSRfi7YVDHsIaB/mgoNlZRUxdLEIWNOeB8DCihfNjq8vHw0XDPS9cPintLKlMqZYXXZ6JY3nDDDeOVfPWzYQmjShv5IS/0StM7zQedklHlnN4W3FrlRXm4TKEZPHiwuHE4iHRQ/uRFKxPeG0OPzEvUvMREXqhktBFI5SWPSs2nDUfZXnbZZePn5MbEN8R7wWDT90KeqCj5zrKeqe+FXjvyod9iqVRqXpQIT9nRGA0ZMkTKUMtP4+Ke98C7IQ+p/0ddSIe83TBwOFoWahik/tOYR12FYUB7GdcBEIoa9RN1E/fUTXFdW279A2XJEDd1PmkaNWqU1E3qZ+sm0kt9TLuneSMv1M10UsQyEHUseaEth088tc4L9SJtOfUsvcqxngFRtrQR5EXfC+0IC6m5Jh4bnmviQUkvNy8pXjFSGd7FbbfdFq6++mpRllPtFOmh3Y7zgm6CG8to2jE2MPbgaV6KpdX6l5ovG47vgfyw+D5OF8/nvfCNYaTF/03KJe/suuSGQR0iNgx4aZAqN/aajxuFjZfPB8OLVeLj4Ec+44wzwvLLLx8WX3zxsMkmm8iHhB/h+XBwVZZ7KjUbV3zPNTy9V4LHx8niqZ133jksscQS4dJLLxV+KjyKtCrTkBoG9E6TDhsWeYyBHj16hOWWWy4stthiYb311pPKCn/yo+FsXnBtetUfV/OufpY03M033yxlt8IKK8jzrr322vH+GpZno3xq7w+khgFEujWsEnH37ds3LLXUUlJOK620Urj88svH++HatFo5vY79cW3YOBzfyvrrrx823HDDsOaaa0rlGMsQTt+Lfnu4VPpU8qn3Am/gwIFh1VVXlbzwnWFIEZdNr6ZVyfrpPa7e67Ul5VHmVML7779/mGmmmSR9VIRxWL4p3o39j2LSf8kNA4ejPhAbBinSupa66q677pL/N64zqCeYZrTddttJ3bTooouG4447TnpItc1ItQNcx0QY26ZoWEuEI4zW7VtuueX4zrE4HOmlraNe1vxgGJBe5n7b8CpD59Ruu+0m7R906KGHSv6JX5+teYFUzqY3vtfwem/5lP9RRx0VlllmmbD00kuHgw46SJ4Xh0fhZoScvOh7IS+0GdS9cXjSi5K9zz77hEUWWSQsueSSYffdd5eOHs0L4UrJS+yn95aUf+GFF4a11147rLXWWmGvvfaSd2Pjg2hX1DCweUE3Ie9xeNod8n3AAQfI98V732KLLaQ80Ic0PZoGmz6NS/k27jis5eMSN98DZbf99ttLR1kcjvfCN0ZZk4/4H9LvTq+Rc8OgThEbBvpxpghLkJ+PH4qXihKqiigfOD/CsssuG0455RRRqjfaaKNw/fXXi+w111wTOnbsGPr37y8fPh8RvcldunQRJfWEE04QGX4QfhTiwsi48847x+95qx8gRKUEkW6Mlc0331xkND24GoZrfkpV2kgPPxdDjyh5pEXzQngqCxTZhRdeOBx77LHhoosukvhJO/LDhg0LJ510klTI/AiE1zxcd911oVOnTpJ2nkH8yHXt2lUUfxoJnmHzwjU80oOSi7vppptKJamVtuaDnxnDzOaFMmPokR/X5gUibfTAzDnnnOGYY44RYwOj47zzzpOKeMSIEaFz587hggsukDiQIQzvkHdHPikDnsMzBw0aFE4//fQwfPhw+bkJb/OjaSUdVBAYbhiIxMO70nxAhKXy492QF+IjL1RyOvdVw2p40oux1qFDhzB06FAxos4991zp9aLH7NRTTw3du3eXb4Ln3XjjjeO/sdNOOy0cf/zx8gzSBo93xbfJ83lGKi+UIe9l6623DnPPPbe8H3g2bYTjffNu9L3EpBUj5IaBw1EfsIaB/V9j4r+mrqJN0npZ/3/qC/79HXfcUdo96haU6r333lvaTepW6lzaiFtuuUXaNNqGM888U9oR2q4jjjgi3HHHHZIO2grqst69e0sbwzN4Xlw3wafuow6knqXupI3QdBEGIr2026RF80Lde88994xvkzQ8cavyucoqq4RLLrkkHHbYYaJMkw+eQXtx4oknSn1MXUZ48gKf8LRdtJPEfeutt0r7R17ovKJtiPMC0bFI24OOQTwrr7yyyOJn80O9jo4Q54W2hnTYvGj4ww8/XBRp2mLKFWWa8LyzPn36SPtEe0d6iY+2gvd1xRVXiCzvk3h497QvPXv2lHYAns0LpM8lLtJIWDrJbr/99r+8G8qCMiHd+p2RF52ZEeed74ayxcAZMGCAlNOKK64oeeFZ8E4++eRw1VVXjS8bvg10K9qwgw8+WMqAdNLzf/7554dzzjlH2lvi5hlxXrgn3XwP6EHoJoyMabo0bZQF3xj6nebFkm3/IGTcMKhTWMMAhYkPKYv48PgAqdBQ2vio9SfnRe+yyy7y4RCWj4yfiB94v/32k14AKhd6q1FMqWD58eaaa66w7rrrSoWKgnrllVeGddZZJ2yzzTZimS6wwAKiuBEfHy9DnxAWtVZoVJzbbrutVK6kR9OmLnmzyjSEYsjPRyWHv+YFGaxyehcwcqis8KdnhQqLHiAUU3qPqbjoESdefrYZZphB0s4IxpFHHimK62abbSYNxQ477CDWNkYGzycuzQtlqnnh+RhMyBAnlb6mDeIn5OdTBRTivWAUUGnH4bH0DzzwwDD//POLLD87PV48kwqOnq099thD3gGVF/m8+OKLRQFeY401pDGggSOdW221lfAIz/s866yz5JuhgtS88I40LzRwGIP0/mAw2W8GIj2UHaTfHi7fDPHY9wLxnaHYU/5UpqxhGDlypOSbUYR5551X0krPGQ3aTTfdJBU7eWzXrl3Ydddd5d2QXr5TeFSU+GP8kG6+K5sX8kD89Pj06tVLRnR4Jmm3aSNvfFN8i5oX/gl9RzHRULth4HC0PKxhkPpX7X9MXUW7Rl1h6zMUJxRh6joUfRQn6ijqU4hOJtoC2g2uu3XrJu0PCuqCCy4oowy0EYxKo6TSVhxyyCFSR9FeoixS5xOX1k+khTqSOp1nEj91J3WvrZcg8kZbR1uheaENxjDAz4ZHnvipZ+kgYjoRz2a6EsotbcDGG28sdScjqCj9pI/00wFF201dS5poY+i8of5klJd2Rut8nq150Q46ylGNEvQJNSQ0bbgYIvDjvFBfw9NwuNTT8ChzypPpRLxPDA7eI20aPfroKAsttJDkl3qcvJA3/Gnv6OhjRgJ52WmnnaStJC+knbxrPiDaC55LOVJehCUO8qxpU6K9Jjxp1O9MDQPK3LYzxEfbT3uFLkXbDp9OOto0ypzvhbzw7ghDufLN0TbyfdA+8n1hnBJP+/btJY+rrbaatJekwepZlDPvhHdJHmjzMUDjthlSYxfDQN9LVhsIn/y4YVCnsIYBPyQvDcXGKmp6zQvno6HyITwfAh82Hzv+/MhUGHzQ+MPn40KJxErnQ6Ynfuqpp5aeXH4AFDgULpQ8flqMBX5irHR6evn56IGgUuKH5CNG8aPHGGUYZZEKj95ceiR4LmnStOk1aYI0P+QF5Zy84K+EPD+CGjOkUT98wmMI0LPDD4sCP/vss0tPA/PvSDeVJAt0yA95xbDhx6SXHUub3gN+ShRuzQvX5IXnYxjx46GoUtnZcoZIBz+fKqAQlQoKLUSlafOPkcOQ7HzzzScWO/LwqWAwxihTeBgxDEv269dPGgoqQAw1GkwaOd77zDPPHPbcc0+pHGjIeKf00lOJkw/eARUNlQnpJm2UDRUjBgKNqeYD4rn6XvTbw0We3gzNg7qUOT1uGGykh+8LP/JFupiyRJzIUjHyLoiPtB199NESniFQym+KKaaQ75UeE1waOeSoTDEayAtlgxGFcYERSMMyxxxzyLcR54X88k1Rrnn/kRIGpxsGDkfLQw0DHbFU0v/W/r/UVSiUhLV1M+0QhgFtBr2z1EP4UefQps0yyyxSr6EMoThTx6McUUfRCUQ7RBrwX3311YXo6aU+ov7UjhwMCeomCGWN+h4Z2j7aF5RMrRdt3UR6aSdRejUvqpyTP8KpDPIYIHSC0BFDvav+PId2gjaMAypJO204fnQo0W6QVvKCDHUo8dD+0dNNe045UO+iqJIPyoPRBsJTf1JXUz50JtGBQtw2P/CII84LbQJ1sH0vvAd49LDTDtO2408cjH6TdnrZaRdQmmlDkKOjiU5L2gT0EuKhsw8jjlFm2jPeBSM7vAfyoO+Ftpt3Qh1PmWNsbLDBBqLzxO+G98c7s3mh3UfXoI3X74iwyNJ2YXzS5nKveeXbo9OVb4TRbN4bhhhx0H7xDhiNQi8hr7TttOeM+mAokH4MH8JiFNFmQrS3pJEONd4X7Tt6DnoKedR8QJQp+UVnIh/F2kC+KzcM6hRqGPABomCi1OiLi6/5wfhR+JD5GPkQ9MflJ0Dp5eOkAqAy5Iegt58PjQ+P3nJ6DvgZMRioSLDSUVx1eBVlGiUM65epLyjN9KJQITOcx88MoXCSXj587cklbp3/qWnTayocwiJDfvgRqcDIC2m34SkLht2oAJlXSb7ICz+ODqdR+fM8KhLKhEps1llnlZEGKgAUR3ol8CfvVCSE54eiUsPVvDDsx8/Ez80zsfoxjBjiJK1a1qSNH5Cfj/zYvGAU0MtAXjTPmheGZ6nQUH55B1QaVIb89IxwMJTIO6Bnit4H0oNBwzvAYKBRIc30FFFhM2qCckyFCJ/4qUgpH8qN9NCDQS8LfCpQ3iXx2LSRL96JNQyQ5Z3QMGk4dakE+Rb4Pihnnk9lzTMxVjAGqBipmOkFwnAjPI0V74Ay5zmUtY5cUTHyTpEnHYTXvPC9URGTF+LlvU811VTS2GE8xXmhQlfDIP534ms3DByO+kCWYaD/qrra3tALzL3WyxB1E/UWdQPKsrZ/tBMoW8zRp95k6gdtAu0B7Q+dM9T3KMrUpcSJAkudRZ2JYUDdTD3Ec2kvqJsgOsuoR1jHR33IKAPGRJw2rjFCMD5UaSMv1He0l+RP6zINjx8jzfQkn3322aLYUwfSSUf6aZ+of6lz6aRDMdc5/ISnLaINpI3RkVlkaDe0DGmDND+0ldSfdLqhkOuoO+nTvJBGiHgp6zgv6BTEYfNOeN4r5Uj5UM+TD/JDXlB+STdtPG0h6SQOeHQC0YbRvlPOtJMYfvvuu6+kk3zTTmFc2ffC9CraB947ugnTkuhQZEqvGiVa3sRLp6A1DMgL3wbvXBVuSN8j5YKeQDnz3dCG0bYxwwJdi28OAwV9g7aLtDKNiilGfDMo9OhevCumumGI0emF3sJ7s3oWeaH8cHl/vG/eD2WHYq950feCbsK3QD74ruJ/SF3aejcM6hixYZBH/DD8JHzI+lHzMUDc88Pzw/FR8pGioGH58tNTyVFp8uOhfPFxoKAywoAVys/Kj85z+Gj5kfmw+Wj58UgfFjIfEx829/w02mON0obyhyzpgvg5NY38ZCht/IA8g2fx86FQ23AQ13zADPei1JI+fkYsbioqfkR6mXkuhg95J680CKSBioCfmzyitPJDkRcMJsqCn8jmBZc0kHYqEKZF0SPPj09aiR9/0oYsFXycFyoA8sNPHOcFHg0VPUq8Fyp8DDHC03hhAFBhU6kRJyM1hCXvVOy8V56P8UNeqExwMSKIX/OiRBp575QReaECpdwIC2leuCYfmhfKHJdKX4e4CaN5gahQmDKmZcRIAXnh/VLu5IVKHQOId0CFT08HeaFhVkMQI4wKk/dC+jCGKGe+K30nmhdt9AnP98q71rTZvFAhaq8c7yWPkHHDwOFoeVjDIPWvWlKllrpK20CtAyCmY1A/0gbS3lEP0aGGEk1dQ53ISCvtDlMSqY/pbKH+RXEkPtpYenSpx5BB2ed51EPUSdRPEO0faUY5JB7qbKZ7YGCooqZ1E20RfG1jNS+MflBfEcbmhWv8aONQnlHUMWqo31AYUY7h0YajD1Df005SP5IGOviICz6jAeSPckEx53mkXetYiHqXNoyOONpRFFzKD6OHcrZ5Ic88E31E86KGAXVw/F64R+FF90BPIN2kgzqddNIe8l5oH2gTCUt5kg7Ckl/iITzvAsONd0O+KA/7XnBpD2mndCSBfCBH2du8QKSRb8G+F/LCN8D70vA2L4RHkScvxI3CTzr4fuCTF/Qt9BXKn/TyjfF+0DHU8MOAIO+8F9pO8q16luaFe8qS6WoYDLSvxEO54afpgngv6CbErXnhu9VrS/CJ3w2DOoUaBnzcVGD64vh5LcHjR+TnU2Ua65WPQV2IF84PBPGx40dY4obHj8s9YfmAUOgg+PxM+OFSaWl44lCyzyMs6dI4IH2mpkfl+GDx17wQjqFHKiOeZ2WUtDLVH5cwPJM4iI+0EQ4e6UXB5Rnky+aFe/KifGT0efa5xEs8mhfKzPrr84mLikDfF++Fn5pKzeZFXWTIC2kmv4TXuJCHz3M1PPkiLzwDvuYFf/x4Pv7EaZ+jrl4jT7q0nOKwxKnvhbyQf1wqfXo2CGPDaxykBznei34fxMW3xzNJm5YzeSV+zQtpJqz6kRf8kbXPsc8jPHIaN4R/HJb4SZet5PV7w7VEReqGgcPR8rCGgf6vqX8W4t/HMODa/vu4to6knqVe0foGPrLUD8QNDxce4XA1Tvy4Jiztj32WPs8Sz9O6CVfrPkjDE4e2p5o36iuUf+orGxZX00wdST1LvMipv9Z1mhfqT/KhRD1OWIgwyBPe8olHnwmRbq2vIa2vNYy6PEvrfvKheaH3njLQtFsZiGcjp+lWf9pleMRj86LvRsvH+qmRZeO31xDx8TzKHblUusgfbST5Jk0Q6VHDABmNT2XgETdymm4NQzxazoQjfsqGNPMObF4ob80LcvF3Y/NDeH03lIvGo2FxySN55XmkSfPDdUyEdcOgjhEbBkr6Au09PyI/Hx8kHwMfHR+TJfsxcR3zSwkLKQ9KPceSxpGKB1mIj58fQ/PCR87PR17i+PU+K33KVx7huY75WeEt2WdrOKVUeHgoznFeyAeGDj9cLKNyGqeN1/JjXioNqfAxaZ6y4lCCTyWmFRZ5wSV/jDDE78XK2bjj94ULpcLaeLL4WWTjSvlTYZIfzQtk/x+9hojDDQOHo+WhhgH/t/1H438W4h9n1Jd/PKvesPWEhsFVvpXTcDFfw0LKyyIbVsPH7YrWq6pMa17oVValMFXf2rjj9Ckpz6YhDpviF3seZP00PMqmNQw0L4wyo7imnoGblQ7lW57eZ4WN+VCcn7ywSrTXtN2xkUMHLO8L2ZQc8Wn8Nu4UT9MQx6Vh4/B5lBee90KaMWx4J/b/sdd674ZBHSPLMEgRLxzDgOEpPmhIrcJSqVQZVXap0Cp5jpLKqmFAfOSFHxHDgLzwkaZkSiUNX0k6y5XhB9ReD5sX8kF+uE/JKVWat3KoVBnC2ffCe+C9kz8d4k7JxVRJnoibZ1KZp8KkqNhzbE+S/jNZROXqhoHD0fLIMgxi4r+mrsIwoC0st96plCp5jpXhmjbCKm3khfqK0Q8U0VimFGpq/it9HulnRCbOC6PMtIs2Xr2uJK2Vpq8cIv203bwfriGMHJ2Zkdc+NUeeLBWTJe2kme9J85LVFtL+MmXJDYM6hTUMUMqUeKH2HuKFYxgwZYUXm3rh1SDipicDRTfrwyqXqBRVmSYvauTQy16tZzQH8QNSKZIXKhPSznvhnejiq5RcPRLvWd+LfmPkSQ2DWr0XnkuZ8QwqYSrfVLgU5X335IOhVM0L6dfvTa/1HiXEDQOHo+VhDQP9P+0/bF01DKg/ymkDywlbbdL6jnqVdk/zRl5oZ1GqU3L1SqQdZZq86HtRw4D6tyXLulyK8wKRF52ZoQp3IxB54RujTbX/jb3G1Ws3DOoYWYYBCpq9h/TnU4U99q8GES8/CduAWUW+KUReUKZ1jiU8Pl5+Pl2wFMvUK1EeVCSaF4hKn3fCUKqt+OudyIu+F003LhUiPVm1fi88g8WCpKEazyIftvHV96P+eg+5YeBw1AesYWD/V/1n7b9Lm8SGF7SFtWoDa0Gkl3qVdo97zQuLe23HTCMQ9SsdYbR79r3Q0UP920jvhbzQOcl70bxgsKFnkUdVuBuBtO0m/eQDnuYpDss7YvGyGwZ1itgw4OXqy4xJDQOUacLxY1aTeAY/Nltksj0WH1i1nsMHyzQPzZ8aBhDXKZl6JNJPhaHKtOaFyoV302h54b3YvMDH8KHBqmVeeJ5WwGy3SsOiz6+UGP1Qw4C4YlI+LlPC3DBwOFoesWGQRfy31FXsJqRtSaoeqEeifqOupQ3XvJAHRj/IU0qmXok80N7ZvKhhoFOMUnL1SNp24+p3Rl5ol+Bzn5KrR+J90HarYRAT70Wv0TXdMKhjWMPAvkB9iXoN8cJZFErPNB8yHwJkr1O8lH/M58NCqWI/XrbdorJKhY3jje/1Or7XXnaeQ17gkQ+IfFmZOB7lx/64+kPEMvbeyli/YjIpP55HhYESavPCiAEVo82Llcvipa5jmbz7LBlNV8rPyvBebF7gYfjQYBV7L6Xcx3LxNc9m6zWeR0MZh0/JpPxwaZQgzUseuWHgcNQH1DDgn+TfjNs9Jf5r6gu2R8a1/751lex9ll8pYeL7LJk4vPIg6jZV2jQvtIfUe+TFhrduzM/zi8OUK5PHV4KP4kx7Z/NC/nR7zjiOOC57z3V8H4eJ71N+MU/5KX/LJw/khXv9zsgLehYdsDasdZXie0ulyKT8Yp7ys66VeC+03bialyxyw6DOERsG+uL42VTBUZePmA8Wa5ZrPgB1LVleKf76kdFzyzkIjBZkhc/iFbvng1UFFOJ55APiR0zJxDy9pvLBiEFB56dW/zhc6r7YM1K8+Jpn8nzNCz8llQgVY6l5ie9T4VP8LBmu6ZFCMeZeh6hTpHK4vBfyQh74xnApVxqsOLylmFfuvfIgyo59nDlwRys49Y/D2/uYp/nXvNh/x7oQ6xrcMHA4Wh6xYZD336phoPVcXAdUcp/ilSITU54MdSx1LTzND4YB06Ly8hLz8p6RxStXppg/bZwaBvpeyAtTQzWPqTigmG/vUzJ54VP3KV7ePXmh/YGneYFHJ592WsbyUDFeKTKVhMm75x3QdpN+8hH/S/YaXdMNgzqGGga8JF4aSk2K8OPl88FCXPMBWIp5qXvL46OiUmI6D4dlcQgIp+/BtzKxa+OIST9Uy9MPlspRrVzCYBSQFxtW47dxxNcorVRAHLTFIScpmVgOsvmCrEzsxv6WR3lRbpoX4oVHxUhes+LQa+sfh4nvY9eGsfekBcMEw5F7Dh7jWvNM+FT+qeA1L3xjuLwrHa4v9lzuLS/2tzzr2mvKDKOUA2nIQ+xv3VjeEt+E9srpv2T/Kb0mj+zw4IaBw9HysIaB/qtZRBvC+jc6AeJ6IOva8mK+3sdu7J/HS93HPOpYq7SRF4wc6lnqrFLiiEn9rZsnQ72YklH/lKyVUSLdtHdc63uBR2fK9msFAAD/9ElEQVSUdtTZ8NwrL3Zjf8uLr/Nk7H3slwqj97Q95IVrzQs89BL0kzgejSOm2D+Wif1jnm2bU/72Xnlxe847oPxJP/mI28C4LaQz2g2DOkWeYaCKJy7Ey+djpUef6RcDBw4M/fv3L5k4dS++50ReTk/khEVOPeSkSBtOr2PZUogeYE60ZYEp6ebD5WPWvGCRY4hwejGnG5b6LE6aJJ2cgMwJwISvJH3lyJAXplmRXhRuNXIgflJ6HfDjdGJOME7FXUkaSyXi5mRMThrmfvfddxdDj6PT47DkhePi+Y6oAGm0eC98a+SFBoy8XH/99Zl5qSYRP0YB75LvkBMhU+m2pGkiHCdCMv0NowKFQb8x++/E/5MbBg5HfUANA0bx+D+V4n+X/5q6CmWak9P574vVEzGVW5dVUvepDC71J/UonVkobbR7mhfaEPh0ylEfV5q2UuTiMJU8izaa9o00097RVmheaNvpGCMvtJP2vZT7LChLppS4SglDXuiMYsSGvOh7gcgLbQl5QX+hvYzl9RlNyVs5snlhyQs6FP8FRg6GgeZF/x3rQm4Y1DmsYcAL4ydT4kXaaz5elDY+2H79+oVzzjkndO3aVeiMM84QogcdF55eW571O+WUU0KHDh3kmG0USRRt+FbWylg3K057361bN1HgqRgxAmLDgB5elFM+epRYldc44viUx3QnrlEiOeo9JRPzbByxa/0tz4YjL3369JHKhLzw82nFiEuvDwYQlQjGVvxMG5/6WZ4NG4fj3sqkwuJquUAco9+pU6eJnoELkRdGWzAuMXLseyEvKNdUMhieGGEcXx8/K04Prl4raXj1V14sCw869thj5Qh8jvvv3LlzURlc8tK7d2+pwGmYaGxtg6X50msIf7bUc8PA4Wh5WMPA/qeQ/rvqUlfRmcH/TifI2WefPb4u0HokriO4j3mxi7+VVRn1j2VsfFbGhuOe+hPFjVEO2ghrGJAX6l/q4V69ekleUnHY+/jZKVfJ8pViP4j4bFjl2XvaANo12gQ6jWi747zQ+Uf7SNtCvZyXl5inpPwsmdhfrzUcbhzG8vReO83QPzQv+p3pKAJ6C21+9+7dJ0pH7NpnZ4Wx4fRa+TZsLBfHqXLKg8gLOhR6oY4YkA99N+pagvfuu++6YVCvUMMA6w3DAKUli/h46THhB0QJpdeEjxcrHlev9T7LHTJkiFj1xxxzTDj44IPDTjvtJJUSuxGVGoe6StwrKY/nUIlQkVMpUnnEeUHJptK0edE4UnFaPxTJo446Sp5j/W2YUt0sPyXNCz0JNi/8ZDYvzH8tlpeYp8+w/Ng/xbNk/amcN998c6n4SHccHh4VIoo0hqbNCy7KNRWj9szZ8oVsGqwbP8dSloy6pJnG8eSTTw577723vFu+USufciHywjB2nBfNj82bum4YOBz1AWsY6P8ak/63KD3859RdjBKm6jfuLc/ex64NE7tKqTDWL8uFqD+1J5c2grZC8xTnhfY3FZeNz94Xcy3Bs/z4Osvf8sgL7RvtXCovKNi0j7ST9r1YN8XDVUr5Z4VVisPYe+XptVKcF9JvvzHaePQW9Kxi7yV2Y9KwSspLuTHPkuXpNeVM+tCh0Avp1OO9xO2dkuW7YVDHsIYBVpx9iTHxwlF8ePlUKFiHDIOVQ8hA9OR37NhRpm8wcsBHVUl8eUR8pJP0km5bkWhe+CnLzQuVLJUpIxwYNJWWRTmkeaHCQHGm8qjme6kGUclBlM8666wjvVGptMCD1MAhL/bdcM97oaJvzryQ9ssuuywceeSRYZ999pFeEHipsEqkL34vNi8pwt8NA4ejPlCKYaDE/81/zv/eknVtqUT6qEdV+YzrWc0L4RolL7RzcV64Ji/aBqbk64lIo+YlSzfRbywlXy+k3w3pTOUli9wwqHNYw4CXpT9cFuFvSRU7dWOK+cTBHEHWFZx00kmiWDPPjo8lS6bYM6A8Gb2uVl4wYuhNZhoRRo1Ne0z2+Xpv3SwqJpPKi81Tsfih1DOKUZYMz6bxoeeJIcUVVlhBetLjskk9x+ZD86BuqVRq+i2lZOhdY7j0uOOOk+Frem00D6U8I86LJeLBJRwH17hh4HC0PNQwYN1P/M/GFP/vSnl1Q+yn980lAzVSXlJ8eJZfLD/FnqFk/UuRif2aImOvU3lRNy9uS6l48ygOU4mMJfWL85Ii2kI3DOoY1jDgZZXycVRKxI9licLFFCJGDBgqU4OkEYiPGqOAhdLrrrtuOOKIIyT98FPh2wrpD8+cVqbiLLnkkrI4j1GoVPh6JYwb3i/blzKdiJEtvtlU2EqJf4yTH90wcDhaHtYwSP2vTk5O1SV0BTcM6hhqGLD4mJeFYoTiosTwnFIW3/ql+Lh8DFyzUJOFxu3bt5dFUSjVVs7GY/kp/xQvSyaLZ2WtX5YM+WCxzUYbbRQWWWQRWYCsH3tePPY65Z/ixX7FeFbW+uXxivnHfOta0m/n6KOPDosttlhYcMEFZaEYhkFefDFf77P8svipa73Pk4n55IMhXHYmIi98q+xYZN9vKi6lmJ+6ppw44MUNA4ej5WENA/u/xv+z8q1r+SkZe19NGctTvnVjv2rIWF41ZbJ4Vtb65fGK+ad4sV8xnpW1ftWQsbxi/jHfuim/FKXiU16Kb/1jvyx+1vU777zjhkG9gpfy22+/hffff1+UH5QWfXnFiLClhiduXZS6zDLLyNoC5tjBT4W3VE6aoHLSpVSqDOllkeoGG2wgCvCBBx6Y/CGyqNx0Qc0hUyx8yt/yKBfmGNLLvvTSS4cFFlhADAMMvyyZUilPpli6UlRMhrywiI2dF/bYY4+w1VZbyfQoRryyZGLK8oMPuWHgcNQH1DBgjUHqn1VK/dOV1gEpfhYRvrlkUvwsas50VSKT4udRvcoUC5/yr7ZMtZ/hhkGdg5fy+++/hw8//HB8bykKXjWJj4EFqYcddlhYaqmlwpprrinrClSZbCTio2ZvaPIw7bTThr322ks+cvKYCt+WCCPghBNOCPPMM0+YeeaZZas1lOlU2Hon3jM7R/B+F1544bDLLrvIN8y7ToUvl5hK9N1337lh4HC0MPgHv//++/Daa695Pe7k1AxEO8r0dTql3TCoQ/zvf/8L//znP8MXX3whO6Uwn5odd1ggjPLeVCIeFnCygw97xM8999yhS5cusoA3Fb7eiW0pGTFgGtH//d//yVQTeperVV6NSuSfXYjYzWeqqaYKf/vb3+TAM959KnwjEHliP+w11lgjzDfffDJtjPcPpcIXI+T47jGgqBR/+umn8J///KfwJzocjpYA/yD/oq6z4x+t9B93cnLKJnQldEx0zc8++0x0T3TQ1oqGNgy0Yvzkk0/C2LFjZa7lSy+9VBUiPqYQ7bnnnqJcMTXj9ddfr+ozmpMYbmZtxKKLLiqGAVOJyEuj5qeaxNAghsEkk0wihgGLj994441k2EYhpvxwkB9GLVPgONjv1VdfTYYtheiVZNoeUxcYqWvNvSUORyOAfxAFhX+Sf5N/NPXvOjk5NY3Qk958883w6aefis75r3/9yw2DegQvBeIFsQj5xx9/DD/88IMMrTaFmCbB/DEqWk6RZftK9rb/4IMP5IPAPyVX70T5MBS2/fbbh+mnn172vIdfjTJrZOJ9/vHHH3IOwCyzzBLmnXde2boU5TcVvlGI983C/FNPPTW0a9dOFp1TqcFPhS9GfCc///yzlBUGeWuuFB2ORoFOqaXNaut1uZNTrYh/i7aTKUTonK29Y6xhDQMFCgovCWWlGkRcKD+sWdh///3FMOBkZcCczpRMIxDlRJ4YAZlxxhll5xq1elPh2wrxvikDRodmnXXWsMQSS8hpiCAVvpEIPPPMM7INK1PhOPgMVPq/aFk5HI76QbXbQCcnpzS1lTaw4Q0DwIuqFgEWWKI4s6d9//79hZcK20gEUHhXXHFFmTLDiIHy47BtiQA9bp06dQpTTz21jBpwRgVIhW8kAhh/V199tSygX3755WXtBEiFL4UcDkf9IfWvOjk5VZfaClqFYVBNfPnll+Hkk0+WHV0222wzmY7RWsDi7LXXXlvWGLDtKtavI4Rvv/1WFhzPNNNMMs2KLT5bE5hmwBoDDrZba621ZJqcw+FwOBwORww3DAzoOeb02NVWW0161tnhgeGj1gCsXRYf02uMYXDQQQfJwjVHCF9//bUYSowYTDfddOGqq64q+LQesJsCW5cyCsZBd4wkOBwOh6N5wZRkncbrnXOOeoQbBgbsBc0+/xwA1rNnz1a1VztGD9Oi5pxzTjEMDj30UFcOC2DEYMstt5RyYVeiIUOGFHxaFzi4beWVVw7rrbeenMXRloZGHQ6Hox7AJia0OW+99ZbsFuhnwjjqDW4YFMAUIg65mn/++WXf99Y0hQhQ+fTt2zfMNttsogAffPDBssjaEWRnIjUMmErEnPzWCHZW4OAzvnHWmHz00UcFH4fD4XA0B2hvvvrqK9kXf+jQoT5y76g7uGEwDkwX4hCLTTfdVIgFmq2xN51KCKVQRwy8p+JPMJyLwqwjBtdff33Bp/XhnnvuCUsvvbRMlcMAcuPQ4XA4mg+MGLD15dtvvy0n1ft0Ike9wQ2DcXj33XdDx44dZb/30047TSz61gimksw+++yiAHfo0MGVwgKoqNu3bz9+xICtS1sraJAuvPBCWUeDEczZFg6Hw9Ha8Muvv4YXXnw5PPLYk+Gxx5+qG7r/wdHhvvsfCrffcXe4+ZZb6yd9TzwVxr71TqH0HG0Zbd4wQCnkpNtdd901bLPNNuPPLGhtYD756aefLotrdcTA1xj8CaaNcYidjhi01jUGCnYlOuqoo8Iqq6wSjj32WFl87XA4HK0JX339Tbh08NXhvAv6hD79BtQpDUzwWoa6n9crjLz1jkLpOdoy2rxh8NRTT8kJx6wroKe4tU6vYbpU9+7d5XAzFOADDzzQ5zYWgKK84YYbSrlgOLEzVWsG3wLzW1lnsvPOO4e777674ONwOBytAxgGlwy8PIy67c7w2Wefh8+/+MIph87v2Tfc4oaBYxzatGHAAqCuXbuGrbfeWtzPPvus4NM6wTacnICLAswJyIyWOEL49ddfZcSIcplmmmlkz//WDo55ZzH6IYccIud2vPHGGwUfh8PhaHxgGFw8cHB44KFHChxHHnr0uijcMur2wp2jLaPNGgZMo2HB8eabby5TiDj8q7Vv38iIAXPoUYD33Xff8PPPPxd82jY++eSTsNFGG403DAYMGFDwad144YUXZG3NFltsEXr06CEGksPhcLQGqGHAnH5f4Fscbhg4FG3WMPjwww/DHnvsETbYYINwyimnyFaOrR0XX3xxmHXWWUUB3n333dtEnksBW9XusMMOYdJJJ5VDzvr06VPwad1g2hw7VW2//fayXStrK9w4cDgcrQFuGJQHNwwcijZpGFBJoPyxCxG9pQ899FDBp3WDHZdQfDEMmFvOISuOEN55553xi4+nmmqqcP755xd8Wj/ee++9cNxxx4U11lgjnHTSSTK9zuFwOBodbhiUBzcMHIo2aRh8/PHHMlrA9JHevXu3mUW45557bphhhhlEAd5xxx3DN998U/Bp2/j8889l6041DCintgSm0TFisvbaa4vxiKHkDanD4WhkuGFQHtwwcCjanGHA9IlOnTrJguOVV1453H///W3moC96wt0w+CuYSmQNg7Y0YgB++eWX0KtXLymDVVddVdbe+Fa2DoejkeGGQXlww8ChaHOGwejRo2XB8cYbbxzOOecc2cO+rRz0RU84+/SjALMLT2s9yK1csBuVGgZMteIAsLYGdiU68sgjw+qrry5rUfzwO4fD0chww6A8uGHgULQpwwAj4OyzzxajYJdddpHFt619JyKLyy67LMwxxxyiAPt2pRPA7kw77bTTeMOgf//+BZ+2A0bN+D622morGTV47LHHCj4Oh8PReHDDoDy4YeBQtBnDAAMAhY9DnRZbbLFw6623FnzaDjjHYK655hIFeJ999gk//vhjwadt4/fffx9/jgFTrdqiYQA46K1bt24yasCBf0wxcjgcjkaEGwblwQ0Dh6LNGAavvPJKOOigg2QXog4dOogy2NaAYcCIAXv19+zZ06eLFEBvuRoGq6yySrjzzjsLPm0LNJ5XXHFF2G+//cJ6660XHnzwQV9r4HA4GhJuGJQHNwwcijZhGLDrEIsr11prLTnM7Lnnniv4tC0wVWTGGWcMK664YnjzzTcLXAffB4uxMQyOP/74Nj3F6oMPPpAyYEoRBgI7eDkcDkejoVaGAWe9UC+W07FG51Mp05bZLpozlv7zn/8UOOXBponnMU1WDzLlnniz0uGGgUPRJgyDZ599Nqy00kph+eWXl2kibfVgr0GDBsnJxxtuuGH46KOP2sxuTMVARc8hXxgGLNBu61NomGaHAb3aaquFO+64o8B1OByOxkHKMEAptu2eVaRjpdq6FpyUz851b7/9doHzVzAjgdFWlX/mmWcmmrprDRWrrDOz4fHHHx+flrx0WODPNtOvvvpqgfPn2jk2kmA7asBz8P/HP/4h9zHcMHAoWr1hwA964oknhqWWWkoOcGrLPaDXX3+9nHzMdJnXXnut4l6J1gYqVc61wDA46qij2rxhQKPEGpTZZ589dO3atcB1OByOxkHKMEBZv++++8LIkSPDTTfdFG688Uap72gPuWbXQkaMx44dG6677jqZWvnoo4/KqDKKPXLXXnutrFV8+eWXJU4LngP/nnvuCddcc40o+myHfcghh4S+ffuGhx9+WDom0UPYFhriORgZtDtcP/DAA9I2Q0xrvfTSS8Pw4cPlMEraKtI2YsSIcPnll8tz6NiC2Fjl3XfflXQQ7sknnwynnnpquPLKK4UH6Oghn6mp1G4YOBSt3jBgnjQ7zhx99NHh008/LXDbJqjQZplllrD00kuHF198scB1UInutttuYhhwIrQjSKOyxBJLhMMOO8xPyHY4HA2HlGGAe8MNN8hp7yjJTz31lBgGDz30kGzXfOihh4ann346vPDCC9KRiKKOQs+OhkOGDJE2lB74LMMAhZv1e2eddZaM0GNw0HNPeHrvqVcxABh1YHSauDlLiXuMFtYB8gwMEdqlwYMHh2OPPVYMEtKCcTB06FAxDNgool+/fmIUkD7WTmo+v/76azEc7r77bjFQFLT7PDM1auCGgUPRqg0DflJ+HhZSYnW3ddD7MO+884YVVlhBeh0cE0CPDobBgQceWOC0bdAocRAgZ34wbK7zVB0Oh6MRkLXGAEUcfYCRAeo55vSjRN9yyy1yhg3tJAr4gAEDpGcfF4UaBR0FnB5/6sYxY8YUYpwAlHS2ekYZR/egdx706NFj/BQhDADOEMIAIKyuP4AYLWBkX+vbu+66S9L2xRdfyDRopkUj99Zbb4lxwHOIj3WTrA1TcM/OcoyEH3HEEeMPM8UgwCB5/fXX5d7CDQOHolUbBowWsBMR2y8yx6+tg6FTpodssMEGvlVpBBbaYhjQ2+P4EwxLr7HGGqFdu3YylE3D5XA4HI2AlGHw22+/iYLNaABKO/fUcxx2isLMCCmKOCMI5513nowcnH766aJLsPaKkQY2Mtluu+0kTAwUb+QwPPr06TPeMMDYYBRh4MCBMn2Iuf6nnXaatDdMN8I4wOigE+bkk08Ow4YNE+MAo4SRBhT9Ll26SFoZPTjhhBPC3nvvLRuKMOWIraYZMVBggLCOEIOEKUY6dYjnkn6eFcMNA4ei1RoGDNcxP5rdZvihsOTbOqhQ2K6U3ZnogXBMAL0rGAY0Ao4/QWNKw8NidRqrtj4Vz+FwNA6yDAOUY6YQ0fuuCjNKNKMCuCjZ7M7Gzn2ff/65hMcFKP6EY0Qh1YaipKO8P/LII3KavB2pIB4MDZ5JfCj7TC0iTuQ4gZ80MZWJjkzC0bOPEs9UI6Yloceg2xBP7969pbMPowI+6wlYgKxgJIE0sH4BQwFgqDAliXKI4YaBQ9FqDQMW+TBSsNxyy/loQQH0cMw333yi6LX1BbYxWHSMYUBl65gAGiVOyeY0ZBbi0YA5HA5HvSNrKlG1gLLOSAAjCdBtt93WLNuAM72JBcWMPmDIKDAAMCqyQBlgjGDUpOCGgUPRKg0D5gAyXLb44ouHY445ps1uTxqDHRcWWGCBsPbaa8viJMcE8J1gGJxxxhkFjgPQmPAvcSI0C7N9CprD4WgE1NowYLchFG2Uce3lp/e/Fs+yIH4WJ6eQN92zWLrcMHAoWqVh8Pzzz8t8OxYL+UFeE6CGAb2/bXnb1hQ6duwohgFzNx0Tg1EDti9lpIkhcofD4ah31NowaG1ww8ChaHWGATsNsE0YawtYyMPqf8efUMNgzTXXlPmMjglgURmGAQvCHBODXih2ymArP6Zc6Q4XDofDUa9ww6A8uGHgULQqwwAFhkU59G5uu+22ogg7JoA9jeeee+6w7rrr+t70EdiNAsOA3SkcfwVb+nFQYPv27WWHIofD4ahnuGFQHtwwcChalWHA6EDnzp3lYCbcr776quDjAOyjvOCCC8oaAy+bicEUIgyDM888s8BxWLDomP2zN9lkEznzwb8fh8NRz3DDoDy4YeBQtCrDgMU/yy+/fFh44YVlhwDHxGBrtIUWWkjmivsi0onBdpyTTDKJHHDjSIOt8zjXYMkll5RdOBwOh6Ne4YZBeXDDwKFoNYYBe/5edNFFsnvKXnvt5XPoE2AHBXZq2myzzWQthmMC2I1o8sknly3gHGnwzZxyyilhuummk3Mf7FZ5DofDUU9ww6A8uGHgULQKw4C1BezRz7kFLK5l8TGHezgmBj2+Sy+9tIwY+BqDicFC9SmnnNLPMSgCRp1WXnnlsMgii/ionMPhqFu4YVAe3DBwKFqFYcD85759+4app5467LvvvrKXsOOvYBvXxRZbTKaD+Cm2E0NHDFib4sjG999/H0466aQw88wzy1H7HNvvcDgc9QY1DB58+NECx5EHNwwcilZhGHAM+KabbhrmmGOOcPXVVxe4jhic0siuRGuttVbyOPe2DM698O1Ki4Oj9FmEzCL2ZZddNtx77725h+o4HA5HSwDD4JJBl4eRt94RPvr4k/DJp5855dD5Pfu6YeAQtArDoH///mGaaaaRqURjxowpcB0xHnvsMVHo1l9/fT8NOgK94IwYMKXIkQ9Ozd5jjz3CpJNOKrs4/fLLLwUfh8PhqA988eXX4aKLB4XTzugWzux2fotTV3F7iALef8Dl4eJBV4RefS8JZ3XvUfBrWTp1XDndcOMthdJztGU0vGHAseQspsUw6NKli68tyAFG06KLLirnGPjBbxPw73//O3Tv3l2+oW7duhW4jiwwX/fiiy8O0047bVhttdXCiy++WPBxOByO+gAbkrw59u3w/AsvhRdefLku6LkxL4Q+ffuFtdZeL7RbceVw5FHHhCeffjYZttnppZfDBx98WCg9R1tGwxsGt9xyS5h11lllMST79Duycf/994d55503rLPOOn56rQFTYTp27Bgmm2wyP8egRLz99tuyiP1vf/tb6Nmzp48aOBwORwlggwumrUK77bZb+PXXXws+Dkd9oOENAxZAsuiY7ROZ4uDIhp7zwOJjX6A9AcybZ/H6jDPOKKNO//znPws+jixQZmeddZaMGnDK+JtvvlnwcTgcDkcKdEINGTJE2hoMg1122cU76Rx1h4Y2DD7++OOw3nrryYFmbFHqyAcjBiussIJM/xg7dmyB66DHhrUFnIFx/PHH+xkPJYJzMbbaaquwzDLLhIMOOkh2LHI4HA5HGuygyDRM1mdhGOy4445+iryj7tCwhgE/2AUXXCA7EXESKzulOPJx5ZVXihLXvn378McffxS4js8//zx06tRJpsUMHjy4wHUUA2sNzj333LDEEkvI2pWHH3644ONwOByOGOgtvXr1Gj+VaIcddpB1kg5HPaFhDYM33nhDTvGdYoopwoorruinsJYAlN6FFlpItnb1cwwmgIr58MMPlylpzP9kMbKjNNx9993y//EfMp2Phs/hcDgcfwVTiWiHdcQAw4COKYejntCQhoGOFqCMQCeeeKLvRlQC6N1lugyKnG/rOgGffPJJ2GabbaSiZiqRr1UpHX//+9/F0MSo4tRxTkZ2OBwORxqXXXZZmGSSScYbBn6mkKPe0JCGAQtn27VrJ/vOcwLrPffcU/Bx5IGFtTPNNJNM/XjwwQcLXMe7774rW95SWe+1114+mlIGWKjNoXDsduW7OjkcDkc+Bg4cOH4qka8xcNQjGtIwYNEjq/rZppTtvr799tuCjyMLDGGyiwxrMlZdddXw6quvFnwcnJxNzw2jT0cddZTsuOMoHZyozXbB7FDE+hXfutThcDj+CtrhAQMGjDcMaHd8VyJHvaEhDQMWGs8zzzyyG9H1119f4DqKYdiwYWIU0Evx888/F7gO5ngyzYrRpyuuuKLAdZQKhsLZlYhRg/nmmy8MHz684ONwtE4wndXJqRxiswbcG2+8MSy99NKyYcMxxxwju7lhMMThnZxqRXyLeWg4w4CDleiVZC/+5ZZbzk9dLQMseuIcg6233tr3nTdgjcFxxx0XpptuutCvXz/frrQCvPTSS2J0Mk0Nw9NP1na0VrCejSl07OzWUkQaUvwsKje8U+2IEQI2T3n55ZfDhx9+KCc0Z72fSt6bfxvVodZc9jwXYzQLDWcY9OjRQyztBRdcUKYR8VM5ioOddtg/eamllgobb7yxHHbm+BPsaEWPNwtoOc/AF4OVDwwBDuth7c9ss80WnnrqqYKPw9F6QE8bU+V+/PFHaXtaijBMUnyn+iV9Zyhm2mNL722136V/G9WhSsqxXJmWelfMGOHby0JDGQYoH9tvv73sN88OKD5loXTwIXTs2FFGWTCo/OTjCXj88cfDHnvsId8VB+X5dqXlg0PiTjjhhDD99NOHWWaZxf9NR6sEjSm7ltF5gIHAeiS+faX4PkXlypQSJ2TDlSpjqZR0FYu3OWSKxVeMVL6SeJpDphbPSPlXIlOMYpli91k8S9VORyXxpajUeMp9dhwm716vi8Vr/am3qMMwULNGDRrKMGArxGWXXVYUOHonXYErHXwAJ598ssxt3HzzzWWbScefYEh3v/32E6WWHXYclYFFyPybbAqw//77+0nIjlYH2hzWJDH9kCmHdLjQ0FpXr+197K/3sYx1UzzcrGvcOGzKL+Wf4uFa/xQvz9+6Mc+S5ZUiY/2sv+Vbf6XYT8OnwiplycTXSvY+Fc5ex+Gsvw0XuykerlKWP27My7q2rqU4nL2P/VNhYje+jnm4sYzlWUr5V8LDzfNP8XCzrvN4uLF/iqfXNoz1S/EsWR711meffSajFQ1vGDAnqmfPnqK8cbAZi0UdpQNLEeWXXYlWWWWVMHr06IKPg61b11lnnTDllFOKwcm35igf9EJsueWWso0wBvy9995b8HE4WgeoG9jOmBFXphNBNLRK9j72y+JnhVNK+SvP+mVdZ/nlhYnvs2RiSoWrtkwqXCp8VnxZMln3Kdms+xS/XJmUfzFe7F/sPsWrlzhiyosjS7aaMqWGtzzrl3Wd5ZcXJr7PkonJhvvhhx+kY6NVGAZsKbnRRhuFqaaaSubJ33///QUfRylgS1dOpkX51d138haftCU8/PDDYckll5RzDDj8bezYsQUfR7ng8B624Zt77rllelbePEaHo9FgDQNGxGhk1U1Rll+eTIrKDQ81l0y5VGm68uRSfpU+J8XPo2rKlMvPo3qVyQuf5VftdFUSX4qqna4sKlcmLzxT8luNYfDEE0/IomNGC5gn73v/lgcUtG7duskBZxxExbQin4r1J1h8vP7664tCy6J2psQ4KgNGFmeMzD///LJLEQa9w9FaoIbBBx98IA0sDXCWq9dKyrN8ex/7pfxTPL1Wiv1j1/oX48XXWf4pXjH/lF/qXnlKqTDWL+Yp3/qneHqvFPuneHpv/ax/fJ/ix9fWVdIw1t9eW1cpz7+YjF7n3af48XXMi/nWjXlZfOta/2K8+DrLP8XTa6XYP3atf4qXcq1/MV58neWf4tFJ3CoMAyrjPn36yPxlpsJ06tTJe7vLBCvRL7nkEjGsmOpxyimneG9uAfwwLMhmxADD4KGHHir4OMoFcxj33ntv+cY4Z4T/1uFoLaAtYn4uhgGdUzSySnqf4sc85aeu9b7aMnl+WbzYn/tyZYrxUn5ZlCeTxWsuGXVj/1R4yy9VxoYrVcbyU/5ZMhB+sX9WeOVnyaTkrEzKrxKZPF7sz321ZfL8svixP/fF4mmKDFN+W4Vh8PHHH8ve+7PPPrv0eN91110FH0epYHTgpptuCquttpqc8Nu7d++Cj4MV+507d5btSuecc87w3nvvFXwcleDqq6+WU5AxDDjTAGPB4WgNUMOAzRu+/PLL8NVXX/2FaHhT/DxqDpm2ni6ntk2t6Vtuigz1VqswDB577DE5mIsRg5VXXtl31KkAGAa33XabTO+gZ/yMM84o+DiYGtC9e3cxOtlq0894aBpee+21sMIKK8iowdprrx0eeeSRgo/D0diwhgG7E7FtKURja8ny4utUePWzruWnZCzPylpS/zicXuu9Xlt/Gy7lxteWsmRwUzKWF/uXIhPfF5OJXb1OySilZEuRsf7FZJSfktH7mGIZDV8tmTiM5afCx36lyMRhraukYZUsL74uJhP7Z4VXvyw3JWN58XUpMva+FJnYja9jHm7DGwb0Nnbt2jXMMMMMsvCYKTCsrnaUB3Yluuqqq2TEYNJJJ5U954sdi91W8Prrr8v0NBZlzzXXXOGee+4p+DgqBQvdGZlaZJFF5NA4h6M1wBoGuBgHTm2TULBS/Hom0tyI6XaqHlFvNbxh8O6774YttthClFmmetxwww2+vqACYARQduy6w4jBqaeeWvBxsFL/xBNPDNNMM02YZ555fCvXKqBfv35iyDNqsNdee0lj5HA0OtQwoF2icWW0sTmokmc1h0y1n5HlV22ZFJUanvfP9GZ2pir3GVAxGeKHuE6FpS5lrjhr4+gBjv2hlFxWuvW6WLpSVK5MJc9oLmqO/EMtXc74QQ1tGLDLSbt27URpY1rC888/X/BxlItHH31U9utn9x16yB0TwJoLzshgOhHnGjiahueee05OJ2f6H9sM+/bCjtYANQzYbYvdzFC01K0lVfKM5pCp9jOy/KotkyINj9KEwo3izbzsOBzKFfUZp+TzHXAfh8mjvHTxXJ0TzmLR2J+0MVVz1KhRYciQITLNWhV9wpNmSI0ASxgUrM9kDdhbb70lMvA1PeQDWYwOrkspv0rKuFwZqDlkssJrOaWo0fIC4UeeGtYwYFEoW2yyE9F0000XDjvsMPlRHeWDEQMUXtZoYBgcfvjhvl1pAUyzYo0BhgE0bNiwgo+jUnDSIjs9sXUp04l69erl35uj4WENgw8//FAIxYzGFlcpvreETMo/S6bSZ+DWWgbCv6kyxcJDlcrk3ceEP+XA6Dqj6ky/ZdoYCpX6ozRffPHFsrHCK6+8IkoWfEs2vtS9Jcu/++67pdPuqKOOCv379x//bPxwn3322XDSSSdJ3brvvvuKccJmGYxgcQDs6aefLjoT6+Q0XfpsjI0LLrggHHjggWHMmDHj86TPf+qpp2QTjgsvvFA6YNU40DBZFIepRCaPUt9lMfmmfMul8KDm+seakhcbxl43tGHw5ptvhk033VSmI8w777yhb9++rlxUCHoT+OE5CwLDoEOHDvJhOIJU7rvuuqtMVcMwuO666wo+jkpBhcMCd6YTzTbbbOGQQw4RhcrhaGSoYfD222+H999/X7YtbUlCaUjx86g5ZGz4UmUrkVEqNzyUkoHHAZcozyjft9xyiyjnGhYXBYsDQvfbb7/w8ssvi6KFH+FQ0iG+DcjeEwYXJR7DEhcZjZcOqZ133lmUc0awaZM6duw43p/nXHvttbJDIwdJ0tGHAYA/cdG+b7zxxmHWWWcNZ555pjxX0w0xv5wwxx57bHjxxRfHp1vjf+GFF0KXLl3CBhtsIKMRjDCov1N5ZMu9VGoOGQ3f0IYBw3Xsu48iy6Fmvr985aB3gJNoUX4pTyodhg0dIdx+++0y7YW1F5QPvduOpuPWW2+Vk7Y5UG+llVaSxsjhaGSoYcBUDJQ6FL3YVSrlPuZlkYaLZUqVh2wcsV9McdhyZOLrPIplisnF/pXKxHy9Rmliqg5nsZx11lmicMOzMtw//vjjoshjRKBsYSjSe7/llltKO8s6NfSX9u3bi7L+wAMPSBg2/dhmm23CTjvtJAr+QQcdJHrNSy+9FLbffvtw5JFHyvOJk2k/iy22WLjmmmuk/b700ktlvSWn9JM+1sWx0yBpQsknTRgzdMKQDg6FJR5NO2HuvfdemYZEWsiHzTthSfPmm28eLr/8cjEk4Nswld5bXuzG/qn75pCxrg0Th7cUy9h7Gy5Fcdhay9jrhjYM+PGY840iy8/E0JajMlDBUfEw+kJ5Mgzp+8v/Cc530O+MKWtU8I6mg4aGxe6U69xzzx1GjBiRWRE5HI0ANQxQCKlTIe0VhvQ+5adkw8RuSsbyKpWJw1v/mGxYdWNeTE2R0euUjA2jZP0sxf6WlGfDpmRQnBhB3meffcQwoGff+ltXe/1Vjh78zTbbTEYb6ARhNGG77bYLe+65pxiS9Mjvsssu4aKLLpIeedrjbbfdVgwDOqdY/zd48ODxShy75W2yySai6KP70NFy8MEHhzXXXDOcdtppcmApW0HzfJT866+/XgwOrhl54HBJ0oW/TXMqTxBh77vvPjEMSAfPtP4aTyyb51qCZ/k2bOwX+6fC2Wsl65fl6nVM5cpYXnxdioy9tzJxGCUbVt2YF1NKBrdhDQPmKDMcxlQEDks655xzwh9//FHwdZQLegioZOgVR1Gj54GPwxHClVdeKd8Z5cIWm75jU3XAGqFzzz1XjFHKl7mzbow6GhkYBvSkMs2VOhVSZQvS+5hvKQ5j3ZSc5cUy8XWKx3Usl5JRKiesUipsMblU2GIyUBy2XJm88ChO9N6zkxr6Bwo9/JQMYe01PfIo46rss+01Gy/ceeedMv0IRR8DAKOBNW0YDXRKoagxRYgpPExdUgXujTfeEAMFY4IRAzpa6MnH0NDRAMJBGKoYDRgmw4cPFwMBBZ/v1KY5j4iHEQXkmKrEM9Uvq8xsuabcmCzfhk2Ft/6Wl+LHFIdpikx8neKlrovJ2Ps8GaVU2LzwUCwD8Z4b1jDgh2RxD4sXWV/AkJmjclDJbLjhhuMNAyol30LyT9A7oiMpKLBnn312wcfRVDCEzhawGFzrrbeeVEoOR6PCGgYoY7RTTq2HUKBQ3DEMWCOlBmAqbEwo9CzaZSQAo4IFxPT20xNvFTU6OVG+MQqUd8cdd8jhoywOhkdcjDgw4nrEEUdI7z3x0FYxVYndiLhXhQ9DgUMlMSRYP8hUI6Zf33jjjRJXKr0xEQ+LnzEuGDXH0EmFc2ps4pvB6GtIw4BhuYUXXlgUCj54hvcclePVV18VxUwNA0YP6MVwBJm7yTx4ygUDgbmbjuqA7w6DlHJlviwNoE8ncjQq1DCgowVCcXRqPYSxh3JPrzw7/NDLDy8VNkVMAeIMl9VXX10Uc3rgtUeftQMc9kg7zLQf1gQwXYlrphmh8OtoAGEZyWazEB1FIC1MQ2I0gPUH6ETweO55550n062ffvpp2XGIuJnSxEGTKIOl5IFwjG6w4cvAgQNlSlIqnFNjE99CQxoGnGzMtAP2QGeFPduUeu9200CPAwuh2HUHBZghTnoEHEHmvi+44IKivGIgsGuDozpgizyGuPnmKGMaTd9ZzNGoUMMAhcwSRoJ1Y4r9cVM8XKXUvQ0b+ys/xbNysZ+9t7xirqWmyORdx5QXLiWXFca6NgzXnMGCko5hgIKekrGuvUbpQjFH+WdLcHiqjD355JMyXYiNGFDumc7LyAFbotKLy/x+piHRNjNisdZaa8nUJIwCpjcx/Wj99deXjlI69lgnOHLkyHDzzTeLzCqrrCIjDih9TDlSHteMBmhaLdm8kE41DFi/wIhEKryViXkpqkQGisPofZac8mOZmGcp9sdN8XCVUveWVyy8JfUrRSYOm+Vasn72uiENAxLNqvoZZphBFFn28/3Pf/5T8HVUAnoX6Lml1xYljXmL33//fcG3bYNemKWXXloMUXYlYks3R3XAmgIaWB2RYTcs30TA0ahQwwCFkdEwXKfqEYpLip9HlchkEXGxBSj11CmnnCLTilLhUqQ9+GyagtLPYmFNGy5Tg5iqM3ToUFl8zEgBU6QZocAfhQ3FHuNhhx12CF27dpVRA6Zj4o/hQNwQskxFIq0Q18RLWIwQDjNlrQGdXlwTd5zeFKmRwTkNGCSpME6NTXxLDWkYMH+OaURsdcge6MyTczQNDAuyPRojMChojB74YXF/gkp2oYUWku+NE7Z9u9Lqgu32WGfANLZll11WGiqHoxFhDQMUPZRBSyleFmnY5pLJo1SYYnKVyJRLlcTXFBmMPQ4RY0oPvfco5hgH8GMZS8jT285ZS7SzzNPHMIjD8N1o760SfsSPgt+jRw+ZysohZsyUOOaYY2RUgN58leUaFwUPOUgNC1yeA0/jLyXtuEzf5nmMVLD4mFGMOKySysT3Md9SJTIxlRMWasozai0DFQuf8m+qDN9DwxkGnNDLCMHss88uCizDbgzNOZoGjjlnj2N6xbXn1kcM/gQVOiMG9GqzAxbfn6N6wBDQE7f5/jiS39cZOBoRahjQuKIwMsVDicY35uWRhm2KDM+0/ikirH1GSiaVhqbIaHgrlwpvqblkIBsesjLM9+cAM7YExTCgR5/wWc+Azz0dmCj0zO0/+uijRaHXMCoTP1f5uDwXXQflnqk/3GMsYBBkyZRKeeE1XYw2MLrLegWu+cZT4aE4Lxp/seek7vNkYionLNSUZzRFJvWeU2Sfkfdt5PFiuTwZwmIcNJxhwJakHPLB7jCTTjqpTHnxRbJNB40Zw5P0iKOg0Rvi6zb+BIu7GKHie6N8WCDmimv1wP/Lt6cL348//vjwww8/FHwdjsaBGgY0rihuTSUa7BQ/j6xMU+VLJWTKkdOwzSVTLmXJYQTg4s81ijrXSnF4SGUYaaAThBkPLP5lrUIcNi8O/FSJg2fv9RmlUNYz8kifQR40Lalwlip5TjWo3OdWWh4pfh5ZmeZ6JlSOHGH5phrOMOA0Xo4aR0mjB5dFyL///nvB11EpOJQH5Uy35WQREzsoOIL07DDVhXJhOtHJJ5/shkEV8euvv8rZEGw9TBmzewYL2xyORoMaBvS8oTQ2EqGopvh5VIkM1JzPqgWhHCvFflnpJKxVwFS2kcoiK895VG5aG6k8moOaszyQ4/02nGHAT8WqeAwD1hcwP9nRdNCgsTuMTtHy7UongO3h2EUCo4DtcTt37lzwcVQDbBzAfFumE/FfL7744jJE7nA0GqhHGWnFMKCRpYfVkvJiP+6VLC/lWn8ly7N+yrf+sZ+9t3zrlwoTu0oaVikVVl0lDWv5WdeWl5LRe8uP/bPC5MnY+9S1pTyZrHsrE/uneMVk4nvlpfhKqTiqIZMKE99nhUn5ZfEtL8vP8uOw1k/vlSwv5Vp/JctLXVteLKN861q+pVRYdZU0rOXbsPa64QwD9v1liy0UCLbl8oWK1QPbcHJYHIYBOx9wcqIjyNAaU6uYvqYjBo7qgvmqGPxM1Zpuuulkiz7WEzkcjQQ1DOgRZs94DFzrxtd6n/IvVSYVzvpbP+tv+TFP71PhsvxTvCz/OIzeW771j/2yrmNeU2RwU+HyyMqUKluujIYvRwaK5Uoh+4xS5Gy4SmViXorU34aLr1Ph7b1142u9b4pMKlwcPubF/twX45UrE/tZfwjjoKEMA3oW2bd35plnlvnIrOxnyyxHdYDCu+SSS8oULcrWzzH4EyzuYt9oFh6zPe5xxx1X8HFUC5xnwI4XjFgxnY0pgmxl6nA0Eqxh8NRTT8ne9NaNr1O8Yv4xL+WXxc+TKcZrDplSw1tqLplyqZz4Knl2c8kolSprw1Uik7pPkYaJ3fg6xSvmn+KVK5PyK0blPiO+LoWX8rN8DIWGMgy+/PJLWV+gixTZrssPQ6oemCLDdqWUL4u6WXfgCLItG/tOYzDNMcccshuFo7rQ8wx08TvnlGAsOByNBDUMmPLKFEQWmVo3vk7xivnHvCy/cmSKhY+vS+WVK1NqeEvlPiO+LpdXjFSmVFnCNYcMZOVKIfuMUuWaKmPdPMqTScmn/CuRsZTHKzd8fK335coU46X8LB8D4ZNPPmkcw4DRAU4cpOeW6QYc9uGoHthKjdEYlDMMMD9o6k+wLzQjKBgGSy21lBxs46gu2ECAbUrbtWsn5bzIIovIHtsORyOBjio6sDAM2H0G0p1oyqFyZQhfzzIpfkw2XCkycZhyZUoJD+XJZMWRJZMV3hIKmnVLIeItJW5LpcjE/pU+J8XPonLDQ/UqQ3grU4p8LFMKlRseimX0HuMA3Y8dQBvCMGAeMj2JTOeYf/755RQ+R/Vw9tlnhzXWWEO25mSXGKxGRxAFlUNlZpllFhlJoWyyfhhHZaA8WbC51VZbiWE611xzyTaxDkcjAcOAM2H4lh955JEwevToolRquGpQuc+qJG3NJVMu1Wte4vAPPfSQrKXkJOOHH354Ij+lRslLKdSa0gXVa9pKkSEMBik7qzXEiAEJvP/++2XHEhQHdolhr2hH9XDJJZeERRddVE6fZb4382QdQfbU51CaqaeeWs7QOPzww8OPP/5Y8HVUC+yCxTqXBRdcUIiDdHyqoKORwDq4n3/+WTZuYPs/FDvaLZQ8J6dixLfSs2dPmbK6xBJLhKFDh8qJw6mwTk7VJAxSFh6zVTg6T17bWzeGAXM3mTqk++zTc+uHIFUXrDFYddVV5QwDDvEaO3Zswadtg8XHGAZzzz23jBb4GoPagDNKzjrrrDDnnHPKdKI999zTFyA7Ggp0YDEET8cBvW4YuxgJH3zwgZNTUeJbGTZsmLQ17LrIYlAWgqbCOjlVk9hshnWl33//vYwW5O0KWDeGAQoCvbUYBSgNKGp+sFl1gTHAlpEcdMbx7b74809QWXOEPb3YbKM5ePDggo+jmqAy6tevnxgG/OdMa/Mtcx2NBowDetv4nmmjOMDvt99+c3IqSnwvjDStvfbast4Kw4DvKBXWyamaxLfHt0YnPEYBVPdTiVjQxc+CwvC3v/3NpxnUABgGTCNad911ZVqRH3D2J+ix2XfffcNMM80UzjjjDF/0XkPcfvvtssaF/3yxxRaTxVAORyOCRtXJqRwCbBfJyP1KK60k871T4ZycakGlom4MAxYec7gUCsN8880nJ6U6qou+ffuKUoZxwGJbn0r0J1hM2L59e9mxicWxvXr1Kvg4qg0aQhpE/nM2GGBY3eFwONoKRo4cKVOJNt98cx8xddQl6sYwuO6660RZYAHoOuusI1tIOqqLyy67TKYSsSXn/vvvH15//fWCT9sGiwnZJnfFFVeU0ZQzzzyz4OOoNtgBasMNN5S1REzdGjRoUMHH4XA4Wj+YxksbTJvDOhWHo95QN4YBO8FgGDCdw3+Y2uCmm24KhxxySJhnnnnC9ttvH955552Cj4NdmhgtWGaZZeRgPUdtwOIn1rhwyB67j7FLETu9OBwOR1sA7TCdIoxSc3aTw1FvqBvDgO1JMQw43Kxjx46+I1ENwPSs448/XqZqbbzxxr4drAEKKgor89732WefAtdRbbDwCcOLfx3aa6+9/F93OBxtAqybvPTSS2VmxJprrinb3Toc9Ya6MAxYKc2+vlNMMUWYbbbZ5MfxXsTqgyFMRmM4y2CLLbYIzzzzTMGnbQNlldOOV1lllbDyyiuH3XbbreDjqAU4aI9DDGkc2ZbYR64cDkdbAHpN//79Rddh2qpvvuCoR9SFYcDxzNNNN11YaKGFRDkbMWJEwcdRTQwfPlwUMQ6RY9oMW6U5gmw32Lt3bxneZSSFsnHUDtdee21YZJFFxg+ncyKjw+FwtHZgGLBlM4YBmzDcc889BR+Ho35QF4bBCy+8EBZYYIGw2mqrSY82p7Q5qo+7775bpnEsueSSYaeddpJyd/y57SCnUfL9bbTRRmHnnXcu+DhqAbbr09GZbbfdNtx5550FH4fD4Wi9YD95NrdgGiUddKw3KGcbSYejOVAXhgF76qMgcER4p06dwksvvVTwcVQTo0aNEsOA6VrswPP8888XfBxdu3aVczTmmmuusN122xW4jlqABcgYphhiDKfTOOadwuhwOBytAUybRt9hA5Att9wyvP322wUfh6N+UBeGAQtit956a1lnwO4wbGnoqD5uvvnmsOuuu8oBcozQsKe8I4hSyuLj5ZZbLswwwwxhs802c0W1huBIdr5DRg34Drt16ybTuRwOh6M1g9GBe++9VzpE2OSCg10djnpDXRgGHPSx1lpryZw7DpfyrUprg7vuuiussMIKYdJJJ5UDVnzh0wRgGDC8y/76GAa++L12YLH3UUcdJQftrbHGGmIYfPfddwVfh8PhaL1gE5Cll1467L//fuGrr74qcB2O+kFdGAabbLKJKAkMrZ1//vmiODiqDyokTptFAZ5zzjl9xMBg3333Hb+FJoaqz/usHRiNGThwoGwNy79/wgknyAYEDofDUS38+NNP4YGHHgk33XJbuHnk7S1Ot4y6PYy4eVQ49/xeYb8DOoTOp50RrhsyfBz/jmT45ibS9+wYn17sqBPDYJttthGFdccddxTDwFEbsMiTefSMGLDOwEcMJoA1BlNNNZUYBpwO7YZBbfHQQw+Fdu3ahdVXX12+ybFjxxZ8HA6Ho+n48quvQ/8Bg0P3c3uFPhcNHEcD6oAGhosHXB4uu/yaMODSK0Pf/oMSYVqGup59XrhxnOHicLS4YcAe5scee6zsrb/++uuHHj16FHwc1cbIkSNlJxiU34UXXjiMGTOm4OM48cQTZRoRZcNCeEdt8dprr4Xdd99dDvmZZZZZ/KAfh8NRVXz19Tfh4oGDwz33PRh+//2f4Y8//nDKoR69LpJRA4ejxQ0Dtsykt3bWWWeVcwwGDRpU8HFUG+wfTxmj/DJ169VXXy34OPbbb78wySSTSNn4AWe1B1OHDj/8cPkOMcgGDx5c8HE4HI6mQw2D+x8c7SPAJcANA4eixQ2DRx99VHbLYZtIVuoPHTq04OOoNq666iopZ5RfpnC89dZbBZ+2Dea8s0sO5YJxcOCBBxZ8HLUCi+4uuugiMcKWWmqp0KVLF1/w7XA4qgZrGPguc8XhhoFD0eKGAdNbmPvOGoMDDjgg3HrrrQUfR7VxzTXXjDcM2A3Gt4X9E/QmoaBSLtNMM40bBs0AdiG6/fbbZdSA06b32msv2ePb4XA4qgE3DMqDGwYORYsaBv/+97/DGWecEXr37i1TiXbYYYcwevTogq+j2hg2bJgcIjfllFOGDTbYILz++usFH0eHDh3CZJNNFmaeeeZw5JFHFriOWuHbb7+VESzWGXACKIaqGwYOh6NacMOgPLhh4FC0qGHA1AH2jz/iiCPk0C0O/Pjwww8Lvo5qY/jw4bJ/8kwzzSQKmZ+6OAH0XDPXnd2aLrjgggLXUSv8+OOP4YorrghbbbWVHHLGKch+yJnD4agW3DAoD24YOBQtPpXozDPPDMsuu4woqyhn9CQ6aoMRI0bIwWaMGLDzjq8xmACMUqYSsUPOKaecUuA6aomHHnxQDjWcffbZw4ILLhg+/OCDgo/D4XA0DW4YlAc3DByKTMPgiy+/Cp99/kX4/Isva0rdu58TFl9iybDwIouGvfbeJ7z77nvJcE5Np+uuHxJWXGnlsOBCC4dtttk2PPLIo8lwbZEOOfSwMM+884Wll1k2nHTSyeGDDz5MhnOqHt13/wNhnXXXCwssuFBYaOFFwsOjH0mGq4Sou777/vvwX9+NxOFok3DDoDy4YeBQZBoG/S+5LJzfs284+5wLwtnn9qwJdT+vV+hy5jnh5FO7hpM7dw2du5wl/G7j+HHYeqbu5/cOPS/sH867oE/Svx6IMuUAk1NOOzN0GlfelPWZ3c5vkbI+Z1x59epzcTi3x7jyOicdpjmJ7/C0M7rJd0jZdOnaPZw17ruvi+9wXDr4rigvvrFkmAYkypbvj+9R//8zu/WoSpkTR/fze4VRt90Zfvd1Cw5Hm4QbBuXBDQOHItMw6NGrXxgw6Irw4kuvhGeefb4m9PSzY8JzY14IY154KTw/jp57/sXwzHP4jflL2Hol0nzv/Q+PK6++YfiIW5Jh6oPGhGcLZa3UImU97pkcONNtnAJ386jbwvMvvpwO14wk3+G498g3SLnU03dI+Qy54SYxXi4YV3H/ma7WQOPKdlxe9N+HqlHmfOOPPfF06HfJpWHYiJvD77//XqjRHA5HW4IbBuXBDQOHItMwuKB3vzDshpsLd448fP75F+HCiwaEp555rsBx5OHTTz8P3c7rGV557bUCx5GHx8cpuoyy9Ok3sMBx5OGPf/0rXD9sRBg6zqByw8DhaJtww6A8uGHgUOQaBkOH31S4c+Thk08+FcPgiSefKXAcefjoo0/EMHjx5VcKHEceHn3sSTcMysCvv/0Wrht6gxsGDkcbRq0MA3ZT/OOPP2pymvK//vUvibuaKDXvbhg4FG4YVAFuGJQHNwzKgxsG5cENA4fDUSvD4KOPPgrXX399+OKLLwqc4vjss89KqoteeeWV8PDDD1d0psvPP/8cvvzyy8JdEAPjmWeeCa+++qrcUwbffPNN+OWXX+Q+hhsGDoUbBlWAGwblwQ2D8uCGQXlww8DhcKQMA3rkX3vtNTkviQM+n3/+efFDyX/22WfDe++9JyMCn3/+uSjpKNYYAjo68Omnn4annnoqnHDCCeGFF14QXgyU8zFjxoTnnntOronv7LPPDg8++KDck4avv/5anv/3v/9dwsLHGLjrrrvCY489Joe/Ivfmm2+Gl156SdL5008/Sfwo9u+8844o/JxFhBxxDhkyRGQV7777rmwBf9lll8k9cd56662SjpSh5IaBQ+GGQRXghkF5cMOgPLhhUB7cMHA4HCnDAOV46NChcrAqJ6/fcMMNUkc88cQToUePHqFv376idGMknHTSSaFXr15h8ODBYjhgEFx88cXhyiuvDAcddJAYDjF+G1f3DBs2LFxyySUSH/FyNtOee+4ZunfvHu6+++7www8/hA8++CCcf/75cpgm8Y0dO1YOeBw0aFC48cYbRdHHMLj00kvD0UcfLco9ssR1xx13SLhTTz01DBgwQEYKGJEgTTwfENc111wTLr/8csmn4tFHHxUZO7KgcMPAoXDDoApww6A8uGFQHtwwKA9uGDgcjqypREzVQWlHmUappvd++PDhoWfPnjIScN9990lPfu/evcMbb7whSj698xx8Se8+owbHHXec9OTHQDFHIVcjA8OAHn2UeFXsGX2g9x+DACX/+++/Hz91iN58DAsMA4D/hRdeKOkZOHCgPB9j4cUXXxR5FH/CkhYMCAXp7tKli6T93HPPHb9u4f333xeDBEMkhhsGDoUbBlWAGwblwQ2D8uCGQXlww8DhcGRNJaL3/cwzz5Tec+4ZIUCRR/FG+UfhRqE/77zzRMHu2rVreOihh0L//v1F6We6z1577SVKfAxGFq677rpw9dVXy2gAoxOAaT708o8ePVqmBzEFqFu3buGKK66QaU06nYkRiTPOOEPipu5C8YfH9KfTTjtNRilGjBgh8SJPWhkFwf+oo46SZ4FPPvlEjB1GEeAzSgE0r2+99ZbcW7hh4FC4YVAFuGFQHtwwKA9uGJQHNwwcDkfKMGCEgHn2KOS4TLmh9/6ee+6RBcUo0yjvzOnHHwUaRZwRA5RtDAPCobDDo/ffEiMCyCFz++23S08/YJSBkQCIXn8UdOKBUOoxUFjfgMKPsj9y5EhZS3DnnXeKkYCBctNNN4WvvvpKpiGRTkYEMDgwDFhU3LFjR3m+grySBuLQevD++++XtH/33Xdyb+GGgUPhhkEV4IZBeXDDoDy4YVAe3DBwOBxZU4maCgwAQC88vf8sIoZQ3un1x5/1AbUCBgqjEkx1YkqRPg/jhsXSWSDcAw88IGE0DxZuGDgUbhhUAW4YlAc3DMqDGwblwQ0Dh8NRK8NAwRoFRhfo0YeYbkSvfy2eZcFIAjsOMQVJ1yIA1imwXiELGAOMIuh6hhhuGDgUrcow4IespaWehUY1DGp1SEsxNKJhQDkxZFvrSj+FRjUMKDPbcDUX3DBwOBy1NgxoD1hsbKkl6rtqwQ0Dh6JZDQN+Tlbms3gny2qNgaLPrgGlnAbIdmLEXy5QYH788UchBXsXMzTH3D3AXET2Nk6hVoYBZXTzzTdLb0SpBg95oSejFPTr1y98/PHHhbvSwTOYy0jvg4J5j8xf1B4Lys36W9TKMKAnhbmjzOEsFcjwfRUD81DZRo5dHcoF3z3PsEoqu0jwfWmDxZxP3ac6Ri0NA9LBfNXU9nVZ4B3bfyULfIcsuKsENLJ8Y1o+fEv8B8zNBfwPWd+uGwYOh6PWhkFrgxsGDkVZhgHWMPPqaKRRdFBkUBLhoQSo8orSAI/5dsrjx0S5Zo4b+wKnhryIC2WYOXQoKqpQsQ8vq/ix0AHP5xkodTwDPoYDSsM//vEPCcM94fC3Cip5QBae8jECWFSkyhHPZQER23/xXEBeWSikpwhaZBkG5AWFDyWHuEmvKtXki+dAxM2iIpRp/AF80kk6SFtKwSEseWBIURcTUe5sT8aztIeb+AlHmfMMeKSJuYZaBppW0oGsAr7Kcg3Y6oz5jcQDiItFU+yaoIozaWJhVErpzjIMiEffDUYY75BnkCaI/PI9kUfyQXq1vMgr99dee60s8EqBsMiiUOISN/tVI8O3yT1hyCtpIIwqp8TNu1flnTIirSwqs0Yr74kwkPIZYsYY1ncIv0+fPrJbhH7TDA1ThqnyyjIMSCtp4Fk2rZQj/6M+j7yRH9Kq7xBZ0kGe2I2DubIpUN78Q5QZsvw/fJMYgcgTD8+hzNR45vmA/5iFdYB4SAP5oyxJp+VreQOew7/28ssvyz1gLu2JJ54o2/fxTORZWJfqCHDDwOFwuGFQHtwwcCjKMgxQ+o499thwyCGHyAEho0aNEgWEbbP2228/2QcYcMrf9ttvL4o1K+xpnDlIBEVojz32kG3AUGhiED+ynTp1Cscff7wooCh5G2ywQdh7771FIUGRIq6tttoqnHPOObLvMEoouwewKp9eXYDCt88++4QOHTrIc1EsUVrOOuss2cJr6623lrQBlBx6z1FsUDp4Lot74KGwKTBQUKIIY5FlGKB0bbvttpIftkFDicEIYas08sNoAIoS5bT//vuHnXfeWXq7Ub54NumkvNiL2CqfCrZUY2szDmKBKGe2NmvXrp0oUeQLJY24iJ90sFUZZc9WbZQBZQe433LLLcOBBx4o27OheJN+4mdrNsr76aeflrxTNhz8Arhn4dVFF10kuyaoIQVOP/10eQ8xsgwD9o+mXDp37hyOPPJIMfQg0sT3hXJNuvj+KBe+RXaP4BvEUDn00EPDrrvuOn6LOAsaBuaDkh++B8qJvPP9bLTRRuGYY46RfFCGvJ/ddttNXPaQprzYJeKII44Y32PNt7PDDjtIWvj+eWechMn+1uSBA204+RLFl/K/9957RU4VWvJC/MgBlGX+i1R5ZRkGfK/km7SyQwV5Qvnn/ZBWviv9XzbbbDP5P/hvUbwff/xx+WZIK3lPbV9HWZB/3gcnaNJjz/fZvn37sN1228m/gDIPb5dddpE0ULbEjSzlQnkD/j3+af5JvmtG9zCu2CHjsMMOk2+RsgN8Q/zbamRSNoxq8LxHHnlkfCPP90g+40bfDQOHw+GGQXlww8ChKHsqEdtw2R5ZppCg8HJoBooaSgdTJFASadjpNeZAERRHFBmUBhRd7R20IDyHgqB0EQdh6A1l32EUHgW9i/aUPwXKLQoEYOSAnmAUe9LMlAmei3IIeA5KGIobCj9KOpUHaeT5+KOokF41BFDuMH5sjzrIm0qEgm53CkDpue2226R3HYURpZY0kz56lik3FjCh5AKUSEZYtKfXguPPyRP5pIzJC4ogh6lYoEDxLO3tVqC48/4A5US+Sd8tt9wiZcM9vbYodShlhKV8dF9nQJwoj5QLJzs++eSTwgccxEJcpMsiyzBQ5dj2XlMuvBsMK/aRxsCjjHgvPAtlledgNAKmEvHNxSANfB+UA4oz74T3jVJLPi3IOwakBd8aSqwaBij9yGGoMF2GcuMfwPDjWyVN9LLTa0750OMN4FFOyGKs8f4B8ZN2jO24vPKmEvF+iE+/UcqQb5y4MIp5rxgsnHaJH3y+B+4x/BgN4N9MjRjwrvm++P94B/qtYCShoFtgdOrogIJRMowDwHfCe6PckeX/xhCknACjYvgDDFr+P60j+O8oW75XvnX97ylzDDstQ4UbBg6Hww2D8uCGgUNRlmGAQo6igGKFkgFowOmp5EAPeh9RIumhpNcWZQNlDgUAhY9eUsLTU8q0hxgYBiguKAv0dtIbzHNQCpBFqUAxRAE5+OCDRSFAeQE8lzRoDy+9vyg/bCFGnKQBZVYPHaGnUZVDVUpQyFDAme7EdJz11ltPRiC0UkGZIS2lGgaEQzlDgSUfKCkoYZQJ6UTBZnQCxQtlDQOLssQwoIec9JEOevljIwhgGJAXeo15DooY6ceoQKFGwYWHccHUFcqXHmqMB0Yq6O2mTJgGQpnybnBRWlG4mbZBeogfYwN50oHSjD8gTyiKGDr0JCOjoPy1XC2yDAO+F8oEZV3LmHxhXGB46FQlFHTSjcKI0YeySHlSHvRKY8TEIM/4M0WFUSmId0IZo1ijYGNcUT68cxRavjXuGa3hOyLfKK3kGZfnoPgjT5rJK+8W4hvHj2+UclXDgHLn/8DQWXXVVeU7BZQrBmDKkMoyDAjHf0Faec8YB3znpJN49t13X8kfhgIjPBgBfAeUG/8Y/wVppsz4XmLwbxGGb4yRANLNMylD5JHB8CBPjDqQR74lgFJPefMfU46UAwY4xhnfOe8Mg4V0YRjzP+pID98WZaZxES/fIaMUlKvyMdIYBbGdBsANA4fD4YZBeXDDwKEoyzCgx5AeVxQfprGgbKHYYCiglNN4o+iiiDKqwPQEelNpwFHyUSYwLFA8U4YB4VFCkUUJR3FDMaOHEVkUNpQMetxRhlAkGBlAIUJRJG0oFChD9Iai1JA+epA55AOlhp5mnq/TUACypEuNDIAig2Jre1JRbFDitXdWkTeVCAUURVaNEJ5JPKSNMiMMShJTK1CIKC+UexRPypN0kX6bNgVlhDKpozIolzyD8iedKI30YFNuKGD0+qKwYTihnDFdg3CkjbInvxgGxIcCRy8vZY9hgkLG1BvyzvtBEbfgIBfySS8uoKyZRkLPeoyUYUC8KJIYbrZnnjKhrHhnlBfvg/IifRhwGAb0wpNXFHW+Te15tuBbRSGmfFFEyTMKP0omSjSyxIFBQhgMEb4ZnRevxglpI6+kizSh/MNHeeUdYRjxDWMIky6+X8oVhd2C74BwamDzP2B0ljOViLRSFppW3isGAu+Zb4x8UE58WxikGM/wuOddUp6kjW8MwycG8fM+MAR0xIDvS41tDGoMA94x3wg8NYAwGrinHPhOeZ98N/reqDP4vqgveIcYHpQxIE7kMAIVlCPy/Bc6rQ6DW6dLWbhh4HA43DAoD24YOBRlGQY0yCiWKFMoTKogo5CgFNF4o9RyjT/3+GkPKDwaahQOei9R6pVQsOCjLDA9w06d4afmudrIE4Z4UaZUSYBHugjHc+DDw+VeFWuUGpQmFF2eA1DgUDpIkwI5nmHjR0GKp0uALMOAZxIHyh95A5QZcaHEk0dIywUivIK8qD+GipYVSh1KHnESHsVVyxjwDJ5J2ik7nkdcEOEgrm3a9F1peZEWZFH0UPpQgjVtGBEYLcgoSCf+yiONGBgYcjGyRgwoL9JFmuz75z1rWWl5kD7KkGtAWvX5xEEvMmnQMuPdEj/fCPHb8iLPPAMe8eh3BHGP4knc3ONHGvTZ+MEnDsodg5leeRR2/V5RhjFK9B7ou1OlllEiDBbeZYwsw0DzrGnVuEibxq3fGf7kj2+Be0B6SAf3PBcjUssLQoYy4z/RclZwT1zAfuc8D+Byr+mjfHiWfW+UF++FESd6/tWoJAyjExjNCtJOXMgiR94wgDE6Yrhh4HA41DB44KFHpM5w5MMNA4ei7DUG1QDKG73DKI70+uPSyxtPCagFUH7olbUKKwoWyguKSxZQMHS6Roy8NQbVAIoXvaxaXvS20uOP4lVLoIzRe8s0G8pHQXmhLKoSmIIajylkGQbVAmllVIly0u+LnmZVZGsFygXl9pVXXpko7yix1oiNwTdFmlF8U8hbY1AtMLKGMm7/ydQoQrXBN8yIA6MR9t/iX+QbymrQ+TYxjinzGG4YOBwODINLBl4uesyrr70ZXnt9bIvT62+MDW+8+VZ4c+zbQm+8+bbwUmGbm87tcWG42Q0Dxzi0iGHQ2lBrw6C1odaGQWtDcxgGrQluGDgcjq+/+TZcde1Q6QnveWH/uqALel8Uzj6nR+jStZvQmd3Or6v03XHXX0dgHW0PbhhUAW4YlAc3DMqDGwblwQ0Dh8Pxzz/+CO9/8GEY+9bb4a2332l5eufd8NyY58MxHY8N7VZcOSy/fLvQ4dDDwpjnXxC/pEwzEuX0yWelHY7qaN1ww6AKcMOgPLhhUB7cMCgPbhg4HI56xA/ffx86dDg4TDrpJOH//u//wh577J45xdThaCm4YVAFuGFQHtwwKA9uGJQHNwwcDkc9gs0WOORx0kknFcOAA1DZeMXhqCe4YVAFuGFQHtwwKA9uGJQHNwwcDkc9gt3ettlmGzEKoI033jh56rzD0ZJww6AKcMOgPLhhUB7cMCgPbhg4HI56BDsbcngpIwaTTDJJ2GijjZKnzjscLQk3DKoANwzKgxsG5cENg/LghoHD4ahHsGV2p06dwt/+9rcw1VRThc0333yigxwdjnqAGwZVgBsG5cENg/LghkF5cMPA4XDUIzhMslu3bmHyyScX2nLLLeUgToejnpBrGNww4pbCnSMPX3/9jShtzz73fIHjyMMXX3wVup3bM7z+hg+hloKnnnpWDIO+/QcVOI48/Pd//x1nFNzohoHD4agrcLhrjx49whRTTCG05pprhpdeeqng63DUBzINAz3w4uprh9aWrhsWrr1+uNA1466TYeqcLrv86nDeBX1C/wGXJf3ria6hrIf8Wd4p/+agSwdfHbqf10tOpUz5twjZ77AFyyZF/S65VAwDvrGUf6NTtcv9iquvl7K64aaRbhg4HI66ASe19+nTJ8w+++xhlllmCTvuuKMsSHY46gmZhgFK+qDBV4UBl15RMxp42ZXh/AsuDB2PPzkcf+Ip45SfXjV/Zi2IfFx2xTV1n3YU8dNOPzsc3fHEcOwJnUKvPv0l7amwtaR6Ky/S07vvxeG4cd8gZXPKaV3DJYMuT4ZtCaKcKC8o5d+oRLlfOK7cTzjp1HDMcSdJuVfjeySOi8d96/c98LAM3TscDkc9QA2D6aabLkw//fShffv2vsbAUXfINAyaC0OGDAkzzDBDmH/++cOgQT5Vopb49ddfwnbbbSfbpC200ELh+efHFHwcDzzwQJhzzjllQdhOO+04rgL/T8HHUUs88cQTYckll5RGcueddy5wHQ6Ho/XhP//5T+jZs6e0wexKRJ3nIwaOekOLGgb//ve/w8iRI2VYba655gr9+vUr+DhqgZ9//lm2R2OrtBVXXDG8/vrrBR/HjTfeGGaddVYZ3j3ooIOkZ8dRe4wZ81xYZZVVpAft2GOPLXAdDoej9QHD4MILLwyTTTaZ7Ex06KGHhk8//bTg63DUB1rUMED5euGFF2S0YMYZZ5QfxlE7MN96q622kt4KNwwmxn333SfGKT3XGAZU4I7a45VXXglrrLFGmHrqqf3/dzgcrRq0KwMHDpS2Br2nY8eO4eOPPy74Ohz1gRafSkQv9gorrBAWXXTR0L9//wLXUQtgGOy6667SO7v11luHzz77rODjePTRR6WiXmCBBcJpp51W4DpqjbfffjusvvrqYZlllgm33nprgetwOBytD//73//C0KFDpZ1ZYoklQufOncN7771X8HU46gMtbhiADTfcMCy33HKyjZejdvjjjz+kh4IpM3vvvbcctuL4E8x1X3XVVUVB9SltzQdGDPn311prLXkHDke9g5FuJabDOjmVQowWsF3p5ZdfHuaZZ56w0korhaOPPjo89thj4peScXKqJvGdad2FkZqFFjcMvvvuu7DxxhuHueeeO9xwww0FrqMW4GO48sorw8wzzxz22GMPNwwMGM6lTKiwBwwYIGXlqC0oYzYfYNH3LrvsEj766KOCj8NRv6CBRcGD6GxxciqVGLWnM+Taa68No0aNCqNHjw4ffPBBMqyTUy2Ieos6rK4NA6YSsTKfNQZXXHFFgeuoFVjszaKnddZZRyokx5/49ddfwwEHHBCmnXba0KVLF7l31B5XX321rC9gBItKy+GoZ2DM/vbbb9JusRVuTHzDKX4eNZdMuVQPeamHNFSL9BkoZvTc6ihBS+e/OZ4BtWS62kIe8yiWQb/h+8tCXUwlYnoL20Qy3857amsLeilYY8DUjb///e8FroMKmhEDdovge3Q0DwYPHizb9jFiwDtwOOoZNKbffvtt+Prrr6VxxUhwanvk796pUYlvl5k6GKhZowZ1YRj06tVLeg1POeWU3OENR9PB7jucG8F8+nfeeafAdYB99903TD755OGYY47x77CZ0LdvX9klizrAy9xR78B4/eKLL2SLSUYNfvnlF6c6IJSdFN/JyWliYgo5dRgjB3VtGNx0002ikDGliDl4jtph2LBhYoRtttlmviuRwffffx/23HPPMOWUU4auXbu6ktoMoIwpawwD36rU0Qiglw2jgPUwP/74oxAGAo0tLvdcK6mfJRsmlskKb/kqkwprCX8NE6crizS8PiP2V559dq1kbHgovk9R1jOUB+nzUs8t9oysvOTJlfucVHhIZVKysQz3WelSXpaMDaOUigeycRDGxqE8vYZS/nE61F/dWCYOnyIrEz/DxmUpDlPsOfYZUCkyEOGUSgkP6XNiGb1W14aLZfT+hx9+kDqs7g2Dhx9+WAyD9ddfP3zzzTcFrqMWeO655+QQL84z+PLLLwtcB1METj/9dJnS1q1bNzcMmgFUUpwZgWFw2WWXFbgOR/1CDQNOq6WBpcHFVdJ7XHtteSl+Vjgl+MVk1FXi3vKyri0pP3az/FO8Ym7MK4Vv/VLXep8KU6obX9v7YjJZ/NR1nh/Xel8qP+XqdRY/9rd+WfzUtb2P3Txeim/9Yz8rE/tl8fQ+y42v7X0xV4l762f9LV95em/DWn/LtxSHLcXN8qMT9JNPPql/w+Cll14KU0wxhWxbytwnR+2AYTDbbLOFJZdcMrz88ssFroO1LZ06dRLDgEXIKK2O2gIFa7vttpOTuF977bUC1+GoX6hh8OGHH0pbRSOrboqy/MqVyQsPNYdMczwDKuafonKfk+XX0jLN8QyoOWSa4xl5VO5zij2juWRSlCdT7jPwawjDgPmB9GKzINZPAawtbrzxRtmVaK+99pIPxPEnWJRDmcw+++x+nkYzgekYm266aZh33nl9hyxHQ0ANA75XRrdZiFwJUfem+FDKLy881FwyMTXHM0qhaj2nXmXqOS/1Snl5SfkVy3tzyaQoT6bcZ1BvNYRhANZee+2w+OKLh6eeeqrAcdQC999/f5h++unl5GOfSjQx9tlnHxkxOOusswocRy3B4vdVVlklrLzyyr7exdEQwDDgW8Uw+Oqrr2R3IhralBtfx4RfnmwWVSID2XDNIVPpM0qRi2WsXxZpuFKfEZOVj/2yqNxnVfqMvPsUVVMmS7ZcPlRMJnbj65jwy5NNUSUykJXTe+sfUyruYjJKKdliZMM2jGGw++67y1kGLI511A6PPvpomGmmmcImm2ziozMROnToEKaZZppw0kknFTiOWuKZZ54Jc8wxR1h33XXdSHU0BNQwYKtndvbgu1XCULBufJ2iWKZYeCgOU4oMZMM1h0xzPCO+ziMNV2r4lqBK0tbSMuXy86hUGRuumIz6x24eVSpjwxWTScVdTEYpJVsONYxhcNRRR8ke8v379y9wHLUAx68vvfTS4eCDD5YPxDEBhxxyiKxzYRGyo/a49dZbZeEx28T6mg5HI8AaBriff/65UzMSxhiU8msOqvTZtUx3S5dJpdSo6W4N1DCGwQUXXCBKwoknnpiZWEfTgWGA8nvuuefKQhTHBDBicPLJJ8ue+o7agn980KBB8s8fccQRssbD4ah3qGHw7rvvyloDiPsUxf56X0wmvs8LDzVVRsNXWyb2KxYeQmlRZVHDW2KUG6XGysCP75Us3/qn+PrslB+kaeLaxmEVXOXFYWy6s54PxX6p51mCz1qtlL9Nl00bYSHNr/pnPQNSGXtvXUtxfHrPtQ0fv0v8YhkbT4qsjN5bN0Wxn8aRJZPyywuvZP01fLVlYr9i4SH8G8Yw0N5DFoC6wlo7MH0Dw+Dss8+WuWaOCTjuuONk5MoNg9qD80pOPfVU+efZHjbveHaHo15gDQNVbJyqR4zEsPaIXZ8snzJ//fXXw8UXXxzuuuuuqpY9cfO8N954Q56typV9BmtK7r333nDVVVeFG264QcIix7a1fAukm2sbL6RhaG9Z36dKWTVIy4ROvieeeGKiuEnPe++9N54wHlTJJl/I4o4dOza8+eab4+NT+UqJvPJsnqc8ruHZd/r++++Hnj17hjvvvHP81BYl0kWaKVdbptUsu7ZM+l03hGHw7LPPipLQvn17+UAdtQHbQrLwmAW2bhhMjPPPPz8cffTRMnrlqC3YHWGnnXaS6YMjRowocB2O+oYaBiiQKC0Qio9T0whl5dVXXw1nnnmmnIbOyDblrP5cP/nkk3IIar9+/aTckVE/VXq5x7VKMKRhtIccnr67F198MfTp00dmK/D8kSNHiuKqsrSTGCQ77LBDOOyww6SNePDBByWe0aNHh3POOSd07tw5DB48OLz99tuidGm6ueY5K664Yrjiiism8msq8Xz0JjZu4ZBYVfrwu/LKK6Xj5YwzzhCjhM4uDBMMHMK89dZb4eqrrw6nnHKKbNPN6K2mnXKjvAiHq9f22eqnZazhHnnkEenoGTp0qCj3+LNuFB5lRliI57MjHe+a8rXxInfNNdeELl26hFGjRolBYZ9dCvl/mU+854YwDPgYWPjJDiW+p3ntQM/CoosuKhUcq9QdE3D88ceHlVZaKZx33nkFjqNWoFFbc801wwwzzCANlsPRCLCGAUoWhOKSIpQT65ZDpcrYcOXKlJOuSmSUSpFBUbnjjjukbeKww6efflp41h8eSjlKI+VOvPREs5MhhgQKPnoE5yJx//zzz4s/Pc/IorSiyD/++OOiBKMg0eN+7LHHykyFiy66KJx22mmyEcrw4cMlfp7Du2aU/YQTTpC6CgWXXnb8OZz1yCOPDO3atZMwt99++0T55RrFF79bbrlFFF/1yyIrn0fEha6EscRzyY/6YaRwYCzGDJ2Ae++9t1wzM4M80fm1yy67yNbcKOd0FmIUESflR3lhHGGsvfDCCxKnvg/C8CzKFz+95oyke+65R5678cYby3t55ZVXpLOX3efwQ5b8YRjst99+YpzQFhAvfIh3yKjM5ptvLmXOu8oqE+XH/sX+S6WUf0w2XHPIlBoeKldGwzWMYcBwEkoZ+8jzUztqgwceeCAsvPDCYc899/TFxxGoPFdYYYXQtWvXAsdRK1AxoQSwRTGNi8PRCFDDAMUGpRPlSd2YUE6saylLRiklkyIbrlyZUsND5crYcKXIoFxyxs5CCy0k01vgWTmUGabvoACjcMJDyUFxRTFnlz02LuH+kksuEWW0e/fuomRee+21shX1gQceKOvIOPUfZZh3QM82SqsaAhof7SPXKP6HH3646CVbbrmlbJRA7zcGBWnCaOjdu7cYFhC98yiy+Gke6DVntADlWfl579/mO+8bIxxGD0o0cascLunbf//9ZSMNeBg0HCaJ8n/33XeL4s4oAuknbhT0JZZYItx3330yXYrRXKbW7rbbbmHbbbeVtXcYC7wnDkZllIFyxChBed9jjz1kxJ2pysisttpq4eabbxYjD3neDwYHZUx6SDf+xKk80g1RRpQh74p3gRGGf4psnnFtOWWVsT5HZZTywqeu86gpMqWGh8qV0XANYxj8/PPP8tNNMskk8pE6agMqug022EB+Zix1xwTw/dEjRa+Ro7ZgWsCss84a1lhjDWkkHI5GgDUM+G5RPJ2aTiiDGAaLLLKIKKsoL6lwllDkcJnmQycD02GQu/TSS0URRcGljWM6zTHHHCNThKjb6RhDYUWW6UMoyTyTNKD4MlWJNpIebnq8UbyXW245UY7RTYgHZRUlC4WckWZGG+ilpxecNjZOf3N+K5SLKtYYAuSJdRn0wGMsMW2KLaIZ+cCPfGCMsVshyj2jKhhLTFEi7IUXXiiyjKxgnKE7UD7k+fLLLxeDaa655pIREYw2ZiNgKDBSgQzr9jBKMAx4VirNlrTs2JQCWYwcfddO1aGGMQxIJD8e6wywclmc6Kg+WEDFyIwbBn8FvUQMr9Ij5KgtaGyZOkijwrxfh6MRgGFAvYkihTIJofQp6X3KT8mGSbl5VKmMDV9MxoZVt5iMUkqmFFmUQdYaYRio0VWMiBclByWYuhuFFOWTuekolcSDIso8fKYZET/z2plGo3Js202PNgoyPNKBsovii6ILnx5tFGlGHlRp5dkoq/R6Y4SggKN0Ew5Dxea5WDkQJ88gPu7LkU35kz4MF0bAmZrNFKK11lpL8spUYhYrb7TRRnJN+ajMqquuKoYTIyXbbLONjJxThpQTZUY5YWytvvrqYhTwTOQxyDCciI9zkuhcQ4fbcccdZRSFRcZMVcKI0jzGaeZa7wnDNe8Tw4DpUqQvDmevY7JhYrcSGQ2TRTYM16XIQPYZxWRsWHWLySjFsg1jGACG9TAMDjroIN+ZqEagguTHZniUIU7HBFARUrH74uPaggqJIXcWHlP5c+9wNALUMEDxwjhAaVLlKctV4r6UsKnrUvzjsDZcim/9s/yy3Pja3uNmXWtYJfVHGWSRKoYBvdKxv72PXZRTpgIx+kjdjULKYlz4+KN4o8Az9YVpP4wCMB+e90cnEAuD6enGCGDUgJ5u5snTiUYc8Ndbbz0xDFDg1TBAHiWb5zJt54ADDhjfttLOEiZOu003LmGYfsPcf0Ya9JuKZTS88mLX+iuPqVOczcNoAesr6Hkn7UxrwmCgjMgvecR44rBJjCIUeHr4KSvyzroAlP8ttthCpmAx3ZZ1HvD5Fyh7yhCjQKdeEQ/vgjjw32yzzcKYMWPk+TatqWstt0MPPVQMPN6XhonDptw8vzhM6l55sZ8Nk8WL+Vm8UtxywhS7j68byjBgJT2GAUNY/ICO6uO7776THw6r3st4YlCx0avC8KejdmBHIgww/nV29HA4GgVqGKC8Qihy5ZAaFOpWi2x8pcZd7TRYSuUz73n0YtIxiGFADzEKTCpcKg5VdpgKyjx5OhaJDz9cRgrQKZivzvpFevY7duwoowtMp0GZZ0oRyjyjBCjA3JMOes9RqlF8qauYg0/POMorc/VRoOkRR7mmo41pTPTGM+Uolda4XEgfOwOhlGOQ8E3Fec8rN8j6IwsxHYiRAtp5lH7yogo3CjrGE9N9aPPwZ8SFMiK/zPvHMMJ4YPRjwIABcs+ibAwo1hAwAoGxcP3110t+5557bomLaVbEjZyOwmBEsOaAsiMNNu0pIo3kCT0FI4M1DXGZ5FGx8kpRnkwl8Sk1RbZUKvcZlCVGXcMYBiwsmnbaaWW+Gj+po/qgZwbrnZ8cq9ExAeypTMPEULSjdqABpWGZcsopZT6vw9EoUMOA3ldIDQSnphGK/ZAhQ6T+RTlF2UmFSxFhkWeO//LLLx9uu+02uccPxRtDgF5uprOgLDNlhoW1KJzIosSzzgAFl+kyrBlgDRQjDSi6KNDMv0c5xmhgG1CMCqbLMIWG6TL0yiPDCOiyyy4r8/tRuFHE4vQq4UcY0rTUUkvJrkya7kqJOPku6dxipyRGMxjJQLdCKdcwpJdps+uss44s1GaxMc9Hkcd4oqzgM6WIMJQB5Up5YkxxTx1OeWBUUKYYSuzoxzvASGP3J4woFHzyxzQlnl3s3ao/ox06+lLO9+CUT5RlQxkG/Kh8zNNPP70MK2Yl2lEZOESKio6eAH58hhEdE0BjQc8NrqM2YLSAniMaVHqZMMYcjkaBGgbam4zbVgiFM8XPonLCozBed911YcEFF5RpPqkweYTuwIgAU08snzTQA85CYhRbpuzozkb4oyQx/YWOMpRolFymOTJtiJ51CIUZpRc59tVHMWZ6C3LUX/iRZhRY6jZ47AJEmnh+VjnQc8sUJ7ZtZoQCYwPFDb9yyxqyzyIdpBsiPaRP48YlbRhAjJagC2DksG4ARZ4REgwFDC3yyJQqyoGyQp50M92IMiWflAtGE+WAPPfIUB4QYUgD5VZKvkgfZYFBg6HFsyopj7ZAlX4nDWMY/PHHH/KzYSUyxYADQ3wBcnXx008/hYEDB8p8SXoUfMRgAvhJ6EFhHipbsTlqA4xTGhR2BmGrUubXOhyNAmsY0F4pYSRYN+YpZfkp3/rHfnpt+VnXSsXC2WtLyrfhUnxLsZ91Ld+S8lEGmfLDVCB6nVEsUUTj8FYGQmlkPjuKtfZKo4za8Lwr4oJ4jiq4yOOiyDJKzDanzIunY4hdi1QhVRnrEq/6QZoevcePe5temyaIuJg+Re86OzJpPFYmJWf51t/y43Tb7xU/FHh2a8IYIN/kGeMAA4ARFEYAWKOAcYBhRRz2GdyjvDP9m2lZyyyzjKw/yCoTTYemz7rxNeXOlCxGcdBTuI/D4aau7X0WX6+Vn+dneZYf87LCxdfWPyucJSsTh9H7FN+S5ePyLTTUiAGVLlYshgE/iy9Ari5++eUX+dE5bIV5jXwcjj/x3//+N+y6667Sc8TuWI7agLMz+MennnpqMVB9nYujkaCGAQ0sihE9r5ZSvCzSsOXIKJUrU8kzmpuYisNiXqb+oKyjZKbCKaHooLSyIxCLf1kfhmFAPPilZLJIlShLqXBK1XpnnOFC5whu7AfV8tuI88s3jVHFmgJ661mrgcFEj39cHtxTzhgTlD1Tp3hvyNtwlRDTqziTgmezQDrOTzllomFrLaNUrkwlz2gq8e4ayjAgsRyYgWHAtmI+1aW6oFGjhwBLnAqU9QaOCWCvZuZD0gPiqD6ohFisNv/88483/n/77beCr8NR/7CGgU6VQKnT66x7pZifutZ7KxO78bXep/xj11JKJr7Xa+sfh4nv9drylK/X9h4XJUl7nFFglF+KjC5ORTaWsfd6bXnK12sbzl6neLGMvY7DZN2nqJhMSr5SmfieHmXKEqJsKWPrrzLwKW9dN0BYDaPhYxnrH4ezxP9F3Cix3McyKVl4lp+612vrr/zYja/1Xsn6x64lG17vY//4PiUT3+u19Vd+fK88XN5bQxkGVAas/kdpYAGPTzOoLugVZwcF1hiwTRs7KTgmgPUtzDGljBzVB1PZGBqedNJJZcSAb5CpRQ5Ho0ANA1VcmebA9IuUq8S9ksrE4YrJqL8NZ6/1XsNb/9i1lJKx4WIZ9YdKlYl5Gl55qfDKj5+RJ6P+5aZLZfTahrPXlpeXLr224aG8dNl7y6tEJu9er7N4pdwXiydPRss5Fc6Sytj8p1xLKRkbLpZRfygrXfZa71P+sWtJZUrNi4aH8vKifikZe2/D4OLfUIYB21tx4BGGwTzzzCMLgBzVAx8BawxY7IRS5lOJJgYjKWw/5zvl1AYYottvv73837PNNpss5HM4GgkYBhzIh2FAQ8uiznKoOWTqIV0pv3pIVxaVK9Oa8gI1l0y5VK95qYd0ZfkVew7+DWUYkFD23UVxYCtDehd9AXL18NVXX8kCIY47Z8s1er4cfy6IZf0FJ1iiuDJn1VF9MALIIjX+b6YTPfDAAwUfh6MxoIYBw/F8z5aY7x7zilE9yzQSlZs/L8O2Q831rhvl+2ANR0MZBqBHjx5hkkkmEeWBLbM+++yzgo+jqWAuIAs/OWykT58+0sA5/gQNPXtVs4CNdRiO6uOGG26QkQL+bcqaeakORyNBDQPm6qIIsO2lpWK81HWWTMo/5uX5WX4qnL0uh5d1nXKzwtnrPL/Udcovj2/dmGf5KX/r6nWKH9+Xep3yy+IXu9Z7y4/vS7mOqSny8IvJxLJWJs8v5uddN0Um9rP3lp9ys3ixn73GTckU4xXzVxfjoOEMA84vmGWWWUR5YF9hrBtHdfDrr7/KiYQcWMKWnD6VaAL4STixkals7GPtqC5YZMyuFZxRwr/NoT6MYDkcjQRrGLDlIzuzWIIX8+196jpLxvpbvvXP8lOe3tuw1j/Fs36xf8zL8tf7rHD2Os8vda1k/Swv5ls35ll+yt+6ep3ix/elXqf8svjFrvXe8uP7Uq5jaoo8/GIysayVyfOL+XnXKblSZWI/e2/5KTeLZ/1i/xSvmH+Kl/KHMBAazjDAEODwI5SH2WefXQ49cVQHGAbsz7z66qvLyIHv+jQBNPhqGLCVq6O6wAjYb7/9wuSTTx4mm2yycNRRR4Uff/yx4OtwNAbUMGABHye7QpwUC9l7vc6iOEwxmfgZyrP3KbJhSgkPlSsThylXppTwULkycZgsmXLjhcqVicOUK1NKeKhcmTgM98XkUjL2vhiV8oyYKpXJu09RU2VKCQ+VKxOHaYoMhgJnWDWUYfDNN9/ItpEYBjPOOKPMifedS6oDem0HDRok+8czYuC7Ek0AB+yxVS6nQLKVq6O6wAhdd9115b+ea6655FAilCyHo5Hw73//O3z99ddiGDz66KPS2BYj2zCXSs0pk+LnUXPJlEv1mpc4PD22um0ku8Rwb/2h5spLudRc6SpXptJnVPKc5kpbip9HxWSor5SYot9QhgEVL4uO6VmE2FeeitjRdLBdKSMG7L5Drzi7QDkmYJ999gmbbbaZTyWqAZjXyMmkGAZzzz23HLvvcDQa6KTi4E3Wa40ePTrceeed4bbbbnNyKok4uK1Xr15hgw02kJF7TttnTRv8VHgnp2oRddXDDz8sO6qhU9MZmoW6MwzANddcE2addVZRInbbbTc/HbWKuOOOO2RRN8rvp59+WuA6ACctcuIjW7k6qgeMfQxSXXjMiOC7775b8HU4Ggd0rjDyynQiDtVi5CC1JaCTU4rYKvLCCy8MM888s9SF8847r2wEwshBKryTU7WIuooODfQ+dmHMm4lTl4YBCyQWWmgh+XFWXXXVcNddd0mF7Gg6KEtOnL3gggv85OMInTt3FmIrV0f1wJbDGF1sQcz6gjPOOCP8/PPPBV+Ho3HA0DuGLsbBDz/8IFNfnZxKpW+//TYMHz5czmlCv+FAVzrr4KfCOzlVk6izWGvKNF506oaZSgSY/7TpppvKtqXsUNS7d28/z6BKuOWWW+Qcg3PPPdenEkVgJOWII46Q9ReO6oFGj4XHNITsSnT66afLdAyHoxFBY0pvGwYCDayTUymk38zNN98so6dTTDGFrGtjtMC/JafmIL5BDII8owDUpWGARUOvLT8OdOSRR4bvvvuu4OtoCoYMGRKWXHLJcNJJJ8nKdMcEMMWFk4/dMKguWGTHadsYBnPOOWcYMWJEwcfhaFzQsDo5lUoK5ntTD7KGctttt5XpRSAl4+RUbSoFdWkYkHjWGfztb3+Tn6dDhw6+ALkKwEq8++67ZREo20X6OQYTgylWW2yxRTjuuOMKHEc1cOutt4b55ptPDAOMr9dff73g43A4HG0LnPi+8MILS3244YYbyq5EDkc9oS4NA8B5Bosttpj8PBzI5QedNR1Mx2JLq0UWWSS0b9/eF4AaMMTGblh6+rGjOmCInNPMJ510UlljwPoCnxbocDjaKu6//35ZQ0mdSEeUGwaOekPdGgYsjN1qq63EMGA+HvvvM0fKUTkYMbjvvvvCggsuKBXSe++9V/BxgMMPP1wWhR122GEFjqOp4GCzAw88UP7j6aabLvTs2TN3NwSHw+FozWADEHYjwjBgiiWGgsNRT6hbw4DtlFhngELBgkV2NWFFtaNpYAeERRddVJQ13650YnCOAQuzfY1B9cCeyZQp/zHf3ahRo8RAdTgcjrYI2mDOcqFOZISa/eUdjnpC3RoG4Nprr5VeRgwDdovxOfFNxw033CAjBvSKs/uTYwIYoVp77bXl8DdHdUCjN/XUUwtxmI9vketwONoy7rnnnrDAAguIYcB2pYwgOBz1hLo2DDjGebnllgvTTjutzPt+4403Cj6OSnH55ZdLL8WZZ57phkGEddddV7aP85OPqwNOVuTwHhrAZZddVq5/+umngq/D4XC0PTz00ENh8cUXl3pxiSWWkBEEh6OeUNeGwZdffhn22GMPOc+gXbt2sgd/qdstOdK46aabwhprrBH69+8vp3c6JmDHHXeUKVYDBgwocBxNATuJ8f9yqNlKK60kPWX+/zocjrYM1vmx+Bi9ZoUVVnDDwFF3qGvDgEWKHMQ144wzSi83W206mgZ6Kxi+ZHcYHzGYGEyv2mGHHcJ5551X4Diagg8//FBG/Bg233333X0XLIfD0eahhgEjBnR43n777QUfh6M+UNeGAWCO8vLLLy/7/rIz0c8//1zwcVSCYcOGyTkGp512mo8YRGAL180220x2znE0HfSEcRYJU7Q41IypRQ6Hw9GW8eSTT4ZllllGDIPVV1/ddyVy1B3q3jCg13GjjTaSn2jjjTcOb775ZsHHUQmGDh0qlVGvXr1kqpZjArbccktZy3LhhRcWOI6m4JxzzpHh8vnnnz/cfPPNBa7D4XC0XXDAIx1QbFeKgeBTiRz1hro3DDgMiV1iOByJ4TcWJDsqA/O7hw8fHtZff305dOqTTz4p+DgAhsGhhx7qhkEVwMFmnJWBQc/35hsHOBwOx58jBtSJGAarrbZaeP75MQUfh6M+UPeGAcrsyJEj5eAp1hqwq45PSagcjz76qPRSHHPMMb79a4QjjzxSRlPOP//8AsdRKcaOHSt7dc8111zh9NNPD7/99lvBx+FwOGoPdAc6KP71x7/kcNR6oH//69/SubnyyiuHySebPGy77bbhlVdeEX4qfHPTv/3wScc41L1hADgkid5H1hmccMIJ4dtvvy34OMoFx6+vtdZaoqz94x//KHAd4OSTT5a99rt3717gOCoF64EWW2yxsOGGG8rZGQ6Hw9Gc+Pa778OIm0aFgZdeES4dfFWd0NWhz0UDwkmndAnHHHtS6NK1W7jo4kHCT4dvXnrgodGF0nO0ZTSEYfDNN9+EE088UQwDDISnn3664OMoF88991zYf//9w+DBg8MHH3xQ4DoAIwUYTBdffHGB46gU++67r2wagKHl/6vD4WhufPX1N+GScUbBRRdfGkbcOHKckVAfdOPNo8LIW+8Io267M9wy6na5T4Vrbup2bs9w88jbCqXnaMtoCMMAXHXVVbL/PobBddddF/773/8WfBzlgC0jO3fuHC677LLw6aefFrgOgLHEGQb0djvSYAvhYv/eV199JSMFm2yyiYzC+PoCh8PR3MAwuHjg4PDAQ48UOI489Oh1kRgqDkfDGAYs2Nltt91EqX3wwQcz1xmgtHC6KgqM469glycON7viiit8KlEEttSEMBAcf8UPP/wg633Y2erZZ5+VkbzUf8gR/4wUsCsR/6r/iw6Ho7mhhsH9D472jsQS4IaBQ9EwhgHK/nHHHScKLT3dsULCwplXX301DBkyRHbeQYlx/BXPP/986NevX7j66qvD22+/XeA6QN++feV07T59+hQ4DgtGm9ZZZ50w33zzhVVWWUX+Rwz2+GwRvi/OgmCkAOPB4XA4mhtuGJQHNwwcioYxDABKG9M8fvzxxwInhO+//15O8z377LNlu0n2B2Y6iA3jmAAWHzO9g9MWfVeiiUEPN9/OjTfeWOA4LN5///2w9tpryxak0Jxzzhn22muvcOWVV4qhybfFoXmcVs66AhpjdgVxOByO5oYbBuXBDQOHoqEMg88++yz07t1bdil68cUX5VTks846K6y00kpywipbgB1//PHh2muv9RGDDNCLS884ZejG08S4/vrrxShg5MnxV3z99dfhwAMPDLPMMktYdtllZTOAmWeeOSy55JKyLSlrgFh0vOeee8r/6XA4HC0FNwzKgxsGDkVDGAZMG2KLUpTZnXfeWfaaZxHyiiuuGJZeemk542CGGWYYr7CcdNJJckKyVwZ/xXvvvSfbRz7xxBPSu+uYgAceeEBO6H388ccLHIcFhw2ycxNG+Kyzziou/xzXU001VZh99tnlfvrppw9bb711uOeee2TaH3IOh8PRnHDDoDy4YeBQ1K1h8M9//lOMge+++04U2c033zwsvvjiMr95jjnmkK0Qt99+eznXgLnOK6ywQphmmmnCbLPNFnbdddfw2muvFWJyWDAdZODAgTL64pXlxOBoetan8O050mBhMaMDKP9TTz21jBhssMEGMnrAND62wuWcjHnnnVdGEliEfOyxx8o0QA49i9cjOBwORy3ghkF5cMPAoag7w4ApQOxkwiFTHTt2FKUfhYPjw1FEdtppp7DpppuGM844QxTcJZZYQvzosWTOM6f6YjCwa4rjr2A6COsLWMztmBivv/669HI7svHMM8/ISN0CCywghjrG+BRTTBEmn3zyMNlkk4mBzjqNbt26hR122EGmF/FvEo7RPIwE1iSwO5bD4XDUCm4YlAc3DByKujIMOHALQwCFY8oppwzTTTedKP0oHhgA7dq1C0sttdREUxgWXHBBMQYwGlBACLvqqqvK7juOv+Lzzz+XNRjsNe+YGCxiZztORzYYwTv88MPFEGC0gP90kkkmEeJfZVHyoosuKlOJmPZ3xBFHiEHAdD8WKm+77bbhoIMOkh2OHA6Ho1Zww6A8uGHgUNSVYcCpvIsttpgoFyj/TBeae+65xQBA+WcKke6IAqGMzDTTTMInzLTTTisKC4uR29pWnMzjfvTRR8Ott94qB8CxiJaFxm+99ZaszcDoYhciRlJuu+022Ubyf//7X0E6yO4xjNag+DGFy/o1MsgHaylYqD5s2DApG0akmFLFCAELjZlWxf3DDz8c7r33Xtn61oKyZYSFMsNtLWVTCWhg2c4VI1wNAgx4/j+Mcu6ZZsR/ywgC045Ye0AYtjjt2rVr+Pvf/16IzeFwOGoDNwzKgxsGDkVdGQbM7b788stlFEB3O6FHEiWEEQGdrqDXuCgjOqqgvZjsitLWtuJktyaUMKZTMb97u+22k0XYRx99tBhb8JZbbjnpucVwYveYAw44QKZ6UM4s4mZR92qrrRZ69OjxF+W4UYFRQO/1QgstJAvUmQZD3tlSkylqfC/0aDMiRdkxX54ebXq3CUvvNyNVa665ppSpL0z+c5E2Rjv/H/8n/x3/KOWLyz3fGWXKf4nhjh/f2KhRo/zAM4fDUXO4YVAe3DBwKOpy8TE9uihoOgLAgmMUWzUC6I1EKWHeMveEwWDARflgP3rm0rclsECb/DOSgiLGSAplRLnQi8uibJRbFH8UYYwEjAKMMMqVqVks8OaMg9Z0IjLnXLAglnLhe6H3GiWVslKDkqlr6623npQJc+TZ9hbjiTIkHIYVRhSH5/kWryF88sknYb/99pN/Uv9BvjMl7nXtAd8g3yNle8ghh8gaBW+kHQ5HrVErw+CXX36R3dZq0XnGqP2XX35ZcXrj0ezffvtNqBS4YeBQ1KVhANhu9Mgjjwzzzz+/KK8zzjijKBwQiq4qvBgP+KN8cE/P8KWXXtrmdpahQtlnn31EAUbhZ3EoveHsFkPPLQovi7lR/E888cTQuXNn6RlngSiGF1O3WNR93nnntbr1B0yxYlQEowAFlZEVDCF6vflettpqK1nMzvd26qmnhmOOOUZGFTjll3IhzG677Rbuv//+QoxtG7/++mvo37+/GAZ8b2oQYATwzfGPco1hgIEAbbLJJjJV0OFwOJoDKcMAxZntzwGKveoJjGIyZVRHM3GZXot/rKQzFZLplEw/TYFnEJeVZc0jnZWquBM/xD3h9LnUkXSM2jTip/cKlSeNGicdenYKNQYM02dZOwcI/84778jU2RTcMHAo6tYwAFjOl1xyiZy2yhxlFAwMgbhnEsUNpQ8lBaWPLSfbIjibACNAF21TVpQNPd4oaRhOTJdhrjdlSlj8mPaBDMYXBgPl3pqAIssoko4UUBaMBrBuhfLguyH/lMOGG24oU4coF4wqRlpQchlFYDcnrcDbOliPgQGqxjpE+TIaxRoDvjXKmmvKjilEcePmcDgctULKMKAOYioknUV09LBFNXU6myGw8cSTTz4p7QUKNuv1RowYEcaMGSMKOso+I560A+yuljrEESWdbZnZ1pk6j3jpsaeziQM0WdfG/UcffSTr2R555BEJh5HBcwmDYaBKP2m96aabZF0g06OJnxHb0aNHCw955IizZ8+e4aWXXiqk5E9jhANfmZ4NKAPymFUXu2HgUNS1YQD4mPn42QkFRQRFTkcM1DiA4EHMC3/66acL0m0LVCYYUii5KL4otii1KL5aRlxDGFJqMDAvnN5zym7QoEGtcrSFg93YxhZllVEoRpi0LCgbvYZYV4A/ZUOvOKMNjKxQ6ca9R20R2sCwQxjlpf8i5UW5YSDwrfF9HXzwwdJI0sg5HA5HcyFlGOByiCWKOgoyugUKPx0drK1jm3QUfog1eldddZW0qWyvzIYegwcPlrqPE+BfeeUVidMCBZ3tmPv27SvxPfXUUzL9lOmsF154oTyPDSzotWd0HmUe44TefpR1dgwcOnSoXJPWK664Qkb6MQ6YLkw6GAVgWutZZ50laeOZbC5CmrTjio0ySAObbdCmKzBy4GOYxHDDwKGoe8MAYCUzp48fhkWhTAdRo8AaCLiciNzWdiSyoMKhjOj9pwccwwAlDWOB3lu7mJudZLhn1IWpNYwitObpHlTofDsYRbqLDkTZQJSJ9nKj3KqBxfSYTp06ydCs40/Qa8VWpPp96X+o/yLlyOgTBw3qULfD4XA0F7LWGGAEMN0YhR1FmroMA4DzV0455RTpyKAHn3sUaM5LolMIQ4Eef3btYzruCy+8UIhxAui9x/BAYUcB51wc6j9GrOnl1+lFPBulnZELNQIAaUPxx1gBjCpgjNCuEyeKPfcYLldffbUYIYxmkBZGMRTUuxgOZ599dujSpcv4+DAgOL2eHQtjuGHgUDSEYaCg15EflZ9XFyarUoIygpLXoUMH+XHbMhhCXHfddcfvGEP5WMOAcqLcKEOuUe64psehNS+upWfloosuksXr5JtyoTwoFzUMKBfKTMtEp6q11elpeWCYnbK0/yAjCIzq0XD6tqQOh6OlkDIM6NxhBz964elo5B4jgN79Cy64QJT/yy67LNx3332iVKOI0ynEPae308uPgs6aM3gx6MDEoMCYIE46owDGBsYBIwBMV8LQYL0fu+NxjW7DtCOeSdquueYaOSWe9goDg1kQhGftJcYGhg16EPGpLJs7KNh0gx30WDd3wgknjG/X2Z67V69eybrZDQOHoqEMAwVDePyge++9tyhvKCQQveT9+vUrhGq7oAeCSk97x1F8KR+UNwhlF0UYBRg+SvD666/fJk79xbA89NBDZXE26wcwDGzZMEKgPEZWmEqEsdna1l1UAzRobPnK96TTsChXGk56zhwOh6OlkDIM6DnnXB/W49HLTlsJoXCjsDNizowDpuygZ3AgKB1tKPwYEUwNIhwubQkjAJZQwKkX8ec5KOgAJZ/1CfDptWcbbZR9jYeRC3icM0TaMEiID0Wes4gY1SC9xENbxLk8nC7P6ASyTE9iAw17ojwGAyMHjHLoDkqsj2AaUqp+dsPAoWhIwwDwM7PC/rTTTgtrrbWWKL8owszhc/x5Qi09A8zzVgMAxZdrennZ1pUyo7eXMPRmUBG1dtBAMAy7yy67yPQpjAA1CiAWZtPjTXlhPLFjke9GlAbfCwY6OzdhXLLdLQ2WrydwOBwtjaypRNUCCvpjjz0mU4QgdI+snYqqCRYhM42IKUUYGYDpShgiGBRZoAwwPLKmWrth4FDkGgZ8bHxM9UgKegCw0Bkyw0DAugYpmbZEgN4D5heyCBmlV40DesJRirWnl2lHDEWCVFytifimMSrvvvvusOuuu443jtQwYG0BI0+UC4YBQ7300tTzv9BSBBjqZqoV3xkLvBWp8M1FvCvI4XC0XdTaMGDKMsq4LlaGGFmoxbMs6HhhhCDu9afO05GBFPBndCErfW4YOBR/MQy0UeXj4SODdLgNUp7lx9f2Po9n/ay/9Yt5eg3xg/Chk14W09BDznAcvCwZvY/9Y56GzwsT36dkrJ/yYn6eXxY/5Wd5XFMO9BBsvPHGYhSoYWCvUYCZwwhsHHqt98qzbszP88vjK6X8U7xYJnaV7L0Nz3fDegP24mcRMmWghoGWDbxVV11Vyo9ytPIxKV/D2HDxvfJi/xRP75VXzD9LRv2sv/VTXsxXyvNjoRy7Xuge3ZStho9lYp5e2/s8nvWz/tZPXa0XHA5H20StDQOg+pJSI8MNA4ciaRjQqNITT0+pzp1Tgl+MV+ze8mJXr2OZ2N+6KANYz8zni+WU4Md+cTyl+JUrw3WejFKWTBbF/npv+bxHhh3ZMpKpHmoMKKH8tm/fXqYdoUzF8jHhl3pOKTJxmGIy1i3Gz/LLuobILwvC2HffjhhouTBawFCtKro2njguvY/dYrxS/FM8ruMweTKxG19bKiYTy/HPUU/o96Ph7HUsE1Psr/cpuZQf1zEfl3fX6I21w+GoDM1hGLQmuGHgUCQNAxpUFtGw8IZGlt5VJRrcLDfmKVk/65/ixf7xtd7bcCl/68Y8pWL8lJ/y4jBZMnptw8Ru1nUez/pl8bnm/WE0MSpA77gqvyjDTCViezSUOpVRuaz4svwtz1IcLosXE37FZGIebik8vSbf7M6AcaSjBjpawHx5plfRoFiZVDxZPHtfjBdfW3/rxjylrHvLT12Xyov942u9t+HsdcqNeUrql+LnXasM3ztD7Xz7bhg4HG0TbhiUBzcMHIpMw4ADMpijrg0txKr8+Nq6Ma+Yf4pnKfa3pDzrpsJZ0jApGXUtFQtn+ZYXuzZMHN5SSia+1rDFwuk1hILE7gRssabGAYuQ2edY5ypaUjkl5cX+qbDKT8nE1xo+z8/yU5QlE/NiPwhlkrJhB6fFF19c1htgFLCzDmsQQCxj47bXKf/YT+9TvKx7vU6FUYr9i/Hy/C3F4XCbIlMsfCpcJTKMYDD/F54bBg5H24QbBuXBDQOHImkY0JPKint6TGlcUR5pbJXsvV7HvKbKxPfKK3avvNi1Yey9pZQM13lxVCKj97GM9bc866aulex9HI7pMCzS5vRfRgs4yZdpRijHcVi9tzy9V569j3nlymh4GyYV3lIxmf9n7z7AbSmq/GHPODqj45jHnHPOOUcwJ8w5oWJWzBlRDJgRI4hZERFUVCSJWRQzCAYwAgKioGJA0P5468/iK2uq9+59zrnnnnvv+j3PenZ31apc1bVCde+4bsPqe0SQ1De+G33pS1+6HLniXbEGxPXStPdtWJumvY+wefcR1v7WPGP3vTSu593HdR02K02ETb2P6zZsJdPwdh5//PFl/FIxSCQ2TaRisBhSMUgEZioGPgdqoz3ppJPKZhtU38f1LJ72fkoa17Pie2F1mvZ3jNr0bViPlpsm7qemqX9npanj2rSI98fLote97nXL94xZyyMueIPq8JZ68W3Y2H0dPitN77rlb8PmxY/dm+OUA58wve1tb1vCQ8AMnrju3ffC6nvXU/KoqU7T/tY8Y/dT0riedV+H19ctTy9NTW36NqxHy0njvYdUDBKJTRupGCyGVAwSgZmKge/dxkbrBVW/PYq4lmcsPMIWDa9/54VHWFAb3vuN65Y/wnu/cd1LE3Ht9Rj/WB5oXpr2N67re2PpT1J8YvKXv/xld2zr65bG+OK6DQtq43r3EVaHj/G092Np4roNq+8jjJLkHys/+clPFi/ZrDS99PPCete9NPV9TauVBo3F98KFBbXh9W8bNyVNHd/j74XHvedXKgaJxKaLVAwWQyoGicBcxcBm67xu0sZBPivJmtqL29TJnO+FJ21Y5P2oVAwSiU0boRjss9/nyxf6/nH6syBpnFIxSARmKgb+LMxGS5gMsunW93VY+9vGzwqr76fwR1iET01Th6/rNEhcGz+LP2gpaVDNN5ZGeFDctzxj1KaZkhZPzTclDVokTS9+0TTul5pmLF0vPMJWO81Y2ppanvq+l34szwifkqblnZJmVjjF1/+ZpGKQSGy6OO743xbF4B3ves/wuX32H/bZ94D1Tp/ze7qicsCBXxq+8MWvDJ8//Xff0+9L+HqmbbfbfvjYHp88o/cSmzLmKgY2WtdJSUlJGwJRClIxSCQ2bfz+9ycOH919z+Gtb99p2HGN0FvfvvPwph3eNjzvBS8dnvSUrYfnPP/Fwxve9NYS3uNfTXr7u3YZ9jvgC2f0XmJTxlzFwK//M0A227ju3fdoSpp5+cxLM6WMlqakmcczL82UPHthy81jSpq4b39rGkszdt8Lq++n8PfCFr3vhS0lj5ZWIs/1kUePFi1nXnwvrI2fksdKpAlKxSCR2HTh+NCJJ540/PYEHl1HDNcA/e53w2E/PGx46MMeNlzkIhcd7nTnuwxf+OIXS3iXfxVJP/3xj386o/cSmzJmKgaHH3542WCPOeaYmXT00Uf/n/s2rCXxNnTlxP2sNL24eWW0NK+MHvXS6JOod4+WWo5feTsK0ca3NFbGvHLnpZmaZ5um7pMpecy7jzA01t9jaXrXvfte+JQ083jifqn17lEbP28Ool69Fi1nSpqWxviF9+q91DJmleM3FYNEIrHW4Hj2Ax7wgPKfOTe4wQ2Ggw466IyYRGJtYK5icNRRR5Xv3fuN66D6fiyeIGCjjvjgEf69731v+MxnPjMcccQR/ydtUJ2mvm556vsQGkJA8FvHB8+vfvWr8nUe95GHcHWTrs03yj/ssMPKH2D5nGvwtLy9sEjfXtfxP/zhD4ePfexj/8KD1Ee9om7qGen9ukfi6jYHT0t1uOs6rzou8qzL65Xx/e9/f9h///2Ll6kOr/Nqr+v7Ory+V44+8XlV/6sReQdvXT8U+bqux7Hmi/sIi36t44RHWhT3kS7yVo/6Hm/U7zvf+U7pE/WOfCK+5guq4+uwNh5ZO+bgz372s1J+xKtD1CWozSPaF20JCn7XkVfcSxtrJfKJfGveOk2kC1IvX8XSJ9Z8Gy+fuh/rMiK+5u+Fx7U8UjFIJBJrDT6F/fCHP7woBje/+c3L/wslEmsJo4oBqzUB2CZLgF4K+aoRbfhb3/pWEWDqvLy78L73vW+4xz3uMXzzm98sGzrBo04/hdo08iF0HHDAAcMee+wxfO1rXyv/4Bxl40cf+chHhnvf+97D7W53u+EZz3jGcOihhxYeApc0Bx98cBHohNVlEGYoM9J+6UtfKuVFXE1LaQth5hOf+ET5n4G6v9TjG9/4RulLdfPrn4yVgSdIP+6+++6l7fpeXdsyFiH9RuAn4OrTCNdHxjT6B/n3YA87ZfME1fnUtGi/EBR9QvT2t7/98JWvfKXc1/Hqps0f//jHS9nqZkyMpz7TJxRc4frNr3b9/Oc/P3OsET5p5Enhi3lL0XFPoHWv313LW9naI13kgV9/yGennXYaHvOYx5S4sXmyFDJP3v/+95e1E+0VbszVQd2//vWvlzqqr7iok3abt+Y/ZUufqrO05nzMqyOPPPLMPLTvC1/4Qpnzj3jEIwpP5BckTHpptNdmJ5+63YwN73jHO87sk3p+RpnC9bPrtoyWxuaScHnnPx8nEom1Bv+38shHPnI4y1nOMtztbncrz7pEYi1hpseApZYQ1SMChg047uNauF+C5HbbbTdc6lKXGp70pCcVgYWQEHwUj1122WW4y13uUgQZG7pwG7ovjSDXhANKBPcbgQgP4VB8/XlCfPImKD3vec8bbnWrWxVh0u+b3/zmImhHWtc3vOENh5e85CXF6rr33nsX4Y8itPXWWw9XvvKVh/Oc5zzDi170otKOup3KwU+hIARFvZG2R/vnUd1fkUbe6nOxi13szLYKY2HVDv/Me53rXGe46U1vOtzhDncYdt555zMF9re//e3DzW52s2GzzTYbbn3rWw9bbbVVEVy1V18bz+g3/Wks6roQ5vUlvrj2wHriE584XOISlxje8IY3lHIIf8b0Rje6UREwo54f/OAHhwc96EHDfvvtd2ZYS9pZ96Xrur/quCBzYM899yztD2Ey4oT7J2fjvPnmm5d6UfK0cYcddhgucIELlL6gtDz1qU8dLnShCw0Pe9jDyrzeZ599htvc5jbD9a53vZL3BS94weGxj31sqQPF77KXvexwxStesfwjMgFae1l3/CuyvjcW+kb/v+Utbxkuc5nLDFe4whUKv35C73rXu0qfmN/GsW5//NbjP48ijbH7wAc+MNzpTncqSoq8lac/tOmSl7xkadeNb3zjMh8oEeKNp/rf5CY3Ge5617uW+Gtd61plHVK+zKtrXvOapf3mtrwucpGLDM985jNL2dqmTMqCOeX/AswVY2Ktx/zUT9K+6lWvKusq6o93xx13LJuisVSnaLv8KSm3uMUthqtc5SqlzuZi3W6/U/vLHEzFIJFIrDXwGJCJzn3ucw8PechDyn6cSKwlzPQYsBTGRkyoIFD3SHz7y0pLQGdFJjjY1Os0hBtCMIGCddFmLt3nPve5Ydtttx1e+cpXlnhCFWGGEP/pT3+68Oy6667D85///OE5z3nO8IIXvGB47WtfW4QKFmxpCW6EG4IIAYqQ5B9/Ccby2GabbYrw/YpXvKJYMAmJoQBo8wtf+MIi3Nz5zncevvjFL5a6RfvxEGqUTcBs2x7X9X0b3wtzrRzC/EMf+tDSZxGvTELfAx/4wCLk6i+Crjqy5BLqKDP+oMu46TMCMx75EeoJWjwo+pvQLLwWVlmQX/aylxUrt35/9atfXcaCB4Og+ahHPapYhf05GgGbEM3yEXXfd999SzmExhBCo209qts+617dPv/5zxcl7Qc/+EG510/qsuWWWw5PeMITyrgr87nPfW6ZF9K6JwBTWAiYX/3qV0v/aZ++NS/uf//7F+VGOwnIBPzInwDrX5D1L2Xo0Y9+dJmH+tcva7088crfg56iEIKwfPQj4Ti8FG3b6ja71me9uPreb/SJecyjIwwRhgnvzq8aN8I9BdE5VutDPXijKALGTtse//jHD+95z3tKnhTo853vfMPjHve4M719d7zjHcv8snlZU9pt/jvupq8pPxRq9aBgmUfW/rvf/e6iTJlH1rr6+eXZMf/qPon6qwMDgnWnLtav8Lr9QZGm12fCjHEqBolEYq3Bc8l+FoqBZ3EisZYwVzGwySKCc011WBvvnrBAcCDkxmbepnGEAR/BTpiNnRBCSLvqVa9aBFnCF2GE9ZaQh+eWt7xl8TQQClmLLTCCL0Hofve7XxHSCGgEDb+syQQ7ghNFYIstthjOf/7zl3xZkFneCUIElQMPPLBYNAmOd7/73Qt/2z55Ekx77e5d1/fz0iCCf1wjbWa1pmTpHw8WghnBiwD4lKc8pXgL8GqzvibAswDrQ0c3LnrRixZ+ShOrsLZ9+ctfLsoBpYuVliCMWL6vcY1rlLwpIvqaUL3XXnsN97nPfUqfsYSzGEe99Uk9lj3C117XYTXV8cYm5qIw/eFeuz1YCcnaLS6OrbnXJ8ZPf6o7gZnAqs36k9BOyJWH+UGglTeihDg286xnPavwEaAJswROwqt5p2/MGfmzwIdlXj3UQZ9QHvWJutdt8tte19TytmnkqQ/q/g6FJ450+Wdr8+RKV7pSab82USYpz+qlH+QhXH0J7MbZHKF8a6926StCul8Kp7ysHcro6173ujLu+sDccayLAm6tyU+YdR7rP9Z81DlInH6nwIRiII+2/fHbXrf3lJBUDBKJxFrDqaeeWp7TF7/4xYcHP/jBxYOQSKwlzFUMCB4EnPjtUcvjl+DMY0Doct+mjzCbeVzb0AkHrLqEUVZoAgcF4OUvf3kRKvC7ZvEm2Dqj5/gI4YY3wFEHlmsCmzylYTUlyMS9dARhR4KkIyRFvQgkBEKWb0coKDYEJHWLeqKod9smvzVvHd7yt2GRhhBVp5EfAd8Z75e+9KXFSktgY/GmyBDW1JnlOvg9eBzLINyz4lIGWG+1hWB3/etfv1gtpHcERJzxZuUmzBEu5UOwfvKTn1yUC7/6w7gqkxW5rnfbJ3EddYp+qeNaEt7y+K3nSRAl6V73utdw7Wtfu9SZp8e44TWPKEYUHMe+KEWuWdTFmQf6wlwi5PMa1OXjIeTyNjmOE0qpOMK/vqAsKOttb3tbmV8UhnrstHfePIk+Car5aoq44Iu8/QaPelFO1OU1r3lN8YCoO0XYOpKOJ0SbjL/5ZHytVXHbb7/98OxnP7ukpSCYG+aZOUTYpxg4fiQ/3hHelhD4HWlyfIhCQSkTZi7x2lhv+NSxbXPdNml46CgGvDKE+2hbUJum/Q1KxSCRSKxFeCZ5ljq14PnseZlIrCXMVAwIDKy1i5KNmcDtaAeNODbrHm9LhAbCHcGUtZbgQrkgxBMc8LDsEuRYvR2XIfQIdySIokBwjby04b73vW+xFgsjoBAIr371qxfrZCgQ6iecgE2IJBgRigjNBBw80k8h+Th+EgpHj2cRUrajRIRf3g9COgXBURBlsN5TpAiuISg6VqQd+s3REcKcdxW0l6WYV8TYEB4vf/nLlzp7QClP/4nXzwRJlnnjQPBzz8OgPIpBW9cxkr9xVd9e/KJEiXHUhZdIG1mnPWQpgdofioH2KdcRFt4OSpU5oJ8oW9LoK+9GEPBZs+UvPuaDuSg8xtI8JCyzvkvH20CgJpjXdZxF8jJv44XflZgnxi8UQnNXvRwP43Uzh6xJQrp+o/RYn5Qi7RdPIaDwGCOeN/PAetJPBG0KpbnCo0dZDyVMWp4a73NQDGJNUVKEUeLd9+pcEx7KCOWeYrBIf7ZkjqdikEgk1ho8k3bbbbdyJNe7jp6PicRawlzFgHC4KLHGOnPsHD7h00YtrMfbEj5CEq8BoY2Q4OgDQTDyoW0TTFktLSrWXMIL4YTAwppNCCbMEWYIyN4nkJ7QQ1ByDIJHQr5Ivt5DoES8/vWvL2VIy3rpaJK8evWtSR6EL5Zr57TlT9jp8S5KrP48BISt6IcgQtTVrna1IuyJczxG+frBODgq5Zy5dukj7XLMhkckhD1eFn3pHLk+p4wZf+9vEDBZh/Uxnhe/+MXFCk/46tW1JvUjABNQKTAEU2E93qkkPYHcuFBYYmx4QMyZEO61mVJkDPAbW4oNoZ9SqD8pWcaMtZzy6DrqxwNA8SQsC2/ngPwokOYI7wUhuY6fRea4/veuhmM/7nt8i5Dy3/nOdxahniIU8xrpD0I75Sjex2Gd12bv6eCxRry3QSEyL3iZ4jgapeOjH/1omRvmoH7jqZOHtOYcRYlHxr02US4pavq7V9+W1NGxsPD6KbPHN4VSMUgkEmsRnkmOePrghfeyPPsTibWEmYoBYYggPZUITvFLgGRFJQy6j7gxquMJFoR3ghfrJ4uwxRN8hBnvHLBoEsJZ0Qk14i04whoBR5iXZikLrKQEJ59IJdD46gzFgpDLGk/wJaR5f4FQog4Ee3k5gkL5IExFHXtEIGHx5CJkCdAGwk6PdyrJk/Clro5xsGCzehO29YV4AiwhzLEabdbvhF6KDksv4dlxG1ZkCoNfAjRhVnrCGwGQsOf9AV+WcWRIn+CnRFAu9AmFSzn6hCdlXvv0GSHTl43kS4BUZo93CsVcMicoJ3Fm3kvVPEiUA2WaB+aI+uN1Fp6iwAtF4SFAq4/28r6wjFOAQgHQdn3Joo4oSKHUqIc26FcvZstDmdK19e2RPAjnvpqkDsZvap+MlSHc2JjDFGHtNk9CuVcmgd5mZK5QEn2BiyfJmuC5sGasN3GUg6c//elFaaQkUyitG3PHvKZwmRcUSMqke4qE9ebdIuseUcjmtS3ahI9C4YVn48Wz0/JOIfnxNqRikEgk1iLsid77I2/YQxOJtYS5igGrI+GiprCyx3UbbnMmSLCIEhwI9Tbrmq/mr0laFkYCJCHHl1fqNIiFnEBPKXAcgmBH+JMWH8GeEOgYA6GNgESoUw8CPoWClVRagiTFgPBNGCHUEBq1ncBD6MQvHeGqrUtc+9VGxzK8VOTsOmFReFAvTRvfxmmTuslPPZD6OdMtDq8Hi2NUznPH2XFCrTBt9ilTnht105/a5H2MyN+REO9W6E88+oWSQADUl4RpR1R4HyhL+kw/ERjrPqlJGFK+Yz7nOMc5Sl6Ev4gfS1PHjV1T9LwUTFg1z9THUSBHi5RJMdBX+s3coNhol76gtGq/OPMAH++WMVaGPqEY8CJII29zQ5gxjrroN/3JU2Os27oGtWHmN+H7vOc9bylDfWt+VKepf2tq+Y23NkWded2MmfYgR88ogcaZ0K/uvACUTG1RJ0ql41cUBeNvrlBi9Kv3UHhc8MtL/+s3fcmjZV7oy/hSWHjtem2or4PMDX3MAyO9fpqXRliPJxWDRCKxVkExICcw4jDCJBJrCTMVA0IFAdlvTcIivI6PcJszwYLwwRLNcs0qKrxOE/w1EUIIGQQM2jThRjpxkcY94YSA51e4MCSM8EIhYAGmXLC2+4JM5CONxeg30kf+wuu6ua/D6t82jEBCIOPBIMy7Fx7US9PGt3FxHfWo6xvxhC/Wf8I/i602sxgTZgl86uRLM4R+72boB0JY5KGeBHyCIEFResc5WHvFt2WO9UlNEaYsY+n9hHgnoo6vSVhQHTZ2Tfk03oRW8y2Ujsgj6hn8UWekXn5jDqAIizLwC498Im3kF3wxr2oK3pov7nmuvLfB20HBifRjaerfmlp+v1Fn1KZxbx2KM9Z+9VmUX7fTvfA6nzrOb80f98bDtflkrIM/0vSug4wjhZYXhWJA2ZqXRliPxzGkVAwSicRag2cSmcinoXnuPasSibWEuYpBCBkhBMT9WHhcs2iz1DuSwrLIEhj51WnatKyPrMCOq/i6iuMaIWwEn+uWIk/lOqfsSIIyCcIs24SOKAN/LVRE2giP+wireVDUI/KLML8Ea0c24r7+Darvx+JanqhHUMQhdeYRYRn3q+0UBcoJetrTnlbeyfASLWswgTraFHm410949b0jJqFM1WXWdXHd1jOu63sPQR6H4I/wlj/CxsLreKQO2h5Up4s6Rj3r6zp9S3V5LY/7iA+qw9v4Hj8iMJsjqOYZ45+Vr+u49xt11R9+a36/0VcxrpEPirR1nvU9qvNs+aPMOix4o4w6vI7z6xie90Yotzw++qlNU6etqQ53nYpBIpFYq3AM15Fj3lFffEsk1hJmKgY2WII2wXwq1fys1azEBPSw7vaoTkOApxx44dGXXwgLNW/QrHqFcBAUAsNy2jIrrKY2fkqZi6bpxQtr2+yXhdhDSH86MkVpGMvf8a3oe4qFcZNPxM+rV49WI81SytjQaTX6aLXHjtfHejfvXNd8NbVltPfmrC8jpWKQSCTWIuzNPhjiKBGDSiKxljBXMSDQL0o26vq6vh+jXpp5aWfF9WhRfrQSaabksWiaXvxYGuE11eE1X4S1fJsyLaUfViPNWq3XUqguw/WUMmfxiEvFIJFIrFV4/8oHUByX5hlNJNYS5ioGrP6szkkbBi1lvFYjzWrVK2n90/oeN8qB/7H4y1/+kopBYp3AvEpKWgr94x//KF+CO/vZz17e/Yt3DHq8SUkrTTHXZmGmYsAl72iPFyTHiBDQu55FwTeVv6ZFy2ive7SU+rRpZqWNPmx5pqRB+KaUU1OdZozasZ1SxlLS1IRvXaZp+eq0s6jmm5emxzeLH/X45qUJWm6aqRRplluvWemn8qGIb39nER7KgU+dpmKQWBcg2J122mlln0xKWoROPfXU4ZRTTinPKl+Q8+7dCSecUMJ7/ElJK03mmueX59jY/jhXMXDuPIhQ2FIbF9cRXsfVPC1f7zr42/uWr74Onva+5W/vgy+ox9OG1fxBdXzLW4fV1Isf+23Dgmbd99LMCxu7rn9r6vG112Nxfqfw1dd1mva3ph5ffe3XF3rauDq+F96LG7tu76O8Nrzmb+PruJq/Dq+pjutdx31NPb7edfC39y1ffR08LbX8NW8vXc3TXsevTTcVg8S6gM30r3/9a5lbf/vb3/6FhBP6/EaY+/iN65pWKk3QWJox/oir0wTvlDR1unlpUM2PZvGitowIm1pG8C6aJn7H0tT8NU0tg3BmLqGxvNBY3Kw0PZpar6DoszY8aFaaKKvlqeOC2nJ6+bbU5jMvfVsvFHx+p6YJ6vGjHm9Qr5xZZaB1nYZyMIa5ioGv3PiEKGEmftvrNsxvm2aMv44f42nv6zRjPO39rDQtfx3md0qalq++r3na+whrf8eo5uul6aVfapox/t59hI2l6fHX4VPTCKt5W56VTBO/9XX9W1PL1/JMSVP/ttd12FiaHn8d7nceL2r56vuap72PsPa3vW7D/K5kGgpCKgaJdQGb6UknnTT87ne/K/OrJZsvmnXfXtc87e/Y9aywOm4sfFaaOqwOHwuL8DZ+VlgbPyUsroPa+FlhfmeF9WhemvitqQ7r8dVhIbS18b2w9rr324YFjd3X4S1PhLW8Y2nq+Da8x1df12FxPyuuDR/jqa/91tdjfG18y9v+BvXC56WJsDZNHdbG1TxjYTW1fEGeYeT8McxUDHzhwxdCKAc1CeuFz6JF+dFqpFmr9ULZln7cGG1s9VrLdeuFz6LVaguiIPivCA/AVAwSKwmW3uOOO2445phjhpNPPrm84N5SHe56jK+m4Gl/Z9Giaer4Kfmjlm9KuuBpf8eojl9qmnn8qM17XaSp45eaZh4/avNerTRTqE6zaLqpaWqe1UjT/o5RHb/UNPP4UZv3Imn+9Kc/lWcYxXRsfxxVDFhEfI/c5wN9C99nR1F8G7++rqnlifv6uqY2rL6fkqblWWqauB4Lm5pmFs+8NPX1LKp52jS99PN41kWa+jru47qmlmddpwn+KWmC2jRj/HV4zd+7r8Pr65VIE9djYXWa+rrlGbufkqbl6fG34XWa+rqmNqx3T0ngMWAlScUgsZKwN1IK/LP2H//4x0I22qD6Pq5n8YzdT+Gx0Ud4m6Z3317XYXE/lmYsbN59G+Z6uWna+zEaS9P+1jyz7nths+7jeh7PrPge1Wna35pn7H5WGnMqrntpWmp5Wr42zHWUUYe1v22auG7v6zQRNu/edVDN01Ivjzqsl0fc1781T+++va7D4n4padr76Pc//OEPw9FHH724YsBdKvFPfvKTYn3zaS1/FDaP9tlnn254UtLGTkuZ+2s5zYZG2uifuikHPJ3+NGjWgy+RWAooBjZVn8Pljkf2yqD6vo0bC+/xjaUNiviab+x6XtxYeC/NLJ76vpe2vW5pappeXMvT0nLS9NLOI3xjvGP5xXUvXcvX8rZpxu79tnFBdXjNt9Q0EV5TGx5pguq4mmfsfjlpZvHU97207XVLU9P04mbxtPdjaVqq+U488cThqKOOWppiwNr229/+djjiiCOW/H8GSUlJSatFFILDDz+8CG6sI459pGKQWEnUisHvf//7sskmJW0stBpzOtfN+iWngZakGNTKARdEZJaUlJS0Fslmg1hEnKNMpSCxLhCKgf/J8JnJ3lxMSkpKWqvkubWwYhCQID6nRUlANtukpKSktUjxnPLMSiTWBeIdA++w+EAHr3qQDbe+HwsLEtfGz+JHdZr2d4x6fKuRZh4/anmmljEvrKZeGcspZyztWJpF+cfixsLbsHn3vbAeT0uz0kwpow1f12mQuDZ+Fj9aTpqab0qa+re9bklcm2YWf1DL47m1ZMUgkUgkEonE/49QDHz1ypc9bLKovu7d98KWm2YKf4RF+JQ08+57YesizVrMI66Xk0fvvhe2lDza8Clp5vHMSuN3VvxYWBs/JY8NJc0U/jZ8Spp5972w3j1KxSCRSCQSiRVArRj49ZL7scce+y/UCwsSt5Q088JbnillLJpmSp5tWBs/JY/VSNOL79GsNPPKmHLf0pQ8W+qVsVbStLQU/pVKMyufKWna+Hn8U+7Hwmqal2ZKGXW431QMEolEIpFYAYRi8NOf/rRsrt43cF9TLyxI3FLSzAtveaaUMTUNYcIRhDYu4meFxTVhpI0LasPm3ffC3EcZQerNStorO67bNC310qA27x5P3Nc8bXxLEb+cNO7H0oQFOe6npAkS39arThN9Evc9mldGS20ZU2gszVg+YUXvxdVp2vRTylhKmpbmpZlSRh3uNxWDRCKRSCRWALVi8Otf/7pssBszESJ+9KMflf8H8SUmbe/xLYXk3QorLRE2nZH20iShE1/0+1ga8cj/MB100EHlRfEe31JJuYccckj5lLs+aevhvg7rxc9qM5oV1yP8vTR1mDLrPlm0DDQrzaGHHlr+ef6Xv/xlN1750e42n/q+Fx/hbdg8mpKvr27qE/Vu45dS5lqnWD+pGCQSiUQisUyEYuA/fggSyJ+dbUjUqzNBEbXhLMR77rnncLe73W347ne/W9re8vRoXr8QTvwSyr7+9a+Xo1mElTpefT7ykY8MD3vYw4Z73vOew+te97rhxz/+ceGL+hLMI686rfAdd9xxeNrTnlaEYQJexC9nzOStD971rncNj3zkI4uCEPWWb9Qp6ie8bZdPKn/ta18rgrT7tv49mldnZeGp83If5J5i9Za3vGV46lOfWsayrlfw1/dTqC7zfe973/CYxzymtKtts/Xy+c9/voy1so13pI8+cx195zfSu8fvhf86vE0b6YMiDuGvw/BGHm94wxuGpzzlKUUBrudJxLflriuKOi1Ci6YJfuOTikEikUgkEstErRjMEkoivP7txfWol18vLMJ713E/Kw2B7Yc//OGw9dZbD9e+9rWHZz7zmUWIFh68BCXC+c1vfvMi0LmPtKz5+iIEQ0oEch1p3eOLIyjKFk94fNaznjXc5CY3Ga5//esPt7/97YfPfvazJS88+A844IBh8803H57znOcMu+yyy/DJT36yCGkHH3zw8NCHPnS44hWvWIiwy6sQdZbe2Lz+9a8fHvGIRwzf/va3/6VN0f74rUlYG17fuyZUUTrufe97F8XGvTiCsfZc/vKXH651rWsNN7zhDYcHP/jB5Q9ig+c973nPcLOb3Wy4wQ1uMNzylrc8U9lRP23WZ9FvoYRFua4jzq94fct7RXFS5ic+8Yly7MsfPerXe9zjHkUYl7/jMq961auGRz/60f+nT4LG2t+G1RTx7373u4sCScCOvP0atzvf+c6l3ebRBS94weGtb31rqfe22247XOQiFxnufve7l/57wAMeMFzpSlcqc8MY7r777iWNtl3nOtcZLne5yw0vfOELi6Kw8847D1e4whVKn3/mM58p6a9xjWuUcj70oQ8NV7/61YdLXOIShZ8C9+IXv7ikv/GNbzzssccepQ+V8cpXvrIoedZCjJP2qPuHP/zhkt8tbnGL4dOf/vSZ49G2vQ6L8N513M9KMytuLLyNdz8rn1QMEolEIpFYAYRiQPghpBIs/Ab17uuw9j7C2vuar43vhbX8vfj6HhFyCIgs8i95yUuK0FDz+sWz1157DU960pOKAItHOMGTIPaxj32s/LHgN7/5zWG33XYrSoRrfP6NnECP8PI8UAikf/azn12ESEeUCGjKv+Y1r1n+qJDg4kgK4ZBA9t73vrcIZeKkFc/qvsUWWwy3ve1ti1KhvlH/qLu6vOxlLyvCex0X8fVvL6yOi3skLwJrCJzC1Ek73vjGNw73v//9i4Cu37Rxs802G37wgx8UxUcbCbR41e8Od7hDUTIoXfrvwAMPLPTBD36wCNShNCiDIK1PpfNrDNTDGPJgXOxiFxse97jHlf9xecELXlCUpm222aYIgdIrU9mvec1rzqx327a2ve39WBp98qlPfap4I/SFcL+E7bve9a7Di170onKvPdr8ile8oqTZe++9i4L1/ve/vygu2nOve92rtN3/0qjrgx70oDIfCPMUBBZ+bTnssMPKPLrf/e5X+ld/UIQ+8IEPlLzF3elOdyp85jGll2dAHaPOFAxlq4+6tu3T946MqTOlLtoW8cE7Rnhq/jbNWB6LpKl5e/G9sFQMEolEIpFYAdSKwRFHHFEEi6C4Hwvv0XLSTA0PauMJOazvD3/4w4tgTmDo8RKsCP8R75cwe77znW+43vWuVwReQjChjUXWNR4Wa5bxO97xjsWye+lLX7oIzAS4W93qVsWiT2AhoPHAXOYylymCvLQs6ay/F7/4xYvgf5/73KcIy8rHT1AkcLMUs7xTHoxL1BlRHgjc8q7bNo+m9L3xl3fNy5L/jne8Y3jUox5VhMk//OEPRSG68pWvXLwfhHZtIZRpg/o99rGPLd4PSoF26Kdb3/rWxdvAck6QxUcp0Df6+LrXvW5RmK5ylasUPgL5O9/5zqIksbh/4xvfKMoJL4JxJRSrnz6gEKBZbVwqxTwxpu6VRynjLeAZILwT/imO6iiegqOulEdt3HXXXYugTxHkBeINkpbQrp/MA+mUYf7uu+++hf/Nb37z8PznP3947nOfW8qhOMiDN8GcMzccZaIEm0fGKvpAvdVT+dGWiDNOX/7yl4vnSh1nzSNpev061tcRPhaP2rjlpKkpFYNEIpFIJFYAFANCBWGNwEmYmEc132qkmcpPyCGgUQwcs+jxIIIF3rgnWBCMWaUJWvJwz9L7tre9rQiihC1HlMQRiglvz3jGM4qARzAjBH/0ox8twp08HYkhfBESCWOEu9e+9rUljOLh5VDeBryEGtb6Jz/5ySW/rbbaqgiN6onw6IMQgqLeLQV/pAma0n/S1H0ijXppv36gILAwE4rdq/vtbne7IujrH2m0nUUcz3777Te8+tWvLkrE9ttvX4RnR3McSxL+8Y9/vCgE2223XVGK3v72txfFgFdFXXgqHJtyf9/73rccv6KI+I0+RlP7pBeHZvVNr0+UrU8cB7rNbW5Tjlbpm/C4UQz0i+M9hH4K401vetMzj+3whlA29MHVrna18m4GK36UxwNBwaSg6kdKabQR8SA4Iod3p512KvOLYiQu2iJurE8oIBRZSqjxmNU382jKvGpJmkXTTeHXh6kYJBKJRCKxTIRiQPAl3BBMkM04rqfQFP6WZ9Ey0Kw0hBwCN+GeoB2888oRLy0hjFWakEo4Y60m0BI88BDuP/e5zxVhjFLAwk7od/7buW2Cs3zwEgIJhM56EyYJjSy08icc1sKtfAmBjnewuLPCs6SzHkvXq3NLyiSsy4uS1+MZo+if+A3SBkIvK74jRA95yEOKcuRokPoTLgnvrqXVdgoZXu8hvPSlLy3HZgiu8iKsykP/UQgoSfqCAqLe3hUg+Ku/c/L6mPDrSBEPAuu5l68Jt3U9a2rbQNGKPmnjWhrrhyDh+tgRMEeG9I06GSvKj7Gk9BljbacMUJT0h6NE6o2ES+MYFCHdUSrrT/760HE0faOt+k6+yuc14EngSTE3KJKOVunbtq5BbVuMgfEzdsqPuV3zjLU/qBe/lDTzaJF6pWKQSCQSicQKIBQDAhThhAC1oRLBgQWWtZ31n6LT4+sR4YIQ5vgGCzWrL8twxMubUsAKTDimMDgj7jjHl770pXLuPJQFwh3hz1EjR5QIX+pCkMSnjlE3gj+h1/sFb3rTm8oRIgoEa7QjJ/jmtSN4CInqL3/t6fEuQgT+HXbYoQjyLPwEYn0kb20iFN/oRjcqylgItPqOx4PA79gPT4pz9uYXYZalnZeB8sWq7qiNPB0fohDpW7zS4nM0TH84/qI8eRNue/WtSX+wzFMwvA9gjNS5xzuVpPdeCS8Gxcc9QZ+HynsF6k25jGNijvuYI7xLfvHyOHlP4AlPeEIZe4qDLx+ZN/rVPONBoSR5sVrfxPiL0yZ9SoG6y13ucmY92rqOkXEyZymhlK55c2tDIG1IxSCRSCQSiRVArRjURElY9HcKT/vbux7jqe/r+IgjWBGqWKUpBi3PrF9EwCLYeY+AAE/QDeGD1ZnQ62VbAjuLMMsw5YFgQoB2zIPFXJg6IC+IsjI7ZsSTIW88XjiNz10SJAmLXjKVl3DvMTiGQhCN+o3VXbsJiBe+8IWLl4JQrs7ie2nGwut41yzLW265ZTn//7znPa+8N6At4tTTkSrHih7/+MeXr/IQmHkI8CnfOwSOB1EsKCza6X0DSgBBX7h2Uwb0DV4v2Gq/Izr6gKWdguLdC2URhtVL+VHXuu7xK96xHu+NyMc7EvqpTRNUpx0LI5jzDvnilbGl3CCCOqWMEE/5M8aUGPUUbkwofRQDigqFUbz3TkKRCgVA2704zCugf1j1KWRRBx4qSsFVr3rV8u6HOrX1bH/r65grlDAveBuniA+q08T91Ov2t47vhc37jev2vr1OxSCRSCQSiRVAKAYEPsLJ2G8b1lLN09730kR8HTd2XYfNSkPIIXASKlmJCUF1ujZNHUa4YInmFSCsEeQoAxFHOCPk+ZqRIxwsvoQ7wqd4L3sSfh0R4W1whMinIVl4KQeEMMeEpEOEYcdRWNEJ1MLwyIt1nrCtHApF1DHa0bZFvR3NIQRTONp299LUvzXVcZQS59nVRbvVT58QxoKfZdwLsaz/jl45gmUMpNdfxkKdpJeXthGApdX+eLeCMOxYEk+BcIqIPvGFHe3jdeBB0FcUC3koo65vXCNzgbJyrnOdqwjglL6Wr+avqc0r7o0NodrYUZjUz3Ef7wvwbBhnx8lY+7XdS8Ks/k984hOLV0h6L29rh/5AlAJh0ac8TeaRPkfqHu+jqIN2qQPvFC+EPCMu6ltT1D3uKRIUXh4DSkw9ljV/pGnT19d12BS+mmal6aWt+Vs+bUjFIJFIJBKJFUAoBjZZQl9NBJ0pYbPCZ9FKpyEgOLNNsGfRZWVmPRdXp4vrOoyngXX66U9/erEwE0TbPsFPQSAE+nX8I3goBjwJjpQ4ykSoIzw6c44HrzQ1yQ9RHtzLA2+Uo+51HevrmljUlek9B0JkXe+xNLMo0vhVB3XRZvVrefUTz8kDH/jA0l4vCBOCHWfSB6z8BFjjYnwI9JGWUIv0vXwoEYRqvPpL2VFm1KUOq6ltpz7wGU9eF3VRdh2PxvpmLBwpWx/rD9Z9R4NCEYt4/RUKYdsO9RLnKFrMoyjPL4o+R9JH2cGjLDz1OI9R5I3w85zwbvFq8Gz18qjTTKWoey+uR8Fbp5mXvk0Tv/ojFYNEIpFIJFYAoRjYYAkhBJgNlQgKjuawOjvSwlLtXttqvl47WeYJsl7KdLzFkRH5RXwroKE2niWbxZfXwQulrOmESPEhxPTSR1x934bNIlZkbfWVmSn8i1CvvjUR4vUdzwcFQJtZtFn9KWeOrPh1TGqsbvqKUsGK7Tw+T0WvzAir046R8XCGnlWc8tHjWSqpA4E6hOp6Pqlv1Duu475OPxZXh7dxKMoSV4dPIXnx2PB0UCSNV5v/+qS6H6cQfqQNqRgkEolEIrECCMXA5soKyZo5lVr+KemXkqaleWm8aEowdcY7rK49vprwElAdXaEUuO/xzSLl1AIdmlf2UtrfI9bnXvi6JHUPwaxuM6GNld67BhQCfcq6HmnafMQ5doWXkuEoUh2/aB8Fv7GPOrY8LS1aBmrTzMujF79omnn8aFYa40Jx9TnZes4spZyWFk2zUmUY31QMEolEIpFYAbSKQRxhCKGqvp9Cq5VmFskvqBc/RktNtxZofdV5Vrlj/dlLM8YbcW3YPJqV3xgttZxe+BgtpYyWlltP11PyWIm6rgapZyoGiUQikUisAEIxYOVlQUTOPyPnp+O+91tThEWa+G3j6/s2TR0W93Hdix+rX02RJuJ6aeI6qL2fl6a977Wnjm/j6vg6vI6fdR9pgur4+r7l7fG0973wNn4sTcTVv214S3V4e70u0rTx8/h710ERVsf1ruM+wtrr+remCBub91PS+K352jTu6zRtWe113Lfpar6WvxfXS1Ona+/bsFQMEolEIpFYAVAM/Esvd7zN2Zn4mkI4qK/bsJp3Vti8+DG+OqwOr+9r3l6Ydw3ifhZffV3/tmGz4uv7lq8Oq+978fFbX4/xRfsirKYIj7hZ11PT1Hz1dY+vDqvD67ia2jTxW6epeXp89XUdVlPL17sO3paWmiZ+6zT13AzeoDZNHdZet2G9NL2wOryNr+/rNG0Y6q2xmto0fqeuy158hFEOUjFIJBKJRGIFEIqBs7o2aWeQa6rD4tpvfd2Lj7A6bkqasbheeO++l6b9ncVXX9e/NfX42uuxOL9teI9v7Lr9rUnYrPCIq3mmhPfixq6D6rg6rA6v43o8cV+H1/c1Tx3eXtdhNbV8vev6t6aWbxZvUMs/dl9Ty9MLi/ugNr4XVseNhY/dx3Uv7Ty+9rr+ranHV1/XYZSDVAwSiUQikVgBhGLgnK7PRvqcYUttuPsIW5dp6vBe3KywOk3NNy9NGzcWPivNWD5BU9PUYXHdC6tpaj41RViPbxZ/ez01bClpejQrfiyuDa/vVzJNS8E3tYw6rI4bu477XpqaeuFjvGhe3NT8Imypadq4OpxykIpBIpFIJBIrgFox8DWYlnzhpxc+i1YjzVqtF9pY2rIoP0HN71qsG1qrabIty0tj3qVikEgkEonECuDUU08djj/++KIY+CZ9kH9H7V3XNItnXhq/U9MEtWnG+FuedZ0m+Ov7mi+oDl9Oml5cTREe/MtJU1/X1AsjrKGx+JrafNv7HvXS1PE9qtO06ceo5pmSpuWZx4+mppnFsy7S+J2Vvhe+GmmCv5eGYnD00UcPp5xySioGiUQikUgsB6eddtpw4oknln+GJdT5cyp/VrXPPvskJU0i/xnxnve8Z3jRi140vO51ryt/UNbjS0paSfJfDBQDRo3f/va3qRgkEolEIrFc2Ej//Oc/l0+WUg584aN+sS8paRaZK5TJZzzjGcPlLne54Z73vGf5Y7OcQ0nrmjyrDj300OIt+NOf/lSMHGNIxSCRSCQSiYlwnIhywHPA8uZokd+kpHl0wgknlPPdr3/964eb3/zmw1Oe8pTyXXnhPf6kpJUkz6yTTz55prcAUjFIJBKJRGIBsLZREJAXkpOSppD58pe//GX4wAc+MGy22WbFc3DkkUfmPEpa5xTPq3/84x8zlQJIxSCRSCQSiURiFUBI22WXXYbb3OY2w9Zbbz388pe/PCMmkVgbSMUgkUgkEolEYhXAYusF5Ec84hHDtttuW44WJRJrCakYJBKJRCKRSKwCKAaf/exnh8c85jHlq0SOEiUSawmpGCQSiUQikUisEvbaa6/hLne5y7DddtsNP//5z88ITSTWBlIxSCQSiUQikVglfPzjuw83vOENh2222WY47rhjzwhNJNYGUjFIJBKJRCKxUeHEk04a9vzkp4ed3/P+YZf3fnBN0favf9Ow9bOeN7zsFa8a3rHTLl2e9UEHfvErZ/ReYlNGKgaJRCKRSCQ2Khz/2xOGt73z3cOrtn/j8OYd3znssIboLW9717Dj23c+/XenYYe39nlWm7bdbvvh43t+6ozeS2zKSMUgkUgkEonERgWKAQH8gAO/dEZIYhZe/do3pWKQKEjFIJFIJBKJxEaFUAz23f/A8iWgxGykYpAIpGKQSCQSiURio0IqBoshFYNEIBWDRCKRSCQSGxVSMVgMqRgkAqkYJBKJRCKR2KiQisFiSMUgEUjFIJFIJBKJxEaFVAwWQyoGiUAqBolEIpFIJDYqpGKwGFIxSARSMUgkEolEIrFRIRWDxZCKQSKQikEikUgkEomNCqkYLIZUDBKBVAwSiUQikUhsVEjFYDGkYpAIpGKQSCQSiURio0IqBoshFYNEIBWDRCKRSCQSGxXWlWLwxz/+cTjmmGOG00477YyQlcOJJ544HHfccStW31NPPXX461//Wq7/+c9/lt8xpGKQCKRikEgkEolEYqPCulIMTjjhhOE1r3nNcOSRR54RMh+HHHLI8Je//OWMu3H85Cc/Gb74xS8Of/vb384ImY6jjz56OOyww864G0p573vf+4Z99tmn3OsD+R977LHlvkUqBolAKgaJRCKRSCQ2KvQUA8Ly+9///mH//fcf3va2tw2vfvWrS9jnP//54WUve9nw8Y9/fPjzn/88fOtb3xre8Y53DC95yUuKYP33v/+9eAh222234Y1vfOPwjGc8Y/je975X8mxx8MEHl7Ty/v73vz/86le/GrbYYouS/1577VUUi8MPP3z40Ic+NOy5557DtttuO3zzm98cfvvb3xaFA4/6nnLKKcOHP/zh4bWvfe2w/fbbn6mIHHHEEcO73vWuUg/1Pemkk4pXQD7f/e53Cw/84Ac/GB73uMcNO++88xkhQ6n/xz72sZJ3i1QMEoFUDBKJRCKRSGxU6CkGjtYQtu973/sWAfnAAw8sQj+L/jvf+c5hu+22K0rBN77xjSJUv/3tbx9e//rXDz/72c+G3XfffXjzm988fPSjHx0e8IAHFKG/xe9///uS5oMf/GARyFn///CHPwyPfvSjhze96U3Dl770peHkk08efv3rXw/Pe97zhhe+8IXDpz71qaI8UEjUQToeA4rIDjvsMDzqUY8adt1116LQ/PKXvyy/u+yyy/CsZz2rKBLS/fznPx8e+chHnlGLoZTxhje8odT53e9+9xmhw/Cd73yn1EM+LVIxSARSMUgkEolEIrFRYewoEWGd8E0hAGf6eQ9e+tKXFk8Ai/0vfvGLYaeddipWfMI6S/yTnvSk8m4B5YJAzzPQQhyr/BOe8IRh6623LmUBAZ5XIEBZoKB8/etfPyPk/+HLX/5yUQLi2NF+++1XjgN592DHHXccvvrVrw7vec97hqOOOqooFB/4wAdKmfJ++tOfXtLAQQcdVNryghe8YHj2s59dygMKAeXn0EMPLfc1UjFIBFIxSCQSiUQisVFhzGNAKdhmm22K0O5IzQ9/+MNyXOcjH/lIOTpEECeAOwpEgMb7hS98oaSjJDj+w2Pg+FELAjthfe+99y4CvDTg2BDB3hEmeTrrT0BnzWft5x04/vjjCw8F5YADDijHg8QLcyzoxS9+cXmHgMeCV4LA70iRNgnfaqutSlnguJKynva0pxXPAq8COMKkXcpvkYpBIpCKQSKRSCQSiY0KPcWA5ZxFn7BNgPeFIULzZz7zmeGtb31rUQr23XffYsl3ft8RI8eCeAx4D7w7QCgnkDuWI19CfdDvfve7kj+FwDEeigIQ1An50lM6KCPykBfPA++FMOnU7b3vfW+p6x577DF87nOfK2XxMKhDKB+OIlFmKAY8CpQA5QQcR/rEJz5RPAvhgZCXMvC3SMUgEUjFIJFIJBKJxEaFsaNEK4U//elPxZJPaUBeRv7Nb35zRuy6g68KeT8iPAk+Q0op4WXwbsQY9AEvgncoekjFIBFIxSCRSCQSicRGhXWtGFACvvKVr5QjSUFeUp73fwHLhReLf/rTn5b3BSgEAV6HnicgoA+k7X2RCFIxSARSMUgkEolEIrFRYV0rBoRyx3W8CxDkWM+GilQMEoFUDBKJRCKRSGxUWNeKwcaGVAwSgVQMEolEIpFIbFRIxWAxpGKQCKRikEgkEolEYqNCKgaLIRWDRCAVg0QikUgkEhsVUjFYDKkYJAKpGCQSiUQikdiokIrBYkjFIBFIxSCRSCQSicRGhVQMFkMqBolAKgaJRCKRSCQ2KoRisN8BXzgjJDELqRgkAqkYJBKJRCKR2Kjw/xSDnYYPfuSjw/e+f+jw/R+sHfrBD344/OCQ0+n03178+qDtXv26Yfc9UjFIpGKQSCQSiURiI8NvT/jdsMt7Pzi8/JWvG7bdbvs1RS97xfbDS7d91em/r+nGrw/iMfjkpz97Ru8lNmWkYpBIJBKJRGKjwil/P2X49VFHDUce+fPhZz/7xZqib3/nu8PH99hz+N73D+nGrw868nQ69rjjz+i9xKaMVAwSiUQikUgkVgm/+MXPh09/+tPDSSeddEZIIrF2kIpBIpFIJBKJxCrhiCOOGHbffffhuOOOOyMkkVg7SMUgkUgkEolEYpXwhz/8YTjkkEOGk08++YyQRGLtIBWDRCKRSCQSiVXCH//4x+Hwww8f/va3v50RkkisHaRikEgkEolEIrEM/POf/xxOOOGE4Vvf+taw//77D/vtt9/wwx/+sAj/jgwddNBB5f7nP//5sPfeew8HHHBA4T/22GOHY445Zjj66KOHn/3sZ8Nhhx1W+PwmEusDqRgkEolEIpFILAOnnHLKsPPOOw83velNh8tf/vLDZS972eE2t7nN8KxnPWt4+tOfPlz/+tcfbnGLWwybb775cJ3rXGe4293uNrzwhS8cttlmm2GrrbYa7n//+w/3ute9hgc/+MHlerfddjsj50RidZGKQSKRSCQSicQywGOwww47DOc+97mHf/u3fxvOfvazD+c4xzmG85znPMP5zne+4T/+4z+Gi1zkIsMd73jH4RrXuMZwrnOda/jf//3f4XKXu9xwgQtcoMRf4QpXGJ785CcPD3zgA4ejjjrqjJwTidVFKgaJRCKRSCQSy8SvfvWrYYsttihKwe1ud7viNfif//mf4drXvvZwoxvdaLjEJS4xbLnllsOOO+44fOhDHxre9773Da997WuHrbfeerjWta5VeK973esWTwIPRCKxPpCKQSKRSCQSicQKYNdddx0udrGLDf/93/89nOUsZyneAwL/RS960eIVOOtZz1oUB94BngNeBWH4L3jBCxYPg+NF+cWixPpCKgaJRCKRSCQSK4C///3v5b2C//qv/ypHhbxvQOinIFAMKAv//u//XpQBisAlL3nJEu8IEs/CVa5ylWGnnXYa/vGPf5yRYyKxukjFIJFIJBKJRGKF4EjRrW996+HCF75wEfTPec5zFoWAAnD+85+/KAhnO9vZyv2lL33p4bznPe/wn//5n0V54EXwHweJxPpCKgaJRCKRSCQSK4h99923eAt4BxBlgPDPM8Cb4D4UBOGUB4rB4x//+OFPf/rTGbkkEquPVAwSiUQikUgkVhD+v+A1r3nNcKELXai8N+C9gloZcF+HURauetWrlv8/SCTWJ1IxSCQSiUQikVhhnHjiicMTnvCEIvB7ydhxIkpAvF/gGJHPlgrjXXjjG984/PnPfz4jdSKxfpCKQSKRSCQSicQ6wJFHHjk8/OEPL/9VEIoB8n6BdxAcLRK+2WablX9NTiTWN1IxSCQSiUQikVhHOPTQQ4cb3/jGZ36ZCMXXiYR5+dixI180SiTWN1IxSCQSiUQikVhH+Mtf/jI88pGPLApAvIgc5P4+97nP8LOf/ewM7kRi/SIVg0QikUgkEol1hH/+85/DYYcdNjz3uc8t7xfEkSJKwUUucpHyp2iJxFpBKgaJRCKRSCQS6xhHHHHEsMUWW5RPk/oKEdp6662H3/3ud2dwJBLrH6kYJBKJRCKRSKxj8Bx8+9vfHu5+97sXpeCud73r8KMf/eiM2ERibSAVg0QikUgkEolVwoc+9KHyMvJuu+1WlIVEYi0hFYNEIpFIJBKJVcIhhxwy7LTTTsOxxx57RkgisXaQikEikUgkEonEKoGXwKdJ01uQWItIxSCRSCQSiUQikUikYpBIJBKJxFSw8qJ//OMfhU477bSkpKSkNU2eVfHsChpDKgaJRCKRSEyEDfbUU08tR0HQKaeckpSUlLSmKZ5XiKKQikEikUgkEsuEzfSvf/3r8Oc//3n429/+9n/IBjwlrKbVSLMaZaA6Pq7npalpNdLU/KuVpo0bo8h/0TLa61m0nDTrsoyaZqXpxc0rY62mWW4Zcb1IGuQZNks5SMUgkUgkEokJsJmedNJJ5Q+p/vKXv/wfsuFOCatpNdKsRhmojo/reWlqWo00Nf9qpWnjxijyX7SM9noWLSfNuiyjpllpenHzyliraZZbRlwvkgZ5hvEcpGKQSCQSicQyYDM97rjjht/85jfDn/70p+Hkk08u3oP6t70eIzzLSTOFP2jRNDXfWkrT8sT9rLS9uHll1fHzeIMWTdPy9NLM4hkrY9E0Y/kEzYtHs8psKeJqnnllLCUNwrPaaabwo0XLQMtJ4xd5bnmG8RykYpBIJBKJxDJAMTjmmGOGX//618Mf//jHQjbaoPo+rlu+WWnqsAgfi2/jZqUZixu7jvs2TR1Wh9f3PZ64r8Pr+DauvZ53X/+O8bR8PZ64buPr3951UH1f89VhU+7r8Dq+5mt/g+rwoPq+5uldt3z1dcs3lqblb8NrauOCvxde/7ZhvfDe/aJpWp5eeB3fxo1dx32bpg6rw+v7mncsrKY67g9/+MNw9NFHp2KQSCQSicRyQTGwqf7yl78s7nibbNKGSUsZv9VKk5S0rujEE08cjjrqqFQMEolEIpFYLkIx+MUvfjH8/ve/L5tsUlJS0oZCnlupGCQSiUQisQKIo0QUgxNOOKG8hBxkw63v1wotpV6LpsG/Wml64WO0KD9arTSL0lpty1rtL7Ra5SyFltLPK5HGcysVg0QikUgkVgChGPzsZz8bjj/++OG3v/3tmWTDre/n0aL8qE7Tlr+StNy6TaWlpFmU1mpbllvG1PFfSjmL0mr0F9rU27ISacybVAwSiUQikVgBhGJw5JFHDscee2z5ukdLNt5e+CxajTRZr42rLWu5br3wlaS12pa1Wi8UaTy3UjFIJBKJRGIFUCsGfn22FNlsg3ph8+Lq8DZ+Vpqx616aNmzefRvW/rbXvbBe2jZNGzYvflbYvPgIa3/b+Dqsva5/g2qeKfd1WIT3fuO65Y/wsd+g4K3j6+v6Nyh46viaZ+y+Dh+7XzRN8Pfi67D2uo2vw+uweXF1WBvfS9OGtddtfB3ehs2Lj7D2t71uw/ymYpBIJBKJxAogFIMjjjiivITsehbVPHHt1+YcFPE1D+L2t5HPK6eOb/Mbo7p86dsylBsWxjpP1+Jq3pbUW7ql1KutR9zHb+Rd86Co16xyxLXxcd+W21LblvY+rsfuHeeY1281Ta0XvpgnbbkRHzQrvP1tr+eFtelm3QdFveNeW6O9+Oe1HUWayD/SzJonbX3aclqe9remCFOW9VLzRL5tul4+EV73SbSt5anv63hxvTQtUQpSMUgkEolEYgUQisFPf/rT8l8GNli/NuPYcKdQvXnHfR33k5/8ZPjwhz88fP7zn/+X8LieQr16uSdAuFZ2XNck7Hvf+97w2c9+dth///2Hn//852fmE2mi3tEHCI9+efe73z0ccMABRViKuOWSvA8//PDhgx/84PCFL3zh/8RHnYLqdkkrzG97XeeB6vb07iNN9G2UF+H1ffD4tO3b3/724ctf/vK/pI88e/ct9eKV86Mf/Wh417veNXzzm9/sxqP2Ou71UVtv18Livr1u06pXfV9fR16oDkNe3t9ll13K/CYIR/gYRV/24loyX/X3V7/61VJuhEc7gsTV8XU94xpfex+8EaduX/ziF4e99967jEfw1FTz+41y6vjDDjtseP/73z986Utf+pd6xdjjqesU8YtS5JeKQSKRSCQSy0QoBgR3At+vfvWrQjbbuB6j4LGpf//73y8C9I477jgcfPDBZaOu+b71rW8N9773vYfXve51Z27msaGHUBC87iPv4Am+No6ng8Lxspe9bHj9619fhNWI86ttu+6663DPe95zuNe97jXc//73L8oBgeRrX/va8M53vnN405veNOyxxx4lr0gXab/97W8Pt7zlLYdXv/rV5WsoETeP6nx6JG+C3j3ucY/hDW94Q+l7bXOk6+Mf/3gJ22GHHYa3ve1tpf7f+c53Sp4IL6FNnbbddtvhU5/6VAmL+LZ//bblR5+Ki379xje+UQTQz3zmM2em0VfqoF+jjoTgq1/96sM73vGOkkeb9yzq1SXImBx00EHDDW5wg+GjH/3ov+Qtnfq99rWvHV70ohcNr3zlK4sgTnD9wQ9+UK7f/OY3FyGUovWWt7yl1I9CKK3+EqZPEYVMWorfhz70oTPngDL9uv/kJz9Z0hmL9773vWWOG4f3vOc9Jf4Tn/hEyVsaed3qVrcqc5CVvG5X3YZZ9z2S9w9/+MPhOte5TllflFPpzJPddtut1EN73vrWtw4f+9jHSh2lw2OctOUlL3nJ8OIXv3h44xvfWOaWuN13370oYPrbhwe0Rf8R5A899NBhm222KfPe3PPlH/UI+u53v3tmf+tTfUmRi3L9mifGYvPNNy/142GKOHn8+Mc/LvUVR+kWF/E19cKC6jh5pmKQSCQSicQyEYqBjZoggQiAvd+xOMLKBz7wgeHGN77x8JznPKcIvDbriHdN2Nhyyy2HnXba6UzhEhHqCG+si+79Em4IQ+4JQIccckgRRvCJI8xF/gSeO9/5zsMLXvCC4XGPe9zwgAc8oAhD4pQt3f3ud79S9qc//eniNVAXQoXrxzzmMcMVrnCF4Y53vGMRBPVFpMWjzIc+9KFFOHa0QlzEx3XcB7XhvXuCDKHsCU94QhH4WIWVZxwoVwTB29/+9sNjH/vY4U53ulNRaPbZZ59SP/xbbLHF8PSnP33Yeuuthzvc4Q5FMJSv9PpLv7nWjwS/6K/4FR597Jf3Qn8QBikrhHBt15/Xv/71ixCqjtIbb0KwMHWWR932+nqMgqfm1SfGlwJH2Ym8/aoPxfLRj350EVqNuXZTWL7+9a8P97nPfYab3/zmw5577lmEXO3Qf/rYfHn84x8/3OUudyn9fdvb3rYQBYJAb85e8pKXHF7xilcUa/+znvWs4drXvnYRpAm+rpWrDizpT33qU4erXe1qRclVNyQf82TnnXcuedRtrSnaOvbbhiHjc7e73a3Ma+NvDJRHObrJTW4ybLbZZmXu+33IQx5S6miNqbt+eu5zn1uUKXPKOjCO2nrVq151eNKTnlQUeUqU9fuoRz2qrEnCegj1FDbjIty80ufG4ja3uU0p11w0Zwj66qbO+kS+8tMnsXYiTpsoIbe+9a2Hpz3taf+n7e113AfV4SgVg0QikUgkVgChGBA0bPqEBoJkUHvfCyMoEpwJKYQSG3Udj59wRkDYb7/9imAgjBLwwhe+cHjQgx5UrJd4WSIJYYQywioB8ZnPfGYRHp7xjGeUOEKMMglKV7nKVYr1U915JQgqj3zkI4tCQDBhLb3GNa5RBGteBUI3wUL5PAQ8Bne9612HRzziEUXJqPvAr35Rb5bjqHfbtvq+F9a710eErfe9733l+EkdL+6+971vUXb0AYWAIKavCGwEXJ6CEOy32267IvTpe5b2pzzlKcP2229f+J/85CcX4U7fENooDfoaj/4UR7kgRFLoCLe3uMUtigJHUbjZzW5WhD5CtLLUneD78pe/vPDry7ruY22vw1ueCEPmSdQl5hGhj5X+hje8YfHyGLe99tqrjCc+3i7C/BOf+MTSJ9ppHpgr0hJozQkW7n333bcoitpMQVR/ChAhWH8Z46222qooC/KidEgrTt3USZ9RmHiTlKXelDD9aiyF1e2qf9vr3n0d5tcc5BmgmESfCKf0ELwJ/eqtT253u9uVtWNtUZasHelj/VGq1E8/UCQoUMLVX17mjbzNTf1E4WT5p6Caj5QC/U0pNX/w6Qf9SUHgOYj1Ze5az+av+tVts86MzcMe9rCirERcxM+674WlYpBIJBKJxAqgVgwIXDbsRYlF0REUFluCFOGlx2cDr68JDq961auGC13oQmceMaJgsNASDglujmawghJYCLlXvOIVi3DjH08JcDe60Y2KUBNCK0GGpZjSQGgk0MnPUaLnPe95pTxl4yX4EXBe85rXFGWDkBKC3Vi91zUpy3g8+MEPLvUl4BGsCGmUHP3MWk8B0GYCF4XgUpe6VBFMWW2ve93rFkWBoEwBYB3XD4Q4x49YaVnQKVsEw0tf+tJF2CV46lP9TaDU55Q296G81PWs672S1OZtrAiyFBbtCGEzLNgEVVZxY2ze8KiwlFMOzCl9SPglOBOi9W0cOZNeP37kIx8pFnDzgxJ54IEHlnh5m5sPf/jDy3xSN14mx4sI2+urT9w71mQc9Yk+0j6eL/XFY/5Q6qwhx6XU3zo3ZyiYBHIeFF4Dwr9fCoi8CPvmnLUm/+hbZVpvz372s0v+rnknjIf+c6yvXj+z+oSCSRlRj178IpSKQSKRSCQSKwCKAcGekE4IIuwE1feu2/ggggGBnGLwla98pQgDwsf4gygihDaWRkIsgYL1m6BAaJeegEYode6dBZgAyMLrzDKLqCMWYREl4BFMCCgs8fIjyLCCqx/hIYQ9dQxhhvWdcsDaSkAmaKhbXdextuANq3kbJ828PugRwY3wqm4s1661QX1ZvVl6CYEhFBH4nfkn1BLu9QklSn8gfeo4EoGZt0GeBDrlUDR4XXgH9BWhlzCoTAIyZYTF/HOf+9y/9EnbPzXpi7E+mUez+ovSSHl74AMfWIRWXgtzQRqCKoWJovf85z+/HIth5db+UHgpFLxD+oHga45F3vqDYOxIEeFYOuHSOYKmLxx9MQ9d6y9xkb6lduz1hbk3q0/aND2KeG3SBuPlSBXhnRdD/5gn+p93JeaPOvv1roRxp2hRnHiG9Jm5Y5y1XbvinSBKtfy1Vzr1p7Dz0Mjb+tVXFC188lP2lLZ4boRiMIW/pTpNKgaJRCKRSKwAQjGwwROwCUg9mhVHYHBEh2LA8khoET4rTRBeQjzLP+GdMOJ9hdj4CR2EEgIEgY+wwyvBqs7ifc1rXrMIKgQWebFy8xh4X4Dl09EGAhGBOnjkS/ghDLGes5JTTAg2hCUCV7RhFsmHAKUsVvtee6f0QU3yJOSw0rPSUlQcd/ISsL4g3BJ69ZH26Hv9c7GLXazE6Rtn9AnNznUTvgjClB59Q9lw1ITwpg8Jute61rVKG+JMuHycQXe0StkERmWoW6/ONeHR5xQXx49C4ezx9qjXX+aC9wYoifqbUGvsKDks/cqgMLJ+mx+OSxE2KQbmgLFkISc4U3TwEKQdU4tx1h+s6t43Ee8+ynYszVwjCDs+Qyg2R8TV9axJO+q26BNeLAJ0zVfTlLkSPPpUHbSTYk3B4zHS5+KV530LY0pZp9xQFilGPC3mk/4x7vqQQmH98oToE3lTMHkgKGFe+pavOad/9YE+Vx/zyPiYYzxWeKa0xdx1hI9yg39KmprqvkjFIJFIJBKJFUAoBqynlANC1qJEiCLAOo9OMLFR9/h6ZGNn9XVU47KXvWyx2hLEYuMnvLF2Ek4JOs6aOx5D4CPUen/A8RjWbkKOoxOOVxAgtUd6AiRBkOeB4Kx+BPm73/3uRbghcLonODtrzqo6ry/UT58RpCg10lJQeryLkHK1hQBHWSFICiOsIYoXhYECo84EP56Um970puWoCOJRYD2nEPhiDMEREQ4dDSLsieNdYX2//OUvX/pOP8ibMkAw5IlwrU99naZX35b0LQ+F9wFYn/WRcezxTiVKnHlBaTS/9APLNEWOEmP+iNcP+o7ATuDUVuNkPpkz+kU7HMMy9gRivOonHeXWC7mUGsKvskP4JHRTOAnPjp21dRwj6eVFUfMSt3XS41uUzAmeAcqOtpkHwmOeeCdGe72HYQzEUWzMC2tDG60rR6yktQ7Of/7zF4XLGFIcpKcQUGjMR3F4ESXMOlMHfaoelEp566+6rmPkuWGcpOvFTyV9TMlIxSCRSCQSiWWiVgwItn4XJUI6izzFgHAZQvMUIuAQ/JxfZq1lCZdeuPqwbrOQE2ZZ/v0SAEP5IGgR8lg3CYIsu6zb0vtljXTmnjWT8OJctHQEeUdoWOZZoL3IyQrqSA6hh5VVHr06I8KINIQrXgsCuHx7vFNJegI5YU6ehDgCr3ZE3n4JthQpwpo26RdCvjiKAyVAP7ECO2uPR/20h2XXsSTxwhGvSXz60qdCWZIpJARA5es/Z/PVbVafKF8aFuYrX/nKxXqsn3q8U0l5BF3CL8Ux2mzc1Iki5JgThcY8odQhc9GYx9g6dkZZoQhRdMwZwnMIsS996UuLAqnexlR/Rh3w6Fd56Fsv0Md4zCN8rPeUCmNKuZ2adoz0CQGcoK9N3hPxToi2irM2COzeyaCM8CCZI5RrXjnrikJLmdRWyjNFgdLjfQQeBYq0e+vRGFpbFGDKgXGlpEkvHJknFPVFxpsHzDymGPTip5I2p2KQSCQSicQKIBQDwgLL5jzCFxT3BLcQ4AlNNuqav05Xh6HY3H3VhMDKchvp8RFQfXudMBykDHHSOfZBKSDEEBgdfyEYEmKdtyeI1WnDGiwPFmPWUIIjQYtgI8yvekQdg+p6ExYJloQlVk/CXx0f1Ia57/EFsdJTBNSDsK7+6hdp/LKQ8yYQCAmzrL0su+pAgCUE8qLwkPhik1/CXpTN06BPeRCUEUdy9LX7OFakD/HoN8K3ukUeUd/6msCrP32dinIVXpuWr75uqc4/ro0zoddRHwqO8VVH/YIvjs2wXGsb5VQ7tN0cUG9jGnMAn3lgjGP+Rbx0fvVjXSd5qAOBO/oh4oKvJXHaT/B2Vp+FnfdCeT1+JE2bd68McyDqrD0UvpiDyNgZCwoOL5t+U3/lmwvmBAHfPOBFi/zMAfGuxclTeeaksswDypjylKu/tM867NUz6hNx9bXnBgXP+qn5W774HSPxqRgkEolEIrECCMXABkvgIRS0v/OIhZIlkvDDiuk8t/B56ZXpWAZL/ZWudKUi7BJcbPQ1T2z+8VvHEfAJob7E4rgG4Yvlm2BUp430SL0i3G+dXxsW1PYJvjhC4ThLzVvztdezKPiUP1YX94RWZ+W9B0Ah8I6EPiQoe1+DlZgl2PEXQp982jyEeZmXh4XnRF7i2jLbesxqC8WAhdmnVgnxBOAe36JU90vUpa5jGxf3qJe2bWMdH1TH1TxtOhRl9PoGP8FcnxDEI+9ZacYIb81f17lXr4ivf4XLow0L/rhG4v32+F3HfVBdt/q6JbzmKgWPV2PLLbf8l/hZaWsKPvmlYpBIJBKJxAogFAMbLAsqy2FNBPU2rCVpKQMEc4oBC6JNu8dbEx5/FuV4BwHbi53OkNvoe/xBUaf4lc+U8qa0Jagto0cs4o4cRb9NSdPGuZ/Fj8bi23arhyNd+hI5fkNxMj51OiRPxz/0PYqXcnu8Y9TWS130CYuy3zouqE0zr+1o0TRj8bPSjZWxEmn0BUv8UtbX1DJm0WqlQVP4zTHrxnEmX5Pi1ejx9ahXL/MuFYNEIpFIJFYAoRjYXFnZCS9jZDOu7/EL82uDdmyC1dh1zVenizRxL41jOfFegbQtT6/c+l4aPDXV8S0/6vHV1KYJ3jqN6yjbfaSJ314ace19XKMpaVy3FH1OqdKXznpHf0a6Or1fY4UPP+tvhLd89XUd1tYr4uvxqONQr/1x3/6OUd1nvTS9evVoVpq6jKA2zylpEJ56ntTUpllKGYumibg2XU2RJn57aWalR7PixZmr5qDfNq6+r6nXFn2bikEikUgkEiuAUAxsriybyPlyvzbh+j6ug2qeoF6auK/T1PGRps0r0rR5RlidR1Ckqe/9tnnUVKep4xdJE9dtG2qqeeo0dVxQyzMlTU11Pdp0NQWf37iu+cfSRXgvTY/qeGla3l7aSFOXEXF1HnV4XLdp2nqiyKPlCb42TR1Xp4lrv20eNW8d34bFfS9dW482j7E0ce23lybue/F1WEtjacbKiLBe+NQ0dXxbLwpCKgaJRCKRSKwAasXA5utlw5p6YUGxcdf3vd+aZqXp8Udce1+nq+Pa+Ljv/dbUpqnD699eXI/G+Kemiet5aYLatFPS1LQU/kXLWU6aXlyPllNGnWZK+pZnqWlmpWvj43o5aXpp27Be+pqEzcpnSpqWd0qaXrjfVAwSiUQikVgBUAyOPfbY4pL3BZeavCBY/7bXPWrT+F00TXs9Ri3PrDRLKWOpaVqeWWkirk03L01NEdby1dTyR1jN09KiaWr+4JvFj5abpg6reVpaapr2d2qa+n5eGlTzzEsTcW2auO5Rm8bvlDQtz6w0ETcv35p6aealb9NQDlIxSCQSiURiBRCKAXe8l0+RL9nEb31d/7bXcR9Ux9d8vbigXrjroF5cex28QWN89XXw1vfB0/L1woLauLHr4J/KV1/H/bywuK5/p4S18VPC4jqojZ8V5ncWX31d/7ZhPb6lhNX3U9PUv1P45oXNi/fbxo/xxW9QL74X5nfeddA8vvitKeJ7fHVYzV/H1fcUhFQMEolEIpFYAdSKgS+FoPgH3ZYiruZZapq4rqnHG/djaYJWKk1931Ivfi2kmZdfj+o0cT0vnx7fSqcR10szi+o0s8JqiriaZ6lp4rqlOr/6d2qaXlgvbS+ux1fTctLU9yudphc3xh/hlINUDBKJRCKRWAGEYuCFPp9UTEpKWrtEGO6Fb8pEOUjFIJFIJBKJFcCpp546HHfcccVj4F9j/XmW3951TW14ez0vTVzX9zVvUB0+JU0d3963cT2e9r6Nm8XT3tf8EV5ft/x1fMvT3sd1HT+Pp75uqeVp72veCK+vW/46vuarr+M+ruv4mq/HU1/X1IYFX011fM1T39fhdVzN0/K11y3V4VPStOFxX1PNX/PU92NxvfC4ru9r3qA6vOXvpWnD6/s2bhZPfR+/lIOjjz56OOWUU1IxSCQSiURiOTjttNOG3//+9+U79gceeOCw9957D5/+9KeHvfbaa5TEz+NpaTXT9MLHaLXqtRRaShmrkWat1gstpS1rOU0vfIw2tXp99rOfHT7/+c8Xb+fxxx+fikEikUgkEssFxeDPf/5zsbj5R1LndVtXfVJSUtJaI88qn1n+1a9+Nfzxj38s3s9UDBKJRCKRWAb+8Y9/lPcMTj755OGEE04o7xscc8wxSUlJSWua/P+KZxalgLfAs2wMqRgkEolEIjERrGw8BxQEZJNNSkpKWuvkeeXZNUspgFQMEolEIpFIJBKJRCoGiUQikUgkEolEIhWDRCKRSCQSiUQicTpSMUgkEolEIpFIJBKpGCQSiUQikUgkEolUDBKJRCKRSCQSicTpSMUgkUgkEolEIpFIpGKQSCQSiUQikUgkUjFIJBKJRCKRSCQSpyMVg0QikUgkEolEIpGKQSKRSCQSiUQikUjFIJFIJBKJRCKRSJyOVAwSiUQikUgkEolEKgaJRCKRSCQSiUQikUgkEolEIpFIJNJhkEgkEolEIpFIJBKJRCKRSCQSiUTidKTDIJFIJBKJRCKRSCQSiUQikUgkEolEOgwSiUQikUgkEolEIpFIJBKJRCKRSKTDIJFIJBKJRCKRSCQSiUQikUgkEonE6UiHQSKRSCQSiUQikUgkEolEIpFIJBKJdBgkEolEIpFIJBKJRCKRSCQSiUQikUiHQSKRSCQSiUQikUgkEolEIpFIJBKJ07FJOAz++c9/Dn/961+HE088cfjtb387HHfccTPp97///XDKKaeckTqRSCQSicSGgNNOO23485//PPzud78bjj322EK9ff74448vPHj/8Y9/nJE6kUgkEolEIpFIJBKJxCbhMPjb3/42HHDAAcNzn/vc4Z73vOdw29vedrjd7W433OEOdxhuf/vbFxKGNttss+GZz3zm8M1vfvOM1ImNHYxFf/jDH4bDDz982HfffYc99thjOPDAA4df/epXo4Yk4RxLhxxyyLD33nsPn/jEJ4avfOUrw69//evh1FNPPYNrbeHvf//7cMwxxwxf+9rXShv33HPP4bvf/W4xmCUSibUDRm9GbfuQdYqs26OOOqo4vxPj8Izbfffdh6222mq4zW1uM9zylrcs+33s9cjev/nmmw+PfvSjh1133bUcJEgkEolEYhGQq+0fRxxxxPD9739/+Pa3vz1861vfGg4++OB/oW984xvlF9/JJ598RurESoJeZjzo/Mj1Wj8MoH4OKJLr1Hmt6o+JxKYOeplnivWKrNW1/nxZ33Bg235Hd/3hD39Y9sd2b7RfIvouHge5sl/XHjYJhwGD6Pve977h5je/+XDWs551ONvZzjZc/epXH+5xj3sMD3/4w4fHPOYxwyMf+cjhUY961LDlllsO22+//XDooYeekTqxscND/yc/+cnwlre8ZbjTne40XPGKVxzuc5/7DJ/85CeLANeDTYNyYK4wSl3rWtcqc+gzn/nMmjXAc4rsv//+w+Mf//jhyle+8nDVq151eMlLXlIe5IlEYu2A8vjlL3952HrrrcteZb0+7nGPK88XJ+YT4/DmAMfvNttsU/b3hz3sYcNjH/vYsr9vscUWw/Wvf/3hAhe4wPBv//Zvw6UvfenhZS97WXEOJxKJRCKxCDioP/KRjxTn8w1veMPhCle4QtEhrnKVqxQZO+iyl71s2cef9rSnFd0hsbIgM/385z8vh7de85rXDK985SuHj33sY8OPfvSjNX3I4mc/+9mw8847l3mx3XbbDV/84hfzUEgiscZgTbILeta//vWvH173utcNH//4x4vtKL9IMg59Y79705veVOxq1772tcveGDYo5Nr+eKlLXarY4Bxo/ctf/nJGDom1gk3GYfDBD36wnDT8r//6ryLQPf/5zy8nNuOTBIgXDHkw8CQmNg0w/vNqEjAZk855znMOt771rcvJ0zHBzUOQR5TB/WpXu9pwwQtecLj73e9eBNS1enropJNOKgbH+93vfsO5znWu4TznOc/w1Kc+dfjFL35xBkcikVgL8Nzh3OOEtE7/53/+pwhbTs4zUCTGEacMCZyxp7v+05/+NPzgBz8ogj4n77nPfe5i2Hn5y19e3gxLJBKJRGIR/PKXvxze/va3l7fYzn/+8xfytro31RlJHETaYYcdipHJPRn8N7/5zRmpEysFnxz+whe+MDzhCU8YLn7xiw//+7//Ozz0oQ8dPv3pTxfdZ61Cnc2Xf//3fy965Ktf/eo1Xd9EYlOENclBwH7CuO2wEf3ss5/9bNEtEn2wpdrv2Fs5W3bccceyHwbRv+51r3sN5z3vecshLjrZLrvssmYP3m7K2CQdBle60pXK6UNeL4aFxKYNc+Cwww4rgtqNb3zj8uDy2YrddtttpsPAq1XmkbcLLnrRi5aHng1lLTsMbG4PfOADi1JDoH7GM55RFJ5EIrF2wOh9wgknlD3Kafl99tln+M53vlMEr7G3nhLzceSRRxaB1SeJOGLSYZBIJBKJpcLbae985zvLnnKxi12sHCB63vOeVw4UMZb4JENLiZWHN6i9lUmncYKVUc+bhZ/73OeGP/7xj2dwrT34XPLNbnazYixzMOQVr3hFcX4kEom1A8+XT33qU8VJcM1rXrPQE5/4xGG//fZbszaftYTePoiOPvro4Y1vfGN5ZnsGeuPgve99bzoM1iA2WYfBS1/60vL99jGDcGLTQToM0mGQSCQ2bhBOf/rTn5ZTLZ7v6TBIJBKJxHIw5jDg4LfnJFYHDlnQ1/yfhLemkf+Bov+v5e9hf/7znx9ucYtbFGOZtx59ligdBonE2oJnCNuOT8KymSCfP13rz5e1Dofg3vzmN6fDYANAOgzWscPA9/GdbiDEIF7K1fzckQeZzzEQQDzcCFBOrqrTar5d4VSstquH17eWUjbhu9cW+S6nLWvRYaCtxkj7fDZrJb7ntqE4DKxXfyit7cZ6fTj1rFHjqHzjYP4uRSgwT/S7MTRPV/MPzbQhnj3Wi191Wc0T6lEH7V+tss0X4xbtNpeM5boS6uRrzmpj9PP6mreLQJ2tseijlXjGLIJY56s1L9NhkEgkEomVxJjDgH6QhqTEPKTDIJFIbKrwiV2f6kuHwdpHOgwWMOow9nl9xmuP8jOp995773KSIU6SMHw4WfLRj360fBPfH1X6o0XCJCMFo7KwF77wheWblv6M2Z8crdR3qRnRfSvs/e9/fzFmb7XVVsVAfJe73KV8t/lWt7rVsNlmmw33vve9y590+S+Hd73rXeU7igzHDJxTwMjuz5q0f6eddhre/e53lzwYzYDxRz+85z3vKWV4jesBD3hAeYXrHe94x3D44YcXvllgyPrGN74xfOhDHyqvaT7pSU8aHvSgBw13vetd/6Ut+lT+z3nOc4a3ve1t5RVPf361SFvWp8NAX/kzHWnf+ta3ljz1kzEyb+54xzuWMfR/A9tuu23pb6/BGa9FTjCZm6vtMGAkt84+/OEPF0PdBz7wgfKZlVCk/PpMiHkkXn3Uzx/faPvd7na38qel/vzVevLnYF4xPuKIIxY2MGq/18R9R08+e+65Zxl3zwfjrB577bVX+cM030E113wD1f1BBx009zuF1rBxMT7m0rOf/ezhEY94RPlzdWNofviDdevBd9TN66985Ssl3Uoolgy+vtGufZQOf27tu/fWiPViTquL/vQHa9rlm/jm3ko4uSg51rw8nRjwjPNKuHms/cbSn9AaS+vZdwq//vWvL0s5MiaeJcbyta99bVkjD3nIQ8rzzjeFfRv2zne+83D/+9+/PHdf/OIXl2eSchnLlwLPOGvf+FmPnrHy10b9rFzPqAc/+MHlmWUs7AfmPefJVDjJ4pnq+cwgYW4xTozNe31hrVljb3jDG8pv/Zz1PD3wwAPLOnv6058+3Pe+9y1zQ331l/o+5SlPKQZ0Y/OlL32pnP5Y7tzUZvPSGHm++ZNh5VjnxibWubnx3Oc+t7Q3yg54ztkLpNc2eXkGTH3GQzoMEolEIrGSWA2HgT3f/m3Ps5fKmzzhD32POuqoheSKpcDeSe4hIyv/kEMOKXov+X61QH5XPhmH3L5aRnVtpPf7c1Ntp0OQc/3vHd1JPRbRw1qsdYeBfqdPG/Pvfe97Zc7pj9U8YML24mAJeZa+Ql5T/qL9Ts+RVj7aQx60rlbjIBc7A/vMj3/849KPsYbNK/25Gt/BJy/TK7Q/nh/rciy12aEgzwrrRZutHWNoTq2LvqfL18/KWK+eGdar+ixnva4rqJfnm3ExP/SXubqun+1gn4ry9Zny9RfbhDFcl0iHwYaDdBgs4DCw2TDuM0pd/vKXHy5ykYsUg5y8GXZMcgZC+f/3f//38J//+Z+Fx7+CMyT5I93rXve6JczCCPJHRwyk22+//XDwwQcvvFA87BnACBoMVYRWf6D0H//xH6UsBm2GM0YhpB7Xuc51ShwedWDwVgfGGP0ybwOxuTHaMED67uIFLnCBYhxkrNdHDEL+FKZuJyJMM6R6iPeg7eLUg6HRv6af7WxnK/W88IUvPFzjGtco9WdoCqOgPypW/7Oc5SylDP2pvxmYbVDz2rI+HAYe0DYD/Ay72mW+qL92GMMb3vCGpR4cI1e/+tWLkT/6UX8z9hlzRmcP+3lYHw4DGxBDrjWh3r4rymCnTPOE4Vo79flZz3rWM/+I9CY3uUkZ5xvd6EYljfUUbWfou+lNb1ocRL7vTpCcopgR1Bh4bUj+9JmQ/qpXvaqMAecdg6+5HOUg/czAzqGh/1rYzPW/P79WJ/WPtMbzMpe5TGlDtMX92c9+9jN5/DmbNct4bw33ypgFgg/DqrVoHllf+jHyt2Y8f5RvTVgryrSmxOtXY8PgzXlJ+V1EybU+CN2M0/Hsi/ad4xznKM8A46t832k1tvq+rt8973nPMicISFPevlI/Qi/HkWeJfpWndeNXGTe/+c3LeCJlW191udar9eN79hyf89psvZvLHF8creaQdnoumSPy17+eS5tvvnkpXz2in42JPcPa82fqFId5MB88U/WjfYtB338ZjK11Y2eteWYqU9/rVwKgeutn808/xL5kX2DosBa0yTqMPrrQhS5UnrGM9BSLRRx0+ssa9wzVBs+vet5bw1e+8pXLmtFvN7jBDUp9tVWfeYZ79nm+EqAZK1xHevX1B/OLGCyslXQYJBKJRGKlsNIOA/sUOVB6B1voUne/+92LTEFvY1i5whWuUPZP39EmX9nHHXAh3/qTX3Va6pvs0pGJPvnJT5a98VGPelTR8+gh17ve9Ur5iCxp71Y3MjL96atf/epkA5d2kt3333//coCEXPWiF72oGNDpUPZ2+ZHRHdwhx9G16APkEgcoajA8MkySRR34cTjHuDCOTh0HeWj7Jz7xiXKwQZ/SMfUxeUm7yXHkBjINuUXfk83YFMjhDMCL2BVW22FANqOnP/nJTy5yv0MatT5u7tAT1cPhG+1jt9B28r05Rz8z3x2GJN+xXdDnyGlTjbH6moxonjlE5UCPOeRgiLGn1/hmvEM+7Cz0BzKqcTAv9fOsOa4u+tb82XLLLYs9gdxHx9cGeVlP+p7cbgz9Ubk1x8C8XAO+OcCuQPaWrzWk35StH61hfWpO02HMM3WINWwuL8dYq/4Mv/Q6Y2ws2Xj0ozrE8yPGkn7gWePwpzUzdRxraLNxoVNby/4g2Lr13PBc1GZrx7U5pS76xQEmh5T0+6KH1zxvtNNBMPYE+dXPylivdDTrVX3Me/OKDsEuYE3Mg7Ypxx/cq68x1bfGeCy98TOW5p8y6fnyMPc94+l4jOWesZ6l6mxczA/PF88d84JtyXp1GJF9btE+qsHGxmFjXr7gBS8oB7XYIDzfjUmUb4zMS89az1N6oLJjzXFU02UdAHTAlQ1FexZ1cKTDYMNBOgwW2NjlQ4CxuVlIHkQeTDYzRiIPfguO4czDy8bHyGRBEIyQawIJ77Y64b3c5S5XDEqMJR6sDKEW3hSBz8POYvNQYexhBPHwt3jVNcqPz7swNkU9pGV082BnuGFIYky0aRGYZp3A9cD6zGc+UzYhhh4PNw9EpA8YpD14PAjVb4899iiGVQ9Xxr62321OPMJOlzptymB0vvOdrzzcbfgEnPDEq792aI86Eiz0KaeNzdaYaAuDuE3QQ21WW1bTYWCj0E5CCWGL4Y6RWl95KDudrS687/pJvbXXw1n7vd3ylre8pQg/xsp4mzPG0Jslsx6y68NhoB3qq1/VlbBHuTAvhF3iEpcoGyWFwfhpI4GVQVW79YH6MXoSxp/1rGcVIUNe1gvjqLw4DuYJePqdQdX8kM5mbMyMtf63YREcbaLe0JGnZ4T6yLtWOgijBAHCMgUmhH3CNeHZ2mNkVXdtqNtifOVNICfAmQPGg7Bg7esHYzVLYBOnjz5wunJks7dWzHltUCdvGRBMrH9rxokLdYi1on7WpfVhrTG2+w3nyDyBhPFY+7zRwygf84nwSZAyj40ZoTfar776ktHWeib0E1LMf2PpGUSg0j9jsO4Is+aUeWMeUNK9veTNJQ5Xfay9nhHxvNMHyjX/zSHPK23mMKHwUlrG2mzsrS3zgoCtn80XQpI3Bwhf9XPJ3FB+nGTCY46a6+HE9VaC56HnwRi0hWBqnXu+UKgoQGPKpGcEoc48Mh89x9TXXuWZTPDjRPWWgbUQ+0LUV3pOG889zyJzSf/oX/sUZ8U8Ic6eZY6Zf9aWeRmOc3PdnJePftE/sc6lMVfMKWPEEeS5ak5ak4R7zy3jba/UFmO2iHBqzaTDIJFIJBIrBTLNSjgM7E/kB7IROZBxK/QgOqv8yRHkF29v2iPJEQxk5Bl89mu/dF2ymT11ih6pnuQAMgk9yF59yUtessg6cVDLPsyoz0DEMBfGZHWr5XFvLjJs0S9nyTfaSz8gs4RuTk7x9i3dmHxFllEHMoS2KUefkFvJMTXIpAyOjJ5kH/lxNOCb1wdx6M7BHwZksqxyyV0cI3R8+RpXfW989L16GCfyjbqR8YyHvidLTTm5vdoOA3KuPtePyiQDMXyS170ZbJ7R6Y07OdfYkL2MB3mUI0Yb6Z36Rz87DGK+xNjT4+YdMCHL06/ofuavtpPp6fvq5wCUuplX8qcn4DEX9REZtl1fyqQ/OAhELzIu5o202kTnNJbyNocZcrXFvFN/+dNl6Kmhj1tDUwzKYJ6RZ61hMqtx1Y/qUK8jfWgdRX/Soawj5WurNUzPJa/T46Ye1mHL8DwiG8ub7O/gj/Wg/e7ZHYyl8uMtdOGeXcpWB7K3t6PZeLRnHthJjKWDntoiL22mu9DrPCvUhw7FfubanDIexuic5zxnWW/WmuebNTHvMJB4Tqqwx1mzxpk9jXzvWRXPSuuWwZ3dxNg65IVX3+D1zOM0m7Ve6Rr6gxNRu/SnOaKuY+n0C53WHLbWjKk/+KW3q4/2c6KYF5655qM6I/3P4YLHOot14JmoHz2vphrVQy+je+kLNhD95bnqucWWydambZ5z7ELsCNaQ/jKexskc9UylI7JhqKf+lhedjBOCLrkI6KHpMNgwkA6DBR0GYTiyWXogMjpZcB50DGUW5JQHbMDm7XMn8owF4xSotxgYh8eMoDZKGwMjmU3Ww0S7GLA5I6ZuMKC+NkbCj4eCzcJDlsNj7EGo3h6ePgXkwemBb1Oy2egbRhjCm/SzDJ8gnsHGQ4KzwGbpIUog9lCcZ7ysYYwYPD1obQoEDg8/BtKxB9BqOQy0kxBJ2PEQjo2ZkLrIw9/Y6y+fRLHpMlgTuGwwBJUxYXN9OQwIfzYomxNjHwqhn2HePJ6qWBkP85tDzAZnPWuHjW6ek4vxnDLCqK3vnWQmMBgLG7XxVpd5pzr0IyMwZcl6JXiYt54xjK5T26IcgjVl02Zr3hNabdTm8Ky26FfzyLgr34Zvozf+8xwnAUIE54HynbbXl+Y+oV9fUiB7CMXMM4ugo/8JhdY8I/cUJQmUz9DOeWZ+6EtCNeeX9rXPDfeEHkKJOhLaCBfmPSfL1PWj3wk71mvMH44SztPWAI2XEZuQZ7zVj3BKyeEomKeIBjwTCIkUCGVSxDzfvvnNb44qI8t1GCDzXL3tE9b51LmJl/OWI9oep3xK8ry9RbrYk5Stb/WzdS5uan8xNHhWWq/WubXB2eLNOf1HkE2HQSKRSCTWJ1bCYWBvIguRQR3YsueTTe1z8qafkCHICvZG+6hf8gmZmEzA2E9vI8/as+2TDHV0i1n1IK+RNRgnyWHKZfi2b9srySgc+/iifOWSA8hc5ESfECS3MYjZV/UDGY2cLG0P2hxyBiMq3ZMu6RAJXV0/ktXJd2QnBj3GW3KxvmidEerDuMqIpf/of+R6dZsld9j71YGhk5xBnma8Y/gmL5Cn9bE2132vL4Q7EEPuNuZkL+UyQMchrnny+Go7DIyZtjH0hcxtrB32U38Hqcim5GHyNn59q/0o2s25xYai78xZOgj5jFGTEZ5NYZYOIx9OCoe46A90UnUyBxFd3DjoVwcOlWWslKv8dk6bS+RfedE1GaDJ6uRWb7EwZJo30iq7bgs50libr+RsejmbBrmXjUNfjOlDAfmwmeg78r2x1B6Hmeg4ZGfj2q4j18I8R7xRwfajfPIuXdnaUK95oi/Y4QAA//RJREFUdiXx6smIzRZEl/Ms0B4HeBzSYWRXz5jL0X79onwn7jkl2aC033pUH8+yVh8LSGuN4WO30e/WrcOb5r85YP7E2on1I0xajkVvF9ChzQHpjbs3HYxpD2GzYufx3LDmpKd7eFYaz+jj3rPS8469xzxTnuedk/Z0irF+to71r/7Rv55P9DNOizGdV3mccJ6p1ppni741vvYLzg66vnbW86JXXwcR6UJhR7FmOUOlmQX19gy3N1jf4SjxzFE+O57nKr5Y58qPeSmcncJ46G9zm97kWU/PY7dkg/K8t3/Nm6ct0mGw4SAdBqcvyKmQTzgMLPj4VI4N1uuLjGjzFu8Ywphpg7PRWIAeRjagnrDjQSuO4YrhlQHExsjJYNEvAg8HG4J2cXx4IPAieiCPeQuV70Hj4RmfwOC0IOA60b+IwKNf9Z32eyAxXjJkc74sRXCyMdqstMWGzaNu/Ak/PXhArmuHgTHE79SBehHSzEebCePoUmAjdsqdUmATYrz16qHNuyekrm+HgTmCbO42F5vvUkG5ciqGYBpCBqcBw/HY/LfGOBqsMXPV2uXc4rCxIU95FuhzG2f9Ro85q9ylIozgBHaKlrnkRDchtz1lQWjTp9YKZ4H+JKRR9pY6jyg8Ub685EvZcYKnhb7VV/oMn77XF5RaAudSQAFQPgEqhFTPMob6GhQEbfR2gbVj/hpLdbG2Fnn2xtrxjHXqhFBtvbcKjjmhvQQr69y6pVia05wtY0J0D4Qyiq+yzH9r1RiPCUbKXYrDgJJgXtjr7A1TlI0efAqIEkCYtmatNc9286WFfmAY0If6h1Crr7Y8XWEnEBvjpcB8MzcIt/F5J+uWcpAOg0QikUisT6yEw4AOQgci9zCc2O/t3Qxpi+xNDDzS0GPswfKi6zJW9mD/pNOSRUK+oX/ZI8lf6jUV5C/laDvjMT2WzEAmILO0spJ7ugfjNaMoec7+zhjGsDXlrekaZIVFHQZ0XzoReYJcof1Oni+qy+snZZPDyaZkWLowHYVcNgvrw2HAgB0OA2SueBvV/ON4GjOAjoHs56sKDL3GUX76gaF6lg4dDgPGR/2vLsbAFxucVieDztPLrDHziM4ftgjzzwl682dRewj5lszM0C+/OPRCt6eP9WR+/UWWJ6Mbe2nImN40IXNOfQ6AsbdmrH/6lfawGTh008snZG/2GnYb5bMfsQsIG7N7jMFcdmDMm9d0XPOE0868aNeEdaV9PtOjveaveWRMe/rjGOTrUKe1Z+wcKKQj0dGs0Rqhk3muKYsBXNnzviTRgk7kqwXsIXQc7VQ+u1oPy3EYxNs81gUd2xsenIzz3qKogZftgS3FYWVz07iY42Ow1h1I9ixlHzI+9Hb6rPHs2apmgR6nz8zNeGuCc1r/sb9wOKXDYONFOgzmbEY15BMOAw8LC8biJ2wJX3STreHBywDFAE1YtAhtmhaOBdVuUh7qNjYbrhMkcfqiZyjuQX4exOpsQ/Fnr16N89C0+BnZZwk74TBg2HQq2kJnfOHhnVqHgHp4yDhR6uGmLYy7Ux+m2qL/jKUHmgexB7K28D4TYtRL//SgL9e1w8BY6U9KhYe9NIRyhu/lIE7HEHjNRXOGQbX3uuZacRgQpH1/dNHNqgXhwMlva1G+hAdC5piAFk45rxATMswN81ddpigH8jTmlBDPEWXaOKVfdM63CGMvA7gTOuaHupqXtdKmnsZbPOGDYOjVRs+ApUK7CKmeh07yMGpzgPSMvE7ZeN2Xo8U6ccLAfFvuHLKG5RsnMZwSMlfr555r68i6JMxTLMxjyhYlS509K82r3vi30JfmEIVf/V23wql7J2C0MYzMsdY9U8wH6absI+rkeUu4V54xtybH6roUh4E+DIeB9c0YvogQXcOpEgqw03LKZ0igaPYMGOYoZ7W3XDx3ndTjSPLM08blwJia86FUo3zDIJFIJBLrGyvhMGDUJMMxANG/7N0Mdva9RQ+C0HPoIXRcuimDvLch27qQRZQZJ+OVa1+1x5NPatlrEWi3chl+yA2cBurTysjyV476OWTAmOVQAMOWg3DkrkWwqMNA+Q7EOdVND9J2sjQZa8zBMAvkFG99kw0ZCMlhTgWTOWZhfb9hwEFjrIyZebJUkMUYeB3AoR/Rcx1CcmCkpx+Rv8NhQO9h82BT4TwiW0+d9/qXnBoHqPQ9ZwH5dalQXzYRRmu6Ij3DW8i9Q4z60wlua5dOQn8zl8yFpRzUAfoEfcbb5gzoDqXSdc3Xel26phPpd7qsNcyQTFfwFvUiBukang3mpblsLOXrOUEfrddGPYacPg7u0cPNYfacqc8/0Ga2H3qvg0ry7NnVjI01bZ073Gp8vEVN17AGp0I76Eb6ibPJF0KsC07PVg8E5S7FYcAYHg4Da86YKnNRR5a+NM+i3Q7XeeOAPtwDncza8Ekk9gJryzrTVg7RRcsPGHNrmg7uGatdbCnpMNj4kQ6DCYaeQDgMeOdMagYHD0ne1XmnCKaAYYWwZnPwMGA0sxF6IFukU+FBwNinTh4YhABCHA+uBxXhzYK0SL2mZCP0ChkhlfBg03NqmuFwrF3hMJDWZqJfCX3LOTXeg3bboBnWGcFtQh52xoGhloHTRsEI5KSx79YRvhgeCSAeaIzpBPr15TAwHurMMOWBrU7mjfK8UiYfGyUD4RTCG/we3PLhFFEHQqo6mTPtprc+HQbaazMwJgSrsTdXFoExsEERMLWH8ZygwXNeG9kD5k8Ip8ZAX5kz5tQUMJYbX04C68QGTMhYibYQOJzqIKBSOgmJjK+Etno+EQA5LMNhQUl1smLe67IrgVDwlE8AotxxUFl7+nzReRzkOYwIMtquXdaUZ2HrtDCnrWNtJiAz+BJS1YeiaQ2GkmyuOy1ibau3PlrkORrwDCBEMsZTfj3vlKmuntXxR2X2GM5K/UDZnecQmAd9s6jDgGND3+lDc8jetBTlFziwKSnWrjpwAFDkem+S6CMnT/S//YNA6xu/6rvIHjsGY+jZrl0oHQaJRCKRWN9YCYeBPZqB094Un/Mj07omz9O1nPh32GDKfio/fGRHemtPHmZopOMpg4PfPq98+/5yoM2MaGQlcjK5nI7SnnQOeZKs5jACQys5g9zLILjoQYOlOAykYTS0/5Mr9e8UQ5o26l/yB/nf2JFH6APkL7IxGwHDafumbIv14TDgiDIuyiSv+ZY92XXM8DkV1gIDJd2KIZeBlL7gsFYLsjg9lVPFASTjb+34WgN9oDdne2BzYDjVFmPuE0L6dMo4zgLdlfE6bCP0C3VrHUDWi4NqdAHriOOCw6vlWxT6Rx3oAZ4lrhmtzduAPmJbYM9yEM6XBuin3spd7hySXjvoEdrvECn9tz6ApC6eI8LpQeEwcciILq4e6k436TmNWsSajOeWPmjXrnu6fLzVQBcjy9NP9AMHIJuXek2Zz/LDp0y/+rT33F6qw8Dc0H/mJxuktac/6nGcAvyeM2wY1gsbgDH3DO8hDs/Sm+jH7Fp0Zmuj5xBZBMaI7cWcj2cIhwGbWToMNl6kw2ABY4Z8GN05DGyIDEdOgXjtbiWMhoQPC46B2wLkqfWKlAfSrMXjQeZBwvtNcPXZG8YlGwch0Abm9DnjCEOOe/W32H0vktHPhmtD5CnkofZHOF6Bm+cw8AqcNwyks1EzHC4H8iVceNAxGBonwoxTC8ZPnxg/5AGjLQS0aAunhY3DiRl1IsD7Pvz6fMPAxmujYShXZ4KUDU4/ezjq96WS/AgJlIt4Bdlrpb1vo68vhwEjpn71yql+YqBbVBnoQfs4wGyI+sCaYXAnMPSExXAYqIt5Yc7bqIRPAQEQP+He6RkCtz/o6p3EXxTmmY3cmzHGUP0IaOZg3VfWuTeBbKzIOnWye6lG6UXgGcRR54QEIU355hGB0rrszc8pZP1I71Vaa8M8cc9ZStnqKXvqYjw4DD3vPOsIqfKw7s1t88EYeZ2XgdkzkWHfc8WJHIL21LHTv9axV6QJM57LlFz19/xTljLNb89ZSobxo/x6rniWcnD6XFr7fBjDch0G+tArrEsVDkMR4rie5zAwHj6FwBnH2a3PPbv18VTFbxa0mSHAN3K1LR0GiUQikVjfWAmHAZAByaKMbGQ8ug0DJFmLXEPuZPBxyIWxzGdTyL/kBHv1InshmYAswiBKV2LkI8vQUXxGRR3ozIxRZJcphB+5Zlilk5HlyEf6w6GL2kjmOhwG+o7+Qq9jQNaeRWVaMv9S/sOghXo5XEIPJb846e20uUMqZBwHoMh0PvPBKEoHJgsaL2WGjEL+wrdWHQZ0JmWyDfhU61KMmC30m4NP9BJ6tDH1aRyOgRb0N+EcDPFtdXKztTT1MzpkS3OUzqQt5pq2MBYvFwzj9Av6MvnbWqRrmmM12JH8bwAexFax6OdxlgqGavOHEZs9CllLDk3px3pNTiF9id/6d0DKgVD6nXlNX6ZftA5F61R/sy2wjbHT0ME8U+hEbE30cuuS/qwMtiO2NM8g/bToWjfu1qe3OPS9+WN8lOkZTGdRd2vU88x4sME4QCbdUt68WAmHAfuUOTXLnjcGa1Pf+8SRZyunJJ3Xgdke7Ev6mo3SAVp1Ni/osCsB42dc6XvmvbeJ0mGwcSMdBst0GDhFsVIOAw8xD2rCB0Mcw49X6xgE2weSB6yFSZjhBWZA0S51Iqg4dW1zYxD20MTjwUXYYURSjge8V1J5Y31/jyDq0zEECAY4D6J5DgPCgE2a8OF1TobqReFBaNPgbCAs2nA9jBnUCbEesgRlD2dvRBCoCAXawvhn0yDQxQYkH4Zc461eNlAC9vpyGDBK6msGP2Nk82VQc69MD0ubgNO8i1KkRa59R9IG5o2MtfKGAcGDIOjtAv20Ug4DCoI56GSDeU9I8ZocY3DrLIHaYWBueSuDsjK1/dITmAhO3i6gxJmLK+kwcCqbU4/C4y0fXvxauDG/65PWBPPlOummgkJK+bIWKbGUQePqGWX+McSag715OoXkoX/lQ9Cxnp34miVMenboH4IUYZCzyPONEuLPx/QVw71XVil05ghDsXmo7tagNeMZOCb0tbCuKEbeIvBstuY9X8xrzyh/nmVf8Pyx3q0zzxTzxbNIvQiN6jzLmL5ch4FfCu5qOQwIeTe72c2K4qdswqkx6TnvFoU267d0GCQSiURirWClHAYBsqA86WRkIjKh/c5pUgcgyBMMy06NOsHNSS/egS3lkkXohbMMN/ZNskv8Z4J9Nb6LzWBDVvK7FNJ+cg95B5G36czt50Xsx2Rvxms6L7mB4Y+Bb9HvrsNyHAb0EfYAcqcT8ZwAdGiyj/bQcY2t/nfPUEd+NOZkcLIZZwvdis6qL/VhHHqZhfXtMFBPxuKezrQo6L7mHocX+Ur++qhnF6gdBk5MGy9yGVvNVJ3KPCGjRlsYt42hubVcaAtd2iEu8rf8HT702eQa1lH8B4Z5Em9rLCKbLhXKoPOYqwzIKJ4L1nVvfU4hc1x6tiDPBHKytnMw0kF7oGewg5H56R3WIYMymxKdltGa3kgXoq/TEYy3k+nsEPIm0zPoT52L1jzZ3Roi1xsrBnLPRH1gTnmusHeYI+wu+ooO61nJhuQ5S3ea94xYCYeBPmSrWspaM6fpZHTseQ4DvPRT+4eDtA60aTs74Nj4LQr7mwNxnvHmvbFOh8HGjXQYLOgwqD9JZHMnUHjo9Iwoi8ImSfC0efLaESIYOBmza8NSGD4IVt4OYASzoREabb42Kw8FxjYPOQ/yWRuoeBs3I7syl+IwIEh5eDByLQICtYcgox2Dexh+CZAMOT6545Mx0Rb9ME8YiJPgHpRrwWFg3nDMcHZ4wNvEbFrt6fF1jfXlMLA+GBJtBhQTm/pKnAAxNhxfxpdwat4yKhJOe0bSnsOAw2nq2iWcmfMM+V619ixx0nwpyk0Lc4YR2Rwy/z2jvN5M2ak3zvaktXViHq2EgqHPlKWd6mOO1wpvPHc4CKwR7TeXCFFTje2rBfU2Pxj2ja9niDXo9BPBlJBKiCVMUi7NBc+IsefdFBA49R+h2bz3TKVMON3iOUGp8Dyybxjf+GP5MeVoQ3IYENwJwvHH3RQ3zhonIBfZY8fgdCKDiHahdBgkEolEYn3DfriSDoMa9lWHqeK0u0+YkmHqQxEOzJCvybRO2ZLt6acM2fa63h/Hku/IQ3RZeyB9kwHcG5gMkOQWeiRj2lJIWqS+dGb6KFmslkXsx7XDQN3J5u5Xw2GgfPIDmUoacg79k3GNDq5fvSlhbL3NoV/Iuvgd0qFnkEvIesZIW8hzZGK6FfmSgW6tOwzMFW1cCV1UvcmBDm6Rr+gpDgCytbSgX5ib5jFnmH6nF+vrqe03ftpC3tQWbSKjshUsF/ojDMTkb6fXtctnV2uY34ylDpGZP+RUdovV0O21X3/TLegyngFkcPYPB+fMLdRbo/PI+o30jOL0ETr0FJnbOneQy7PRGrFe6M6cOerGFmJtGm/zzzPL3GfkZ9NyeJXuwO4zz94TMJ88Y+gt5hvHFR2BDuQgqmeltw58gs18ocMql8PM+NF1PBPHxm1DchiA5xL7hgO3nEgM8p73xmMl4PnHMcgR5HnLLpIOg40b6TBYwJghH5tD/OmxDYShnAGNULIco5l6qI+8bJxe5eMptXnWBiUPDQvMWw02BkIRA6OHrwfkIgaUgLKd5mXwZ3S1+G3y69phoC02BN++tskqV1sIah6qHv5LAaHR5mBTkJ8H5vr8JBFh1cObg8ccNLZOoXhIEjKXo1SYc94mYJx08sEcskmrQ7vRrk+HQf2nx8bD3FnOhqBPKVEEcnNWvjZRbwyYsz0hY7kOAzBPODysfWVyrBGoKCtLRTjNvDXjhJX54TSAftPGWslSjtP4vPqEcc8yyk4rxC4KzxhvM1iH5n4809oxYgwn4BPsPEsJX5RSfbscWIOMtuaw/rS22rcL8Cif4kxw84kf47Ho66X6U97KYbSPsTQfWueL9W9+eO5bO4RDc3rRZ71x8/wJB5e3bTxnGNXVo7cWNiSHgTLMQW+zEfwoL9YZJ8xynYPmRbylpF0oHQaJRCKRWN+wHy7XYWBvwot6smsLPHQQ+x8DDRleeQxy5ARGFwZM+iF5m9xRyyzk5zhY5W15DgP7O51vJQ7ATIE2rE+HAVnSQTVyH12IwVI96GnkSk4A8t+U8QByDhma8ZF+tdYdBnR2ZdKFyG10yKXKigHjJm+H9chXxsHBJ3OtxUo4DIBhnIyqLeRk9hnj1xvzRUCvjv/XUDeyonXdjqc20KPpZIyn+Dn0lruOjEUYwH2Klt7hnh4UMD/ZfcxzzhJ1pFOwn6yGwwLqZ9cUaBf7BJ2TLctYGUPOOc9Qa9GbCNYEnVh/1/OyLm/q2jTXGPz1n+c1OwkbmnFjN6CLGTtvo3vbhi7VtmdDcxjYFziuOEmUr33eVqNjT+23WeBUokdpl/XBrpQOg40b6TAYMQj3IJ9wGDDGWLBOGrsWxsCzlAcB2BDkwchNaCDEeYC2f8hqE3Raw4OUgMdz6KFh02VcXAosWA8dn+1grCTsEHoIn+vSYRBGX0YlgoKHjnbzLtuElyK8EKAZ7WujHOMSgX59OQwAv9M2Xle0sROO9bd6LVWwsEnx3PuOps2LEdd42AgJO+2msL4dBnEqniBpTJzYWerGpc7xup31Yu74liFj8lieK+EwIMgSBj1LGEWNo5NchIalgrGSYGoNGBdKZyg6rSBizZh3xs1pFv1JSHRqfqmGWfPCvHVCQHs8A8wjz8cW5raTH+pHGcDvRAghi4K1FFD0CDZO7xBqvM6tTwnC9VjqC0ZpzkBOIu13woGwOXZKfx70mzboRwJqaxAnYMrfGwnKcxrF6Rj7wJiQOAvG2qfb9DGF3nPWqbRe321IDgPw7OXEMY7xGnB813Kp42O+Ueo50Oy12uXVbyeq0mGQSCQSifWJ5ToM7Ev2R7KN/ZouR8aUvqdLzAP5wFsI8VYvOYOhiGxcywIM5nQFBk4yNIMN2YrcuZRyA/Zkp1gZzzn6GX98psTBjlqec72+HAb0MaeJ6Qwh8/qMJDl0EZkioF991pK8RI7U5wxfjP9r0WFA3wiHAVmNPES2MyeWCuNpnBlUyVZsLdYEO0JPvjUGK+EwUKZPzNDb6Q50Tetn0YNENaxb+Toc5tChcaHn9BxqZEd9Zww53vCzhVjPU9b/GMjfHDkOF7JZcQSYn3W7zDt9yDHFiMweRF6n57IpLQfKYRSmv7PR0IPYnupng2tr3XPEQTb6tPXG/lA7NqYAPz3aoTV6IGJgN5a1HYpORv9SlvXmk9T6Wvii/S2N+cYe4TlgTShfW9s5uKE5DDwPPZPMITanWI/GUj5LnZvSsaWYc2xO2mXt0pE989NhsPEiHQZLdBhYqIQMExxZkIQTRoxFFozyGVkYvhiF5KV+Xg31wG8fLB4aFpiHZJxkIKTaoAiYixiHLHybDYMJg0yctLXpMPTos3XpMNAWJ30ZK33320arLTZGxvVFDXKMQSF8GGdt0T/e1CCErk+HAdhwGEHjD5I88J0o9ymo1jE0D4Raxn99pd/iE1Y2d/3Qmwfr22EQBj+k7f4LQJzNayrMCXOMU0nfyctmZePmOZ+l6KyEw8CcJUj4fJCTW9pBOQhBd5G1HwqO8SfkxRtLxsQr4bMEXuvW88Y4qgPhxUkebwot8kyz/pw+s0YI2wRja9/a7p1Q0X5zz+uqTucTsqRhTCYwmseLCCIUFM8ywnjMD0KNtUeYqyFfa9gJMHPHnHeqhwBnXi9yooYCab5oAwO0chmhtat+5hoj65zQ5b9h9JH17tVOQuoigp9+83yIUxkUNc+CMefohuYwAIqhN8Y4DcxNbeQYJRBSNlrFfQz4GDisc04a7QnybM83DBKJRCKxvrFchwHEvhkyJfnU/wswapNBFgVDkT3SfklPdVCJvFLLAmSXMNh6e89+yDhJNyaXLlVuYGS0/zPY0zHI2XTJVm6xH68vhwE9gt7NOEhGIVPQpfTbUkCOdvCD0Uy5+l1fku3XosOA3hPGPqQPQodp5e6p0PfmvTcr5Onwo0NdnGHGukU9/5bjMCD3M6R72zvawzC7zz77LNnoqF7WgQOM9ANGTPWkz7aGcPdsNXQE61b57CkOfC5nPnHqyEd+dCM6h/C2L7XfHHcQiS3G6Xz9yNC+VLnWmNHL4s0NNhGf321tZvjoXuxX5hM9xcEzes5SDgnREcwhNjB50Qs9D2qdQ5scIuLgY/tQLtsT29wi+hh4xrE10SH1nbzohHSLdh1saA4DYEOwNoyPZyLbnrnhgJ+5tBQ4mEgvk1/o7Pa9dBhs/EiHwQLGNfnEnx6H0ZExg7Blk/RQ4DhgOGa4Iez1POvCGEMYvnz6x4PRQvEA8IkP4TamMRB+PFgZuD04GM4IWx5GjCGMdmPCno1YGyxQhkEbIiOlxWpzigc1Iw9j7tiGUzsMnLzV7kUdBqAtjDhOojACxp/iECB9d47QPTZGHuBOAUjrYcVApi7a4E9vPHgIIa711ZhhunYY4CU4L+IwYEi0cRFYZjkMgFCrzv4QlEDp1D0jpM3HKSAPXPXslSstQYaHmNE/vMYe3BwuFA5za0xJqR0G5gvBdjUcBtZCOGKsE4oVB4J5RlAlLFpTBDzzrd1MjbNTDAzbDKwcBdpts3JtLbenp3oIh4ExJhwsxWEQMBYElMc+9rFlvli/hH1vPDBAMvjbOFtFxTOEAmGe+OyPfuFw0B7fTSVkmlfzBApt9QzAHwI6xdDzyDzSV54DPSO6+en7gwz8nj/WiL4k4HM6eG61QnELggghFT+jq/L1g1NtFF0n3PR3bx4z8noGGQtKLcOy9Iz/niGzytduTgNCEgO/MvUdwYnArNxZ85kQ6w0JfW/NSev5aU4yrrfCDiHNWMefs3nWW6/msme1uccJ2HvOBxjyPUucgtJGCggBy7PTSZax50U4DPB6vkx1GHge6U+/K+Ew0GZ1sCfMcxjoL+tVPZ3O8ozyppe14fnu7QxvFnnmt3NDP1gbnoHSmlfmpec5A0R8zk1+5o2xTodBIpFIJNYXVsJhYI8mY9AnyHBkQnuTP9Jl0FrEeEKvZTQna5ADyA5kF/tyC3Imo6Y/5yWHO7QiHX3CPuyk8FQwJDHu2efJSPZqxin/4dQzGrtfXw4DZftTUHoYPYTuTS+gS44dLOtBmXQPB/zqg2qIfqff5530Xl9vGJD3o65kKro3e4C3TLy5PkuPrUFOZJym/6q//OhlZMdZdgxzbyUcBmB84xCTk+nmHvmXbGd+T12H5EC6Ex3Z/CF/0lXjVPYY5G8evPjFLy62FP1p3fkcDNmfUXQq6G3k5NAVGLHN6d4b3wH6GAeY5wV7Dr3ac4QuZv731v4Y6JUOxYbszs7F4cgw3upl+t16Mf6cneaQfuM0MI6LvPVO96LX0SH1u/ZzUpiL9fPPs9K88cyi91lz6kj/o4tNnbfq7hngpLwxUyZ7GiefOdOzQ2xoDgNprCU6L/0z5jRbnzHWt1N1KPkYU4cLY52zX1lr7G5scOZ6Ogw2XqTDYMQg3IN8GLoIgxaciU2w8sqdh4AHGEMEAYRBiQBEcPH5DMKDB7iHsA1NXBgKbZScDOpIQJti4LFB8YDyhDoZQbj0sGb88nDysLX58xDbjNVV2epjI0MMQTYUJ0JsUoxXPJEevh4q4QDh5WXcY7SNunlAMj7bEG1QyvfwdIplUXioESh5Qj30bfTaQuj0gIu2ECaj35XpIaUvGRzF21xsmoxChGx5RVv0tzQEU0ZWp4ujLTYOhiptJBxotzY7MUIg7MEDn0JAUPTQZoD2xgej/bwNS3vNJQqCT0sxxPH8hiHSNcUjnDjItfbqDw9rxi79oP/NP592aoXiFnESw39EKMO8Y/BdisF8KsJhQBg3NxnF4xuh+kqfaZe1YIysHeOtvTZk88qcjdPK1os+MI42YmvYRq5P54EB2ydojDEhjNGRoLOc9htHxnfzwHoyftqijsbQ+tEWc9qvtul742ccxftuoj4xH8ydKW0JmJ/WrpM8BDZjqnx5WxvhQONMQ5wLMY+Mh3qoG0HT+rfZTy0fHyGC44RgTlFUpnGSr2eM9WccrRHlG0/lG0/zXV3NY04OwthUwRafNUy4o6grS3s8N9RBP3veEqzMPW005sZE2co1Vhw8HJPm4zwnibchOLWsGfnGnIy2yl8bzQPx8cw1JjEvzGVOGgKReTdrzXqWcmzoQ33GWea5NqZMcRhw8uprgpxfJ3Gm7Cc9WLsEOcqs/rTmKPNTjOuxP1GGKT/hiLV+Q6nQrnqde4YaO31lbtg3OeQ4kLRNPvFZM31IQZw3ZjXM13QYJBKJRGKlsBIOA7CXkePlZa8lWyD5MfbYex02Up69lRyAnBRl5GPMIceRe+g8jH30P0bwecZOsjHDE0OffVh6e7F9kmHJJ0ccYGAcdDgiyiU3KZeDn+xDBlMu2Uc6fTCmQ9mPOQyi7+zH5AT3y3EYkM/oEfQJ+mzPYQBkdwZGh6b0sXqT6Rit9Zn2kvmin7Vb39M5GMaMsXEiQ2i7Ayx0K3Jw6P4OeTggQZaShzJbeYyMRCch1xhvBky86wrqYC7Rh5Xp12dt9Zt+YADUJgcK6ad0RvOLYc+YmytkfrKlvlN3/ab/zH/p6M7z3lQIh4GDevRCepFDXBxMS2m/+WSeO/1s/pG3yZtkTOvHYRcyIyOx+WVszWf6m9P0xtM8NHc4zsinHBAOd035VJN1pHzOBfmQcRl1zQO6mTcQtK1eR/qIzYcuI51DROReY2A9sA1w7uCdp5exyXBa6E96DTnafDKW9GzhDnmy4yiTzSXKN570dLooXUO96ZDWP53COh17lqmXuWHMvbFD59H35oK3ADiE9Lt5Q79ULluEX3I3nYaTTFvNI33vGWZOWp8947o1xBFHb2bAZrOhM2grm5xnSDwrlVOvYePPbma+s42FDmceq4c52Xte0DmtBW00RuYVm9M8h4E1pF3WmmeL58FyHAYO+rEJcSaxe9El54FdQX/QdThktFk/sw1wjrF9xdywNoyTvjNXzX9ORH1lXM0LOi4HASehZ4ZniL0j3zDYuJEOgyU6DGxwjG82lPrzHR5OFqaTywzoHAFOolpghCFGbwYYnkYGIA82Qs5yFgdBiSHEYnVS2+bg8xHKVJ52e3DzEjOW2+idgLZQ6wejhwoBjzBB8LTxewjI0wOW0SY2DQ9sZfo0iwcYIwxjmjyXAw9SD0WnR238HmY2LW1B0RaCmHj9bFOx6dUbmnx4iZ2wtfFEWzz4CLjKCH6/NhJjy4Bsg2OEZ4gdM+gJ11Zl2+gZr+VLsFrEiAU2MkKLhyRhkfDtG+420Gi3OeThTAC2qdl0tGGRssx1Y2YDYCRlPDdmsz59s1zUDgObjQ3OJh/zXZ1s/JwHTlcRuI03RSfWi43cRsuzr84EbJvaooZQa5NgZIxtUIREz4CVar98CH4EPGuGkEqJMGe1x3gydFqHHAxOslhvK6UcEP48e7xuGPPInPcsICQgfUsoVAd9yVlo7Sw6Z1tYD/KhnOlfY2mNxvNP+9vyCSnWzzzFYhYIUQQUJyWsQ2tdm5Vj3lg7yqbEEdrNL4K7NUA5X0q7zTvCFAeHNrTtNGe11Xh7zpu3nMnWmj4ae6a08HxRhmcLoZNTmqIxtmeZf54j+kH/U5DM7zFBfx7MS/PZeKqDZ6L5uuh6UV99bQ+x/xgDClX0VaxxyiJh3LOJ47fuKwoLpxKBEllbns+LwFxJh0EikUgkVgpkgTB6L8dhELC/Oixgn6K3MJ45kMBQE0ZQhnlGaWGuGYDcM/YxvDCi0csW/SNbJ4LJHE7ekxvtjwyocTiCUY7DP+qC7KOcC/RhcjrZky417+SqvrHvkiu8Qaje5CW6+hQDbQt6MJ2PjOGEO8MTo6ADB7P6gJzBIUKOo3MYQ21ltNU+hlft17+IYZsx2bjoZ+0ld9Mv4j/5tEO/GRvOFwcuHH5gcGzbRnal0yqTIdLhn5XSSXqge9HxGXbJUg6W0LeVST9ySt5c9gaCcdUGxsUg80+b9I95YG6S5cwZ+gRdegro6GQ4zhmyuUM9DObkxOW2n62CPm5MGVcZ7RmV1TcOKmmXuYyMsfFiG3KQzEEkc9iYLgpypvL1KX1E35gv+sn8iTKNtzqoj19x5h7bEvmWA4+OvyjoNHRqeqDDpA4vxWGceF74VQf9YEyFRR9wfpHR6bGM/FOhXPK19RtrSbnyVQ4nhLa6R9F29XCoSt+bd9bs1E/MmsucJJ6VniHmUDwrzVNlKsf69ZxyHW01JxzkpbvRzzwnZtkD2cc4NzgJ6HecMtaKNTPLYWBvsCcYX/Pbaf+l6J3mFT3IuFgvnvX6i/NyKtTTM1KdGPw9w8wL42NNx7qOeeE35oU22I88Ozzn7FEO5epHDgO2qaXYk9JhsOEgHQYzHhAt5NNzGDDeTjF6LdVwk0hsqOg5DJxYWORTIolEYu2DcZ/S7iQOodHvch1RQKHn/OD88YYBBcMr2sIXQToMEolEIrGSWGmHQQ1yMgMPI6qDNozTDj0xRCLGNQcgGDgdnnKSd7mHtoCBXT6MQt48dcjM4QhvDSiX0cyhGIY2BnIHxzgnFjk5az+mU9MR6NUO3jmtO+tQxCyQPxgZ9Renh/zo+FNOZweMJaMXIzbHgzc8GcW01wE1xnzOBf3scFhrUFaOgxb6wqEUuo+T3YxsDjoxJLd95MCLgzYOEDkU4gT0SshNY9DntcOATubAVg1t0HeMkQ6LGHd94CChz4aad/pHPoyo+m3ROlsb5ETGbXkwEjO+6qOVan8cZOKUcUDMeKg3+dGBTnPYARWHYsxz47YUJ8EYyMPa4yAPBwJjr7Xq+aB888u8srYcbow5MnW+zkO9vsxHcrQDh/EMUb7nh/LVzbxeyts9LTw/rGMOJM8OhzD1tTLNIXOJg4nsbUzMP28rjRnep0BaeXhWeh57VvqChvWrTG/DeEtZ/5u31jmHzFSHKj5v2HBQaBcyX6zfsUNgwukcnkXewDC/OWWXOr7Gk7HeetFOB+6WqrvEGnfQ2OHBWOP6yxo3N6wVfWletG+ZCONc4/ShR0kvv0XXbjoMNhxssg4DJzC9erSIgCOf5TgMEolNDekwSCQ2fhCA44QaIdNJFG+ROOXklNJyBGQKujdEnHxxUshbZ07qLUWgJGx7HjHupMMgkUgkEsvBunQYJBIrDXJT6zDg8EkkEhsXGO85NDgtOFMY/TkKpjpJxsAZxPHiM7zxlhJn1FLeDEuHwYaDTdJhwEjA0+g1QicSOA2caAiyyHqCnnzSYZBITEc6DBKJTQP2USepfDrI67feBvB6s5NFTg5O3SPtvXilsU/Lw/c6OQx84svpwTFHP8cEIhDX+7r7eB03/jfGa8vpMEgkEonEUtFzGNjznLZMJNYa0mGQSGwa4CzwBoH/uvRpKp+19rYHPW0pb3PQrbxZ5c0yn33ztrcD2N7Y8TbaUg6GeePCm1vpMFj72GQcBr6T6JSi73I5Wej7YwwbXsfyPXGfKPLWAa+ZE40MnS3CYeBVKh41C8U3ydJhkEj0YR35BIjNytpj/PO6bzoMEomND4zzvkvLSeg7vr5viXzH1X7rJAkBltHfa7VexUdexRbmtV3fXvbtVc8KezXng//98b1QvLP+GJvA6pSL/+jxqQSve0vn9WT7u1ei/VcCZZkDwjeD02GQSCQSiaWgdhj41jMntE9geAvOvuKTKMi1z4345MlKfXIlkVgU6TBIJDYNWOve+nbwyjqPt6r9n4o/4PY5MG+GzzL0+6yStxJ8kSX+ZNr/HtCf2EB9isj/sI69tSBvn+ZyOJtzIPbDIJ948la6/0xJh8HaxibhMDBZ/Ymib3L55hbHgT902myzzYbNN9+8kG8aE/j8+RAjgwXQwgT2xyicDL6nzKDhDy19+24lv3uXSGws8OqaTYbBz+dEvGng1bV0GCQSGy/i+5g+SeT0iTcDvJVHKPQHWYRWfzjmlApB1qlMr7cyuCDX/gh5yy23LN+XPeSQQyadiCGceubEnxja5/0Zdezz9nz7vHDk5M1S/2QxkUgkEps2OAwYYP3hrzcMOLjtbe79eai30B0su+9971v+aNIbtgwlicT6ADsGHcwf0jLQmau+b59IJDY+cE6HTsRRQOfyJ+z+8Nge5SCXA9P+x8De5NngsJVDX/HfE/Qk+hhHAacDnY1T3P+u+Mzr2H84gDe9vZVAz6ILysu+aE9EW2yxRTlQ6o1vb6X742p6YzoM1h42CYcBIwLvltPOvuPF+NCSPy9BTkdaXP4RvYVF4RUffP4wxZ/0HHTQQUX4G/tEQiKxKcM64sXmsHN62LrxPb08YZVIbPyw93IO+h8DJ138+Zc/0vNHaARSfzjnrT6/Xkv1x43i991333KiZZE/LAw4IMAB4Dljr5ZPb8/3eSLPJnv6cr/pmUgkEolNDw6LefPNm7QOpXFUM8xwdtfEUOLNOX8uuhJ/LJpILAUOXjhd7CsL3v50iJK8lUgkNl54+5tz26deOQS8be0gl0OcDmz5JBBywt+bAw53cSxc6lKXKp9f9xnXxzzmMUVP8wY3vWkK2Hr8kbtnjre97Y/yQbE32i/d2yO9Ce5gdtpU1x42CYdBIpFIJBKJtQVOeAKl30UdA4lEIpFIrE/4zx1Oap924Dzg5O4RAwviQJ91IjORWJcwX81HhycPPvjgYsxzoDKRSGwasP9Y8z6TZ/07VOXw8wEHHFC+mLLXXnuVA9E+Fct4L96bBD69vuhhT3qdA1kOj3rzvN0Xa7I/qpf9NPXBtYd0GCQSiUQikUgkEolEIpFIJBKJRCKRSIdBIpFIJBKJRCKRSCQSiUQikUgkEol0GCQSiUQikUgkEolEIpFIJBKJRCKROB3pMEgkEolEIpFIJBKJRCKRSCQSiUQikQ6DRCKRSCQSiUQikUgkEolEIpFIJBLpMEgkEolEIpFIJBKJRCKRSCQSiUQicTrSYZBIJBKJRCKRSCQSiUQikUgkEolEIh0GiUQikUgkEolEIpFIJBKJRCKRSCTSYZBIJBKJRCKRSCQSiUQikUgkEolE4nSkwyCRSCQSiUQikUgkEolEIpFIJBKJRDoMEolEIpFIJBKJRCKRSCQSiUQikUikwyCRSCQSiUQikUgkEolEIpFIJBKJxOlIh0EikUgkEolEIpFIJBKJRCKRSCQSiXQYJBKJRCKRSCQSiUQikUgkEolEIpFIh0EikUgkEolEIpFIJBKJRCKRSCQSidORDoNEIpFIJBKJRCKRSCQSiUQikUgkEukwSCQSiUQikUgkEolEIpFIJBKJRCKRDoNEIpFIJBKJRCKRSCQSiUQikUgkEqcjHQaJRCKRSCQSiUQikUgkEolEIpFIJNJhkEgkEolEIpFIJBKJRCKRSCQSiUQiHQaJRCKRSCQSiUQikUgkEolEIpFIJE5HOgwSiUQikUgkEolEIpFIJBKJRCKRSKTDIJFIJBKJRCKRSCQSiUQikUgkEolEOgwSiUQikUgkEolEIpFIJBKJRCKRSJyOdBgkEolEIpFIJBKJRCKRSCQSiUQikUiHQSKRSCQSiUQikUgkEolEIpFIJBKJdBgkEolEIpFIJBKJRCKRSCQSiUQikTgd6TBYA/jnP/955u9K0T/+8Y9CvbA2fKk0r4w2rqZ58TX1+CL9vDym8KAeX4T14mpq48f454X34mqq46fw1xT889K0PPP4W4r089K1Zczjryn456Vpeebx92heOb0yFuEfC2up5lmUf1ZYTW38ovxjYS3VPIvyzwqrqY1flH8srKWaZ1H+Nmxe2qB5fL28IqwX16N5fGPxEd7GJRKJRCKR6KPeL6dQb/+tqbcPR1gvrkfz+MbiI3xW2kWol1ddRhu3FBrLZyXLQL28ppYxhQf1+CKsF9fSvPigXl4RNi+PmmcWfy8uwnpxNbXx8/hbCv55aWqeKfw1Bf+8dG38PP6Wgn9emppnCn+P5qVr4xflHwtrqY5flL8Om5WujZ/Fi3r59cJaquPn8ffi56VBLc+i/GNhQYkNG+kwWGXUC8evhXXaaacNp5566pn097//fdnUy2u1y2jjapoXX1OPL9LPy2MKD+rxRVgvrqY2fox/XngvrqY6fgp/TcE/L03LM4+/pUg/L11bxjz+moJ/XpqWZx5/j+aV0ytjEf6xsJZqnkX5Z4XV1MYvyj8W1lLNsyj/rLCa2vhF+cfCWqp5FuVvw+alDZrH18srwnpxPZrHNxYf4S3Z42ohMpFIJBKJTRH1HtjTA9t9tUfz+Hp5RVgvrkfz+MbiI3xW2kWol1ddRhu3FBrLZyXLQL28ppYxhQf1+CKsF9fSvPigXl4RNi+PmmcWfy8uwnpxNbXx8/hbCv55aWqeKfw1Bf+8dG38PP6Wgn9emppnCn+P5qVr4xflHwtrqY5flL8Om5WujZ/Fi3r59cJaquPn8ffi56VBLc+i/HVYj0IHTGyYSIfBekAtIJ5yyinDn//850J/+9vfyoIThupFOC+8DZtHY3mNxc3in0WLpFuXZdTxyymnFx7U5juljJanzaOlNj7uZ6VpaQp/zRPXdViP2rgp/HV83M9K09IU/jo++Kekq2kefxsf97PStDSFv44P/inpFqE2v3VRBlqtMlarnPp6qWXMSlfHxXUdtgiNpWnzi/sx/llUp4309rwUGBOJRCKxqaLWAe2Lf/3rX4sO6LfeP+O6pno/jes6bBEaS9PmF/dj/LNoVro6Lq7rsEVoLE0vv6llLJKuFzeLf4wWTbOUMtC8dG38PP4eTUmz3DLQvHRtfNzPStPSFP46PvjnpWlpShk1T9zPS1fTFP46PvjnpWlpShk1T9zPS1fTPP42Pu5npVkKjZWxkuW0+a2LMtBYGUstp04bh8cSGx7SYbAeQFgMZ8Gf/vSn4be//W2hk08++f8ssCDOBNSGryStRhloY2tLL3wlaWPrr2zLdMq2LE4bU1s2dNJHDCH2Ofsbg0g6DRKJRCKxqSIcBvZIe+Pvfve74YQTThj+8Ic/lLCeDpi0GK2WjLYa5WRbFqdsy2K0sZSBsi2L0bosQ75/+ctfiv4XOmC+bbBhIh0Gq4wQFC0Yi4ig+Itf/KLQSSed9H8ExVjIs2gRvpZ3ajnL4VtO2jHq8S6SfgqtrzJmhfdoCm+PZ0q6mubx9+LnpWlpCn+PZ0q6mubx9+LnpWlpUf6g1ShnXZXR8k1NFzSVv+abmiZoKv+i+da0SNoe79T0y+FbTtoxankJhwwiJ554Ytnf7HnpMEgkEonEpopaB/z9738/HH300cOvfvWr4jSIfbS3n47RcviWk3aMerxT0y+HbzlpezTGNyUtWk45y0nbox7f1LSL0GqUsxJlTOHv8UxJV9M8/l78vDQtTeHv8UxJV9NqlbEIf9Ai6VajDDSVv+VbjTIWpaWU45cO+Mc//vFMHdABMnsfHTD1wA0L6TBYZYTDgGPAwvFmwRFHHFGI4BgLzSJDsQjjfiotmmapZSySblH+oEXT9Pjn5bFo3Xr88/KYF9+jRctYCvXyXOlyVqMMtEg5Sy17kTJamsqHenxjadvwqeX0+JaTdox6fMtJ26NefXphPZrKN0ZT0y63jCnpp/L1aJG0U/jscYREhhDOcc4De18KiolEIpHYFBEOA/ujvZGz4Gc/+9lw/PHHl30zDC09mrpHL7KXt7RI2kXKWGqdFk23lHLWVRktz0qUMy+PefEt9fin5DGFp6Ye/2qVsUgeU6iX37ooo82zF7Zc6uW3Lspo8+yF9WgqH1pqGWgRvqWWsxy+qWnRVL4ejZXTho3xBYnjFG91QPubPTCxYSEdBquM1mFAQAyHgcUUC9AiQ7Hw4r6memH24urrlm/svg6rqRfXhsV9GzYrvqaxuDoseHph83ja+LjvhcV9GzYvPu7buF5Y3PfCxnjG4uO+pYiv+errlqcNa+MjrKVefH1fx7dhLX8d1qOap+WP+17YWHyPWp72uhc/j2cW36y4NmxWfBvXXrcUcXX82H0dNouv5mnjWv5Z8XVcfT3G2wsfu6/DFuFr4+v7Nq4Oq+Pq65avd9+Li+s6rMfX8tZhbXx7Xd/XYVP44jocBhzjBEbCYpwuSSQSiURiU0PtMLA3/vKXvywOg+OOO67sn+EwqPfX+jru47oO6/G1vHVYG99e1/d12FS++r6NqyniejxtWI9vXvzUuPq65avv6/gIa6kXV4fFdS+s5e+F9fh7YXHfC2vj27Aefy8srmuesfs2rI2P+15Y3Ldh8+Lb+7juUfC3aXr3bdis+JZ6PO11L34eT496adr7NmxWfI9anva6Fz+PZ4xvVlwdNi++ph5fHd/GjfHX8RE2i6+OD4q4mqe+bnkibBG+Nr53H9d1fI+vvo8wv/Y4bxeEDuhtg3QYbJhIh8EqY5bDwGKKhSZuKRQLtxe3UrQaZaxlivajeDiGl7WO66VNWhrV/dpSj38eLSftVFqNMlBdzrwyI76lHm9NLV97PSWPKbSu8q2pzXe1yljX5axGGb37WRSfI0qHQSKRSCQSfYfBkUceORx77LElLAwtU6nekxfZnxehNt91VU5Nq1EGWrSc4EetDljrgb00dVjSYhR92FKPd1OiRfqh7rexuTpGNV+bblbcIrTctL3wltoyllPmLKrzXY0yevcrRXW+U8vA438L0mGwcSAdBquMWQ4DC6pdiPFHITX14tqweTxxXYe11Iur78fiezwt3yzq5VHHt2HB3wuL+x7N46/D/BqX2GDdewg6EXTUUUcNv/71r4ff/OY35S0RD8SaF9V5xnXc12Fx3wuL+x7V6Xr8bViPr77uUZ0mKNoZb8bU/VNTm0edb011mui7yFe808o+3eXXvfCa6vS9fNv7Xljc1zSVf4wnwtr4mmre4KmvW772fiws+lH/+I3vCaKYq/PGLq4jH/w2fRRjX4fVcTE2kX/UMfIdyzN463oEiYt0kSbyRy1/fR/59cLivhcW9zVN5R/jibA2vqaaN3jq65avve+FxX1Ns/h7YW18G+Y6HQaJRCKRSPz/GHMY0CHsnSHD1BT7ar2/1tctX3vfC4v7mmbx98La+DYs+Ht8dXwvvL5uedqwuG/52vixsCnxIXPG+LQ64DHHHFPGk1wtHm/NP6Wc+r7lm0U1fy9tGxb3vbC4b6nHH30SOkDI/DV/na6+HqM6TZ23+9AB9T3ettzIO9LX+dYU8TX1+IJ6fIumW5S/Dav5apqVJsg4RR+FXM5e4Y/WIw65jvkaFHlGXD0mtc7WUh0uTaSP/ILq+tX5BW9bjzZNpIs0qMdfh8V9GxbXdVhLdVyPt73vhcV9y9fGt/e9sLhvw+K65undt3G9sLjvhcV9Gxa/tQ6YDoMNF+kwWGWMOQx++tOflut6sXmoB8VirMNqauPb6158fT8W1tKsPOJ+Fs9YWE1tfHvdix8Li/sezeKP+3o8CIMHH3zwsNtuuw1vetObhhe96EXD0572tGHLLbccHv3oRw9PetKThuc///nDa1/72uEDH/jA8IUvfGH4yU9+cqahrB3XKGNqWNy3NI8/7pcSVlPEaYcN2Wmob37zm6U/3vGOdwxvf/vbh49//OPDd7/73bI5BG+dX+RRh9UUaWwq+vt73/ve8OlPf3rYZZddhte97nXDNttsM7zwhS8sv+6Vu+uuuw6f//znhx/96Eelr1uBoc67vg6KsFlxi4bN4qmpjqt5I2wWXxvXCyMQUkS/+tWvDh/72MeGt73tbcOrX/3q4cUvfnGZv694xSvKXNa/e+yxx/C1r32t/AF7OBKiD/0S0j2n9t5779Lvb37zm4cdd9yx0Fve8pYzr9Fb3/rWwvPe9763zAlrwfh4xoUyJX/PQd/v/cQnPjHssMMOZ9JHP/rR4bDDDittwRttch1z49vf/vaw5557Du9617tKu2LNaS8BJeoeadu+6YXV93Fdh7XhvbA2ro5vr+v7RcLauDq+dx/Xvfh54UFtfHvdizdW5o1xT4dBIpFIJDZ1zHIY2CNr2aXeT9uwNq6O793HdS9+XnhQG99e9+Lr+7GwlmblEfdtWFzXPHVYSzVPyx/3KORO40O/Ia+Sc1/60pcOT3/604fHPe5xw2Me85jhCU94wvDc5z53eNWrXlVk3wMOOKDIvcZYnjGuUU5dRoS14XVYXPeo5u+lXSQs7luq02gL2e473/lO0QHpf+jDH/5w0Qu1uTbgR771dY8ib9d0zO9///tF53jf+943vOENbxhe9rKXFR1Q37/mNa8psv+HPvShYZ999hkOOeSQImvWOmCbf1Bbj/a6Fz8rrA6vaVZ8L25qWE11fPDENQcLne7rX/960fHe+c53ln6jA77gBS8o/alfd95552H33XcfvvjFL5ZnEUcCA3w9fvRJOqB5TWekA4buV//GNR3wPe95T9E9999//+GHP/xhGR95yRcpw1yhw9Mb6aPmER3wW9/6VinTWEabYkw56tgH9tprr2GnnXYqadVp3333LXVU/6h73R+u6/s6rKap/D2+Nq4Oq6mNm8Xf42vv67CaenFTwtrrXnzct3ER5rfWAT1H02GwYSIdBquMKQ6DdtElrT4ZhxB2fv7znxeBaKutthpufvObDxe5yEWGs5zlLMO//du/DWc/+9mH//mf/xnOda5zDec4xzmGs53tbIXOf/7zD9e97nWHRz7ykWXzs1kTfoxtbH4oHrIx7rPGPuJr6vEFiZcvCuFpSrqW2nz0iwe+zZ/gccMb3nD4z//8z+E//uM/hlve8pZFUDCfbQx4p5QX9SLc/PjHPy4COWHmzne+83CZy1xmOOc5z1n6/KxnPWspC+nn//qv/yp9fY1rXGN44AMfWIQNfT1lLUW7tCloah8FT4+/javT1Tzio/y4HuOfSpEHYYqwRQikzNzoRjcaLnjBC5Y+M07Rd9GP//3f/z1c9KIXLfP7yU9+cnHCHH744UXoMobIKSrC2QMe8IAy181/9O///u8lT2MjL79xb9ysl+td73rD/e9//yKYckqoH0ERvvzlLw+bbbbZmfmh61//+qUO2hRCa/QVZwBB0Vjf/va3L2tPmstf/vLDU57ylKI4xKv9+Ns+Slo9Mn8I/CEsCkuHQSKRSCQ2VUxxGNT7aNL6IfIjedrbA5/5zGeKM+C2t73tcOlLX7rIz2RfMnTogOTokLHPc57zDNe85jWLXvLGN75x+MpXvnKmDljL+XFvzNu4liK+ph5fkHj5hiwc1/PStdTmg8zfo48+uhiH6YDkfTraVa961XIoid4cc3xKeVEGmZFNRH8zZt/97ncfrnjFK56pA+rbWgdE+hrPPe5xj2G77bYbDjzwwHKoKMavVx6KMqNNKNo5r87iI31ct/Gz8ol4VJc9xj+VIk/yNj3p/e9/f9GL6OZ0MfNWP+q3Wgd0/b//+79FV3MA8t3vfndJT98KHdA6oAPSKS92sYudqa/VOmBQ3NMV6Z50dOOz7bbbDl/60peKXkC3owsYb/aVyE869aDjKZO9TL8gacwRB8rYVu52t7sN5z73uUu6C1/4wsXm4pCh52r0Ra+fktY9GS92oFoHTIfBhot0GKwyZjkMGNFis7CoktYP6X9jY1NyMpuQSAiKzYxwcoMb3KBsTASa8Kg7VcLYesc73rEYuW16NuYrXOEK5fQJj3kY0uOBqgzEUI7iHk/8BvV463hk/hA8bO7y59k1rygi5pp04ghyMQcj7zofaSMfeYaH2CYs3gPfyZnXv/71w41vfOMiFBAQbnWrW/2Lw0Ae8qrz7pEy5H/QQQeVfiTcEL71N0FR/9/1rnctJ3ie97znFWfC4x//+GJovtSlLlX6Gu8FLnCBEkZAZ/BWT21Qhnoox716CdMegiWhhEE8+ki/hAAX9Y/+D4pxiLHAG+W5tzkiAle0se1TY6N8v3EqXjya0m81ydMvQZ1QRwAz9whz5iFhkOOAArP11luXUzrPec5zhoc+9KEl/HznO1/pQ/yXu9zlhoc97GHlBL96em5RDghiD3rQg4qChJeSdJWrXKWMjfFw2soYPfzhDy/Onqtf/epn5osY9b2J89nPfrb0NVCkrJngQerDYaA90V9I/6gPQdbpls0333w473nPW8afwvDUpz61OAzi1X78dR8lrS5ZG+kwSCQSiUTi/2HMYUAWtG+GLJe0foicEvqCU+sORpFJHRAjn5J7r33taxdZ2Ju6DJt0QG+Xk4HJw2Tv4L/4xS9eeD/4wQ+WA1F1GT1dIqiuU4TN45UnmZlhVf3pFaEDhp4h3vxDoaP18gl9JPIJHVDZ0sqT0fYmN7lJMTzTAa92tasVh4H5HIZeeUW+PRKPyIre4qdX0uNCd6Bfku/pFIzV3uSnv5D3GYwve9nLnqkDGptb3OIWxTBNn9QmfRFlqY82hw5ILq11wFoXa3VAVPeR63os5K0sfSZee1odMPrUb4yN8jmT6DZ1Hq7r8uZRjJm20LHMxWtd61rFIaBv6EoM8fe9731LnD5E7BN0bvozvpiz97nPfcobAv6Q3XiHDmgMxMfYXOlKVypjw9FgTOh4j3rUo8rYKN840ivxO+BHt6SjajOYK0984hPL/MFj3XBC0fG8gU6vjjmiX/Q3h4E3GDiU5G/8OTHCYWCu6j/U66ukdU/Gy/y1BkIHFJ4Ogw0T6TBYZYw5DHy6xsMzFlq9EbUkblY8msczJY95NCWPqeXM4lutMpAHnM3aJu41SEbQS17ykmUTs5ld+cpXLhvi5z73uSJcEAgirWsbNScDgccGanOMTdLmS2A03nhtksrwKp1PwzBy+3zLoYceWuJDUCFE2Bw/+clPls+1EF4JpzZFRnvx8XAmwDGmfupTnyqvahKanv3sZw/PeMYziuODIGeT9eohQy1BQN2VVfcBQcac1E4CIccI4zJj/fbbb1+MuRwghA0bO6GBUMJhoG6EYv0aAk/b30HKQjYSJ885AjgH5MXIzXD9iEc8orze6xM0IdBZNwQJr8PqP4Zjjhx9jbzd4RVWfVMLfdaYfNRdOwiehBsCEIFFX+lfr7iqjzLUXzvkE/1i3PSv12IpC17j1GYnW5zMMP7CjQMDvg1SWuPllUmvUMbY+KyVX/deF5U3Z0cIj+3zIMapvo9+pngaG28BONWhLzheCPTGX33krR/0oV+vq3rF9FnPelYR9kPJceKE0OeNDX1HWHTqh+ITzpxLXOISw/3ud78iVMbnt/Cph1e3nW6RB0eB9SNPAusrX/nK0hdgHt7pTnc6c+wQ5cyrzdoXgn3McWUYd/1LqTDutcPAnFWH4O/1WUvz4tFayqMXHrRSZaxEHua88bJm/RoPikc6DBKJRCKxKaJ1GJDB6AXkW3tnK/P1aEr8LJ558WhTzAPRAckqdAA6j0Mx5FJGT9c+Q+utcwejyDWR3i/Zk9HbwSknqkMOpyORV8nK5GPyvfEme9Ox6BJ0ko985CPl7WCyObk+ZF7ycugdoQPSCeMzPMrHaz7RIcnq9ImXv/zlRXejZ5Dx6ZryoCfEp0Lpm1FWyNrqZ06qnzeVpQsd0LV6KkP+3ggm25PD6RDke30Thvromx6JJ+NrI13D52bJ//LTb3QMh5y0Rb/qO+2Nw17edKev3fve9z5TLzFOjNh0Km1Qj9ClQkf2eR46N4ePE/h0QIes9JG3FOicTsIrT/2lt17lYb36hI7PoeoLjiK6lXzp//SecCJ5W55uGH1LT4pP+ugn5dFbnvnMZ5Y5YN4YV2NoDML50PabOtX9GuOnbuwIjPfehAm96zrXuU5xEjgEps/0nX4MXdqc0xZvxdC9Y84y+rMXGB/zlSOidhiY3/peX/h8lHkrb7zKkVa/WjfGBdEH6bz4gY7soFntMHAoUx/q/5hHyBjoF3nHGwbhMFAnDoNwRkSa6Ku6v3o0Fl+nn5XHvHg0iyfilpMHWkt5+LVWQwc0Hukw2DCRDoNVxhSHwawFyABTX8d9fd1SxI3xtOE1bxs3i+bxt3Hz+Hs0L00bN4+/JRuuMSLsEFyckA7vPO87YSm+rV6fAEDGLrzfjKWMxl5ftelKT5BioGZcZqgncBASGMhtkAQjp70JGB6uFAr5myM26cc+9rHlJIXX75xeYejeb7/9yuapfQQMzgDCFWHJaQKCgvKduohXZtGFLnSh4vnnSCA01gIj4YaxVnlO0diMnR7RBuRaXaXXP/ISJt9b3/rWRUgihOqH6J92DOJeeeL1NyGL4ThOInhzQH/4/iEh3CYThn/kmiBB2NE/BC1vJ7zkJS8pJxO0i3CoPBsV4Z9we5vb3KYIFuqr3tFHxsBYu/Z5Hp+7IUzKm0BjbG18xs0JfeMpvdcg73WvexVBheAsX0Zs1+YQB4X0nBT6lOCmT5WrPHn4RdLJ9yEPeUgRkBnGldl7JkR/itPP2qgs7TdP9KE81cP3Kgnj+GsHCtKH8iC8UQII0RxEhEcOLuEhLMYnibRR/gRSY0RANYfAvPWMk6c+VzYHjnaG002/mq/gk0SzHAbqGoIf0h/aGQ6DeMPAmiD4U6rUVfn423lXX9f3NY2F96jNp75v42qKuLH4lmreNl17X1PEjcXX1PK112N5RFwb79lUC4vmWzoMEolEIrGponUYMIrRA8m59s2evBdU77Ptntve1xRxY/E1tXzt9VgeETcW31LN16YbyyP4Wv4xavna614exofsyAhKdg1DJh2QDE+HM0bGL+TTkDfJ13QFeqIDXmRbMqr0Dp8x2jokQ0alJ9IRGGnJx/g4GRirHSij85DtGWDpevRFn3ahJzC6qgsDMD1HPehuHBLe8KWbhQ5It/FWsHTe2HbvjWN5MSyT38290NfYIxh6GbIdNmIUlk/oZn7pSA7/0BHFk8HpG8odcxiMjQO5EL//IWN8jv5SR/osvYTOXOsu8lVffUT3cFiL3kdvoefRvx2qY0BXvj7yxgHjvjdAfAWAThw6IJ1Gv7j3q003velNi0GfM0D5xlX5jOt0JLpV6NfeanCYig5sntDTHaZjM6CLOqglH33qrXw8ofv51X/mmXGiv22xxRalPXQdcrS2R7/1+lG9/HKqaL95qw+RcWZ74AAhh0c7QgeUt7Ey7nGYjtMEcWzQ4/SfvtSn5l04DHzmKMaILn7aaacVAuVwaNHL6XThMDB3OBHUFfSt+9ZhQBdVpnxiHhlz/UF/5PiqHQbxhoF+Zk/DH3Ov7q/os7Yva5oVV1Pw1bzz7ueF9yh4a/72ur6vaVZcTcFX8/bu47qmlg/pe2G1Dmgc02GwYSIdBquMWQ4DD+R4uMXiS1o90u8eZgR5m7RTJASs2HQZOgljNlbj18vD5isfG6ATKE5XExJtZgzLD37wg8vG6iSCT6cwcDKMy5+Q5FVBgprNLl7n9KC1ATLMEmJiw1U/p6nNoRAUvZ4XggjBzel8hlXGV8IvYYWQFxuzU9kEGn9Spf6ERnyEH4IMHn1AcPK5JUIYYVDZBABlRf8QdFqHQQh2rlG9gfsVZzNhzHYaRN3kpb8IVYzfhBV1IzTUfY0iX9eECIJjkHtrTF8SqvV9fHeRcEiY81YBR4MTKtrNgaLtxkI/x5shBFnCkDXqdAhnDYEz2q6/CbdOUXBIEFoIrpwdHAy+3X+zm92s8BFOGbcJl9748CaEfiXkcx4R6LVfXpQJBnvt6T0bov3iQpgzNpw46qX+FBAnhCggIcDXeUQ+fts+ROKkodQ6+VI7DMxDwrf6U6LU1Vh6i4LDgxBonoQCQLDmRKNIee4BIbb9JJGxCYdBKDH1fBlzGJijHDytwyBp3ZD5gnpxSFw6DBKJRCKR+H8YcxiQW+ybIY9tKDRPDlgpWo1y9D3di97AKMwYSSYlYzqIRa6nC5BhyJhtejKnccVDXiYD07PoE/Ji4HQQh47EYOqtam9SK4NeRqali5gPtcOArkfvYEx2Apw+SWdg7DdvEN2SoZkuQs/AS88gn9Mr6RoMx3SROMjG8KuOHBL0P4eb8NEbyOt46IDkeG/c01kY3aWRTzgL8NFvGKdbhwHSV/o2ZPnQK1xbA4z58qZ/0YH0V23oJkPq17bP5RPrxdxodRekXXQjp9g5WtRX+xnVfY6HLuHAGh2aPkyHCx0QH52FDkX/I8MaN2/X158q1t/0YWPt8z76neHemHgTg/Gb84iebJwdtvJ2tr5SrtPyTu47MCav+IwsG8EPfvCDMhei33rt15fmAP2HHhRzypjQr7zVzk6Az5yq86jzafsw1pw4c8Ocrh0G2ix/TgE2DYfc9A+91zziIGM/CFuBvtfvnBHeEgC66ZjDQJnhLEHqrl4cET2HAZsHe0nY09r5krS6ZO7UOqD5lw6DDRPpMFhlzHIYEAo8EGORod7i69G8+Hk0pQy0nHJWowy01HL0vc3FhkRAseGHUIWcDGdYbh0GkZf0xpRA4dNENi0GaQKETduGSZiziTJ4EgC9JhqnwcX7Y1iCnc2udhg42U045M2PjZFh2GlqwhDh1EZtMyd8INdej7SxKo/goj5Os4QzQH5Os6sLgzNDr3IIKuIZnm3INmabu/bZrJ2WIGD5FmH8h4BNnqCFV3lO0egvwiXBm/CEOAYQgYHgjDdO7XPKEJQYpG93u9sVAZLwoUx9EWNVU/R9CEJBce8kAkdACL+EMWOpfKeA8IQg6sQPIYXQHt/pJ9gRNAlDBK7999+/OF4InuLDgaPfjBPlL76Hqe/1AadDOJ8oHYRUTghjhhcfB5Px4ejBZ84QVH1ySbw2jj0fhFNA5Kms+BSWN0E4ShjfQ/DSzjZ9kDj9Zg6HswcJD4eB0yTRN+aRa0qDsswFv+aV9moDPmR+EKQ5T/Sz8sDpG5+UCj7EYaDf1FE9ou3q4ZnphE+MU89hoE+DP9raUrS5FzdGkWZq2ql8LdVlTMljCk9Ldd5T0k/hacm6NV7pMEgkEolEYrbDwL4Z+2zsuTXV+2tLU3haqvOekn4KzxjVZczLY6lloEXKqHlckxuNC4O3E/2hA5JfvVXt9LVxcoqafBl51GnJ2XQ4+lS8jcwY6tS5N5cZn+klYXimIyqDvMyozLgbBnf6CR2Pjka+pXeQu8M4Sl8k2+NxEI1hnP7nZLi40AHpb/QTJ+YdzNIeZdL16KV0Bwd+OA7oIaGXkuPVibxNV9NP5Dj146yghzJO04PUi/6ij+hW2kj3oAM6/ETvCx2QTiicHknf1lfeRqabqRs9jaGZg8YhJDJk6AIxXjUJb3XAWF/0DWXSvbQpjNbedOaokBa/Q2l0PIfeGMHDyM0gTb+jb9Mt9IPDbKGrIQcDHfjzdgj92HiYJ/qdLuPAHz1RP9GT6M7GiS5j/Kx/Y0dX1QeRrzc49CenQdSzbnfMO9fsSPpb3mHQpx/F+Bqz6Kc6fT1/zWn9XJMwVDsM6Hnyp6/TAc0jZeqH0AH1W60DuuZkMRbRHutIn/ccBnRFhy+NY9RRfegS0tONZzkM5B9906PIsxc3iyLd1LRLKacuY0raqXwtresyQgfkgA0dMB0GGybSYbDKmOcwqBekhRZU39cLcQr1+KfkETzz+IJ6fFPTB9883h7PlHQo+Gbx2xSNDwGL0GQTtMHbxGyGNiibLUEu0siLUMJob+OzkRFCCA6Mq4zuNjOGZyc0bGaEiHjDIE6q1w4DcyFOZ3jQSmMjrB0GjM6Mo2GM46UndDktQYhjvJc3gY+xXFucSnEf3ygkwDAoy4eQ42QLo26c8OBciBMe6kFYUSfE2O10hI3dBq9eypQHB4BTLe7jNHqPGMR9gomgSniUl3B5MRoT6DgerAlCeD1urkMYIJwRLBjN9a01pQ8pYoQwJzlCGDEOnEH6Wd9FP8vLGOI3LpwX6kEYuctd7lKEO4KiceMwiBMcFAoCtXgCVTh6jMk3vvGNwfdG6z9xVg/CsHRB4qIP8SD9T8CltPhsT7S3fj74jXBlq7uxZ8CXB2GRQEUBMT8IxOZ4pK/zMIcJuPqOsK8vpdGP4uVvrGqHAQFQGYRE/UUxIhD6dTqIYmIOeO3ZvOQckI9y9REI671hQMjGZ2ziuaju4TAgTN7hDnco81m/UbqMWzgM8OuXaGO0dwr1+KOf2vCgefEt9fin5LFIGWhqGXXYGE9QxAVfjz/u02GQSCQSicT/Q+0wsC86rMOYR/60Z9Yy3hjNi2+ptz/38qjDxniCIi74evxTwtr4MYp0LX8vLMLr6x5PS3gYMV07NR2yNGII9UYAGdn4GSd8fmMcyc30QLoCI623jcnH0tO/HNBidJWHN7vl57CLeHpBOAzYBOJTLORyxurWYcCI7aARPUcd/NK96Kn0V/Ixfo4KOgYdk87nNw6N0U8ZlDkYGObV2Zvq5Pow8DJic1jEQRx1Ir/Tucj15HW8dBuyvzDOC8Zrsv2YDkjn8b93dEZ6HkN9rX8yyPvvNW8vh/4ZMn2QdiPxoQPSZY2B+upHehEjt/5Xrr7zpjRnhXGwDqNNDM0OhfnGPv0XPx1NvRyGoxsZW86NcBhot3bQSej/8jF2dCq8nC21A0LbQgfUN0Hu9WPo51FXXyow/toZenBNwvWL9nqboX3DwDzQx/qGDkUOly76MPJQX7q0PtGHoQOaf+a3/jV3vQkRDgNtN1fc08HMF2/Hhw7I+aHv6NwcH/rWOlGeutC/leNQXszJ1mGgP/GjWQ4DNhrrTR1be1pQPXdmUc2/SNpFykA1/9Ry5vG1cfV9pA2q+VqaF99Syx/36TDYOJAOg1XGPIfBrEVssaHefVz3woJ/Ks1KF3F1/Cz+MarzadP24urrRajNZ16czcXDzHgQFn2j0OZlE2MktekRPoybcSRk2GBt5t4WuNnNblZOLjCSOiEfJy8Q46lT7U6VxGl+GzsDtvwJE04oEADNBfWwUSrLHzSJszHbGG3OBCDlysspEnk7kUHokB/nws1vfvPiWHC6w/8b2IB9I5IAGTzhMCAI4dEGeYTA5o+ufB+SIGXO2uT1mTLjRAjhx2bvk0SEKp8R8hkgf+LkZIVv/BNKkfKRk/BOmRCatYGB2Ilx9UIh1MW3EaPsEAL0PeGJ4MDZID/lE16lUz5BxyeEjFsIYfJVH8IsQ6Z+lhdizNbXBCJjRxgyLl7P9WYIJ0nrMOCM0UZ9qLwQao2btqmb12wJhGHY9hkfAn38iRihmOLg1Io+QcbTp3s4iwhv5mf9fIh5616/aEucWIo5pf5ORSmD0wG/dqqfdDHfXRM0zWHOEXPAXFInbRAfbxhwaoUCQEDTl16n1Tf6j/Bn/RgzAigBwdjpF/WM+qsHxUy9zAtzSJ7InDIXCcDSoFBU9CvFxsmjmO/SEu71p3UlXbQv+qld63HfC4v7eTQvjzq+DQv+eVSnj3T1dcvThsX9FOrl0cb1wuK+JXHGKx0GiUQikUjMdxiEjNTupTXVYT2eNizup9CsPOJ6LH4RqtO1ecR9Lyzup1AvjzauDkOMmMbAISCycBgyyfr3vOc9i1HdmOFjGKYDMoSSP+l+vvvvTQKGUod+QrblGCDjO/FOVyBbO5AVDgPGXW/b+pwnGVr+5Ga89E66HP2FLhPGUfqKuUP+ft3rXld0oNABGVJ9SohjgS7GwMoZ4eBaGM/jDYP4Vr2DbvQTB3G020Gg+iAOOZwcR7Z2sI6eQl7HGw4DOo+3Cxw080YB/Sj0PmXRGRDdU39wctBx6Hr6N/RT+i6915vo7CTKrXVAdaG/6B99xtHBcE8voLPRRTku5E/fou9Gm5X/7ne/u+RrDENPoXtrqzYzftOD6TvGVRnqGQ4DbZWftjNchwNCHekq5F68HCgcBsYDr7c3OIbomRwT9D9kLsiX/ucUPnLwzv8KxH/JybueqzF3lefXITXpvZmgbsgYerNEP6hT6KjRj9qtHx10xGcO0lmRz2DRDcMhQwd0KM38i77Udv+3wCZgLnLWIGuE/K9uPR3Qr76nM3oTR/9Ef6o/Hc88kgaps3ykdWiMA4KdJXRrzjH1ZRcZs6fV6z2ue2F1mlnUpq/DFuGZR2N5zIuv7+fRlDym8NQk3JxLh8GGj3QYrDJahwGDmg3Gw9WmVy8ycUGxIOeF1XH1dcvb3o+FzaKWP+57YXEfYXEd7a3jW+rl1wuL+zbM75Qygs+mZJMjgBGy4jVGxHnACM/wTliKDYlg5DQ8Y3UIa0E2M5523/JjzLSReYAylDo9HoZnQiWngrzNBbCZ2zAJZTZ9wgthkUHYK3w2RgKJ0wMErfjsDgHUxu9khg0ZtIngQWgLPkJjOEEIA+rnTYY4CUFgsyETWpzcZ+C1+RNeGLMJlnjVSbsJyIQjvPpTX8aGHX0VFAKfX5sJYZXwSPjUZ9pKGCFYOaXvBAo+Gw5HAeE1BLt4pRepBwGGYd96Y+gmiMlLPKHWmDKiM7ArX37aRWAibBGyCS34CSH60ngRhsLRE6/shsOAUKVO8ol2E/o5gAi9HE7aRBBSH8KVcgPapnxj5KQNYZezwDwzZ2Kt1PM6SFl+jaH/TaBIcCopjxKivcaV08PYaHMIbwRgygIBO94cQD5rxGFA6KPccgZwwnhbIxwGXq0l9Kuz+uOL9itDvaLOQVF3PBQiypP+ML/DqRNCqHnt+RiKkzwpAPXJEvyUG0IuoZUgaX3FnPNb91es+ZrauLifwu83ymjjWqrDg68XFvc1KWOl2lJf11SXgdp0Y+Fx34b5NS+Qe2OYDoNEIpFIbKoYcxiQVe2Z9uB6L61pQ5QD6vua6ripfPV9GxbXNU8dNotq/pDfycZkTU6DkIvJ+/QDuqGDK/Qqaegv9DQG69Cvghgz6QsMxHQWaZTlsBOjaOiY5F+6It2E/mmekNHJvE6p0z3lhZfs7c1dOgKZl3PDoalwBPh1apsOSGcC841exIEQny0la9MJpSeL02/ivxLE03k5IeLzsPJiv2DkDb0zyqQzOWTGSSBeebUe4Dco5ph4OoMDafpQfzBURx+S8b2JLk9toRuTI2Pt0MvoKU6140d0HuOgTfK1vhiX9a14ehF+OjlHj/GTJ0Mm/cib85w+3nSIOtC3lUUXi4NZHAriw2FAv6LzaZNx8xs6mT7XT8aYvspA7u1qbQ/oF3nTGf23Hl3aNX1bO6IP63kd19HPyvMmuPoqR1+YM95ycZCOk0R+eMNJon98XootoNYBtZsuSVelg+GjA4bjCo83VBwi40RTNnuDvrSGYh2pt3oGRb3FKZ+Ox+7BfhIHNNkV6P76wLzUT/Rl/WqMOL7M23Bc0OvNG04zjjhrLOZYlBnl1vc1jfH10rRhdTk9/qA6bhZfUM0T7enF1dSGx32Pt46P+6W0pb6vw4xDrQOmw2DDRToMVhljbxjYXG2EvcWG4r6mWXE9GuOflc+suB4tNa+IG4tvaYx3Vh4RN4snyEZng3LqnuHW6YAQYAgbDJQ2YwZyhlZCE6GK8T82vCCCGWOmjYxBmMBh06McyJtgEmlseoz9Tnkz8DI0Oy1hc45TKgQOm38In4RJJxts9jZv9bPZMvQzdCvXSQVGbQ6JMLIiwiIhk7AQfeI0h/bGp4vUjdHca43y8EqteAKXvJQnL4Z6GzbBlpPDPNbOEHBQbHg1CcdDIIrXN9U9TpmoA8eFOugrccjbH4ziysUTfafNBM84SaFd3pBwkiNOhOB12kR++o1grb+Nob4lAOpvgrx+JoA5RYQI3E6rxCl+QiWjue+DOiEWbUbqEK9O3vnOdy6vNxPgzAnCEUFLXsrWJkK6eYaHcEspIUDabKOvYo6281i8Mgl18f8YBNqYW/IksBLozVvjSajWJ+ZLOEi0Wz94a8BzSZ42ecKgeWJORp6ENW3wdoi26++6jr16BqkvIVfbnIqhJBm/mE/GVH9xIKkr4dEpH28SaIf64jVPzDsKBaeSeWT9Tq3HGPX4p+QxhaemHn+EjeUzK65HPf4I68W1PL24HrX8rmth0fxIh0EikUgkNlXMcxiE7BL7ab2n1jQrrkc9/gjrxbU8vbgejfHPymdWXI/m5dULR1PLIZ+Se42Nt5HJuWGcROR0+h79jLHYiXXyKRk2DM1BcZjKW+uMr+ShMPCSqcnhwUs3odcwktNNyNvS+rxsnS89hTGXbsK54HQ+o3R84pbhl4PBoTTGWKf+t9xyy+LsoE+FnE0XoW/6XwVzkB2CgVvZ5Gx85H36jjrRdXwjnzxOR5E+nBh+6RP0Tbqz/gtdKPQX5LomYXj0Ox3QYTR1os/KN+R8/U1vog8g+pt61fqzOqsf3Zj+FToGZwN9SvsdeKJfyN+nbxjSGdh9+sebIfpQW+Snn/WFPpEf3YKjiL7ikFOUG5+t5TDQJuUibaIDMnDTtc0huhj92tcIzCsOGl8MYCeg5xpn+o3PRzkQxyHEHhF9FXO0ncvi1M/BNXqqN17oyeEE0CY6Fd2dzqQ8eiedSrg+wad+3kynU5lb8qbX+jyR+cpRVb+t4a0RbedgCydBu8bG1lyMDz2XU+fGN75xKT/qK3/zzNwz5/w6dKdv6KrmhvFUX4c2ra94u6Duq6C2z8Yo+Hq889KPpRujHv+UPKbwBI3xRfis+KlloJbfda0DpsNgw0U6DFYZsxwG8QmPetHZoFDcj1HLN5auFz7G26Oab1a6Xtws/ppavrE0vfx6YWNU8/XSGQebmXAGTaeXCRY2JgIdI6tNn+CIXNt0nfxgqMVnYxYWBmInIZzgYHi2+Rt3nnmCH37CCSHGJmlDZrAm9BE0lO2kC4GC0MMA7g+qCAdOoXMayIeBNfKwqRKo5MNg7VVUAiijsbLkxfjvVUv58OA7SU7YcIKC4Vh7tIFzQDvkR8D1LXtEmOU8QepJMCIs6kNt1I+9/m3JZoIYpp288FaD/LVHXaO/CT8EKgKkNhIiCT8M5P6XwNs68omyg5xKcAKEsEOI4oTRLvnWZGwJcoRORnNjT4i20Rk3nzjSN8ZC+caaUGdcrWFzRvnRJr/WOCO+100JafrK2EivTL/qomxOHYJrnPhhAI98xij6Vzvx2pyNgdeVveqqTMItYdo4mq/6UF9SHsw18V7pdbqGcG0+EP70IzIu3qIgWJtP8iLccbYQIrWdUFvXp65fXLckb3WmLOsjJ2/ky3lijPRPrDO/+orQrd/NaX2qXgT1erx7ZdVU17Gtb01tXNyP8dc0lrbmGQsf4+1RzTcrXS9uFn9NLd9YmprPby0spsMgkUgkEpsyeg4D8otfe2Ytv9T76Tyq+Wal68XN4q+p5RtL08uvFzZGNd+sdL24Wfw11Xy9NMYGCWc4ZVQnm9O/Qi9qdUDyKX2FfEp38UsfY9Skd5Fv6VdxwMaY+8wLOZreQRYnW8vbNV2BDuhgGt2N7kVWp086Ie+gEtmcsX/XXXcthmc6BP2CviYPOhtdj57k7Wh6IB71IsvTAb3FzbCtTmRxuiXDP0O4tOojP/K4POk/Djo5RKZ+HCV0MQZop8LDeFzL470+rin6W984/MNxQA9j1NeG6O/QAfW5OhkLPE6+x1vUygwnRIyhcI4VfaD+kac2ySfaR+9gpPY2Nb3fQT+6H6M5fcUJdm9cMF7rC+3WF94GMA7Kq43m7o2RPmWE52BiG6h10NAFjZs5Q9/3WVYHAtU9nA9tn9UUfRt87AvmBx3J1wDkq8xaB0T61Fwwztr0/7H3FuB2Fdf7/xfa0gLF3bVIcXcvlOLuTiC4uyZAQtCEhODuEJzi7g7FrcUdgktb+vvPP5/hrnQxnS3n3nNP7s193+d5n3P2zJqZvWf2nlkza4T+Le8o7wHp2uA7/Vj6WkzKI7+5V95xJtXx7tH3tv5vWs5cp25GyyM7hxHDDoYb7oe+HnlDP5U+P7+WT7y73C+GEQxnpE8axJVLJ2V6T7n7M5lULidbxKqw5lYlV8ZceO9vbjm5nGyOjYQ1d34Zw7BxDBkMui9kMGgxqgwG/mMTuwYpDxo0iOWaBpxloCzHZOYGCgBnBDB7hBUDKA1YuVEwafwY+EZBwVjALAWUMipPaySJHyWDMDR6KKXEwzYxDKCieDGIz8wL4kO5ZPYESgzvDoNxxEGcyLLEksFmZupzf+xJSTwMIPOOYVxAcSEtBthZYkjaKK72rPwySM7SRbYrwqBAXCyTRDFgaSbKF/4M7qOUEB97JBI/YS3v0vxMiYyXs3vgucgXZnNwj+QLeUl+Q+7H8oh7Qd7KycdvcVpek5fMZiAfMW4w84ZBcmY3YCAgT8gP0raBTrsn8hilBoWRdJHl2VnRQFlYufrnsfQh7w/lQD5hKEIZRfFkFQhp85zcG4oa4YqeJY0/R8sLyoL8IU3KjtUOvKt0Bvhl+yOUaJ6ZvPFhLS27d3tPyTuMBGyJxbNTf+Gfu98y+mexNC2PeK/4lvjOyB/ulfLnPcAQwjZgPBfPRxxlaft0OoutSKO7ke+HbwLyHclgIAiCIPRkVBkMpEd0PZp+ir5JHw793/pZbKNj546hT6NnM+CNHHo3MhgEGGjFjxXYDEBTzpQ/ROenX4FuTbz0B/hlgJY+J/0Lfunf0f/E3XRv9Ct7Z9DR0Z0ZOEdfpn9DPwd529qGuJj4hA6PLk2a6NO8g8RBXPas9OfQ89nKxvqArEJHnv4m/qzg5p7oA9JH5T4JW6aTl9HyhV/6W0zaoh9Kflifm/y2Z6MPRl4jbwP1uTjtfvjO6E8TJ31I+hf0ATGa0Pdmi1Wez/qAhLPw/BLe+vfkH89Nn5iysHJN74Frnon8R45JWbwLGEVIl2ex8QOel/fB0vTx+PjSNFISFlqfijSJH0ME/U7GLchL+qG8L7w39AH9ZDtLB/JsvCP0yXmfeHZ+edeJ38q86r5SWvwWlnh4t4iXNOjzUe7cM+80fUKeg/eOb8nSJX9z8UNLI+fXLLYiDdidnoXvh/fP+oAyGHRfyGDQYqQGAyo5Gu+cwYAPzeg/vjqsks/5V4VJWUc+J1MnXCNsZRqQ8qFBRTFhBgWzDiD/sa5bY+vLkgrTKk0fr10jQxjCEoe36ltc1iAiY+6E8/Ehhx9h03hM1tLxtLiMdk+5+Ezens/fl78nH5fR3Lx7Ts772f3aPZDPltd2P9yHD+Pp3cvi5Jfr9DlSpuG9vNHk7L8P6/PU0i9K2+Iw5uJNr3O0ciJ+e16fj6TNPeXy0f+HFg8kjL/f9P3O0eJJ//vrovv194o/cj58Go+/LvIrcqtDHy6NIxdfKlOHZWFyfmXyZSwKl3Mvkk3J++CVRRkMBEEQhJ6MIoMBg3W0m3Xb4TK/MvkyFoXLuRfJVrHRcFXyOf+qMCnryOOPzgkpN3RR9FLr/5mOanq8yRIOPYgBV3ShXDq4eZ2XuFNd19I14p7GZfEQlni8zkx4k+Ead6/H2736uCxNH1/6fPhD727hfXzeLeeXIzLE6e8h7Q9Yumm4ojRwS5/L4vPP58P6uCw8cj4PTcbkikhY4rfwli7/YVHaOTe7LqPJ+jQtD30+8gxWdj5cGhfxWHnzy7WFq+oDWpw+3vQ/zOVRrsyR8+F9HN49d23/20uLsyje1K8jLIo3l0bOrQ7TcOn/NM6cW0reB+o96j/eFRkMui9kMGgxqgwGdT7A9pIPt6oy7yhbkQbsbs9SVK6teA5o6XR2Wq1Ip9VptCqdnH8z6NNoVTo5/2bQp9GqdHL+zWAr0jCOqnT4L4OBIAiCIPyM9hgMmsG0fe4stiKd7pZGWd++uz1LGUfHZ+nMtHwarUon598M+jRalU7OvxlsdRqtSifn3wz6NHw6/JfBYPSADAYtRpnBAKXRPjY+LlEURVHsrkRR9MqiDAaCIAhCT0aRwYB9zmkr1QcURVEUuzvp/7FlFP8Z75TBoPtCBoMWo8hgYPvU+49MFEVRFLs7adNkMBAEQRB6OooMBuzXbu1l2oaKoiiKYncixgIzGGiFQfeGDAYtRpXBIP3I/Ifn3T2L/MvCpCyKo45/WTjPsjiMRTJV4VKWyZtfzr8sXI5FshZPLr4yvxzL5MriKfNLWSbn40llyvxyLJMri6fML2UduZyMuVWFNVbJ5vzNLeeXYx25nIy5VYU1Vsnm/M0t55djHbkimTphjSZbJN9eP0+TK5Ivcq/yS2my7YmrzM/T5NoTT5mfpwwGgiAIglBuMKC9TNtUa2eL2tr2+nmaXHviKfPzNLky+fb6eZpcVTw5mZxbjj58kXyRf1mYHMvkzS8nk3MrY5m8+aX+ZWFyLJM3v5xMkXsRy2SL4vLuqV+OZXJl8ZT55VgmWxSXd0/9cqwjl/O3cFVhYR25nIy55fxyrJIr8jf3srDGOnI5GXOrCmuskm+vn6fJlcm31w++9957MhiMBpDBoMWoazDgA/PMfYTGVDaVz7mlfkX+Ri+Xk8+5pX5lMsZUNpXPuaV+Rf5GL5eTz7mlfmUyMJUz5mS8W+pXFcbLpX45+ZxfTsa7eRnvXiRf5Jf6p25eLvXLyef8cv7e3ct59yL5nHvOP3VL/1fJ5/xy/t4tlbHr1M3+55jK59zS/zl/f51jUbjUP702t9Q/xzSMd0v/F8l7v9TN3HPh0v92bf+9W04uZVGY1D+9TuVzbqlflYz/XySfc/PutGkyGAiCIAg9HXUNBtZ+5tpW759ep/I5t9Svjoz/791ycimLwqT0cjn5nFvqVyYDU7mcbJG79yvyN3q5nHzOLfUrk4GpXE62yN37FfkbvVxOPueW+uX8vZuXM3pZk0ndzL0oXOrm5VK/nHzOL+fv3bycMZVN3VK/nEzq5uVSv5x8zq8qjJdL/XLyOb+cjHdL/dPrIjfPOmH8tf3PuZl8jjkZ75b623XOza5T+jAm5//bdU4+dbPrHKvCFfl5t5xc6k6bpi2JujdkMGgxygwG7F/pP7CUHIoMc351WDd8R9JpRRrQhy2Lq6PpGFuRRhUtnc5My6dh6fj/zWKahnfzch1hK9KArUoHtiqNNB3v3wzm0ujsdFqRRu7as8yvij5sWTxlflWsG7YROX61wkAQBEEQ8gaD1157LfYFaS/r9gPL2uEyvyrWDduRNGAr0mlV2Fak08w0yuJqbxopq9JoVjpltHQ6M61cGs1O06fh421mGrAojWam49OweP3/ZjKNtxXptCKN3HWz6ONtRhqEl8Gg+0MGgxajyGDw8ssvx1/7OI3vvPPOSNp16pfKmVvq7t3sv3dL6d1zcnXc7Donl17n5MyvzM2Hzckay+Sq3Oy/dyui90/l07A5/9Qtx1y4Ir+cm12ncilzYdL/OX9/ncql9DI5+dTNX9t/75Zj6p+T926pv117txy9TE4+dfPX9t+75Zj65+S9m/3P0YdJ6WVy8mVuKb2MZ+qfky9z8/T+Kb1/Tr6Om12ncqm/v/b+5pa6p2E8vZyX8f9TuTpudp3Kpf7p/5x/6p7+51crDARBEASh2GDwxhtvjGwzPa1N9fTuObk6bnadyqX+/rrM37v7/0Vy3r/ouq6bXadyqX/6P+df5GbXqVzO3zMnU+Zm16lcyiK5Om52ncql9HKpfHqdutl/75Zj6u+vU7+cm12ncim9TCqfXqdu9t+75Zj6p//TsDn5Irn02tyK5L1b+j/1zzGVycmn/rlr75ZjLkzqn8qkft4tx1QmJ1/kljKVMab+OfkyN0/vn9LL5OTruOVkPOvI13GzaxkMuj9kMGgxqgwG/sPzRKGEOb8q1g3bijRgK9JpdhpFcbU3DVg3bO5+6jIXNhdXR9KAdcN2JJ26YTuSRiNsRTrd6VmqwjcrjValk3M3NiuNOunk3Osyl0ZH48wxl465m7Iog4EgCILQ01FmMChqR1O3Rkj4NI6OxpljLp0c68rlmAubi6sjacC6YTuSTi5sLq6OpAHrhu1IOnXDdiQNWDdsR9KpG7YVacBWpNORNHLMxdXsNIrYjHSq4mhFGrBZ6eTcjc1KoywOVs3JYNC9IYNBi1FmMLDZJXx0nGcA7SM0mnvq593Nz/8vYk7G3HJhc35Fssacf+rmr+1/zs3kc8zJeLfU365zbnadMuefuqX/c/7+OsecTFUcqVtOxjPnn7rlwpuMZypjLPL3bjkZc/P0/ilzMqlbLg6TSeXKZMvcymSq5Mr8vFvO37t7v5ycuad+ubBFMlVyZX7erczf+/n/KXN+qVuZTJVcGb18Lqy5eT//v4hl8rmwJpPK5WSNOfkiP+9WJedpfiiLMhgIgiAIPR1FBgPOMeB/rh+YtrX+v3fLyeVkjTn5Ij/vViWXskimyt37Fckac/7mVuWXunm5lDn/snDm5/2LZI2pfM4t/Z/z99cpi/zL4rDr1M3+55jKp26pv13n3Ow6Zc4/dauSyfmnzMl4tzL/lF7GM+efupXJeHr/lDkZ71bm7/38/5Q5v9StTKZKrsyvKqy5eT//P2XOL3XLhTWZVK5MNufu/6cy5lYlV+aXuuXCmkwql5P1lMGg+0MGgxajymDAh+U/yPbSPtKcH2cl5Fgml/p5ehkfpoi5sDk/z9Tfh8nRh03lUz9P7+/DGL1sM5lLK2VO1seRMifjw+aYk/PhU+ZkfNgiFsn6eFKZIveicM2gTyPHMjkfTypX5F7EMjkfTyqXcytimZyPJ5XLuaX0Mo0yV4/l0vAsk0/dvJynD1PEIvnUzcul9OFyLJP3bl4mdfNhjF62UZa1LymRk8FAEARBEPIGg1dffTUaDazNTNvRRlnWRuf0AWOZnPfzTP19mBx92FQ+9fP0/j5MEdOwOfeUqYwPl/q1h80uF++eMpXx4XLMhc35eRb5+7CeZXKpn2dOpm7YjtKnk2NOzodPmZPxYXMsk/PxpHI5tzKWyfq4UrmcW1GYZtCnUcQyWR+Xl8u5lbFMzseTyuXcylgkn7p5uZQ+XB35KpbVZZ7IyGDQ/SGDQYtRZDB46aWX4uwSPq7cR13GXJh//OMfI0n89gsxTORo/jm51M/Ty/gwRcyFzfl5pv4+TI4+bCqf+nl6fx/G6PPVmOZ9GZEnbgtbllbKnKy55ZiT8WFzzMn58ClzMj5sEYtkfTypTJE79Plp/3P5b7SGjP8m72nx+DRyzN1L6udZ5OfD5Vgm5+NJ5XJuRSyT8/Gkcjm3lPileZyWCfTlYvRhLJ5cGp7+nmDOz1+nft6tjEXyqVvRtXcropdN5b1blX9Kn59Gn++evkxyZZRzM5qyKIOBIAiC0NNRZjCgzSxqS4uYa399uw5NFyjSB4xlct4vJ5Nzy9GHT+VTvyJ/71bEqjhyTGV8OGOat0af/0XMhStLy2gyqZx3T5nK+HA55sLm/DyL/H1YzzK51M8zJ+PDpvnpr3PlkNLLw6J0cszJ+fApczI+bI5lcj6eVC7nVsYyWR9XKpdz8+5p/ubKAJbVZT4un0YRTb7MzzPnnoZLWSbn40nlcm5lLJJP3fy1/fduKS1vPX3eG3255Moo5+b9ZDDo/pDBoMXoiMEg/Zhz18RFPJC9o4l/+PDhUSnlP7/2vzPZinRGxbPYf/IUfvLJJ3H5cFnlC325IP/xxx9n4/XpdgZblU6raM/j+cUXX4RPP/00dsLIbysX/82k5YIM396HH34Y4/DfjE/Hp91stiINOCqehf+UC98L9ZLlfVoevh6zb4r/tge+6rKO0adj/8lTvhcUOr6BonJpL2UwEARBEISfUWUwyDHXFqdu/Ded1lYroHOhe1m7n+oBncVWpAFHxbPw33RR02lNZ7Uy8WVj/61cKG/Cja79jFakAX069t+7oXdaufDrv5P0e/F9wKL4Rhe24pnSNPjP+87/9957L+Y3+Z5+I0YrN/umCEM/ZXT9ZuCoeBb+0z7wn++FsiDvoS8P+3baQ74ryo80ZDDovpDBoMUoMxigMFpF2R5SsRLH888/H8kH+sMPP0TFlA8Uki60685iK9LpCs/y9ddfR2WRLaVeeeWVWAZF5fLCCy+EF198McoT7p///OfIeLvCs4wO5Nn+85//xEaJRsq+q7RcuLZyoUyQo2Gj8fzxxx9/8c1YvP66M9iqshkVz8J/8vSbb76J9RKdJvLeFPW0bHDne6JskKNzxfeiuqxjTNOxa74XjJjkP/lO+aTl0l7yXVHmfFsyGAiCIAg9GTmDAe0uzLWhdUmbTduN3vS3v/0tutG2oxPn2n277gy2Ig04qp6Fa/oK9OUYaCPfy/qA6LSUC31zdCLC+fgszlHxLM1mK9KARengxrfF4D9lQv+c/M+Vi30vkPEXvkfGTSzuojTEcubyjTqPvGUCWFG5UCb2vSBDufD/o48+imXq+4CwFeXTijTgqHoWa4vIY8qAPndn9AFlMOjekMGgxagyGFhl2R7ygRPPE088ER577LHohlJCBU1angzctIc+fC4uc8v51WWdsB1NA1aF92kUyVn5Pffcc9EgYJVsrlyeeuqpSK6xlFNxpmWTS6OZbEY6VXG0Ig2Y88cNJR6LOfn8zDPPRIWD/ygdvly4plyeffbZ8PTTT8f/KDIMalMuFl+deyljnfAdTacVacCq8D4NL8d/8hRlnO+FMvF1nikWVi4oinSs+F4oH1bl8L1Qtmk6lkaj9OFzcZlbzq8u07C5uMwtdW+EPnwuLnNL/eyaWTsYMk1BN0Xefy85+nLLXUMZDARBEAThZ5QZDHzb2SjRc2m70XsfeeSR2CehbWeyhbX1nqYHNMI0bC4ek8n51WWaRhqXueX86tKHzcVjbkV+6KSUHzoOehPM9eNNp6Vc6Juj+9J3zKUBfTp1WRXWx18mV8Y6YTuaRh36NHLp4PbVV1/F78omUFIuuT4g5UIfg76G9QEZN6lKoxHWiaMV6bQiDWgyqZwZ2JjMZ2MmVi7Wd4BWLsjQN8f4SRjKlDh8/Ll06rJO2I6mAavC+zTK5MqYhs3FYzKpHH1z8pZ+Nu0H3wG/tCf+e8n171LmZDDEyWDQ/SGDQYvRWQYDq2BpGB9++OFw//33xw+egVMqBT5SIwOixiL31K9Ixsv5/94tdTe3nHzqn8ql11VuOb8iGZPz/1NZ/ltFyzUzn8l3GrWcUsJ/3PB79NFHY9mgVGLJpYL2aaTppG6pu6eXSWU74ubp/VO5um6pX5G7Z5Ffzo1ysYFpvqknn3yytFxQWJBBiUdpRMGkXBr9Zrxb6ueZ8yuSN3fv591SP8+cX5lb6l7l5/2r3CD5afUdSggKIHnPt5Mac7jmG7FOL2VDHWlKfFE65pZL3/vlZBp1q+OX+nuZum6pe+qXk2nEzeoy6/RSJijpfDtpubSXpizKYCAIgiD0dHSGwYC2mvC04Y8//ni455574iAok5PQiU0HyOkBdf1S/1QmdauSy/nlZHLuVXJVflX+3t3LmRu6DDopfUDyH50W3YkyyPU10GkpF/rmyLKi06dn9Gnk3HN+qX8ql15XuaXuOf9ULr2ucsv5Fcl4uTI3u6ZsGPtA96QfQX4XlYv1E+lrWB+QsBaX0afr0/J+3i3188z5lbl5d++W+nnm/MrcUr/U3ft55vyK3Gxgmvw3Q0BaLpC+B+WCDP0/JsEig/GTOtPiz6WR80vdvZ9nzj0nb25l8jk/71/XrY5fkX+VG+RbIU955+kDMl4CfR8wLZ9GKYPB6AEZDFqMMoOBVZztIR82BgIUlgcffDDcfffdUXFEGaVSoJI2ouT46yI3T/yrZDxz8nXiaCQNmJOvk0Yj6aTy/pqBf/IdxZy8RwHxlSz/KVcaRowFlA2VMbMYqKAtLh9nLp0i1pHxNHkfpioOH6ZMzpiTrRM2F66MOXlrAFHireNkSnxaLpQVZYFCQtmgNNJgUi7E4+P3aaTXRawrB03WyzcaNueXY5pGnbDtScPLk58oCnSSKAuUePLeZjFQJigW/Johh2+K74Wywf3LL7+McRSlYdfeLcc6Mp6pvF2XxVHlnzKVt+uqOOrIeHp5fikXyPdCO0SZUFfZTDn7XjpCGQwEQRAE4WcUGQzQh3JtaB2iRxGeNpzJSXfeeWfUbW0rD68npHqAp8mkrPLP0ctb+Ko46sh4enn7791yrPJPmcrzH70JnZT8Je8ZbEZ38jqtlQtli05LuWDIYSCUPqDFVZROkZtnlX/KnHydOOrIeOZkOyONNAz/KRu+K/oU9CPob5SVC/1E+hrI0gckrI/X/tu1udn/HHNhypiTrxM+F66IXtbk/f8qNiKXxst/9H8G/elb0N/mm0n75tD6gGbI4btBznZm8PEbfTr2v4hpmCrm0qgTRx0ZTy9v/71bjlX+KVN5/pOnvPP0ARkvaXYfkHhZwS6DQfeGDAYtRpHBgI/TGrT2kA8bowMf+gMPPBDuuuuu+OEzEESlwCCo2FzS8LF3JQ0bFnAaP8qQsvDlght+Dz300MjZJVSeVNAonLm4xfaR/KQBRInn/UfRMCU+Vy7I0KlCWaQM+RYpFxTOXPxi+0i5UA+xp6jNHCHv+W/lgmLBL4ohnV6UReoyygYZvjfKNhe/2D5SLtbpJf/5VqirqNNMke8oTVnku5LBQBAEQejJKDIY2KzO9hAdifDoVQyy3XHHHVH/tW021ddoLslPyo4+IGWHTsvAtNdprVzoa6DTUi5M5kO3JZzFk8YtdozkKWMf5D39OsqmrFzoA9LXoFzQVxmYJg6VTXNJ/40xL/IdI40Zcny5QOsDUnaMm/DdUH6UKXGoXJpH8pK+OXlLH5D+X2f1AWUw6N6QwaDFyBkMmIHJx0mFyAfqSUXqmfp72jYe9913XzQYMPjDQBCVAoqN2FxSdlSCNuhsymJaLrh5gwH/6SD4E//F5pD8pAGkk0Q+o2hYuaCYWJnwHze+ETpVZjCgwaRc9M00l5QLeWoGNptdwn8rF6vfuOabQqGkLrPt1fjeKNtc/GL7SLlAvhfyn2+FzhXKuv9erHzSNijnlpJZXlphIAiCIAj/azBgf27aW/pwvu209tXT+6ckPDoteu/tt98eV2eiczEgpL5Gc0l+Wh+QvgSz09Gd0jKE1tegD8jKD3Rbm0Shcmk+yVPGPsh78pr+RlG5mIGNvgay9AFZCU0cKpvmkv4bA9OUBWMmfDP8T/sa0PqAfDOQviITzohD5dI8kpf0zW2lFHUYJP/5PnyZFLVBVW0T3xTnI8hg0L0hg0GLUbTCgMow16DVIZUtJA4zGDC7hP8MBFEpMAjaHlK5wyq3lHVkPHPyVXFU+afMydeJo0gGNypBa9iKBqYpV/yYwXDvvffGypgOAkoJ5V+URlG6RX5l8sZGw+Xcy+Rhzj91S/9Xydcl+UkDyAAleU6Hid9UKbFyMaMCg9IoMDR6lEvdbya9z9x959w6QosvjbeZacCiNNqTDuVCntJJstUDqcHAysU6V3xT1GWQsiJdyjYXfx0261mq6OO1/96tGczF1540KBfI90L+862YspiWi/2369TNu3t/DAbWOZbBQBAEQejJSA0GTCCivUz1obq09hY9yQamb7vttvhL28uAkPU12sMqfSPn3yhzcaRuOZlGmYujPfGSn/QV6AOS7+i0DIDmdFpfLrZVlE0a8+XSyH0UyVbFUeWfMidfFUeVf8qcfJ04ymQYXCbvyWvKpqxc6CfSN2fyGH1AwppunEujLN1G5Y1V4VL/9LrIrVH6OHLx5dzqkv4bA9OUBXldZDAwQw59QFtlzjXGT+JoT11m9+3vvSPPUsRWpQObkQZ5SRvEO0/eFxkM0jIyt9Q95yaDwegBGQxajCqDgX1sjZIPm4E4PnQaPgwGVMYoiyg1VNKN0ocri4OKBub8msXOSINnSol7Lp1c+lyz3yEDnygl5H1ahpQLZYufGQxQXihzjDmWZntZFj53z0VsRNaz0XA5ee/my8IaQC9bRQuHMs777w0GlIUvF9zMqMCgNIYD/CiXRtOty9zzN8pcHKlbTqZRpnHYu2Z5bO51aGEwlJkxgLynzvLlAikX+6b4XiBy3AsKTi7+9tA/X/qszWIar103Oy3iI4+N3t3LpTR5vhfKgXqK74Yy6kh7ZERZlMFAEARBEH5GlcGgvSQ8uhMD07feemvsc9BHoZ/p9YK6TPUJo9crTJ+p0jWawWanY8+HbuqftU4ayNJXsINC6dehP6VlaH1AyoWBT/rm9DXYWcDiSeNuhGXhy/KrWXnZ7DSsHCDlYmWDXyNxoXOS7wxMs3o87Wv4cjGDgfUBmUBj95CL21j2fGVhG82XnHxVHFX+KXPyqZvlSVoudUn/jUF/yoK8NkNOrg9IH4Q+IHUY5JoyJY5G0y1i+nzp8zfKND7v7v93Rjp2beVj/71MjibPO0850P9rVh/QylUGg9EDMhi0GGUGA6s4i+g/whyphPnQOVSJ5ahUxnykVAooNpDKGtp1SmSpPKiUrVHw11QuVEy5sJ5V6TSD7U3DKlvCMoCJ4obSR17l4itLxwwGKCWmLFLJ+nLBDT9msVM2tvUNFbTlZVka0MrFk7Lw5eLLuYhV6RjryuXY3jSsXFAK+CbII365xp3ns/wqIv7kBeXK+0/HKVcu/MeNQWtkUBZRGnGnXOrkJSQ98p9yMEWmPeVSl7m8zbl1hEXxUQ58KyxbpGzo5OJOHtQpF/LBvhdvMEi/F67tm+J7gTa7hPzMxZ/S3gP/rXRmuaRsdpkU0dLgea18IGWTSz+9Lys7yoXvgW8GmrLoy6Wq/cnRlEXujYER2jwZDARBEISeijKDQa4d9Sxrh9Gn0KsYYLvllltinwN9AJ2Hdr5ID/BEL0LedCVP06N8XEUsS6OZbE86pvcQjrynn4EOhJ6SiyuXhoUnf9FP6dfR/7aBaU/ra1Au9M0ZLLVJY3XLBTnTYdOy4dp0Wh9fjmVpNIvtTcPu3XRZ6wPSN8c9fb6idJAhDGVBPwKdNtfXoFzoa9AHpJ9BHxB3whIHrHoWZHy52Ddi5cJvM8ulrlxHmEuD+6dvzHdCuUD62chZXnn5HMkXypb+BXldVS6UHXUY5JpvlTjqpIVMUblA/tcpl7rM5VnOrSMsSoNfnoPysTEt8oprL5sj4cgH3nny3eoxyihtj8ranhytXBkzIH7GO5k0JoNB94QMBi1GmcEgrTj5bzS3nDu/hOcDpwI2gwEfPpU7FQqVNJWHkWujVUK482GjzAwbNiwMHTo0nHjiieHkk08Op556arjkkkvi7AiUI4vXKisqHKuQrQL26SFnFbjJWEVlbvz34SzuNJyX8c9R5gaJzypFlOmTTjop9OrVK2y++ebhsMMOC9dee20sAx/G6OPz8TN7lgFps5ZbGRqtXCgLBqU58IrBUqtAuadcOpBrywPymziuv/76cMYZZ4Tjjz8+nHDCCbGMLrroovDXv/41pk9D4cPlysXStDRy+evdysoFP5jGbfFbGvbf08dnA/w8yz777BM22GCDsPrqq4fNNtssHH744eHKK6+M/shZOB+vd+M+yAfyixk9psRTFr5seI/xQ4ayQWlEhnLxz+tp6dnz0yjTEbjppptiuQwYMCAcd9xx8Xs599xz4zvFLAm+ccITDvpysfzNlUuav/wSxhQe3HwY4rBw+Fs4/Mpoz2XxGEmHd5z37ogjjgjbbrttWG+99cKmm24a9t1333D++efHfCacL/uU+HEf9r3YihzKwMrF6jLKyr4pMxjwDZEGz5OL3+6bNCDlQhiW5Z9zzjnxW4FDhgwJp59+enyfcuVi+ZsrF2jPYflrZWN5buWS3mdaLrm4c7Qysefz7vwSHmWYmfw82/rrrx/WWWed+B6StyzfJR0f1pPwkHwg3ylLiHKefi+eaVuUXnt3q+tkMBAEQRB6OlKDAXotbaX1H9L2NP3vr83N+hrotAywmcGAgT30jVz7b7qF6QEQeXQjdNfTTjttZB8Q/eK8886LOhXp2GAh9DqR6T+p3oGc6UCQ/14nQnfifxqOa+TK4k51JLv2bpB74Bd9ib4sevqOO+4Yddp9RvQ7rrnmmpiPJpuG93Hij15DfpEf9Ovoo6B3+bIhPtzQac1gwGAp4Sw+n07OjbTo0xAHfY2zzz47lgs85ZRToh5+4403Rt0NHdvCFJWL5YOl4eVyZVNVLhbGxw3T57Dr1J1w9HF5dy+99NJw8MEHh4033jisscYaYcMNNwwHHXRQdLc+rk/D4k3/Ex/lQl5TNmm5QCsX+oD0zekD4u7fbR+vJ348M+8A3y3fBd8HZUIfkHKhD3j11VfHeNGDLZzPO5+/lh5y/OKWyvhwkGu7R2j3lZZd7lnS69TdyL1Tnxx77LFhhx12iGVC+ey9997hrLPOis+XKxdP4uU+yFsz0vC+UgZpX4P8pF9NH9C2paWc/LhTLn5oz89/4uI8zwsuuCCWSf/+/cPgwYPjPV922WUxXragsnCWvz7vuLb0kOMXOfyMyEDCpH1AC2P3VRR3EQlvTN34b/dN/l9++eWxfNZdd93Qp0+fOK5BWtDC5kh48paysHqMMrL2qIhpW5S2S+ZmfUAZDLo3ZDBoMeoYDDxxTytT7+f/0zjyoTPAxnJUlD4qESoWPlYG0YxcG/HHakgYBmeXW265MOOMM4bZZpstLLTQQuGPf/xjmHDCCcM000wTGwoaQSoWKjwqG9KnsWVQkTio2GkU8KcyIz1kaCBQlqjAbYYxDQJhUHaIg3gJSwVG3FQ2KFkMiNuqCSo2q+j8c/jnMdo1stwLyhT3YQPTSyyxRHzO1VZbLSrFKBfcr4VP8yv9j9KHkmPWcu6fcrRyg7jhR+PEs2M1J795DosnTQdyz5Qf8gx2/ulPf4plMMMMM4QFFlggzDfffGHaaacNU045ZVhxxRWjDOkQjvxDKSbvbr755pEGJJ6ffLBGChnyAxkUHt4Z7p94UKgpU8JaY00DR77QyHJfhIH8t4Y3Vy6WnymR5Zd8uuKKK6IBZ6WVVgpbbbVV2G233cLaa68dn3OZZZYJhxxySHxH+F64f8JZ/JZ/uOGHDM+LIsiz8G34MoE8pxkMiJe8oqx4TuLx5WK0Z+G+eX9RCimXySefPEw11VTxe1lyySXDzDPPHH7/+9+HueeeOxx55JHxfScM34TlHflK3vENUA7Ea+8915a/lAPlwj1zv3wvGPT4JnhG3hFTTnhunpd9UpHjmfj2uG/La8svyzt7phxRfsgnOla77757OPDAA6MiglIy//zzR6LM0zG198Piz+WbfS88G7+Ui30vVpdxzftGvlAufDNWp/AMadwWv6VBvBhvMDrxbVOXWbnw7fzf//1f/G6o6ygX3hfqYfKRvKNcyFu+V8s7e++5f2TsvrhHFCv7Xq677rqoWBOvvaeUKfeIHDJW1/GcxG3vcvos9jwpTc4UT/Ieg+cKK6wQJplkkjDGGGOElVdeOd4/yz8t/RwtHeoF3n3eJ0h+p+VSh/ZteTcZDARBEAThZ1QZDOrSt7f8ok+h/6AToMMwYET7m+oYqQ6AHzoAeuNRRx0VdVp0WPSnRRZZJOrgE0wwQdRp//KXv8QBa/QFJiugG5EGejS6DemirxIfuofpH/R50JdIA/0HmbQPiE7FM5AfpjsRN30UBsohfS3TrSD3b89h9PqSd0eeuEmbCXDogOhN6O5LLbVUHOzlfkyW3zSvzM1+uT90O+JEd0oH2igX8gqdCn3c9HJ0fIvP4vJpQNPpeWYGOukPzTLLLFGnpUzQv7n3ySabLPbbGRClHCzvKAPCUiaQZ/PlQv6g03LvlAv6KfKm0+JGudAP4Zqwlu8WDj90XsLRt7H7tmfyz+PLxYgs90KeYRSg37fWWmvFPuAee+wRjQaMQSy22GJh//33j/o5eWLhfVnYf9zR53kO7gt93MrFvhn+W7lYH5Bf3Ijf7tnHa9fWl+KbvfDCC+O4CH2LqaeeOvb5+GZmnXXWWC7zzDNPnFxF/ITleyF+8o5nIe9Il++DOC1/yWveJ/KX/jky3DPPwRgPfXPKlL4U7yB5aOF4Jr4lwhKO5/X3bc9ixC1H5AlHnjBmQv8PHnrooXHy2NJLLx3LhXLiveadsDQI50k6+NFP5LvnmyYPiNvKxJcN9RjvK3UYtPePuIkrjR/afZOXF198cZwMOtNMM8UxEr6VZZddNpbP2GOPHfvs9GN5x6kT6Etxb5Z3lAt9KO6FuH3+cm/IUH48B3UWz0QZY3SkbHC3coHEQVyUHXEjSxjSJF7/XFYu0J7J0/zIT9oP8pAJsNTb008/fRhvvPHimAnvJn146mnLo5SWhn0vVo/5PiC0+qyKXt7+W10ng0H3hgwGLUaRwYAPk4+e3/aSj52GkYqMypvKlgqcypBKyRM3PmAqDBp5Kridd945DkQvvPDCYc8994yNFOFpcBgwZMCWSgl3FFKUxp122ikOtjNgTWW1/PLLR8UFCyfWaBQjKk3uh1kDyNHAMKDFbFjCLrroopFbb711nP2LLJZS0ttiiy3CJptsEq3ZKEsoq1tuuWW8DypznsMaISo88pLfIpo8FSgVGTP1iZf4URZ5Vssfn2/pNSQ+KkIqVxoHfm0A1EiZWrnQUJhCRmXKvaZxWjqUC40M+UcDPddcc8WGACUKJYEyIz0ULAZw+/btG/OMNHA74IADYhkw+A5RiikXZtIcc8wxMe9Ij3JEaaZxmXfeecMqq6wSZ/WjsC2++OKxrMiffv36ReWDtAm/zTbbRCVpo402iumsueaa8d5QWHk/UKTsedIySIkcjTBhaIjJN56fPKN8uCcUEww8vBtmmEjzztLjveado+Hj+YjHD0xbueBGHiJD401eU1aUC+mncUPSpVz4tphx/4c//CF2pJhxb/cGySeUqr322ivOOuG9uuGGG2I5MZBtZWKkXBj0RTnn3igf8pz3nW+DX74DyoWODWWFsYs0uG/i5/21FQCUC+kw25zyJG6MDDwfec7z8Vv2vfC8/CLL4DTPTXiep1evXnEAnnIfNGhQzEdkyfs0z6C90+QNih/fAGFQmKze82XDe8A3hTIHCWP3koufb5q0iZd6h7yhY8V3jdJEOtwDihp1G4ois054HsqfGfnUR9RP1GFWPrz7zKLhm+J+iZ+OG+XNN0OdRx6TD4Sjc8A3g5GI+4cYIqnbiMvXZZQTcaOI87w8B8/Ac5aVC8/Be0heorBSP9Cpov7GqEPnim+Gd/qrr74aqQymeQaJCxIXecE3A1EWrVyMpvx5tyo/3Iib55LBQBAEQejpKDIYpG1uGXPtLW04egr6Izoo/QGbjEIaufYf3Yk2Gr2dwVomV9AHRJdglYHpBqw2QHdC/0ePQmdhpeYuu+wS/vznP4/Um9Ch+EWfom+CLM+FroJexMphdFj6gOhOXKOvMQBOn+LMM8+MA4Toq+it6FroTOix6LSkhRy6tOncPAPPkupKOfLc1gfkl8kl3DP3Q78SnY24cros7paPlh6Dk+hL9AnQncirtFzoa6DDUi7cMzq79ZFSEi96runo6PbkI/rmHHPMEfOCwVBLF8MH+Xr00UfHPiBpXHXVVVEvpAzQUXk29FkGTNENGXinvNEzuWf6DuiwlDv9RMKho9Lvwo3/lDv9E0i/bPvtt4/6LH0NdFn68vynP0lfh+ez947nqOpr0AfknUZX55eyId9Ii2egP8r7yXtN/87yyfIszUPGLsgf8o9B9bJyoQ9AP4Nf5GzcxMdr15BBWPrwTBijHzTxxBPHd54+Bc9N3OQv/QrKgXeMcRnyhT4yA9mUifU3yHPyb7/99hv5zXFfxI87/X/Kjm+BsuFdoFwISx+T8Ru+GcZpGJOhHPhG+F4oO94Zxl+YVU+ekOc8R1oORisre3bymf4fZcK3TJ+F/ob1f3gm8o4yRNbnmyffFHFTd/AN+O/F12eUi/UBqcMg/XjKvSh+3DHGEJadF8gvXy7ETzi+U/qIrCyib8Z9Ez95zdiSlQvkm6F+oi6kH0ndyn1QR1EfUfbET16TD5QHhgnKa7vttovjPdwPq9zJL74TP25C+VCn4s99kX9W76Rl4ok/ZcF/3m8bx2J8wCZarrrqqnFyL313aOFyJO9Im2/P6rFm9AEJjxttEGnIYNC9IYNBi1FmMKBi8R8o/405N+9OWD5wPnQaPpQ0BrdoeKgoSIMKAfIf8gFTWRAPg4BUNjbbngYOhYT7s8qERpQPnwqKhpDKEis6lR/KJAPKKIVY06ebbrqRFlwUJBpXZocwcMbMlTnnnDMOOmIgoBGlcaUBYZALpY0BsNlnnz1a6hkEZWCUipDKmbgxbCBH48wzU+EyM4ZGESUTpYUwKFI0CihQNMA0GFTIkIoRRYwGgWcgPu7BKuVcnnniTvo0IOQJvzR+vnx8ufBsNCA0glSihCcdS8PSwY37xOqMckQDQ16iULOKADk6GZSLhSd/iZfBWxoL8gcFkDxBmSCPUSzId2ankKe4odBx3zSek046abTEo5RQnpQJZcf7RNnS6BKOOGgQUUJQcGhMyUOUWRR/8hTliEYYxQJlgrKgXCD/KR8GxGmEKRcaP2souSYcig+NNu8CnQPead4/ZNKyMJIf+COHkojybEqJLxeIG8+ODPeL4kCDyXueKxdIfqGUUC7k0/jjjx/fZ94d3OmEmJKMckM5o2RY3i244IJRCUdhZ6kqsxHIExRG8pSy5lskHPnHO0958c6jXPC9kA+kj+KOksg7RRlwH8ym4DumrCkb3mveBZRtZOjgES9p00nA6JGWDd8M94vCRpnwnvE81DO8SyjAfIcMTFM2fF+m7CDny8No5UKdQp7zrvLLt5HWe1yTZ3xT3mBA/MRRVC7cJ+/srrvuGusxvhfeRdLhvlCakOEeeD/4JR+ZHYM89Q35TV3Bcw4cODAqgbzvPC9lzLtE/pFvGHIwGPHuo4ASjm+KcuG5eEaURL4pZoHR0aVMyF+Ma4RntgtlRJnzvnL/fMPIpOXCt0y5UK/zbtGhpsOG0k68KLzcP8+C0kq58D5bvvnyMFIu0Dqf3AOk7NvTHqVExpRFGQwEQRCEno6cwQAd3trcHNM2N/ef9h59B70Q3QZdhT6K6WWp3oRegO5EWHRuBiPR4xlsog+AToBeS1/E+n+ERz9g0IwBMgbl0F/RP9Bf0H/Q1+m7McmJfgK6EXGhH6HXM7ECfRc9iK2OGCQ1fRbdjy1D6Icyaxs9jolj6HLonuhbTAZBX0PPRX8nbgbR0cPQlbzuBOk3cQ/kDffPM6Pboveg1zG4aAYD9E7yxeeZ5Zen9REoN/RT9DL0Q8rAysVIX4Pn4vnoU9HXQMf38aXlQvroeeQp/WjKBWMOuidx0Ten34Q8ZUNfn+ejL8xgJPI8E3oj+cu7QF8M93HGGSfq7gwgkwfcNxP00OlZxU54+gesmCV9+q7cO1sfMRBKmaC7MgBKX4bJXOjK6J6QiTDkI+VCGpQLfXPKhb4fZcJz4E6c1u/meXhuvgPC0WfhHeN9JAz3Qr55fbaofMgP3lPug/5JWbnwDtHP4Bc58p37IR7iT8uGvOfZGOxlVQE6PvdKfF988cXIPiDh+P64Dww7GFnQ+/leCIsb/TnKBSMA7zvlwrfFt0ZYnpv+H2VGebK1DmMpvEc25kJ/hfylf0k/kPyiX8NYDH0E25GAb49+A2XJL+875WDfiRE3Bt0pF3R3nofnpp4hbcqQNIiX++X+iZN8Q86+Dcszn3e8q3wvxE1fI1culD8ylB3vHuTdpkyt7NNyIb/p4/Ed8k7aygLymfhsWx6+e+6B75a0eB4mWDKeRR4xRkU/C3f6gNSJvOv0AXlOvhXuhf+UJavV8cNgx3dGvct3wz3zDAzmE54xHN5l+oSUFd8RadJXo46kb08Y3gfynjqUsrC6zL4d5KxcyJ9evXrFCYTUkXyzrPhn3IT6l//I8LyWbzniR9nxTpN/aR8Qpu1PkZunhaeOoXwYH5HBoPtCBoMWo47BoD0kLB87lRkNHw0rFQUfKpUqFQYVgqdVFlScDCBSkWG1phJk1ggNNnHQiNIo0TgxOIksFR7KC0ofA3S4UTlyDzRouDMAjRWWSpT7IA4aLBoZKliUBeLDjwYGCzaNB5Ue/iiFyBIHCghKSe/eveOgKpUts355dipoKlLkGCRH2UGR4ZfBOJ4FpdSUEvKC/CYccZoVmThQaMlLGqZcnuFm5Jq8I8+J25RFX478xw0/GgPygDzFjzQom1walAsNAo0wSjKNCvlBo2DlAilrFAcaGcqARoaBScoQZZzGhYFLFCYG6Jn9jBJPvjB4TEeAe0TZQVlB4UBZ4D5RWLgPGljyiw4FygcKCco6+UvZo5Tw7jCAzi8KK8op5c6sFO4DI4YvFxphW5GA8ky50EixagWlivukwSNO7ptnZlCfRsw6Lb58fLlwzyhaNHy8u0Xl4g0G5CN5TfyUC3H4+C0NGlbyg3umEUfBnmKKKeJ9kp+W1zw/3yDvGB0hFA9k+L7oCKFE8M5A/MgfOlEM9tNRI7/xo0OEssGAP+8BZc2zcS80wNwP+cL3gOKKYkn8vPMoTQwgk8+UC8qhGY9IEyORrUKgXCD/eTcoN94v0iC/yBveCZ6LvEKh4Z55HlabUD4o8qaMpeViip19L0UGA35NiSd96gfSJQzxFZUL98k7RJ6Z8k3HiGchP4mHcuEb5Pn5XviPwYq8o5PDs6D0UbakzfuOkQclkg4l7yH3jT/5Qx1FOnwDbKfFPfNummJEeuQ927nxjqDsUx7UnSjzxE3avEd0ZslXOsrIcN+8C/bNUC68P6TLe0V+0EGmDFEO6YDwjqAks9yWVS+4U++Rr2XlAq1Twz3zHVKX+++lLgnjw5E2eSKDgSAIgiC0z2BQRcKiT6FXoa8x4IVuSBtsA3heb4LoU5Y2ehH9AHRviM7IYC76MXoV/T90D+LkFx0WnYOBMPR5BiBJGx0J/ZCJDOgj6Ik2MYhw9OWYNIP+xMQI3Lh36wNyTwyY0i9jogV6MfoQkyIY2OO+mKGNboVOiz6G7kR/El0UXYk0ra/BL/1GVoaiv1q/y3Rs+jv0l1j1QB8QHZn7MR0pzTNouhP/bVIQz43u5HVaKxf6H+QN6aN7Uj74WRppOlyTD/xHlnxCx7MJJkzos3KxPiB6MmWIcYRVBPTBTZY8Ij36ZgxSsmUJg5fkM2XP/ZmRhz43/WL6b4ThORloJd8Y4KafjW4N6W+j/yKP3oofxHhEXvC8vCeUC7LW17A+IDowOjd5wjgAecc7xzgB98n98AzI4M47aH3ANN/s2ogM7xx5hF5bVC5mMKCPYH3AsnET3K3fzKQc3mUGpnnf6WuQ17wLxGXfDH1A3j/6z3wzDLKjy+PPPaL3o+/Tx+e7oH+GTk56GHyYmMnEMd59+vm8c9wbej0yxMN3RZnTByQO+gb0zfmmGTegr8HYCvdDflNG5LP/TuyXMRb6j/RfeQ/JD/KRPCKvSI/VDJQ13w3xUz7knQ1MW35ZeVje8R6R5+QT70jaN4e+D0g/jToHWe6DuHPlgjv5wT1wb1Z/0O+iPqI86DeRrpUJ+cB901+mXJjwSD3D90R61Fu9evWK4yL0AakrKFvS4L3GDT9kGI/iHbM+IO8I5cS3y5gJhlX6i4xvUI/Rt+f955p4Cc99sSqA74l3hfJIy4ZxAMqFuHkG+pKUAffON8N3SZ+T7bC4P/rD3DN9sFy5QNztezGDAf8pB18udUhZWnnySz1LuVm/WAaD7gsZDFqMnMEARYmPio/TPrZGSVgqMT50KnUqFCp4lAEqByppIxUDtGv8CctsZ2YZsP0JDTUDTlRSDCraLHysnAwOMsBFRYg/FTKKAI0GllKuqaxp3KgEuSeUMBovFDMGphmgo4GlokXBoeIjD2gkuX9WMBAfjSgDZVSqEDfuAyWFuHkO8i99Pn8NeUYaKyoxGgkaZGZt2MAiDS33jBuD1tyzheM3zTNz456RpWGzgWlfjvynEUGGRo+GlgYLdyrRNH6fBhU5jQIKIYOVDARjOEBBQSEmH8h3ygdlwQ7yMQWNckQxoUxQAFAiyXOUBAYbuS/iZgCTwVIaVxo2FFOW5qGkoZzwTNwX7xMDkOQR5UK6WOMpQ5R3GjiUE2ZLkxc8H8/gn8+eDfLs+COHEoLyxT0z8M0sF94VjB/cM+40sAzMcj+UOeFz8RIn770ZDMh7njVXLtwnMuQDyi1KSu6+fRqWLvmBcYx74/2hk8X3wDuKMsw3wzuLYkY+orjb8m3CkH+UC+XKbA1kUThtlgHfF98giiKKJAoBZc4AM0uPTTnjOchz4iZNyobyQGGkfOhc8b2gsPKc5A0Kgn8+ezb/jPac3AsKDvHyXDwf9853TpnzfLz/5JsphGm8Fh/fC3mOAsMvz2DlYgoG5UvnivylE8Q3QxkSn5VLURr88n0xS5/3njJBieJ7QbnllzwlT8gP4iYvUbpw513jGXkuZCknZCln6lTuB6UbxY16gxUGfDuEQZFDwaRzgnLEt0sdRT3It0dcfJd8f/wSL98OHQfi5lnSuixHe26el7KkTPl2+C6pG/h+MGRR5tS7yPKe+LAWl5UL92rKIvfPd+C/l/aSMiU9GQwEQRAEodhg0JE2l7DoU+hV6LL0qehL0Qajl5l+lOoBpgMQB/oCg1f0GxhAo09kA4joL+iADKjRz0AvRBeiD4LegR6LjkV/DT0UPRcdlH4J90Pc9C/pX6DP0gfkP7oQ/Rtm5XIP3As6HMYAdE3SRe9Fv4H0TUiDuBkIRFch//zzFRG9lz4mehx6K/0VdD4GZCH3Q5o24E28RX0B0uM/Oq0ZStCdvE5r5YKObgYDBgX5xd3fm0/DrkmD+NHxyA/0bjNuMABPvlAm9Dt4FgwAGG7o36L7UwaUBfkFyUvKiv/0NXhG+j8YWxhkpq8BTYY+CGVt/SL0QyahocOiH/NekA59F+sDMjDNgC3Pa/2MsuezfKTM0cl5FvqhDHwzDsEzcG8YOEiXiVnkHeUC0ziNlDP3TT+C+y4rFwaTeScoF/RgG6yHxGVpWDqWLvHTr6OvQd+csuE7IB/oE/BtkDeUHf0M8o5youzo81GelCFlxDPSJ7D+FPfN98X7SX8YvZ7+jPU1mAVP+XE/9Jno0zMOQJqUDXGRNt8XZcm3Qz/BjIg8g+W9PZenPT/1B/0kvlHioT7gl3LifeKbp39EvNRhRfHyH3feI/Kc8jZDji8X6PuAvPuQMrT4LT5Pu2f+mxGGPGBAnz4aeU7dQX5z35QZz8T3Qj+N2fjUe+Q3z4Y/3xblRHkia0ZY3lPKjtXnGEWZPEv8jFEwfsS90ufifqj3rP9MuuQddRffifUBGWux+s8G9S3fcrTnNfK+8B7yfdAvp27F0IpBgmcjf7mfou+F9Hjn0z4g5eDLpVGSLu8PZS6DQfeHDAYtRpnBgI+TD6w9JCwfOx+6GQyokImXj5XKwJOK14g/5D/hmaVAw8ZAFoOTWLhpaKh4qCypCIiDdBncpCHEHzkG+rGQUrFS4VAxES/3RIWGtdOIkkI4Btnwp+HgPghD3DTiKAcoH9wLS1KJnwqZ5yN+7sPuHdq1PafR4uXeeQbSpDG2+4A0CDTstnTVwvn4LA37z31SwdKw2cC0L0f+o6iQr6SL5RoFDTfC5+7V4ubXKnfkCUsjaOWCMscz8Cz4Uf52T6TBc5B3PCNGBJRJGiU/+Er5Wf6StxB5FADyGeWWeO0+qPxpnIiHdwR5ZqbwS/w0juSHf+/sOdNnNTcGK8kfwhIH92rvCGVC2WCooGFHKSdu7sXHRxo+z+w+eYeIm/zLlQt5Ycoi5UNni/tJ7zVNwxpa3ifeXd558hDjC7OUeAbyz74Dwtg9MXOHb4E8Jt/o9GBQ4B2y+FFcMQJQvsiQB8RPnqDo0xnk/rkHnpd75t5Jj3eEuCkX5ImfvKM+4NmJn+eztIz2fFYukPi5f8Jj2OO9o9NHufB8dBq5D5P38fk4+Y8/+eW/F94tXy5WNqYsUi6824QhHp61KA3zt3QoT1YJkBeUCSQfyT/eNZ6L5yMOyztmYCFPvqG4Y6CiLCg78pjyo77DIIGM1RuUkcXLcxEn90L85A/5xDvCe8F3S11JHHTG+AZ5Zv88/Fqe2rPZtckRN+SeeE5fn3E/3D/vCffNt27hoMVp1zwfec73RV7z3xT5jpBy4J5lMBAEQRCEcoNBR4iu4Q0G6E+4oyd43cHodQvTrWj70YXQxdFt0PnQJdH/GKRE3yVOwqCLo1eiHzKIxsAXOoj1BdC9iR/9An0MnRYdiHiRg+gtuKPncf/cAwNYpm+hG6PDEjfGDPRg9GeLHzm7d/889rz2H1q86PzoTPQ/TV8yXY5f/MhH9CaLx8cP7Zq8QOfjXrmfnE6LLkV86Hr0XfnFzeJN6dMwnRZ5ypO84n7RZ+kTk5+4WV+NcJSL9TXo3/FckD4b+YmOR9zkB/8pV57b9EdIvPQleI+sP8x7RNyEIf/x517QZ4kfHZj4SduXi9Hnpb9GljwkLXRw+uF2P/auoC/bKm7kCZsrG4uX+yRO3k/CFJULz2Z9QMqFciJsGr9Px9IgP3i3Cc+APXnG92J9DfLHv6eQvOM5+UboayBH/5Zno9+ODAY+5Jj9Tv4a7ZuhTJngaX0N7oNwPAvvAeH4Xq1cLH7ukzywZ7DnyD2nuSNLX4nnoE/BN8jEUb5DnoG6wt6PXFy+XHgfuE/yg3vhGfnmi8qFsqO+gbxT5HUuDZ8OfpYf9CEZtyIvuGfru1qfmPfCwpMe/TS+JXv/eOdYVUPdxX1x/9w7/ULywsqPOCFujLsgY/fBL8/DMzBWQ9yUC/UZ9Q/x40c+2P375zKan/fnP+8K8VO+1geEvIuUO98j7z7fug9r+WXx8M7z7pPPlAv5Qby+XOjPQe9WRsoVedKQwaD7QwaDFiNnMMCiSEXIxwn5yIpoMkb/YVIpUFFhDaZBokLmY6VSoPKkQoD8LyIfNpUclVBKq3C8LBUo7qkcceBvslaxYRzxNGsq/v4+rXK0uJEzmpXUy1s6xqJnLYrX34/dey4O78YvZUAFS0NPJZsqJfynIaBcaBRQiikX5Chz7idNI00HIsd98eyWz/znfvnv8yN9RntO5K0BQ470eU787dl9PqT5bHH7dwQ5o8njb2HSZ0ufy+jj9Pdg/3HneZCzeNI4jKRNvqNs0HGycin6XpAxgwHXli9laUD8SYtntrzgHn258EyWF/xauSCDvxE3/EyW9C0enw8wVy4WJi0X/iNv5W5h0mfLPStukHj9u0R8RuL1ZWJM4yJdSBmgpFE2ppTwfeBOmUD+o7ggQ12GEsc3RrzEUZSGJ3LkEfdo92z5wq+Vi8VpeWdyFobn9nnt88KXCbR4LT+gxW9xp+WSlnvVc+Vo957eE9fcOzJFeWbpku/W6aV8fLnYN2PEzZeXlVmOyHN/MhgIgiAIQrHBwLedvn319DJG2ln80Glpxxn4ZOIX+hP+pvMX6QHeLacLQfQW3Lw+xG9Op+U/cVi6kP+4ex0F5uI1et3GdCfTn1L5qufytHv28fp7wo+06QsUxUHa+JG/6LI2MM3gqpWJlQu6lDcY0NfAzeIinqr7Jz2e2eezMc0/fv0zejnTOS1+ZKvKhbzw91NULsjjbvJG/3z+vydh7PmsPOzX/lu5WDw+fBoveU9/jr65LxfKI1cufCuUC99POm5icdq1udmvv2+f39Dnh5ULbsibDP+tXJAhXl8uxOvzIo3X4uba4rYwkGvcLX4L45/NP1dKH6/dr9HfR1kckLTJW74X+t30N9LvBXLtDQaUDYP/+JGWf4Yc7R54Xu7R8sD+Q/8upXln5WLPmn5bhLV4rEyKysXC2DuCDLKE5b8vd5P3eViVp/hZ/BZ3ej/4l8WDP3nL90I+873wXVgfMK3PzI3/RnNLaWUmg0H3hwwGLUYdg0F7yIdNBcuHbgYDGkE+ZCoJ4qeihvzvLDY7Dbt3ez4aF56pFc+So0+XXxT01GDgK1n+c8/40fAx+Em58CyEt+dL6dPpLKZpdHZ6sBXPRZ6S75QJeW3KYlG5IIMhh/JBlgaORjQXd1dgK/IQNjMdysTKxX8vfmDalA7r9CJDudhqKe6lmeXSiny0+O35U/9msb3PYvdEvpuySPmkymJ7SZmijKIsymAgCIIg9HTkDAboNrk2tC5Np6UdZ+CTGcH0N9CnbMAobf+7M1ulvxWlQX7iR/7SbzCDgddp03JhaxX65vQ1cLO4RvWzdDZHxfOhe9LHYIIeOm1ZudAH5FuhXAhj4ya5b8ank6bZWWxFOq1IgzqO/jXfCXluBgNfLtD3AZnMx5gW//FT3zzPjqTNe27fC/lsfUDKwZcLTMuqjMgSL2XGlkhmMGD8UwaD7gcZDFqMIoMBHzofV3vJh80HzoduBgOURtzNKtidaPdMI0+Dz8Ahz8Xz8byp/KgijR2DbNyjH5j25cIz4EfDx3JBm8VAhdody6ark2+JcqBMyGsrF8oiVy7I8H5RPpQl/sSRi1tsH3nPYfq9UAa+XCAyKPHeYGDKYncqF56X++UZmU1Dh9KewfIjF66VtHsgz1EWuUfKh/opLZf2kOdFWZTBQBAEQRDKDQa5drQOTaelHfcGA/qFDAp1BX1jdCL5SXmR5/RLbaCN67JyYQsX+hroWLl4xeaQvKc80L3RadFxc+XiDQaUC2Hw0/fSfJKn5D3lQZ7zSxmkfQ3rA9JPZNyHfiD/cVffvPm0cqFPbvUY9VP6zTRKytX6gDIYdH/IYNBilBkM+Lj4QPltlITjA6cCpnJFWWQ2A5VxR5TQUUWrwFCw2KuP5bVUZKkyNipJvpPn3Jctr7PGz5eLN+Qw+GmzGPDvjmXT1UkDSDlQJrw/Vi7+2/LlwneCskj5MGCKn8qluSQ/IXnuvxeufblY2ZiySLnQwUIeP8o2F39XpD2zvYvUY5x/QP1s7xjPlAvbEVq6Ob+UJmffAnnON0D+p+WSo8WT87NnlMFAEARBEH5GkcGgqC2tS9px+hbotJwzRn+DPooNblt73Sw2omuMbuS5yXPyF50J3Yn+N2WQlgs6oJULk/nQAbnOxSs2h+iflAd9QMqnrFyQYTKf9QFx76nvdWeSPKVc6AP6ckn7GlzTB6HfxLdC2WBgQLY79QG7C61c+F58H5DvwJdLES2e1J04cZfBYPSADAYtRh2DgdFf+w/Q/nvyYfOB86EzyIbBgIFpq4zxt7g6g8Tv00ivy5jKcc3gFdZ+Btk4nMYUrFSubhodoU8nTdMGNs1anjZ+/DdDjhkMbBYD8ZhMyjSdzmAr0hhVJM9RSFDQfbl44oYf3wkKCe+YGaWKyqWIrcrLUVVmzUiXPDUl0L4XysnitW+Ga2Y6IEe52PZquKN8pPE2ymY8SyO0Z+IZOISOQ6io1yDvHHlgClsz7quR5zM5q6PIc74B6lrzs3Lx4XJu5u6vTVmUwUAQBEEQyg0Gvi3N0fun/9Fp6Vug05rBwAamvXyz2Iiu0Sy2Ks2qdMwfndYmwdD/RpfC35cX5YJOS7kw+Q29z1bY5uL0bs3mqEhjVKRJntvANL9FfUDKgX4i34r1ASnDou+lFc8yqtlZz0iekueMmfAtWF6ncqRNvUWfifEsvzNDo/XYqCyvVqTdjDSsXKi/cn1Ak0nzvszN3Gl72PKIMxVkMOjekMGgxagyGNjH32gFgLw3GKAsMkDNR09FQEXriZt3z8mYe3v8uBfo/YvkzR1SUTFAyMnxnCh/2WWXxWsqMeLz4XwaPl7/39PkUnnv592Mlq73t/8o59wbDRv/cfdlx3/kzGDAs9gshjR/LN5cOkVM5b1bUdjUr0wW5vzNLeeXyqT/i5iTMbeisKkf/8lzlEAUDJSSdGDafy/4oSyikNgshqJy8f9Tf8LYe1IkU+ae88vJ+XQaiSv18//Ta/uf82+vn12bIYB8zpULNDlkUOLpYFFGPLd9T2m8nt7fs8y/Kiw0Gc8yuZw7z8DzYzQ466yzYj1HnWezZuz5LI5cPN6/UT/o/e0/7QT3UaQsVhHZVJ5r2jSURRkMBEEQBKHYYGCDLO0h7S3tNnoVk5KYnEB/g2v8Up3A6wG5a88iv7IwRi9TJt9eP5j6l8kX+ZWFMXoZ+0VXZQAU3cn6dV4X4j9yNgmOyXz0NUz/9fS6vbn5NFOaX+pfFoY0oJcpkze/nH+Rey4NWCRvzPlXhYHe3+T5FigX+g45ndbKxRsMGDuhHJFN07R4/bX3T5nKe7ecn5fx/4vkYJG/uReF9X45OSs/u87JmFvqnvr7a/KceMljKxf8fLlAZK0PSJlQNvTlcUvT8+n4/8Y6z+L9ity9X1kcsMy/jp9nTs5Y5F8W1vxS2vdCntv3grsvlzJShmk5cm19QBkMuj9kMGgxygwG9oH5j80+ZvsYPb27ffB86CgjKItUtCgpVJb4Qas87bqMRbJ14qgjY+TeaQjYeuSMM84IRx99dDj++OOjsYCGhXjs+TzrplEmVyeOMhkURAamoTcCeOJGuVAeNruEiplntnjL0vAsk0vdid+IH++UzTSqSqssHVjmXxXWWEeuTKbMHWUcBQMFnbzHLS0XGk/KDBnePYgS0+xyMRbJ1AlrrJIt868K698XiBvvDIPZ/HrZsnhgLi2uKRe+FVMWcTMlxuoyrpFDhjKx1VK+XIrSSFkl0+o4IM/Bs2E0oL47+eSTw9ChQ8NVV10V6wV7N4viSN2NVf6wSIb8pq6lbKxusu/EysXo3VOZ9NqURZajymAgCIIg9HTkDAa0k9bvg74PaG52bW7pf9pydFr6GH4r11SfqNIVqvxhs+LIuRvrxlElU8WqOIr80ZPIbwbZ0J3Ia9ytTKxcCIuOZQYD9FqT9ay6D2OZXFUcVf7GMrmqOKr8PdubBiyTYeyDcqHv4I0AnoS1csG4ZjPZ/bhJWRopq+TK4qqbTpVcmX9VWGMduTKZIj/c+GbIY/KcvgbuablQX1EG9JMwFvC9MNGSsrJ4i9JIWUeuTKZOeNiRNIxVMp0VB9f2vZDnfC9cW98cWn1mdZp3S//7a/qAnKEjg0H3hwwGLUbOYMCHhMGAj4sPl4/UPlT7792M/pr/VMRUwFSwDEbR+KHMUCHgRwVQRPyNZf45P1jlD03GaBUVldSVV14ZBg0aFPr16xcGDhwY9/qmQfFhfRzmnqPJVMkVsW4a/KL0UcFCUwB9eUHcKBcMBja7hGf28eRo92HM+Xk378790MiaAkRennfeeaF///5xkNIGYFPm4vP/czJVftD8c3LeLeffCAlLfpsSiJJhSomVh30vvHuUma38gOQLYcvuwe7RmPNL/7eHFj6NI3XLydQl4cgb/vN+Ymiks0ke4cZA9kEHHRROP/30mFfkWVFaZfdh6dj3Qj5TLvhZnWd1GdfUW8hRJrZaClniKErDmLsPf53zT1nHP5VJ3YrisGegDmCFwbHHHhuOPPLIMGTIkPisvH8mk4sjTSMnY6zrT7mkyiL34MslR/9N+WsjbRptmwwGgiAIglBuMEjbVf+/zI9fa8vR4egD0t+wSVdlegA0XaDMrT1xeOb8U7ecjGeVP2xVHPyix9LHQnfiP+5WJlYu6FJmyLGVH5SLpVGVTpVMFRtJo0yujHXCFqVRJ6yxSs7iQo+2vq8fAPW0cqF/gXHNJo1ZH7AsLfNPZVK3nEyZu6fJVIUvkqlDC5uLI3VL/RshYclv/pPH5Ll9A2m5UF9RBsgxnkU9Rv+Uuq3sHvDzzPn5a+/fCC2uNI7ULSfjWeZvfjkZ75bzT1klgx/fAd8LfXP6guS/9c2h1Wc5en8vT3h+6QMyMVoGg+4NGQxajDKDgX1g9sGlxM/7p//58KmAqWCZXWIDoFS6NniKwtJZJH5jkR8D2NzbxRdfHM4+++w4EMkg9v777x8OPfTQOECJP/fLvadxWTzePb3uKH0aPl675t7IUypVU0i4VxozGkQrJyOVsTcYUC7El3tGn6b/X5cmz8AjRhhmLfM+wAEDBoQtt9wy5vUVV1wxMv72pJOyKp5mp2Fx+WtfLvyavynxReWCDGVixhwrF359+mW0e8j55Wjy7Q2X88uxThr48W7yrvBd0tHkvcbvmGOOie/MfvvtF7cI4z0qemdzaeBm+WnlYm7kPXWWfTNWl5kSjyzlQn2AIk8cuBE2l1ajzN2zuTUjfliUhj0D+YlxFCPpEUccEfr06RMGDx4cv0/8TN6Hr0tLOxfe3MlP6+z6cjKjhS+XRklYGQwEQRAE4b8oMxhY25lrT1N378Yv+hR6FW08ehyD07Tz1k8p0gc6Sou3s+I3tiINYy4tuyYvc7qTDUz7cuI/fQ36h4Slb0Zfg/+EgYT3aTebdt8wdfNyHWEr0oBl6UDTYf07z3WuD2h9DeQwGED6QlaePg1PSyvn1wgtns5MpyqNHNsrn4bzbr5czM0MBkXlQhj6fjZuQphG+4CWVtF1SvOvkitjnbAdTQNWhfdppHJc23tOmdj3AtM+ILT6rBESjnFNDt2XwaD7QwaDFqPIYMAHxYflP9BGSDhTFqkIUEoY/GNA/sQTTwxHHXVUOPzww8PBBx8cZwwfeOCBhcQ/lanjlvqbm3ffZ599wk477RQHITfaaKOw7rrrhrXWWiv+br311mHPPfcMBxxwQGH49LrIrcyvrpv3I98OO+yw0Ldv37hd0qmnnhrOP//8eCCzrRiggaPx82Xoy4WKGIXk0ksvDWeeeWbcioQBWQYKrVz8PVRdl7kxwLvXXnuFvffeO/4n37fddtuw/vrrh6222iq6W1gfh//v46vjVsZU3q5zbnZd5OZp5cJAKwYRBlvPPffcOPhtqyuKysWUeBQQlBIGaTFinXLKKdGIxazvQw45pPIezN/L5MLk3OrSwvrwZW52XYcmzzuxxx57hH333Tc+N258p6uvvnr8xd3ir0rDlwsz6Fk9RN6SxzaLh7xPDQb8WrlQdpQhZUmZUraUMXESd3vqMruucmuEdeLLuZk7v+Q93+eaa64ZVl555bDOOuuE7bbbLrpXhfduObn0P4ZZ3m3eccqFsxTY/g2lnDoqNeS0l4Q3ZZE2TgYDQRAEoacjZzBg2wabNNaetpcwtNvoTgwC0degXU/7GqbTmk6QY6pD5NzsOudm11XMyXs3+59z82HKmJOv45a7tj7gcccdF1eEopcyMYvJHei09DVS3SktFwY+ra/BRBFW1ftySdP1zPnVcasjk/q1180zTSP9n4av6+b9rK+BTmt9jXPOOae0r2HlQl+DMkOGMqQsra9BXPQ10JeL+uf+v79uxK2Mqbxdp272v8ytiLn4itzsusjNEz/yjnfbdo7gnadOom7iW8iVC7RysT4g4yyMtzDuwrfHN1inD2j36GXK3Oy6PczFl8br/7eHufiK3Ow6JX5WLrQJJ510UmwjWO1uq/kx5NgqG18ujZJypV8vg8HoARkMWowygwEfFh8oH1l7aEoJg9dYEBkAotG84IILYkXNVjTM4KfSRdmpIo0mv3XlUxLOwlLJU1HtvvvuoXfv3iMNBptttlnYbbfd4rkFDNQ2cn+N0J4F+vvy7mVE/rTTTosVKw0XjR4zn23GNYNsXlk0Up644UclTGVs5XLRRRdFxcbKJZduEblvaM9iz+OJm8WLkQNDDcaZ7bffPg68FoXz9Gnk/JtB/yw5f2N6H3Zt5cKWSxhjUC5sRjrlQsNHGfjGz8rGyoWBUhR5m2WP0kicxG3pcY/2v4h1n6Uj9GlYOrl7MzcvZ9f2P6XJ8uyE551ZddVVozEP5QK3om/UwkJkrFzIS5QR8tZmiZDn/ntJy4Uyo+zMmMNsOeKgjK1c0vuwtPmf5ofPC+/eCH38OZIGrJLzRI56j++TzgnGvV122SV+o+Q5dSUdSZQ6k7ffummkJO+ocygX6iA6SSjwtBm0HbQhpixa2fDbKAlPm8YZGDIYCIIgCEK5wcDazrQ9zdHrTTDta9Cu077Tzlf1AU2n8H7t0WnKWBRPM9Moon+WnH8dlvU1Up3WMy0X62v4PiBx1703exb+d1a++TSgT8On6WXaS0srfRb/v4jINNLX8Ez7gJQlZUof0Pc16tyHsehZmk3L96o0OnIfdZ4l527y1DXW1yBPKRfqpLRc0r4G174PyDgL4y18c4y/MMmJybCWtuVFGes8i6eXqxum0TTaS59Ozr+Idl9WLtQ91EGMRzEuRR8QI43vA6ZtTCO08PQB33jjDRkMujlkMGgxcgYDLG98UGYwaA+pdCEfuRkNbIkRFloaQhpQKgWIJdGWRtp16pdzq5Kx/5D0GOijUsIizOAjg2DMnN1kk03CFltsES2dNMwsneUeffiiNNJ0cjKpn3fz9P4m4/970tDxTLZVCEoi+WzGAvLeGr+0XPCnXGgk65ZLeh927Wl+nubH/RI/eUtngcFIDDUYblCq8IcWxof1TONN/+euU7fU3649y+TsOkdfLraclDymXMjzVCmxhsx/L5QLiomVC3ERp88fu4/0nuzas0rO3Dxz8qmfZ+qXkzOZIhbJkQ+4Y+BbccUV48oDOp68T7k8SWnvFnlImLJygb5crGzw998L31xaLkW0e/NM/XKy3s3+l9GHNXq/nFzqzy+KNIZdlDlmNrHKirpyww03jCsPMCagdKM0W3gLm8aZc7P/RvLPygXlnXefvLZyMWXRl0taRjm39NqURRkMBEEQBKHYYGB9QNNXq5i2t0U6Le289TXq6G/ml8qk/3P+/rrILfXz/v7a+6duJp9z89f2P+eWyufc7BqaTlvU18j1AX25pDptnXLx1/bfu6XM+efkvVvqb9feLUcvk5NP3dL/Kc3PM/XLyaXlkuq0ReViZWPfC+VCv973NSyNsvswP88yuZxfzi0n71kmZ/9z9PK5sCm9Xypn1zlauTTa14D2vfBt+T6gjZvkvpf0nuza0/vl5MytjF7Wh/XuqVwqm/P3TP3s2rNKLnXjP/lmJB/Jz1y52PeSlov/n3Pz8sTBfxkMRg/IYNBilBkM+LCsYWsvCW8VLR++JxUvFQLkf+66DqvCWIVDBYRFmKWCrCJgEIwlZGyPw6xZrJw0Jr5h8KxKJ/Wvks+xPWGgPSOsU2ZWLmnZWHzpfbTnvtLw1lnAYMAsiV133TVsuummsTzI91ye16HdW9H9pX5FcnVYlU7qZvlq5dNI2Vh5WrnwW5Z+oyyKqxVplPnZ86KUoTigKHPNu7PzzjuHhRZaKM56R7mwfPLhPXNpWHnAumUCfblYeIvPx98RFuVJs1mVjj0neU6decIJJ8SzRsh/VhtguGFVEH6UkeWHz4tGn8XHAS2/c2VRRVMW7driok17+umnZTAQBEEQhBHIGQw46wcDu7Wf1pY2Smt7Yao75dio3mCsCpf6V8nnWCeMl6kjn2N7wlm+kscwVxaedcrF30NnPUvq36o0yuRzbE8YaPkKyetcWeRYVC7pfbT3vmBZWO9XJlfFsrCpX5lsFcvCej9fHkb7DnLlkNLKxZcNTNPsCOs+S0fYijRgI3H5/ISNlEsdEhf9QvqAMhh0f8hg0GKUGQz4sKxitA+3zgecyhrryhlTec+cvNFkuH8zemC1ZGUB+6Qx6NWrV684YM2+c8yixdpJBeUtkWXpmL9nTs6Ykzfm5I1l8v6/d8sxlfNshryxSJbBX1Z3sGyPrWTYf37ttdcOyy67bDzbAOsyee/jsf8p0zQ8myFvzMkbG5WHjYZpVN7YSLicLMzJejYaphF574+xgGW4tucte+rPPffccTUQhx6jiFinNsc07hxz4Yx15XNynrkwxlSmSN7kypgLZ6wr7/34xSDA8l2MBpw7g3EVshqLrYlY6cH3jTx1p4X38aRMZXLMhYM5WWORrLVj3N9TTz0V2zgZDARBEISejjKDgbWf1pYa07bWM5Utks/JpcyFM+bkYV05YyrvmZM31pUzennPnKyxEVmYynvm5I3NkIc5WWMz5I05eWORbHqdk/X0cimbIW/MycOcLMzJwpysMSdvbIa8MScPG5H1bDRco/Kw0TCNyhsbDdeoPMyFMdaVz8mlzIWDOVnYiKwxF8boZcrkzQ/SjuFGH1AGg+4PGQxajCqDgVn6bMClu5DKgfuHDGhhGGD/bcg5BWyvwcHL+LGND3JWueTiE9tHy1dWEDDoyzZQGAo4SJWB39lmmy3OWMaYoDIQoTXuvAtsjcNKoDXWWCMsvvjiYfrppw8zzzxz3BoH4xPLdc1gkItL7DjJW2Z/sHyUlUEYCljhQV264447xoOQOciesjLDQVf7hq0d495kMBAEQRCEn1FkMGASj7WfaZsqiqIoit2FtGP0TWUwGD0gg0GLkTMYsF0DBgMbKIK5jw/aQEzKnJz/X0QfplF5c6NCQNFljzoGqRnYWm+99cJKK60UlltuubDOOutEAwKHCdmKgrJ0fBom5/9XMQ1bxJxczi3Hjsh5N++esq4cNBnylvJgG6J+/frFLYhYVbDwwgvHgd9ZZpklbLPNNuHyyy+P8sj6wcZc3EaT8XLezbun7Khczi3Hjsjl3HL0cmWyRXI5txzrysFUrm5Y/G3AmcFn9slnJdDSSy8d35UJJ5wwzDDDDPGAcgavzWDg35m66dSVT/3rhIFerhH5KjdP80+Zk4U5/7phyGPqVM546d+/f9yiiPMM1l133WgEZJs3Dv9i6zFkKcdcPN7N0/w9c3LGnExROGvHuK8nn3xSBgNBEARBGIEyg4G1n/xa++rp29mUOZmqcObvmZMz5mQaCeeZkzPm/KvCmb9nTs6Yk2s0XKPy3s3LFDEXNsecXJ1w0Ictky+Sy7nl2BG5nFuOXq5Mtkgu55ZjR+Rybjl2RM67efeUdeVgTi7nlmNH5Lybd09ZV86YytUNW1fO6OXqhvVyjcin13XD1ZEv8i9yt7FM+oCvv/66DAbdHDIYtBjeYPDDDz9EZfHNN9+MBgMGV+wQXf8RNov+Q875t4e2rQ0z1jnYmAHqBRdcMEwzzTRh1llnjYNbAwcOjINayOYGGtvDzniWlD6NVqWT82+EtmqAgUa2fmIrKIw2888/f5h66qnDdNNNF1d7MDsZORsszsXVXnZ2fhlblUZnp9PRNOqGL5PjHeDd4b3gcGwOOp599tnD+OOPH6accspo9ON9Yo/9shUGde+liHXDdySdrpQGLJKlTOzbZAu3QYMGRaMBK7YWXXTRMM8888SVIEcffXQ8X8LKJfc9N3I/KRsJa7K0YWxfhdsTTzwx0mBAmyeDgSAIgtBTYQYD2sPhw4eHt99+Ox56TB+JtrPRPmDdNrquXI6NhDXZ9qZVhz6NVqWT828GW51Gq9LJ+TeDPo1WpZPzbyZbkY6e5Wc2ErYV6XT1NGCj6UD6gPzSBzSDAUZyGQy6J2QwGAVgsOQ///lP+PHHH8OXX34Z3nvvvfDiiy/GAR8MBv40ePYUhwwIdSXayep2oC4rCLbaaquw1FJLxUFpuP7668f90HkWf7q92HkkjyHlcsopp8QyYSuiiSaaKIwxxhjh97//fZwtfsEFF/ziHcvFJfYMUv68B6weuOiii+IKIQajMRb83//9XxhvvPHCKqusEo499thoGKR+0rfcGlrZwGuuuSYaX3fYYYewwgorxBUgk046afjDH/4QNt544/i9I0f5WNg0vs6i3SeGSuoV3iUGPR599NHw/PPPh3fffTe2dbR5MhgIgiAIPRW0f7SD//znP8PXX38dPvjgg/Dyyy+Hxx9/PLabtJ+0o7SntK2tbMtFURRFsVHSTvk+IOOZGMExhmMU/+KLL37RB1Q/sHtBBoNRAD4SW2WAtY0ZJhgNXn311bjSgEEWLHVY55iZ35XItiVUAvxSOQwePDjuib/aaqvFPc9tAIuDdRngshUTubjE5pN3hjyn8mZGMgOJM800UxhrrLHi4O/EE08cttxyy3DZZZeNLEeYi0vsOeS9YYXBpZdeGmewsxplzDHHjO8MhgO2wBkyZEhUBpgxwLuTi0fsHFq9C6+44op4CDLfNis/MNLOMccc8ZfVIawSoSwpp1xcnUXu0VYVPPbYY3GLBWaVvP/++1FR9DNLpCgKgiAIPRHWB/zXv/4V20XaR9pJ2kvaTd8HlH4uiqIodgdaH5AJ0M8++2x47bXX4vgmK8zZjog2jwnT6gN2P8hg0GLwkRjNaMCy1G+//TYqjSzZYbYJHxizMt95550uQ+6H+/roo4/igBBbYWywwQbhz3/+czxMd9ppp43XF198cXjppZeiHPK5uMTOIe8OHQ8suswWZzbyvPPOG1cYMPg77rjjRgMPFTrlaczFJfYc8s7QoNPIc4YBhx2bkQmDwZprrhn30mcl1Mcffxzlc/GInUvqU8gWPxgFOaAaAw+ruf74xz/GVQc777xznOHBVneUVSu/b2sjPvzww9iWsaoAJZFZJbR1UhQFQRCEnoy0D0j76PuAtJ/qO4miKIrdidYHZCzqk08+iW0abRvjnGYsoN0Tuh9kMBjFQGnkA4IojpCPiqWqEEWyq5CPnMEfls0OHTo07LvvvtFYwAx2zi044ogj4mnottwoF4fYueTdgVTSTz/9dDj77LOj0WCZZZYJk002WZhqqqnCnnvuGQ0KVoHDXFxizyHvAN8sRgNWEXC4rq0yYPY679D1118fB6CRs06u2HrSLtj+xyhoHGC+3XbbheWXXz6eVYLxlvKiLra2pVVtibVbkHeK+7R7oP3g3YGCIAiC0NNBe0j7SFuZ9gFzbawoiqIodkX6/h+kPUv7gEL3hAwGXQg2mNLVaGCwkOWx559/fjj44IPjYbpzzTVXnOHKYCIDWCAXh9gaGpgBPmzYsLDHHnuEZZddNhp1fv3rX8cVBsxAfuGFF2IFbsjFJfYMGuiwsoTwyCOPjAPPY489dlxhwAHmG264YdwK57PPPmuT1jszKulnaPCtU/9yjsx6660XFllkkTDffPPF/6wKMSMPSONpBQVBEARBqEauDRVFURTF7khh9IAMBl0IXfXjYiDxH//4R9ziplevXmHppZeO219wVgEzkdneBouiQZXEqAfLmtmyhNUErP6YYIIJ4uDvb3/727jlDAesqYwEj2+++SYeUNSnT5+wxBJLxAOyeWc49Ji98jnwmJUrQtdA+v2youj0008P/fr1C1tssUWsoznwnDrggQce+EUd3QqoHRAEQRCEelCbKQiCIAhCV4MMBkIp2IKIrS1OPPHEsO6664YFFlggbmvDDGT2z+ZwTQYaha4Ftiq55JJL4moCBn+nmGKKkQYDjD6UKUvFBMGAkYlDi1hhwKoUzi7gnWFVyiqrrBIuuOCCKCN0XVA+rATZbbfdwl/+8pcw88wzR26zzTbxXAPOFBAEQRAEQRCEngIZ5ARBENoHGQyELNjygsNL2B+7d+/ecQCRgSdWFWy66abh0ksvjYPSflsboWsAQwCHTg8ZMiSsvfbaYYYZZgi/+93vfmEwYDYy+8sJgoHVA2xtw8HlE088cTy/wFYYrLrqqnGFkd+SSOh6oDPEAVO33357rLcxFmLcnWeeeWK9jRGRw6jUaRIEQRAEQRBGZzBOwZbJX331VZwEyWpbJjpyZhuTaLiWTiwIglAMGQyE/wEDyQw4n3baaXHAGUMBg84cnMtBx8xU1UzjrgsOnuFQ40GDBsWBXg47tsFfDAZ2IKoMBoIHyjMGQlYTjDXWWPF9gWxNxOHmF154oQwG3QR0gJ555pm4vdSSSy4ZZplllmg4WH/99cMZZ5wRXnvttfD9999H46I6SoIgCIIgCMLoBvRcjAUfffRR+OSTT8J7770X7rvvvnDZZZfFrXvRh9UfFgRBKIYMBsIvgPWdgabBgweHjTfeOB6eyT7Yq622Wjj++OPDE088EWewapCpa4MzJxgYXHPNNeN2RN5gwAqDJ598UgqS8AswgIwxkPMKeE+8wQAjAoedy1DYfcDZMxxufsIJJ4QVVlgh1gMYfinLvn37ajs5QRAEQRAEYbQFOyZ8/fXX4eOPPoorCoYPHz6iD/xEGDZsWLj++uviBEn0ZUEQBCEPGQyEkaBBfeSRR+IAE+cVLLLIImHhhRcOW2+9dRwsfPXVV+MAE8v7ZDDo2nj//ffjtlGbbLJJmHbaacOvfvWrOPjLzPEdd9wxGoV0hoHgwXf92GOPxa1rxh577JEGA7YkYj98trPRCoPuBcr0jTfeCGeddVY0AFOfzzXXXGHFFVcMe+yxR7jmmmvibCvV54IgCIIgCF0f6Hbfx212vg6fD/9iBIeH4SN+xf/lJ598Gt56+53w6quvhTffeju8++574bnnng/33Xd/eOCBB8NLL70cPvn00zD8i3z4nsaf36ef/3/77XfxXRMEoWdDBgMhWtY//PDDcMcdd4R+/fqFzTffPCy22GJhoYUWCptttlk0FjDopEaj+4CZxf379w9LL710mGCCCUYO/vpDj7XCQPBguS4HG6+00kq/2JJIZxh0bzC7ihlV99xzTzjiiCPiarEFF1wwrhxbb731wqmnnhqNwarfBUEQBEEQujaGf/lleOrpZ8NVV18fTj/r/HDq6WeHoaefE4acdvYIntX2K0YOPSuccuoZYeDg08MpQ+AZ4eRBQ8OJA4eEkwadGt3IM/IwG77H8Of3ZugZ54Qzzj4/nHP+xeGue+4b0X/4su2tEwShp0IGgx4OBo053JitSI477rg4mMyhpxtuuGHYZ5994iGoH3/8cRx0EroHGPhjpviBBx4YFlhggTDOOOP8wmDACoNnn31WSzCFX+CVV16JdQDGwtRgwCAzKww450DonmArOQ47x0Cw3XbbhdVXXz2uHKG+Z1UZdQLnnwiCIAiCIAhdE598+lm45/4H42B4vwEnhdPPPC9cc+2N4Zbb7gy333l3uPX2O8WRvCvcBu9o+237f/sdd4/4hXdlwvQ83nn3vfH33AsuCUf1Oy4cedSAMOzaG8LHn2grWkHo6ZDBoAcDYwF73TNzmO0pGETadddd48HGHHj8+OOPx4OChO4FjDuvv/56OP300+MZBlNOOeUvtiRiBcm9994bBxAFwcAsdLao4aDz8ccff6TBgO2JllpqqTjQzCoEofuCw5A54I3D3vbff/+wxRZbhG233TYcdNBB4dxzz41GA9ULgiAIgiAIXROffvZ5uBeDwWlnxRnzt995T/jww4/Cv3/6KfYBmTgmio3Q3puHH3k8DDhhUDi8b/9wzXU3RuOUIAg9GzIY9FAwcPS3v/0tHm7MGQVsOcLBmAwgDRo0KM5E/eGHH9qkhe6Gv//972HIkCHhz3/+c5hkkknCGGOMMdJgQBnfd999GhgUfgEOA8NggJHJr0rBYMDWVkOHDpXBYDQAZ5e89dZb4brrrgt9+vSJq8owErE9EcZiDoJ7++23da6BIAiCIAhCF0NqMLjjrnvjXv3S24SO4pFHn5DBQBCEX0AGgx4IDi5mwPiYY46Js82XWWaZsNxyy4VtttkmzjJlaxIMCkL3xcsvvxzLd4kllgi///3vf2EwoMzvvvvu+B4IgoGtyTgclzMMxh133JHvDAaDJZdcMhoXOetE6P5gFhFbzd11113xrBOMiGxRhOG4d+/e4bzzzgsvvvhiPAgfWXVCBUEQBEEQRj1yBoOPP/5E2wcLHYYMBoIgpJDBoAeBQR9mEd95551hl112iXuVM6C87LLLho033jhuYaPDjUcPsCXRiSeeGJZffvkw4YQT/sJgwPkUN998c/jiiy/apAUhhPfffz8ecM7AMQdl2zvzu9/9Lh6Se/zxx4f33nuvTVoYHYBB4MEHH4xnV2BIZCUJ7cIaa6wRDQlPPvmkzjUQBEEQBEHoIpDBQOgsyGAgCEIKGQx6EDAE3HbbbfGcAgYFGRzCYMCWFFdffXV455134rkGQvfHc889Fw499NAw//zzxxniDPyawYCtRzjMWgfYCh5vvvlm3MaKeuE3v/nNyHeGg7LnnXfe0K9fv7gKQRh9gBH5u+++C0888UQ0MG622WZxtRntAoZFDAl33HFHfDe+//77tlCCIAiCIAjCqIAMBkJnQQYDQRBSyGDQQ8De4zfccEM4+OCD437VCy20UFh44YXjrNIrr7xShxuPZnj11VfjjHBWjzBb3AZ/GQhed911ZTAQ/gefffZZPAx3lVVWiUYCGQx6DjjXgFVJrDDZeeedo0GZ9mGeeeaJZ1oMGDAgGg4420Db1QmCIAiCIIwayGAgdBZkMBAEIYUMBj0A7DvOntTrrLNOWGqppcLKK68cFl988Xh9yimnhEcffTRuNaJDjkcfvPTSS784w8AGf1lhsP7664ebbropDB8+vE1aEH6uJxgwXnHFFeN7Yu+MtiTqGfjpp5/ilnQYlk844YR4rsGiiy4a5ptvvrDVVlvFVWiffqpD9QRBEARBEEYVZDAQOgsyGAiCkEIGg9EYzBrl8NszzjgjbL/99uEvf/lLWGSRReIM4qOOOio88MADccYoW1JoEGj0ArOFTz755Dj4O9FEE/3CYMB5FcwW1qoSwYMVJ6w2Wm211X6xjRUGAwyMgwYNCh988EGbtDC6gnNuMBoccsgh0WiwwgorxLNQtttuu3DJJZfIaCQIgiAIgjCKIIOB0FmQwUAQhBQyGIymYLUAB9sedthhYZ999omDxGwrwoxR9rZ/4YUX2iSF0REM6jFbnO2HppxyyjDmmGPGwV+2JNpkk03iwdcceCoIBuoMDEkbbbRRGG+88UYaDDAesFKF1UisQhBGb2A8ZvXRvffeG1cpbbvtttHI/Ic//CFucYbhiJUIMjILgiAIgiC0FjIYCJ0FGQwEQUghg8FoiC+++CLceuut4cADDwwbbLBBJDNE2ZMaAwIHWEqpGL3BXvO2DdUUU0wRxhhjjEiMB7vsskt46KGH4soSQTCwN/1dd90V6wtWFWAs+NWvfhWmmWaauOrg7LPPHtEh+bhNWhjdQf3AdnUYijgYn7MMOBB7pZVWilsWcUg+B+nLcCAIgiAIgtAayGAgdBZkMBAEIYUMBqMZmDX+17/+deTBlWxDxJYSbC1x7rnnhn/84x9tksLoDAwGlDcGg0kmmSQO/jLwu80224Qbb7wxHnCrgT7Bgz3smVWOwQBDAe8MBiZWqZx++ulxezO2ORN6Dr7//vu46uTwww8PO+20Uzz/ZLnllovvBGdaPPbYY+Gbb75pkxYEQRAEQRA6EzIYdD7oI9PnYfU1E6qYINMTIIOBIAgpZDAYjcB2Ieecc07Ycsst44qCBRZYIP727t07XH755XFGqAaJewZYRTJ06NCw6qqrhnHHHTcO/rKdyGWXXRbfkx9//FHvgvALMKMcY+Naa60V3xdIHcK2NH/7299kLOih+Ne//hWeffbZ0L9//7i13dprrx2WWWaZeHg+Z+E888wzOjBfEARBEAShBehuBgP6nffff3+4+uqrw3333Rc+/fTTNp/mgT7KRx99FM9aY4C/o6CP/Morr4TbbrstPPLII/EZ0IdbAfro77//fnyeXN+L+2Dr4SeeeCJO9GKbab9rAMYNrtlxAqKj1303ZDAQBCGFDAajCWg4LrzwwjjYN/3004dZZ501GgsOOuig2JjQYDCDWIPEPQN+S6IJJ5wwDv6yD/1ZZ50VZ4qjrGkAWPDwKwzYvop3Zq655opbm1kdovqjZ4LOB6vTOECflWszzDBDmH322cMaa6wR3XQYtiAIgiAIQuejrsGA62+//TYOPr/99ttxdTk7EbC96Ouvvx7+/ve/hy+//DLq9pAVo+hzuDPJkPOsGJzGjz4jYek/IkN8TE6jv0kc9CE8CEfaDHq/9NJL4frrr4/9iSOOOCJOQqoLnoEBb9Ilvddeey2eo2Xpcm/oqNwvW6eeeuqp4bnnnosD5oTjvomD++N+Pvnkk3j/3Nfnn38e/9tz8HzEhTx5haGAiVSPP/54TN9WGZAmssRB3hKW/OCXcN5g4fOO9IgH4wPhKAfy3PKO3xdffDFce+214corrwxPPfVUNB54cA+kw0TQfffdN+y2227hggsuiM/lwXOxQhgjzcMPPxzvlXupggwGgiCkkMGgm4PG5a233orbzzD7k0ON2YboyCOPjIce0+injY0w+gNF5OKLLw4bbrjhyC2JOPT6uOOOC88///wvZiIIgoE96zkU27Yk4vyLrbbaKhoM6iiawugNjAacX7DkkkuGiSeeOEw33XRhzz33jLOb9H4IgiAIgiB0LuoaDBioRm87//zzw3777RcOPvjgcOKJJ4ZBgwbFXyaWMbjOOAGD2ffcc084+eSTwyGHHBLPPDz22GPDddddF+NgIJw+wsCBA8Pee+8d42KsoV+/fuHMM88MDz74YJRhUB6DAGniRzrojQcccEDYfPPNw6GHHhr7oXXA/TOAzoRI4hk8eHCMi3hZRc+A/ldffRUnTTJ5hRWwnLO1zz77xGekH8w9c1/0exkT4XlYMcszEg/x8UwM0HNfDODDJ598MrqTDwzOM9bC/QAG7bmvIUOGhG233TbsuOOOsX990kknRYPF7bffHu+JeEgTowOyAwYMCEcffXTYY4894nmCPA/pIAcwJnA/7AyB0QAjgwEdG0MExheMABdddFHMY+Lkf2owYJIX5cmWsuQN/8mH9B1JIYOBIAgpZDDo5sDKjkWdMwuY7Qlp4GlQqhoFYfQFMwlQfjbbbLMw+eSTx8HfOeaYI87sYAuRZizXFEYvoIwym4UtzX7729/Gd2b88ceP+9WjaAoCoHOGgRrDNIYDOoB0vugs+s6NIAiCIAiC0Fw0siURM/2Zac72ogwus8UOM9wZ2Ednw1jAL9vvMLDNADOD1n369ImD/Py/6667Yr+SMQcG73FjQJsBdgbkTzvttDh4zSA6kxUxCjBITZzEzWD2nXfeGeNj5wO2uawDGyC/5JJL4oA86Z5yyikxbrbYZUsemwDHxBUG95l1z7ZHrD5It+JhEJ3VAqy2J56bbroprrRgJQXxkJ5NfiHfyCvSxGCA0cRWGADiZ9UEk2Ywsrz66qsxrmuuuSbeH4YJjAHEwYA+ZYDxAF511VVxS0+2keb5SJf7xNCCIWb33XePOrUHz0L+MuaDQQRjDUYR8hQdPN0elPsnrSuuuCLmHffAPVZNIpXBQBCEFDIYdFPQENB4MSi86667hkUWWSTMP//80dJNI46VWui5YKYIitAOO+wQpppqqjj4y+CezUKQMUnIAeW+V69e4Te/+U18Z+acc8544C0KrSAYqENoZ2hvWLlE27PmmmvGDmndmWOCIAiCIAhCY6hrMGAXAramYZCfQWZ0NAa50dNstjm0gWgGoRlcZiCb2e977bVXXE3AljcMZjMAjkEBwwPGBQwIbHfTt2/faGBgoJ7Z9fxnRQMD86xQZrCcyYxsk0sfA7mqfij+bNnz0EMPxZUC3B99WFYzYBjYaaedoqEC4wdAL2XVAc9p6XJvd999dzQSIMcqAfrGPCOD7BgCMF4QlnEVjAX8Yljg2ZgcQ3ysREDnZVIV+YaM5SsrBVg9wJZEhGHVBvdHHhI398A9k2eskoDEyVZCyBCGcsIY4Q0GPLcHRgW2TsLwQnyskGDCDisq6OtTHhgxDKyG4HkZJ+JZuS/6ct6okIMMBoIgpJDBoJuChglr/v777x8HbZZbbrk4E5gGgb3tvBVc6HlAyUJJQJmYdNJJ4+DvUkstFWdpYEzS9iFCDiirrDDwZxjQgeDcC0HwYDYWM6joqHBeDoYDOi50SOkkff/9922SgiAIgiAIQjNQ12DAVjfoY8zGZ7sctthhwJ/rp59+Oq4yoD/IylFm6N9www1xEBsyOA8ZVL/lllvibH4Gv+lbEp4Z8IThMGPGHjAqMKDOhDW248EdN7bosfgwQvAfw0OVjsh90V9lsJ974L4xGjBIP2zYsGgIYEa/zZg34wgrohnoJy3GSRhIx0DCVr0MmKO3YlggLzA4YAggL1hlQBzkGTPxb7311jgwzwA/M/mRxXDBFk621RArBXg+W6nALH/yh3S5P/ri6Mqcr0DecG/kMasWGPAnf0jLtjriedmyaLvttosrNbifHGzlBVsTkS/cBxO+/OoBjAcYTXhWyCqROhMGZTAQBCGFDAbdDDQeNC4sd9t0003DfPPNF88toHG59NJLozW5qjEQRn8wOwIFj30VbYXBsssuGxUjbRsiFIGZLrwzv/71r+M7s8ACC8SZOLgLQgraIzpAW2yxRTRY0wHCgMCstMcee6xNShAEQRAEQWgGGjn0mNnwDCQzs5xBegaw+cU9lWeyIbKQQWyuIf+RhwxW84v+B80NpgPchCUt4iMtri0eZIm3iPj79Ll/DByQtMomvuHPcxKG8AB54sTP8gLyn/vh/pCBlp7R8g2arMUFkbf7NDf8iYv/TORkVQUrEVhdwKqE448/PrphqLBnIQ4maGEgsbMQeN4U/lkg90RYiwfY6gLGi9gWCSOO9y+CDAaCIKSQwaAbgQYKCzmWfQ44nnnmmcNkk00WVlhhhWg9lrFAMDD7gZkF7K3IwaQM/i6zzDJxVgYzSQQhB5RalsL+7ne/i+8Ms8aZMf7aa6+1SQjCL2F1DefozDTTTGGcccaJWxRhNKAzRGdGEARBEARB6DgaOcOgqwGdkRn/9EeZ5c92P0xM8mQwnZUIrFLg7AQb9O+OoEwY9Gf8hgmf9KdYCYERoqi8MACw9RHGhPY8O2EwdJDXGDkwZNQxFgAZDARBSCGDQTcBjYpZi9kXcK211gqrrLJKtFJjQOA8AxoEQQAoCcz83WOPPcL0008fB39ZicKyUfZxlGFJyMEMBmOPPXZ8Z2adddZ4oBZLcAWhCBgFmMHE3rQLLbRQmHvuucOf/vSnuIyb7fNU3wiCIAiCIHQc3dlgIHRtyGAgCEIKGQy6AVh2xv537MPHQUKbbbZZJP/Zkw4rtCB4MJuBmRnM8p1hhhni4C8HY7Mfo864EIrAu8E7M/7448d3ZpZZZgn77rtvPNtAEMrAbCb2aOXwOw5AZosijE8sq8bgpI6sIAiCIAhCxyCDgdBZkMFAEIQUMhh0cbCEjBnhGAvYH3q99dYLf/nLX+L2Dxyqkx5yIwiAw46Y8bvTTjuFqaeeOg7+Lr300vGcCwxMdZcmCj0LHOJF3WJbEjFT/Kijjor1jCBUgfNRHnroobjMHGMB7dX6668fBgwYEA+Vw6ggCIIgCIIgtA8yGAidBRkMBEFIIYNBFwZ72LHVEFsOsfUQZxXArbfeOu7tx8E42oZIyIGBu7vvvjtuSTTttNPGwd+llloqvksyGAhFYNszBnrZh553Zq655oormahrBKEK1Ctsh/bwww/HfWk33HDDsMYaa4SNNtooXmM0YMWcIAiCIAiC0DhkMBA6CzIYCIKQQgaDLgq2jGG2L3vOr7322nHQd6qppopbPXDK/osvvqjDJIVCcJgSg3acdzHjjDOOXGFwxRVXxEOUBCEHtiTinZlooonCGGOMEeaZZ5546DEHdQlCXXz++YjO7L33hv333z8stthisQ5aeeWVw7HHHhueeeYZtV2CIAiCIAjtgAwGQmdBBgNBEFLIYNAFwSzNTz/9NFx33XXxrIIpppgiDuAx4MvWDs8++6wGXIRSsPXHI488Eg+snXnmmcOvfvWreAjpjTfeGL777rs2KUH4JVhhcNBBB4XJJ588vjMM9g4dOjSeoSIIdYHB+7333gsXXHBBNHhPOumkYeKJJw7LLrtsOOGEE8Lzzz8fV9BppZMgCIIgCEJ9yGAgdBZkMBAEIYUMBl0IDJ5gCHjzzTfDsGHDQq9evcJss80WJpxwwrDAAgvEgTz2h9aAr1AFtv3goNEjjjgizDHHHHFPegwG11xzTTzfQBBy+OCDD+KKAmaEjzXWWGHRRReNK5refvvtNglBqAeMBrw3F198cTwAGSPUeOONF5Zccsl4MPKjjz4aty8SBEEQBEEQ6kEGA6GzIIOBIAgpZDDoQsBgwKxM9plngIVZmQywLL744tFYcNddd2n/eaEWGKxj2yoG5jAYjDnmmPEMAwbvPvnkE71DQhYff/xx3GueVSm/+c1v4hkGvEM69FhoD6iHWJ1yySWXhI033jgewI7hgHMNOLRfK1cEQRAEQRDqQwYDobMgg4EgCClkMOhCYN/5++67Lx5UO8sss4Rxxx03zD///HHGL7PFtQ2RUBfffvttuP/++8Pee+8dZphhhniGAdvLcCYGs8ilVAo5MIDLOzPBBBPEMwz+8Ic/xG2tqH8EoT2gLnryySdjO7bEEkvEFXOsYOEgZDu8ny3UBEEQBEEQhHJ4g8HgU88Md9/7QBg+/Is2X0FoP5544uloMDhMBgNBENogg0EXAYdE3nTTTWG77bYL008/fdw/nFm+22yzTbjhhht0UK3QEFiJwnvD+8QZGBgMWGFw+eWXx3dNEHJg8HbbbbeNK1J4Z+aee+7Qp0+f6C4I7QGrDNh6CKMBZ/BQD7FyjjMNON8AIybb8LGNmiAIgiAIglAMDAb3PfBQGHrGOaH/gJPD4KFnhosuvSJcefV14aprro+/4s+86urrw7AReXL1tTeEq6+78b+89sYwbIQb+XVVJlxPo703p595Xji6/wnhqBGUwUAQBCCDQRcAs72feuqpOLMXIwF7h7Md0QYbbBCuuOKKuE2IZoQLjYDBNw4WZTsZzsFg8JcDRznD4Ouvv26TEoT/gm2qWGHQv3//MPvss8dzL+abb744yPv666+3SQlC+/Djjz+Ghx9+OOy///7hj3/8Y6yTMGZysD/1EiufBEEQBEEQhGJ8PvyL8OhjT4QLL7kinDRoaDhp4Klh4ODTwsBTRtB+ezLJgxE8ecT/EwcOCf0HnBSO6NsvHHxYn3DQIUeGAw85IhxyeN/Q5+hj42z6k08ZOkL+9J6dd23PTp6RH6xcufnWO8Jnn2vCqiD0dMhgMIrBNkRPPPFEOPjgg6OxgEEUZl5yhsFFF10UPvzwwzZJQagPZvTefffdYffddw/TTjttfK84bJTzMVhhoDMMhBQYmZgFvt9++8VtrDBczjnnnOHwww/XCgOhw6DOoe655ZZbwk477RRmmmmm8Nvf/jaea7D11luHa6+9Nnz00UdxRYIgCIIgCIIgtBf0a956881wyimnhEUWWTSezUZ/GE422WRh0003DTfedFP45ttv2kIIgiAIKWQwGMVg65jTTz897u08zjjjRC6zzDJh0KBB4Y033tDgidAuYDDgPIx99tknGqLY4gqDAYeP6tBjIQcUa84qOOyww+LZBQzm2qHHHKCtVU5CM/DFF1/E7dLYbo/t937961/HM3t69+4dbr311ugvCIIgCIIgCO0Ffd33338/nHbaaWHRRReN/RozGEwyySTxLC1WuH755ZdtIQRBEIQUMhiMQvzrX/+KWxExcDL++OPHxuvPf/5z3ALk6aefjv6C0B5899134bHHHov7zy+44ILREMXvCSecEAd/WdkiCB5sGcM2VkOGDAkrrrhiPJwWYxOHsN91113aykpoCjCCv/XWW9F4ufnmm0fjFKsMOJQdowFnGrzyyitq/wRBEARBEIR2gYlO7NRwwQUXhFVWWSX2hb3BYL311gtXXnllnLwpCIIg5CGDwSgCjRhbxrAVw6yzzhqmmmqqMMccc8RzCxiw0xYgQkfAjAoOON5xxx3jezX22GOHlVZaKSpGbAuilStCCt6LO++8Mxx00EFh3nnnDb///e/DAgssEA2YL730kg6lFZoG2j86aDfffHPYeeed43vGKoPpppsuLL300mHgwIHh7bff1jsnCIIgCIIgNAz6uuiSJ554YjybzYwFZjBYZ5114la9rLwXBEEQ8pDBYBRg+PDh4brrrgvrr79+mGiiieLqghlnnDE2ZgzwPvLII+Hf//53m7QgNI4333wznHvuuWG77bYL88wzTzwXY8011ww3sVfjN9qrUfhfcLg6A7gYDDjseLzxxotbpZ1xxhnxMGRBaDYwUp1//vmxLaSeoj2krsJocPLJJ+uwbUEQBEEQBKFhYDB49913w+DBg8P888//C4MBuiYGg0svvTT2fwRBEIQ8ZDBoMX744Ydw4403htVXXz0OyHGw6BhjjBEP31ljjTXCxRdfrK0/hA6D8y845IktrhiEY5/wxRdfPA7CabsPIQcUZlalMHhL3TTmmGPGQVzONMCIyTZXgtBMYBjnoO0jjjgiLLLIImHccceN563w7i211FLR6KkzVwRBEARBEIRG8emnn0ZdknP80C9zBoOPPvqoTVoQBEFIIYNBC8HMbg573GyzzcIUU0wRB3FZXcB2MQsttFCcycugnQZHhI7ib3/7Wzj88MOjgsRe9L/5zW/CH//4x3gI8v33369VBsL/gH3lqYNWXXXVOHCLIXOmmWYKvXr1ioeCffDBBzr4WGg6WGXAirstttgizDbbbLETh8GALbGWW265eKaBjOiCIAiCIAhCI2BchUOPWTntVxhoSyJBEIR6kMGgheCwWfZrnnzyyePBOxgKJphggjDXXHOFAw44IO4TLggdBQanxx9/PG4ts+iii8Z3DOMUZxnsvffe0WDw7bfftkkLws949dVX4woUDjw2gwGH0WLgxGDAwK4gNBvUV//4xz+isWrttdeO5xjQNprRgIOQtTWRIAiCIAiC0Ag49Pj000+PZ2WlBgMdeiwIglANGQxaBLaAueuuu+KACAO40047beScc84Z+vbtG2f36oBHoVlggI0BuM033zwaCnjnGAhmyyuWXmqmuJCCfT6vvvrqaFSyMwzmnnvucPTRR0djpt4ZobPwz3/+Mzz77LPhyCOPjFsTTTPNNHGbPt5BVt/xDr7wwgtafScIgtCNQR0uiqLYmTTQb8FgwAqD9AwDJqZwth8Gg3RCVC5OURTF7syOQAaDFoBzC5jVffDBB8ftPtgaZvrppw+zzjpr2GijjcLtt9/eJikIzQEGqAsvvDBuJ7PggguGGWaYIb5rnJ/xxRdfdLjiEEY/sOUQh2JzZgGDthiZVlhhhXDJJZdoOyKh08F5Bhg6zznnnLDuuuuGGWecMa5wYUUehiu2WHv55ZejnCAIgtA9gO4AOYCUiVGiKIqdTeob9EW2tHzmmWdin7hfv36hT58+8dysE044IQwbNixOiGLVveooURRHZ1K/Uc+1ZwxQBoNOBsvcrr322rDllltG6/Ziiy0WZ0xyhsEyyywTzjrrrPD++++3SQtCc8D2MsyowEjAllcMvvG+HXLIIeGhhx4K33//fZukIPwMjAJsPcS2aRgzmX0z77zzxoHau+++O65MobERhM4E+82edNJJYfXVVw9rrLFG3FaN1QYcwM1qPA50FwRBELo2bFYbHdUff/wxfPfdd1H35D+rynoaWWkOc37NYivSEMWuztx3QL1jZCKn/fcyjbBV31or0mnVs4jdn616V1qRTquepSvQ6juMqO0xGshg0IlASWZlwU477RQHbZktOemkk8Z982afffY4k5e9mwWhmUAReuyxx8IxxxwTB93Y9ooBYA53Ypui1157LVaQgmCg4eCMlSFDhsTDZ2eeeeZ4zsqyyy4bzjzzzDgDh4OyaWQEoTMxfPjwuDUW7yErC2g3p5xyyjD++OPHFS+33HJLu2ZHCIIgCK0D9TSTDOisfvnll+HTTz+NW38wm9cP2PHf07uZTJFbEcvCe/cyOe+WY1lY714m591yLJIzd++Xk8255VgW1ruXybXHzbt7v0bkUtkyN++ek8v55eTK3FK/VM67p3KpbJlb6l5EL5cLV+aW88sxlcuFycVlbjn5HL1cLlyZW84vx1QuF8ZkUrnUrYhFYXMyObnUvYhVcj4uL5dzK2IduVx8ObfUr8rN0/xTuSp3uy5yy7FIzty9X04255ZjWVjvXibn3XIskjN371cml7qnzMmZm3cvk/NuOZaF9X7+fyrn3YpYFt67l8l1xM37+f9FckzUYKUVehgTNxiflsGgiwAlmZmQAwcODEsttVQ8vPG3v/1t+N3vfhdXGeDOwC2FJgjNBJ2zO++8M+y2227x3AJm5s4yyyxxm49LL700rmjRTHHBA0PAww8/HGdwM1DLGQYM0LKdFct2NatbaBWom95+++0wePDguLrgN7/5TTyEm1V5yy+/fBgwYEB4/vnn1XYKgiB0YZjBgM7qJ598Et555524nzjbYmI0wN1WHdh/f+3dyvzK5FK31L9ILnVvr1xOJnU3P/+/SNa7lfmX+eVk68jlZFJ38/P/y+Ry7kYfNpVL3erK5Vg3rJcr8rPrlD5MKpe6ebmcu10XMSeXutm1p/ez/2VMw6Vu9t8zlfNuOeZkUje79sz5mVuOORnvZv89U7mcm13XdbNrz5yfd7P/Xsa7mbv/nzLnl7rZdZFbSpNJZXPu/r9nKlfmn8qVudl1kVyOObkyN7tO5VK/lDm5Mje7LpLz7mVyZW52ncoV+eVkq+S8v7mlct69Ss78/P9UxtzK/BqRs2vvb26pnHdvVC7HorAQfYsdb1jBj9GAccJGVxnIYNBJoIDYimjDDTeMBzgyW5cBD1YW7L777tFYIAidASqABx98MOy5557RWDXTTDON3ALr2GOPDX/729+0wkD4BXhnnnrqqdC/f/+w/vrrj9ySiIPZe/fuHR555JE2SUFoDTjkeN99943111hjjRWN7rPNNttIo8Hf//73NklBEAShq4EOKcvf6axiKHjzzTfDu+++G1cZMNsNd1Yu8mv//bV383513VJ3f23+qYz3q+vm3YuuvVtOrsjNu3u/9H/q7/3quqVxFF17t5xclVvql7obfViTq+uWxuHdUubCVrkV+dl1Sh/G5Oq4mXuRfMqcf5VbmZ+5eXp/k6njZu5F8jmm/rkw3q3Mz9xy9DI+TJGbuVfJm0zqZv+9W869yM+72X8v493M3f9PmfNL3ey6yM3T/D2L/LybyRTJpUxlvFyZm10XyeWYk/Nu5u7/NyqXytZxs+squUbczN3/ryPn3VJ3c/Pu/tr7m1tOro5b6pf+T/29X123NA5/nfPPydVx8+7eLWUurP1H32LSBltPs4ofgwGTOWQwGMXAgnP99deHrbbaKm6n8Ktf/SoOdjBrd/PNNw/nnXdeVJ4FoTOAFfGKK64IG2ywQRxsY3Yuq1sYBN5uu+3iu0nFoe1lBAMNxw033BDPvMCoyYHH1Fu8O6xSOf/88+N71UjjIggdASsIOIvlwAMPjAbP//u//4sr9GhTV1lllWj8xKggCIIgdD2gL5jBgI4qW7CyygBd4quvvoqdWTqyOeIPc37NZCvSyaVRlmYqXybrWVfOmKZRJ3xdOaOXLwub+hXJFTENWxS+SC4nn7p5+UbYSJhcGrnwORlza1S+Dk2+PWFSN3/t3Uw+J1PGjsjXTa8RuUbcU9aVy9Hfo/+fssyvimnY9H+RX+66jGlcqZ//30i8no2ETeUaSbNKtpH7KGIz4qjDVqVh6eTSS91S+Zy/vza3nHsRU/m64avkUj8vXxa2Si69ZlUn51Cyw4gMBl0IbJfAKfyLLLJIPLNg8sknjwNvM8wwQzjooIPifuAo0YLQGaAiYEY4A20LLLBAXN3CYNuvf/3reJYGA21vvfWWBn+FX+Dll1+O74ZtA8M7M8YYY8TzDPbaa6+4ZRF74QlCK3HVVVfFswx4HzF80o7OMcccccUUB7ujGAmCIAhdC6nBgFVhbDfHhBU6sNTdLI/nP+R/o2wkbCvSaUUasL1pwLrpNHI/KVuRBmxFOo2mUVc+ZSPhWpEGbE86rUgDtiKdVqQBW5FOK9KArUinFWnARsK1Nw1j3fAdSacVacBWpNOKNGDdsB1Jp27Y9qaBkYCJ6jIYdCEwY4YDG9daa60w8cQTx5UFf/jDH+LqAg6dZZsiGQuEzgQVwTPPPBO37WB2+FRTTRVnjDPYxn+2KtK2REIKGhNWEqy88spxJjcDtGOOOWZcpbL11lvHFQg0PILQSrB9H4as+eefPxqyJpxwwjDddNPFNnWXXXYJt912m95LQRCELoacwYDJKqzCptNKvW0dYE/8YJWbp/lXyRlzcnXCmkyVHCySqwpv/lVyxpxcnbAmUyUHc3J1wppMlZwxJ1s3fEfk6oaFjcil8abXRWxUzsv7/2WsKwdN1svn3HKsIwNz8Xk3756yyt8zF1/OLcc6MjAXn3fz7imr/FOm8nZdFUcdGU8va2GrwteR8Uzl7boqjrpyMJWrG9bLVcnCnFydsCZTJQdzcnXCmkyVnDEnWxXe/KvkjDm5OmFNpkrOmJOtCm/+VXLGnFydsCZTJWfMyebc2AZSBoMuBGbL3H777XEQY8455wwTTTRRHKRldcGf/vSncOaZZ8Y9PDWzW+hMYAh4/fXXw4UXXhi23377OLA2ySSTxC1mZpxxxtCvX7+4NEkQPDh35b777gs777xzfE9YkYLRgC1g9t9///Dcc8/J2CmMEqDgHHXUUXG1HqteMGhhNOBQbt5XVlSpXRUEQeg68AYDOqsYDOgDYTz49NNPY73O9kR1SIcX5vxglX8dNiOOOqybTkfupyulAdubBqybTiP3k7KRsO1NA9ZNp5H76QhblUZnp9PK/NKz1KeepXF2NI0699mMZ6kbR0fSaiRse9OAddNp5H5SNhK2vWnAXDroW+hdMhh0EbzyyitxJuTiiy8eVxcwSMuWRBwcyqDG008/rQE3odPB2QRUDDfeeGPYZ5994tZYzMplAJitPI477rjw3nvvtUkLws/A4Mlh2X379o2HZbM6CoMB9RjbW1G/0cAIQqvBe8dKgs022yyuksIQP9lkk0WjAauohgwZEgei9H4KgiB0DeQMBpxjQKeVVQZ0YtmeyMh1FXNyRXF495x/EXOyZeHNzzMnlzKVS8P6a/vvaXJlzMl6t9Tfrj3Nr4g5udStTCZ1L2JZHFX+Ob8cc3KpWy4ek8n55VgUh/+fypibp/dPmZNJ3XJxmIxnKlPGNEwuDnPz9P516MMVxWHunqlMGdMwuTjMzdP716EPl4vD3FJ6mSqmYXJxmJun988xlfPXqZ938/T+RfSyabj02rt5ev8cU7n0fxqHuXl6/yLmZMvi8H5FMilzcj58lV/qX8ScrL9O/e3a0/yKmJNL3dL/Kc2vjDk575bGZdee5lfEnFzqlovHZHJ+OabxoXfJYDCKQaa/++674aKLLoqHhjKIwQxIVhawHRGHz15yySWxwAShs/Hjjz/GFQaXX3552HXXXeOe9AyuMTN3yimnjLPF2a9eM3IFA+8CBgHqqSOPPDKsvvrq0VCAwQCj57bbbhvuvPPO2PEXhFEBDFrDhg0Lq666atyaiO2yMMwvtNBCYauttorvLrMqBEEQhFGPMoMB16x09cSIUMYiuaI4vHvOv4g52bLw5ueZk/PMyaVu/tr+e5pcGXOy3i31t2tP8ytiTi51K5NJ3YtYFkeVf84vx5xc6paLx2RyfimL5LxbTsbcPL1/ypxM6paLw2Q8U5kypmFycZibp/evQx+uKA5z90xlypiGycVhbp7evw59uFwc5pbSy1QxDZOLw9w8vX+OqZy/Tv28m6f3L6KXTcOl197N0/vnmMql/9M4zM3T+xcxJ1sWh/crkkmZk/Phq/xS/yLmZP116m/XnuZXxJxc6pb+T2l+RSyS826pjF17ml8Rc3KpWy4ek8n5pUzl+M9EYRkMRjEoDM4mYJ9vDgj1B4YyWDt48OCoJGt1gdAKsCXRq6++Gs4777ywzTbbxL2/MRTwPjL4u99++8WOmwwGggcztDEyccbFYostFsYbb7xYj7FCiq2t2G6N0/YFYVSBeovVLmyZRTvLqilWwsw222xht912iwdzowgJgiAIoxY5g8Ebb7wR3nnnndhxZSUsxK8RWriy8KlMkVwZq8J6/5Q5+RzrhPEynjnZIlaF8/4pc/I5VoXx/ilz8mUsC+fjTZmTL2JZGB9nypx8GcvC+XhT5uSLWBbGx5kyJ1/GOmFtMCvnV4c+jaJ0UpkiuTJWhfX+ZXJGe25P8ysL793rpONJGjZgmPM3pvE2kobR4kjjSv2LruvSwhWF9f5lcmWsE9bLVMnmWCdsKlMkV8aqsN7fMydbxDphUxnPnHyOVWG8f8qcfI51wqUynjn5HKvCeP+UOfkyEoaJ7TIYjEIwOMsBs+wLz1ZEDLKxVzxbwMwwwwzRiMDM3B9++KEthCB0Ptiz7NZbbw377rtvNFrZbHHO1TjkkEO0JZHwP6DheP7550OfPn2ikWnsscceucJghx12iOcbcM6BIIwqcHjTddddF1ftcZA77yfvKQdzr7XWWuGEE06IB7prayJBEIRRiyKDwdtvvx11UDqxdGD5b7PfOpOtSKeVz5JzbyZHt/xqNJ1Gw7QnDdiqdBphLg0bqEoHvO07LiMyGApfeuml8Nhjj4VHH3009jeoC4gjF6ZZ7Eh+cW/pMxvx9/EW5ZnlDwN2PC+TsyAHwJMnaZhcHtsv/lXPYuVE/OQx+f3EE0/EvCf9XJjcvefuo05Z12UuzWbT0mhlWjm/ZlLP0hj1LD+Hs4ka9KNlMGgxyGgaALZCYG9lth9icA2jAQMY66+/fjj33HOjksy+8oLQKjATnP3oDz/88LDEEkvErTsYXGOrLGaQv/baaw1VFMLoD+ootqo6+eSTw7LLLhvGH3/8+M5Qn2244YbhhhtuiNvCCMKoAgoOSg+r9lgFg7EAw8H0008fD0BmS0C2BtTWRIIgCKMWOYMB22UyWEY9Tv/J0zrDqbuxyr8O68TRinRakQZsRTrNTKMsrjK/umx1Grm4itwbZVka3j/nV5c+DnZKoI/w7LPPhscffzwORD/33HPxu859z9DCM9D8wgsvhPPPPz/ssZUPk00AAP/0SURBVMceYaeddgqnn356ePHFF+MEN/x9WkXx5Pzq0uKoSgPaIBsGTiaFPvDAA+Guu+4Kd999d3jooYfixBTqMW8MSOOA5A35dOWVV8bzA5nEx1bBnGvJ5L0LL7ww5iV5Sx1JPpLHTz31VHRnoJ/zL7kH8oo07d6M/hkg98S+5ZTTGWecEfObrYgvuOCCeN82cJgLC/HnuQmPYYd7IBz3RfrIWNiie2iUZfE0Kw1YlU7OvVFWpdGKdGAz0qoTR7PSybkbm5VGWRzNSANaPEVxNSOd9qRhxkIZDFoMMpllXsx2ZBUBWySMM844cSY3AxhzzjlnbAyo6DUrV2gVeC9532jsUVB23HHHMNdcc408wJbDQpktznvJ6hhBMKCsUp9hUFpggQVifcY7w7vzl7/8JVx88cUaiBVGOVByMIbuvvvuYfbZZ4/vJ4YDDkPGiHDQQQeF+++/P3zzzTdtIQRBEIRWIzUYoJdiMGBwjNmvdGB7Ohmcgzm/ZrEVacBR9SydkW6r0knZijRg3XSQYQCab/fGG2+M4xpMzFhttdXi2AcTjFh9jD+yNjBlYY0MVDHozWSPjTfeOK4I7d+/fxyIpl9h4ZBloIv6gnMfMSZABsCZ5Y6fyTAGgx/hIfK4WzykiQxhLZ6UhCEti9cG1xgspw/NpDvOyGJVKxOnePZ11103bL755vG8N/KE898sDtImv0iXwXW2ciUOwhN2n332Cccee2w46aSTwqmnnhquueaa8OSTT8Y6kftkMh+TUOmHbbrppmGVVVYJiyyySDyra8sttwyXXnpplLH75J4tv43cB8+FoYLdL7hvyor0SAuZXFhz414oU+5zk002CWussUacAMt2oLfcckuMn7LwYUWxFeT9zL3zzWQr0oCj4llkMBgFYFAWyysNARU5g2vsEc+s3FlmmSU2DFTsVKqNFIggdATMEkdxYTY4A2ooGqx6sXM1WGGAosMsCQ2oCQbeG96JXXbZJb4zU0wxRdwfnneGwdill146DB06NNZngjCqwaytIUOGhKWWWipuAUjbO+6448ZtADmwe9CgQbEDKwiCIIwa5AwGnK/FL4NSDKhhOOC3M0kanZ1OK9KA7UmnPfKteJZmsVX324p0WvUsdckgEwPgzG5nRQCD2BgLGNBmjIOZ8HzTyBbdN+4MWrHCgPMeGYg+4ogj4ux63KgbbFCLuoEJbQxM40+/AyMDaTOhifQYMGcwnDPX0APhwIED4wx6ZuUzKEa6/CfMmWeeGQfojz/++Lht5YknnhjlifOyyy6LKwds5j0GTdJmRv7KK68c1l577XD00UfHgX1WFjBZ5frrr4/3j/7JVtR9+/YNjzzySEyTOFh1QX/q7LPPDnvvvXdYccUVo9wWW2wRn/2cc86JeUc8xOln7pNXbCOEIYLBftJn0h/ndC2zzDLRaMDzUB4YBcgzy2cj90CZkY+s6OD+eGbGBQhHGoTLlRdukHwgrzFeMJaw3HLLRcMB9859mkGkI7S0cn7NYivSgD6NzkpzVDxLZ7GVz9LZ6bTqWZpBGQxGAbBM04CwFREHyo455phxcI3BCxoGGicaQpRlQWgVqACoFFASmBHAgbVmLICsMEDhQ0HSAbaCgfcG5ZVZP5NNNlkcgLV3BmPokksuGU455ZSo2AvCqAbKDp0ZlnZjELV3lXZ45plnjiur6LDp7CBBEIRRgyKDAYNRGH0xGkD+l12nrPKHORl/bf45OS+Tcy9jGp9dp26pv113hGXxNjONonhTv46wKJ5mp1GVTkfTqptGkUxdVsWR+ufkc270J5lRziD2aaedFge9mbHOIPo999wzUgamcaTx8f0zLsIWNwz4M4vftigzIsNA/3nnnRd69eoVt5qce+65Y5rMkGdgnjgZaGd1w0orrRT7uZzVd8wxx4SHH344TmzifvFnwP1Pf/pTXNXQu3fvaOjYfvvtw6qrrhonR62wwgrROPDXv/413iNbD7FKlcF5DAYMtpMWA4IMlDPgxmoGBuP322+/uJvEvPPOG1cbkCYD9czO517RQ7k/BvshW73abP/ddtstDBgwIPa7yAeeiecnDcaX7rjjjnif9Mm4BwwM5Pnyyy8fn4nxJz/o7/MZ4oZhhVUd5DUrCzAWmLE2pZWVxWNxo0dzrt2f//zneC9nnXVW9KM+L4ojdTcW+Xu39sbhmZNJ3doTh2fO39xSehnPKn+Yk0nd7DqVS/1zfjDnn7rZdSqX+uf8yujD2f+cW06+IyyLN/XrCMviTa/by7J4Uj8ZDFoMMpiKFOsxW3WwxzeDFQyyTT311LGRo5LVft9Cq0HnDCWBWRksZWQwzYxZcKyxxoqrX2677bbw5ZdftoUSejp4b9j7nfMubLWUvTOsMGCrF2bkoAwLQlcAnTaWjNMR5AwDq+c42J2OHrPOGJyS0V4QBKH18AYDBvEYrGI2rRkNuLZfT+/G/9y1d6tiTj51S/+nNL8i5uRSt6Lr1L2MOVnvlvrbtaf3s/9VzIW1a/vvWRSuil4+DWvXnt7P/lexLKz5eaZ+dl3klmMqZ9eeObkityJ6OQvnmfrZdZEbg03MWGcGPv1KVqgzGY0Bcrbbob9ZFD4Xn9EGslIZ3PhlcJvVCOz1zwx90iV9jA30QzAqcCYAqx0YzGbmvq0SYHyGQXjOkDz00EPjZKc777wzGhOGDRsWDjjggBiGPjLGAgbtCcvWQgzMo1MyuM/e/8hjCDBjBveHAYV67IorrogT8GadddYYhnEf/HgeVhncfPPN8R7pczPYznZOxIdBwc4GIG7LQ+6bX56N58FoYTsCsF0R/XaMHhghmPVP/pAWRgbCFdHy1H7TPE/dkINcU8aspiCv1lxzzbi6gbxg4DEN6+Mwd/vv5Yqui2RSP++Wo5fJyadu/tr+e7ccU/+cvHdL/e3au+XoZXLyqZu/tv/erQ5z8t4t9bdrT/MrYk4udUv/pzS/MuZk/XXqb9eeObkqpvLp/5Spn11XsSysXZubDAYtBlt3kOEMVGyzzTbxoGObwc3M3PXWWy8ui8MCq4OOhVaDzhmdMRQklKwJJ5zwF4O/bDPDDAWWYvKOCgLgPAuWmrK9GqtQ0ndmuummC3vttVd8twShK4D2lWXRtMVsQ8SWRLY9EecZ0IGjQ0dHSxAEQWgtcgYDBuUgugQDZaIodl0y2ERfkQF4Zs2zqwLjHAwi33rrrfFbtkGpXPj20NJ86aWX4pZCbOvDKgFm5jOxiYF/jAe4MfBP38UOX4YWD3UM98eqBrYZYosfzh9goierA4ibcAzUM5iGLDojz8jkqW233TaudGA1qw2gU4dhFCBOZtuzUoEzLFl5wdlZGDOIj0F1Bv6R2W677aJhgS2R2NKIezMDhOUbv4wtcT9sm4RRgNUL7FiBgYZnJ98xImCgmGeeeeKZAhgeyCvCW1wdpT0n/ynjww47bKTBgEOUyQuMFGk4URS7Jvmm+W5lMGghfvzxx7gUjaVvWKDZM5lBWQbWZppppmiRxnqs/eGFUQEGfpmxwKwGFAvO1PCDvwwGozAxo4Jlj4IA6NRjREJB9e8L5NyL+eabLx5QhqIsCF0FtMd0HpltRQdqmmmmie8rK6m4Pvjgg2OHSlsTCYIgtBY5gwGDgNBWGtignhkRitz8dZFfXTe7Tt1S//R/zr/Iza5zbmXydl3Hzf7n3Mrkza+um13n3Mrk7dq7pe45f5Px/+06lbdr75bzy8l4t9Q/5+fdUrkcvUwqn/73196tzC/9b9d15FKZ1M+7M7jN1jas2jSDAec3MoOe79gGpXy4ori8u/n5//7aBrnYz58tfNiKhy19MBKwlRD3wvkGbLnDgD/34ONgUJ8Bb2brEw5Dwb777hv38mdFAQP6zNonfrYiIgyD/jwb57axspqJUsyyJ07SwBhg2w4xiM9kKrZDIg3yiIF05KAZDDA8cHgw/XLS4blsNYEN5kHykhUI3CuGAgwlbC/MSgmMIvwSHwcnkwecKYAR5N57741lRF5Z3qVM89a7pe52T/ynjFmlwapd0mNbKvxYyVEUX86tzC8n691Sf7v2NL/UP3XLyfnr1M/T/HIy3s3LpG45+ZTml/qnbjk5f53zs//eLXX3bvbfy+Sui/zqutl16pb65/xy/t7d/0/lctdFfnXd7Dp1S/3T/57mnvMr8vfXOT/qB8ZwZDBoAVB8qSipsFkGN8EEE4wcUJt44omjpZnlWjQEgjAqQAWA4sAsABQKtiSyWbe8q6yG4aAmViCg0AgCwNDE4VYYQanX/DvDVi+8R+zHSaMjCF0FrDJgdhWH4zELi/OEeG9ZbcAsLJZzX3zxxVGmEcVIEARB6BiKDAZsN2JGAzMg2LUxdfNyZfTyubDmlvql//11yip/aDJezl+nfjnWkYFeLg2TXqes8jemcv469cu55WRy9HJpmPQ6dcv555jKpf/TOMrky5iLq8rd/8/JpCyTy/mlbnaduvn/fLfsgU+/kZnyTDpj1jn7/jMonwufuy6Tybnxn8Fr6gsG/tk+aP7554+TNBnQx1jAID1yfnCMMGz3wyoEBrkZ1GcyFKsAmPjEDH7uf6uttoqrUxn4xxhiBxDzy2oADCMMlLNCgPMDGN+hb82ZAhgrGNRHhjEh29KIe8UowIHMhCENVitwXsIGG2wQn4FBfu6dA5cZ7GcSKr+cuUB8nInAhD8MA0zuIy+YGcwvBg5bKcEzYdRgiyVWUNj2Rj4Py4ic0btZPnJNGbOVEqs5yEvyFD/GGSxMGYvS8NepXOpXh7mwubhyco3Sh8/FZW7ez/+vw1Q+F9ZkUrmcbBGrwppbmVx6nbLKH5qMl0v/++uUVf6eZfFWxZHKF7Es3lwcZfJlTMPYtf9v19TjMhi0CMzIZgkbjQ6rCZi9zcAaxgIGYY866qg4mxHlWBBGBX766adYKaD8YDBg0IwDodjXm8FfZt6yl+LVV1+trTqEkfjuu+/iDBa2sUIJZYktZxnYOzPXXHNFJZlZJYLQlcBZQXTQWN1Hx5B3lpVUtM+sjGHpObPGeMcFQRCE1iA1GDDhigFGBrb4ZXCNXyPXxtTNy5XRy+fCmlvql/731ymr/KHJeDl/nfrlWEcGerk0THqdssrfmMr569Qv55aTydHLpWHS69Qt559jKpf+t2v7791S+TKm4Yrc7TrnZtdFLJPL+aVudp26+f8MQjNbnnPM2OqRwWpbYYC/DUr5MD4O80/j9dfeDdogF4YA0mdshcN32arHDkBGr7O4PQmHwYCJJOh/nCu5ww47hJ122imS/xgBOA+AwXYGwTFIMHDPYDh9aO6BwXLCL7fccnEVAcaDddZZJ6ZPHCeeeGKUIS1LG8MBg/oYBBjwJ20MEqwyIE22ssaNlQ4YYBjoZ5Us8WAwYLWsye6yyy5xRj95zwA96WBkoC+GH8YIJnKxPRRnL+BPXnEfPl+L6PPbu1keQsoY4wrPjnGFfMef2co+riIWpeGvU7nUrw5zYXNx5eQapQ+fi8vcvJ//X4epfC6syaRyOdkiVoU1tzK59DpllT80GS+X/vfXKav8PcvirYojlS9iWby5OMrky5iGsWv/367RvzAYcH6pDAadCGbgUkmypGyBBRaIB4Eyi5HBCQYmWF3AcjGtLhBGJeic8Z7aHogYDDgQFKOWDaQxg+Gmm24Kw4cPbwsl9HSwtQuKKUox7wwGUTMYsIKKWSwoxyitgtCVgJEURYjOLDO+aI9pn3l3f//738dVM9SHKE0oSYIgCELno8hgwKAae3WnxJAA0/8p6/jVDVslW+TuWSaT/m+EFq5u2Pak49PwzMnCKn+YkzE3T+/fKHNx+Lhz/o3SBltyfo2mUyZXN52cf8oi2bI4zC8nwzWD9mz7g47FLHn2s2ewnRn3HMjL6oM0njSOlN4/JasGMBAwEYStf5itzyx3ZrgzaG0HLiNHvuXih8TF2QucQcCgO8/gyQA7cSBnYYiPwTWMB2xVxGA5k+zYAolZ9gycs5c/Ky0Y2GfgHuMJA/mkg8GB+AhPvpAu6dh/nzYy1IWky6+Fwc/C2P3Z+2HPgz/xIcM17iZnz+5pz1fknvoT71133RX7hRhHmBTLlk4YDMgfjCppfGkcOb8q//Q6lc+5pX51ZPx/75aTS1nmB1P/Ivkid+9XJeP/l8lV+dUJW1euyK9Kxv+36yL5HC1cnTB15HIy5ubp/VNWyRT5m7tnKlOXFh5jogwGLcCHH34Y95ZjdcHUU08dByNsyw7OMcDKyyn8X3/9dVsIQRg14CBQtpdhFgjv5njjjRcNBbyrzBZnhgTLHD///PO2EEJPBw3HNddcE/cInXnmmeOSX1ZP8c5gOGD5L7Nf6Dg00sAIQivAailW/zF7a7bZZotGA95d3mFWWDETi86f6jxBEITWIGcwoOPK4BZkkCxH71cl216m8bYqjc5Op6Np1A3fkXS6UhodZSvSGZXPwoATk9AYQKZfyb76TDqzrXyYjMHseAatbYDKh09Z9CwWFj8OBmbGPLs27LzzzjE9tuhZeOGF46A1qwI4nJiZ+RY2jc/SwM8GzIpo6UK7Dwbk77nnnnjeAbolqwZY0cDWQaxGuPLKK+P5AuQJqwQwAhDe4sulm7rbtd1zURi7N5NJ5czNnttkGyFhLB4MF5QpWyphGMFQQ9+Q1Q8Yb0iTd6LRdOrcWyrTaBqwKp1cGmXy7aWPtxVp5K6bRR9vK9LIXTeLzUqjbtiOpAGbkQ7fNeM3Mhh0MpjBSGZjaWb/PPb4Zl9vDAb8x+o8aNCgWBjICsKoBINiKDMsm2RrGQbPMBQwgIbhAHcUHPYxEwRA48GAKgZRtiRiC6tf//rXvzAYsHxWKwyErgjeXzoxGLVYPs7KAt5dMxrQ0RwwYECUYbWgIAiC0LkoMhgwa7aK1vnN+TWTrUinLI1RnX4z6dPprDTTeFuRTivSyF03ix1Nw+SZ0c6gObP7GVCGDKozWG5yJttoGtCnw+x2ts694IILwtlnnx23TOWXQXomN5Eug/ppHJ7+ftrL3OCbxWs0t1z4ZtHSyvk1i5YGpEyZBEu/kB0J7rjjjlj2lucduRdLI+c3OrIVzzu6pAHL0mnFPbQiDejT6Kw0LU7qKFY/yWDQyfjhhx/i0jj2lJt88smjsYCBCH4ZkO3Vq1esUDV7URjV4OPncCSULJZNTjHFFCMHfiGHHrNPPYc0aT96wcCWRBiZmDXEO8N7Yu8MRiYO4cJgSsPTSAMjCK0CW6wx84t2mrMMeIdtlcw000wTttxyy7hKkNWCeocFQRA6FzmDgc1gZUay2HlkRrhtV9JZbEUaYtch3206EFXnW270PbF0bBDN6Afs07Rb9S62Kp2uQF/e/DfmZDvCVuTp6JKG2Dkcnd8P0uS75UwSGQw6EawYYFbtRRddFJfFeYMBe3svueSScUkeqwv+3//7f22hBGHUgUNAWULJUkK2l2G2rQ0As9Jg+eWXD2eccUZ4++2320IIPR00HMwaYk/OP/7xj3HllA228u6wzcv+++8fZwdqsFXoimBgihkUHGBnW2txzhArATnTgPMNWGXAUmpkBUEQhM5DzmDAgJ91YpmtmqP3s052zi9Hk/csk/PXRX4pzT/HnFzuOvXL0WRyzMn5/8zStj3OzS9H888xJ+ev+a2TBjSZHHNy6X+75pc0zc3T5HPMyfnr1M/yz9xTmpxnkUz6367tfxktnGfO37vV8fM0Oc8imbrunpaXJuvp5XJuKYv8m51GkUyz0qnyhybjmZMrY1U4H7dnkWyRe5Gf0WQ8zc/y1MuZn2eZHzR/T/MrS8Nfp345mown7j4NL5f+z12nNH/PMrncdeqXo8l4lsml/3PXKc0/x5ycvy7yy9FkUubk/H9oZZeT9zT5HHNy/tqnkcqnNJmUOTn/367tP+mxWoj/nD0ig0EnAkWXZVgHH3xwPPRlkkkmiQYDZm0zINGrV6+4LE/buwhdCQ899FA8jIkttJgxblt0MPiLwUArDAQPGg068X379o2HerHvu21jxTsz++yzhwMPPDAaDAShK4J3mDOEqPs4FG+JJZaI27GZgX+yySaLqwxuvvnm8Nlnn7WFEgRBEDoD3mDA2VoYDJgdTIfWOrI5ej/rZOf8cjR5zzI5f13kl9L8c8zJ5a5TvxxNJsecnL8u8ktp/jnm5HLXqV+OJpNjTi79n7tOaf455uT8dZFfEU3Os0gm/W/X9r+MFs4z5+/d6vh5mpxnkUxd9xxN1jPn791S1vFPmfP3bikbkfHM+Xs3zyp/aDKeObkyVoXzcXsWyRa5F/kZTcazTK5RP2j+nmVyuevUL0eT8SyT89dFfinN37NMLned+uVoMp5Fcv5/2XVK888xJ+evi/xyNJmUOTn/P722/zmafI45OX9d5JejyaTMyfn/du3/c4A8/2Uw6GRgCLjqqqvCxhtvPPIwRQYgmLnIjEUO5WGgjW2LBKGrgA7ZvvvuG41cGLYmnnji+N4y05Ytic4880wZDISRoNGwPeA544K6jneFgVZWUs0666zxfaLBEYSuCt7jt956K64IZIUV77EZ+dlaizb7+OOPj7NcdZaBIAhC5yFnMGB7CzqwjzzySCHxr5JJ2WiY9qQBGwmXytYNW1fO6OXrhq0rV8RWpJMLW9eto2xFOmVp5Pzay1xczU4jR59Gq9LJuXu39tKnkcaZc+sIW5EGbEU6rUgDNpJOkXsVc+HK4irzK2KjYdqTBmxFOh1No274unLGVqTh2UjYZqdj1xgOXnjhhWgwYBcSGQyajO+++y5m8p577hln3DLowPYGDKJNOeWU0YjAoTxaXSB0NbD86IADDgjLLrtsmGmmmcKEE04Yt5jhMFv2qT/vvPPCu+++2yYt9HTQaLC/3eDBg6PBYJZZZomGAlZS8c5gMNhnn33CG2+80RZCELomWGXAqsCDDjooLLLIInF7Ldps2m/ONthmm23imUPMshAEQRA6B2UGg4cffjiuBoP8z117lvl55uQ6EjbHnFzdsLCubE6uI2FzzMl1JGwRc7J1w3dEriNhi5iTa3Y6ObmOhM2xSK5O2EaYS6co7fbS4kvjbWYasCiNZqcDfbytSMNfNzstH6dPw9yawVx8qVtOplHm4vBuOX9jmZ9nTi51y8mUueeYk60bviNyHQmbY5FcR8OnzMl1JGyOObm6YWEqy3+MBkwKlcGgk8Ae7xweazMVGUBj4IFBtHnmmSdu0cHqAkHoaqBTxvs533zzhXHGGWfkgNnUU08dNttss3j4J3vKCoKBzjwrDDAyWV3Hu8Mg68ILLxwOP/zweJ6LIHRlfP/997FdPvbYY8MKK6wQtyJiWy2M/dSBCy64YDSMYTDVuUOCIAidg5zBgMP36Lw++OCDPY4PPPBAZM6vWWxFGqIoiqLYmVR72RzyfBgOMBi8//77Mhh0BjjImIGFDTbYIMw999wj94FnexdmabMPPNsfCEJXA1sSscKA7YfYf37SSSeNB3Yz43a33XYL11xzTfjwww/bpIWeDhoNtmkZNGhQ2GqrrcJcc80V6zuMBcsss0w89wID1D/+8Y+2EILQNfHTTz/Fwak777wzHHbYYfHd5RwXDP2036wOpA5EiUJxEgRBEJoPDLLUx6zW5tyYv//979FgwIw3VoFB6mH7791SlvlB3zkuYkfTqEOfRlFcZX6eZc/U6jRycRW551iWDiyLq246lkZ74ioLk7K9aVT5eVY9S1k8ZX6eHcmvKj/PUf0suDNgxj7e1D+co8IvO0hQF3nZKhIXv1X3m7uPuix7FmNH04B6lv+Vybkbm/EsZX7Q0sj5GTuaBmxVOrAsrVanURRXmZ/nqH4WY510jPfee2/8fe6558IHH3wQvvnmm7glrwwGTQSzFPfbb794eCJbcjCAxoxbBtP23nvvcM8998TtDwShq8GMXZtvvnlcZcA+3qyS2WijjcL+++8fzj777PDaa6+1SQtCCJ988kk8wH2vvfYKc845ZzynBQMTB7uzKoU6j0aMGdyC0JXBQBXbZ51//vnRAMaKQFYa0IbPMMMMYe21144GfwawtMpAEASh+TCDAWe8YZyls/rKK6/ELTPp2N51113hjjvuCLfddlu49dZbR/KWW275H5b5wSp/aDJFcnXiqKJPoyiuMj/POnFU+ef8POvEUSRT5J5jlWyZf1VYo8mVyRf5lYVJabJF8u318zS5Ivki9yo/T5Mrk2+vn6fJFckXuVf5eZqcl7d65cYbb4y6IGdObr/99rFfvMsuu4STTjopbittdVAaZ45pGmX+RTJVrBNHlX8dVsXh/Ytkqlgnjir/OqyKw/uXyeTcjXXjqOOf84NV/tBkiuQaiSPnZzSZIrk6ccA6cVT55/w868RRRybn51knjir/nJ9nHbkiGdyoy6jT0K/uu+++aCxl1xEmfDJxgwkcZjBoBDIYZEAmsl3BOeecE1ZeeeW4hzdbdLAH/FRTTRXWXHPNcO6558alHY1YZwShVWC2eN++fcMaa6wR5phjjrgqhm1lUJIwgrFdBxWIIADqMbYbQqneYostwowzzhjrvZVWWinsscce0WDAu0MDpMNiha4O3udPP/003H333bEeXGWVVcL0008fjWCc58KqKwxjzDjT+ywIgtA5wGjAtkRmNGBiAtu9MmGFJfLM9mVyFmRlrCiKYjPIKgLqFQbNhgwZEtZaa62RO0Vwth/nUzJJ6sUXX4z94VwcoiiK3YVMxqDOoz5Dv2KCBjvhsOqec/u8saDR8WsZDDJgbyeU2D59+sTVBDQuRg4/5rDja6+9VjNthS4LKgtmU/CusqXMH//4x/DnP/85HHHEEXF1AUoSh58IgoH9hTmzZeeddw7zzz9/mGaaacJ6660Xjj/++Li1C+cbaEsiobuAwSmWm2MwWG655eK2bJxjQDvOWQYYEa666qo440KrDARBEDoH1K90UOmoYjig08qyeDqwX3zxRRg+fHj4/PPPRVEUm0arW+jrMvuWyU+2WwQr7xnjYYCNnSKoi1QPiaLY3Uk9Rt1HnUbdxhlSjFczts2KT/QxiMGgEaOBDAYZkMmXXXZZtEYzG9GMBawyYGuDfffdNy6n1cxEoauCvcpYRcCB3RxiO++888aVMf379w+XX3553NMMi6MgGN55551Y7+24447RwMT5BWzngoEJo8Fpp50WZQShu4CD3anvqAd5n1klaO05bTn1IXUlipQgCILQeaBzap1VDAhGOrGiKIrNJHULdQ1GSmbbsk0vq6aZDMWkKM5lY/UBk0tMNhePKIpid6PVafZrBoJGDQUGGQwyYLnsoYceGrflsIMSmZnIPvB/+ctf4gAaSzwoAEHoinj55ZfjIC/vKwNl4403Xhwg22677cKRRx4ZZ5Kzx7cgGDAgMbi6/vrrxy2sxh577LDQQgvF/T7Zpqhfv35aYSB0KzCLlb0cd91117g1229+85uR7TlnGWyzzTZxf1smCQiCIAiCIAijD5hly97enF+A3sd5VpxNyYQRtu/V5E9BEIRyyGCQgIaDfY9XXXXVkTMR2b6AJWzMumXPu0ceeUQNjNClwf6w7NnI3vPMpJh66qnj9lrrrrtu2GmnncLAgQPjvo2CYGCWzfXXXx8PhJ188smjwZRzL1ZYYYVoRMAAxX54gtBd8OOPP8Z9bDF2LbnkkrEdt/ac95szXji3g/MOBEEQBEEQhNEHbDt55ZVXxn4MBgMmj7DdKmdRaqW9IAhCNWQwSPDSSy/Fvd8xDthMxAkmmCDMNttscXCBJW3alkPo6mAm+MUXXxxXymy00UZxSyIGyBZddNF4jcFAg7+CBwcTstVar169wgILLBC3slp++eXjKgO4zz77hBdeeKFNWhC6PliKSV14xhlnxEkAHOTNtkQYDJhlxnt93HHHqU0XBEEQBEEYzcC+3ldffXVcLT3nnHPG/g3bVF544YVx20pBEAShHDIYJGDZ2gYbbBAPSLQVBra6YNtttw3Dhg2LjY8gdGWwZdZ1110XVxnstddeYamllgpTTjllmHnmmeM2RSeddFJ49dVX26QF4WewPBfDAPt8shqFg7L/8Ic/xHeH7Vu0KkXobmDlzA033BBXW7HSyiYCQLZqY5n6o48+GlcjCIIgCIIgCN0f7NXNKgIm0HGOH1tLTzHFFGG55ZaLWxI9/vjj8VBQQRAEoRgyGDjQsJx55plh9tlnj4MJNrAw/vjjh8UXXzz07ds3PPHEE/EAHUHoyuAcjmuvvTacfPLJoXfv3mGRRRaJKwzYnogtZvbbb7/4LguCgTNZ2L6Fg8BWXHHFsPLKK0elmuW70047bRxwZTs2Qehu4L3ef//947Zsdi4RZOstZppRV2qmmSAIgiAIwugBVk6zyvSUU06JK+zR/zijjS0qDzvssJGHHguCIAjFkMHAgYMPaUAwEDCYwPYFbFsw00wzhXXWWSdcdtllWl0gdAuwwoADbNmSiHeXLbVmnXXWuMXM6quvHmeRP/bYY23SgvDzfu+8ExyKvfTSS8f3hS1b+D/LLLPEsw3YskgQuhtef/31uA0bK2aYXcbhx6wixJBK/chkgKeeeqpNWhAEQRAEQejOYCLUe++9F4YOHRoWW2yxMNZYY8WJIhgMDjrooHDvvffKYCAIglABGQwc/v73v8dDjTmzgEaFfd/Z93iVVVaJs2uZhfjPf/6zTVoQui7Yk5stidife+uttw7zzTdfnCnOVjNrrbVWnG3LUkxBMFC3cYbLoEGDYr3H+7LEEkvE+o9tiTjD5f7772+TFoTuAwz9f/3rX8Nuu+0W97C1bQY5z4X6cKuttorbEbLKUBAEQRAEQeje+Ne//hW332X7IVaYsmvEuOOOGyfRbbfddvFsA60uFQRBKIcMBm349NNP4wDr7rvvHgfJ2IKDg3EYbMWNWbfajkPoLnj//ffDLbfcEk499dSw8847h4UXXjhMP/30YZ555olbzbCS5tlnn22TFoSfwdLdfv36RcMS9SCzcOaee+44I2f99dePW7sIQncDhx+zygBjGAYwjAUsT2e1FcYxzjG4+eabdY6BIAiCIAjCaAC2kH7++efD0UcfHSfO/e53v4vnGLB6mq15WWHwzTffaLKIIAhCCWQwaMPLL78cjj322HiwJwfjsN/7dNNNF2cfsuqAveCfe+65NmlB6Np48803w5VXXhnfaQbDUI7YYmvCCSeMW83stNNO2pJI+AVYuvvkk09GJXqZZZaJWxFhYOKdGXPMMeNh2VphIHRXfPbZZ+Hss8+OWxBxCDwTAphlhvGAA74HDx4cXnjhBa0iFARBEARB6Ob46aef4pZEbNHLBFD6NAsuuGBcXXruueeGDz74oE1SEARBKIIMBm3g4JsNNtggzqpln+OZZ545TD311HEbDvaBv/TSS+MgrCB0B3zyySdxCw72aOSQ42mmmSb89re/DeOMM058r9lii3deEDw4LPvwww+P2xFhKGDp7thjjx1+9atfxVUHvFMo4ILQ3cBMM5afY0BdbbXVRh4Ez/kcXB911FHhwQcfDN9++21bCEEQBEEQBKG7gjMK7rzzzrDvvvvGiVCLL754NB5cccUVcXcJQRAEoRwyGLTh7rvvjlsVsPUG+9zRoHDNigMOS3z00Ud1MI7QbcBs2uuvvz7u0Yjxi0M+Ocib/RunmmqqsO2224aHHnqoTVoQfgZ7ffbq1Svu8c774okhlXpSS3eF7ggMXayQOfjgg+NqGYwFU045ZXyvDzzwwGgou/jii8PHH3/cFkIQBEEQBKFrgW0WmQTx1Vdfh8+Hf/Fffi7C4cO/DF9++VX48KOPw9PPPBsGDxkS1llvvTDPvPOFZZZdLvTuvVO48MIL46rSzz8fHr744qsYJhdXj+OI92h45JdxAg3vmiAIPRsyGLQB6zMzaMcff/w4u5YBBbj88svHve/efffdNklB6Ppg0Ouaa66Jg7+8z8wUx1jAwO+kk04aVxg88MADbdKC8DNefPHFsMsuu4TJJpssbkPkDQZ/+tOf4rkYbF0kCN0NGLo+/PDDaEjlXCK2Ipp44onjqkImBlAncjAeq2wEQRAEQRC6Ij759LNw7/0PhlNPPzsc3qf/CPYLRx41ILLP0T//iiPyYgT7HnNcOLr/iaH/cQPDgBNOCcceP2jE/5NHuJ0wIq+Oy4brifzve3NsOKJv/3BUv+PDsGuuDx9/olUYgtDT0eMNBgx+sXLghhtuCFtttVWcdcjhsBwMy/kFbOcyYMCA8OWXX7aFEISuD7Yk4hDPAw44ICy77LJhiimmiAYDyJZErDzQId5CinfeeSf07ds3DqayhZU3GDAT+5577tEKA6HbgtlSnN2CYYC2ndVXGAwwhs0000xhvfXWC0899VSbtCAIgiAIQtfCp599Hu574KEw9IxzwvEnDQ4XXHRZuP2Ou8NDDz8WHnn0ifDwI4+LYkO09+ayK6+JBhUMBtdef1M0TgmC0LPR4w0GHHDIYcYXXXRR2H///eP+dhx2zMGwbE202GKLhZNOOimeoi8I3QXvv/9+uOyyy+I+jbbCgEFfZo1z0Cd7OT7zzDNt0oLwM3gnevfuHQ1MfoUBZxhwWOzjjz/eJikI3Q8sreYsItp7tmVbcsklw9xzzx0nCbBtG+3/tddeG5f6C4IgCIIgdDVgMGCFwZDTzgqDTz0z3HPfg+HLLzSxUeg4nnjymTDghEHhsD79wjXX3SiDgSAIMhgw4/Dhhx+OJ+gPGTIkrLvuunGbArYmwmiw6qqrhvPPPz98//33bSEEoevjvffei+/t+uuvH1cU/PrXvx458LvggguGY445Jrz88stt0oLw85YtHPq6ySabxMOxzVgAMThtueWW4fnnn2+TFoTuCbZr46wCzjLYeeedwzLLLBPb/IkmmihuQXjsscfGfW21kkYQBEEQhK4GbzA4ZcgZ4Y677h2h23yiLUOFDoOVBhgMDu/bXwYDQRAierzBwE7PZ1/jW2+9Ney6665xtuH8888fNtpoo7DjjjuGK6+8UjMOhW4Fztw477zzogGMgz1ttjjbzLBqhi05XnnllTZpQfh5ezYOhd1www1/sR0RxqZJJpkkbmMlg4HQ3cGB8DfddFO49NJL4yqsHXbYIcw555xxdQHnGOy1115xO7d///vfbSEEQRAEQRC6BmQwEDoLMhgIgpCiRxsMmEHIAYenn356OO644+KJ+QyKMcDKdkSbbrppOOigg8Jtt90Wty4ShO6CDz74IFxyySVh4403jltssd0G5xcwi5Y9u9mnnlm0guDB/u6bb755XGHA+8J7M8EEE8R3aPvtt4/+gtCdwXlE9913X9x6iDqSA49nn332sOiii4blllsurLTSSlEn+PHHH9tCCIIgCIIgdA3IYCB0FmQwEAQhRY82GDCD8KGHHgp77713PPD4kEMOiQcdM0A21VRTxT27TzzxxLiv908//dQWShC6PjjDgG22evXqFffoZout8cYbL55fsMYaa4QTTjhBKwyE/8Gzzz4bt2mh/sNYYKsLFlhggXDooYeGF198sU1SELonhg8fHlcQDB06NK60WnvttaNBbJpppgkzzjhjWGSRRcJpp52mSQKCIAiCIHQ5yGAgdBZkMBAEIUWPX2Hw0ksvxYGw1VZbLay55ppxKyJmYTPjkIGzG28cUVl+8on2Mxa6FT788MNoMMAQNvPMM8ctZjCEzTfffHGPes7reO2119qkBeFnPP3003GV1YQTThi3seLMi8kmmywsvvjicXCVA2MFoTsDQwDntwwbNiwcdthhcVUB57zMMMMMsa5EB2DFIVsXCYIgCIIgdCXIYCB0FmQwEAQhRY8/w4CBVbZnYQuiiSaaMIw11m/C73772ziAwJZE7HXMwciC0J3ALNphw64Km2yycZhs0klH7kfPLNrll1tuxDvfJ7z80ktt0oLwM9imin3cx2h7XyCrU+aYY464t/vTTz/VJikI3RuPPvpIfKeZHDD2734XfjvWWGHMMceIB3zv1Lu3DoUXBEEQBKHLQQYDobMgg4EgCCkaNhj8/R9vhr/ecnu46prrw7U3/LXb8robbw7X33hLuOSyq8LR/QaEXr13CZtsvlXYaJPNw8abbRk2G/F/z733C4OHnhGuuvrnZyVMGo/YXF53w80xnyMz/mI5Le8uv+LqEUrkaeGAAw8NW2+7Q9ho0y3CJiPe66232yHssuse4ci+x4Rzz79oZJg0ntGVer/yJD+uH8HzLrg4HHDQYbEu3JC6cAS32HLb0Lv3ruGwI/qG0848Z0R9eN3IMGk8PZm/eLeMGTlx1JJyuX5EWZ197gXh0MP7hO132DlsuvnWYcMRdeR6G24S60q+gXOoH6+7KcpfCzNxtZzX/zVcfe0N4aa/3hoee/yp8MEHH4V/6XBmQRAEQegxkMFA6CzIYCAIQoqGDQaPP/FUOOHkIeGwI/uFfgNOiv9PPmVoOGkQPLWb8ef75v5PPuW0MHDwf8k1NJl8eLE5/DmPeZcGnDAwvlfwuBMHhRMHDnEyaTixmCPeaZi81/99t3vKe23Pyfs1OPQ/7uRwzLEnhn7H8X6dMuL9OjXmU8/IizrMvzMDe9Q70wh/bj94j1Cwqbd4v47uf0JbHXbKSLn/DSuOWv5cdkV1ZNcps5+/O94x3qe+xxwX28Yrr74uvPLq6zqcWRAEQRB6EGQwEDoLMhgIgpCiXQYDOq0HHtInzs5//4MPozt7/P/0n/90W/6ngDlZsbkE/xmh5Lzz7ntx9QqDJChBDz70SPjs8+FRpru/X6OKuXca5mRHR/JeGd96+91wyeXD4mDuGWefF5565tnwY9vBpvjnwvdE5t4XmJPt6eS9AV9/883IzhsGg4MPOyoaop58+tkRMj/Xcbnw4qhn7l2HOdlRwf/3//2/2P598+234bnnXwoXXnx5NK5fesWw8PKrr8lgIAiCIAg9CDIYCJ0FGQwEQUjRIYPBpZcPCx9/8mmbjyB0DB+8/0G45dY74mzKU08/Ozz08GPhy6++bvMVhI7hvXffD5ddcXU4qv/x4Yyzzw/P/O25ODAoCB3F9z/8EO5/4OFw6mln/9dgcMpp4Zlnn2uTEISO4YcffwwvvvRKuOjSK2QwEARBEIQeChkMhM6CDAaCIKTosMHgoxENlCA0A+9nDAZffPlVm68gdAzvZgwGP/30U5uvILQf333/vQwGQqcCo5QMBoIgCILQs9HdDAZffPFFePXVV8OLL744oq//frfRW7jXZ599Nt47z9CV+ozcy3fffRfv6+uvv27aBDgZDARBSCGDgdBlIIOB0JmQwUDoLMhgIHQ2ZDAQBEEQBKGuwYDtDP/973+HH0boD+gKDCojw/9vv/02Djjjn8ri/v0IvfZf//pXdDd/+ky4Ed7k+C3qSxHmn//8Z/jHP/4RrrvuunDkkUeGE088Mbz88sttEvXAfds9f/PNNzFdrn26n376abjxxhvDtddeG95+++34nDC9f+4H8hw8L/95VnsOk+f6iSeeCFdffXW49957o/GAMAAZC4ubxQFxtzg8eAZkTd7fg5Ub4XjG559/Ptx8883hkUceiXGmGD58eLj99tvDscceGw477LBw1VVXhc8+++/APs/xwQcfhMcffzw8+OCD4fXXX49GhTqQwUAQhBQyGAhdBjIYCJ0JGQyEzoIMBkJnQwYDQRAEQRDqGgwYlP773/8eB71PP/30OJh+1113hWHDhsXriy++OM76py/EIPzf/va3cOWVV0a/c889Nw7AM7j/5Zdfxpns+DPwf/bZZ4ezzjorDB06NJx33nnhzjvvjIP0DLIzMI7sSy+9FO64445wxRVXhMsvvzycc845Ye+99w6HHHJInLVfBzwPhgAGvq+55pqYFmmeeeaZ8T6fe+65kcYNBtB32mmnsPnmm0f/e+65Jzz55JMj+n7vjpQhL7inSy+9NFx22WXh+uuvj/d3/vnnx/+vvPJKlPvqq6+isYB4Bg0aFG677bYR+fvxyFn85BfPRz6edtpp4dRTT43Pd8EFF4SbbroprkgwAwQGDgbsMTqQBmGQJW7y8oUXXogygEH9iy66KObRJZdcEt56661f9FPJD+7j/vvvD2eccUaU22uvvWK+fPTRR21SPxsekLvhhhvC4MGD4/OS59xTFWQwEAQhhQwGQpeBDAZCZ0IGA6GzIIOB0NmQwUAQBEEQhEa2JGIA/JZbbgkHHHBA6N27d5zhj9Hg6aefjgPuDDQzYM2AP4PpGAtOPvnkkbPXGXB++OGH48A9g9unnHJK2G+//aI7g/gYAwYOHBgH4BkYf+aZZ+Kgfp8+faJhgYFy4j3uuOPClltuGQ4++OCYbh0w2x5jAYPjhOfeGKAnPdweeuihOLiPgQJjyDbbbBM23njjeI+33nprvG+MBOQBg/3MzGew/fjjjw/7779/zAsG7ZnJzyA/g+wYWQCyDO4fffTR0bDyxhtvxNUABvKDwf1NNtkk7LnnntGowHPyzBhb7rvvvvDaa6/F5+deuXeME+QTKy123XXXMGTIkJinpMkgP/l30EEHhe233z7cfffdbSn9DO7/ww8/jEaHY445JhopyHvKiXhIy69swDiAMYh75DlJm5UetkqiCDIYCIKQQgYDoctABgOhMyGDgdBZkMFA6GzIYCAIgiAIQiNbEmEMYJCZgW8GrR999NHwySefjBw4Ruadd96JM+MxAjAwzwAz8qwIsC1vGJBmtQEz3xmgxujANj0M/mNgOOmkk2LcDNwfeOCBMS78GFhHDqPFHnvsEXbfffc4kF8F7guDAXEwME4aJ5xwQiQD/twj2/aw+gEw2M9qgEMPPTSuNsAQgiEBEg+D8jw3BgNWRzCIz6C8nalgg+3kIbKEY4Y++YAsRgVkbRXF559/HvOVZ2XFAOmxDRCrOchDVi1g7GD1BSsPkMVIA7nu27dv/MVIYNsSPfXUU9GgwqqBNI8wemAAYOB/wIAB4fDDD4/lg4GEX56F5zQQJ0YgVmL0798/rkJgVUTVKgMZDARBSCGDgdBlIIOB0JmQwUDoLMhgIHQ2ZDAQBEEQBKGuwYCteBigZoCfQWhmrzPwzox5VgLgTxgMBgy+I8dAPLPXITPZ+/XrFy688MLwwAMPRCMBhgAG7RlMZ4AbQ8BRRx0VZdkGCMMCg9cYJ3AjPcIwQN6rV684wM6s+6qZ7gzgMwjPTH1m7LNqgQF2BuKJD+MAaTOwD3gW9utnBQKz7nkODANsx8O9v/nmm3FLJWbcH3HEETEvGNjHGMJzsKKAQXbbQoj8YGCe1QPI8sykz/kCyLAigWfcZ599YjqsFHjsscfifZIfGDmY0Y+RgZUQDNyz+oA8ZrUHxhMMDeSX9UUZ4McQQD6xWsGXJ/8xGmD04B6Iny2JkCU9ntG2NgLkB89FGpQBKyko56p+rwwGgiCkkMFA6DKQwUDoTMhgIHQWZDAQOhsyGAiCIAiCUNdgwAA4A+EMRDPAzR79/LJNDzP/rQ/EjHlmp7/33ntRhpUEDJoz4M0gMwP3xMPBulxzLgD/md3PADaD8aTB4D1xkS7xkxYrBJjZTpps64MsYdivnzQZ5OY3JQPeyCDLfXBPxEOc3AMz/P3KAEC63Bf3zqA6YQhLHMTJ9kU8I/dhecF9ExerCsg/4uC5eEbiMTl+CYcsMsSHDM9l2zrx/LgRJ/dBnMiylRArBjDUYHhgoB9jxF//+tdoULByY/Y/RhAzTmCEKDKskD/EyzkHrGzgfnz585wYCTCgsHUUecH9VEEGA0EQUshgIHQZyGAgdCZkMBA6CzIYCJ0NGQwEQRAEQWjkDIOuBvpdDHQz+53tepjJz3kDnAHAr/1ntQAD9gyEd2dgMMBgweoGjAGsaGD7J1Z+4I6/B0YDjAAcuoyRos4gfwreAww8GDowOthWSnUgg4EgCClkMBC6DGQwEDoTMhgInQUZDITOhgwGgiAIgiB0d4MBs/ExGrDagBn6rAJIWbSKoDuC+2flBc8O+V/1TPh3tDzbk28yGAiCkGK0NhhQUbLfG8vCaJhoeLCwdkaDSpw0APy2p4JOQQPJPbOEjAa1lYMC1kjZ86RgeRzLBFmyh+U/HXQlDBbxRgdju5vBgOfk/eKgJaz3LEUkb5pR/inKyqM9oNyY3QH9noetAPnDjIrc+4Gff7/IXy/H8xeFrUJ3MxhQ3rxTvFu23LVZ5e9BnpMWbMa7S/mxPJaZLcySsQPJWgWeg3LN5RXPxzvFzBvuK+2IEIawjb4X3dVgwHPyXjHLiXqM5dp8X814DzyIj7y1OqwZ8bMcm3eMWVssfW8luH/yjudJwfOhZ5CX1GO8Y/5dJCzfCGwEMhgIgiAIgtCdDQZC14YMBoIgpBjtVxgwcMXJ9Bx+wyn3HEjT7AFSBg2whnOAEJbyZnTgGWh49tln471z4A+DD50NBmB4jieffDLuOZhbKsf1o48+GgYOHBj69OkTLrvssjgo6MG9stTOlhPWze/uuMKAfQo5QImDnTiIqWy/wSqUDaJRLhwixeFIzQBlyyFW7JGIYaoV4L3g3eFQLN4L3jcPnh+jHu87BzRxaBXLNhkwN/Bt8b7xrfGO8n3Xze/uZjAgP8gv6i4O6GIPSjvcq5lgoPj++++PA6/tfXc9GJCnfFh6Sx3APppl73YzwH2TDnUmafNd8n6l6TKwzB6ie++9dzyAjXfND/raUmAMaeRH3feruxoMqM/Z55W666STToqHsrFHa3uWQIOicqY945unfJg11gxQV7B0nTaIZdudDZ6N74/84ru0pfLpM/Me0m6zPy3tJPWd/26pc8hjDsGjncQYmDM8pJDBQBAEQRAEGQyEzoIMBoIgpOg0gwGdYgYiGZBhgIZT2m2AkhPk/3/27gPqlqJKH75jzjmNYnbMWREzBgxgTiBGzGICwYCiokgSE4KAmBABUQEFzIogoKgYUcCcIyqmMc785+uPX83dTFlW9zlv5F7uftba63RX7QpdtbtO97Orqn0pXhwdKwAQCYcffnghLnwN36+X8iBrvFQjzHwIBpnoRdyX5pG19nlDUgFy2kdxEI0cBEhHxPaLX/zi4Y1vfGN5SUdczkNgITnMYPTCr27qpcw3velNZX89H7ZBKH384x8v+W+++ebla/U77bRTqdfJJ59cZrIiAxBVxx9/fPmSvq/qyw85oy186f/oo48u1+jPHqFiTz9thAzRNrH3XJAN4g844IDSDkS7KBNhXZM96ihvOohHpJB2kFZ9whEhf+TFO97xjpLXMccc8y/kpGuRRjvstttu5Sv/rkUb1e2pzxB2PrSzzTbbDO9///vneohZiMNAfpwzbEKf6Gv7ArIx/eM6tD9SSds71t7aXxwyU72RLYhFThJ2Ki9tpZ20A+IHuR5gE9qJLh329bKXvazYmP6SzzwEjjZi+/TVRRtpy5e+9KXlWhBgdNiXuKc97WnDPe95z+GJT3xi6TvXrR76T3+zCfm4b9xj6qef9DW7YEOu0/3hOAjCtp/Zl7Kl2XPPPcu9RtTNx5O0WZDsZmjHfcuO5UfYmb5AqEW/IybtVbnVVluVe6UlDdmPdnZfBUHuV59xkkQ+7iXlsumdd965XIdy5iE4F+IwCKfXYYcdVsYuv+5F7aBd1Mu9pV4IQh+uYu8xzmnXcKQhFdmfOO2jPf3qH/eT+971E3kee+yx5bqMC0hH7bX99tsPhxxySLHnecAG5cVBaixUp7333rvcG+xJP3IU6D9lPPjBDx4e9ahHDa961atKX0rjHgZ2aKXTe97znpKHerM/duZ+OfTQQwuJyr7Y0oknnljuMWK8rWd/q3+MIfJyr2lP95q+j71KtYU+cH8op24394f6uf+DaHWsPHm5Pg62tm9PO+200oauz3323Oc+t9hu/MeANNrNvfTqV7+6lMW+ZmExDgN9pB/YsrGDfbjvXAdbcg/pH+3nftFubJBtGMfqPlJvNhb3ofue/Whj7VHfb+zNf6t+o8vZ6QNs7if56+v4v5mCtjeGfeELXyj2Lj/XoC9dD8eN/wz3rzo//elPHx75yEcOW2+9damf+iuL3fj/dsyujG/awfVHfvqBA4dd6C8OHWOc63XvGScDxgf3rmthX+zGWEKXjblfwxFuPPT84P7QXnRj/Je/cTXgOUUfGPPVic3X0Af0tYV216b+J9mSey3GsLBt/+HGdfecPpmFdBgkEolEIpFIh0FipZAOg0Qi0WLFHAb+tLwUe8Heddddh6c+9amF+EK0mc3qJR2pgyBCGHj59tKOtHjMYx4z3OUudxme85znFNIBMeElH5GAbEBUI0wQAl7shSMIEBNmHCLFkAOIGAQEEgnh5heZhfSdBS/16oRAQCwhmJAASAQEjPojEFyjOiIbkONmrSKlEHWuDxGiLVwjvWc961nDfe9736Knrgg1wsmBvFAuwiPIH+UiT5QFSAnEFyLlRS96UWlbZAySA/FDVzsgauSPsEDecHp84hOfKEQHp8Z2221X2kdZyrTtkb7Rngi6erYxcgUho9+Qhgga5C/SD0nkOlvI84gjjigkkTaUf5A0Y1iIw0D+yEdkDBJq0003LeSlOmo7RCOSLLYfEcZm2NTDH/7w4fGPf3zpT22JrDF7H3HDTrQ7glh7cgQgdvSzPkAY62ftjNhFONLRnuyOHdYE5BgQSOyIc0k/IrTUL8hQdQ9ynK2pG0IXkc7RxpYQm4g9evpH/6n/M5/5zEJ2Iq7M8FYn8eqFXHKsDx/3uMcN22677T/1n7zoI5Gf/exnFztla0hMNobE41DQbuxHHHuXHzvcYYcdyj3qfrACQD/pd/3BttTPfet+CCD72Qf71O7KkLd20L+1wyCA8FRH18lhh7hU1hQW4jDQTq5RnY1drksbqKP2d21sAtiF6zd2sa+HPvShxcb0p3FLvDqKf8ELXlDGIWMWW9LfxkjXSIftieOocv/Kg23JVzuz5VlwTezH2Mc2jZdITmWxV2OgeOMNGzLuPvnJTy7luL/Vx9gXZbEJzhw2zyn62Mc+trSFdvjsZz9bSG33EPtyv3FaIrzZqrGknv2tLdil9jSuuj51cq8RhK57AxFurHVfsAn3IPtCwqqnNjJmsi33iPFIXxkDjKe1E8x1IqO1nzzYt/tdecpubYBtslektnaTt/ym7GsxDoPoJ+Ok/jUmuQblGa/0Abt2j/tVZ2OM1RH+Qx7xiEeUftBe6sx+tI2+NM67h/Q5GzDWuMf0hTHb/4K+YWP6X19p2+iveUho7ar91Edfuh/VRzvrL+Q5e2Ab7mP2b+x0zyrX9YXd6Ec25j/l0Y9+dPnfMLa434zp/k+1AWg3disPzxRsyXgQCKeU+hhb/f+wR7/sQzu5L9mp+grjDJVGW+gLNu4+ZWOgjsqQj/rTrx3J2oLzx3XrT9drPHbPuN/otmOY61eW8dK9JI8ppMMgkUgkEolEOgwSK4V0GCQSiRYruiURggURhgD0Uo4AbGetIwG8YCMYvax7aQ8S4vnPf3558UbwI5q8jCOK5IPkNIMXUeelHHlAEEKIAQSJmZlewr2MKx/Z4uU9SPJZQJghZpWHWAgCAkGBWInZh+qnnvJ3LWa7BpAbUZb85IEkQLRNAQmEOHItiLMgS5B3rl15iB/tpR0Q/NpNvcz4RERpK2VpN0DK0I/ZlvQQTeqIvOBYiBm68ox6c+poNyTUM57xjELo3P/+9x823HDDQoyqn21j1C3gWP133HHHQroh8WY5ahazJZE+QMKSdiZwtBX7Q3arJyJWuyAL9Zd+0FZIeWQpQSQhshBVCE0EGDII6aj/zQpFkoI+RcxpU/lq03kIHDoIPO2C2EWUWalgBj4nUU0g6R8kq/sB2R7t7MEwjpGKnAPqwFbVY2rWvf5lx9rNPRqQn3bUZmxJPbQh+1BP9q3OiDz3mnprQ/enNkLuutfoxgx7NsamEJLsXz3ruiHe5adtt9xyy0KE3uc+9ylOIE4LebkfaiDz2SvbUqZr0E5TWOiWREi+mIVtnEEytnDt7gv9zw60l3ppV2OamdDuXfai7dyjyE/1RQojFZHkbM09idylw6njfo2Z82zC+FjP1h+D+9Y9695wv7Np9o9c1cbsXh8gc4HtK5cDw70Q0DZhXyAN54O8arK0hf5EyiJXla2v4iXG2It4ZVvuK+S0e0jeynet9Il2Rc4ibdk3+5KffN0P+gORG6S7fmKTxsJYqQDsF/nMlhDBbGzjjTcebnWrWxWiWx/V9hjOInVRvntBW07ZymK3JFJPDgx9r13939RgA+K1l+t2L/ifc62cb9rMuKRd3fPuf9fqmsKZ7TrYMacxJ6f+dx/qd/2LfGdz7JVetPcsaA/3OBvTRurHweg/x6/xPxyD+kAZrlP947+gHsPAveA/Q53rrchasHH15hDh7DAmBVw3G4j7TVu439gY+2E7bMmYw+61h/tMnvrZ/ctB65qUAe4V9eZo0U7u+/r/LGzQGMcRawy7293uNtzrXvcq951xIRxwAWWyL8862j1Wi4whHQaJRCKRSCTSYZBYKaTDIJFItFjRFQaIMmQ7chbRhFRzjpDw8uwFHemDOPLijHBA6iKtrTJA7CAfkQ30xCHOvNAjUZCYiBTkByLAS7xZmEgUsxmRKvFC/oQnPKG8uCPazUAMMnwM4s0yVXfpEBhmFaoDggFJiahBdiB2EX7CkVzKR+4hbsK5gfBEUKiLbWXoqCvSpiZDEYIIDsSQdjADHKmBjETkIjCQQNrM9SE1kJEICWSMFQfS0tP+rhf5IR8EE8LMLGHXhWAKYgjxI104buoVBuA6kVf0Xbd2l5/yEKAtwYRMQqBwMHB6aAPpp7DQFQZIGs4i7YnE1yZIWCQkIk79EUD6zPWaMYxgQszTR57qU9eK4BSPtEOASce5YAVAkGtIcW2EuBeuv/UBIlIdlMGxMg/Zpu7aELHGxvWv9mKvbAixF+SwdlM/BL9ykUshSDlkmD7hTEBYuT7EIftCSgV5irzTD+xLm1hp8qQnPamQrEhcJJr70uz5sCX25V5DtgnTl0g9tsixIp4egtw9tsUWW5T2kD+HCKg/m9W+2ty9Ws+m1ZfsS38pn126P1yrX/Vt2zRWGBBtg2CfdU8vxGGgfsYT96D+jpnu2gGRGWUhYdUTkcj+tKVrNH4Z85CT7NH9aUULm0E8ukZ9zt6kka+8HLsmdkVXXyIg3bfGR9c5C9pR/6gPm1SuNtcn7ILdKS9WSLAdfcv5YpyTzn2i7dk9slRf0jF73XW5b8S5T6Jv5Mfe2If2YhPsW3mul5MBKWpM1F6uUR5BtrJvtqRvtb//CQ4p5Vk1YyY2G9Mf7E4/+J9RLltnX9pIv9XkLB39rF04rN3fSGa2zyHovq5tx/1SrzBgX7P+Mxa7JRHHhPbRVq7T9bp+7c2JoW2Nr+rp3vOrb6wk4LiVzrVzcBozXJf/C3XWPuxOWxtThLm32Z3/BCJP/5Xa1pjomt2r4UyaAieF/2L3u/vEeClvdsPm/N9pN23v/1scPWMcO3fdxjDXywa0hWuzCs91cKa5NrYc/x3ahM25Lvbs+tVdv+sz7WaMMw5HPVw3+1M/18uO/U9oH9eqLpwU7luriR7wgAcUOzeexv3Gdtiq69WW8qr/t9lG2Bg7Vz96nlfcT+6LltjXFxxWbNt11mNiD+kwSCQSiUQikQ6DxEohHQaJRKLFijkMvDgjMZDZiDDEJkIDsY7oR4YGWYcMQCQF4e8lHoGAwPSSjvxGCiEo5SUMAeGF3Us38kueCAxAKiDAES1EHFLYr3B6XrKV24p6+0VgIpeQVvJXd0Sf+iMw6hmsQB/xpI4IKYQFohAJ4Dq1hXqLq9sjiDRAOrimcACI1x701QGh4roRHcpBevhFamg/hIlyhdWzH9VLWgQOhwBCCdGufWNmLWJKGyE3kCwcDe1qEAhHEBLXtShT+fUMXderPkFsIYqmyLbAQhwGykAcaRvtFL/6BwFaXz/bocu+tCOb0Tf0EEzaVJ/qC22KcNJmiHj9oE2FxTUg3tmFsuSnvYNAp8c2WrsiYVvqLg82rs3lYQa7lSDqwAba9pKOnWh3tmNGrbrLB7knL/bFXrQDoaNeMcvXL+IKQae9QuTnGrWHPo9ywpb0r3tQX2sfbRn27z4SL45NIQM5QJTP5gPshr0gx5DI6tuzCfkpSztoX32lPbQdBEGMiEcuc265poifwrwOA/VCOGuPGJNcj/vH9bOXOp36GLMQ5eyFbblH/Wo/x8Yd16SPtJ8w95++0Nbayhji+lxvtL8xix1qX23BLsOO4rcV4ezfPcjZo4/VzXXITx7KqqHd2a7601WeeoB7W13lI4+QuFeCsNQOsXJCX4eedPJzzcZ97cLmjc3GEuOx8UkbC9f/AbatDLaAIOZkMK6oZ4yb4F4wdiG82YbzXt8a55TF5tUxxoqwRe2vf431nB4IYv01C4t1GKina2Zb0V5sgk3XBDJ7VG/twCb1h3Zhd9pcG8pLvOsSpj/YgPyNBWww7pNoc33FJo1DbFW++knZY7blV9uyixh3lOs/Q7s5Z89t+8c45T5gE/pfX2sH44mxyrWpv/4h+ln7KxPoahvXo63YjLwiP3mwqbD9CNOnrs258tUj2oJN0HfPGU+Q/Jx26lE7njg/tCWngzFO+7GXFq6l/r/xX2FcjfZga8Zi7WU85JSKe20K6TBIJBKJRCLROgw++alPn/18ncRuYun43Oe/mA6DRCLxT1jRLYnWRngxR7IgGpC0QdQSx8gm5A3SHrG2rgNJgzgxg9eMTmSbbRqQdi2hg8RAOiFOkEeuv0e6zQKiCgGF1FGO83mwmC2J1iYgtRBgCD/EWW1ffs2sRhIhkZBYrdNpXYPrRcQhxtgX20K4uT73WZB8ASQlghIJiOyMmekLAXIM2eceZmPKF9ZzPrRY6JZEaxu0l/sJkRn2VI9dBNmvjd3Hi2nftQn6xv2EDOYMsPrBKg/X147NyGtEN6cKkW6eWfItWvuSZ+0MHcNityRa2xAOGvd0O4YRNuecM6s3a35dg3EjbMwqHKsSjGMcILXTGTgHEP+cbpwO2mkxNmZclJYd+6/lcJ1nHEqHQSKRSCQSiXAYeObca+/9z353/uTwox/95Oz3+L+c/Uzw9/K8kJKyEPn73/9Rfr3L7PGaNwwv33n3dBgkEomC9cphYFYt0tJsQjNLg1xCABDHSCIzB5EICIPeDMJ1CeqPOHTdyI55yC+QLmY8LxTSmb250LTrusPANZuFigxiQ7V9hW1xxHAWINT1yzxE99oKdUd+cXxwCiHP5u1zbbUYsl7+0rUz5OfBecFhYPa0mdJhT/XYRdgXItdM/nnv9bUV7IudhH1xaM5zvyxl7FqsfZ1XHAZszMoNM+TZUthV2Bmbc24MOy/YGBjDrKjw/7gQJy47Wcz4EXbdOlRnwYtcOgwSiUQikVi/gcT93xUGbx1eucuew2vPfiY44G3vHA486NDhoIMPO/v33Slnyzvf9e7hXYe8p/BVh733/cN73vd/ctjZcuhhR5wd/96ip+16eawvop387rXP/sMrXrXH8IpdXj0cfuRRw6/OnP3tvEQicd7GerfCILH2Yl13GCTWbqzrDoPE2ovzisMgsfYiHQaJRCKRSCT+8Mc/Daed8c3h45/81PC+Iz4wHPbeI4b3vO/I8pvyv21B3nnwu4e99t53eOEOOw5PeNJTh0c9+vHDox+71fCoLR83PO4JTx623f6FZz9PvXE48F2HDu85nCMh21AbsKkjjzpm+NwXvjj84Q//vNo2kUisf0iHQWKtQToMEiuJdBgkVgrpMEisNNJhkEgkEolEwkrFWBFrtWJKK9rlv8uqbFtpPuUpTxmuc93rDpe85CWHy1/+8sNFLnLR4WpXu9qw5ZZbDh/4wAfK7gD/Vdoy2/P/5H9XW8+zsjuRSJy3kQ6DxFqDdBgkVhLpMEisFNJhkFhppMMgkUgkEolEYj743pnvcXEYXOMa1xgudKELDZe4xCWG853vfMMVrnCFYfPNNx+OOuqof/lmVSKRSCT+D+kwSKw1SIdBYiWRDoPESiEdBomVRjoMEolEIpFIJObDT3/60+GQQw4ZHvrQhxYHAUfB+c9//vJ7mctcZthkk02Gfffdt3yXazHfPUskEon1AekwSKw1SIdBYiWRDoPESiEdBomVRjoMEolEIpFIJOZD7TC43OUuVxwFIZe+9KWHe9zjHsMb3/jG4Ywzzsj3wUQikRhBOgwSaw3SYZBYSaTDILFSSIdBYqWRDoNEIpFIJBKJ+fC73/1u+NSnPjU861nPGq597WsPF7vYxYqjwNZEfu95z3sOe++99/Ctb32r7NefSCQSiX9FOgwSaw3SYZBYSaTDILFSSIdBYqWRDoNEIpFIJBKJ+fDXs5+bvva1rw0ve9nLhhvd6EbFYeCjxxe+8IXLioPNNttsOPDAA8tKhEQikUj0sSSHAfLt17/57ZqYRGJp+OUvfzV89GOfPMdhcPLnThn++Kf/XBObSCwNP//5L4bD3ntkcRgc8NZ3Dl/7+jeG/+//+//WxCYSi8ff/v734aTPfO5fHAZfO/UbazQSiaXh7//4x3DGN789HPzu96bDIJFIJBKJRGIC/zj7ucnqgV122WW44Q1vWL5fcMlLXnK4wAUuUFYY3P3udx/22muv3JIokUgkJrB4h8GOrxjefuDBhRD56c9+XuQnP/3ZOinqbnb7z37+i/+Tdfya1iUJ+/nyV08d3nvEB4Y9X7f3sNc++w8f/ujHz/4T/9Y5Om26lNny05/+b9v+k22fLaXNO/rnNQnbIl/60leHd7zz4GHnXV8z7LPvW4ZPHXfC8MMf/fic+F769VHYhfZobaaMiWfbUy/N+iphO9/57veGD37oo8Nee+8/vGq31ww77rRLIXWPPe7TaV9ruZxj7z9bO+39HBv73veHE0767PCWt7+z/Ece+t4j0mGQSCQSiUQi0YFthn7wgx8Mr3nNa4Zb3vKWxUlw1atedbjEJS4xXPaylx023njj4Q1veMNw2mmnFedCIpFIJP4VC3YYfP6ULw2vOftlFSGy06v2GF569u9LXrbzuikvf1UR1/KyV+w6vOyVuw0vXyPOhZd4er30KcsqbIlNmQG+8657nt0Pu5/d9uuwfZ2L8r82G7Yddr17Eedh23T+97efz3lJXrrTrsMrdnl1IXRfebZ9ud/L9Xd010sJmzn7nvs/m1kzHp4tLzUm5ljYFe3y8p13L+MW+7LKwPH/2lg/Tcq5LGz5bGHvxob/tXljZGPvIb08VlPOroP/SPXd4zVvKM71b37zO+kwSCQSiUQikWjwP//zP8OPf/zj4hS43e1uN1zxilccrnOd6wxXuMIVhg022GDYfPPNh8MOO2z42c9+livOE4lEYgQLdhj85re/HU4/41vDV0/9xvD1b5w+fP20dVe+cdoZw+c+f8pwwFvfNjzhiU8a7nSXuw63uc3thnvf577DVk988rDX3vsMn/ns54ZvnH5GN31KytoqbPtLX/7q8I4DDxqe9vSth3vf+37DbW674XCb22043OOemwzP3eZ5w/s/cHTRI708UtYvYQdf/srXhiOO/MDw/Be8aLjnve493OrWtx1ue7vbDxvd4U5lTDz4kHcPX/3a17vpU1LWJWHvVkgaB3d48Y7DZvd/4HC7DTca7ninuwx3v8e9hic9+anDYe9939o1Rp79zPW1U08r3zL40Y9/Mvzxj3/KD/UlEolEIpFINPB89KMf/aisMLAlka2ILnrRiw4XvOAFi/Pg/ve///DWt751+P73v1+cC4lEIpH4VyzYYXBewy9+8YvhkEMOKX8al7jExYfzne98w5WudKVhk3vda9hvv/2Gs87KbzQk1k2YeXrQQQcN97znPctsCrZNfPhpxx13LHs2JhI1/n62zRx91FHDQx/60OEyl7l0sRcP11c+e0x87GMeM3zucyev0Uwk1n387W9/HU4++eThBS94wXCDG9yg2PtFLnLh4RrXuMbw2Mc+9mx7/9wazUQikUgkEonEuoLWYRDvwRe60IWG61//+sNWW201HHHEEcPPf/7zNSkSiUQi0WK9dhj4wM2vfvWr4YQTThi22Wab4epXv3r5I+EwuOtd7zq8/vWvzz+RxDqLv/zlL8O73vWu4d73vnfZs/Hf/u3fykPSzW52s+ElL3nJcOqpp67RTCT+F3/605+GY445Zthiiy3KOMhm7PN5zWtec3jCE54wHH/88flhsMR5Bn/84x/L3rVvfOMbhzve8Y7F3i92sYuVpesvf/nLh9NPP32NZiKRSCQSiURiXQGHwQ9/+MOyJdFtb3vb8sFj7zbEu/DTn/704f3vf//wk5/8JFdrJhKJxAjW+xUGyK/vfve7w6te9arhute9biEMzMb2IRxfzrcCIZFYF2GFwXvf+97hQQ96UCF8z3/+8w8XuchFijNsn332KUswE4kanEzHHXfc8JSnPGW42tWudo7DwEyc5zznOelkSpynwEH21a9+ddhtt92Kk8BydcvUH/zgBw9vf/vby0tkIpFIJBKJRGLdgm2GfvnLXw4HHnjgsOmmmw7Xuta1hutd73pFvAu/8IUvHD784Q+XZ72cDJVIJBJ9rPcOg8CHPvSh4U53ulP5gr4v6W+22WbDa1/72vIhnERiXQSHwVFHHVVmht/+9rcfLn/5yxfy9xGPeMTwwQ9+cDjrrLPWaCYS/4u//vWvZcXVs5/97OE//uM/hstd7nLDjW984+Ee97jHsPvuu5ePhyUS5yWYMGC5+oYbbljsfaONNipbth177LHD73//+zVaiUQikVgu+MAoQeilpKSkLKfEB4wdh8PgXve6V5kQepWrXKVMgnL+3Oc+t2zd++Uvf7k879XpUlJSUtZliees5UA6DM4GYvXoo48upNilLnWp4nnmMNh3333LH00isS7CbIkvfelLZfYse/agdJnLXGZ4/OMfP5x00kllNnkiUeMf//hHWUWAQL3f/e5XVqZ4sL7vfe877L///jkeJs4z8BDFafrRj350eOpTn1pWGFpdsMkmm5Tl6+6Dv//972u0E4lEIrEYGGtt91GL51PyX//1X+W5IyUlJWW5xLiCMPMMZ+Ln4YcfPmy99dbDQx7ykOFxj3vc8LSnPW3Yfvvthz333LOsxP/CF74wnHnmmeek6+WZkpKSsq6IsSykfvYKR8JCkQ6Ds/Hb3/52OPjggwspZv9ipMHDHvaw8gfzu9/9bo1WIrHuwQMQO37Uox5VnAU+YGtZpgck3+9IJFqYZWPF1ROf+MRhgw02KCtTrL7yYP21r32tOFgTiXUdHpi8SNp6iJPA6kLbcHGoWqLu+waJRCKRWDyMsxwDXmA9OxAkXrzUOp6SefWWIqtRBslrWZisC9eyGnVcjTLIee1aEGWO//znPw+/+c1vyrZDVkn7NiWx5bT3YByQ5z0rrCNtndeYrOa1rHQ555UyyNp4LYut02qVsxBZjTLIeeVaVqMMMlVOHR5O0YU6DdJhcDZ4XMy4RpD5EM7FL37x4Va3ulWZmZ37vCfWZZhBe+SRRw6PfOQjh0tc4hLDBS94wbI9kQ9624ojkWjhD+UTn/jE8NjHPrasSvGhbPt+brvttsMpp5xSxstE4rwAD1HvfOc7y3ZExkb2/vCHP7w4DHI/20QikVg8YmWBZwqEnAlYJiRY3YqcCwdCT8TP0pmSedOvVhmLLWchaRdbBpm3nIXUp5WFpF2IXqs7bzlL0Zs37UJkrJz6fKkyVsZyljOW33KWQXrl9MKCJPNMZzyK1U0kZuAGoVanI/OWsRKyGuWs5rX0ws9NOS9d+1KuZd60SymDrEY5q1EGWUg5vfB5ZCFlzNIT/5//+Z/nOEYX4zRIh8EafOtb3yr7Ftu32yxs2xL5GI597ZIgS6yrMDiYLb7lllsOl7zkJQspduc737lst/WDH/xgjVYi8c84+eSTi83Yoo3NWGnwvOc9b/j617++RiORWPeBuNp7773LdkTnO9/5ip1btv7Zz342//cTiURiCYjVBcbZX//612VFlxm9f/jDH8qsXy+u4lJSUlJSUlJSUpZfOAtMILbSCi/IQeodNx0Gi4ClaW9729uGu93tbsNFLnKRsi2RD38iyHhhEol1EV7MjjnmmLIlUTgMYoXB9773vTVaicT/gef505/+dNnn03ZEVhhc4xrXKCsMfBMjx8PEeQVmvNpq6+pXv3pxGFhZeMABB+R2bYlEIrFEeFbgMOAc8I71wx/+sDgNjLteYIXHbxzX53VYL66WKb06bCq+F9fKmF4bNqXT0xsLq8PruDqsljpd6PXC2vAIq8PrsFZ6aecNa+Pa8JA6bejNG9bmUYe10ks7b1gb14aH1GlDb96wNo86rCetXpyPhfXi4nhK2nRtWBzX0tObkp5eGxbntfTiIqwnPZ06LI5rafXa8FZ68W1YnNfSi4uwVmbFkzqfVn8qLM7HwloJnVa3F14ft3p12JiEbq0/FRbnY3o96enVYRFeH0/pjcmYXhs2pdOGtzKlV4f3dCJsKq6nO0uvjo+wnt48YW1cGx5Spw29ecPaPOqwWup0td68YXV4HdZKL20c/+lPfyqTNnyH0vOX1VTpMFgkeFw+8pGPlO0ILnCBCxTywEeQ7fWOdE0k1kXwKB511FHDFltsUbbauvCFL1w++mTLjfw+R6IHY6EtiXwUDJFqxdVNb3rTYY899igv/YnEeQEcY1/84heHbbbZpmxF6D//xje+8bD77runMzWRSCSWCC+jxlkvrfYMt8WrfcRjz3AvsX5bGQsn4kIWGlbLPHF1fB3Wkzp9m6YXR9q4Vj/OezKVro3rhUV4HPdkLE0r8+i3x7VEeB1Xh9VxU2Ft2jivpadXSy+uDWuPW2nj4ryWXvhU2jgfC6/Px+J6YT39WhYaV4e18e15G96LnyesPm/j6rBa6vhaZyysjW91I6yVOrzV78W1x226CKtlSr+Oq4/jvJcmwloZ0w9p49rjNl2E1TJPXC11XE8vwmqZJ66WNq6nG2G1zBM3Jj29On0bPxVXx9dhY+ERVss8cXV8e9xKxNWy0Lg2LM5bqdOEXhsWMks/8uxJm6YNm4prw9rjNr6Oa8NCIs4vHttEOM9heMF0GCwB9njyHYPHPOYx5zgMNt544+HQQw8tD7eJxLqK008/vcwOv/KVr1xWGNziFrcYdtlll+Gb3/zmGo1E4p+BMH3FK14x3PCGNyw2c5WrXGV45jOfWbZqsY1AIrGuw2yL973vfed838V//l3vetfh4x//eNp4IpFILBGtw8BzxY9+9KMy9vqWgZdZL7KOa6nDHPfO67BZem3cmH4bV4f14nrS0511XofVUse30tNrj+vzOqyWOr6Vnl7vPI7rsFbq8CndOrzVifM6rJZ59cf06uM6vpU6rtYPaePa4zZdhNUypd8Lr8/buDqsljau1l9sXITVMqVfh/fip8JqmdJv43rxvfBav42vj2vp6dfHrdS6rV57XstS9Gu9+riVWjd06uNWat1Wrz2vZZZ+G1eH1XH1cSutbh1Wn8dxHdbTm6Vbx7fH9fmYTOlFXB3fO4/jOr6nV5+34W18e1yf12G11PGt9PR653Fcx7dS69TS06uPe+d1WC11fCs9vfq8javDamnjpvR7eu255ywThK0usMIzHQZLhKWzP/3pT4cXvOAF55AHtic6+OCDy55PicS6CC9qH/vYx4oj7HKXu9xw6UtferjXve41vPnNby7LwxOJFshSWxJxEPimi22JbnCDG5Sx8Wtf+1ohABKJdR0enN7xjncMm266aRkXfa/jKU95Sq4uSCQSiWVAz2HgudNMNy+tXmq9yLYirhe+nKKM1SgnJSUlJWX9k9X4j1mt/7HFlLMa9UqZLZ6z9IUdItJhsEzQcD4Ge61rXas4DOxnbHuCM844ozRsIrGuweqYww8/vGy1ZUsipJgZtbYkMmgkEi04T48//vjhYQ97WPnuBTL11re+ddmSyJ9NInFegO0x3vCGN5SPwHOK3ehGNxpe9apXlS0zEolEIrE01A4DL6vhMHBsIpZnUM+orQgfi1suWY0yUlJSUlLWTzkv/Y+tVjkpKyOet0zaSIfBMgKR+tCHPrQQCP/+7/8+bLnlluXbBr4onUisazAg2EbmyU9+8nCZy1ymfMPACoODDjqoDB6JRA9WEmy99dZlKyJbEvno8Q477DD84Ac/WKORSKzb+MY3vjG8+MUvLo4C//d3uMMdhje96U35weNEIpFYBvQcBp4hPHueeeaZZWsi4mWWxHkvbB6dxUidRy+/XthCpc2jl2cvbKGyGmXUMpbfapSzGmXU4ctVVi+v1S6jjVtuactYqTLrfFejjN75csmsfJej3HnyWI1y2vhZ+j1ZaJrFlEEWkm4e3Z7OPOlmSZvHapXR5tkLW6jUefTy64UtVNo8enn2wpYiq1EGafOM43QYLDOOO+644bGPfWzZ7/0iF7nIsOGGGw5vfetbh7/85S9rNBKJdQe+zfGpT31qeNKTnlQcYFe96lWHBz/4wcMhhxySs8UTozjllFOGrbbaarjuda87XPOa1ywfPX7hC1+Y27UkzhPwkPTJT36yjIX+56282myzzYbDDjusPEwlEolEYmnoOQys7PLsaT9dztnFCGdDLbN0evHzyKw86vgxnXllKo91qQwylk9dRi9+ITKVx2qXMaYzr8yTx2qVsRzl9MJDemUstMxeHq3MozNLZuXRi19omZFHL69apxe+EJmnjKWWU5fRy2ssfKEyVUYd34ubV1a7jLG8puLmlXnLmNKZJfPkMSt+Xpm3jDGdeWUqj9UuY0xnIeK5Kx0Gywgza806tHf3v/3bvw3Xvva1ywdiczZ2Yl2ElTGf+cxnhuc973nDHe94x7K6wEzxY489NomxxCg+97nPDU972tOGBz3oQWVfd07UXXfdNVcYJM4T8BEoTtONN964bD94hStcodi7sTInByQSicTS0XMYENu+Oec0iN+FiDS1zKMzpjcls9LW8VN688hU+jr/Kb1ZMpW2zntKbx4ZS1/nXUurN49Mpa3zntKbJVNp67x7gqAxqzO2g0D41Gl7edVhrYROLT29MWnT9PKIsFrq+FlSp3H9rjlmuRJhoddLM6/Uaepj5XmvJY57OguRWena+Fn6tdR6s+xksWXUEunG0rZxY3pTEnm0ebXxY+fzSqQbS1vHtcd+o72J41avPq/DahnT78WNSehN6ddxU3pTEunG0tbxtfR0x2SetK3OmN4smUpb5z2lN49Mpa/zn9KbJVNp67yn9OYRaX2nNx0GywgN+/a3v33YaKONCpFg/27buZxwwgnlY6CJxLoEL2snnXTS8NznPne4613vWr5lwAHmo7bpMEiMgcPg6U9/+vCIRzxi2HbbbYdnPOMZw2tf+9r8UHZinYeHpG9/+9vDnnvuOdzmNrcp//NXutKVilPVhIHcfjCRSCSWjp7D4Lvf/e7wox/9qLy4xhJ5vwuRmCkXMo/OmN6UzEpbx0/pzSNT6ev8p/RmyVTaOu8pvXlkLH2ddy2t3jwylbbOe0pvluADgvT2G4S3uF4ZRBxb96zsecKWsCeffPLw9a9//Z/svpY67ZiETi09vTFp0wQBVes4D0dHvaWFsFa3FXnTCcL7xz/+8fCtb31rOPXUU0s7+BakCUdIqzpdlBntXLf1WJn1tdAhzk877bSyS4RV9bac5Jis08TxvFKX05M2fpZ+LVFnbfKVr3yl1NuEFW3WtlHk2WuraK8Q8UEQtnmEiAvb1s/R184jbZS5EKnL6KVvw8f0alGfuOaoX8RF2vp6SG07dRnCtK17Ead34oknnnNf1vnW6SItaesSbV3rh+48UpcxlraOm9Kbkkg3lraOr6WnOybzpG11xvRmyVTaOu8pvXlkKn2d/5TeLJlKW+c9pTePSGtM9utDyOkwWAbEBz/vd7/7FSLhYhe7WPlI7JFHHlkG1URiXcNXv/rV4eUvf3nZcuNRj3pU+bCn7Tg8ICQSPXjAf9nLXlZWF2y33XZl9jWCNR0GiXUdf/rTn4aPf/zjw1Of+tSygtD//E1ucpPhzW9+8/D73/9+QQ9QiUQikeij5zD4zne+U0gyL68IvVqQOaQNr+PG4ueVecsY05lXpvKZiluI1PmM5TUVN6+sdhljeU3FzSvzlIFU8az75S9/ubwrfehDHxo+9rGPFZIRwch2kTC99OwcmWhbz7333rs8ZzzrWc8a3va2t5V3sUirjEhT16mWOt9e/CxRBmnD2/xCx0QKzo1jjjlmePe73z285z3vKd91NIEIkS1d5FnXy/XE/Y2EPeCAA8pKdtdtshHZfvvtS3uYwCY9otU4gNhHlGtf34v0i4P54he/WMpE5tZ1jHJDoi31l1Wjm266aeFvDj744HO2P4u+kj6kzaeO78XNK3UZY3nF9bMF1/m6171ueMhDHlLes7S9dox61/l885vfLO2nT6KtPvrRjxbxTKvdvvCFLxTnjHZr7SzyFMeRw7Fy+OGHl3bT12zdu1+krcufEvnW51FeT2q9MYn89Cn7cM3qxrbUL1ap0SPO3VvuT7sn0HN94aAKcV/6/9lvv/0KF/L4xz++2Cpd/F44FEhdX+2gLvSUob2NCZw86iOuTtfmUUvEjcUvRFajnDqfNq+x8MXIWF51eBu3UJmVz3KUQVajnMWU4b52v6TDYBnhgWDzzTcvRMKFLnSh4Z73vGf5o/Pnk0isa/BH+qIXvWi4053uVLYkshf9Jz7xibIcL5Ho4fTTTy8rUbbYYovhYQ97WHmY3X333cuDUSKxLoOj1Mv7JptsUrYiuuhFLzrc9773HT74wQ+msyCRSCSWCa3DwOoChI13KS+vCLP4TUk5twXBglBBNCIGvTd5TjBp8BWveEUhF5GE7Jc9I2HaPJCLyGDEu1W5j3vc48rEm/333784IIK0qdMK60md76z4Vs+zOpnSd72u9Utf+lIhjV/wghcUEvWJT3xiIfm33nrrUnfCAXDQQQcVQjqI2LgG14uMRWT7Xp7V7DiUl770pcOb3vSmQtD6VQbnQ7QzMliaV7/61SWdLVDvcY97FM7Fzg7aW3nKoF/XPUQdxOkPTo7b3/72w+1ud7viMIhy/PbSnluizuzEMZswGYudPfrRjx4+8IEPlHFSnaONo/6IcG2F7A7HiPezJzzhCaW9bB2r3X1/jr1yBli9EHlwCHEyKE/fanMryTl2nLPVxzzmMSUtUpx+tF9tQ5Ff1K++rgirw8fSRJ5tePxyGCHmTXj0Dnr/+9+/7JbgW2N4OnbHxrUZh5Tn+Ac84AHFVo8++ujyHlvXh7483/CGN5Tvl3m/ZZs4EvbbszHp2Jb/LbzJvvvuW+4T6bX9a17zmnKvc1rU17ASUvdBSspChe2w8XQYLCM06Ete8pLh6le/evkg4vWud73hmc98ZpkxkEisS7Aftz85DxA+eMymPRAgx3LFTGIMHtI9hF7rWtcarnOd6wx3uMMdigPBH04isS7Dw78Xppvf/ObDJS5xieEqV7lKeXHyMpYOg0QikVge9BwGSCjkCtIxSM2FShAn7XkdthzSy2+5y+nlty6WQVajnJUsA6mC9EMqHnHEEYWUveY1r1nIVA4EM4xjax26vTyCnETCI7ARvAhKBCaSM9ISXIP8kLMcCibl+FYYEvP9739/4RzEI5WPOuqoEi6/vfbaq5CmZuaHE86xWeKIeSRmiNnrRDplmInu+pSvnranRbDe+973LoT9TjvtVN4PEfuuWT0QyMhVW9paeSw+thbTJmZ/q8+zn/3s4cY3vvFw61vfumxlqrx3vvOdhbyXxmz6SKetOF6kN6vedXs35Sy4y13uUn69t1r5qR20m/r2+l6491mzyz3LSadOtozppVmMyGM58gnR/vLTF4h9JPTOO+9crkF/ho3QjbKtMOCw2m233YaHPvShxcmAm3rrW99a3vPZyOtf//ri4DE5kPPgjW98Y2l3pLjttjkF7DbAIfSWt7ylrDJAeLOfd73rXcNznvOc0v7I93322aeQ7mxQ/7EttnfooYeWFQ76TZ7KNKmWQ0k9lFU7heM6bBmlPI4j+bBPBPx73/ve0l+cR+oUZdjOy3O5vONeZFvsVTnuJ/cOx5S2uMY1rlHeVe2iwH7VIfqfuLf9/yiPE0I7ajP1CmdB9E+kiWsQph0+//nPl7q5HzimtL866MeF2lrkP3a+XFLnG8d12HJIL7/lLoOsRjmrUQZZjXLa/NJhsMzQmAYEA9RlL3vZsi2RAdaA6eE3kVhX4EXNA4b9uW9729uWP1szEXjkedsTiR48qHmwNFPIzA7jHyeqh/1EYl2FhyMP9lZZXe1qVysrCG91q1sNe+yxR3kZSCQSicTyoOcwMP765TRA6KSkrC2CUEEOehZA4prBbcIgQlJ8zGhu09WC2CZsHNmKEEdsylNY6CFxEJh0bHnMKeFZBCmKe+BkQJTHNjTOEfYbbrhhiUfsI1/V1/P6K1/5ykL4m4Xt2f35z39+eWa3zQ3yWL6+2YT8NyFIXZCfnn2sPjcz36QgYepmBjxyCQnKqSD/+9znPoWY5TRAMCO2vV/GjPW73/3uZWKaSUZ0t9xyyyJWLUjjOrWH9lGGuhsPzCJH8tq+CRnMSeD6pEVgczggoYOQlb5tc+HyQo5rN8Swvmp11zYxDrIN12d7G9cQY2Ndd8faS79xzuhXKw30n+s1OfCPf/xjyQMJz6GgD1784hcXghyxbwXCHe94x7KqgH0jyusZ+I4R7bFdp/RW4+p/nJiVJ2z0Rje6UXEqPPCBDyzvh2bae0f0vsg+OCOUp17qzP7V2UoGNoLUjy2SOTXcZ1aWsO0b3vCGxVnBycWRwVbf8Y53lLI5kdSJTVh9cuCBBxZ7Znuu7W53u1tZYcCxEVsFtfYS7S2eLbq/woEdOj2RT+TF5pV55zvfuTjJODrcxz3bTElZW4RtpsNgmeHjxgZIg+qVr3zlsjWRQdLyQgOLRk4k1gXYr5sn3jK+jTfeuPxJI8s4v9JhkBiDWR2Wx1p2uc0225SZP14+PIQnEusq/vCHP5RZUWz7Upe6VHEYGBe93HixSSQSicTyYMxhgGh07H0qfmupwxz3zuuwWdLTb8Pa4zZ+MdLmMXZehy1G6jza/OK8DluMjOVbH9cSekuRNq86/15cHC9WECpIRuTf+973vkJkXve61y2kq2dfhGovXS11vYKkCcKm1fMr3rH95xH2yFQzl60IUA/pkLhmWZvV/IhHPKJwEWb/S4fYD1IVyWr2NqIVWevYjHvEJieD/JGdrkNaqyiQwze96U0L+Ws1AUIJ8alu2oJYLYE4RvAifBG/Jp0hnMVrG3U1MxwhjTS2uoAjAzGLMCbIWfd+XLd0nB5melvRjGiuCWqztzlJfEdNf9jCKNoy2i/aMs6jzm34QiTS1Wl7ec2jMyW1vvYgbXz8umbXhezmQNFenl21DyeLvkeeI9YR7sh1pLZVA555tbFwxDznjL6Rb/Q1wQn4ZXvIew6mHXfcsbwPcmhYcaD//+M//qPYopUN4vQtpxYC3+oGzgR2ynFhxYRf9ic/9qiP2aj+lK9Z/pwAVv1ybLkG9q0+8mVXbE7ebAJJb+Kj91KrSWwd5v2U84LDgL5VOa6lbdMQ7SnOr/O6rWsJ/egfYVZDsFWONvXCp7iWOr86r/a4Pq/DZklPvw2rz+O4DptHevp1WBsf53XYYqSXbxvX6ixGevm253XYYqSXbxtXS8QtRdq86vwJ20yHwTLDh4/9CVvKdYMb3KA4DHjMDWj+bD38JhLrAv7xj3+UP2wPF2zZwyg79gBhuWYi0YMXEA9iN7vZzYaNNtqovEx4CPOikUisq/C/btm0mXgXvvCFhwte8IJldhSnqtlZiUQikVge1A4D5I+XVqQhpwGyEHkYv7XUYY5753XYLOnpt2HtcRu/GGnzGDuvwxYjdR5tfnFehy1GxvKtj2sJvaVIm1edfy8ujhcrSBakCvLP9j4IVg4DhCy7Namgl24xUtdfuWZ4mx1vqxSkLGeF2eP2nOcMQJaa6GDbIiSquqhrnSfyN5wd3vPMxjbTHAGMhEdoIow8x7sHEbUIT0QtghkBjJAOwilIUuSu539kbMyqjq1c1CNmqNtOxixzdfecJV6dgpR2/8f1OvaeYaa4Ol7/+tcfbnGLW5RJbZ7JrJTgOEEgewdBXHtOUz91qq+7tYWxsFZCR13k6RqifcLp0Or20k/pTEnot3nUOhEWhJ/+5QQwox6pz1Yc2wKKY8XWRmyEs4hTgN1aDWIViJUmHDoIdqR3OKSiXzwfswvkO77AjgThXOD4sfLDJDJ9xIElTLuxgXBmmHzDmUBHnWw7ZPKtvvXc7b5SJ/rsQptLx3mFhHdNnAps3H8GR1qsYDF5jbNMnspgi95TOSTYEQeCe5a9c5i5proPQ9o278W1OtpIXsLYoRUe2pIDA5/CaVLbeJ1Xe1yf12GzpKffhtXncVyHzSM9/TqsjY/zOmwx0su3jWt1FiO9fNvzOmwx0su3jasl4pYibV5t/uwzHQYrgL/97W9l/0ErCzgMrDTw4SNhufd7Yl2Chx9/9GaHeODzR+eBw591ItEDmzHWeVHyEGZWiodRD5OJxLqIs846qyyLNgvp3//938v/um90eEH30rOQB6dEIpFITKPnMEBSESSR5wmCQApHwlhYfd6Li+M6rKfX6tZhbXx7XJ/XYfPq1edTcW3YmF4d34ZN6UdcHVbH1cetThs2pR/nddiYXh3fhtX69XF9Xof19HpS6/hFrCBVbFVihjvCncMAwYr89pwwlkfkMxbfO6+P3Rt+kei2HzJ7HJGPoDU7G9kqnFMB+UM/yCDH4eSwmly9zfDmcDADXJw95xGrZnebRCbMPYjYR7LaZsasbcStOiFyXS8dDghOAuQ0ch9pi4Smp72Qv3StKoj97xHHZo+rWxCtQVy5TqsWlK2+3k2Ry+pmP3h1Rcoq1zuIMukgwtVduZFf3Ya1RHgd156rG7GCQ31doxn7JtghpdVBv9ONsiItifzqfOvjVm8qLM5bvYjzq67qo93MwmcT2sf3MWI89auu0e5+pdemnENIbjbluwNIefH6jrB7jgb953uHtrbisJInAt5WP2yLI0FbIcqN5d4XlaG93CucRvLgbGCTVj6YrMgZYLWOPNVRmZwNbNo3LMzYd7+xQ3YrX3mKE8Zh4JsY6uI+MYGXM8NqAttdxbsqfVtrRZtFG7Zt24a1x3HuV33ZrmP1D4cBJwfnGedJ2HYrU/mOxbVh8+rV5724Vq8Oa+PG9HrnU3Ft2Lx6vfheXB02pTcW356PxdVhU3pjOm18e1xLhNdx7XEvPs57ccYD91Q6DJYZnAK8+jzc//Zv/1bEKgOkmcEzkVhXYCWBhyEPXB48ecQ9FPI0JhI9mG3tgc2+ox78PBiZdeIPJ5FYF2E7IkumvYj72HH8p1um7yEqkUgkEsuHMYcB0iucBikpa4sgVZB+iO7aYWDWthnEZmMjX3pplypB6ijfrGpkK/LdpAYkqFnWQfDSkQaJ6Zk89BH6iH97uSMyTYaQDmGLdEfIIsM9B33lK18pZC2y1QQyhKzvGFhp4PsHtp/hoHBsopnnJjO/8SLIWG2hfCsXpFeObw5YKWBlslnvyGzvEL5/4B1U/dXbu4UPOHsntU2NetsqV5sjguXLoWGrF9dlhTPniXbwPQlENYI2CLG6HecV6WI8sse/meq+v3CFK1yhtFVsm1OTbr18VlqibKLdOGV8E8I7GeLeSgy8lP7yXQJt7JrCRkhcpw9e+zaEdzq2IB/kupUhVgxoAysBEPv6nfNGP7MxxLxdN9jBZS5zmfKBYY4BzoeYjKg+7M99oxxkv75Ub84rKxzcTz6MTT9WzXAm+FYIJ4VvJ9gmy1ZV+tjKAfVUXx/mZicIeveja6TjGxg+GK0+yje51zXpP+UvR/9Jzxa0o3ZRR/V2n9mey5gRZfXSp6Scm8IujQPpMFgh+OOyDMrHEW1dcKUrXakMqP6cbfWSSKzt8CEksyf8mXoY3GCDDc6Z/eFhMZHogW14APWQdulLX7qMgV6a0mGQWJfgYch/NQeYF2Mvn15qbEd02ctetry0eEn97W9/uyZFIpFIJJYDPYcBZwEyMJwGfkOch7Rh9Xkc96TWb9PWYW1ce1yft9JL2+pH2JheG9cLq8/buDqsjquPe+dtWC++ldCp9erzNq4X1tOpJeJrvfq4d96GxXEd1pM2Po4RK0i/INHtxY/ADIeB5986bZ1HT1rdWr89DkIyCFZkfRCxSF2zsKP8Op00CHzPMfEBWLPPQ3AWnARI1+23374Q/lYHuB5lhijTDHIz7JG6nvmRx8hfs8nFIfAR0tpH2UHq2x/fRDTlcTwggjkXlG2WujA6Pg5rZYEZ2vbcN+NdHN3nPe955ZqRschXBDXHholKyFl5I2cR0VaK4mfUISTaoz6PsDiO8yDQzKhHonOscJTc5CY3KQQ4Ylwb0afX5hfShsX5WNiU1PpxXIv+0tbqqi+1rbaLfjb73qoBKwlcFzuSrs7f+KvPEOwcAjgB/WzijN009D0bsrrcSg52hZCXhn14J7RyxZZRVpkcfPDBxTaVrf88X7tvOJHUVzsrV12I/OQtjZUlbEb/yteqCc4l/cBm2ArC3+4I2223Xblm18kJhb9QH2XoJ6sOpGUfbJ0eBwWnQ7RftEV7XofFeU/CZuhpYzbOYWC7Jd/+cE9EfCvS18e981Z6+nV8hPX04ryN68lY2jZsSq89b6WnX8dH2JReG9cLa+NbCf02TXsex72wVr8nodOma+PjvBfWxrcS+m2aOK+P3b+OOdg4DH7/+9+nw2C5oBGtMDAA+HPlwb3YxS5WnAY8mAZKnfDf//3fa1IkEmsfbKtlkLAvpQcL+0BaMWPmhz/6dBgkejD+eVg0W8hMIQ+SZgF5IPSHk0isK2DLnAW2X/OSut3ZLx5mIF3gAhcoM/fs82sCQH6XKJFIJJYXrcPA84OXV4QWsqcVcaQ9r8N6enVYLXV8q9ee9/THwlppdWq9+rgOa/Xq8Dqsjauljav1p+Lm0R+TqTS98DYswmfJVLpZYW3cmLT63pcQg0htxCey3Kxn70y2PPGsYAKWtEHEtHmOSVtWhLkXbLEib4SwbXiQ4lYJmLBjRr1nF/oIyTrPEPmIQwibER5bzDiOc9sGOeZgiLr7JQh629qYWIZw9eyPfOUoQQbbfoV4B7BVD4IU+ats+bmvOTPkr5woK+pB1E3bKlc7I5CjfkT6ILmjbvKWLvKg5zyuob7+ViKuJ5E3p0Ns72M/fGS4mfb6gk7UpU7bltPG92RKfyqsFuHqo63rdo12ifaLNq7zi7TiOWKs7kDW28IH8e25WN9y4Oh3zgjPxsh6hDwujJOBs8IqATP9kf7sUn7sh/2pm3at263+rfuUCGNHnERWBHAMmdBj21D3gfrSc31xra6z7n+/PdtXF3HKiDpEPeI8juuwVo/Ix33q3ldX7cex5N3YfWHHBmNGXHdIL69eGXVYHV5LqxN69fGY/lRYLXV8q9eet/pTYa3UOrVufTym24urw9q4Wtq4Wn9WeBsW4bNkKt2ssDZuTKbSxDEbdj+lw2CZwRGgQW3lwrvKK2+WrT2PfQjGgGqgTJIhsTYDUeZPzp++BwMPoP7k7DFoCZ0PHCUSLczIZh8entmM2SQ+YGVmh4ewRGJdgociLxJmq7FpKwX9l5s9ZxaSByoPT4lEIpFYPvQcBl5eEUzerVrxvErq8zq+Dm/jIqyWOr7VGzuvpRce+rW0OrVee9xKxNXxdVgbV0sb1x73zsfC2rgxmUozK6yNm5KpdLPC2rgxafWDYLFdj1nEZj979vXeZEa0WfLIQQQqYpZ+L9+e9MpCMCIhTeDyXuZDrj4YbGKiLXhsO2PFAFKbnrpF+l4ZpC6jF97Tcz+asW2rF44R1252tg/jcmLYh97e9rbuwX2Y+S1Nm2dbThzXUuvOkrF0bVit18b1RLy2tGrDtjna19Y3Vi6ccMIJpf+DaGvzmreMWuo0bbqpsF5cLW38VJqwbc4p/Wv7Ik4ifcxB4pcjiKPMRFnx0RbiOYoQ+SaP2bKI08AKGI4G38awUsN9Ee0W5dZ1IuLdO7GCBL+2+eabn2P3HAacEpxVbJGd0W/zi/M2vJaefi+uDmv14j7lXGL/VlFYVWDlj62PvENoO+0U9tLmVx/X0uq14bW0OqFXH0/ptXERVksd3+rV53V8Hd7GRVgrtU6tWx9P6bXxdVgbV0sbN3VcSy8swmfJVLpZYW3cmEyliWO2ydGWDoMVwp///OfiXeUguOIVr1hIBh9LRKBZ/vfrX/96jWYisfYhVhhwGJgh4w+e4wDxa1lgfvQ4MQazSSxPNcPKbBOrrMw0MpMjkVhX8Kc//am8IPlYnqXO9qf17QIrBpEAHGM+hryQh6ZEIpFIzMaYwwC5QpBBtUR4HTemu1ipy6jzXc4ySJv/WFgd1wufkrH8pspYTDk9mcpnOcqIuo7l1ca1emPpaok8errIT6QmMhDBSpDLMeMaCdOmmZJeOcgcM5Nt4WJffiQtUtIv8tbziXAfd/UcEyRQnUcrvWuJ8DouziNMXZC5ZqIS9ymJMO+SEV7Xoc4zziOsPu7Fz5IxvTaPnt5UGXV61+LaXVeQaz29nsyKD5nSmyePecogkVdPP8Jcn2uNfg6CP8KQ8yT6WjqrANg+m+QwQ+RbAUA4HaxI59ByX0QZddmtaGfOGnYtvXzYvbzD7r2DhsNiIX1CWp32uE3fC6tF+e5T9752iPHAdXOexXWHjOU3VcZiJMqp823PlyrLXcZYul74VDmLKb+X36wyxuLmlchjtcoYy4sN42/SYbBC8AeJaOD1tN8xh8HFL37xsq0LL6OBVoMnEmsj2KY/fR+j4iyw357lppYVchhYtpdI9GCmjf0tzfawv6gZILnCILEuwUoZD/L+q419/rv9h5///OcvH2yzJNvDfn6PKJFIJJYfPYcBssi4XAsyhoydj4XVEvG19HTq41ZnLF0toVNLL74+r+MjrBceEvG19HTq41ZnLF0toVPLYuLrsFbqtGP6s3R6YT2p07f6bdxYfJwH8YJoYbNBJguflUcrPZ04j3LkH8QtacskbR61RFwdX4eNyaw8xuJbnTaulim9Oq6nMxVXy5ReG9fG1zIVR2blUce3OlNxtcyjN298Ly5k3jzYXtwHtY2Snn328giJcPph93W+td2Lr/OItD3pldGLr8/r+LGwWqLO6hb1izqSXhltnr2wVkKnll58e9yLH5PQqaUXP3Y+FlZLxLfS02mPe/FjEjq19OLHzsfCaon4Whaq0wtrpU7b058VX+vU52yXIzAdBisEjcthgDjzkURkw4UudKHh+te/flkmaAmWr00nEmsjDAT29dtrr72Kw4Cjy4eDdtlllzJDIL9hkBiDpcnGvWte85plGapvYFh2+rOf/WyNRiKxduPMM88sdmx/Wh8utLLAf/hlLnOZYdNNNx3e9ra3FafpQh6YEolEIjEfeg4DREu80JplupJyXionr2Vh6RZbxkJlNcrJa1m4nBevZTXL6sUtl+S1LFyyvRYma/O1LDTdYssJsTpNepPg02GwQvjDH/5Qlhz5YnvMTiS+Z+ADOfYz5BH9n//5nzUpEom1AwYBA4I9y/bff/+yvYxZtg996EOHHXfcsSwDzG8YJGqwmRDOUKSqfVQf+MAHFrvhaMpVKYl1AbG66tWvfnX5r77EJS5xzv/3Va961eIA8/Gy3/zmN2tSJBKJRGI54VkiHAa+CRcOAy+w9oSeEttgkF7ccspqlLOa19ILT1k35Lxii6tRRkpKyv/Jat1zq1VGjh/LJ9rSM5d34nQYrBA0pC+y+/DRla985XMIBzMVzbzlSDCDUeMnEmsTfL/Ay9mHP/zh4iCwf7ePFflIz6677loIYTNwE4kAx6fvthjzfMjpVre6VRHfv7AyRdgPf/jDNdqJxNoLBJUx7qlPfeqwwQYbnPPfTa51rWsN2223XdmOyIfhE4lEIrH86DkMLI0PQsC+1z0RZ3wms3RDenpTaSOuLqen0wsLacNqvTp86lqmwurwnl4bp4w478XXYXV4HTdLd+x8VlgdXofV4XVce9yL74XPo9+GzaNTh7XhddxY2Kx4UvffPPq9sFk6rS22uqE/FdZLV4f1yoi4nn4rY2kirhfW0++FtXE9namwqbg6bFbcrPA2bkx/MXER1kszFh5xY+FTebVxY/qz4noS+r0084aP6YVEfE+vF9aLm9ILCZ2e3lja0G/vuZ5uLT29qbQRPlbGVFgb1+rV4WTqWqbC6vCeXh1XH7d6U2F1XH3cShvX0+ulj7A6vBdWx9XHrU6ccxykw2CFgVQ1Q3ujjTYaLnKRi5xDOlz+8pcvX3UXh2DT8InE2oL//u//LisIrJB52cteNmyyySbDHe94x/LRYzPFfUgrVxgkWtjP3cew3/CGNxR78Q2DLbbYojgNzNZOh0FibYexDzG13377Dfe9733LFkTxv33Ri160rDjwv+3BKVcHJhKJxMpgzGHgBRYh4KOyPanjgjyo43vS05tK28b19MbCemmndMfOZ4XV4T29XlxPrxdWh9dxs3Tr8zo+wtrwCKvD67A6vI5rj3vxvfB59NuwNj7CxtItVH+e+DZsKm4qbB6dOqwNj7ipsF66qbCpuDqsjevFj4X19HthbVxPZypsKq4OmxU3K7yNG9NfTFyE9dKMhUfcWPhUXm3cmP6suJ6Efi/NvOFjeiER39PrhfXipvRCQqenN5a21R/Ta6VNNxbWi+vpTYW1ca1eHT5Lr9Wpw+rwnl4dVx+3elNhdVx93Eob19PrpY+wOrwXVsfVx61OnHvmsiOO916756TDYAVgBqKtC8xUNDMxiIcLXvCCw3Wve92yPzLy9ayzzlqTIpFYO+BFzRYyBx54YCF873SnOw0PeMADhu2333543/veN/zkJz9Zo5lI/DM+9alPFZuxjZXvXyBed9ppp+H73//+Go1EYu2EhyH2u+222w43velNi5PAf7aVgT52vNVWW5X/7Pz+UCKRSKwceg4DH+HzEnvSSSelpKSkpKSkpKSskJx44onFaZAOgxWG2Yo+HOuDn2bchsOAxCqDt7zlLfkB2cRaB84uX/A3W5yjYMMNNxwe/OAHl+043vWud+V+9IkubEt02GGHDfe5z32G2972tsP973//4V73ulexm29/+9trtBKJtRNWTh188MHl2xuXvexlz/m/9h0iH35/6UtfWh6e2HkikUgkVgbhMDDWWq0dDgPjr5fYkBNOOOGfpI5bTlntMlayHLIaZZC8loXJuV3GQstfzrymZCyvlS4DafaZz3ymSMy2jVVOMQO3JtfmkbH6rvS1zBO3UJlVxnKUNZXHcpUxJXUZq1VOL345ZDXKIKtVDlmtMla6nNVsr7Fy2jjHxjYOA+/HuEG7SaTDYAXw17/+dTjuuOPK1hwXu9jFziEgfEixJiD+8pe/rEmRSJz7MBiYEf72t7+97EGP9GXDO+yww/De9743nVyJUfhzecITnlBWGHAcbLbZZsPOO++cTqbEWg0PP9/85jeHPffcs2ynFf/V5z//+cvqAh879l2XX//61wt6UEokEonEwmCM9Rzq3cgqbFsa+uix96Xjjz++bJlJPvnJTy5ZIq/lzLMnq13GSpZDVqMMcl67ll74csp5rb3OzWvxPSurSk2EsuL+bW97W3kvPuigg4ZDDz10OOKII4YPfvCDRa+XvpZz+1qWU/JaFibnlTJIlLOSZdVlrFY5vfjllNUoZznKkN5Ke9y1Zy3PXKeffnpZ6WnFZ64wWCHY5xgJYQ9vqww4CoKIQELYFx4B+8tf/nJNikTi3Ae7RfB6KHrKU55SiF8fPfYRbx/r/tnPfrZGM5H4Z3AYPO1pTysfy37Qgx5UZmtzGOSWRIm1FR58OAK8GBrv2i0Er3/96w/Pf/7zhy9+8Yvlo/CJRCKRWDmEw8DL6Z/+9KcyPnMa+Piej/B9+tOfHj7xiU+UbV8/8pGPpKSkpCybfOxjHytjiy14fcvP96vqSZ8XvvCFy3Oi95tXvvKVw3ve856iz3HQyy8lJSVlbRdjmOcqjgIrqUzSsFPOL37xi/LBY5Pg7Z6z0G/4pcNgTnjQ5YF+0pOeNPz7v//7OX84V7rSlcq2RK9//evLx7ws9ciZi4m1Bb/97W/Lw8+OO+5Y9u5+3vOeN+y2227D4YcfnisMEqMwlvk49nOe85xiM49+9KOHl7/85eVPJ5FYG2GJpSWXu+++e3HsX/KSlzznf5qz4NnPfnZ5kDIm5n90IpFIrCyMs+E0MD5baWD/XO9TXl59R4sDwcQWkxFSUlJSlkt+9KMflbHl1FNPLSsKHvawhw1XvepVy/esPBeaSHLta1+7fK/NygPvPcYj6Xr5paSkpKztYswzjnm+sgWR7SB9sy9WFoSzIJ7P5kU6DOaEfThjlcHNbnazc4gIfzjXuc51hi233LLsC/+9732vdEYisTaAA4uji7PAh2uf+9znFhu2IsaAkkj0YK9Ps7Q5CszK9vvCF75w+Na3vrVGI5FYu4CMYrfbbLNN+U+Ol0IzynyLw1ZsZlyYYZEOg0QikVgdeDkl3o28S3lptcrLmJ2SkpKy3GIWrXHGr90fbDv0qEc9qjgMbFFJHG+yySZlEp1nx9/85jfnpOvlmZKSkrKuSIxlJmt47vL8ZfJGOAsWinQYLACIBss9rDK45jWvOVzgAhcohMRFL3rR4Ra3uEUh1CyxRdImEmsDzKa1b6NtZW5+85uXGRZmir///e/PLYkSo7AH3qabblq+YWDLNR/LNr7lR48TayM8AHGAmiXGbutl55e5zGWGe9/73sOb3vSm4YwzzigPT4lEIpFYXcSMNhJOhJSUlJTlliDE/Pp+ypFHHjk84hGPGC5/+csXZ4HtiG5wgxuUsH322Wf40pe+VLZNi3S9PFNSUlLWRYnnrhjfFoN0GCwAPDS2PHjNa14z3PWudy2OAoTEhS50oeF617te2fObQ8GfUyKxNoDDwP6NHAW25WC3VhkIS4dBYgxmYtvXk5PJty/ucY97lJUG3/nOd9ZoJBJrD7zosdnttttuuMlNblJeCP03X+QiFynnVh34AJTtMBKJRCKRSCQS5314PvQ9A6um68meVhjc6173KttY+qbKn//85zUpEolEIlEjHQYLBAI2Vhlc5SpXKX86yAnfNfBBWdsSmemY2xIlzk3wInJwWfZ9wgknFHu98Y1vPNz+9rcfnvjEJw7veMc7ckuiRBdsx/Jce7776PHd7na34Va3utWw9dZblxnaicTaBPbK+WmPWsvLL3WpS53zvxyOfB9Ctm92IpFIJBKJRGL9AIeBVdPPeMYzynaVHAa2rLz61a9enhl9r+2zn/1s7g6RSCQSI0iHwQKBhD3ttNOKR3rDDTcsMxhrcgIxa7sXH/SyDCSRODdgnzJ7l/nIiZm3T33qU8vMiqtd7WplRsVee+2Vs8UTXbCdk08+uXzw2AoDztArX/nKxdGUDoPE2gazwswOe8ELXjBssMEG5f+Y2JZoo402KvvTWhnIrhOJRCKRSCQS6wd6KwzINa5xjeIw2HnnnYcTTzyxbDudSCQSiX9FOgwWCLMZf/WrXw0f+MAHyodkr3Wta51DUJjZaMuXPfbYY/ja175WPjiRSJybsAXHhz/84WKr173udYcb3vCGZSXMO9/5zuHHP/7xGq1E4p9xyimnFOen7VzYjN9tt902nUyJtQr+j3/wgx+U1QUPeMADhktf+tLn/B/bq/Z+97vfsP/++w/f/e5302GQSCQSiUQisR7Be/AHP/jB4bGPfWyZAGWCpxUGnhdvdrObDc961rPKe7KPHicSiUTiX5EOg0XAyoFvfvObwxve8IbhTne603DBC16wEBSXvOQly4xce8TzZp955pmF0Egkzi14UOLc2mKLLYpz65a3vOXw5Cc/eTj88MPLKphEogdbEvkYmI+C3e52tyvj3A477DB873vfW6ORSJz7sOWamWFWUFk95b/Yi+DlLne54c53vvPwyle+sqw+SOd9IpFIJBKJxPoFWw196EMfGh7/+MeX58T4xtVlLnOZ4Ra3uEXZfjUdBolEIjGOdBgsEr5lcNRRRw2Pe9zjzvFYcxj487FPno/Kfv/73y/bwiQS5xZ+/vOfDwceeOCw2WablS2z7Efvw6CcCOkwSIzhox/96HD3u9+9fCjbR4/ve9/7Di9+8YvLTO1EYm0AZzyn/CGHHFJW9nkBtMzct4VsRcRxf/TRRxedRCKRSCQSicT6hfiGge+wWWkfHz2OFQbPfOYzywqEdBgkEolEH+kwWCT+8Y9/lP283/jGN5YPg9qO6EIXulAhZR/0oAcNu+6663DSSSflnniJcxW2z7L90P3vf//h1re+dZlh8ZrXvKasgPnlL3+5RiuR+Gd8/OMfL9+68LFjy3htT/S6170ut7FKrDXwEmh1wfbbb18cW14A/Q9bEeMF8LDDDitbaPmvTiQSiUQikUisX/Cdq2OPPbZsPeRZMRwGvnNlFfXTnva0dBgkEonEBNJhsATEdgjbbLNN2efbn9AlLnGJ4sF+zGMeU1YZ/PrXv16jnUisPqxw4Rxgj7e//e3TYZCYCz567CPHnAYcBo59KDsdBom1Bcav/fbbr6yEscqPs+AKV7jCcIc73GF42cteNnzuc5/LrYgSiUQikUgk1lP0Pnps60rPiyZF2ZLomGOOSb4mkUgkRpAOgyWg/uDi5ptvXvaIv+IVr1iWuNV74vnmQSJxbsCDkgch3zDw4Vrby7zwhS8sW3XklkSJMZxwwgnFUcDJZDsiH5TdZZddyniXSJzb4Aj98pe/XFYSXP3qVx+ufOUrF1s1zu2+++7DcccdN5x11llrtBOJRCKRSCQS6xv+8z//s6ww8LxYb0l0+ctffrjtbW9btunlULDVdCKRSCT+FekwWCIQF1/60pfKxxURFpe97GXLKgPfMnj6058+HHnkkcNPfvKTdBokzhWYMXHooYeWbbJufOMbDw984AOLrXo4yr29E2OwJZFtrDiZfP9iyy23LB95N5YlEuc2OK7e8pa3DBtvvHHZCvDCF77wcNOb3rRsnZUfdE8kEolEIpFI/Nd//dfw9a9/vayuv+Md71ieGTkM8DW3vOUtywRPWxLlCoNEIpHoIx0Gy4Df/e53wxFHHFFmN171qlctf0QIDHvGv+IVrxi+8pWvDP/93/+9RjuRWB1YAYM4O/jgg4tt+taGGRavf/3ry+oXH0ROJHrw0WPbEbGZHXbYoXxA9tWvfnWuMEic6zCu2TLLS961r33t8n9LbAtoj1qrC2wXmEgkEolEIpFYf2FrylNOOaWsPrXKfoMNNihOg4tf/OJlcqfvYHlu/MMf/rAmRSKRSCRqpMNgGcAZ8P3vf39485vfPGy66abFa21/vMtd7nKFqP3ABz5QlrohOhKJ1URsSfSoRz2q7NX44Ac/eHjxi19cHFw/+9nP1mglEv+Mz3zmM2Xssr0ap8HDH/7w8g2Dn/70p2s0EolzBxz0++67b/lYHUeBFz/70tpCy7jmQ++JRCKRSCQSifUbHAaf/exnh5fs+JLhdre7XdkFwrPjpS996bJK9XWve93wjW98Iyd2JhKJxAjSYbBM8IfEQ22GI5LtSle60nCVq1xl2GSTTYpX27ZFvtSfSKw2Pv/5zw9PfepTy0wKs8bNFj/ssMNye5nEKL72ta+VWTfs5R73uEf5tVoqVxgkzk388Y9/LKtfnvCEJwzXuMY1yoeOjWuPfOQjixPh29/+9vD//t//W6OdSCQSiUQicd7C3/729+HMM389fOc73xu+/o3Th1NP/cZw6tdPS6nkG2e3y9e/fvrwxS99ZTj8yPcPL3npy4bNHvDA4da3ud1ws5vfcrj7Pe81bPWkJw+ve8New4c//LHhS1/+atGXrpff+iJfJ2e3wWmnnzH8+Cc/Hf7297+vsbpEIrG+Ih0Gy4gf/vCHw1vf+tZCZtgn73rXu17ZH+8xj3nMcNBBBxWyLT3YidWGmRVbb711+XDt85///LIlkZm4P/rRj9ZoJBL/DA6DHXfcsYxltrGyN7z9P9NmEucmvvvd7w4vetGLyv+qb7LY9s+3Nqya8t0NH7dLJBKJRCKROK/izF//ZvjU8ScMb3zTAcOOL99leNkrdh1etdtrhl13f+2w6x6vO1v8rt+yy9ltQbTHHq/Za3jdXvsOe+/7lmHfN7+9yD77vXXYa583D695/T7D7nu+oej9X5r1T3Z79f/azStetcew4067DDvtvPvw3sPfP/zqzPy2QyKxviMdBssIH0A+9dRTy17f9773vYcb3ehG5aOhd7vb3YZtttlmeP/731+2gcmtiRKrCTa52267Dc95znMK6fu2t72tfOApVxgkxnDGGWeUjxy//OUvH/bcc8/hJS95ybDffvulzSTONXC2f+ITnxjud7/7DVe84hWH2972tuX48Y9/fLHVL37xi2WlXyKRSCQSicR5Fb/+zW+H40/4zLDvm982vO4Nbxred+RRw5e//NXhe9/7wfCDH/5o+P4PfpiSsiAJu/nQhz9eHCgcB0d+4JjinEokEus30mGwzLBlwlFHHTU8/elPL2TGZpttVvZWfulLXzq8973vPfvP/Hu5ZUJiVYHkZXu77rprcWYdcMABhXjzQeREogerpWxb9Za3vGU48MADi8PJca4wSMwDTvH/+q//Gn7/+98XJzl7Mg795je/WRSp7z/zq1/96rDzzjuX7bHudKc7DY973OPKygJbEfnmhu8E/c///M+aFIlEIpFIJBLnPfyvw+CkYZ/93jLs/aYDhmOPO+HsZ6Cz1sQmEovHF075clmR8bJX7pYOg0QiUZAOg2UGosQHkN/xjneUbTzMfvSx0GOPPXb4+te/Xj7I+Le//W2uVQZIEt89QLRIa5uQH//4x8Nf/vKXNRqJxDSsejn99NOHQw89dNh///0LCfzOd75zeN/73le290gkerAX/Nvf/vYyjvlou22srDDg8EwkZsF/F0cBR+Wzn/3s4aEPfWj58LpvqVi1wrY+/OEPD6ecckr5kPYsJ4Jx7F3velf5P/UNFmMZx7zt1oxj+X2gRCKRSCQS6wNqh8Eb93nz8Iljjx9+9aszc9JEYsk4+XOnpMMgkUj8E9JhsALwh33aaaeVrTxsRWQLGNvC/OEPfxj9M//HP/5RSA+zvr/yla8MRx55ZEm/7bbbDk9+8pPLdjJvfOMby0xKszYTiXmAiPvc5z5XbOmVr3xlId0OPvjg8g2DJH8TY4htrHyw3fdXrE7ZZ599coVBYi7YPug73/lOsSEfJb7ABS4w/Nu//dtwvvOdb7jwhS88XPnKVx5uf/vbl/82KwQ++clPFpszJnEgxEoEjgfO9d/97nfDHnvsUVbrcUKcddZZ5X9QuP/N/DZQIpFIJBKJ9QHpMEisFNJhkEgkWqTDYIVgRqQVAQhaRJsPMtoyAbFhhYCti5zbquHEE08s28Q873nPGx74wAcOG264YfmgI6Llrne9a5mZucMOO5S85JkfdkzMC2Tbl770peIssA+92eInn3xy2d6D/SUSPcQ3DHbaaaeyFZGVKezGuJVIzIJx5+c//3n5z3rIQx4yXOUqVxkuetGLDpe61KWKs4AI22CDDcq3fu5xj3sUZ8BjHvOY4UEPetCw+eabl/FK+sMPP3zYe++9y3+g31wZlUgkEolEYn1FOgwSK4V0GCQSiRbpMFhBIPY5ChAdtmUwW5fzwPYeiLhnPetZwyMe8Yjhjne843CNa1yjzLw0A5Nc+tKXLk6DTTfdtOzVbBsGBN5xxx1XZmDOu61RYv2Gh0ezdpFuhxxySHFOfeELXygOA7N4E4kekL0f/ehHy2zuo48+uoiVKlZJJRLzgHPJVnzbbbfdcLOb3Wy43OUuN1ztalcr/2s3vOENy3/eFa5wheGSl7zkcLGLXaw4FKxEiP9A4fSvfvWrF+FA51TwH+oDxxye+T2gRCKRSCQS6xPSYZBYKaTDIJFItEiHwTLAH7StEb7xjW8MH/nIR8qMXM4BH2TcYostygzKi1/84oUQuexlL1uIk0tc4hJlluXtbne74eEPf3jZmuEJT3jCcN/73rcQKpe5zGUKeXL+85+/pJWH/ZvtP68czoh0GCRmgW2akWsbInbJ4RRbf/zpT39ao5VI/DPsP2+P+De/+c3l+xeIX99msXIqkZgHxh5bWBl7rBq4/OUvX7YluuAFL3jOFkUXuchFhite8YrlP+8Od7hDcSxYdXCDG9ygrK67z33uUxzqHAbSSMvRINw2fZzoViG8//3vL2Obb2/kCrxEIpFIJBLnVaTDILFSSIdBIpFokQ6DBcAfsVmNCNdPfOITw4c+9KHh+OOPLx9ePOaYY8q2L/e73/0K4cE5gNz493//9+IUuMtd7lIIEFsOvfCFLxwOPPDAQv7vuOOOxamw8cYbD7e97W2H//iP/yhbNcRszJve9KbDrW51q5KH1Qi2ZLAtURJ3iXnhuxi2Inr3u99dVhaY+ZsPlYkp2B/eSpT3vOc9xXHAWZBILBSckpxNHOLXuta1isPc7/Wvf/3yP2mLIg5xToNrXvOa5T/Pf9+VrnSlcn6Tm9ykbMm3yy67FNlqq63K/6g8rE6QB52b3/zmwy1vecviXPAfbCWC/1lb/Z1wwgnFAeY7QYlEIpFIJBLrMtJhkFgppMMgkUi0SIfBBBAMtuZAeLzmNa8ZnvjEJw6bbLJJIfZvc5vbFBIfUXGd61xnuOpVr1qIDuTHhS50obKCAOHBUfDgBz94uPvd7162YbBSgHPgoQ996HDve997uO51r1u2H4otGcy4DBFmWwbxiJR73vOehTSxl3jODk/Mi1//+tflo6Jm4fqgttUwuTolMQVbD/n2hY+vf/jDH86PHScWBd/sscLJSpX73//+xRnuP40j3UoBjgL/mbGarhWrEPyf+i/lLHjTm95UPpL86le/enjVq15VvnNgyz7/w5GP/1Gr9zgSrFq4853vXH6V7//z85//fG6tlUgkEolEYp1EOgwSK4V0GCQSiRbpMJgAYsG3A25961sXcgMhYW9lpIRjRL4ZkzFL0jHnwZ3udKfiIEBY2FrISgPpkB9IkNijWbg0nAgIDQ4I6eXnewacDuId00WKWMVw2mmnralh4rwKhL6PYyP3zfYmvjnwq1/9qvxaJeA7FpxaVpsQx/b0rp0Bjn/84x8XZ4GtZZDAiyXL5JWOhrUbbEL/shurodgKh5Ff4WyKnfzXf/1X+Y3j+iVDH//yl78sW7zYTuaDH/zg8JOf/GRN7MIQNpN2s/6CfX35y18eXv7yl5fvEPgvI/7XwjHgP5VjANlPxIUTIf43hXGi+6917H/SFka+8/OABzyg/H9accC5wCn/9Kc/fXjRi15UvqGw8847lzHQShk2n0gkEolEIrEuIh0GiZVCOgwSiUSLdBhMwKzad7zjHcNDHvKQMjMySH9bISDvOQb8ms2I3Edg2Kc5vlEQpMeYIE2QJOFEkDdChJPBLEyrDGLVgTytMPDBxx/84Adrapg4r8H+25/5zGfKDNpnPOMZ5bsWW2655fCwhz2s2OGjH/3o4XnPe175cLbtYpC5fp2bbWsbjle84hXDnnvuWVbFmIX7/Oc/v3x0W/hBBx1U0vigrZUzZo/7sK19wG2Rxd7f/va3l+8d7L///sXe3vjGN5Zz6ewRbtZwYu3Bn//85/IRWH3kuylWQrEXJKrfxzzmMcM222wz7LHHHsVWfIgdeWr/d/rbbrttsR02pL/DZtiScOnYjW3YbO/iV3pbXAkPm3nb2942vPWtby12w2bMKrdV2xlnnJHfXFmPod99qJ+tGMP8X/rv839X/xeG091/qP9A5/7/Il4YZ4CVerY0sqrPf2P8d9KVJpwQtjVyD1iV4Ls/f/3rX9fUKJFIJBKJRGLdRDoMEiuFdBgkEokW6TCYAbMjf/jDHxZSdeutty6zGG2jgNDnIEBYICsQGrXDAHHBGSAOeSEOgWFVQmw3RMextEFy0CdIEESJdPZrNmMSUWwverN/E+dNmBlu2wx2Ejbgl3BAIdtsY2X1im9b2B7LB0PZXehFutjmKuwK4WYFC7KN+D6Gc3p1WrqcYuLo2FaL88I9YLWCVQyJtQfGA982uf3tb18clXVfGlv05bWvfe2yksn4tdFGGxX7Qc62usab2h4c++Asm2Nn9on3nZU2LZGWzdiu7R73uEfZt54TI8nahP9RdvDa1762fGPgete7XhnLYkui+v/Pb6wmMH75HzWeiWOjxkYOfA6D+D9le1YXGNdijLPKb4cddhg+9rGPDWeemS/SiUQikUgk1n2saw4D9SLr2sQhdfbO267ePy8jHQaJRKJFOgwWANvCmP39ute9rnyAGIGGqEDS2ZPZzMcb3OAG5RgJguRAfvhFfHAyBAGCEEF+iPMb8UHYCSfOEX0+jIwUPOWUU/L7Bedx+Kg1oku/BxmLtLU1FsLW9lgcBGyM7bAlNkgQcHTYixnitgGxrZZZ5ptvvvnw+Mc/vqxYYL9WKzzykY8c7na3uxVCmK3JD5Fn9Yz47bffvqx28IHvs846a00NE2sTbLHyzW9+s8zq90HYGGM4KBGnnAN+kawxLiFZY7xhS2zNd1Ve8IIXlJUo+t3qloc//OHFRtgCG2I39O5zn/uU8Y7NyUtZtpsxo/tZz3pWGSOtRPDB7VxZkADjB/Ke43vDDTcsjiz2E/+DtcT/JjsOh4Ewx2yYjUe8PDiwOETDDn3vgLPKFmy243KPpB0mEolEIpFY17GuOQy8C+BPPvKRj5QV0SbHLTdcu9XMZLlWwn/nO98ZPvWpTw1f+MIXyuSs1drS0iQbXI8V5L1JesI8U/tG2Kmnnlp2nrAlbcDzrjxM1oqtaOe1jXQYJBKJFukwWAQMwvb0tkWLLTvMmDQTG6mLuED4IzIQISFBeAQREufikHZIYNsbITtsuyAMUWxWJQeEDyfb9sNqh7X1gSCxPPBH72GKg8hsbQQZu0Lqs7GwL0QZEpiDAOnvOxhm7iLNkP4I3Tve8Y7lQ90++GnGN0eEvb3t+22LK8Svj29zEHB2mbHLNpXB7qxs4XQ4/vjjC/GWtrd2gs1Y/WFrKaS+MSRmYnMiGU8Q+yScSlYB+CCsFQNsBoHrmB2JYzecBpwInAAcBcJIbTPSIXDZjLzZnC2NbHvlmxuJBBg7zPS3DZotsqyQYpexqiD+G/0KE2dljBUDVu3VW/zF/6pxkIhjhxwRnF1eStlejleJRCKRSCTOS5jXYYA49807nIV3BO+Wzn/2s5+VLWYRzb5v5h0CCW1iJD1EuW8+eY5COssX9yFemPzwEQhr2zf7Zlo7MUPZ8rYlJVL7qKOOGnbcccfyLUbn80K9kOc///nPh+9973vDt771rfLrGlxLXLO67LfffmW7VU4JhLp43/iTh/o7V395eR71Xisf+bkm9aVLxPuWm9X1tmPlMJAHKFNbSEu0mfbSntLV5D1om2g7zhNtojx5qlOdr3b1/nTIIYcMJ510UiH8a6ib9O973/vKxMDnPOc55d1PuQE66hRbyHJ4+J5d3T9jSIdBIpFokQ6DJcKfgj8ae3W/7GUvK8S+DzHaHgHhirBDdiDUgggJ0qOWdgsGuhwGfhEmyF4kCM95Yv2AB4vDDz98eNSjHlVsiZ2wCQ4Es8KRsxwJzmPFCgk7CyJOOuGxdQeb5KBCKiOK2R09vxwNtsBC1HFQ2NbDjPPPfvaz5eGrfRhNrD3QNx4IjUVPe9rTCpmvT9mB37CZ2B+ePYRjs7WZ2m6kZTPshcMq8hBulQIbYp/i2A6b8VLgQdVDeX7zIgHs08sR+/RNFU6rMYdBjHP+R9mx8chYJ46+8Y5DTDxHO+enlVTGSy+vXpYSiUQikUgkzmuY12GAiEZiI58R9baF9D0pglxHNNsuEqGNLP/85z8/HHjggeWbZlYtOz7xxBMLKY5cR8T7XtnOO+9cvm8mP98+8320r3zlK4UUx1Mg9a0oPfjgg0seJjzStb2tiZbzOgyi/iaayMO30fbaa69SP99MQ4Qj3JV7xBFHlFX0Vkb7hh+i3OoAjhE6Zuvja0z2lIdv/fnmmtWoxJbPvrmGoOegcK3xXTd5IenjfcbvaaedVuqEtPfO43tZ6ueavTN7h1emtrO6AsHvW4Hqvdtuuw077bRTSa/dYucIbaxunpFdT7uyX3u4HvVRltXcL33pS8u35Dxf1+DE8R627777lj7jfOAwmuU0SIdBIpFokQ6DZYI/aQN7eNH9MfrmgdUHHAgINQRHPUtySoJEkcbs8d133334+te/Ppd3OHHeAYfU5z73ufJAYgssJC/iDHlrmxl2hbRF7LIZv2wsBCEcNuXcqoRwCnAI2Ac8SOWaoLNFjdm6T33qU8vDlYexxLoBD7tf/vKXy8uB7xVwWHIWsBkrUK52tasVojZsg83ELG2/IWFTwtkM0lYe4QRlL/KRn3COTbPGn/SkJxXiNrcjStRgC8YRL59WLfk2ilUB4QSIcaoWcZzoxjhjH0cCMW5ZJfWiF72ovPguZPZUIpFIJBKJxLqKhWxJZAKFmfJIfsT2oYceWkhnz0wEWe3ZDFl+9NFHFxKds4BDwKp0fIb0yGyz+JHcvrX37ne/uzx7WYGOLEeGywNJb6LZrrvuOnz84x8vM+Y5EDgnOAzkaevdeeB95qtf/WpxOnACWHmPxEeqI/mR4LGKwLOl1asmTHkHsVVrrKoIot+12trZNeJVOFJsXWm2Pw6HsyMmnCDnEe6u/7DDDisrCOrJKMrlZFAep4JrwtNwnhxwwAFlK19hVgsoT1mf/vSni2gvq7fpqaeyPL9y3niu1U5WNdRwjdpf/3ESaA+OAM/TnB/4p7r/OYGsFNFPthbmyNDv7eqHFukwSCQSLdJhsEIIT7Y/UoSGAd3qA4RskB6kR5TEDEuEHWLkYQ97WPmz9MeXWP/gIeH0008vs0HsEY+gRfwHicZO2pm3CGKzwOnRQbr5ZV/0pBFHgiCmgwQWJw/bzpgJYcZuEnHrFtiMB0Pjhm9WcC7pU84h/cxO2AxxzJbYCwcAcU5XXNgMXbYRKwvYjmMiTjrfUPAQy15zlneiBy9ubNMLn62v2F38H9b/gyERR9iYb3LYVs2LmhfRXHWXSCQSiURifcFCVhhY6YuHsFUoJwAiGzHNWeA5XRpENKIbiY7oN3PdLwLehDXOAMS9WfdmtiOo5YPnMKnNBCVOAmQ4gpxDAdFORxo63kdsc/qUpzylcCOz3hG8d5olL636c2JwFnAamKGvPHWmAxwanAjhFLFawnV6H7E6gFPEM6NVruqvLWw3hOTXFrGlkl9OhnCObLfddqU8z5ycC6Fr5QAS3pZAZvF7V5bGdeJ8vD/Tt7qAHjEBT/nal+PEygCkvudi/RDfMJSndDViRwtODHlrc5P6bDn8zGc+szhnODEC+p6TI/pVvTh0bDE1hXQYJBKJFukwWAX4A/InxbvsY4z2CLcdg1neyDZECMItiFvknDBknW8j+PMwY9gfVGL9hYciD2K23jCTGwGM/A3ngWOELrsJB0HMFEe8sbEg3pzTIxEvDUcDu0T8epjyIJPE77oLs2k8wJuxcrvb3a6sTgl78avP2Qg7YC9hE6Qei8Jm2EroS8vBJE9OLDbjAdbsnXbPzUSiBofWscceW1bh+Y9jj/VYRcLuiJUIm222Wdmb1iwyL8D5f5hIJBKJRGJ9w7wOAxMqEP0Ifw4A73XIbbPO8QomN+IokO0IdjPpY5seZDldx3ZOEG9Gv5UCiG4rCcxq90uPIyE+DoyYtm0Okp/IE9FvlYNz7wmeA2dBvTgM1NdseqQ7ZwA+5cgjjyzXEO8bngmtZEDsq49JKYh1hPnJJ59cCH3bCNnqx7OktuCAUE/5WGGAZNdmJrV437YSgPPDDH26yudgoBOz/TkfrGhQtnzomOBnG2lOBVs9ibNKQ53DoeB7W9pRWbECQtvpJ6sWOFvq1QD6yfu4MHre7bS5+mkb/aw/A8q1SkFd1J1DwTZJs97p02GQSCRapMNglWCg5/nlHbZfnf3teMx5h+3BbJYlAi5m9SJP7DO/+eabl33veLsT6zfYkIcBD2gevnzbgOPJdjCxNRGiLWaDhyDh2JVtZWwbwyEQH6qlG/ocELY9MgPEAw8nlzIT6zbCZjz8+xbK7W9/+7K9kNVLxp1YecIGarLWr+8S2FLN9lXsw9gUemFT4n1o2YwZD6zx4JtITMGWVV7cbGFlCzQOrNpBxRnFMWo2mpcqL6Be5hKJRCKRSCTWV8zrMHCOmPceYAIRwUX4FR76QUabfS4e2ewXMY6QR1Ij5KVxThwLEyd/YfVEDvHyQMTLT15E2cKQ3rYC8o0CZH4ryHYT5aLe8rBKwK/yPA+21+s61IE+3iTKVUfvJtKoZ9RD/eSlrq5fennKg16UHbp+o90iL+nl7zzSRZ6Rl+vkMEDwm1hlFQQinwPGNca7tjxwRBwiHAucJWMrAsJ5oLy4vshHuXalsJoB1/TJT36y1GGWswDSYZBIJFqkw+BchD8Bf4g80jzcvNRIYDMufVgUseePxRK1+k84sX7DH75ZApYl+njUxhtvXMhfZG5L/IaIQw4jgDkN/NreAzGHpGNvd77zncsySLYo/8R5B2zGTB0PrGa23Oc+9ylOg5jZ3doMm/BrxQk93z7w7QIkbsRJi9B91rOeVWbNeOj3sJtIzAv/f2avsUdjEKfUrW996zK7ioMrvk0wz0tOIpFIJBKJxHkdC/mGwdoGRLpnP1sV+TAyUttseh829hvHZtCbmY+oDyJ8XYTnV44LThDOACs1rFKwUhbZ316bPuQYQfhzqizmvUoenBaen73Pa8N5J3OlwyCRSLRYlMMgPLDEQEgMRCkLk2i7aEd/HPabQ9haeWDWruVwiDh/sNq9l0/K+iVxv7EXtuHhypYzPiKKdAvCt+c4sNqA8yBWscRsXtvK+FaGhzS2BuyyLTtl3ZSwGU5Ke3lapmpPznve857FKcA22MEsm3FMT7jVBQ95yEPKahR5hlOzLTslpZb6P89SbN+8eNCDHlS+8WO5un1bObdidha9sN/1XbQHWZdfnhOJRCKRSCwe67LDQB1jdn7M3O9JPUs/n3lWD+kwSCQSLeZyGBiow0kQL+4GcUugkETrm7h20osbk1n6/jyjXZElvM9mV9ojL5bA0emlXYrU9XI8q56LkTbflSynPl5sGVPp2nwXW8YsacvoleNe9MBlT0Z7Qt7rXvcq23qEEyBmgvcEMUw4DaxO2HbbbcvMB/e0+5wttuUtVua5lqVKm+9qlbHS5Sx3GWAc4Zg0s3vTTTf9p4/OTtmMuHBEWXVgFjgHkxks4cxsy1vJaxmThZSzGvVZ32WqjdmObazsL/vNb36zLFMXzpba/7upfBbS50uRuoyVKrPNty0jnhWM//kSnUgkEonE+oV12WGQWLuRDoNEItFibodBTSLy+nqxJ+EBRjTOI/Ey3Ib5Nfs14kOE1brzSKTthcmvLSfCav0p6eURcXHci5+3jJoUCNKkLaN3TqJuETdL5tVdahmtfnte57+UcnrhIZHv2Pk8Mk+dlloGmacMtuHXTAzLNo8++uiyL+J973vf8v0Le9MHCdzOHA+C+AY3uEGZ4evbGkFAteXENa/ktfTKWUhZkbYXR1ajDDJVzjzpySy9pZQRevbvtDrFR7B8YMvsbt8p8F2LKZvx6xsY9p23XJjNgDx75c1bn9CL8zqsl66Wnl57PlZGT3ceqdO3Yb1yIqzWn0ek7YUtdxmkF7Yc5YRunUfE1f918Xyx2DLaciJu7JwspAwyr35bxkLK6enLxy8HsWcu8doqxux0HCQSiUQisX4gHQaJlUI6DBKJRIsFOwy8qPqQjI+n2IbCvmzxkutltpV4+e3FTUkv3VReU3FTslpltOmm8oq4Wnp6tfT0ptK2cWN6rfT02rzauPp4TK+Wnt5U2jZuTK8nrW6b11j4mN48MpW2jZunDPdfELfuTbO+n/70pw83vOENyz7z9Ye0EcGIX3vR+2bBXnvtVfZVhCC3emXU0tZvLE0dN6XXk1Z/LO28emMyr35bxkLK6enPE9bGz5JZ+dXhbCZeLBCPp5xyyvC85z3vHKcBm+FwYjdhM859nPbFL35x+e6KvGpStC2jDuvVY0pm5deTeXRq6elP5RFxY/E96elP5TEVNyW9NGN5LaWMNt1UXlNxtbR6vTRjec2Ttie9/HphvbgpvVZavam0bVx7zMnHOWxvW7/uYc9k6SxIJBKJRGL9QToMEiuFdBgkEokWC3IYmAno5dU2AmYk29bCKgMvs2a9eaEljuO8Pm6ljZtXrxfW06nj6uMpvTauDuvFh0zF1dLTa8N6OlPhPenp1mFxXIf19KZknrRxPktvSnp6dVjkFdLqtWHzSps2ztv8emHzSps2znv5jYWHiHMvcuLZ5mPfffcdtthii+I48DFRpK896K95zWsOW221VZldjngKwjfymCpjIRJ59fKbiluI1Pm0efXCe3rzyljaCK/jxnTnkV7aCKvDe3rzSp0f4tG2Z8bzQw45ZNhyyy2L4+Byl7vcOU4mH8i+//3vPxx66KHlQ11shqNhVvlRRi+sDu/pzRNH5k07r96YjOm04WN6Y3GLSTuv3ljYvGnb83nC6+Oezqy4Wnp6bdhYPr20PenptWFx3urVcW14Kz29NizOe3ruUQ5hWxbG6s50GCQSiUQisX4hHQaJlUI6DBKJRItFOQzOPPPM4Tvf+U4RL6/xcot8JF5sW4m4On4qLM57MqUTcXX8VFictzIVH3F1/FRYnPdkTCfC2/ixsDjuSavfi6vj5w2rZSoupKfThsV5q9fG9+JIL74Ni/NWr43vxYX0dNqwOG/12vheHOnFt2FxXoe5H5G5ZqJ+/vOfH3baaadho402KgTwbW5zm2GPPfYo+4XTCyKqTh/nERbHdVgr88YvJqyNm0dnMWFt3Kz4xYS1cbN0emFt+Kw85ok3fnMc+PXdFN/EYDO2JbK91Qte8ILyjQvOKM6CsJteXm14HVfH93RbnVam4iOuju/p98JaGYuPtG0eY2Fx3JNWvxdXx0+FxXkrU/ER1+q0YW18T6Z0Iq6OnwqL81am4iOujq/DIrw+HpMpnYir4+cNq2UqLqTWcV+6937729+WCRvpMEgkEolEYv1EOgwSK4V0GCQSiRZLdhh4efUy66XWCy1BVJI4H5N4IY7zsTShN6U/VWadbkqP9MoI/am0bbpZ0iund1zLYssIqetfH7cS+r24noyVQWaV0wvvSV1GlBNxs8ogvbietGVEvlN5hH4vriehH2lW4lqCZKJ/3HHHlY8aP/7xjx/222+/QvwiniK+znMhZZDQ76VZrmshY+UstIx59Ns0i5F5ylhMOXW65byWyCu2t/IR2u2333544AMfOLzoRS8aTjrppOIoEB9l1mXMW05Im2Y5r6WWOt28ZYT+vNKmWei1jOm2Uqdb6LW0+rPSxvmYXsiUfltmLW26WVLrt/kudxl1ORHXllnLUspp850qQ1w6DBKJRCKRWL+RDoPESiEdBolEosWSHAbf/va3z/6D+lV5ofUya7l8Ssp5VZA0vfC1UYKIcuzjtghgYtsZq4JCp06TkhLkJJv5zGc+M3z2s58t288hKWubSklJWV3x/+M+9PyVDoNEIpFIJNZP1A6Dvfd583DscSec/W732zWxicTi8fkvfCkdBolE4p+wJIfBt771reEXv/jFOS+zZr+lrN2S/bR4WY22U8Zyl+P+jG0uEL4rUUZPVqOc1byWXvh5TTiTwqFkBUpshXJevf71pV9T1n6ZZYvuSxM0CN10GCQSiUQisf4hHAZv2v+tw+v32nf4wNEfGr5x2hnDz37+i+GXv/zV8Itf/DIlZUFy5pm/Lr8f+8Snht33fMOw06v2GI486oPpMEgkEkt3GPzsZz8752XW7Lda6PntxRHxtU6t156PhUUevbhWx/EsPb+9+OUso80rjntxvbRxPqZHQq+Oj+OpMiK+Tj+PXkjoR9xY+jbtmB5pyyC9Mnp51OX04kNqvZ5M5dGmHStnVhm11Okjvzr9WBlk3nLafNsyQqdOE9LqTUmdR1tmHRY6tcxbTp1HHC93GaTOo84z8lhKOfPm4Td0Wr1ZZZA6jzaORB5jZYROG1ZLm8es+FovjkOnTRvS5jGl47jVi+OpMsiscuoySK0Xx61OKxFfpx3Tcdzq9dL1wur0dXhIXQbp5UFCb6yMsbhWx/E8enV86Pfiaon4OI90tQ6p86njHY/F1RI6jufVCwn9On0dHyKeo+CXv/xlOgwSiUQikViPgcTlMNj3zW8bXvGqVw+77vG64XV7vWnYa+/9h732OVv8pvyf7PPmYe99DxjetP/bhv0OePuw/wHvKL/7nn1etnV605vX+3YrbXD2r9UFbOpVu71mOOL9Rw+/OvPXa6wukUisr1iyw+CnP/3pP73QpqSkpKSkpKSkLJ9wFljR6TcdBolEIpFIrJ/4y1/+Ovz4Jz8dvvzVU4eTPvO54cSTTj779+Tym/J/Em3zoQ9/dNh3vwOGrZ/5nOF+mz5guPs97jU84EEPGbZ60lOGXXd/9XD4Ee8fjv/0SUU/0qyvEm3w2ZO/MHzzW98tvF8ikVi/sWSHwU9+8pNzXmaXIl6E42V4TEJnlt6U1OnV27XUs/dqncWWM0/apZYxj9RlrFY5vfhaos0XajN1GfOUs1gZK6NnJ2QxdanLmCf9vHo9WY4y5sljHh0ypRdxs+7LOk1PpvTmyWMeHdLTi3pHfB1Xy1LKmCeOzIqvZUx3njzm0SFTerPy6MWP2cmYzCqDzKMzS+bJI+o+T717Mk8ZoTNLb0pmpa/LmNJbqiyljHntpC5jMeXMK4stg/7Pf/7zcpwOg0QikUgkEonZ8Nz0/ve/f9h8882Hy1/+8sP5z3/+4cpXvvJwhzvcYXjZy142fO5znxv++tckxhOJRKKHRTsMvvvd7w7f/OY3hx/96EflJdbWRCHOQ+rwWsZ06vA2ridTemP5RNiPf/zj8hFY1+Ijn1ZL9PQjTRsW4bXMG9eTKb06H1KTDnHut5e2FjpIE1sf2Kfc/uTx0WokhL6VVy/dQsupRRjn0qz2btNMyZRenU+rMxZeSx2n3urMQabedfpa2vRjcT3p6cdxSxiNpavDx2RKfyyuDm/jejKPfhtXn7svtTNb+eEPf1jaP3RCv07ThrfxvbBa6vhWZunV8erNTjhT1Tvse9Y9Ezp1P0eantTpQmaVQWr9WfG1Thtex7Uyj95UvLCpa4lwbeu/h434KHPd3j1p8xiLq2Uevan4Oq6OX8j/DqnzaHWm4loRHzZWjyc9adONxbUiXt7+X/ynxPcv/K+0+ZA2bS98TKZ0Iy7shI3EeDLvfUkin6m4XvxUXE8Wko96p8MgkUgkEolEYjY8i37yk58cHvvYxw6XuMQlhvOd73zDJS95yWGjjW4/vOIVrxi+8IUvDH/961/XaCcSiUSixqIcBrYgQnZwGHgB9xLrJXw1BHno1ws0EsKfgA9z/vGPfywEuLqJC702HZEOafORj3xk2GOPPYaXvvSlwzvf+c7h61//eomL65EmJNLOkoXokqky6nAkgWvzvYggX9QT+Y88EI6oEdbmVx+LP+OMM4aPf/zjw9ve9rbh9a9//fDqV796eOMb3zi8+93vHk466aTSt0GmRP7aVzvHx3O1+1hbteXJR57y3n///YeXv/zlpewvfelLpR/oRD6zpFfGPLJQ/ag3kunTn/708MpXvnLYeuuth4MOOqjYfLR1ra/NEGVtW9VS2yl9adu6BeEWehFfh0XaeWWhbTalX8eFfbAH1xbkYLQNPcfqLF4bBInoGiKvsOdvf/vbxTb33nvvcm8ecsgh5UGOnWiX0Cd1naZkSn8qfN4yor9OOOGEYifPf/7zh4MPPnj46le/Wq4rrpNuna84bSWtdnSNCGS/0ogTHm0T6bSldoyPEmvXGBdIfb+Gvckn+kTe8kDo1rYZ+dV5hp0p23XKS3ykca5MeuKVHXnJw7nr8xvlRfqol/zDRoQjQv2Gjvyj7tEOxrBjjjlmeMMb3jC85jWvGd773veW9hYfQn9K6r5YbmnzVR/9yJnkf+e1r31t+d858MADy7VoI/FRJ/pxb0VfRpvUEv2r7aSPsuvy9Y386ciXjRHHYTN0pCfK1ffyVob7tc7fb8SrV92fbPcrX/nKcNRRR5X/FNfpf+Xkk08uY6kyQ6Ku9a96yLu1zRBlinM97E0+7fWSuA5tqy7qEXbyta99rdSTTtQjJPIi4rV/XF+0dT0OtenqvKYk9BeSLnSjXukwSCQSiUQikZiGZ7jjjjtueMITnjBc6lKXKg4DKw3uec97lveI008/vTxPJRKJROJfsaQtiQywSAAvsV7AV0O8OCNsTz311OHII48cXvKSlwyPfvSjh8c97nHDi1/84kIwfvnLXz6HlOjlgWxQb6S15Wn3ute9Sj6WpIlTBj0zE+MlfZ5rnFevFmX4bcuJ8CALtDWCbJdddhm22mqr4YEPfOCw2WabDQ996EOHZz/72eVaEKsxY1U+kQdBnrjmww47bHjKU55S/iTvc5/7DFtuueXwxCc+cXjGM55RvOyHH354KUsfI2+Vueuuu5Y0W2yxxfCYxzym6CO7PvCBD5yzLVXUuxXh+oFz6YMf/ODwrGc9q5S93XbbFYJVHYP06aVvhX59XSsl2hAxw9HhOu9973sPG2ywQbluddU+cd3qT8/17LXXXsOTnvSk4ZGPfGSxScI+yeMf//jSV7vvvvtwxBFHFIJT+ugvRJR82TbHxHOf+9zhIQ95yLDpppuW/pYHG3/Pe95TnFtB8vXq30rUlfTiWxnTjXBls48Pf/jD5Xpc13Oe85zy4IXwD7twP5ndi6h7wQteUNqGLjLxxBNPPCevsIG4dvb28Ic/fNhpp51KfuGkaeuzNoh6IRItd9VXG220UbmXOMj0reur9YUR99mHPvShYbfddiv39IMf/ODhQQ96UPl1X7OdF77whcWZ+cUvfvGccevtb397uQfDpp72tKcVMpRNsQ39IczYph05MdQNecpejI/GSWOecukR9/ejHvWoMgNnm222Gd70pjed00fGD7Nzdtxxx6Kjbsp41ateVcr97Gc/W/ppn332Kf0rD/b7ute9bjjggANK/aL/xfnVRshz16RucV0Pe9jDSv5PfvKTS/8b59l72Ipf985+++1XHv7Vh3NJHcRr27q91waJup922mnDO97xjnJ9xsEXvehFhWBnP3EPEOOle8t1Pf3pTy/jbogxW9+7dm295557lnFa3vo3xgT5uG+0y7777js89alPLbZ1//vf/xwbk8+2225b2lL/srOPfvSjhWDXv/qA/SDdv/GNb5T/F/8z7l+2Z5xTJ3rGPvYS9mWsv+td71r+Z9gkB3FcX4x5RF2j39RVXTjd5Kmd9C/bdN3s2RhojDjllFNKetcrbf2/oC2NperC+cjmpOcc/8xnPvNP/5ORJkSY9J///OdLvZVN/Edq6+OPP76kbe/rhchi/8fUTbkcBp7F0mGQSCQSiUQiMQ4TPjgMPENe/epXHy52sYsNN7rRjcq7j+dJz5GJRCKR6GPRKww4DBAUsf3GQiRmOPbiWml1vTAjKj/2sY8VwuvOd77zcI973KOQ2sgnL/ni44W8zisE+YXEPPTQQ4dnPvOZhTRBzCBCgpRs08TLek0yRP7x4j8WX+uEhI7fWiKdY7/a+H3ve18hUJA8CBqECRIROYikQezc/e53L0QbYggxg4Sr80KeINmQOP4kkTxIRGHIXeSlshxLK402QiAh7LSvdtpwww2HK1zhCsM1rnGNkgdiyIzNqG9cb33d2oSdHHvssYUARVhZZaCe4qKebdro+3ltpSez0tZl1LrRbn4RRB4qbnjDGxaHDRtxD4QekRapiVhzbQ94wAOKbfpFviKFpUXa6Sv7JiLc3vKWt5Q+YJPa0QoM/Xy7292utK/+1P/aGVmlDzbZZJNCrCIekXjSqmttT61txbnfaG9SX2/EhW5P9BdBqEmHkEMS3+9+9xsue9nLDte73vWG5z3veWVVBjuSrzZB/l/4whceLnjBCxYnyNFHH13arM5bftrQ9bITRCni0kMe3ShzXmn7dEzm1RsT9WIPSFukJNJfmyA0taX2Cl3HsYoCEeue1d/bb799Idbdk1YnuE/daze72c2KzWivIII5powH2vo//uM/SlsZ+9xPxj92IQ3HBVtDugdJiohG8MufcwORirhHABMP09e5znWKcAB94hOfKH3IxtipGfGPeMQjhhvc4AbDbW9720J405G3sq1Y0r93vOMdi32rCxKYw0jet7rVrcp9YbXOu971rlJn7cJWgtjWJje/+c1L3dWNLdGpbdNY5SEfYc75YBzU3u5N8Qvp09CdV7+WhaTR98ZZTln9Z0xhA+4h9uPa4r409hoPjL36l8PSmKFttDki276r+ll7ai8OBH3L+aIdlGU88Z/hP1I7Ic/dj2zMOMUReZvb3Oac/w59yIGhXH10k5vcpCzbvv3tb1+cP+rEhtigc05rZbNhRL94/zX+R9gsp5HxnqNV3q7R/UK0h76KPiWuWzqOgpve9KbFmc+5zK70sWt3j7lm9hH2x2EbeWlrx4h1bWuMZcvqoj38z8f/I92272Ms1I6cKNrMWKT9lamdxLvOSDMlkX9bzmJE3YwD6TBIJBKJRCKRmA0rMk0K9ex94xvfuKwy8B7jHcjzrkk7iUQikehjWRwG7Quxl3HSnk+Ft2FxHmFx7EU9Zl8izpGnyAVknTjipbpNX+cbhAyyCiHA+YCskm/9Yk4QIAgHJBmC7kpXulL5k0FaIA4QJ8g6xM/GG29cZqFf7WpXKwSq2dbSyw+Rg0C7xS1uMVzzmtccrnWtaw3Xv/71S57OpYk/L1s4ICuQwcglpNBd7nKXQjCZhYu4Q7ggDpAWrkNbIAjN6EQkmd2sn5A/ZlWaTXqnO92peNb9USIa73a3u5X2Q/qYzYkwRI4jNKMtg3Qx2xjhQhDCyKf73ve+JZ0Z+NpWe7nWur/iWHu7Hk4DhJWZqEgeaaJvQiJtm8dYWC/t2HktEdfTD9EGbFwfIzi1jzoLj3R+IwxJZyWF/mCXO+ywQ2kfbYrkEc/Zw370F2cPW9anCLhb3/rWhSiLPgziC7mGhDMjHUnGHpB9QQBySOhLafWn2bD6hh0i9BB/+h+piJhG3umT6GfXoTx1YGfIag4S9eH0UE8EotnC8lIGglA+HrY4MBDIiDw2rAwkNfKfw4SjQxxbN6udXdpapLUXbS1OO7MVW5kYZ+jQjTaPNHHcC4s82/ieThvWxsd5G+Y3xgv11vcEea3PI94vHfesVRSIdPeQtvUAy75cu74mrtm4RI+zKAh/fcUmEL760/1uvBBuTPBLN2ZVE2OGuiqbnclPnxhX3NfqrU7GKzYkX6sBOGrYnTxdg1/3LIJ25513LrZlXIwVEOyAbepfZLR2YLNWRmgLzhAEv3FD+fJRL+3imtRfmWZzc5ghxY2D9f0W7SiderMRM+Jds/FbXOgReiFtWE9vKqw97p23YXFMtKF7jlPM/4Z7mZ2rd1xfnQc7cP36GuFtPOFA0W5shRNHm3HixH3JptiSbX30tXS3vOUty/3nP0h/+M+QTh7GYc4cznNOLHVTT/bg/40jgEPBvWu80e/a2r3rv43TnkOAw8t/MPsIh4Hr818hDw5nfassY4lxiuPZi5txyjXqT2OkOrAj5DyiPuqtPfSvNuEsN8axf2Op87ChaG+60nlJ1Gaf+tSnyn0pLPqjbfNawu6kd28Y8zhe/E+K45QJ3TaPXp7zhMV5hLXH6uwa02GQSCQSiUQiMRt/+9vfCgdhtTOu5opXvOJw3etet0zM8+xqy6JEIpFI9LFkhwEiCAmCKPBbi7Be+DwyltaLc03KmoEYs+XrF+s23Zi0aepyIwxBoTwzHRFqiBMkPPIBaYPw8YeD+DO72Exa5DISDangJR+BZ5Y1wgxpggg10xGhaksI5D2iDsGCuNG+nBkIGuUhK2KbE/UKQR4gQBA+SCFEjBmlSGrlikMKqR8SF0HJWYHEMWsT2YdwpE/PH6rrl1b+rl19kcUIK8SJuiO5kcQIFNuUqHMQdm071iLPWX3USztv2EJk3jydt/VudcQR7YfAj5mwiFGEHpJNnHYz4xtpqs/ZsDBbRdl6A6FuhjYiFEEWZWpXv0g620/pPw4HM8URc5xGykHeXfWqVy0OKH2NuNXviOHYIoSjih4iOUg1M/mRgh6gbJXClt/61rcOb37zm0sd9bN8OcTYF/vW3+xUPhwRHGJs2XWzF4Qj4lk9tIf7h326D5BwSC/XV7djLa63bu+2b9o+qCXSTUkvv7GwOB+T0Ik61+XXeUR7I4q1uT7X3oh3pKv7i5MA+U5Hf0srP23lviTa3yx+KwGQxMajEDPS9YF8/cpPmchPfabt2Ywxzex9ZCtHYDihOIKM7VGWsqMOcV0Ib04q9bZSQZ+yAU4IzlPjQdQ3fqUz3sQ2W9Kzc23CmWFmufGRI0J9OEzcA+GoiDZspa7XmM6URFrSK2eq7Fkylbatd60n3NiuH/S1e5etuI84kY3BHLnGbLP3OesQ+xyU7Mb/jr62CsM4rW/9V8a2OsYQ/z2IdP9jdJUZ/W08YZ/613+S/wrOJ+OW8Uv50nFs+m+io/88D8TqBv85xj9pOAnkp37+r9XJiitpOApcu7I5G1yXccOYw2GgntqBQyDsyTlHQjgi1cn/rbqL9xvt6Tgk2rdt7zaszkcbqzubDYeBOLZZp29lVhnzhLei3HQYJBKJRCKRSMwHz0m4K+893lO8a1/5ylcuq2u9b3mmSiQSiUQfS/rosZf2IA1XQ5TpBR7ximhAPoTDAHkQL9x+e+kXI/Lyko44R2AhJpCtZsHa3gD5hhxD5iPnEF2IW4RLpPUbsxzNyFZ/RJn9vxF0ZlIiyjg96CLQEC/+2JB/CBcEMc84UgRpQo8ECYjwj5mbnBZIFnloryBilKscOmaeqpOyzCKmo+3iuoVHvshjJB6RDnlk9i+C2Ax0s9H1ReQXdavbcakivyBNIv/lLmMpEnUzM1c/hnOJU8DqDqSumbSI1dgKBnGHeGUriFR9jAAWx8HAGRXXqx/9xh7/bA1R5sFHHyHdhLsX5I98NvNaemQsEo+tCEMCWj2gPuwEIYnUt80Ugk7ZyDnlsw3lKoNts0XEsDRIaMQgxwBnEmcBxxgHAxt0byKH2TB9s5GtolGWMuL7Br32XIyETURfxLn7gPTSrKa0ddBn7nH3E0KYbZgpbvYLZ4BtaPSHFU5WQLkm44l71YxsOh549Qk7cP/75dAxpthGBsnMYaCfpIvxQL+agS4O6YpQlh4ZrSyOScRzrJSo+0kewpDGMVuH3RofnOvb6Idof2nYn/HP1jLhNHAP+B8xvrIPK7DYUjhNw+6j7OWSqJ/849qif1aivIWKOrgHzbbX1/pGnyKs9SmntPHX2MJO/B+xI30ifbS9MdwYw4aQ+tLqJ/noc2OU9uZosjKBnUhnTBLmZcp9zQmqTzgIpPE/oF7hEDTeGfeUFysjOAw4Dq1m83E5974xS34cVsrxf8Gm45qNU8qWr+t1nRxa2oGu/iLKce+ou62C/P+wO/ksh81IH23IqRYOA3Xi1BPnHuqlXUlxbekwSCQSiUQikZgPnpO80+KKvGN7d7r4xS9e3qOsrPce4nkqkUgkEv+KJTkMENCIAcThaghCR7nIAwQmItIsWoQX4jNequn10i9WIt9Y2WBGPVKO+AYCIlUdkCOcBUGyqStiQx5IEb+xagChihxE7LoW14RAjJmb2hZpZy905SB3zMpEniCNkX4EGSyd2bqIO3uEI5cQMkGAIRkQMWYrqy+HAccBYlBcEIlxnert18xKpBEyipPENSI2EU/qrf2RnLbCQEKZmRyzPBEqce1LkWhDRCMCCkmFGNdmHBSh06ZbbYl204/INW2jj/UdMgupru5WayDekGL6B4nqlwPJbFzEKwcM4l2/6htEmV/l2FaD/Vgp4EHH7GJ2QtesYX2kL+SljTwEBcksvTbkMPAdC4QjRwUy0SxhZZuBjChWpyDoXJMVEGyHzUY69744Noq4Y1P6BvFoZQJiUF052ti0Y1twId7UOVYq1e24WFFfdqJd2b5VD/FNADa5XOUsVbQZuzWjRRsi6K00Qj66l/UVIt+qH+OJrXn0p751fdGX2vlOd7pTGf/Ymy1okIicMGZoa3+OHQ4k/Wvs0pdEHWIcQvZyWho/EK5RtvsbKaxcYxFbiGtg5+5veWpfumyKzbkudXQ9+iTSOCbSsQXOATasjhwIHK7Ib/VhT/oLYR7pl0vUQf3VQV1dn7bkgDb2uiZl13U/N0T5rl/bGzfcezEGayv9q97+9/Sl/4ToF/V3bqywSsi47J5G5Hth0pf62VhgBZJ83f+cAZza8hHPmaNP3NfuYf3KRji6jWscFcYg9zOCny2wLYR2/FdKayUUXc4s21V5WQsnonFdfdWbtA4D/2XGTPnJN+zQGGqFgntD3TnvhbPv5eg7eYQd2EpJWygrHAbaaCXsc5Yo172VDoNEIpFIJBKJ2fCc5PnS86QJd7aX5jAw4cTznWfmf/zjH2u0E4lEIlFjSQ4DZGHMDiVe6muJ8DHp6fbS1ude4BHfyAmkQmy74U8AoRbSpm/zbc/bsDiOcy/rQcQgJGOmpe1dbAeBzDdbF/lHj756aCe/CA7kDOIfwW42uVn6vN3IGKQews2MYjNFkRTa1+xGeoh5M4kRRwggpKw8EY7ikX4cBkg+TgUkqbIRTmaYIoIQgWaMXvva1y4EH7IMqex6xCMylal9hSFkEEMIJflyhshLGxD6CCkEEEeFGeX2qkYquWZ95bdux2jb+ngsThsiiFxLzGS2jBBJxWmj3aKN27x6+bZhtcyr3wsLQbKpE0IJgY5IYxfOtSeSUvsGGaXu0gURhPhFqrEntsXGEJmIPysDEGxsXl8jrxzbzihIZjakf+PbA1Y1mBmLWESgcy4gAH3TQP7qpa+1LxLSjHQzvDmIkILS6XP2ZWZyfAsBaakuiD2ODuSv+pg1Tldd1UfeBIEtjO3a1ohDRH3YkDaIdiBT7RtxrY62JBxc7MR9wk7UlXPLPRREY5tXnLdhs+Lb8zjuhTmO8cB4qc/0bcxY1ufuI23kGvS3vhRuvHAP6gt5sB0kK7Lf9ekvbR4rVYw/nDtsQx8rgzPAvSot+zNmGGfiI64cnvrDGOD7Bu51BK9yObqk4aRQf9ejbzl82KsHbfc/fc4J7e8eUEdOLPrRv46D0JaWowHZ6zpsoaWunDzs0bXSi3R+6zYNifBaxvTVw33mmFOCvZtlxCljbDbmihsbt2pp4+K8DYvjXlgvTbQVO3F/sAnjsPHaPcahp42E61N6Yf/aLEQb+j8yVpjh7763LZlxSN+5b/0fcGb6LzJuGLvZnv8M8Uh99uG/xzY8/nP0vfuZQ9Pqqctc5jLlP5A9heNb+8lHG4cD0/gS45jjWGHCoSAveauza3f8lKc8pWyxpN577LFH+Z9039h6j40ZV93nbE561xTtEG0bbVq3cdvevbA41hfaUr05zpWnPcxGc43uiak86vM2LI7rsCm9iHN9HBUchOkwSCQSiUQikZgN3JV3DxNXrnKVqwyXvOQly2pbz9TeoxOJRCLRx5IdBsgFpMVyS7wot+de4hEWCARb5Rjsw2HghTqInjqvhUhbbkjkTRC/yHZbuiDJEWZmOCLJoo6RTjv5RfiZmW1WrTojQqRDqCBjEShm1yIJzR5FgCgTYSFfTgD7V9NBnCNbEDnER2oRLwihmG2qDv4AkUpm8CJgkE5IRCSkmeLqIC1yW95IYOQq4hAJ4/roIQPpIXr82boeRDDiEQmt/vKMD1Ail6xecO11WyxEgiBxzDmEGELumRWgPLODtXWtd25I9Lf24MBBZmlPdsGZhWSy0sBMaiS8a0Hs1vWOPJyb5ezhBSHGMWJ2LtJPu+oD7YAMNjudY0Z6ZLg8OZKCvFWmetBHCHL6sBnOAH3mfkEuBznKTsxk3m677Ypd0JUGkShPwkaRfx62ENyINKQkstB1sktlckSY0axubJHTAyHrOpDU9BCfCEtjCBIs2nMxws7kgUx03R4COSa0ofbWN66zd1+vhoSd+iWIaW3k3rdKwL2jnbWr+999p7+NF/qKA0D/xGoPba9fbe+CtNU37M5sc7qcCe4R7azNjZPuVeOmPjd2Wb2gj/Wnsumrj/EkxgZOOfe7MT6IWG1pjOFI4FhgE+yNrUtnjGFDxjCks/43hkXbRxs4Rl6rm2twv6i36wwitm7DpYoyXYP6ayfte9nLXna43OUuV67d+Oo+WKottlJf7zwS+vpK2yPx3VPa2P+G+0x/cQZyBGlDbay9akHsI/DZE6cSp4h08vJfQNgXm+O40V/uZ2Ubg8JhpU+R5MYezgX3rP8/7Wi8YDPqxk6Nf+rtv4t9KIP9+WWz4TBVLw4zNuaaOI38/+gDTgH/Pa6PTcmbnj4ydjlmo8YPxD3ngros9n+mlravtKN8a4eBe8q9Js79WKdfaVE341g6DBKJRCKRSCTmB+7KylfvWCYpXexiFyvv2LnCIJFIJKaxZIdBOA1InPckdHq6U2G1eGn2oo4cRfqYPYykRiKYISsceRD6vfx6YVNCT55myxIkLbI1tuNBSiJqzfBVNy/1ka4uQ93MHkbGIzkQI8ge4g/MOcKFDtJFmiAJ5IOA5RBApCD9kTgIXcQXYsf2L0hZZD0yPZwX/gQRQMqNMtvynSNFpIlrlUaYGaFRN9doFmhcn2MOBnF0iHK0kXzUP3SjHeI8pBfuGFGj/shJjgrkua2U/NEjRZUdJFGk7eVV59nG1WE9afV6aUKHhH3U7aF9/Zp1He2nT6LedR6uhw1pQ+QpMtfKDaSa2d5IYkQZco3zgWNBvtJy0FiZQge55sOz7EqZ8tOOdfn1NZCoj35DKofjKNKxKd/vCKeF2cPshV2zEzakPNeuDbSFesmTjvzC7vzKVznix+6ZMQm90NVmrss1s5GrX/3qhchEkiPYENZt2jgfk7qMVnp6ddhUuGvVNtqUbSBPOVT8ImORsJwc2hTBGn0lL33kF6munbUjcYyk1QbSxDijrcUpK/KSftZY5Ff/KAfxHNcRNhLjA506bfQtRwF7NI712iCuRX3Yinw4UdiDsLCHnkg7JlN68kTwskP3DzvxAW+2HE4ROq0ttlKX0eq1YbVeT+q0tYjTDvpT3+nDaONoZ2H6UF9GmlbE0WdXbEzfIOU5r/2HxSqF+j+TOK/Hf3n41XbRR+45ev6XxKln9HekZ1/GJ+lcS9QznAphf0Q+8lbn9r8nbDLG0HBYjF03ERfSC6/DxoSeurIP/63+czn3vFyyk94zUJ12nnJqvV6aCNPmzo1ltrFLh0EikUgkEonEfMBdedcygc8Kg0tc4hLlmQ5v4v07v2GQSCQSfSzJYeAl1ot7K8JDevGtTOnX4X69NCMV7L9slqKZqWYtmy1pxiVyBBkRL9xtPnV+bVwb5kVdWYgKMy6RBvZUNqs2Zn+b3cthgOAIci1e8Hv5TUkQBFGfug5IOGSi1QK2mfEHRxwjj80GRqyb2altkLltPlFGlNOe1/qRptXv6dTSxse1x3ErtW6tg5CRH9IIQaPdt9tuu9K/SBy6UV6bT+RRy1j8VJqQOn5KX3jdFj0ZSy8syEoEmdmztiIxE52jyKxb7YC0N5vXzGOz+5FuiEMkIGeCWefs0qxdM//ZhBnl9ILQq+tR18VxxLM5bW/ljPI4quRpGxezbG2fxWmHvEP+Bpnc5t/m2+pEfBzXdWnDWol8HCMlbc3DgWbWuodCJGWUt9gyQqeWnl5IT6dN6zfqrn49qesc6UIibS217qz4MZ2e1OWGTKVv695LH9Lm06aZymMqrifqZfUAx4RxnJ2wYatSxCNi2/wWWsaU/jxxdXyc1+0TEm1Memnq89Cvpb5XI00vXaSN41a/1ou4Nn0d14uPOtU6ER/hs+oQecZxK61uLy7OowxlcmL4j+Wk5pw3rvr1fy+eM7LNp82vjqvDWqnT9vQ9WwjzUpsOg0QikUgkEon5gMPynTgcgpXQVhfbUtMznfcA3zdIJBKJRB9LdhiYbRripXalJMrw8uwXIY7IR5QiVBGbCHUzEmPWZLxozytRBokwpGOQ9bYiQJ7as50gYxH4Zmua+Rj1myq3zX8eiTSRP7ICuRwz0p1HmUE21OnqvJZT6nqNlbPU8rU/BxBC2GxkBHWrM1X+vBJ5TOW11DLIPGUg68Mpop/1MQknStiYfqdjNrctutgihwLnEXHMocR22bB8lSHtrGvR7mYx2/qIo4JzLvI1k59jKj4Uqw5hf7VMXedySrQZx4nrZCfC4zqXow6rdR2rUc5yy2rUe6llxD3Dro3VHAfum7gnSJSxktey2mWsVjm9+OWQuozVKCfO2Qr74BT1fGFFlRVdVmj4LxIf/7PzyqzrmBWnTP8D6TBIJBKJRCKRmB+emaxk9R7r2162rzUBzpacViD/v//3/9ZoJhKJRKLGoh0GPl7pBRbhEgTnSslYGe3Ldbxwt3rzyth1CEc0+aPhqPBbi7h522DsWpYq7bWvRBmtzHMtS73euC62Rnr9u9QyyKw8VqMMEvFhz208iTaJc2nYIGdK2CQ7DVvt2eesepDIN/KLvImyZtm9uHnKWYrUZWgzNuK31VuqrPa1pCy/9Oxk7B5bKVmN/l0NO1qNMkhdzkr11dS1KJOdhM3U/0ELvf5Z+lP1EK5cq0N8syYdBolEIpFIJBLzwTOTd1kOgxvc4AbDRS960fJ9L6vTPeP9/e9/X6OZSCQSiRpLdhgEURm/tQibR3r6U3nUcfGSHdLqhvTy64W1cREf+fde5Otye3n2wtq4sfie9PSn8oi4KZ2e9PRn5TErvpWe/jx5hM6YXhvX05tKTyJ+Sq8O7+lNpQ0JnTG9Om5Kb5Z9TqVt4+I40td5tfnWEvnUefXi2+Ne/JiEzpheG9fTjbA2vBc/S6c9buN7cT2dqfj2fFZYLXX8lE593OrWYXV4LfPGt+ezwmqp48d0apnSncpjKt2Y9PSn8oi4sfgx6elP5RNxY/E96elP5RFxY/E9GdOfyiPSTOm00tOfyiPiSIxztUylmRXWyjw6hI6yrSrzUXAOg7/+9a/pMEgkEolEIpGYgd/85jdly1/bTPqG2eUvf/nyrUDfwPzBD36Qz1KJRCIxgiU5DHhkvcj2Zt2npKy21LPqV0pWowxSl7FSZbb5rlYZK1FOLatRBlmtclJSUtZvMc5wHNiSLh0GiURiXUS8S9r6w9iVkpKSspJirDHu4K+sznzXu941PPKRjxyucY1rDNe+9rWHLbbYYjjssMOGn/zkJ0U3x6eUlJR1XYxhtRjXQoyHi3lvXLLDwIusbwbEby1BqM2SqTS9uDqsjRuTKf1eeB02FhfnrYzFR3gdV4fVcfXxmPR0IqwOr8PG4trjKQm9Vr8Nj7j6uHc+S3ppa2njenqhMyatXp22F97TG9MZC4/zWnq6dfxUWC/tlG59XsdHWBseYXV4HVaH13HtcavTC2v1e2F1eH1ex0+F9dK2YXVcfdzqTKVr43phdXh9XsdPhfXStmF1XH3c6kyFTcX1wmqp40OnF9bT74XV4bX04nthU3G9sJ7UOnWaCGv16vBe2Dwylq4Or+Pa8DpuTKb0e+F12FhcnLfSi4+wNm5WeJzPK1N5jcW1enE+JqHXSzsV1+pxHPhugplw6TBIJBLrAoxPXli9P/7jH/9YsEg3lbaNj/OpND2Zpd/mGedT6dq4WfqthH4tPb1a5tVrZVYZbfiU7phEmlZ6umRW/CyZpwyyUmXUYWM688pY2jbfpZYxVU59vNhyFpJ2sWWQupz49cz0xz/+sXBYv/jFL8qHjq06+NOf/jT87W9/+6f080hdxjyyUP2QhaZZbBkLSbdQ/ZCFphnTnyp/Ko60cYvNayy8J/OU0YufSteTWfptfJxPpWllsfoLSUMWo99KT6+WefVaWUgZZKllhPT0QubV8d64GMfBkh0GXmR9ZNQHAkN8CJW054sJq8OnwuJ8MWFt3FhYHNfS6vUkdGq9qbA4r2UqjkR8qzcWFse1tHo9CZ1abyoszmuZiiMRX+tNhdVpQ6biQkKn1psKi/NaeuG1fhzXelNhcV5LL26esDjvhcX5POG9sJ608XWaqbipsJ7UOnWaOryN64XV4XVce1yf19KLmwqrw+uwNq6WXtyYfoTXcXVYG1dLL24qrA6fCovzxYS1cWNhcVyHteFTYXX4VFicLyasjWvD2vixsAjvhbX6U2FxPiY9vQirw+uwCK+PW5mKCwmdWm8qLM5nhdcSOrXeVFic1zIVFxI6td5UWJy34YQDwTaQ6TBIJBLrCrw7elH985//PPzhD38ohJ33SATdPGJfcTIrbkpvHpmnjN75lLR6U2nr8NDrhcX5lIzpjeXZC4vzOrw97+lFXH08pTslU+nauDhv9XthrYzFt2njvKffC6/Px9L14uJ8TL+WXto6vg5v9dqweWUqXR0eelP6UzKWps0vzlv93jlpSbQI7+m3aVudhUgvfRsW563eQqSXdiXKaPMYO6/DFiqz0i6mjFZ3VvqIn6U3S8bStvkutYxe+l4ZPb15ZCptG7fYMshCyhgLi/MxmdKr4+K4F1anGZMxvTaPOB/Tn5KxdBHmGYws9t1xSQ4DH+HzUnviiScOJ5100jnCgUDq8zo+wqakpzsVFudjYT0JvVZ/LCyO67BWryc9vamwOG/12rhaxvTGwuK41qnDxiR0a/2psDgfC+tJ6NX6U2G9tHXYmIRurT8VFudjYT0JvVp3KizOx8KmpNWP815YnI+FTUnot9LG99LUYVMS+j2pdeo0EdYLH5PQr9O0x/X5WNiUhH6dpg5r42qdNmxKennVYW1cHV+HTUkvn6mw9ngeCf023VhYGx/nsyT0e3n0wtr4CJuSMf02bFb8LOnpT4XF+VhYT0Kv1q3DIrw+ntIbk57eVFicj+n1pKc3FRbnY2FjErq1/lRYnLd6xLOWSRrf//7302GQSCTWehibjFHGKmPWz3/+8+HMM8/8J6eB31bo18chtc48MpWujYvzMf0pGUvXhsd5T7eOr8/H4mrppZtXv9Vrz8fC47in25OF6NV519LqttLTr49bvfa81RuTnn4vfR0Wx3XYlPT06+NWrz1v9cakpz8rbZ2mp9uG9XTb85700k1Jqz+VdiqullYn0tXh7XktY+FTEvnV+dbHrdS6Yzqkje+lac9rqfXn0auPa932vJZWvz0fk57eWLoxvZ5uLT29qXRjumP6tfTS1vF1eE+v1R9LX0ub11hcHNdhrYzFLSRNnC9Efyyulp7emC7p6bfHrbRxcT6m38pi9MbKcM5R8Lvf/a48h3n+4ii18nMh749zOwxiSamCFWiWm5dXM99OOOGE4jQIx0Ec987p1vpjEi/JvbA2vJWllFGHL2c5vbDlLmOp5fTCW1lKGfNKL6+6jDZusdLLZyXKaPNa7jLGZKycXrnz9n8rY2UstyyknIVcSy+/5b6WXr17YUuVXn4rUUabZy9sKTKW33KWQcbKmKec89K4vJRrqcto42qZtwzSy2eeMshyXUsd3spKXUtPN8oinAgmaaTDIJFIrAswNnlv/M///M+y/Yc9xH/2s5+VF1dhXmZj1tssod8La6XVWYj08mjDejoLlV76lSijzaMNW4kyIrw+Xoly2rDVKqPVWaj08mjDVquMVmeh0sujDuvFL1R6ebRhPZ2FSi+POqwXv1CJPOp8eudxvFiJPMfybeOWIlP5LmcZvXJqqfUXK3U+bf513FKkl1ddRi9+oTKWR53/mM5CpJdHHdaLX6j08mjDejoLlV76Ot/lKqPNow3r6SxUeuljKzbPYZ6/rDpYMYeBFQZeTs0I+f3vfz/89Kc/HU477bQy6+3Tn/70cPzxxxc57rjjhk996lNLFvksV14pKSkpKeuGrNbYvxrlnJeuJWV1RX96pvJ8xWFgcoYtiZBunsE88KXDIJFIrK2oHQZWF3B2/vjHPy77hnuBtXd4/I6J+FanDYvzVm+h8b3w+nhML2SWTi+uDYvzVm+h8YsJa+Pm0VlMWBs3K34xYW3cPDqLCWvjFhLf0++FtTKlE3F1fE+/F1bLVHzE1fE9/V5YK1M6EVfHT4XFeStT8RFXx0+FxXlPxnQivI0fC4vjnrT6vbg6fioszluZio+4VmcsLI570ur34ur4ecNqmSeu1umFteERVstUHOnFt2Fx3uq18b24kJ5OHRbHdVgr88ZPhcV5q9eLn6UzFhbHdVgr88YvJqyNG9NZSPiYbh03K74N82s7yF/96ldlwsZZZ521sg4DEqsMeCsUyGnw7W9/+5xvGXip9ZJ77LHHpqSkpKSkpKSkLEI4C6w6iI8df/e73y2km9khnsEWs6Q0kUgkVgs9hwGHp5lunJ5eZr3IpqSkpKSsXWKMJr245ZTVKGe1riUlZW0U741WF6yqwyCcBlYa8Foo2J6UHgZ/8pOflNkjHghTUlJSUlJSUlIWLp6lTMr4xS9+UZ6xYhsPD3qewaz6JPF8lkgkEmsTjEu1w8C374xtMZ55kUXiOCZB6rTns8JqqeNbqeN7+lNhvbietHq9NL28IqwO7+nVcfVxqzcrbcTNo1dLL74OG4ubR7eWXnwd1oubR6+WXnwdNhY3S6+Vnk4dVsfVx1N6rfTi67CxuFl6rX5PZ1b4mF4dNyt+VtyUXsiUfns+Fj6mF7JQ/ZBab540C9UnoVdLTy+k1enp9/KJsFrq+Fbq+J7+VFgtdXwrrU5PfyyslVYnpI3v6U+F1VLHt1Lr9PR7YXV4L66VVq+XbiqsDR+TWreXrhdWh9dxPb0Ib/Va3V5YHVcfT+m10ouvw8biZunVMqZTh+HqvUuumsMg4FhBlsN7GKy/Ms+REB9dSElJSUlJSUlJmU88Q4XEc5VnLM9a8YAXz2P1c1kikUisTTA+tQ6DH/7wh+XF1bZEXlx9j6UV4SG9+JB5dMakLmMqj1nxs6RXRi/PXti8Emnr9L38emELkUhf59HLr9VZjLRltPktVxlTeaxGGSR0llLWrPR1GVN6UzJv2lZv3nQh8+rXevOmCZlXv9WbN13IvPqz9KbiZ6UNmUdvSmeeMsisctr49ngqbci8eiGt/rxpF1JOr4x50s6r15NIOyuPWfGzZJ4yyFLKqcuIPOrjVq8OW4hE+jqPXp69sHkl0tbpe/n1whYq8+TheWtVHAY9tIX0ChWWkpKSkpKSkpIyvwTq40QikVhXYOwKh4GXVQ4D2xJ5abXKwEus7Yl6Im4qnsyjM0tm5bESZfTyW65y6uM2v+Uqo86jl99ylVMft/ktVxmz8lhqGWSeMuapy5TMk365yumF19KWsdAy59Wv9eZNEzKvfqu3kDLIYsvpyVj8cpcxpjNPGWRWOW18ezyVNmRevZBWf960CymnV8Y8aefVm5JZeSxHGWRWHstRTp1HL79e2EKlzaOXZy9sobIaZZBZeXje8tx1rjgMEolEIpFIJBKJRCKRqDHlMPABPi+xxPEsmVevlShjnvTz6rWykDJmyaw8lqOcWXnMW8YsnXni5ylnSmblMW8ZU3rz5DGPDpmlNytuOcqYJfOmXUo586ZdjTKWKlHOapbVi1tOWY0yVlqirVbyWlajDLKQchZbn7qMedJP6c0TNxY/r8yTx3KV0wsPWa4ypvJYSBljuukwSCQSiUQikUgkEonEWoGew4D43p0tioQRH+ObJQvRDYk086adV6+VhZRBxnTnyWMeHTKlNyuPWfEhrV5NTsyTxzw6ZEpvVh6z4kOm9ObJYx4dMqU3K486vm5rMqbXylRcyKz4kLG8psro1XdKpvKaiiOz4ueRefIInaWUNSt9XcaUXiutjcxKu5gyWpmVvi5jSq+t+0JlnjKWKnUZq1VOL76WxdSlbeOlljNP3Fh8LfPk04sLmUeHTOmtVhlTecyKr6Wn69xzVzoMEolEIpFIJBKJRCJxrqPnMPjud79bPujuxZXTIOW8IT7QHx/sd3xu9G8QJb24KVloutD1i2iz5YMtIer9os3wRNS0aeeVqTppW23sPiKLbe+pMpZTopy23kFg9dIsVOa5ltCZpTclS00/jyxXGdpWG4/Zydp6La2dOO7pjUmUWZddH6f8s8yyk5R1X/SnvvWbDoNEIpFIJBKJRCKRSJyrGHMY+PAxggIxEb8pKytT7axvkN6/+93vht///vdFkN4I7yDvxkRapPlJJ500vPrVrx522WWX4UMf+tDwzW9+sxAV4nvpliKLtRnpXI/rCoIfsS8s2gCZ4lfde+VEe/jlHDnttNOGE088cTj66KOHd7/73cNhhx02HHXUUcMJJ5wwfOMb3yi2Tl/+ypL/H//4xyJ/+MMfSlsL04Z1OT1RJtJHvsccc8yw//77D29961uHj3/848O3vvWtyf46t+4z9SHa89vf/vbwkY98ZNh3332HN77xjaXNhGkX8XUa7aF/tI92ijar2429Rh9G+vY6o1/DieNcO0WYcsbabKGi7MW0c6RRF9fkGl2fOsb9Ezrq6jrE06PfXn+03xlnnFHaeL/99hve8pa3DB/96EfLfUmP/th1L/Y6Fip1OeoS13D66acPn/jEJ4p977333sVm/G/Q6dVZWNxf2kz/uk/cn/IUFn1PopyIizZnV47DLqRxbwoPmxMXfUIcRx6hV9spEUcn2tyvvNlu6MvbubqLV76yo5/9ShNjV4RHXNQr2qLWUUbo1GMEPe3/ta99bfjABz4w7LXXXsMBBxxQxi7/1VHXuNZapBNf15PEtUZdemnXdqntciVlNcqpywiHgT5Kh0EikUgkEolEIpFIJM419BwG3/nOd4Yf/OAHhcwJUideakOET0mrN2/aefVCar1509Z6s3RJT2+etKEzS29MIq0+OeWUU4YjjzxyeP3rXz+87GUvG3baaadCMH7qU58qRDT9IB/aPIKcQpbf+c53Hm53u9sNr3vd64YvfelLJU2QUnW6VqIus/TGZN60QcYhUjk1XO/znve84clPfvKw1VZbDc997nOH1772tYWAZ6OIryBaQhAubBbJdsghhwzPetazhvve977Dve51r+He97738JCHPGR44hOfOOyxxx6F6ESIS4cERd4qc8cddxxe8pKXDC9+8YuHXXfddTjooIOGz372s8W5MNZewtSFDl35P+IRjxge85jHDG9605uGL3/5y4W8U782bU/mbbPQm6Xb06nTavevfOUrwxve8Ibh4Q9/+HD/+9+/XLt2RDgGuSidtvrYxz5WCONXvvKVpb2I9nrhC184vOhFLyp2+prXvKY4aD7/+c+X/lKGfJQnL8ennnrq8N73vrfY9NZbb136+RnPeEbJ7+1vf3tx9sR3Veq2G7uWOqyVGNNCd0q/1lGu+9A1u3fYBjtkL1//+tfPIb6lcz+yK3b08pe/vDheOI+0Yzin6LLdL3zhC6WNtfdjH/vYouu+lF+0U1svIp7UdRzTJWN6U2lIW060v/HozW9+87D55psPD3jAAwqRzUkWfRr5+hVGOEI4RLTLtttuW+5Bos/d19tss01p2+OOO670N2GPhx9+eGmjF7zgBcWmtLv7Wl+8//3vL22mncXttttuw7ve9a7hM5/5TLE37a092RdnKbvcfvvth+22267kx16lZfOciGxROk5GdeUM0dfS7b777mUcOPnkk0ufc5hw9Oy8884lL/fBEUccMXzyk58sbSNfIi3bfuc731nuA/ah3w8++ODhVa96VblflKHu73jHO/7J3o2H2v/4448v+W+22WbDox/96HL9xsiwpbofo6/EceKoj+twrcrR/pwP2rbu20jfSsRP6dTS052VPuJn6U3JPGmXWgaZlb4uY0qvJ/pDv6fDIJFIJBKJRCKRSCQS5ypahwGCAYGKHAvCZTESRFN7Xocth6xWGW2eK1VOfY5sEIZoRJw+7nGPK4T/Ax/4wGGHHXYoRCzyCgFFPwiHOg/nCAhENX2EOZEfgk8axFLoO0dAcTAgLUjMqqUnPvKL2eXIZDNrhcuD3dCNWb09kSZm3LpO+tKrk3o++9nPLuQYIhW5h0xDPiMV73Of+wx3vetdh2c+85mF+HKNylc3eakbpxeikbPA9SIlP/zhDxfbRlwiBJHezqPOMWt6n332GZ7whCcMG2644XCta11ruNzlLjdc6UpXKk4LhJ/rHmtr4Y6/+MUvlnye9KQnDU996lMLsc5hoB3VsU4XaXv5tWFTMkt3qgyiD9gaR5Q6IyaRs8K0b5BfdLWhWc4cIY961KOGW97ylsPNb37z4ph5ylOeUvpPG+rD29zmNqW/XvrSlxYiV7n6XXubLS39xhtvPDzykY8s5Kp+RrRquwc96EHDlltuWfpf3+hXdsK29ANRN/n5dc62SMyijvuovW7h4iMvaSI/NlTnx2ZdOxtybTe4wQ2GS13qUuXaENGu5U9/+lOxX8SwerOZq13tauW+ZdMIXnkQba0M9rTnnnuWtmKr7ktErjpGW9f1XqzIp5Y6vNabJdqMINYR3pxhrpV9x4z3qDdx7le7IcbZA0eU+1G7IfuR5vK4+MUvXtqLnSDU3ZPKkfdDH/rQ4T/+4z+Ks/M5z3lOIeYR636NhXe7292Gm970psV5wWbF6Vv3OVKfbbFBYyfnhDJijNlggw2Ga17zmmVs0R+uj31zWKrzHe5wh6KjDHXlDNTfnELyuNnNblbsQR/q509/+tPl3r/LXe5SrumGN7zh8PznP784LYwBbNjvoYceWhwN8r3uda873OMe9yhOCfdVOAyMxY5dD2eKe/LpT3/68J73vKdcm7bV3nUfRdsL93/unjvwwAOLs0A73va2tx222GKLUr7rpOea6zyWIlH+rLClyGqUQRZTzqz4Vlp9/W68SYdBIpFIJBKJRCKRSCTONfQcBghopAaiAmlDHPfOx3RmxffOI6yN74XN0mnj47wXFudt2Kz49jyOe2Gh3wsb00HkODZD1oxVxOPDHvawMqPVjFPEgj6L9L08CDJCXp/73OcKmYnoM0OYTpAUCCO6yCUzUBFLN77xjQsJh6A3O9VsamQZUgyJj+C60Y1uNNzxjnc8h8RCQCG5kG2IY0SyPG5xi1sUcY5gu/Wtb13IvVe84hVlBqy6mLWLaOMkuOc971mIQSQbewwnBlIS+fj4xz++kHibbLJJqQ8d9XeNZgub2X+rW92qEHHIQCQZUtKMaGnNRpa3to120EZIYmUgqF23mdDa++53v/uw0UYblZnCCD1tivSNNq7bXF7uH21p1rJZ0EhEJKMypK316/S9sDifFVZLT38sPo7VSztyQpkRj4w121uYeode6GorJDjSH7mPMA2y1jhCB6mpLfWFvtfutsbiUEGU3+lOdyq2gMBForJpBH7YK5L0wQ9+cOlnM8P1mfa0+kD/sBNEOycD8pVNscfb3/72xYb1nZnirkHe8tU/7DSIXY4gtoxQ3nTTTYstq4++d79xgiCZg6Q2C9wKCA6SK1/5yoWMVoZ6s2X25b5AGCNm1UFZyg+b0Tbqon2kkbd7BmHNiVW3dUjbd72wOB8L68XFca1bh0W43xgvtKf+ed/73leIZ7Ye1xRpnNPV12bH6yv339ve9rZznGfajBiP9LV2l5/4KEv72EaNY4kjS5nSGg/UQ7vpe/HsApHv/uPM5HyxOkEfPe1pTytjnzTqxm45qtgfp4Ny1FsfBYGu7rYUY9tWJqmfVV2caFYyxIoYzg/jgvQcQframM2uXDf75ARRb3HKd43GY45PduJ+4CChIz7a26+xQ/7sxJgSdtL2Zdt/0scKBOUrw33DrtmxMZdO9Fubvg2L415Yq98L6+n3wuK8F9bGt2Gh3wurz+N4LH5enTasjY/zNmwsnn2kwyCRSCQSiUQikUgkEucqxhwGyAjHyNNahPXC55Fe2nnDFiKRvs6jl2cvbF6JtHX6Xn69sHkFkeM3tv9ARCJBka4ISnGIHr9LKSckSAuEbjgNzK41qxW5hFxXHpLYLHSkPMIXWWy7I3ajzhGP5L3f/e5XiDmEKbILmc/xYfY+sg7Bh4BDnCLCEIIIX8SiMpFqQTyqGzKFncof0XeNa1yjEIFIQnVTBw4FBJ3tThB2nAUIMrODEcT254+ZwMqVr/wRhcozG1daM9uRha7b9cnDDGez3xF46tK2YdsPcb7YvhlL1wtfajlt2qkwbYaIRDiawa29zDZH7NoqBvGpLxC2yGLOIrPxxXM6IX3ZF2JdO7MN7UnkjThTnrzMjGZHHAeIWSsNYvY3R4QVIGZ4cxZYocBpoY/kb1Y6B5WZ4OxCfyNO5WOLKuVbgcLpwHmlLPVhW5e//OVLeg4F24G5Vs4UqyqEIZBdM/vjXLAtj61fOMrMohenXeTpOthn3K/zylQf1GFLkXnL6IXNEkS1/rTaRzu5H83yt82YMU27Iuo5HNx77itlBJnqf0jbcbpw+LEDzgFkvXsytv7hZLRagE188IMfLA4hxLv8jCtWbqiDvmc/HD7sAnlulYOVIYj+sDui3urPOcDRwGkUW0dxFBgX2QH7Z9OcFMYR45V+Nm6zCY4j452xxxjDxo2Rrp3dcRa4L4499thy7WEjvfaeN6wW9VE3zi7OE9fsOoyv7lM6rrvOo5ffVBmzJOrYltHm2QtbqLR5xHkbFseLkV5+Y2FxPo+wOfaRDoNEIpFIJBKJRCKRSJyrqB0GCBYvuMgMZB+SwXn81lKHOe6dj8mUXsTV8fV5GzcmU3oRV8fPOh+TWXptfHs8lRZ54BfJ1XMYSIsMinwir/o4zuO4lp6eMv2aWWsrEOQ+os4KA0Qcgsn3A2zfgWhFlKoLm5EW6SEvM2CRwGaqy4tTwJ7+ZvoiUxGGSGMELuL5q1/9aplt6xo5EpCuZpMrL0g89UKmcRiIQxBe73rXK3la3SAOIaceSH4kIZIS0WxlgLAgBeu6EmHaEqmIELRNCLIXsawsDgZ1s0WR1QaxxZA00kY+tbTtO69EuiCR4trH8pqKG5M2TZz3wsb0tbX+0RbIU/2GDEam2oLFTH3EZMzct9JAmyFVOXy0MUcCGwpCPexZ/zhHxJoVrQ+RxWZ40zXbGoksD3aAmKUrHZLXrHU2yTnEMXD961+/2BSSFnGrnupl1YLVOtIEmet+QyIrU1/TQWobE5Gu6uN6kb1sFgnLkcS5YJY7m+EYc484l96949pce7ThPBLt3rZ9HNc2QiJ8MVKX0yuzDZtHIg0CX5+xFe2m/W2vxpkXzkNjm/tU/8bs+XAYCDdT3/Y+Vvpw1rAtTiI2cJOb3KQ4Gq180tYcBdpbm7g/9S+nAHKHc/IaAABZ0klEQVTfeMbuOHRsacZOjFHEeKTMuu5EWmODVU9Wlhh3OEw5FI110f50pXVsnOIwizTGKeOIsU5bsBs2xi6Nk+EUrduvrkcbXsfV8e2xfF0TBynHhHGb89P4yBkSbVSnqyXyb/ON8zZuTObRq3Va/fZ8TGbptfH1eRtXSxs3phdS6/fS1udhc/rCOJQOg0QikUgkEolEIpFInKsYcxgQJA+SLOXcEX3h19YfiC7kPaLMjGWzpZEMiJ423VJFuYgrRJjVAYhyJJuZvWbp254D4WQ2L6cAMg3RoT418eEY0cpRYBYtcgyRjHhD3pux7Vd6xDPSDLlm1QKnglUCSC71QXoRJCAyEPGFBEbcqVdsJxKkW+0wUHeEI7Kaw4BOTdY4d91mHptljtxGFCN7rZ4wi5hof/Uyqx1RadaxssJh4HrbtlyIRNupm7paNcGpgtREaiMf3ZPRzr08VkvUE7Gl79gIJwtiHukb35ywSoVjQL+6Htel/7QZB4/2RQCb8Yx4FS5P/aFNkbf0zAY3Mx3hisg3Gx1xz3GGLGaLtrkxo5vdKQPxr276SDq2YuWAmen2iY/tjORh1QB96QiHQVwTW2BfnGTa3hZNbC8cAcIR2uyanXESmNHOica5gVQ2M912XvIOW1uq6H9jtGtmt8okZqlr67ClXtrVkChfH1ixpG6cNfpNOAJfHxsDOHqszHDPWfFhix92Lx824T7gHNLmnFC2pPJdiT/+8Y+lPfUBhyYnjrFRWytXHHtgd8YHKxE4JjiK2Key5c9GjDvijavCY3yIOqivtubwMP6pizzcl641xoD6+qOvrYbhIOXoMA6a2W8VjFVUxidOp3CkKTfyWC5Rf7arrsZvDgP3aWwNpa4rUW7KwkQ/pMMgkUgkEolEIpFIJBJrBXoOA4Qsss5vOA9SVl8QCH4RarZBiQ+MIsVi2w391aZbqoSjCAGGeA/y1PZEZsqayYsERWQj+oPsiHQEQYd8ty2RWcSIUwQyIhXBiiC2b7gtZJC4Zt3K65BDDikEsXTSIGcRgGbkInYRj8hD5LRvI5idK88oF9nHwYKctBWOWchIQrPeOTeQl0jfegsU9UYUIrqRadKouzzoxexjaZFsnAaIbv1x0EEHlfLkEdJr03kk2lFbmAWPQL32ta9d2gLB6PpDbynlLIcYHwiymjMGAcmZhbjlENIm+pMNsdGwD78EKc+ho4/Zh7bUtuxGWvk6R+T6hoD8EbQIceXqF33EFpHMnFG2t2JLHBT6jr6+MvucvdgjnwNKHBtjz2wD8a8s9eagkI6TQTp1Q/ayMfZCh+1zOLgnOERci2s1W9v94n5FCiO3OSU4Klwr24x2WKqdIKiVgwB2bbZ70gacc1bzKGclxoZ5RR2Vr83c0+5TTh+EOVuO76Eg4dl7bPllfEH8I/H1s7z0iVUoQXRbTeJ+dO9yQOgDJD470a9sQL4cNPrFNxHkaQWI7YHM7EfiK5+dcfJYIUJH/+rHsFl1cB4rptgcm2XrnETyZlOceZwUdRvoY/avr4yVrs93YdzP7m0rYzhK2AW9lbqn5a2t2IUxkb2ov3bTjnFf9tKmrI7E/5C+4jDw/Y10GCQSiUQikUgkEolE4lxDz2GAJEF++PUi6zfEeUgbVp/H8Vh8nLdxdVgb1x7X521YG9+e12Fjem1cL6w+b+PqsDquPu6dRxjSzTGiGrGE0KodBnQQDHUedT71eRvXC4vzCJO3OiCakGzIT7Nk7QcfHwwWXxMejpFQCFjEu33CkbqIPqSebX7MrDVDn/jgJ9Id4Ya0Qp6ZMcy5YHa/bWgQa7EfvF+EHQeAbZq0jXRmZkrLGWAWurKQumagW7GAbFQu8hZ56XoQy4hFDgxEmm1SbI+CUDPzVz5BvrpOhCGyWDsgGOWNoPTNAyS3NtMGdZvW7Vqft3Hazb2HPEWWaq9LX/rSwxWucIVyzdqbjrrU6Wvp5duLG4sfC+vpIFCRqwhU5LpZ27aYiXZmI4hdDgD9ou5hJ361k2vlAOJwkIe0+sD16j+/bMDMb8Qsh4l+kAeSVtlIaP1gxjdByiLOrQyI/DgEkM5sTFoEsPvHSgOkP6IZicsepVcX7S9fNuR+42RgL7YjYn9sWtlmw3N+uRZEnzLUFcHNdn18Wb7yMDM+7FwbRLvWEu3bnkdYtJ120P7KuOAFLzhc5SpXKatglNG2d+RVS51nHNe67XmrV8e1x0TZ6uh61UkfIMk5Ntxf+so9zCHEseI8Vmi4LtdA9LOxQJ+4N7U5fVtNuR+1vbbWh2zQmMPhEx8+dl9yOobjUtnGUf1mxZDxgJ1Iz87c764j+shYxkatdvJ9ldjKiuNBOnmwX1sNcUDUbeBYHmb4GyvVlb2xHWOaeipPPZXXtmF9PhZXh/X0iH7wfx5O3HAYcAQb/6Kv2vRxXIeF1GFtfJz3wubVn9Krj3thod/TC+nptMf1eS8szlu9qfj6vD7WP47TYZBIJBKJRCKRSCQSibUCrcPAC6uX1yBsHM9zXoeNhddhY1Ln0Us3T1wd1pM6/Viasbgx/Z7UedQypteGI3L8mkWLvEZuIazMykdiiUP01HnXUuc1b1gtykd0IcTtO44INtvabGBEE50gO+pjs3oRymbtI7qRtUS6EASba0CYIeUij7geM7qtJHC9SFczwZF6iENOBqsTxJvJzUmABJbejGbEGNIxylG2X+dmO9vuRP7qKJ0Z4pwGSGF6xIxgs9ldZ5A7ZrULjzzlg3jmtECM1m0QbduTuNZaT/62JLFtCLL9Ote5zrDBBhsU4hxRLV/9Qa9N25O6jFa3Dqt1xqROG4Ig1x6I2WjrWpDE0X702zyDKJOPdt9uu+0KocwRg4xFHrt2bWFrH84H7Y2UlYbdWFGACI6PbrMD/YpIpsv+9A195bVtJzxmmetXdXY9hF0pm3OAramjvPQPXeWxAcfC2JH+8atdzGBn+/SkZV/KEh/1qNuzbqNeWEgQu9rWdkr24mcr2sB1yz/I50jTy68XNiW1fi9dL04dHLu/tKm24AwjHIpIf1v8uCfFa1/3r3SuUVrjj/bVNxwx2tTqBP2vPa080tb6W5z+EBf3rbHI+Ok+ZacxDsjLLxFmLJIX/fqatCdHFSemfmzHMXmquz5nT217xDkJu5VGfYw/dXkhvbRjErq1tDphl66BU4TTxf2l7dVBO8V/TU9mlTFPXB3Wkzp9Kz29OizC27CeRPqe9PRmhfUk9HrS04tj/eDeTYdBIpFIJBKJRCKRSCTOdYw5DBAMyIwxET9LJ2VpEoQXIs03DHxLwOxWs4oR+Ag3ZFkQT0iHOn0ts/pLn/uVF4Ja3majmgFstrZ9+xG5Ztgj2JD8QXbU+dTlBCEyS+r0iELkuzogidXB7H+zwa0KQJJymKgHktnWPchTZFydX53/mNTl9uJJrVPLvHrzir42Exl5bea1WcD6WFtrT0RxL925LW07tNJLQ8JuYga3/jULHaFpWx2zwjkKbPsi3LmtZjgFOITcD7a3sYUNu7RqxDc1bHeD2GXHygjiuS2fCNfufukjotk9R4TZ+ohVDgyzsc0G54DokcK9/Ou4MZ3FCGLXL2JcPc2oZy8Ic3WK62nTrQ0y1h69sFrq+FavF0fG0i8kXS093Vp6aUJiTFxouqVI5M/hYiy1HRcHGAesrbis3rFNFrunF/81Kf8r9f/YSkmUEeW4f9NhkEgkEolEIpFIJBKJtQI9h4GXV6QtQZDVEi+4vbha6pdh0oubp4yFlNMLn8oj0swqI2SqnDqsloiPesyrX4chfcxMRR5bafDqV796eMMb3lBm6po1agZvpJunDNLqBIFhNi0y0hZICEmkPKcBQdzaksVsX1t9RLrIo5VePXphPXHNyGTXE2RKkKFmIYvzSzfqXqePcsbKirjol55OLfPkNW/4mLiOmI2N5DPjWTuIm8proeUsJq+pND2Z0o/waHsS/al/kfz6mp4wbRB95JeDCNnJuWA7Gs4jK18IpxLynB1Lr00jbV1+SNivGeIcDe4r2wfJi83bIsc5B4V+Uc9IW+cV11GXU8e34T0J3VY/wuQfce4/7aBOxgb10nZ1mjiuJfIKaeP89q6DtGl6ebRxbfxUmpCIj+sd04+4XnyEzboW8VPlRHgvPsIijzqu1qnj2zxCIq9efIRNlUF69XDvWN1ixQyngfHbf4iVErZ7ijF0Ko+QWfG1RH6zwloJnShjXv1ZYbVEfFzLvPqzwloJnXmuJephu7F0GCQSiUQikUgkEolE4lzHmMMAkTAliAi/XnTbuHlFHpFPTyI+Xux7OrNkNcqYR2bVY0qibmNS6y6lDL/S284DIYl0te0HcUzMSBW/2HIWKu31RVgtbfxCZCn9shLiehBHQWQv9frWZol2H+vLOryOk45jJWy0tdNwPtT6bR6t0Bmze2Hsvs23lvpa2rjlEmXEtbCNsJMgJXtpFip1Gat1Lb1yIr4NX4jMKqPW6cXNK7PyiPipesySWWWQiG/LiHJribGF1Lqzyon4XtrllFn1WA5ZjTLIPOWI155WEKXDIJFIJBKJRCKRSCQS5zp6DgMEVLzEIslSzn0J0qGVnu5SJPKtyYyaGFrJslPWP1mMLUWa1kbDTqfS9OJIm2edH5mVnsyjsxyyGuWs5rX0wpdTVuta1gYZu84Ij7ZYF9pjXajjckk44a2uSodBIpFIJBKJRCKRSCTOddQOg1/96lfnOAy8wJrFOyVm3ZJe3HJKXc5KldnmuxJlkLaMlSqnltUqY6XLWa32SklJSVluWa3xqy5jpcqs812pMlpZjXJW81rac89ctmJLh0EikUgkEolEIpFIJM519BwGZtbGS6ztOHpSxzkO6YWNSa0X6Wpp4+J8KWFxPhYe50sNq8PruJ7ucoTV4VN6iwmrw+u4Kb2lhNXhU3pLCWvjxvTqsLHwCKvDp8LifDFhdVwvrCe9+LE0EV7H9cJ60tOZCqvD67A6vJVe/FiaCK/j6rA2bpb09Hv51GFjcXHeSi8+wtq4Njzi6uMx6elEWB1eh43Fxfk8UudTp50VHufzylReU3FxPhbWk1ovjmvp6U2F9aTVi/Na6rg4rsPa8FlhcVxLT285w+q4+rjVq8MivBc2lr4On9JbSlgdVx+P6aXDIJFIJBKJRCKRSCQSaw3GHAZeYGN/8Hml3vd7KmwpshplkNUo59wqYyp8sdLLb10sg6xGOatRBlmNclajjDFZjXLOS9eyUrKabdQLX05ZjWvplbES5a5GOatRBlmNclajDNLm6dgzlw+Xcxj84Q9/SIdBIpFIJBKJRCKRSCTOPbQOg29/+9tlz24vsZ/5zGdS1nE56aSTuuHrouS1rN+yGm2mjOyblJWWHMtSatGGnAbpMEgkEolEIpFIJBKJxFqB//mf/xn++7//uzgMzjzzzLLCgMPAy+uJJ554DoHmuJYInxXXkyndsbg6vI3ryZT+YuN6MqU7FleHt3E9mdKfN64OH5Mp/Tqu1umFzSM9/TqfNr4XNkt6+nU+tfTi6nRT0tOv82nje2GzpKdf59PGT8VNSU+3zqOO74WTOm1PenptGBItZuEGoRY6bdqeTOnV+bQ6U3E9mdIby6cOb+N6MqVX59PqjIWPyZTuWF51eBvXkym9Op9WZyx8TKb0x+Lq8DauJ1P6Y+Gz4noypT8WV4e3cT2Z0h8LnxXXkzH9OnwhcT2Z0q3jap1e2DzS06/zaeN74X5POOGEMs6deuqpw89+9rPiMPjHP/6RDoNEIpFIJBKJRCKRSJw78DLKYfCXv/xl+O1vfzv88Ic/LB89RtR5iT3++OOLHHfccUuW5cxrSlarjLyWhUmUs5JlrXYZq1VOL345ZTXKmaeMY489dvj4xz8+fPjDHx6OPvro4ZhjjinHn/jEJ0ocnU996lP/kq6W80p7kbyWhUley8JkNdvrvHYty1GWPD796U8X8cx1+umnD7/85S/LBI50GCQSiUQikUgkEolE4lyFVQaWv//pT38qqwx+8IMflKXxPszHaeDF9pOf/GQh7ZB5KSkpKcslMa588IMfHA466KBhp512GrbccsvhLne5y3D3u999eNzjHjfstttuw6GHHjp85CMfKfo5FqWkpKyr4nmK49PzlW8XfPWrXy2rO3/+85+XDx7/9a9/LRM5FuIsgHQYJBKJRCKRSCQSiURiWeCFNFYZcBqY2eaFlePA0ngrDr7//e8P3/3ud1NSUlKWXb73ve+VXyubrCh46UtfOmy66abDLW95y+I0eMITnjDsvffeZYWBGbj0c0xKSUlZVyXGsB/96EflOcv3o84666wyaYOzwHelTORIh0EikUgkEolEIpFIJM5VeDGN7xl4WeU8+Nvf/lZeXm1X9Oc//zklJSVl2cX4YqyxbzeHwNvf/vbhyU9+8rDJJpsMD37wg4ftt99+eO973zucccYZxZlJ37jUyyslJSVlbRdjWIixz/OW5y5iG6LFOAsgHQaJRCKRSCQSiUQikVgxxKqDWrzApqSkpCy3GF/Ant1m3h544IHDVlttVbYj2myzzYZtt912OOKII8ps3NimI8eklJSUdVViDOvJUpAOg0QikUgkEolEIpFIJBKJxHkGZtb+5Cc/Gd73vvcNz33uc4cHPvCBwyMe8Yhhhx12KN83+OlPf1rItkQikUj8K9JhkEgkEolEIpFIJBKJRCKROM+AM8Be3j4K6iPHW2+9dXEc7LnnnsPHPvaxssLAKoREIpFI/CvSYZBIJBKJRCKRSCQSiUQikTjPwHYcPrb+oQ99qKwqeOQjH1m2Jtp9992Lw8BHQu33nUgkEol/RToMEolEIpFIJBKJRCKRSCQS5ymcddZZwzHHHD0861nPGjbeeOPhQQ984LDLLq8aPv3pTw+/+MUvhv/v/8stiRKJRKKHdBgkEolEIpFIJBKJRCKRSKzF+Md//WP43e9+P/z0pz8bvvf9H54tPxi+/4MfpoyI9vnil748vOPAdw7P/P/bew9wS4oy/39zcFd33VUfRRQw66prQEVcXBXZNawK67LsIqIERUFFEckiSJYgOQoiKkhGyTnnHIbskGGGOEMSGKD+fmrnnX9Zv+o+5965985l5vN5nve53VVvha6u7nPP963us9Y30qf/4zNphf9ZMW22+Q/TUUcfky659LJ08y23NssuiMacmjr19jRt+v3p6aefmT3rRGRBxYCBiIiIiIiIiMgkZvoDD6Szzj437bn3/un7m2+dNvz+D7Otv/FmWodtsMnmaaPfj9HGP9gqj9kmm22VNt50izxu5G3QKLMgGmPygx9uk7bcZod0xFG/TtOmPzB71onIgooBAxERERERERGRScz9DzyYzjz73LT7Xvul7XfcLR186JHpggsvTtdNuSFNuf5GrcOuv+GmbDfc+H8W+y3fBdWO/vXxaevtdkqbbbFtOvLo36Tp9xswEFnQMWAgIiIiIiIiIjKJiYDBbnvum3bdfZ90+pnn5Hf0i8wtF19yedp2+53zUxgGDEQEDBiIiIiIiIiIiExiyoDBLrvtnU457cw0bdr09Nxz/nCvzB0XXHiJAQMR+QMMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGIMGMh4YcBARGoMGIiIiIiIiIiITGJeaAGDW265JR199NHpV7/6Vbr44ovTI488Mjtn8sJYXnXVVemoo45Kp59+errjjjvSU089NTt3/Hn22WezPf/887NT/n9ImzlzZrr99tvTtddem37729+m3/3ud7Nz/6/vlOVvq3wfBgxEpMaAgYiIiIiIiIjIJGbYgAFi8TPPPJMef/zx9Nhjj6Wnn34626OPPpoeeuihNGPGjCyC40dZRGeEaAR9fJ588sk0a9asXBf5lCWNusKPv5Sr20awfuKJJ3Ib06ZNSxdccEHaaqut0uabb56F+GGJY6Au+kR9GMdEOvn08dZbb0077bRT2mabbdJll12W8+krfaZvGP6kURd/MeqM42As8Iu+33DDDem8887LovzDDz88Zyxok3rxoZ2yX6RRvoT9GDd8MbZJo08xdvwlCHDcccelfffdN51yyinZr+a+++5Lv/zlL9Naa62V1lhjjfSzn/0s3X///bNz/69/1POb3/wmHXzwwen888//g/w+DBiISI0BAxERERERERGRScywAQPEZlb0b7jhhunzn/98WnfdddO2226bvvGNb6TPfvaz6Stf+Uo68cQTs+iNqH/44Yenb37zm9n3i1/8Ytpoo43Ssccem1fX33XXXXmbupZffvm07LLLpk9/+tNpxRVXzHUirCOqEzyYOnVqOvTQQ9MGG2yQVllllbT66qunr3/96+kLX/hC2mSTTbIAPwyI6TfeeGP66U9/mst9+9vfTl/+8pfTSiutlI+Fvj/wwAPp5ptvTjvssENaZpll0pJLLplWW221tOmmm6Y99tgjnXHGGenee+/NARLGYuedd05rrrlmFtqp42tf+1o+BvqKQI/fPffck8eCvjMWu+66a35KIgIGBBao93vf+1761Kc+ldtdYYUV0sorr5z7edJJJ+V68Lv77rvTCSeckIMlCPyMOWP3uc99Lo8lbXIM8OCDD+axZLyOOeaYHHwoIfDA0wQEAvbZZ5+04447pu9///s5YMD5K6H90047Le25557pJz/5STr33HPz+SGY0IcBAxGpMWAgIiIiIiIiIjKJGekriS699NIsRK+99tpZfK9fCYSwjajOynzEbcRyBGlE7h//+MdZeEbUZtX6QQcdlIXqk08+Od15553puuuuy4L0AQcckAMBiOUbb7xxbo92EdlZUc8riRDq11tvvXTFFVfMbrkbhG1W/dOnAw88MAv9++23X/r5z3+efvGLX6QjjjgiXX755XNEdQILP/jBD7KIf8455+SyrOAvx4RgxjXXXJNX72+99dbpkEMOyQJ8FwQY9tprrzwePMHAUwUB2wQsCL7ssssueXwwgiqMHcfL8R9//PFp//33z77UgRFM2WKLLbLQzz6BAPp59dVXp/XXXz8HVziGEs4RQQvO3zrrrJOfpGBcNttss/SjH/0ol637x7HxGqjtttsunzeemChfXdTCgIGI1BgwEBERERERERGZxIwkYMBTBqxIR2BGoEYwZ/U5q/cBsRqxHRF+yy23zEGC3XbbLQcFEPdZwY5gjliNHyI7Ijrv9WclPgEDxHxeB4TAjlBOOUR0RGxehcMqe4Tzb33rWzloceGFF+a2+yBgQKBhypQp6de//nXae++9c/Bi++23z6814ukH+hyr83lND/0gnWAG+wRGWFVPPQju06dPzyvtEd0R23lVD2nkxdjxlzHh9UKMG+NBgOGSSy7JdcZrkBhDggM8UUB95BF04TcP6CMCPQEDgi0EDBDuzz777GyMH+UIsvB0BHXSJsERnjrgSQpe4VTCUyCcu8MOOyyfG4IyBBZ4WoFx5UkFjjOgTgI8BCc4BvrAWBJE6cOAgYjUGDAQEREREREREZnEDBswYJU9rwqK19zw+h0EaYT3s846K+dTBkGfFfCks0qfwAHGa3oIGMT79BHIqYvV8QjXCO6s9GdVfAj1t912Ww4I8DogXvlDHbweiG1eJfSd73wn+yGA91H2i2AEovfuu++eAwcENKj/1FNPnfO0BEI4oj6v4OEYedIgylAHq+sJaFAOQZ6xILBBfYj4BA5YlU99/MYCvxEQr1RCmGeb1x5xbPgg3rNPPbyyiKcdzjzzzBwsoH0CBqzwp96LLrooBzfiOHjtE4aIf9NNN8151RGvfWL8eaUSY1SeT4IU+HGcPNlBMINzs+qqq+YACue5/L2DeB0VT2VwXgm6UH8EirowYCAiNQYMREREREREREQmMcMGDBCYEY55Nz7vuI9V8IjYrEZnVTsgRrPKntX4iPQI0uHHSnpW21MPZdjH2OZ1QAQdWOVPGwQB6ANGPnUQQKA+6uJpA3xphzoR3qmL/drIp258KIvYzW8pUBf10B+Oj74HtEsfOM743QWOm3p4FQ/9pb0YC4z+0Nd4coAxaflxLOzTLj7Uhw99oa8I+bRNGmNBnfQPX9J4XROBC55GIOiCiM8+T2nEeaNOAgU8nUGAgScCytcMlXC+GBv6z1/Kluef88jvIBDs4S+BiUHBAjBgICI1BgxERERERERERCYxI/0Ng8kEAjji+5VXXpl/OJjfPODpBZ4W4G9s80oeXoOE8P5CBpGe4+XJBJ7KIGDAa4s4fgILEbQJ8CfQwdMSBAwIDIyUCFIQJODJCtof9NsFgQEDEakxYCAiIiIiIiIiMol5IQcMWHUfTx9MnTo1BwV4dQ8//svf2OZJAkRvBPPyKQIZXwwYiEiNAQMRERERERERkUnMCzlgIJMbAwYiUmPAQERERERERERkEmPAQMYLAwYiUmPAQERERERERERkEmPAQMYLAwYiUmPAQERERERERERkEmPAQMYLAwYiUmPAQERERERERERkEmPAQMYLAwYiUmPAQERERERERERkEmPAQMYLAwYiUmPAQERERERERERkElMHDE497aw0ffr96fnnn5/tITI6LrzoUgMGIvIHGDAQEREREREREZnE/GHAYK908qmnp3vvnZZmzXo25/OkgaaNxGLenH/BxTlgsGkOGBxrwEBEDBiIiIiIiIiIiExmCBicdc55ac999k9bbL1D2m6HXdKue+yb9tx7/7TXPgfkdK1tjE9YK39Btb33/Wn+u+OPd//9nPpR2nKbHXzCQEQyBgxERERERERERCYxM2bOTNdNuT6dePKp6ZBDj0y/POSI/FfrtoN/dUT62c8PSfvsd0DabY+90557/yTv/+qwtv+CZ8UcOuyodNgRx6TzL7w4zZgxc/asE5EFFQMGIiIiIiIiIiIyX8Hrdu6995508cUXp1NOOSVdeumlacYjj8zOFRGRLgwYiIiIiIiIiIjIfMWTTz6ZbrjhhnTCCSekQw89NJ199tlp+vTps3NFRKQLAwYiIiIiIiIiIjJf8cwzz6T77rsvXXHFFemss85KV155ZXrEJwxERAZiwEBEREREREREROYrnn/++fToo4+mO+64I02ZMiVNnTo1PfHEE7NzRUSkCwMGIiIiIiIiIiIyKUDo5/cHZs2alZ599tm8zd+wFpTBSngl0V133ZWuvfbadN1116V77rknPfXUU7NzBxP1RX/q+kVE5lcMGIiIiIiIiIiIyDyDQAC/L3DBBRekAw44IG266aZprbXWSquvvnr60pe+lFZdddW0zjrrpB133DEdccQR6aKLLso/Ynz88cennXfeOa2//vrZtttuu7TPPvtk23bbbdN3v/vdXG777bdPRx55ZH410TnnnJNOPvnkdOyxx6ajjjoqHXbYYenggw9OBx10UNpvv/3SnnvumXbfffe0yy67pF133TXtscce6Zhjjkm333777N6KiMzfGDAQEREREREREZF5Br83wJMAW2+9dfrgBz+Y/v7v/z796Z/+abY/+ZM/yX//+q//OqcvtNBC6e1vf3v6wAc+kN71rnelV7/61Tnvr/7qr9LLX/7y9IY3vCG95S1vSYssskj6h3/4h/TiF784/1100UXTO9/5zvShD30offjDH07vf//705vf/Oa08MILp1e84hW5XsqRt/zyy6fPfOYz6eMf/3gOOpx33nlp5syZs3srIjJ/Y8BARERERERERETmKfwg8UknnZTWXHPN9Na3vnVOwICAAAGCN77xjemVr3xlDhq86EUvSi95yUuy2P/6178+veY1r0kve9nLclBgmWWWyU8nbLTRRmm99dbL26uttlp+UmHllVfOttJKK6UVVlghffKTn0yLL754DhZQH/V85CMfyb5f/epX09prr52fQhjJq4xERF7oGDAQEREREREREZF5Dk8aXHnllWmDDTbIq///9m//ds4TBYsttlj6x3/8x5z2l3/5l+nv/u7vcrDgPe95T37SgOABoj/+7373u9PSSy+dlltuuSz+85TAJptskv+ussoqOf0Tn/hE+uhHP5p94ymFP/qjP0p/9md/lp9I4EmDzTffPF144YX+WLKILFAYMBARERERERERkUkBv2dw6623pp122im/Puhv/uZv0h//8R9nMZ+/YbyqCHGf4AGvI/qLv/iLnIZfafgSZOC1Q6961avyEwqUK31e+tKXpre97W1piSWWyK8teu1rX5tfjbTlllvmAMbTTz89u3ciIvM/BgxERERERERERGRS8eCDD6bjjjsuv0KIVwXFEwWI+ksttVT6p3/6pyz0R1CA1xf9+Z//eQ4gEDwof/8AI6jAkwPU8Y53vCO96U1vyk8s4E858jHa4ZVIvBrp9NNPT4899tjsHomILBgYMBARERERERERkUkHvx3AK4E23HDD/OohXjnE7xfE0wTxtEEEC3gagScJMLZJwxdjO/yoAyNAwF+eQIhXElGOVxXttdde6a677prdExGRBQcDBiIiIiIiIiIiMmmZPn16OuCAA9Kyyy6bXxcUAQD+8oQArxgqnyggj/QyD2M7LPwioIAvAYT3vve9abfddkt33313fj2SiMiChgEDERERERERERGZ1MyYMSOdf/75+YeI3/e+9+XXERE8eN3rXjfndw7KVxCxTWCAJwfwffnLX55e9rKXpRe/+MU5QBC+8aoiflh5pZVWyoGJqVOnzm5VRGTBw4CBiIiIiIiIiIi8IOBpg4MPPjiL+zwNsNBCC+UAQB0wCCNgQEBg4YUXzsEFggwEEHi6gIACv2NAPWussUY6+uij07Rp02a3JCKyYGLAQEREREREREREXjDMmjUr3XzzzenAAw/MP4rMDyDzOwQEDPgdgjpoQGAgXjuEESzAj/RFF100rbLKKumYY47JwYLnnntudisiIgsmBgxEREREREREROQFx4MPPpgOOuigtPTSS895LVErYFAa+RiBA15DtPbaa6cTTzwxP7kgIiIGDERERERERERE5AXIM888k6ZMmZL22WeftPrqq6cll1wyv34oggI8cVC+piieQOBpg8UXXzxtttlm6ZJLLklPPvnk7BpFRMSAgYiIiIiIiIiIvGAhcMATAscee2xaddVV8+8URICA3zAgQBBBhJe85CVpqaWWStttt1266qqrDBaIiFQYMBARERERERERkRc8TzzxRLrmmmvSbrvtlpZbbrn8+wQvetGL8quK+L2CxRZbLP9eweGHH57uvvvu9Oyzz84uKSIigQEDERERERERERGZb5g5c2Z+1dBWW22VPvaxj6VFFlkkLbHEEmmTTTZJF110UXrsscdme4qISI0BAxERERERERERma946KGH8iuK1l133bTiiiumLbbYIl1++eXp+eefn+0hIiItDBiIiIiIiIiIiMh8xXPPPZefNLjzzjvTzTffnO666670u9/9bnauiIh0YcBAREREREREREREREQMGIiIiIiIiMj4w2tANE3TNE3TtLG3scSAgYiIiIiIiIwp5RdYXgvy7LPPplmzZs2xZ555RtM0TdM0TZsLK/+34n+tMP73Kv8XGykGDERERERERGRMiC+mESB4+umnsz311FNztrXxMcbYcdY0TdPGwubnz5QF4fMyAgp18GBYDBiIiIiIiIjImMEXU76kPvbYY2nGjBn5R0efeOKJ/GOjw1h8kW/llTasX8tGUnYi2pnsbWB95cv0uWlnUBtl3mjb6LNWGxPZzli2NRFtYGWdZRuRNhZW1zcebWBlnWUbkTYW1qpvrNvAJqKdedVGX/pY2ty0MWzZuWkDm4h2JqINbCLamddt1Hl9voNsUFn+73r88cfzNgs4RvqUgQEDERERERERGRPi6YInn3wyPfTQQ+nee+9N999/fw4cRNCAPE3TNE3TNG3sjUDBww8/nB588MG8aIMnDuIpg2ExYCAiIiIiIiJjQvl0wX333Zduu+22dM899+QvrqTxJTb+dhn5tU9fWl9emVZbK78rrVVfK622YctFWl9emVZbq2yrTMuvTC/TamuVbZVr+dV5dXpYWbbPD+vym5uytfX59eXV+bVPnTbIr5XX8mmlD/IblFfn1z5lWukTabVfnV5al0+d1uU3KG9Qfple+pXpdV6ktazlU6fFfmmtvEirrSu/TAuf0lp5kdaylk+dFvul9fnX1pVfpoVPabVvnVbbMOViv7Tav/at8+u0SC+3a6t967Tahi0XaXV6mVenh/WVq/dLa+WVabUNWy7S6vQyr04Pa5VtlSn9uvLKtNpaZVvlWn5leplWW1dZ7NFHH82LNfg/jP+/eBKBxRwGDERERERERGTC4ctoBAwIFPz2t79Nd95555xVbnyJjb9dRn7t05fWl1emldaV35XW8m+l1TZsuUjryyvTSivLlX6tMi2/Mr1Mq61VtlWu5Vfn1elhZdnwK7e7fFvpZVptrbKtci2/YfLq/NqnThvk18pr+bTSB/kNyqvza58yrfSJtNqvTi+ty6dO6/IblDcov0wv/cr0Oi/SWtbyqdNiv7RWXqTV1pVfpoVPaa28SGtZy6dOi/3S+vxrG6ZM7JfW59+yYcrFfmld/oPyu9Jiv7Q+/5YNWy7S6vQyr04Pa+X3pfXllWm1DVsu0ur0Mq9OD2uVbZUp/bryyrTaWmVb5Vp+ZXqZVlurbGzzVOe0adPS3XffnZ/2NGAgIiIiIiIi84xWwOD222/PK90eeeSR/GWWL7Jsl1amsd3aL9Pq9C5r+dfb5X6Z1sprWct30H6ZVqd3Wct/UNmyzCBfrOU7TNlWuT5r+Q8qW5apffvSyvSWX22tcmV6mdaV1+VXW6vOVlqZ10rvs6ivLFvvl9aX12dRrizbVVdX+iCLcmXZvrpa/sPYSPzLNkZTrpXXslYbg8rX/oMs/Mty5XaXDeNTWviXZertcr8rrc/CvyzT2o/tMq2V3mVddZTbtU8rrc/CvyzT2o/tMq3267OWf71d7pdpdXqXtfxb+7FdptV+fVb7t8q29lt+XdbyH6Z8q1yXlb7hX26XfuU+TxXwdIEBAxEREREREZnn1AGDW2+9NQcMWOnGl1a+1PJFVtM0TdM0TRt7i9+QMmAgIiIiIiIi85wyYMCXVQIG/I4B2w888ED+4srriWojvStvrGx+aQOLdibqeMaznYluY6LaaeWPhZVtTFQ7rfyxsLKNiWqnlT8WNhFtYBPRzkS3MVHttPLH0iaiHY9l5BbtjGdbZRsT1U4rP4z/t1i0YcBARERERERE5jmtgMHUqVPzF9fp06fnL7G8nmgYw3ck/qOxiWgDm6g2PJbhbSLHy2MZ3ubHY5lf2sEmqg2PZWQ2vxzLRI7X/Hws/L9lwEBEREREREQmBa2AAb9jwJdW3qfLl1heTxTG/iAb1g8r6w5r+dXW8u0rH3mltfxqq/3qsvV2bZHXZy3fMq0vv5XXspZfndbnU6d3WcuvLN+qK9JaeS1r+dVprXrCp5XXsq46yu3aJ9JaeS1r+dVprXrCp7Tap7Q6vy7TqiPSSivza2v5lGmt/DK9tNqntDq/LtOqI9JKK/OHsbJcq45Iq630GWR1mVYdkVZamT+MleVadURaaWX+aKxVT1l/K79lfX6tvEgrrczvsi6/rjoivbTap7Zh/cLKukdStuXXVz7ySmv5ldbyK9Pq/NgvLfL6rOVXprXqirRWXstafmVaK79Mb+W1rPRjm/+7DBiIiIiIiIjIPKcVMMDuvPPOvNqNNIzgwSAb1q+0qH/Ydvr86rz44t6V32fDttHK6/Mprc9vUB2D8oexYeoYaRv1mGOD2on8Yf1GmhcWPrVfa56U+aWNpI1B9XTlDyqLhc8wfl3pw5Qdid+wea3x7qtjkA1Tdm7bGMbKNoZpp+U3TNnwGeQX1vIdVD7yB/mFhV99bgdZXxutulp+g6yvjZaNxLe0sswwYzDaNkbSvz7fYfK68ksbpp6+vK782rr8hh3rYdrp8xumjmHawFp+pPF/lwEDERERERERmee0Aga33HJL/uFjvrjGI/L8HY3FirmuvHI79ulHl4V/aXUbbN9xxx35GDC277rrrv/Hb6TWVz7ySmv59Vl5nKWIEDaojdKP4+W4sRAhSt/YL+uP9kqfulxt0R7t8NsXvM6K7cgfVD7aKK3l12eDytT1h8UY0W/mSWucaotxKq0cM7ZjP9qJvDIfi7JhpV+XlXW26h/GBpWL/LJvff2r/bCuNhhz5ghWj3fLf5B1tRNW54+2DeYJfWauRDC19mltMxb1fAkjry5TjmGULf3Cp5Ve5vfZMP5l/XXfIq+2Mj3qLsuSH1b6lH5h9f2E/FaZsDK9zi/3y/xhLerpKl+m01ee0CvnSd3vSCuta0zD6vbL7fApx7c08krfskxY5Jd+LR+uX46L4+Qv+3XZsHq8ynrKeltWluuyuH+X12U9RrVFXqtNLPJLq8u28kpr+Q1jcTzUwY8gGzAQERERERGReUZXwCC+gPMltrT4Ulynj4VFvbR90003pauvvjpddtll6fLLL0/XXXdd7luIAnU/Yp8v3ldccUXaY4890le+8pX0ta99Le21117pkksu+X/Klfu1lfm1b7RZW+kTFmIA260ykYfowTFyvBdccEE2tklDgMK3rw7GZcqUKenUU09N++23X9piiy3ShhtumNZdd928fdhhh+XxRJDAn7+cZ+q/6qqr8hgzbvhcf/31uc3W+S+N+cKPNVJ+t912S//5n/+Z/uM//iOPPfmskqzLjIXF2LbyhjFENF4BwZzYeeed0yqrrJLWWGONtO+++6Yrr7wyH3eIR7TDPnOPsWKMMPz4G+PGGDD+ZbCt7GPsM+7MbfwvvvjifJ4vvPDCXA/pMb+j7aij3K6tK73Loq7S6nT6GcfMsTIv6F/MnyjDsd58883p2muvzX78ZT8CMPiEGMd1fdBBB6V/+7d/S8sss0z62c9+lucZ9ZEf/ZsMxvExTzgGrsPtt98+z+1VV101HXPMMfk840PfW2UxxoBr6dJLL03nn39+Ou+88/K55rxT5zXXXDNnrCjHPeCGG27IYx3zjHmCT3lPjHnHeFM/5aJd/G688cY/8AuLOjlH9J9zSbnyHIYPxnymvugbbcW8j/sEfaI9jiXaII80ytJGHBfHEnVTT1l/jBnvcqdvu+++e/qv//qvtNJKK6W99947188738dqnkR7rbyRGvU88sgj+Zh/+MMfpk9+8pP5s+fggw/OadxryrbK+0mMZ1g5howZ1wdj2Oov+5w7/CjHveScc87Jc405F/UxzlFHfU2X8yf6hX95rhh7fOjL2Wefnfbcc8/82fK9730v7brrrrlN6o7jrC3qjnkQczP6h9Ef+hX3zzi+ONYw0mIOcC0xT5gjK664Ytp///3zcTJPWmXDYr7TXox3XIvlWPfVMRbWaiPaN2AgIiIiIiIi85RWwIAvzny558srX/Tj77CGf9gwabGPQMeX+VNOOSVtueWW6Utf+lJaYYUV0je/+c0sUpx11llzhLr4cl2Wjy/hiB0IwV/4whdyHYjZCChRLvzDWn0qjT5hLZ9IK/PCv/QJQSD8oq8IKCeddFLacccd01prrZVWXnnl3G8EELa//e1vp5122ikde+yx+bjKfkQdtIFos95666UPfvCDaYkllkj/8z//k771rW/lsVt77bXTj3/843Taaaflc8t5pt3jjz8+C6FrrrlmFvs/9alPpc9//vM5yPDzn/88jxmCHm1Em6UhliLOIPbssssuWVBdeumlc3/pEwED/GIcYizKtDo/0vqsz79Mq/Njn37zGgsEJ46f8UYIJtCBgBQ+HDd/GTMCMYwhAajPfOYz6dOf/nSem6uttloWB7/61a+mb3zjG2mTTTZJ++yzT57DCG+cI8YbsQ2h+Be/+EXabLPNcoAizvX//u//5nmKAHfAAQfkfjHu0e/oO3/jnNdzKdLLvPLYwyesrxxG308++eS09dZb5z5yrD/4wQ/yPIxrkHsF4/KjH/0offnLX07//d//nYXEo48+OgtwzNWoN9rg+N/3vvel9773vTlgUPaj7O8gwzesTmv5lBb5g4zzxjEg7m+zzTbpox/9aL6uOD6OPY6trJNtxuf000/P9x2uP8aPseE8U575w7XG3GGuUD/lLrroorzP9ch1iH33u99NRx11VPY54ogj8jXOufjEJz6R5w/XHUJxiPMIusxpgmC0y1/mJ23R9uc+97k5wTECZiHi0sbGG2+clltuufTv//7vuQ3uSbTLOCBCMwYcB/3nXoVYfPjhh+c+c6/h+uc+8p3vfGfOPEaU5b5NEJP5TXn86Bv3aK6JOP+MJ9clIjv3ENpZfvnlsyjMvY9gAtcjx1laOfZlWld+7A+y8G/VUaZhBAwQ35n/H/7wh/P1/Mtf/jJfRxxTeR0QZDnhhBPyeHJuGLPPfvaz+fxwnriXkM79hHsFY3nmmWfm+mmXMeB+QgCK88j9+otf/GI+Z9TBPMO4RxGY47qNOcZ9adttt83+nAvaQGhnLjAPmLeMN+eT+UfQZp111kmHHnpovs8zR8nn/L373e/O93sC8oxB636PkUZ/zz333Oz79a9/PR8z55ZxWn311fOcZE6tv/76eU7/5je/mfN5x5jFOMcYss28xzcC1dRNPp9JZT9iO8oxJxn/uD4ZN8ab42Ke48MYRzn+xnZpZVrLJ9K6rPQNi+M1YCAiIiIiIiLzlDpggNiFoIEghpjEF1gMQarLRuLT5Uc6ghDt8sWfVZrvfOc7swCCYIKwxApY8rvqIY1jQHBCrEG8+f73v59+9atfZQEzxCb8+MuxIroghiCa0A6CJqIKogJ1Ug5R9JBDDskCJ/nHHXdcFtPpC3UgMiOu0s6BBx6YDV9WU8dfVvgjnCHKIAjQPgIMohyiDkJ/PA1B/QizrKxFuGEsEPJ4UiBEf4QRzhPjQrsIiUsttVQOFiAgIQDSbwQgRBDapQxCBccfZTl2BCXEpbe85S3pTW96UxZGEYAR0BGaYtzq8SaNPM4Lx4fYgwBDsIH8CACF1eXDBuVjpQ/zEuvKj/0yP9IwxoBzwDnkXHH8COOIn6x+xZdjw/DlPDMOnGOCBksuuWRaeOGF89xEdEIwRaTlXDGOCIbkEeSiHGPP+CDOIfQiUG266aZ5BTJBG1asI5AiGiLCIcYxz5iDtE0wiL4xB5kbzCMExF//+td5nlMPAhuCHGUi2Fca1wSiIEIZ9XDcnHvmE8Iu9R155JE5Dx/SmDuImoiCiy22WPrABz6QRT3aYv5w7XB9MR6LLLJIzmce0keOOcYxxhwhjzHk+BEiaYdVyczL1vkKi/PWysPq/JZ/K61lpV+Iegi1nA+OnXsKx8B1H8cW5ZhT+HI+ESA5l1y3zA+uxTPOOCPPgw022CCfa8R7VqRzPhkDBFLuQ4i5jCk+zCuuX+YB9xnmH0L62972tjzXmGO0GW0zFxhfytMHrmF8uDYJBLz97W/Pf6NehFwM8ZX7H/3lXL7//e/P1wRt8hQR55r5vfjii+f5QGCMewx1cP/huv+Xf/mXbPgRVGEuMk4cF/cz7l+Ufc973pNFYspx7+DYY55yHMxV7iEEGJhfjBl1dd2HyvNanr/a+vKwQflY+JR+bDOPGX+CIIjfXDecc66TmN8Y/edcci1y7XJeuP8zJgSJuZ/89Kc/zWPP5wHj9aEPfSgHEvDn84jrm88ozi/lEPYJ3sR1y/xkH0H+Xe96Vw6Gcp3TPvd85hgBBe73r3nNa3L7nJsTTzwxzwPqpv8f//jH8+cJAaC45hGyCYL85Cc/yfMMoR6hnTnE5xLBbz7zCBAxF+kL55885hLlOE6OCaGf+yCfc1wjBEc4To6ZPm200UZ5DCkb8yPGkL/cq7ln4cec495FHp875TmLc0Qe29THODJWBBz4fGWc+MzjPosPc60sj1EHVqeHDcrHSp/W5xjGNWDAQEREREREROYprYABgg1fqhE24kttCEstG4lP6Vfu85eVmLSLaI6YhgCOQIfoygpOhJeyfG3k8WWcbUQKRC4ELbY5LvLw4Us524hQiFEIgYh7b3jDG7Kgi4jPSkp8EFAQXxA4EdF4nQoCM+IM5clHlEMAo8+s2kTIQcBAqEF0IfCBgMex0B9WYhIsYKUnIusb3/jG/CQBIgZ58SOOCByIiZtvvnkWZhAIEVUQHhFkyUMMZEXqRz7ykfT6178+H8O//uu/ZsEa4RFhiBWiiJgcD2MYx89fxGhELgQ/+kxbHB8rP1kJinDH+DFu9Kceb+pgbOk7x0Z9vOKBfPLwCavLRlorv7ZB/nVa7Hel0TfmOX3lHCKkcQzMv7ocvojHCKcEGRhXBCbOH3kzZ85Mjz/+eBbUEOLJ53xwThG0EPoQ/xDJmEcIs7TF9ca5RhxCJGIFOSLcq1/96izqUpaxR0BESCMQgTEPEAOxZZddds6KdcoiniHEcs1wjjlGBHwEQfrFPMKYq4jDzFeEO9pD7KUunu5BqOMaINhF37kuENZoY6uttsqCMMfFMSLuca0y1xDgmAPcOxibcg4whoiNCJIRlCMtfGK8a6vPR211frkf22Van5V+5TbiJKv3uafQ7/reGHOE80VwgbHgKQrEY0RUzjflmCNxrTDG1IlojkBJPdxXWOnPPInrj0AQ55M+IIpyr+G+wpMICPHcKzDqQcRHaEfkRfykHPOKewB1Uob5hGAd54d5QnnmIH0ioEHQkGOgPY4ZIZt95hCCMNc8fWb+ci9CUGbucd+J+w0COvnMc+rl3sJTEfSPfMYwxN8YS/pD3dwfaZdrk/EirWuelGlRT73d2q9tUD4WPqUf2xwDfWRcuS9zP+H4Y57UvtyHOV8E7fj84TrmSR3ynnvuufx5Rz6BHa47jOsunjRg/BhrPrc4p4x/3MO45mmHMWR+MKdIp036w/VJgJLAJJ9n3E+4D8TnGsdBgIHPBO4xnPt48oqgPtsECcjjXsLnH21zr2B+cI+iXu4NBA6Yv9TJ8VM/84u2CToRpOC6mTFjRr6/co6Zn8xV7qHMJcrQNn4xhhh1MkbMEa4jjpd06ij9WtvMXT5rmZdcT/SZz22uGXy4FvhbWllHK62VX9sgf/YNGIiIiIiIiMg8pwwYIBrxJRwBCGGLbQSGiTC+LCMs0S4iOMIlAiur5RHXEJ4QAlplS4s+88Ub8SXEqMgrjwlxmPZY5Y0gy6pdhH2EGYRkxgPRlBWQrIpk5SMiCKsmGSOEGVYpInBQBnEHsZb6EI55FdDHPvaxLL6xuhzxDYEDMQDhB8GH1byIsbxagv4wBowF/UMkQYBENETUiJX/CJKIIwhHiDcIywQSEI0REglMIBzST4wAA8dDnSFKMA6IW4x1CIEEahByaY/+ItrwSpRY4Ut/YuzCBo136TtZrJwH9JV5FWJk7YuVQg7iF3OB4AxjxmpdVtLyl9XiCKIIcPHu9Vjtyypgzg3zB3GPc0ed0T5/OdfMIVb+YjvssEM+z5w/xFYE/de97nVZoGe+cd4Ro3nSAF/aRGTjfCLykcc8ZM4wrwkMsHIZkRKhjEACwQvmPvXy1AR1MIcR9hGbmRMEpDguAkqIiMxFgmFsI/4yVyiHD/Uzh7g2YtwYwxhz0hD+Qvwrz8VkNo4j5jf9Jq3ud6RzzlhJzfzgnBOcQSBFkOT8cy4RUjk/+FKGucX1j8DLmHKv4XwToEEQJgDAtcg9g1X+rCynPs491yVicAQj4okRzi8iNPOBPrCanUAP54YyHAttx3HQf+rhnsfKfu4nPOWA+EsQkvNMeeYG5WP+xjEz1wguISQT9KB96iaPfpBOPvcv5hbtdY1lOd7lHCp9JpNF37iWGRv+xrGVFsfMMXHv5l7LeeS65fVRXI9c16y4R1DnHsxc4JqLQA/XDvcPAjeU5X7D5yXnmeuS+cJnAPcdAguMNe0xxyjHtb/ddtvlYBafr9xLmF/URdCHzymMfO4ZPNXCnKAOPhOpjyAB55PPaOYo9wbui9wHuAfE0yfMR+Ylx07bzFfapAxPI+FHetwTmH8ENAhw8tnJHOTeyniS3xrP0cwTzhHHwhxl3DkOxpjPOvJbbU2EcTwGDERERERERGSe0woY8AU+ggaIWC0jrys/8vrK10a7fFGmXcSICBggkiHShpAe/qNpozbaRDjgizptIHIhxLHaGpEC4Y1V1AgJCL6IGIgtlEOgoA7ap88I6ghAiMG8zgGRFgEF4QYBFqEHf0QNXl+BCIyI+NrXvjaLcYg71MPKxxAP8EWcQXxkxT8BAwQhBGpWG9N3xgwfBEjqo5+IxQg15CN8ILbQ5zhuxpEyCE6IJRwzr0RCOEZURCBGqOFd8wieiEis8mReRN+irtpGc17oW9m/YW00bY3Uom76h5jFqmfO6ytf+co8ZowVK4QR10IQRrxHqCM4xJxgPiO8cZ4R81jxzXzhPGCcZ+pnviEm85QIwSuEOsRZxDq2eSKA80FwiAAQQTTmC+eY80mAAhGPgALXDauI6Q9lWKGMaMs8pD2OJVaIIwIjKPPUBKvaEa3pD3XyxEAEB3gShadvmDPx2iWER46X4Bhzvnz6ZrTntcuiPqw89+N9/odpI/K5hrk2GWeET+YBwUACQQSUEEuZP6zM5i9PqfDqJ+4nCPncPwgYcP0hFnPPQSjmmifQg7jLuSIAxH0JwTVE6rg2mTO0y0puVq7TDueN1dzUT13kM5fpb3k9Mzcw5i5l3vGOd6RXvepV+f7DPMGX+Rb+GHOYdOrjPsY8QDimPOI3q9GZV9xfmJPcaxkvyoxmfpTnJMa9tJgjdfpIra+NkVhZR/SNY4+AAdcXTwoQIOIewrXEfQVjvnDdcU3iTz3ME+7v3AMQ5RHh+YzgHoFYz/2awAxzLJ50InDJeeUeTkCPeURbvBKIecU84lrmHBEY4nOAexbzh/ZZxU+fuWdwX+CeRv28qozgJvc2AhvcJxDh8WVORuCEY2ZOMYe4NxEwYJ4zPyJggD/G9UDAivsW9zwCUdRBXeV4xpiOxqiLz07GlfnK2HPsfFZTL/lz28Ygi/rLdhg3AwYiIiIiIiIyz+kKGETQAHFiIowvzIgGiF0I3ogRETDgdQN8iaZvrbJza9SLYIeIj8iG8MErVnhNAyICfxH+ETIQLuLLPn8pzwpahDjEPYSyWD1JsAHhkNW+iLkIh4jCIfAixLCKEoEYQS6CEdEG9bLiEREGERihEEGIV17QPuICFq8MQSxmRTACLiIz55D+xXmkTgwxl/4iXiPK0Db1IuyxshUxCIGT/HhdBcIm/aOusTgPIZIwDohI1M8qVcYfgRSxlbkQYzwvLc41IhJzhPHgtx4QQRHrOB8IahwLQh5jRBn8GWvEOEQ4RN4Q7pkTIdbix9MFnEOCVTx1gljHOWLeM2cQ8xCReS0WYiBCFwIcY0QbiMEEGygfQR7mG8Ibc4e2uZYQ8xAeKYcxxzjv5PNqLERKjonjZg7RVwJRzEECTAiP5CN0IjozJ1gVjcgZATL6Td9i3OrxHInFnOU6QthEdEcAZfU1QiLiM8cR86lVx0QYfeS6ZS4QMGS8uIY4L/QPsZZ8xoXj4BiYCzyVQbCPeYMfY4pIjFBKHcwdzi/luW8wtsy7CB5yfyCP+xfziXsBT4WQx1zg/sU4Mb/oG+PHPQqxFnGU4AZzsBw7+slnAfcvXplG0IBjYo7jh3957KRxT6D/zAdeacVqd9ohuMRcZps6IpjEeI3l+aI+jp9X4zC+9IFx4j4cTzSEX112Io1jDuP64trhXPE5wD2C80s6gQD6zHyJcnG/53OIwAvnkDHlemRMOQfMBY6Rstwz+A0CPssI3nANxTnis4pzw/gg2hN84J6EX/SHp44Q0rkHcO6pP55AIZDIZzT1c19hjhF8IGDNkzWU4bVGca4x5hpt8TQDn1WU41wxNzku+sa9juuGzx3ud2WQvp53c2PMca4nxo7PP/7X4Lh5siLy6zLjbXGOGWMDBiIiIiIiIjJPaQUM+OIeFoGDMogQVqeV+7E9rFEG0QuBhFWqiBGImIgLCBl8uUd0KP2HaafPL/Kw+MKO4MQKbn4c9KUvfWleoYnwgoCNXwhOUScCHQI7ghgr8hHYCBggfCPCI96z+pw8RBIECkQB6kGsRaDHHwGQOhBCqQ/RD3GPVcQEChB3EGkQ/RBhaBtRCcGQdhBpaIOVqqz8ph7SEW54WoKgC/1GSELcYXUpv62AP0I9wiSCEj6Il4g9iFEEIFh5zmtJqBNBC8G5Hr9yTGO7y0Iw4/h5ZzTt8773RRddNIuciGGMd/i22himnZZfpJXW5c827SN0MQdZmc1TKLwShh8LZWwQ6TlfCGGcD+Yv5RjHMMaT8UXEZ9U4xlMJzCsCSQj2CO2IZAh18Q54yhFMQBTHl1dXEajgvCEoEzhg7BD544kcxo/VwfH+es4joht180QEIjUrarmuKMu8oRwrhakXkZW+IgoiGCOiUY6gA8dOvYiLcZ/g9Vs8VYC4x9xnjnP+EDUjeBLjWY5rnV6nxX60w1MZXEe8CmqhhRbKK5+5RhibumxZR19alOuyslyf4cs8QWREEOdaZTzpK3OE+wnXOavACRISLGCcmAdcU3GuI/BHQKp8uieCPJxL6iKdecA1icDLPYt5xjmjHe4XrPrm/sn9gznK6nMEZuYB5QhKcJ5plzGO+xr3Cc47rypjfnEMPAVFnzjPHAP9QOStx4k6uDdF0AOhmiAUojIBJe5VzJ24LuoxLLfrusv9Oo+6GHvuJwQHmNNvfvOb5wTeGBN+e4Qy+Eb50lr1ltvDWquO2rifMIbcNxhjVudz7XHeCHQQRCCIinjO3I9yIZozx7h2KYOozzXNtcl9hHs79xTqIYDHk0DMR8aedhH+4zVkfLYxFxDuCdJy7rnvcl0xb3mKins/TyMwx+gDfxlP6uS8ck9irpDHnGA+Msf4XOEznPsaIjyBVoT/mN/cK+g3QTH6ymcV9yQ+u7iH4UPggD61xjHGt8/6/BhHnmZgnCNgwDFxjyefsR6mnb7+RF6ZX+6X23GeadeAgYiIiIiIiMxzugIGCD+In7E9WqN81BHbXWmsjkTUQFRDdIhV0YgGfIlG7CnrDivrK/cHpUU6X9SpG3GWbQRPxBQMgQ7xk3GJ9inDl3zGB5EYgQ1xHxEYIZ6V+Qg6iCGsGkXoJI16OZYQCjDqo37EeQQWVvPyOhoEFfYpSx5CGK+SQPiiXcQZVnsj1hCcQOhF3EUIivboD31AjEFkot/0F0EpxGL6i8BMGvUhTOKDMByr/lkFTZ0IO6QjOsXcqMeztK78CAIgTiEKIZIjTCJK01eEJfxCSKnLl9bVRpkWPi2/PsOfc86YEOgh+MN4IDAhLiHUc34RdTkP8U56+o1FHRwr+wRkWKXPuWJu/fM///OcV5EggCPoIsYhYhEYQsxFhI1xQgAkUIDgzFxiXiCyxStESCNwwfkJsZ726T/nH0GRuRLzErGaNIJKHBPXHOmsPKd9RESEX0Rnzj8riBEbmcPUj0CNwBaruZl/9A1xun6SIcYitssxrtMinf4zzxAbaYP5EUI5c5M+ks85Kuto1dlKG9aibFlHncb5ZV4TwCDgyXjxZAZBQAI0XEOMMdcl84V7HCIu/tzzQkTmOmTVN9cl1zLnFcGVMecegADLvIvflKAtrhfuBwSIOP8I+5TlvHE+wxCnKUv7/EAtAQqOhXFmnnEMBIAITq2zzjpzzilzneOhDoJP3AOYy5SNeR7bWFwvnCPmD/OUgA9iM/kxZuU4Rh0jsShL/xGAuU64XxPQ44fDEbMR5bmf4Rdtt+rqsmgjrJVepsV2beFLHzhPBGsYF84h54ox5nxxbUZghs8F5kWU5TiZ68wZnh5a5/fnKOYV5ZlvGPUw7pxH7jdcP1yv9INtAsgEGDinlOMewjXLuefccV0z1/DBuMYJ9NCXmB9xD+HewVM09A3jyad43RFzh7oJdDMvCTYTjGBOc8wYc4wx4FiY24wJQQ+uBe6lMWb1eIbF2JTbZVrtE/t8nhMwYJ5yffK/BmPBcXNPieBXV/lh0kZqlKVd/tfhR68NGIiIiIiIiMg8ow4Y8IWVL8wIDPyN7dFaWUdsd6Uh+iASIHwiCiKoImrzJR6hnLyy7rCyvnK/Ly22qRMBh/oR5BE3EG9ZVY1wFyuYQwio62MbIQXBDlEFwYX6aiMPn7J8CAWIOIixCFysrkRkRGzBEIQYA4RGxBSEJvpKf6inbJO/rT6wX7bNMZNW+pf9q48n8kufYawcozCOl/GkLoRpBCOeLmCFJwI2QQvax6cu27JWG5Febnf5DWscN+PA2Mf5ZD/GOMambou/HDN/WTFMQAAxjTnGKlrOb3meEf8QdpmHCP+snGZ+EJBitTSr0gmqICbSJiIs/WDMoi3mRt1+pOHH+accwiB957Ug9IlXcdEXxGT8KBPHGcY+edQddbV8WvOE9qNffWkYfUYcpZ8EKQiWEFzh+ojXY8VxxfhG2ZG0M4xF2bKOOi3SY0xiXjBfENcZa84/+zGG9Lvsezme5ZiSFnnlOGP1WNfnJPoRRlrUF2XKY4h5Hv7Rl67y0fdyO/7iQ32lb1hZptwfiVEuzj99Y57w5AafGwRbEbPPPPPMOeOMterps+hf3c+utNjusvBhXMpzHNvl/YVxq9spy5fniTnG/MLYJo/ycdyUYx+L+VG3H/Mo2sO3PH9YzMGyj3W/yvlX1t3Ki/6zTR51lH2Oerus9IntMq1OZ59+0AfuLfE0BfcWgi3c+/Dh3lOXr/cHpY3UKEvQ0YCBiIiIiIiIzHNaAQO+vPLlPb7klxaiQ73dZS2fSCvT2UYkpl1WNLKCkaABq68RSll1ycpWvuTHF+y6vtgv01p5WAgCrOSMVzewMpWV7q94xSvyqx54zQuvWGAlc6xObbUTX/bjb8vIC9/oU9SFyMPKTYRbXueBaE7wgtWaHDPjwWpkVokjIjNGUZ76+trFos26v2V+WOTXdZd/y7piHFpGfu0T9SLYsIqUYBCr81kJG686iXa76iitL38keV1+GHnl8cd2ba06ymNByOMYWdXKUwScT4RvnkzgvHOe+W0AzjlzDvGQ1busmuaJBFZN8zqet771rXm+8p54ggfMl1I4j2Mr+xP7CFL0h8AFr8Li+iJgw3znFSQ84RJBA0TuCJZRbxwLRl3lPtbyifZLi7608sIoz/EgJHItMD+YJ4wPARTSo71WXcO0Ufr0+Y80j336Rf9q8TMs+l2WJS2s9AuLeuu02sp2uix8W/0OK/3L/SjXqqOuJ8qGb1eZ0kqfPn/Soz/MB64jVt7z1Ew8vYFf2X5Ztm+/Tm/lt9JqK/PDP6wco9IiPcqWZWI/fLB6jrFd1hHlyrLRTv03/MOvTov0aCvyW/VHfllHmVf7lBZ1lXW3rOUTaXU6bZHG0zrcTwhQ85sLvDqLABO/98JnLj6MYV1fWVeZ3sor8+vtLiOf/0sMGIiIiIiIiMg8pxUw4Isrgi6GON2yMm+Qb5d1lWPFIcI4X+wRWFlNzTYCZqxEHNa62uALOn95XQWCK4I9Ai6GSIv4hJiL6IRwG1/q63qGtbIfrT4x5gglISy0bFAf6npb7YyFlfXObRtl2fL46nrnpp2+smVen98gG6ZsHF9Y6xyXRhkCZMx9RHPmJfMz5irzljzmZ91WV39ifOt5T90x77neuP64DsN/PGyYMattpP0Zpo3R9KO2QXXU+aNpb1AbWN3GSNsZto3SZ6RtYIPaabXR5x8WPnGNDSpT1ltuj8QGlavzB/m3bJgypc8w/i0bVK7OH+Q/WhvPNrivEaQnKMu9bm7+vxjGhjkW0pivBGgNGIiIiIiIiMg8pytgwBdnjC+ysV1bfPkdaV5pXX6RjpVfqlt+5XbLpy8v0kNgqr/E1+Xq/dL68sK6fOr0Lr9hrK9smdfnN4z1lS3z5qadKNsqX6Z3+ZTW5xN5dX5XWrlfWsu/zmvll2l1fpQp52aIobVvXaZOK7fL+sLKcuV2y7ryy7TR1lFay6dOG00dpbXy67SWT2mD8rGWT5nWyq9tkE/k9/n15WGD8rHw6fIbSR2tPCzyB/n12TBlu3yGKRs22jYG5ZU2jF9ffpnXVVdXeml9PmXesH4tG0nZvnq6LOoo730tn3K75TOM9ZWNdPrAk4wGDERERERERGSe0woYIEbGu4VZvTxeFm2Mpp1hy5VttPz76ugq02XD+tc+o2lnJOnYaNoYiX/YSMuMto2RlBupP9ZXZqTpY2l9/Rorm4g2sGhnPNsq25iodlr5Y2FlGxPVTit/LGxQG2PV9qB2ShttmyNpY7RWtjGadoYtM7dtjKTc3LTTSm/Z3LQx3u2UZXhdHdsGDERERERERGRSUAYMpk2bNidgEF9ieSUKxna935U3KK1lpU/t35dXprXyaivz6+2+/TKtTq+t9Kn96+1yv0wrrcyv/cr9YfLKtNpqn/Art+v9rrwyrbQyv/YblNflF+ml1T7h15Ve5sV+K63lX9oweS0r82v/2C+tL73L+nzKOlp+fXm19eXX9YTV+WWZlvX5RV7Lyvy6XMu6/KKOltU+ZbmWtcq28mor80v/2C4t/Gr/Vl5ppU/t35dXprXySqvz6+3W/qC0lrXK9OWVPq20ltX59XZrf1Bay1plhskr0+r02mqfejv2a7/ap5VXWplf+tdl6/0yrbQyv/Yr91t5tV+ZVlvpU/qW27Ffl2nt85ffTjBgICIiIiIiIvOcVsCAR+T5Ass7ffuM9wBjXftlWp0+EivLt+qKtFbesFaX7aqrK31YG6aNQXmDrC47Hm1gdRvj0U5Ztq+evrxBVpeN/VZ9XenDWFlvbS2/Mm1YK+usreVXpo3Eynpra/mVacNaWWdtLb8ybSRW1ltby69MG9bKOmtr+ZVpI7Wy7q76+vKGtbJ8q65Ia+UNa3XZVj3h08ob1uo26roirZU3rNVlW/WETytvGKvLtuqKtDp9JFaW76urL2+Q1WX76hptG1hZ76A2uvIGWV22tc//XPwwMwGDGTNmGDAQERERERGReUdXwIAvsPwwahg/gorV+6WVebFd+pZpXenlfuSXVufF/tykxX5prby+tJZ1lQlr5fWltayv3EjTI622vjJ9eX1pZXqZVueV1srr8o/0Mq9Mq/NKa+X1pZXpZVptXWUjr9zu8h0mLfbHyq+VFvu1X+SV231+dXqZFttlWsuv9u3za6XF/iC/Mr2V1lc+tsu0lt/cpsX+IL+RpEV6uV1a7TdWabE/yG9u0mK/9ivzxiIt9rvSY39u0mK/tC7f0abFdmm1X1damV5bnV+W6cvrS2tZ6VOWKdPrvFZamV5bnV+W6cpjm/+5rrnmGgMGIiIiIiIiMu+pAwY33XRT/gE+vsSed955mqZpmqZp2jjZueeem//nMmAgIiIiIiIik4KugAEr3vgSG3bOOef8gZV5I7VBdUxEG9hEtNOXP6hs2DB+XT7DlA0bbRsjsWHqGKt2WulhY9VGXx1j0QY2qI6xaqfPxrKNvrrGqp2JaGOQjVU7fXWM5bH01TVW7UxEG9gw7cxtW8PUMbdtYMO0MbftDFPH3LaBDdNGl09fXmmD/PryB5UtbZDfoHbK/bPPPjsHDq6++up09913GzAQERERERGReQtfRmfNmpWeeOKJ9MADD6SpU6emK6+8Mn95PfPMM9Ppp5+eTjvttHTqqadmO/nkk7Odcsopo7aoq5WHTUQb2Lw+lmHbn5tjGbYNbDKMFzZW7bTSwybiWMaiDWwijmWQjWUbfXWNdTtd6WPVRp+NVTsTMcewvvk8lscy3m1gw7Qzt231tRE2Vu200sMm8lha6SOxuTmWYY9z0LGMRRvY3BwLRnn+v+L/rLPOOisv0uBHj++77768gOPpp582YCAiIiIiIiLzBr6M8qWU1WwzZ87MX1ZvueWWdNVVV+VH5M8444z85fbEE09MJ5xwQjr++ONHbZQvreUzFjbRbUxUO638sbCJaAObqHawiWpjvNuZyPHyWIa3+fFYxrutiW5jotpp5Y+FTXQbE9VOK38srGxjotpp5Q9j/D9FMIH/r1iccdlll6Ubbrghv47o4YcfTk8++WR+8tOAgYiIiIiIiMwT+DL63HPP5acMCBqwsu2hhx5K9957b7rtttty8ODGG29M119/fZoyZYqmaZqmaZo2SuP/KQIE/H/F/1m8hognPFm0EcEC/i8bSbAADBiIiIiIiIjImMAX0ggasJqNwAGPwvOl9fHHH88BBOzRRx/NxhdaTdM0TdM0beQW/0/xvxX/Z/H/Fgs2CBTwP9hInywIDBiIiIiIiIjIuBDBA03TNE3TNG38LRZvjCZQEBgwEBERERERERERERERAwYiIiIiIiIiIiIiImLAQEREREREREREREREfo8BAxERERERERERERERMWAgIiIiIiIiIiIiIiIGDERERERERERERERE5PcYMBAREREREREREREREQMGIiIiIiIiIiIiIiJiwEBERERERERERERERH6PAQMRERERERERERERETFgICIiIiIiIiIiIiIiBgxERERERERERERERCSl9P8BWXdXp3vIiMQAAAAASUVORK5CYII=) **Multi-Threading vs Multi-Processing** ![image.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABuYAAAIBCAMAAAH12MlPAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAADDUExURQAAAAAAAAAAAAAAAAAAAAAAAAAAAG4yoAAAAF4tiwAAAFQkeEkhagAAAD8eXQAAADoZUwAAADQXSgAAAC0VQigSPAAAACUQNgAAACEQMQAAAB4PLAAAABwMKBkLJAAAABYLIQAAABUJHQAAABIIGwAAABAIGA8GFgAAAA0GEwAAAAsGEQAAAAsEDwAAAAkEDQgEDAAAAAgECgAAAAYCCQAAAAUCBwAAAAMCBgMBBAAAAAIBAwAAAAEBAgAAAAEAAQAAAMlvhQMAAABAdFJOUwAIEBggKDAzODlAQEZITVBTWFlgYGZobXBzeHmAgIaHjY+Tl5mfn6WnrK+yt7m/v8XHzM/S19nf3+Xn7O/y9/lACsycAAAACXBIWXMAABcRAAAXEQHKJvM/AACMAUlEQVR4Xu2di2PVNBvGd2PgQGHygYhHAdkUEacMYTrH6P//V33v5UmapJfTbj3b6fb8YKe5N8mbN0nTtN1w7t5MWLr5wtLNF5ZuvrB086WzdBWOvfyK47oSS1d5cWKhovWJGdr5d1gltIITHNvB0DMNSu93HJdSl85KgTIKYvgmGDr5F8eL4Kf6KindoJKNoi6dJX4gP/L/QH+f/PBEf7XYx+b56VsE87Dy+6+WT7Lpbn1SblL9pz+StFaipvdIE/gxJCeWf/8w05Nf/7tbPXj5792fvg0ZUH/zW3LSvHT48dLpad2q+EF/rV34Sbx0wfO+/Q7FUviSlM5+pHTBz/7u3/1HfhV3u/vSLGr+4+7d3zxQJ0nppI9YXjrhn7vfy685JKWr7n6yEIPRFCRKT+kUHPW88vtZiusu8itSq5bUaFq66js/LJHd3bs/qUGNuezGoXHkz/XOEtCftHR/yd99zUvN3/IXwg5QhLR0+uf/v0/M8l/BQWpPzVKTX0lGUtlp5Y7Bo2npLDl3qfUuhrD/+BG9Mwf7UdlV1QO1dRFLd0m8i5iCEal8JX+9wacq3f+stU7BmFSkTq+kdOsJSzdfWLr5wtLNF5ZuvqB0hJA1o8JxNdw5h6GTic6/iWPBikqHZJPUO040wfmr1/Ij1VjZv2fuaOaN6oP+Tkml6WnK+Pd2o3puDhunat/dqPT40APsVA8vfXpPwE8mRzu/5qJ6rA7yOyWPP+rvvW07QTjZxj0xH+L0alcfO/nGiVovy2Ljoya5caZpCm83jvRQLdWMC7KwsuFkG0+tPPJvd+OeOZ3r+acr3RpwiOPNRLWfEHJz8PElRa3hb2owzPUgozxMShgXL4BG3NGjGLSQ9t/+dOJwpmYLFbxlImGD4YV452lJEiHRncNKBjn7p04b1bON6u1TLQ+COE8RQv5vipvOovSwBe9OdCqkKek/teo8CBZPzc07G2/ksCN/l+m7NUsbr9wof5KoVqyfz86jrsFsf+99KPSswF9+d+VXMzMUT1B+tLR16nCWjGhd7sqfneUyeHz9lUSbpYtm+0tmK8HJf18NLt22TV7l375G1Ozb/NLPsmnFkYxYBYrxjka5DJqO5SyWbqNa2LnE8GajOhHvEERLdyYu6onzC+a9s1Ftu3VCbsrcjxByNXinhJ7sSkA3GLn4XGE54Vx6LM/bwoAgDXSZAYb6tz5q6fY2Ni6/pqLIpMaP+t8GlGrj3AeW/ZONjzK4q2VzU352dnQZSf/tiHDt7PrjTmZdho1h8qdjlUeU300Z3+K/ox0rnRjhomP2oQxDGnQ8sSFoWn7qYNbjli3o+Bh3Yg1WfWCKUeTPIyxD1xYlpJTurUZJo8s/KzxKF+0CgkmlXxSZkfo59MeOGzJt0KMvVwlaJvcRkx62Nt7Jrzq58zA0rKTk51OrLhGFJPY3vWUGu+FBfHHywrRlsM3tuvgax4uh8+cS0e+1ISwlE0LI9YAtAjcOlm6+sHTzhaWbLyzdfBlUuq49//ZgyURM8zBDQSydp97+sF/XmcX9wpnCgzUxfmHtZfBJ69JZlObDfnUZmww+Swv+4Fz98JAY9GGuaalL1/Gw38snT8KDR9Uf/pTfX7+Z1bJW3f3XPB8dfPtFIg3HYrU8zQifb/7UQ3Uf1viDJwzv/qU/d7/crR5pYl3UpdPgLQ/7QXZ/wBEHKYmfRUon1k9uHEOa6eyJuJf26//lT3/t53d7sDBa4efGLrLStT3sh9LJQe2VYL8aQF28dANOVCKh259m/NEM8itm+dNfO+0//mChPmFobvLrD9L1kJTO/hoP++WlM9JgSenGoXFbn2aUcknLKEqnP+H4N6wDTpqWTgM3HvbDOYPszBvHX+zXS/d3kOdQNHRIKHuaURqQ/MqJ5fBzeIwXY4+aXSPM8mjZSWPpLomexZ5svDRaroHoSXuLN2Hp+qtxMONK13/SqUq3nrB084Wlmy8s3Xxh6eaLl44Qct0UuzeWbqm+HFe2VwQnyg+r2gp3hmPN6sppm5XkT/elVI/tPNXGuW5sX9h+ooW6TIamv/1czvYC1gqbs6oT31z0cf9cfvX8706n2PZsBdo4s9Lpnz9UqGfS/WfuOxmeXKWbxPQkeha16tN/enwRdm1VlW9fm2gPtqT5Qn5O3WxnDfvwpuZOSPOt/KlR/qQifduYls731xs3ZIc5SnlDudmlI+T2IQNPOTGR8eZQ1Lxawe7TqvM59o5+pUqfs8vCLOuItjSA/O28qDZeuVl+bTQ9lNHv1Z4WvZKpxPnGHQ2GEfiCaFR7/C4mGtI72pGz+256+48gWg9iqfY3Nt5ti+HekT7fqA/LVW+l+i1qDx4gyE4tfjot3cb21xt74WFDfTCv2nhvpgvj6dfnCf+0dHAWMDvTP51eiKnSGaHNouTPf+W/7kofhM33YkTBSidF3wsPG1YbT+Vvgkf6dXezn6f+l5buVVo6nayIScJUUjp3t1B60Gf2BiBRNcVzE6Q+TegtU728Zdq/j+IrboNS7EPPEhPFvwqls3P74wAWxKZiGkEmcC8ql6QGkemcP4M4NdOnSAi5wex0LjXknUk9hbnY0kSM/7gcZFbZDdpQ00r7WS82/mVptZROe/r2843EhgxLSQcSGXVsxPYz2jls2HGT/PPn/Mxhoes7ulLwwdYKBoFgEv4OHsVDYo4+wqiW6qnb38G7/jeWeuqgPz6QhvrUtY7gqSOrpx9NUnBf5dG/QWD6rPWCOCEJQaaQcPdlJV9VCi4XX2Sx5DWZD1o6mVnWrSUkvhvKpM/5qcm99N9zvJdiOBJLp3Uwyr/N8NygWsKvvqnCvd3eshJ6WXThal3w+iCEELI6sFGAzANIzYATmQeQmgEnMg8gNQNOZB5AagacyDyA1Aw4kXkAqRlwIvMAUjPgROYBpGbAicwDSM0wh68qfZoXfNN8Zi91qf6EQRn7TGFlT5KOjXV5LlzAy/Ht5CWF1AxzqCp7mt74rUrfh+BYDtLSP0GexmbNRHcNjC7gugKpGeYgelDhieMHYtSSfeVS+dELZYUXfhCDPj5tRA8NouZf3fizOOEzrAimfLI3KajoPqnTpy+SdqxPDaefVl4VYwuoDn9VlX5PV8wPxNM/dIujoN8r9kLWxb0vJjtLNCSJq0ktxri3RiRAaoY5aMJIW01aMvQq+pYJ+KFRVvrEdNQ697GcIHdf/IAAgvm6618mun/V/K/XS2UfpLUuqo4xPaMLWOn7M+Twi5m/U7PIXLGG972bf8uKazJXogHp+q9WA4w/6e+FgNQMc9AUvRXqzxjR2a/bMidkUjlWU3BNROcOtSeCrITRBbSfu3d/suBujj3J3/LnRgEmPcG3Fb53HQ1I14xWST5gRNUdD6RmmINV4J/y46eaQnSKCW2I6Ayzr4jRBYS4rDsNonPvdtEFYjHcUP94JembPqpaKccDqRnmgPNYVy9HLRlOmOYAJdADyp3mCuY682L8n/2qg7ve79O6VYJiDC+gO9819YH7f+5k0kQ3KXqIgJEqfJ7fDObpIVzrrEYuAaRmmEPIqM8avGRa0CqTitj1A+ZWkGP1DB6Z6CxchZ5B+MF99SUvMkFoF51VKFxWwugCVj8+MG9zdtGZP5zcbGV0VzV+UYOO3dHg6XoUF53NCl4GBR4PpGbA6ToJ1RFqZR2I4poUzG5C27kAkJoBp2tF59JV9Stsa8FqRKfNVIiXmOOB1Aw4kXkAqRlwIvMAUjPgROYBpGbAicwDSM2AE5kHkJoBJzIPIDUDTmQeQGoGnMg8gNQIIYQQQubDWWXvBwCp0YHNGPrp/jUhz35r3vPyLWUtym/vBBLw6ekt/Xa0GKuNw8rfhqSo32Zl7xDb0kKq6I4Wbl5fKnzDN3wP+7D6uLGxfVQdbeurmrR01UblX8Kt9s+1Ht5t7NoLOs+lIcvh7dYbOTysKn31BMq/8V6jaPXoKwuuE2mP+r1nN0ru9RvUJjrJrZkNGA4ruO1UCznW/mtJrWrbqHUviPxV+v4JPZqjBoDXe/2B+cRVUf/Qy0j5d92k1YO3sK4Hki/LmvxZDj2bghj0vXcqL7Hoe2vvud96q12CZLR67iWSv6iPflApm6ArfQOJHuXw7jxa9Qflt1f+uePJurxc9sQzr6U4kWP1zq0OPJ5L+YVta4XVkZvXH82nZPjcD6EwJirz17fxCEdSukNx3qq9VXR7bkb5zUt+xWFtREcEk5Ue/UAIIYSQdWLJOslaXcuslnpBIqHVsZ/uCp1qMoQLtWGiCy8Ij5e6640tfNQvjbwgPTUjl++dVdcdrb+qlyJXKVgm8uWeHZkf60dxpKB69elXNLoKJIcj8Q2is/PKr4gueIc/XWjwgvg1kqdw9fjlmBqwXqQfBLBr1vC3Kbl8JU5nG0/VqkG1jF4D8v/cDvvqXv9lqfhbbv0rQuqiBnWXK3o9mNXtH87l8HWamPlehiNfxDMkMWs68vewvvS0S2+Ya9E9tHyK6DJvKYoEUaNeI/llkgTUt9RfPefaFoVQR54tPyKnobjRuXaCi1oEDR7d4xFX5Vj/lcjbUtiFRQhVYj9qDm7CTnih/zTYGc7TjFu7CV562BftiqKTPOjCkmudogdUyL6YdeHIRXd6ihDXBj4PoTVnn4RQW4vo3mdO+l/indm7jVPRyV9IJSyoWOPwn9fyi4VOCxKj+ft0kZiGsPBrzgyySFqZ9uNxhBBCCLl94LojQyfK0U2nymEdbL8lMOiZT+qtypSnuhbTE34qQtHq88fvVIiLX8LUFN+wwPWQJKK7H/Roh+un3hnkR92fUn0QixU2iM4seo3tVSBmvaT0KBJ0B84eTA4i5J1Kd+bo9bqHcTepPXWRs0oKavQPt4jBvngm7voFvWl4h7zq6fXn0IpzVlV3nuqZLVPqZbfGEaracD/NkUUSguj2bLnJouhuFwsuGdZyaKaToloUi79K5DxY6hCTbtl4EYshPxCd50Jbp5sUvyL3/9psq7gcJH+n7y2qB9cQiZv9COrsEbNgZp4Er0oz6c8LK459WjI5k/ykucCfLY8Epyg6c3lov76dzP1rY3r8iJ12V4GdUxdKdL8JMtAjOjFvibReizKh7M8QTP5OTpqig5stmcJvtaKr8fR1xVINH9IzqbkputCzmrUW3UZl67ryH3cQzN//9BCP8uvHq0G1Ts6oumSf1NY2C9GpZcuKKN1NDGzdh/mi7NoZmiGKzsK5J9w+mIv0SrbRUZuJH23tPQSdGj2nVracf6GnNxc/k/ygrOJb6dqf2TblB+bHiejcX/6kb7KhBVYEDUYZxf1wHQw98/XlcL2p7sFACCGEEHKrwIsByDyA1BS4kHkAqSlwIfMAUlPgQuYBpKbAhcwDSE2BC5kHkJoCFzIPIDUFLmQeQGoKXMg8gNQUuJB5AKkpcCHzAFJT4ELmAaSmwIXMA0hNgQuZB5CaAhcyDyA1BS5kHkBqClzIPIDUFLiQeQCpKXDpp/NToqO/MbpO3xJMuMy3Uq8USE1xh3/STxYeh0+LRtKP7mVfsx/9NT6LfA3iu2gBL4l/ontCIDXFHar4uVH9PGoj614w/7j77+aLD71fUHL4IvQVMrqA09A80yWB1BR3qBI9SM3ACuafngXpJ5/HMHVJhjK6gGsKpKa4gxQmfBP1+5squXEFXFMgNcUdqru/hWxX//OCVY/caj9SMH1YxZRf/tz8g3kcV9UvGkb7wMq/Zy9+/n3nu7+KEV/A/0qMFrn+ERf71LN/E9v8V8XIAtpPzFCl5VDD7+IU5jJiRCFr0+fKv1IdDRZaKiZWkjQPwy3jgdQUd9B8uikaK+/13SwFe/KyevJE3cTl0ZPqBzf/WP139xFy+U31pzbbb+2r+B5PVPM7Nx6IBA+qn93Zfr6qXkp1WvVpkEqSV/Nq0BMUxr4Cys9fkvdP1bdmluMfepSGdoBa168ZmzErrshM66I2qC5LxcRK+iyp/iKVp+aLAKkp7iBnCjnSfJihKFjsTNwFvaUpzVfuVH2nB8w/kJwAX4uAEtZG+3i+z+bqGCtA0h5ZQPs9Tsz46vvd6jf9sUmjljQprgcTaoNJLq0k/fH2eyEgNcUdNClPLpqGSe4nP9iP+/+aBjPUlLnWP374bOZf4LISNG1PP5qWFNCMTbNcYOgR5rxg0bE2mOTSSqqNFwJSU9xBk7LO2nTAzzFIcuqRBpJDGsxQU+Za//jhDzNbjawKTXtcAc2IqjdjkJEqT33lkBYsmBODRU8rqTZeCEhNcQdN6gf9qZO/uORAbTGDWkPI+BPM2p2EMXwl6GnGFdCMd21+5eZ/4aTWaM6L+0UH9tRg6WaVJH11LdjxQGqKO3iq36LH8p9L6pyCqab+ZZHjjx902olp6arwU44poBndw80+5glyjOZSCr+aaKLBo6eVZJj9QkBqijtYWtKRYLi1n5f6C3OzYJ2S80HLeOlG/Q2uHjL++MFncCvFzjaqgGZMzY/gpJcX9QpZUlzDJyLBYOkmlfRXEXo0kJriDp6gTNT9aD/2+7eb9dzfuCd8kZs6U8GMtBSMBu7rxtrsDnZYrboZOP2IAv6lvz62BXe/lPO5pbUABZ6RaFeDpZtU0qcy9FggNcUdPMGq+t2P/itX2nJFaUY7d9Az+0GYOlPBDK/v9VdnVbies9/Kx4f6J5gNXLKvBD/bqAJKc7rvwvLg0hA1lEf4XN3H7L4uros0NXi6WSUZF2+pkJriDpZq3ugkX5VcQfr57Ny60KEWNWsWRCxZptzsKw2Vluzu/9RQR4irF/HHD66b93G9tBL8bCMKKNm2DLvZseKg1v9Ss5nq4ppBnWqDpptVkhv9cAEgNQUu1wrKYWq6LqwmL9XPOPphPJCaApdrBeW49PA9JSuSXHEcDaSmwOVa8c7Dr8fXhdVk5osNpY/CpcN4IDUFLteLXLhiYFgbVpSbX6ykcWI6GkhNgQuZB5CaAhcyDyA1BS5kHkBqClzIPIDUFLiQeQCpKXAh8wBSU+BC5gGkpsCFzANITYELmQeQmgIXMg8gNQUuZB5AagpcyDyA1BS4kHkAqSlwIfMAUlPgQuYBpKbAhcwDSE2BC5kHkBohhBBCCCGEEELWCXw5vYtZf9jYviF9OZZUzxWju95hrL+4r9h++PANdGdeknss2dcvnzv2AfOSh9UWTINIq+fa0e+dy3+QZ+2ozOi8JKe5fepGoTXvVfh6/zDWQXIxxzjq4fUzzVqigSa5w+qjm6US1C5/Zl5bzkMBcNystHUeVUdH+qn9c3F5u/UGfviw//nGmR7fbexW+gF/cX22sXFavVU/lH9b60WrRxO4TqQjccNptakHtUmud8T5A3xcclLqF3q4s/HaQkn34uZ15TQWDUep6uqh5V1dVG2qj6f75nV4tlGpSdx31f30xCJV1c5zjaPiRPmlpcsgp9WDVK8LaYchA2/MZH8qOZiN0FvKwU2SbxkXov9aIiULGYymQ5TA/vbqypfjXnQXqb13s1lP4e4HN+XVcz2cVfdg2tj4iJxFyX2tY7sYTHJvzbxnDu7s5rUlzV3ItRyCSSYtcd5ibvgTzTL3EHRHfxah/Oq0HpLL0B5QD206Vz2Hw2O1bFSvgnkeWCe3sbC8oyBygOTOTJJv3P3deS45nbbJeI7ye8z1k5z815GgVXJydAe3RPPao/P9zUp0pixIlJxZtcjVrvzdMw1VQau7jfYw60Fa7M56SU5b3YeNjTtyWNgMxayOFvhcXTyYZVoy7+a1R8dynX7Jr86kNcu7nvVMciaX4C6HMJCrk0yhUX4v9Nrp3K0HwggSJbOBkpsrQXJrfJlKCCGEEEIIIeSGsWQx7hat1dVbOxJaHXvpXo+p9Lb7BOz7iuowydnaa3Jcc7RoY3Pa2Jii9dy1W0VXqDvOsHrJ4bxLChhCeREqvfW//ryfJJc9iWx7bbTuTFmZ5KrqBQx671cO4qI5kCb0Vm/l6BaMI7/hIw0r7DOUUPXBvXf01sjGPW95Ets2Adyr7Ha6h75qohLEs0u29vOcanF1z4kUOWxnQA34jRwN9Mp+5Sfc3UEq4i42RW/K6sHP5IHvya9tV1F9NGc7eGJmu5zktpCQ9pa2D1GtIkTbTqPZE5l80K03X5s9BJYMicAPT9UN3lKSjZPqbOOpBpNYKjoNrlskujqZlaJ1A5MfrHFWm3lOw99mLTn985un4ropxdT6Ni93DKkcxi0B8aDtfCGV4xuV/B6fGcy+KX8hMb0Felmdk3ScKmxYONUNNGJM7v1Kt2gNC4HlkP5X7xgUTo3jVfO+lNxDPUofUObURPEm0TmrAQmV7NWAuzg2UskO+n/T7NVjbETS3hLRsnCX7S23kx2/4Qb+ibhpiw2SM6yA2bm1Cel/9y7roz765par59U7GHB61X6VT5lTk9xRJjmpAQl1okaVuJbP3MWxkUp2wH9lH0JzycmQImSJXVJyGahkybdut0h1Tg+LxGxNetsMcCrrIx4fV6914LxGqjt20N5x402n5IKzHlxyz9SIMH4Qx0YqerCj7lh5U73QwcNcM8mFAUMP8meHiSSHvlCTVMnpRuYguTP19y2kahfM2X6idyyJOmm3q7I3nxDp2ngqHdiGKCByl+VUDy45s8LJJWfGXf19pxaZcwRH+YupKOqHCaY5+07IVHIhqBwssX3psiaSnGqxbrMRo+RbrSeQnHlJs5Vfz52HshmNGcw7rQ+LDaNa7HCNaBcl+dF9QzLTSnNqB0juo5dPnSC55xLB9g7ty0Fim2OZimKblMzkB7dmkkMQJKYhHk/YW64G35ZP5oe1QjJDxq/1EUIIIYQQQgght5RqY0Pv7Th6r2A5tp7XQfY2HFCnv3pazr+3dI1n7ZeAbBMMdjZEJNf17gqxPA3rIb582kaf5EovXXJt370xKZrZ8LoFJ6716s2XIlflOnDwRoHfdhX82tAXTjQo3WzhW7jXnfs+yZVcUR34Q/r2HhqQnbjIRZmpYLfdJ3JorahrIOoZMqS/ejfjhd/Y0Zzqiyb0Jlxljdd6NxeP36E/2qi2Kt15oXcw9O1FElrdrZFKhGS3jTvIr74MAWZNXx9OV299Ol2OExHTMsnZyf1mi5/YdkdpQTT7kgncOTa/lk1TG5XuAZD4VlESCFuE9Na5BRGjFVVSVXu4IbYq6h1EkNy5nFe3A70NexukUJoJvHgIOifZ1DcWaRR11T+LKoews8bStVtf9W4bdbJfdfYULUl1UW/8TYOUDGmlknMn/bXdUfG0m/Gev/q1bJoSs5qqTa2ouEVI/3bt5nIsqu6e0kbfNqhPSSI5N9oeStMpzZb8P7Qb5LAHyWmjFCvkKXYcsqiulMluGz1o8ZT4ahJJ/7kedzx1CzMFyQ6iVHLe5cmfZcokJ8fT80xy+ldsmpKj7kuprIlbmMeWYX1zmDTtrKh671SPKyXuIILO+RDmVb2vp9eaNQ9YIia6+IIbsyiplMrdNnqwd6RJB+PR5Q/p63FSySU7iBLJSbZFGdRgmdIfPepWu1RyhumR2RXzOn3rklN8i5C2QPHIiir2npnA1ATJVdqtJVWtNfrMfcxSY/4I6BYllVK52wYHc091TnVaO51pJRcxydnuSre6wTKlP3o8+Wg7SIOf/ukBwjTUS/eHBZ1Tp1pyWVF1D/io92VeCkjuXIqwkPN/nYxz2hNIJyO/uvlF0YNO509PzeR/6nCWSSn4JG5Wi/bSPrdr+tq52NlWKDnbx+Vp65mwG8h+fPblG1jh17JpSoPYi85McvUWIUjO0pY/l1wd6wp4ph3ACTJgryDyl/Bo7vT9O2aRHG1r4G1x0O3nviHO46inDAyZ5IrdNvpfkBFN0hOLTDEt/Y9qF79VSU4xg5l9L5FnyiQnk2HtWDWQWNXPouSbpixj+mdNXP3lN5FcWlT1t8PVg/Mv57oyuO5cW70MlRx2AZOS66sXmzsvR0dw0mTXxhRCCCGEEEIIIYQQsh7gA/+EkMmBkhXAkxAyOVCyAngSQiYHSlYAT0LI5EDJCuBJCJkcKFkBPAkhkwMlK4AnIWRyoGQF8CSETA6UrACehJDJgZIVwJMQMjlQsgJ4EkImB0pWAE9CyORAyQrgSQiZHChZATwJIZMDJSuAJyFkcqBkBfAkhEwOlKwAnoSQyYGSFcCTEDI5ULICeE5AVf0I01DGxxhJVf0LExnAyuVx64CSFcBT+UdfUfwdLClfqcdfsHTSJbE/JTKMBVeoc5KHb2G8wVy5BK+VNc1WCpSsAJ6KiqWqHsCWUFW/VtUxLJ3UEntSVd/AqBz8+hVMhccV6ty3f/0E001m9RJcK9Y0WwlQsgJ4KtptfGnpOqrqk/xdXGI116dzt4PVS5CMAkpWAE/Fhmr7yYAzdW79Wb0EySigZAXwVFxYVfXFreA/uAaJfVNVT2AUDmoBQ2KSSuTX5R4/wyVtKCrwf1NHOWfgH3cRPsFFSeYYv8BJL2xqnZPkUpMMBk6i9N/BCfwO5zkh2bbfFUnQTHfv/gi/qvoBrupeiEwvtgLpDBBOQuwOW6X7B1yq6o9OlzRb37S2pLuf4QTgemVAyQrgqXiepAb+dLshdaeH4RITOoezhoe0fSxtSOXEZKtKkk0CSsaCn7QfXJj9U4cXx3AGXSyAiMW1Q+ei87didJOd5L6b5JLoFzfNDS/OKiUos9Soat9ntZeJTKrwM4xqjsFE3uG6WvpGN4h3U7p1nPvf+7HpIk5ZtpotSXIYtDjJxBUCJSuAp4JsSX8RinL37ktkdUU6l1iTShFj2miq6j+YBOlkH8FYU8fNwoocWnUu9rB3H1XVz26SluEGQeLBNDNWL0E5A3omJa33VGSiO8XYVhgSWqUrWSyWR5ouGjXJVltLStOuy3+FQMkK4KmEvEqTC0tfMhB477EinQseglQ3TLm72DJBtcntOLhJH+cGp0PnYFJiamlM6YLTbM6H1UswF83v8TyZu5wAPZkjs3azS6phiIp0SFeOiW4rTZfkpNnp65YU5sZKcaKrAUpWAE8l5ioaZKb20k2r1znxgyl3F1sJPIQDaV2O26PBGaFz0i6ifIpU5kOzHqaWYC4aCXbgpsw9doEBjDcNd0FyWuDudlkeLtyMpktHtuqWFFOzGXZ99XllQMkK4KnUOQymesJwvToHU44oT/X54IkSJ4NF2BE6pyWxy7hHcs2Rr0HMh7o0wTS1BHPRyCDapnN5FQu4umu4C4XEMv4Wz/xOfu7Ska2sJfkw/6sc2/YKrBooWQE8FckXTGLUZpdMh2uJiTGZIFyfzqXOsQMtwo7QOS3rb2IT/ucuMySWRo0rkWAuGqmwtrmltPBsTHkA1ayD1yRZbqUpj9qlI1uxJcl5H3zrK5fp+sAVAiUrgKciWYPp7v80l4ldzInEPsEkJC24LrcUOulIh3gInTr3c73WlSBzhfqaOuqcNKBE2o9G6VzjEn1+rF6CWVSxhbi5yNLzCkklZ+5Ku3QTmnGS5OpsJaePLenztU9YoGQF8FTS0ulgnBY2kZg08NBNirEOlJQ7WSkWhnj06JxOKJpbBeX6K97MSXKRLEaLxo3QOXGWMh4fHxwc3IhxbkUS1OAhHb0vA2MaVZABpu7C7tfB9BYojKoRdmiVLmasgkSxlJsuebaS08eWZLtMP5tQr2NiKUDJCuCppBVid7dgVBKJ2ZVP9csvOmjfb9eUnzTE8RfcVx7i0adz6iD8e2wNJART+VV/H6jbP0lcvWlUHf+mKvS3xBt+PScD5984gQLXeZHleyUSFJPdEv/LqqpWhFJkelNOErDrr3DjTDH7f17PQQ/MkkvXTP8e2y1Wn+Q2XdKTZqevS6RKd2wbv42r3+YAJSuAp/LNQToLf1lXqPDjQTYT/+G4+s9LcBBXhX9OI9z/7fjzX7iRNszj7kG05e7Ot3/9K11W7vFEJHtsc6CDdH7ySKR3bHL+9iCIpy5bXsrvDjwv0jPXy8qlYs6G1UvQG7fU8Ofj9IKvTWT3f1c5NCeOP6pzrqFt0pUMVp8O0tsDDZf6pPnpQ0uSSY8bjG+Si9urAkpWAM/bjvSCMBkyWmZNioByPFtnkhHcKGR8FUDJCuB525EuMelApUtML2dIZE46JzPLev+mzV0aq6arBkpWAM9bD7ZGf8ZN9nLLA3HmpHM2sikyJ1X+husVAiUrgCdRfjz467frWuKaB38cz+2WypODg+OD7LrwCoGSFcCTEDI5ULICeBJCJgdKVgBPQsjkQMkK4EkImRwoWQE8CSGTAyUrgCchZHKgZAXwJIRMDpSsAJ6EkMmBkhXAkxAyOVCyAngSQiYHSlYAT0LI5EDJCuBJCJkcKFkBPAkhkwMlK4AnIWRyoGQF8CSETA6UrACehJDJgZIRQgghhBBCCCGEEEIIIYQQQgghhBBy5WwfnlXvdmHJ2KkqmDIWKZtwbNARmUzD86Pq5DnMObv24n5Y+nlxtIBpBbABdHFeVYe7d1601k+fzp1UlR2pc9fAtijVw409kQEcUqpqG6ZliOyrI5inhw2gg8d9FdNXa0fLapRVvjqq6hVMLQyu9+fVuQRuneFMARuA8a457aiqDzAJe8HT+z+ptS2LUR26c0LQuVMPIOJTXrvlhRhR5WeSRuZOxmJVl80jvVIDqQgewvxMzBJKcclU1U45rKmA6j5XDB89wk6rNUbXo/Ja3Qv5o4VZonXraZ8D3w5EpRRUD1CXU8wQmzrndpGeHROKce7cIuwnjqZzIgJtHKk7GYkKKAgi8FxdVK8SXASmLYponx3l8k4PEj7pXJWq+lp+o2glgHWJd3zoa1o9uiTnjUWUzY4OTl5V78z6UP5i62nk9FbhXVTjEuyNuqqhqXNuFbuozp4Bl0LnFm6VdKxbFCSydKLojhN3MhJfE/kIW+SeuvpY40AEUtc4YNboytOYq3yoTuwo1+U4IoD0kz3Wqtp3g5hCWxD85DKdcasSW88JzkRyZB4hvz06d29j58iAFgWdsx736Chezh+qXbVaI8sI6q6JO5kMmbq9lUMhgnh4YYtcglZ7qXOY8xgmlhjA1afLWq/QuC5nJ/+AcxvUuSXIFEJ+YzUVOid9qhtqoHOiSmYNnazi0exXvO+4oxCTIxNh1wmlCOIhHQTFmuuc9qKOd7d1AJ8fdlt9Bqtrp/Kbn/xr2IwobupcTlW90YN0UI/dqrNB0aegc9W52GV6oRP0DOicXDXopFEvt+XgibkXqtyaRepOLo3M2fVSTNdL/JCIQGvcDipLO947c1umc1WiBz5xleA6y9RlD3csrIi+CQc5nQ6PxcmlHem121uddlLnuti1haezMOuzlab9jWfWuW2Jvthlg+ljjlwDRkN1vr3hKzN2Zeg1LCnZ0XrAxJ1MgNUnLsQKEYh62EF4b6HemmzPs0XjsMBiiKQkiGqVRoCUCmsaXRWsOgktJj+558y62NAAJJn3biCEJDQGwsxKCJka6hwhhBBCCCGEEEIIIYQQQggh5MazFza+Xojk2Q5CVsL2RRrZujXMLX+ozp6KutzO/7jl3NB3dhSPd6UPN5NLYg/RtG6FnZQXg5/AGh4SbOIZ81EbXi6yPyZvmNcPdrHumbJNqXP2wFz92o7n1Vn2kCO5FFK3tum49302k7C0n4w5GNejvg07659JWcxlEDPUuZ2i2afPvQWd291L+qv4YLgT49/ZSx/8vqPODZ0Tl/D0nD6fl+hcHltP6cedvVRwe3v5G6wuN/OdMVmtKP6wasZmIamOOu1yLthJ2wAIMZXkZC0a08hLM7n01Q5dD3kVqZj4Xecyn7wx5ZFaGuYVo0NP+hSpIPa6YYvOnUgGNZBaZfpy8krfqmevwnBP85Hft/v6uhl1t0c+LE5D5/AuDkENQeckYB1bUpUBUKO/Ev2Pp7be7+S1zj7MWlX2UNGRGPBGm+GvlJs5T7Xg4VEZR+wwAQ3xRivPZvMdddrh3Hw5QNoGxKoHPcO5Ooqips3CXxxhT+AlIfO8pMkFtjLxWVLxtSnmpSdZLCS/9oaJKH4xSVpn+jiRPkKo9qQx5Q22o2FeLagge4YxYg/p4yVEUkE+cueXYrH2IfhQX6/MQyYJZpNqLHUuPkZsTzZC5/LYkqqdSmKHx7/8qUePqFcJepA8emfmjyerw/g5xjxR+QiwGWXh40tKKntUraNOO5ybOpe2AbG6xZt4SpSEHeqQZV7amlQ+v7LihXzY43cfwls9XO4SAGNZqAk8AZg3poBbOhrmFWPX3VmP6eiDhlpTqLb00d69xULqRwscPUWR/NUo0oeINankhs6Jm9ZdeA2O6lwRO6YaatVTqeehvkxQtzJPt/0luDcTFVp8k4IROnIg4oNJZgfy21GnHc5NnXOrtwG3+ouIEupmIdHMBSE785I/Ld6ic3ibivuIwoZGooKvxR9NFqxoTEqds46GuTZ4SYsKklnNRx0AvQjR01t9JHaBedEQSueC773TijqnlkBMNW8I9btO/XHoutI9QJ7IbUOmXDAZcVKgEpPfjjrtcB6ic/UZhLxZyEF/myGLvOQ6J7GewmQDkb37w7pmD9+4AurUObc5ec46Guba4G+gLCooLO57EaLnZjYX1xf4uSEvGqpD6jOMSa5zReyYat4QvOcTvLcUe9Q5Md9peUHLrSJWjyF16HUn7rpS3FGnHc5yMFsZKtU5qXCcQcibRYzuIeW3PS+FzsnIGVKUFuFeYpBYNjgV07EunSsaU56zjoZ57ei05ehVWKooKkguRp/ru5y8CNHTbr292ttbnHr5xfb2a+nfXmVFC12QjKCx6DZfzGPHVIMMUEFSYaf7T/UqGK61zkkDyOV3C9FVg+pIZmNWP1ojr/b0Zdk2eHTUaVdVi+hOFvJzmIdKdc7ed/RuIeKQOW7eLCSVavFCAiFkV14KnVMPOf/C1jg8O95WEEqa5Mfne3uYqnbpXNGYmjlraZjrwLPFi3DtFFfjN7G2e2fxRrqdLfdPl+q3Fm8XmC0L9xYv1C9fEA4ruHHnwL14mjR2SDX44mTC4xev41u3s+0HclkK063m68WzWiI7i6NFnI511GlnVW8vXus0zOVXtIEo9YcvXoTJRdYsJMjC1CuGbM1LaFIpjyVg3YqE7XSl/+HiTUimFn80xZBZUyxy1towyXikb2MdEnKF1C+ZI4QQQgghhBBCCCGEEEIIIYQQQiYhbu1ZLVd0mlvMhV7jU3O52MQ3rzpLNndgs6oENGtCi5OTvt5oKHFP7DgGvwCnM7OzwupViB9iHFR+BEr3i48HG6MjKmMYDbHGj8iTVoonkLoJyhCea0twp4YosUfcX280lAvq3LIX4NSZy57Lmit4RMM6THNoKX9LRYYnXXp1bokASp2TzjqNUVX1U6vEKN9BtFTn4otnmsqwl6tfKcpMFG1kb4uJr8axWNlrkOpMD3i3Tcpm/XKatnZWvuKoeH3NGtHyDqKQ1Y9lLW+HKsk8UrlbXTSqMpLGW/qmKctK/Zyq2PbTFlWeJOQtcy+z0mijs0a7xeIJ3EznpN9UPYpP1evbY/QRNq0p6Bwarz72dITH7szJp5FpYnq22KjzVweJnMoXyYQX3OCsYrGk3IrT9LzbxnNnL6pRxCy//nSWmJPMIf+NVxzlGVonnmrOizdq1Dqn5ZTfKJ1q8exIipi/GwhVGANJtVh960fjxYqrCvVN4mliS980paH25M9S0tcrHMUWVb5FSBrXiU1E5U8f/PO2oQmWUpUwr5GIntYNM6XtHURSQ45XvJb91ENI+e1b7Ch/FJi2WWmsanHQjHFIkHYcXm+UvzpInM0mYtBD2tLlNJ5MfAUOWlvydouWd9sgd0oupSOf4MbMuSH2zHiGvcjQWqGVGPIXSHTOm7iXHw/VO8FcVyEqKSYWhJHoXH2Qo2tF75umNJhkBc/t2/mhc823CJlNtM4WyeLz/6VUo7hNbncQbb60vIMoG+cEDeEyqJuxdVVRYC6nZHUR7Tk264zweiM/T2ziCGrW7AU38azZY8pL322TZtZ7CmF3b6EjnRqTM3r+Q5stXnEUk1kjVCT5O4iS/Gc6p31hDBmKUlcNTLGwHrVD58Qzfb1PohwtOuddqv+iRXW8RShecdixU6rqr2pcvxfnBoEaiqh8vVLTZiw9Xiaw4ToXa9mSRZIxqNmTy4HkrIUu1YFa320TrJvxQX8ph2pTPHs8o+c/iNJbXJ6h9SfROZkAym+sDpuK2uuzg0vtBVMs7DKdsyPofKGPZ8UGQ48RdS67hgknzXWuU6ownhYnuxnkVWvvdBGhqVkK7dXcMrdEEAfVeWazgyb+eiMNJhMFf7o/b+LiXF+px1ovdE4MUZla3m0Da71ug/EtuMTMFfmPU5w0Q+tP1DkRmM1bYnUooUxmSbxgSuZveq/h3C7e6paAQ/F6H5kTuqFd58RfQph0o85lE6pQw7nOSY7apSpIOl8n1huElAw8iy3Q26uUX679FtKDmWMUmNWdOJ4uNG7tJFeLHxYf6xU2jV6/3shUy9RDKJq4LnmHF9zEWi/njP3vtnGrJOHzGbvuONvbttJpqJg5nFoaSOsrjuLp1hstFvCVQC+/VOPejgjRtEL80ncDCTBJKCmvrm74C9gl4BtRhZNYFx4vf72POre+0Ceov/qbATrX8RahXOc6paqIY2K7qYRF2y29DWDlv/NqEWZhvkEkbmLYXBwtXN7RaWdRv3rGSF9vlL46qPkimfoFN8U7a9JtKX3vtmnZvrL3WjMYbgggc/UmjNZXHJWL27MB5d9bvEY9CuW7gWCywkpVqtl5+lYt8R1RiCftYPmbppJYsaHEWUvbW4SCPGJNt0rVSN+BeTvI+pwJ4KuDyCjOi7f83wKm1rl4rU7IALKlNUIIIYQQQgghhBBCCCGEEEIIIYQQQm4jdwkhqwJalgM/Qsj0QMty4EcImR5oWQ78CCHTAy3LgR8hZHqgZTnwI4RMD7QsB36EkOmBluXAjxAyPdCyHPgRQqYHWpYDP0LI9EDLcuBHCJkeaFkO/Agh0wMty4EfIWR6oGU58COETA+0LAd+hJDpgZblwI8QMj3Qshz4EUKmB1qWAz9CyPRAy3LgRwiZHmhZDvwIIdMDLcuBHyFkeqBlOfAjhEwPtCwHfoSQ6YGW5cCPEDI90LIc+BFCpgdalgM/Qsj0QMty4EcImR5oWQ78CCHTAy3LgR8hZHqgZTnwI4RMD7QsB36EkOmBluXAjxAyPdCyHPgRQqYHWpYDP0LI9EDLcuBHCJkeaFkO/Agh0wMty4HfBHyqKpiGMj7GSP636hPcLFYuj9sHtCwHfhPw72iJjY8xkgM2ojGsXB63D2hZDvyMl9W/v8NY8HtV/QJjJ50Sq7o8rlDnvq2qP2G8yVy1BK+XNc1WCrQsB36GlKG9FP91eaT0SexbGHOuUOf+HFCAG4DKqbWcK5Lg9bKm2UqBluXAz1C5VP+DJcU8YO4kkdg3Tx7ApEjcYM09rlDnpJtf8bnWAhPUSiW4TqxptlKgZTnwM6pKGukXWBL+qqrjURLL5fvtXz/BVHhcoc7dPfj1K5huMquX4DqxptlKgZblwM8QickcBJYEuRa6jMQSrk/nbgerlyAZBbQsB36GSOxBVf0FW+RnqX/q3CxYvQTJKKBlOfAzRGJtlV1V/1Hn5sHqJUhGAS3LgZ+hEntZVT/CCr7VK1Xq3CxYvQTJKKBlOfAzVGLN2v6iDonEzB6pqr9hgsR+kAQi5oHtDU0Pj/GjOfyXLHF80mWArz6r8xM43ZdElF9hNx79446f89UraXTKd5nOVdX30aTBpUBCfsvqibmBlpWI9We1EhT+kREzhkhUu0VkP5itqv6APeA1n1V9q3RFeEpyv7F0qbNlp29pSQJO54S8XRnQshz4GSYxKUjWhL+yuk0kJlmHSamqf2GCxH710jkDPKQBOLUE1f2RO6J+IRPlG3exjV2RVK5w0vlUpnMhfU1CLnAAHBW0kQBcZ8VqJRhMsaI+w7VFZHJZGUl3IySJx3trbdKVTAc6XYpstbSku9/BCXRsGFgd0LIc+BkmMSlakIHhnclwid198kTHC/l98uSROfR53BeXl2L8TY7uKIi7VvBP93/8IhkSVMi6LPxIfKJYxPhZxjLvyepqVmdpct9onC6d+0sVUpKTIPVwJhG+aNq/iOsPIYszY8USjKLRtvu9HKPSlSL7RixfflCDiuyTOhlS89VvcvyfigcDUpd0TVWlYZi9zaWRrUZLUglrfmR2XX2R4rjrFQIty4Gf4fmT7LnV8UFkhMSEPEifR4yuteImuKtSAKiIkgSLUyLrU2HUDRfozDRoeuZa56QRoo8VcQc5SBP6B8Yk2sxYuQRNNKGfkoFFW7lSikxs8bJAzKHuE/GJs4dola5ME60/FUyH21zKbLW0JDHed5OcZenetxUALcuBn+ESk8aXzKtlLqCHlelcXNcWAYaJuLonl1PSg9U2qfog55r6npT0r7Hv1Z4vOXOtc/WZkqSTsDLzXPubre2sXIIqmnrUqkMVIpO+rN6VpUoHY5asTznbpZtfcipNlyJbLS1J1C/Oa7NTXxnQshz4GS6xPHfohFalcyGyTQN+hlFrEEbln1pHhNzPkUs7nccIErUOmwhboiU6F68aktTSbrqqjmGaGSuXYCYy7eugWoXIsjFPN7u6JokOZMskSrt08+SUpktXtuqWJIObGwSfYV810LIc+BmQmFRR7KSky7Tr8VXpHExK3YGKO1TISK/Vs5zUIOflCTqu55JB7I8Ywq8znBb5zoOVSzAXmVz2oXPKRSa1WndrQkitrd23S7cpgRaZdGUrtiQZbt0gyGU6pplXCbQsB35GaLnJaB9KeiU6F2y5u4xWaeco0w+YEhKdS5cwO3QumIQ6tfQseTZnxMolmIumDtYlSifYS3ehQ7o/pVNYo+nSma14mkLnsn7gaoCW5cDPCBKrp84Pwm2ta9Q5ubqqrw2sb23WXci5dOrpldhynZPUYKoLoua0750RK5dgLpo6WJconWBvqdcu6UoJPufjUtOlK1vx9PUsxuawMF0l0LIc+BlBYvWlZ1ycuEad08XfnHrJ93t9Ls7wnL/Mn2QZp3MxrCSazm1nxMolmIumDlaKsu6/lNDeW66TO6Vr5rBSaTRcurIlwdzwXTKIRscrBVqWAz8jSKzOX8zzNeqc3tLJgUd6LxU5xxpdYIzOyRCJmw/p0svMWLkEc9HUwUpR5rr1O3xbbkp3S9fvntcX2U2XrmxJoNqEBRppK2t8r0CQwlm7lA7f7NeqczJBaL341fsw1V8/2K3bROfSiecYndPz2zRHJZv1rzNi5RLMRVMHK0WZLlumOtd45KFLuopc3AnpnprcpStbEgQmaQCmao9kYprcyrg6oGU58DOixEKu68vWa9Q5ufhtVQEJH91DzuXqIA07Ruf+lvQCYWvm7Fi5BHPR1MG6ROmEc9fbDiJd0nUeiK6ktxJyl65sxdPbHVpwPVfo0LIc+Bm1xOQqQHqfn+ryXqPOSd8WspUik4VaNUIQmcCnd8xH6NwD6T6/1V1IyW3U+bFyCeaiqYN1idIJ9tJd6JBuRK/3YAS1S1e2YgAdb3HB2PbCiisAWpYDP6MuvrRA6ZGS8TiRWFm+VeuceMRdXglZUjHn+YLyCJ1Lb5/Ol5VLMI8qJ0EHVSaZ2mp76a60S7cmLgJFoktXtuJp8jXR6wBalgM/I+lyNNvfJL1DIrH6EkGQMFegc5kVZK6JzqVhk/szy3Tu8/XM9ydm5RIUU3L5JXMNNOpcZNKBpdPz78PsQ+bvja3j7dKtkaEapkB0ybIFkxKTLOal1wC0LAd+RiKxl3LtKTNn2DKJyWwgNlqdMK9c56SJJFoSSJOS8Qw5l6h116aP18Eowft1TprPNU0/pmTlEhRT0o7rULnINFEYlWiTcbHRs7VLt0Yu+GAKRJcsWzAp8XzXf58VWpYDPyORmOY7tSYSq4tkq+ptEpMGnHQwQzyEOtnc3Xya/VXiqKtTyKpINcZVlYuWZTpnl9ufj4+PDw7mumgprFyCYqpHOqngcPurEJlM/+o5vviFbEjkWPnfY1ekuDWk+1Xd/4m3HpouRbZgUtIAVWUyPbimDhValgM/IxWRLePBLKQSE3l4byX9pZhbJPZd7SoM8RDq8+Xudp64NPLVH7hBo0v6ZlAB1zkXi2dOW1PUKE29X+fCE5dO8eT5bFi5BMUkuDAkZgxfiky8whCTLtNrl+i36PSZct8y1CZd6UUxCZWErUhNlyJbMCl1vtLF6KRqrg5oWQ78jDRb2u8nNzZTiWmZquMDcZLOrFViFuLvEGeQR1pTubugA5l2WF6F6FvNfKy5qOSaJOZc7Z/1zXLSH4s7XMV5ic5ZgW1oNK78ieJJWLkE1aRen35VjauDlyKzaYOcw36TCZ4qmDhb5KCJLdKVK0DBheu5a7qU2aqRIDDJxFULEYDrFQIty4GfkXUFeRZD7Tu+pm5XQO0SsymLYJZBHukJc3dFlCcSnvjx8U34L8956Nv+sWhwlBBLdC7pje2NKtewIfbyrFyCZgp7R5Il4qbITK2M7FnEejpRS6FFuvXLM8LukaZLma2IhHHD78kmvvsSppbwVQEty4Gf8fNBMrG+f5AuPd0/yGruqz/+q47tuue7g9A4vzlINyl+f1wdo/4Genx3ENa0cnfwszSa49/rrZbKwT/VPzYXeZLm/O5vx9XfLpiD2AR/bjEpIUS63GCtIN9JMRNWLkE07t/+qY7TG6GtIvtBJPb3n41Z+oM//60+/VFsPmlK95vfpPP0HIKGS33S/PShJclMNlxvKs1+YfVAy3Lgd9uRuVAcP5XYVZKM62i3FybeyHNkenrlt+ugZTnwu+0kqy0Gda6dWelcIUTRuSu/YICW5cDvtiPygMmRSQxMJGXOOncdW42gZTnwu+3I3DLdfitNa6Z3C1bMrHROrtGTy0a5Rr/6O+TQshz43XpEPv+FmYeurfVvAry1zErn9K3DceHoj2u5XICW5cCPiEiUYz9Q5dqZlc7pLjHlM+5bFGulVwG0LAd+JAhI+Y8Tyw7mpXN378v0JZDeNrgyoGU58CPKk4OD44ODa+gOZ8OTcMtuPvx48Ndv17aHFlqWAz9CyPRAy3LgRwiZHmhZDvwIIdMDLcuBHyFkeqBlOfAjhEwPtCwHfoSQ6YGW5cCPEDI90LIc+BFCpgdalgM/Qsj0QMty4EcImR5oWQ78CCHTAy3LgR8hZHqgZTnwI4RMD7QsB36EkOmBluXAjxAyPdCyHPgRQqYHWpYDP0LI9EDLcuBHCJkeaFkO/Agh0wMty4EfIWR6oGU58COETA+0LAd+hJDpgZblwI8QMj3Qshz4EUKmB1qWAz9CyPRAy3LgRwiZHmhZDvwIIdMDLcuBHyFkeqBlOfAjhEwPtCwHfoSQ6YGW5cCPEDI90LIc+BFCpgdalgM/Qsj0QMsIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCyMo5r17AdAEuFZlcgvdVVe3DfI2wAXTz/Kg6eQ5zTlUdwpTycJHQHlFpj0ymYfvwrHq3C0uOaFxVncDSy72jt9swrgA2gC4em4ja66fLNQWOTVjlK+Ssu/Z3q2pnY0P+L0cSOIdxBbABdLAt9f5wY++kVXzttebjXFW9siMcm7DKV8e5VO7unRetQnvT0w/mbHVo7USwAXQgHaYbnvoho6/WrDftg1W+Mjaras8M/pvzsTqDaRlH1VFVdV8cXBY2gA5kuIKpBerceiLXAzC1cDLsWk6oqq33Kxzo2AAcmZR8DaNTVR9gUt6h/9w4M7lprd2RkfCdueVEnXt8KjOUGOKVWN5vqglVvnduSdXuZCzPq+oURudhoSmJCN4dnVfnR0dHZnkrzm/NJOJ4t7GTX77JhZ9eWsC2oSEPJcLjdmsSXZ3P4rwol/+GXHRU77fUpA3gnjS4N+Z+a5H6KGYTolFWQU5V4frMRSG19sGiVE1tgc7p9aBxB66GG1XnRFCq1Ik7Gcm7ZtVlPWUmAhg1+Ncw2m0DkcM9tagZnKo2oGe0APJfMf1uWkP0fXMV1L2Uv10iCjp5qluPed1StEcqqkDrrFbChs7ZorPWtrmmQOcWJjO5wLAQZ3YtsYPI4iMy0NaRupORqBSq6jVshgjyPHaDuQji3FKkpsLE7NEUaP9OOuZYTxpXXCyADGqiJg87rBZdGsyZ9NIy1FrE4uRyvLex8cxsGkWE/yKfSt02tBYE2ByZuFTVM1iaOme3Ne+UM1Ihzi0NkYQe0kHTdA71nQ2mZBR7KqByUV9dTvPJB0QQdS5cqZ+bCFRpchn4RaFoi9+i0wBm8FW1FqtHDxokLsmyG04uE0uzbuiMVKJYa1rlNeP6I7WgwAZ0pAvrYA2de282MbgcE3KdE6u2gHzMPJQ/F9Aql8duOjqGCMU1tYxP5VUergCgczLD0EO4+MtVRMHQGWQsAVyHRcdbrR49uktEnMAI8k+GtJByzMmtRK6ahY+wRWymDgEUOqcTB0EuKaS695SwcaGhc2qVi3bEsMhRBKk7Gcm5SQ0tvca0DmYDEoHOxZt3PgSJrrg1AocQrg5gI1qXtb4nmN8d9JP7RBREmWcBbx266aShcoJqnSpTl85Z/aqMBXcRG3TuKdzNqleMH715iCEml7qTscj0Tq+SGkj9+q24TATQOVNJINaGzkWVwRwnVTK5kOiyHkX3sHianjw2CoU614+M/7qo3KVzNk9/e6TEtWdUr1b3+ZGIwq13tFe2MBJZhrcgg9qdTIYonR5yEUDnxGr7hAS9uGronEYCpripkomCd1k/RPev3aQJxJOHi0ODOrcEqTn77ZpbNvblQcKiSqlVOfGeUyMHXwPuZDLkWkHmfYUIoHMysddDoNQ5XCU66pAqmfx0Wcu5ZX5yzi3H4AtjcSjyakpqrXFbHBKuV1+izolZb8la5Gy9zd3JZNwxMRUigM752kek1Dm7Qnd8eSQGkOuMHitGN+Hcxsf85NlUhjrXzpbfytRuT0c4TFakluwoB3tcRGaejeuJqHOWgsSvde6DaahVubSKWgruTi5NuEp2cRUigM6Jc7qrr9Q53AVSXNgSwLvHeHOgsJqtTlZO5ysA6cnjzFOXqalz7Tz2O2/hDue2DWiiKG7Vo8hXKrRZa1HntLfTBVGxbpkAJLjOMLzKX6m6Zu7k0mA/lbRwvw+WiKDWuTfhBs1hNow5PkI6fgdbAthMVQbAeBM8s2pQ4b0rnTjYnYri5HJ0q5mpc22oOjm+qKhLZAKqWFzdrrVfgEq2+wzCiVpDaBM6qlz7ycydXBq5MnbsVkwmglrndBR0dESrlcbInt0yLZIAuo9SsESbVj0oaCJwKE4uF3mG3pmjznXha8phtmLWj6GazuVXH/Rv05V0/0L1WvpcVUvfM6Gdr0b22YtexaXu5PLs2i6+M9x8yUQgB2xkgD6cW7UXN6azO+S2DqJaZRFcSoU1je43B8Imh/zkbvUrkvhuhvzUhBBDlSyhsBJCpoY6R8jVQp0j5GqhzhFytRSLLIWVEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIWQE25d6ufmuvW6NkBWy1/a1lCWsdcOMryS8EHy0+AayE1/DuIzhIS/Bhd4ZsXYN018D6l/ImlLnziVVGA2x5t8oJJfCXyy78tf0FlLsYXhIoJ/7V8aMQjdC51Buz9aUOqdphrebCm/DOcgUhK++rfxbR9lXXtp4eoQcLA2ZU79CfMynY26Czr3GS3ePVqBz8evSSlW9oc5Nhn4FXt+jvXmy8m8dWfPooW4zy0LmSAn8XdIyIRr+7ZiboHN5eafVub2q2odFv0JSfBGNXBwZI+xDjcL2db+R/oJt5rT+CMZZ9qnIfuaoc3vF1z6SbyIJXn97yXRlcy+fbcdlozQQnBs6J/NJWMS2n+pcHlvsePV+5r5ZhNopIt0aGgV/39KK6k+9g9Y67XQuaPPdqweypFm06VwzL43k0k91ZZ/+r7mTtz2rBde53VAII0u8aLAtDfOKeaezZ5id3C7159Ns/2jVVvhMj9nkqF+N0LnMtrtD/9x2WBRNdE7SQt2Iqf7yZxZbUt0yq5zElgXC97KOzBUjpVa1ZkaCYy7ln6a/DVj1pD2jVloxo7TPl8VFqo467XTGgqPXaAwVfF2lfKnNRte0WbjJzCFkR16CXJ3sQkPS04AoE4Yyl78rYxS/mSwr4doxa0x5g+1omFeLX3hnV7qiCEkfI+WWetWPKOlX6cRaHS207PC0D7VIzcjVxMliIeXzjk8dP+jnkLKi6dV9nLl+kGhB5/LYIgyx2yKn/EkymGeIoXqj536tNqlqayV6/WnerV39zUTLjU8rAunB8nHkmQQ4ea0f6zFrR512Ohc6J9cEaRswTVJJHOEEagzNQlNT6pCNvGTJBYK3YRf+UQtdshLg7b6OEeoUxQ+TntxDl40pbbBmbTbMq0VzKsDmqEO8JAi+OJx7f4F2r54+M0Wnhe/jSql0NFMZqjWgOidKHiybUefy2JqqHKQuvd7Ql0pgO7nUlyau3t43hNZ3az6t5aNENvuK1RqAt9QSPsTYWqedzoXOxaMfVJNkxDCFeq0ddtksfHgLhva86FEPAdEEmATpCqwjCemKZD/40CCOqqq1+NWkp3mM0HljynPW0TCvFjm5Ahuw4fiZm4OnVULkyOtHPL1IH5J5g/1665ey2RGozsmPVYl9yhM614jtsaTDtPmBf/VTnLH84jccpKoxPENY6YdCbzb6DVMhrJkor4rCx+tm/65+d512OJc6Zza0AW/iMmypJSM2i0znuvKSNylxTr48LtKVpoBbSy7ZMJJ7UWvxi8lP53pVNCYQc9baMK8W/X5jqIMaF6oZk/rzo4E1/3i7LQ4xZqhbf56y6RyGdVM96FweW379lGFq4ddptYRkpiC/UtVu3dh308dQ2TcfE0/8xqISZgyB2jf0Y2112u1c6FzWBtyafSMSxGYRw3vI9ryIwY9OdFYkIxIUSmqSjfL3zyTX4o+mU2uNRWMCnrOuhnnFmPT8cjNFCuLZalbQ7t5CJ8hqTDzfHxnmUk908qKZzrlY/TfqXBo7phpv5tmxvsT2zriudPHXoRMT+NuA3ovLL8JjXQGrapjSfiyv027nITqXt5r2ZuEh2/MiBj86sQNXvg4p6WWDnUraijeSssuNJv+Ec9GYhDpnXQ3zqnl1/i5dZY3I8KsrY2UF2YWr4m7RMyITwLrTzYvmlW9ziFObSESdi+j0MaSaNwS55nabfhZZfhOdO9H2d6+233y2j04w9wcy1mdSrCvDa7O9Trudh+icmgNdzSIPGexZcoGsF0GGbAeiD2yiNRGx9uhcxK5F0px1Ncy1wS+TigqSvGtXFaskeqaLaJgSCHnRXOdsPu4eUefS2GL1VPOGUN/Z894q0TkznpaL5beLbCal1jDou/6012m388hxrrNZeMj2vIjBj85ZapULTD081sOJDYC1vhg9Olc0piRnXQ1zbfCuIa+gMDg3KzeVeH3pFQ0GJhlywCV/1LmivXiqeUOo5wU+y0x0TgI8l7/i3v7twqUVqa+2woJkW512O2ff2s/bAKxxlU3obhYesj0vYvCjI5cM9cVBWFBRzfbFE5lPmwvo0bm0MeU562qYa4OvxeYVhAl7S+WmVxdboa9Jdp0o0LnzSnoxW7yMOpddm4RU84YgVYt1KN+vkOqcpIMLgFuL9GJY1pWq0TpCdcSVyLY67XI+x0MlUT5pG4A1VfKiWZzVS2tZyCIvYvAjSJ4yibdaT6tTXDQUdyB7dC5tTHnOuhrmdfMQRZPRpLnIhMFZbxC5Gzxlyp1eUcNbksiKBp2T6zG4Q6aN2J5qo324FdWY6pz4nfkNmNuLVI9PCO9o05PrO1ssFEnZhqrOOm11FgXWq8PXqPK8DcAqY4b1m/viVjQL3BoLIbvyguQi0hpc1VXl3rhJFO4EwfObI506lzemRoO1Q9kwrxsZMKr3dufeippXkOT9bG9bKicUAZ56t+bj8709LC5KLVcLkWM2RdfgpnNacp85QOeK2CHVoiHoBqJXe9oObKpS6Fxqu5XIOCADhcrNLmxlKnG6/1RvBZlvR512OGt9vhG3eF8nbQPBqqdbaFNoNAtp9h8WH2Wqj5AdeUFyNbrccXL0VneoxPv9IU1b6hb57y1OLXqnzuWNqchZR8O8dkxsgk/XiwqS+bFwBlv0tA7F8OmBaobu8sqLFnQu9oNB5/LYIdWyIYgoDd8xlOmcZgvG20tYofPrGW25gjXDzjrtqmqpXCHIr2gDwSqqpOhldN4sfFOFTJdCyPa8hNA1kg0nTpPtBjKM1qsoOj/s1rm8MRU5a2+Ya8Du4uhZmDvH96GErdp7rxcy7/Bd4tnLUh4u3izqJaPFkc50dtMAG/diWuFiud7xncSuT4lMxF3iO5JqPEX2Goy2O7S3j73FIqnwxy9ex9vmHXXaXdVP36r8XGJFG4jWTRFHuEORNgsVlMkphuzIS/SObC9ev4ppGmmgrcXbRVDHWvzBtBkfLciaYp6z1oZJLsBtX0Eh5Ko5z66vCSGrJl/4JISsmDW73ULIjWc3LvkQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhJCZUb/WaaVc0WluM9k71EazG9/eRlYNXlT4uKkTLU4XJ74PcRw7/lrj5Uya2fVhUPk9UPau0PHwzRiXxl67acChC1eGF8VnmIQWJ+Dv+8QrsgdyQZ1bnn+nO7Nz4qFVbFV9iB2IWGDqwQNNq3P6wuX0Rfb6pthb/f2WAUgVATh04cpQfDBFCU5HyddqDaQ7TsQX1rnsyyMNQuZa8j9D7PXgTvyUSln+p0eNEd0DLdG5lngZRez4pnKg1ps5lZiOwTUEZWjpw+BUivI1Pnd0NKqVX1DnNuovK7USM3cj+mC8hn5TXw5u3+5oKX9bRdrrjpfo3DIBFLFN5+xrIMZz/ToAda6fsTrXTSlKfVP8eC6qc0vob2dzI376QcrVNVfurMjJdS6dO0g/S50r2Cv7+WU1FN8g35BF+Rb4UpTxW4Id7MQ3zivhqwZ+mvzl9nGhbS++iB60vAQ/IfFtaWebZeQ8Q2vETlnKWuc2w4daIqFUubyStUrXuc6ay+LloSyRoiY1K/XXTvW7+UmLulN/osIITWYna4alVC+3rrpuvNOJAMwg1bmwwCBdlfzKj38cxb/p57KInaR/p0U/52NO/kUUwX2V1OKfbBLsc/4a45UGDtcgvthib0SX0/hZfc4kFyz3xKKx9fNYApqYeOh3xupwmjtzEPRjNXYC/w5XkrmYf3yQyD9C0cjQOmHFzLuvWuf0S/B6gKY807BaRXYUpM3XVViL0CtcY+CD7QKCOKorujwiQAPcdigh3A40K6dR789kmhlblFexf+hKTypB7XM9mptY0w2pWl7Tb6Si5cwUb3t5u8rGOXj6BF3KL3atOKu2WmBucy/pr8wpfLoqWaSUKo4fFpMLDzfYNzUlxonUpYY3RzUdnbrN3M/0e0r2FSwRuQaTetfmdPIaodRDGksaTnOHDzapJsmE5+TVQk4jeUgyl+S/eqPntTW3MkNrheaq+G52onNSNtUPL7801cOH9qmrUBPmFaowilCq7Ug/pqU9bK5zSTytwcVCLs98BiKOR6LgEs2sAc2K99GKGkKLkghv97WXV4ucVFLSGn4l+atruiFV5BVfjlSXeb9BWIsjwOZkOid9mQwReDm5BtUpAGo50zkRjcprUy11M7ZDjSYQPvccGo25SAxL7HHsT9W28VoVXiOppT5g6RkaLHGtT2yEsyQU/7zavttl9LIjDiGz0lJs1iNl03IUGVorfDSvuy8l0TkpmbZOzz9GfWulsUY0dqhCddLC6odNJWGtyVznsniWGL4fKOqmNSWH+tSKZSWMw4c+xFqL+uD9t0xatFPUk2pHrpqu8xnp/m1K1SJVy6v0CnqY/zqzlkiAzYEbXHUYklqC0BAyDHsQmDv5txgVOImbHRLsa4Dha5L2iT6d71sMl6sPM9LLmc0IZ8XsQqyuBfErBTIt0kMjHFqKGtPrAQQLsWP+8WUznzkXGVorMEhnnyLKdU6z7AVIh8NYIxIZHYk7SWH9W8I++ejSuQ/JfM9+vfvEXDZiWQn5saDQuTBAea8nJ7WaFUX3eZY3jBapel7tGkSIpZgr+hlLATbHnYRglUpNL+CEU6umKDD5lfFAPRw0YwlghxRvL2ra98NHE2SIISnrqbJPOMazepiQGTGEzwiGNlCEC7I5D23FgRDDGf3U9eWCzHLkt8jQemF1GEtv5DqnY4yXP71wSCoolMmdYmHFLj9dOhfnKGbwztKtMDieFVcw/zCg61ysYp94xpMGpXLFb0q1zqvpaIg2X0x6+apQNrf0D4zH2oHQQrXWAovfK1ZCdaZuEfF0d5+p+MVBFIB/oDbLUtJU/DdaQz69MXSFS7uDvYV+TrpF5/Atc0GuJ+S3yNB6IdOrVJeUXOfq6tB1DtRDXSOlUyys2EUa3Tr3/sgwB5kLmqu6w+CExqEdXVB/lZS4e/SiV6t1Ts/WKVWki+Fv1rw6f2fT6JpC58Qe7nDG8vsFUiqw7EI6VGdH9ci4qZI40WYjF8jqVDTxLGI8q7sWVsWdusOhe5aO1mnRuXC1EFpckaE1Y/voxCfokUTn5ErABwgrv138+Tif1AgMMMXCil1m2N06F5FQ9RnrUxvuYaOaZEVdvEXZMgkQazxpqXNmEfy0dV7FKBd2xfh+Qyh0ToaJunZQfq/WVGDZJ+BCdaZuKX4pZ6FOXaxRAKPGubDToqp0q25HOJ8HC9Ixf9TupVXn6vx7/11kaO1JdA6LRLE6tnSVQg1JBcEAUyys2GVG2K1z6VJpPeBEg4OsqHK8d22P45xaAvGkpc51SBUBipPdEAqdqyqZXRYd5YndAUgF5jMyEKqzq35iG3guf7YoVTTxsM5ixLOGWNEaLvpcTu3hZCDFOB6G61adq2dKPsssMrT2JA0a085YHTbW1WteqQGmWFixy0+0NnQOEwajvgKuT20gK7r4FtfetEUVC47JWXKd65CqXyS8LubUN4Rc53QGKBe95hQvItKrdK87uaL2mlNCdeaVXIPlYBFO+Ph+FEDQuSRmrHV3TKwIBH1vDxcWrdUU1sdbdE5+kX9vn0WG1p5a52QKh4aO6hA+WCHO6tWn4OUmKayLPA6RZpOj6VyIl19DiiL7sJdNcQRkRfzDUOjJh9VvEGu41Dk4l1IV5Ixhx8MNI9M5UTcZJrB2L/VhC9RYHY4Cc4sL5Ew6p1CdxerMQ9S5jCShls+wEl80celDrWr31TXWenpWYR9Te/Ti7eHsjruDHvRd1DnPXMy/B/QL/DJDa0/QuU0RlVeDH8M9UJ2mxXWuWFMwSWGDVlhniLVCcTWdC/GkLaTyRI2JNJEsCFkRd1x7xS47vb0RazjXuU6pCqdVdhPpBqF15ZjFHnkJfdB5Vb3WjQhwhMDUogsU7xbSxEUsoTrFafGiriRdAX2/0EtpVKOY4Fs2cfE4XYjw1DUJ7r9BCJKZ0/2nervDrK3hpBXZUpmY5bLmua/CQuc8czi17jh6tafb8k0zywytO1pXAPcTrfxSjKc7e9JSdX4tOvNh8dHWIUIVuklCiT4sdK+bucrIdbKQn0PXuRhP0vn4fG8Pa7yiHZWuAktzUGsk6Jx4wQM6p2uor/b2FqfhpOqovqnOdUpVUBnNQhqjkYKBuosTccgYJeW3ZWo8HhAFZjb3sXsswUmEJZjRCCtX4WJNLqLgGwUQmjh2HOlsMNa6h0mEoPu8BIihNZyHEMTsae6fuIxD5vLMhlv7jQytObpjyohb7bLy+7qE7UhwOZo9mKSwMsQpPvtGxW4vXAtiPGzKDJf32kFpWwhVBYLOiXuUjM+ccBKflMYaLnSuS6oKot5s7oSN9bad3Mr//Cg89I8XlcRN3w9fvMBku972v0hubgu7i6P0e/v1BXOIsRlOuLk4Wrgxvg/Ft7Rnr0d5/OJ1XDruC+fcWbzZl1lLDOeZi5ndkVNGkTYyNDdC+RdvF7EmdhZWwLpqYNJt/Hde1OE2thevte8JZUc84eHiTV1HG1Jh8ls+ThIeHYinuRcDbElusN8n1nDwjXJpl6oy+31f48n6nAkIKyiEDAIXNbeJqXXuPN8zSEgvckE400nHxZla54p1TUJ6yTY73RIm1rnyxg4hvfgtj9vF7gLrWtOwmy6nELKMZJGHEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCyKTcJYQQQuYGxrABIAIhhBAyHzCGDQARCCGEkPmAMWwAiEAIIYTMB4xhA0AEQgghZD5gDBsAIhBCCCHzAWPYABCBEEIImQ8YwwaACIQQQsh8wBg2AEQghBBC5gPGsAEgAiGEEDIfMIYNABEIIYSQ+YAxbACIQAghhMwHjGEDQARCCCFkPmAMGwAiEEIIIfMBY9gAEIEQQgiZDxjDBoAIhBBCyHzAGDYARCCEEELmA8awASACIYQQMh8whg0AEQghhJD5gDFsAIhACCGEzAeMYQNABEIIIWQ+YAwbACIQQggh8wFj2AAQgRBCCJkPGMMGgAiEEELIfMAYNgBEIIQQQuYDxrABIAIhhBAyHzCGDQARCCGEkPmAMWwAiEAIIYTMB4xhA0AEQgghZD5gDBsAIhBCCCHzAWPYABCBEEIImQ8YwwaACIQQQsh8wBg2AEQghBBC5gPGsAEgAiGEEDIfMIYNABEIIYSQ+YAxbACIsFZ89eTJk0cwr4qrOAchc4X6QdYejGEDQIS14t9K+BGWFXEV51gtB1qCf2EhZFLmrx/kxoMxbACIsFaMV7FvDg4OxunkDR3mxlcEIS1chQ7eZlhbE4AxbACIUPKbtnJluCi+/YwoL+FwYUarGHL7ANYh3Mxh7gIVQdaVG6+DtxjW1hRgDBsAIpS4FJRPcFnGS4SvqmO4XJjRKvaXnbf6FtYh3Mxh7gIVQdYVl6VyQ3XwFsPamgKMYQNAhBITwh/+C6d+vtegX/7W39Wo2KMv4vj5K9hyvtUI1Z+wBfpi3NBhrr0iyCwxWc5cB0k7rK0pwBg2AEQoMSkcmN5UP8OtDxfbo2P9XY2KPVHH6hvYhtAb42YOc+QGofKduw4Ssjowhg0AEUqsPR94V1p9D8duHug0r/ruLoe5q4PD3A1H5Tt3HSRkdWAMGwAilFh7Prh7908zLGvYX/2noV7e5TB3hXCYu+GofOeug4SsDoxhA0CEEmvPomJ3/1HDfx03uAKfNNCvYuAwd3VwmLvhqHznroOErA6MYQNAhBJrz6pi920ppH+rl0037XZqu4o9ennwc9cbFb47OPjxPsyBQsW+faL4LrIfzGykKT746eDgfzALA2Lk5/jm5z+OP/97fPzbEzg0kDPEMtx/eXxcfT7+/Ycy48qDn347Pv7v8/HxHz8v2Uf14MeD4+O/Kzntny+TzLfzzcvfj4+/fDr+9aewCbl1mCsqwkizPqSo4MEPUpDP1d/Hx7/8iAo02MtdESrf+eqg8c1PBz/V7eWbn387/vTl+PjXl10b6Qdr2ZMDaNmfB9/BqYf7P4jy/Ctn/v1lVx0Yg3X3fz//JZr7z/HxwZPO2Ud/GCrqBGAMGwAilFh7VhXDje3fzbWdXzSAK2Grin2jjlW7sFr76lzFfldbG0meLhPjkW1Nq/mvVW8svDac+2nwolp+soWjlM/tewe+t4rK+WLV3cb/bDqf8Ltq/oCqc2LWBxZVeNCSv4heMpDVY5U9Vx10zM1MZdOr/m4b6mJT7dGyJ3bdmvOpeyj4vhH8y0HbwDlYd38KzyZGfmsMY8vDdNYWFXU4GMMGgAglVk3e7/5o5u7nTX9SbyyprELFfKtZCz8ggHDhGF/hAZbqnz9/PY6jSdH8FDTBb3JlSBXw2+glUzCZPMJS/VvOqv5nk3Pnk4Ssm3TrBuP4lPB/EhZGzeLYYW5wUe9+Bz/J3S9/NjuVtihkeqyy56qDjrndvfuV314UpAnXDarZd6OpdmvZNxbCEd1JNOnvtrGrPrFcF9YqWX0pTj1Yd2tt/Pv4l1iUz/AEQ8LYKaiolwNj2AAQocSqCZcXv5qla6uXVfUXTM1WoWJg/J225TFeWtP4/LKeaf3P3FvGG2+CNqv9HHTkfrIa8QCKks0AX/qk7m9YgSV1/EM2vfvRQzafSfrB3Ku/ao/7B67clszgYe6R6c6gorq+fEpm25hcHjdyR1aI1fm8ddDc7vvk7LekQdmwLCctG5SF79ayoGS2mhH5Dq5/wV7zMzySM98/cE1LFwyH667v8/kvXdH8SgJm68lDwvTUFhV1OBjDBoAIJVZbUDE8st/eul1/gkxnNswpaStR/P5D43ncGP5z66q96+1xkZaU3Yak0AM5L/9s6678vGXdeff2d9FsH8QLwMHDnDKsqK7zxX0D362+bBsEmRKTw7x1MDa9f0ut8REoV4wkfKuW+TVtm5fXzpciZz4MhBqs+U5GsKR9j9Bde1i/b/FYGBJmSW1RUYeBMWwAiFBiNRgbiM9P2qruK6vV2G3Pbpj71FzqaJ84hyZYnsTxdthUKMF0rX/7gONp5Dps1dO2WBUWNkYMcwOL6qdsdCUu6N9gI1eACWLeOtjd9JBWcVqEb9cyHxnbbzj53csvWfXYguWX3j0nygjd/cqCdpQfDAkjjKstKmorGMMGgAglVoVR9D5FaOurTTJ1NzyzYe5zqw5Y4y6nTt4E/2vRV8FvXfwEW443u+7bKpEHzUS8BO13wh9YjgYPc4OLaqFLEQo2Sf0PFnIFaIXPXAd7mh7uYOXDi4Xv0LL+18H4OJcuMfrQsHSUG6O7PoS1XWjWDAkjjKwtKmobGMMGgAglWlVJE/T23bwSt2uKZH41s2GujOHYrLE1T+XYF7BG2NLiDFO3f2Dpw+42Z2pv1dvVYgdX3biiqkvR+Ri+tNNRm2QFWIXPWwd7mt7dr+zmV74zw8J3aJkFL25zJ9he06SEPkS1NOSCUbprddsx1w0MCTO6tqiobWAMGwAilFhVJXXoV/blNYlNyNI7vzdimPNbALAEesLjRMmmswzfRDKg2Vnlpe3Wr++6NHWKYa6lqHbbr3k3H+/igIVcAVbh89bBPq3x8+bbLXvCuxqV9xJqvIT1Pg1PvXHDrWSc7n7lGx/bFmEjQ8KMri0qahsYwwaACCVWVWkH60sMeSuzKUZ2nXIjhjmPB0ugJzw0Sncit2HTxY7iG4/wOKcFTOvc23aXZg+uunFFNaFWjXtAfj++63KWrAATxLx1sE9r0LyzYbsn/NK1OLvaq4th4bsv/gJjddfnGnIV+nP3CDokzMjaoqK2gTFsAIhQYlWYXUf45r60iduqdn7X93YOc50PzyY0Vty//8N3LBekde4q2FGAVQ1zvopT3lzALf5lSzFkQqzK562DfVpz93/qmRewJ7w1y761fyt33ZYtfMvFTsF43U1e1vDvyw6FWB5mZG1RUdvAGDYARCixyspaIB4HqTXKKzRv87dzmHMFwwSwlV/yaR2eJGojrXNXwa4Wu6phDsX5J9mp/I1rFJ8nuFKszuetg31agzwNHeas6H07ll1bYEH4pfv6x+uucv9nS9340rEpZkmYkbVFRW0DY9gAEKHEaitrgWiVsaGF735krJOKXSCG09qkesJDwZZurgp8g8u442I50iovrXOrnPLBmMiqhjn7dib4R9Q8Tkw7Sk9WhFX6vHWwT2vwDo/Mtye8X8nC0obdk6qLYeHLamgyUncTHv0WFOW/jhrpCzOytqiobWAMGwAilFh15SqGrbeYIPnEstyH26pivsX2Bg9zPhy170lu4juhvjTrozHM+c2Lrka7omHOkv3T960lHHeqMlkRVu/z1sE+rYHaZHO9nvB2r61zzidYZdRbUCx88YatFsbpbskDz1Xv2yPbw4ysLSpqGxjDBoAIJVZjhYrhpqffM7ar44Z0W1XME2sX3wpV7AIxnJYm1RseU+XlCySGrzO1BW4Mc77TskuFBlfdqKLaIOyBnxzYos3xr99zsfI6MNnMWwf7tKbtcZme8L7psdxnWuMTgHog99lkcaXbZJTutuHjzC+wtdMSZmRtUVHbwBg2AEQosVotVQwzKm06tkLQfMtau4rZ1XT7c/ljVazjac+JYjjNJiX0hA/vW1i6ednwp1bbVkkawxwWajpuzq1mmNPAX2Am14rJZt462NP0Wt8+0hfelvrz3TYJfmWb7Kz05+aWb7Uco7ut2BhWVl9BM8zI2qKitoExbACIUGK12lAxbO554H11y/3gdhWz0G2bf+77cvIgFXtkQTsecJkohjN+mOt+Q0UTf0tX29Blg1pW537l16hPBa9qn3yYs85k6RyYXAEmm3nroDfS8pWshrftYutkT1PF5VqXklmtZOXzYbT/MksYo7ut2GXVkreONMN01hYVdTgYwwaACCVWq00V8ynSf7Zy0radp13FfPGt4Rw/ODFIxfzUHasLE8Vwxg9ziDJog5PNnVsWX9y9qHO/9GuOc/e9r1vBMIdnfv787patf6whJoh566C5Cc0o2D9RLID2NNWgIc1v4wj+5Z4v+RqJXae1PkH26O9kOWWE7rZis9PmRXVGM0xnbVFRh4MxbACIUGJV2FQxzMGU1uv8dhXDOvnn7A74NzaLtDnJIBXDnqj2+ctUMYwLDHNh9+RfbVdpD14eJ/rnXUUxy3wQepuyzn2cKz6P9b2dbHDVjSvqIy9Jg/+Oux4SIqvBqn3eOmhuxj/Z6ITRtTG29DRVwa/nmt8SeOSTvk9latj7UVx93rcVxKReB+uuVOFf5Yqtq0t9GTkkjNJZW1TU4WAMGwAilFiFtahY+P5Zxx7cDhXDk6ByEQMB4JO+n+6b+IapmG+lqv51Rf3+r/SKaLIYykWGufCKCjnbr/X86tFLDF/pbPa+5yp+8firH70y/rX5WVnn4Rusx+hbHvzqrfvTfbt5Pv0wJ/gyVhtdDwmRFWA1Pm8ddDcsVPyLTZWhCbeUraepKvGb4v/9Gor+5DeoU/mlVCVU1H+/YofmtwchdDZUDtRdlOPvH8M49sTqWuLVA+yQMEp3bcGSQ0VtA2PYABChxOqqTcWwa6jjFVRdKhYuSzJ+EcGPUTHsyIgkY8d0MYSLDXNh/38b/+Xzu7aAMpe08zbr/AcMdCl6LbiiYQ4T7WSL9aMnT346+DPkggPdVWHVPW8dDG4/tFx5tDWknqYKwohU8Kl1xO/SyX8aoYfpblxzycgvVIeEEfpqqwUqahsYwwaACCXWLNsrypa8Ozb22jJBxxt2HvyR9Nf//uxzSptilvd/TdxtN7q/+jlRl+SD2lPGQJ7KTUw94RO+bbzB6+9fs2UicP9nSw/8+4tVht3gb6vzr35Kp21//+jFuK/VWb79qC2bo4rq+t7agWKzQt8zQmRCboAOmpt33D+kA+SnjgfVeppqzY/FUPsJ5ejguzz4cdcoOkh377+0LEa+/FauUA4L011bVNThYAwbACIQotjGhmJ9pcZXZC6x+ZrcLqzj7r06IxeDiipgDBsAIhAi+MOyLZNPx7173kRBSAqHuRVBRVUwhg0AEQgR7J59eZumxieJsBCyDA5zK4KKqmAMGwAiECL4XvX2rQ24QXDh9/+R2weHuRVBRVUwhg0AEQhR/FGj1keI/md9VtfGB0KacJhbFVRUAWPYABCBEAPP/ej+tSdh2/WD7w7CW1eWfPqfkBQOcyuDisphjlycn8p91ZHPt+BRHDIlHOZWCBUVY9gAEIGQmq++/90XPiL//N51H4CQTjjMrZZbrqgYwwaACIQQQsh8wBg2AEQghBBC5gPGsAEgAiGEEDIfMIYNABEIIYSQ+YAxbACIQAghhMwHjGEDQARCCCFkPmAMGwAiEEIIIfMBY9gAEIEQQgiZDxjDBoAIhBBCyHzAGDYARCCEEELmA8awASACIYQQMh8whg0AEQghhJD5gDFsAIhACCGEzAeMYQNABEIIIWQ+YAwbACIQQggh8wFj2AAQgRBCCJkPGMMGgAiEEELIfMAYNgBEIIQQQuYDxrABIAIhhBAyHzCGDQARCCGEkPmAMWwAiEAIIYTMB4xhA0AEQgghZD5gDBsAIhBCCCHzAWPYABCBEEIImQ8YwwaACIQQQsh8wBg2AEQghBBC5gPGsAEgAiGEEDIfMIYNABEIIYSQ+YAxbACIQAghhMwHjGEDQARCCCFkPmAMGwAiEEIIIfMBY9gAEIEQQgiZDxjDBoAIhBBCyHzAGEYIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQsg82d7buwfjFXN9Zybkmtjce7x4ugfLbYc9ALkadirhEJYr5frOTMi18Phc27zyES63GvYAZDRbL6ISiRrtw3UZo5raqaXdykMEGQ4bOZkpF1O1jbca/PTVYnF4dgSnS7PpOvkG1lnBHoCMZN9aewo8ljCqqZ1Zwq0M1vUIGzmZJRdVteca9CksU/HGMlBVj2GfE+wByDhea4s52YVtY+9oJcNcgpygqnZguRBs5GSOXFjVTGVgnorHlhn9uZQuXg/sAcgo7mmDeQfLKDjMETKCi6uajkYnME+E6dBj+53h3T72ACRjZ29vC8ZWFtpgXsAyiqSpbe4JZhpA+zCnSXSNfdvitwmzsqyRl+EJuQImULUOLega5u70KI3lp3s34kdJUobcp0vytCtngLGFPt/m2a2X2IalpE//7UTxIlhJewDNBLX9FrP1TluD0t2QrcF0XFztqdcCFkMdwj1wNLWtF3p0Bs2vymFu61V24+5tmpfNZ4nf6R245sPcrgV5beb28ISsmMuqWqcWlLfzwh26bVMj8CEZUOwsOxv3kFzHjpVX4nWug4Pl+2t3BJaC6FKi2O9rVRrg2zz7nTS31Wl2O75P/4V0z87ZMzjaaQ7TTJyP38lGbgQ2UwPnnV3+ofl/aJtlDRjmjnRWWJ0fHdkqv0wQl8+rimHObq9XJ0evFgsk8gE+GxsPzX64ePhsobHCyIZG7maL88Gn0R3hCVktl1a1Ti3YycaAkDiCv13sL955gHixY8pxD+rYNcx9rV42MGzqKGIDXsSHKktBFPu9HiU7IcgA3/Lsz2A7efP6CFuufVaq9Ok/Ogs7EyK6ottpPPiZd0FCPliT28EWpO+kbScnzBdPHpdD1IBhTniDaFvW6k7LRBoUw9z++WEyf9NZZtSBO2qpZ2nbmS7ZGLZpiWGQ6wxPyEq5vKr1aIGgmpUuWn4Qe335spldkQW1DErRho1tmATaiJfdMQwpvEUKrtil2vf7pme30f1DLJ4PkbGf6Cv5tl3JHYagm6+rczeF04SJrO2n6bpQJjeZXRN94AyubezFGePps1T/hgxz6WM3tpNs6SVUMcwVqG/QaJsFtsyN7cx6HlegqFFd4QlZKZdXtYJUC4RimNNR7jS9JrTBASrlanne2+XruHgKs2ttWA1UOhT7lZuH+GZnt2f+smVKW2p8D0tBVnK7gmu7SPPTJF2NreMkEwNya8CCgvMcjh3s2uBj1CEH3ZtL0WaJ+VY3/cOc3aeHr02Sz5vP1eHMNqql08au8ISslkurWkGqBUI+zFmXnj3tZsMsXEw5OsYQYOuE9d4Ru7pKZof9ij3ANzu75bYoaE8XkJbcrtFaHxdsZkLz0F9qckPxZW/lfMgL4HbCA6NhoBg9zNnVld4l2E3uG5c3K9rb+O7e3kLJfLdtOidnybNvZ/6opzhLBjmhIzwhK+ayqgZataAc5kzL3h+lqAtUsUUtC2xtP9kpY4NkcrHYp9jjfa2kxX5Mm6Dmj6W3ldwWY1urs3ma4oKX3Ca2Xx2dn7zrWx7J2dL1kKgDo4c5m4qp5vpekEC+7NAY5h7adDKl9t2yJQ/ltFYLP7P3LIUCtYYnZOVcTtX6tSDvw02BmmDEbFHLArv4sjEF2LVoveLXp9jjfW2DCMwB61hiqM6Sm7sbC5qnyauIkF6syaK9XOxqruvJmEAxzPkmtY/hpny6aAG2nkIPsP4fzuw3v983OpZGeELWkFTV+rUg78NNyzpvQbeoZY5FbyFuaelX7LG+Nu/MHnpDadGx9JTcovJqjqwAbUDptt3sOSB16BnmtK0tfadCPsxZK0/Go5ZhTvG5b3qfW8/sO5WLhR8jC0/IOmKt11RtiRbkfbgtY3SuVbSoZYbt9nyH6zhge0LOww2AfsUe62tFK26w2UKmD319JTf1TjfHRJqn4TBHRmBaELZE6+2vdNiyaVs2zJ2ni4a24av78ViQD3NmS+6wdQxzvr0M7bhu5Ju2JPKxbaUoCU/IGpKo2hItyPvwTfU871o0aRmGUmyPfkMv7Hy5Yqd3GmwvJRR7gG92dnPJNdqGLwxtfSW3m4itnUHzNBzmSDcPq+pNqjA2z4r7gW3ciquC1pwLbaiqszDQ4Rk22LrJhzmbSMbJHl4mAd+nycqM7aLEOkfayP2JGZ/ydYUn5PrpU7U+LRCKPtz2jJx1bHVpjgAZdh+uGdWcsR8yKHYYynxlJCj2AN/87P6wYL3Xcss6gHAB11tyu8WfvN9k/6xzpwuHOdKNPRwqvD9avAovGkiaj2/oOHmzWJjpRC/nsmHOmuXZ4WKBN/a8hWcP+TDnN9jOnu9tbH8d7xq4r5/g6d7Ozt4zy9oRBtyskW+ahupjRJ3hCbl++lStRwuUsg/HjuKPr57rayIXb9QakmqOACl2sdQy+9tWdwx/loJtlT573VTsAb7F2e9Y6Or01f7e09d49KIe2HpLjq1sp+8Wi0OPmWQxOw2HOdJD9m48oXhqdSe8SUd4t21akg1z2bvtqnfJ6kMn+TAn12CuBcbZvrVX+JpORuovmBSN3HVBLui6whOyBvSqWrcWCM0+PFM7peOFrwX2kYTWu+f+XR4zIoX6GYlMsQf5FuwmvYhwkm297i35xvPEVy6HcabmaTjMkWXsPlssjl48a399+PbTxdvXC7TMe8nbwLfD+8bvLF4cLRbF1v5umu8pv7O3eP1msW8pb+W+e3Ih+XrxME9czpytuyQvKW8NT8ia0KNq3Vqgr+9PhkSw9VDa+pvFYj/fdVkqR4ZoSscezR05hw8v9Rhy5/krUex8utjv2332ncfSS7xePG87fY/+C5sPJeaLxcO0Bhqnaa8iQgghpKT9iizQ70sIIYSsORzmCCGE3GA4zBFCCLnBcJgjhBByk9l9vtjv3jPd70sIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBAyAVvnVVW9gOWK+cBvFBNC5s6VdaLsMZey/fzw6KyqTt4934WLcp3fw9dTH8HcYOf5YrF4ege2JnvivVjswEbImnHn+dHRh+rD0dtn3Y14Luyqqr6DZSJWkWaD3RdH7/U8H49eP9yE2/RcWSeq5+nsMcnGYxngUqJM1nWYe66eykM45JiOKHtwIGR92PuI5hn4eA8+8+TQCjHtnHIVaWY8PbUzJJy/Ws1Qx2FuLdjWi2oRxMPdjTt7L07UDJ+1HeZMCaydpteeASuQFYrDHFk3FtowhaPFs68Xr468t30Lz1myLX3G+TNYLobWQdbPTJBmD5vvrNars9eP94SHh5jnf9hCgCnhMLcObJqI0wFh7ykM6z3M2aT4tGUGpj3HKw8CF0LWgzs2/TrLLt+ena18eW7d0Uq5un5mX09XfciuFe/Y/L5awcjKYW4deKy18wqWgvUe5rbbg+hM7QhB4ETIWvBQW2X1NWwkoLVyZf2MXU+fNxZEXTbTj3Mc5lbPc5s9nvYolkn3AywFUUL3Di2d8zfFIuHm40OscX98l48pb6vq3Zak8EY9k+WHp28tper87WO41Oxh8eCNZlcN/cOcZ70coV+Jm1zjtQ1zXedGXjf2LLPVyVNcIm498yne2+bS6MNQ7rO3ITQoC35P0nhfZOTVeXXeMbMgc2W5quGmcdtCe8LSpjWwnY4KvSWTww9ZK319Vp1EtQ1p3RFXoegHRK0+7sNsbC9sa4c0/GcaK9DaW+wcCep2qoajIwwIjTQH1Et77kqs1zjbhi1hTz0q3RK0J2cqdfaFuC1gXtqVpN3e0k5U6O4Vu3tYZXCPeaPBErTyHE5NrKLaV6VdQlu6UzVSJ7SN+o+c17vGLObOPRderP2vXSiB87Qd52f5YAksGeZsSCu2odjFqeSjMcx1n9vzupuW5kS0IO5zUbJJ3n4oVuA1PIRGwX1RONOqp+oy/bSRXCODVM2aeN55FyxvWkPb6bjQ1sXXvbigDkH/PK0dHx5BXUrzrS9XdMRMwXyuq7d4BmvEVKVIc0C9dOWugYVrFYKMUZ6wPQOQ66zVG1Jd1pXk3Z4XpasTFbqT6+thx/WYN5mskpJ2kYO6PH+eT5EUqznlg44l2y/c7H528X9+iJFk0/3ikGMx7ebZ/s6d/Tcm1nvqFnRpx+Z79a13XzI40qWErUWQ+7JhbsMSSZcfbL6s7aQY5vrObXnVlnmk06w9VxeNf76QKtnyktUXX75l5+yhTwweel57Cm6nTi+XLchtvxtzwxikajaWnMHSxoCmNbidjgs9YJjT3LT2A+YbhyQfR9/Z3cetZ1ovrp99vYWdLBnUyjQH1Etn7kq8o4Elx6af52qyQO/N0bFzeHUs7UqKbs/clPbs9SXXW2ejeswbjF+EB0x87cQZ3umzfKiDhOrmb1Jovw6xphhrGTHTi0QTS7JUt6sRgrjN82MMjRnh0mHOpl2n7qhYJuxRzHyY6z13WUrML8/i2oJtAA9zLBtIPyT1tGNdXNi201Jwq97k+VCtxPO8osm8GaZqdlMo681zhjWtge10XOgBw1xnP2C+oVhWxnQO97B5c0LIe4slw9zIeunrpZDBdASr8RHHLuJeqym55tLKOjedHtKVZGtj/dnrTS4nr7NxPeYNZtMKHui9fNj29XvltL4Ecgm9T9rXljp0VKUNLCGsyzZu2RQsaiJPCMrPZp5Z1993pmQMM/Wsrwl1zuoFzYa53nOjlG427FZBPW1yNUN0C5zv7/SyZpa04FjQitVqesZdCDeKYapmbTIbSzIGNa2h7XRk6AHDXGc/YL4YknxEcnMvWW/RP8yZeWm9dOWuxE6cnatGvXDxaVe/cZ2o1tkBXUmh/b3Z60+uIK0zizi8x7zJ2MwAfMyaSRv34lAXF11MQlmTUOmfwFxgTSE0jGZMu+aOU0nFdkr6Vc7XasznX+rS21S9KdilUtBPXVyHPiRBlpy7mVcLHpu4oHbkxfzq1XPD3NKGXmiR3Z47gwDuaICsYZP5M0jVrE3Wc7KS0U2rr52ODD1gmMsaddoPpL52FZSMpZ1kvYWdLEs/TXN0vfT1Un7i9omIjdGVXyCZmn40IyzeK9qJh3clQm/2+pMrSOtsZI95swnvNxn6noV7tpIR9yEtb0BP7bZtTWi4zZj19WKGhzEBhkHJUZcuoaVjmF0q+cWRtRlkPRvmes/dzGuugoLakRebE6d+gk3A0Ee0NXTvRN64WWswqA+5OQxQNZuSdd+bG920+trpyNBTDXNtZajp6i3sZFn6y9LsV7m+Yc5uwLV72va1IJ90/qy9IqKM7EqE3uz1J6d01NnIHpPkbNlAhynnkgYU7+md605g9YqtsRnTGusLe9dkwuPk2ivvH9SlS2jpGLap6/TnOrjZU59hJpkNc73nbua1r0Ow3U3ZHizMrHrmc0jRblLYUFzoLLkd2EWB7VhvZXTTWsNhrq0MoKe3sJNl6S9Ls1/lsl6qwELXy6Up1mfECymbP1uflOrsyK5E6M1ef3I9dTayxyQF6ZV7r4R81219mzZTo2ZMk0vXLSmbYuX3qtWlS2jZGGYZljzZMWpqFqT33M289nUI1uTzWZS7YUG+raEL1pxFVa07ydfuya3BJuadPdHopjXhMJcrj5L6Nht1OpCkvi2P+Di9vYWdLEs/TXO8yvUNcz58te1BsU06vs9E2dQcayq2MyXo7MiuROjNXm9yfXU2ssckBTblxF6x5RJKWl/WcJsxrYPv2mptLSlbyrOb+l1Cy09tEn9jrbK+8ZEF6T13M699HYJNI/N82Q6osNbR1tAFy90H33iO1Uty67BW0LlDZXTT6h24LhA6u23Y8M0addcwZ2Vou/3Y21vYybJqaaQ5SuV6hzm7+dVyd9z3WSZDTrhi1NRikUZ2JUJv9nqT66uzkT3mbWdrkS+ibNqi5ZBVb5ua1pE37YIlNNwWeVv4rr0Xdus6EbdJuFNohfzjAnfStPMgvedu5LW3Q7AllDSpetVUaWvoirXnF5qrsBeF3D78lZbnxfT98YnvTRzbtCYc5vzxgvoyzK6WutPqHOaaZdjxq5He3sLOnjwXtCzNZSrXO8z58JXsxXcsg/mFlXVKe1oVic6O60qE/uz1JddbZ+N6zNuO3Xa1d8UoD61FxYvhXgnZXeAz1PquqUkthDZ5+2iUPIxy7/Asrt/ZwyTnPi7tmkXoEloxzFmuhNjwhdaRsP3cjbz2dgibnrnQMnes4PV257aCG95xCAN3A5EbiT/eVH186u1rZ9/6LlepsU1rymHOr3HsnVkbdyxPQmda3cOcj0lhD+Cu9tSmhb29hffZR1ZS1Is6hDRHq1z/MLexpaOk8CpscbzzyvcP6WtiUjQdm5gkfc24rkRYkr2e5PrrbFSPedvZxBCRUD+I2S+h+E4b57n6BSG0yTtKKiHe8PVV/cjZro6/XUIrhznX0WwqVgbpOXcjr/0dwsZeeOVAJA7WHQU30BKzGyDk1hE+ApPyPjS2cU1rymFu40526pMdDd6ZVs8wV8/nAG6D9fUW4VrKMMcizZEql+aulfzNYOB9cX2HGzhC3UspY7oSYVn2epLrr7MxPSbZ2H2D2Y1wpm8Citg90EzG2lfXt2/9DaXC+WuZB8k8Lt7AbcZ07oVph3D+Nu4oMr4O8rZ3tuq+ya67GKZJ6VWRzXuyp0gaQbrP3cir7W9KJ3ZaP9kdtad10zzJX5LWVfAwoWi7901uGeFNvcLZYfFE2Iim1dtOx7fqraDNpn+6ISPoX7NRp/1Awzd2C5J+8kxYd28hbIfR3++CNc84RuXyXqqd7YVfdzpFVxTwT/Y0kxrclQgDstfdK/bW2Ygek5ArQoe5dF2VEEIIuTnYusyQF0QQQgghs8PWjDp3IhNCCCFzxnYx8yVfhBBCbia6VeyUN+YIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEELIeLbOq6p6AcvsuVmlIbeSa23EH+TchzATcj3cWzTYgtfF2JFmHdv1rlrewdLJsFAF288Pj86q6uTd8124rIKsNCvnQhXRx+QJkouTqNr+3l6blk0nriylq23EBXruI5jb2Hku9fH0DmxN9qzCdmAj5AK80lZYcDmFyHTq0BJc1kaHhUp5LANcysp0+Gp7iPEVsYTJEyQXpkXVPjyEH5hOXFlK6zzMPVd/pagKYMO1sgcHQsZj6nBoE6bI5dQs06ntk6o6fwZLJ0WoNIFWtnURRsI83N24s/dCIlcVfCbnanuIYdXVTSOzl02QTEeiau+OjtSinKX99yBxaaylLTJL6XKN+FKRlw5zViun+tO2KGOabtrOYY5cHGtlkzahyw8MyxLYtEu5NNN7T2GYnGudCI9mVpm9bTRUbfe9ulRn3Qt2rWiccUJe+2Fu76P8nG7CKUHHv1fT91HkljHHYe6xBngFy4rhMEcmok3VXqhbNe5yW2OME/L6D3Pb7aHemSuHOdLDc7vaP/0a1lY6m9BzuWQ6Sj2enVTVO59wvavO3+px99BvkR09Nmcn16m359XHfZiN7YXPYav3z+qb8DHUzpGgvqdqODpqVa+HGuADLAn3JIfvi7K8Oq/OfUR8K9nXM955bZk+f9NYJGnLWizNvUOrzSIaEt15Y351d/Xw0NZhZKr+9mk+Se3PRlJdd7wGIm+33V2uZh+H1D++i8XtqLlG/V84a6SHC6vanjrW96UKcW29wuLmexdVl5BbWmGaUmzEd6Cw71KF3ZLB5EOWL5H/iaXSo45P5QTK+ds0LWcPp3mj9aGGpcOc63Q5ddW7mXKN11ZxXacPbXjP6qI6CU18S7sv4W2zTS9ViFiv/R0MuQ50JgSew6mFzmFu02Rf70LWfcFYPjet2djRhYbIabyhlw1z+ZhnGpWC9lGHembOCbFvTzEl+qCtOcMXM7MYT9XFNVZNOzve2EFaL71Z27LSB+ponug917iozfvBHnjt7kp/NuqKaKkK1+dt6GTk3Ne82msuTVC5eNZIN5dRNZPbCSy5uF6rLaKq2CPkohVmKbklb8SHsUe3kXYBi6EOlkqXOn7tw1jgPJ1I5af5YOdePsz5Bp18G4qt2kjrblRc9+m9KnZTHTmRPMd9Lkp27bxcIZJ67e1gyHWQ9YaJ7Ao6hzmby8QGareCT6AZJn5N/8R63l1v16GJuk61WdDa3t1T89YzTcEHxzzU8lUSdPXnz/PJ18Y9dU0v8yxh31ZtRm2mtrtt21eL6rC9WVNao5mv3VfY37mz/8b6ON8fc/bQR+GHrpFRf/uzYb7Nom/qKc5dvRZiOoTANj16TLxZc3mCl8ka6eRSqrZpsdBRZ+JSUYT7djuv64EoDWNYrKIVZimZRXGFvecKGzrnnmHOUFt2PlOzEGPHVkBsccewy7LqSLVna+EtbNAwt2HppLvfbJOlVkxRcX2nt5JqKz/SGfmeT9k0/vlC+ootb9P1xdcAhcjqta+DIdeAL4YEzuHaxJpQynm98GIzlVNtAtZ237grZJte6d2x9oHlAPPNFCxYpIfO28RDLDhkoQYMc8n07PRZOtSZc/IcrOrAuQdApmt1Nv0Iqt6ftc5o8E0vLE05PySZ2rFeMOyRGZBeo+i7qoxpkjWmpz09UpbgpbJGuhilas0ZpdVy6LTVDHHtqzlpWJEkjAPBZcsbmeARoFRYaPrYYa5cX/TmCYt5fow5wRrJkGHOnmE/dUfFmrZlOa+43tOXbRiXo2dxqdKWoMLV3zCFSOu1p4Mh14BPEgPd843GMJc2eGtBMnzZgkK9mdHE/z4VrmkKVl6aCgaLNyo352RRBg1zogW++K6c1n2HaVW02uAFZW5keksdoH9LstYVzX2TmoFLvmfMA6WW/vTKopuqdl0kmPxicmrJoqcJXi5rpItRqtYc5qyRpiKC2XSqbQ00CeO4FNNWmKfUlGymsCOHOWsVyTCDgccLZp5Zv7+sEdW1YvmoLwv1SswrM6u43tOjpG427M5ivdzhmo7oFniZQhT12t3BkOvAhyjnYybKjA7dA5t21a8pnSe3bk38maK59H3Wk/mmFrvVkLS4miLB3NbDvTjUhVHAVs/PUNw76hM0oplpLRs0fXDWhCRai68ti6S3KgRzS/VwRHqo2k5VMkWLSz1qyaKnCV4ua6STy6maObdczflNg+owXccz0jBGW6vJ3FoCJAo7cpizJpP0BjLjVBe/xPlajfkCgLoMGub8UilkREuPISiruN7TN0tqwdMqVDuyM14h+joYcj2EN4V8tBtOHfQPcxsbGEgy7W0Rv7UO74s7FcwmVg2lVYoEc9sS7tkyRHWGG8Oms1he1V76oxvbMp104oOzJvQPS20p2XSzrSMz+tOz25ChcIGn3gNG4vnUkkVPE7xc1kgPl1E1u1GGzjYXwV646ffxcap+6tIp5EDm1hIgUdiRw1y9ipLhIWzWlZdRXYYNc+mMzvKHZp9VXO/pmyXNp4GC2pGd8QohdHYwZI1ZNsxZU5CLufQR1hbxW2tyHe9UMNPnosN2igRz21K2bKALqx2WE7uxZpoSW3Ez00knPjhrQv+w1JaSTXG7ppv96VnUvJOItyXPdYu3xh44zF0ua+SytKua7WoIt6QaIrjjj7EIb+NIp7ZOIQcyt5YAicKOHOasR3hhL3NJwEBsZczHenUZOMzZ7m7bamU3JsPySlZxvadvlrRvmBuvEEpXB0PWmP5hzq4lPtqkElsylBbx6zwMN987FaxlyzAoEsxty7H19ni33vRAGq9pb3FDMUs16cQHZ03oH+as6Rf1aW7Iybj0bDE1uz9jN+oTp1yN1ZIlniZ4uayRy9Kqav7cTmimTREoj1UOddRGmLZYmVtLgERhTX2GD3NWjK5FdNu1lnQVgroMHOY8K9Li8ixlQXpP3yxp3zA3XiGMjg6GrDGtuhewqY1cJtlmsFreTfHblBQ33zsVLKTWpEhQbd138pvYEnnc4bapA8EH3+YdN4e2ZTrpxAdnTegf5iylXKtty9gZLGPSs+cI8vsQDXk1h7ms5tIEL5c1clnaVG3bVjvj7Z2mCBxzD6JTc6eQA5lbM0BDYbPGrw51Q1Fbcj7r3bt2RFm62TqebdDJW11GXis2TL4xFa5zlAXpPX2zpH3D3HiFMDo6GLLG9A1zNrOxCwe72on350z86aaULVNWrFVkjSOz2CJBesd2B1clRXuyR1VgbmFrkb8E0EeDejpqivBCCxZuFSvNNpt24kOzJvQPc82U6pUYZUR6trMhPKsYsLtydfk3bWoZ1bhRc1mCl8oauSxNVduy+WOyZaOlPTkqujDU9AsZZG5mSXbVb2zZNWRYXDT9qdcyTO2T7r88nzXBtBml2PpDMgpZoYcPc/Wtt6Td5UF6T9+oir5hbrxCOO0dDFljmroXsUaA5QFbKw836FxrBP8yHZ6NCXrSVLDYUizFoNW72l79zEV7MlU5shaUNtCAvRzB3iOkPPRE04US11MhvUnQbLNZJz4wa8KSYW7Tu66gPP6umHrX8vD0bGKbpy3YLfIzVMuuJZ6ocaPmstNdKmvksqSqtrX3NLxiKt3on4hgS9p4mM74ZCYoWL+QQeZmFuHcFXYzV1jfqujvyNq4Y6cS6pGpcT4fipK19HuHZ3HxztrYuRcT76ZOE2tQdkDa5oQ41ghFkL7TN6qid5gbrRCgtYMha4w1oQJbdLDL97qx+aZpH0xc/HnM+hnKpoLVLSW2D4AnXMr2ZBM2J22hwJ9yyCienYZ2Zfcbmm0278QHZm3pMCeTPWy7q0kW8Ien1yyl1UV895DzXIPVlVTWXHG6S2SNXJY2VTvN97MnIsifxhM51P3+EiEbmZtZMLgF0oee/WnxwMmOjg3JyNRQxzC9SsCuDcFWfiJnuzorHTHM+Zib3X0rg/ScvlEVvcPcaIUItHUwZI3xbYo5KlkTcfoYgQ8uppVB/PfCZM3fIQSyL/I3Ps/vr0E16jcDN0JtB5VsvV8mDf0NpsLCmb7GJ8fzmj4l2nIOa6tpmGFZy6I1fZ2nda2e5G8kG55e3FEZCNPMmM/z19L1yWXoedJjFTXXPN1Fs0YuS65qHw8f52vvSiYCvH1YyRRsgJALN7Xovet2hVW2wrnsdc/6QoL07l+LOsakhPO32QMPG1+HglpiuhCU3UrMsellelVk6xXZo3eNIN2nb1SFaVFyZdi8ozZGIQJtHQy5YXTPctYFbYXpsgchhEwGO5ibz9oPc7bK0vp8ACGEXBJ2MLeAdR/mbKGic88xIYRcAnYwt4E1H+bscRi+g4cQsgrYwdwK7CXh6zuZ0c1Tp1w3J4SsAnYwhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIWQaNjb+D54z3TH0g7mkAAAAAElFTkSuQmCC) ###Code # only_sleep and crunch_numbers """ Created on Mon Oct 22 09:45:22 2018 @author: DR.AYAZ """ import os import time import threading import multiprocessing NUM_WORKERS = 4 def only_sleep(): """ Do nothing, wait for a timer to expire """ print("PID: %s, Process Name: %s, Thread Name: %s" % ( os.getpid(), multiprocessing.current_process().name, threading.current_thread().name) ) time.sleep(1) def crunch_numbers(): """ Do some computations """ print("PID: %s, Process Name: %s, Thread Name: %s" % ( os.getpid(), multiprocessing.current_process().name, threading.current_thread().name) ) x = 0 while x < 10000000: x += 1 ## Run tasks serially start_time = time.time() for _ in range(NUM_WORKERS): only_sleep() end_time = time.time() print("Serial time=", end_time - start_time) # Run tasks using threads start_time = time.time() threads = [threading.Thread(target=only_sleep) for _ in range(NUM_WORKERS)] [thread.start() for thread in threads] [thread.join() for thread in threads] end_time = time.time() print("Threads time=", end_time - start_time) # Run tasks using processes start_time = time.time() processes = [multiprocessing.Process(target=only_sleep) for x in range(NUM_WORKERS)] for p in processes: p.start() #[process.start() for process in processes] for p in processes: p.join() #[process.join() for process in processes] end_time = time.time() print("Parallel time=", end_time - start_time) #Repeat the Experiment for crunch_numbers ###Output _____no_output_____ ###Markdown **Multiprocessing Module in Python** ![image.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAACAQAAAGdCAMAAAHP9fOgAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAABjUExURQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAGZodN4AAAAgdFJOUwAIEBggKDA4QEhQWGBocHiAh4+Xn6evt7/Hz9ff5+/3v+H4wgAAAAlwSFlzAAAXEQAAFxEByibzPwAAm5pJREFUeF7tnYdiozoQRW2nOZ10p5r//8o3d2ZUQdhO2yTvnt0YdY16QcCMEEIqDvz6qfT6f+G6iqVfB4iXioHTF78KR379CP3sEZdONcY+TIeSlNT24in+Gq/6OxrO5uCN3mSpyAy2DOhW/oJDqAO9G/YAgYm+70Ksroeb5a26eDnrZ2ItqhV+b2f9VT+7fVxp4LcwEqv+VRzKRQOe9XC5eoMfDQnXWX/Xn6lTsYB3NVb/p7N71UhMWi3XKguc9mvEpd7VvUShHs1W4rztF1CtrvduYAgZzA4K9Yj/8lPkgESnV/k7gwtwEnNAZHCzpflCcvZNhE4utyKDpM0M3AK/biI/a1cHvV6CE7PQq/x/QRSHZiK/EtPdGs5VFg17CSvzrCw9B7J/GlK/etGL6s5EBqFVBxSxOJeL5IDmBwKyHLgW3dryKOSA/JkwqpE64P+AWsyOTQ+lFiJ0cGdqsJ7dmXEKAELtRcF6yQg4RZl5DlhOql49K8szdZj/m8sf/uuP/j9Tp5pq96bqktADHMwucLnMOp7TGcxOZhK0gkyRzqdTP8di+ABvwQMsltKMLJJLi7OTJMD1AX7Fh/yK8b44lE5MNOZ9ObcoDuEFvjUmD1mSCtXVETI1pAPcqPnc4hQ7xCM5JBf5j1DgX0tGA4e14Gn5OnIRx9F6OYa3uUlGPU/52yZQQsj/nYmeorJJLl2hk8kRfk/fo4O9zuj28LOU+bZc7D9+5N8VrsdpIt4voLufvcjkxbJEZl+nNzoLlBHMQvk93a9Kujo6lt8LCK85IQWrSpkFyFVH0zfo3XkPXS+jLsCsbCZTZRirJ/sNml+CpkJIQj9gIgZiooKdVAnVWSHLJBvzvktoLNGe9F+ZAz2m60ga1g5W2EhHr1ValWqC2ZdZQI3/aq32QQUNnP+P+J8ll5A/jfRh/1tukHb5wzaMTAlO0J9jXiBGoWfv+0uo7D8mUGqpDsSXqMxK3f5CHiG3/GGvoZ8tRHXSySiJ/cCun2MiiCSHPQa4OdekItFdj30ItZQfmP5aeuy2yMz2OiZDd0QxedakqSm2kWz/xgxf5ddy4MaMZCr5R9H014waEkLIbwad+v+Q1fzJVdv19zJnmO2ZUqcHv3o8wBxOd8lEobcd3axfncxkJnShGyWiv531T+t1P+tWp6K/hkNxdryyzTNo5E9mBhf9E26U2mwSAf149E4hhMXF/kxs3CCezZ5VPXudh/t7umWmyv3Val9yRGzNxq5263DW3Voofs/ux6OzX/zltx1NpRq5hBx4MEO1hgJ3sKBEs4AhcuBVpoy3dr/1t+QAQGHKH3pCbdsHS9wTVeVssZS8Weq/fbsbeQSHYunbpbIymksumGsxw6ERvQf7B/EN1AFWDwj5I7zObq9ciTMJn0ceWLhF8dOw1tyns2pZX75dK59IWRHAD+wzdPYif3eaAzrahf/9Gzp22M9i5fApoZ/nEDedjIbiZJX73RPliXlUR/JffsSrXNVUfjC/lEHHDt7BEH//bo/FJIg5cIY6IMqrYI7fspRtHRHs30IOwOAFO2+S0jB8wAxH2lRpV6lry5RcVDjNHmDzie9HBMO/Fz3wZgmR33PslD31/VoF1/+aAvmTIlSV/B7qzNhyAB5EIZNGmU3HgOHjwJT97EyuyIFeC19M7H+8/ij6HjuAO3AqaUidyST/08UoIYQQQn4ex74ZpLzruZ3PeNgnkCZJ/WiwcWN7a8IqBuRqgH2/OAfVqE3dnpWO2sCwnMr6mcSxKV+W2zUWTiRuyjdo7ViBLGaEmK9OjHmMDCcfbrHBXUzIfdMbBrcve6trm6/r30qPV8AgPPTjFmqGw3Ya8lx0umw4eu4fsfyBAxzVUGf4s3WSPqnUv56LGgsPC0T+4Y69O12qY3hWE3Uk//blqusXPKD0ZCuO55nEpTJ04uFaI3hcwUZ3+6F31MoUEgJWfKJT4RVc1cz/2d74GcwRsgqmzwUBs1Cv9zhqAdQOR0vUCx7iUWkXVgP0EKp5X/mDSupdXFlMqW6qeQfHCNp2pO0fihV/9xqQtpjVqpe4xKFGhiMvCEA1/q9AtGaKrXD89x81ezWz8E9NLZ3QSKVJjwVlGWDJV59CyABd+KrX3raQswywEN3hnSocM/UHSpABPbYngvuUAQvLAPwoWGRrms0ITu3RJ/yrkOLAxrc++rOEbLHtYBdcbLsDWc1ik1wdCHPsol/A0J4KQjN1C3EucS+1lACKHeUtQV1IzvSnaqpG+gvvx/6YEmRDMHsakyglCI9WkKs9GOSmoodDszxTO1HinLdcNBWwPYIYS5ff/n3qnt+uQBzLmRFwRGsTTc8t5l4HCCFklPXWnUTfh53c4KPhM4yMPx8kAH/Y0bWHgvp9PDb0gCRo6mDkSvntZweqk4wwo9nsGQr86biH58ZlRpFOofxs4sY//mkycAnPEKWLP0HW99jvvRKdzplgZ//sNog/mCQ/v6cGAEhdZIBfXWkqvYvTYyKD1EkF8U1/ywDMMc2B3eD4jRmAKeMB0hP+q4n8yTLlUrWqkT+sXqQGiBFmm2p4ovb+WJKE8rsy4IMg7WGdSQj51aBj+yifEMQ/Ich9hjfUSEb0va6jsX8kV90Dkr++PxWF3QmX/2L2pqoz02M6AMPfCOTGLs3LrOuwodDPnqCYLaGXxIU3KN1Do2a21SGIxu6HL8IjRr+WY2SAzHButLTVCHsqrlmbAlNfzII8pfpckWaAPYKE/CGEEPKb0a6/RdyJ38RkKD+J4eSl1O+V6RXLdGNBXTYGvobxjyLeHNX3Iz5pwjCl05cr3uCwrFjCVLS2GaC5hbEeL2KEVnSYOa1O1aP+V9+PbvvD0QT0mAKqCkZ21ale0rsKF/nT2244E6oGC9HbltA9zoOK6kAP1kZfPx2IaXt8KvCtXEPz9U3g55AUe1LIMsANMBX0R4v0eXxbP0D7azJAb+WimXtTv8ACQGr+Pib7uFcnWeKtHg8KhcyRhC9wJ1L1+mTRmapPbQMRd1E/88zIz6BZpNwTI3+Gm9nKj7cIeA9bSfHYRKFp4q5GXxr6w7AX38ZubuwGBxx4TzDSyyefGc2O40cBKWV0050NHG8BPQ5ene7N5n5so5cKgclNNsxdhFyQsU/A+RNRIBRVoNyxtRZcyTVTIrvsCSPMujCGqpXuuP0rNGYtR1E92qk0TbReMT0StUpsF50NmR1+kAG4ii4+E/TmTgX8uBK/cGsPGKkPnEfyiYa7+XY8pXqur19LQVgGaIFASLzgGwZykXGvP8SET3MEf/L/0qdQuIeKR4IQjHrBI0TmT350yISNZgDM9Amj+WVw3Z8jwB/EzuLAg+bEZmw7lRBCCCGEEEL+LXr7Q69Xs8O+w4NaTloQ2XJOnfoZkQ8xsdTymKYxefXMZtrP34p2xA3sPOioR7ESaXGjaXOo4y5CYvHRh50YDc7W8UMa5uFeX6/7digSuPNcFWK+2mrfa4D+6G+wL/MfOjcpHJgMmRIULgISU3QWMsguyZkJ4GJYDXBLuySXudIe+qucjjrGNag9lnjNEKOUL26Ve81c64247DkzB941YMOTCxcHfn/FXLufQiO4yp4q05JWExdKt3lhN1oD7uNdzHSBCjVAPOBhTjP0S0qp/UKxLtxJQSAmHMYPrQKXN7yRxrSCvqlbruJSv8TVH5t48neKR9+gzHxorKoIz+OpK2sxao4/qQT+YiJzY78AT1TibTiSKDG0GiABwKX7CLELtUBmrIqXmT7u6IZ+sT8rQg1VD5QZcrVgLGZ94MK+eXevBuEea6gB8brn/qTg4n6p/q79YYX06xl17c8qC0HylG32p+ZtzBYFp+6tBvT6qABMPF0aqKpUNM0t0SV3oQaYOmwIqr+YebgFa0pEgbcn6ecFYSV/TuEjJkVf7e2u9vSKH7vT73vR8qNXKF2Bi+49akcR+gAkUh3ITxZ7KVAYGXGTwFQxTaJUJ2olEooGeosIT9PoKys1GFEgZjXXH/1btmvATSaRvdo+6IDLJL/4HzKq1+osBK8p2+zPRGnT9/d4xtrdWw2Qgkd+4f/9nsaIQOUC7GVQS7x5PbjrT7HfG4U3I23wQcwbaYpLvE9KvNxZFP0xtFFsXKHMfNzjHAbecQ/98t5dqcO1Pc7Sn/UvksX9Q4eG+tRfi1T98XFw2d/ZHr4F3nd+rhMt03zE2GGUCyRI3dcqJrruxYLTNEnnJ67NCtKqjeQhtPgTCeSqwajOjN0SfxB49aQhwjuyBldLp/q77S8f9PWdYvYgVVcmZ/shXZYP8j9klDDXp+VDYizbUowuyq+lFt/0WyUqOto1C7Sg5W/C35ZBwtmrP382zZYB1sCbDjKEEEIIId/FN889rn7DebNfSo8z3ruX56QPWG4Isn/zVzZsx29f0Pw8zmURaSrUAH1HSL/cw94WzF96P+kHg4t+Np9jr0IWwcLC9uf8/5Ne1u4IaiAXPUUvC265mB3KMGzFiMredobY8IFbfasdwsqEkHjwvKbJom+vc7/kU8AugitsSwnZq1mMzMcVNlpP9EvE8eVc2SVY6+aYfq7YUUvptw/EATYCYSc1QB0qWj/i36PubkAdjPTn2YKH1voAU5JPBn2AXvGH0Ra5vocnZ1Sn+PtS9GaFm9rOMDapVCOEm2gCVHiI5k6UWupiJ2WIHSxDFdormAZF7ZuQSQi/qCysAX8VL1TtA8j/kVAD7GEFQgghJLHdM4rk04mvUS35hql4FQUn/9+Krr6BL9pwj/ziIRSD/ELhF/1d20vEZNW30OWcWrjCLGanYglTC+cYp7v0B4jF2dod+mEYcxydytUNyPdi2b60HT0gF/t/hkddFdEs8TVn17wc4OSKKoPbS3ssVo3Nzg9q6U9waWrEBcxML7o9EJ/SJd+LdAdrK5W1b/vhQXGca7oMPYX8Sg2AS1GFU1kwn/d6PlOU4h4X/dVwbuKPujSFXkINQBQambmy/+RXsX2JsWwJIYQQQv4xtzLrzm7tB4p52uSkbanPVvgEv0ZMYdF3skYwE8eXhELDZ4vdXJM2odT1fIBmqz51KBZ2DEDVs9kCzx/m1vo8oj/WCNIpAiE+rCioewsmHQzwQJapBvRH4UONbheen8yesgyyQiv+gkPyAbInR7Gqf00PX5bPLNrTLqbRp8CgeLOzPr5rf6chZcc98Id6gef77L/WAFPGC34EP42We8C9YrXGn72xDufLwlOq9ugn+TT8jJBmqvwUV/9VPU57ifIIj6xKeRc39KWY6hqQjn3Jf68B8XFS+Qt9ADaEUNzFObEb1YbHUxf9MaLPn1K1I4vkU4inxMKjlP7MohrO+xU2ZvFfH/u8tfcFXi/1RZPqpO9v8PrEQR8gQ78/qJ31Afbo5Ut/6m8t1C5Hfu+TB3t+0iqEPZ560c+etAKGp1lZA34BOx/7Ch6sYhBCCCHkn2HT89k1ForzeOs240ym5htX5e0hfWKwz6x8sriBwtG7oiTOjZY60Gt/qOu0vr/VoyK6cSi/J1Ix+h41QLQX6gWO1Aeu5kRN7L9by4/u68h1OevfgkcYuQL/9Q08otuHicSL8KSmiU5niliPJI8SyLN6A2pk5hqGnVEQAxNbNabSUMSVB9bfBXMcYsByF5sgqrTvk6uduB1+d+Gv8ajpFWwH71UW5XKRPy0JUUkZLGQ9dvxkfYDayd96Prs+hQ/Vq5OZvnQwhGfO5E+n+67DAaOus10BV1h+z5bYI8TpkxCvfhdcHaIuHgdT/O7bxazlzzRvS/0qqJva55AscI0IoQhFYPjD7+zkqetWnb4+zH/UT0jL/wQUQdgXlD+UhO4QXOtnOfV/VgPwa4jKneiPWOmRU/UctxjwK3/piFmwWF2HPUH5gTLWgDt7GZ+YwIH2E/J3jydNTQOt/pkG5TXHU6ceq1neB1FDDTAHZqkOJMAjdzObdYhUfoI+KUhFyMKvI8bgZfc5fGpghBBCCCGEEEIIIYQQQgghhPwqOvlbjh/y/b7HdLaKSe8U+g3J3UTb+cWFL0968eeaCsTKI98c6vjd08/N1lZom2I56ns7f9/jJMTTcxI2OxbhKnG60yvBW4znhzJhFXF5Vb2Vh8h2J89yQiwj2ShW+H+1RajZK3dz3LR6pnIz4wdWWqFsCN0Oz8iPfRNP/saqjAWCh34u/UmynQhPAwh6LGiiVlpM5sPVA9T80b+x1nI0AA6/oAYIm0NtOEAISOjnvFI1RFJHNh55fPbTrOU3PNhjxATbU5tmbo99mdodVBmTHvec5zZWA9REa4ASPqtZhtFbh5vXALNPT6vam8f0R6zCVbFr6rKzR1zVoZSVOw0+kqL8zqc+zCSe9HySf9m0SJNamYQp1Px519yxuYNJGYT8T40j9+dqfXA36fCTJ0mJIUpoqrbIVJ3rS+xopIDv1An9yTWM5E/Pfx31z6JD1e5Xj3BhrrwG4AmufoXxwNzF7yeCF4Qj6iecwBSjVwu2n637K/m9E6V+ZWDR909+jNM8yB/IfNhPJ1GdSkd58SZ5pa5w+MuUcCHXJ3FiWgjbdzfqbHYLrxICBi4P7RZOVrkPDQDkAs36a/3CIPzBRBxLXlqaYKLvVPc/KUIJVYPTo7N40M4Oy9qQCWf3a5HmVvJmTxxem5mGaLFpMJZOj17zyjXye6mvaUe6LB+0ZDSjZq/9eQ9/7lotEEInv3rgVgzUtkKs7PCdOpHLbeoD5M8MvXPDqGsqrwESfnjSGPpUA5B/9llVJDM+Ciz/vaKLUvsAGEkx6mFEPCwoLccfXyx8qNoeKs6fV4axfnoWRzkhNgzW9bdyXX6zDkWXGcFHjB2mQSCJKX7KNQ6NUl6eJn2QUlIarLQGuBJXQY8gQmC5WkOQPzgqvsmrNSAmND48Lf7CB3BtwDWd/g/PbSOszFRt5OKPVesfTteK4hVXta3Q6gNCDehaNeCg0zavBl4D7ARnchdrgF00t1ZCMrIasETpxxoAE7/mT4IG61gDQg49y1W5lbxyO1PYBxfR+nCN5ZAXt6uCUfBh5hZ7bnQpUZgetUMdFWlSE7fKa4Amc2Ynn6HKrtFRMKxrQLjigoS8eTZLc/N4Vav5gLDUof63h6zdxJsrgkPBRVGaXJi1/I7WAH9y2w28BhyaNrobqwEwh9L+II9/j/e9NUCksbaqGm0uphp8BdmfcE95Ltfg2o2CDzMf1gDJ7fTlFZ2p52laxKxS91UN8OxS6xAzGm5wFCynaoB/Ahlq0YXXeKYn9xGWOpT/1tMDNdm5Bpg1FiujNUAHrbIG6OsjkaToDj6Rp9Dpn/WYwI3Cw+byp1UORnKJHeJIDcADGqqWYtXijisqvaAf1rwJEb/iM8WwAaZIef6IRT1UwQgX8WG2qQZEgdTEem03SWmy52SiVVUDdJoKNayzmDNp/OI1wBMa0gkrc7pn2ewxKaqyD7ZYGG5kqMnuNaDDY/lwk9cA+4SzxBOf3YdTqQGhxhWfepZey76jDAv8IW33y2wUwLeQxYX69e8/5y8qCGID1YevJ+OjzPZigfTOAv/mNH4kLP8kdPYdbHvHARzIBE2/JW3xiq198hpW0Qf+zvy7zdmbE5DOoBaNf9/a0mTBuZX8lxogcz+NU9uxf2Db7HGx5X5RA+xT3VCGz0Rnb1yAvQoeXt+wXl6GdFk+5DVAQFUOJY9r+Jb1djVAHN1LxAKWGSKVrSDupI3N5W9hPaua5XTYQBTgbja72YOL4FkdX9xg7RKNxPnySqPYh1KuHgSUML6ERsh9zDoZOPbn2hLO1bW40vTgA2XgViMWMWw2fAFHBw8oCHEp+XSvCygrgtnFsSYp9xFiV6MkkBVlzDuJGwaeJts3NStxLP+lrtxhiNJQZt2BGwthBIEyRI2/k06X2aqVWOUa0qluZ8sHCVPQ51Zmhyt0FJYuzQeE5UFpBIczOICJrhUl26A+QBhSthbkr0ULI0ez3otmGm11oA/Va1vgUWaLEx8N3+574hbzVsIO0rkdYdD5w7wzZ0CqAX7dmjn6Vld/iO0DeW86IemuFZwQQgghhBBCCCHkPYQ3wn8Xf3vX69+CPaXdt6MmveDTcoMiqwx23HJjDfg6kLe75+9kDcABhw0lfMQa8K/BIStT4O989jJ7EdW5F8wL7lPOOjuXY5pYZLc93hgnNaDDXcnZ7EHvYpsjAzVA71jokTy3g36pb+gDvb1y8GV2itBmzxZ8IcRBb0ON+8bb4cjnId10duJGsrpf3+ot/H29La5nOaX8cRNW/ku5ya+Wtzic3d+iBvSXdljkZma3puFILnCCPqCTvz0tN7dDWUrbVxeis2N2Unt6vBcQXzOVSy6E+Qmy9Pt2Qph8ElID9E6y5jIKzt6fKX93OM8bbNQ2ahTX2igwtAaYBqAMVZ25SI5wmBc6feVkUEtRZ0IA6ahyWXafrpAme95cJV/t5iKy11qZdNB6VEF6YQFHoND40VqBt0Q41/PluJEqenPkaN8P02d14QGYkQEVjkVpobpN/1gIofFLD2HRqgPpa8jnoxkuF8l8PdAiP+iXw7FQYPfZrc6kGoAGqxpzGG/Gi6nWgD09TRvt7L+DuoXwihpwWQiB03E+Rpgb1oAvAnmLi5cFzkPqicFoI+R6/B5pDUhGOqCYGoQ+IHNh/6VG6JuGcUbTTOQvnGTDJRNCn5HwpxjckDXgS7Actsx/sJapTRQtVy/6eyU/dhIFzwSdwjk+WBes4686gQau9RrtfFxRB+ZsLfMAM4EF3FRCdDIEuA0MWAP+HlrkuNqF/O9gDfi/k29KEEIIIYQQQohhD62S7+e8j3cO/ylcI/4j9H0Nw/aHj0B+MVUUrAHfi23ZQ+EX3JrXTXncFNC3R7gelqJa6mGvx9lZOMuhFvpBWb3DLG4P8KS/nimCqX66NXy/FSFceggWl0WhkZkrBBSOMpEvB68WNVW44P79bHaCUxuztz1toLBSayvm7P/Z7AWvRID6bG3Wdm9Y/p/iR8KRX9UrZQj+ngrrAyQyddXP1nY6gHwLoQJY7tslfBp4+XRmNSB+Klj/wkd6Z/gzb2Zhv/K3f6Vlm328N0RjTmIIitUARGZOn/BGJfLtFEWkqvPZrdcA1QtmeWZvuVqdo8zdrf7ZL/4utRDNTq7Q6Y9bewhur1FoZOoqhEm+GX1hpvTk8mslIL8vZ/FFV4pbul5a7PoUZ3/V5PrVrHHkVP+fB3eqCz/rYB1M4ru0JDJBy98tyPeCF5L6R71FdT576vt9KRT0zKI3c/uTaZ6qUvdgDlQjqn39qL2d85B64CGYw7kuOS0EM4KlRxZMs7Nn5HewdYmxaP8orAGEEEIIIYQQQgghvxV/lEN5/3mh4PNAjwTkvPud97lg5DPoRz8jWuzHh2e7xpEA7CTACOLT7gVc1Wc9onv/QNi28EbBpyM5ql8ULNm+BqBkxfF4m3afI6UWjcKrKbaENeCziD07clTPhYYy1GvMaOi0HN3aLsGtUJQIzIMdPNU1wD/aOfdDQCAdAvB65v71kqvtIoK5jnyIcI/WSqc/nqWPcOrXLr0G4Fua1yhH+2Co9Bb44Ff/om99CE7sIv/zz1zaBzHFp4QkRvIjBRw/RKqHCZXVk9WA4CF9eXMpPyv7cKJ+Cit8B9U+Gko+iOSwfXMU+Y/3dKAY1v4hPv+YlVniN7bk3PTUXx/X28Ef+a+lNYufuZTpgdUA08mUwD7aaXHIH4Ctf8BUPazjOZFkbhdTqWBuSD5A/OYo6oIoi892vr14DfB3v2kN8A+G4iSQv+gvggDwH61cBnXtIGB8X9cA+2jnK95SCCOgVuFPPag6GNnPSQiSNeALsNwMH+HU9myfGhTSa6P8g6Gizn4josP/ugbc1DVAHdvXdINacb15UHUwUnV+now14NPx3MTlde0ZbJ81dVP5LV4b5Yf4ckSH/3UNWNQ1wD7aqd/QxF+46CegoREPD9kogEVCUNuFNeAL8NyMH+G0r12GGiCTsActPMznRH+srxFd4vuu3g9gdueNNK8BNnsra4D0MPYhUplJwshfHaf2wUPx5U37sGl/r28/jN9BdVvyScS1lX+20752qR/IFA5xmhtO/IOh9g3MW5iEzbn4pU4Yhs9c9rMOfYUYZZ/HjB/t7M5NO9uzN4+Kxj2UX970D5teXJo+fAfVbclPZudW6h7QU5C/AGsAIYQQQgghv5n1AR4kHrkTt83NueF7aAJtm91ZFAdPbnAzaYzSGdkAXuSCLwBghSZ/Iyu7cdOStoMJr3g1TWBjDGCprzhxrps1s3BGNmB3b+3lQPt+p7ditHD8HTCB6KZwDM2o75ItnBhF0bZ9sQZsgW/Vh3zUgx32N5ud6g08e8mTmOEOQj9bnukRwxsxhXHrPVBisvQAVIMA02uogCreZGzQuNZwYc5wmwDeVIurBCyXt14/g6eGXrRq5M78b3XmsUpIt+LsGl86S6+t8sCQiCJADyMqX+wYpJr+beo3SSEHpFitK5D/p2qMlzztzxZqhHvH8b/gfcDgPVDQyZ8FoBr5if/NTP9nZvgx9VrfQKKGM/2E3ewEN6KO1EiE0BqQGelZhwvzIv8lVkS3fMFZB/9v8xALDInIfeNHJTHl7EIr5vH/YhoRa7ld5Vf+Ww3o/BSQveQpWCLzzRWMQg0YvgdKNDEA/bE/f4mUFIAp9rR9yn+UqTuT/xKqKaIpfvftEmpAZqS6/kTfb2WxwlRvfJavrcIlJiL6tndeLXEGRn/83Vl6MuZ/g+UQkh1qwCXuEEoW6UueYK2WoQaEj1e23gMlFw/Af+zPXyIlY4YpDv0mczofoP9RA/rQbPUPvzI/MU1WA8xIdfLv8C6KLb/LFxl3ytdWJf+5b7eUgcF+gh7V4H+DnuDHQC3XNApcyDirL3nq52EUiHZG6z1Q+JU/CQAKe79UshVMkXx0F3o1rfUBBt5F2c/u8ZFSO7IUakBmJHQ3T94NBbHhTC5mb78eGPznvt3SFf3s2r52l1n8H8BHxvQ8YKwBTzpJiG+UwgMGqQZcrfzcoLgZew9U3x96AKaDDf78NVSusHMhYhCiUrXVgG6lQ4OM4OpDXEi52KlkqwFuBEvV4WOXcrVYoTJZi9dWWWDqPw9QfiRqvO9Uf/RD6jrbNV9kwJXUCfT/XwgyXwvgU0uBRfpZSPP44syMMbAGEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCHk53Jqb0sFmfL30/knB3bjfb428kXB/u/YwwM7eLxmheupZOzwrZ8jRrP+Z78ddEzkjXxamm7SU1aZUtA3a3wNt1/9ZJcQniI3tk2Mv6JD83apjx1/iqylMNPUJWtifCXbV8BdM+ODmfemDR7gRVKgx9Pewpt/kk6s9JoxYlRlKh7KdfzTNl/N7crELtlf+atARBBT7ERVUWIc45G1WaTYMyX4/C4gJvkbuoC1fhAismVizJfnrT6t/SmyVsJM47HHgjQxvpAdKmCdGZtq24cyr0spv+7tcyD24iahty+LbE3dBbhK+pgv72BB1Vqd8FmLd1IFGrXjkbXpe/92Q6EEn98FxCR/fRdQ5+52iXFfRR5+gqy7FbXHvmtBfgt1ZmwScvvMw2vW8NKOhDTUVB8vENAz3AgSrfKaIjg0E0jjRksz8Vd6FoKmLmB2EiwkAhDeJJSFZ189FKwO3Up3/go76TukZ7Iv2wG8tk6mK9Ynih+8Cs0mLaoQRPlgKrwPRQMRZHSI2eTTHpsyIiZNsgt1uSq/lSzijcThVwQYhbBuztOk6iysN38lkhCVKturBGCtpp05nqB8A+FWdCq2ztxiw8MSo0pynrgqjjrtkUEOQSdKXRiC4GHuKR0kJiunl+im10+cJl+eOrtA1kd4spmnmGrJS3BZcWbSFGlWQrCNylNVMYsW1iDoNd68PLPUl9kScsPeorLAO2fwlWUlK6+sPngFjBH4wD7IVKEsOLUVVDk7drUkV7BAYt3ehOZH35+4FqTyASb2YBYQZdfS0Y8RupG9zGfPs8Uz0ci6ALFATRVnOuEQMXRpkIc36AKsq4K4yAXJDlhKmPrmIn/HmFgiBJlY6/wvRP9sUttLKQdDohvDMy6ICRUTLzACbhxpxJErLBnm8Q4vJgqksLRzNYJS+kV84AmNEimeyhz3kc8dIDbKUbyLb+na1dQbQZ7kPHF1HHXanZEc0hyV2ozLtUmmhHZYJaYsp76/gUZakFWX6Cs2OlwQDYwvvaMQrYlRFmeUpkqzEIKFk5HKM9oFxEumz8szxVdni2SiyvmKmfPaKqSMiLi4Q5RXXh8kKFw8Aq9RI5kqINayVkYhdbyVEtbWL4GEksRlM9Z/hN5HEa2rgEnV6AJecinLOOv2AcouAOoYs+VUEd6wC1DdQTJGhgadK0J856U2YKZVF/AUY7WaGROC9+GO0IpjTCHtSypGnr2BEImQ4vMmbSmeypzwVsWMKLbJ5bXch8q614MFKlQVRyPtzRx68nEq6CGlV/BBYkxnCqmniDe02dyXZp1dYjTSB+AyKA8zjs6qNGfBBidV5dm6C4h69JdZ6stsyQv6KcyD7+2tqKm88vrgQYUIrOSGmQpirLE/LIWUq7a4MAVKubKR0+eVBhnxfHQscxpdQJzmAI9TStfY1AWIfawC7rAIr9EFiDZcxdiXHYoZWnwucIxefBvQVO0hzactyiwmS2xFK44xhb44U6eg1b60jZFKUKbs0aRNZg5s01pIiWKLS/sV9w9eS8e6AElcHUcj7c0cEhup2jKVsNqXBVglpi4nq6ehMSQxPOsqYURvv56vsDFco4ZCkeYs2CxZ4Yo6tXsXgPLMAiuyJTYS4PIBnejk5ZXqgwcVIog1qsxUJYtVS6YWUq7m3os6y5WdkWlUqloiKy7NLiAbmMzo1Y02dAE+nYkG1j0X4cUFnZVPlgXhCuMYgjHIzJA94Yrfqj28hg5brJHyLKYPdwFZO/dAjacwA86VcXiokjaSOYoIkIadJHYoLLGVeqTKVhdQx9FIezOHXvrl2ertxpMrxPBENZ4YB9V87Z+eLnxpWHaJ0XhbHmSwGUdnQp5mtwZZssIVQlVVTC5Z7EmR6VGeMbAqW/IRTtIWCziSl5fVBw8qRGAlN8hUJUvCl3cBqA0uqohkdwVDrQrxewTSW+j8Bq+WdiMf+GSBMdEF3IdwZcah9lLG6BWL8KQoUS7YPME1y4JwhbEElTWDGJ8L7L2ZhKuOfO12FILwMF1msVapspg0sQ+xszfG4ygUIfFwYBl2bWF7WD7tBpnywcQMKZ7IHP2eMyLAdWXmIrZqRSzb2pFpZZx81EmGXwhWxVFbS5yua+VQf3O61OajZHd7q8RU5QSnPr0vfLk4dgkpksqkKQoZXxVnlEbI05wFmwkcrhCqqmIhgqwgXZ+XZxZflS1vLtcdvIcWqeTlldcHDyokzGqURJNlaiiCLAkaQi2kXD+vC9AKY3gsrS5AqxBACbmRjO/Cw9lIFyCpMdbJ0E1s9VOE53ZvckF2ZFkQrpZL6koo4gsCq01cnchFTSUsXb2FMGV5ptgr0bOYPLNd7zTiSApxYOnQOYBvx9uAJwq7WJ0QMqWLucY4oHrVjmWO1F1Fjf0qYqMuZ5MFiTl4HSRZfFlJqpfosLKW6mTzmEYOSUU57rpO5sFqEKfgoJGY0DKwC2Wq3JfnrV1u+85qojawaFkXZ5QGpDTnwWbJCtdMqFDFYgRqmvRyycozj6/MFoxxwOYGeHG9Iuq8vPL64EGFeK1GlZkaiiBLQlZweg1Cf2YXQD5EKJQmaRGZKz/IsMwfv7oWSDN2lbYh6YFcuwvv89UipvkTg91Ynp9KlankN7KpytyFYa1QfpRBFxCnbl8HRrPl4kRmfhjU4hR8J97nq0FK8ycG+71dQJWp5Ddy/lkD+04suvLOzuzM5q1fzLLrftBY9SVp/v7y/FmZSgghhBBCCCGEEEIIIYQQQgghhJC/xLwD33Q04f0nrxYqZufHsye4mzwlN7+rXgr0qfCkNvmF9P1L193glUjf0A28vwtQMfVhig0h3PjzXOOI/y9MJ7sA8gsJLz87jk9AfSEf6QJMzPjM6bt466/ScxmfD7sA8tPBIBjffGCEtiUKPJYoldhHSn9GM75u1N+dqM9k+nOS9ug6nuQGeAw2WKi5gedEARq//XZuYq5yL1lI4jZ/piaIGd77YA7DI5OZv9AMc3ED+qqm/G2WIZgJ7TAnXBOeg80ikrjPTPfeno6QLwXzaEFfIRkIbUvM/WKvXrj3liAmeCPK3F9iNoPne2+dtu5+zp5xCp4S0kL0upe6AOc6xKg6kIck5nnzFd1qtZLA8Nq2hE3rc3/WBRTiRkKE7lrSqm+8825lqB3NiYBFXUQkcauLr5xpEPIRpFoLrjHMSLCaHBppeH2JjoHyG16ZApKdDrxHfXpLqoRjLwAJpBc7gXJ4NKvMSx5Shdgsl8sTvGU6fyzUxuLcn3UBubgRf/2CJMje2xGlCU+/jmrrnAhY1EVEYQYipt/y3B4hn0BcCBih4p+k6q6bBPnwu+x7258X1GCO16z4mK0z4bTrXj7GbqHr1P95pW9hBJmXPKSCKKa/g1tf7IIgbDqe/IUuoJwtAJ8DKWqQtXmsaBraOifKqIuIsi7A3vVCyM+n0QWIwnfObb7+mhoC7IYvaIjVXwhTChD3EoCGLj1A0NkVZF7ykCJRTGuRQRtX5IL5s99CXCN7sYzP17O0IvK2tsiJMuoiInYB5BfS6gJOYzuxai1XaffzV+ikNdnE+xxbizbpVaPr4VK4M7cXGo2Fbq/PwtvW5JJ7yULCJQtFdComXk8NR/GdxDYUZ/68GebiGlmzlLk91h7iBv2TdCohwlI7nhNl1LjGiNgFkF+IfwclkN0aPNB99vBOUX8Tor+mdd/2wXVfbfECpY3G9hZFbzuOmr34Lr+Grm9+PREdNJmXPKTZW9E34Ts1wD8bYdI8S/PE6/Zzf1fhRkIurpC9yFs/dyW/Eusc6fA3UA60rZzIo3atRRTjHnwGgxDy86j6qrrrIoT8bdgFEEIIIYQQQgghhBBCCCGEEFJwdLO60SfpiBDOARLy/+Cm7x/ms4OHrWr+sjxz/EuZTgW7APKH2R+8SvDNjs07Yrvf2cM/5+nrlvvxDYRZ41l2xWH5xf5sttelx4mjdfBpl70Y6GJP/i7dA3z7s4mHWSCzgyRvJs+yCw/sJhUoghSSiFm6hJSKlLIUlncB0ZyQvwOOvJcPDpRfb1/2D/3j8hEO76DTp+/7V1kk2KN0yws82y9N6VSfoHnNDucj5NOZzyVyazPxF/qcpC4EHs5wpN+e3ev7i8sO4lzO9m7NUOI8gRBqn8mD31uc3E8qowgylyGmS4mpyFJWhKo//jwDIX8JtJE+tASlyx+x8fbmrSC7Cvf6zJ6Pn3N/RdmDNjbFnV7iIfvC+hqNad53+sx/FmARhz+1e5AexkHvFF8vlrvdiy8dSCond1bIENLllAsBTVkWlrh9YQdA/ignq+oT69lbM4SldwjaVSiimbsSk2pvPDduFJ+3j81PHZTWsJG/Vb8/u80eswsR69U1t/qssYCBPGqgDUD3JFd7lC+plDzIQoaQLid0AXnKUljiIwREyN+nWBikLkAvypU9SnvrXYDuHOxXLyoV3Iu2rtJ6v1/vw7O4SON6ikOvrjnKZwFHxcBckUwyu6DEtZBh0AVoKoqUKRoAftblXImQP4yMfm/dstN3gcamYsuFw2eZ1R/o0/VirQ0Fg+qlPuOPJcCltznBm58NsKU1HvmXC14loHoDw+8BXNr63gxlYX42m/s3Q+4R2XINdSbPy6N4uEP3kFROEWQuQ9UFiEOkIk9ZFpZGfmSBEPK/4CDt+O/ZFbv7XedP2C4uuwOxMs2hb9zPTrt86352GELwt3Hk1kuzO1rmrQot7azz94QE38JFjFdm6imQTJ6z7mKgUooghSRDSpfhqchTFsNyWVxoQsjXEAb+T+QLgiSEfBFvn99evyBIQgghhBBCCCGEEEIIIYQQQgghhBBCfjvxu5kDPutTPOnc72Y8Tveyh8P/It4yPs8zQvfdZwPbOUbIz0AbDhg81jfCV3QBeP5WgcYv22FxBi/hkR28T6TJSBcQov4a2AWQn074uPhqm7r6BV3Adq8jHKWI82yrcIZdwE3/8pXNlF0A+VGcYsiLb8dTQhcgCn0u7gJO4he29TFZYA/RVRVaug1B/XtzlAYN/Dk802mwh6o035er7NPbZZCuk99jd29v7AgDPJ7PFWyub3Gql2c1Vs+hY7hSg7U/d2jBHQ+6gAUMYh4MY6i116LB04FlNi1NFyZSntggnQoXM5SQfwfqr5A/vhurv43GsqDWabS41LftrdUUrvRatFdxg2ftZ6/X8mMV/tmaivtyx3jz+F3+/lEPzMFCIL6BK3YB2pykwWrrl3aq0ohOn/uXq11iF5DNAsx0Xx/zh05fdhJ9BFcBc+0pUQdlDLXWvNfZZC8aEFPtKd3TEdyKdBq2pBJmhPxTbCgsX3Mh9XW10uHcdG5pr8V6ssos9HjdX9kFFAFZUwqEwOK7hqStFLEWzHWmgZ4idQEWWugcbkpjUaAFmna8Cwg6U/jrRYYLgVfrACvPqkgxZFqfvlTZFDBxX2KmCTHHYi9DyL/k9HlV1cQeL8r1io6FgqvEXAx9Fi/YVkHeBZSLb28s4sBQQ1136Hu30NiE6p0cGRJaFkFoeqEL8JabtcjnqB3tAnxmrqiZzzOqLkA8dYo5a8Qw0NbZhGmKgWiKNxVmXcDUnQpC/hlpHQxi3T5Q1Wtc3lpTzrsAmWq7ClhTCG0ks5JWFsd/aSOuGmKNc+suAG8qNu1oFyC/pjOewusGqy4g9ADoA/ytoCMxjGntGrPJxv3QBYzPAtgFkB9J2QU8ey239+vDFqMfVvfahvIuABuF2rjv0Dmov7mb2F7AgTV9NbPqf22hPKS3/oq1za0Pi5VGaGuDLsD03pWYq/Eu4D7reQR/KzqmIqo3spWKrJHgfhDDSIRClU2eSxIGXEsuaLAL+GEXQH46ZRegtR2EDWzfQbS9tbILwDwA6ETB2oRPiHWb3c/qmA9dBvQ9Ng41Cr0qeF8osMa4qQvYt61/DMnB1XgXgKthIdjpgzMxVa1S3I+0z5gMYxhGqKhpzCYL/eHMIzM/uu3KLoCQzyNrgl9EFcPXR0gI2R52AYT8rznP9hC+hiqGr4+QEEIIIYQQQgghhBBCCPlpHN+vVtfxxB2Ovrz4yaHBN3nHsYMz6+LU3rvIT/Rsy3v8NIU99jOTn8BE1nkW78J2BUHI7jzYyb4zNHjw4jfMTbtLF3C2zX02+/p4i0FznmjfwWrCyQSVsFGsrnznwgdoZ13I4l1gF0A+hf3wFfHI2o/26jl4XLxF7VuVizXvMPuU+H78GLlw1i28C5hr64HNcXpP4LKKMXUBeSjLLpzT1UvykrXvXAIQrHDdy+yWXX1eN5gcePNG8CbsbHbS6RMTUSyNeiHi50GemaMA3CyzvuKky57RjJq60SZXWaJmR5ZT8+DJLimtEAVfQvfQFhZrSqN42E+iFJJlVoWEx5f+AfgiqzwjCtXsPPkLRUR+NZiApudngD0PZ2A6sL/s++VyOVue9q/L5dxr3iEeqt+7tQfj+9cDXSbAywVerNFJoOgC1GjZP/VPcwkFjwYdwUiiiI8sz5YXeIJZ6mUeij7Hc4s3dKg+HP6XGuzSVBKAZIU0nYanAuzpntes9eUmGtGjiqbOn6A8FFUUy9JbBLnGCwXEE+JSxE3/uDh6C8K/Hc1OentMKtcUXUBmkSQHj/q6ArHXh500yDK3hYvLzkKzpyXyFC37h/5xqX5Vl0kWrSoJn/bmt3iaKw8mZkSmEqd3CMUfsgpFRH41qE9W1wL5I7YrHQndIJ8FHFjtiXbgHs/XzN3AZgFe8dSfacz6IO91yoWAhrKXHsEVDy+xAwAewZgEQenXSwQ890egH2JXUZr0vbQiHePUV3rVWhDLxI9BSnNYmjxZxEsP8g5NxIObXeHlSYUm5AQoLPKwXLOWtiuXDs81l2mV7kI1Epo/w12kSFqoaoxCsmA1KmEVTMqIXJWuWRGR383Jqlqvy4jjKpQ1Zp9ukHcBOmgAHZjslX/6eFznzbXVBUjoWCWkKGBvbS0LxR7H84f6yzodpCklMIJDv2rANxaoEEbZymRtg1rwpa8PhMFoF2CGOqF5i+ul4AZZJM6CFKooNMEZKCySDpxKy1zIXAqGalGmNbiV0HwCV6Qoj6SSLGhibDEGpcyYmBFJhasB41hE5K9RPDyMi1eSvAs4yselK3tZ0C0ab1hGNGcB4i+0RUcWCrjkoRjmQX7W+TzFIygkcILSr9pe931oS5Qm4tZ7nRTQI16J6mK5+HmQF6I+tG7DCUl8QJtIY2wxJdBR2JyBwiKLG0iDg8F1v1zpg9tlWoNbhCatUwQpUpRHUkkWNJWEISnDrNKMSKpSSjA0IX8BGVxWsvaXXt6m617OxXag9A5ns7m9pk+avVRUWW6iJT3inXuX/VujCxAf4DVvQRh1LotQXh7F/k6nmurLarrxKMZ4QUkugROs3MyG7AubJlwmd5nJnjp98jeF4UdiF0PEZmK5+HmQYbqS1sEyUZEObL62NtT3Lwcw0i4k14ScUCpXahaQ4PByFblY31ekNbi10NaQPU9jEQkiSJJFqyrux4XEIOWVB5MyosgSyHP4LJOgrIjIX+So69Im8KEP276Ft3SLiy7u8i8uuwOxMM1+10ntWGL4Mp/u4Ug0cx/PLm3Idw6xwV2GctZdWOAeedwsE47DfYNMAsesosC+z33aVbcOoskSVV5qOFyar4N0V8LFUvHzIN+Km6QAbWueCSOBhOGz0ISsMzKLELyzh31IIUt1Smt0a6HN7ZLSWERSSpasCgnFr+dUFkzKiKQSyaRieGCxiAjZnhMfNmy4+aW48OdlF/Az+bmSkf8rhytZZvju1q/l+rV/u8sG7v3yvuoP4udKRgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYR8H+kD/7ny13P6ri9AvM/XRr4o2P8jS/9E1EGHD9ltS/lxrL/Bp6Xpxb9JKmRKIfv+yGeDD7Z9NeW3D7eNUH0VefspspbCTPOTa+uumfHpBb3qBf2mP5DMwsdBS0aMZre7FMD3MybyRj4vTemj/bkS317/ulzr/VOHX8hNIf62iVFfIW9tuPkMWUthphmUbPgw4pexQwXcNTM+mnn32tLTl+YO+/6pe5Z+RQy7qx6fza0/Gj9iVGeqh9rf/ZDOdlTkjXxaF3CSAsqUwi/vAuzj0YFtE6O+PG+ljtjl47KWwkxTl6yL8XXsUgG/tws40oaqyMAP3sL3ocP3u/v+Uq8ZI0ZVpnqYgn8J/6vZX43m8O3KEjEq8kaqihLjaETWZq2fVlcypfAFXUCW5K/uAhal9Fsmxnx9ehdQCTONxx4L0sX4QnaogFVmbKxtH8o8GfL7g6Cwb/vK2K/Xg5gpIwP5iFHdBeBT/rPDVwnXa+TX0qh+KXfeMx2puoAYx84NN3PfCPLziEn+hGa1gcf+1VXKlokxXyFvbR37CbJWwkzjsSeJP3s5PWT7ClhlxsZs/VDmZZ1fUIbp1NmmiCtGuwDrW0z1tTTy6WNV67O6gMx95XXXkLYgJvkTmtUGfPwIbJkY81XXlw/LWgkzzaAL+ElUmbFRyF0yr+7qzrOtqWXfn+A60QUslgtXRZYxyEYXgHDOXTk7WNpMwxkJL7EIIS/zWUSKz4j6PJ8OkqOx3KnDaI8AlqZhHFWhFDuyY2nq+wtXZco5NqWzkCYyZ3zLdzle8DHJqojZqFRxCOMhV6Zh+/ywNJcq46pBYgZ5nHBfY11AUdQxjKw4lfHN/EyYVuXJmOwCquyO8VWJyrXzwi4PYKw+FI6rTBWqgquEDOKkQMYq+ThvMiA/u9ooRugeEykxAa/PrlBzi0DvFFjLdqMHNfEW3uoCEKQpEL8QMrQIT01CYuX3WC11F0XwclysVWcro1tZU6uUd6LBegPIMluqAtBJoSk10JCKCzOBJ0GsbDfEY1iVH99HmoZxZJHdyqpJHfiOh+WIraRSWJIIVyWlJfA4Fm87czCPEtTYEZ2Jjd76Kdr1/bJKcpG4Ko4YSNVKBjl0LTqpbi6HYDYIz5NYJSYvJ/UKYv1wX673S5TVxh8ULnRlcSZpyjQrQRixGas8wYPLqNEWBenGZXmKqae+ypaQG1aprlS9tpjy8srqg7ktaq0wzFShKLhCyFvTSiZBrP4+ujfFJl7gCSEl+v7GVYJY6g+41RgENdcIoEPFvI5GUtCrKxjp7CEWsZJ1AfdmsUAAV1J2Pl+DNoWnRqkLEAVcrvflDw7VFncpby4lAzX7JD9E/yh/EtMdHAmSN/K76mAsbtRMUKWmAqGuEK3teWpMmquqlaKMXStADMM4ssikyohC2yvci8vuxO9NZWHdpz3RoEQtXUmTlEKBdiJzpB6/XHRvRSmb2HBgN2+t7kmYVZJF/pS4YRxF2gODHNK6v0Q0z+dW19VCyCLOE1OUU9Z6va65r9jo9CJJlfARrR5FscKFlfzE4oTOpCnTrASTRuUJ7sQSF422KEg3LsozxjfIFjTR5yCWpPtJawU0eXnl9cEqYFFrNavqTBWKgiuEhGe04z35gTBoPCHoLUA4gusUDTggicQlmMWFgEUgJaRzyNfUBTybQ8lmXDwTHU8ikNUGLu5MbPRahQdl1gVovUFpoFyl7urkeW39fJpIaggutnuV8tBfyVsd7GLumEKMdV9WYtcgJQwUqcSgXeNYFzAWR1TAATpAqSjw6MbaxLKwslCDUvz5Ra8TmXNXnCQyxJnWRHMehpQ3u7ObJblIXBVHbe2M5FBwrvUtTVSQaFO4k+C0KKdLdxSnP8GX1xe/iNcsRZqvVrvq4nTfVZpLYcYqT7D2ovNo84L0S16eKb5htqi4R7CVrkYn+72mOy8v96z1wYoFEWQ1apipQNzkJVPUNpwshVRIvvSyah5LfBPwKLhOsRAdz4VGFyBdl2lBEWdqkqo1si5AckgSKDMe66UlYejoq/Ds6on1/JWrmb+qXkJQnVQx5IzEZ8Jb/sV8cjwZWXuwVPiY5O0hxOQlMrvLZkVCFgd+Yxx5oWhbFQcY3YOxEsNK91aiMqbFkjiVOW9FCzUsttAywpEYT2uW5DxxdRyVdWCYQ27r18cw+4SJLcmGiTGtlVMc16OjsJBTA7+EFFllgWmZ6licHkqV5hSsKMYqj2hVF4rOo80L0i95ecZARFFkiw9rRhiMpTTkNy+v0hUklQjyWusOskwFVcnkQobRRd2HbjVmwiYkOYIFbkh+Z37jhKXRBVgFUso4zaFnopN1ASKoeH2KMduMsArPrp7YEHyoBybZUyiFe+35Y3xWx2I+Oe43SlqlwvutYF0UaSKLAzUjxlHXnBCAjQM1I4eDZXanVw9pKnNk3MlyyshShbHJqtOph1klOchWx1FZB4Y5pC0YCr3cp4IN/gaJKcppbfH66Jl8edb5pUpRzNdALE6XRlR5mqGPCguprDzR3ovOI6gLMsbrmZKlvsiWl7wdBT+myMsrrw8mV4zAQnZdlqmgKplBbYtjincJMfM2Ib2/4Boj5aggCxRcxrsAWfuYTklxLruuzFSnnAXIbzLQuVsdnl09qBB8KEUzlo54pdiSM8tMCBzzSTg4gVAjXUA2HpsqWMfElrTiiIrowBIp5un2R8RqqxKUr2Ff1kLakDn9c7WrnKUKw4EsReV37UNJmWTBElfFUVs7zRwKipTAWx+4h4kpyslOQ8Z1QPTlWeeXLBpMcWO+grHiFFWe5hRsdFJWHtGqLmg9AtfV0oTyjIHV2ZJVbzi19MriHK6z8pLgY3aZZYzAapSHX9WaEKuXTC0kHISrL1fM/Rbc57t/wBc4hvcP411A2UY8ThHJcA0uTpZHloLe98l1s3sYnl09sSFJdRcQqWYBRfO0nk4Y6QLSoOEWwboUJ9KKIypSok0hEcR92kAWaVSGlMWkTWSObQLHXgRkqULrM/dh9CmTLATrIo7a2mnmEJIqQ46PEyCeJx0kJuJdjbi7C000+vKs80smNFKU8rVRnFWaU7DRSVl5RKu6oPUI6oJM8ZoiBFZnS1x2CDIviKhBVl5ZfbCgYgRWo0RbZqoSYvWSqYWEg3DdtQsYEFc4wPpVuYx1AT4tdCzOZ8lllIBZpdwDWRdgq6fUMiSHhuHZNRaXJaksxbAD5FSZGfJJqszaplhlnVFFXKZCb79mXTSDRCOOkULxUcO2irNJohBSIURlXOyHpE1kjiDRFH1Aliqf0y/TfL5MshBaSxFHbe00c0jGDyPsbNrKFwwSU5STTJql9Yf5ZvLlWeeXKkUxX1vFCWWW5hRsdFJWHtGqLmg9grogY7zVLKDOlnyGH4s+IcGE8or1wYKKEViNqjPVCLF+RxfwFsISZMKmGxXjXUCZToszxGxWUT4ldQEyBcRuyWvsbbQu1uFZZx6LKwScl+LaBpVAlZkhn3z3J/iNuRNENl2Y2QXrD3cBWQCijItBkGyS8ikMihbSZOYoUk1cBbJU6eCHpWm2H1YkOchWxVFbB6KmyiFYrPpVmEkgCaE4qsRU5aQBSUs2TfLlWeeXKkUxX1vFWaU5BRudDLqAoop5BHVBxng9U/LUG5YtRTcf7TKy8vL6YEHFCKxGia7IVCPE6kLUQqo3v364C5AuKk5ogtDjXYD8Wo4qahTHKrtG+ZTUBfjOcrSWLgEdZBWeTeO8SwpJKkvxqoigzsxwUyjdK/U647sdVSperdqGmMpmEGnEkRSZA28GQshBIxM7KaPKcmcyc5Ty+cIwCMngp6tO9BnRQZXkkLg6jso60Mqhk7SnaYTjBYPEVOUEl2cPYbsg+XJ5/BLmKJ6iLF+r4jRphCrNIdjoJCQkdgFFFfMI6oLM4i1SH0PzbPGzLkaQMScvL6sPFlQWgRgOMtUIsXrJjNW2cP1wF4BC8z5AVOFuhVXgqguQrNNu9A5FpUZSldSkbuNK6AJkBhXDtVqwtswtwvMgcOgB15CkQSl6MShlZsYtNI/mONYZq1qiQJiydtGgZY021gzmWUMGjTiSQhxoMsT/iYwQmiJxjGsIK26FV0qdPGPZaNpm5lhpXKuFZLnqxJenLsoX7hFDXSQ5Jq6Ko7ZeWbfSyiEZ/V665dLKUsgXK4PE5OXkR+QslMyX561fxIUuHzyILOOr4rRwQJbmUhhzEqTwylNVMY8gL0i/ZOWZxVdlixSE5tYCvqQ/sIJXsvIq6oMFkCVMHFaZ6kUQY/WSqYWEg3D9eBcgvqWxdndS58JuUaMLkESLw5Ul3YzEZAln5jDKp8DUsRutmruXx6gOmjFFeOgosOsb27olqSpFuawlwy7NusxMXNbd3TW6zPvZ3pMEqH6leG46VHkPU8wflp1UVps4hpg8sTeep4FGHEmBRPV3x5IMOLztX4/1qBzcelih1QqZElXx5VQS5DFMZI6MMculJAND+3WSwzLMu1fcl/NiGyY5JK6Ko7YWc9XKdSyHMLgbNuT5+R+lSkxZTpYaCzv3lVKOS5UiN9URMC/OKI2QpTkXJjipKk9VxUIEWUG6cVaeeXxikmeLSLS+htbt7k+XS/GlmlheRX2woGLCTPYyU+WKiztNTbASEgbh+gldALaHjZRWy9cQf4hAJl5AF5Rm5Ju1vs6L8ikYERQrByD1V/EbJnl4KCfhXAKGLghTlSJOmhnQVJkJvW4lobpIcZ25X9XpNYSp+I5VMPXESgTeYRmtOKJCrqhbvjSUWawCdQjrIa0aM2UQ80xCVm07czwntY5IRuh0VsSWYUmwya0gald5EvVaJK6Oo7IWC9Wa90EOyczk5bjrpG6H6modiVIlpignQVQxtOjL89Yv8mspssrnpkJZnFEaIOZRlYQJTurKU1axEIFcQ0G6NivPIj41jglxaW2uEOw0xqy8ivpgQcWEWUrLTC2dpoKrhISDcP2MLkDmGd1Dl90bjE85hYfK4hNJh92dT2SD0bK7kjWY+Zgv43KsxVJiyipOCk/oHqC2OVEI/iA85xWnSofdKrzPMMYXBN6/MKvFxQOSE6Redhpneq6qu+tiy4mmntjj6EppxhEUWiin9523qtmBJCMUh4WVzSvKKcaiu8PN4Ji0duac33bBo8uHMp93t9ZclLe42SfUSY6PBxZxVNaLy2gzlkN3oVVIfZVfaeWmNerEZOUkxMfocl+et36BC0lRuD2eVaaiOKPMIKa5ECY4GVaevIrFCLwgXV+XZxFfni3CxV0XhZxdXN9ET1l5ZfXBgorxao2qMjUUQYw1FFwpZHLg10JK8r3EfrmFjCmuypUfZdjtxyXBVxH7L7tFLjNq0+7E+3y1iGn+vGA3lufnUmUq+YVsrDKv6bR/pvwogy5A5viu+ipsASSsdRf7ffF9qpQpzZ8X7Ld3AUWmkl/Ixirj++ogU36UQRegG4dfikxVX473l1hRiy4e9tuJ9/lqEdP8icF+cxdQZir5jXxzlQnUXYCsMeJC/qsIz61nW5r/lC9J83eX50/LVLIziy7stH0r59Xa93vEWFx2cZ/r3/Mlaf7+8vxZmUoIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBScfWpr436ecS3cxPyezjrQP062q/hAy9lOlcx01upm0y/SPd8pa/A/yKGr/Ul5KfjHwMJX//6Wt7fBUQxN71mdT//dMkQCSF9E/rTYRdAfh/SZGR0xQeFvmGO/v4uwMS8Xm8Wc/Itvfo9GVd/AewCyO9D2pZer7NPU30ZH+oC9Hr+sQ9m9P3xV75pl10A+X2EtiXV9wuHR+fjXYCI+YHFPL7qGb+6+wWwCyA/nT2ZSJeDYGxb9lHUk/4cnzbSFfcCn6B9SoOuNODw6ZOZftMyNkY49BX4Mb4xWw7Uy2gUuoCDR/GwCrvnuZcspIKsC/CvR2BOv47r+uTvOSQwF9e575+zL9Cc9CezuXh8tXY70DZzoow6iwhdwIXownfxCPlhHEn1rG5cxbYlrVJ+b/vbG7gRZfjgsH+4EV9IBtpA7GOvPqeWIBSoZbEO8m4GXy0X8KkE7wLMILTZzEse0kkhaNYFaPvSRT1Qw9yfS1iIG+j7g+gA0nT+AVz9pOpA28iJMuoiIvk1Db8NRX4mVj+97hpZ24Litpfhfblc6eYAmqi0Wv0O3kIGvj2MozAUOwyJHpSNgWfYp5MhUH7nL1kXIEP8mXY+og5dwAt27CQiaHIvWUhwm31fIYhpcxW4RDuV5qeNLfdnjbEQN2CB9DIVUG77V6QDNxtGtY2cKKMuIhKvuEqS7YvThPwwUEUF1ymhbcmcF99dlnru1mE6e2UGMpards9GO9t1t6qeBfg8WKnL8tu+zoJ2412AY20895Jbyywg60hEzJUg0uUTjDCrz/1ZF1CIG3jWNhy/Qy3S2FecbHtgqHV3VU44HnWVLzbDiFMnQn4WUkWB6xTvArDYxVUqvk2/U2u1US8f2Lz2Yz2A0wRZgI+DT6PdZbZZoMBu0edeCusCEdMpbuu79Lm/MCUfGYe95+p7O2EUpbGpxVA7nhNOjDqLKGhiBhHys7BPGMq8PBHbli1fY32P02VRSTvfyyv1Omyqy2JZfsVzaAYy36+WwSkcIbUmYPPr3EsWUoWIuRSwDn9xI8UCzP1pF1CIG5C5u17XfrYgSlOsUgbaKicial9G5JOA2VlhSsjPAYN9eTrOu4B733rLKn5YReuafZlXavNiiFbm+tG1+O/XB6ZWUjhCCP0A9xOCp8xLHlKJj7nq2nbmZaRVoM79aTMsxA2s/WaCrHn0GtOKe4VtbZLIdy/yqMuI2AWQ30dsW0ZW8cMNNb1VUFTqvn/UE/uCNUexDbPrOVp3thpI4QgeunYAWNhb28q9ZCEVJDFtU057rmeEYabJnzbD0TYIJw62PVJaxUZ+WtoqJ8qoy4jYBZDfR7sLCJNebZ7VLEC3xnPQLlyJMPzevVCsDCz0q7gmjwN+5iUPKZHENOuwHkhug7/mLMDv+BkwiGndtBAocqKMmrMA8ttpdQE24gEbM/ODuXHszQk7/0IMRHjN3ZpFaFP5nD/3koUUybsAaYFRm4du/qwZ5uI6qZfxlUCMU5ZHE9oqJ6qoi4jYBZDfR6sLOAk3/uy+vdRufz5Hmtnh2KM4WcMVv67S1mBusV9noff9nZrkXUDuZbILkCFaQhIJVBd2+JSyC0jiOtlRI5vbizQmgHUOQ63q6pyooi4iYhdAfh+tLkCqs45vMn3W/cPOZvQXOoKv9ZjdbLaA0bXOi61F2KNGxeRfwpEWNH+FtYX+ZHGIBZpc5iUPCQ8EZduKLuZcQnDPnfxibi+Xwp81w0JcJb87aVMThOWpgeFQi4sg5nlOFFGXEbELIL+PZhegu+zAb4m9uNZbrCMDqyztFTs9YMCNc+BGMPPQ3aCzLsARZR4S3JanAx0bdO9NI0KJpvDnzbAQF4S5B5AVvAguMfjhZb1NOtTiIlQ5UURdRsQugPw+FmVtzV/tZacI4kxat9PC0V9pIMKzjtNqYZV/oQ0ize8VGXN9Dz6EDpNnWZJjVM29ZCHVt/b9SYL7sAKRSQKWBOYo9xfm+4W4QnGuUAd0NHLtPSzMStvOiSLqIqIQd7E+IYT8TNI4r1RaQsgfh10AIf9r2AUQ8r+GXQAh/2uq5/n4eB8hhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQ8j9ksSo/x0DI36dbrfTVP0RYlh81IuTP46/Vy74O9r+GXQD5nyGt/1Tfp5290rPJ33jqdjIV7ALI/4vX8MGv0/gu/gnYBRDyqznuqoZuH9oMLPZnsw7v0peW0KXPFC+7S39Hb+b6vEvf8hAwhzju7FW/IFgfmdHcHaepBlRL/3IhNPuuPsnDPb4M8ubynHThbcNJBYoghShili6QUpFSFsPyLmCRgiHkz4A1f/YhfeE+/zYI2oa4uNUv+wj6fUF/1/6rWitoafaO/dT+lv3ZBUx8SpGsvbn5i7pX8fvn4uFMHbkGb/MWlW1M+L6EvV4cqkyezDSplCLIQkQxEqW/AV2NkYosZVlY1gXwRUPkL2If/naN4V/XcGB9cdnh03zSdF69MVzLmNjjd38pLWeJHkC/3iG6OIQu+6f+aS4m+q3ezPoxfAZE22+KXVpa/7g4kiavmof+cSkupA85wr6Evui/75/25rf4wk8uzxOiOISvpDKKIHMZRG3pUmIqspRlYWkXwC+GkD8JWkI+dAvhMx2G2Po1vwBfMLjJhX3H88TmzECaFT66Y18Jya0P3KTwL4gHbZIqj2jch33kQ+cMNz4+y1LAveklBZFURhFkIWJIl1PoLGXJCF2AfYGYkL+GfvynrNyl1m8LzL3xPaQPDrlDdx58Jd82fXaTwlp/1jJAy6Wzb/eBpcdx54O1akL/dIXlSeqsCnnSB3/rbwwXQVYyaLoCwc5wFyEsSUn1KTZC/gzoAaR5ZIy0BgydTx1Y2agqzeK8Myt3IMOtknxXXUBm/Ybm17+dIKzkPno4xrXwLWgbTI4LeaS7eLE9wqQyiiALGbJ4QaYNKctDLb6aTsif4mqVxnVFP+gd8bp/o7MFgMHz1tWwcQduIqhWqLqAgGhOZURf9Bdqk9xHD9ra6y6gclzKo1uDugmRVEoRJKwMMUlBKUGbpyyGJaH4soSQ/wHP9R0BcJZPFR76NwyPZuUOQhtKVF2Aqh3R3YqBXA6zcIOHE+wT1l3APu4spFAKeYQ93/HLVUIRZGZeaaK2SFkMC6FUCwdC/i6HY01l3xfVipvZpdDkTHYBuuN23S9X/nleEDw8YHsg+Q57AbqaDwN8IY/y2J8PVGWQhQyVvK4tL4KGhVBkSpAmF4T8acJn/WdXsiIIrSFvMqb2HTJvGtfF3AEUXUBp3fU30M/75zxYWXwf46ohBt/hlIIaBo1uAaoq0dknzQtVGWQhQ+XfU2Gm2d6fhqWy3A9iJOSPspAl8Or0RJbbWRdwYSd5LqHt+3vMrc3qUWboB9J+7Db/4bOeAgBFF1BZi180TbnA1JH2KgHP1zrAB9/i5uUAVtp8xf1CmvJtJc+Z3nuEEFFlFEEWMlQN2lORpyyFZbKs+dgU+b8gTUbBAZnYVC7NDFo7QyeNUC2gxu0z9zSYhLuzwloUuMgk4ED1inh4gBNtrrELkCEZ2ADuGjTxTB5T4LxRUhlFkIUMLlUE5odFylSpYbks5ZkpQv40p92ZrcEP0y7YadeFewXLTprGgd9KOO7MzV7XZU8EiCML4chDyK33lnbPXc8VBtDS5tGN+xYOui72KxAi7B4kecSFB5RUShlkLkOWLsNTkaUshWWyzJNIhJDPJw78n8cXBEkI+SLYBRDyv4ZdACH/a/Y//wTeFwRJCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQsiPJbz3c8i/eI9mHeeyeFXpKItmAr6Ido4R8gspvzKecVu/gfudLHd4KbfHGbys8ILv09lssh/49o+ANXOMkB+Cvhq/7+PnPib5ki5A264JMO/Th8U3YnEGL4d9/9Q9L5EeaBuMdAHiIftywWfDLoD8cPxLHIJ/AmCSL+gC7FMdgn6iPP/oxyY8Tvfyho+Mqna3LuBAPHzhC0XZBZAfjn8s73S65QS+oAtY25eAFmvtAmY7bCmEOM2L9yGyMLDLOMMu4AWfCXf1F8AugPxw4vcytxoLv6AL2G76MUYZ53bf9hp2ARKIbiF8EewCyA+jbgGxC0jf4D9Ylo0y+8TXoEIHu9QcD4oY8g3xZfpoeEaffUu8JGz4F58Yy4Lc2AWUHlU36AJOJJDX8N1kp9rFL7Ux0CqbysgWIbGaY99+H4KQBtLOq4/hxi4gNCg4EcKE/MK0d6Yru4BDs4OR+8ZnewX/ZKfbq1o/5elz9VX+Jv8eHwWOhNBupV0K0qqvcb1Xy1vx/wyttViPU73AVJCIQzoW9sHQsLuowb2Ku6o1riXMI/cyEkOlvdWIoC6zyVIXv59qG5wrKEU6CV6InzYm5N8hq14hb3KpC5AWI7/4inh/9SQ/tksO1Qo1Ouy1ZV0Aqvbzo/yIOjbHftW5ETb6Xi66N/UiZh2+TQ7zq6IdiuP0hVCPQNqaGEPaPflB9PiiMVqkKLV9mdbjTF2AJM1N90R3cynt1zovWFrDrLoAdR33EQYxVFqIBXWVTZJ3qys4s2+awhI6+wyz+NH+R60I+aegJlaVMdsLwKhlHYG6xEUakm63y1B8gWvRBXi/cASX3vBsO0F8YXS8889/Aw9WG0zZBegdgXhXLnQB9nFyWKEJS0uGJYzRyGRmggA8TpcpLATcdG0fDZJpCS4y3cdyH1OasguQJi6/0kGZdhhDrbVIqmx6Lkwls3SJ8GpdgOaS9BlFz0vIP0HqI3CdEroAmdjKVFWm7jZhlSp7LJf8M/y45F3AeR6QNzzHWspbVusL67viEILdFbQvi3sEElroLHQJIE5wEWNbftvM3OMc7QIkIaoTwSFE2Os8q7sAm4JIn2XaYQyV1lJUZ5Pj/U3fP6pWCfMnmTHolZB/iTY2r5OOdAGr1QorW7Sgp2hpZ3XiEjbW7tQFvOQBeXN0rKnIaBgXwP3Ul/wuIZXNGGIXoLqDEKp1CTES631cO9oFPIW+614bf5xnVF2A9y1ibOuF8RgG2jqbAmafJRsay7GiwyTkH6FL2LIqShegWJ221QB4Q9OJTTDV7tQFJLdCbB2zg5NOZtc6Wkqwz74xjjV1tuSvkdHZ+oCyC/BYcfWJuGlt6uLa0S5Api/Sswl20z949IAid77Rd+8OxmMYaKtsUpYd0i2KtLkIQo5JCvVKyL/lvj4JLJV7me5nxX0xbGrrgSHXelUuu4DsLK+3DutjBJsw6zaYbZrj7JFv7I8ithg7t+sCzNy1jS4gIm3UWjCougCZnEjL7TpZcaiY4zEIpbbKJrUxRF02dnYB5IeTWgdIrVRab7ai9jYQKzQo5vbWOqQHWNsWgnUB1jb8xhluIBRrkAKZkqMJb9cFFINyowvw25hKmO+HgAISTkTX7+MxDLRVNmG78tY6MPnxNZPDLoD8cMou4DUsob2WR0tfEpRdwGAv4DWuJ0IXAHWMQJpBtkquCPfQtukCrD25drQLWMcJuhIjLruA536tk4AONzJhMB7DQDvMJssWtS9zlF0A+eGUFTZWchk3MXjHtuwH6PIuIKyfFfPo9++TNwHH7wKhmY5ht9+26wI23xGQmb3qnDApkGlN3gUkeSS97RgG2qiwbIrjvl3z1IuGXQD50ZRdgNRYOzy4tiYvM1ytwbKS1+l83gVI9deVwAIBWKNw38fWCKx5XSOkAxuEbTCeW9jGkx9WLO46Zk0vXL2BagDSnNBduKvRLkC0WTvE6T14lB4g7wJETlfBPSYwwxhGIhTKbJKcgKuQBru/Opvd5RMEdgHkh1J1Aaj0l8c476q1WGpw/7DsZMVr8+q8C8Ct7vU17ERtreMC8+I9HJpD+5M5wXIpc4VjWL8eo8VpXDd5O8RRwlWHs3R6225jF9DfHcMxDN3VeBfQiXDdcnkZbfvu7AUnDrMuwPs5xU4uDmMYidBUeTbJdYlGbvbSI1iS0EOyCyA/nKoLsCP5Qnh2x7W+uVZ0AXZ8vhgg0fql6p9pF6D3A2xtgOO6AG5wDCB7wMYP2/vZoE1dAMbxsAnhrsa7ADtsAFRntyrWs6ILKJ4QVP/DGEYiBGU2+Y0QuahW8hTofgG7APLTKZ9wE5bdQ2dTAKO762JLqR1f3HV25ma+tOvi4gEnit3Z+W0XPBx0Dxeh+zguQznp7k5Dj2MRhNBifHbVJnh637ljd+VuwvN80e/ssFt16eyeyCJrmX13pRSpUU0Vw3iERplNy+5K7OIzhYfdnR6ozuKoHr8khOxMNgp/EVUMXx8hIWR72AUQ8r+GXQAh/2vYBRDyv2bRffWWWhXD10dICCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQsjP4Hhlb+a6Wb8ezB6yLwq0uZ/4msguLFbZ28u35D1+voWJrAtZvAPbFQQh76JbrVbxDRz61hy54vIiv2Y6jbo6rr5s/A6W5duBt+I9ftrCrv21ap9AO+tCFu/Czh4I2ZL5GhVS0K+Moq5pI7gOX9DG7ybU1Sc8fPd9XUBL2OIlpB+jnR3h9Y278PHMJWSUed+vD2Q6/ZheE66Xl/DePtVtwFx1/gHRSSYDHDTnZf5ps5Jg9b4uoBY2iHUWPib0cdop3S5TS97jh5AtWPsLNPFSTlz2Xe9Vbruat0P9nHT6jV1AxQ4p2JZmkCGLd+ILBCT/R47D+zID8ePbqGR4O/6+tyg3j9Yn2QP2y+4yzmSP8B5Pc6VT6IVMHvY6vHrYOMOrPTNSfHkoJ+bKm/MijtB5F5BLIBRdwEkXXmA6m50PXgUQTHyOfwi3rl76C1SDWHONGpbHKchlfFWqshBZF5cpiYdZcjNNSqmSLEIWK0EOu+y56CmtYrGPGYuH5u5iGpHdXZy5FJJlVoWEByk1WVaFjMhVorR3NwMvIvK7yZb8zkNqY/r1XN0V6GWlrMSaZ+8D1x00vEtf8O9uqdq+uWPtV5S6uWDVTK2F0JpcuyxDsd0IUVgQ1x5peGW36jIJQLISP/phUx9ZX9Q4r63RxLu7K3zp2GLSDwT0x0ksiA/LswvoLcgbtRNUJ4hKpbGc1MjDJ85zTfIgZBam8jsCZ9YdhBeR2yVLq4hyJ+pgYx98zdIoxqIMtxdElySLVoWE9gZ2/QJtFkzIiFyln4r0LzKmIiK/Gvs6gGuM7IX7+q785Wn/ulzO96VV62eJzbm0hiN8Pwzf57jGnoG41Z0DqagH+6h1og5dAIJ8MH9rVMbH/iYOOzHcPJQn1MjDGET2Xv7lRf+0XGJgziVQopWE2D/N5VertX4XQTRpvy0zedQupxBWTG8vkljB8ikFqV8UPxD7OClAEs/wYTMMoYdocHvSZw40KRmlhWex25graY24nGiHnKd12T/0j0tphuogfMglpRGyXFyGaUAhWbAqhJLu9QR+1T4PxjIiV0nPId3Dq2ZUKiLyq0GlKEdIMYo1W9TyM7YQOPGWZ7VIkVFcfs9tGH9STewCYLTuz6HRsDwoo9BYKMkIQRQf8Q+z/REJ0kJAezEb4i/sS2gnngihMMFnTV71Aykq7HH6iHqIVK9FkG/aRi57/3qA4Ek8U5k8P6W9DDVQG4VFsRCwb5ZIkJhyv2CoLtIqotiZC/hcaw9QpSiLJeiCZGY1JpQkLw8mZcQwS3Jv5LdjHwFyjZFrVT3WBYRuQ6fQjltYi1RN6ALU5BLDxtJmqG5kFBrThe8NaRDlN5BCOx+RoNgLEDwoVcdrZSINamkNTH3tpX25ytkgyANviSB3e+DtdaiJzoTSougCrpFH875TUdxfllaRWDWweimz0x3HGZYwZlXEfRg0pduUEUk191XaA8o4FRH53aAH8A+IOWGJCLRSjHcBqhY7VyzPdeMgWui16AKsgVp9Cq6UTBNCkWr+YruUy96+YxhJXYBekgTJqmqv9kXx5Kw0QSeoxkFYnayoKrsWQdphoevsC2W52073UFWzX2qiM6G0KLoAnQh1WLZjYRQXKoKmNYgipk8eSJGi4NgIutyqiPsq9WRVMCkjXHXSP6n9qpdlRioi8su5WsWJtBG+1ydY85rqAkyR9gqjhV7HugDMBd78wIERvGSh2L4TGpoEESa+Rt0FJEW0KtqrBqmomeBaATppcLaj6L7wpUSd+QYPei2ClKX0It+fKN3eooGY5qjURGdCaVF0AepM/lbSQG91Gy76gyKIIpowIbDUANWomRN0udW4hNAGoIsZEVVxF1QnE7GIyF/jOVXHlY7Z07MAcfHQv9nqNfyEq9VWNwpdAMjbdLDPQxH23mIQxdS27gKyW+qNLkDVGaUJxNHNj9i0ZBTEKBmc6bUI8krT4JNzJXd7GWZVA010JpQWZRewklFXDA5EDPMR/Glao5xiuirtlUITdXp1TSWhTwlqnzEjouqsmi96EZG/RvYhfmt7411AWJ1KrSgtbDNRNVZb3Vob6IWoD6uhI/ceL8Ij5p8IQsab5GXpa/9CAiNYFe01hRcoTC77R09wbFqVR72WQcrqOV9vl26P8pV2oYnOhNKi7AL2+/U+pmJiZa6KtEY54XOtd+iycCtN1OnVNZWEsQstfYJkAtW+7wUktIjInyPea3+2GjDWBdzHhiiNMzjD5cYsyjsCMBGNdAF32TZ6wBt4HorSwa0GcZ8MfVO+ksAIVkV7vXZ3icIEbmxbPTYt9xgCVk0RpP4U+K3Ja1+5q8cLzcZKA7VRWJRdgNjdw+W6P7UmVqQ1yumynIykKFFJBjWuQ6EkomFWpbA8MtNEtIjIn0NaYX++WJzLRWvKWBcgvy8HUhu10ugs8UScm8XtbFGdC4C5aKQL8PM/Rc15lHHoYF6EgjFQGnSs8OvQKwmYmuCgSy6B41ZFe5VfDJSHz3pIQMlM3hCTCBVienkU1Z0OjC6WhVIEiZM5wqvmjgLtgTQpzbBr7KbMxRFsCo39GoVF1QXghI5cUAJmkKc1iGKh2QwpT2MeCXSZZMGqiFv61+VsuYY6CyZlRJYlEgw2ji7VbSwi8veQklV82jfaBaCfELT9WbuW+pEsDlRjtdX92Uq977vVSqp4XuXh4bAIRZU4RBQqfLZHic0puMklcNyqaK/oP5RsyhpN7M747BZDpfryHS91BUU4h1sEed6/XK1kGa737xVpCnCtDcTFCCuiXOMBG7lF1QXI0kR3NcRa9UVagyge2q1esjQWkZSSRatCwkdV693FFEzKiDxL7CChakyhhw7JX6TrskPwfgru0Ez8Mjvoutiqlp3UhQNrALNLHCPfV1dLrGGDh6W0mLd8qyBybJFloUjgwZcug+d2UQ7DCfhcAsOt3PGRB7EnqbFoA8HEpBQPuJqvs+4iuDWxTPwsyLkvpauNtLNMmIs8wqQJOeFkrtJBQ0XlQWxxkM3SGrLCQzO3KY1lJIVkmVUu4byLDwxkWZUyIsuS2Wl0m4qIkO0JTb/sAn4XVz4jCTcghJ+bnN+c0eQv4jNSWeWq9ldyouvimBbALoCQ7ZDFt+H6X4knwZ5CUtgFELIt16/9290vX0Qervp+FQ/VCW8/tqH9XMkIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEPLdXPR9v3Z1YtyU/HO+s2B+VyVglSU/jMu11Enhcd4y+ESuJNz1wjXkx/IDy+kIdfLSNZFxU+UEaThwze/hjzSRiYL59JJJcU1k3o/J16mc+QjsW9/JV2fcPy+Y+entaiWj+uPq7nzfzXLuUSONszGDOygO1WaC7VwJt3A4JoewOO8Cl8vlNqH9z9g6lz/OVDn9G+ZvItGDayLjpkaHNCxd83v4eVn/HqYK5rNLJourzLyiwfyUfJ3MmY+wKYHf2H18FV+ThK+uGf+05u1nA7rzVDU9dXGlSpOyMtiHtr9VfZvtXIGJ/Fj4BkTG+nrPLckuufxxfkqPmUDFXA/2psZNDU4C/iFTBfPZJZPFVWRe2WB+Sr5O5sxH2JDA7+w+vogvSsJX14x/WPOOw65+txROrp5N6wt+Yw6DPFMHBjeif944FG/nSpjIDy3hF90IuFmtXqADd1/RXv4R2vltWxuGjrfO5Y8zXW+Hou2UsvdwjgiOXRMZN3U+eaj5GoYZ9w+7jM9jsmA+uWTyuMrMKxrMD8nX6Zz5CGUChxXrG7uPr+JrkrChZny4b/tnNW/+hJirW0+4OSG5mIbVJfQXrgEDg09mIj90ErByDZh3No25cf3vZ6fq9OG69xGm6+1QtK8W9gDh2wZVxrhpQGXiJOBfMF0wn1syRVwTmfcz8nVDznyEMoFf3SL/EBtqxodz8l/VvLmOn8PhU6cGr3EWcAZtvjUwMPhkJvJjMAkQ9l9hmOT95exUnf5pK56ut0PRvlpYbAw9uToxbhpQmTgJ+BdMF8znlkwR10Tm/Yx83ZAzH6FM4Fe3yD/Ehprx4Zz8ipq3h7tK/fpqamjUwX5swrmCxR1UF6vVSm8RPItitbocGoCT5359X8S0vEVF7vuXm3RCo3Z10D2qm369uiqORU7kx9gkwOUtW83i/AGGML8eOSQyJt/5S/9W3Eo6lAx6CPtKJy/9y7lcF5fqc30bDi4f6UGU/mm4NbLf+d2V5+siNSGo+ZllwBM0yqVkqQb/qLm7Cn3gaFaNOx6WRTsr2oJsRSin+YWlcyodDWFf+ucTuc5PTcTXm6rYjx/UU/9yu/mkD7YB+8E24NB0cWEbYM+Xkk3DoeYdNUfYQtCJxF6JrrqLeSh5+ix1bDzj2lkfmUjH7uW+uQ6dqpBPsblYL+HdhHDkFpGR4tpQMqPt6fZtkHUHkqqnoiKVcZX9S9FgotW59in9y2Uh4Xhimx3ZwPlEOWdUORNC2a7nme7FsrRv031I1FPts1H272xC7Q5TaLSvMeMiCUHEdlXfXFOViRbX7Nt2KLaqUn4G+oCJ8dp87OAU1s+uKThUnyhTa5SRt6EBqBJwjKOtiTe3KVxpE8/JKu5EfoxPAua6F5D2JhbWghOvZc0Zl089mdIouiEIdVuKLRHue61VnvKs3q9y6iG2LA/q0swN77NKwTxTWlk16rjOvMmsaAui4CGtqcdWLCrdFwo009EWtisFyG5O7ZenQFPXBN1ALq3Op66JDEwLcfvzaqh5X81pCVoykVg1LLfkUKWxuzVVymNZb0ynY0O512xRh3TcApqVWkAV9+o6MSyu6ZJptSe9e1lknZ6sf8sLoYrLMs81pc40p0V5ZndLRxLbap1g4HyinBN1zlgo2/Y8071YltqpiqVKAbp2+xymD7yvCU12mI321TAukmAiNqv6NjXVsFBHW1w7J3cYMAqpPwM9VxJJc8oSzfTONSU6qIbS1oTnu/8DgyIBC5v4nGm0R9iQ8GZRuBJlnGUfa8aklfxEfoxPAqwniPOZa+h6nxvua2jZ+wya8k03HwSD0l5jsr9nByfuENQzDvAsLdCU1SrDs1d/q4OxLYSg+lsk0rz2mHArGmueetE2s2rguMq86ayYFsSCSv3HALMXtktHQ1itayrhwvIppA6vSwk95vzsJR2WHpVrDz1C1gEbA1Ot9OsLhLWMvVUM6n01pyloyVRideWUct7TGNYkjVIG4yU3nQ71PVHuFZvr0LPm8nK57FbIG02M7zDq+0RGzg8Ni2uLkhltT+ovrwwqYW5Qx6UOYnYWOkudoDlTVcjRxIpmsiMrnU+XszLIGTjZvufZehIANnUf0DXbp1lX2fHOJjRVwK321Wp2RRKgaVf1bWqqYykRxtvMeE5uX2yl1J9BORfLpm4Fajl+BFUzJ8xYd5sE6Dw8r9Z7l24zkUytAY2KWtCYBBzDOFR9vRGS7/poAby4pi3f5kmAhBLEss2HVIdt/h7nIZAhK2F7djVUmSook972VIRhdSoos2pDK57Oig2C2DS5WFKVjHp/dc2IaA1h+7e0G6qrCp+Ko0e9NmXJqFzY7YtxR2pTHTFShpyg4afsfGfNaQpaMpVYDTvb3NCsjBP0Vsa1sn5DsW8q95Kt6lDee2IJkuq+pmy41BgU1+aSabSnPSiz2qCJKW5y1nGpyDHxhc5S85ryWiPSO6NgmNiKkY6scD5dzsogZyyUWFobep5PnwQ02+dY+t7bhKYKuNW+Ws2uSIKJ2KjqW9VUZzSgVE4bm+iGYiul/gw04yPFPDOh42kjWk1SEHe3SUAx3BVMJFNliQ13k8PhJMAmNOZFfZfNVGXyuVBbvi0mAdmUSR+RyFZXVv29eWtfkFVp6+OC3BpULkLRdQyrU0GZVdOteENWbBBkI+PegzRD0RrCFoWhhWB1C71ha2QaoFU+dVZObaoCFu08T/GG7GrWnC0FnUys3sCLFVuP7D66ZquMy7N+QzpavsfLfVNgav+ShjXLjtjVmYO0eHQGxTVdMqputSftjmKEVdYJg7hU5JidhU41RRkhNXFzfpjYipGOrHQ+Wc5gWJE1lG17ni+YBLSq7Fj61OQdTWiygBvtq9XsiiQMUpBXq21qamA8oE05uXWxlVJ/CprZTj5aF6jt+AxB5QuzrJ0mATovHz1oMJlMWMXimHDYmASosVV9fYlBJcACZuZrQr4tJgGZUIOXJWSutW2Xt5Z0Cu01YJC+IkOH1akE1jHeyVa8ISs2CbKRce+hwxyKNimso52k1Tw72PI01mcMGH9qdWA67DfyUoPyPTVnS0EnE2s7oGFkRfvN1kVbZFyW9RuLfZdy370OqfBZXzq2YBsW13TJTLenYmhRTT6cDOMqRS50w9So9yD/0LoGDiY7sqlyFkYqch3KVM/zFZOAwj6vskPr9zah6QJutK9WsyukGoiYV/UtampkPKB4dGNzE50utmH4H2dPN/SFiccDdH9ivMfXGxahKg56iIFBlgCtxHlSE3UyD66R64nJtuM0JgG6QWy1a1QAjUhVE/LtNgkom7uQudbnfEdw74OgigwdVqeJrJqsexuyYpMgG9k1HZPCOtpUfG00V2shnaxtoDt5W7wuGIGVtScrtffXnO0EnU6sVT/zrvmYHQ3bIuOyrN9Y7LuU++51yI4Vx2zXjijc3XVGiguumiWzoT1pcH5LWDdZ85SMxFWKXOiGqdExLJTR0Brs1pFNlPOotMNQoM+LJMupL58EFFV2YP3eJrSpgDUioWxfDeNCqoGIeVXfXFMTkwFt1UShbxbb0Pn3oHPm4uZZwBbWYadi0EMMDLIEqF04TVBSJDOdktRHM6CYbjtGYxKg5WeTFp0P1AJgnWHpmZDv8yYBWtkf9b2GOaGCDYIqMnRQnaayarLubciKTYJsZMd0TAsbgEnqBednIfFv46dXDJT/8PDCwFRrT9nFZqX2/pojbCHohsTqakhvnuq7sQdVazrjsqzfWOy7lPvudSh26Y93d3qjv3+pS2ZYXNMls6E92ZpQb9xrJuZZN1Y1SpEL3UhqYBIkG7HeuSObKOfxijwIBfpB9Qj91BdPAqaz471NaFMBN9vXmHEh1UDEoqpvrKmJ6YC2aaLQN4tt6Px70JwfOxloZxhiYQ16iIFBlgANdPzGSp5MvfP1kjbCoJ1uO8b4JEDPifpmok5VKxdqZvYT8uk9EFcrRSntUqYqZXmrL2cQVJGhdXWazKrJurchKzYJspHd0rFBWEf783jz2vAzxuUYkaPlH5tTYMQUFTudjwNZqb2/5gSmBd2UWI0ArQ79eHE0bIuMy7J+Y7HvUu4bAxumCoG92OlpQQ9Hl4wV12TJbGhP7hbyYxAtsm4srlLkQjdMTVFGQ+t3dGTtch6vyINQoG+OJtO9WBXYpoo1TMB0dmyoLc0mtLGAjUb7qowLqQYi5lV9Y03NmAxoc04K0DeLbej8m9DbLuUtKQUtKSurQQ8xMMgToBOI0U2VzNVgLId+Q9tRRicBmpsxHTrxLBuSnuHwuxtt+TSY3KYopZ3KVKfjw4x1BkEVGarhpKMa01lVORbywDdkxQZBNrJTOoRpYQ2te8ODKupwaGzofH6wpTVmqqEX+6/aiXsWvbvmZEwIujGxmj8n9oBvcVthi4zLs35TsU/7rti5DmH/vBzaKkaLa7pkptuTACnXC0tHXkSjcZUiF7pBamxQDXk/sH5XR9Ys5/GKPAgF+uZooppmL1YFtqliDRNQVNmh9Xub0MYCDmiUhcRKblxINRAxq+oba2rOVEDCppwUoG8W29D5d6FFVjzQIRxhk6p/Shsjgx5iYJAnQC2LmcXJsHD0Zkx88Maft9jUdsDIJOBM5U1nVHVLqhBA59fhxEdbPvWY5j72NuJYSjuVqaZvZHplDIIqMlTvaqYymc6qyrGQB74hKzYIspGd0iE0hM1zUTNxZCWtKRm2fQMFNVxGjJlq9cme0NrHLdhYau+uORmVoMtsf3FzYtEVrrXAy72Q6VJW8qzfVOybyz0Te+c6pM9dTd27GS2u6ZKZbk+CHjlb6f56kXWjcZUiFzrV5GWk5/hiGQ0S+76OrFXO4xV5EAr0mYjlaKLl1ezFqsA2VSzVtavsMH3vbUIbCzhQta9AblxINRAxq+qbamreeCcDErZootA3i20kL78LeylD/3QWWt+JTnzLyln3EEODIgGatvAOyfmFVEIftHNXegsmPBHkUmxsO0I5CTi6tGlMv87XEFoe/UuYcFroaX7dlM/iXdtp0zPrhFIpDYSCvlmmhzaTSvP8xflTnHQOgioyVPuzrG5OZlXtuAp8Q1ZMC2JLlXKtUrBbOprCCiuT0B4O9x7sun+Jpap9a+zZKrkQyLD/GDfVziL02ucanRBK7X01pymoyRnTO5VYI34ou5oGbShlUGT9hnRs8F2JvWsdsg3GxMuqK7ZZxwtmQ8lMtiegKYCjYoUwHlcpcqFTjfBoybWPrKYyKn2Cd3VkjXKG85GcGYQC/YbRpNWLVYFtqliqExpVdiDZ+zvfiQJutK9msyukGoiYV/Wpmlq1gumAtmqi0G8otjIvvw0rlYpyflqmVhgYlAk4iHdZHC+rwpVPNpwXiLGx7Qg6Cai5LzoZqVy6Z5XznIfWks8jjnRabKGUBkJB3y7TKiwliDkIqsxQnRUbGv5UVg0cV4FPZ8UGQSwJKUk1O6ajJexJ6KmM4F+3gzNSpSzl0khjEQbGTaUC2QmgwK0+CBOT+K6a0xTU5Cyzs5FYRzsz6Uzq80kbSlkos35DOqZ9V2LvWodE2jDGJdLjX62C2VAyLlVB0ez1mLNkXW7YiKsUudBB04XvqztZGY0k9l0d2Wg5t3JmEAr0MQ5hsucpe7E6sA0VS3XtKjuQTPikzlexsmy0r+lmF+IciFhU9YmaauKkxE4HtE0TTVZGWWwD59/LyV1Wyi83qTY5OkfP5zgDA5RH0foubRNKeNH3gCqVq/PgBp+owAQyzhcHwWVc5MW2XnXFlk3i4C65ux/uHo3KB0KFX99isoowgueBULDMT7rq1DFfNu/fZvn6fJ2sBkFpn5BlaJjAv3lZtLNKqBwPAp/Iig2C7CHkYmVVsnM6BiZe8+dXbtzfZQsh//KGou/qDBRy6RR8cOJ43FSJDxb1dygSnC7Oc+UdNacl6CHkTAeOJxPr6OZp0bacTaVcZ/1kOqZ9V2KDXerQHnIoeyv6vg7P4RzhRMFsKplme1Lmdkw8z4JWXKXIhQ4aGU0WNyG9d0W/PEgseE9HNlLOzZwZhALZcoeDnqfdiw0Dm6xYG6psI33v7XxbBdxoX61mV0g1EDGr6pM1tW4FUwEZm5rodLE18pKQP453Mh8B68fyWDkYN/23bJNYXV6km/e/EH31ffHqv2KJ+50F8xMrgTNSzr+1yv5aNtRUQsg38PFOBhPo4bGFcdN/zBaJ1T3F4pjAr0PXN3ZHOqDnw22R850F8yMrgTFSzr+1yv5epmsqIeQ7+HAno3tyxaFcMG76r9mcWH1f3FZPSv1c9F042QGO2bFu2trN0+8smJ9ZCZSRcv6tVfYXM1lTCSHfwoc7GUzds2bsjJv+azYnVp91+e27kcOTYekU/HcWzM+sBMpIOf/WKvubmaqphJBvAV3fL1/5bs/GxOppsfIm5e/k6CYd2eqfzrm/WvJ7yvnPt0/WVEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYSQHThaLvdcSX4fLD5CCCHv5rYX9l3zi1l2Z38gFTvzV4qPEEJ+NNrZRt5uDtz81/NPRpF7zcXIy5mbv5u5hnPluh05V8/9ryxSTgIIIeTr2ddhouD1yO1+N/9iFBnJzf6D8wAN49I1u3GgfmUq4vpfBScBhBDy9eiwtXLN/PQR2v7J9b+afzYJCLk527uBtn907Xt5963xF4kcf/2tG/wmOAkghJCvpxy2hBUMfuXaseLfTwIEzc0P3xN4H5iCvFg2nLrRL4KTAEII+TBHyw03hAfD1uwNJoM7AsvBWe0DMWrfOFiI7cLVBXOxWLq65mjcT8O4ZBDh2CgCmV054CORBwa5eVIbKO8Qo6SZuxkatRTQq1zWE44lwomxdspWhNilTkylerL4YKcKQgghW3Oqw7nwfOgmIwwnAdr7dkl9O5tdw0gIXfFFCFm5q0aCQ13+OqtsCFlcrd1UWF/N3RgcPrgx8MiFhnHFeITlJGCZh9S/nruxsmPkl6vVw7GrKwa5qUfzblwj7C5GPZdp5m7FAnmNUI7UoRkGEKjYHT3DTlln2xWbbXesE1Op3lB8mRB3eYUhhBAyxVzvBjvtm8LDSYAO1D7IaYd/AQNFO/zFk+tWVytXvmXbDTrmCM8rO1/QX7vFngn0dHN2eWc9+53bzObq9KW7OOluMJb4QNAwrmhF6KOIYbOP1eqqC65O3GbnyK9gHse+kjo39xHvaxq8dhejTEY7dwdgZLWzHR3cZZMYAYF2at6/rsIYHO8ZbLbdrU5MpHpD8elP/+o29VSGEEJIi7BSM7Jet2QwCdBFWzCwTri4q72nXXo6rm7xXLjOtG9xS/joNXTqOh4+xB3fJdw9uOZO1NlhxDB8NIxLmhGq7GH0fM4f1rMkhLXqrpFvPwk4wU58f+86sLsYZTKaiR2gY7h70zG02KX3cn0L1cLG9JCmrWx3qBMTqd5QfMnKphm/8YQjIYT8C8on1mOHXFNNAs6wCE1rV+uJ8673sLAHuu8ctg50+3vsOLz6y0csHQw8ZEQ60r83jAuaEZrscQldoDOQMDDtHPnp3eqmcYNFc7Ng6uG+bcQoktFObI2ur8O2+x40r65RNNB1Ni9QJ2sfmrewLUTdUCcqilRvKL63LJ91FuBqQggh0xyjzwy8NW+n6rD1tlIwDoG0btOeuOjL0RVXoWmv7ve9seU/Ghn8FeOQzlL8KQRdDj4MvDWMC5oRmuzjkwB9+D4MYx+IfMBwEiA8Nk/lwXaDGEUy2omtmGMPIpvaIYx480UY5o2Oxj5j2cJ2lzpRA6uQ6h2KT2Uo9jMIIYQ0WehmtJLvSFfUw9ZbsWcw6In1/TP53QOAyYMN57oAra2BRvNqcw1Hjwh42HYDun8+L8fLhnFGM8KxoUzEX550AFZxLfvuyIdoMjNxlnZHvd6z30WMLBkTia3AvYX8iQDdFsr25Id5o5J7PdnRdkOdiIykepfi05lMlgZCCCHTnN4/96ubydWTdu9PS1B0+8agJz6FQXnIzB+GV1faUY+941ZXhkPSMHEapyz38X6x0DCONCOsZV+oNieOvu+OfEg9CZB4da4T18S7i5ElYyKxJerwQUdcw84xjAYagMmzKXe03VAnhGaqty4+Qd1yEkAIIZ/JcNjKGfTEevu33ljQsUtVOtaH4345Gk2+IT3GkY8Vb+WQ2zBWmhGWsi/0cYT17bHvPEOXjb7gHZEPGclNlfDN1O8RI0vGRGILVIoR0r33PG8M9eNzlR1tN9SJqVRvWXwKJwGEEPLpjAxbGcPhAJ17eW/ftoNtENDQfMQrwUmxbd5DqF19P7hL3DCeiLCQHVK/ZemAVTX6gh0jHzKSm2rkw+F7xMiSMZHYAgy6a98DCOgTAnHNPSxXjdMfA9zVdrpOTKV6y+JTVAZOAggh5DMZGbYyhh2+mhQDlx5CC0fH9ebzyLhmD9a1nmfL0RfdDfv6hnE7wlx2XXDmkUM/5mnHyAeM5KZ+P8AWu+8SIy+CdmJz9IhhLe1cz+sXj/nlK3A7/++nCLaw3aFOTKZ6q+IzOAkghJBPZ9dJgC4z++dkdorRJcwBZnN9wiAdcl/chfWgdfejy+mDs8wYQ9ha9Q3jknaEmez6Pd7n6OZYB0QfenaOfP4gmvoWuDPIzUM9EuDL5HeIURZBO7EZ+ljIcItdjcNBfA00Kw7bewjvBdjCdoc6MZXq7YrP4CSAEEI+nZ0nAWGIeLhcHncPWPH1/UvsxGdzfZy7X6+uu26ltmE8stPv6/vuFKcQL7OTYwjx9fp4OVse61TBH0BrGFc0I8xlt9fd3MtCeO/M3MdxaOfI7dOAEy8LGhDfFbC7GGUyJnI3okv+semSyp2PsDoa33XHnb8UMD6Tv4XtLnViItVbFp/CSQAhhHw6u08CbKGacVc+07av68LAa/auwks3i/hQsGfDRiBsHTeMBzQiLGSP77UFDwsMKO+N3FIx/siFboOX3OXZt7MYgyJo566jb3wc/WqgbkrYKGqBHmZhvWVetrF1TWSqTrRTDbYpPsBJACGEfD6Tn2jDJ//G39a77Lq7h647HVlyCocX3d1ddzHiVWyuVzddd1zOHGYH593tQ3dRP6fYMB4wEmEt+2J50T1cXZrRUZHo3SLfO75s59hGdhVjpAjauSvsifvxQsFX/zy6OMLOT7v72+6sLIxp23fViYlUgy2KT73xXUGEEELIBxlfzQembQkhhBDyi+EkgBBCCPmfwkkAIYQQ8j+FkwBCCCHkf8rivOvah+ymbQkhhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQ8ntZ9n3/4mpCCCF/jsX1G75Bp7ycz92UEMBJACGE/GGubfTv355dsV64zQc5WPf9ravJv+UDZcFJACGE/FlOddy/cN1sdvjQ95/0SVqMHitXk3/LB8qCkwBCCPmr6DbAiWs+GU4Cfg6cBBBCyP+LM73R/3rs2lHO4WR8DnAgNqezWQcXfWdmi5u1avu3KzMw9q/CnYT+2m8lzF/dQFnvm2krgIhHenBnrh6DP2PENyYx166ezfbE/tXVwoVYXro64DHs3VpQL2du3kpwSMfLVXlW4uDBLR6WbiI0krfnyemfJHxnzCzRlhKMRjMmvjFaFiNlpgzlKiYBN7mGEELID+VYe3KwPnSjETBPKAaMBPr+sysNwW8nY0yVIWC10iHoLdw00IkEzO104asNlumoIdhTs0YAGRqpjWNGJvyo7z2oo5tb6GJ6FnBoMScGMUT3wwRfusaHy3Uciuf3ZuK4eSN5uB3fv65WKwTjkY2Z5bSlbEYzFD8yLIvxMhuXCwH7uK8ZwmMehBDy01mgu3be3GzIGayzlWwO+n4hdvm6ngzr6gUGCdecv3RxkYyF+drVGkK2Bd0MIMMjdQssStc+ird8Y6QMkRyKWggLW0wJBlF4DL57sMSgd+dqs0lj3JPoXg5co1n1ZGqN58ETvf/Yn+PaEhAl4f7EsXkaMytoS9nMxYH4BbDNyqJRZqNywatOAnReySkAIYT8fHSF7MRReYCuNV1do4PKa9opxpCY7Slg+KlX2cKjGOdr02zg2SYAjTQN3BjGfT3a9P0iKl+Ky/ocQ6QPlxiqn02ZoTGkGwhzjK8W4ViCk8PZDEGrXv2ko5ROS0AMnS5RZMysoC1lMx9q8Utg2zoTkMpsVC54lUnAITYNOAUghJBfQdjKFsZuOhuYBLT2CbzvD+CW84OrAW4Ol3eqFQzbwbgceLYKoIoUy+97VbV9w4ulAaojCGB7GytoVZVTxaAb4yZkZYPHJuKqGByJge4yYD++sAFtATFLedPNgsSYWU5Tyul8yL2UwLY1CcjKbEwuDXgBG04BCCHkt7BA3y5UB9oKdI+7sXisBpVsUhGJ96n3zrEiDYShvRx4JgMIVJGmECZ8I516QE4Wyfd6EEBHaAziN1CUVDGo3jZDKhsEm28EzOZwifkFbAbj4YSAF1jH9/36NjvmOGaW0ZSyHU3tpQS25SRgvMxG5ILXN2yDvLS2GQghhPxG9OTAYCQ2qkEFmwb3ywK/XT7Xc/Lr1eq66zrsVo9PAtoBZFSRphAmfOv5PxmzTuyCdTokkGHrrXG3PR8rsb63BwoqG6yxizP+OgnA1jtsBtOL6eQt/SThczaOjpkFmlK2o6m9lKScVJplJtRyWcCIuFVVCCGE/Ep0t2C4Zw6qQaUaRRK49b6OhwsR4PgkoBlATjPSKd/YK3/QrWxdn8t1PddBa+w+SBWDerbb4JUNdhKKGGEf9xge1ShjSkBjoYNrdj9fGDMDTSnb0dReSkp/7TJzcrk84D1sBryOb1wQQgj5leBQWD1YpoNm+aACh2NvFMAQklbM+YCC5WsWQiuAnCrSbOia8o2l7DHGSV274nTbNfY48nvnEQSZJVgPUJrAVdxqkx3/02P5qt+vbAwYbkgeJibFHQZhzGxKSqhGo6nEryjLol1mkSRXDFifTRzKSggh5NeCEaBfxwfG5ucy1umIUA8qezgdfp1tsB/YZAHnCsJz5vs4PpYGFGiykawRQE4VKbQ+CZjyjRHuWex9gMJWN04FDoMXEKQk2GXUQxE+ntcJPsRthqewV485Rhh99RHBW5dlb2XxtgTsYnbM4zH8MbOCtpTNfKjFr0AQsSxaZTYqVwpYTweuhzdxCCGE/FYWuhmQ86TDw2BQmWPBDevVCkNsPBynbx5+ve9uMEQgsDiS4C69ocNJK4CMKlJowyb2lG/cpZfhKQxrqmvcwEaQrx6Usg7r6kGC93R47Ps3d/4S98L1cbmEPbzXEFADeRQzPXJne/BjZgVtKZv5MBC/pCyLRpmNypUHrNORYbERQgj5vexd+XAnQ0sX1s9jj9kfpAnD2028kz2/9DHx6VCPGqbFf3hC4TmsXMcDSFSRYvzJNvWbvucqf9yh10Nsg4f4DB/SgsirdJN7LMFnKWPi1Ma40nFSuM8EGRUwncK/jU7HzHLaUoLRaMbEzynLolFmY3IVAc/hoJG1hBBCyM9mw4L5h/A7pCSEEEJ+FZwEEEIIIf9TOAkghBBC/qdwEkAIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCHkS7nq+369cM12vMPLr2RDOi/XYi88zt3gnZwgmgPX/En+LxWGEEK+iPlZF1nuu+GncItxbLcQ3+Hlczjubldv/dvq5jQfUe4gzqFrPpMinXUs99AbZ6L9gBAdvC5d02R+frdaXR+77nfxzyoMIYT8CRa+6EysPqtP/S2TgMUKsSbewoi7r9pb130meTrrWHQOcKVKuPiIENtNAh41Bo/yl8FJACGEfAQdY150H+BG1sLKeuO4sRW/ZBKAHWXJgovlYr4802V35zazG9E877nmMynSWcYyh1U+5n9AiK0mAeoIfE6pfy+cBBBCyEfQScDKNcI19P2p6z7E75gEXCLKLANme5dffxd9Ip1LWF245oNsMwk4gptjTP/ePngC4V/ASQAhhHyEehIwO4ZB/xlnrX7HJOAFUX730bKJdJ7BCmcBPoEtJgFzDP9XVur3bvaL4CSAEEJa7Ont5fXV1ApvMAmY6R3yc9cIi/MHmAhP1yPn09rWGzro5a2Ov/3LTfJWezno/Ib1enVVD9THD+7/toi2YdzkCY5PXFNx8tyv70Punbz0L8iWxaVGsL4NGwZHeg+hfyqW7+L6GaHOTy17Xm/ynCjSmWK5WK1Wz7B6FsVqdVlZB/Y7dSTOrgfZu7jQBPXPl+Jli0kAqsizXHUHKCt1MJ2GjbaaW6eaWU/xdsZkbZoq7ukKM7+wLHmqkkAIIf9XdJ/XeG2vdIeTAO2Hw1K0PjTXvxY996T11CRA958Tb+6s8BJvVgeyO+X75XnGMEY2jPVZucajZBfqcnzpXYgDzW0plPja9/FYeUoxwHVn9xoiNqiDQcCmsfE78lZZK/uVo4d8fqDbCJHzzZOAc7jQ2YwOsCkasDkN07a3ep5BMSEmq8tUcW+oMEWyn8pEEELI/xLt3iPNg2WDScCBuvfRzE4I+Jp6X3vc/OH1aWvroF1TsLC19JlKdYTFqHf4hRdRvly64Mc61j6ZxkfuG1PPz17WHmfD2MNtDId2MLB/LGY3RiEONBiJ1lik7pmvOyTkGU/XLS1JMZ/h+hVGmjsLGyuj/IOAUz4NbgeU1prlzy6rhZoE1/nB+gLJXsbxdmoSoKVvI7eWO/YEEpvTMG37jCnZ7XK57FaaL9PVRTSt4t5QYcAtcsgKobWvQwgh/yPKRVdaolXUk4BTde4723o7Id9g1aHvxTWbrLWDjqNXht6H7o9cJ+xdurOmFx9BwoCGMeLa1RkNY98JaB18S4v5+zSeKoU4Nt68BO1cB8A4GnnOxVHUXL+lM4a6KA533YcBx0RPTwKQ5Wmm4S8RCIOezgFSeZzYtsjUJAApf3S1juJF7m2Rhg22xQnHDdWlpCjuDRUmFYqdbXh1DSGE/H/xBa6TdaAlxSTgyO/X+otjtIctD6rroOMzig3WZh+754xi5CpoenFBw7PsuGVhm+UFDePNnPqILtxmU4VCHNVkr9TRY/z5O/10kAs3BNR1kUadlfnoPgw4JnpyEqBDYzYHsIHfi08t4+ONoBhJR9BhOwanAubvDNoiDdO2L/kNGDWZqC4lRXFvqDC5jSZ5vAYRQsj/Ce05nXxQKdHOtuApTBj0ifVyg3i2gJkNOhusvYMe6Y/3YF55dFpeAKzCstvOOzzV40LDeCsWF7aXnC9uC3Fq2QYP9Bd34IcpUeF8pT0R8NQkYI61fXmGX8dxm4kMp0AbzgTosjlFpe+NSrdQdkqDMmm7sbpUwMpzd5cKo9n3KU+4EkLI72YvnMOeejygmAS83l9mazdd6uajHNCtc1VtsB4bJYxRj0bt5eA6btUr0ddcXQrpkD5oGG/L3LdPwgq0EGeQHOjzZGyYBOgo6HvvEwFPTQLsvMYQs4WqHFGnJwE66D+4BugNjcxghzQok7Ybq4vQKO4dKsww+wghhLQpbgeU6I10P2YX0WcHdKawwXpslDC0n649GoWXdK9en5iDIh8J5mfB/i3fw24Yb4udp/SzAYU4g+RAv8MkQN37EDsR8NQkQEfDR329Y46ftINlPqZvmgRojg5JO/Y7pMGYst1YXdrFvW2FAZwEEELIDkxMAnRhV1mpme05b7AeHSUUHcrSye+c3IsdG0s3wKEdLAf9jHk59jWNt0Hv7G/etReg32ESoJvl+XGK8YCnJgFaWOEg3wCcayiP2U1OAuw4/wjjd0OUZhqUSdtN1WWiuLesMAonAYQQsgMTkwBbp5VjiJ678rXiBuuRUcLRU3jVQXwj8zIQDPrBJEBQP8ODjw3jjeSH1YoUDJID/Q6TAL054wJNBDw1CbDFe3EwMEMjKG6H68DamAQcwq6eJekNgYnHO5ppUKZtp6vLZHFvVWEMTgIIIWQHpiYB9oqdfMzRB/HDY2QbrEfGAUc76sLjyXDo0EHqTpWKnmIbmwSoGMPRvmFcc1IdItNhMJyOK1IwSA7005OA3Fptw6A7EfDkJEAzpci5DC3K7I1I+/Z+ncYkAAPr8PVJGlnYed8iDdulUJmuLpPFvVWFMTgJIISQHZiaBPhLA17CGsw2kNMnZzdYD8eBgA4X4V2y8wsZj1yC3Is+9R6eePO9ax8VrvuXOHbraGF7xQ1jwe70j50V1Lf9nPtguLjSR+ufw0HKIgWD5ECfjYFjkwBJmOWOPbMf5ZkIeHISMDu0cFJKFudP8bF4HWbDSHquLoXxSYBueAwHS3v7gT9esUUahC1SaExXl4ni3rLCKJwEEELIDkxOAmbz8IBB5DnvcaetR8aBwEF4HC/gg3fhRe/OR14wEPioUL4DIe7eN4wFG7DGhkNdjZakvYFCnEFyoJ+eBJyU77pNY9NEwNOTANeWxPV89UrhW53ijE4CNJZs4R3R++++Q7BFGrZLoTNdXdrFLWxVYQAnAYQQsgOD58RqDu7CkrLv74fb6xPWGJPH39cPLnXNCV70PbdK5eU8uMGXarCQDOtG/4iPoi+MNRrGs9kexqrGVGd+oXerjadiyCzEGSQHCc/PrKf38As+Os2vwih5l+9mTwSsy/n8uYZBvPu32cj7fF3sb8RnJPs7mOPA/egtEcwWXkafHEV0ns/TadghhYnJ2tQsbmWbCuM7QO95LIQQQgj5LAZL1F/IdBr+QgoJIYSQL4CTAEIIIeR/CicBhBBCyP8UTgIIIYSQ/yn6Hpz8pNwvZDoNfyGFhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQ8n9koV+dEVYD7VIuL2r6cc4krFtX7y2XrvqHfGbipviueJQdc/ZHFETNuzKsqsVfxbcWJiGEfAr7d2vrIJXb8gMze7BbXZ50T9p9FtovmgRof/3PR5+/OAnYMWd/RkHUvCfDqlr8ZXASQAj5Zcwf0NP3/dtqtXoxZVyRg0fRn7paKLRfNAm4FOX6c751dyCdf56aHdicuPcFXvv6znFjQ87Won1iQXwi78mwqhZ/GZwEEEJ+F530Wv362HWCGmTLP+hcCQrtF00CPhFI+M613+bEvS/w2tdnZuIH+UBufSPvyTDxktfiL+MHFSYhhGzmXjqt9aFrHJj1YVqwX/ZqpfYzuzxOAv49H8itb+QdGVbV4q/jBxUmIeR/ztmbdEj9a7bIH4L93n6w3/sqhuu5KJ5hHbistYMub3EDn+DlCt4D+1fR5/XCzYxz87C+WeSTAOxKP7pSTE/lcqfu+sd9MzYOV2bqvOZxCvMgjbI2ry1Z9jyG/insGheJu8k1YDRwoZUHxpgvjycIUCZRwpPMAG9XbjJk6OYCmhjQHEnWepDlbJ3kMdE+qSCC571bk/PlzM2NdpaN20wNtGM+BtU2cS0m1662gwOvrhaQidH5uCieMNs968SgWWlGy3HgPaOdZ2O+2nl44Lf7+ofscEejVg0bwbjZTgEQQv4Jx94apT+vFvo5mCdcuDqBXqY/F0U4I6Bc1tqqy9MJheB97jr0Wuemf1nppCQbIcJpBGWN7sMnAVmwUJ5Zd2ek1Kjp22r1pOaCzltyLMIA5jotWTDa9a+r1QrCe7+aSaFJq7YphoELrTyIjPjSeMaTqOMQxF1pl/t25MYFo240oSeqnB3CwoyzNA2S3BINig8WxMBzNuK1s6xlk0lVMe5jUG0TezCKCdFzkFGyBTJoQ7lqwq7MUCtIJpv68UrTKMeB94x2ng19NfNwrrt6ETdvSDPWCMbMdgqAEPJvWKB1Om9uNkRnCqEJZ6Axe5+09e0ADAEvB67Bsr5/MvX5SxcHBay81q7WpeeTrcb3rfvyOLNgoRS868ZEIRxUw+LT5y97EtKrG1fAf7bB3ZAFueXSiizuIkqhuVR10UoV+EQe5NS+2knUHAqj1gJZVA5hoOVGR35d5EKOV9/0iGlqJ7lxp6It5RYF4Z59zb2EaHemnsiypk0mVUk7rObtAIyyIcWH8NH3YX8IUwJP7qQoQqocUba80jTLceA9o51nA19N8TRFD164+486sW9KM1YjRmvJLgEQQv4RusRxwqg7RGf0rs5Bv+5d47aTAHREaWPVbinkegentH3WAR83phQW0Hq/lgWrHV7ocKxjtiUGBLlXlYAeNwWVA//FOJ2RZIH/MCpFXIpDrJBHu+lB4NvlQe1rJImeFoSXbeOgux2MsE03eg/g0cKLGeVpEppJrkX7jIJQzykn5hjRLKx2lrVtMqkKJrK/OQnQbQJfIMuqGV48VzCAPptyWpQ4wwIuW1lpmmU08J7RzrPaV1M89TPY6GtJM1YjRmvJLgEQQv4VYYdQaN+e00nASB+EZu5d2JaTgFNRxVUAOBKDkaAxeFifi3sO2Q1YXcGMTwKy6OHIRhx4f1CVgO5n/JY5/LcmAUkWHQredJ2U0KgXsBmfAgwC3zIPapFGkmjWRRIF3GKOG73OlBuk7xVddBoHsrhaSZ6YBLy7ICrPerNC42ln2URm1qE5U9nfngQgMNsqg+oIeWb3zjERtiniLqKoQVVp2mXUSIlS28U8q23a4qF5FzagLc1YjRgx2y0AQsi/IrwirTonVIA+e+xlMDD2e3pbTgIQWb4amc3FIAa9d46xKGA9Bjq1vC/ZahIArY9SuNsZZMSKpzxPF8g8OGOySH+JIPp+fZuCgdc3rKpebCAZUgU+nQeRWiToR5OYzeMi9Z3WSTfIUklTtmrL42okuSFaU8ptCqLyrHrdgWpn2URm1qE5U9nfngSoN523yMz3Xg8C6LiJodX3NHYRBQZ1pWmXUSMlSm0Hve3aVTbTeTiYwE7UmJEaMWK2YwCEkB+MdhdpjzeADrD3e4xbTgKwHCiWgBq07hLa+b/1anXddR32GGzgxd3Yj0wCdLtbwrXLc2Oqk3kADVmUpR+ievbu26LWo1lxWC2pAp/Ig5zKl8cTSdaI+n5ZEG78Ribd+OGxrIDruEaS3BCtKeU2BVF51rWq7gK1s2wiM+vQnKnsn5gE6Pk/GbRO7ILVMyqFDORvnpRdRDGDstK0y6iREqWZZ7XNdB4Obs5M16q6RoDKbPcACCE/Fn2ip36IEPt9cQWx5SQA84Z8+FArXVLh3ura1mMCFic28MLBzrcDoLVoIOTFGTYg+/VD+zHI5AG0ZIkstAuzpbNHjcNu/evo0qYMfCIPCipfIZ5Asq4djjHhZo5Jjm57JAdVXKBOckO0ymdyuU1B1NFiF0jvHrezbCIz69CcqeyfmASELSlJgVY/ua7nOtKF22i7iOIGRaWBWeE/MvCeUdvFPKttpvMwPhAaaEsTyGtEIDN7XwCEkJ8JOqt1ub7cx6Em7zy3ngToQcTsDJKeIFY9htq0UEkDr67A0iI1v4+a9XKZEqQOSPqZfArRAuun5L8lSwZ6f9tcjVHryYliw9UpA5/Ig4LKVzuJelfGH/Nr0nSjz2tJR4x5XjyxX8VlpCRPiNaUcpuCgOvsaIpmlOZ8O8smMnM0EdPZPzUJ0BMwxxhjdfGKW2TXOOget6l2ESUa5JUGytEyaqREaeZZ7astHlKd2xgw3FCrUo1IJLN3BkAI+ZHoa16e02OCx+hCspFyy0mAPZLmD/zZssU7Cqzww/P4+7pg9IFXn1/y08T76Ih3mgTo/e71XQculu29RziLXWlLli7OBebpkYEUtU5QqqmSghCyY5fNPCipfDWTONvDdOw6218/CIN5ouUGsdsJch2OfPsjxTWa5AnRmlJuUxBwLY48SvXhY1M7y9o2lSiRieyfnARg6vMs2eijFu4YoVWkvN5BlGSQVZpmObZSAtp5VvuayEOobz3ivZWlsCXNWI0YrSW7BEAI+fHYYFgQniwG204CZnsezpuN5/1L2EHH8/j96313AwfoHUJfoc9RST+nHt/QYe0wCfDpS+J1ZJAWcKvXQKfUkEUleFz5y08GA6b3wS5dRhn4RB4UVL4mkqg7+sKTiKaKoQgNN9iQDTvBGOQGJ9FGkzwh2oSUmwsCrl9dTmUdRueJLGvaVKIk2mFNTgL03rnI5LUeboX8HMj2ouQGqdK0yrGZEqGdZwNf7XR7EwvYxltDGg2kqhFjZjsFQAj5DRzepb7i8SIsKQxsNvrj0qDUYuDOLO3OMHiKiwJhfunBPx3qC0XSUnMf47Dwhlf4iCM/xZQFW8WA/s+2aTFcPYXF2h62HxvH98JTEnZirSFLemLgNt7KLKLW02/xHkmkDBw08qCg9NVMonLgWSS83TRusw7dwCDt5tjSDaosrrEkT4j2oYLwYSvk/SoOUko7y8ZtKlEKGmFVtbhirp7ivrmmoS7rLUUpDPJKM1qOUylp59mYr3YeXumQLNxnxTwqzViNGK8lOwRACCFfATY985uOOJ7WWlORL2SrgvABjewA84wQQppgXz9/Lgo7uuM7AeRL2aogOKDtDvOMEEKa6LMF69uTw9lsf3mm25CDY9DkG9iqIDig7Q7zjBBCJljmh6b6Fc8h/Su2KAgOaLvDPCOEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghZMBs9h/KthtyDvG6HQAAAABJRU5ErkJggg==) ###Code # Random Strings Example """ Created on Mon Oct 22 09:55:16 2018 @author: DR.AYAZ """ import multiprocessing as mp import random import string import os random.seed(123) # Define an output queue output = mp.Queue() # define a example function def rand_string(length, output): """ Generates a random string of numbers, lower- and uppercase chars. """ rand_str = ''.join(random.choice( string.ascii_lowercase + string.ascii_uppercase + string.digits) for i in range(length)) output.put(rand_str) print(os.getpid()) # Setup a list of processes that we want to run processes = [mp.Process(target=rand_string, args=(3, output)) for x in range(6)] # Run processes for p in processes: p.start() # Exit the completed processes for p in processes: p.join() # Get process results from the output queue results = [output.get() for p in processes] print(results) # Cube Example """ Created on Mon Oct 22 10:04:29 2018 @author: DR.AYAZ """ import multiprocessing as mp import os def cube(x): print(os.getpid()) return x**3 pool = mp.Pool(processes=4) results = [pool.apply(cube, args=(x,)) for x in range(1,7)] print(results) ###Output 794 795 796 797 794 795 [1, 8, 27, 64, 125, 216] ###Markdown **Practical Applications** ###Code #Kernel Density Estimation """ Created on Fri Nov 16 09:26:20 2018 @author: DR.AYAZ """ import numpy as np import multiprocessing as mp import time def parzen_estimation(x_samples, point_x, h): k_n = 0 for row in x_samples: x_i = (point_x - row[:,np.newaxis]) / (h) for row in x_i: if np.abs(row) > (1/2): break else: # "completion-else"* k_n += 1 return (h, (k_n / len(x_samples)) / (h**point_x.shape[1])) def serial(samples, x, widths): return [parzen_estimation(samples, x, w) for w in widths] def multiprocess(nproc, samples, x, widths): pool = mp.Pool(processes=nproc) results = [pool.apply_async(parzen_estimation, args=(samples, x, w)) for w in widths] results = [p.get() for p in results] results.sort() # to sort the results by input window width return results np.random.seed(123) # Generate random 2D-patterns mu_vec = np.array([0,0]) cov_mat = np.array([[1,0],[0,1]]) x_2Dgauss = np.random.multivariate_normal(mu_vec, cov_mat, 10000) widths = np.arange(0.1, 1.3, 0.1) point_x = np.array([[0],[0]]) results = [] start_time = time.time() results = serial(x_2Dgauss, point_x, widths) end_time = time.time() print("Serial Time = ", end_time - start_time) results = [] start_time = time.time() results = multiprocess(4, x_2Dgauss, point_x, widths) end_time = time.time() print("Parallel Time = ", end_time - start_time) for r in results: print('h = %s, p(x) = %s' %(r[0], r[1])) # Kernel Density Estimation - Benchmarks """ Created on Fri Nov 16 09:26:20 2018 @author: DR.AYAZ """ import numpy as np import multiprocessing as mp import timeit import platform from matplotlib import pyplot as plt def print_sysinfo(): print('\nPython version :', platform.python_version()) print('compiler :', platform.python_compiler()) print('\nsystem :', platform.system()) print('release :', platform.release()) print('machine :', platform.machine()) print('processor :', platform.processor()) print('CPU count :', mp.cpu_count()) print('interpreter:', platform.architecture()[0]) print('\n\n') def parzen_estimation(x_samples, point_x, h): k_n = 0 for row in x_samples: x_i = (point_x - row[:,np.newaxis]) / (h) for row in x_i: if np.abs(row) > (1/2): break else: # "completion-else"* k_n += 1 return (h, (k_n / len(x_samples)) / (h**point_x.shape[1])) def serial(samples, x, widths): return [parzen_estimation(samples, x, w) for w in widths] def multiprocess(processes, samples, x, widths): pool = mp.Pool(processes=processes) results = [pool.apply_async(parzen_estimation, args=(samples, x, w)) for w in widths] results = [p.get() for p in results] results.sort() # to sort the results by input window width return results def plot_results(): bar_labels = ['serial', '2', '3', '4', '6'] fig = plt.figure(figsize=(10,8)) # plot bars y_pos = np.arange(len(benchmarks)) plt.yticks(y_pos, bar_labels, fontsize=16) bars = plt.barh(y_pos, benchmarks, align='center', alpha=0.4, color='g') # annotation and labels for ba,be in zip(bars, benchmarks): plt.text(ba.get_width() + 2, ba.get_y() + ba.get_height()/2, '{0:.2%}'.format(benchmarks[0]/be), ha='center', va='bottom', fontsize=12) plt.xlabel('time in seconds for n=%s' %n, fontsize=14) plt.ylabel('number of processes', fontsize=14) t = plt.title('Serial vs. Multiprocessing via Parzen-window estimation', fontsize=18) plt.ylim([-1,len(benchmarks)+0.5]) plt.xlim([0,max(benchmarks)*1.1]) plt.vlines(benchmarks[0], -1, len(benchmarks)+0.5, linestyles='dashed') plt.grid() plt.show() np.random.seed(123) widths = np.linspace(1.0, 1.2, 100) mu_vec = np.array([0,0]) cov_mat = np.array([[1,0],[0,1]]) n = 10000 x_2Dgauss = np.random.multivariate_normal(mu_vec, cov_mat, n) benchmarks = [] benchmarks.append(timeit.Timer('serial(x_2Dgauss, point_x, widths)', 'from __main__ import serial, x_2Dgauss, point_x, widths').timeit(number=1)) benchmarks.append(timeit.Timer('multiprocess(2, x_2Dgauss, point_x, widths)', 'from __main__ import multiprocess, x_2Dgauss, point_x, widths').timeit(number=1)) benchmarks.append(timeit.Timer('multiprocess(3, x_2Dgauss, point_x, widths)', 'from __main__ import multiprocess, x_2Dgauss, point_x, widths').timeit(number=1)) benchmarks.append(timeit.Timer('multiprocess(4, x_2Dgauss, point_x, widths)', 'from __main__ import multiprocess, x_2Dgauss, point_x, widths').timeit(number=1)) benchmarks.append(timeit.Timer('multiprocess(6, x_2Dgauss, point_x, widths)', 'from __main__ import multiprocess, x_2Dgauss, point_x, widths').timeit(number=1)) plot_results() print_sysinfo() ###Output _____no_output_____ ###Markdown ![image.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABvIAAAO5CAMAAAGLc3JxAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAADDUExURQAAAAAAAAAAAAAAAAAAAAAAAAAAAACvUAAAAACYQwAAAACDPAB1MwAAAABnLgAAAABcKAAAAABQJQAAAABIIABBHgAAAAA6GgAAAAA1GAAAAAAwFQAAAAAsFAAoEQAAAAAkEAAAAAAhDgAAAAAcDQAAAAAaDQAXCwAAAAAVCgAAAAATCQAAAAARCAAAAAAPBwANBgAAAAALBQAAAAAKBQAAAAAIBAAAAAAGAwAEAgAAAAADAgAAAAACAQAAAAABAQAAAD9l+eAAAABAdFJOUwAIEBggKDAzODlAQEZITVBTWFlgYGZobXBzeHmAgIaHjY+Tl5mfn6WnrK+yt7m/v8XHzM/S19nf3+Xn7O/y9/lACsycAAAACXBIWXMAABcRAAAXEQHKJvM/AAD+iklEQVR4XuydCWMcR9GGV7J8sCExxsQYs3xGOAIjjIhjFCWKIub//6rvfevoa3pWu6tdSWvXY2unj+qzunp67pny9PPEWhft21OsddG+PcVaF+3bU6x10b49xVo3bt9g222yWZ63qom1TtpXZbQ010Fif25kbqyICHyTXOS3/WQDMGcV/61tV8Zat1H7chWcG4oX+bp9E8WUwVPulbDWefsGyeEX2X7zR7aAgd/8MUUVP7IZfvMdfr/55Xf/e/r0L0//+u23T4c/f8PwAQFPB2ThsUgvybx9v/nXn359+u3w7dOfKf3N/yT/X//vdyKEnz9ggPybDq2EuH8H8fWw1hX6+063w1dfaYj6nv7Ijctos83z09On/ycRfxH9Sf0tCsChsequ9Idf/Gf75P9/0L6nT3+QGOkL/n0lv7V7Dax13r4fh8Hbx1/7Y2kSqIUz7M+6Qd3IX0RM26eouNZTYv/hgdPto5zZnQaoi47vCvdaWOs6+uOGLnNopG0YIqG/g+ef0+0TGYtVr1FGl+37r4SLD3l9D5frrHSvgbUO7ftRS4H+fmU+6rGt2h8Gj5QsP8O/dCP/vX1UTgo0OY+V3HKgbfC/0t/Tp/82Gfxp8PCrO+T/eljrRH9j1s7uwWGt+0Lbt/9Y66J9e4q1Ltq3p1jron17irUu2renWOuCILh/FmAwd8kwe51cF+baEZL9iThZWsbcXhHKDLMB1R1VuNeCHDoMSELPMDyeHQ34zyxOGCb/h9n18I4RIiIebCz58PXwYvaMchp5dqUOxjIKyVSYARbOjW35i+0wPJox4+fDaxE9VhmRO9GNt0/+aYBE6F+P4cC2TCB/6jiazQ6ZLQMWEvsCf2jvbPbYPJA6ZTRcnvqYfvAETcZGA/kr+rkSpyWYzd6qb8EfkaL+mLGEY/NWNvKnqkWAtg8h+h9o7dw3heXqDrYPDrbvxHI4o5+RbAQ9whnEUtrEu9nsGTaSkwRUwxvy3jGSuzqH2Qc4mLEmG+Zz2SIjFmxAHRRwoVQ78SyBMs8owoEJ9xE8AzpsgCpms0fDtWQAPzsMusEWEnNsZrOP8L7UTBwMC3rkj1Efk3d2jl/8/4hxTK+EwQkxcWos/ujAL8NfSsFIATQ5ajXMJWfxDrNzhq8B9TfB1xND/WbObRvsEFfO9QzTGZhSlmm4sI0HCwYcJktuPnAsD+/YJplxxQLecMPpjBsFLtmD8P/wSHYHmNwuYabcF4hhWQ4PAMx24IKVYUVRQfq9bupFeyzk4zBwNrQ9CAJPaO2mTp1iLanncN9wV1W0z8ab187bx4aL39SSxucH7SBK4J/sQpTkuG+G9/jBFG5VHN6zZlo78fucLg0Tq8S+4ZB7EI323QH9GKayd2PoxwsN/kz5nNsWBHdAx4SKIDpbiby3TzGdTCaQVew0q2e0Kp5jkXPjbMtcr31v8Ccz8EqY3KriZPik26PZS53rdb7nv0/uwx6BWfLwxeI+zS6ZyOLxd0BhaZyEcHdxjV1IykKDUpaS6JDBArxHxxdXr1Kc4Pscy1Oc2OKwVzzYYBfMSi0Dx28Cj4Nk9yZHarLvpt8LxK/EnuYA+fFFje/02cThTBJyZ8hQEUlBRWqR0K2vForlAVNwfcQfHIhJGF2Wg3CRSr0ROdwTrvjzfnYqfgnF/w9YgwmHEvQGvXjwTpyUQhJN/FzaRxci5L9lIejeHn/cYFErgSZQtg8512gWz3T7QTzChVVqLVLq3XEHRdwreuB1n9jieQlpKp/vnzqWVvim1jzAw13M7idi/fgnP+kfJgwqU9qEH5US5zB79Ag//IfwRzxGUt/Dax9rpLV+7I70b3jB9mG3IGik/Ayzq2v8aHvlPKl1zQPUH3YHPL+JH1ZZa6//eBaTxmj7YkrZHhXHtTzRa+17PnvBU3j+7zPBFwefJ0/uf5oPgs+bei6sLgP15sli5+ArZlnlE11u4/cSfycMWMbNKygBBTJjXdV3svQQuT40yGWvk3MPxj6c1xzpu36J1Gdns+eMeY2M0FTsGpmMs+jAA5jFRxSH+OOPs7NXmox/2ilwyEqOgR4DeCmTxx+y12H0AQoRdMsgHiPBd3k6O32vh0v8rz/aPnpSUMUgGbOzEEVJE+MOD/0sXhxocmNp1cl/J5pMwk1xUhwPyoGEQ025fXa0Lv81Wg5XWRZ9llWBF6Cx8scrbdKb6pVxkKsnFxPHyGDQBZqcMxBZ6W91LWsf/Yee8YksENJ/+Sn0p1vNVaLl5Hmh6xSsWL9L72H7Cn8cIdjIf/GaX+pBVw+r6DA8kV94eOQ4DDxjwD7SWEp6Nvxn7WPIIKpB0DOoi7ISyHRt+yTBMJwzIHuxPZqdD8MBk1sUfnkNh1WS0UFBa59cqpeStX2DnmBnwLbx2uwEGVfBdnF9LV19rqZWXkiXsSe4tszEtsjELI1ZXvYH3AUwdHbG04LnEnfytczYDMf+gEL4D3tlnOxXdGez+HiEnZOYIHNCSWgFgmhMmL/OrlVeyoJDC8JWcj4Qt4TOuSvanCWzNHweoIbuvjOVfu77A4nQSOah/zxoYaczz0RJGkM9XaC9mgYw+IAidMiJ2ZSB5bUpE7O0+dXHItr2ccZOIdzon/8TJMjjcvu4Tynbh1+d/y60fc9VTGNuS2+WlvK0SPm5phfV01KxmtEULiNhIiqC+IecPJq3yIiwtU88PAUq7YNHft6Ly9oHt4gxTv7U/TBZrW7L2tCe8w2CzwdbkN+GLWSxlG3mL2aus5Yc45HlU4Qc+K0PEmFKxu5fL1cJuscfZkdyctnCN50+h8fyy3IwWWPjt/nojzi4PXuCNbrsEbjXWtiyXaK4uYbvCn+yADrD3oULEchh+TNcqeBwhr2O3NTFu4u5YSD+uMcQCSuN7YMbjZZLYRLI0iWNHtewcGzoeKHV1jqN0SwTFzxQtlD+MKVvsQihM99Pyl9uufrSNeMZ9/kSrDd0a5jeBQwWR9KW2eUJSpFA3vR5hJGnhcgvHeLhCgK/emTJnC5mHz1/Ox2NP1lFafwKiPY0pSanQ2EO9MqRm/63JRTbJ4fVeuFrkMEmKeUnbQ5n1j4swgyuZxmG7jhOctjiny4EPJBBF2oqEsCzBHTkJMk5BTvqYPacA1bypkH7KVw17oNHIjSXna266WNtj7CFkARxBEAUSDMXnouYggYevtA8rU7PLKXCQQ6vmp/mAaQc3kgmoixNspU7nZnDfSwAMNzNVcA79KfQftwlRT9+bnAc5P7u9Py+wrkKf7rB6h1zLg8PPieoO/7J8YrMuZ9f+3jnSz7e/LzaJ/MK/mTPhDn3Wd5HBUEQbAwPNHTOnKKNxf7R1u0exWf1ZNVq9PPTRfPtWF7TFi9QUqVldSJ5fKHNEB5oyRlpoE/IMSHv9MScyxTyyJ7DgAP+0qHFIQN6KhCAMkxM/RXmP9BLWyuDo0/d4h/vt8J/vVFHfniFU+ouz78yxJ/i1BOkEsSKcdVjrdSgAet56QNmIS1mmFxRY/skVPtMUyAAZVieFqp81DSzYwbgL8esQ0oluVy/ew7HhR/RwP1KDs+0CGufnsu2ZLl9ql+R5NFdOlagaGofQy0lw+XOMJahV/+4VtRIXoaw9qkJqfT6SLo5anaNxaflgsJ0+HFViuWMhL57NRsOVX8DDuW/5tOqSIj2vWVFLt/YSXx5pkzirDrDI2mfhB0iVMugkPxDwNHZMD+gGOKYj8Rqt8k4e39hz7cx4KHCiehm7Ng2WJtF0cEPeiCsAyZfTO1sjsx5eZYWPoeVtlp6cXzkAT497zlN+2zyKqbnPecxdp9PcPwn7Vs8G77WidimZzY2CIIgCIIg2Dn2OsLPjmjePhPN22eieftMNG+fWat5+/eZhI2ap9umsfCu8k2bKhU/wdJkIwz2hY9bs1Hz5BMxf1JnySp1qmSmEnw3EaVf5FmDqnn6jSL52Inmb989wc/3//wDt/aRHI2THwqYGL9gpCE/D181sU9/5aeQJEw3Gvft77D5me4/85MuKurNgw/Z/Oebn+RTLnB/++dVxkdJ1Txgpavjr9kv28Fal2CoVwm/+h9/Wc6+wGO/wBzc/CC/RZ+plz8cnMmXY26nPclWC/sNP/Zj38BhWf9iIB2KtUlL1VD86n/8WYXzB5TsF5hD5SppeAmd3z39vYuQLTUPGQzMmPCDWmje/6hCIOV4Sxj8W260VA2UxBZyg/Z+0s1/5LfRniCDUyPJNpr3Z8lA/v+NMXD9KL//l8LxI1EaC/71F4sD5viVHZMr7M37xcWw+S1FJbc/ij9J8/8/6XTb+5so8TdiJOpek9y8DtDerli7opvxJTdv/4nm7TPRvH0mmrfPRPP2GWteEAT3ykIe5e3gN4mD9IGhm5i4VWvpXbX6IaUlAnY3N5CtPry7Ah9EED+f8Cc3MPO2YPm7ZLi4+P9EtyohIRpAIfm8xaA3oj8f+G4w/P+EP3nSmP8ZyA8NSSv0PnX90gV9zEKePtbMZctnZ+WjSfoFCQmUr2JQxjF5fTy6e5fftYhrGnv415Lx195eSj+fcPZbraVMeORu+tnj9Bkif9mMBPMXzdMovSn8yu8Zxx/9vAvbPCfythr5k/tKJQ/LSKL0nwjM9L0y/lytuOVnCo1MInynp+VvWWvzxCFvgUke/ryVGsGl3S//dHs6k9eQefMs/tHsax1s6e56aanUmX/knciLE7+IKnwZu6Ed8KU3k1ga1758T4hhMn7eX/N9ZW/5ML2+pUDKgllCPH1dSEqREsTF1Ph7fw4fAz7OPvjrXeVrWIw+ldvImTu8bzkyKMGMiTz8wOzlA0rXM94xL+pKj1noTfeSXgJWRopRZw9kbK6GJs0oizWqEKzDuGN13rDHJnp4kgevEz7sAROxj9XN9D0XR/I7DPqdMbEhCqA18o//ZWdhcXRVTjb7YdyTzYpgcnsyG/jqAjFnfU6ILr+rGn/yBbXr2SH7Q3Zdee6TPyJbzBka66H3CirB5qFt8n0j8TbNwx7xqHhvg0apoDZP5IC4Zftgmid14Wu5DvTzRlIrft2ATnmdNWO4K+DUr826lhWBisiI5IQ+POK+Qz9Oh7D0spPPE12VBEGwPjYdloeD5a7MjKucQQr31k2vLLoudSnTginDXNVtNU8FMb/6J2IrOlWqF1Ayf9+IrTBUWt+3KbzA3/Xs66P3sn/QXRgcj2cHRyd8g5X+QwOYgk4I08G3fsk/RGG/YKEiwp0Ks/yEXLh00LIo61u4dOP/uLvljzxGjuahaN+XFkLLSGcB/BNEcoCLNRo8fHHN7IW8HZvrL4mQLlRX2rHJcSuj9b+JSuPNKUFydMQsB3n025OYgLroUSdjJQoJJVyLfu4Z4h+OGs23FM3TXLIwG16hLlcYifN0DCcCr3QZqX40T5Zp4pQzEXB4XjjMR+MR6iL4O5EszUdVivO5vDJMM5QYdYpM/uHGi2bvcJvPwKyOrMs+X1Y+JbY5m3w/6lYsV/LEwXniYRz3LGGdMTya7+vE65vDrkm2zNOp4pid5M/OIRj7gyzlzVPfEQ+c+A+Gz++m8f8DQ2qHv/ecZ3XHYq+z0InX4/WXzeMpLBwSwWWvLKZLJ2n5eVCwfvbv66Z5+ivfAshSGqHny/yNzBoP5DWenwGyu/2MmbjuEgTBqsiM19ALawI7IkVQcmJGdXQRjUM89Ur0BG3klDDDkZ04Bv9WbYYvcJKP7uOA7Oxs9go1gHsu3ze/PJVPCA04zJJsXvP1zjgopDD+nUoyeVuyXNmTwJcSif9yUUdD+V8c3JXwiqGsDGT9Dzlm/glRF7NTvdwmr5285kct5KuBqNLpe35FCe7D4XT+6vBULjDOLrkPooPZ8VNtcDdvJZIuZLW4FVAFXiyD34NYLf0iLsX09C6xZGkPqf+lqeZkuPUsnbpFB/DTG+nwRn9xxGhfxyiF9VcvjvGSGP+d2TGWwvzlSx7y+uUiIqPJPErc5uf3KsSrh0kpStBkMiAkQv7LUdlc3kltaIw6AA8gs/YkkL9onoT6MVyOkQ3+80tKXHPI/kiFAB1MyGbCLRVuYIR/NIjnE7hhMvmxlBqAjqSwQhdj6OTv2+EjvfJ28RRDMf5XB/iA4EJ7Ipr+9D+GqyWnNTFI/a9S8xhnP/hDdnLRUyu5LkvTpNdS34q6iOzbpLrBiKIbxUTATT2rU85yYA+8+0MpHO1ZlJvKWgqGbH96Rq48f3eGSQs7jJNzm7Q1jpEmww3PJdl3jSB4Biwn7jn40SE57niXdibXmGHPzj5cYWH+UU/XXc/efXw3vHsi30V6wZmUHza6kuLthoKNENuV33p6lg2VQItmgPz4hT+dquGyGN2duG6BfwZJNjYPWD46xdGJPamceoFbYlxEzzFqwXKMtvkCXjPkbzM9+/SupdjH7nLz5CODRPaVGsrk9pEmma/VCTxRClEnmqenqeXkMMO0bDZPP2ukzdNO3AikZMbpz/7zjzsCOGAgFsTmcS1xLPe+YLwNb1XEBaBtCWDYwJ0F0/APvsd8S6TESCx/+e0GSWcZpOaJR+69kuI/rmTLS5GSd8Oq3xuQs8Y9dli3ILhP5KyX/GyD1WeBm03q9kbH2cl/BUzpcKOS2P0WwWBUmL3stw4fSS1D5kb8k1tOiKTmXYhrZdNFJmdkdF19DmjcvAUX+bytUhNw2XLynMeOmMglhFF6+CkyFOXHi4aPbxbD7BmCh+HNMQIQwUucL3jYoPnDJ8dbuoIQhxTpCzN5+SukWI6WdYaVDQ/D5udfS3IEYmfMOrW8EXnJiL/2NaCqeYwTgdkT2Um57+SR77d5Qw9DUISWMWRRaZAJ4rBUAk8lkLF+Yywbhl7RXHAcTgdyeCEdrmHAPqTEZDhsl939CUIokA+yx3hRbJ6cDqm1J2BzJouIa6kswMbahY1cvTTZx+xJXW9YmDbP/1kgf7jVQJ5rMO0hNw8n8h1Owb6jxKbg3yWbx/0kQ1PSDvKNIdlC2kwgTTPqwG55Ifvmx7z/2CK5DKSTZeiS0BaG8ikjzfU0h83eH0JZc7FWZIL/AiLo1OuKrOorHEEykXWjwgD7jBLdtvuf85NHWhv9vVvScCspA3vxNVzaBitwUp6Durlf9wWZws/eccKmi6NHJtDPBDaJvzyuwt5GJ9vPq3mcbXlohubpZPv5NM8/PsfP0B1g7ub35bBWiBkrCILbkJfqE8gxSU36SIUdh+iu5Qa2MR8vWzxPwEIxT+qhabokYjOnbmQNCye2lJazxhLD5vkxbXkvnKVukOalNqrDV/S+Tq7S0ll47dbVVVG9aTF+3y8vwOLfyeydXrrTQxCFW/7pJxzokmOmfHUTwvzBgQeyhqu4V9MF8COh5nuHY6zXVpL2k0aImjwpf3GMhcOKDbXnv9zOD+TQSo+N8YcsLdJDqi8UeqDVV8O5SXFWyyTAzUvzMXtdArqE3BCsDLxDVrYnbN7s1fHGzXt0LBfa+bSNhlzMzqEEOI7kwgepPlDBo2+eA2BCJpF/dmaep9bFIVnwny5nOfaRAUJ5fUUNGM1jAMp4JWK8Ow35yFVMOcLHlk+bgiP9Mv+DwLqkwg9il6EtqpL3sgoa0G/FkVBnf7C3YPblyY18nVCayTn54Yz2zeBYbz5GJ6E695sl7C/agNQ8Wr88GcHm+Q3He4xoqPwU3cDTu/wEnc7LDA2CIAiCIAh2i703MNgPTGuOhQb7gWnNsdBgPzCtORYa7AemNcdCg/3AtOZYaLAfmNYcCw32A9OaY6HBfmBacyw02A9Ma46FTn8jaI2vBzWiP/1gDvLDj+boMvzHHJtTFJ6cU3XPHzZbzl+H4eeyEc6KyXeAac2x0On6rFHTRlS/UqaU7im+XaOoMTmxffUPjDJcVW3G8J38jrOxbcuw9vfY1sW05lio1ud3wzDwi4YETvnMmsT8jr/4+34Yvqf/6dN/IF7aBuD8u2z586NmBYan/3T38Acr4em/IKzfmBx++/QHS/iUQYgA9P0Z239L+I//e/pr7rxcvTKtpJRvQBqN9vADAX7bjPkCqyPy/tsw/B8lUhFfwcmvYBrWQotFnHWOZqnuHI4t+LOE7QrTmmOhqSrm1KmM3xOE9xsdU4N93dI77elvVFTclIfrZ/UJcFoTh/+a2MAvGGLDL+4Nw2/wa3mwc832frU03PxsCQpUvkxrXcdfodWeeku31PJnGYi/1yD51fyeDvp1RSDa+1Yqxw8IAh2bmuYn/tbh92p7Aqv8r+wfvnK3tOLp0/8r4vjnxgpntjsisbTf/4kLf/+weOkmy06CSu2ZjGzLwWBIjxZp/2QiWXJse7LhttaeuPX3LxKrH8nMMzjL+rsMNVq7wIGYskQt6vD71J4NQ1T5h9QAxPy1qCzQhuqA599PdAnYueV0Gsse+bs68Jey5XZt7eXqlWn/ZiKFZHJqAnGupD1+l7NEytIJwxUz0l4Vfq/aU6d0jw0pDbR+y/Fmm/y1qYPQVUx1JuPzkfzYLk9mzlZ7/o3hYlNrT38b7ZnFyDRmWA4W4/kV7gnt/UfdGdXeU34Z+ZdC7qnu/GVU1+FarR1iWnMQ8h32tTpouNe1KmN9ok3V2sme6y+/9cCnfxCXxXmo/BTzjvz6PKYbSadz7Uh7XIioj9BRa6+oXpn26X8R/JO5DUoO9r3cQdpiBywMn9SepVMnMe1ZHXMUJxmulNSTw+nmMmh3mNYcC12F3Y+snZCV8RlgWnMsdBVCe/ePac2x0GA/MK05FhrsB6Y1x0KD/cC05lhosB+Y1hwLDfYD05pjocF+YFpzLDTYD0xrjoUG+4FpLQiCIAi+eBYC3+a9GpAbet/F6gbejkFfDbQGm70uYapJwlQlqtBO8sfyJnm+/BTdS6e8hWk5G1VfEq2asie3Uak3s162lN6sImUq/QJAwVSWo/AmQL9QwJ/ybzk3SxgYUvLSXyKJBgwQOPjKo2u+hkrgl4U49q4GfZ8Vvfh/MlvQJ1z7659OZk9cBH/6kXL1HuQYzV0/es4gfhNBv4TOWNvKnztRl+dwMAf85/eIWFMRtw3/XmevJeTmQN6Z/A7ec772WuMtkrDugKEnVinXntfY8pJPB/EvBdLHt0GlVBYuf6DQ3vy5fZuDP950In1n/c5XrljS9cjl4D/nUb5yipld80VDFqva41ulpan67rZhOOPLvyTKAoe32gOSDJsrabTHWO6PvaL8gMPsTIItCcaFxWm2/Hsr8dIf+DPtlaku7I3S+udfHqWPP1Zrih8k9Xjd6cJ/hEqlPNprrJlwcza/OMbgqSqLn5SKAVYG0fEtgVCTbUFuetF3Fmkx65FS4k++BaafZoJLakbjeA7t2ZfD8F/rOohRiAjftHciX8s6t3gKANSV1uMxljtl7ONd0haaoLWfIfoJZfVLGH8weqkFODEhaJflVKo9fVGcfuyDeNWsEPwgZ1eP1x3QxRcPsVI0VOI1liyJZCYBVqx2C0PxywB2Aryp9ikB/3wcMFCanvpOAzFXPRHpLWNZ6psA10USrZdS3qI/yQp5bVTRILgvugO2CeSevDTvtc2xTrBu6mCSbld2Aj2I21tqD/hOKNgU7VHZgb4uF+cM9HWKws53D7eiDIbpyvyFxHkOGnXC72BjFahHCimBR1N7lkBTO4xjgP9lJ4MfW1DwSXuCv1i52eI8BcpWPswJJrXnAQP0aDkITC95yEr6pSXAouyab5VkONdqnoCpyRuspS2uXIJLWg+WkAA98Xp2qm8n9qMmLs49EMf07723Ku3hQDNpj3/yYV4/PtTP+CDxERbw1+lI4QhuWPPhY/8Ur6zPNQF+7TW2OFzLxx1w2BJ8+KAF8ZN+3GposFeIUoMgCIIg+BIZiLlXRVabCXheFV9nSdRSCV93WOxU2d3lyThwKnkHNtQ+UjN7pG1mED8zWtPJc5WvZugF2KbVehEKMIc1KruUw6Qx3fJXjggkHAdp8tGTItL+zuSawZF9tvv1MFyaNNb56BGV55Uv9JNLMVq+uqJ/1AAOIXmpjsAhOtGcBclngIynLgLxl2SLXCza6uy1lM+x6GVKjSigj+cSLPiI5c2eSVn4G460WP17BQ8DJUgbTcyF4BfiFylv9bEnocdyYE/xsmrRUxuB5HLpXvP/mpfZkDfPe0jdJWI2O9dq8hALUe/0EueT6lKtOdir8iUaCdE+raRsy83JbOAdAxKsm8uUs4eIGJ1VlajSQ3VDtsilkEKdLV7K5zkcboVn+ZK0BEqExnpl+cf/WrqH8K+uCRGXBPuhKv5SRroxlzr4d/mx7KnbYrkMz4ZH+k1heuzXS5BRQ58F5BrKxW78d+0lPjU6ti030J5eYBW4uTh3H1Gn9V9TpZPXcHhQzsWl4M3xUv5seFNez3Y5dfGavAXdrL1Uhg1OAtcreuR0hEqWrbaNYH45o1X11K2g7UnO/DlQD520dtkAdple1Gc850RAIxj4lTOJhO4xY9icsoCr7JBh4NdHkErOm1LTkGcMI5GMN/UgyPdKcB2m/rMq5UB0OYMgi1/PxaX0jgyNV+3BIxt+q0pmVjKnBMIxw1vshPaszswxdY40mqjrCTa27+QVGM/oGvMpNumiDNPSyTNauad2R7rIvd9Y7wXBQ2Tp8JR1XN7bN+S9wZj6OGIkyKTDq+nMu4chQQVWr3K3FlbZ2PDaEG9/Q1/bohk/cJ9gPs9LagbatMppXo4PCFf/vOdG82JKub2LOwzmqEGc/mXhjKTqxT4Ge4KrtBvSbBi37x+T2zVypdP61rbsNF7xsR03Y0R76k1BunBCcF6CFdGyZQ50yuGQRdgGC2fmKCFYhOptbIB30kmghAfLqY5ixH3MD6J651lMR3uGBJfXeD2a29z9Gooloh/qYOlVao9Ld1nkhvbW4l1eE3ODrd2V59rj/bSN9pKwaE+PD0gbzeMI9eqF32Gx+EDDRlitPa6v6WNRFijpHsk2eBAMH+d+u2sQBEEQPGia/dXj6QN4rDbMsSnXe/hp9GUdsjtGPT3R9U3wEgXdVnfI4eZjBSlETn57eRrCu3/1OuGBLHg/8jKg+BGx7sF/25LWX7IsrkN6vi/XTq/Z0XvT8Mfa/J25uOrnWQ9x6n/6ZD0/lwt/OCbEUYGJP8chBLfmxQEcT69QmHfYDkf6EDWfWJNwCRh4gv9AriHy0MNKe6LPZ1FahQEiXssBykKfKLu64sf7h8cz1BbeN+ncgPz2tVdcrhHG2vNqSpXPtDZ2uQlbXqdI/xN0m58t4BEuNtJL55cvLF47RwKGa2R8ci33kSNMb1DWRGSkPb0jVoKS0CQpHls/lLvkE5ySHMqVK96nPAIrctUOYEvV61EiX0q6owqgPx848r/fcKCp6dBfFM2r0VQcNTyXWE1jiKujvWGAtQkajN+O9vgnyNkB+k177zW06ArH5Ii24GvN4kRqK3HeORLAn5R5uqKnuYNWexzauQw7+3EzkPP+1N/scSfN1AqVbP2PG5gNhps/dCcbj76kSZUB+EObCu0ZTF1cnYZZoEt4/U56+5M80q8JOJBVBSLetz1jNLpr7ZVVxp/cvs0/DQXu8YuO9geaFiTteeeMtFf+ckYBbe3Ui186/LnxXbN5Mcfcyd9RLW9iWTX42GuDaO9z4BbteKlX3B8Ca7Yin5b/TFitPZ9kjtkIWZDb9H1rmM9HuxFsFW4q9+Frc7KGOsur29A9gVLGpPtKbqbMgjCbcRUmK1XRStF/vobdLyul0/q7ZN3Vs67e/YiAAVJ/eXeHH2J413MHkdINWN5Ae7aqfqZRekRg8Uz6iPceMc6ysBwlG/mzOsmRhWXOvOBMSek7Q+F2zVgDhhfv/ciE/gtURar3YXYpNUhHJVhtfrzEZnjOW568CawPX1qRDm+svqY9q8KdI8UKq62e0+rdg6z++meHGD3t8Q+jg13FVbVl7s9ceg4izq1lYTl6tNeJXgvl5r0kEaduRXv0EwlQV0ps2rOlLshHJc98aLUXIdM480y4sdZrFe4clrny6lkxH48IgNyyS78F48/u4u1pL62qsfZX9IjAkzJcXpZiWVhw9WebN8mHPuQjo5ZULKOnPT8yoTvZHgUM9VoIN7X29CCszAR/ufWsghn7/cGa7B9brDWzuuBssyYPouP2UntbrbRMjGuzl/0WBEHwBcL5erj09Xlv/tYbiQYcOQiyYrbbZdfbuVf7k85+ohNEJoJvg95LvJArgNd6XPguX53I2IN7s0Pd1LvE17YsTtyionJMuzp+ul2K9KTmtvP0fro+nQ7HjybToySTFh7xqE08B3rsyx9HIp18EYC4zw+b1P841Y4BzEtzBSpgyTy1Py5Dv4cdaaG1pEdK9YbnbA6dHJLYyANJ+UAD+JUPXu8d7CZzeZSKofarSNSoHFQ8O51aJje17qdlYBD5K3Hsj4gb/Ti8m33y4xhqT6Cz1Z7cCWpPiLn47OxCnXbQb5GAR7zpiAihVpT4THsSDIs41kS8LmbXulKu9eYoW3Q6mjYVqMjXujGfYjVjCtHIoWpPfirtSdMtIWTk6eV8bcR0K9A57jo5lwon/+PvGh2ADe+Axn8J1WNmlpr7aQ0sG8HdOErN3bzU9pxXdgmZ/Z1uh7Y+0ki6qSJ7IS0j6EJR4iu1JyGCOBlU5CpznbmIJBUH/l7xEvVR1p5Ipgt8/rw3a8YoPk7BkDdT2iNIbfamNxX43eD5JAeRKPzVXScC+FkMBywJLmuoV8k2Wqr30xowmecxoIvoZpcO53O7xeBm7Q0fjgaewlwMh8OcD6tgrsEucXg/QFcWSTEMvVdzOzVGIWxMe0+GxaH48fdoOJ7bjvgFskSfpVxVgM9RYnM5vJH3RZfaY3mN9jDrDcMcLTsaji4ZjklLN9Kc6+ElT9o+GxbaqgPZl+tN4LPhTG8JGc4XvO47DC/VWDQ9W2KCE133eDh+J89oSO0lJxumWiXLSLSX++nzRnpBT5EvR/ThHVTiJ1enOF3lpjY+wLOSYBCsjAzZxVmzjP0i7Hr/0LWrrm19L2L7I5AWuKG9h4ftGtLa1vaq1B73/R7NTWjvwVGoR7a19sqVbWjv4SF6yWtbX7on27OV7fD1cHz2+d1NFQRBEATB3mFHdbuDp6xXYeLlaGvW70G8Yu1OKiHryB0tJuUA5El9McUYH4BQRs4vd1ivfkUufvohs15WmzPVlNszpBuNrTWDfudwNpzaNbvZ6+GCp+TZyYzBIcTAk/uUuVjImx3tjYlyYLE4015K+Qxnwzu7lsNrK9jqAcgLhnxErGtPS0QkZCjEKF60WciDjPZqrdlwjtL1AUV7PirVQi45UGwY5IoA9JU7zmogsvIohBYjsKpMQq/nxnY857tkPbcFPMNHr48zDO8Zidpq4DCgICanm1tk403ZPm+KbMXFHxZ7BjTKg/nfNgigEj+qDL1EtWIebj2O2/bgPyc17VmJkgm80u90M/QkWw6DJdycVS38b6bHpqX2JAeX/VqqKX8cveI4wt9Zzo1hj6zUdA2HG62PIReAXmutBW55IdC1R783ZbdIAVaSX64CDOH1Xmlyaja1Z9eD6SXWhPyLP7kQiO0V7xXB1rRnjx6qlGlPS2QmDPUmc0NX0h52fLzgLtf5JKCohf35ptIea5Bl5SqdetTh2jMJINGf0Ex1m7RtDK3lcS5H/G9n1+fpWrCp1gV2hiwI+CMO/wQUeKuvrp4fYjh6/AGvjD1/z2lXxAn2znO9wUEDy3yOsIFfknEvfngqD149e4swOoBKMnL+yrYMkOA0mOazx5phfiFdrkX6W3DDNMjF+401UNkkZlnQLXWix3KTdmiNNDdRAW+JkMBT1/LRmex5mJwMszdaPUgikWaTmhIY5d0IS9lWt5UvRE4DbsTOTWxddCwF+4TsaPpL/Ac4wr54blzin/GhRqynP8GdAn29vKMlcbAi7H0qxNxie9jaItEP/rDPNiEEVutlFwjuA9PAxBKf901pgAhhg0DVmq2X+RfcF1ybyfpsYon/XO55W7xQQVvSF+tlBGM3GQRBEARBEARBEARB8IXyNNgnTGuGBQb7gWnNsMBgPzCtGRYY7AemNcMCg/3AtGZYYLAfmNYMCwz2A9OaYYHBfmBaMyww2A9Ma4YFBvuBac2wwGA/MK0ZFhjsB6Y1wwKD/cC0ZlhgsB+Y1gwLDPYD05phgcF+YFozLDDYD0xrhgUG+4FpzbDAaX4czHEzw5/McR/8YfV63pr7a6hpzbDAbyeb/vMayvuLOZTfDN+YC3yzNJvf377nv8tZDH9wx/+Zo2XV4oZh+OE7cxc0Db1DTGuGBRZNb9hceWi5uQB6wVw9/iWS34p7Q0rleT3GZVoRw291exN8mrRqhjKpvFu1YBVMa4YF7kh5v5jz6T+XK08pLXVtVlPe6o0RRPw3o1lySnnT89e2MK0ZFrgb5eXGDL/ZX+WhIbLJPEzl/QRrsRnlT+ZU5Un98ZfnELjc+YM5tU2564anw+/V+Vs4rSMh+6u5xFO6wZ/p+7UI/w3GgLrL6pVpn/6Vzknl/fTL0z+bLDbqUp8EwIHofzMA/Kghjrot6F+IUxtEAd8Pw3/EXYRjCyRsV5jWDAvUprNbfzAVoMv+wWlPlKdVGr6HbfwoCvl++KeHyubX4TdwMOXwPwkECP9FJZ4O35nyfmY3DcNXEvg9Yj2Pp09/9+3w52+pp2+Yxc/DjxL+m+H//uJLhrJ6ZVoo9NdhUnk/S4eK7LffDt9KESKLvH/PcPz/KmVF+TwBWKhWBU38TuWGv7CKKY2Hf/tXy353mNYMCyyaLh3+J/VDQ1Re0TR2n3iIKMoiATtt+Nk8GmFxdLEjYYFEDz80oSpbndpraq5fFTIlOh6KtLqOhYFyI7TKs76XX/7YxkK0xuqWuvzFhEApJ3FPOWgRoqPvr/wpwh/APo/O4Sd1i/K8X+oN+J+Iqhug09S+FCbLDZeYlBMbr+5/q4T8iPI0wGR1Gq3I8ipqeS6xPHFrx5pQ4bbko1hB5kGqKAX+xK0HiVucGv5QlKdOoLOOYNvUN2qEWRSzyb/MSZjuNxIrYqI826EO/+WPOItOVeV5cWXXlGR5TWsHczcpr0xXuvUoRTx/89g0bzJA6/MPi5MGWQF/hbsKv1/lyUijM9eiozzug7ieaUWhPFuLCBr7D5slVXkM5zYnHCuPgkB6VWWcVGaZ1kRK5fmx+ZrKwyJEkRCicvz1XCTIlMcJuwq/V+UNf5Rf+1NQO2+Nb1B13fGNLM8nFEGS/Qc/IqHKszG9iuU1mYNcvTKthlbK87WjDLPVlfd3i81IgBhV2tNza8rjzrEKv0/lWbeJM50UYtulrzXCNjq4x8r7PVedhmY3PP2jbkV539OJbd7nTSpPtuYWiuqVaXU9UinPnbI15elayWIKd6E8j81YHAbD7yzun9ya8n6Fuwq/J+XpzCZlSy/IUgQt1rZzTQ2062WW19OH2r+63AJsU3EeW7PDLCSLDlGeHTroek7dpfKkT3Q36UHiVIrqlWlNOUljwJyyiEC8jCerDX9so+5KeaqUjMZx9vCT0bbaVDdHeBm+/ATuNjCtGQzB0abYlqjlF+9z+5G+0QlFDq+/ktpaNH9/y1UKDsfgZ9s1OTGXb2RXJp6/2VjgT6U8k1SPjAkNUarq8cfTQvKr4mwOhx5nbxuA2GnDZ4PKjmQ0nThL5f1F0vnoAebi5l9F64Z/ymGtuMvwYqmzI0xrBkPSPho9g+NyqwdgS3Vg20EBA/WUPV1/1g77Hd3SAhm4lldy2AlOVZ6kMwHdFMpjJKdjnuawRYeJKkX1yrSSZz3q/8IgC0ADsgdH43SJT4NK5SET8jdxEw3VlvyBUbogGzBigbjLcOxHPHRHmNYMC1yJ3dZrV9i0+nlgWjMscCVCefeOac2wwJUI5d07pjXDAlcilHfvmNYMCwz2A9OaYYHBfmBaMyww2A9Ma4YFBvuBac2wwGA/MK0ZFhjsB6Y1wwKD/cC0ZlhgsB+Y1gwLDPYD05phgcF+YFozLDDYD0xrhgUG+4FpzbDAYD8wrRkWGOwHpjXDAoP9wLRmWGCwH5jWDAsM9gPTWhAEQRAEQRAEQfD583whzIYTC7iBo2E2w/8x3cBb8WRYN09Wbn0mm0SmKlEX1ZE5k9/T2ezZYvE1nceHErKEyUpMAeUNx7dUnqRdu+QbGR6ZYyVY/haU1+YwVYmR8toO1HiJWFwOB9jOJWQJbdmrMBzxZw3ltayadk3WawulN1deos1hKsdRUdPKg9I+2nY5U2WNGPJ0YMrTkK+xeSHBj+BCyBE3l5QyrxRyCM/x7BK/18wK45OJ6eLmQrfggEFMoiGWu/lm1yLtfimAIa8kUgJZqjivNcSKL0qT+r1RD2LFUhgLrhH2DlVlyJvZFSXguh5kRtO6E8tTcpAfQV3u54/9aZAUC32pTxM+8jKIbhlXKU/lgZVv3Tr7lCJuZl6IqvLglSLolmAClwwxRuDvSvpHfAh86SIy6LjF5HDJ7UK8FlvFmFfigPoxDGbXn5A19grDW/xB2iMtBf+Gc/W711OZFxu432D7UZXypIgZZs+xYcjwSSIFxmTliUd/sS1qTO9sdsLeLyqroRpPBzrBy7Ag/R1ez0/pMeXxx2Es/qzfZZStxEh5qoC3DM2ZIEpreaiBlfIUelx57teN+TwEP567JCB0v5PYR5o1TFVCCTeeNZUqwRZtG09l3rpyMoOgy6D0x+j0J/SlSEHdnkoq5dGy1Qhy/RF9MMwuPkhAKhbd4qno8DIsSH8HGBXHuSnPm04sGbZfU9xSrkWpvLNB1p7CMJwl5T3TnBvlvaEIPJrWRWQr7tnsPQRepZiU+/WAZRj4gE5FBENfaknwi4rFhT/JmvEHnomGW6ilslCp3EtxMlBmUFoymzi8pbzKEas7sCZppTxathahTv1XF/ssp2JNvQxNYL9qsLoF3vS6705EQiLWo1TePGcghbry5P/sWaM8/tLzwbbYYFB6MCk2iClz9w0c5paSMCW5ELc+Lgg3bpi2AWX96D4YLt9LxEz2bRrL+HcSKAG+Vbc1CSSHbr0tJCtP/V6s/vI/OsHLIAOPD3xfJ4sGVR4c6Zex2kD0rEesR6k87tlndMJ1hfmbyhtoILBDzBHPvaUIxOSO6HN43uHPAmVfYV7C3TAr7zGS++FjUYLGI/Y9Pa+0P1RWHfyTasGFMY6ekKMudBM3ZSqGQlXq5g9hGaze7MIXRrMr3RCrO9AmWaVsfyu5eFvEf32EKQE701TsNbvFU+VOYBlkPjyWHbAqjTLc5qZb+anfMb1rxHrI9HKN6UXKRm/rSgSud58OUctnkitUxQjuVU1Mfjk6sdHA80HaLkH4c+frFMNkyEQTkxfiQAnoFtthDx9lo2mlWlCbzDXn2HtJhmICZSoLFfcFf5RhuNCt/kodUqzVPTVJK4U1tgSWbSHaTrqs2GvrFv0Vl3qlncRaybmVAsgQARbIICm/6PdU1vYwe9gwZ6kW1mdrsLygm6uB6SEwbqk8GW7rIOuuaVZQnm2DIAh2iU+OFW0gveUsuPYEVSWQA4xgC6yuPHVW7pWpElxgEbp2DkGHlZRHPIjbtbt+lGDtHIIaWQ/auXnx8bSuBcKBQ/OB55gYOHsONwxm9lriB15bgE/P3uMwVXz4ZQ5wID+GHPPUnV5NsATyi8NTbOmQBJZaY0WIp//zaXqe7aayBx4vaVDAXY+eOJGusY0HQnd6HE0sUhA3JV/reXQ6c4S4eDJMRFIgfVeSKf0nKdxFEtdnCPhUnaZPyuPJSp4ADqQ/QKU8O4FPc2Q3y+kzYjKCuy8u8tl7iZMIt1RTXrqaoKFyhaRWHoqUQGMBj4Tn0/RJedjwdF0AtB+S8nha3E7gU3mX3PopH5MR3E3l+dn7E1GV5kCG2dn84vgwX02QxPh5LIKuPEsgqUVgOOMpLonLp+lL5VG3AdB+KC1PXYDT5mNzEJMR3E3l+dl7+XUBgN0Y/LBJv5rAjZw/Z5rS8hR18epIsrx8ml7ObnpwHGUovALj1xbxH3/PykA5XWoXvvirLokVN5TnZ+/fS6zlIKjy6NBrFXZlgqH4ZQAszhJoasD9nCSkhz96mv7RYAar5595lTuYvWeXpHPzdgI/BcIG/Ny8xPOPfICYyPMagEpImpQDkctYEogVIq8mYIvNXH55Pl3O4WsCSw3eY/GJ5ad6kdBO07/XawrMAz65ehnsGaG0PSaUFwRBEARBEARBsAOqgy2/6DBi4pDMguWUKDYnshnTTT0KtFy2i9/bvBETrSZ2jnh78NEuu4SzOlUleCm124UTVbXgO1ceG5rSD3Kh63EZlJAbnWtWLMVOnpf4qULmsAPlmWMdRmm6mUzkbMF3r7xiihgeiY/ZrTRyJ5rSIvmp03Evt9tSXrI2VZ70AVxnAy9gHw283o0QvSGc14V4Ezrvh9WRyrPDsuUvul8HMH7segKfqVQp3hHB8Y1NeUWOCVx5Gqt5CW+HgQ8VIBPe/KxV8kC5FO+yOReTYqhcC9ECubWG8bea35Py5CEDFb7AHyosORxK6VZnFpRqopkTdSFYA16JmLca7lMLUh8v06AV5vMrLxtgN5MAVZ48/TDMPvFs/tcohB1Blb1k5DEiKc6r5fagDv8u7c5yubbDLRvLAN+IFPoIG+mPrDx06ZErbziQxwvw3x7B4Ih5xwgJtCp5ILoqyRa5mJSUJs94eS3plj96y+uAFJVrKWYOUhz/DrSy/MP/YsB5Gel2enWxA/35CG4k5dFMbi+wP8LtEbrqJaNzT21EoTxx4v85tIGAJ3Z1j/eucAvoEh+vtYlPQ2WEZeXxT7Std0ZIwOVHuZjH5pTKo8stT0I9Z/6+0o3+ccMHkizwNfxlLTwX/EGK91uMasmGGd5oQOWxI01MfkX371V5GlTNFvhjTaTRxJqffrIYWi1eHbnqxp/0K/5koz21EY/sGp3v8+ximaCFwK9XvelaHHPMyQxhqpYffVrdlUdEa3SjbgzklXYJwNy6THmes3jShqIa4YGUKWvhuaQgdG9Vy4trT0v4lCij4ELHSvuQQPwWhgynlOfZ6s0hEsLmi0v8KoY/tFrEp5RHrKduhS9YpH/Mbcp7o7d5oS6fBsxDF2/TOkzj5ee4srwMfAyg8mg0wxNdGWgwY2rlFSs8ubnQZHRDlwei0WUtCsuT7bPhUV1LJpGtkMc7O1ZiLtWS6OwoL9dZ/pT83BObL1vxJTG2GjKTypOQLZCUd0Ed6C0HqjwMENkAHXB0FP5PM+4V9RUuElftgymBDZrB+Zd7l7TrxP8XGAxuMzQlFVY462FYMoDhWqUcqP9JmYtK8eE5+tSjm+Eq3wgjOzlFlYcqF8Jj5aU6U1DL0EZnlzeAcMM/G7KV8pBDUt4tVisVcv8srJxP4CJb8HiJ8uTVHHDbclJ/obyr7FV5ZkkPJxC/TZYbiqnzpduMxlrOhDfxmhj/ALopBUKNLotfy8WlsMWP11Ji9MUgQAMVuunBr3Wk+EbKowC3+kgueFxkY64c4NLSargvmJ9FMgdXXplkV+w6/7vhhgf+Plc+j1ZvbY7aK55MvHZrz9CVThA8RG6cIHV51WP1Y83RuXomxTpkSebBCixT3it7j8gEy5LaSlAZn6tngP/1qJIHUyzTgBxFb8aybI2lJds2mMSPTbDNh8Y8o6En1nlIIo8l8IDMTrvzNnXvWWz1oI9ue/+g5+XhOArUg8c3EnQ0HEvEgOP3AQez4mYIT5fz0EizwQ9jgmnkmRL+sdcO9IS3XnuRUPzX81fq5R8fDsnPyDFEXRbNP83LIx5ja+fq5X86uc4k+sdC5AkF8fgfEwdLkB6yM/FntIXZK54xs77jxpXHDbXyXs8Q0q8bWpm6VdTzchE9KQXlMWKYixolnW7cqdBTlBgsQ3pokGe09EFk+Xf2UWMY0CrvDGaRpzQJ9itBKlrkpZxSuLQ8hKnT/7ihS7IN5a2K9BDMjVu+0Y6qu+ZVM4lhQEd5xfEBg9OvihZ5OZicobKXH854OXRSee4J5a1IuYPh33u+31DMyv5k58YwblR5dglMUKf9lnq2P/BKZk6o7P2w4HFHT3lybRkOTrFlNi/tckHQhTMVuukJtvL+f10n0uF/g7z+W5yqPN5Q4lckGSPXEOi2Xve85FoDH9Dkliobvoa7pzzmccg7PuQOFcuGyTWH4P4RRYQ29pPrYT5P9xAE+8Zi9VOhQRAEQRAEQRDsA+2x9ZJTH4/z7fwbso8H8vdxLsjvmL6JRsxOiHVZluNqpXWlqpqqh6fPBNu+lK0IAH7Gjx5usuSG3NRPyzqkh9Zfb87VWtlzK+5dhVGl5MzimEZsWf4rlz3JzTm48vhzxF+75Rw/fpet9oFeWFaSY0XarliWfu285YfKEy/qbw2Y89wwb/6eJt0OxF4YZu/knbLHw0f0wSVv/LmWyz4H8qEmniJOZTyX1/CdDWd6e9Cp9dErGTenh++HR4h7xHcq8oFeBuBnPryaHUpAUdqjwTNBJc7VgVg+3cIsL/VpOinuUAqhzEd74mSJ8jS0uPQxckitrMqz+fB+Zg/7yBUMbU/RFcZ5bjQ/mSanirSX0DmvNU47RwL4g5z0yT35Dh15qbkD2ZbKsy6WDi8+d9Yj2aYob3jOhxevrvhygHRBBn2N1snFGl5/VXG5YJSfDTxCuN1IIVcNzi/9Ppjn1LgGDNd8Neq1PjiD/1raM75AU3NBhdNH8+RLffY/u3jzoRTwYXYpurpZefIUq6B+4A6tlVb5+cCrHhqHPz71hf9FVxhSurllftZOYC8Nw9EbiUudIwED5+0DeWTwMR8MBYsT2IO4NK9aefKBMLUWD+yS7hlS5fF+E3Qwn4KUGsszTCqAeLmNwcS5KdpBJ92yKV4zLaefuZWIj9q1tAX+t9LYOxRX1Gk9buEq7wMRZWVrkmdCrbhGedYyGgUkeEsAkHtDPbXWin+oshi2eXUDfeaucCrlIQ3q5L0kwfzhX5LzAPc7YpvYCLbPQ+34Ul8Rv1l5MAu7hV+Uh60+2MJ+pfIEhMpGXuls2ckGP+ZV5X2tCU6stYxmh9rjX/Ljeeh/lsbXhNNJTlkMMUXQwxw5eXKylPssUJYqiExanl2VhEvrUozt5NBaEVRZAjSOG4Fb6wqnUp7+JFkPc0cVgL+5tY5TtdwfB1Qm105EobhVlJcolSdJWWPrAI5I9Lrc1mDZyQY/5lXl5VjtMPxnFXhXn3QTfsbKOz9LTZIvMKgLkw436pcw3WtAFmXxcTllUnniJvYO+Wnlmdt+uLWNYl3h9JTnw8R/3FEFiEcfoNZ5tas8eSpNRgJ7zufWm6iV9+mDPMGtDZcN+4Q+DZENnyU175TyuB+U8Gnl2fRBLA5IOpeyJvhv2X0rKE96ZgXlvcUf3PRykx56ta7wF2ZXykMHYI9uvaTB/OFf6hwPEI//8nnMrvL05Rxw67Bf8emYQnmqemyOdEagc+Cy1HyE3+gWIfXy9ljup3jjAwVdeZKEz4ZOK09EtGf44CRDgbr4X1wckFqxR9p9DCQrKE+zsJt/q0iplVWZscyZ304xLzoyefQ16IAychcc8NePUApbCeRP1TnuwJ8Fy4O78sQlUBnrBltpcuiI1++l3C1s0oZo7cV57yyrRifugdT6ttxGeTjW41rwIbCkFcXsnvhMlCdHERuC6eGhfBNozVZ8JsoLPgP0k9g3cis75bmx7ezJ5bKHrSFW4abLJCu2/qGiy8wRthBMbK48rsue4HDBvLeD1WDV2ur5IrZE5LhVb5eJ1j8cpisoTatjy5ZOuW+gEaX3fHTr+2r5jaQYsFpSZZms5HVfylv7DLue4berBkDOpw8nz5DUL0Ok0+/XsyO4rWl2NGNn3nHQJBmYD/V4LDlg8LPolEXO8cM1v4no0x2vPriQhKABmvRQ3qvI/9oolfrAhbFcvZDLHhLIaPmcEclXLq41gSa/VnkpYyGrY70EYvWVfKA8yUSKu0vWPsOOEJ5H1asGFoAMPslJZVTfTrcLEqVOzD/SPenM+0JeuqT/GX2lOaD/5fk+Da1y5JG91SlffSASgM7WpHIam5HDCxkNEn0kLy+yqxfF3xs77y4eu3Ih/3m9pOgLCR1mrxGomaT6Mk6Vd/dfhmMFtejVzrDrGX6/aiBu/mFIWif4KRsiUebksLQodJEIm1ewHOTA0TvNKE4CyZZ1onUlIT7IlJMm5dk1NJGSSGKR/ufB1i6IySthRx9irWTFYy2WuBOrwl0jSpFzMiueYZcz/LKr1+ssKi258E9TMJw+++NGztmZl9euVFKvGjBIc5CTXLQfkSM5Rz+nBr8tNVxIBDxpUp5EATpUeXb1oviTYMGvXFjotPI0E2+xxMGd4++Q1O2+sVrYRqFVUk5h54kqdGiLtObCocizmJ5Sosyp15zVy0vR3BJ7S6zlINOaTEsSWuXolgc4k3Ijv9g+Ql08qVz0w59HikN1YVcvij9uHLQLYjyditZNKs8y8X6TOLilCneNV0LcN51hN/TqeUrFP82lvAxBJEqdfJWzz3/FZQmgrqozHnlolaMoT+uUrz4IesFM/jRpoX51uC7aP+aXDhhEeRe655pQnl4yp6dRXr5md4dIJVY9w25h7CBgI43RqSnpMoRGWbS49clJP/PO9zj6VQOJthyKsjXccoSTyqNfxxjj/TQ/vPgpktpbqxVKiS786kX1p0GyzVcusEw2Advwj4+UfvRMihabmx5ty/3Bmu4dHE9bQg5dNuiELVZhc/ZTebbdArQee13/WjyIfttH5V0VO+Rbg1k2v9h4ZbZahSAIgiAIgiB4KBSveb6/N9lPHOXu4i6lrVx5Gx0c3HnXyfHosNArn0BOC9VICI5deV4oI1772NCKlFnrSakKOcs0Zix4ezRPqQNfr6ueTkFsI4Ov9Xsp7+TBFaeV73Tdytgl6TXhyWK92Vhrs0R57vSQ+eyRnd1XipR9PSyxIaa9O+W9l8uBn/SZKrtAfCRXohoshI2Hk2fJ9XK/oBc6FQreRnnrWQGv6RAqT874AqvBEx0FLqEt6CqPV5YVuUSUXI0eUk6k8hz5gGNaJLLIQkaumOcQ29qTo604561a0jrlsV458DitKSYP/FCPfhGQ0hpnmEciRVy+wmhqKh8HsdC26yqno0FWpRtku6RJAsrj2Xg6ZYo4GgZ5fOfNsCjvc5hQnj1gO/CjhJJ8GHgjBc+8qxQ38sVCEcP/E3lqgY5UFH2S9mw455M3w/DCguH8JDlJrijRBE7fi/diIbdd27dHB37Ci2G8OYHPozMM6Sh5yivdzwe3LdlgpmKU+vWKBa8GmYhSyH+typMfszGLBVL/XtcNWlM6eQ2bIehueqVKSEKPxSRTWA6vAamrY3maFTd2vVroKs9upNRgi+RWLI9+uiWQlB5RHrfm5R9+/CJ6GZycJiB3HzyZfbJbMayWEqsO/Ef5wzt5CIs+LIWO0ffpSW9KyodJ1YkffebabtXIsOFzNUuUarId5eVQ/MnGuo5/H+XEJ6vnkfivVbKMUsxq6Cv2waTyzs7ORAWWa0d5wLrjUh730ci5XPgulKeRhBEqJBIj5WmiEylaQvOTgVWu9v+jyuG/YLVdFBfer/VxU2YIyUcmodH2xw1+xHiHM34rADDMwTwpykNtXdZlCjmTEEfuOtmp0p4wvUH5fIy7kEGVSuXlflqZZZZniLNveQkbV771bjZJ+VWP3Tm4VHkSQuTOBvw1uWrf+ciBU7cSey4BDEOr+EgVFhgu4bdcSAAzQTa2n5N5zm/VqEEAw7CHkCj+WJ/rr5BC8ZeD5Qk+jkCmxT/UuJYplYcf963KlpTnSezviitq9/tv5ZlWXrEO4ATJKN+owCe7gcXmDhd31YokNmiVReXrpJ+0pSkJ/tNtJiDBHpegEDcLn+tWV56GHOPv5TB7xUWtZoX/WiXLSIP9d3Uq5Q1vUw0uhzd6N5xluVR5w1zmBt4zcSV3sT4ZFofw2zIjL1iOhqNL9bTK+3Seeh+7GVvVpCw01ywg9wgN53POSyZqsW++Fkk40SquB+B8NBzPL09mw6u5HU6lJPw7X0hm6F8ZDRJnjyEvcAAsb8UYXrLnn9HL8AO95Q1uE0TyftdhjUXnIbtPXPZnVSqVl/tpdbBnPfAF+/xVsdxe+GkX8R2KjOlLNuVR20KMgcnnC4k9YuJXC8lNIwEjTi0LjrxUFFg81jMs9B+epgdKUAmWo7magNxyyHY+f8+Z03PQsbx4wQCGoVUUkpvJtC2pRaJnIEkt9LmFCad+CsaTnOr+aFE+b3wBFSbBftfNjs50cmewRCWHymi2GpT66XNGdnYrDVIK+ZPjFTelXjn3lQSDAp0MV8BvVRtxQ3KfiZaDTFYTDII1mMvMXDIKCB4gnGoGLs/qWWG1qSq4P1xD3DanukN5DxCbDnVTKk+ws992ciB4SKQz9Xb+m8szHAZhcy3a0rPfEh7Ke2hQJelU/kfb4nhVtviPQ99jffxHzlUFDwk75VSc/9Yw2eK/nv2WSz8MCh4Sprzi/LeGyVY3dPHthOYOHgymPD2pyvPfFiZb/NczhumEePCQMOVh32ePl9rpfdOWnf3GukUeZNRrC8EDIZ3K9/PfdnqfQXLwYGfIF8/o3cU9lEEQBEEQBEEQBEEQlOhZsB2y6l0V/dB8j9xqTOVyp+y8EpdyoU4u4W3MTQ8Gynm1m+6q8Mg6NJHuTl0BFmfOe+UuKiEPtNxGecseDPQGcLv0rgp/NG6ixesoj499mROME66R1a3YXTnZpl15FlIY+9ju7WFIjyjmsixb5eMNSA3p3FWRkjIEf1UGtoHykpjjAdyq+0AzL/J2Z5m4iOS52iMdVCahG/n13PScb5UFIs3rJTbR9OWmbJnipgZVXu/Bxncy4dEpf83Dg/Iwowjivz8YuP5dFVYiX3I6DOez9Fhk/eSkla1P0DMg1eL5cI0iPvHa1SCPfEl9RIRIQS6L/8daDJE8j+s2STv0CVDPrXqS02FSPj+G2or3CH5/4zRjJRtryrbJD1eyYP7AWz/YKJdnNeTi6i2f3dFrfibDL31RjG4NEqp8VrqrwmPcSeF3oycni7LtY6USoLXg1UYbPGd8oEMjDTrl72j2cZAnkDySeVbPIfknu/QXIDcKFE9yGn5nQXpSiVH05w1+tSnbJz1cCacoj/u86sFGeQhoeMm3Dci/l7kDKQNr8t0kgtKDgZbP6ndVeIkmIH98OKt5clLLfsL3E5sBFLWQZ4r490Se4tH/jsSprI03/Mjg9fZIApWwp0RFrMrN62OIS2MEZu5+/OWHTf0BrF1RK0+CiPQRH8gx5VWjn0gScciPhlo+q99V4SWagPxRebajsVgrW15Fa0G+QeauPAwweyzWIonFCUl5QqU8CQEiIk+Aem4ySXl9DBFnsPiw1WaLX+PMc5fKKx5slFbI3+EVFvJwLVWe/2o+KrrKXRVeoofwj8rz+Ug3RUenz6DbBpmb8mgZS5V3ce0vDxe6yjNXyq15ktPQPnNhiza/hpnnLpUHleUHGy9fiQLZ03M+YuiNtYcHS+WlBwOLfFT7S++qIFaiCMDBLVvcPDmJxYU8ZqljihS1kBrzRQPzE0br//TYI2rgsviP5Q+LIaXyXILt0CdAU24AexCvj46+x8PxO75jm6kJFk0LdlH1PCkjd608Wc5KnegqHmx8aTtbBosQJ3JxyK0R2gzAIH8wMOWz0l0VgpaokbqVhz3rJyeRyVt1e39ZVlKgl/DugE648D8/H8l8KPuY2VJMq5Dbwz+rmLZDnwCV3JonOVPpKi5yAENI3yFVPk/KyPTcagCuR1+DmiAr+VbUT3KeTnxsLc1CwTJW1cnWRrwuWAy3tJZQXrA1xpcBgn1AFkzjywDKTdcQgvvD53duR5cLuoHBfWLzo25K5Ql6MqqOTOfUJ/fgwV1w42WAawS8GY4v0ml/BuZz6kmfwd3D3l92GYBnIQULkEA/p065XZw2D1ZDjlCWXAbAhswXPItWBtp21+d/giWY8pZfBniFJWYo7+Fhypu8DMCz1vJSylJ5cipbvaG8e8SUh31f9zLA8+HDx4EvGZUPV9pp//KcOpQXbxq4L268DPCWJ8gfv9fju4UF5nPqPG0uQUEQBEEQBEEQBEEQBEEQBEEQBEEQBEEQBEGwCU+DINgRZmRdTCQIgq1jRtbFRIIg2DpmZF1MJAiCrWNG1sVEgiDYOmZkXUwkCIKtY0bWxUSCINg6ZmRdTCQIgq1jRtbFRIIg2DpmZF1MJAiCrWNG1sVEgiDYOmZkXUwkCIKtY0bWxUSCINg6ZmRdTCQIgq1jRtbFRIIg2DpmZF1MJAiCrWNG1sVEgiDYOmZkXUwkCIKtY0bWxURuwc/DYK5tMgx/MdeXxnc76c/750vUqBlZFxMxNtH6fVjeT/zI3x/NU/IbRnxvnnX5D9Ka8x7p6uCbcX8Mww/mWhG07vfm3Arsauc7C1tCWF6NiRj7YnnQ9ffD8FvzFQzDP9YekZnv/vEbc307DN+Y867ZouWhm8xFfv/9/5lrO2R7Q58P/zP3JOta3j2qYGuYkXUxEWOPLO/p/zrlDsOP+NvY8jKfo+Vtm3JP99ebiwrLqzERY58srzOuLDgsz7g7y1uhqLC8GhMxCq3/AX3p/NvCiAWBnzUgWx6Gx/BndTLsLyoGLBD8aCHEV3aAyxXDDkWynpD/SKka1C5xfrHQYkTy0M3J5VGnf7PQMnMr1MKFf0jE09+al/xHgwCb/jsN/JcFGf3emyq3KPKXDSxPNPCrZTD8qoFFl1qGWVHisijtw5Ra4o1CWT0bmLS8qU7/S9H4by2UdOTNJ5gK9hEzsi4mYhRaH/5rDhl15uQSzw8W/m6hrlDaXTqCx7IvGdyfcvqfco/DTHws/T4LfOU68VjkbkOpRBOgxGwFokBuCsvDsWBKDLcXorWzumLMpTrlKjUTLtK6EGqYpFA52kqHfu9NlYu+MMtlV2T5zE2WlychaOiv5mS1zUVcUZbgR3VDHlW0bvopq5cd5k0rFFRQWl4hMN3pKOpP6uakrK5p+VhtygSedmXjeFUo7a444wHfV+YEvWQ5EHodT2s60pC3DZAaS4spNI9HHGrINo9IDOhiv1rUAq6s1KJyU5ZXmxcs3MY2R7C6lpB7b6pcZJPrScs2Z8FNlmfrD4JW+wRYZ6WKMldOAKEslQ+eYYRF5/UqhUx+IChwGP5ggUs7vTckJuXD8oDPbugMm7QKqFAOl7L/kKIcKP/qnYf8wcvKnV3ADLCDKPdpBZ4EEp4x7FcHXBqRGK1/U5fyx+Svaof511xFeKX2toLJn8fyMvK+YaLcejm/4Wozk6tbV7yyPHOBpAeSCodGy+mw6KQEcnfySdMVO51DQuxwWv6Ltrw/pRW4jp1KSQ7UCNIKR6m7GZ1YHBJ8B3tR1J+HUIFcNuiUphRpzfGbVIWU3ai2addV1Q6VM1cR3lhesUsB2B2oY7nltb03VS521mmPAe7a8mAD5gLJwv4pFS9Ja2MntYvzrk+AK3Z6GhLT8pUK9hQzsi4mYhRalyOO7//8LfE+7o40BvI0w5J9HvIyJUF4+PU7yRT2p2HdS9/MAIdCHZskyKVx5d1jSjSqbTqg6FtAGV6pvV5sFmO12x9Cr/emyv1HPcS6loekfzeXk8/qNNUYd46SxaoEfcv7d7U27JLbxbWHrXBX7PQ0JKblKxXsKWZkXUzEyFpHX+aFpfcxJsLxqlH77r/FcTKouzmlK4dCmu3q8WFIBrwj5XcWUFEk0XMLhXEky8OIzidVwW/TUOlbQBneWF5dweQfDRun23tT5WKBVa4YpiyvCYWN+OFcU40sWifKYlWCvuXBkexqglIiHR+u2OlpSEzLf1mWh34wV6W0oveKUMPViLi8Yq/PjHgyLL/ynjFZHox2bF6mJ56GFn9NEfoH7u1KqbyjbNJm70qWl1dXWPeWh7d/Ks+wqKOlKjmP0Ily63piT2+uEthHvaAvEqEaxf4p73+ajNeyvCZtj9LysniTLnv7Q2JavlLBnmJG1sVEjLKfkpFgEvY+xvSc++lXdSY18hqWp4EzGQD3XOpC8p/UpRLZmUbkn8x8UxBm03SGPpHTymxRehHnBfOEfrL08pzhzZYHZ7EL/6UcNWivj7hqBJcUBRe9N1Uu62lOvSPV3BU8aDQngLnk+qEa2faQQTo1AiPMM11R26reE5ZXtqFPZXlQsl7XWNLpWTP0mGtSHjLlKmovMSPrYiLsFsH8FvIDdkzY6Rd9LMG/SLDdslyoEVq3y0oYY1AiDnVEMqWm5Q7//Y6BPxUjXkSHHzBQygzUIYZbnoEgCDIX4Fltc5JCv3J9CH7sVYHvCcrMQd/y/k8S/s+OpVgH+uW3GvLmahHBUe9NlSviv/5HxL+CvIU2pKvdggUSVoNBP/5DetBCCb3/RbbiWdPyZM5gI7T3ehd+Cstjh9kZ2slOVz1rrxR3HkzINyrYS8zIupgI+vnX4efmjtrvfhp++icd335Xzn5/Qdf9kIfQN98VC/W/fqcLRx1jv/th+PWHZrH+LTr6B1lGfFcW99v//Dz8+O+8aPpbodVvvqtOPIOqTBRqDuEv31XHol/9i9WtW1ZmjnokXxX+1T9/+PX7Yhn8Z+Tz3/9UedfVaOj03lS55J8/DP/Vkyh1eM3f2Zh0X7dhhvTPn4YfmlPMWEP8MPxgvZdrW9X7q0oPTeF/Y3n/6q/6/lqNiqe/+y4vyXudrjljSPzvh9HBRU8etCrYO8zIupjItqlm92C3TO96g/vFjKyLiWybsLw7JCzvoWJG1sVEtk1Y3h0SlvdQMSPrYiLb5t8/NMciwe741o/kggeGGVkXEwmCYOuYkXUxkSAIto4ZWRcTCYJg65iRdTGRIAi2jhlZFxMJgmDrmJF1MZEgCLaOGVkXEwmCYOuYkXUxkSAIto4ZWRcTCYJg65iRdTGRIAi2jhlZFxMJgmDrmJF1MZEgCLaOGVkXEwmCYOuYkXUxkSAIto4ZWRcTCYJg65iRdTGRIAi2jhlZEARBEARBEARBEARBEARBEARBEASfLdfDW3Nth8NhMNdnxeNhGK7NvSPusuduUdaaPbEv42H31eQ3lDL0n1jMdjjSJkjeK7G65D0yDBfm2h3Wc3fC2lpKrNsTS1q1WuknhZQ5kVA5EC/8c3XcgvV7YjOG4Si5dmJ5N7LtcnfLLvXied+D5Y25US3rVvLmVt1QZNfyxNKwO9Wd78O0vDfDcGnOgtrynl0Pw3vzz2anmExOze28vETgB/MwFv0xvOx5radPhxfinS0Q8/FQnAeUOpdOOjtDtc7OzuBMks9ZxqlNY6fD4ezxVVkt4x3q+hEp34mvSgOquh+z6Bx3mfRzqb1cCqNxqPn17CIJXacl1SPWFkU+0mq9H4bXDH6FmlxbH8xmc63sJ+nsD57Lle8gmrJm75D2CdxHljcGn/Tcs+GjCqH2x+ZSiqbOk9DbYcFNWZWyikrR706jJe+pQi2ZouCiJwQfWp7PwfWVbIvWSknFaCm0IqmqIusuVZDWXChPnW5pGFuyrSyv0Tqo2j8aWKa4opTtgDKBDtOC0vKG4RNVyF9qXnT5sarIQmalA0jSh76Ufvw0DM+7XgrZRIaoZ7PZawkbrtl4FEQxF0gO9DnD0Q0WfVVWy0EP4vejabxJU9f9auAoOMrteO5twuBphaURLx6/ZzD9FPKJhuTQ8+Hy6MUbuhiEdDLCkDfy5Cwjgpq3OPjbKev57JHJpry157zB1o9O3VSMLQn1JPz1quQqGnW/K5a79n3ZU0ktTtPHEJCNcqE7Hex8NHTgbNFprQ4PTgadsoqRwGBvhzNteRiQKllY3kjriG3GXTWwGsVtD1oLKAeRUFqeKlGPhNPxcBqmJbqwRtfolDIXmbGXHu3Oxm4E68miv1XSps+UvqyW87X5fPxVaZq6D0MefMqFCshM2Qgjh1fq/1AIZdyXFCRjTB3QK3LTPrBsa8sbl6XConSTAanlMoyG4WtujHH3iP5EqK4Kfi2/hnIE52y07wszdbUYbcEIkI0xDNxxI5EoWto31VodHp2yvMimHU7X8hTfrxcJxlp30rhrxnuluG2CyYq7nYbOcZ4UjWVKQoOVVxbGVK4DJoHi+17NNhcjcBFC6C7KrSV11NbVSmhbbMfUpGnrzkXSufar4VnjpxHOjUA0FyLZL7i3qFYCIdlO1VFb3lRZF3qywr0WriP0QnbuibZ7cAyB33PJwHIm2pNWxUzZ70qlpbKn2tRtwdiIz8C4oDo+IRjphwHL0Mme1blkXJYXqUmEsg6T+zwsU61upamOtT457vDTKm73lB1a1qSaGBNYcchWU5V9iZ++V7MtVziYVWQ6ssYW5aqkd57ufepqObYLN9kmTa/u2M1lpTC5598I50ZQ4JEJZTy6qFa5S5IBKKijtrypsvqWxxFl5pdpuwcqucgTUFkVeK2KRtPvSqUlwXqqTT0qeFSxhQRxNXIis8Vkz+bwuiwvsmmH8zKn80MBq9QTsfTsT1Ranx53+GkVt3umLA+/9XG9MAyPucHANcvTkw/YnXa99Gi2WNzTgzlaJh5xcmABHIyIt5W81NyaahkX1bp5nGZc909VinfD8FYOBFrhPD5kp3NiQgmPTtXy1hjI17YSfG6x5p0oa8LymJkvgpy2qRRCPUVoVBWrotH0u1JpSdGeSmoxRgXXhXFMvJSjLUyyFjVqrfYldkayJVqWle5FNu1IWCdKWZqzG9ZxKrG2vErrTftTk92r50BSIbtm0vI4aJTiAjvmGuHCLQ/zGpGDuLGXG8sWQ0jg2hp7TqILJWmr1MIkMY4U3dm01TLSUkaGQZOmqrvvHv3kosIQc1YNtVorHxFoTscDUrU44RriNbfnwzV+4e2XZZaHoQHKOsBr+8xE21S5oG0qqquSq2g0/S5YWSJb95Q4fXCAtuAyF4IBLSHMxc5Ktq3FoTPh8KjK8ppKCIpsujTx2kLTdJQszRZjFosK9LRet78dWBo5oGzxPmjyCBEa705Jfd8cBm0ZX8VswFY6w04ZBkHFvVqeTmZYNTYH0dukcxVmZbbRGdXeKQic5gTsDs7HLkGXZptbxgpcNYc6a7GNzrjFLjcIgiCYZvUd1qqSE7ey737PeLf73iC4FasfpN0geWInxNqTic7ujwaXlIDJwFxB8DDYmuU5rZT779XyjNH59iC4Wx7xipXc9CbDFfsruW9eqG8XbyTBmVxqOuRlHL+0JFyfj29lb2/ErwqqHzOoMhzfsu4nPXZyU/y4MePb299bN3Tupm8eTPiAtfaTK7snMwgyb+QO6sNLLg8xXHmHV3HfPLcIleHcSCLk04BRSDHe1PKuuNyr0Xkf57i/LQgDuLwFvsywroOyy5vix40Z3d7ef0DBqG8Vw+8lV9y8LCyhQaBUV+sxiHSvo3N0uvFH7hJoJXnrmzxpNx8PKkaTNsb9o4JsJ6bpygzrOji7uym+0xgHOz5ukg1NVa2xPL0ZHo7+IW/whYKllbmAG4zdyYQx5mCQtpLAPHJLUbXm8ozKJMT9o4L8BiUdt0WGdBm2iyI7vCl+3Big1w01y1QRCyNF1UaWZ3HYZaojCIjfZiqMDKK8XXwkieWdTfrgJcafOcHaluc7jXyEZRlO3LLO8c343dwUXzcGe9OJ29t7VZuyvA9xD1hQ4ebDvU5jEH5ftzGWxOGZhAile03LG997TySwqYNzgP2SHF7h4MskynlASrjNTfFlmAvYTj/ZUr9q5xZqxouNnEWC+cpzkemUUvDFg7EhwNkYRHu7uLkLSWzOUnBxk5Nn5Pk5GKxgXJAs8Yjs8cytGTZ1cHgCkttt3xRvruqOLUwwpL29faJq9YMJkFa/HhNjy00QBLsl22kQBHdHWF4Q3Ac7fpIiCIIgCIIgCIIgCIIgCIIgCD5P5uPXym/MWnk9Kh4lIJW3jbyR1ROsIrlu6V8gG46bbQ63nfBc720Cdqfvrkh3jI1B4eZakSV5dWiuZtePqa59qXv1BCtI7uaRWdUnGN8lepKfC5kd6m1x8sbwGxi95vruWE/XiQ2T3SHNPf67Y1lX6AhZfUyH5S2lsK6Kj3yqt3giS25jrW57n0S/eXgDWx1KntlyXY8iV0t2Dxy1y5vllvd4XulwlPpGcgbaFfMyh8rDTitGahOnPJrrzKt5PTGfMp+XdzmX5HwfM9OO5R1UpTWNNua2A9HcliWoJTO1WKcupF+40ERNK2PC8r7m0/kpLn1TDbs9dWSeWNZH857FdVVDyozqyvaSlGG1AgTPrKPrQnhU9yXJpiu+e2R1US8vSst77259pQJET1/wjn8GoSUXTMw3OtOf3rsg+AgyYXsGoc0AE+4ZM2FvYJV78rx85M2eCGDAG5S1WFw1qyXELV6fyWjWyvB5BS2O79q/eMenX8Xbf0aOL5GRJI3lIeya4frZCzhynR1Gn1n2kuCKD0bo43l1glqS5aKi+sqMUqxfl0poc2VAdtJ6c1xK4J8hFJAz3xqDvI+hHKmiBXMz8MMpue1VP/uTG3xYEptc2VLTThXGVO9ZVL7Rrsis0TXHxvECIZg3yiKF6WQTY+qusIpVX8bxh1ikE+3dBY+ltv6ojD4Si5bYs3NoAjcqZByo5wJq4zM66h1loN/E1dFSTgBmGraZffKXQ1QjSD81qyAvFVVtpSfl7MHvruW50GE92tFyd1CqrrNRPuOXEthzgXWCRvKE3WuPB5Vik3UphTZWBgIViWrwPoUJyVYqYS6AnGX8I1BL0PfYIFg9KpraXvWzb7Ct2zE+kqy1r++xgsPenCF4Zo2unabIxESy/pi6O2D4SWdGvdrUHpGnOBFxpoiE9T3RVwHVs4cK4ce2L5dkICn59iPtGgaIK/cVP30u6VyCcE9i56tSXr5v9WWEPqzXtbxCwfVotzKkI5o6G9XXPlMCkWgSNJLYO+hybSTWq0tb+MbKMHQa0LVEUvrY8orMCo/LaQ0tuG47fruW11S21LRThKV9ezW3psxaXZP5YoH9pdSvSkImkk2MqfsE3WQuIh8GvJC5p44o9SPOy2r8smlzKPkAe7QLmxInM/Ap8ZArJg2Q3vBO6U2Syisk4JPno271cee77q7l2csbqtFukUCq29TZWG556lNaSew6ZB4fifXq0ha+uTKM1A8Fac73SYHTsTpIyvl2lifbTNJ0gYflR/vzhyOJexpdQ0jep7O25U2NqXuj6abL4dK+xYkVoxqIkloCkKY6OCAIOJR1LA7jpjLQPZOuQZRP1i2i0CtbbbS75QpJPOpW89qseW0rf2ubZp/O4bWWl9ZtWCc3dTaqYVOPvibBWBK7PQyURqxfl1HhGytDedELLiwv1aGwz5TzKpbX9LOF9jtRNV0jYcg61ah8n4dXZKRrnTDWtrwlY+p+QK8Z8u5WVNGrzBXB8Xy+uJTuTi0hWSjDVQ23fLODvoBylAEOMRewO4rB9+pojvWQzl+iUByGflqco2MRfP5mPs/rEDIMH+ZHMK1iIHi3suzLF6942kJCWb/3WI9gVlWfZI/A068xwx43lvcB8Tz0lkrXdXYQeLlgT4nbYjTzJsFYEuMQhxhjsU5dRoXDby0a9aUGkyxksAqLYy7OLcA40NUWl130otPOXy7wW57xSzmvYnnYlP1M7+ItnFVlK00bVRi/vXE8x9F//X5hzyzVSHWNst5wmk+Wp1KJiWT9MfWwKE+9Hi5OF3ZEUt8T0PSSMLdT0I+KHKoMMBE+frtIkYgyd7rZ42ixsEXB88V7dybmi3fyrlk6rTIH6azxy7fv8is1sSg55UT+TDJO2T9bvGW66kyzRC7Oimm/rHPiACKaLOWWmlknGEvaqe1arFcX0BS+oTKeLD6Mm9ABOZYZEc9Zuw6l6laDx20v+xnMF1aVqh1wu0CiCjtCn41Xg5ZZq+vHi/fI2epVFOlMJOuPqX3juj4/GtwnoYwvh3xQHNw7oYwgCIL1eWJnmL4sDl62x0hBcLfcftEybJDFDm6CXyFLF+HZxQWOkjapeRBsh40HXzrVvNGry+1y7rrkQseskKWJpDavWfNlpQfBysgJ2Gx5nRvHOySh7jDs3+2+LGN/DqGiSlDWq1NoJ4N5fRL5qJXozDb+oAE4aB8bWN7kIFgLrLTOPlVXQ/ON408sEPH+US35Ke5Y5wV0IuEcj0f9u90nbha3C54Q8ucQjKoQ8Rf1Kgo14CkeZGAICj99/p5SHyUMDUyVERH8CLBOsyTG5wcNhrMFAxhR12ZcehCsjduUfc+nuXH8Sm/x9nF2LSPUB126Y90MJlle7273iZvF1Uzqe2WFUSFVvbCxQpXmQQb8+h34aukIU3mtnIrQJxvLDTan3oIzvQNiqslBsCH5LlVxjG4cl9vkrobDM95cZc+ipHE38tLhoxre8v4jvWtJbl8qR61Jc0dj9xwYdSGdepW5VBlYlpZCN6lSei9R3/LaKjyZL7jXo7OuTfYGwYbYE21AHPmyrJkk1myzA+686LUbvqeGoTomLa97406STs8hGHUho3p1xr5nYFlWa8L1LQ8puRhg+8VX1iZ7g2BDsChTi9CxjSFpa0E3AwxAiXk1XL+xBWMzDP0RAwtPg7y1vO7N4kma1O6ykFG9UqEVLovfT/LcmpOKWW555gW2r5uwvH7pQbAGLzDe+JzhlY6p0Y3jMM10qGMDsxmG6RGDpZY3cbO4SpfPIRhNIW29UqFK50GGC1aY8ITkSpaHTXrQAM2+mj+SBzkKAU9RNNlyCIL10Tv1/TpAe+N4OpefzqmP7li3RwwsfOJud9B9AEGli+cQlPFt8U298nMNQvsgA+YT9enjP82d616q5Z0Kyw8aIPIdnfpepakmt9ctguBLJx1X6usSgiC4G3y1WaxggyAIgiAIgiAIgiAIgiAIgiAIgmDfeHxWvqby9iy/neqJXGIzz+psmAyV0Q8EbYN0A9nNoLrmail6++XZc97VU/BmZg7lABmJw++fqXh5NlwyB0XkyGGb5V0z3fbV2FTVK9IqAFhXse+UZ2en+R3ZzySq+zWzW3JyYzPXGHJgeXa9936vwIbJZhf24uxtsEY3PJp8lXjR23I7rJrJMBzLVgfBBW+kBbA8lLmQG0nrTuVNrMNiPudzxnITO0QkyWJx1Ga5K6a6w9u+qfVsquoVaRVQ953AO5Vz5UUFH/iIdPUu7i1QVGWC7VqeOdZjw2RbZb1umGCseJIexBD8MQng2q6+jofQtjvGSqyz3AE3dceoRiuyaboV6VqeeBPDcFj0dza4cbevAT8m2KS3go+gKokcdG+sbkSZQ2YiTLFEJzU+i5p9fKadpIcCDH9Kjk+CPzd32g/lQu159aoARDL1WZmszC5LiEvrc6FfKrNa6PBo28bvLyjFiyma5igy/QE+mKR58ZMBgt4Vah46+RwF4OcDvRO0u8se2dDyqk+W2NsDSlawvNxXdS83iituNJeNfMEA6BsBzJMcnf2T5PPaonOtsg52pOqi+41GVcrNlnfJzzzmeaXY1RUaWxd+bARUgytbnlZAm50+8QS8FvYmhGYlf603Rg76WRkOVW4NiOqd/+gr2VaxuVCoAb9NAYi07wl7sja7LAGXjBhs5VXn5y5glqfptW2X+oTfm7ofDGuOYg/n4ZAAf1kZAFrlpuim8sM9KJAbP8os7thGb2dWtjz9KIrjE0hBytY+DtZmCRCpfTXq5UpxjeXppMNuYeLeqGixtlv6xK5V3f1ukmCqMsYKaPpOn0bNn0MqLK9eeayHziPaJCNbnnjtAbbTomPdmXYR1cyiozNPylXt0ttT4JIGVLG5UIRDp00BVRVsU2eXJdyVHrHXrQY3bbOHF/q6SsZGqscciuoAHdl1N6XBrgViFlZvgfU2Wc3ylOtSZbZowNgkPnq4KehYntW07uVWcbXlpe7U4dsbFS3e2Z7SuANVty12qoixApq+M1+Re7K8cS/fCsuuGZ04iqeVyoetvKnoJDsUlUfYZFd/xu+ew5MrVdUuL0fMWqvYSh3PRMbylwJ66qizG6sjGY5uNbhtG+udv/hlVM0xqtWUFUGxa75NRtWZu0k6Qac1LbCzKiyVt8Y+rwaFmCtlMh4Tyyyv7OVWcbXloSomutD3ZI1HRYt3dlOjnau66H6H/kJVwlgBdd8xjaHfiSlUUK08bo8V3I5OQdbV+PVv9+uSW/EKawfYBxZlN60OQTMAvsuvYnOhmrwuAN6xOursblSHBjdtc19F3Rwj2RTRvFwRpTpTtbwo/a0Ozoyx4sl6lofQtE7eyPLKXm4Vl5osXhyeqK+kHhUt3tlNyp2rWkilkJ6qOgqo+g5H5uZCtLz9IKtA18nbwwpOdZbReShHpVAGV1tYNelMgnlAts84G2ClzgbxFZQMwz4ENZQVEL0GchAvpDwL2RgoVNYeaK8s6zyxFABfq442uxvVocF12/zYvrSqUXMM9enBiOalUyHbCfGym3hc7lZjBWKnx3XhaWE+W7E8aYGsON9vYHlwqbD2MhSHfVBSHAY6LEouW9CLAvSg8gTDsD8qYJ3ljsY7W0wn12HXqi6736hU5Sy3PHu5npBeckcVzCG17Xfv2IIWR9jihSZYgJ6L0RUGtWGxelbwVHsaWhmuH818AuagxcF6UXeBwcOF66aOpZYwB6aC6gKudYFKcrIquyzhLqwZZWspNLhpmx9CVAvLUXMUGTB6TGJFSMg7WBVPNpTdxPRmTKlACZMxYaTDB546SJqsVrVMZC5E2EssRtipvOJlOUoen2WWoOjNWo2t4ljnwiv6Ga6ljd1RoackEqntPLGbJ7edq7rofqdSlTFWQNl31TdAsbDGr9QCceUJ0v2H6rhzLv0kWRojAdmwN3RBcCP3oupgivtRh57gBb2rCl8um+niaHyJo0tYXhAEQRAED4B8gmAFbjrAKM713x1yjpD1km19wH+vtBVb6fBsBSEX8fw3fjpkLd0HHTbowZRkado28qab4ndjeTc0z08pFucnJ1lBZGVuzKut2PTjFAUrCGUtaP7rPh2ymu6DFdigB1fr/anIB2Z5ui3vR5jiji1Ptxc7Wgls2pjVdP9ZMuz6HvbqJnI/r1jcwWAhTILet1Ilpnx8wJxFxlKx8U3xfn0GGekYa4qcqHgCLbZ35Kqf350Q6DFnUYuyjs/NXVaqLs6vHR2PKp5ukNBvJVWl5ltJtE1ouVwa07CqE3LUDRXzc43jxykKfZkQr2obEob/RE5qikjKnzUUA8oNrWvS9K86XPejXvi8Qbt3eg97eWMystYLo3YDupGSoCwp5WMVnZ6GqDP2sVPJSjOkMhhSTNYWOV1xo6mD9w5Ks5uJ6loY/oiDV8b3eXVx6bYLuZxcV9znJqlvU+rY8tp6u8Aoalwx9VvvjR+nKPWVhMSnPckSzMt1pYmk/LV/6oYqSY1N/ybd01H1wmdO0fTd3MOOrva7d9ob0I2m98WhW8HU2Gbc6tyw+6rooKabIpdU3KjrkG95MNc4geBDzcXN8pri7M4pw+MU7Fg42MRU21LHlqfejIeMosYVKy0P+zfxlRT6UiG/pUptOpWgTXSvh2r/1A1VWjWOvHSUvfC5UzR9V/ew854361BfpvncJjS9Lw7+1o8PtBm3OjfSDVdueVWRSypu1HXIzxvYVNTWonnEwcXN8priymVqljVkZKqltaVuYnmTFSstLxdUkvRlub0fhiu0RHf6qYTlllc0dEKNI684il743CmavsN72FVBKgF0F+CkJLVW3PjTZFln3OrcsFUyHVqdqsglFTfqOuRd/7U+NdLUoqmjJgJmeU1xrY2bw+CKVMPaUtNo15omb8ZDUtR0xbStKpgLatBo/T0ub2bMJSy3vNzQkRrL/sVvo/vcC587qSfg0gZv6R52o7yJnIfq9J3XMilJrRUcsXHZw1tWGdZmnHWOsVTcja6H8rwfl2OsLXK64kYzMpAPFz4oTJ/NamrR1NESJcuri2M049ORdF1xxtoOuy0VEVjpcb8zaXmaV4qarhhzSILY6XFxXjxOMX7mQpQOPolMKmGZ5bHC+JWGNjVp+rene+uFrT8c8MDY2T3sDtPZOATMND+6YHiSVBfLgEnz4wNNxumG5/qmeJnIeXT3wmfTpsjJiittHeTz624/oNe8XEeP+pRGTVmcDFRgB9RtxTEFpLHWlCrnCl/MXkubUssLLK8iaqpi2itJUNRTPk5R6EuFfLeJfoWVpoT64Id7PdS7r2xoVZO2f0e6T71QPTjwJeJ9Guya04fZ1alWd3Pe44H2wj0QHXFHTB503TOyhhCK89E746H2QhAEQRAEQRAEQRAEQRAEQRAEwb3zklds7J7HgnS30H5xZDd4j5iMqEj3Mq3EnnTR13Y/y/3TeTqr5r5G3cu7uGRp+A2t6cWhxpnd17RSH3gmQO488tc4KwgreEEZo3rFd4mXXvPqbOWOGaZG2WQESQV8jpZX3FmmKtB7Rv22TPuWTlKd+JbC+0gV8a6T0VT2a4267dMawU5xo/F7fB2vw0aWp7huFb0JV/C7ISbUAvoR69nDBhS37X5+llfcg/IINTbtHHvl9cEtf3zrY25T/2vRoNAoWCsjvxe0xYP3dKU1DaY97xFHjYbhRMPsfl2gffCWTt9VyC2u6Sk4ZUPLq/v3QGfJx2XphzZzMiN1ibt+WjCnzJjp6PsPyk/AaARLxlCpd4AiC6SAEz7lkJ9i0PldnxRPaFfwpeD0lXVlDkfS256DPKUm0Fd1IkR5X7U+REBygNTR76tuK23NBtLyqorjhhcPm79kDlbRZI/vGP8hCaXbtaeNoLa89TKqtaWsO+q8yfrwsz9rmJ7MsrdT1GqoeomlsA/FQ6yKnfF0S7QO9fhZYZ8nYxWHgaYpaUPuWYJM5sbG+zx/9FkeKvCIax/Gsi41c6KjtLwqpaGiVtn6eVSzvKpVRlGA5mmbTzoCYIzp05AAumaR2IhQW9cqB31WlPsa9ZWdSNHquCcFvNDq2O6kqfSpVupa72Ouq9htuK/gT+VhAM0kL0Fl1GX7T3q00dhBB/G1GtaaGZ33v4jlIk1TEd4fdRJ5oXowGaSUx9ww96hPktim7iWWUhmDVrE3nm4L61E8e0JWsDx16Acg0jM11cM1yCSziuV94mfN6mZDdf7ihly6MX6qsrS8KqWhovU+TdCIplVOUUBy4Oe5S9uTdYoryISc7hOgnoPOOU0njh5E6wSwJ0eqkElZ69RUsdtwe03GtTo0TwxteghdqdZwXeUjN2LhI/gMECx8zYx82DV46CqjLjXZhpbu1F8NZpBujksUmUpxNKA3nm7No7ML639nZctTAes9UtSu6MfVLO/13JYmJafMVXs2yWI+U+gpurFabZYpDRNFJzbfG9SIplVOUUCpMJ3cFQ0mudHmmKorf/Xh8ms1CBMj7MQk6uQADBWlsjytNIYzdIlVLA2wrWKv4d61uifUPLGqFZ8Nxqy9dPprNDxHXFNizYxWtbwlo+48ZWELTHnOEE6ZEs909lqqyFHTPGA8nnbA2pbXq9DaltexO8VW6yaLlb9sx/sRO1jNeXpKoxjNGITFc2Ua0bTKKQooFdY/5ZXnXHFM1xU/L4cPj86GD9Z3/VkiU6TVmnctD743559sCupVsddwbgvotan4krNCOZvYYms0PEeoCayXUXHipcRDVxl13uGy25UtNseck7lXT0FLFDlqWhlQj6cdMG15ejTb9AEmtOYol2zD8vQBZIwWbqwUexUOJ3Bu0zIfBz5UJ47lGVClNLTH9Rg57SGJRjStclIBtcLwK4cODVYrLrewaepa5wCZD4v53I7Zm05Mok6RVtqGanUtb1g8T2uHuoqdhhc6Ekw7mD+kLmnKk01htMVorLv4yI4iYUR0rZMRo8WW6hwZrB2z0qizIvLIej/I7pcx13ZEvlSRVY2IBvTG0w6Ysjw0VlrW9IGMMKFcymzD8rCvIJraS9fTXVfWb3qNiGpGrcAHDa9SGtrjGuFvXxA0YtQqwwtoFMZDcsVsR6Hlp1o0dR2pfPH1YgET0j6pOjGJOjlAX9P74qJnebDm+WKxwBpXRlNVRXOWDUdYcxrHtGNnSG1o23Ow3fMfxbEcsNONuZTVM0rrhTrHNUedSRenveCW1ShXixKQu9ICql5KpTga0BtPwZ6SVHxeXDy4HR+Ls4fmWE5enW1KWjjeGp/CtpdjEPTAvuvjfD7HTrK/D9gAXvF7PZ+/Qdbl0mIJp/5OqA25teUmLu2YcHs5BsEUTxaLl3ZktDVeLCZvMAmCIAg2x+/xxzGxPNKw9K7u8laoIAjWp7h7Xy4C+xWR/iUuJywvCG5He8a7vfjQJywvCG5g6N/MrddF1Cknz8QEzYvdXboUouJ+A+aJ+JBhWF4QLIWWYltuYDpyjtru/1aDKxy+zzPLa27iV4tDXFheECynvsif70W0W06WW14SfyKO9GmMsLwguIHa8jr3fy+zPFuZCvCma7pheUFwA7Xlje7/Xm55zRnO4uGdsLwgWEptefhVW/L7m9u795vjvPreb6w5eWKGd6WK5fkN40EQtDSWN7r/u7l7v7G89iZ+dT96qZbXe4tgEARBEARBEARBEARBEARBEARBEARBEARBECwn3dFyx9z2i4Hbe1dFk9Lv1VmRNcVXRZu3o8z3kM+gJ+zlpMa9Wd7yd0tMs/13VXjKDT+huKMhoc3rZ77K5zz7HwLdXz6bOSi/sfG+9nkb0lbX7ypdzipPUrhqH4blKf3MV1HZ5zJQnf2zPHnvQ/3lSlJZnt4DXb8bovrImHzFhuhLW/PN0vrpqa/ZK+ya6suQyz/VaD05/mIgJHf/ropSSFLqe9qB+uXN4qs3RF9ETl+K8e/mKW0W2kK0Qj5GpC1tP4NJ60qtKlAplSvyRR3klQJ8YrloTYJZSW9gpmIL+s0r9VF8CrOuG5cJBn1VBiNK9UklmJV4CJw6+nQkFQ+d8beozUgrDx1VcWtHbJl1oijSttz0v9hoOaD5VIR9VvGFJUEsVj+Ik8GC4y7J2gaUPXPEUqp3/LMn8wfpihd+U9K23OziXRWjjxnSB1nZbNAQeS4fPc2+ybNS8S2ccRbYMKF8oMCeu+p9stPrVlP0QZEvDABFnlqKUUIWxy+N0NZZHqzTOrTIo9SHRciretu6MRDl0VNn0NKoj5WoxiL7gN2JePWWllfWhilLrTx82LTmy5Uk179QI36alz0YOQNNpx2ELhehdzKQ0yDpfnlw9GVGFcdw1ckwM1Efzp3cFNHiqC1vpXdVNEJecQ/dqCFEHPaF5OoDqaMstAn+JSofwkr1SbCceYln0OSLzadUvVHClJWrXjunzqPUR2f3onXz/tOv1LSVqGjVN2pPUqd8CAze0vLK2jRa2QfGX64kU5ZnawNBg0l2FwNigSlU9hoam7pGbcHyIOy9VIrj4qMvBtb12cW7Khohr4ltNm0IoyhwwSGUP+pILDkpszAbY4BsZOVJNIxCOfMSz8DEiY5QOPwDWpYQQ59ASymrbHnsE40XJI9CH7Jesgmkqtuyb3LWtOobtSd1p3ZGs9osatNoZX+Zsrz+KUL7qJu45ANOF1gzUBILNrjlMLLpmmYGTKU4pQ6qLwbW9dnFuyoaIa+Jh27ckJTPG/yVK9KJLGrL85Zuss8zYGHJVEYJU1at5dV5kEIfsD7Ycl23pd/krGnVN2pP6s4LeaR6+vukjVb2lynLw2/5sgcDSyhZbqoeCJYqYo3ngx9YNF2DlVx5yJ1KcVS888XAUX00zhab+XDC5GrLQ+iyd1UYtZBX3Cu8fkNk1YVC7AgG2fvnipWJLGrLe6s+7mg0jEJWl6qDch/U+cKHA6JLi6uLBKlZteXVeZT68ONVTF513WBE09/kbKraqC9VwkG86MKMffr7pI1W9pdJy5NhKljPKlg2kNRonvcSB7Y6vY+6Rk/42STcG7D4xeAjZV+29YFmBT/hI6e7NnxXhVMJeUotaJOGWHa+m8NEbzkkulnUlifJOA7Vp0JWl5dNht4HZb5ujhi8jPHWJFKzasur61bqQ526xmnrNv1NzraqtfpSJRw0U0+U2tfGICAekStrk1Luu+UFu0MusmwVOw35QEjN632Tc82qjiayINiY0ULvtmzdkm/Hsm9yrlvVsLxgazx5LZfAPmu29k3OJ4vq4mgQBMG9sdbTABW3fQ4hCL5ANnsagPj98TdLBkHQ0h70rn4aNwwuCJYw3PJpgGsJkFvS5OGEYeC+Ti/5gEKyzFhCq4cZguDLgqPfttys+TQA7I5iB/S8UEl7dgEJZOOSnfvUKb8/d5sHwVYpTAs/7e3kN1hee4+CYLbkUSo5eZ/6/txtHgTbpLa8NZ8GsAepDLsVvmt5k/epx/0/wZdJbXnt7eQ3WN7rJA78Buqu5bUZh+UFXzi15eFXLcIWmzc/DaBnSK6e0C23kyOFWZ7eNpUkq4xby4uPXAZfGI3lrfs0gN5brk9762nOFxdqee3d/hP3qVuG8ZHLIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAiCIAh2ydMgCHaFWVkPkwiCYPuYlfUwiSAIto9ZWQ+TCIJg+5iV9TCJIAi2j1lZD5MIgmD7mJX1MIkgCLaPWVkPkwiCYPuYlfUwiSAIto9ZWQ+TCIJg+5iV9TCJIAi2j1lZD5MIgmD7mJX1MIkgCLaPWVkPkwiCYPuYlfUwiSAIto9ZWQ+TCIJg+5iV9TCJIAi2j1lZD5MIgmD7mJX1MIkgCLaPWVkPkwiCYPuYlfUwiSAIto9ZWQ+TCIJg+5iV9TCJIAi2j1lZD5MIgmD7mJX1MIkgCLaPWVkPkwiCYPuYlfUwiSAIto9ZWQ+TCIJg+5iV9TCJIAi2j1lZD5MIgmD7mJX1MIkgCLaPWVkPkwiCYPuYlfUwiSAIto9ZWQ+TCIJg+5iV9TCJIAi2j1lZD5MIgmD7mJX1MInbMAx/MtcW+XEYzPXF8evwD3N9VnyRGjUr62ESt2EY/mKuLfLzl2t5w/CzuT4rvkiNmpX1MAnn1+Hf5lqd+7C8vw4//8ucDf8ahr+bc22GBzE4upb3w/CLuZy/Dv8z14r8fhj+Y86t8Juf0F/D8MMPf//WQpYSlldjEs4mY+8+LE90bu6KX6YiVgFJf2/Oe6RreeNW/bBuO/9zi57poYan/PobC5wmLK/GJBx0orlW594s7w/mKZEIc68Nkv7WnN986647Z4uW9225N8JqYOOe6cGeztw4Bta2vHtUwdYwK+thEg660Fyrcz+W993QW219j9XP5uPr99//n7k26ogtsT3L+3YYvjEn+e4fN++Z1iBX6Ztf4f6deaZY2/LuUQVbw6ysh0k4m7T2niwP60rzFOBQ5haWV3CPat+Z5W2ZskqY8W6qTVhejUk4m7T2nizvt8PwvfkSf0OqsLzE3VkePV+Zc4KwvBqTcDZp7T1ZXq+uw/BLWF7mDi0PB5E3XIUMy6sxCWeT1t6X5f11VPDveYYkLC9xh5b3584KpCYsr8YknKq1/4CPfGd+Ayon6apZYQA/uvsnufwEdZDKPn5nJ6N/rU9c/UlDf/bQQk+/Q7g5E1KpUfj/GNCMyD/z6B+U1yl/lJMzf5HwX4qzDnabhVVbkQjCU4Pg1/KGHTkXKt30q4VkOr03VS74DcoGrOX6lqcuHmsV4198xp8txOsu9f4KfYyyrB6q1eY0CVbvJGmlAhHmAtBfcYG11+mqUevb5q6nkXxfBfuHWVkPk3CKdupQUIpR4gMK2LWvbFpUZXLxsryRh+UfLISUarEgYKMuW943CNSRUyADGjWshsRvpCaV5eFgMJEvI0vmMFMlTwxWaNHElBcvhjm5N7hPYaOBhTjd3psq9+nTv1vYMPwRmW5ieWrPxDrLfIIuBHORrPd3FocSnz79o7lL85GeN36yoBIEmwugf1J7lnR6GhLlfQEd+Z4K9hGzsh4m4eR2cgb9J9cqohQNAxKM7R/YhzqkUp+jb/1EP5ywA9Ej92bJ9OimpmWKzUMPnl946YkdrjoRPRGqX1JUiOWhhGqM6i6rtDwm/h9H4jfIb/hRA1Pt/grnP7FNE70X+u23WKcN+P32W4tj8n+zuTJe07kEjGAU+p/ff/vvXKbQ772pciVX7hu5xX5mA8tjgSyPFq8z4u++/Rbr8T9LKyQAYoXlIcEPcLHHf8v9Obsd8sXeCGv34b+c2hh8U5UKz/JO5+DhLJFn4648Kg2PVP6myxUPGbOyHibhoLXm+mlIV2F/ySqjNsz5FTpGHB6LfkvdyT5MZohZnoONDP81h0x05qSAX0jD5C9bZCBbqqVzX4kO1CILorvR0vIg4K1AdVMrpHY2looWpUJBEQowTFPTOMbNyWrogG/p995UuRyRtmOUe3DWtzwYnO3qONGpa3ScV1oeEuh6ATL/g+Hp5MZOEgeB2xuBWqUrnYlSFhNJ2rsV6Uad7kMCTh8SU/JV/vuKWVkPk3C6rcXAKHrMXMD62joLPZtHDLs5zXb9XPP1OAyWvCj5LSfFZAQ0vN7IVstDwuKEGqZvbooRif1yttpiVLF26XgIZu/LQSuUVFXGLOEdADBotY4i1dkh1xS9N1UunKmRNOz1La/YDyPcD8GXWF4unO4khhr6Ah6Vymt5iJgrU4RxT2vOpZ2ehgTnGnNOyXfL3DfMynqYhNNvbQrFRD0+d6wKRcdmU6OvGD4wsvF+C4d8+YhEHQXIAL80vOpQzlHLq1PaQrUakeXhBA5FbJ6tavc7LPDMqYUKVc6ov49TkuPgynPGFFl8qtyqnhjFG1hecTEtiy6zvJSAS780e2COkY4FVaUgNFr0Id0PAhyF7Eqdzps+rWpT8szfXPuLWVkPk3D6rcX0pw6MCnWUiELRrzxscIoxDDACykgjKbl3AC8ZcAbsX5+1tNBTMmnsAMVI84j8dz3wcuPq2uUpowivOqLpFYwaq1W/uxpS702Vi511OTFtYnlp6Qay6BLLKxaPOJozF0g7ZVtBOPW5FwHlOL9kLa3Y6ZhTtaBJ+cK1v5iV9TAJp9/apORuNBUKk6wm/7qbp9Kp5WF8jB+tZQY0vHJfU+Bpi2Wgl5FHZFtq8rcW4L4ivEyLUeJ7J+GPyZ/H8hJyhSbKxXpLHcomlldWA0dttk5YYnlFgmpaTHugpsBx+Qxy8umSkWDyTwyJSflRzD5iVtbDJJx+awvLK/rYgRphePWUONHNFW49GCmyrUAG5Zq/xdPm/QnWSWq/27c8TP/VkhcHKbaX2I7lNfW8reX9X/JuZHlTlWrLT0E8ZsuRrWDyTwyJSflRzD5iVtbDJJymtd/x0FlQ/8SqsTzdp0x0s/AnrBEVtZ5iCGWQASgXURWeFgeetq9NJ2xKy6uHMFaJ6piwgDI8B3YqWOwYllle23tT5WYB4baWB3vTzrmt5TVoeCYHQaHF0mOlTk+pJ+WL/PcXs7IeJuGUreX56YQGdVb7DPwLRkG9M5zoZpBGI9AB0sgqCORcOnUtJw0uyLjDzv2UllfPE2nnOmEBZXgO7FSwKHTS8jq9N1VuFhBua3lYHO/C8kbH4ggzF5cefqS6Yqen1JPyZf57i1lZD5Nwitaip4Zfv5NrmbAWDStOimeoRhyqVB040c1yFWv4Xi/v3mR5ctvYDcd5XAnKIMK0K/7a8sqzZlu1PJtn6iFfgjSj3psqtywLrGh5VWXLaqCPt2N56piklMjuFTs9pZiUr/LfV8zKepiEk1tb7sbSYO7eRiRqRMJyaTjRzXSkM9g+QIrJOyMZ8OYJ8zekweVZ5xOUpeXVqVPMhAXUgzmLIF05gBl503Eekox7b6rcpp5dy8uLMKdIVVfjL8l7d5aHtb4dCrfpJjo9yU3Kr1CDh49ZWQ+TcHJry3an3uh2hqixWRpOdDOm/3wa063nn71rdprBf4sBXJEtDzr/Ss4q+N5xWnXJP2EBZXiZFvtVu/Ko5Ite9ZAvKJNvxfKwT286qUhVVyP3591ZHrrE5uQ2XfJPDIlJ+VHMPmJW1sMknNRa7G7yJZ8VLE/WkXlpWHcz9KLnQar0Pn7z6qjAMsABRHGxKZOTqM6LQ/ztWx5WvdUiG0M7Xc/rW16/96bKXeWqAurgN84YxXWcuhr5fO9tLK+pVIeyi7KnCgXJXzce4Z0hAbK/jdlHzMp6mISTFm2YY7MdpbGDfdD4nIepkU+UiJ+gm4tL4NjV6QF41ZnFnmPcxa4nxFXX0ozCWJn6m+J1SHmgf19fKPxTeTuFOoRcfBHe1LSUL7xTltfvvalyv6v2qbCxjuW1dWARSRXV7UOFJCwv3T4K1rI8qHN8q2ZFVSOIa+cu6/RiSGAdoRqblK/z31PMynqYhJNOXuJAN7/aK/UBdjDFDYyGqxH2lSLRzcXYS+nLzsR4M+vBJF2spNTpg5Sr2PGdZ8goWR72L3/P03w50JnWnCT7JiygDK+SYkVbjA4cR+X7NvuW1++9yXLLwngVs2d5sLTyXoWqbXDnnTLGtPcNJqSyfmtZXuGaoBZw37JO7w2JSfkm//3ErKyHSRj5Iuwfi5Mp6ALvA7iSzv5ke6MUVCwN2c1pivspnfFHoHe+PxgD+CSNusCv6kyDlE/0ZYU5heVJ9QpvtjxaTN4ZIEcXmrKAIhzTSC4VE0721Tf1FiO4YKL3pspFx/n+hedzu5ZH8bz353jNC0mmcetpTLI8TM61reo9YXlY33ROZBeUBRXnWJZ0eh4SUJJfn5qSr1Wwp5iV9TCJp7pjKToTTh0N8oSZB3MMapf9BgcCKlAq1AeHdLOtG9CDnhzzsTnR37mTkZXpJC1Z8yDlQ6PmzJSmhiFSShSWxwr50MPoLvfI5iI5dRGOVhfjH/swbyRrmHbCU5bHPDu9N1UurUUzQmMh1LU8Hkj7C2XZj8Wt6xi73lDOC+VJrPJAr1RUUe8Jy6Mz2S0WgebKFKIAU0Y+xzLR6SAtSXPaCflGBfuJWVkPk2DzhTT9yJD573cYx8NPOF6wULmzHaqiNXknZTVyBOkqi2OMIj/+QyQljNAz6N3t+YKvBf8iwXbVoRik2GVmdRil5bHUYuyUlscoxspvGkaTFlCGM8V/U14cKshHjLxY/U5aXr/3psrlwEbu/0Q8Vg1TA658nr9sv1RD9PK9NLSIwRoG+f7PJss1LY9u1EW1VdVcacKSd0mnM+SnvzdT5YS8lp9VsJeYlfUwCe3lSudJ0X/mkboF2nJI6KgRgiopY0yHa7mYsH0d+IXp8hhx0VSBcpAWq1inGl9IZi5Sq0oMXyjOF0xZQBnOnQcxbx729diYsLx+702VW/QqJq7pqR5zkFE8HABYDa5GhHyACcxqxouTVSxPdvXGz1WBAkLNJUDYd7bTne56TmtxoSs/UsE+YlbWwySwmIGSvq8vGX2LfvpBTo59V/bIb//z8/Djv7Mm/lYYwTff6XLTxtg/fxp+8FNVznc/DT/JGYpvvyuX8bwD7Yc8Hr75rjjd99fv2jOqfyvTfvVdeXbsq6qyT5/+GRn/9z9Vy6rMn/4x5V6H/+mH4YfivOpX/2IN67zLprd0em+qXPI75i57099/V14KqPkjdmC//NA+jKuGhBx+/aFN+tU/f/j1eysn17aq91++K6yqrtTT33//MzLtt7JuDZpZiHU7XXLmkBjn15EnjQr2D7OyHiaxberZPdgt07ve4H4xK+thEtsmLO8uCct7qJiV9TCJbROWd5eE5T1UzMp6mMS2Ccu7S8LyHipmZT1MYtuE5d0lYXkPFbOyHiaxbcLy7pKwvIeKWVkPk9g2YXl3SVjeQ8WsrIdJbJtv9/syzJ7x7x/2/gbHzxSzsh4mEQTB9jEr62ESQRBsH7OyHiYRBMH2MSvrYRJBEGwfs7IeJhEEwfYxK+thEkEQbB+zsh4mEQTB9jEr62ESQRBsH7OyHiYRBMH2MSvrYRJBEGwfs7IeJhEEwfYxK+thEkEQbB+zsh4mEQTB9jEr62ESQRBsH7OyHiYRBMH2MSvrYRJBEGwfs7IeJhEEwfYxK+thEkEQbB+zsh4mEQTB9jEr62ESQRBsH7OyHiYRBMH2MSvrYRJBEGwfs7IeJhEEwfYxK+thEkEQbB+zsh4mEQTB9jEr62ESQRBsH7OyHiYRBMH2MSvrYRJBEGwfs7IeJhEEwfYxK+thEkEQbB+zsh4mEQTB9jEr62ESQRBsH7OyHiYRBMH2MSsLgiAIgiAIgiAIgiAIgiAIgiAIgiAIgiAIgiAIgiAIgiAIgiAIgiAIguCuORwGc22J6+GtuT4rPg7D8MLcO+Iue+4WZa3ZE3syHj4NJ+baGYuSg9nRti1vkCY8WTnb1SXvE4y2Ybgwz47Qnrsb1tVSZt2emG7ViqW/Ozsw1+zrs5ey1cH7WNzg8dkbc23OMJyZa2dIzzkvd2V5J8NwpP6bWF3yHsEgQSV3XM+7t7wN+n7tnphu1YqlD8PcXLMLHazHNnqHYwllRuq4BXdgeTJdoAmy3dk+79Fwrd5JvNybJR8A77fdSwVpaN695XX6/qaGrt0T063y0m/IcWx5sDSM3Q8wvXMJ3hPLE85SXXdkeTez7XJ3yvlwZa7tc4+W1+EmtazdEze36oYiu5Yn3k/D8Kj034awvIfJxQ6P8fbL8tbuiR1aHqLkAO+BWt71MHxtzkxjeaj69RMLmL1Cims9kk0cQGI4tx74gEXCk6tU1cZrPX2tC4HZI54Ke63ul5dwf6Dr6Aw1ODs7o6hLshbD1SvzUO5dWS3jERIypZVWpWnqfkjRHDcfPprr7bDgphQ+HQ65kHr9LAkd+1EEWnh2PVxrkagWekxXSafI/lRchDWZz55J1ofDJw18501vy3qMDntPDzvi0vJmz12kcXbdrASLpl4moUvVY1mVsopK7vdErSXvqVItmaLgoifIe7OqA9fh6fCcm7K1LImt9fILrTBVXWTdpcpSyxPh5CeN1knVfjqqgcXmfY2svFHbAtnazFCC2pkLSjoUGRdCJ5FK7ax5DkTac/XLAfLIy070XSkX4wTOR+YcHs9mr82JcJd8YSHqQyj+k7rutGOFPVenaeqO7iXpnDaaKWsTVpGZVsIoDqNlODsohMxo6FSkWs/UNftagvwEu/Xhp5eiwLnXyPXZloUhRMSjoFTpOUiK0OyVO5SqqaiqBM7eiKOqSlFFpez3RKWl1FOFWhJVweb0+IW5jr3XtJCmtSdYFRI5RVloRUovi6y71FlqeWJipeW1Wm/bj0Lxn+jAcsVt3fLMZsyXqCxvwETF6olftwgtp57hmo1/S0H6hksuOZ5YkpG30ClGBqaYw0sa5UIiMLS9INm4JCpwdTibPbfotloGxhQs7rFpsUnT1F27+p3s3oTB9mdaYit8PlwevXgDh+6urFqGr7GkWi8eY2+FmjBnzAQSgeCj2SGHXM/y2rKuWAi6U4rSDnMH8nVfuZeqm4qNBkv+dVVyFY2m35VKS2VPVVKg7eN6tYlpWze6A9f8VDS3Vi5CcDZQbyrLSkeQbNoudaYtDxO+bEvLG2m9ab90Tz3eC8VtEZYDzJeoLU8cOse7TrvXWq7TOJIpBQbAHc/IW+lUh49YrII2y9YLSJJpxHMea6rlXOoS8I3Op02auu6Yq8WXwXQo23OmbhqK4jQv3YtQ/zplK6XlyVhDAl2MXks65CbB2MV3LG9clgwMG2K15eFX52ubXYymqT5jS7F1VXIVa7zflVJLVU9VUqApeGR5UncUKcnOePpl3FofHjjqKcvS0nORdTsSXcubz+fY4dpxVGF5Y6071n52j/jTeC8Vt0XYJ15WQWV5qmEtOl0jKdqbsBb6vgMOamHkzTrt2q8V6FEqmaqB8Mpf9wgnKNBPU9c9jf6MpRaFNQ3NfWQNr4+OC8uT8Zdv/dFx5ksrODqWNypLO8zyaCwPY4AeX8gpbVNtCIpQU5VUxZaUBSm1VPVU6QZtwY3l6TWGo8HqI4uiidbCgZRlWVo6I2TTtCNhCiHZ8oRLq1pheR2tO9qS1CBXTaW4LYKxD/wcRqKyPHVol+p+SxzFUtmw0ZCGig6Srlez9b1MRRr/4jNJLL3Ua66mWo6td3Vma9M0dUe7paCMdgQmX3E3wtYIHKTwtLkKJQrLk20uW6fStGeesLx+WSpVeOnAsohDI41foW2qVW8Y3o2qkqrYUnVHqSV4clSTui240Yday+nw4tp2xfLTb60Pj1SW19Q2TTsSXcs74wG/r0oKyyvzb9CIZmA1itsmL1GVkeEts7yEdZjCk06E7hSlfdX1ara5GOGVZSJd41GtZJq51VtrGtXgRg8h2jSWO2EVMPx5k06BriTPew1NjTCruag7bWR5mOwT8KbRMWF5iaoslSq84tBjJs9BaZuKaC5Yxd1UJVUxU/W7omWZbNlTTepRwbU+II50iHlNM3gtIlKUUrVWh0dRltfUNk07El3Lwy8M1EyvtLyx1uv2NwOrUdzuyR06qslbubkF2K6cYMiiZmdpTWF9qfvArlez/eSZC8zkmieRO5aXJb8WV1OtBOvB6Y7uNs2o7tBItcSCxDP+sPxGODWCThlMxTDNtUjVQiss9YJqlozJhOX1y1KpwisO6ca0A1DapuoORIWaqqQqJtgLud+VUksg91STelRwo49PXCEwhn/XcvQ51VpbMeWyvHTbNO1IFKtnXY+4pT3zvWtpeV2tF+1vBlajuN2DaphrVJPxxT+uKeSAfrza5KTT9Wq2x6kY4KfCu5aXR5q6mmo55wNq7gf9bZpO3bnHNye5RmZ28rARTo3QvkmmY4wsr1Y2kttgUQWm41u3vH5ZKlV41cFT9M3BWttUrkkpSKGmKqmKTtPvSqklwXuqST0quLE8tvRrNpK5ew791urwIFaWl26bph0JO0dKzDxc0pXbpqy1Xre/GViN4nbPEsvDkUOLt6y1PD1z2fVqtjZPKr5j71pelryWea2pllPmN0rTq3vKR2CKCz/ZXQmnRuigvtSz5ImR5TWmmVb0qsAk5pbXL0ul0uUDD+e5vTr7UVMp/NaEmqrULQZNvyullhRzN6lHBTeWx1w/cdfzYThLJzOnWpsuk2hZXrpt2tnOSRcvKSmH+T4e/UTmyGarPqjbn6J8vFeK2z2TlmcHUjUeaKkwo8heHrtC7qpH3kKnfp6AbbZLmRjYHctLkojmRdmmWk67jKjSdOteh0ELlkUjnMYHwKCuy+lYXipb8VNJGAOiQBsiyIfeqbI0+Hy4FF8KRxVPzL4STVOlxLfJCMuq5CoaTb8rlZYUdY9T1wW3lnc1aOuQ1ZnWf9TaangoniR7ZFu2I4EDTDsLcmmSydJsdziyvJQlqdufmqwNaRW3cyYtD/VU/7NC9Wg7K807VejFhsf36Mq+t9ApDhPYa6ecdgYZKDzZapaHpsJlkjh4Y7/DJ+OwrZZhd0fY+qNJw5rIVupuN2d5PsrFkGbQSthqrcBe6lS5FqlaPB+gl9ROZFmoR/s8AjXL42UiFJ683LRlaSiUr2ZehJejVGibqkI6maOYoiq5ikbT70qppaqnXC1OW3BreVzZWcLcDq2Bt1aGBwZ+W5bX1ItsujQBlcmFCWwt3hL6UWhheR2t1+1P3ZPGe6W4nZOr1taENVQkVEGbiV1lhtbUrxchx1785l5V4ORdDORCRwCmRpAl3ajU11bLMRkbmXWauu7msl2E8RgBdrWkbmga9QRhcqk7M7Y8mYcEyc8eGcNQEAWyHHCxUO9EWZaZRKBPUjjHgboyTVM1lTmrqhRVVJp+V0otWbz2lKsl0RTcWh5TS7UpJwGj1tLoCIeHOaUsr2kqsu7SjIWmZWyyNOTMtYWPPwTbttR63f52YJnirvS2v92D6cVc2FWp46Nf8aTesVeplls843uN4345RmZXU8hU0Hj16f+UrZwr1oMmNJqd90ktlKfnuO9KknruV2e1TrUEv6fPzrLWaUBZdym5GSbUjLlq4eqdBZ575lN98VtQjfqNwTgg4qkFnzohyLq+9mObblmW2SNGYZvCMaDKRittU3GQk4TKqpRVVJp+FyotVT1lasnUBXtPJGBxki86ICWrW6t+y78oK9U0FVl3aUZ2rMMnH5L5tM+1HKZJZwtdrVftT4X6wBLFvUB3pqPQh0uam5XGu0uO0+mx1Pe7AGtSc63NdhYtzZ46CIT7s7xLv6NvPLFvEzs/sglbsbzmHEUQKPdneQs7T4b12Xg5tjX8pulN2IrlNeflg0C5P8vDut7YeDm4Ar3b7VZlG5aXr6EFQUkzJd/pDK3H2hey59sVtznKutrCgfpVunIQBEEQBEEQBEGwPtUV5KWsKsmrpp3X7q9e0KbsvoQg2Bqrn5i8QdLvRNczIOKs2P0Z0OkSnsTJwuChsUXLk9GNQX406712/z4t76S6WTgIHgBbs7wLfZTjfbN7Sanu0/JWfIF/ENwdW7M8o33t/oOwPCcsL3gwbNvy2odIwvKCILPAUdlHebaDw/XZtT/HQ/i0VH5KpJYE82sxLT7W9NGf2iCnPKHZvHa//WhAXRCfqsqfRKgyrOsAdvsC/1FjZqNX8R/5DWujqs3ajygwmm0bPesSfOn440x81hjD9ZP6NE6eVgJ6XaCRxC9GIMeZhVPGkGgLTuHm5d1YbUEv1OPe0lPXQbA3wNBG9M4ufcTfntr0o7gNX+BvHnEr8sQckfcIILk8OAZnp2r5/SF6zyak8Z/EPWBBDQbFs/LliDClt/Y+L4wwPo390QZTLQnTgilSTt++dmQjTlDDXLbarArC2C5f119m2NRBSQ+8605TbIk+dctOiCWgcBqJemk2K7zAv9OYzqv4ex9TUMaWJ/s7zAIbP2sUfJb4/mPG5RBGCceSDybdvfGqNO2jkZSdmtiN2UHJCpZXFyQZcZxyQVhmWNfBsHw4qrnd5gv8O40xMD1wQ1tSkW7VOpYnXuyQZRsEig96AQc4stVHTdMDpzqoG8mTFFC+lcC42fLKgjA87bhKx2mRYVMHQ69W7OQF/p3GOFpLJNfDxX7VxpanTStrEAQcIMUqqLAM/KQXXOhs30omSzxNyRI3W163IHMVGTZ1MHTQ7+QF/p3GOJqFJ5+o2tjy1Av/1M40+CKpbuioDUJeH2NIWCV5bgtGwDN/59XZQMtoRcvL7z6z3UfOsKmDY+9cx0Hn9bZf4D9ujL/2p7G8ftWWWN7ondfBl0z14OfIIOzN9va1gFoSewc3xcd8Orw8u24ZrWh5n9Lw9AewU4ZNHZxdvsB/1BhIDp1X8fertsTympdmBl82S1abGKvqU0aSGKHmAzjk0jetCpbRhqtNQTNs6uDwNMoOX+BfN8bb2Vhev2rpfuyx5ZkjCEg1vdcG0ZwUGEtWL0DRkWZYRmPLs8vRdUG+p+Mg5zl9RTJs6pCAFezyBf5NY9QMG8vrVy1FN5b3rCsdfLmklZ6u2EqDwK+dLhTGko9LY/yUR3jKqLW80UcDmoIOyhWtZljXIbHjF/g3jem+ir+bkKGyNMDe2yxPZye/qiBZBQFHCoeE7jgag8CRjJ5i968FtJLHHNaHMjDt1drGhOX1PhrA3486iDFMaZlVhu/rOji7eoF/vzHMEe1uLK9fNYgdSlFueeL/YDmmix3BFw8mZ4EjojGI5tX2HUnO5BzcpDSyCctDOMD4bQuyG7/UV2c48Xp9BHgNzBJoGgp9sEfz8WS+OWWP6qaDnSSAoyyh1xjeByNhjeX1q1Z/RAHScmrHLsHA2/RI8OUi07IuzdIrFHx06aDzu5A7kpzZccADylOPHj167f4jSmI7Kqh6XX+dYV0HBxYn18dwjJhWvNh1gtu9wL/TGG336FX8oFs1CPBWTv2IApKqlMk036AMgmAn0PKCILhrwvKC4D4IywuC+yAsLwjug/J8TBAEQRAEQRAEQRAEQRAEQRAEQRAEQRAEQRAEQRAEQRAEQRAEW+VR8QrL27JeXvP6HbNP+DqVRBN5M6snWEGyrkvQYcNxs83htuektx6NebluLy3Ja0z7jFzlXfsButUTrCK5bun3xZG/yf/uWUvXmQ2T3SF8PY9iAbtiuive+ov7ViYsbxn+GYhO5nw9kzn9c54rvYawl9cdEZZ3W6a74o2VfVa8xHo5YXnL4Ms+FQtI8HVrKbB61eIN+At7l/LqbIsrvJTZUl2Pilwt2YOg+oTALlnSFfL+Pkro9mbC8pahb/Qdw1eB2icHgX2R5nn5tfpp7BNpS9nqYE+ZLc11FLlasgfBQ7A8Y/VhuFa3huUZtLEc5y/ote/2boGtDvbVTGgUuVqy+2Bu+5fEcsub15FrnwgsMpCuOCgzfDSvD+/KIfB43jvr56m1W6vKVTlXFBYg9a9GmkbWabs5PbFATfCkqnqdoJYsaEoZ1UWYbMYoalIZU5Z3jr8Ulz7I8qjzTU1vXbcyR43aEtVgr5K2miZV2HxefCJOaEyozO4gj42qSLIk2cSYuhuqdb5SWF7+mrh+gEfeCO1fBMHahC9LPsvfJtYvBwj+tZOkzmF4jd8mgxN5/bJ1zYEeZvAzBjJE02kBRmJ8gOLr0OS1BMo3UZCJvofdP3+gCYYX4ilqYW9YF5/Whl+ZFq/CSH1tvQVUdU7wm3rappTAD3yaBLUkeKJfOarFunWphDZXxqTlkRSn39Il6d3eBDmrljBIsTL1HlatsUXHDNS21/3McMLRXVW20HSiCrMX4tvXbgQNkcwaXR/a4SmLUZe6BfN3kk2MqbtCR3d9rFzu83TEiANT0Bt+HWCBhsoaH/3IxGdwmKKKD//oZ07EAvU15uJvM0Dbr9l++U4PtovX+h0gGaL+WQROzNicvuAsQUEHo+zkuX2HUvK64ncN9KN5TPieWcuXvCYsD9FnqGA+0iGIRDZnHGQyBdd1dhArNcfOAQkgTI9m0ySoJRGAX2p7LDauSy20sTJWs7z8zYviO2f0oJ8vOH3AOaCKmm+yvLLtdT/rlCNBbWWTphNlGGfUi3f8Vq9ECUVmUmDWNbZnC1ahllKmkzHdeEzdGaxUW3Zpeegr2er4tunCJkcmlFUJGsMNu1e2gslCRr7k4x9QbjPg4IbusME8x8AZQ3SIUkI2sF+ZHDBllR+jtNxsIKQ8uTmxaU+znrA8jCJGcjDRayBSzjBgore9aVlnAwpl0gOGMQEHzEsrs07QSEojZJqtxfp1qYU2V4aYxnBd7MkyiFOH9o26iq+sMCU2bKXZv32aySyvbHvTzyYkjqKypaadKsx6CJnXe72UmQrb5lp3cDbJJilnKll/TN0ZrI3VJGFhEupdKYPxk3eEdjMktM1YSci2avSVj1uNvGbcOAP3IlJtU5EhCrKAzt9Y2MhWKdcrnpettnyZCQevTvUtz3fHGO6yVRCpp/beSXBTZ6NUFxJou1X3o0ZWkgyQYTUS69WlFtpcGfb1ou4RfGF5fg2nWgUhlWwxhcha0fSk+Tdtn7K8urKlpp0yLH34EGt4dQipRV6jvLImZzpdVO0mU8n6Y+rO0I/t1EVrEFCP3KsgM3Lx/Rs68iXXNN2IT9Gjklf8ZJWOjF4G1ieX1DT2MWYtyTQ8x9TFvoZVygFS9C9+sk50XHUtz6fZVIziZVtwU2cFU7y5QEpwaZ8zKxOMJM3wGrGJutRC+N1QGQo/n5RXoE5heT5mbbemeDBSy9ZMW4Obtk9ZXl3ZUtNOGZYPM306EQoVJ4duhXTYYZHORLKJMXV3iInZQb5RzYw6B/v65uOZoG3IbdQPhNczFKKx+rkaDs84lerImspA03NdbzVxhdoG8ZoujRKBR+1eheQSCR8lPpd2LQ/HuBKWilHSYEIwVldNnRVb4ikpgR4o1QlayfT16Vpsoi5N4ZsrQ0GHYd9jx8++XkAG6jBrqVzAc7aRzQD9ZXDT9mnLqypbaDpRhOUBmCyWpCYWDt3OnswXPNKjM0U6E8kmxtQdcnz9oVxvg8rydIbTyVYUpnB+KtooVladEaMXayKkFqM7s91BosrAlCpfjpSudoXaBp2TkABDzrnpOqbuVj/Y8bHQtTwdoiQ5SBpMCEYtpVAlT9TVeZBm9JkwQYJWEodf2rsmQiA2URcTICx8c2UYWDHKbE98EVxYnh/d2W5T8ZxvZ3kJ6cSs6UwO80xyeiF5Cods0MeKhhVJyESyqTF1n9RHA3J51RtVnQcu2igd3bSAmjrgepDhmnQqgzTueKrVvtIsftvkcdlyyF6no+7WdKRgO5Ou5eXdQnKQNJgQjLVZU2cl5w+a0VcnGEnaOZ9GbKIubeEbK8PodWSyPD+Bw8SdXc3tLK/tRNd0iYfZGVBxFfdlFypODv2V6lr9UqQzkWx6TN0fteVx4fZWD6iaaapoI49m3pWH5QRhx+y56+G1NnYygwu5ciDogsoVapvqPF0D9nuqrbJb8+JN151pgJSWlw8Gk4MkWQ1u6qxU5wia0VcnGEumqaIUm6hLW/jmylDyWiCTLK/c7xbLIM95FctL3pHljTuxPpekaFj+jG62QVCoODnw47pe0/KWjan7ora8A9TQ1gjVKa+iJQCTmp04zqC7pXWvhus3uiYcZWDe8rviksQ1aBvUYXxmwPlk473sVqS3Q3ObWS0jbLPlwWZ1Jq72S4zU9utJr6bOChqbD/3TcHPLKxOMJdEYvUJZik3UpS18c2Uo1akTI1keatc/g6w5r2J5FsetWN6VnffpdmJViiFhNje1J0E9M8SXuvb6W/2SlDORbOmYuidqy4NXT6/LoXl5XFwq+3LAEbK5nfeDHFZQ8FqPKkYZ6EDQvtN5DkMQv65Ql++NGIxq+fXDnrJb8asZuFpwiK+bwvJcCDsSFVIQKR0A1fAsZFNnw4fSFebkZvSNGtlKYj563esLbpq6jArfTBlHNsY+Lj236XWwVYTjOa9oeWU/p/NLdWVLTTtl2As7SG0qkk5W1bq2hTqEZVud0iITyfDbG1P3CppgiBc9rr0q55/O38znuoLLLQGYZF0owb4wdXjj2wzQKye8s4EXkuCbH2GWK0yDKRdv4eRZzOP5fHFZFEmhV0dzZMmlUdOtrM/xnI+bqUqxK7lY4OekzB4aHhZoXj7AIYhE0IL3RIm/rrPBsxQfFhjJGFDt6KsTdCTVAGqxfl1GhW+mDKQ6k5N/6YqdcXp2doYS8cv9KO+CWbyEWHdvikxkaz2swW3b636mxX1anB82lS017VRhmLEvX7ziFS+JM1Jmqa0Sj2F2NX+EpbJKJylnIll/TN0vbIOS/H58jPlFsLVjUWUEtgfRDEtnsiRglIH55V4lDiogCnSFotcAHNgBKWUhFiSHAk23Wsp8Bo83DWHALyrD1gdBP3gaBZFWmqmvqrODwwTCa+7t6GsSdCThxW8t1q3LuHA411cGB5lQHDUJFgzkQPtrdVcNTTmvZHl1P3OpRx8cZWVLTTt1mOZSS+TMGl3zxAx3YFaVVKQzkaw/ph4Uj8rbu58v3i+sptV7LXoHrOlhgWUZLM7S6av54p0fnaS77ucLO9o+XJwuxI5LEGZZp8qkso6Qc9GnjxbvaIV6O3zKXkv3ZwkU3iT/+K1nTIo6Z56/fWu19dwO0r32dYKxpFWyFuvVBdRCmyrj1eLDohyOEzxfnNi9oYncs5Zea2DBo7ZX/UwteA3LdhSaTtRhL9++G18WscxGup6/W6A4r18u0phK1h9T+8ZoIRPcH6GMLwesZ3y+D+6bUMYXRHWrRnC/hDK+IPTUfvAgCGV8QZTnI4J7JpQRBEEQBMF+8GUuWuaxWAvul/r2no1Y+/MIYAcv7F8hSxd5yZsazjereRBshdtb3ttNLixh5Jtra6yQpYsMw+Xi5GyzmgfBVtjc8vxzCBs9A9V9omQFlnyDYYUsTcRrvHbNV/8CRBDcwOaWl4Ztccv46rS39a7IMltZIUsVSY/MrFvzdS01CCbZguXdJVsp1G+5X5uwvGAbPOb5hcLyxi+575Dfrt8dhh5b3my//MX2laRRv8K/rFen0HEGh01IK9GxvPLhgVb8hiYHwVrIU4bDoVseH7QE+tSWP81kDx/zkS4MQPzwSVI5VtLHpYCGg6H/Zn5Z2wF9/2TCHqN6LXHFo2JVIaSsV1GoU2RgWdqr+wHakCqllRER8YNUc3lIE8iTM1K4PwYw1eQg2Bg+C6w2IZbHEZxfcu/maF7dSWAYwqIkDfYR3IDzwvKeQ4BjmMk+4U/3VHB0XmyvZoIU6ZsJSlUIqOpVFGqUGZjlwUz42DcQz3yoXs4PEbNMGJNbHn0sEod9b1Dc8QJV8DePd5scBBujxoWhpUaWhxr3H5jn6ePLC7jlAFZZju2X9rofi/Pxy0GpPn0qmg8Qg4kX22tWti8qFrnjQsp60S8bp8xAE55qedf+TqRUOd1QJK02reaYLbiclI8ivBAB74BxbfgbBJtzaoPIVpvu5SUu/MJO6LkYruR9YerFMNRh2bxZ3MavDW6OY3mCGXsrbnzXZ2PZ0bzGz+g3hTT1YrBsnDIDz1IsXNbH9Kk89o3qlcwby2smBUFjppocBJviH/SAg5bXvuReH0GRBSS2+l5HG6d80Y29ekp8efzqIPX3+DySrQ15WaupQ3EzkR1UQVNIWy9sxOeUGXiWYotWnFfK0pm3tjzs0MRXocVNNTkINsVeW0SHWp6fz9PXleLQRtz872FpGNrAdW9jeemwTbYnyF/xYEWl+dIeMw2jKaStl2aaKTNQ17XunU6rsy6ezry15Y3eIDdf8ECxsrymyUGwKbYLoUMtT30+OGXILXCsJCsxjZ0ahhbug7y2PDk7YWiwYtJy4lGP4JSmkJzIEtS5VBmoxGM5R/nGdn1eKUtn3tryLqo8sZNWwvKCXZCObdzy/AYQe9SZO5vr4QALzQtfMW5keVMvtk8mkb6ZoIwsb1Qv8RU0H12Qj6oCXUyvYnnpWJK8GoZznm4Jywt2Q3qXgFuevXYPLhnrF8MnGWc8uXKhr2XfyPKmbo1MJiG7rfyS4ZHlNfXqZWcZaJYQeHP+yRpnYeLQ347l2akgxesSlhfshmsfRG555vVh+DUvsvOywMA3wkrQRpaHxMU1g0xhefbNBGVkeeb1erm/ovzoQi2wiuVhfWorb+Bvvg3LC3bDEz3FAsMQyxu/5B7rNdnfnA++GhsPQ33/6VLLw2/3xfYqrZcEynf5NIWM6lW9ur/OwC1vWDxP93+tYnkI1ROk/CiC7WQ/TFmelW6fkAiCtcEIPf0aRnUslsd9YP2Se7kZBcwpKCHjYaifQ1hueTz72HmxvUgjZf5mgtIW0tbLC1WqDLQCCFLkCvxKlseTKv5RBBw0vtH3n/ctT0vHRKA5BMHacHRhKfhSLW/0knvetiUObHUB1g5DvZsqhU9YnuxWhXzNG6i0xfhZFNAW0tbLCzXElw4BmeX7YeBVAViQrpVXsDwejBI5KcMZB+uBi67lpdLjHrJgc54t3nLtlG7Eb15yP5+rweV3/I/erm+fQ9DwiTfzg96L7V06fzPBGH++oK2Xf4NBaT66gIM29eml9OYt++bNDbDCio8izB4v3qOy9rjDVJNH30UIgi+bj7YL971cEAR3QbpVprpKFwTBbuGF9Nfz+RscruVLBUEQ7BqeixWqEzpBEOycF4t4jW0QBEEQBEEQBEEQBEEQBEEQBEEQBEEQBEEQBEEQBEEQBEEQ7Duf/IUnE/AdRu0bIVZgw2Rb5DCesL0b7l/Vd8Tx2bofCV+KvjxwEj4wl95itDobJtN3sGyH9BKkFRiGD+ZqKXr7lC9aWpQcPDeHwHfOqKP3otJHpx+Hs2PP7JkIkjZLE7g7ptu+GhuqelVaBdR9p7w9y2+f1O58sYunzfK3QrbCcsuDKRxhEJtvZTZMxlfKb+2B9PUsb0q26G2RkXHmvLStIZ8sU+QrtgVZ0r+B67RZSvRdgkLNtRGbqnpVWgXUfSfw5Y/JPO3TwcDfjLU1bra8tbpyueX5l73WZMNks0ejj4NtzjqWdzEyFadVvMyoF8MgW9vnDcOxbGXmwJbLr3o/wjcgns7nr0+xlYBhOJEkaZ+XspToXTDVHdb29JrFNdlU1avSs7zcdwLf95gGsapAvsfTfVPzLbhTyzvfrPobJtsq61jeNGPFg9GHzXzK98/AYCyUJgRTtI8ezp6pwEiJTZY74IYCNrW8Xau6Z3lN3w1D8RGQ9CUedPqWa3anluevll2TDZNtlfu0PATlb71wQeaG53w+lrdrVd9seW+wUMovXU4q4NJ/aiWzCpg9mw8BWMGnWNo+vso6PeQO9hUGHD81eXZ2pv3I5Y2+4h28vIQvL4NQR+Q9sjwGX+k7aT+cXQ/XyEs85AMWg09QaA4pC0DeGO7XdbIiuywhLi5ULrhr4FuxreOu5Q3Qbdtm79ANH5GjfNvSaZqjPOJSTzrcLe+ANTh3ZXk3ER4SfJR906mfnMPKcfhYnsHa2PKKbwsiT3MlVrC83FeNGqVL59h7crV1qC/D5+eBbZi9Ql9d++Fif1RUSNsZfVlqeveqLrrfqVWljBXQ9h3PrF6n/k4qYMSo31cHifN3jxUrGOrm9+eAhL5T91tpm4BA+cwAkGHF7yQI+m6vQ/V8aizvhQZrrubM9YfTXtquY60qAB0t38Ark1XZZQm6sCcgj0y8tJambek18+YXmuYYHywQTsuL5kl0CKduAuq0AmVIWp+U+6ux4sEqllfsCZKpZ9rRM7a83Fd1LyfF6dv957lE1SOMhTTNLUdFg7TdYovPR0F0p6r2pOJWalUZq1ienMEyX2F5aLS51sfqYj4lWd4VpzvsUu0LIByB7+Sg0xOg6fRjL0DfQkYXRzF9EEKXHlJPpeVhPF9hyufZIvE3awlMjAxgT9JbF4AuhmIec0flyZrssoS4ntib/hYymkyAm1HbuCf8WH2itm2OgvU+JtjDS44Wz+ua3YWs5FRX0U1XchhwZAWK5SGzZ43CNra8bL29z7mvZHnWV3UvI2mpuNbyEIxfJJa9U29UtKS2yyaxa1WX3W/UqjJutLwTyQiFuTdlicFhgesjtc1ZCcnypCusLzDh0KO429V/7SsSgh7iBj0kiyrsJErLs7HO7pJ5emR5urt4rGuGugB2sS3UPFmTXZagi9u0+tBOQrB6yrbBy19sqqWJYs0xfPVJ1VlexrU0s+ym4kw03Bx26ZMuxal9VCVjYTdbHlYSua7FWEhYhnl671qeVrDu5UZxjeUhVnwwGPx2R0XLpOXtVtVl99eoqhykylQh1nd6OIaZSL1lb9sw2gQrw3xKsryP4sUcgt+kAGJujQGuDsW/nJUqXrQTvWQjxkodWZ4WCgfCmwKsz4kla7PLEnCpI1/gomYtuG4b1jni61+hKce4jjcjFSGoQspuKlfxOux88JRI/zsWdpPlHeFIqzj+dOkyE3NPZln0VdPLjeIay3MDgQOjpDcqRkxa3m5V3R5EZeqZSrvJqEPEZ1NvampteXWb1kAXyPV3p5LlWa5aEsRc/RbC3bZsR/sFCqbzPpXlpSTuGlmeFYoZd1RA6uKUrM0uSxTq0GprNSy4bpt/Zag/Sxbt5mGNuUBRHWD7s6KbTgu9qLNn2sXSJue3zPIU2V0Y0lfAjsDEXWSr9CxPHU0vN4obWZ6dv9AqoLjUO20BjnVD0RtC8u9I1WX316SlhzBWQN13/t1i37/VlrfxPk/ufWg++Na3PGgg3QBhZb9nkCEBr8zDDkjVrywv69/2k1OWJ93bFDBWR5vdjeqw4KZtcvTFIw16MmVzlGr4piJ4To7QXXYTD6LPdaLWAsusnLHiwU2WZ/sdIw9KYO569IBpy2t6uVHcyPISbFFnVIywzk59biT/jlRddn+iVJUxVkDddzo45GYF2daWt/FxXg8rOPWMlyRal5ZYCDpDL/UvFtL5jL/m6WN2gFcYjsLycHxiLlviTVmezEtNAWN1tNndqA4LbtoGtV+hrLTAUarmKLk8YHnJxy3Pzi48KnfT7DFWhX4mggX2vgw/VjxAoeYSKsszVwEWLrmKJlCPHtBkWfRQ08uN4kaW99Zk7W6Y0agYYZ2d+tzYtarL7jdaVQk3WJ7MLYY0s1DBaK6+JVOWpztIbi2kGQZop2y1A4qpsLC8ds0waXkfeLTRFDBWR5vdjeqw4LptflGhHqt1c5TjlD/QvHCAIscTxRLGu4lAy8xWC9zKatNcJeXFPRNoW7PE8po8G8WlQ1v1jq7+jkbFCOvs1OfGrlUtWPcrHVWBsQKqvrMlvMBTrWVNsVc113awguvRqVgjLSRNh4pX2C3PFrGV5fnJDA5sufVmyvLkFEJTwFgdbXY3qiM1oGzbcbOzU+rmKLk8oHm5Ikp1FhW1crXAegZWxooHa1oe9ge2n0qZVKOHTFveSI2V4oqO1IZUNxsIqVPFN8I6u70pYdeqVkp3T1WUaBVQ9V3uWltuZhXgkLRe9t8WK7genYa6PaQu2a45Yw/CDvCTEahobntOAilZeY0sT4/gMT2xwXUBY3W02d2oDguu23bRjAmlbo7h5TFI88KRg4RUI7twf5LMtcC0YirOum3D8iCglRWnbVa2PMiWvdwqznbU2NPQ682t0LAcUxdtnX1eXy5F8G5VrWj3K11VQbpVQNl3ONY3F8th/kkFqHiZzxboWp7uaa3O0jTUAsdHOoZOuNPAwRJ+ebZUOkDPTn+EN/eD+NmBaIXqYWx5PF3kjaoL6KijyS5LJFetDguu2/aYpYJP1YhpmqPAdnjm75RdpHnheJ/xvB0Dm6KbDsUYME44cK1AFUoLOLIVy2MLLjiOv85r5KoxSy2v7mUkrRSH7aEkT15uZs/YO91RkUQMazssWo8MDUjtUtVV9yu1qpzlllfeK6SrAVXB4SuewKkadHv6lmfIjkrPEMFht//oGWYcbwoX2gH2HNOV3oSUSOtm9Y4sDw0T9AR/VUBHHU12N6nDg+u2UeuKXckSmuYYFshklhcP44Heza5u6SYoXZCKWoHYcQh2JYtsxfJk/DoSYG4g3qWWV/eyKw6TjCjO5qWLhXo5FSnwmKsZFfqb8M6W6NyXO1Z11f1GpSqnZ3kO3b6Y8C70WpcH11vCVrbXqgevj4xOb8YHuOWQRYen3T+LLuBhwCfrSqzMedb1RbOW03P1vt76VI5ClnUis23qr7IADC8JAzlZlV2WSC4bFjga5vxswXXbfN2PCtulLKFpjiH9IFO9F8GQ60czNduim/SsmHaNFyh5+iUxYXyIBpCLuQRvQ3meoWWuI/fUJNM4dvkmy6o3azWq4vKt7hCkHl+7HqkfFCTzfdHcPCrqhWNqOw6LyjrsWtVl9zuVqoyxAoq+s2tfCgpCo6U7sPurc957fH68U9Kq4aK9tPlF45a3LtViepp7UXUwxT1ZXnJUk/UXzqaWV+xElhGW96C4J8vTFU97DuALZ0PLe7LiwiEs70FxL+qQ4xohn9MKNrW81daaYXkPjN6F2jtAT8vdS9EPl6vuZc6tEf0dBEEQBEEQBF8mxQXVLbDVzFaGV4v5FBfvBSmv4N47u66Y578h29V9sB75FqFVaO+Gb7kXVWL4yVULvevKAh8CUp+iYjf1nrCCkItItherXlcfs57ug+2yruUtl74PVWLgHaEdLDzdJvUgGFXspt4TVhAyEc//pLxXcx3C8m7NBj3oSZb3fht50wv8d6PK5bmm2yXzvZeTbPMK1415jSo2/emHghWETMTzX/tLFl6vsLxbs0EPrtb7E5EPyvLSVwFWKPxOLW+dim3Cxp9D8PqE5d2aDXpwtd6fiHxQlpeecVmh8Du1vHUqtgnN41+r4/X5Ai0PR8hota4ReHNHfoMBH03axvv2qxfeM9P0snxSJJHeh8C1P1eTP3XQyXjiBf6z2fxKHuxJqmyK7FY8w/LeFXWYPWcl9CmZXvOKzzH4VwEemZis66ri8ncamoo/Sw/HHOt93EWpfCDHbnS7kvHN70RgeZdWgkVeRdTyik1/+qHQlws9QjKm8xKKj1SISPE5BP2SRdFQUH2xouzfVvejXvicQYvlSSk4/YZG7W17MOn279svQ+qX5QtFEtTFSpWnsuQJL/K4knJknrfQ8qjqk4aIKkFbZL/iCSTDf2JPhtnzlXLz07gWZR1TZZ7blgOvKk6eaSOjittLD4AmK0tlmL3oUUtGFRmvigAqKjmkqJsqZntJ6+9ynFuIOkWIRqTAXFBC+ZGKSgtmQCA3tK5J3b+17se98Fkj3bDL9+2XL7yHEqrPLBjuk7o8F1XRO/pyg2wSNixsk0DmqCFncpFvi9Rtr+JKUweaAF8wgH7Q9ww0tWjqOFrU1cWV32lAVFnx4m3m+G1KHVne+XB59MKmBpLySlE3V0xSIP6ZWIAEkvH3ITAwYHGP7dEO9Cer+tYqbPl4/lr9qqF1Tdr+9ZI1Yd0LnzfsCV1r+MS35ffte/ak6NhyhedJWBdxVDMelCTbccY2dmSTsJdpcTJXb1XkkoorTR0gpys+LJhkO05BvI7tAK+LQ6iuucycyor7Ows+cvfYljqyPGtUIuXVRk1VzFKk93LlR65rfVHoUgcGvysnYVoZm6Gt5MbyqoYqVpNWx14BTVj1wmeOjUgZqrK10ZIWSF3LS8sTedQmRZLSTdKyTcqyAxfvfsU9KV4LcdJkK76E6TwNOgXN0Dy0PW2RSyquNHVIr99AuGzHKQRL1Q7wujizP6OpuD20JG+4bEsdW574MimvUdRExSzFyIQZlPelLiSX6FJd9FjMxouVXFte3VBDa9L0Lzbiy7nnXvjMsRYX7/zQySntdiYsz05A6DES9J2un7qgc5pHWH6tSP2CEfekuqRxIljmbcaWb85ewMGqucTRFrmk4kpTh3xHVBoT4muxfMaWVxaHo0r1CU3FdZWHNR1+21JHllelBClkFDVRMRP0NXRBoa9ayBYIRVH6K97a8uqGGlqTVsfutfCyFz5zUk+8xyh04LWBRkfX8hLseMROv2+fJ7b0hfdnKc5XYYp7Wq3UnzpoMzadp5Gg5PeaiqMt0jIkbcWVpg7ZMm30p/ycqo7tALc4guJyZUhTce1hfTdMW+pGlre0YiaYC8pkfZkQbFE9dmicitJf8Xr+2n11Q6uatDp2r4WXvfCZk3oCvWVvut/2+/bTC+/Tq1+LPRNxz1grWPPyxLMMjzZj03kaCQoqYi5xtEUuqbjS1CGfNdWdpOZaUNexHeB1cbkypKk4/Oh3zactdRPLW14xE8wFFbTfh2BWZzy/SbeH0aG/4vX8tfvahhY1GetYfCm86IXPnNQTecgKxT6MppZW7m55671vX194v+ZqEwNAfKaHKglIw8JGgoL51VziGK82JyuuNHVIu3645Hi4lAVNHdsBXhdXfacBcVXFZUdhE1xb6gaWd0PFTLCz2hSq70OcD6haOiIsitJf8Xr+2n1VQ+uaNP1rueTwohc+c1JPNM21U4R00NSSmFve+Nl+E+n3miTLezr7zILhwY1WfPzdZHl6Gs145mIH4miLXFJxZVQHNhdMnGFt6tgO8Lq4dk9fVVwqfKkXqNtSbRfk2abBn0h5FXaxtGImmDJucTWrkAQZRQn6K17PX7uvamhdk6Z/LZccXvTCZ04edtXl1HSQjF2hjAKbHbHnoNeW/jUa1ovh+oNDw4tA95arHE8yGvVy7RXCyyxv/AJ/v/ps1amKXFJxpalDWjdd2rneJn1Tx3aAN8V5ZUS2rThi0buy8G1LPTd/apMN/kTKq7CLpRUzwVRQcW1QKL8PUa3GLUwc+itez9+6r2xoXZOmfy2XIjz3gp2c+lxJLeYpFu3/Lb1vX6leeI/cqBGUWQ06T9JoZZC9FNa5bnlVxvCLzjFFlAODZWCOZSrJrC1yuuJKUwdUXPwY+1pKU4umjp0BXhTHgc4zxvKdhlHFOauZdFsqdn0wBD7TKsFp8CdSXoVd3FQxEdQc07FEoy8VsttwbPdYlKC/4vX8rfvKhtY1afoXsY3uUy/4FdTPldRi0bMipxJu/b59hYOIqGJciXWfepJGK/r+byTVwdNk7DrX0kVCEf8w+GhqipyuuNKODCzCFb3q2daiqeNogFfF5QKzp6g4A+x4ri3VGmGVS4M/o9FF1E0VM0EZ58Dvl6z1ZUIWpCfdUglVXTx/7z5LQE9dk7Z/R7pnUukFCf6MSRfQgfbRtt63b+ggsjztBHO7trEkqS4fdSRAG1w7+qcOmownXuCvMwgOUj2wLXKy4kJbB5uQLtLuqalFXcf0VYBco7I4K9COYkYVvyi8TalMl9qUvhOR8bxy1A0Vm/z0Q6kvFfKri1q9VEJVF88/dV/R0Komo/5tdZ96oTnl9+XhlhfsnIsHec/UcarV3ZjCw+yFeyAs786YOsd/v1z69YS8W9opD7MX7oGwvLui/tDbg2Fhi1EsaG3FulMeaC/cA2F5d0W6hPrA4C0twp2sAh9qL9w9O37ffpDo3sr1EOBdPoN+H3r3PNheCIIgCIIgCIIgCIIgCIIgCIIgCIIgCIIgCIIgCIIgCIIgCDbg4KW99aPikb6z7Qtjbu/5XY096aKv/Y39982HG59H+KJGnTyGZe6C9Mab/eLllOomI0qO1no8c18eYXwwD+DMb3zU755G3ZG9gepO8BeRng7D8ULeTVezUh94JkDGrDzQ5cjnTzPpPWb+Yq1d4O9sHDEZUfFZWp6/jU1g/5vznSjDnzc/1Adh5R2ZNyCCxJ7cWz2jG5VwT5aH+prrDsivAK4M4dWZ2dLtLe9FevGeYN8yEKb6P5Vec1a89Gs5k+8UWP6yAS/gs7S860LD8rI9ddavRfRZcZXPaJmkW9caGb2feLHEeqNu+9zRc/dKtjzZOKnpG1meUL1BlhH+bkl7nxuMoC4zM1HopPwYfVFgh8kI4gV8lpZXtOkAVTYvbJB6eW7v6LOXIPrHEJZSvy10nYymXqa03qjbAXf5Xuv7szy+ZcDfw9lwe8vbDC/gc7S88gO5V8Nr14692JjvBcVv+haJB89mzyaPfWrLWyujOmni3i1vh8zbOX81y5snywKP5223bWZ5CGre8eLFlB1fFN3UsaSqYMmj+fQhxSiRF6CW96RK2i3hMQOz5bUyR2UOh1Vk04lHbe4ecNB2dlXGvDqMqqLahhcv1OERn2knfcjmkby+v/hkkPf1hJGAKma9jK76k9V6o47M5yucNarU0Cqpfxq71eRt4Qt9vSMMMRrMh4IfCJiXXYU+0Dd++1sQoTJQf+V3Y8srJV5LxpcSLEDe3jHPwy///iGSJC3bmE8pE7bTOtBjj2J4WgQyx4oHFO92LAqglLz43Jf98g7n9E1BRw9YkZHUoqir5iABloPVBEhlyk6kKGPFQ3LAoaXSnmsrzeW6Ql9VxXHDkdo/RAvnY9cORoQEMfANf0yR74rw3ognVcx6GaVdYolUGUBspVHnx5L+xQxtrZ/jkQ8atWqoewlF8b3qeRIwyx+Pp9uiY6s+iJQuQgUF15SelpLeQcWG4YovG9dDZThOX9CCxWdsbHnFgTYOEE6eyydfc+kYWxfHiwvpcv9EwfnI8nLKBHscQHyBdVXegeYI5FG0ihQFQAqFirYlivVYLDCaq+MAvrhdRwRrUdZ1nAO6fYEqmheb3IkqanIkB+DnbMFMJLypNLI7fS5vUWcv1lVEYNNwBPnEf8xBZdpJH1xAPPaJqRb5Q1yVfVVUMetlhAFvroJ1Rx0N5OIdz5zTp1oQUdlKhZYrEhbImNbyOuPptrAYYD7FjWYUbFrzBLb5pIaLCak8ZzUMr+dGzsd064wtD81OBst4md9k8eClv9CNz5CeYWN5ZUoDPY5fzIniK9YaGoEkacuN4z5ISQ1wJCqVthLyvC5o6ZTltq5rk8Opdtu1ft2q7kSKVpN5DrjWLrMvkyG4rLRensMkTg98RRU7DS+uKYjLtJPHHcZkceYDdTBXZV8VrA/MTI9f1swohVasO+p8kuPuyTqeyzduEWu7lGlFspTq6EuFO+PptrAkYD5lBcuTLSYC9WqF6uWC5mtY2I2Wx7WC+JVybZRKN0zQ5UeWV66qBNU3JkX1ZmwgeNHWKsd9kNIK6Jj/5CuPavCcmrSvNg0NbXKwIWM7gLoTIVovQ0YBZ7o/qSvtg1vn77qKnYbnll5IjZLl+ZUaFprnzWw6VZsrqGxilVsro2bIG2uNOu9/HhfgF2ZJz8VwJed01LtckSilbptKd8bTbdHv2hSVBzdbXnLgJ+mvuiyLuAILW2p5Z2efKFr2fjncUqGGpfUMR5ZXD1SAHscvtJ6ObQyNyAW0ddStSfGjAlzJFZ+uLk7Gpo8CtZbHuo5yEGVqrzWdmESdUYCtfPqqUAXWVew0HOssddhOMlme9zS/c4ddg/ng5V9m6nY6rtfY/Wtm1M+waGGvqfWoyx+IcjviyVPssETondRquSKrox2ixXbG060pmu6sZ3liNEIl75mAHL7M8oT6xCaPfa2oXCiYL3iAtMzyypSG9TiPAermWkTdqoT7ksL06AUa8kYX5aTRnS0v17XJ4VrXi6cy5zadmESdKuAJ8mytxL2Vp6riuOG20KCk6Crl6XWnK++qpA7IJNF+eS2jn5hcM6NqDksULSyb2h91edBpZhqJH9syrFFD00upFEcDOuPp9hxff6jPNaOwdSwPVU5osLC25c07p4gxUefzVFYo5i9lmeWVKQ3vcblXo9SxRdStSrhvpLBEsStJDbJaVHVtcsDK+rXsd7jrazoxiTpFgJxpIfQ0ldYvv1/rBG1iRKo4arifMXhtE7ppx9aK4npXnPjIe5estynEyNfMqBgxBWuNuuxWaem2BY6kZWmvscsVmUpxLGA8nnbBepaXZs6atS3PXA2HHGh0eKGvhuGcM4Wl9XSt5RUpjTx4eT5XP9otWETdqoT7RgrrLfzT4b7Woq5rk4OcQwOyPm3an+tq5ACkoPV0V5vlqUAGtVVsGu7rP01jHPLqhQQzAkWlA4C0OEXQzZaHZd6aGfVzXWvU5TPN6Suzc3TKARaaF75AXa7IVIqTAtrxtAvWs7zcjxVbsjyZbThavFAfO5bW06X+TJaXUhrlaE6f4ScWUbcq4b6Rwsq9h2P6poO1qOs6ymH25vyTyTedWNZVSAEwHtl2Le/l8OHR2fDBatarYtVwP7vv9io8KXQhVx30++bgMh3r9G2kRMpeM6P+Pbtrjbry8rPY4MXwSeR5cuVCb51ZrshUilME1ONpF0xY3pUffNZ9gEb1TrZuzfIwWthDXrrPnI3lJUdheZbSqEZz6baIulUJ940U1jvkxhhWh1teWddRDuoTmk6s6kpSgPdg1/I+6blvo1vFMmM79ZewvFFWedoQ1ig+OGyagOsGy9Pb0tbLyK+FNKw16lIRfiYXk8uhrMBh1hY0UkPVS4WhKWVANZ52wYTlpUtXzRjliasx27A8XQeoorx0m9bgNcuzswZ2Ugp9zzFfpjS0x3VOxOwlW8FU0bTK8QIaheFYpT5dISCUBwtQslleWdeRyofF83RHUt2JSdRJAXq6XA47uK0r/X4YPizmfvtYXcVOw5GoaoNrx/K2KR6LNJEqdpeFwZxXI9XOuSClDOd1MkrGUue43qiDxUsVrESA1bwo4XzwKw7LFVkamqABnfG0CyYsD3X8tDjHLN70Ac/7HM/ni8uq0luwPPTRq6P5pa5CvHSstt9wKkuWNyzeMi069mKBnxOO+SqloT2OsTk/gpLzjtFV0bTK8QIahfH03fmb+RyDXUINSJ9+jVocSwF1XZsc4FXk2m/diUnUSQEYU1fzR+guDWgqzUMRQfq+qmKn4QirVk+uHR6BLl7Cp3sD7MjPXy7wmxT6KfcrRM1FhuFysThGuetnBLdK1TmuOeqQ+eWLV7xUlgPEOcc238QndBWZSnEkAGlG42kXTFgej5hlH9+OUU7xQnmsugXLQ7ygR81eOvsSu5YLTQvFALp4XxEEFra3Efx4m2iPY5VGpM8NU0XbKsMLaBWmN3GC+oSXhj16qSO8qmuTA1TNKw4wFpm8q05Mok4O4FkSJtGAptIY6F8vFthNaOeUVew0PO9PlKQdmSxyy6Tnm/2jg2FrLuKW7+a9RkbIyS8u1F+IXWvU2SjIzcRco+LY6tJ/uSJTKY4GmEg5nnbBE7OZ0c3ZR4sFm5neiJEEDheni+LkOvFMQJFPfRt5GeE2WIN8k4yVPnu8eI+y0m3+84UupGaPFu94ZtFuQi9TKnYH+nzxrjwaAhoxbpVhBfgN7AfpLvfni/dao5Jni7cUNKGqrlUOfjcXTVsdZSeO7pYvAubvFkivPVlXGoanvnT3SFnFccObQ6uD/NTK88VJ+YIW1KytkCFGUfBk8aEaCStnhJx8W+e41qgDL9++Ky80zq1Rj1KypYocve3FA8bjKdhPPqYdQz38b0PK6bxaVE4zvvCwJqdb+4S5X+LYXo5B0CMt7fzIfwskQ7IbfG/kfbNWXptt1l2XLtvLMQh68OzD6/n8DQ4F+wvtDUBeH+fz+bv24HOaW47z04ljtvXxc/7byzEI+vBsm3DLBV9Je4bjRj7d7jz5fGunHD7aadbt5RgEk7xYbP2o/cli8bI5QxEEQRAEQbDv9D+/UPNka+cRguALx79HIHdq8IqnevvE2eIg2A5v7TY23qbIzy+I+U0SlhcEt8O/NOCPSOVroNXzFg1heUFwO5IN2V1+yb/0UkxYXhDcjtaGVrOpsLwgWIX2xfP58uwKllff7i83h4flBcFy+ADTMU9X+mEb3z8L5F5AfTwMqFz5+QV/tInPLAK7/059J2F5QXADsKiLoXjxPB8uTO+q1wdF5elFsTz8CI+T5b2BdPHBAcTxFbbIUbxBEExAY6IN5RfPl++qz+tGsTzgfrO8+j31MDquWrEJywuCpcCidNmoL55Pz4/Zy3dusLzRe+r1pWrFa2+CIOjhFjXzF8/7w/VpHyi+CctL7y8UR3rnQZIKgqBPsjx9RUx+lZEZlUdPWV75nnp/MWuWCoKgz8jy1OemlQKmLC/xon7NsDmCIOgysjy/N8VebeXRU5ZXPmPtx4ZZKgiCPiPL80eAzAY9esryyvfU21ckQHIEQdBlZHnm9ddvu3/K8sr7pg99F7jFN2wFwedJY3mjd9X77Sl9y3vSvqdeYt/b9mX3GwxBEIwsjw++Vu+qh6P60kAOp+U176nn3WX8aOqVelMmQRA0tJbXvqu+/dKAi5vl1e+pn70T9weTglNCgyAYMX7xfPuu+upLA/5gQn4lff3BgcUZT4nadxb8E1NBEARBEARBEARBEARBEARBEARBEARBEARBEARBEARBEARBEARBEARBEARBEATB54A/ob63HLz01xZO80TeW78a+dn7lVhTfD12mvle8fn1RHoryx1zJG/VvQUvTRXXfA/MTfkta2Sb0t83syJriq+KNm9Hme8hn0FPcKAmnt+b5aFwc23GW/18GF/0eby4uim/ZZFtygdheda8sDznc7O8F/doeeUbc1fn7FS3b6ze/ors5fkttzxJ+erMdqIPwvKsef3MvQ+WkFrzufC5zEFnaSjel+XN/GsOa5Kqe6ib5F+a39JG+sdvTbUPwvKsef3MV1DZ5zJQE59Lgx6A5W1IW93Vqr/OWF1TxzsdEv3M12nN58I+Nmhue4eS1vKepFdvEn/DZsVBN7SlEno8r04pHvUz6OV71NTnkbnGo64zCusM5dztWGpUalKtOlZuiIjf3DlNFv5+UuRVqGe6UhVla9p8jamB6v26pHlF1KNCDXXdDivfRCWMUn1kdDbd+6KHF9vRygPnAw/nzJ2pLU/eGu2HSY/oqT+eAJCAvKDbv8w890yG4TV/jg5FRj6jDjSJfINdSrmCTzyCDY3XImQfgCaUPGaY1+exSKjIK3UzH5kx+G55gkM9H2p19dV3glj1G5WQpNQA+Wou/AfiXrkh2m6WX35b6cxcpM3ihN6L2ewZt9ZSafUw6DGcNM/qVlP0QZVv/sz9J/wppT0k/SDwgtte80p9HLCd9sWaqm4eAySyyqClVJ9Ugk1OXcNm6jvLJadiRIlIUZuRVh46qqfRyQd0lrnQduhBOk/8OLS/WCzQtdVxE6PfU4hfVn9iorBA/aKX+IcB/XbFN8V/bYHD6QvaPT1aSioUpJF68nxRhrf1YedfvLtUr754fhjObWjiR4DabIw21UfcGaqJHMVr1EKSUq5NAPGj9HUaAvFrVviD+Oxl3cNQTPRtFiiaCY7R8tRSVup4gcxlCEvzmHBkeUUfVPmiK9kc+SJU0ZoExixKYAyToVOsgnDkupX6QNDi9ZlUoK4bYq75LQ1Lg03OoKVSH2Vlriktj9rmRz24d20sr65NrZUHD9sJzJdAK8zFwcvOfWkzpE0q6EluDEzREoleYgeZDFLKl4NE2VKShXLzSc0d8xR7iqXUk6KWajkVg3RcHx+JPqXLhgGydb+m8gyt+hhfrC+HGb1OLZRSyoYOD7c8bmqIFqJ9g6EhoRirshXGWfCrTUgg/YfJkelfaPHYu3BjzUuVqvC863xhVPi1kTlOyHpiw9LFRE15dR6FPvwT+OO6nWqSa/32VJ1BS6M+Fl4e/LA23JFib0xTayyvHB1MaVtuHj6scKeyleVpt+r+7FOxLlAHsWUmHR/xe5W7kdtrycFLwUTFjU+pqi2UIvrJaLH++b1MU5/0XT6sSWTrfsjJ1v2arKm+75RherJVRkJSoJcLhwqv3JCyBljHZo8zyoIt42jTrGSxntAMrHlVNolcYpkvvO9ml/YZtXFCrycMVFbZOl82eRT6sPgKTyI29nUvg4ZWfahEMaykme/F8U7iR5aXR0ejlYcP9+Ne6YLK8tRxKUaVPglbfhs2t1bT4fAKv68G2xX6jjCNXPykJLo0TaUkVHw0jsf18U+tmM482uXc7/mV1fdpO0sJTRut4kX9N2gIuJS2mFmVYlNZ+Cj00adoqAmlzCssgyZfjm5z9RJ6iH1+zeaIJo9CH5gYbLrNuL7FIDRFW4maVn06c2dyd4pjZHl5dNRa2QfE8NqTST3L8y/IfjwTStW5soqJEkuGq+HwjEsFG99118AoNR8taWrA8lRHLoeM6uNns8xePNrl3K/51NXHOk4jk5TQtNFqkCpSOPCzYkMAJNOv76GFqSyy5XGYkfmCx0+rWl6TL8PtU/c5oR7x6ZSgIXbcZLk0eZT64MFZMXBy3a51cjmVpcOoEhWt+tr25O6UJXBjeWVtkqtXzMPk+PoDd0k1uZtS232kJ/KM5wcvqWtkloZTjO5M57G6a1BAAt7JASun22zxJ4zqo76UwANczv0aLeUpqL5aAcnZABMgbGPKONU/OfCzakPSkJZZKc/1YCqLxvKw91BWtbwmXz1TKo6cUGMZ6iG15bV5lPrAUtGXB1XdoHUsj3Eky13fqBIVOUyLb9uTu1N00VheWZuUslfMHoH+Mldqu4/09sAL5I+g216E2jvgTEqvJam7Jg96YcmAPeSZBnWSUX30JCVdMp17tMu53zVbVj/veZKDNG20mhT136whJimDppKayqK2PHTtua7aV7W8Jl+G24jtJfSQ0T5PfYlSH6iTnEap6yanJ4GskccZlLTqa6uVu1MOF1vLK2qTUi4t7+GzxPLsGKgkr9p83YnNMbvyGrNfCiq7pjq3t3TAysymX2gno/r4I0CmRI92OfdrfnX10zFIkhKaNlpNivpv1pALOdHPk7Kzd8UBynQWteV5J6xueU2+NAIUruc7xgk9pLa8Ng9S6kPPTdV1Q9Cb80+2rO1lkGnV11Yrd6c4iq4xy0u1SSmXlvfwWWJ55ZgxIKNjBNGuAg4w2uT1m+5y4MCHgJJ72Cl18EmKVkb1Ma+fanO/y7lf86urD6Xp3i3vtEnTRqvJlR/8r90Qy20Y3toWyzE5HW5MZdFanp5uX93ymnzF6/2VWpPwrGrLa/JQSn2IVF03z0DoZpBo1de2B83UaxF6bjNlXVie1abWyv4yaXlPOqdj2FoVSsneD3r1CDHX1ndN19jlIqM3YCmuhpHOC4CmPi/siCnNw14/l3Nxza+pvtUbe+qy+JGQVDxdwVy7ISqQ+uZyuGyEJrJoLU92D6jG2PLOvUqK17/O95r7XPST9GV9PZZ4s2rLa/Io9KE7K+QnAWXd0OLF83QfV51BU9VWfalvDTRThg/i5bqfnftEKLukHB2NVvaWScvjGa3zN/N5Pp1JMIkPx3PMS754QE+pOLYm2HQNz0sdz+eLSwnuDViEI/jV0RxF5nNAbX1g4ZcvXvHiiISywMVbuF0uh0vxdfV5dxnPyF25lFILWUrY46fFuZwdWbMhGJAnvCPG7q1iZ0nVExNZ1JaHA5o3vEbWszwEcuPAK31Q5Wt7FdNsak3Cm9VYXpUHSk36QLvmR8i0WzdFjKVuXFvVRn2pbw3J6mpx7PFYnFws8HPCYsva5JR1/nvHtOXxtIRQnnCkIoV0mwLccu6T578kYNQ1WIYonLlSKQkVNxE/DAej+vCGIeBjWWuS5Vzci6+rz8mCt3U1xVdCnhK2IyuntRti2fHmMcFEC/pZ1JbHQQpeXHQtTw4hHe+DKl8/nWqt8tYkvFmN5dV1Myf1ATmi/V7VDTMWLzHAHGVvV2WAHKqqNupLfWuwmSpgk4SKP1pIl4g7Hd/XWtlb8u3jfuf4Qbov/fni/aIZOuijxVkZmG7OT6uO9KKMFHK4OF34pYnR/ekuDhmXV8b1efn2XXGOfjZfyI7X5Ly0/J6OuvqoNn5Hd8MXQinl0UKCNmmIlmJ0z1p0snhm+ft9/48X7yFiPhWyuoyOpawPyny9todzncesNYncLMur27xSH/PFu3S4WtRNLiURTADqKDMYH/aV6kuVMGSCeXy8yFPvo8U7Tu+m/FybsVaCoCGtO7fFabMfuV8+6i4adGaYNas6XkIEwabgwMf31lviYQ3PdKRSnzFW1qxqWF6wPXD8Y64tcdo73Xx/8EL66/n8DY799Di1YN2qhuUF20PPgm8RO3B7MMz1xMfoPBJYt6phecH2aM4YfZa8WGyplW/SqZ0gCIIgCL5gVvlyQRAEW2KtLxeU3PqbCUHwBfPWrkudDqt8uaBkdckgCJzNvlwAii8T3CAZBMGItMOym1iT/6ZrNfl22Ad2ASoI9oF2qbjy0rG9ET0IghG3+3JB3qkd5Ffq9yyvfad+3GoefLHwjh0+ipgOxvgoJ5DHuvjOG8Hu7Ol8uUCfRpNHsw7lKTC9f09d4nbJMmOGNh9eCIIvCljUxZDf58+Hmdf5cgETnpk0nQsGwK0PT1qwSFYZM3QOg+Wjj+n52iD4gqAx0TI2+3LBIzOrd9xjXutKM71XWWKSo8kYxUpObolB8GUBS1AL2ejLBdiFqbfgTB/2byyvzdgtLr+LLwi+JNyiYER80ZO/v4Mu2weKb8Ly/Opeib3ao7G8NuMiWrdB8EWRLE/fMbT2lwvqZx+fzBc80qNzZHl1xmF5wZfNyPLUl0zDA6YsT33KBQ/eCD0jy1Nf9qdo3QbBF8XI8vzinD1M7dEr7PNgctybTa0264zD8oIvm5Hl+YFberOh+CYtL39m0b+3MGV5dcZhecGXzcjyzGsvKU6GMWV55gX+FiqzvObLBG3GYXnBl01jeWt+ueC53YPyAqF2uYB3pnDbfJlg8p36Kvayfh9/EHzuNJbXvvoehrH0ywWIvlyciDRs6mr+iG4RedJ8maDJuLE8Dw6CL4TW8tb9coHdJsZniPQuzysXab9MMPFOfZVGhPiC4Evhtl8uOFicLTzl/P/bO/uGxnGr7ZMXWJrdKaV06ZSmnZ2ytJTSPJ2hlHtZNpvv/6kenUvHtt5lhyQQuH5/QGzr5ejoSEeSZftSfjYPJIRfJki/U18TnMWvaySEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQ8n74hhBCCNk31IcNRCMTQggh+4P6sIFoZEIIIWR/UB82EI1MCCGE7A/qwwaikQkhhJD9QX3YQDQyIYQQsj+oDxuIRiaEEEL2B/VhA9HIhBBCyP6gPmwgGpkQQgjZH9SHDUQjE0IIIfuD+rCBaGRCCCFkf1AfNhCNTAghhOwP6sMGopEJIYSQ/UF92EA0MiGEELI/qA8biEYmhBBC9gf1YQPRyIQQQsj+oD5sIBqZEEII2R/Uhw1EIxNCCCH7g/qwgWhkQgghZH9QHzYQjUwIIYTsD+rDBqKRCSGEkP1BfdhANDIhhBCyP6gPG4hGJoQQQvYH9WED0ciEEELI/qA+bCAamRBCCNkf1IcNRCMTQggh+4P6sIFoZEIIIWR/UB82EI1MCCGE7A/qwwaikQkhhJD9QX3YQDQyIYQQsj+oDxuIRiaEEEL2B/VhA9HIhBBCyP6gPmwgGpkQQgjZH9SHDUQjE0IIIfuD+rCBaGRCCCFkf1AfNhCN/LL85vvvv/+d/n7l7JGohLxL2EbfCerDBqKRX5afVoa/6MHrZo9EJZviR6n0n/SAvHbYRt8J6sMGopFfFrq8V8hvf/zxR/YbYP9c3ruuPLq8d4L6sIFo5DRo6is92CLvxeX9UyIL/RP4/S8a5Qc9sStU1u/08F3T3+X9FkqrVO9XCfNVD7bCa6+8tiF4/Ocvv9Hrz4Mu752gPmwgGjkNXV7Es0SVuJb/6ZkaP2j4LXeRCf5js/29Hr5r9s7lvfbKs+Kl+OmPGuQZ0OW9E9SHDUQjp6HLi3i+y/u3/aunyvxJgv76X/m7a5f3e8l09f/0qOF3v5qTv2xmML4/vEqXV6qKdOW9HiDej3rwzTfff/+nH/8fzgnP7gi23p28z1bw+lAfNhCNnIYuL+L5Lu9H+LHV3/RcCdtz/W77s4LefA+JfqtH74VX6fL2uSogeufylG91cvpfPV6XrXcn77MVvD7Uhw1EI6ehy4vYgMtTtf5JT+b5TgaTqz/uYCGsN3R5ZejyegHRI5f3zTe/+R+u/EcP14Qu752gPmwgGjkNXV7EJlzeN3YRp9ZkfvOzhPphF/d+ekOXV4YurxcQPeHyvvmN3a1VHw6WoMt7J6gPG4hGTkOXF7ERl/fN/8mPnys3AzDi/Yf5QZf3wtDlbRiInnJ53/wNl/6lR+tBl/dOUB82EI2cpujyvvvLj1+//nf109ev/++HP+i5LN/++V9fv/7069ev//oh8V6EnI3+4W8/+kn/9q8//rWztN/+7Z9f/2fS/McPld3YLyFqComLlv4tlizL2zYxFcQOhGoX+f2P//z69edfTPF+zG94++6vP/6tK9Bv//bvr78Yhfzzez0RYcL/6JTo998Ldgfpn/EbpN9yMVigb3/4+nX1y9d//flbPVGmT4UOLW/Lb38wBvDr/77+46+NWe3C5Q0w7R5VEVQe8BXy579//b+fvv4zUT+/+/E//1399+t//lqvi+/+qvX8778N2h0K0ZMuz3qTrLH3Miy/jX73w7+//p9R5d/b6ixQS39QKyBbRn3YQDRympzL+xMarc+vSQsW/mQX6B1+/dFvTUk/8lucXf3qtiWcwq/fYRujw3/TBv1ioqZAKGRuN6aUBrN/lwDWKZZc3veRxCZWsluHjOhUQ939nGzdfkn/hZAJwjKsIdC3rjy1AX7vCh1aXvAHzL8d/iXVvwuXB3Hxq2bafarCrzxLp5A/YMm8xdHHd752fy1ssvqrn4jhlz5bsgCCJ1uh7XL+qUcuAw0Lhf9ON8S0/Cfv9vqk37sVkF2gPmwgGjlN0uX9ATMUy/++ymC7IbUp+jfd5mMzdoIxgl/dbifRQBsv4p/FSTdRk39nqVFP9pKipkA429L/gt/558v/Kpd16TPr8jRnYMrnlPa/8QgdYX/7zW+aXuD//t8/vrYdfEJ2v6R2m2mCP2sAsI5Av/W7zmLnMaRCh5bX0D4h/bNJW39K2B26vB6m3acq/MqzBAr59evf2wzUxP+hh//717/buvs7roT8vq00M3N2GstP3US1BMImXZ6toSiV4YZlCt86KFeVvyZXBXqm368VkB2hPmwgGjlN0uXBOL7+2bsR9Rd71zl+VMUuzXuDq29/tIEd02tstKXxIqETw+lvbW/0TydNOAjT4IL8X1LUFAipLd12L7n79H+Ui7+qKBmX9512O5iItPxRz0a73iDn79Df/fJDV/o/WPkzLiPsvIt3MdYR6Lfol35ptPft91G9uCBKzwodWt5v/owLq/90Yb/90fZ9iLILl9fftCtVgdQCMRyF/NzZ3R+s+v6hHuLXH9uMdMaXWH5vKtqb1f1gE+r1hAFCplye9Uzh3G0dw/rBSv8PR5Xawv8vUuXA9Hkv75WgPmwgGjlN0uX98P9S/bRd3g7bsbXg2Lb/aJpH3o+oF/klXie0Fww/hdesOTdOQnlJUVMgbJOFHWyn243tOZtE0y7PzhNTOduUfw2SbnX3P19Jqo/o2figpJZSY19boJ7aMwyp0KHltWOQ/wb94XftRHIXLk/oZ9rrujzhq9eza8HRvwc1gYFXbHnWC38NBfrmtxggRIImQAJRW/uNlSRKYG3DilSJuwXRvrGh6dPlvRLUhw1EI6dJurwMtiPxzQaLNL/W7+16DVS9SHKJRI35f36bBdYMe/VLOxE1BUK3LR2dTHLb5m/Qd7S9e7KLtB2hbOiMsXcKf/WTzusuPeH0StpQaOzrClRzED1IVejQ8lpbT6w0NwuBu3F5/U17XZcXOvVvvmm8etzp2zoNVu2stuPBoQHjxh4v04tT+N1fm6XHSHkbtHRdmfRnooPTp8t7JagPG4hGTjPE5X2HsH/VI2C7lh67mZwGql4kZa4GXPwlnaK9EZNsiQE7ETUFwrcS2ifNUx0EEu5631QXWX6Fi22pfssu6A69TTjtcUrakW/sawq0+rm/+rIkKnRoeW3B0uJ/h6R24fKGmPZ6Li+Vg1VfqD8Ab+hvJrEVnQrcjNXqb0BHEgl+CTaLCRu1dPXXbmrD06fLeyWoDxuIRk4zxOXZG89us7T238cHtQ30WzvUc26n+KRacoN9ivUXPSqyC1FTIEaXi2058X4NTCucUWeqi0Rhg5boYFdwvHsiBd1hnBt26cnw+ca+pkD93jVaI6pQw7DyQuc/60EI2sEuXF4mYtK013N5yRxwJRQIwNn6ozLYfTKwAUPH/9ODPBIqIrNYslFL14tuRQ9Pny7vlaA+bCAaOc0gl4eG7HY7NnaPlf3GRtWLFPbsFY3Z5tdnG8kuRE2BOE6+dsgZDorRzbj3zBNdpN1qkbqxZbFdr7dHo6A7ezNDDxqS4bONfbMCDSWqUMOg8tqJTpBCy0u7vKRpb9Dl4U3nSS8Wa8pmm9uhaM2g6gwQ6l/6UNuff9Sl42R72rRhWVV2d+fXSJ8u75WgPmwgGjmNNQ89yPE7tVw4AbfTQEPKD58cYKM/WC+S63ZA0Zht8yytquxS1BRRLLti5bc3TEG8gXKii4TAuUmJgLGr10sXdGebsB40JMNnG/tmBepLtkINg8prbSfX8b24y0uZdrHfTaWWzwF7NXu6PNsnyJMJKWzDCHdcRiCUV1/f6/bIqPlu2rBsibp12TXSp8t7JagPG4hGTlNweX/6N0whxDVjmH+8wzeBtVF7P604TSsa8x8QPfZDLyNqCkTyBLQ7B9zGg1sL/v3yRBcJgUsLSIjjVV1BdxtweZsVqEyPCjUMKq+19Fwn9uIuL2XaL+Ty2ofdCqRvozkgVFBfv7GP5IUr3Zs2rFCVa6RPl/dKUB82EI2cJuPy9OmhFK4ZY9zWa+lPbfRb/F/9lN/QUDRm2934DenlRE2BOJ6A9tXRzrZNe7vcb02JLhLRSlvjbM+kB6Cguw24vM0KlKdnhRoGlddKl6vN/i7vW6QTSBKCIoQDrKI2Uqb9Qi7P6l8ndUn+Xr9BgDQiLdl1/uCJyU0bVqjKNdKny3slqA8biEZOk3R5v9Uh9tdgGQhtwTVjzGCSDSmktVH7IFDyfUOgaMw2snv1JUVNgRh+S7ctsG1yzfeCPCCuL52dHepBCjx00Xe5ZwMub7MC5ehdoYZB5bWW7jx/6dHf5fUaPEGy8K0mRW3Epv1iLs96gMQzbENAGkF9Gex9Nf9xgU0bVqjKNdKny3slqA8biEZOk3J59jnUX+Pl+qjbwSp5rx2Ujo3au1u5JcOiMVtpnc7wRUVNgfBBS7d7pLWTtJO+cAN4wuVB4GwXbUBCfW/qb8DlbVagDP0r1DCovLZrzwkzwOVBknLQ9OJnURuRaRteyOVZUdLPKPQGacQuL7Wfa9OGFapyjfTp8l4J6sMGopHTWPPQA4tdeEsNY6Nux3ZQfTyCa6P2IajVT8nVkaIxh/vMX1bUFAgetnT7JKxt5rixED0Um3B5djyc36pjHannGwq624DL26xAaQZUqGFQee2OzfTjyINcnu20g0moD24ER+OrojZSj1C8kMuzDrsyj62BNBIuT+eQrpls2rAwaXNUuUb6dHmvBPVhA9HIaRJN3W7cSK1rRN2Ofditzz5I30atkSWf2CoZs+1sHAFeVtQUCBy1dDvOFH+L5hjcyzAkXJ7dRxa+FaLFzhb98hR0N9TlpbYnbFSgNAMq1DCsvHZ9K3Mzb4DLs9n+WrjHazOP+tiSNiLTFvJVYUills9hiMuzRtrnrWIFkGrUEATsYfFS36xhWa25WQ9Pv6x6sjPUhw1EI6eBdfv32e0jNKkWjU7DM2PbUtOvYvcIbdQ6gcRYGQETr0wy2CmAu/XqZUVNgaBxS8fcbvWd7dITd9JTLs8629xtd6QY9LuFjiDhAjLhf4eQqaeyNipQmiEVOrS81nyS3X7zWjk9rGATyr9T5ncYXcXjGptJT9MW8lVhSBU+r5BBLi//0qD+INW4IRjsdNst65qGlValfajfq8rh6ZdVT3aG+rCBaOQkqedz7VpnvA5gzweB4TKTc6Df/dcdrUeNUT8oEy0ZWmNOravYnsRbbHhZUVMgZNzS7RzzZ6xwpt65mXJ5WorkGyusSOHX+/I93gCXZ0VNLmttUqA0gyp0aHntFDLu9/WlA0E/WcA6qNxSmc0m9nhWXEMv0xYKVZEufF4hg1yeai/5cti+IIW4IQjWA7kNcT3DMsS60e1P/vxscPpl1ZOdoT5sIBq54TvXGuxij9+X29oO5kPdlxgDM9Y5UDAe+hYv8XGCJhqjXWIPfVBjzGYc6FmovvrXb4YvK2oKhEu0dO0lDckFo6TL074hfqH972wf/b+wS8r3eANcnt5sSd713KBAaYZV6NDyWmcUfATqT+gj46lBCX2AZfXPaKbXfJYhtSagkQw9TBsUqiJZ+LxChrm8xnH8JzWT/e6Hr6GDSID4iYYg2HpwRwxrGRb4yXduVmO/hOkMTb+serIz1IcNRCM3SE3+98/GUL79/h+2gwmb57d2VtN+MPg3f7GG8RPWBkMzVnew+vkfuifq9z9qfLdzTzZGO+/yfZANqKP6n1S07/5hm2CU+YuKmgKhUi29yTt5oyrj8rqPif/8jyba9/9sRI6bY77HG+Ly7Aa21U9WpX/6j9s3bU6gDIMqdGh52+/PflVRG7v637fYstHb5Znq1JRWv/z7L2o7v//hP83Jf8ddqMGK29e0QaEqUoXPK2Sgy2s3K69++scf28L87gcdfPSY/SBcskwGa0VaxWAdw2rm/j9rib9tWnNCvIHpGwqqJztDfdhANHIDrMUhNd23zcDHjPjQj8RmnAptRrJe355ujDqY/NkRoQn4Z+0JXFLvQn9BUVMgULKl25fXZm4JZlyeoel6Av7niaykJQdDXJ72SC1+D7IhgbIMqNCh5TW0rspB5pRDXZ6JYX1xzC/pKu7E7WvaQr4qUoXPK2Swy8u1FeHnHts6EDDn8uxkPthQMtiwzP/mG68uuedoh6QvFFsB2Q3qwwaikVv+bF/7Y/l7cjxqRkV/g1UpP/0dKxy4p5tsnH/07eNr1OyQWmKOZNvVr3pkcFrtn91E/5d9TujFRE2B3izdf2GimBkrYs019zq0vwSN739/S603GXKSG/ASplD2bPjf/M3pleMvSWxCoAK9K3RoecFv/ura/3//Ykv3rXTCpZdSJTGDf5+vf820JwHiDjJtQ7YqUoXPKwRricmFeaz5Zaz699F73/77D3duVqDQEAy/h8+Lyj3IsFTmP7tDj//7W0H/A9IHlVZAdoD6sIFo5D3B6RcIeUvQtAkZhvqwgWjkPYH9Anmj0LQJGYb6sIFo5D2B/QJ5o9C0CRmG+rCBaOQ9gf0CeaPQtAkZhvqwgWjkPYH9Anmj0LQJGYb6sIFo5D2B/QJ5o9C0CRmG+rCBaOQ9gf0CeaPQtAkZhvqwgWjkPYH9Anmj0LQJGYb6sIFoZEIIIWR/UB82EI1MCCGE7A/qwwaikQkhhJD9QX3YQDQyIYQQsj+oDxuIRiaEEEL2B/VhA9HIhBBCyP6gPmwgGpkQQgjZH9SHDUQjE0IIIfuD+rCBaGRCCCFkf1AfNhCNTAghhOwP6sMGopEJIYSQ/UF92EA0MiGEELI/qA8biEYmhBBC9gf1YQPRyIQQQsj+oD5sIBqZEEII2R/Uhw1EIxNCCCH7g/qwgWhkQgghZH9QHzYQjUwIIYTsD+rDBqKRCSGEkP1BfdhANDIhhBCyP6gPG4hGJoQQQvYH9WED0ciEEELI/qA+bCAamRBCCNkf1IcNRCMTQggh+4P6sIFoZEIIIWR/UB82EI1MCCGE7A/qwwaikQkhhJD9QX3YQDQyIYQQsj+oDxuIRiaEEEL2B/VhA9HIhBBCyP6gPmwgGpkQQgjZH9SHDUQjE0IIIfuD+rCBaGRCCCFkf1AfRgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEvFUms9mx/nydvHoBictodjY/n+nBvvJWbe6FyrVVm2D/sFGOZlP99VaZrgxXevAa8QWcns033nS2keY75WwptSXc65m95NU3ijXZfltKsGWbWK+u2OYziDYX+ntfeZRCpDkxl/fL5X2Sg9VIjzbDNtJ8p9yIJh8/z+dXT3vdbN6Fy9uR3W/bJtaqq42W/VLSih3ovZzW3wcHJ3LUcf/pSC8omUR2joix7y7vSQqR5tRc3i+XdyEHSz1Yj/PF4sYzrQ2kSQA0ea4He0RkEu/C5fWy+0g1Q9m6TaxVV0HZn1fKK0ksjv4gp/W3BvK5O9RrQiaRnSNi7LvLc1hIeYKV2v1yeQcH42B0NJhEeZ+dJgGwL/29T0Qm8S5cXi+7f7Yitm4Ta9aVV/b1kmjo7/KaQKMzXFxd6LGBLm8rvAWX92xee3n3GGnHD/p7n4hM4p24vB48WxFbt4lN1NXzkhju8gxHcoxbSxa6vCFMZ7Ox/ixTc3lHs9kss759aC712skzqgWUTPRngvhqD5M2kVLj1bQo9eSARJ5N9CCJUXtOWwHj2cxdw/Co5FPQ+6R3/knKNWoKN3wbXL57i5MrZS/XiqovUbPURLVFJvHsbrRuPWmeVSnVplcvV2RTvRWRK/FGXV5KPU6ZIAN+hVTaSraUNWMCa7k8e/PuWg9qLq9XW6/Wf77HMuVUyxIxXrfLG9+KjMInPVOg4PJG9oausOyGHmCCaMpdvs2NP3u3DW+cjJCJqZMuk9WXzgn0uNrYo3cgfGq3g61WTx/1ZE6Ucz1uwfkozUO3wKtHuePZgLBTkz4uCaG2fEYX8a3UtmiFfIS83kcfnVQfY296Kud9u4Xgt3qQTVmLd6zJL+xNdz+lSXzKgCwdcPcmSs5SMqgJmr7lanxwZv410WZybq4HQE64ktTKlai2mkn0VkBLrlY3Vykxhabn0ZUrPEjYVFo1CYoldgju6N3JOcd4Ub9f9ED4ICeaNl1Wz9XB2Ok93Macbitd2bOlLFmpzzNcXtdp51xera2DWtdb6rGCFif/8sb9CnArbJnRRwcqMTBcqGSB6nlaYJOR4YNeNOBG7+rhZn46v7V6zdwA0ICLz/O5TW51p1cavePscrH4Iv/N5WbA0eNqupnaAiGW7ku1s92cKEeOfxTspukgzY+4ZKJfXzapXuolDTvLaytgLpeXN/OT+TVCCk2jLuZT1Lvd/HU1P/k4FwW4DVxBbu0AQIBW7SC8kDKKd2xVZjCmj4bp3HLwUnKYem1ObTFOzlA0KK38xQL94epBGmrTAGsur1auZLVVTaKvApRCrW6uUkI0crLpeTjlCg4SNpVWTUS+xEmb6MDSnlMaW+VnemSQougEsaYeW6mmH1A13Ta9R6atdGXPlLJopQFrubxD5NupJJNIva0bSvWPkhZ6rLDF4UCvvUbGELMhY+cd8BApl2dolCnjaieQKKIbFowwp0x38afLKydpDCraLrzJ5EYXYMeeZntcjU3VMIHVXDXWPbps9mCVRIkXMbw0rendtdGtJ370/K8ho60A2Z+9bDpHSPtZD2r5lPR+KL+7kdqkjdMxQit1xILvtb1JqUab4t11C+VoJU5nhebl9EsuUgZ3ESuRXNmg0OwWWqApjjpDqLi8PuXKVJt7CbgmMUQBxVrdYKUEFO3dxS2Xd5C1KSdMmoodRzbhAEHbTHVO+KSH6kutE+ijnmvN0/YeKkGuXJ4iEqUsZRiR8VaQQ3+HgY7hadq+wZBOpEdbN/TpepviJfr3bIt7jdh7oA2dtWTIuzynvjGuaDQmOnh0R7TQZ5BEEsmrNXWbSbdwrYNndQA9rjbyeQcYU2YNscMTpeLy8BSRt8SIBZNmamZFzWkrAAOSTj6s0zRjxUo+Jb2jKwhGzCHBIhFajlVwsUZt8ZZu/WIZr9MeUvLU5xB2b4nkitlDKY43GaMJ9nN5PcqVrbbgom8SQxRQrtXNVUoZ395d3HJ5B1mbcsIkqdhx0eVhDPCoBwgoJ5q1vpEMEW3WPdQT9R42Zq5cniLiUhYzjOjv8jwevHljOpE+bT0i0fU6hfNMv9ziXiU6LbV4yy8psi7PrW54EjVZqMcb0cLJ5sa4LhjCNnnFmSAXnZQNuOoeYLzS54kfT5Ry/4YCB3p0tRaL6mgrBIl19grVqVFX8inqHZ506XUzMeh4Wv8gMxU77i3XKIoXlAaj79Y7ICX9HZF0eV5yxewdH9AgZ3q5vB7lyldbdNUL318BlVrdYKUU8e3dxSuXe5C1KSdMilqJiy7Palb1IbdorpCcRhXHteyrHl/ErvfIlSuI5B9VMoxYx+VFd+XSifRq6yGVrtcx/UqLe53YZVxhWbrDaunn8hwjRT18WbhEwQOOZrO54OWVMEskbcc5A666B1htKBU6KUq5f8P9lsDwMNLKtq9Sk5ZxajcDlBFV01lW8inrfSJGK4elwmO4pu0KbcAGLqecqAmNoosr6LSzdzWSLs9Lrpg9bkz7zVvO9HJ5Q8vliRpd9cP3VkDNejZYKSnS9u7iJeUd5GwK5/R3gmqJi+3Ddkh2vVYmeGOcuMExeuM+rSGhHkSwlZQpVxDJP6pkGIHgkbdCyVMLm4fnWOHuVAQyifRq60rPrtepkUqLe61MPi+WD7cf06u8Pp4ulKJKECEmM+o4QSW7FFweWrxNaMBV98A3KZ+8KOX+DTdu7M8WdLV6PRbV0VYEblHrPtIL+d3ewa/kU9P7GAsSwqPfcDqwHmfvxWOIqr6inHKiJgxSxCU6pmO5nl9MCHURJ1fMHq3eN08508vlDS2XJ2p0NQjfVwE169lkpfgU7N3FSypIN21TclzIu1riYvuwImCWIQ3+szo63AsQnehcZLB6nN4jU64gkn9UyTACdhvfYEGV6O/ApY2gN+vblYzL69XWh3W9To1UWtwbAFUZtIW6SsIpeAa7efT+TH0vzC6vd5u0XS4fcNU9gCkkBz8lUcr9G9IMBvFITTunWNRSkz4WN+fQLSFU8umj9/G5mnm3JcYD6764MSLhmoWxcsqJmjDgLJqBDDkL7SHURZxcMXvMoPyuQ870cnlDy+WJGl0NwvdVQM16NlkpLkV7d/GSSqQb25QcFPKulzi0CR+Iajp6uXGHVUxZzpTgWFtUlQxWDyK4t+KicgWR/KNKhhGY1katMLgHjDQ7l4bpsevzgus+lbZeqv9YPU6NVFrcG2CwyyvuTvOB2p0aqbk8yUU3BA+46h7A0rxN30pRlHL/hqjB/UFYpzbrWFRHWyHSv90eTC4W96u727mnxko+ffVu7zan24FNzwwJxK7bvWHllBM1IaCjNnMb3F7J7huMdREnV8we3Zzn07wGiNS8p0+dq0PL5YkaXQ3D91RAzXoMm6uUjrK9u3hJZdL1bSodpqVe4kL7EOTynd0egprHphXTqOV8U9eD1SORtffo8MoVRPKPKhlGYCwW5Qeb6WoldGnoix1jL7o8Id/Wi/Ufq8epkUqLewMMdnkjuehupc2DtJ3eINb70h1OYFeV2nSPq42A7gEWQVJNuygKbhp4mwHcNPHbT9Mbw3nCAEdbIWKj3QZjj0o+/fUueWSyH8mU5NGuxbVrMuWU4+JZ7LAXW04zBQKhLuLkitmPMSl2R9eYRLQNUC67PQuk0qtDy+WJWjQJ0E8BlVoVNlgpLWV7d/GSyqbr2lSkGp96iQvtQ4BKz2SXhu43l/57ORKn024S6qGepesssL3SGxtZnHIFZQ9KWc4wBhUQPCIGb+IMj0KXZh9Z6Yyp6vLybb3e9Xq17NRIrcXtP9BN0BaKKrF3HJ563Dq1c+R2vKcvhfH0bhJq3JodsTRG0uNqsplaq+qs5vTJDi6LosS7Xrw07dNB3d2aMXTWtmAvLCg0aQx3V7fz+cnM4C+UVPIp6f3cSQk7uvxhWgf61UsR0JG4WKNx8RR01NJIu904CUJdJJIrZo+LbTcwsvrrGiC6si+6emOHRd3VgeXyRC2bBOingEqtCslKwTxhcKU0lO3dxUvKPcjaVKSagGqJC+0DSCUv5U9TALnNdSNdcTcGrtetudy4ixEk0N4jV65Ap2EpixkmQIe17LRwcFh/7s4ae1tNaZfXq63Xu17PetwasUI0Y6+oxe0/MIagLZRV0uwYuv98Id32/FoOvdAN9gnQp4vZweQDqk/w9I4RxdPlfK5vx2lXsntcbbL0pYXPM12R8SpX9nENWGlRFK3lu/n8070dGvlp2rcirB4/n87OL/UZkNacgrBCqfMyidwAAGGZSURBVEnb53Jd7lqjLudT0DtEuD2fTaezjwjUPEgag+GewZOvVKNx8RQ7oy53XrEuUskVDcp2oKub+Yc5hpvQX9cAcW71cD3Xq1K57dVh5fJErZiE0E8BtVoVNlcpDWV7d/GScg7wM2lTkWpCaiUObSLETjScMHD/BvdGV1U96OafrsLeI1suTxGJUhatNAGmtsLiqnkBSrD7P3ZpWI9sH9pvK67juFAvLvWu1xPdq5GwxT3JuPJ9uzzj+jGIcMi96O1cjVd4OkUygd6xRq7cdg2o19XEgQH7IFuum1gFUYx921YhYBQWpnmE7VQtD+6Kaxi23KQxaJrPPmDr8K2ENOK0gpTyMWT13vSZFv+B1gDtgoIKy9doXLwGtM8opYBQF+nkigalczdwE73xb+qo7HYCTThXh5TLF7VmEoZeCjBUatWwuUppKdq7g5eUe5C3qVA1MeUSl9oHUG/RbYe0A5vAwVbU41/veo9cuUKdxqUsWmkC+06vju5Vo5bY5dlTzStWW6fZcda7rde7XoegRtwWZ9qU3IJ8Qy7vwIxYIrudhK9nlxdyB4OJ8YkZWJvB9WmlxR/O5pfX81PEHjt5dXo/vPi8mM/9qitf9QWMpD0Yncw/LT7NTwKRc6KAqTzA0i4axGkeTM9Mopfzi6i8fbSlYGLgLURgMOmOXvP5WLJ6N/LfXM5PooWQECNu+t3ymZQTqlAwIAkmLCGRLrLJlQzqWOpTNSKZ+g1wci4l1271uK+l1qutahJ9FKBUanVzleJQtPcWL6kg3axN+apJki9xvn00SAjvUzA5/eTVY0t7aGSYz6OYyXLFOk2UsmSlKSYS/mY+P/HdteUo/syBFLSsGyFbLx7Z+o9KGtfI8XxuelANZeSsy0RqxEMNl/LVvUVG88F4ScbD+zmEwjYDz1vvBMn1dSjshRRACNlD3qXLwzKyNz7Ekk1py+OrBe/Eftr96E8U9ipc3kspgBCyh7xLl6cPon/5KOs0s4tLHD4m77G8enCjIlgJ2gWS7atweS+lAELIHvIuXd5BdEt7teh5U+C1gelp7lmIbQKl6e+X5MUUQAjZR44u5qepe7qW8tU95+h0bjjLbSjYC47mH19G/unH+dlrUNyLKYAQQgghhBBCCCGEpIkfT3k+G09zNDubn1ceS2nZRol2zVsoAyGEvDa2sXHkmWlOz/yHSc/aFwtE7ypP8Ra2wqxXhlBxhBBCPF6fy7Ov+emef8JWucfP8/nVU6+de+/W5YWKI4QQ4vP6XB5e67TUg/pbns4XixtvbvNuXV6guEgxhBDy3nl9Lu/gYOy+YhNvMNHfKaK83q3LCxT3BtRACCGb5TW6PI/grd8RUV7v2OV5vAE1EELI8ziazbzPFLidq7mWe4X2obmUfVFWKc0Uk3w+Mc9yeYUS4e3tuWeLsxIW9bAeqSSdMkBO/AqpqNFRg0+hDP1qRkQqKGEa7TbdpNakSvUnIYTk+dTufFytnj7qSe1cnW9DOR8ZBxMsLSp3QWdWSlOxH6Kyn5Iefey+SpX7/FQXWz9e2BLe0dMPmHWgW62X6NAtUfBZq7yERT004OPITix87fKLHgh4HX+jqGySWoax88Uu13ulhewUl1aMIV+GPjVzcDDW728qN44fQ+7Tg2MN0Ow06qU11ZP3Oi850SSCtI2Mjjq+5EQkhBCDdj3LxcJ+7lc/wIjexH7F92nRfOnR+cQj9kSsHm7mp/Nb25s5E7pSmtpF28+C39kL9hvmV/OTj3OJ6vbiHV3sqde/rpZhL3fk+FvBPsNQK1HzJcaH68tGbOuPhayERT104Ht4zsZSuED50mODM2ktJGnLAOGNclGW1eq2mX9lhOwUl1ZMKcNeNdMksPg8bz78fKdXNPdjFbXRQU+t9XJ51pKMOvQDq3f16Sgh5J2CL5+srppeYnTZbO1DbyKX7CE+idvNCtBld5OkEb6237iPcppIcASnqA7PfjG1m3FN0n1WFxs4PiKJH9pQKRE+RHzXHtqOtPkYf1bCkh485Cv6XQo6T33SQ/W3dmGulGRThmsVwH7oX6XMCRkozj8yFDLsVzMHp8srZ16HorajhUbkprKF3lrr4/IMN5q2VYczsiCEEBdMZ1K9je1NnK4R4/2mI5M+69G94YVuTru9cpqSpHUvbR+I/r66IBX03Gu6vFyJ8JSft5SJ1TJde8xJWNSDx0gmM496AOHlxCc9HskowYpWTNKW4doeAHys38bMCRkozj8qZ9ivZkJkPNNWjRV56eqkv9Z6ubxIHZ/1gBBCPDDTST7cFvSTBvFk6gHgLJxVOV24s2eqaaIfdQf9Yzmx9PxNgkCi9VxesUQX9ncDZqK2I85IWNRDACZ22nnLPbUrRNZ+XnrqJaZQ5STjMqAQdhadU2MQyT8qZ9ivZkLmEqvxYMjdvWs5SGs9XF5OHYQQEoAlpeTegbg3cXwMZmlfFi5d8Eqa9zKheXIcnmGCiaFJILuNwRBItAGX5yRxLReDHX9wzdoRpyUs6iEEl2y5ZYI3xokbHGP90OZUTjLRxyOCvROWUWMQyT+qZNirZhqOZjP5GhLu+vkuzxN5iNYGuzxHHYQQ4iOdfvp57qKDQBcVY2cDtTTt1oXAvYyxsCg8Jkf7hkCizbo87AixP1vQ3bYRUhIW9RCC/LEIJ7Ogz+rosAAs6ehEqJxkXAY7pWpyTKoxiOQfVcvQo2YMJ6h1l7zLG6K1wS7PUwchhLigO8vPyLzexHEQGErnbvFU07SbDL5EWyHG59pvpu/FBBI54iTxQxuKJYLQweQAW/q97jaUsKiHCPTGxtXLjTusYspypuSPlT5Nppxkoo9HBPfGWKTGIJJ/1KsMlZrRhx/uz7RK44VNT+QhWltvlueqgxBCGrB21zwN5lF0EJio5cb8PdJEkORYfCxbG9I9ayCRI04SP7ShWCJ02sENSCx2RmtkroRFPcRIfnd2swZ6cWxaMZqS881GlnKSiT5eIkdfkvDUGETyj3qXoVAzUJ5zpeLyhmgNsRvdADlRcHlJdRBCiIC1tbZ3col7E8dBjOTiMjOW7pPmCAuJ96k979KzJl1ZIJEjThK5X+btmiiWCBd9oeGXU118J2FRDzF43PxMtoTo0wniK5YjcQHtXs5ykhBz6S4KY7Oj5xMsjhqDggeK6V+GbM1gpdK5O1txeYO0JqMC14VhFue5vKW7ORg7NhPqIIQQA9bUukekDk6fdF4T91Suj8F601NmS0OvNO1TcXYyeO6scmGPoLeS1RBIVHN50S6aconsk3Ldns0x+vHG4+UkxFbDnB5iZN64lD/NfFI8/4306l23XUwSZTCXG6enzzfag5yQQcFDxZQy7FczeJyjnSKPkUHB5Q3SGlx6uwgOjxa4PJNUoz07E20fg5cPKrbXCCFE/ZOZZdzO51f29RW2Kyo7iGYn3/3ni5lhfi2HbeheaY5wRR7Pwunb89l0OvuIZBep2V8oUc3loV9d3c3nn+7tDKRSokPxPEagz6ez80t9kUfTjeclLOohYmzz6DLF0MFgN25aSklaQeTP09V8rm/50shZIYOCR4rJZ5gvt4e9Oft0MTuYfMAsTCi4vEFas7tnHq7nc/x6kAz8WZ78eboM1aE+umgjhJD3x4X2w5ZrXaCKe6rAx7gvNgTuCxh7pWk9o5noYSGs5SGzxTyIXXN5+g5PgEW0aomOmreQWR6c+UFBwqIeQuxNTOcupu3Q3VVBQz5JlMG/ftvGzQkZFjxUTCHDfjVjZoNOfT+dQrEllzdIa1OnWm4nECm4l4ebg0qnDnvbjzf2CCEho5P5p8Wn+Yk7hp+E772X9+QHg/zxiRl7m/H3aaK/6pOm80GDmUnpcn4SPLvg4cdOiBMxlefE2sW5PiWanhmpL+cXiQIVJCzpwQffP3AdnHyjYECSJrh1U4dGzvk8ipkUMip4oBiQLUOPmhEOZ/PL6/kp9Dl2v0UR567019rkXETQIchxV2mdOz28+GzUEbpkY1/VtAkhhJA9ID2DJIQQQt4cdHmEEELeCXR5hBBC3gl0eYQQQt4LRxfzU3+3KyGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBDy2hgvV6vVJz14RWxRrkrSd/lP7GxfWdvIYUNpFvRCSE+234Rcdpvb2+R4HrHfn2NZ6yNqRxLpVg+2wxY/7lZJWq4u9HfA9r84t40cNpSmpJLRy2tjdKpts+VMr5T5cLNYXI70wGV8drswxX9cfDrUM89nB43oFbL9JuSy29zeJJ9FhQH7rdG1jOIKBZ/q0VbYorFWkpardHkxksqeuLxbkdVnppfynD3akJFZH93bC8ryVM8/kx00olfI9puQy25ze5PATK905Kjst9GuZRSTB9PyP+qB7Qw3bVdbNNZK0nKVLi9GUtkTlyczstWXhcPn1NzN4fxJooCgPY+NrRtr/2x85uSD9aVPG2nyQSN6DiLUVs1ybSLJnmGM9ZhRiO032DcPXF59wLhHbMIoNpBExBaNtZK0XKXLi5FU9snl9fdLR3LHZ/V0kYoH5S27cx9xPNGjV4LItFWzXJtIsmcYYz1mFGL7DfbNQ5eXYgNJRGzRWCtJy1W6vBhJ5W26vPlqdXcsP+J4I8z+3Nv153LiTg9eCSLSVs1ybSLJnmGM9ZhRiO032L3mAmO9xw96mCTj8maPq9WX4PQnc26uv01DuUHqq+WNfyP9ZrW6NQ1qeo2LHw+OH+KUPi9Xy8/6u8PEeHQrc2RyuHdvM8iJpxM9yErQGsXhlV3buQ3u9I8/oyMwYp13i0NtXlNZNZKrj1g/WjgS5Yock8qilev4Csksr4/sBYeTK7398nTjyGYY35peydPi5dPqoVlEitvBTMt+LZUvP2ouL6csQ/9iT+ZfEHL15WPbp/YpdimHYpqW2WK5eqhbptBXL5ZUSi/RMga7vJY4nvGG4ZY/KLhrVy4q88EMQq8eGqscf8Tq6OrGq0xT8K7BNlEPjaUa/Hov2PNaDTDZphPkw2WbXkdasqqBj86apO9v2yIXSqmsmZuh1ADeNM4d7ws9lSDj8rAXduWtdlzIGU3pg+04Gpz736iS6bHVufQodlTppYRhZbzmj8boiGp31jgxcUI3mFUkuBrLDvSWq86IL/WUpWn7XSeKdR4XFSCfYUQpC18ur2JOG6U1XOoFw0yOu17VICea/rqTHvh53OFq0eVllTWk2NKLuWjPXS12KYdimvb3wRTx79QfFlPrrxdDJqWXaBmbdHmig6Xfp8PkHWPrsDIfaY8NHozAKHBDJ65fLXIwnVrPqHT1XrDnNRpgusHFZMOVml5HWjJb6pyBT1zdCUvbgeVK2bFOboZSA3jjeKpO1iDILWyeyPkveiBA17aDOJafjcFOMUa8sQcaTDaEPZ5OD0+vTW0gtLtugiCpvczSPJbtUH4swcx4XA81nu30qhIIDxjjHFvraBqmGMSTus3pZdvqEKlzGv6RoZRhSCkL4U4G1JNP9re9Zpig1T2d2PKfWMNth95DXB6qbrWQjm48bxpAyeUJSWUNKbbtBm+xnDb+KNZne9pKsYs5FNO05bWNv3F4xdSG6KWQ0ku0jMZ1Hc9mUddYJHZ5cuJBfyuHci65sgmBxDAXMpeYWf8lXcZybrzm2FZmOyt1qkUPRMvJei/b88AGmGlwEblw5aYXINc8yVBQIVlQM45fXmn/OrLXupTlyEsrIgpRzq3cAN44sKmWpZ6NgctzWeo6KEZEzvhBGqv1R2jzzuoL7pU3WtcqafogAR2XM6SSigjGmRbI3FZwI1nrj6X5PuFXHwm6DA9hw7ZYp/LTka0BkTrb8o8qGQYUs3DbOQyy8S54punO0coUY5ZzPRrg8iDrfSuATpTKLi+trCHFxhTd7a1PdFGlXOxiDuU0UV4UrjW2YmqD9FJK6QVahnVdHU8fk6FiIpcH9xYUGpIFbtASVp7OPJ7axTQ87dBMJLpqaQ6y9T7U5ZU0m2twIblwlaYXIJdcySoF9YFzzZcyJgpRzq1ogG+dkRS+Jf98aOTyWm1iSNe2FnQ/6Akx+3LUqopWz2SrxDcXdC6t5+pSisANA21OktCDSOd1WRgh9ZDgi9sloH3ZJo2f3kKABZE62/KPKhkGFLNw5UKqav+4+Oj1Yzi10oP+Lg+pev2mm09IJJSrrAHFtt2GHngUi13MoZKmKS9MqfMhxdRwsa9eiim9RMuQqYc8o4AODjRD0zKRy0O/73WiBjnXWJqHrTw9EJCeMwFCHTVFbqqlO8jVe8WehzXAXIMLyYSDoIWmFyBXPP2VCxqA7rYNKwdhXfhEIYq5lQ3wzQMHr9x7FeqBOkjqBOPBe/fALo9iLu3dM53IGR2sokqCasRNiyeVASl51dKB+yTawm7Mz1Oc0F5COhrrugdLgIZqe0ZJ1lz21noMQST/qJJhSL8sDFIiHVwjh2DZHee0avq7vA/y2x9jyplME0woC90wlDWg2Jj5OF1hR73YuRxqaZ7JX3fSVExtkF7KJX+BluFhF6yaoNNZR7TqGbk8KC1c65JzTXk8IplRRDc9OU6ZYbneh7q8smYzDS4iHQ5pF5pegFzxilUuaABGNa0AUcyIKEQxt7Ka3gFnWKMyxox7IRnyLs8uu6hZygKG6tXu3YrQaoirxAAD15tyUkPJ5iXg9r29q2R+mBG+rKTYoTnWVGyTHiwBCqmjYtlxB+7PnIFAEMk/qmQY0SsLg2OsUddkwAhN9d/f5aFR+RUqZ/q7PLQbKGtAsVPyW4rFLuZQSRNrap5OiqkN0kul5C/QMnzgv+24BD6sRVd+WyIdIsOg0JCsu2fuEMnsd9kGOU6ZYaK4Tr0PdXmV+kg3uJhkuFrTC5ArXrHKBRXOra9taTOTAy9mRBSimFtFTQQUXJ4d78NfohvUmoKJfNIXtbQ0NhRXiYCWguboppRANiPgfp3kImJJhcpwFvM9XZUYLAG2enaO/9Du7zXcNJYfRPKPKhmmqGdhcIwVmzCCMTr6NR2h9Xd5qFB/kCNn0l17SlmoKyQwoNgp+S3FYhdzqKaJnuSxM6ZiaoP0Uiv5C7QMH+gGyx/TG7uHHdw4c14AkdxEsfYV3NrHYmeyg49kfiGXV6uPZINLEoerNb0AueIVq1xQ3YJlWEoFyaVOf3LgxYyIQhRzq6uJlF3eSKxDtIl9QM1dCHfKFBNXCUBlGLuCqfv3Mzxwb8A0BbE5NAAYn7ERybUZAg+WQHIPd/CcwfqaogeR/KNKhlmKWRgcY0V/F1QDzqmuGr10yIlUX9NNlFvkTLprTylLunOrrAHFxqBi+MJmMYd6mnbHf7sWWExtkF5qJX+BluEDb9/j/hVS9/woenjf0jA56Fd5m3J5ZXu2R07knpboN7g8Xrha0wuQK16xigW1j7R0FeXrTw68mBFRiGJuPdX0zoGWcjbSDHZEq+36P9pm9qmHuEoA+og7++hEcgmlATdwpmiYdlFaeuCFbSKNmEMlQL8U7+BBQG1lQSQ5ciKUMyxQyMLgGCs07Xe/2N1ld6hqZO8WjBvBTRqF9dbHsJPJT7slFspR1oBiQ/7kduhisYs59EkT5quPOpVTG6SXaslfoGW42M1p3l2bNLHLi2/mQTNpPUQyb8rl4WrWntdvgEg3Y+suTrha0wuQ0F5fUixo2L3GLi/ul1yiEMXceqvpXVN0edYBzWTM09xjN2CEmbvNHleJBbXxSXJzUkqAG/o30i61QSDBE2m6Xap1CZx91AdjrOD7a1pA/KraUyC23KB51N9CMcMS+SwMbh+Awbebw0ik7l59iDtX3Ugcw9B0X4M6c+weNZzrBsrKGlDsSP6pDm3LxS7m0CfNQwyj1VaKqQ3SS7Xku28ZDvbp+0RaEbHLi7SK0WTmHZuRzJtyeWV7fkYDdBpcESdcZGZB0/MJJSsXFHLriMww8usjSitiWG791fSeKbs86BOdihvE3iV1llWOr56aVYC4ShRr1IaE73HBKFSybAyjiehuMK9KYFja7/6NcN9FW9fYjKQbA7Tmp60uEBsd2gIZWjFKGfr0zcLgGuvI24V3MEWX4GydxtYr+wang0Mka0j2NfZJnaWtsKNmV3vR5aWVZehfbNtxNFsij6TtWQnKxS7n0CtNrH8urVUVUxuil3rJd9kyLrrHmU0dXdhtab324SVcntVq81aOYxjaU6Z/j2TemMsr2vOgBphtcAHZcLWm5xNJViwo9sE036lovtnU6i9KK2JYbj3MluhY18dZ/sHGaYPfvoLvbQlNgLhKGrSTSd4md1H7b9Ow72UK7sPUJNC+u6HZyu4/rWhMpWnqodgYLlmsqRUy9OmdRWCsBzMtZ4dnq/YZ8YaHqXQ+aZdnnUDL05GMIkouL6MsoXexnZ5b0YdNasUu5tArTTvnsQUspjZAL/WS77JltFsgOpLffW3BzDGkLSpGlS7+LU6HSOaNubyiPRv6N8BsgwsohKs0PZ9QsnJB2xfMWS7kWqe/qJQRw3Krmy05GMcq8jSK3Vze06iW42aYbFjedJuCcL82qeSR1E0ipZAJ7O+x63ex2O4v/AslCWT7RXvdvktL0ffjCu75SOxJ4wa6fHMZhvTNAl2dp4/zrjIeLqL0x026eHWvPLTRrOBESX9oEkJQqcPMYk9ZWaBvsU1z7Arevey2R7FLOfRKEx1x01+WUuutF1Au+U5bxsm10ys/pKcxDvYVWgHOou7kEuvXwsMnZ4QTEskM5+v6FEmnuQXpha7Xe96ehQENMNPgIkrhyk3PI5CsVtDWgpeXRnFmit29UzFZyoCBuRnKZkt2iDTs1/ZdLkJeHrYMQt4emKJXx6eEvDfYMgh5e2BJhBtoCQlgyyDk7YEbcj1fp0TI+4Etg5A3iNx9f+TtCkIC2DIIIYQQQgghhBBCCCGEEDIAeR2R93Q/easc6jOrzcunCCHkvSF9YPYtTDsCb7otvRdjU+wso2exGSnDVHBskfcV7YcqCCFko0jP99Iuz774M/PWuU2ys4yexWakDFKBi3uwh/J3P1RBCCEbRTq+Hbs8ydJbS508rFZLfYH+VtlZRkpU0l5sRsogFbz5z9kbv3VVrFd2QgjZJtIzvbTLe7O8opLKa3QzX8HcDu+nlgkh+4P0THR5W+IVldRM6pxvjuyA91PLhJBXzezKfpnk+oM5kB++yzu50i+NPN2cO1+iuF6tHt0ubHSztJ8BaZATT/ZtuTcr+zXIw0vktGy/AzNdGOTUo/xYLDRBL6kmsn7746ERYvxRuu3V6qb7pnfDuUlAWN6UPiZiCGQef9av9X1xS+qSLUlHWl+ZkoYkJQg1O7lc2PJ9sWkt9A2NZeHaVCa3JooksLSx7fJmmMnBZK47Or98dD53cjA6a8p3f+t+HrWQfc+yE0LIthnbT08rd/gEoePyTm3f2tG+/xYfKXO+v2s/wOncHcIJfIcYiU6n1kMpNqZ8JcsD8b3vINrIR+0HvgzydUfvs5nePagP1oE3lLbg+x9cxCeIW7yPUjXkS6Jk9ZUuaUBaAl9K/YizD5xLWbguFXyq2QHa8zM5GDef6VL0m9ETtxqEZfOh6VL2vcpOCCFbx3Z/C9mpN543vqJ1efZTk08ndpR/Yq83nzmRvq373OEY19rPRWoPaHtK/JTIdxJ3Yr91fYdLghx1na3B635xIHIspF+f2R5V9hcu52YeNLaJNV/xPzg4xrF+YXoK95D9/KKfkQj4pD349DL9kepKSYr6MsihV1KfjASelCjfvZ0/4cG6x3Y2WBbOSyVe2PQv2wHF7bH8Hn8UP2f3cppxzvJK53Yjm0FbvIpu5KhQdkII2TrwePed37Jj+8blYSP7nbPEN8UgX7+SP5PfbSdmN7mvVu1al6xk6f4IdIatHzLAE7VTMznwOkOv+w0j65ThqV20w/eMm6kcCtQ5wIMjcUJdvxvgZoSvXLeayFAuSVlfBjnKd/s5CTx1SIpd+cQ1LZsMy8J5qVRcHmbw7kN6J8kFYnj4dnxUzp4ujxDy0mBq1naZAs5oL4Y+rJtECLZb0wPcW1PPIxcexO017gW+R2cAiPXFScjNpZfLc790j7tCztQJjka9ABJ2PJ7K4d9z6nAzggf3FikTFEuCiwV91br9nASulPjUTPf0HC41uigK56ViKLk867rt7yIY5zQZlrOnyyOEvDToQNtxOJAz2k1haSq4E4Zz6kHGMshXZ3QjQXFC5zTSpTbzhKC3NXgdbnTVCx9FhgjuM9Ny7Irczv+EiZxJ3pczeGlLEcxR8WnsYklq+qp2+xkJ3EwP5Xfn772JYVG48GrJ5eGWojOoyILZYCNuOXu6PELIS4Muy58CyRn1H5hOBf0vJk3N2tW5HGDJS+4wmWmBLDraSSOWH5tdCkM7Qy98FNnrZwU5VpHtps6IXFfrpz1r9mbcn3lTNYdiSar6qnb7aQm8TOEWcY/NgBt77a3KonDh1ZLLSxWk49x65ha6PELIfoCFqab/tMgZ9R/YyxlsrsO8sJ00SQjcr5NOUlyndHKyroj5XrtGN7Qz9MJHkUsuD531p3lAfw92eCWCCzfJOMWS1PUlB17smIQEfqbWqz8tFrhV5t4pLAoXXvVqwOBeThVEaTfK4gEHSYQujxCyJ3TTtBY5464SBrfBcK7tZXHTx8xhpGNHpOZWk/jSewQRhnaGXvgocsnlwYfL44X9iAUTztCThyUHxZLU9SUHUXYJfAncTEeyWQeOHdy52RWFC696NWBwL+PxktTCJkYyzu1GLmwSQvYIu+NdD8BIzqj/gP/S3wr26DnvqcJdnymmBfYOmmz5XFhX2PXGPTpD7x3+XvgocsnlYQdI++hglVgwC877JbcUS1LXlwTo97UCVwI3U/FG3bzOpyhceLXk8lCQ1KMdGFE4Xnaoy+OXGgghLwlcluMi0Ke13TYWuNz9jyO517R0lrxG8hzWzZn8sSfQ753INMTp/CqdoUxcHvU38MJHkUsuz97q8rZslogFU6Tkqf65XJKqvqKS5nEkcDOV7G4z67Rl4YKrXg0YvMtRQaZ2Zgftdg+fj7ybfuXsk2U/6tIihJDtgxtCSztwP9K7Q63/0Bd9NH3fFE/A+bvwxdvhQfGm48NKnsF99KHSGcLvLhDepuKFjyIXXZ7e6nK2+h9fPeWmRW7a49XquumAbVeeWtkrl6Sqr6ikLlkJ3EwP7ePtLXc3fSfTwdWiy7M+r9nLeySeDtlgM86Tin6E4vV2eXHZsVfHffyEEEK2jH1PWMPTkfiw1n8czIIeNl5V0/tKbVeHeZ/BvUNY6Qx1cz5Ab+iFjyKXXV7bEzvknlFw08aKroO81CymVpKavsKSumQl8DONN6UudQNSWbjgatnltSOXBvVMx83eGsuFpNLT5aVr2ZOBEEK2zofGR+CtwvKkl7ekd965kIcLZ8aiTNDJP3YvDcGNIP9OEHY9eH5HpkPO+H7SvNHRRvPCR5GxadB1SDJb6F50ZjhupquG5U12v2aQtr62WnjIvY26WpKavoKS+mQkcDOFv3ee/fsgbkWnY2Xhgqsyj3PljiJPO2Hc11O3p5eXpg5MKu0756q6CcuOVfTsDJwQQsj7Rubk3tty7FTK24FECCGEvAFkQuu/CQxv9eq9X4cQQgjZE7Cu+dCtM06wKcR9sSUhhBDyNvggN8x8dL8tIYQQ8tY4c7Y+Plx679AmhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQsh2OZpN9dd6jGZn8/OdfV1nt7m9Tiaz2bH+JOQNwFZNdod8IGehv4dz1n5c7l7PbJPd5nYwPZu/xmY4FQVc6cEGeaXFJVvmpet9x62abB18Vfv19iUi3douD9+Ve/w8n189re82e7Pb3A4+SXar1/c59C25vHWKm7ZtOfmAX4/yM81J+aLhUA+U+/Mx0ixyfqehLXdneh6c6FmXuTlfE+Rt89JmvuNWvWFqLUDI2Z3DyNrgtR46hJHvPr3+cemVCPpMMc8Xi5stlVSkW9fULiTyuR5snd3mpvkt9eAVsSWXt05x07YtJ22Df5KfaU7LFw1I3Oey1DOPFxrK50IvJxNcfTLna4LsF0P7iqDet9jVJNl1q94wtRYg5OzO4VpPe0M0kIq8+vK8e1HbJq2UYUgSW1jKEiTpdV0e+hj9vX12m5swfpUfRN/WwuYaxe3T4BXUXq6lJi96iU8u7PJXdto1tY7r6WM3Fxyd3copPerVEotS7gdSgmHW4dX78OjPY/eteqP0aQF1uztDBPkT2Z4feXSCaaXhNa8/9GhoVSSJV+jypJKivm1r7Da318vWXN5w+jR45Zkuz4B+ITEOBrYnSGjlpJuo9WiJ79LleTwz+mD2vFX3aQFVu0OLPsPf6HZmIjJOre5fai16OptVbjHkC2zizpJyT8ILkkTSDg9NyEwLHeUvIZ5u+pOkay5PkppN9MCht7kemfj6M6RQAoNRUbs5MZdbTroGN40OI5KnYqmMHjeLhGy9HYxns0P9WaFYOz5GUm/6lXJ5peQia7JkTgeEmfv0afDK813ewUc5casHPvaG0NotsWWIy4vtqmTLcq1gpGUKCffvK/oxIPo+t+pe1t+jkfZpAVW7uzfXjV2fS7hgxTMZeSQRVo96tEvGWDcRQjldfJnRX00Pxp/lP1i6U9TRR+fGwqN0n9CDS1MHEzRP5c6t/vFn7+7EjV9rEwhkuRrXXN6hm8vqsR0yn+qZhng1HiU1JbK3x8EX3x9kS6BKOtZiLLK55aQzRGl058adBh6gm2Npd8CtDIRuWn8TNVNvB6MLT+cg7f3KtePzqd3MJkt2etKTq5hcbE0gfdpP1pDM3CfdmOXkNlweJEwugqFW7vQgS1paj6KUQA3BsytDvjVGLU5mq020mZzztjLICbc95hMe0FeU6Oq9T3SE3udWnWsUPr0baZ8WULM7sd6l+F/4kw/2ZEM6MnZp9RyZbBDXQpYZ1Rl8mVE/M1TG0wLO2tCW0m7PuZqffJxLtUuZjpyOR9CZL+76rh5u5qfzW1s77WBcLy0+z+cLW+tuZ/AFZ1aLhd3bhgBuE/PAuNrwcH250H1tl/aK3jppSJTfWiLSXy4Wmu1dN7IqlABRO4tdZHLLS2eI0mjOnSCth4Xq9fPBHP9VH6tVO7JE6MascJCtN5vG8mZ+Mm9uRJuuQK/5lGvHQ1u+0Z6Wzo5XPblKySWsqXDaSzabuU+fBq8gvVzPkbwYJY6yprQK0ZMCetS6HkNRSoC8Arsq2XLc4qSHa1pczeUVEu7fV5Tp6r1PdITe51adsX6fUqvy6dMCKnb3QS7DKY9Eeji/jnTkIzlbNtQtMEauDfkhpi8z6sfQqNreoFDZsS27G5FM2sLLea9ypCa7scvIGx6cLq8cXWAA3NkMTGChCU/VILom5gPR79rErKU/dlUix1Hf1tCU9Ea7ojFit1mVStBEvfN6sTC3snSpNJpzn/UQMQxP2iwxiHnyUmi03kRN1hvW1ZZNq5qI4WoWMcXacUEyq6tGmtFls63Ok6uQXMaackbmJZvN3AdV8FGWoFzk5BZc3rEcJ7sOTDmyGm8JE0zQ0+UZXLsq2XK5xVVcXiHh/n1FBa/eq9Gb4u9rq86qzaN3I+3VAsp2Bz+nGof385fuM5ExGOg2I+8G62gbnvRsjC+zrR/HpjDqUH1igJOaL8p51w7Fsh7dhW7USrqpSitu1Y97Hs4OgLFtkHoUgMDeNm30Ld04OzRXD1tS90kTbDFQwyyWwEZdBuUJcqtIl0rDnuuMd4yOfdktvqACkguINmqm3jD6afs4a7jOML+EVzseMOou0Q5PrgA3uYw15YzMSzabuQ9sO0lUpqIzSV70G84hus/0UwOBVU91cgAeuu4iljZ4OqoiJbCG4NlV0ZbLLa7s8koJ9+4ragTmVIlui7+/rTqrthL5RtqrBZTtTuy6vS0HZXo3EfxW0ILTGTe8PXRWb8k7XF/mwMAM0j61RtF1LhOtOoiE7tbbugb/m97Mhhm+2gjGOP7zjnIm7fKQS1Auv0uouzy/8UhR7WyhXAJEjVaw/Nxq0qXSiETCKqTTfiZy3OjHCx2Xxqk3yNK1IxQlttIkbu14YBqZfF4pFqXDTS5jTTkjc5PNZ+7Tp8ErRWeSvIjE7xYGeAnBnyC0oCV21uTfIyp1PZFTL0oJoCbProq2XGlxRZdXTLhvX1ElMKdK9ITx7VWrzqqtRLaR9msBRbvDGmrXWYg6PJeMyHFngoFAaQvJdrArvoIzpIjwZY5Nxq3ziY5Pr4L0cE5/G5DkF+kMWoIQhqPZbC64FoM5vl/fcibt8mA5gbIxSGqNeqjLg9yY/pRLkIhq8HOrSZdKIzoX2bIcN9rwQsfJueLIuLIbcslItbadKlU7HpjUJM0qVbJkchlrypx2k81n7pNuj3IyMotsQYXkRSTucJsdnCNkNGMzyPlOkrS0HkUpQaz9oi1XWlzR5RUT7ttXVAkKVImeMD6IuTetOqe2FNVGmrMpOdnP7jAkclwXxghu48lExmlvdLEjJp8Xy4fbj+kFYcWXOa4zv87HmNkLj06B5NiJhCqIaVvWiaTpofUFWfy6kzNpl4dNGvq7AU20lcQXPSBhnbBFSFkuQZ/GUZPu+Y3DCx0n54pzKD5PdzXimenCdqZ87XggkP72CUQpJZe2pvRpN9l85j7p9ignI7ModBsll4fbJLloDegUU4bon09L61GUEsSGULTlSosrurxyI+nZV1QJClSJHhd/z1p1tlH49Guk/VpAye7EAS/hVxWsWDgrlunIuAFYNtQXxJc5rjO/zoXxueq7vSMvB04kJJntVDGwXN2fqSN2DQDTYX81R86kXR4sI7glhbTbJhqL7pCwTsiNpf5yCfo0jpp0z28cXug4OVecY1hgh7+S5VGoHQ+ULzkS9USpJxdbE4hOu8nmM/dJt0c5GZlF0ZkkL6YTT4LteonnKOR0J0mPBItSgtgQirZcaXH1WV5h7CRU+4oqQYEq0ePiWzH3plVbMo2ioW8jzdmUnOxld7iUoNtgk46Me6Q3evDq8GWO68yv8wZ7l7upkCASVlRzIxRUl1OTbn1hqdxrYQlzaEBCwQ0dLDy0JpkWXUlYp4S3O5+LJejVOGrSPb9xeKHj5Bxx5NbX7cHkYnG/urud54eOhlLteGDmkujFfVF6JudbU4t32k02n7lPnwavbNXlwa4T99PlbCdJjwTXcXlFW660OKTm3ZJxrpYbSUexr6gSFKgSPS7+nrXqjkyjMPRupP1aQN7ucNf5Vud3CgZJy/amdTIyBAyeZnhF+DLHdebXeYdUSHNeRrHOTduRJNJuig9Aq3Vu87v1ZTczueMwDKvSLg+i+jWNrrCzhZzoANGX7gAX25Fs+y6WIGnYYW416VJpROeKjcMLHSfniCNV1Y3LipRqxwOL/Mkrrii9k3OtycE57Sabz9ynT4NXis4keTHfU8RYn/cQ7m7BSf3dK8GilCA2hKIt11qcXHaff4OMerXcSFwKfUWVoECV6Ai9z63aJdMo+reqfi0ga3d4FCgSALm14iYijyBfL8N4GXyZ4zpz6vzcaRrYWtQMD8PtBLjJ+ZReecIooR0o6RtimvpCxPZuwAgDqLQ5GOzOt27sbF9V74x+6i7PSNk0Dzuoap5fLJUgadhRbhXpnt84vNBxco44Vo1mtHYid56KS1HF2vFAL94943Rw+qRDXVeUUnIZa8oZmVfCbOY+fRq8UnQmyYvZniKFfQdT8JYYFK+TpEeCRSlBbAhlW7YtrtFk1OKwQvWlGa/DfXRXSwn37ivMZdGNL7JDUKA4ugdC73GrzqrNpX8j7dUCsnaH+3axtnC6UUEYefTR7r/J1ucrwJc5rrOuznHt9nw2nc5suZoHWLXd3JlZ770dfei+o/vPF9LJzq/lUFO1D4c+XcwOJh+Qt9DWl27gvpl/mGO4+SRtLuPy7K6M1erx8+ns/FKfyXBXHQJz9UFpkMDT5XyuL/Dplp8LJUgadpxbWbpUGtG5Dbm87vHXlrtM11quHQ87c1k9Gld6ZYtnW4crSiE5BIutKXM6LGEuc5+wPVrkZGQWW3Z5Rn7r9AyPslWwPeqmLK2CHILV27VcXtGWoxaHGutaHM6tHq7nelWEbK/mE85WY9xX6GsTek6+4ugeCL2/rRphk2pzGdBI+7SAnN1BzoTLxUMVTXtLRV6Vv6L14vhKievMqXPooOXBGVcfWdsS1HTdt9yB7nVx59ZqwNMpMnDqS0eS4HaC9pBzeSbftucAD/6N+NBcPbSkWN9Xbr02lC9ByrBTuZWkS6URnduUy8PYfT77gMX4W7liNJ/pOcu146EfzFGuVXu+KPnkMtaUM7KwhOnMffo0eGXrLs8wQzoe104CWCILCORcz+UVW6Pf4m6it9p2rhrtUarHuZpNeEBfAZeXfflYWKC4q3HR0HvbqvNq8+jdSPu0gIzd4W1CyWpBhWn8OPJ9j9u7L8yRt8t6Er4BXF7X7fjsmRnuXc5P4qY+lWdE3Hm5GY2cmMBmfHjqnRUOZ/PL6/kp0h1HbyQ/ns8Xn+YqhhGvPGSYns0/LS7nF1Emoeg+nSUeXnxezOdJ68qUIFKSkM4tK10qjejccaAbo6tO817oQr3hzpc3WsNQOb+hqlw7HqMTU7xP8xO34KEoheQy1pQ8HSsslbmPb9uKKDGOYk7mC5q8mEy8ytQOPOZnue6sSFFKEKtJybdGaXFGkWqjYhv+IHNyLtWhXbsxyUB52YR79hUyfSzc+YkLlOhqGva/VRfU5tGzkQ5oASEmalLHUgFrmT55WdKDureILL/4fRieLwpOEQKkXezSNsTjbezjau+nVRMykPfTOLAa5o3XcFum5x5O8s4Q29ihyxNb3NybqejyCMnwfhqHPoj+xb5Q/eISh49cmCBJxDp25/LE46UXYteCLo+QDO+pcdgP6TksMov0hMA+9PfWGRW/aj8cujxCchxdzE+T+/zeKEen2DVR2/5A3jnTj/Oz/bWR99aqCSGEEEIIIYQQQmKyzze9PEfv6aGY0exsfu4+oxSdID77rrGXFfcVt3tCtsg274NPz+bPadEi2Tt5qO6sfcuEvgYiOkF8ahp7pu05bCalMJWtidsT7n8h75Qtmr595dH6D99K7Pfh8vD84OPn+fzqyRY4OkF8ahp7ru11bCalMJWtidsXujzy0pwvFjfbH+hFuTzD9GsS4wWASz1Yo4ASfV86/GfVHhTlvic8OrEeuzGp4TxfrqrGAtt7BptJKUhle+JmiFROl0demt2YYJTLM0y/HnPsPos0OCOJsC8ub3DhXPDaGP0NohPrIamsLdQWeb5cPTQ23tgjmZ4Vr42XSkLcjT62FyH5eSqnyyMvzW5MMMplqy7PY3BGEuFduLzoLfbxa+3X4llCbZHny5XUWPO1uD1gQxXcn0jldHnviGm0Vemw8MJsubb2A6uFhCfmkrd239cEo4gN8i70qqhRLq7pH5WSjssRpVWmZ3CjM60eiRC5vEoph2g8i7xAPRMyl32/wqX12N/llQxVai+YKfQTKksvg8oSN7OWglxhrCEaqw+PYhXF9DKTcfkLxAUDsvRzeRlxMzrpSGQfqZwu782DKp4eHOtXqJrmMcESg3Lnt9GJ81nAq7H3RbuZnPO+UyMn3DaXT3j0sfsSlv2K1bketKQNOo7YcOhmtnpsvrbuk85FTd/5vJbzEW7TtvXrk8qNStZL4q5V9SxgoHD556q0UsohGo9R+xjj89jA04Mhl32vwmX02Hy4tOU8OqEBS4Z68Knd/rfSD5JXhMKXZ30XgQ9iNqcKqi6bfqaZtRSMMIj1PI11tqfEKopIm0mQ0ujCEwsgaNmAulSeI25OJ4Z89gWVI8eaLZA9BVV8LKMrYOsTd41XDzfz0/mtNSZnVKWfAl4s8I391YP73fKayyskbL9zfTU/+TiXfkWs7sixbyG9Lz0R0dJ8vfDh+nJhv4a8utRLLulcoJcF9PLUfsK6/RqklmPxeT63YXT1qJfEXavqV8BQ4TjQa9VSDtN4DIS17+hM6KGUfZ/C5fR4MPU70OVhdALBioaq7mm5aAQb14XC51Iv9ABA+3ZOV1R1D5cXNjOHghH6sZ6psc72QEpFIRkz8VPCp02XN/OTOT5HDOz33ssG1KXyDHGzOjHksy+o3OZYtAWyt6CKDXeOsUvX2g3FRrdyvTVS9LvN9+mnthdum3DF5RUSxndMu+HfpF2AkPOd0cdkI2JqdNcO+KYw+8fMuopc83Jp9NKcxJeA7UDQcLq8ckaSGEF2nZ8clSQO2nEteFHhtVKuo3GPsh6qSpbDQuGKeqwvbBZKdzBBj3bViDK67Hb+4YL+DkGv2MxkDOhO7Xeey2Xt4fIMbjOLkACeXIlYa2istRWDZ3sFFXXkzMRLSZ4taL/yimQ/60HFgLxUYvn7iVvSSTn7jMr1TMEWyP5iTWLpmAw6kkd3LAMj0hB4cMap9rHtkfWo3O5LCWME7ZhXi5z3jDIgFxGCeot8WKO0Q88IuRSbvnsKg12nd3GR0WfXUoOIEV47rgYvK7xSyrU07lHWQ1XJclTShY+vx6rLKxoq5gHtQM1DruSEwtpVlwc6fBu2UtY+Ls9rZjESxJOrHquPxrIur6CijpyZuCmN5XeX0Ac5bKbbZQPy5Km4vF7iGjydlLPPqFzP5G2B7DGoYs8PwCa8scxRdwbVfo2fDXKml8srJoxWs/R6FIucL9lZJiIy81YldF0k3X/IlazpW6TFZRwmlnXahKOYAUHS5eBlhVdKuZ7GPYp6qCtZDkq68PH1WHN5xdJhNO884eUil7JC4Z5S2yXKQP8Rv2pl7eHyMtbTImE8ueqx+mgs5/JKKurImYmbEnTTuUVUQ/O0W9GAwqsll9dPXIOnk3L2GZU3Z3K2QPaZ2CSwfvNl4dKFwT1f3/7lTC+XV0z4YIJBnDkMtiDgnP5Oko6IWwrBc70YsHp9ZItc8XKJ9RJ1J6Ztz2byQR7c4+jfzQdJl4OXFV4p5Xoa9yjqoa5kOSjpwpLWY7oD704US4dFzkzBulApkKyu5GEeZ+crtbL2cHk1RURh8rGGaCzn8koqcsiYiSebLDm2rgHz4dY3FA0ovBrKP0jcpE7K2ddUnrEFss/EJgGLibHdLmyga2KCnOnl8ooJG8ZYOhIeHb8kx558MamIWIfX3w2QLp1WdKXSVA5O5NBjOy6vrPBKKdfTuEdRD3UlewcpCnqMOsDgRLF0SBa/YuRSQSiJusTds2MJqTO7WlmLpp9QY4IoTDLWYI3lXF5JRR5JM/FkOxSfpzsoL+S3bi8yFA0ovBrK31fcvE7K2VdVLoFjWyD7TGwS6GRz93gw1PHX0+VML5dXTFgZn6vxtre/5cCTL00YEa00GJNhxuRJ1yJXSqZvcJuK3eB8f6a30re4sFlWeKWU62nco6iHupLloKCLoh79zknwTxRLB9HWmuXZIkPBMr9pTLtW1h25vDU01sgguCmWVBQRmYkn2zF2lnQ4K/FxGVwhg6uh/P3ELemknH1V5TiMbIHsM7FJFPclYdXe9xpyxmvZn/QAOFf7bniyGzSapiW/PfnyuBHRDIKVf6xNpZcm5ErJ9A1OU0HijovYossrK7xSyvU07lHXQ1HJcpAvXFmPUQcYnCiWDmuO6efMqhWEu0ZmQC+71O0Q31Ara9H0E2pMEIWJY62jMbejdlMsqSiJZyZuSqKw24PJxeJ+dXc796skLoMrZHA1lL+XuEWdlLOvqzxpC2SfiU1iJGfaHccBYwzn3KE1Bl9tq5LL7rNOGInr1WLCHtK0GquUx3Vq9/1buogol9MZGNBmkv16IpdiU8GSmmP/fsdTkzhIuhy8rPBKKdfTuEdRD3UllwtX1mPUAQYniqXDth9ftJaqSdn5IzYettvzq2UtmX5CjQl6GOE6Gsu5vJKKMjhm4qYkp7vnGHziMrhCBldD+XuJW9RJOfseKk/ZAtlnYpOwG66eMiseuNhW/gjjXKdVYZv4l+bRGTzM2V0tJXzu9OrYJNbMbGp3rXMR7dscusX3MVpGxuPFuRSbClYb2xH/GJG7xliTOEi6EtwqvLn9Fiq8Usq1NO5R7jKqSi4XrqzHqAMMTxQNFdPj7qG9g9Ondu5ZqyC7ivUo3WG3J6Na1qLpJ5pZTCRXHGsdjeVcXklFHTkzcVOyRnk7n5/MDO74rGZAwdVQ/l7iFnVSzr6PymNbkG/6PfV4WoK8SuIqbvdo3X++EAueX8thG8S2/NXN/MMc8w2xIKdV4dzq4XquV2WQ1F7NJwwxbs9n0+nsIwI1z15rl383n3+6d4dyDfmIuKVurPXz6ez8Ut9gEixMdUS5FJvKGIV+upgdTD5gFCi0HU9F4jDpWvBQ4U/St7YqrZRyHY17FPVQV3K5cGU9Rh1gdKJoqOghjWimJ76ykjVdW03jzYwiyL1W1pLpx2pM0MMI19FY1uUVVNSSNRMvJftYvstd/iEFV8jgaih/L3GLOiln30flkS3A8ftqJntEXMWC82pJi/sGRh3AgpvolY9TbG2z3E6wyuBczSaM1YiWB2e0eSQjLEtqESsf8eDIEcXwUBqYhblUmsq57f7A0ymudR1PReIw6UpwX+FGpXJ7wVFppZRraNyloodq9pXCFfUY5JQ4UTZU7B1sue68W1Xj9v5QtDmmUtaC6aebWUjdCNfRWN7lFVTUkjMTLyVM8+azD3hI4FYyNcJZscoGFFwN5e8pbkEn5ex7qTyyBRyGr6oje8Mk92b38YkZrpoh62nQ7oXj+afFpwt7QerfbVUmyfP5zeVc+4Pj6O3l2YRn5vzl/CR4+MkwlQdu3BWWgGxEE/XMiHo5V1mL+LlEepEXtbslOZzNL6/npzg1Dl+tX5Q4VnmtgEbh88WnucY6ilRaKeVwjXdU9WAoZ18pXF6PUU6JrA0lQx2diKXOT6JYNY1jb0xqUaBc1qzpZ5tZQMUIwXM0FqeYVVFH0ky6lDAP8pbFMXm6sb/LBuRfXV/crE7K2QtVlUe2YBpg3nLIm0fswXd5hOw52K6gfTapIIuMQQ8g89230inQFoiPGARdHnlL4A3GT8FEgmTAPh5v2oObmW9kgyNtgQSIddPlkbcE7kaF61skgz6I/uXjTLi4xOFjd4dxv6EtkACxCLo88obAJCX5zAZJYj9I57B4M/e6aAskZPpxfubv3SBkrzmafwy2I5E6R6dzw9kb0xxtgRBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYSQV8KhfObYsDzVEyXG8nWsT3pA3jLbrOo7k/SV/ibPR/R5ob+JIF8B+qy/14R93Zvg8PNicTHWA+HIVKvyoKdKTCXgRjqrSBJwdrO4tt9XPpFvkyS5GJUvCkd6aLjY26+BoG5u9aAXw2Pk2VxVx0jS/OrixpD+/Vp/e5x9WiwW8+iD5eOzW/m2+ePiU+JDd8WL6zHCh4ZczvRKw+YlujcxPurv9dhmA3h9bLLreFVcScFmemBAQR/sJ437fNh4c2YQSmKRk9b1yq8MZ+WLhk960LJ8nvW/DFDRasj3pofHyEOXty9cGm3e6W+Xif1+eWAQR+IMOvzFneLFtbnV5Bzclr8ViVD4Z4n/vlzeJruOV0XoaK7lOD8Lkqtepe/O5TkTOXyE+bMeGLxZXnzRgMSvcHyl67aDRzASp3dJBwXuy8SUbZivHh6jJSoBXd6ecCbajNZLDg5mcv5R/jgd2RjtZfnZtLzJB+uKntrLxYvPQOZoqy9mwtny2S7GCNuS6EQCD5Begu+sAbw+ntF1vG5CRyNDqCf9nSCq9N25PAe0mJzxJi8GiX/AcPeua2Z9kCi9Szoo8KskKgFd3n4wytxy+mhOLydB80ClLrtjCWRC2YPixedQasDbk0jGul/0dw8k6Z01ALIzQkcjg6jCPbyo0jdnBrt0edb8B97Olhi9Szoo8KskKsE2W7wkTZe3GcTWl/FoTlY770dB8xg9yaE7IzyXE3ZVtHixZSznBrrBQgPeokSY5fbf1COhd9YAyEa4wFzm8YMeJnF8weR2sVhIlKVdawiMZirnkCCuLmzdt2ZwfIXcltdHOO9yfoNLq+VNeJPaZbcuzy7y6EHE+DNSMYPCc/Qd6bKbJnh2hYUi05nctqmnA49vTdP0inf5tHpwFg7CPFMYRd47tyN6RPFi3KxWt9JfHJqsDam6AtuuajC7ghCra7FP+eG4vKReQbUIk0vYcLdudqkXhJx4muoUy/rJ1ZxkiF2L2avGD+VyPMmTMd6N+Y/4bfOYx4ExGMQGl+LFFphFrjFm8IXw2KZEkm1hCatjJw3gpDGepxu3Mo2FPdr8LCOvCRvkxJMtcY8G/QyTH9zZvDR2oRsURjaOL8Bat0OgBqwhuMAlWjMYy6boFi+7D7Zja8jfbd6ty/sgJzLTWRkPd0gLS5Z9ohbbsrT7x9KKgo+dI4AiJ9puPs4zQdvmQJ8oXgzbFUxxN6QhaRrbrmrTfrxU7pB0o4ucXoVKEUboAAOabiAvnk312PYNyelmIsTuxexnJJ/lWtj1jURYRPCbh1RDMCNE5cMBFy+22LLqQU8KDXibEmFKWLDKlu03gNPGlBraIsCtO8miPq0AFpyArVXMbAMmP6izeWm8BumbhEvoC9Za2BTuZNwxsTsju4WGYxxrTz9FS5ehZordujwMbtILm2ImT9p9TS87NyUx3LIb21xeaZojW/ButBkGrri8XJ4+nhX2iuLFwIFES9dVgFzcUlXr6GohtTSeN82y0UVJr+UiQIR7293j6dLHrnssiYdU5S724+n08PTa6zUsiRC7F7NXjSPUvf5uOJSuzfZ3fvOQg6B5YZYIWYsXW1DWXGPMoEIcz+LnheTKtiTCimfvXWsSeEsNwG6dfTqRGZppDbYFNNYjXfDSXjFAZveRE0hhe66ymW3C5Ad1Ni8MetiWpZ6N2ZDL63QAzTYTRHRtjmM5kqr27bMFkiSJ5Hm2yxthpOb0NA6ncqm1OAc575XdA1bcDZaiwEWXl83Tx7XCflE8uy3XVYhc21JVI+x916rtgkRipGnw9VqWQYZ5nQgX5qidEhTF01Tv8uqshtiBmP1qHHYWmClS1Ymf1zzgLwK9Q3ZpcsWLHTi1jsvrePrYtsTtSoR89XcVCbudBoAnwdzNc1NMUM7tgV+DTbfYdl5SBl2dfYZAGrVm8ipGz/7pZRmJkC35kc1GXN4Xp/YwKlGzxG9H6VoNTnYOO3N5U3uPM6cUWFxi1BOX3QdZtIqIAhddXjZPH9cK+0VxY5TrKkKubaeqEdZbnyoJ4um1KAMWq7uaR1gdOpfFQ8imx0lSD7F9MfvVOJ5Q8OwMTvWxmU55zQNdWWjTck78QvHiweFMQbBTPZjpTKCCzGvkJia6aLDUHQfblQhrO7mOI0TCbqUBIB1/vG3tSw8gpY5Q5MKD2Fbjq5CqmsszBLIZVk1ey9+zf3ph4NOVe0+9Hq4vENZyed4ZJwXMtL3bChM5k14JDiWxyMlNuTyX5aesTm4Q4CpKH2f1dwKswbeRosBFl5fN08dTdq8oXoxiXUVEgTdW1ejzm7GoRc40ugjw9FqUAZOAZnnIH5qWxYtTDamH2IGYvWoccng7FaQH7WYbXvOAfwxX3+ScrIzWL6bwvW0f7JKb9s7blQi3wVJeKIWE9epwUw0AQYP7fDinkuHFZvo0hdT5KU6oe5Ism8H6MwQaaPL9+qcX58wuEa/uj/VEiq26PAxXYtKa3p3L+/K5uIn1YNbcB70/c/2inAlFP7e20NLmKwde4LLLy+Xp4yu7TxQvRrGuIqLAG6tq9Mp+TcsZ1+Xl9FouAiI11o47GU3/WBYvTjUkE2K3YvaqcVi66/JkDd9xE17zgFEGQw3ILreOihexvRvAYXWPlUc353qAMZB1+9uVKB4PFJCwXh0Wa3VAA0j1UJiBNd0DttlATLEOM1yR7TR2WQQba5oSPUOgOGqIH6JX/7QfbNXloW4/4Z0nDhmdbd/lRYnnObTbkA03rbRy5JVUFowAnuqQcnf5yoEXuOLyDKk8fSJlV6N4MYp1FREF3lhVozL8cZicaXVR0GulCLaZP7UrZu3CTVm8ONWQVIidiylUaxy9urMGJVsUrnWNT8BdbFn1E1mx+BXc6MesUwy1eLEDZQ3b20AgFLSwXYkGdQMS1qvDYq0OaAAobTA0gNdvp4QSAvfrJFWRVzKSaTDme23tPkOgOGpIFKLeP+0HW3V5SLw8o+pIm6OcjORBbeZaWfLiIFtvOJOidNHkt1NSWJ/TuXgrW7GicMu66PJAkKdP2k5LUbwYxbqKiAJvrKq7QWyLnFFdFPValGEk3TuqH7hPQZbFS+vVJQ7xAmK2lGoc63vOchpCJpHo6H39dOCPse5avNiCsuYaY08w77W63KpExY4jRMJ6dVis1QENwF3EbMC5dujTdBbiCNEsmvu/kku3HfcZAsVRQ9Ihiv3TfgDFOAVwdJZCQnvbPopax8wm/4CETyiJRU5G8uzI5WnpGrckv52yh0nGLi9WlHcfQk5ELi/I0ydthaUoXow4eqm2JfB2qhpreV3DNWCzlRagqNeiDHKnpu01fMrixamGxCFeQEwHZJCs8fAF+BcyBXXA3Q5Z9cPN7PjuGCrHply82AJRco2xH3avnb3vtFWJsNdef1eRVLbSAOC/fJmx4deRDI/BTeHirVpkU/PCVm5ndM8QKI4akguB84HG94qw5To6S4EnOfQ3KGrdjt68XUN50l5JTkby7MzlweYasw/KjrJ128FGfr6RonDGGY5iWJc0HTdPn6ydZqN4Mcp1FbLFqkaDdlojaqfRRVGvRRnk521mzaUoXlavLXGIFxDTJW8kcqWwoTxoHgjtZIhOtXlpZfFiA8oatjfDcc/dmyYFuOFGXduTyPoaV7tFGbfXAMJS2DcFuMXAm9VuxMOrg0feJ1J5jgDPECiOGpINIdKnTW8/GOjy0Fkt0F6tTZW1rnctuuWfg+Orp8wI95W4vPFqdd20A9uPNW4qKDtuODdvcW++aNLmGynK7pey7wc6OETCBtvN5/P0cZTdM4pXPZW6CthiVdvnh5a2Po70hlbj8op6LcpwqPu1Gu5unBoviRenGhKH2LmYfY0EGxxyio+bB7rf5rUcxyjHU9v3Fi8qKKuToAVzjO6ZHY+L7hl+U5QLq45uLXZLEhnQC3SbGEsyGrbXAPTtO407mqIY/lMLmM/K1K8phx0im3NOsGcIFEcNcUL0Nb39YKDLs0MHCyqjovXw61ZCat+u4ZW4PP+BRpN516CCsrdv67FcSMG7fENFhV3dw1SWwmw3X8jTw1F2zyhe9dTqKmB7Vd28Ranh6UhaeDPjLem1IoNt5C7LdptMQbw41ZBEiF2L2ddIcJMx/Qi0EDUPdK4u7l3W4sUSNqIeBLT7fjou3f5+OxKpZpxPKZRkFLbYAGbBsCcepKCenOzsG7X94j5DoEEm39f09gMMHpzdczKKKn5gY9K8u9POuGFGXsXKCMZL4bgZxhuWN+lNfIZQEovM+CN50GXk1J68mE48zazrkR789hSUXd/JalhemgyN6rr3BMWBDeMmPF7XKsPxZoEgn6eLp+xeUbwYPerKY2tVLXxomiN0IfvuusWSvF6LMuB2mk4AhA/SAbgPAObEi1MNSYbYtZj9jMT25E76PlKnrpkaJpfNDvTVwyf/UuViHgiRXeY7uXa6/Id4yrANiXSc5Xj7soyGrTaA884fPVzEASfQ0GNXPqzKOp2J4RkCDTT5nqZHCNkd0qN573Sxg15vn8wrYPtiygShG+q/CLjHVuxRdw+mLc6e6dcoIyGE9ETG/v6C3lg6tdIw/iXYvpjY/veifTm8Sc9FlZ0hIwFH8a9SRkII6QkWDB+63QkT7D9wX0H4KtiBmOjNX3CHgeRfeMnhyyCza2eLyKuUkRBCevNBZjc+uiv0VbEDMbGd1N2jslNkBdHZJ/g6kLvFzmMAr1JGQggZxJmzy+7hsptKvTK2LubxYrHwX9z93rlaLG7CTW2EEEIIIYQQQgghhBBCCDk4ONRnIZuXBK2NPCRffF0AIYQQskOwPdp55yiOLfJwbHh5CBK1eScVIYQQ8tLgXZbd9mi4uAd7KH+Dy4OQmHR5hBBCXghxQ95i4+RhtVp2u6PxZjZna3BweRCSFF0eIYSQF0LcUOn+mrw5tfcnGSvQ5RFCCHlBai5P3pe7qRfc0uURQgh5Fjf61dLDS3ysYnkdvwri5Eo/1vF0c969lG66MMjZR/mxWKjnu1nar8EcTG7NSXm10tJet8ub7eWWc3NKWN7EH6WYXdlvjFx/MAfygy6PEELIuuCDGtOpzMZavFfPnYYvBLzUC/j4sgt8WvddQbzmzwF38LrLlg/OZ7MM3oMMY3kqoeUOUenyCCGErAscifidO3nx+sR+bb77wgY+Q7J6OrEfKjyxDsp5RbscegubgU8LFzb9y8dy1HzBaorn97ovIFqXuZDdneN54xnp8gghhKwLfJD74UQ4nmZPJZ4xuHO+sDHFEmf3zXo5Wtvlwak5nwk7Ev/auFtcvG+/CTy23yemyyOEELIu8EHut7rwuUr1LLjofGLKYF2kHjzP5cUfxoSbs59RwUXv29GuYIQQQshgAhdlcLwUljmD94ThXPt1LzlY1+UhJW+zzETO2K85f5Cf/gN8coYujxBCyLoUXR42ZAavSsFMrF0HlYN1XR4eU4+xV/HpaP+7mXKGLo8QQsi6FF0etkwGn1XE9MvOxAxysK7Lg0P9NA84s4uZeDPZMX42yBm6PEIIIetSdHn+IqYF59r9K3KwrsuDV5MH7lKcy0X/QT05Q5dHCCFkXYouDzM638vgqYXuFWISwPswwgCXN5PfzVN+IXh+4V4PwEjO0OURQghZl6LLsyub7qbKkTyksOzWOuUdmo/6GwxwefLil2DLpsOlXHQcov0Gg+Pyjg71ByGEENKHsssb2c+7Nm5pKh7Oe2oBjmmBE/qFIDnR0+XpBhbnZS/HV0/tmimyXtpl1SP9zGzn8vB84Bc9IIQQQuqUXd7Bwcx/I5ihew5dwEzNAp83yOUdHMGHerQ7Yw4+6xnL09GZ+du4PKTjpUwIIYSUGcu9uc7NGGRC5U2fzju/9HDhPZcuTOxrUZpXhQXpycqom1ic3XEzgTMsb3S/pvKhyRlvoj41P9r7hljn9L0vIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQggh22G8XK1Wn/SAvACHX0wNGJanemJd7kwiV/p7k2zXRPbXANl0SAht4hVwMs9xMTKXp6aKttJRvk6OpLi3evAShALg2PLQHK4pn0Rd6O8h1PLcronslQF6qnpnTYf0gDbxCpA6yHBmLr+zOrpCwad69AIEAqAPfbCH8vc58knMdVxeLU+6vBZPVezeSAht4hXgzPIepD4+64HhHc7yJkYHy496sAMi7QYCXEuAiR4YniOfJLWOywvylGQ8kenyWjxVDZF8Q4XcUDJkW9DlvS4WUh/BaJ51tFVq2r0315/093ORvNZxeQGRyHR5aejySAi709cFXd7OqWnXzBpwD28TSF50ebuDLo+EsDvdAReyR2j1+EEPSxRd3vEVElpeH9kLDuc3uLRa3sjdvyInV48Iunq6OZd1U+XaSOgawsikeO/uUJQTTyf4ebNa3Y7N/8PLJ0loiEAadYrlwvT6YJDx+DOUslp9ccV16SFOutDThUHOPsqPxULL3wowuTUnpRxLe90ub4aKKWt/dgWhVtdS/fKj6PKyamvyTIv8HBOp10ib+qGW5bZN4dgMCL7M9ED5vFwtP+vviIoUBzOIsXpoamn8UYYcq9VNUKDRWVOj97d+/q2qDD27t4wdCCl5L4wWFm6mIuPtqJSMz9DCGvImlmkfmdN5vRkml7B2E8cWYHGpF4S8BFUyXY6hany3Jj/5f6Smt3AzH9+uVndeKUwP8OCkklJCn7ZCnoWploYLPZUHNZR0eWPZ4N7ipfTBWkNDaSv9qTXbjtao53LkJPsZl517WDhxKL8g0HRqm6fSUyAb9biRItX9+93UJcI1JPcWV8XJFvqjHreguJ0AJzjZgcbky1fWvl9pd4iad3kVtSHPkshrmQii9qkRP/Ur24WMkK5jJaZrlDPJsUxViiPtGcGDSfVCf4MuzYkbTljCLkGnqvAgT1qpWXlHyL6zRSjG9Ju5ZEKGFdZQMLFM+0ifLuntYKRP4ng03qAgQY18l9NoomB8CHAwlZsLLY9t/ziTw7keADnRppJWgrWJQlshz8SzMnfUlCTr8oQ7mWNNPtnf9prhGMda8VPYLcZFCSawracTGWGaDt2asZ242QW8pb1iGOPa6loPVQo7eMdPiTxcIEQVA348nR6eXqdMzZqkHkguT9oup5e+eTdUxCkW2iCHXq/oCRAvbPqXi9q3LnMh9Tme24zzLq+qtk4k/8iA68L2akR4wBD72PYWtlNG0l1OGjj5PEVVCqmohXSyMzt8kc2XS9nBNbYFaqeOZni2vNLR/cheaysUKTXK8Q4qREEL8prJSVuRsK+HdhIhVyo5DitsUW+Z9pE5XdIbcrm3Lg7PoT62RSpKUKHc+qCJkvEhgHSh1vSOrOk18csuL6MEJCmk2wp5LqiVlqWezVJweV3VwuaaYSC6VWcd6UhsLF2D2G5/11nywRT++NweQNK2sdq93qtVu2wgkukujmcIpFHvWtcagyAqx6n8LoQFZXHKhTbIkddHuQIYii6vWFhcvO9GEXa+n3F5dbV1IvlHhrIKNlIj7rTmED2XXajH3KS7hJyXjrpb+kjRFkAnTE/NLAObiFbpuQV61Varnqq8gwph0KK8mMo+is4QrBsZ9nZ5vQtbkiPTPno1m0Bv0iq6XKRW21osaqJMpfWpJvLGpwFC01NNFV1eTgmh9v22Qp7NCPptqD7BnHV5XxyzwQxMKxa/HXNUA42W6Q1Ixxm7GWz16wHW09WY5MKDuL3GspGqjq6eIZDNsPM3CRBEOw3YdG3VoSgOLhYK/TyXVywsLnq9vytYQA+1dSL5RwZc326NuKnbelGlwJG39ob18dRd6x5SfLG/ARpCOxvQvtOL3oHhWSOepyrvoEIQtCyv7fZNa8Fyv6s9Oa7kOKiwRTky7aNXs/H19kF+dx0PRFSJKpoogWQKrc8eFIzPqipnevidc3k5JURJonzpVknWwbYNy71X+SmyLs9rRU4vjHl5OzwUJnLGGXa3IGgwUMY5NV68iUdboqzcnOKE2qNkmX+nRW+B4qgRXhCsIK2uAo341MUpFPp5Lq9YWPQi/uhRzqQb1yC1+UeGugo2ViMW+Dk7hMbtvCe17EM5n/RMA6VAcLfa5TjTL8HNNmG9lHoUriUIWpbXFFs0jOXJpRdKzro5Tmcdem9vUGHLcmTaR59m4+sNFdd5XXeKVNNEAUQttL5q/SQCOKZXdHk5JcRJhi2cPJczu4C9uj/WEwUGuzy75ysiZUaptDFea4wGqzX2do35YeZ3suJipylYe2nuxj9DoKqNh0FmWAgx3J/lxgtFcaqFjvqoML2wQbiXi4VFj+KPheVMut8epDb/yFBUwYZrBKDf0skcOh5d2pNs7+3PgIFSeG5MkGNXdee2R2tpwnop9ShcSxC0LK+gIYJxrJxyAp0hTIPdbjiosBU5Mu0j32xyesPppofCzbvmdl1dE1lqra9aP4kAjumVXV5GCXGSYQsnu2Swy0OET/qulpake8Ct32ATGWYi7XhNQuB+naQqnbVkJGN2zPfaRYJnCFS18USQQ7uZ2HCTKlZZnHqh5cCLHaQXNgj3crGwWDfyhzlyJu3yBqnNPzIUVbD5GtFOuikbDtCbozsKzFcZKEXZ5bWbG/H0iJS1Deul1KNwLUHQsrwCQhgRnF2PBjnlJDO9wW5/y42dnAwqbF2OTPtInS7oTT3b02KBe1vOcmNdgiy11letn0QAx/QqLs+QUEKcZNjCyS6BeQV9RrGO0K+m7p3EuCsKDTjXGjfuIRgTEquE4cA8jTiSSzd4f4ZAVRvPBTlD+wzFB0Vx6oWWAy92kF7YINzLxcJ2k+YWOZN2eYPU5h8ZiirYRo3I6lK3FwtWa3o2dEGZOzMDpSh5AQzAnNs0XlgvpR6FawmCluU9sBv+7zGP8OpYTlRyHFTYmhxKpn14p0t6G8mWGXXiBvdpt54SpKi1vmr9JAI4ptf0Vx1yImpgvm7iJMMWTnbJYJeHXqb67AOA//LtAZu2nLdp4UmWKQZndvFe7GthTauz3GcIVLXxfBCcT/mLojj1QksAb1tRkF7YINzLxcLabd96ALCXKVWEgWqTo5LIhu3WCIrWSTASjd7ZXX/O5kWPgVJUvYDTkW7O5TlKLctr7epGt/u5OcixVzcxgwpbkaMDqcbG5Zwu6U224WQGK70liKm1vmr9xAFc08NV73EJORHrwNNNnGTYwskuGezy7Bp8csNABJYZ3KB4pnbpLDxgK8KN3H9QQ0LeJyKVI8AzBIqjRmSDiPipzqQsTrXQeCxIf4MgvbBBeJeLhcUAwukr0OWkWqQwRG01kQ0brxFnD/3BGJMbZ80WveInKV+zjyVmmBQlL4CUuuXEkddovJTCZI9n/iqkS6jUoryYqmC+hC2bzv28qG5iBhW2LIdLpn10p0t6E3u5zdRdTYKCViutL9JECAK4G4TG2BjRmJ4o29l0g1pJNjBHN3GeYQsnu2S4y9M1eGc37vHVU3q8pu9XaCzQvtPA30KMu+0yDmtksFZkzjnBniFQ1ca9IGMzaWhak22gzk7ulrI41ULDMS1wwhY6SC9sEP7lovaR9dKOq4/0FknO5Q1RW01kw8ZrxLCc417UCHvm/KpozCS4d+kxSIqSF8AGiCe9eIQK7cJ6KfnJwjG3TzOEREotyIueXNf6sLmxu6EXJRMzqLAlOTLtI9dsSnqzz7t13N0408FSzZW1Wml9kSZCEECwpqePtramh62j9uVtB4coqcEqLtt3xHmGLZzsEtRN0FRqddQYr0NuA/EsMOx4MUPtps3OvlHKv1/xDIGqNu4F8R9qNFkEd8ItNXFqhcYg1gLNB+mFDSK4XNS+fWtbw9ORjChyLm+I2ioiGzZeI+roGsKnh9WhezdWQoZIUfQC7RuqLBdS2Casl5KfrN09qQcxoVKz8mJtrlsnsI8htQ0kSiZiWGELesu0j2yzKelN/ZrLsh2/FGquptVi64s0EWID2OWRBtf0fE/9MBVNWsVllRDnGbZwsktgd0HHjrvOrYUJ0r+4j7IeHDdTCMPypriZ6ryz3gd8dN1nAhvCeyUsWI73F8yfIVAcNcILoi/eFewrhxL0EKdc6EnTm9tSBunJiN5NLM6upP0PTc541bHMCFJrsw291VYW2bDpGpENA20Siaqwj6l5OaboLQX2FroNwb9PaN9DbFhemkCmitpX5Xkp+cmicy4s0AVKFVLyos90H02wZW8fP0sk4zO0sIac3jLtI9tssnqD13WeoPuAMjlPleYkqGq11PoiTYQ0/ilveuOmRGhg8jBV08AySojzlLSrlksIIQ7SRbq3hV8f2IRV6l/fMbIcEbwmTio0/Yilx3a1Wp0GEkLIC4DlPO/23msDfXPhVuO7RqaV/lsz8Rqu4uQNbFmrdHmEkFcIlubW28i+I6Rvrr/w772Cdc2HbmPkBLtwvJdbJtm2VunyCCGvD9zw7bEK9nJI3+lsOSQBH+QOl49uNS6xda3S5RFCXh+y5+nxVd/II1XOnL2mD5fee6RfDCyvvurVA0IIIYQQQgghhBBCCCFkUxzqw5zL4IuNhBBCyF6Dx2Wc93zg2CKv1gkvb4JtpEkIIYTUsG+la9+fB3f0YA/lb3B5I2wjTUIIISRAnI339MrkYbVadm/Iw9vmnE3kweV1qGVJCCGEbIPI/wTIe12dL8FuglqWhBBCyDao+R95ReyGP49Bl0cIIWQ9blb2S4eHl/h6z/I6fvvByRU+Rm3mazfn3WvtpguDnH2UH4uFuqGbpf2axsHk1pyUtwkt7XW7vNlebpjMdUfnl4/O56gORmdNpve37fuHalm2pCUWehSXEELI2wTvkptOZTbW4r287jR8B17zFh75QpQHfFr3bjp8CtkBd9uCV9fpx4Zb9AXqE3VYLfrt51qWlqzEhmpxCSGEvFngA2TCcycfYJl8kkPnax74DvPq6cROwE4wM3I/1SKHjreJ/E+4sOlfxmvwV7f4BMj4o/g5u+9ybpzclc7tRlakLk85KmRZlrhSXEIIIW8Y+IDVaq6H9ku97UeJ8YzBnbM0OMX8q/uivhyt7fLw4RD3gbqT+KPXBjgx+5F9wU0BeGlWJC4XlxBCyFsGPsD9PBVeIa4OBhcfvZth1mfowfNcnvVO9ncRPHjXSiEH2SxrEuMgV1xCCCFvGtdfWBwvhXW/4D1hONduKJGDrP8xlFwePgbpLJJmwWywfdRcDrJZ1iQuFpcQQsibpugDsDsyeK0JdqW0C4NykPU/hpLLS6Xece58REvo5/JqEtPlEULI+6XoA+7kYvABTnyH+pMePMvlpVJX7L4WAx5wkER6uryaxHR5hBDyfin6AH8R04Jz7f4VOcj6H0PJ5X2W36mFzTE2XXbPDgxd2CxITJdHCCHvl6IPwPzI39uB7ZPdK8QkgPcRgwEuD6nf2N8e2K7i+K3Y5WWzrElMl0cIIe+Xsg/AOqE+Hw5GsuV/2a0cyjs0H/U3GODy4tSndmaHu3j24XNh5N+gK2dZkbjq8o66fAkhhLwtyj5gZF8G1riQqbgb7xkA7Lpc4IT1SUNcnvVPzVNxR+LpMLfDfpMn9XFHyNNxeeUsKxJXXB6e4fuiB4QQQt4WFR9wMLOvL3HonkMXnH2VcECDXJ69zeag3uYYN/NaLiSV1uXVsixKXC4urnryEkIIeTtgq0i7AVOQaZI30TnXeZbh4cKZ4VkmzWsy7V25ID2Zx7mJRdlN8UU94L7huT29vJwglaXzzulyloa8xJXi4i6i79MJIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIbvj4OD/A35T7+m0FtEfAAAAAElFTkSuQmCC) ###Code # Checking Uptime for Websites - Sequential """ Created on Fri Nov 16 09:36:09 2018 @author: DR.AYAZ """ import time import logging import requests class WebsiteDownException(Exception): pass def ping_website(address, timeout=20): """ Check if a website is down. A website is considered down if either the status_code >= 400 or if the timeout expires Throw a WebsiteDownException if any of the website down conditions are met """ try: response = requests.head(address, timeout=timeout) if response.status_code >= 400: logging.warning("Website %s returned status_code=%s" % (address, response.status_code)) raise WebsiteDownException() except requests.exceptions.RequestException: logging.warning("Timeout expired for website %s" % address) raise WebsiteDownException() def notify_owner(address): """ Send the owner of the address a notification that their website is down For now, we're just going to sleep for 0.5 seconds but this is where you would send an email, push notification or text-message """ logging.info("Notifying the owner of %s website" % address) time.sleep(0.5) def check_website(address): """ Utility function: check if a website is down, if so, notify the user """ try: ping_website(address) except WebsiteDownException: notify_owner(address) WEBSITE_LIST = [ 'http://envato.com', 'http://amazon.co.uk', 'http://amazon.com', 'http://facebook.com', 'http://google.com', 'http://google.fr', 'http://google.es', 'http://google.co.uk', 'http://internet.org', 'http://gmail.com', 'http://stackoverflow.com', 'http://github.com', 'http://heroku.com', 'http://really-cool-available-domain.com', 'http://djangoproject.com', 'http://rubyonrails.org', 'http://basecamp.com', 'http://trello.com', 'http://yiiframework.com', 'http://shopify.com', 'http://another-really-interesting-domain.co', 'http://airbnb.com', 'http://instagram.com', 'http://snapchat.com', 'http://youtube.com', 'http://baidu.com', 'http://yahoo.com', 'http://live.com', 'http://linkedin.com', 'http://yandex.ru', 'http://netflix.com', 'http://wordpress.com', 'http://bing.com' ] start_time = time.time() for address in WEBSITE_LIST: check_website(address) end_time = time.time() print("Time for SerialSquirrel: %ssecs" % (end_time - start_time)) # WARNING:root:Timeout expired for website http://really-cool-available-domain.com # WARNING:root:Timeout expired for website http://another-really-interesting-domain.co # WARNING:root:Website http://bing.com returned status_code=405 # Time for SerialSquirrel: 15.881232261657715secs # Checking Uptime for Websites - Multi Processing """ Created on Fri Nov 16 09:36:09 2018 @author: DR.AYAZ """ import time import logging import requests import socket import multiprocessing class WebsiteDownException(Exception): pass def ping_website(address, timeout=20): """ Check if a website is down. A website is considered down if either the status_code >= 400 or if the timeout expires Throw a WebsiteDownException if any of the website down conditions are met """ try: response = requests.head(address, timeout=timeout) if response.status_code >= 400: logging.warning("Website %s returned status_code=%s" % (address, response.status_code)) raise WebsiteDownException() except requests.exceptions.RequestException: logging.warning("Timeout expired for website %s" % address) raise WebsiteDownException() def notify_owner(address): """ Send the owner of the address a notification that their website is down For now, we're just going to sleep for 0.5 seconds but this is where you would send an email, push notification or text-message """ logging.info("Notifying the owner of %s website" % address) time.sleep(0.5) def check_website(address): """ Utility function: check if a website is down, if so, notify the user """ try: ping_website(address) except WebsiteDownException: notify_owner(address) WEBSITE_LIST = [ 'http://envato.com', 'http://amazon.co.uk', 'http://amazon.com', 'http://facebook.com', 'http://google.com', 'http://google.fr', 'http://google.es', 'http://google.co.uk', 'http://internet.org', 'http://gmail.com', 'http://stackoverflow.com', 'http://github.com', 'http://heroku.com', 'http://really-cool-available-domain.com', 'http://djangoproject.com', 'http://rubyonrails.org', 'http://basecamp.com', 'http://trello.com', 'http://yiiframework.com', 'http://shopify.com', 'http://another-really-interesting-domain.co', 'http://airbnb.com', 'http://instagram.com', 'http://snapchat.com', 'http://youtube.com', 'http://baidu.com', 'http://yahoo.com', 'http://live.com', 'http://linkedin.com', 'http://yandex.ru', 'http://netflix.com', 'http://wordpress.com', 'http://bing.com' ] NUM_WORKERS = 4 start_time = time.time() with multiprocessing.Pool(processes=NUM_WORKERS) as pool: results = pool.map_async(check_website, WEBSITE_LIST) results.wait() end_time = time.time() print("Time for MultiProcessingSquirrel: %ssecs" % (end_time - start_time)) # WARNING:root:Timeout expired for website http://really-cool-available-domain.com # WARNING:root:Timeout expired for website http://another-really-interesting-domain.co # WARNING:root:Website http://bing.com returned status_code=405 # Time for MultiProcessingSquirrel: 2.8224599361419678secs # Checking Uptime for Websites - Multi Threading """ Created on Fri Nov 16 09:36:09 2018 @author: DR.AYAZ """ import time import logging import requests from queue import Queue from threading import Thread import concurrent.futures class WebsiteDownException(Exception): pass def ping_website(address, timeout=20): """ Check if a website is down. A website is considered down if either the status_code >= 400 or if the timeout expires Throw a WebsiteDownException if any of the website down conditions are met """ try: response = requests.head(address, timeout=timeout) if response.status_code >= 400: logging.warning("Website %s returned status_code=%s" % (address, response.status_code)) raise WebsiteDownException() except requests.exceptions.RequestException: logging.warning("Timeout expired for website %s" % address) raise WebsiteDownException() def notify_owner(address): """ Send the owner of the address a notification that their website is down For now, we're just going to sleep for 0.5 seconds but this is where you would send an email, push notification or text-message """ logging.info("Notifying the owner of %s website" % address) time.sleep(0.5) def check_website(address): """ Utility function: check if a website is down, if so, notify the user """ try: ping_website(address) except WebsiteDownException: notify_owner(address) WEBSITE_LIST = [ 'http://envato.com', 'http://amazon.co.uk', 'http://amazon.com', 'http://facebook.com', 'http://google.com', 'http://google.fr', 'http://google.es', 'http://google.co.uk', 'http://internet.org', 'http://gmail.com', 'http://stackoverflow.com', 'http://github.com', 'http://heroku.com', 'http://really-cool-available-domain.com', 'http://djangoproject.com', 'http://rubyonrails.org', 'http://basecamp.com', 'http://trello.com', 'http://yiiframework.com', 'http://shopify.com', 'http://another-really-interesting-domain.co', 'http://airbnb.com', 'http://instagram.com', 'http://snapchat.com', 'http://youtube.com', 'http://baidu.com', 'http://yahoo.com', 'http://live.com', 'http://linkedin.com', 'http://yandex.ru', 'http://netflix.com', 'http://wordpress.com', 'http://bing.com' ] NUM_WORKERS = 4 task_queue = Queue() def worker(): # Constantly check the queue for addresses while True: address = task_queue.get() check_website(address) # Mark the processed task as done task_queue.task_done() start_time = time.time() # Create the worker threads threads = [Thread(target=worker) for _ in range(NUM_WORKERS)] # Add the websites to the task queue [task_queue.put(item) for item in WEBSITE_LIST] # Start all the workers [thread.start() for thread in threads] # Wait for all the tasks in the queue to be processed task_queue.join() end_time = time.time() print("Time for ThreadedSquirrel: %ssecs" % (end_time - start_time)) # WARNING:root:Timeout expired for website http://really-cool-available-domain.com # WARNING:root:Timeout expired for website http://another-really-interesting-domain.co # WARNING:root:Website http://bing.com returned status_code=405 # Time for ThreadedSquirrel: 3.110753059387207secs NUM_WORKERS = 4 start_time = time.time() with concurrent.futures.ThreadPoolExecutor(max_workers=NUM_WORKERS) as executor: futures = {executor.submit(check_website, address) for address in WEBSITE_LIST} concurrent.futures.wait(futures) end_time = time.time() print("Time for FutureSquirrel: %ssecs" % (end_time - start_time)) # WARNING:root:Timeout expired for website http://really-cool-available-domain.com # WARNING:root:Timeout expired for website http://another-really-interesting-domain.co # WARNING:root:Website http://bing.com returned status_code=405 # Time for FutureSquirrel: 1.812899112701416secs ###Output _____no_output_____ ###Markdown **Python MPI** ###Code !pip install mpi4py ###Output Collecting mpi4py Downloading mpi4py-3.1.3.tar.gz (2.5 MB)  |████████████████████████████████| 2.5 MB 4.9 MB/s [?25h Installing build dependencies ... [?25l[?25hdone Getting requirements to build wheel ... [?25l[?25hdone Preparing wheel metadata ... [?25l[?25hdone Building wheels for collected packages: mpi4py Building wheel for mpi4py (PEP 517) ... [?25l[?25hdone Created wheel for mpi4py: filename=mpi4py-3.1.3-cp37-cp37m-linux_x86_64.whl size=2185319 sha256=f414154ad8fffb7b45cbb3b8ae73fee944de886686efeb2f1a38e649c73ffc2b Stored in directory: /root/.cache/pip/wheels/7a/07/14/6a0c63fa2c6e473c6edc40985b7d89f05c61ff25ee7f0ad9ac Successfully built mpi4py Installing collected packages: mpi4py Successfully installed mpi4py-3.1.3 ###Markdown ![MPIFunctions.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABFYAAAFeCAIAAAAgyQz7AAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAElUSURBVHhe7b3/r2Rnda/pP2psWZHmh5HQTYIShHI1keaCZDOygjRSrhRMLByi64xkq5lgIriaWCMrYyWgey1zzQ0OCgIlGW7u2MYYkybpoU1ig8B8s4kNuG13u7+ZWbXrU3VWrf2+a9eps6v2l3oePbJWVdepevuz93r3Xn3ap2/5JQAAAAAAwNHACAQAAAAAAEcEIxAAAAAAABwRjEAAAAAAAHBEMAIBAAAAAMARwQgEAAAAAABHBCMQAAAAAAAcEYxAAAAAAABwRDACAQAAAADAEcEIBAAAAAAARwQjEAAAAAAAHBGMQAAAAAAAcEQwAgEAAAAAwBHBCAQAAAAAAEcEIxAAAAAAABwRjEAAAAAAAHBEMAIBAAAAAMARwQgEAAAAAABHBCMQAAAAAAAcEYxAAAAAAABwRDACAQAAAADAEcEIBAAAAAAARwQjEAAAAAAAHBGMQAAAAAAAcEQwAgEAAAAAwBHBCAQAAAAAAEcEIxAAAAAAABwRjEAAAAAAAHBEMAIBAAAAAMARwQgEAAAAAABHBCMQAAAAAAAcEYxAAAAAAABwRDACAQAAAADAEcEIBAAwHy6/feOtKzf0ACpYSm+SEgDAEcMIBAAwea7ffOfnb1z7q6d+8seP/ssHPvaNd33oSUx83/1fv++R5z/95Zd+duna1Ws3FSIAABwNjEAAANPmzSs3nvynV9/70WfCjT52+u57nv7L//fHl966rigBAOA4YAQCAJgw12++83ff+Gm4s8dT+ekvv3TlKn8vDgDgiGAEAgCYMD9/4xrf/zmj777n6R/+6xUFCgAARwAjEADAhPnsV34YbuhxBz/5+Itv8z8FAQAcDYxAAABT5c0rN+575PlwN487+MFPnH/zCv9HEADAscAIBAAwVS6/feN993893M3jDr77nqdv3HxHsQIAwNxhBAIAmDDhVh53VoECAMARwAgEADBhwn087qwCBQCAI4ARCABgwoT7eNxZBQoAAEcAIxAAwIQJ9/G4swoUAACOAEYgAIAJE+7jt/L3nvzNe7/6Gx/5anweN33//c/d98jzf/Gl77/0ymXFDQAAs4ARCABgwoS79txf+/2nfusPn/nU577zV0/9+Nvff+O116/iWgXqePm1t5+88KqNQB/42Dfsv3oWAACmDyMQAMCECUNO4nvufeYPHr740itvhVt/XKpAK9gIdO/D37p6nX8+FQBgDjACAQBMmDDn1PzVDz/1R3/+fLjpR68CrfPo3/7gz/76e3oAAABThhEIAGDChFGn5r/9D197+bUr4aYfvQo05Xc/9Y8Xv3dJDwAAYLIwAgEATJgw6hT9Nx9+6qG//G6448egAk358rOvfPzRF/QAAAAmCyMQAMCECdNO0ffc+8yXvvZyuOPHoAJNefGHb37wE+f1AAAAJgsjEADAhAnTTtFfv+dpfgpCpwq0C8tTFQAATBZGIACACeNHncRwu49tFWgXFqYqAACYLIxAAAATxs85ieF2H9sq0C4sTFUAADBZGIEAACaMn3MSw+0+tlWgXViYqgAAYLIwAgEATBg/5ySG231sq0C7sDBVAQDAZGEEAgCYMH7OSQy3+9hWgXZhYaoCAIDJwggEADBh/JyTGG73sa0C7cLCVAUAAJOFEQgAYML4OScx3O5jWwXahYWpCgAAJgsjEADAhPFzTmK43ce2CrQLC1MVAABMFkYgAIAJ4+ecxHC7j20VaBcWpioAAJgsjEAAABPGzzmJ4XYf2yrQLixMVQAAMFkYgQAAJoyfcxLD7T62VaBdWJiqAABgsjACAQBMGD/nJIbbfWyrQLuwMFUBAMBkYQQCAJgwfs5JDLf72FaBdmFhqgIAgMnCCAQAMGH8nJMYbvexrQLtwsJUBQAAk4URCABgwvg5JzHc7mNbBdqFhakKAAAmCyMQAMCE8XNOYrjdPy4fv+uWW2654/HW85sq0C4sTFUAADBZGIEAACaMn3MSw+3+hucfvNVGBOPOJ9ZPPnbn8qnbHzjvXnl412tb4lZ4ChmBAABgE0YgAIAJ4+ecxHC7v2EzZtx62+1u4HnijvjMQC7Xdu6i1c+ds/Wo3ocKtAsLUxUAAEwWRiAAgAnj55zEcLu/ocaMBxdjz3LAWHzb5K4HFiPHagRy341ZDSEXH7hNzzTfYAkPF0OUuO3B5zZfv2D5ZOFtN3UjkKuzNy8sZv1doMU73P7AucXDBfqe0ubCfuWTLyjaMhamKgAAmCyMQAAAE8bPOYknQ0Xb1Wix+MtvzUSxKO58ovmuSzMCNS9oRouTuvnVux5bvUl4+Nr5i81k4saPZqxavGA5h1TeVl++tnl+OQItvwu0eE3rzTsWszEC6ffYvKZZxmphv3zhk7/SOQAxAgEAzAJGIACACePnnEQNA0VXY8ZqKlh8j2U1V6yHhA0WM8nqSc0t4aH/Rs3yydWk0RT1t10uae1yaFmxekHxzVe1vaD0cFG7gerkyUXRjEBf/J3u7wExAgEAzAJGIACACePnnMRmcqi4HgxO5o3FSBBGII0TG+qvkK0mk5OHzU9TWA88J+PHkjiE6N1K+qFlZeHNF89XFxPWED/dLax7AGIEAgCYBYxAAAATxs85icvhoezJYLD6v2Ka/0nmZARaDgmbP43tuXN3LX5p+Q2Zxd+a23i4nlKaN1lMGotC/9/OytLbRv3QsrL05tli8hFovTAF2oWFqQoAACYLIxAAwITxc06iHyGibjBYDxWruhmB7DXNwLBi+eT6b6M135AJD5fjjXHb7VYs3nDjHfQRpbfdtDQCFd48X0w6AoWF/c4XFWwNC1MVAABMFkYgAIAJ4+ecxI0RYgCb7y/5n8AWviM0mCcL+2Xz8xD4iXAAAMcAIxAAwITxc07i5n3/EHZ+w2coNxbGj0MAADgKGIEAACaMn3MS430/tlSgXViYqgAAYLIwAgEATBg/5ySG231sq0C7sDBVAQDAZGEEAgCYMH7OSQy3+9hWgXZhYaoCAIDJwggEADBh/JyTGG73sa0C7cLCVAUAAJOFEQgAYML4OScx3O6PXf2IgoP+1AQF2oWFqQoAACYLIxAAwITxc05iuN3v0/U/sBOeX7n4l0xP9yOwh/mp2Qq0CwtTFQAATBZGIACACePnnMRwu9+n/Y9AzT9sqn9E6HAq0C4sTFUAADBZGIEAACaMn3MSw+1+n65HoPMP3nrL7Q+cW/0zO4sZpvl+zgqNSYuXiVvPXWzeZDHz3HHn4gtvPfeE+5K7HluOQ0tWc9Rz525fPqEvL7zhLirQLixMVQAAMFkYgQAAJoyfcxLD7X6fboxAGlSaKUX/J8/Gd4Ga1/hZqKmXc44NPM3z/rtA5y/qC9efcvJNJ5uv7nqs/Ia7qEC7sDBVAQDAZGEEAgCYMH7OSQy3+326OQLp+zAng8rmCNQ872lev5h53Ddw3AjkvwvUvOHi3U6Gpdobrn71NCrQLixMVQAAMFkYgQAAJoyfcxLD7X6fnn4EWj7vrI5AJwPP6guLI1DrDXdRgXZhYaoCAIDJwggEADBh/JyTGG73+3SbEWg9tDSvaf2og+4RaPn//9gbrovmV/UX4VpvuIsKtAsLUxUAAEwWRiAAgAnj55zEcLvfp10j0LJuP2xY/v9C1RFII5Nx2+1WrCaf5VMbn7Vi939KSIF2YWGqAgCAycIIBAAwYfyckxhu97GtAu3CwlQFAACThREIAGDC+DknMdzuY1sF2oWFqQoAACYLIxAAwITxc05iuN3Htgq0CwtTFQAATBZGIACACePnnMRwu49tFWgXFqYqAACYLIxAAAATxs85ieF2H9sq0C4sTFUAADBZGIEAAKbKtes33/vRZ/yoUzPc7mNbZdqFhakKAAAmCyMQAMBUeePy9bv/9IIfdWqG230M/vyNa++8o1RzLExVAAAwWRiBAACmyrXrNx/6/Hf9qFMz3PFj8NJb1xmBAACOB0YgAIAJ89Irl/2oUzPc8WPQhkkF2oWFqQoAACYLIxAAwIS5/PaN//O/fsdPO0XDHT96t/8WkGFhqgIAgMnCCAQAMG3eunLjTz77gh942oabflx7qvnHsDBVAQDAZGEEAgCYPG9euXHxe5fue+T5993/dT/5rA33/fizS9def+v6tes3TzX/GBamKgAAmCyMQAAAM8EGoctv39ADB3ftRU47/CwhTACAGcAIBAAwc4a9a7/89vV//flbejB9GIEAAGYAIxAAwMxhBOoRRiAAgBnACAQAMHMYgXqEEQgAYAYwAgEAzJx33/P01a3/3ZveYQQCAICxwQgEADBz7n7owjdf/IUeHJw5jUAvv/b2++9/Tg8AAGCyMAIBAMychz7/3c/9/Y/04ODMaQR68sKr9z3yvB4AAMBkYQQCAJg533zxF3c/dEEPDs6cRqCPP/rCl599RQ8AAGCyMAIBAMyfAe/dZzMCDTtJAgBAjzACAQDMn6vXb37gY98YZAqaxwhk888HP3H+tUtX9RgAAKYMIxAAwFHwxuXrH3/0hXsf/tazz//skLfykx6BLDQbfj75+It3P3SB+QcAYDYwAgEAHBE2/9gU9Nv3PfuuDz2Jnb73o8/Y8POFp36i+AAAYBYwAgEAwB75zBcu3Prv/m89AAAAGAGMQAAAsEcYgQAAYGwwAgEAwB5hBAIAgLHBCAQAAHuEEQgAAMYGIxAAAOwRRiAAABgbjEAAALBHGIEAAGBsMAIBAMAeYQQCAICxwQgEAAB7hBEIAADGBiMQAADsEUYgAAAYG4xAAACwRxiBAABgbDACwUH5dx/5y//hf/kzRDwqbQQKzyDi7P3PX7ygaz/A+GAEgoNie6IqADgOfvzTN97z7x/TAwA4Dr7w3/7lf/+//rseAIwPRiA4KIxAAMcGIxDAEcIIBCOHEQgOCiMQwLHBCARwhDACwchhBIKDwggEcGwwAgEcIYxAMHIYgeCgMAIBHBuMQABHCCMQjBxGIDgojEAAxwYjEMARwggEI4cRCA4KIxDAscEIBHCEMALByGEEgoPCCARwbDACARwhjEAwchiB4KAwAgEcG4xAAEcIIxCMHEagOfPWlRuX376hB1DBInrzCinBTKDrt4Guh5lB428DjQ8eRqAZcv3mOz9/49pnv/LDBz7z7ffd//V3fehJTLSI7nvk+U9/+aVXX796+SqbI0wSuv5U0vUwD2j8U0njg4cRaG68eeXGk//06ns/+kzofNzG//KVH11667qiBJgIdP1ZpOthotD4Z5HGB0agWXH95jt/942fhj7HU/npL790hT8cgulA159duh4mB41/dmn8I4cRaFb87NK1d9/zdGhyPK0vvXJZgQKMHrq+F+l6mBY0fi/S+McMI9B8uHHznf/0Nz8I7Y07+MeP/gv/XylMArq+L+l6mBA0fl/S+McMI9B8ePPKjfseeT60N+7gBz9x/s0r/BVhmAB0fV/S9TAhaPy+pPGPGUag+XDt+k3+t8i+VKYA44au71FlCjB6aPweVaZwfDACzYrQ2LizChRg9IRTF3dWgQJMgXD24s4qUDg+GIFmRWhs3FkFCjB6wqmLO6tAAaZAOHtxZxUoHB+MQLMiNDburAIFGD3h1MWdVaAAUyCcvbizChSOD0agWREaG3dWgQKMnnDq4s4qUIApEM5e3FkFCscHI9CsCI3d7e89+Zv3fvU3PvLV+Dxu+v77n7vvkef/4kvf598QgLERztWtpPG3k8aH0RLO1W7p+u2k648HRqBZETo58dd+/6nf+sNnPvW57/zVUz/+9vffeO31q+hVoCtefu3tJy+8anviBz72DfuvngUYAaG1c2n8RAXqoPFhtITWTqTrcxXoCrr+eGAEmhVh46v5nnuf+YOHL770ylthI8C1CrSE7Yn3Pvytq9dv6jHAoITuTqTxcxVoBRofRkXo7pp0facKtARdP28YgWZF2PuK/uqHn/qjP38+bAEYVKAVHv3bH/zZX39PDwAGJTR4TRq/UwVah8aH8RAavChdv40KtAJdP2MYgWZF2P6K/tv/8LWXX7sStgAMKtA6v/upf7z4vUt6ADAcocFr0vidKtAUGh9GQmjwonT9NirQOnT9XGEEmhVh+2v7bz781EN/+d3Q/9hWgdb58rOvfPzRF/QAYDhCjxel8bdRgabQ+DASQo+3peu3VIHWoevnCiPQrAg7YNv33PvMl772cuh/bKtA67z4wzc/+InzegAwHKHHi9L426hAU2h8GAmhx9vS9VuqQOvQ9XOFEWhWhB2w7a/f8zT/W+Q2KtAUy1MVwHD4Bq9J42+jAu3C8lQFMBy+wYvS9VuqQFMsT1UwIxiBZoXf/mqG5seiCjTFwlQFMBy+uxPDGY5tFWgXFqYqgOHw3V0znOFYVIGmWJiqYEYwAs0Kv/fVDM2PRRVoioWpCmA4fHcnhjMc2yrQLixMVQDD4bu7ZjjDsagCTbEwVcGMYASaFX7vqxmaH4sq0BQLUxXAcPjuTgxnOLZVoF1YmKoAhsN3d81whmNRBZpiYaqCGcEINCv83lczND8WVaApFqYqgOHw3Z0YznBsq0C7sDBVAQyH7+6a4QzHogo0xcJUBTOCEWhW+L2vZmh+LKpAUyxMVQDD4bs7MZzh2FaBdmFhqgIYDt/dNcMZjkUVaIqFqQpmBCPQrPB7X83Q/FhUgaZYmKoAhsN3d2I4w7GtAu3CwlQFMBy+u2uGMxyLKtAUC1MVzAhGoFnh976aofmxqAJNsTBVAQyH7+7EcIZjWwXahYWpCmA4fHfXDGc4FlWgKRamKpgRjECzwu99NUPzY1EFmmJhqgIYDt/dieEMx7YKtAsLUxXAcPjurhnOcCyqQFMsTFUwIxiBZoXf+2qG5seiCjTFwlQFMBy+uxPDGY5tFWgXFqYqgOHw3V0znOFYVIGmWJiqYEYwAs0Kv/fVDM2PRRVoioWpCmA4fHcnhjMc2yrQLixMVQDD4bu7ZjjDsagCTbEwVcGMYASaFX7vqxmaH4sq0BQLUxXAcPjuTgxnOLZVoF1YmKoAhsN3d81whmNRBZpiYaqCGcEINCv83lczND8WVaApFqYqgOHw3Z0YznBsq0C7sDBVAQyH7+6a4QzHogo0xcJUBTOCEWhW+L2vZmh+LKpAUyxMVQDD4bs7MZzh2FaBdmFhqgIYDt/dNcMZjkUVaIqFqQpmBCPQrPB7X83Q/FhUgaZYmKoAhsN3d2I4w7GtAu3CwlQFMBy+u2uGMxyLKtAUC1MVzAhGoFnh976aofln5eN33XLLLXc83nr+9CrQFAtTFcBw+O5ODGd4b/bXdL2565IUaBcWpiqA4fDdXTOc4Yd23Yln2SX2v8Mo0BQLUxXMCEagWeH3vpqh+ffnY3faxrXizifCr+7F/vZKBZpiYaoCGA7f3YnhDO/w/IO3Ljt3SdK/XU232Adue/C51vNm8ksF978kBdqFhakKYDh8d9cMZ3iPnlzfkxZed+IpL80bHdrfZb2mAk2xMFXBjGAEmhV+76sZmn8/XnzgNtu1bn/gvJ6xHW2vW5jsb69UoCkWpiqA4fDdnRjO8A6beePWcxetfu7c7dZWy3oHkzkn+aWC+1+SAu3CwlQFMBy+u2uGM7wnN6/vj99V7cReRqD9q0BTLExVMCMYgWaF3/tqhubfi81+V94W3R/l6gWLZ25/4NziSxas/3C3/crXn7jDttE712++eCiW2yUjEBwfvrsTwxneoZs3NuqmxZb4ZxZNV2jk5a2S2GzM0i+139y79yUxAsGU8N1d05/evem7z5t04rqwJ5svX7J+k+UfajTPPBE7VF/bdK7mIleX3u20KtAUC1MVzAhGoFnh976aofn3YbOdLf+I6GRKWd2RbOyD6yeX29nJFxZfqXe767HlB52/qD8lKu6zZ1OBpliYqgCGw3d3YjjDO2yabnlLsbw7WbdqfHLddMVG3v67QMU3dy8+wJIUaBcWpiqA4fDdXTOc4b14ivZsX5qbl+lr1/XJhdtmm8X1faNDV7/qWnhxJ7D4rOK7Lb/qNCrQFAtTFcwIRqBZ4fe+mqH596G/21i4uQ961rvYcuvseOV649MHue8Cua/abRMMKtAUC1MVwHD47k4MZ3iHTVeuWTbdRl+v23bddMVG3rybWdRLmm/2+l8qvvnG6/ezJK8C7cLCVAUwHL67a4YzvBebpovX2Y5O3Cw8qzZf/clm40aHttt58UzzWaV3W7/J9irQFAtTFcwIRqBZ4fe+mqH59+J6zwoPw/NLi7cpxVdujkAn+2bHV+2iAk2xMFUBDIfv7sRwhnfou3Jlv/OG6X+p/ObuxQdYkgLtwsJUBTAcvrtrhjO8H5tWOl0ntgv3tSeXcv9MewRa/f23B9a/Wnq3HVSgKRamKpgRjECzwu99NUPz78flX7hfbYjrfarZGU/+b5+lxduU4isrI1Cz+TZf1dOGaCrQFAtTFcBw+O5ODGd4h74r17r+Ornj2ezZ2Mil+5u1G79UfHP34gMsSYF2YWGqAhgO3901wxnek8vr+6qDrMvsSp134mZLhiv7yRV80ZuLt013hoblO5TebQcVaIqFqQpmBCPQrPB7X83Q/Puz2cXWbGxnK5pdsnKbUnhl+Itwy+3PuO12Kzb22eULzqACTbEwVQEMh+/uxHCGd+i70nlyC7LZp4u6q5ELjbn5S4U39+5/SQq0CwtTFcBw+O6u6U/vXl1OQUt0cc86sdR9DfqTjvXdgu9WI37t6qKvh5V3O60KNMXCVAUzghFoVvi9r2ZofiyqQFMsTFUAw+G7OzGc4dhWgXZhYaoCGA7f3TXDGY5FFWiKhakKZgQj0Kzwe1/N0PxYVIGmWJiqAIbDd3diOMOxrQLtwsJUBTAcvrtrhjMciyrQFAtTFcwIRqBZ4fe+mqH5sagCTbEwVQEMh+/uxHCGY1sF2oWFqQpgOHx31wxnOBZVoCkWpiqYEYxAs8LvfTVD82NRBZpiYaoCGA7f3YnhDMe2CrQLC1MVwHD47q4ZznAsqkBTLExVMCMYgWaF3/tqhubHogo0xcJUBTAcvrsTwxmObRVoFxamKoDh8N1dM5zhWFSBpliYqmBGMALNCr/31QzNj0UVaIqFqQpgOHx3J4YzHNsq0C4sTFUAw+G7u2Y4w7GoAk2xMFXBjGAEmhV+76sZmn/s+h+IeUAVaIqFqQpgOHx3J4YzfALu3Pu7fqEC7cLCVAUwHL67a4YzfAIevOtNBZpiYaqCGcEINCv83lczNH8/rn5a/5L2v91xKhf/REDhX4Y+qAo0xcJUBTAcvrsTwxne9uQf8lp337Cepvd72TQUaBcWpiqA4fDdXTOc4W3pelOBpliYqmBGMALNCr/31QzN34+Vf7VwNzf2tYFUoCkWpiqA4fDdnRjO8E2X/9Dh6h8WfPyuvnr5YPayaSjQLixMVQDD4bu7ZjjDN6XrpQJNsTBVwYxgBJoVfu+rGZq/H9sjkP8jmUXd7LOLl93+wLnFLy2484nli9f/qvSt555w/+Z08+XxfYQ+q/yGJ/9w9W5/JmQq0BQLUxXAcPjuTgxn+Ia1P8Iod9wtd9y5ev62Bx87ad7tXrD9zuBe6bYIe5Mn7lg+MBY3QP4fqu9h0/idLyrYGhamKoDh8N1dszm3K9L1dP3Rwwg0K/zeV7Pp/L5tb6bVLW+5fy13t+bJk1fafnTXY7XvbruPWO6M6yfDGzbF4n20kp1UoCkWpiqA4fDdnRjOcO9JQ/nnt+o43Uwserbc460X7LAzhC3i/MW4OfS0aXTdBQkLUxXAcPjurrloh4onHeGf36plZtX1dqugQFMsTFUwIxiBZoXf+2ouNoveXe4vKzb3r2V9suUt96b1C5pdcmNiKe5rJ1ukPbl+n9IbLgvVyzc5vQo0xcJUBTAcvrsTwxnuPblRiE9u13H+xZ0vcE82df2rVkVri3B/Hrx6q742jc4/DDYsTFUAw+G7u2bTL2Xp+mVh0PVHCyPQrPB7X81mN+lbv78sXe8yqqtbXmunO/O+tvhCfYN7Y0mnUYGmWJiqAIbDd3diOMM3bHonNMtob4ZOHrq36mXT+OSv2KNbfuWTLyjZChamKoDh8N1dszmxKzYnvxph5RF2/fJWga4/ThiBZoXf+2o2bd+3fn+Jzyy3mOqW12xY6x1tscdt7H3r3cptW/nO+9y5u5rtr/lDI/2V31OrQFMsTFUAw+G7OzGc4Zsum3TVdNZW1jhbd9yuL+jYGVxHb2wRd6z2h83n9WTxHSoL8B+x2DR++csv/o497vozYQtTFcBw+O6uuTjJq9L1ulWg648WRqBZ4fe+mostoHf9/rKy2Z4W3HHOfjXbKNev9L9klLYzsXym8obr75ivNsfTq0BTLExVAMPhuzsxnOEtl7cmS9Q4W3dcx63Gxgu23xncl29sEc2LF9x2uxVaWPNiI3zh6TeN7r8UY2GqAhgO3901Fyd5Jl2/hK4/UhiBZoXf+2outgDsUoGmWJiqAIbDd3diOMOxrQLtwsJUBTAcvrtrhjMciyrQFAtTFcwIRqBZ4fe+mqH5sagCTbEwVQEMh+/uxHCGY1sF2oWFqQpgOHx31wxnOBZVoCkWpiqYEYxAs8LvfTVD82NRBZpiYaoCGA7f3YnhDMe2CrQLC1MVwHD47q4ZznAsqkBTLExVMCMYgWaF3/tqhubHogo0xcJUBTAcvrsTwxmObRVoFxamKoDh8N1dM5zhWFSBpliYqmBGMALNCr/31QzNj0UVaIqFqQpgOHx3J4YzHNsq0C4sTFUAw+G7u2Y4w7GoAk2xMFXBjGAEmg83br7z7nue9ttf0dD8WFSZpliYqgAGYsuuN8MZjm2VaRcWpiqAgeBy36PKNMXCVAUzghFoPrxx+frdf3rBb39FQ/Nj25+/ce2dd5RqgoWpCmAgtux6M5zkGNyy6w0LUxXAQHC570su98cMI9B8uHb95kOf/67f/oqG/se2l966zp4Ik2DLrjfDSY7BLbvesDBVAQwEl/u+5HJ/zDACzYoXf/Sm3/6Khv7Htlev3VSgKRamKoDh2KbrzXCSY3DLrjcsTFUAw8Hlvhe53B8zjECz4s0rN/7ksy/4HbBt6H8Mvs4fBsOk2KbrzXCeo3f7rjcsTFUAw8Hl/uxyuT9yGIHmxltd22LYAtDLnRBMkc6uN8OpjmtP1fWGhakKYFC43J9FLvfACDRD3rxy4+L3Lt378Ld++75n/W64NOwCaP7s0jXbDd++dpM7IZgoedeb4ZzH3bresDBVAQwNl/vTyuUe1jACzZY3Li+aXA9W0MY1TnsbZBAmjI1i1xucq0V26HqDMGFscLk/FVzuYQkj0HExeBu/+osrr73+th5MHPZEmArDnqvXrt+0xteD6UPjwyTgct8jdP0sYQQ6LtgTe4Q9EabCsOcqIxDA4eFy3yN0/SxhBDou2BN7hD0RpsKw5yojEMDh4XLfI3T9LGEEOi7YE3uEPRGmwrDnKiMQwOHhct8jdP0sYQQ6Lt59z9NXr2/7LwDuA/ZEgMMzbOMzAgEcHi73PULXzxJGoOPi7ocufPPFX+jBEMxmT3z5tbfff/9zegAwboZt/DmNQDQ+TAUu931B188VRqDj4qHPf/dzf/8jPRiC2eyJT1549b5HntcDgHEzbOPPaQSi8WEqcLnvC7p+rjACHRfffPEXdz90QQ+GYDZ74scffeHLz76iBwDjZtjGn9MIROPDVOBy3xd0/VxhBDo6hm3meeyJg19aAE7LgI0/mxGIxodpweX+7ND1M4YR6Oi4ev3mBz72jaG2xRnsibYhfvAT51+7dFWPAabAgI0/jxGIxofJweX+jND184YR6Bh54/L1jz/6wr0Pf+vZ53924N6e7p5oodlu+MnHX7z7oQtsiDBFhmr8SY9AND5MGi73O0DXHwmMQMeLbYi2Lf72fc++60NPYqfv/egztht+4amfKD6AaULjn0oaH2YAXX8q6fojgREIDsr/+MEnfv33vqgHAHAE/MM//6s1vh4AwHHA5R5GDiMQHBT2RIBjgxEI4Ajhcg8jhxEIDgp7IsCxwQgEcIRwuYeRwwgEB4U9EeDYYAQCOEK43MPIYQSCg8KeCHBsMAIBHCFc7mHkMALBQWFPBDg2GIEAjhAu9zByGIHgoLAnAhwbjEAARwiXexg5jEBwUNgTAY4NRiCAI4TLPYwcRiA4KOyJAMcGIxDAEcLlHkbOnEeg//X+/2YdiOPz861ncGD/+/kfq21gNMxpB/uf/rcvhGem63/87P+nIwSjgcv9WOVyPzq53K+Z8whkR1oVjIafvPrW//wHf6MHMA4e+cK3TT2A0cAONkL+4Z//9d9/4ik9gNFAs4wQLvcjhMu9hxEIDgp74ghhTxwn7GAjhBFonNAsI4TL/Qjhcu9hBIKDwp44QtgTxwk72AhhBBonNMsI4XI/QrjcexiB4KCwJ44Q9sRxwg42QhiBxgnNMkK43I8QLvceRiA4KOyJI4Q9cZywg40QRqBxQrOMEC73I4TLvYcRCA4Ke+IIYU8cJ+xgI4QRaJzQLCOEy/0I4XLvYQSCg8KeOELYE8cJO9gIYQQaJzTLCOFyP0K43HsYgeCgsCeOEPbEccIONkIYgcYJzTJCuNyPEC73HkYgOCjsiSOEPXGcsIONEEagcUKzjBAu9yOEy72HEQgOCnviCGFPHCfsYCOEEWic0CwjhMv9COFy7+ltBHrryo3Lb9/QAyhh+bx55aARcVC24cDHhYOyDTTLOKFZRgjNMkI4KOPkwMeFg7INh2+WNWcdga7ffOfnb1z77Fd++MBnvv2++7/+rg89iTUtn/seef7TX37p1devXr66x+PNQTmVhzkuHJRTSbOMU5plhNIsI5SDMk4Pc1w4KKfyYM3S5kwjkM1tT/7Tq+/96DPh94Od/pev/OjSW9eVY69wUM7ino4LB+Us0izjlGYZoTTLCOWgjFN2sBG6v2YpsvsIZGPu333jp2H1uL028l7pe97loJzd3o8LB+Xs0izjlGYZoTTLCOWgjFN2sBG6j2apsfsI9LNL1959z9Nh6XgqX3rlstLsCQ5KL/Z7XDgovUizjFOaZYTSLCOUgzJO2cFGaO/NUmPHEejGzXf+09/8ICwaT+sfP/ovPf6vchyUvuzxuHBQ+pJmGac0ywilWUYoB2WcsoON0H6bJWHHEejNKzfue+T5sGg8rR/8xPk3r/T2tx45KH3Z43HhoPQlzTJOaZYRSrOMUA7KOGUHG6H9NkvCjiPQtes3+Z+9elGB9gEHpUeV6ZnhoPSoMu0DjkuPKtMzw0HpUWXaBxyXvlSgfcBB6VFlemY4KD2qTPfM7v8vUFgu7qbS7Inw5rizCrQPwjvjzirQnghvjjurQPsgvDPurALtifDmuJtKsyfCm+POKtA+CO+MO6tA9wwj0MAqzZ4Ib447q0D7ILwz7qwC7Ynw5rizCrQPwjvjzirQnghvjrupNHsivDnurALtg/DOuLMKdM8wAg2s0uyJ8Oa4swq0D8I7484q0J4Ib447q0D7ILwz7qwC7Ynw5ribSrMnwpvjzirQPgjvjDurQPfMQUeg3/zIV3/z3q+GJzH4/vufu++R5//iS99/+bW3lfXWhLfaRg7KNh74oJgcl07PclCM8G7byEHZRnawEUqzjFOaZYRyuR+hZ9zBahxiBPq133/qt/7wmf/jP//Lf/37H134zuuvvX4V1ypNhx3gJy+8akf6Ax/7xqN/+wM9ux0h+UQOSq4CXXGYg2JyXBIV6IqzHBQjJJ/IQclVoCvYwcagAl1Bs4xBpemgWcagAl3B5X4MKtAVZ9zBaux9BHrPvc/8wcMXX3rlrfDbw6VKs8TV6zf/7K+/d+/D37JCT3URwq/JQelUgbbY30ExOS65CrTFDgfFCOHX5KB0qkBb7K9ZOCidKtAWNMuAKs0SNMuAKtAW+zsoJsclV4G22OGgJOx3BPrVDz/1R3/+fPiNoVdp1rF51463HnQR8i/KQdlGBVqh94Niclw6VaAVTnVQjJB/UQ7KNirQCuxgg6hAK9Asg6g069Asg6hAK3C5H0QFWuG0O1iN/Y5Av/WHzzDj5irNOjbp/u6n/vHFH76pxykh/6IclG1UoBV6Pygmx6VTBVrhVAfFCPkX5aBsowKtwA42iAq0As0yiEqzDs0yiAq0Apf7QVSgFU67g9XY5wj0e09+6nPfCb8rDCrNlC889ZNPPv6iHqTEQ9CWg7KdCrROnwfF5LhsoQKts/1BMeIhaMtB2U4FWocd7PAq0Do0y+FVmik0y+FVoHW43B9eBVrnVDtYjT2OQO+595kvfe3l8LvCoNJMsUn3g584rwcp4RC05aBsqQKt0+NBMTku26hA62x/UIxwCNpyULZUgdZhBzu8CrQOzXJ4lWYKzXJ4FWgdLveHV4HWOdUOVmOPI9BvfOSr3/7+G+F3hUGlmXL1+s133/O0HqSEQ9CWg7KlCrROjwfF5LhsowKts/1BMcIhaMtB2VIFWocd7PAq0Do0y+FVmik0y+FVoHW43B9eBVrnVDtYjf3+v0Dht4RtlWYXFqaqFB9+zbAALKpAUyxMVSk+/MSwAGyrQFMsSVVd+PBrhgVgUQWaYmGqSvHh1wyfjkUVaIqFqaoLn3/NsABsqzS7sDBVpfjwa4YFYFEFmmJhqkrx4SeGBWBbBZpiSaraFUaggVWaXViYqlJ8+DXDArCoAk2xMFWl+PATwwKwrQJNsSRVdeHDrxkWgEUVaIqFqSrFh18zfDoWVaApFqaqLnz+NcMCsK3S7MLCVJXiw68ZFoBFFWiKhakqxYefGBaAbRVoiiWpalcYgQZWaXZhYapK8eHXDAvAogo0xcJUleLDTwwLwLYKNMWSVNWFD79mWAAWVaApFqaqFB9+zfDpWFSBpliYqrrw+dcMC8C2SrMLC1NVig+/ZlgAFlWgKRamqhQffmJYALZVoCmWpKpdYQQaWKXZhYWpKsWHXzMsAIsq0BQLU1WKDz8xLADbKtAUS1JVFz78mmEBWFSBpliYqlJ8+DXDp2NRBZpiYarqwudfMywA2yrNLixMVSk+/JphAVhUgaZYmKpSfPiJYQHYVoGmWJKqdoURaGCVZhcWpqoUH37NsAAsqkBTLExVKT78xLAAbKtAUyxJVV348GuGBWBRBZpiYapK8eHXDJ+ORRVoioWpqguff82wAGyrNLuwMFWl+PBrhgVgUQWaYmGqSvHhJ4YFYFsFmmJJqtoVRqCBVZpdWJiqUnz4NcMCsKgCTbEwVaX48BPDArCtAk2xJFV14cOvGRaARRVoioWpKsWHXzN8OhZVoCkWpqoufP41wwKwrdLswsJUleLDrxkWgEUVaIqFqSrFh58YFoBtFWiKJalqVxiBBlZpdmFhqkrx4dcMC8CiCjTFwlSV4sNPDAvAtgo0xZJU1YUPv2ZYABZVoCkWpqoUH37N8OlYVIGmWJiquvD51wwLwLZKswsLU1WKD79mWAAWVaApFqaqFB9+YlgAtlWgKZakql1hBBpYpdmFhakqxYdfMywAiyrQFAtTVYoPPzEsANsq0BRLUlUXPvyaYQFYVIGmWJiqUnz4NcOnY1EFmmJhqurC518zLADbKs0uLExVKT78mmEBWFSBpliYqlJ8+IlhAdhWgaZYkqp2hRFoYJVmFxamqhQffs2wACyqQFMsTFUpPvzEsABsq0BTLElVXfjwa4YFYFEFmmJhqkrx4dcMn45FFWiKhamqC59/zbAAbKs0u7AwVaX48GuGBWBRBZpiYapK8eEnhgVgWwWaYkmq2hVGoIFVml1YmKpSfPg1wwKwqAJNsTBVpfjwE8MCsK0CTbEkVXXhw68ZFoBFFWiKhakqxYdfM3w6FlWgKRamqi58/jXDArCt0uzCwlSV4sOvGRaARRVoioWpKsWHnxgWgG0VaIolqWpXGIEGVml2YWGqSvHh1wwLwKIKNMXCVJXiw08MC8C2CjTFklTVhQ+/ZlgAFlWgKRamqhQffs3w6VhUgaZYmKq68PnXDAvAtkqzCwtTVYoPv2ZYABZVoCkWpqoUH35iWAC2VaAplqSqXWEEGlil2YWFqSrFh18zLACLKtAUC1NVig8/MSwA2yrQFEtSVRc+/JphAVhUgaZYmKpSfPg1w6djUQWaYmGq6sLnXzMsANsqzS4sTFUpPvyaYQFYVIGmWJiqUnz4iWEB2FaBpliSqnaFEWhglWYXFqaqFB9+zbAALKpAUyxMVSk+/MSwAGyrQFMsSVVd+PBrhgVgUQWaYmGqSvHh1wyfjkUVaIqFqaoLn3/NsABsqzS7sDBVpfjwa4YFYFEFmmJhqkrx4SeGBWBbBZpiSaraFUaggVWaXViYqlJ8+DXDArCoAk2xMFWl+PATwwKwrQJNsSRVdeHDrxkWgEUVaIqFqSrFh18zfDoWVaApFqaqLnz+NcMCsK3S7MLCVJXiw68ZFoBFFWiKhakqxYefGBaAbRVoiiWpalcYgQZWaXZhYapK8eHXDAvAogo0xcJUleLDTwwLwLYKNMWSVNWFD79mWAAWVaApFqaqFB9+zfDpWFSBpliYqrrw+dcMC8C2SrMLC1NVig+/ZlgAFlWgKRamqhQffmJYALZVoCmWpKpdYQQq+fhdt9xyyx2Pt57v9PRfqDS7sDBVpfjwa4YFYFEFmmJhqkrx4SeGBWBbBZpiSarqwodfMywAiyrQFAtTVYoPv2b4dCyqQFMsTFVd+PxrhgVgW6XZhYWpKsWHXzMsAIsq0BQLU1WKDz8xLADbKtAUS1LVroxiBHrsThscGm578LnWrw7gaSaZxeLXy2YEmosKNMXCVJXiw08MCziMJ61n3PlE+NWxqUBTLElVXfjwa4YFYFEFmmJhqkrx4dcMn45FFWiKhamqC59/zbAAbKs0u7AwVaX48GuGBWBRBZpiYapK8eEnhgVgWwWaYkmq2pXBR6CLD9xmN1+3P3C+efj4Xbeeu7j5grG7MQKdXqXZhYWpKsWHXzMsAIsq0BQLU1WKDz8xLGD/brZecybv8p3PA6pAUyxJVV348GuGBWBRBZpiYapK8eHXDJ+ORRVoioWpqguff82wAGyrNLuwMFWl+PBrhgVgUQWaYmGqSvHhJ4YFYFsFmmJJqtqVoUeg8w/eessthbGn+XbKEv1q88o77lw9f9uDj527fVlu+wL/LZpF3dz8Lb7q9gfOrb5q+Qfh7pXPbbzJE3csHxiLsWd5EykWr48fIdwK42cpzS4sTFUpPvyai5Vglwo0xcJUleLDTwwL2LvN+akzc8PFSb7sI984S07dSr2qQFMsSVVd+PBrhgVgUQWaYmGqSvHh1wyfjkUVaIqFqaoLn3/NsABsqzS7sDBVpfjwa4YFYFEFmmJhqkrx4SeGBWBbBZpiSaralYFHoOWAoXuptc0ws7zZOnlB8+Ty+y0aS5p7rOZv8qzvwNIXVO/b/FdtvvLkS2zaueux8xfbf+Ft47tA6+e3+i0sPktpdmFhqkrx4ddcrBO7VKApFqaqFB9+YljAvj05291g35zSy4d3PbZ8ZfFMdud/UxdbafWC/lSgKZakqi58+DXDArCoAk2xMFWl+PBrhk/Hogo0xcJU1YXPv2ZYALZVml1YmKpSfPg1wwKwqAJNsTBVpfjwE8MCsK0CTbEkVe3KGEcgd3Pm7sDcrZi/Azt5cecL6vdt8atWRTM+re4FF7rvAq3eqjgCbf9bUJpdWJiqUnz4NZvfCHaoQFMsTFUpPvzEsIB9u3GKmuuTvznJdZbWmvHkxcsvbLWSr/tTgaZYkqq68OHXDAvAogo0xcJUleLDrxk+HYsq0BQLU1UXPv+aYQHYVml2YWGqSvHh1wwLwKIKNMXCVJXiw08MC8C2CjTFklS1K0P/RbjmXircKm0/P2y8uPMFnfdt6xesijACnTx0b8UINEsVaIqFqSrFh58YFrB33Tm8+ZAR6MSwACyqQFMsTFUpPvya4dOxqAJNsTBVdeHzrxkWgG2VZhcWpqoUH37NsAAsqkBTLExVKT78xLAAbKtAUyxJVbsykh+H4P7WzZ1P+BusU0w4p3iB+x/Bi1+1Kpqv1fs8duddd6xGoM3n9WTxHSoLOHmB0uzCwlSV4sOvufj0fdv87vS7Dr+0D13afalAUyxMVSk+/MSwgP27+eMQTjLcGIG6zuTYSvxFuNG5OIL77UQFmmJhqkrx4dcMnz4N97BH5SrQFAtTVRc+/5phAdhWaXZhYapK8eHXDAvAogo0xcJUleLDTwwLwLYKNMWSVLUrg49A5vIuaolmieUt1BJdNoqzSteAsfECjSsL7jhnL65/lfvy9ZcsXrO8yTNuu90KLax5sRG+cMvfgtLswsJUleLDr7n49DPqfpslmwO6/s7Yfix+882/4Iwq0BQLU1WKDz8xLOAwrk/vhmX3bY5AxTM5aaXbVi/mxyGc2ZPkzxLmojsYgU7verdfcsbzeQ97VK4CTbEwVXXh868ZFnAqNzaiM187Nq4Ona4PtDvEq/X03DhKswsLU1WKD79mWMC2HiqT0x2pxn5PlaUKNMXCVJXiw08MC8gc57HY/6oUaIolqWpXxjACHbVKswsLU1WKD79mWMAudlzOm/9j6ox3DF3usHWeSgWaYmGqSvHhJ4YFTM9mT/SzU+8q0BRLUlUXPvyaYQEH1XXZc+fu2v2iwgi0m+58Xs6ihxxgzq4CTbEwVXXh868ZFrC1m9+OXl4+1n+xYid3uJlr/uxmYw2bz/Sj0uzCwlSV4sOvGRawrYfK5JTX8f5PlaUKNMXCVJXiw08MC8gc57HY/6oUaIolqWpXGIEGVml2YWGqSvHh1wwL2MX1zdmiDcLPQW59T6958ZLV/fGiVU5+7HLTS9lPM9dO17BoUf8RzTL8SNb+uMIiV7+Rugo0xcJUleLDTwwLmJ7LPVGHbC8q0BRLUlUXPvyaYQGHtHzb3YS8JD+9199BajqrzytlWwWaYmGqSvHh1wyfvi/d+dyEucqwfQhc2u6gCD2jPcp/h9zVpfc8owo0xcJU1YXPv2ZYwLY2yfjf9cmZXzm307haVwd7snANcjbvduu5Bxc3cCcH664HTg56uAAVF1Y5spsqzS4sTFUpPvyaYQHbeohMTn+kej5VTlSgKRamqhQffmJYQOY4j8X+V6VAUyxJVbvCCDSwSrMLC1NVig+/ZljALja9sdp3dNa6u4Tm1F/uQeqTRYec7FbqjdUf3sQ30dc231Rt3nAPP4t88VWpCjTFwlSV4sNPDAvAtgo0xZJU1YUPv2ZYwGFdX0JWZ2xzJi/P/5O6eHqfNMLy2rbVOb+zCjTFwlSV4sOvGT59Xy6DFekhONmXLO27Hktf47agxfFdHKPi660+mwo0xcJU1YXPv2ZYwJaebNTrJ5us1skUz+08ro2rQ/Oa1jXIuXrB+qsWxZ1PuE9sXYBKCzt5/frIrj9ipdLswsJUleLDrxkWsK2HyuRUR6rw5JlPlaUKNMXCVJXiw08MC8hcxTKqY3GAVSnQFEtS1a4wAg2s0uzCwlSV4sOvGRawi5vntM7a9ZPNqbwcY9z57V+8ea6X3yT2xprlr2606+qryh9Xef9cBZpiYapK8eEnhgVgWwWaYkmq6sKHXzMs4PA2p/SCxQncnL2e2um90QiLJ1f1flSgKRamqhQffs3w6ftycwNZbmjFQ7DYi/zfxikeptWhOXnb9XEpvn79bruqQFMsTFVd+PxrhgVs6fIM39iTm0DWySuNdYCluJpD0LD+47PV1aF4Udh4/erJ1SsXl5vNJmpdgIoLW1/4Fs+UO05pdmFhqkrx4dcMC9jWvWVyliPVvGD5tiubD1p8aPHTm8KjF7RUoCkWpqoUH35iWEDmKI/F/la1+NVGBZpiSaraFUaggVWaXViYqlJ8+DXDAnZxffrm53SlkZYv0FdtPO/f5ORrm8Y7+Tt1y1/17bp+vvxxlffPVaApFqaqFB9+YlgAtlWgKZakqi58+DXDAgbSXyRaZ2/p9N5ohMWTG5eW3lWgKRamqhQffs3w6fvSBVvcgtae/OrS4mE6eVJ/3+OB9Q5WfP2ZVaApFqaqLnz+NcMCtrX57esEbmzO3iaQ4ta9RVz+6lC5Bp28+OTJpmhYHM3sAlRc2OpMWPzk2PW1aVOl2YWFqSrFh18zLGBbD5XJ6Y5U87b+ybOfKksVaIqFqSrFh58YFpA5zmOx/1Up0BRLUtWuMAINrNLswsJUleLDrxkWsIsd5/TJCORPdNdUO45AzTMbbbNoMP9VxY+rvH+uAk2xMFWl+PATwwKwrQJNsSRVdeHDrxkWcEjtBNaJuj6Bm0KdtbZ4ejdF8yR/EW5Xi8GWDsHmvqS/CBcPU9yaGpavKb7+zCrQFAtTVRc+/5phAVvbXC/Wm/ny4fJmaOtDECxfHfxFwb3YfcqyWfTm6xcXLkDFha1qQ7/UUml2YWGqSvHh1wwL2NZDZXK6I7WHU2WpAk2xMFWl+PATwwIyx3ks9r8qBZpiSaraFUaggVWaXViYqlJ8+DXDAnZxffqWz+lmY1rtO8sGWKIzvnnByeleaYyTxmtesKC/n0W+qFMVaIqFqSrFh58YFoBtFWiKJamqCx9+zbCAw7q83jesr+Kr077hpDvap3dzBVpw8lPL9bb9q0BTLExVKT78muHT96UP1u9p7UPg0vYHYkXzGr/zrDa0k42o9J5nVIGmWJiquvD51wwLOI2ru6gl67O9tnV3xrV6wfL1pWuQ033Kye1afgGqLUy/keoRVJpdWJiqUnz4NcMCtvVgmZzqSC3s+1RpVKApFqaqFB9+YlhA5jiPxf5XpUBTLElVuzLDEejkyK3bo+TiolX6buCBVZpdWJiqUnz4NcMCsKgCTbEwVaX48BPDArZ1vdG4s311Q1bd8Udr3pUKNMWSVNWFD79mWAAWVaApFqaqFB9+zfDpWFSBpliYqrrw+dcMCzhKm5u5s+1ghoWpKsWHXzMsYAg7MhmDCjTFwlSV4sNPDAs4lOM8FuVVKdAUS1LVrsxuBHLDZf6PaTACYaICTbEwVaX48BPDArZ1+Scr+/9HAw4jI9AUVaApFqaqFB9+zfDpWFSBpliYqrrw+dcMCzhGm9sP/Tl3SaXZhYWpKsWHXzMsYAC7MhmDCjTFwlSV4sNPDAs4kOM8FpVVKdAUS1LVrsxtBPLfkjtx/SflCnr5fTexeLEbnJp6/Y280k+a71Wl2YWFqSrFh18zLACLKtAUC1NVig8/MSxgW/XN5TP+eP7SyzbbxH2t0Mc1z6T/rFPtq0r/RMCK2MKNCjTFklTVhQ+/ZlgAFlWgKRamqhQffs3w6VhUgaZYmKq68PnXDAvAtkqzCwtTVYoPv2ZYABZVoCkWpqoUH35iWAC2VaAplqSqXZnfX4Rb38mt/gh8eYu2vKNy9cafN1dHIL2mPFn1odLswsJUleLDrxkWgEUVaIqFqSrFh58YFrCtzVlqc8X6fF4Up//x/IWXNaPU4n+OXLz+pCPUBes6vptmqsUyTvFVzSv5LtA0VaApFqaqFB9+zfDpWFSBpliYqrrw+dcMC8C2SrMLC1NVig+/ZlgAFlWgKRamqhQffmJYALZVoCmWpKpdmeePQ9Ct2PKPnJtbOs/yz6G3HIHcH1qv6l5Vml1YmKpSfPg1wwKwqAJNsTBVpfjwE8MCtnV1Zq4GicUfAdiZ7OYK9+0dN37oZD4581svW49A645oXuxZvEn53dwCTvNVjEBTVIGmWJiqUnz4NcOnY1EFmmJhqurC518zLADbKs0uLExVKT78mmEBWFSBpliYqlJ8+IlhAdhWgaZYkqp2ZZ4jUGNzS3dn5R/TYATCVAWaYmGqSvHhJ4YFbOv6zGyKhvVPolycxs13Y9aTTHOSl8aP2suWtGeVE0vvZnUYgbb8KkagKapAUyxMVSk+/Jrh07GoAk2xMFV14fOvGRaAbZVmFxamqhQffs2wACyqQFMsTFUpPvzEsABsq0BTLElVuzLD/xdIN1vre6zlnVzr/+Q5ueczT27Ilv9DwskItLwha+7nWrdxfag0u7AwVaX48GuGBWBRBZpiYapK8eEnhgVsazx1daq3R6CTc/jkS07Gj/bLFkWYRoqtVHo3q9cLONVXbXRlSwWaYkmq6sKHXzMsAIsq0BQLU1WKD79m+HQsqkBTLExVXfj8a4YFYFul2YWFqSrFh18zLACLKtAUC1NVig8/MSwA2yrQFEtS1a7M+P8FcnddzT3Wiua2zD3p7r2ah+t/RmN5l7b4yVoNrSGqF5VmFxamqhQffs2wACyqQFMsTFUpPvzEsIBtdbPEyZDTnkCM/Mfzt1+20TV620IrFd/NL+A0X7V+pR5uqkBTLElVXfjwa4YFYFEFmmJhqkrx4dcMn45FFWiKhamqC59/zbAAbKs0u7AwVaX48GuGBWBRBZpiYapK8eEnhgVgWwWaYkmq2pUZ/0W4M+vv0vam0uzCwlSV4sOvGRaARRVoioWpKsWHnxgWMLTN95Q09jd1/e+nHUwFmmJJqurCh18zLACLKtAUC1NVig+/Zvh0LKpAUyxMVV34/GuGBWBbpdmFhakqxYdfMywAiyrQFAtTVYoPPzEsANsq0BRLUtWuMALVZQQ6YhVoioWpKsWHnxgWMLztb92EFxxcBZpiSarqwodfMywAiyrQFAtTVYoPv2b4dCyqQFMsTFVd+PxrhgVgW6XZhYWpKsWHXzMsAIsq0BQLU1WKDz8xLADbKtAUS1LVrjACDazS7MLCVJXiw68ZFoBFFWiKhakqxYefGBaAbRVoiiWpqgsffs2wACyqQFMsTFUpPvya4dOxqAJNsTBVdeHzrxkWgG2VZhcWpqoUH37NsAAsqkBTLExVKT78xLAAbKtAUyxJVbvCCDSwSrMLC1NVig+/ZlgAFlWgKRamqhQffmJYALZVoCmWpKoufPg1wwKwqAJNsTBVpfjwa4ZPx6IKNMXCVNWFz79mWAC2VZpdWJiqUnz4NcMCsKgCTbEwVaX48BPDArCtAk2xJFXtCiPQwCrNLixMVSk+/JphAVhUgaZYmKpSfPiJYQHYVoGmWJKquvDh1wwLwKIKNMXCVJXiw68ZPh2LKtAUC1NVFz7/mmEB2FZpdmFhqkrx4dcMC8CiCjTFwlSV4sNPDAvAtgo0xZJUtSs7jkA3br7z7nue9ke0aPgtYVsF2oWFqaoOB6VHlWmKhamqzpYHxQwLwLbKNMWSVJVCs/SoMk2xMFXV4aD0qDJNsTBVpXBc+lKBdmFhqqrDQelRZZpiYaqqw+W+R5VpiiWpald2HIHeuHz97j+94I9o0fBbwuDP37j2zjuKNMfCVFWHg9KXWx4XC1NVnS0PihnWgMEeD4pBs/Tl4ZslLADb0iwjlMv9OOVyP0L73cESdhyBrl2/+dDnv+uPaNHwu8Lgpbeu97gnclD6csvjYmGqqrPlQTHDGjDY40ExaJa+PHyzhAVgW5plhHK5H6dc7kdovztYwu7/L9CLP3rTH9Gi4XeFwavXbirNLixMVSkclF7c8rhYmKpStjkoZlgDBvs9KAbN0ouHb5awAGxLs4xQLvfjlMv9CO19B6ux+wj05pUbf/LZF/xBbRt+V+h9fes/EzIsTFUpHJSzu/1xsTBVpWxzUMywDPT2flAMmuXsDtIsYQ0YpFlGKJf7ccrlfoTuYwersfsIZLzVdbDDbwzXnmpDNCxMVV1wUM7iPi5URudBMcNKcO2eDopBs5zFoZolLAO9NMsI5XI/Trncj9D97WBFzjQCGTbyXvzepXsf/tZv3/esP8BLw+8Nf3bpmh3gt6/dPNWGaFiYqraAg3JadzsuFqaqLcgPihmWhAc4KAbNcloHb5awHjRplhHK5X6ccrkfoYfZwdqcdQRa8sblxdL1YMXZFzdLTrsbLtkhTA7KqdjhuPR1UAyOS5HDHBSDZjkVAzYLB6UGzTJCuNyPEy73I+RgO5innxGoCIe5R/oKk4PSIz2GyXHpCw7KOOkrTA5Kj9As44RmGSE0ywg5e5KMQNOgrzA5KD3SY5gcl77goIyTvsLkoPQIzTJOaJYRQrOMkLMnyQg0DfoKk4PSIz2GyXHpCw7KOOkrTA5Kj9As44RmGSE0ywg5e5KMQNOgrzA5KD3SY5gcl77goIyTvsLkoPQIzTJOaJYRQrOMkLMnuccR6L0ffeaNy9f1AM5GXz3DQemRHjcyjktfcFDGCTvYCKFZxgnNMkJolhFy9oOyxxHo7ocufPPFX+gBnIGXX3v7/fc/pwdng4PSFz0eFIPj0gsclHHCDjZCaJZxQrOMEJplhPRyUPY4Aj30+e9+7u9/pAdwBp688Op9jzyvB2eDg9IXPR4Ug+PSCxyUccIONkJolnFCs4wQmmWE9HJQ9jgCXfzepd/91D/qAZyBBz7zz//PP/xUD84GB6UvejwoBselFzgo44QdbITQLOOEZhkhNMsI6eWg7HEEMj75+ItfeOonegA78ezzP7v34W/pQR9wUM5O7wfF4LicEQ7KOGEHGyE0yzihWUYIzTJC+joo+x2Brl6/+YGPfePLz76ix3BK7DB/8BPnX7t0VY/7gINyRvZxUAyOy1ngoIwTdrARQrOME5plhNAsI6THg7LfEch44/L1jz/6go1rT1549eXX3tazkGKhffPFXzzwmX+23HrvPYODsgP7PigGx+W0cFDGyb6PCwdlB2iWcUKzjBCaZYTs46DsfQRaYkPbfY88//77n3vXh57ETt/70WfufuhCj3/3tAgH5VQe5qAYHJft5aCMU3awEUqzjFOaZYTSLCN0HwflQCMQAAAAAADAGGAEAgAAAACAI4IRCAAAAAAAjghGIAAAAAAAOCIYgQAAAAAA4IhgBAIAAAAAgCOCEQgAAAAAAI4IRiAAAAAAADgiGIEAAAAAAOCIYAQCAAAAAIAjghEIAAAAAACOCEYgAAAAAAA4IhiBAAAAAADgiGAEAgAAAACAI4IRCAAAAAAAjghGIAAAAAAAOCIYgQAAAAAA4IhgBAIAAAAAgCOCEQgAAAAAAI4IRiAAAAAAADgiGIEAAAAAAOCIYAQCAAAAAIAjghEIAAAAAACOCEYgAAAAAAA4IhiBAAAAAADgiGAEAgAAAACAI4IRCAAAAAAAjghGIAAAAAAAOBp++cv/H2PHnlFezLRGAAAAAElFTkSuQmCC) ###Code # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:12:23 2018 @author: DR.AYAZ """ from mpi4py import MPI comm = MPI.COMM_WORLD rank = comm.Get_rank() print('My rank is ',rank) !mpiexec --allow-run-as-root -np 4 python mpi.py # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:15:19 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy comm = MPI.COMM_WORLD rank = comm.Get_rank() if rank == 0: data = {'a': 7, 'b': 3.14} comm.send(data, dest=1) elif rank == 1: data = comm.recv(source=0) print('On process 1, data is ',data) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:16:00 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy comm = MPI.COMM_WORLD rank = comm.Get_rank() if rank == 0: idata = 1 comm.send(idata, dest=1) elif rank == 1: idata = comm.recv(source=0) print('On process 1, data is ',idata) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:16:36 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy as np comm = MPI.COMM_WORLD rank = comm.Get_rank() if rank == 0: # in real code, this section might # read in data parameters from a file numData = 10 comm.send(numData, dest=1) data = np.linspace(0.0,3.14,numData) comm.Send(data, dest=1) elif rank == 1: numData = comm.recv(source=0) print('Number of data to receive: ',numData) data = np.empty(numData, dtype='d') # allocate space to receive the array comm.Recv(data, source=0) print('data received: ',data) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:17:01 2018 @author: DR.AYAZ """ from mpi4py import MPI comm = MPI.COMM_WORLD rank = comm.Get_rank() if rank == 0: data = {'key1' : [1,2, 3], 'key2' : ( 'abc', 'xyz')} else: data = None data = comm.bcast(data, root=0) print('Rank: ',rank,', data: ' ,data) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:17:29 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy as np comm = MPI.COMM_WORLD rank = comm.Get_rank() if rank == 0: # create a data array on process 0 # in real code, this section might # read in data parameters from a file numData = 10 data = np.linspace(0.0,3.14,numData) else: numData = None # broadcast numData and allocate array on other ranks: numData = comm.bcast(numData, root=0) if rank != 0: data = np.empty(numData, dtype='d') comm.Bcast(data, root=0) # broadcast the array from rank 0 to all others print('Rank: ',rank, ', data received: ',data) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:18:22 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy as np comm = MPI.COMM_WORLD size = comm.Get_size() # new: gives number of ranks in comm rank = comm.Get_rank() numDataPerRank = 10 data = None if rank == 0: data = np.linspace(1,size*numDataPerRank,numDataPerRank*size) # when size=4 (using -n 4), data = [1.0:40.0] recvbuf = np.empty(numDataPerRank, dtype='d') # allocate space for recvbuf comm.Scatter(data, recvbuf, root=0) print('Rank: ',rank, ', recvbuf received: ',recvbuf) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:18:47 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy as np comm = MPI.COMM_WORLD size = comm.Get_size() rank = comm.Get_rank() numDataPerRank = 10 sendbuf = np.linspace(rank*numDataPerRank+1,(rank+1)*numDataPerRank,numDataPerRank) print('Rank: ',rank, ', sendbuf: ',sendbuf) recvbuf = None if rank == 0: recvbuf = np.empty(numDataPerRank*size, dtype='d') comm.Gather(sendbuf, recvbuf, root=0) if rank == 0: print('Rank: ',rank, ', recvbuf received: ',recvbuf) # -*- coding: utf-8 -*- """ Created on Fri Nov 16 20:19:10 2018 @author: DR.AYAZ """ from mpi4py import MPI import numpy as np comm = MPI.COMM_WORLD rank = comm.Get_rank() # Create some np arrays on each process: # For this demo, the arrays have only one # entry that is assigned to be the rank of the processor value = np.array(rank,'d') print(' Rank: ',rank, ' value = ', value) # initialize the np arrays that will store the results: value_sum = np.array(0.0,'d') value_max = np.array(0.0,'d') # perform the reductions: comm.Reduce(value, value_sum, op=MPI.SUM, root=0) comm.Reduce(value, value_max, op=MPI.MAX, root=0) if rank == 0: print(' Rank 0: value_sum = ',value_sum) print(' Rank 0: value_max = ',value_max) ###Output _____no_output_____ ###Markdown **BONUS: MPI implementation for one of the above applications***Kernel Density Estimation***OR***Checking Uptime for Websites* ###Code # What MPI functions you need to implement it? # Data partition scheme? # Results collection? ###Output _____no_output_____
supdocs/A1-pygempick-appendix-a-immunogold-modeling-function.ipynb
###Markdown Appendix A : Using the pyGemPick 'mock' immunogold micrograph Drawing Function Here is the pygempick.modeling.draw(n, test_number, noise, images) Function to draw test micrograph sets that will be used in subsequent efficiency or separation tests. * n is the number of particles of the real dataset (for example our Anti-V30M (+) control has approximately 1550 particles) 1. Test number 1 is draw only circles, 2 is draw both circles and ellipses. 2. Noise if == 'yes' then, randomly distibuted gaussian noise will be drawn according to mu1, sig1. 3. images are the total number of images in the set - used with n which is number of particles detected in the actual set to calulate the particle density of model set.(for example there were 175 images in out Anti-V30M dataset)```pythondef draw(n, test_number, noise, images): row = 776 image height col = 1018 image width radrange = np.arange(4,8,1) mu = n/images mean particle number across your images sigma = np.sqrt(mu) standard deviation of the mean from your data creates a new normal distribution based on your data (particles,images) pick = np.random.normal(mu,sigma) height = np.arange(26,750) array of possible particle heights width = np.arange(26,992) array of possible particle widths height = 750 width = 990 count = 0 circles = 0 elipses = 0 mu1 = .05 sig1 = .02 image = 255*np.ones((row,col), np.float32) convert to BGR image = cv2.cvtColor(image, cv2.COLOR_GRAY2BGR) if noise == 'yes': mu1 = input('Input mean of Gaussian Distributed Noise') sig1 = input('Input std of Gaussian Distributed Noise') adding random gaussian distributed noise to image... for q in range(row): for w in range(col): image[q][w] = np.float32(np.int(255*np.random.normal(mu1,sig1))) change this value for high variability in background conditions.. if test_number == 1: for j in range(np.int(pick)): count+=1 picks a random particle radius between 4 and 8 pixels r = random.choice(radrange) chooses a random center position for the circle h = random.choice(height) w = random.choice(width) w = np.random.uniform(20,width) h = np.random.uniform(20,height) w = np.int(col*np.random.rand()) first method used to choose random width/height... ensure that no particles are drawn on the edges of the image figure out how to void borders... draw a black circle cv2.circle(image,(h,w), np.int(r), (0,0,0), -1) image = (image).astype('uint8') print('Complete') return image, count elif test_number == 2: q = np.int(pick) count = 0 while count <= q: picks a random particle radius between 4 and 8 pixels axis = random.choice(radrange) N = width * height / 4 chooses a random center position for the circle w = np.int(np.random.uniform(20,width)) h = np.int(np.random.uniform(20,height)) bernouli trial to draw either circle or elippse... flip = np.random.rand() if flip < 0.5: draw a circle cv2.circle(image,(h,w), np.int(axis), (0,0,0), -1) circles +=1 else: draw an elippse... elipses += 1 cv2.ellipse(image,(h,w),(int(axis)*2,int(axis)),0,0,360,(0,0,0),-1) count += 1 count = circles + elipses image = (image).astype('uint8') return image, int(circles), int(elipses)``` ###Code import numpy as np import pygempick.modeling as mod import cv2 #set the image and particle counts images = 175 n = 2250 detected = np.zeros(images) #empty array to plot the particles drawn per image. for i in range(images): image, circles = mod.draw(n, 1, 'no',images) cv2.imwrite('/home/joseph/Desktop/V30M-TEST/test_{}.jpg'.format(i), image) detected[i] = circles ###Output Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete Complete
1-Lessons/Lesson05/dev_src/Lab5/src/fp-src/Lab5-dev_Fp_Worksheet Only.ipynb
###Markdown Full name: Farhang Forghanparast R: 321654987 HEX: 0x132c10cb Title of the notebook Date: 9/3/2020 ExampleCreate the AVERAGE function for three values and test it for these values:- 3,4,5- 10,100,1000- -5,15,5 ###Code def AVERAGE3(x,y,z) : #define the function "AVERAGE3" Ave = (x+y+z)/3 #computes the average return Ave print(AVERAGE3(3,4,5)) print(AVERAGE3(10,100,1000)) print(AVERAGE3(-5,15,5)) ###Output 4.0 370.0 5.0 ###Markdown ExampleCreate the FC function to convert Fahrenhiet to Celsius and test it for these values:- 32- 15- 100*hint: Formula-(°F − 32) × 5/9 = °C ###Code def FC(x) : #define the function "AVERAGE3" C = (x - 32)*5/9 return C print(FC(32)) print(FC(15)) print(FC(100)) ###Output 0.0 -9.444444444444445 37.77777777777778 ###Markdown Exercise 1Create the function $$f(x) = e^x - 10 cos(x) - 100$$ as a function (i.e. use the `def` keyword) def name(parameters) : operations on parameters ... ... return (value, or null)Then apply your function to the value.Use your function to complete the table below:| x | f(x) ||---:|---:|| 0.0 | || 1.50 | || 2.00 | || 2.25 | || 3.0 | || 4.25 | | Example Use the plotting script and create a function that draws a straight line between two points. ###Code def Line(): from matplotlib import pyplot as plt # import the plotting library from matplotlibplt.show() x1 = input('Please enter x value for point 1') y1 = input('Please enter y value for point 1') x2 = input('Please enter x value for point 2') y2 = input('Please enter y value for point 2') xlist = [x1,x2] ylist = [y1,y2] plt.plot( xlist, ylist, color ='orange', marker ='*', linestyle ='solid') #plt.title(strtitle)# add a title plt.ylabel("Y-axis")# add a label to the x and y-axes plt.xlabel("X-axis") plt.show() # display the plot return #null return Line() ###Output Please enter x value for point 1 1 Please enter y value for point 1 1 Please enter x value for point 2 2 Please enter y value for point 2 2 ###Markdown Example- Lets have some fun! Copy the wrapper script for the `plotAline()` function, and modify the copy to create a plot of$$ x = 16sin^3(t) $$$$ y = 13cos(t) - 5cos(2t) - 2cos(3t) - cos(4t) $$for t raging from [0,2$\Pi$] (inclusive).Label the plot and the plot axes. ###Code from matplotlib import pyplot as plt # import the plotting library from matplotlibplt.show() import numpy as np # import NumPy: for large, multi-dimensional arrays and matrices, along with high-level mathematical functions to operate on these arrays. pi = np.pi #pi value from the np package t= np.linspace(0,2*pi,360)# the NumPy function np.linspace is similar to the range() x = 16*np.sin(t)**3 y = 13*np.cos(t) - 5*np.cos(2*t) - 2*np.cos(3*t) - np.cos(4*t) plt.plot( x, y, color ='purple', marker ='.', linestyle ='solid') plt.ylabel("Y-axis")# add a label to the x and y-axes plt.xlabel("X-axis") plt.axis('equal') #sets equal axis ratios plt.title("A Hopeless Romantic's Curve")# add a title plt.show() # display the plot ###Output _____no_output_____
notebook/fluent_ch2.ipynb
###Markdown Generator Expressions, Genexps ###Code tuple(ord(symbol) for symbol in symbols) import array array.array('I', (ord(symbol) for symbol in symbols)) #array.array: 'I' unsigned int for tshirt in ('%s %s' % (c, s) for c in colors for s in sizes): print(tshirt) ###Output black S black M black L black XL white S white M white L white XL ###Markdown Tuple uesd as records ###Code lax_coordinates = (33.9425, -118.408056) city, year, pop, chg, area = ('Tokyo', 2003, 32450, 0.66, 8014) #tuple unpacking traveler_ids = [('USA', '31195855'), ('BRA', 'CE342567'), ('ESP', 'XDA205856')] for passport in sorted(traveler_ids): print('%s/%s' % passport) #tuple unpacking for country, _ in traveler_ids: print(country) ###Output USA BRA ESP ###Markdown tuple unpacking ###Code a, b = 1, 2 #tuple unpacking, swapping the values of variables without using a temporary variable a, b = b, a print(a, b) divmod(20, 8) t = (20, 8) divmod(*t) #tuple unpack, prefixing an argument with a star when calling a function quotient, remainder = divmod(*t) quotient, remainder import os _, filename = os.path.split('/Users/linheng/.ssh/id_rsa.pub') #return a tuple (path, last_part), _ is used as dummy variable filename a, b, *rest = range(5) #using *args to grab excess items, which is a classic Python feature a, b, rest a, *body, b, c = range(5) #It can place any postition a, body, b, c *body, a, b, c = range(5) body, a, b, c ###Output _____no_output_____ ###Markdown Nested tuple unpacking ###Code metro_areas = [ ('Tokyo', 'JP', 36.933, (35.689722, 139.691667)), ('Delhi NCR', 'IN', 21.935, (28.613889, 77.208889)), ('Mexico City', 'MX', 20.142, (19.433333, -99.133333)), ('New York-Newark', 'US', 20.104, (40.808611, -74.020386)), ('San Paula', 'BR', 19.649, (-23.547778, -46.635833)) ] print('{:15} | {:^9} | {:9}'.format('city', 'lat.', 'long.')) fmt = '{:15} | {:9.4f} | {:9.4f}' for name, cc, pop, (latitude, longitude) in metro_areas: if longitude <= 0: print(fmt.format(name, latitude, longitude)) ###Output city | lat. | long. Mexico City | 19.4333 | -99.1333 New York-Newark | 40.8086 | -74.0204 San Paula | -23.5478 | -46.6358 ###Markdown Named tuples ###Code from collections import namedtuple City = namedtuple('City', 'name country population coordinates') tokyo = City('Tokyo', 'JP', 36.933, (35.689722, 139.691667)) tokyo tokyo.population tokyo.coordinates tokyo[0] tokyo._fields LatLong = namedtuple('LatLong', 'lat long') delhi_data = ('Delhi NCR', 'IN', 21.935, LatLong(28.613889, 77.208889)) delhi = City._make(delhi_data) delhi._asdict() for key, value in delhi._asdict().items(): print(key + ':', value) ###Output name: Delhi NCR country: IN population: 21.935 coordinates: LatLong(lat=28.613889, long=77.208889) ###Markdown Slice Object ###Code s = 'bicycle' s[::3] #s[a:b:c] step c s[::-1] s[::-2] invoice = """ 0.....6.................................40........52...55........ 1909 Pimoroni PiBrella $17.50 3 $52.50 1489 6mm Tactile Switch x20 $4.95 2 $9.90 1510 Panavise Jr. - PV-201 $28.00 1 $28.00 1601 PiTFT Mini Kit 320x240 $34.95 1 . $34.95 """ SKU = slice(0, 6) DESCRIPTION = slice(6, 40) UNIT_PRICE = slice(40, 52) QUANTITY = slice(52, 55) ITEM_TOTAL = slice(55, None) line_items = invoice.split('\n')[2:] for item in line_items: print(item[UNIT_PRICE], item[DESCRIPTION]) l = list(range(10)) l l[2:5] = [20, 30] l del l[5:7] l l[3::2] = [11, 22] l l[2:5] = [100] l l = [1, 2, 3] l * 5 5 * 'abcd' ###Output _____no_output_____ ###Markdown Building lists of lists ###Code #a Tic-tac-toe board board = [['_'] * 3 for i in range(3)] board board[1][2] = 'X' board #wrong shortcut, rows are referring to the same object. weird_board = [['_'] * 3] * 3 weird_board weird_board[1][2] = '0' weird_board t = (1, 2, [30, 40]) t[2] += [50, 60] t import dis #bytecode for s[a] += b dis.dis('s[a] += b') ###Output 1 0 LOAD_NAME 0 (s) 2 LOAD_NAME 1 (a) 4 DUP_TOP_TWO 6 BINARY_SUBSCR 8 LOAD_NAME 2 (b) 10 INPLACE_ADD 12 ROT_THREE 14 STORE_SUBSCR 16 LOAD_CONST 0 (None) 18 RETURN_VALUE ###Markdown list.sort and sorted built-in function ###Code #sorted create a new list and returns it. fruits = ['grape', 'raspberry', 'apple', 'banana'] sorted(fruits) fruits sorted(fruits, reverse=True) sorted(fruits, key=len) sorted(fruits, key=len, reverse=True) fruits.sort() fruits ###Output _____no_output_____ ###Markdown Searching with bisect ###Code import bisect import sys HAYSTACK = [1, 4, 5, 6, 8, 12, 15, 20, 21, 23, 23, 26, 29, 30] NEEDLES = [0, 1, 2, 5, 8, 10, 22, 23, 29, 30, 31] ROW_FMT = '{0:2d} @ {1:2d} {2}{0:<2d}' def demo(bisect_fn): for needle in reversed(NEEDLES): position = bisect_fn(HAYSTACK, needle) offset = position * ' |' print(ROW_FMT.format(needle, position, offset)) if __name__ == "__main__": if sys.argv[-1] == 'left': bisect_fn = bisect.bisect_left else: bisect_fn = bisect.bisect print('DEMO', bisect_fn.__name__) print('haystack ->', ' '.join('%2d' % n for n in HAYSTACK)) demo(bisect_fn) def grade(score, breakpoints=[60, 70, 80, 90], grades='FDCBA'): i = bisect.bisect(breakpoints, score) return grades[i] [grade(score) for score in [33, 99, 77, 70, 89, 90, 100]] ###Output _____no_output_____ ###Markdown inserting with bisect.insort ###Code import bisect import random SIZE = 7 my_list = [] for i in range(SIZE): new_item = random.randrange(SIZE*2) bisect.insort(my_list, new_item) print('%2d ->' % new_item, my_list) ###Output 12 -> [12] 5 -> [5, 12] 6 -> [5, 6, 12] 4 -> [4, 5, 6, 12] 10 -> [4, 5, 6, 10, 12] 3 -> [3, 4, 5, 6, 10, 12] 8 -> [3, 4, 5, 6, 8, 10, 12] ###Markdown Arrays ###Code from array import array from random import random floats = array('d', (random() for i in range(10**6))) floats[-1] len(floats) fp = open('floats.bin', 'wb') floats.tofile(fp) fp.close() floats2 = array('d') #typecode 'd', double-precision, create an empty array of doubles fp = open('floats.bin', 'rb') floats2.fromfile(fp, 10**6) #read 10 million numbers from the binary file fp.close() floats2[-1] #inspect the last number in the array floats2 == floats #verify that the contents of the arrays match ###Output _____no_output_____ ###Markdown Memory views ###Code numbers = array('h', [-2, -1, 0, 1, 2]) #typecode 'h', short signed integers memv = memoryview(numbers) len(memv) memv[0] memv_oct = memv.cast('B') #typecode 'B', unsigned char memv_oct.tolist() memv_oct[5] = 4 numbers ###Output _____no_output_____ ###Markdown Numpy and SciPy ###Code import numpy as np a = np.arange(12) a type(a) a.shape a.shape = 3, 4 #change the shape a a[2] #get row a[2, 1] #get element a[:, 1] #get column a[1, :] a.transpose() #create a new array by transposing(swapping columns with rows) ###Output _____no_output_____ ###Markdown Deque and other queues> The deque(collections.deque) is a thread-safe double-ended queue designed for fast inserting and removing from both ends. ###Code from collections import deque dq = deque(range(10), maxlen=10) dq dq.rotate(-3) dq dq.appendleft(-1) dq dq.extend([11, 22, 33]) dq dq1 = deque(range(12)) dq1.appendleft(1) dq1 ###Output _____no_output_____
TFMG_Project_Tutorial.ipynb
###Markdown TensorFlow Model Garden TF-Vision Project ExampleIn this Colab Notebook, we will be using a simple version of the YOLOv3 model for detection to show **how to create a TensorFlow Model Garden project by following [TFMG components](https://github.com/tensorflow/models/tree/master/official/vision/beta/projects/example)**. Also, how to use TensorFlow API and [TensorFlow Datasets](https://www.tensorflow.org/datasets) to support your TFMG project. > Note: As the goal of this tutorial is to show how to create a TFMG project, we choice to write a simple version of YOLOv3. Once your project meets the basic TFMG components, you can freely added subfolders under specific component folders to support your project. > > For advanced developers and researchers who already understand TFMG components, we have more comprehensive established projects. Please look at [MovieNet](https://github.com/tensorflow/models/tree/master/official/vision/beta/projects/movinet) and [YOLO](https://github.com/tensorflow/models/tree/master/official/vision/beta/projects/yolo) as examples. ( TODO: May need to update the links.) Understand the TFMG componentsThe TensorFlow Model Garden (TFMG) has a modular structure, supporting component re-use between exemplar implementations. The folders of the components include:| Folders | Required | Description ||-------------|----------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|| dataloaders | yes | Decoders and parsers for your data pipeline. || modeling | yes | Model and the building blocks. || losses | yes | Loss function. || configs | yes | The config files for the task class to train and evaluate the model. || tasks | yes | Tasks for running the model. Tasks are essentially the main driver for training and evaluating the model. || common | yes | Registry imports. The tasks and configs need to be registered before execution. || ops | yes | Operations: utility functions used by the data pipeline, loss function and modeling. || utils | no | Utility functions for external resources, e.g. downloading weights, datasets from external sources, and the test cases for these functions. || demos | no | Files needed to create a Jupyter Notebook/Google Colab demo of the model. Modularity both simplifies implementation, andaccelerates innovation: model components can be recombined into a new model per-forming a different function. For example, the YOLO family is targeted towards object detection, but can be used for image classification by connecting an image classification head to the current backbone. Understand the your model[![YOLOv3](http://img.shields.io/badge/Paper-arXiv.1804.02767-B3181B?logo=arXiv)](https://arxiv.org/abs/1804.02767)In this tutorial, we implement the paper of YOLOv3 : [YOLOv3: An Incremental Improvement](https://arxiv.org/abs/1804.02767)In YOLOv2 (YOLO9000), they used bounding boxes in dimension clusters as anchor boxes then computed sum of squared loss.In YOLOv3, they predict an objectness score for each bounding box using logistic regression. This should be 1 if the bounding box prior overlaps a ground truth object by more than any other bounding box prior. If the bounding box prior is not the best but does overlap a ground truth object by more than some threshold we ignore the prediction.So, the functions defined below:- Bounding boxes- Non-Max Suppression- Interval overlap- Intersection over Union (IoU)- Sigmoid functionYOLOv3 uses a new network architecture (DarkNet-53) for performing feature extraction, it's a hybrid approach between the network used in YOLOv2 (DarkNet-19) and residual network stuff. For the crucial functions of YOLOv3 (e.g. bounding boxes, non-max suppression, and IoU), we will provide the functions in the Step 1 (Create Model) and will use them in Step 4 (Create Task).For architecutre: - Architecture: Darknet-53 ![darknet53.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAZYAAAJTCAYAAAArNTAMAAABQmlDQ1BJQ0MgUHJvZmlsZQAAKJFjYGASSCwoyGFhYGDIzSspCnJ3UoiIjFJgf8rAziDHwMcgwMCZmFxc4BgQ4ANUwgCjUcG3awyMIPqyLsisPZn3WJ6e3aVtdnr+XfnAZ48x1aMArpTU4mQg/QeIk5ILikoYGBgTgGzl8pICELsFyBYpAjoKyJ4BYqdD2GtA7CQI+wBYTUiQM5B9BcgWSM5ITAGynwDZOklI4ulIbKi9IMAR7OgbHOrnTsCppIOS1IoSEO2cX1BZlJmeUaLgCAyhVAXPvGQ9HQUjAyNDBgZQeENUfxYDhyOj2CmEWL4VA4PFCQYG5qkIsaQXDAzbbzIwSHIjxFS2MDDwxzMwbOstSCxKhDuA8RtLcZqxEYTNU8TAwPrj///PsgwM7LsYGP4W/f//e+7//3+XAM0HmnegEAAGAmBJE234zQAAAZ1pVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6ZXhpZj0iaHR0cDovL25zLmFkb2JlLmNvbS9leGlmLzEuMC8iPgogICAgICAgICA8ZXhpZjpQaXhlbFhEaW1lbnNpb24+NDA2PC9leGlmOlBpeGVsWERpbWVuc2lvbj4KICAgICAgICAgPGV4aWY6UGl4ZWxZRGltZW5zaW9uPjU5NTwvZXhpZjpQaXhlbFlEaW1lbnNpb24+CiAgICAgIDwvcmRmOkRlc2NyaXB0aW9uPgogICA8L3JkZjpSREY+CjwveDp4bXBtZXRhPgr/eeWUAABAAElEQVR4AezdCfylVVk48Kvt+75oWdO+0KpNoBKOoICTEWCgTYlDKC45TMrowEimIQgylFMhpZDQMoaOiJDgAspSVIZZme0LlpRm+77//7/vyWc68/K+95773vfe372/Oefz+f3udt5znvOcc579POcB/2+tjGqpGKgYqBioGKgYGAgDDxyondpMxUDFQMVAxUDFQMJAZSx1IVQMVAxUDFQMDIqBylgGRWdtrGKgYqBioGKgMpa6BioGKgYqBioGBsVAZSyDorM2VjFQMVAxUDFQGUtdAxUDC8TAv/zLv4z+/d//fVSDMReH9IrzxeE6evrIeLPMrz/7sz87+tVf/dXRf/zHf4wF84gjjhg9+clPHn3yJ3/y2Hr1x+XAwF133TW68cYbR//8z//cCdBJJ5002rJly+hVr3rV6D//8z9H3/M93zP61E/91NFP/MRPjP7pn/5pdNZZZ6XPnQ0swQ//+I//OHrd6143uu2220Z/8Rd/kSD6jM/4jNGjH/3o0amnnjr67M/+7NEDH/i/Mt6P/uiPHjLOJQB/JUGA5wMHDox+8zd/c3TvvfeO/vu//3v0eZ/3eaMv+ZIvGX3nd35nev3Ij1wJ8reS+F8JzP75n//56Pd+7/dG//Zv/5aQ/Ld/+7ejP/uzPxt97ud+7sgG/YiP+Ij0/Sd90ieN/uu//mslJ+JwBPoP/uAPEsH9uI/7uCQMBHHNcfHN3/zNiSi84x3vSJI+ooCx3H777aO/+Zu/SUTCb69//etHL3nJS0Zf+IVfOHrAAx6QN7Gu7wlDP/ADPzB685vfPPriL/7i0Vd+5VemcfzRH/3R6Id/+IdHv/RLvzS65JJLRg9+8IMT3M1xrivwK9o5WvHiF7844fYLvuALRv7QiA9+8IMjwswdd9wx2r17dxJYPvZjP7Z4lG94wxvmvs4W0UfxgGeouBKM5YlPfOLo+OOPH/3P//xPGupb3vKW0cte9rLRiSeemCS+j//4j0/f01Qwl1pWAwOIbmgdxx577OhjPuZj7gc4KdPmf+ELX5jm/9M//dNTHeYNmg6T0h//8R+P3vrWtyZi4fMyMZZbb711dMMNN4we+9jHJu2KMGQd//3f//3ox3/8x9NvN99882jbtm2jT/iET7jfOO+HkPrFWAzQVDCNe+65Z/S0pz0t0YjP/MzPTBqh9YKx/NAP/VCqc/XVV4++4Ru+YVSquSxinS2ij7EIHOjHlWAsIXXEmEkkH/VRH5VU26/7uq87yEx+93d/d/TOd75z9NCHPnRECo5Ci3n3u989woC+7Mu+bPT7v//7aRN/zud8zojU/Du/8zujz/qszxodeeSR6fucMJEs/a4N0iZVWt+1DIeBL/qiL7rfnDVbN59tTOO3f/u3E2OhzSImH/3RH500g5ijSfP33ve+N3VlXTCZvP/97x8dddRRaa0gUtr8q7/6q2Susq6smTbNqglvfP6VX/mVEQ37O77jO0bf+I3feMjaOfvss0dve9vbRr/2a7+WBCSMJR8nX8xv/MZvJA0n2stfEcVP/MRPPMhIJ401f3ajvmf++oVf+IXRU57ylNH27dtHD3nIQw6ZL58JM7TEn/mZn0lrhfXDnre/rZ8o5sV8fOmXfmmiE4h+c52hReZtEi0xj337iLUccK3C60owllJE/uIv/uLoJ3/yJ0c/+IM/OPqWb/mWg4uEnfW8884bnXLKKUktZpogCf/DP/zD6A//8A/TQiNFftM3fdPoggsuGG3atCmZX1796lePXvOa1yTCYFFYdFu3bh09+9nPHpGccwZUCmOt1w8DL3/5y5OP7aUvfWki8tGKOUK8aTAkUczge7/3exNjKJk/vhpz/2mf9mmjO++8c/SgBz1oxFeHoezduzcxGhtb+5//+Z+f1sfmzZsPrq2Ao+vVGsEQtUdwyYmEfhA3mnb4BfNx8gvQ1JiC24pntQH+krG2tbGRviP8vfa1r02EngZovppCwKd8yqeMzjjjjNH+/ftHNMXnPve5ySRJa7G2CJhRfCbUvuAFL0j4Zc5srrMSWgIGZs++fYB51cqGYiw22fve977RLbfckiTgkD5IhZgLFZm5hWaDiBxzzDGj7/u+70t+Gp+pxhYWJsSEwZFKI3rGM56Rvmf/xmhIKKTNIAarNunLBi+NEP6bpjBEnl8CMSaNkxY58PNCMuVz+emf/unRc57znNEjHvGIxFSYn0rmz3r59V//9dQGwv+t3/qtaX4xFZrKnj17ErOh2f7Ij/xI6ucrvuIrRswrJYWJD2zMXrQjfVhT1irhhHaUl3ycfjcmxCwK4emqq64afc3XfM0IwUG0rr/++qKxRhsb9fVDH/rQ6L777ktrhs8qfK/N8dIuWB6Y1AVWEC6tveba8r2CYVlnTGnNdVZCSwgttB3ruE8fTfhX4fOGYixf//VfP7Lp3/72t4927tyZpFCLAmNh6vBbSIykyGc+85mjo48+Oi0qG526SuL57u/+7hR1RGKknTBhWHjqMLVdd911oyc96UmVsQy0wn/qp34q+Rqa0iXzkXnEOLoKAmvezCuN88u//MsTA3rlK1+ZtM6S+UOQEHBMhSaLgTGxIU4Pf/jD03fMTgSKv/7rvz64hrpgyr8HE7OLyEb2fX8Ygr+HPexhqU/adZsTmRnmuOOOOxiajND92I/9WILr+7//+1Pwyr/+67+OphlrDttGe0+zQ7hpGbHPu8ZIaFGYKWl8k4p1xpKRr7NYr+NoCT/PuPWb9zuuj7zeKrzfUIzFRjzhhBOSSQRhIJl4JQVSf01wmK/YTdlUY0ML+RSBdPtatBHbKtupOiQNkmsUNm3M5e/+7u8SwYn24vf6Oj0GBGYgsk1ikM/PNK2ymSPCCMGk+dMuImPdkGIRCyaLr/qqr0rzjOHQbMHn1e/WQGnh18OwaCmiksBFc8FgaL+Enuc///nJB4NxNUs4lkXAXXzxxcnWf9FFFx0UiKzT0rFu9LXKQhGmR8R+XInAD8E+wSDG1R/32zhagtlZh4db2VCMxeR927d92+gVr3hF0lowCtFCoo8QjnzjCksNU1lMOpusjYwYcZwiUHw2sbnVI7FSbf1O0ulSt6PN+joZA7SCb//2bz8k4MJTGE1zjia3Nkp+kdL50x7zZ05g9HnhhReOhH4y0TGBYRC0GdosWCMScRw8NF4MgfBCY6ZNYVAiwpy3IsCcf/75ab0+5jGPOWR95u1av7Qe52AwOn6+6F+wQelYN/paNT/2KpywVHQV+/betUCNiCJtYywYU9Ns1dXeOFoiUKBLI5qmj66+l/X7DcdYSA9MVzQPpi4S4dd+7dem8w05g2BCaEo1sQhigTJjOJDX5ktBKNoW5LJO9DLDhfDSAmicQxRtmZvS+bMumtI8DYMm4wCjgIB3vetdo5/7uZ9LgSFf/dVfnXwckwg14sYUJxKJr4bPhHDjj5nNWuUfEcWU+1FyHFijV155Zepb2D0TLMEm4J12rHnbG+09iwStMw5FEhjyPR/jddj6T/7kTw5Ggfq+SQtYKdpoRLSRv7bVC1rCEhLrZJY+8v5W4f2GS+lCyiXRceIJPWQKe9zjHpds2vmEcBhz3MVke+UYJbliRDYsgsLuycEaf0wY7OWkxNjcebv1/fpjgOmBRN93/oSmP/WpT01mJyYxZ1Ce/vSnj77ru74rmbP+9E//dKxEHBiwFj/wgQ+kQ3XWY1NyBaN1hFE0zYDRhiAEkWu0b/Z6DCkXaGYda/SzEV7hhZPdXr7iiitSNF3s7xgfhuE3WmMEfmBIfC00zCi0Scy++Xz8nr+OoyWECYxl1j7y/lbh/YZjLJDu4CSpkKSnbFlLCRKmg/TF2j9mLiGAJFGEQl0nckmFtBGhyWzXnPl+RyCuueaalFqEqSSkkGivvq4fBhBmRIXkzwxiE88yf9oTwowA8WHQPISmv+c970mmOZpMmyTcxACYMCjBAPwoTGvaE4nmxL3oQ4EHBKG2g72I22WXXZZ8fAQc56+EvL7pTW9Kf5gVWGcZaxPmVf/8+Mc/PuEzAngEhvzyL/9yMjsSNEWB+k0qIBGE9jJfHl8I7fG3fuu3kvlc6DrGkjPx5joLRjSOlnhGmbWPVZuXDWcKMwERzWMhnXzyyfeT8tRhLrPJmSpoHhgHqZCT32JgRqPOCu0UBWaBcfg96lGP6jSPabeWxWPAvBEG9u3bl3waO3bsmGn+mEJpKKKtECDaKyJD2nVGRgh0qWDB58cvx1cnkktbnmU+wXCe8IQnjJ71rGfdT6OGRcwHM6FZO7NiHeZF9gnaVF2r/4cVQgU8Ww/XXnttEgbCXGj/Egh27dqV6IJ0UPY+vxnmI+JOOLHv0QfMJp/ntnWm53G0JMzos/SxiudYHrCm6o0Pn/i/OVuad6JrbDhOM0Qgn/wA0jkEDk8bmuM+PyMhbJgE6HxK5B0TQSZcmd2bWQJaaCoiePTFjMH5yr4unLFEYg1Y6ms7BkjcIvbgNU/E2FabecocmDdzyY7OueqzuWKO0Jb5M0fmsGT+8nbyNYIZ0CbMvXBkm1u4ujUyCdYm/Bz42nEWhnRL0mX/1x4mFT49z+XjJOxwMnc5oq1jsCCOJWNtwrWRP5szFgf4FoxDM5HhQUod2oP5DG3EOrJ+1Md81Nu0Zk6Ff/4/69PzBIHmOsNUJtESeJ6lD32vWllJxjIJyQiQZIV/+Zd/mU7MNhMT2pAIBAnXAiONWkAIS9NvYlPTXBAvkk+XLXwSTPX3+WOAvwLDzwWNWeYPA0BotGtzM6/2FSisH1qKtaZYa9oL4jYEdmYZ6xD9L2Mb5s7+hedJ+1f0nTlHC5p0IB9bvs6moSXa6NNH3veqvN9wpjAaSpyGldqFZDhukSAY4yQChIR6XcvyYyDXOALaWeYPgwpTRrTX99Ua5Odr+vr6ttf23CxjbWtvI3xnTbSti7axjaMDef2u9ibREm3M2kcOxzK/XwqNhQPdvRwki1kLxxzTg43sRHNkNs3b5XthR3V+gnRSS8VAxUDFQB8MrDotwej4nNDDIctSMBaMYNKhptJBM10wOzBZQVqbthLRHiSPtt9L+6r1KgYqBg5vDKw6LWEiFFLfpYX1nd2lYCx9ga/PVQxUDFQMVAwsHwY25DmW5UNzhahioGKgYuDwwUBlLIfPXNeRVgxUDFQMLAQDK81YHBwTW86nUkvFQMVAxUDFwHJgYOXCjeV/kqvL5UxOxipCip1Adn+HcynL7pCX+0kEnFPds4Yyu8zK4SvJMmdta5YlOeSYSuBwHsDBN4cOhQVLPhiXguXPEzocHnRAkSDigJwzTMJ+F7VOSmHN4R7i/Xr1OwTs2iiZO3XkAzTWZnHo2V/zXJO0LQ7TyuPlumkh5fNcCxtlHE38jvu8UoxFVtJLL700nU52GtafNBtOKwtZlgLDXRUW0zwXyjiElvx2+1rmZZFwDnFOwwzkmnr9618/eslLXpKyDhijWy0d2Jq2rRI4p6nTd0zT9BF13YUjrb359l5x+M29Lq6aRTAUBwal6XjjG9+YTs87/CZS0KVd8nRZPznRSQ8N/K8U1oG7TXgpwdHQ/Q7VXunc2f9uhg0hM+9/+9qd92eeeebBPGx33313yv9GGLFnEHwRUa4XlgVhHmtho4wjx2vJ+5VhLAixA49SrJxzzjmjLWuJJREQpjC/SSLpDmsLRZ6ntqR+JQhZRB0hihEWPU1/JHT3y9hINgXG4k50mQaCmE7T3pB1+46pDwy0NExWni1nlWSqpcXKDYVZ0FxpJBKIuhKYVGpNCEF3p4lnMRh3mpdeMdwHTs+Uwtq3/a7n1qvfLnim/b507mijrBeuN5BqKc9kIN2KeVbQCLnVMCL5AaXCIZTJwSYnnH00jz20UcYx7fytDGPZv39/ShSHQMhAzKSRayXSY7jZ8XWve12S3oOxkFJpNE7jI8aS01lwEbctY60Ty8wozCUSU5J+mUtoPhYq1RnxcklTfqASQacpuUcEQbOIJ/XXNkFyUhmLHEaxEdST3Vbb8l+RsjAWjPSee+5J9cDs92Ay0XYJDCXjjva0H3fDwxU4N63lUvJ+0YUEKLuv+1KY/8wnU6BL2iT6k+HYjY0Yyy233JJeCSK0FHO5efPmhFdaDO0mEhHOYxzTwDpk/+vV75BjKJ0769K8mnv7JKcJ9mrsp2vWMpNjQC9/+cuT2dxv9jhGow0azDzKRhnHtLhZPGWYFsIP10cIEFFSaiTey5uiqchEbIGE5MEMITW+VOPUXHZYzODRj3706HnPe15iHCQ7kixGxLTCDq8ewkmildDQZWEYFonnyCOPPJgvjG3XDYBuFJSPjNQ+qb984Qf8nkGk9cdfFMVnUtgLXvCClPPMOPQhpTdfk0y7Ngp41YWXkjGDoWTccCYZ44tf/OLEQDEYTMtGxlyYEJpSYsA+r1fj07e74CMZaDA7RMQ8gg9OMGN36/C/hEDgGWsFY407OObFIEth7cKV+4EQvubtkhiH1EW0NYy1Cf+s/XbBs6jvp5m7e9eSdNozhEBZya1R898sNH3JIgkWhFDF9Qf2Fn9n0Izmcz73nYdlG0fb2Ob13UowFhNP5SWVIp5ttlCbS4oWCyt+j5T327ZtS5tQnVtvvTWpvxajdOUWpgy3JFoXKenDQkJ4f/7nfz4xDMzFjXMWJ0IlQEB5y1vekp511wYNyBW2UpuP668t9xRNBHMjeedFtlUFIXEpEfOZtN6up3WXBKmcE5IWE8+WjBkMJeNWT8qKm266KV1ytWXN/Ih4wwvT0yMf+cikHcZGzWGf13uMAzFACMCC8MqS7P54eAAjvJhr82Gucj8WoitDrTraiLUyD3hLYe3qm/ZlPcnUy4cGz9YCweLVr351WnMEmiZjmbXfLngW9f00c0fbMId8r4Qta4Dw4FI2QqB9SdhEP9zVwvxF45f1GDMiFLoKIywYbWPsOw9Dj8PYZhlH29jm9d1KMBYbC1Lds4IAd5VcUuHMQ2wwCte5UpP9zknnEicXANmsGBHJwp0WJEMER8ZShJMJTb+IuFTamBJTHGJlg1NzSc/+wFjSXxtj6RpP/j1pixZl/K7cZXrLx6tu6ZgDhknjVk9/8MTshIjZxJiya3qZDbWxSMaCAJDSFWmAdu7cmZgkQsEExp+C4dDK3J+RF8KB6xSYRc3jvIM8SmHNYczf085pVrRShe9I9N2rXvWqdCsqQaqNIM7abw7Dery3rkvnDmMhXNmT9rY1QbN3v4ogF9Gi0t/T9DEIwiCaQKhgOidEnnvuuQf9cm3j7TsPQ4/DGp9lHG1jm9d3K8FYSGAIGjMMRlBS+EtIMG6TpOUEEfaejwVzkVafaYyEQ2MJAhmqNWaBgejfAuXks4gRJFoOjQJDIhHzh5T0Z1HPq5SOOWCYNG5wkvJoQ0xiNBVjtCGZkSKyZl7jmdQuX5i7dviorA9RP5ylwriDeWrDZnQ7IGHC/JkzNzvmdSb1NevvpbDm/RCGmBuZIjEXgQcIpjnhgGbiM+5xpU+/49pb9G+T5o5WQrA76aST0r02riawN/nVaDH8ogQua9g6ufjii0fHHHNM2tPW8tlnnz26/PLLk6ZrP7SVIeZhiHHQVmYZR9vY5vXdSjAWtlMSBkaA0HcVfhiLx2LB3dVlumpKdRgHidZ1s5iV9puaEKnXYghG5iZKdu3b10KFOf1IPphV2L8jieak/jCykqLfMG+V1FdnWhhKxk3KZ34TpMBPIYoK/iYRtFKYZ6mHMQjkeNzjHpfG7vZPJiKSfTANfjCmI/4zvhVX/R511FFJOFjkGEpgbcMF7WzPnj1pnBi7tYcpInYl8Pfttw2WRX9XMndMxPZybiKnWYsQvOuuu0YuBWSSUkewC60WQ/aZFYKfirnR3mERaZoVY8yzzMNQ40DHZh1HjGferyvBWGwgi8Kd9AgdjaG5ADABd5STpLevxa8zaZkEWgfmkZd7P3wrH8KqbfUmFX4WkUjS8iNgGAt1nfSvjWn6a+srGFj8xndA+mp+H7+3vU4LwyTChPEyJ9gYUmszx9HewITIrUcRSMHnwx+GUTCB+EMUmCwJF3DnVka+FEEaJFNaiqCN3JE/b/ingTU06iZMhCO4jjFZ/8w3xkEbaStD9NvW7iK/c6xg0tzxOcKPyL4cf4J87EtCnLUqghS92LRmSrZ+Y79b/w95yEPSZybvcXutzzzA15DjMKZZx7GoObx/+MSiep6yH5IJpDqXwGbaXASitmgriD2TFh8EDu9QJQIdBZNRzyTRgvIFGXXaXrVF2mFKcUiR2cl95iEZk2j69seUFhFK0TfTWq4xxffjXmeBoa1dqjezAls+U8PRRx+dTIYi55rMuu35eXynb4yFkIEYRAEPqdN8BoN1OBL8gis4cwkHER0Wz83zdRpY2+BAzGhbTLCnnXZaWvsCJozL+Yuu+4tm7bcNlkV/VzJ3gl5ERjqKgMlEsS4QdPub0EEQxUAIGnCW0w7rg7VCnS5Bq+88gGfIcYhim2UcgZ9FvK6ExgIRIjc40Rxuc/5ANAdnnU3E7myjYRbb17QViwlxoeZy2LlEbOvWrUmCwYA4by1IGgvpJV9o45COsbB1v+IVr0jPcqIHoWKaKOmvrX02Yj4ffgBhtGzCCArGkjM+8PpsTJgRqTwvs8CQtxPv4REzJyVjMoozPaKtMEJne6bVqqLtvq/MWPBw9dVXJ/OHOYAvUjoiYY4QFOuCKYS0Smt1eLZZ+M3CLNr8bYjPpbCGBN3sE1HiqLd2rVdaCuLC72IOrPE4DJo/O2u/eVvr8Z6mXDJ3cMH3J4KTMEnwEUzCtGWP87VZCwQ+gukFF1yQCD0a4XvCCX8hWoKx5HstH3ffeVi2ceRjmvf7lWEs1H4MBVERfSViy3ekCRNoUTGPMFchhv5Iqkw5zonwj2AgbK6ixEiwXaaELqRjXKR3Ggs7d06UqN99+3O4C3O0gPkzqPYCDBDMXIqijdFK9u3bl04S79ix4xBQZ4HhkIY+/AFRFj0lQk5f/E76oKnZrDY/h7iw7UUVMCEQzu1wajNtYMAYHVOXUGxMV0ip0GKSrPnP8Riw0moRGGtlHqUU1q6+CQ7WRkQ1GgOYZaCwBph22mCftd8ueBb1PW2jZO4E3Dgxz0kvlRPrAfMXXyzfWx7E4WQ+S4doRuuWloIpceRz9FszXaXvPCzbOLrGN4/vV+6iL4tDNBYp2nuMhnpImuO0y53wzCPq8hFQm0l46vAVWCw2KgalMJMgllGc0vW8uvn3Dt0h+NoJm2c8U9KfuiRrjvlo23tqOlgRQk5FRMP4aESICaJOxVePOY52AmaL12aKtkphKB23TQrXYIY/kTP6tSnhCKwcyiTEfEyBk3m8wpE5ZY5k/gq4aH58cdaAFB7qjDtRLWKIVtalMQwBewmsXf0gruAXMNFkjDRIAoixt8E/S79d8Czq+2nmDn7sSfvAeSbziR5YC/Znjjdr2ZqItRpr2V7L6UZznH3nYdnG0RzXPD+vHGOBDJoHIssMg+DaXF1qrPrUYxvNIqMy54vN70OXvv15DmPATNqIRcBpM5FUx42jLwzRR/OVaUmfOWzmIOAYB2+zrSE+xxqgrYDBGvC6aDhKxrJesK5XvyU4GbpO0AMMglbdtTcIYWiBV2tmHEMZGsaS9jbKOFaSsZRMUK1TMVAxUDFQMbA+GFiZqLD1QU/ttWKgYqBioGJgWgxUxjItxmr9ioGKgYqBioGxGKiMZSx66o8VAxUDFQMVA9NioDKWaTFW61cMVAxUDFQMjMXASjMWkUqiKES/1FIxUDFQMVAxsBwYmM/JsDmOTd4nB/acoXCeRHFQ0Slqp5DnfS5hiKFJfS61hANc4w5mlfTl1LHzI9Laz9pWSX9ddYYcU1cfze8djpUJwLkeBx2lzHdIrivkWH23T8or5iBtfj6p2faQn4V+O0flvIUwWGeu4rzNkP0025q1X6HcDh/KZuDAbltoLqHOmRpnigh6Dmc619R1vqYJY8lnfcjRB2/OqDRDiUthUM96MReKOZC9w5GFRZSNMo4SXK0UY5H3y0J3uM+hQX+S8/ks+64Mtk7gSs/QRVxKkDLvOrevZUh2eMo9J9MwA+lsnPqXTcABTWN0cZFzLdO2NfQY+46pLxxS5Es6ilgbv00rKaV0J4hFk/jox6lrp/BlqrZ2FsFYHK678MIL09r0XpHx4fjjj0+ZJDDEeZQh+pWvTlLPU045pfWcGEYtW4Q6kewVkXYi/rzzzks4bpuHacfrUDAcygIhizGmFaUUBoIc2uFOJamInGMxD65dcB/LvOYh4PS6UcaRj6nr/cowFoRYKgsnbKVg2LJlS1oMTGF+u/LKK1M+KMTFApTqY1kLSdAhLcRwmkLScovl7t2707MYi5QWNskiNsY4WPuOaVybXb+Zb9dEEyrcS4LYYLDyxUnYCCdNfMgcQLvzKo2Hg6iLKPokEMhzJ5ccokbjltYdc2vL9TUEXEP0K/+aDBVtJ//B6O4bSWFpivYcjcadMcaLwUi749k+xfw4KY8Y04YJldK0NPdMKQxSD/mT9sc1Cw5XE9KkUJI9gmAW9zH1gbfrmY0yjq7xdX2/MoxFBlP5tCxgC4zKnWslFoWEcpJMWiTBWEwsjUYaB4tSri1qfUirbuhzYptaTJ13KyJJhjofiemozwiCFCCRdBJCEXSakpO+iITNNKm/tomQbdlYpKHI1XISo7albyGZYywYqTxY6oHZ78aV46IEhpJxB6zad1Mf8yNcgXNTR56qeGaer9dcc02CRUJQJlBzYr4wGnDSYPJi7i655JJE5GLe89/n9T5Mb8xuTJXWHrOlVCNygEkmysyUS+BDwDJEv+DErGlWsZeasLlBFewEPVqK7BfulA9NR24/aWfytdlso+uz9UlAsObNaZ6hPH+mFAYZsa0T1ydYK/audSxrNDObRJTzYCxDjgOTXq9x5Dgveb8yjIW6jYiS/EiozcVKU3HfO6IS0ipzANOHq0qp5GzOFhSpxV0PGAfJjqRl8zClsROrh3CSuOTFIrlhWKRkN9aFrVneofPPPz8tSqYpUvuk/ppwmyTPINj64y+K4rN8Ru54d4GVcehD5mO+JhlvEVfwqgsvJWMGQ8m44UxOMMkeMVAMBtNCQGxKZifwjUunE2MZ8pXWJjcaIhbEQKZbeGLyiPnXJ5hlo0acmDww8UUVcwFPMlbDkzkOxkwwsObG4c69K5hlXCYXcGMckqrSgDArbeZl1n61RbOTe4t2FzjO+7DmCDsutuP7CIHLOO1FBDWugmjCl7fT9Z7gSEDEUAiGtI1mmQYG8KlvbRAu7AHj8tqlkUV/fefB80OOQ3uzjMPziyqHrshF9TplP4gFByFJD/Fss9tavDIPIyTxO0Yjhfa2bdvSJlSHjZXJBAGXlffetUu/bCISl+zI+rCQEF4XLGEYmItEjAiajSRAQHHZl2dlNbZYpTKf1F/c35KjgFSGWZES8yIppYKQSPvNfEZ1l8HXpVakRY5rWkw8WzJmMJSMW70DBw4kKUk2aOZHBBFemHPcDTIvE0KOh/w9wcFacD0viZr25rZQQgKp0/UKuVZijggFTGbmOISCvM15vcc4MDvEDN4wCcRadm5zBp/jtBUajfXEfxF4thYIFgQN47Q+m4R71n7hg0lr05pw1da+3/VpvdsLuZ8QU2O+Mi7jjr3omWmK/Wk+7WcWBCavZpkGBhm5WTQIh9YO/N9www1p32D8bcwz+us7D54fchzam2Ucnl9UWQnGYmNZCKJ5xhGGXPoTMWYDYxSRdtzvHLvuPiEB2awWLkmGikwytCGOOOKIRDhJSvpFxNlhMSWmOJvJBqeGk0j9gbGkvzbGUjLZJHQb3fjdQcL0lo9XG6VjDhgmjVs9/cETUw4ig1DYLBzhzIbaGLcpS8Y2TR2aGq3NZsc0zC8ChmgQCGgl4bfAgPbu3ZuEihNPPPF++Jqm3z51EX4ahSIL886dOxNDxwiZwPgmMJyuQjsn+dNKFePib3BHCz8BQSpnotHOrP3S6q1197rEWom249Xac41DXghf8M3sbJ/MEkRjnQXTJaW3afrTwOCiOsxSVKC0+TRv2cMxGSbuecwD3Aw5Du3NMg7PL6o8cFEdzdIPCcwEWQwYQUnhL0GEmK5oOUGEvedj8RvnoE1EfSfNhmqMcNoUmAUGon+2fBuHhuA7mgqNAoEgsTELlPRnDPMqpWMOGCaNG5w2HpMbrZGmwlnr3nhmjojGmtd42trFPGlnTFqYnQg5JjnmREzu8ssvT5oB2EinyplnnnnIGmhrd97fRQQSYk1AENVGCmd67Sp8a8ZmbWIuhB/4x1RoYExQ9sW40qdfjIEWTfsrERowelYAzIS5Fpxnn312J1MaB2/f3ybBQMNmzmVxcLU4DZyA5jtC0rznoXRc8xxHKQxD1FsJjcWdK6RSjABR7yr8MAiORU0qVNdCakp1GAcJyAVhiKz2m5oQCcYkByMTosqufftaqDDnH2kZswr7N4m0pD+MrKToN8xbJfXVmRaGknFjpsxvNiDJkT0a/iYRtFKYp61nLs2dwAVMHXH1mUbJ58B0BA80GNIpTQXDpb0w05hTwgECSOJ3LqJpSpoWppL6JH9BJ5gC+BA35ixaSJdWoF0az549e9KzGLu155I5TKdkDvr0y8TIN6CP5r5ojpWfkWmOf5JvhdCBeRLMSuBrttfn8yQYXO7HQoHJMk0ybduHTG20cRGltMN5zkPJuCaNgxl11nGUwDFEnZVgLBYoQuIqUYTOom0SAwTDuQaS9Pa1q0ep0QgOrSMk9EDYvWt+FUwAYdW2epOKxSi6RwQJooCxMAUwxWhjmv7a+goGFr+xx3NcNr+P39tep4Vh0sbHeGkEFvyuXbuSw5z2BiZEbj0KgmfuN61J/WCJuTMWTMJn5kvrhNZivhA935tzzEV0oTWAyER78xiLoA9RPHx3iC4Tqj8mXeZVgpB5FtIbGnUTDjDDddQzLiY/DBWhbCuz9ut5axusgd+2foT+C4KhqdNSBMXkjvy2Z4b+rgQGUZ1gpKUQNO17hZDk5lnCiCAVGnzXnugzD9OMtWQchLtZxzENTLPUXQlTmAFyXiMoTAFso02Cy0FLW7EhqO9xAE78ex6qiMmoF7b5rg3dRCpJmYRM2hX/TgrmSAsph2SpTp/+mNIigib6FbKZa0zx/bjXWWBoa5eUz+RHsmfbtQmZZZgNmsy67fl5fEdbwkA4iJnn8nUAVhI2weP0009PpiY+CWYkpjKRc8JfmXgcpDWecbb1WeGHJ4yFQITZRYE7Wou1F8JA/Ja/Ima0AWdzhMVa+wImHEpkejL+tjJLvxgXQQKOuhhX9AmOCF5BtAlftNpFlhIYwkfDjAqneUGoMRM0o4sW9J2HvJ9J70vGISoWo+87jkkwDPn7SmgsBmyhU1cdvhIfT43liLeJSKA2GmaxfU1bIWnZsEwjnLw33njjaOvWrUn9xYDYkPkNSC4mKidO45CLsSBSwlc9y0YbG4nZoKS/tvY5/wUUXH/99Sk01cJBUDCWfLHr02djwoxIunmZBYa8nXgPj5g5YoPJKKQ/0UAYofMh02pV0XbfV4yAkOHee5vRfDOTIt7MX9YFxkIaFWyQF9qK+bJOEGjPzbMwCZmzq6++Ovl4rBdzSyNAkK0nMHRpBcbHUW/tWq+0AWHV/C7mwBqPQIV8HLP0y9Rr3jGJcUyXNssJTuNjFbj55ptzENJ7fskwO9/vxwG+KIXB/iJ0WSP2LrwTQO68884UygzOcedt+s5D6RBLx0Gom2UcpfAMUW9lGAvpCUOxUUVfidjyHWnCxJA+mRyYqxBDf8KAmXJIq/wjGAh7qygxEtYkiayJYASJ9E5jYefONw1pom9/DsthjhYwf4ZFLsAAEcpVc9qYhbVv376UbWDHjh2HgDgLDIc09OEPiAaHLMenvhAafdDUaGcIC5uvsO1FllNPPTVprZyuYEAkmDKcuXBYLw9/XSRczb7gDwOkKTkLxHRHWMCUmY2EjY+DleBgbURUo7VAE5eBwhrYtGYOtM6bZZZ+hdRjgPZZF8PTH9MNRi0E3v7K12nAA1aMsw3GqDPLaykMCLLDloI5+LWEGRPQmBdF5gVd6YKl7zx0tdf8vnQcotdmGUez33l+XrmriZnBRGORor23AZhGSHMk1NzZyOSgLtVelAsJTx2huxaLzYBBKSQ0xDKKU+aeVzf/XvQXgq8djCbfNCX9aZ+0yjEfbXvPtANWG5UjGtEwPhK2DYqoM6eoxxxHOwGzRckRGW2VwlA6bgETcA1m+GOH1i9CDkdg5VCmBeZjCjzO6xVc5jX6DbjgLV8Def+EEFoNbQbcXfXyZ2Z9bz7ByXTK/BU4JEXzG46DAeHmJwJvk3DTIAkg2mtjAH37NacEruZeauJBWh3jAl9XQQgn+Wm6ns2/xwCse/jCNAMX08AgYtD+sl/uXfOvwRm6Yd+ECTnvM38/yzzk7cw6DkIIfPcdRw7LvN+vHGOBEJoHIssMg+DaXLnJqIk0i8pGsyDZUmNhNusN9blvf57DGMIm3AWPxYWhjRtHXxi6+mRy1GcOmzkIONqIW1dbQ32PoZpXr9bAOCI9VJ992on1SluBL7B6nTfO1qvfPjha1DPWrHlQzAOhcd7zMI+xLfs4VpKxzGOiapsVAxUDFQMVA8NgYGWiwoYZbm2lYqBioGKgYmDeGLi/52/ePba0L/Msp/o4c1bLY/WrioGKgYqBioEZMCDc2oHyrvRAfZteCsbCMSfyRbqVVbR39kV+fa5ioGKgYmA9MeBwqGMD/JRDlqVgLAYk8WOcnB5ygLWtioGKgYqBioF2DDjmIAhIoMeQZaV9LCKVREcMjZQhEVzbqhioGKgYONwwsDQaSyni2QQd2BNv7zyJ4qCi07NOIQ8RN18KS9960oxIx3HWWWeNPSBX0r57Y5wfkel33GG7krZmqTPkmErgEE7tbJJzRSQu55jazoUQOpz5cI6EIOIchHM3Qk0XbXYFi9xlYKWdjwsXL8FBaZ316rcUvlpv42FgpRiLPFyXXnppOtTo0KA/h44c9pOgTbJBOaCk9Fg00ZhmaUib4XCXzKrTMAPpbJz6l03AATZjlInWuZZp25oG3pK6fcdU0nazjgNrF154YZpv7xWH+lyj6xR1pGpxIFI2A8keIxmpc09OYp933nlp/SyKuIORLxHcshnwJ2Juiyjr1e8ixlb7WE4MrAxjQYilsnByVtqOLVu2JALCFOY3qa/lK5JF1saVPmNZiwNaDvZNa8IjoUu5sXv37vQsxiLFA8dbENP1GnPfMfWBl5aGycodJz8b5yMt9tprr03MIvJnue9E4kZpO6wJByhd9uRZDEaqFafa51kceJUlAHGn1RGOpM+fdu6nhXG9+p0Wzlp/Y2JgZRjL/v37Uz4tBMLGZNLItRIn6qXrkGSS9B6MxQaj0Uj9YTNL3yAPV6RpcUOfU9DMKMwlbkUk/TKX0HyEQEu8iHhJUeHkeRQEnaYkf1akXZnUXzybv8q2bCzSfCB4UWQ41rb0LUw+GAtG6jpe9cDsd+PKcVECQ8m4Aw7tuzOe+RGuwLmpI09VPDOvV1qIe1bkhGP+M59MgdLryKslQackg7QBN3x6JYjQUszl5s2bR/BKi6HdjEs+OMQY4BnzN3dwmGfaHqL9rjbWq98ueOr3hxcGVoaxIASIKCm1LSyZpuK+d2ahkN6ZSSTIc6kTkwe7PGYgAaB7JDAO0i9JFiNiSmOHVw/hJNHKiyUbLYYliZ0bKSN1iFxJ559/fsqoyzRFap/UX84AYql5BsHWH39RFJ/lwHJviOR5xqEPmY/5mmS8lW0ZvOrCS8mYwVAybjiTE0wCRQwUg8G0EGjMRZZd8C3y/JHx6ds95fqGt2B2mK15BA+cYMbu3+DTCIHAM9YKwhtXFXh+XoUARNDBUAg4knaWFveuMPXGZXLxHOYqqSptDWNtg3+WfqOf+lox0BcD89tRfSFqeY6jmwOWVIp4ttnFbS6HfBC/+B2jEae9bdu2tAnVcZe3FPsIuKy8EtJJsEiilR1ZHzY0wuuCJQwDc5GIkRkKoRIgoLjsy7OyGtOApDKf1F/c35IPkzSLWZG88yLZnIKQSBXPfCb7say4LooijUtISYuJZ0vGDIaScat34MCBdKeIbNBb1syPiDe8MD1JPY9o0hYXVTAOjJbwABaE1816Ml7DAxjhxVybD3OV+7EwJmYpdbQRa2Ve8FtnUvlblzRh5rnSQvuynviHAs/WAsGCoGHNWZ9tjGWWfkvhq/UqBrowsBKMxcZCNNy8F9pC24ByyVnEGGKDUUTacb+7w8XdJyRHm9WGJ926AY9kiOA4U4NwkjD1i4jLhospMcUhVjY4Uwvp2R8YS/prYyxtY2l+JwsrLcr4pTVnesvHq37pmAOGSeNWT3/wxOyEiCHEiJaU9cyG2lgkY0FMSemKbME7d+5MTNJV1Exg/CkYDq3MNQN5IRzs3bs3mUXN4yKCPODLmlLyBJ45XF3vaec0K1qpwnfET+OOFlccjzstPUu/XfDU7ysGSjHwwNKK61mPlGqjMMNgBCWFv4S5iOmKlhNE2Hs+Fr9xqjKNMY/QWBBIBAnhRHQwCwxE/8KZESYagu9oKjQKxIxEzOxS0p8xzKuUjjlgmDRucD7+8Y9PJjdaI02FM9y95sxIzI6l8zGPMdMATjjhhHTHOqZ79913J42AOTMvzIe0VMyEOZEQIY1FMNi87jK951tjbrQ2MRdwwz+m8uxnPzuZ+OyLWioGlg0DK6GxuHOF2QIjQNS7Cj8MRziiQYJVl+kqHPXxHMaBgbggDJHVflMTIvUiSEE4Tz755GTXvn0tVJhjnxkMswr7N+m5pD+MrKToN8xbJfXVmRaGknFjpsxvfCwkblFU8LcMBA1jEMiB0Br7GWeckUxEJPtgGvxgTEf8Z3wrmKIbFgkOyzCGSXNLO9uzZ08aJ8Zu7blkDtNZBfgnja/+vjExsBKMxQYSAeVqUYQOUWjalTGBK664IknS29euq2V+wDxoHSGhxxTyL2ACCKu21ZtU+FlEIjnghoBhLEwtTG3amKa/tr6CgcVvfAfTXvs7LQyTCBPG68wM4rxr1650KRLtDayI3HoUgRTukecPwyiYJf0xkzJZEi7gzkVufCmCNGiSpH1BG7kjfz3gn7ZP6xSuY0zWPx+gcdDYaqkYWEYMrIQpDOI4rzETpgA3KzYJsagt2gpiz6TFB0FTcW4gD/HEZNTDEGhBYSKbNDnaYvZiCnNIkdnJFb0hGZMs+/bHlBYRSgGHkNhcY4rvx73OAkNbuwImmPzY8k866aR0/TOzDFNTk1m3PT+P7/SNsRAy+L+igIfWYj6DwTocGcEVgg8IBxEdFs8t8yumQtt65StfOTrttNPS2hcwYVxMe8yTtVQMLCMGVkJjgbhHPepRKdTY4TbnD0TacMQjNBKp2WiYBW2FBIu4CMcUWXPjjTeOtm7dmvwpGJAzLUJ1aSy0lSaT6poojIWt+xWveEV6lhM9CBXTREl/bW1z/gsouP7661MYLSc8goKx5IwPvD4bE2ZEKs/LLDDk7cR7eMTMScmYjOJMj2grjNDZnmm1qmi77yszFjxcffXVyXdmDuCLJoOJmCMCg3Vx1113pfNOtFaHZ5uF3yzMos3fluEzBsJRb+1ar7SUBz3oQcnvYg6s8TgMugzwVhgqBgIDK8NYqP0YCqIi+krElu9IdUw2Rx99dDKPMFchhv6EATPlOCci7h8Dce+LKDES7LSmBIyL9E5jYefOiZIzNn37c7APc0RI+DMc2hNggGDm5iraGK1k3759KdvAjh07Yh7T6ywwHNLQhz84C8HhLUJOX/xO+qCp0c4QbtF1wrYXVcB0wQUXpHM7ztcwzWHAGB1Tl1BsTNchUqHFQrTNf47HgJVWiwlZK8tYCA7WRkQ1GgOYZaCwBjatBSwsK+zLiM8K0+IwsBRXEzNzyaFUkjafGUw0Finae4zGc6Q54bC5E555RF0+AmdFSHjqCN21aW1UDEphJkEsozhl7nl18+9FfyH42sFo8o1d0p/2SdYc89G29/wBYEUIhTYjGsZHI0JMEHWmH/WY42gnYJbiRkBAtFUKQ+m4BUzANZjhTySZfh2chCOwcijTAvMxBR7n8QpH5pQ5kvkr4KL5RSJKaX7UEbnWVWRSoJWV+Ni62pjme34S8wdGDLKN2eXtYYzgFzDRrEuDJIAY+yT4p+03h6G+39gYYH2xp4899ti0loYa7coxFgOneSCyzDAIrs2Vm4yayHHWAjGyOflfmpu0WX/Wz3378xzGgJmMIxaIDYY2bhx9YegaO9OSPnPYzEHAMQ7erjZn+T7WAG0FDNaA10XDMcsY6rMVA+uNgXkxluW0AUzANuLBJOOvpGA+/hZV+vZXCmOuQXWNqS8MXe0xOTVL+Jea3y/i87RrYBEw1T4qBioG/hcDKxMVViesYqBioGKgYmA1MFAZy2rMU4WyYqBioGJgZTBQGcvKTFUFtGKgYqBiYDUwUBnLasxThbJioGKgYmBlMFAZy8pMVQW0YqBioGJgNTCw0oxFCKyQ19KT86sxJRXKioGKgYqB1cZAr3BjhFwyRocSHU4cd55iaPRIKOgkuMN5DioqTsBLzyG9xSIPvPUdmzs15Hk666yzDrmEqk97LiRzMNF9KU6cr1cZckylY5B1QYoZB0adoHcXi9xtXWdZ1HetsYSVMjSUhG2XwjKunjNFDug6XGuv2DdxkHPcc7P+tl79zgp3fX71MdCLsTg97KS8dB/uN3E4bRFFQslLL700nZZ3Gt2fU8VOkUvrLjX6RRddtJALnGYZ7+1rqfedDHeB1jTMQJ406WSkqXHyHwF9xzvekU5nT9vWLPC3Pdt3TG1tlXzn7hXZrBHruBdGtmP3l8gh1ybsuJxMehdXIFg7i2AsTs/bK9am94pUQscff3xKUYQhzqOsV7/zGEttc/UwUMxYnAiX3gNTIZ0i8u7CWJQZCiGWI0kKk3POOWe0ZcuWJKUyhfntyiuvTIkGERcMr+1A37JMj9PiMgFMiztSr+uRd+/enZ7FWF74whemlC7zIlClOOs7ptL283rm+2Uve1kSKlx4RbjBYCUilQkYTpr4kJKGduf1mGOOWVh2Zn0SCNwGKUmpxJ007muvvTYxt3klkVyvfvN5qu8PXwwUMxZXpNqwiNv73ve+Q1LRLwJ9+/fvT4kaMQ0MTa6l3OQhVcs73/nOkezFpPdgLBgijUYuK4RcEkcJHkNaNS6pQJgm5J5y3S6JUv4r975IFcPcgiDILZWfNpeji6YkA0Dk85rUXxuupPE3Frmu8tP3UudrW14wkjncY6QSLKoHZr8bV46LEhhKxh2wat+cMz/CFTg3rWMCxGuuuSbBItM0E6g5MV+0V3A284OZu0suuSTl3Ip5j7HN8zVMb8xuTJXWHrOlHHOSS8pSLRvz0Br/evU7T1zWtlcLA8WMBSFHsOXnQqhltW0rLiGyweNmxahjscswTGqzwRCoaYoLnBBRkh8JNSek2qGpXHXVVYmohLTKHMD04TpaphE2Z8xAFlwXQGEcJDuJKzEi5goBAeohnC996UtTwkUp2TEsUrKrjiPRpSSH559/fkrhzzRFap/UXxNusHsGPvTHXxTFZwniXvCCF6SbEY1DH1Lq8zVJpY64glddeCkZMxhKxg1nkk3KIoyBYjCYFmaLuTA7gW9cnrYYy5CvtDZJNzdv3pxyv2lbOnl44ruK+fc9mOVDwnDOPffcdBeP7xdRzAU8HXfccQlP5jgYM8HAmhuHu757adZ+F4Gb2sfGxkAxdUfw3IFio5LoX/va17ZihhR23XXXpZsbMSKaBKaCGL761a9OmgIiPA1jQSxkcyXpIZ5t9nPtSWkPvvgdowHLtm3bEkNT59Zbb00mE+OR7v3etdskmUdcYOVWQn3Y0Aivm/vAKpuvDL8ImquOBQgobpH0rHT5JGF3ZEzqLy4Gy5FHE8GsSLN5ke1YgT8ZoJnPpNWXGt5tiSRdjmtaTDxbMmYwlIxbvQMHDqSLtVwzwPyIIMILc45Lp2KOc7jn+Z42Yi08/vGPT+Yv2ptrqAkJ1qd7e3KtxBwRCpjMzHEIBfOEMdrGODA7jA7eCFz33XdfuvbBnMHnOG2l716atd+Av75WDPTFQDFjQaxjE+QZbpsd0yiYWUjSChsyn4wLi9xNjvjnG7/5fNtnFzXZiKJ5xhGGXPoTMebeFowi7rPwO8euS7VoXIgiRkTid3UtLcsYjzjiiEQ4aWb6RcSlh8eUmOIwFsT+lltuSRIpqRSMJf21MZa2MTe/I6HToozf5VZMb/l41S8dc8Awadzq6Q+emHIwWesAU+YIZzbUBuFhUYWmRmtDdDEN84twM4MSCGgl4bfAgPbu3ZuEihNPPPF++Jo3zNY57VyR3n/nzp2JoWOETGCi2DCcrtJ3L83abxc89fuKgVIMPLC0Ymk9/gAmEtIh5oJgu04YUyE1CrUMjaK0TRKYZ5hhMIKSwl+CCDFd0XKCCHvPx+I3wQhMY8w54EUgmYkQThIwZoGB6J8tn9ZCQ/AdTYVGgUCI7OIDKenPGOZVSsccMEwaNzhpBkxutEaairm87LLL0sVaEY01r/G0tYt50s74pTA7EXLWG3MiJnf55ZcnzQBsTJfKmWeeecgaaGt33t/R8k844YSRGzAJCKLaaP1Mr11liL3Up98ueOr3FQOlGBicseiYlLZnz55EpBEjl2q5cdFGmZapaM9lXqRSjABR7yr8MCLHXJBFKlSX6aqpIWEcGIibJxFZ7Tc1IZKk54ORCVFlSrt9LVSYSYq0jFmFL4lEWtIfRlZS9BvmrZL66kwLQ8m4MVP+G5oaQeG2225LzvI+81g6jnH1zKW5E7iAqRMIrCtaJf/dvWumTXi46aab0pkVaxHDdTVx+KgIB94b27j1NA6OaX+j/Qk6Oe+889Ltj9YB07DAgnFl1r3Ut99xMNXfKgYmYaDYFDapofx3mxVDYVPGVGxgZgraCglq2oKIISR33HFHaos20fTRMI841+CK2u1r994zaSFAtI6Q0KNfxAeMCKu21ZtU+FlE9zgYesYZZyTG4qpgphhtTNNfW1/BwOI3uJv2PvlpYZjEHDBeGoEghV27diWHOe0NrOZ3PYogEnO/aU3qB0vMnbHEDaTMl9YcrcV8CcpQz5xzbLsG2hrg/4j25jEWQR8YHN+d4BImVH9MuhghQSj2SGjUTTj67KUh+m3CUT9XDEyDgcE1FhuBo955gtNOOy2ZTjh53efunAGTSp/CeY2gMMXQSJqEmIOWeQSxZ9KKA3DO2yDQUTAZ9cI237Who368kpRJyKRdhxRJwe5+D38FyVKdPv0xpWGIOQMUaoxZNscZ8LS9zgJDW3t8FEx+/GInnXTS6Oijj04aAvNNDmvbs/P6jraJgThPZS3l+AErzZPgcfrppydTE/8eTYupjOblOl8OfgdpjWecj2PWMcATxkIgwuyiwB2tytoLYSB+y1/77qVZ+81hqO8rBvpgYHCNBQPhqN+6dWuyzdNShIKyg4uaspHCuToNwIgBZ6bDZs997nNTBBBHvE1EAsW0MIvta9oKqVA/TCOcvDfeeGOCh/kBA3Kmhd+AxkKSzYnTOJgwFkRK+KpnOdHjXAtzTEl/be1z/gsouP7661NoKj8C5oyx5IxPnz4bE2ZEG8zLLDDk7cR7eMTMSf+YjOJMj3nECJlxptWqou2+rxgBIeOCCy5Iwor5ZiZFvDnwRYZhLO6JF2yQF9qK+bJOCDuem2fhTzFnV199dfLxWC/mlkaBCVpPYAitqwlL3700a79NOOrnioFpMTA4Y0HsHP6KSCwmCtoD38e+ffuSCaNpxioBmgkNQ7FRRV+J2PIdqY7JhvTJ82ILAQAAQABJREFU5MBcpX1/woCZckirztBgIB/84AcTbMJnpzXLIUikdxoLn1H4asDvjE3f/uALc0RIhBOTqgUYIEK5uYo2RiuBR6fPd+zYcQjqZoHhkIY+/IGZiG9FaLG+EHV90NRoZ3fddVeKrhO2vchy6qmnJq1VZBoYaCnO2zhRLyvDNGly5gk3/GGANCVngZjuCAuYsrNUwsbHwdp3L83a7zxxUts+PDDwgDViWxZmleGDXZgpgt/DIs6JH6mQbZvEmH/vcVIvotlU/0mg8imFjTzr6n5vmcFEY5GivcdoPEczIqHmTngmB3X5CJwV0a86QndtWvBhUAofCmIZxSlzz6ubfy/6C8HXDkaTM8mS/rRPWuWYj7a9h0+wCgwQ2rxpzYdgfCTsONXPnKIecxztBMxS3NDEoq1SGErHLWACrsEMfyLJ9IuQwxFYnXqnBeZjCjzO6xVc5jX6DbjgLV8Def+EEFqNtQnurnr5M7O+N5/gZDpl/goc0lLtn3Ew9NlLAe8s/UYb9XXjY4D1xd459thj09ocasS9GMtQnUc70zAWz+CFiCwzDCnaZs1NRtFuvApDtdEwEv6XJsOLekO99u3PcxjDuHNCYMS4MbRx4+gLQxcOmBz1mcNmDgKOLnNOV3tDfI+hmlev1sA4Ij1Ef33biPVKW4EvsHqdN87Wq9++eKrPLR4D82Isg5vCFoEaG5JJxl9JwXz8Lar07a8UxlyD6hpTXxi62mPGaRZMZj0LYaINrvWEqa3vaddrWxt9vluvfvvAWp/ZWBgYPCpsY6GnjqZioGKgYqBiYFoMVMYyLcZq/YqBioGKgYqBsRiojGUseuqPFQMVAxUDFQPTYqAylmkxVutXDFQMVAxUDIzFQGUsY9FTf6wYqBioGKgYmBYDKxMV5mS6g4nNlDDCNuVeci5BQshxJ5mnRU7UdzeLMxqy6Y470CaNjcNvZ599djpfE88P9So9ifGfddZZY+EYqr/aTsXAMmBA2LScb86qObPWDLP3uzNyzgoJi3e2ztkqYd15SLd6Mkc406Y4RyR7R2k05qy42CjjKMHDyjAWi8GlU9KMOOAW51ZMVtz8KC+TE86YTPxegoRJddyn7uyIe0nGMRan5x2efPKTnzwXxnL7WmZlJ+4nwTFpPPX3ioFVwoBDwQ5QywLh2gsMI4pDrzJWSOgZCWcxClmvZZJ2uBgjIpBdeuml6U4lqYicfZJ5w1UG7vCZd3of8G6UcQTux72uDGNx4I80Iv2JHEtxhgJjcVDSxV033HBDuu9i+1r+qGnTtYxD0gtf+MK0ECctPszHoUEwzaM4YOdA4LzanwfMtc2KgT4YcFBYdgXEmKYuuatrB5pr3502EtO6NA3jcUjW1Q5yCmIw0ukQRNEHf1LpuBsKzWABkUKJIEpYm8eFdRtlHNPO4cowlhiYNCrSlzQXgUUklTutwQLMGYsUKBYo6UYqDSp1fkrbBV2uuJWqhkRkkUpkGFqPg5gWdK5WY3RuUJTuBEzabS56WZQ947dc3Za5WJvSsgQcnnUvuzQpzHuekZ7E+1oqBg43DLiFlkDHUmFfYARtxS2uNBg54mgp9uzmzZtH9hgtRn5BaaRYMwijLh5kJqOx2GMysDOzSV7apClt/U373ZDjwCDXaxzTjnvDUC0LL9TbYAj8Ii5TkrSS7wORR+BlXnabJQ1EEkPX18rjhMjTCjAKyQMtUPVlNMZISD8YjxxO1Orb10xTkdbEhWLsvNG3iZD8EmPwnISVUXyWnyfuQ5d3iwnvXe96V2JOpBztWPiyQqubtxvt1NeKgY2KAX4SWoR9La8dbaNZ7Em5++w9wmJYMewX998g6vZ9pElS356XuQItwEi8IthNv03el7ukCJ1xqV/8RlCV3FZWc8lh24TAIcehX2PsO46AexGvK89YLBoaibT5iLx06JHqhWmM413G42c84xmJuPOXYDQWlQzFmIpF48ZL6f0tVGnhqcgcexad9rWNUSlUbyr4KaeckhYbGPTPeWiBRyFtYVbxXHwv2aRiYSp8RyQRGZe3bNmSmBntS1Zh45mXmp46r/8qBpYQAwQxWgRNnvXBfmsWhNxe5XfNfZ8EPxYKmgxGgmnIyC0BqeuqXbdtP6MP9uZxxx03VltxTcV1112XfDixF+1dV1sQXDEqSWnbGMuQ4zD+WcbRxN88P68cY7nyyisTEQ4Jg1+DQxsRd8eKRYKr0zxEaSH6tBNp6C2AI444Ii0wC+X4449PmWdFlUmHL+MxdRrToUWEmSqfgPvuuy8tMvUwJpKKwnRm4TbNYfmzXe/1Y8GKOrNAjc2ClBaeuY2EMg81vQue+n3FwHpjwB4IJ739TLNoFlq8qyTywjRNWJT1ms/FvvSsi+r4Xt70pjclKwW6IHs4JvOwhz3sEFN13p737oGi/bBcKO6T4vdx7xR/DdrRlb9vyHHoe5ZxeH5RZeUYi8VGOgnpgKrsPhZRH0KSheJaSO68pxlsWvNTYDq0kigkIEwgfC4cg+7GcJ+HRebVos39NPGsFOjudMFUmMyCwWE0mAzn+rTF4iY5YWY0FT4f8FHjMc4+zGpaGGr9ioFVxgBB0kV5TGYsDHwp9mjc8Er7Z2pmNnNpIGHN5XC+I8A5IhB1m3jgC2WSZq7GXDAovlz7ltDKShF0oPnstJ/nOY5pYZml/soxFv4RKnJIMyQPzCVCDjEJUgqfCaKMwbCDBiOCLATcPS4I9ote9KKkAd15550Hb7iMC5byfgLJ/CiYALNZrtFon5ZBxR5X9Nk0jZGymN4sctIZ85u2hlqs4+Cpv1UMrDoGCHvMUo4d8K1cdtllKToUHbCHCIIYDkGRX9N9Qvyx9jdLASsIraSLscAPoZG5XGAQ4U8AAMaF6Qy1TyeNg7Vk1nEsaq5XjrEguMIDm6YhKqIriB2AeuxjH5sWEa3DdbBMTG2LxmLRjkNXbiWkKYQE48ZLC1AEWl7U165IFVJPrgLTmizYvDS1DZoTRhjf07bccmlR7dq1K/UnFbzfLeBaKgYqBrox4KK75z3veWnv0lKEE+eOfE+iCfY2HyaNhVCpEODcPMssTtjk9O9iEqwb9qP9S/AkDHLq66vNspE6mOJfyTjQplnHMQVIM1VdOcbSNVqTTYPguKPFMIHRakwE5sCPEoVEQ/PgCKMGYzxstephRBYXJ586IrPyEid1maqcqYn7QPTDeZ8zMCY7GhN4ogiDpO4GY9GH2xlFlmCO4GTKo0Hlz8Xz9bVioGLg/zDAUmH/2NMnnnhiMpM3/THho3F4OQJmogX71n4PgTG+z1/DUc9nKzyZEOqCLH0TMtGBWZlLyTgEJRlb33HkY5r3+w3DWEysScZYaA1i10VtidYSUWIxCB1+85vfnJxumAqJhV+GBiGyxCldGgRHnbrMXbkJzWRgLJgAaYVNl2mO+cs99A50BaNRF1PSvnqCCiwIKjvGAlZFv/ogAWEyCglLtAsfi1PCuYaTKtR/FQMVA2mvOi4gpNep+5tvvvl+WGG9sA9ZJ/hUMAQHrAmhhDehzOqgF02GFI0h+hz19roAIVoK2sDvYp8SYDn0wzwfz5W+ojkl4wg/bt9xlMIzRL0Nw1girNAEifbwmWrMPHXVVVcldRcx51znvKOlyBUkcoQkwumPOSH61F0LyO9N1djief7zn58Ob3HkXXPNNYmRiTaj7eQ+FpIMJ5+FyYdi8YpOw2CiXZsCDJyLO3bsSAyNZILxMbMZD7vqs571rCHmu7ZRMbBhMMB8RJC0p50Ziz2VD5CwiCA7bMkKITxYmDFaYJ87DO0QZZjH8mfjffhcn/SkJx30qWiXuZxAyTrSFEDj2ZLX0nEILJplHCWwDFVnZe68p7KK8kLsEePmImI2oh1wjFssoZrSAkwcMxVNxoLgO2FPtRjYVp2Q9zuphwZBK+GccxjSAiTVeJapDLHXl/pUcA41Phq2W9/zu2hfPbCIpwe3xc83ZBFifNRzsNCMaDo0Fu1hXGDTBticxPcceIRQajPgGGoR1HYqBpYZAxiAfdTc+44Z8E0K0ukqiLE9bV/ah2jBvffem7QT+9Zeos3kvtJmW5iXPlg4mnQHfSEw2rddGk+0N+s4mNbB0XccAUf+Oq8771eGseTImPY9GynNhV8Dw8mjubSFISD8Jg2hZ28tkUBoJ8xUFpXnuhaWRa2PsPV2wS8Xmn7zevrwnQXd1X5Xe/X7ioGKgUMxYD+xSij2LYayivtqqHHMi7FsGFPYocvn0E8Ic34y99BfR4lo50735u9dnzEAf5MKplNScv9M1C9pP+rW14qBioHxGCjds+NbWf9fl30cS8FYSBAOHVE1a6kYqBioGKgYWAwG+ICd9xm6LAVjEcJHtRMBtYpq6dCTUturGKgYqBhYBAb4o/mYmu6BWfteCh+LcDuMJc52zDqo+nzFQMVAxUDFQBkGBDcM7WtaCsZSNvxaq2KgYqBioGJgFTDwv6f0VgHSCmPFQMVAxUDFwEpgYKUZi/DcakJbiXVWgawYqBg4jDCwFM77afDtoKRT6g4OOsGuSEwpLYO0CuyFyx4A4C4H52qc9h8XBl2CFxeZOTQpk8CsbZX011VnyDF19cEH5xpZKTUcbmseVvO7A2sOrxI6HKR1sLR5eE09aXNcp6A4eBc54Lr6nuV755j05RI5MINfn0M7TJswztqvaE03pUqBImNEG7ylOG/CNvRnZ9XMqZT5sm5wSjtC0EUL1Hc3i9x8LgIcd0ByCFiXHb4hxpi3sVKMRUp8C91JeKfW/TnN6rPMn9JmX3TRRQcv98kHukzvb1+70tipYSm7p2EGb3jDG0avf/3rUzZkF4LZNG7EdLBz2raGxkffMU0Dh9PXF154YUqBIysChhHFxpU6xz3nMig4kOr8kHQe5513XloriDqGbg3deuutKQpRRgWHZk844YTRueeem4hStDnEq1PbYLY2vVf055I5qUQQwXmUIfqNe+Pl3IvcdjmspTjPn5nH+7vvvnt0xRVXJMZtL2B20ufL5UVgaAogYJB8VhqYk08+Oa2NeTKWZYdvHnOyMowFIZabR0qGc845Z7Rly5a0KZnC/OZOBUnoLCi5t9oOG84DgX3aJAk66W8DTFNIvW9961tHu3fvTs9iLHIHIY7zIlCl8PUd06T2MQgpbzAVWhHhwp0YTdxJNOrKaJKq+SddOxuFGWMwL33pS9M5KXnX/Emv7vY/mRMwa7ncpM7BoGVeGKrQKMHgvg/JS4XU07ivvfbaRNBmSV44DsYh+n3b296WUsS3pTLRdynOx8E562/2vhxgBEyXbhE4CFuSz8oBaH8094bUSfDj1aV+1ti8yrLDN69xrwxj2b9/f0roiGggLMwcuZqLGEhl/7rXvS4Rh2AsFg2NRp4txEjMdlxTDKkyGTuZzzTBhOIqYBIlE4qDQyQ1KjaCIO9QfhIeQacpSRoZeb8m9dc2kXKVGYssrPkpfRKjtl0mxIyCsWCk99xzT6oHZr8bV46LEhhKxh2wat/9M8yPcAXOTTMm3ou2J72CE3EwdjBgBG3llltuSRoMoYOWYt42b948CqmbdiCn00033ZTmUIJSc2wOjcdZKmY2lz8NxVjC3MLUwlRp7TFbunlUglJ3qTMz5ZpX29im/W6IfsGJQNOsYi814SjFeb42m23M+lkSWOtSQljmcPvTvGI01gsNJi/28SWXXJKEjHlqKdHnssMXcA79ujKMhYkDESX5kUqai5WmIouxhRQSCnMAdVeqfOowmzNCQlp1ORDGQXIh3do8zBVs8+ohnKRcySBJbhgWyejII488aGuWAO/8889PxIhpitQ+qb8m3CbUMwi2/viLovgsIaVb72RlNQ59SL3P1yQDsw0FXnXhpWTMYCgZN5xJhOlKVgwUg8G0EG3EmKkBfG1mkhjDrK8ECFoEhkJAoG00i/FjvBKB8l8E8webdYE5uYIA7H5T3xqJ2H2MBE66JPNmf6WfzQU8uTIBLOY4GDMBwpobhztXMyCQj3nMYw5hdhiHW1FpQJiVNvMya7/aigSrJPo2RjsNzpvw5bDO+p4GL5EkISLglNLenmH2DFqgH+tXbiwMh9mTQFdS+s6DthcBX8kYFl3n0BW56N4L+7NAOGVJeohnm83U4n34wx+eFk/8Hunyt23bljahOmzr1GQEXCp6mU5tIlLu0572tNSHhYTwujEOw8BcZB+2SBAvAQLKW97ylvSsNPqIlLsZ3EY3rr+2nGSkccyNlJgXWUwVhOQpT3lKMp8x2TznOc8ZPeIRj0iSLmclLSaeLRkzGErGrd6BAweSlO/2vS1r5kcEEV6Ycx75yEcObjrKx++9eaJFIAo0SeaXZjGvcG9ecp8VAsuERiNAYKwL1xHQbAkJ7iyHO2nU4Q8DCOLU7KPPZ4wDgdM3vGESsmG/5jWvSf3C5zhthUZjPfEZhYnOWiBYEDSsOeuzSbhn7ddYmREJV23t+30anKs/j0KIRBfMI+2KJu9iPQKjNeN6jFwrsV8JiExm9ntbMEIbnH3nYVHwtcG83t+tBGOxsRAAERzjFkMu/YkYs4Exo7hHwe+cedLrk3xtVgSL9MU0QjK00d2tgnCSkPWLiLO/Y0pMcQiYDc4UQCL1B8aS/toYS8kiIJXZ6Mbv3hemt3y82igdc8Awadzq6Q+emHIQGcQZsef8ZDbUxpDEuIkL/QXxpW20aXzw4AbQvBAE9u7dm0yg5gyx8axbOhFNEUHuuqHFuMYAcWLqzE2ReXt93iNqNArl/e9//2jnzp2JoSN+TGD8QeP6o53TtmilCn8MP5NLp/iHCFI54UyV1v7N2i+t3lo/6qijDrkRNdr3Og3O8+eGfE9rp8Ej/JiGvY6JExwIh7SS8GFhQNYDLc9Nk829Mw6uvvOwKPjGwb5ev60EYyGBITCIAEZQUvhLTKxFRMuJheQ9HwvmwilsEzFTkGCCQCKcCBFmgYHon/2WM5CG4DdaDo0CQyIls+WX9Gfxz6uUjjlgmDRucCK4pHkmMZqKMdq4TEskstL5mNeYm+0iNG7sJDiYK/Pz1Kc+9SCBJDAw69E8SbQYoxv5fIdZnn322QfrNtue5XNEnjG/WMsihWhfQs6D0Tfb51tjbmSKxFwwREkDzQmpm9lPW+NKn375I2nRu3btOrgnxvUxCefjnp3lN4KUtQmnF198cXLE26vWqHm8/PLLR7RC+5WGqpx55pmJHnCql5a+87Ao+ErHsch6K8FY3O5GEsEIEPquwg9jkVlUpEJ1EZCmVIdxkF7lKMOstN/UhEiSNkwQTmGJ7Nq3r4UKcw6SkDCrsH+TSEv6w8hKin7DvFVSX51pYSgZN8mf+Q3hpTHwQ8DfJIJWCvOQ9fi8mIn4yvhW3INO6kZYwPvBD34wMRzElomKidN8MJvQykQWkk67CP0ssGpT0AlNwzydccYZyZxFoh7XH41nz5496VmM3drDKBG7kjno0y+zEt+WPpr7oomDSThv1h/ys31tHwtioQFitD6zLtBMmBHhmiBEQyVkEr5oL8yk9jfhg++S9udsVNOsGPD2mYeh4TOuVSkrwVhsIIuHZInQIRTNBWCRiGUnSW/fvj2ZTywyWgfmkZd71/wqmADCqm31JhVESHSPyCFEAWNhfiH9a4O5prS/tr6CgcVv7PEc1s3v4/e212lhmESYMN6XvOQl6ZY+0itzHIkQTIjcMhVh6AIySKu0FAEauSMfrKL7/M5fROAw/wpmefTRRydCRDOjyU3CTXpwwj9BH6LQ+O4wOiZUf0y6zKsEIfPs6tvQqJtNWqdwHfWsf2YeY8Mg28qs/Xre2gbruL1RgvM2+Ib6DvNDBzatmYity4DV3GESPjNlwxnt2t4ldPgeXjEXGiB6QNCI9trg6zMP0d5Q8FXG0jYzM37HeW1DOatgomzGWEia5pSjrZAAmbT4IEgMzj3wscQmxGTUC3usDV1CvLVFKhJ95dwDyQfBCGmTRFPaXxMVTGk0rJwBMq3lGlPzmbbPs8DQ1h7JjsmP9Mc3gSDC+Z133nkIrG3PLvo7hyPBSkshmcJpvj7AEz4aJgqEIi8YDoJk7XQR+bx+yXsRhhiLtYgo5GuQJK2fEAba2gMjDYwJVjj0qaeemqKajNWzQpajzfz5WfpFhGkhTEZtbef9lOA8rz/0e5ozBiJAQ4APM3fMubVA2yKEnn766UkjyedcfZYNASgExXH+rr7zMDR8Q+Nvnu2thMYCAezhzBQOmzmTwHzBEW8TkTpEemEW29e0FZKWDYsgcuzdeOONo61btyazBwbEhixUl8RqIZYwFjBgLGzdQhY9y4mOWCnMBiX9pcqNf5z/fD58AyKTED4EBWPJiZw+fTYmhBNzzcssMOTtxHt4JBEiNpiMQuoXgUUzdCZgWq0q2h7ylWbFEU9CpKE6KNssfGTwjPnSfM2h+UR8MEqBGuo46xLEqdnGtJ+Z4czZ1VdfnYie9WJuaQQIn/6ZeLv6Q7g56q1d65WWIpSW38UcWOPhnM5hm6Vfpl7zTkMfF1hQivMwO+fwDfUefATOCy64IGVdsPfh0/wyf6ERGAuNVOBJXmgr9i6agbl4rqv0nYdFwdcF93p+vzKMhfSEodiooq8QAt+RJixypgwaBHMVYuhPGDBTjnMi/CMYCDs7DYY5ZJJE1pwYi5AtlsbCzp1vGmds+vZH8sQcLWD+DMRNgAEilJtkmCcQxn379qVsAzt27DgExFlgOKShD39AqEVUcXjry0bRh5Bd2hlizkkubHs9C5MMQiGbgbnOcRZw0RoEaDhsyZErXFeYMUbNzERijfUVz8z6Cn+IHi2XA565hrCAKTPVCRsnIHQVgoO1EVGNxmUcMlBYAzR367xZZulXSD0GaJ91MTz9leIcwW6DsQlz38+0OFF9Ai+sR4ICc6bzNw7LjsNvaZ9950H7i4CvdByLrLdy97FYRKKxSNHe2wDUYdIcqSR3NjItqUu1F+VCwlOHr8BisVExKIWEhlhGcZrX8+rm3zuIh+BrB6PJN01Jf9onrXLMR9veU+fBijgymyAaxkeqQkwQdfZi9TgcaSdgtsE5oKOtUhhKxy1gAq7BDH/8D/q1eeEIrBzKtMB8TIHHIV8xAOPnb0M8g4GI8DHH7OhdRSgxSVwUGDzD271rtnXE0/qBvzAldrXR53vzCTamU+avwCHtyTjy9dpsH7M0praDmzRIAoj22hhA337NKYGruZeasE2D8zb4mu3N8tkaheNYg7FG7aEu/BJIaTVwaw131QPXLPPg+XnDp49lKyvHWCCQ5oHIMsMguDZXbjJqIhkxsdEQIjb0IEjNekN97tuf5zCG8AV0wYPYYGjjxtEXhq4+mRz1mcNmDgKOeROPLrj6fg922oNi/RAe5jWGWK/6gy/9eZ1Xf4GT9eo3+l/kK+HKHvcKv+MYxSLhir6WHb6Ac6jXlWQsQw2+tlMxUDFQMVAxMDwG7m+gHb6PiS1SFUVpkLBqqRioGKgYqBhYHAZE0zF/DqlBLwVjcUKWiWqe5ojFTVPtqWKgYqBiYDUwwHcq3FqwAzP3UGUpGIvEfGLKOWSH5JpDIam2UzFQMVAxsBExIKKWr5oPaMiyFIyFE1oER5yWHXKAta2KgYqBioGKgXYMOPQ8Lgio/anJ3z5wcpXlrSFSSXRP9c0s7xxVyCoGKgYOPwwshcYyDdql3nBgT7y98ySKg4pOTTuF7KzCspvTpD4XrCCz7awHuNwb4/yItPaztjXNPDTrDjmmZttdn51FkAnAuR4H8RxylGKna/7Vl4yQlOYgbX4+qauPIb8nAMlX5cwV7XwekmIbvOvVbxss9bvDAwMrpbHI++UEuCy0Dug5OIigOOznulH3LzhZv+wazO1raTOccI5zFKVLTTobJ7Ed7IsxykQrnTo76XqWvmPqC7O089J58M05ie6uE+8doHQWqK04na2ubLY03UUXsF144YUpEeK4w5xDw7Ve/Q49jtre6mBgZTQWJ30RBSempWrYsnbPAqaCQPgNs5EjShZZaUikz1jWgqE4zBXMoRRO2QMwpN27d6dnSeZSlHC8wcV6lr5j6gOz+ZaWxUl895IIl8Rg5YuTsBFOmvgQ/UK78yoCpov59IFn3DP6EU6PuNPqCEfS50879+P6aPttvfptg6V+d/hhYGUYy/79+1M+LUzDxmxGkAlXlqJBkkl3awRjscFoNNI92MzSdsjDFWYQN/Q5BS29hrQbbkUU0y1NiQR2TvQzt0i4KC1IHpKHoLunRP6sSLsyqb+2JSbbMiYhzYdMAlFkONa29C1SyWAsGKkrWNUDs9+NKzf/lMBQMu6AQ/vuCWd+hCtwCrbwfj3KNddck2CREJQJ1JyYL4wGnE1twNzRaKXviHlfFNzwjNGZO7AtSrNcr34Xhdfaz3JjYH0oQw+cuLsCEZXhOE+PHU3RVNz3jqiEtCrHj6SETB/s2dKcYAYSALq7A+MgxUr/gBG5q0FAgHoIp+SB8mLJRothkZKPPPLIg+ki5Cc6//zzUxZVuZVI7ZP6yxlAwO4ZRFp//EVRfJb3yF0RkiYahz5kPuZrkvEWcQWvuvBSMmYwlIwbzpgcJVDEQDEYTAuzxVxk2QXfuHQ6MZYhX2ltcntt3rw5nX/Stqy/8MR3FfPvezDLZIyoM5Vi4ossBCCCDoZCwJG0s7S4JgKzjMvk4jm+IklVZdMmKLUx+Fn6jX7qa8VAXwysBGNBLCTdk/QR8WxzetpcMg8jJPE7RuMWuW3btqVNqI67vJlMEHBZefkrmEdkvpUdWR82NMLrgiUMA3ORiBFBc0GUAAHFZV+eldWYJCyV+aT+4v6WfMJIs5gbJ3xeJEtUEBL+BOYz2Y9lxXVRlJxIHNe0mHi2ZMxgKBm3egcOHEh3isgGzfxIU4IXARTSjSOatMVFFYKDteB6XuYv2pu7bAgJ0qS7XiHXSswRoYDJzBwvOoeUdQYu65Im7Dri0uJ6BOvJVQCBZ2uBYEHQME7rs42xzNJvKXy1XsVAFwZWgrHYWIinaJ5xhCGXnEWMSa+PUUTacb+7w8XdJyRHm9WGJ/G7dZBkiFgfccQRiXCSMPWLiMuAiikxxWEsNvgtt9ySJHfSOxhL+mtjLF2Tk39PQqdFGb+05kxv+XjVLR1zwDBp3OrpD55EnSFimDaixRHObKiNRTIWmhqtDdHFNMwvDYUZlEBAK4k7SjCgvXv3JqHC5V9NfOX4ndd7+LKmlDyBZ0l/tHMmLVqpYlz8NO5occUxQSpnonmbs/Sbt1PfVwz0wcBKRIUxU9kozDAYQUnhL0GEmK5oOUFUvOdj8RunKtMYcw5pFoFkJkI4ScCYBQaif7Z8WgsNwXc0FRqFy5qE+fKBlPRnDPMqpWMOGCaNG5w0AyY3WiNNxQ2ebml0pwjtoXQ+hhoz5kk7Y9LC7Ny3wyTHnIjJSQ/EfAQ2pkvFbYj5GhgKlnm3w7dmbNYm5kL4gX9MhQYmbNm+qKViYNkwsBIaiztXSKUYAaLeVfhhEBxhp8wj6jJdNaU6jAMDkc4AkdV+UxNi8iEZB+E8+eSTk1379rVQYY5i0jJmFfZvd22U9IeRlRT9hnmrpL4608JQMm7MlPmNj4XEzQEOf+tF0MyluRO4gKkjrj7TKPkcmI7ggQbjzApNBcOlvfA/mVPCAX8Vid95kjZTUinO512PD2XPnj0pYAVjt/ZcMofprNcczHvMtf3Vx8BKMBYbCCFx5ShCR5toEgME44orrkiS9Pa1K0qZHxAcWkdI6DFd/AuYAMKqbfUmFX4Wh+occJO0DWNxoyNTjDam6a+tr2Bg8Rupm8O3+X383vY6LQyTCBPGSyMQpLBr167kMKe9gQmRW4/CKW3uN62ZBcESc2cskRKI+dI6obWYL0EZ6plzzMVtndYAZ3+0tx5jKekTzHBtPbiczriY/DBUPptaKgaWEQMrYQqDOM5rBIUpwM2KTYLLQUtbQeyZtPggSLfODeQhnpiMemGbDxPZpMnRFgmZtOtqYlKwK3rDXxG3D/bpjymNaSlngEKNc41pEnx+nwWGtvZJ+Ux+JPuTTjopXf/MLCNyLoe17dl5fUdbwkCcC2Gey9cBWGmeBI/TTz89Ocr5JJiRmMpEzrl1kYP/oosuSuPJw7vnBXPfdjEVjnpnc0477bS09gVMuMJaAIrx11IxsIwYWAmNBeIQA85Mp8/dTS7ShiMekSOB2miYxfY1bYVznfTONMLJe+ONN462bt2a/CkYkDMt/AY0FpJsTpzGTRLGgkgJX/UsJ3qca2GaKOmvrX3OfwEF119//ei4445LTngEBWPJGZ8+fTYmzIgEm5dZYMjbiffwiJmTkjEZxZke0W8YofMh02pV0XbfV4yAkOEueQTWfDOT0maZv6wLjIXJTrBBXmgr5ss6QaDzsOS83rK8Nz6OemvXeqWlCKvmdzEH1ngEKiwLzBWOigEYWBnGQu3HUBBX0VcitnxHqmOyOfroo1O4MHMVYuhPGDBTDmlV3D8GIuWLKDHhs9OaEhAk0juNhZ07fDUQ6YxN3/6kacEcERL+DFK1AAOO6txcRRujlezbty9lG5DeJi+zwJC3E++ZiUTBCS3WF6KuD5oaDe6uu+5K0XXCthdZTj311KS1ikwDAy3FeRsn6mVlWM+caUPigeBgbURUo7VAE5eBwhpgDrTOa6kYWDYMLMXVxCRQOZTCRj4OScxgorFI0d5jNJ4jzZFQcyc8c426fATOipDw1BG6a9PaqBiUwoeCWEZxytzz6ubfi/5C8LWD0eQbu6Q/7TPZcMxH294z7YDVWRWOaETD+EjYcaqf70A95jjaCZiluBEQEG2VwlA6bgETcA1m+BNJpl+EHI7AyqFMC8zHFHic1yu4zGv0G3DBW74G8v4JIbQa2gy4u+rlzwz5np/E/PEXYtq50NDWDw2Lnwi8zbo0SAKIOQk/U1sbvpu236526vcbDwOsL/bOsccem9bSUCNcOcZi4DQPRJYZhhRtc+UmoyZyhKEi2DYn/0tzkzbrz/q5b3+ewxgwk3HEArHB0MaNoy8MXWNnctRnDps5CDjGwdvV5qzfY6jm1as1sGhGMSv89fmKgfXGwLwYy0rq0YgYk4y/koL5+FtU6dtfKYy5BtU1pr4wdLUnAqtZMJn1LISJNrjWE6bad8VAxcBotDJRYXWyKgYqBioGKgZWAwOVsazGPFUoKwYqBioGVgYDlbGszFRVQCsGKgYqBlYDA5WxrMY8VSgrBioGKgZWBgMrzVhEKolMKj3guDKzUgGtGKgYqBhYYQysXFSYDMIO7DlD4TyJ4qCi7MNOITstvh6hr9OsAWlGpOM466yzZj7M594Y50dk+l3Pg4FDjqkLlwQIub+cWXJ2qRlu7XfnO6TbIXQ4K+KMTfOsh3oyCDjbpDhXIotDaVReF3xd3wv91pczUGAGvz7nHR69Xv124aF+f/hgYCrGUrpx54U+ebguvfTSdKjRoUF/Dn857Cf7rmSDckBJ6bHMzOX2tQzJ7m13z8k0zEA6G6f+ZRNwQNMYXXblXMu0bQ09R33HNA0cDhc6SCsbgDT4GEYUhx9lLpDhOhKPYhRym5133nlprSDqGLo15G4dKWmcgZGB4YQTTkh3uQyd5sUhRzBbm94r+jv++ONTJomh+wt8rFe/0X99PbwxUMxYSjfuvNCJEEtl4aS5tB1btmxJuZ6Ywvx25ZVXjm6++eaRK4oRnmU+3yAHmIN9GPU0hdTrFsvdu3enZzEW96kjjvMiUKXw9R3TpPYdGHXKHlOhFREunvjEJ94Pd25mlKD0oQ99aJp/2sBtt92WcsthMBJQOsHugjd/rqd2r4lDtpi1VDpO42PQQ15cRqMkEMhzJ5ccZkbjvvbaaxOzm1eur/Xqd9J81t8PDwwUM5bSjTsvtO3fvz/l08I0EBZmjlwrQQyk65BkEnEIxoIw0Wik/kDI5dqShysOGbqhz+lxpgkmFLcikiiZUGg+DuExmyAID3vYw9LJ8xgjgk5TclAz0q5M6i+ezV9lWzYWyShzc4wMx9qWvoUZBWPBSF3Hqx6Y/W5cOS5KYCgZd8CofXfGMz/CFTg3LShPFTgxT2MHQ56pOuDz6jZPGgyhg5Zi3jZv3jyCQ1qMPHNSoNx0001pDl2aZY7NofHIHszMJonlUIyFMOZOGPnrmCqtPWZLqYDkAJNMVGLTXPPKx9T3/Xr12xfe+tzGw0AxYynduDmBGxJdiAMiSvJjBmn2Q1Nx3zuzUEjvzAESULrUiRmEzRkhIa0+73nPS4yDZEe6xYiYK9jm1UM4SbnyYr3tbW9LDMuNhG6kDNu4XFXnn39+IkZMU6T2Sf014YYjzyDY+uMviuKzPD7uDXHHuXHoQ+ZjviYZb2VbBq+68FIyZjCUjBvO5AR78YtfnBgoBoNpIdqIsSy74PN5XoUAQVDAUAgItI1mMX6M16Vu/BeREQBs1gXmFNcS+E19a4RwARcYide2nFzNvqb5bC7gScZqsJjjYMwEA2tuHO7cu8LUG5fJRd8Yh6SqNCDMSpt5mbXfvK36vmKgDwYOXZFjWpAgsWTjSv/eZzOM6TrZxTllSXqIJ4LXLDaXzMOIX/yO0bhRcNu2bWkTqsO2LsU+Ai4r771rFz5JsEjKfdrTnpb6sKERXhcsYRiYi0SMzFBwIEBAcdmXZ2U1RqSkMp/UX9zfksNPGsesSLN5gXMFIZGok/mMyeY5z3nO6BGPeESSdCWkpMXEsyVjBkPJuNU7cOBAkvJlg96yZn5EEOGFOUfq+aFNR/n4vTdPtAjzSpOkOTeLeYV785L7rBBYJjQaAUZiXcjMTLMlJLh2Ge5uuOGGhD8MYChtBYwYB6FA3/BmX9x3330pO7d+4XOctkKjsZ74jALP1gLBgqBhzVmfTcYya79N/NbPFQPTYqCYsUjTXbJx+26GcYDbWDbigx/84IPaQlv9XPoTMSa9PmYUacf9LvoH8yP52qwIFgmWaYRkaKMfccQRiXCSkPWLiLO/Y0pMcfBgg9PiSKT+wFjSXxtjaRtL8zvZizd9OHOve2CY3vLxql865oBh0rjVw/DgiSkHEUOcEXsp65kNtTEkMW6OW39BfPMEmHk9eHClQF4IAnv37k0mUHMWAR0uLON7YaKScp8GJos0JsPUiQEMVRB+GoXiuuSdO3cmhu7abCYw/qBx/dHOaVu0UoU/hp/JHS38QwQpfTTLrP0226ufKwamxUCxDcPGRZTDDGTjksb4LhCd2Lg2AwnfZkBoSY2XX355ugUP4e7aDOMAJ4EhMIgARlBS+EuYi5iuaDlBhL3nY/EbpzDTGDMFjSVMIgin8WAWGIj+hTMbMw3BdzQVGgUCQUpmiinpzxjmVUrHHDBMGjc4EVwmN9FUNBUO8ssuuyyZlpgdS+djXmNutstUSCPFTJgOCQxnn332wZs+aVr8YjRPV0zTxDBq32GWTKHzKBF5dtRRRyUB4e67707a17j++NaYG61N+8lY4B9Tefazn53MfvbFuNKn33Ht1d8qBkowUKyxRGM2rpsOSfyIrMXu0quQgmMzsMvbDKRDl1ghUKWbIfqKV3euMCdgBIh6V+GH4QhHSEiF6iIgTakO42BTd0EYIqv9YJjRNknSWINwnnzyycmufftaqDCnLzMYZhX2bxJpSX8YWUnRb5i3SuqrMy0MJePGTJnfEF4aAz8E/E0iaKUwD1mPz4uZiK+MbwUDRMgJCeB1yZt1i9gSighA5oOpjVYmspBgFGt5SNi0KegEUzBPmBpzFi1kXH80nj179qRnMXZrz36zz0rmoG+/Q469tnX4YaBYY4EaG1fUDUexBWvjulmQiSZf5LEZEG+bwaVa02yG5jRoWwQUDQGha2MumMAVV1yR/AHMM8wn+qd1hIQe7d675lfRBsKq7Rz2qNN8RYRE94gc4gjGWGhxTG2en6a/Zts+BwOL39jjOayb38fvba/TwjBp3BivMzNMfgihcG8EmS9qHDFsg23e3wlD37VrV2KAhB2CjSt9mU9jnKL7aJWCNwgcmCQNFrF2AynBRbBCc730hV3Qh/3CnIWBMaE62GndMK/yG5rnccKGdWoPqWcfWf98gF3RcWAdot++Y67PVQzAQDFjKdm4gdI+myGe7XrlvOakZApgE28SXGHGtBWblkmLD4Km4txDvgkRDfUwBFpQmMi6+o3vtcXsRUtz7oHZiSM4CCxm2rc/prSIWor+hMnmGlN8P+51Fhja2kX4mPyYL/kmEF9mGeaboYhvW799vnM4MgIpmLcIAhEdFu2Fj4YvqimcYDgYkLVTuiai3a5XeBLefMcddyRfXdSDO1qLfkIYiN/yVzDSwF75ylemcGhrX8CEsTL3MU+2lVn7bWuzflcxMA0Gik1hsXFpKSeeeGLyK9AImqW5GdxP7pYyz9tI4veZIqYtj3rUo5KZwmEzZxKYL/h8bCKmNhsNs9i+fXuSDG1Y4ZiCCW688cYkvZIMMSB+IX4DGosxNJlUF2wYC/Oe8XiWbT6IF9NESX9tbXP+CyhgYhSZhPAhKBhLTuT06bMxYUYk2LzMAkPeTrwnYWPmpGRMRiH1i8DCCJ3tmVariraHfKVZccQLTaahOijbLHxk8Iz5IvTm0Hwygd55550plFkdZ13a1nWzvZLPzHDm7Oqrr06akfVibmkUmKD+CTdd/dkzHPU0L+tVKPWDHvSg5HcxB9Z42wHLWfstGVutUzEwDgPFjIX6PWnjsr333QzjgPQbZoSh2KiCAkRs+Q4jQ1hI00w0zFWIoT9hwEw5zomI+8dA2NlFiZFqp2VwGBfpncbCtBe+GvA5Y9O3P8wWc4Q7/gzETYABIhRmHH3QxhBGEXqyDTBD5mUWGPJ24r355gTn8NYXv5M+aGq0M8Scz0LY9noW2rQgEeHY5jrHWcBFg6VtOWwp1Jh/Q5gxRs3MJEIr1lc8M+sr/F1wwQXJdMw0JwiEsIApM8cJG8/Do5v9ERysjYhqNC7jYJK0BjatmaCt82aZtd9me/VzxcC0GCi+8x5BswFthLaNi9gjhjQKvph8MyDooqZsBg5MElUeZsnMJZ8S+3OX9BYDYwYLX4v3GI3nSHPCYXMnPJODuuBxVoSEp47QXZvWODAohekEsYzilLnn1c2/Nw4EXzsYTb6xS/rTPmmVYz7a9t55C7AijkKbEQ3joxEhJvAl9Fk95jjaCZgRVZpYtFUKQ+m4+R1oLGCGP5Fk+uWLgCOw8lHQAvMxBR6HfLX+jJ+/DfGMdYjJmmNRal1FKDENjP8NnuHt3jVfm/Vm/cBfmBK72ujzvfkEG9Mp81fgkPZkHPl6bbaPWRpT28FNGiQBRHtte2aWfptw1M8bFwPzuvO+mLGItCFtdZXYuMwj026GaRiL/jEqRJYZBsG1uXKTURNGxMRGQ4jY0IMgNesN9blvf57DGMIX0AUP/GJo48bRF4auPpkc9ZnDZg4Cjjbi1tXWMnwP9ljP1g/hYV5jiPWqP/jSn9d59Rf4Xa9+o//6uvwYmBdjub8e3YELJqCSMk61J+0OUWxIJhl/JQXzyTWkkmdmqdO3v1IYcw2qC86+MHS1x4zTLOFfan6/Cp/Bvij4p12vQ+FvvfodCv7azupioDgqbHWHWCGvGKgYqBioGFgkBipjWSS2a18VAxUDFQOHAQYqYzkMJrkOsWKgYqBiYJEYqIxlkdiufVUMVAxUDBwGGKiM5TCY5DrEioGKgYqBRWKgMpZFYrv2VTFQMVAxcBhgoDjceL1xIeWJE+/N/EjOA0g06KCeTMPjUmT0HYNLvxz+cz3AuHBqOZ2cqpZd2cHNoYu7OIz/rLPOGgvH0P3W9ioGlh0DMnBIN+TwMBogk4I8fl1nhdR3Jw/aIVtHSQj/LDhYdvhmGVvbsyvDWJycd5uh09NOIseBSIfA4kphCf+kzsBk4ve2QU/73Tve8Y506FNq9XGMRVoWp/Kf/OQnz4Wx3L6Wst8p80lwTDu+Wr9iYJUx4G4bmc1lxXB4GE1wbYK7bOQTbDtI7O4d6X9chxEJa+eFg2WHbx7jXhnG4iS5099yJ0neF4fbLCIn8OWskvtJupjta4kop80DNg658ktJm0ISGlcsaie6wTSP4uS2DALzan8eMNc2KwbmiQGCltxv0v2478k1CARBSWlZEOzd5r6VnogVwusxxxwz10zdyw7fvOZmZRhLIEB+LnmdpGbJi5Pm7q2gNchHljMW6rEcU9RROZrkFctzNEmZfs8996TFaWFSo10OFVqPE/6Iea5WY3Su5pVHC0zabRJ86fk947f8VL2U+NqU7yvg8Oz73ve+lH+Lec8zmzqSDObjru8rBg5nDFxzzTVpz8g6Ljs1gVPuOozGfiLs5UXKqUsuuSRZPeZt/tLvssOX42bI9yvHWLoGT2uhVWAowRD4RWSxlSCT7wORR+ClISfdkGRk6HU3ugSBiDytAKOQlXbz5s2pvkWLkbjgDOORHPDSSy8dMU1FviwXR0kMGH2Dk6qNSXhOJuQoPktv49IsMEjoyITnlkYMRr4w7WAu1Hl183ajnfpaMXC4Y+Ctb31rEjTt1RA2XS1gb/FH5tqKvSU3FoZz7rnnpnuZSvAnsztGFbfFxjMEVVnTXZchgam93iyLgK/Z5zJ8vj8mlgGqKWBAhGkkVF9E3kVIkUOMaYzKyzn3jGc8IxF3ajJGYxFKfY+pWDSuf7Ug2WnddSF9Pfssf472tY1RKS5ceu1rXzs65ZRT0mIDg/5lsKUNReEXwqziufhedl3FwlT4jviHpPLfsmVLYma0L+nqjYdPJTZNeqD+qxioGEjaCGHOtef2NauDK8lZG9zX5A6nXCtx66v7mAiVrlAIa8EkVLr/6Lrrrkt3/cRetHfdmURw1Yds503GQltaBHyT4F+P31eOsbiXHBEOh5zJY8dExF2G5KIs6jDNg40V0beQpPS3AI444ojRO9/5zrRQjj/++JTSXGSIJJtS6VtwiDgtom3h3XfffelZ9TAmkopiMWu3aQ4rmVT9WLCizixQY6PhcDAyt9GWKmMpwWStczhhgAnbPkf4MQ2WBhqKfUjLoJXERWgIPCGSduGiwmksAE94whPS9dIsF4o2RWi6hO1xj3tcoh05A4s5WBR80d8yva4cY5FyXGRWSAdMYC76ovYKSRaKy+RFcqEZ8FNgOrSSKMxlFl/4XFxf7NIljjzp/71aeLmfJp51t4bLwjAVCzkYHEaDyXCuT1tIXLQazIymYkGCj/kO4+zDrKaFodavGFg1DIjAtG/4Mi+++OK0b2Xhtn+E/F9++eXJAkDo4+BXzjzzzGTOJoyWFr5QJmnmaszltttuS75c+5bQykoRdCBvc1Hw5X0uy/uVYyz8I9RcDEahkWAubl984xvfmO64t5D4TBBlDIYdNBiRZxBw50wQ7Be96EVJA3I9bVz3Gjf35f14TiH5WMzMZrlGo31aBpPZuKLPpmlMAADTGx8LbYv5TVtti3Vc2/W3ioHDCQO0BEKkC9NEiiLwPjtuQDNhvkIHCGnOrNBUmKvtYX5S2g4z95vf/OakdbjwLacTOS4JjczlAoMIfwIECJeYTtc+HRo+41qVsnKMBcGF4KZp6KSTTkp32zskJToknPjuGWdicliqWSwW7VhQp556apJ0EHcmKNe/uilRBFpe1KfNcAAyUeUqMK1JAEFemtoGzQkjjO9pW65Ppgnt2rUr9Ufq8rsFXEvFQMVAOwbcIooRsErYM5iKgtDb0z4T9AhuhMy3v/3t6cyb71krMBdRpPeu3STK2R/ttfWmvv1o/xI8tcnchpm1WTa0Ee0NBV9lLG0zM+fvTDYNwmKhxZhMWg21GHPgR4ly2WWXJanF3e2YCMbjPnn1MCILk+pMshGZlRcOfZFlpCBnauICLP2QhnIGxmRHYwJPFKHGJKVgLPoQT0/CwhzBaeHToPLn4vn6WjFQMfC/GCBkYiCOEhDqRGwGc7Gn0APWi9NPPz1pJJhDFPWZywTHnHHGGemIQX4kIOp59RxHPZ/taaedloRQ0WWsJIRMdKCNuQwNXw7Tsr9fOY2lC6GhoWAstAb3gYvaEq0lgotGYuFQezndMBUmJ34ZGoQT/U7g0iDe8573pLrMXU3VGGPBBEgr119/fQpdJhXt27dv5H74YDTgxJS0r56gAjZXCxRjCeehfvVBAsJkFFoXsxwfi7j7XMNJFeq/ioGKgbRHXWvuaAAiv33tYDTn/R133JEEP6ZsjMU+FxSTF3SC2ZmfFHPJw5Lzet5rG81ghhcgREtBG/hdwnweQQL5s+jNIuDL+1yW9xuGsVgYNA3nUj7wgQ+khfLMZz4zSTJXXXVVsrci5pzrwhBpKWyzT3/605MkwumPOSH61F0LyO9N+ykt6PnPf3460cuRd83aAS2MTLQZbSf3sZBkqNoWJh8KZic6DYOJdqnLYBBavGPHjrRZhEtjfMxsxiOrwLOe9axlWTMVjoqBpcEAgdF+Z3mwV2gpfKgCcM4555yxKZhKBxE+1yc96UkHfSqEUOZyAiXrSFMAjbYXAV/0tUyvD1gzycwn/8gUo8TVL7zwwoN20bZHmZpEeSH2iHEQ5qjLbEQ74Bh3cj5UU1rAe9/73mSmwgAsCL4Thw4tBotQVAkz1oc+9KGkudBKOOeo1pjRu9/97sQ8mMr+P3t397rpUR4O/Pvrf9DD9GiPCiXQg8KqTUXXKElMLaxCWlnrskEbrLgGzeq+JBRLTHzZDWUPorQkmECJtq6abkhiTRpjhHgitFAETwoR8aCH/Qu+v+cz9drO3rnv+5n77Xn57gzs3t/neeaeueaamet9rkHs9aU+dVv4MXXcAUnf87toXz2wUNPBjaGxkVqENgJpCSykGpoOjUV7GBfYtAG2//iP/0jvgeeXv/xlajPgiLHXZ8XArYwB+4ePMvZH7B97LQ+wyXHEvMWcTZuxL7vqeYd2w0ejbpPuoC8ERvs2zHB5P/5eGr5mf0M+M+nB15133nkjIGrI+11194axdA2g5HuLiE0VD8VwmosIQ0D4LR6EnoO+SwLJ+6OdMFNZVN7rWliYjT4wk6462pULTb95PX34zoLuezeHq/5dMXCrYYDQaA972o/NPb5tfOwqfEsxliNjCutbOAhzX1ZiRDt3uve1lf+GAfi3rmA6JSX3z0T9kvajbn1WDNyqGGBZaNs/u4KPXYdvbjztBGPh15B6hTZRpfK5p7i2VzFQMVAx0I4BqXCku5q77IQpjG+Euagylbmnt7ZXMVAxUDHQjQHm9ziO0fQfdb+1/pedYCz8D2yQtVQMVAxUDFQMbBYDGApT3ZxlJxjLnAOqbVUMVAxUDFQMbBcD87KpDY+FGidqagcipjc88tpdxUDFQMXA7mJgJ5z3Q9DjPIvDhM53OGioSJ0gP5jTr06y77qvRspt4c8OZfZFq5XgRdCD8zIOfE5tq6S/rjpzjqnZB8HBeQFnhwgTzjE515OfHVDHWQa+umZx+tq/3IYsBF2GA0kIHa519klk4BJrB0wybLvrBwxObjuPtXRI7NR+BdW40E6CRwd72+AtmZvmfMz9uXSc24J11+Gbez60t1eMRXp7C92BRYcL/XNK3mfJI3/2s58dPPbYY4mILEEg5pqA11c3T0rb7Q6WIczgBz/4wcH3vve9lLRSigpjFNXh/M3QtuYaS7QzdkzxftcTA5C5QOZqB1j544Rvu6bgwoULaQ0g1tbB+fPnbwgbeXtnVqk+pEuPcNQ333zz4Mknn0yEHu4QnD/4gz9IKTocjs0ZUN7OmL8drnP419r0tyL60V1An//853tTiYzpL96Zo1957eBdaqQ2G3zp3ARMSzxLx7ktWHcdviXmRJt7w1gQYikUnKKXquHE6qZFkiZTmN9cAPbSSy8lAiFFShCRpRA3pV2SoMNcCNqQQup11SkC6l2M5W/+5m9S4ENfrvfOGhkAAEAASURBVKMhfYytO3ZM6/qT582NnTQK80pqdh8GJovBuObZiWgaDS1WCg0niXNCSNqOs0TWigSjGJG7NGRXwJzllJNkED7nxCWNEqwui5JjTu43Gvezzz6bmGJbjql1OCn5fY5+X3nllZTJt+3EORhK56YE3rF1Sse5LVh3Hb6xeF/33t4wlueeey7l3UJc3InAHJJrJU7LS9Hg6lHSezAWEi6NRroHxFiq/LhNEnIknHSAkmmCqcWNjSRKphbmEwSKyQRBcAlYfmBRJBtNSW6vSM+yrr+2CZFSxlgkrQwCqB6JUdvufGBGwVgwUlewqgdmvweTibZLYCgZd7SnfdcEINxwBc5jPfmR4r05ni+//HIyeREmaCnmw/3mIU2T+qXUAJ/f5GeDr3xtmLPAq9xuxiHPG/Op38w1RqMNGsxchZTsHhBXYzNVWnvMlhIfgtPNh8xMTHpzljn6BSeGS7OKvdSEsXRu8rlotjHl85BxbgPWueHD4Pel7A1joZIjoiS/PD12IJopQ7JJhCEkTmro3/3d36WMxswbbJ2Ywfve976DL3zhC4lxkChIwTYPcwUbvnoIJ2lYzi6SG4ZF0n3nO995w9bMpn/x4sV08RjTFKl9XX9tm8w7CLb++Iui+Ez6dleEu7VlZtaHDMl8TRJlIpDgVRdeSsYMhpJxw5l8ZW7Ow0AxGEwLAcdcZHdtagcB+xxP48JQ5WHjlwimrk/zjTnKAA2mt1Z3asAdYSAucQNns9D4xO1jToQRRaZaOOb3irXTfG/MZ3MBTzJbg9kcB2PG6Ky5NhijLxm0MbwPfOADN2D1G4Ll8joaEGalzbxM7VdbkQdPMsfAU97HkLlpwpe3M+Xv0nFOhXXpeSiFbwquNv3uzSty070X9mfDM3WQ9BDPNhu4xeveesQvfo+sxqdOnUqbUJ1XX301mT0QIRmDESSbiDT8V3/1V6kPCwnhdbEPhoG5SBKJKCFyAgQU92x7V7ZjSSel0HZrXV9/baljaCKYGykxL5JXKgiJRJ3MZ7Iku0b5jjvuSJIu5zMtJt4tGTMYSsat3rVr19INmx/72McOTqzMjwgivDDnSDdOO2wjPPk4xv5tvuAUvnNfFIIiuSdJHyMw37QNTz44TBdOEHNwEwbMD6HDOnKlLGmc5ue+HMxIinVZr9Wbq2AcGBYY4Q2TkLT029/+doIPPvu0FRqN9cS3FHi2FggWBA2wWp9Nwj21X+NnbiRctbXv9yFzo/4SpXScU2Fdeh5K4VsCh4u1uSLEO19WhPdwRSQOV9rK4eq++SJ4Vxv4cGWXP1xJi4cr89bhSqpN73n/7rvvPlxJvIcr4nq4kgYPVxLu4fPPP3+4YmCHK43mcJWNNP2+utQn9be6o+VwxbTSv9VVp6mdFSFPba+kxsMVUTv0fUl/Xl5d6HW4IsrpHZ+1scouegjmvKyY3eHKBn9jzCuimWBdmc5ujGdlzknv6790zPrw3rpxq/f0008frrSywxUDO1wRNV8drjS1w1VG58PVjZeHK59F+q45pvTlAv9ZCyuB4HDFbA5XDP1G/yvGcLgi4Gk+V5rcoc/WzMoMdXj9+vXDVbLQw5VwcLgilGkefb+6X+NwZZI6XJk4U72V3+NwxbwXgPrwcMXADlf+n7RGVkLN4ZkzZ9L8W29dZaWtHa4Y4+HqSobD1X0ghyvt7PCrX/3q4Up7O1z5h9K6jjnpamNMv/aK9bG6ivcGfrvaz7/vmpu8zlJ/Dx3nEFi3MQ9D4FsKp1Pa3QuNhWRCGmXyWA22iMnyl5Bc3XNNywmTg7/5WKSRkc6aaYxkS2MJyTtMKiRFEqL+2eM5d2kIJFyaCo3CnS+kaTb/kv5oXUuV0jEHDOvGDU7SPW2ISYymYox8WUxQNIDS+Zg6ZiZAF6a5m8YcwLs7x0MDpJUwO7mF0/0Zsk6bI74ZWgz/mPB0Y+HTevzxx9OdHebWmNwm+MQTTyStDF7mLvx2K4Em9W0ti0zjUBZyHmNo9slXxNzIFMnkSYtwv485EXjAPKitvjKmX/7IFWFLV2XHnujrY93c9L0712+l4xwD6ybnYQx8c+Fwznb2grGwmTMnYAQIfVfhh0E0EAkmDnWZrprmDYyDn8FtkZiV9psx+kwXJjkI58mTJ5Nd+/VVqDBnLzMYZhX275XGUNQfRlZS9BvmrZL66gyFoWTcKyk/md/4WPg4OBDhbx1BK4W5pB5fFvMPHxjfiqul3/WudyUGH3AwFZrT3FTKjCP6ygVQK001mW/UEfTAaY4w++w+Dv4KZic4dD1007xUAmdfHcxD0MkHP/jB1IfrcJmzRIV1MRbt8aGsNIf0LsZu7WGoiF2Mfe5+mQkFx+ijuS+afZXMTfOdJT6X4HcKrJuYhynwLYHTKW3uBWOxgRADV44idDSG5sbHBJxNIEmfWZ1biMNzcfYhR9LKBJaYAMKqbcRlXeFnEd3z2muvpTuyMZZ3vOMdySejjSH9tfUVDCx+Y48feiXxUBjWESaM98tf/nI6eLgyeyWnNwkfrIjcJorwcoEWtApaisCL3JEPBr4nQoTosNBMfS/Yg3aGmYMZsbRujq18B8YR8w4PLmvzmW+mORfaGlMEfbzwwgvJd4ch8hX5h3HxkRGEzDMNK4c778u44DrqWf98gHBASm8rU/v1vrUN1sBRWz8lc9P23lzfDRknn9y6ddQH19LzMBW+Pti38dvbw2a2AUVBnyRSRMGZBjcwNje/qC3aig1BfRf+S1NxqBKBjkJDUQ/BoQV1beioH09tkXKZYRxSZHZyfXBImySasf0xpUV0U/THtJZrTPF933MKDG3tcnQzJwmKYGJ697vfnUyGIufgcRPF4cgIkOCIx+AjOiz6Z7YRISckHZOJgkkgfuYZkSSQYCA2sYCQfA3pg3SuzjqGG+2ve8ITxkIgAksUuKMZWXshDMRv+RMxo6kxwa78fWntC5iAE+dujKGtTOkX4yI5C2ToYlzRZ8ncRN0lnkPGOQXWTczDFPiWwO3UNvdCYzFIC12oscNmzi6I4nFK2uJid7bRMIszK20FEbFhmTdEdKyctwcrR22SXDEgNmSEiMZCIssJTB9CMRa2breuedcd90HkmA1K+mtrn2+Az4cPQWgqXwCCgrHkjE+fPhsTZkTSzcsUGPJ24m94xMwRG0xGcaZHpBZG6GzPUK0q2i550piYsWgaNE8HYJuF70u4MB+QSD5CBQYohJNpy1zzY2AuGD8B5ZFHHknE2VrxPcLPb2RNYSw5zpv9DfnMXGfOVgEQyURnvZhbkjZGZj3pv0srQGxWTvu0dq1XWoqx8ruYA2u87YDllH6Zes07Bs4c3FVK5ybMzl3tTPm+dJxoRMk66oJ16XkohY+Zd1/K3jAW0hOGYqMK13To0XekCYscMREuzFyFGPonDJgpxzkRcf8YCFv7Rz/60RSGuk4ia04ixkV6p7Gwc+cLkdllbH8Oy2GOFrBwYiYdAQaIUC4908ZoJVevXk3ZBs6ePXsTiFNguKmh335A0B1IFVqsL4RGHzQ1RNpm5UwXtr1EoW0ILaaFmMMcF9EfzVTghRPznPRS+tAimb/45Pg1cge5k/k03u985zsJfloKpuS8Bkd/HtYcfYx9wh8m5owRBzzzG2EBU2bSEzbe1x/BwdqwXsOnYrwyUFgDTHrWebNM6VdIPQZon3UxPP2Vzg3G2QZjE+Yxn0vHKay8ZB11wbr0PJTCt0+MZe/S5iMKorFI0f62AZg3SHOctbmzkclBXao9cwkJTx0H5CwWhAqDUkhoiGUUp7O9r27+vQN7CL52MJp805T0p33SKsd8tO1v5hmwIqKcyYiG8dGI4lQ/c4p6zHEIDZhtcEQ02iqFoXTciDNcgxn+REzpFzGGI7ByKNMM8jEFHqc8pV8xd6LPuopoLxK2OuYGPpwV8Z11QRs0TzlTMibtBswxJjjP109Xn0O+N5/6Yjpl/gocgovfsK8/xNC42lKq0CAJINprYwBj+zWnBK7mXmqOecjctMHXbG/s55JxToV16Xkoha9PCBmLv6Xe2zvGAhE0D0SWGYYUbXP1mS+YRSxAxIWpJCcySyB2bH/ewxgwk77NiNhgaH3jGAtDFz6o6/rMYTMHAUcfvF1tLvF9rAsEm3bVhSPM2JrwtH76CPxUOGO90lbgS3+eS+NsW/1OxdfQ93d9nLsO31B8l9TfS8ZSMrBap2KgYqBioGJgOxjYm6iw7aCn9loxUDFQMVAxMBQDb/f8DW1hhvoictifmYFqqRioGKgYqBjYDAaY6QQF8Ud2mY3HQLITjEVkEQewaJ6l7c5jkFTfqRioGKgYOIoYcOBbtKIIuyPHWERd4JrNcxlHcSLrmCoGKgYqBnYFAyJMRXMKYpmz7ITGglMKGxY+WTWWOae3tlUxUDFQMdCNAWH5fRG13W/2/7LXznshsMJL2QlrqRioGKgYqBjYDQzshMYyBBWSEToJ7iCXg4qKE/BSe0hvgQPvutbz93//9ynPkxPhUw89SWNClXX17dS2hsxDs+6cY2q2HZ8JEGzCDsO2ORv97uCgw4iEDnZjvrvmIUL1pKZxaFZxUFF6oL4UJgHDlOc6+Ke03ffutvrtg6n+drQxsFeMRUJJaTucGnca3T9ZX32W1l1adSk95HvaZeYiH5PTtm4FHMIM5EmTTkaaGiejjVGKcwcmh7Y197IeO6YhcMg68Oijj6Y0M9JbYBhRpPaREkfG4MhojVFI93LhwoW0VphcJW60htwkKtcZ27KT5u5Keeihh1Lurmhz7mcf/HP3lbe3rX5zGOrftxYG9oaxIMRyJEnZIafTiRMnEhFgCvPbN7/5zZSkUHpy+a1EOuxqcQLbqW+S5JBCwpbL6fz58+ldjEWOLMRRnqNtlrFjWgezEHQpWBBHWhHhQv6vJu5cmiXz9eoWzzT/TtK7GAszxmDk65IaRQSif3J1uRtF9gbMWo426WniCuB1cJX+Xgp/aXul9bbVbyl8td7RxsDeMBYp0SVqxDQQFmaOXCuRqsV5GNmLEYdgLDYYjUZeKMRIEkcJHiP/13/+53+m9BrMIUwoq2uMkwTLhBKZbplNSLfyUkU2Y8sCQacpSR0S+bzW9de2nKTxNxb5o3JzjNT52pYXTB4sjAUjlbROPTD73bhyXJTAUDLugFX77pRnfpSKBJzHOhIgxjtzPcGJeRo7GPIrEPI+Xn755aTBEDpoKRySx48fTzd70mIkMBUcIo29OXS3izk2h8YjLT0zmwzH1tJcpRT+ufqLdrbVb/Rfn7c2BvaGsSAOiKjU+cwgOSE1hTSVp556KpmFQnoXxiwr7g9/+MMUoy1/FkJCWnXpD8bBR0G6xYiY0tjm1UM4SbkSLkpzjmF9/etfP3AFbuSVklxwdR98IkZMU6T2df014Qa7dxBs/fEXRfFZgsSHH3443TZoHPqQUp+vSSp1afzBqy68lIwZDCXjZjqSbFJmXgwUg8G0EG3EWPp28C0RVRI4IEAQFDAUAgJto1mMH+N1Wyj/SzB/sFkXiGzcd+M39a0RwgVcYCSebckem30N/VwCf1ebLvRi6o1bSqMes59s3a5pIChZO80ypd9mW/VzxcBQDLx9RQ5tYQP12cU5ZWWpRTzbDvLYXFLaI37xO0bjTo5Tp06lTagO27q7WxBw6d7fWt0mKXMvKVfafX3Y0Aivm/swDMxFhl9mKMRLgIDiFknvSpePSLkjY11/cTFYjjbSOGbFCZ8X2Y4VhMQ9IsxnTDbSrbuBkI9BHDotJt4tGTMYSsat3rVr15KU75It5keaErwIoHDp1Nymo3z8/jZPtAjzyhfC5NUs5hXuzUvus8JkmdDgCSOxLqT8p9kSEtwdD3fPP/98wp+7cObUVkrhb44nPrt3x3riMwo8WwsEC9caW3PWZxtjKcFb9FOfFQNzY2AvGIuNhQC40jW0hTZE5JKziDH3tmAUcZ+F30X/uFSL5GuzIlgkWKYRkiEidPvttyfCSULWLyLO/o4pMcUhYDY48wvJ3T8wlvTXxljaxtL8Tlr8Y79N6+6+DKa3fLzql445YFg3bvXgG55EnSFiiDOi5T4TZkNtzE2M87HrL5z0tI02jQ8eHLDNC0HgypUryQRqziKgw02YfC8vvvhiuo+FBuZ6AkyGqTM3Rebtjf27BP6utmnntC1aqSLqkZ/J5V/8QwSpMOk225jSb7Ot+rliYCgG9uIcCzOVjYIIYAQlhb+EuYjpipYTRNjffCx+4xRmGmMyobGESQThRIgwCwxE/8KZESsagu9oKjQKtwCSkpliSvozhqVK6ZgDhnXjBieCy+RGa6SpcJBfvnw5mZZEo5XOx1JjbrbLVEgjxUyYDgkMDzzwwI0rpGlazHo0z9OnT6cL3zBq32GWTKG7UvjWmButTczFWOAfU/nMZz6TzH72RS0VA7uGgb3QWJzKZ8rACBD1rsIPwxGOkPz6179OdRGQplSHcZB83TyJyGq/qQmRXBGpIJwnT55Mdu3XV6HCnL7MYJhV2L8l0QTbuv4wspKi3zBvldRXZygMJePGTJnfEF4aAz8E/O0iQePzYibiK+NbwQBdX0tIAK/bQ2mqTGr8Vkyc5oOpjVYmspCWEBpdKd6XrMeHcunSpRSwgrFbe24vjRsll+y7tl0xMBYDe8FYEAURUO4mR+gQiqZdGRN48sknkyR9ZnWXOfMJ5kHrCAk9kMS/gAkgrNpuM69E3XgiQq49FjlE0sVYmF/idsIh/UWb+TMYWHzHacth3fw+fm97DoVhHXPAeJ2ZQbDPnTuXbqmkvYEJkdulIgxdQAatkWQvQCN35INVdJ/f+YsIAOZfwSxdbc2fIViBJrcON+nFDfxnncK19SCXnvXPB2hsGGQtFQO7iIG9MIVBHOc1ZsIUwCbeJLiitmgriD2TFh8ETcW5hzxEFZNRD0OgBYWJbN3kaIvZiynMuQdmJ47gkG5JlmP7Y0qLqKWAQ6hxrjHF933PKTC0tStggsmPLZ9vAvFllmEuajLrtvc3+Z3DkRFIgXEQBCI6LOAIHw1fVFPzxXAwE2undE1Eu0s9wUgD+4d/+IcUDm3tC5gwVuY+5slaKgZ2EQN7obFA3Hvf+95kpnDgzZkE5guOeETO+RYbDbOgrXCuk96FY4qsuX79+sG9996bzB4YkDMt/AYkVtpKk0l1TRTGwtb9jW98I73LNh/Ei2mipL+2tjn/BRR8//vfPxCZhPAhKBhLTuTA67MxYUbNbNBTYGiDCx4xc1IyJqOQ+kVgYYTO9gzVqtr6mfodzeqnP/1pOttEQ33ppZfe1iQfGTxjvjRfc2g+mUDfeOONFMqszi4lQsVAOOqtXeuVlnLbbbclv4s5sMY59D1rqRjYJQzsDWOh9mMoiKvoKxFbviPVISykaeHCzFWIoX/CgJlynBMR94+BsLOLEiPVDjUlYFykdxoLO3f4akyoMzZj+/vLv/zLxBwREv4MxE2AAQaTm2RoYwjj1atXU7aBs2fP3rSWpsBwU0O//eAsBCc4h7e++J30QVOjnSHmfBbCtrdZmMGEFgvHNtc5zgIuGixty2FLocbCdYUZY9TMTE7sx/qKd7b9JDhYGxHVaFzGIQOFNXBsQ4dUt42H2v/+YWAn7rxn5pIDSmLBdf4OZjDRWKRof2M03iPNCYfNnfDMNeryETgrQrJTR+iuTWujYlAK0wliGcUpc++rm38v+gvB1w5Gg4FFKelPXSYbjvlo29/OW4AVcRTajGgYH40IMUHUhT6rxxxHOwEzosoBHW2VwlA6bgETcA1m+ON/0C9fBByBlUOZFpiPKXAy5xMDMH7+NkwvGIiUPuZYlFpXEUpMAxMeDc/w9tbK12a9WT/wF6bErjamft8Ff1e7mKUxtR3cpEESQMzJuj0ztN8ueOr3Rw8DNHd7+s4775xV8907xmJqaR6ILDMMgmtz5Saj5vQjJgg2QsSGHgSpWW+uz2P78x7GEL6ALngQGwytbxxjYejqk8lRnzls5iDgWEfcutrd1vdgZ2pUrB/Cw76NYVu4q/0eHQwsxVj+T9zeI1whAEwy/pUUzMe/TZWx/ZXCmGtQXWMaC0NXe6LBmiX8S83v9+Ez2PcZ/n3AcYXx1sXA3kSF3bpTVEdeMVAxUDGwXxiojGW/5qtCWzFQMVAxsPMYqIxl56eoAlgxUDFQMbBfGKiMZb/mq0JbMVAxUDGw8xjYa8YiUkl0T+kBx52fjQpgxUDFQMXAEcDAoKgwBNzJa2dCFOcJnH4vjWaaA19Sbziw5wyF8ySKg4pOTTuF7KzCroeNSn0uHccnPvGJm+4PGYMf98Y4PyKtfX4XyZi2prwz55iacFh3zm1Io0OYcIbF2Zn8DIc6zrIIs24WueUiEWX85mCttexMkNQ+DkhKz7PE2gGTPeMMlBBxZ67snfzMVcA153Nb/c45htrWfmKgmLEghF/72tfSnSRSeTiU5+T63XffffDQQw+lzbk0CuT9AoPDfQ4N+ufwl8+y78pq+9hjjyUisgSBmGt8r68yJDvUJ6PuEGYgnY1T/7IJOKBpjD/+8Y/TIbqhbc01lmhn7Jji/a4nBiAjgczVkVCUIOMU/YULF9IaQKytg/Pnz98QNvL2zqzS/Nx///03rqt+8803U8JShD5S/8uGLEU9QanvfFDebsnfDjk6/Gtt+luxb+6666500h9TW6Jsq98lxlLb3D8MFDMWqTv8kzXWfRAOJyJyUpA4fY2wOXy4VEGIpbJwYtq95idOnEjMjCnMb1KeyxGFQEhD0nbuYinYhrbrYJ4Dm0NNeKRet1gioN7FWKQoweSXIlClYxs7pnXtuzFS8kUahXkl5buoC5PFYFzJ7GQ6jYYW+5GPfCSdJM4PzEqPE1q1tSKlC0bkThP382DOcs1J9gifc+KSRglW6fjlkiOU0bifffbZxBSXyvW1rX7XzWf9/dbAQDFjeeGFF9KBMinJmSEQM0n97rvvvpRKXlLIJRnLc889l/JpIS5/8Rd/kcwhuVaib1fOSjKJyQVjcZKdRiPlCGIsbQdCE4cM3dDn9DjTBFOLWxFJlMbIfIJAMZkgCNKC5Ifq4ICm5KBmpF1Z11/bspJt2VjgMwigejIca1v6FtI1xoKR/vznP0/1wOz3YDLRdgkMJeOO9rT/q1/9KhFuuALnsQ3lqXJLJ5MXYYKWYj6OHz+ecEOLkd9LahPw+U1uLfjK14Y5C7w+88wzaRySiTKf+s1cYzTa6EsLE/gofdK23FQpfx1TpbXHbCkVEDglE5UI0/jmLNvqd84x1Lb2GwPFjMUGZLMlzUX6C8TcBs5zGbkrwiaNC7ACPRa7RJCkNhsMgRpSEBFElORHyswJh3ZoKu57RxhC4mQOkJTQTYLMG+DHDGhd7u7AOEh2pGCMiLmCDV89hJM0LC/WK6+8khgWSdeNlGEbZ9O/ePFiyrTMNEVqX9dfE26wewc+9MdfFMVneXxcSiVponHoQ+ZjviYZbxFI8KoLLyVjBkPJuOFMTrC//du/TQwUg8G0EHDMhekIfD4vUYwLQ3V3Cr9EMHV9mm/MMa4bkPcL7syp/HFgbYOLxicvGOYUgpCMwXDM3BtrZ47xmAt4krEazOY4GDNGZ821wRh9j91LU/uN/uuzYmAsBoqpu4y2NALE1XW1JGfZYUlgNk5sUlKYC5PYw8M8hqkghogjpoQID2EsNjxTB0kP8WyzgWtP5mEEJX7HaMBy6tSpxNDUcW89swciJCsvgiTBImlYdmR92NAIrwuWwIq5SMSIKCFyAgQUl33FHSDGJZX5uv7i/pZ8wmgimBVc5kWyRAX+JOpkPmN6/NznPndwxx13JEmX89lcxLslYwZDybjVu3bt2gFtVTboEyvzI4IIL8w57gaJOc7hnutv8wWn8J37ohBOyShJ+hiB+aZtePLBYbpwgpiDmzBgfggd1pH1y/xF83PTKGZE43Y1g3pzFYwDwwIjvBG4fvOb36Ts3OCDzz5tZexemtrvXOOv7dy6GChmLC56Ytum2kuXTnKVfdcmZSIKUwONgiRJklbYkEUMuVeCbwbxH7p5MSkb8fd+7/duaAttU5ZLfyLGpNfHKCLtuN85Z919wl+EKGJEJGMmPlqWjX777bcnwsmEpl9EnB8JU2KKQ+gQe2YaEql/YCzpr42xtI2l+R0pmxaFAbkHhuktH6/6pWMOGNaNWz39wRNTDiaLeGPK7odnNtRGCBVNmKd+Nj5XBeQFg79y5UoybZoLTIEGhrFgsubGHLummYbnrh7BDsxe1iuND8EmFFgbiD6BiTAhCGVOn4d1TjtXwPPggw8evLUSZDAzJjB+o9g3+Rjj77F7aWq/0X99VgyMxUAxYyGh8ieQ2El2CIoLk3yHyLhnHiFi32YiYT7BXDAjmxsD4ixl0giNohRoEph3MDOMoKTwl5Bc77nnnqTlBBGm8fCxYC5SwjONkWxpLEEgw6SCWWAg+keYOHcRL8SMpkKjwJBI0/whJf0hZkuV0jEHDOvGDU7zRhtiEqOpGCNCzAQVEVVLjSdvF0NwERqBwBzAuztxgknSSjB4ApArEQSXmCO+GVoM4QfjNRY+rccff/zgPe95T5pbY7J+n3jiiaRFwMvcJSIo9W0ti0wTmCDkPMbQ7HOOvTSm3yYc9XPFwFAMFDMWznOLlGrPNIQgMx+QZkVkka5ig5DSLl26lJzsiBHnKCJgowxlKgbEZk6yxAgQ+q7CD2PjIhKkQnUxwqaGhHGQcl0QhllpP/wm0TZJEjELRnby5MnkI3p9FSpsPCRezCp8SSTSkv7graToN8xbJfXVGQpDybhpCMxvBAg+Dv40+Bszj6XjaNbjy2JK5QPjW7l8+fLBu971rsTgAw6mQnOam0ppWKKvaNgueGNaU0fQA42BkOMzbZTvjxkTDmnG6s5Z7A1BJ7R2fZw+fTqZhmlIsW/a+pu6l8b22wZL/a5ioBQDxV5XYZqc3gg14mIDI7BubkTwSbSItILAYihsyqRHxImpgRQ5piAeiAENQVttzAUTePLJJ5M/gDbFpIVo0DoCruibOUIbCKu2gzjF721PzFR0z2uvvZakdYyFmYb07/0h/bW1HwwsfoM7+Gp+H7+3PYfCsG7cGC8zEpMfQijcm2DBF9VHDNtgG/ud8PJz584lxkZLoQm7qhfxD/j5nswnmEIz1Z9gD/ODmcOjg5UYxrGVSZEWan0o2olL5pg+h+A8NdDxn6APGhPTMBiY6fRj3TCv8veY5z5hY8xemqPfjiHVrysGijBQzFi0xpTQJOrMCDYmM5JN7XfSJbORUGRnEDh5HXLjNOeIH1NIpIiC9tjKm5tfmDFtxaYFCx8ETcWhypyhYTLqITi0oJwQ9cGlLVIuM4zzO8xOAhqCwJIsx/bHlBbRTQED01quMcX3fc8pMLS1i/AxJ/GLMTERIpgMRc41mXXb+3N8Z91EgARHPAYf0WHRvuAHEXK0akwmCiaBMZlnRJ0JE2Hn+LcO8zWkD1qrOsGwop2xT3gS+MBkDJYocEdrsfZCGIjf8ufYvTS13xyG+nfFwBgMFOv77M42iBvHEFib8I033khnRPgfnCUgASIEHPWkSpuduUE4J7+LCB8baYyDlF+Huc1hM2cXmOE4aW0iPhxMC7M4szpljYjoh3mDo/b69esJHpIhBuRMC9hoLGDOCUwfEo2b3wgOvMuJHkSOma+kv7b2+Qb4fPgQRNhh4JgzxpIzPn36bEyYEW0wL1NgyNuJv+ERM6clYjKKMz3mESN0tmeoVhVtlzxpTMxYNA2apwOwzWLtWV80ZpF8hAoMkNbKtGWu+TEwF4yfgPLII4+kdWqt+N665jeypjCWHOfN/oZ8Zq4zZ08//XTS8K0Xc0ujwMisJ/2H5tRse+xemtpvE476uWJgKAaKGQsTiFBjIcPCjG0+arzIFoTeBlIQO4e/IhKL9Ed7YEa5evXqwbGVGWKM/Zp/J/oRfSViy3ekOgQIMWGiYa7Svn9f/OIXkynHORFnaDAQtnawkX69P6RgXKR3GgufUfhqtMHsMrY/+MIcERL+DExagAEilEvPtDFaCTwyTZ49e/Ym8KfAcFNDv/2AoIu8ErihL34nfdDUEGlEnzNd2PYShbYhtJgWYg5zXER/1hYtyol5TnopfWiRhAgmWn6N3EHuZD6NV8AJ+AlImBJHPrNVHtYcfYx9wh8m5owREx7zG2EBU2ZWFjbe19/YvTS137Hjre9VDAQGiu+8FwHEx2Gzv7XyUZCymBWEwYYJRqMIgbr5ocnojNSLaDbVf1KkfEph5476bU9EIXwt/sbQvEcz4qzNnfBMDupy/jKX6FcdMNu0CBUGpTCxIJZRpAfxvrr59w7sIfjawWhyJlnSn/ZJqxzz0ba/mWfAiohyJmPAxkcjQjwRdeYU9ZjjaCdgNh+IaLRVCkPpuBFnGguY4Y/mql/EGI7AytdGM8jHFHic8sQ8zZ311FVEe9Gs1DE38OGsiO+sC9qgecqZkjFpN2COMcF5vn66+hzyvfnUF9Mp81fgEFz8hn39jdlLAduUfqON+jz6GFjqzvtixhIoRtxIXYpNguh2qfLxzrrnEMaiLZoHOJhhEFxw9JkvmEVsNMSFqSQnMutgG/P72P68hzFgJn04RUQxtL5xjIWha7xMjvrMYTMHAUcfvF1tLvF9rAsEm3bVhSPM2JrwtH76CPxUOGO92jfwpT/PpXG2rX6n4qu+vzkMLMVYik1hMVSExb9tFhsS0fCvpGA+/m2qjO2vFMZcg+oa01gYutpjxmmWba+DJjw+l65PgkjbmNranPrd0PU6tb94f1v9Rv/1eetiYFBU2K2LpjryioGKgYqBioFSDFTGUoqpWq9ioGKgYqBioAgDlbEUoalWqhioGKgYqBgoxUBlLKWYqvUqBioGKgYqBoowUBlLEZpqpYqBioGKgYqBUgwMjgorbXjuek6mO5jYTAkjbFPeKOcpJITsO8k8FiYnup3RkDq+70CbNDYOv0mCGQdGx/bZ9p7rB4zfgb8+ONrerd9VDBxVDAitd07NOSbh5c60tZ0REn7tLJ0zRcLnHSR1Bkv495Kh37sO3xLrYm8Yi4Xj0ikH3xy+jHMrFkvc/CgvkxPOmEz8PgfSXArl7IhMzn0E3el5hyc//vGPL8JYXl9lVnZocB0cc4y5tlExsA8YcIjU4Wo0wN+KjBp33XVXytRB0FRk6JDZQgb0SEwrJF/WhgsXLqRDyF1nnlIDI//bdfhGDmvta3vDWHB9Uob0J3IsxRkKjMVBSalFpJqRJ+nMKgfU0HQtfZiSLsRBulikXXUxHwf0wLREccDOob6l2l8C5tpmxcCSGGBNkD9QHkG5+uSvk4LIdQkyVkReQnffSGArBZU0RQ7EuivKuxiMtDsE1rnLrsM393ijvb1hLAGw9BzSlzhBnxeLQ6p+WoP8UDljkQJFKhRSi1QaVOX8pLUMza6plfvMdQAWX56M0EFMxDxXlzE6NyhKdwIm7TYJvizK3vEb+KLIXKxNaVkCDu+6BVGaFOY970gx4u9aKgYqBt6OAfvZjbbyAzJTSy3FZG0/EkAlayWEMnW5+sFTPjhaCovG8ePH0wV9tBh5CCOR7tt7GvfN3PAtwfjGjWz9W0eGatFaaBUYSpjBLDJJMyWt5PtA5BF4mZfdZkkDkYjQVbfyOCHytAILU/JAC099GY0xElINxkO9lfDw9ZVpKtKauKeG/Tb6hnqJEzEG70lYGcVn+aniPnR5t5jwXKaFwUjroh3MRVZodfN2o536rBi4lTFgH9ojMoLbI/ZaCGX2rcwK9o29y/9ijxIqw9rhHRfHuS8HfbDv2gQ5d0kROuNSv8A5xiG5LU0JU2u+Ozd80e8+PPeesVgMNBJp8xF5d79EqhemMaooieZTn/pUIu78JRgNjUeGYkzFonHjpfTrFqC08LIMS8tPStC+tjEqhUpNtf7whz+cFhsY9M8paOFG4RfCrOK9+F6yScXCVPiO+IdkXD5x4kRiZrQvKr3x8Kk0NbT0Yv2vYuAWxgDGEcIZRmIfS0Bqf9uv9lLkZbOn+WdzHynCz5KhDiGzy8dC83EFA99M7EV719UWBFcpliSlbTKWueHbp6neO8biGmREOBYBvwaHNiLujhXSC4mE5iFKC9GnnUhDbwHcfvvt6e4NC4WDT+ZZUWXS4ct4TE1GxGkRYabKJ9TC9a56GBNJRWE6c6dH0xyWv9v1t34sWOq8BWpsNByp3ZnbSFyVsXRhr35/q2LAfo79x+Lw4IMPHry1yrzuWnImMCZtDIelwpUTeWHCJlTKcM3nYv/mpu68Lv8NrYblQuG3EaHp3ilXTaMdbfn75oYvh2nX/947xkK6IHWEdMAE5j4WYbhCkoXiWiAWF83g2MpPgemQZqIwl2ECpA6qtFsm3Y3hTg5p2D2p0LmfJt7FiNzpgqkwmQWDw2gscs71oeVDH/pQ0mowM5oKnw/4qOcY5xhmNRSGWr9iYJ8xYK/efffd6XZYe/LNN99MVoX8Lh7jI3C6UE+wD0uE667t5bgJtg0HfKFM0szVmAunP1+ufUtoZaUIOtD2vu+WhK+rz21+v3eMhX/ETX8YjEIjwVwilBCTIH2QYBBlDIYdNBiRdxBw50wQ7C996UtJA3IbZtxwGRcs5f14T+FHYdpiNss1Gu3TMqjgfUWfTdMY6YnpjY+FtsX8pq11i7Wvn/pbxcCthAGMQdAODcLeP336dDJT0S6CaRAKma+EJvOtXL58OUWRohfr9hqhkblcH4Q/518wJExn3bvmYWn4dm2u946xILjOqTRNQ+5kdwWxq3NdV0tCoHW4DpaJKRZXPgEWi3ZcCOVmQZoC4s4E5cZLF1qJQMuL+toVwcVElavAtCYBBHlpahs0J4wwvqdtffnLX04muXPnzqX+2Gb9bgHXUjFQMdCOAVc8M4u7ORaj4EPxj2n7jjvuSGdW7DeCIl/KF77whbTHaSlu8Mwd+e09/N+3rBv2Y7RHGOTU10abZcObm4Tv/yDdjb/2jrF0oc3ioUFwyNFimMBoNZgF5mCxRSGp0DxcsYuJYDxssOphRCQQ1zCrw1SWFw59dlumKiGNmICiH877nIEx2dGYwBNFqDF1PBiLPtzOKLIEcwQnUx4NKn8v3q/PioGKgf/FgHNtGIvzKoTNIPD2Da2FAIgG2E8sGvaZvX/PPfckc3qXT6WJ33DU89ned999SQh1QZY29YEORN/5u5uCL+9zV/4+MowlNBSMhdYgJl3UlmgtEVw0Egzhhz/8YXK6YSpMTvwyNAiSjgVKg+CoU5e5KzehmTSMBRMgrbDVMs0xf7mH3pW3wWjUxZS0r56gAqfyqeIYiwWp6FcfJCBMRqF1McvxsTjwlWs4qUL9r2KgYiCZsZi0n3766XQMgFBoj9EUMBEOfNFeCLxjBVK4iOx66aWX3oY9Vg7WkDZmg4Fw1NvrAoRoKWgDv0uYz+MgZt6ww9pzwueow76UI8NYIlzQAnJXvM9UXuapp556KkVyIeac6+9973uTliKfkIgQkggnH+aE6FN3LSC/N+2nJKAvfvGLB07jc+Q988wziZGJNrOwcx8LSYaTz8LkQ8HsRKdZ/NGuxQ4GocVnz55NDE24NMbHzGY8HI2f/vSn92VNVTgrBjaCAXvHeTPnwjjWCXX2L4GMqUtADquBw88ETnvf2bLYezmQhEo0oylIqhM+149+9KM3fCrqM5cTKFlH2t6bG759YiyD77zPJ2Ouv0vuvGdqEuWF2Juw5uKg/tIOOMaFGYZqSgv4xS9+kcxUNBkLgu8kDlRx5Dshz4xFmqFB0Eo450wkZvTv//7viXkwlSH2+lKfVCT8mI/G4Svf87toXz2wsO2C26KmrluEGB8nPVhoRjQdGov2MC6waQNsTuJ7DzxCI7UZcMyF/9pOxcC+YsC+4pS3H5m/Yv+wFkQiSscR1BHM01VEg9r7bRoLpuRdFo4m3UFfCIxhcmu2Pyd8+RmcZj9jPy915/3eMJaxiPMeGynNhV8Dw8mjufyOIVgAFg9Cz0HfJoGomxfaCTOVReW9tkWpPmajD8ykq456VHb95vX04TsLuu9d79dSMXArYsC+tk9oK/aK/ei5K/tll+FbirEcGVNY34ayyPq4PaKdO9372sp/wwD8W1cwnZKS+2eifkn7Ubc+KwZuRQxgIMzHkXFj13Cw6/Atga//9SAv0XJts2KgYqBioGLglsTATmgszFPCe/MzIbfkbNRBVwxUDFQMbBADIlBlD4go1bm63gkfC8cbH0gtFQMVAxUDFQObwwD/r+wBIuLmZC47wVg2h8baU8VAxUDFQMXA0hioPpalMVzbrxioGKgYuMUwsNeMRXiuMMNIj3KLzV0dbsVAxUDFwE5iYCec90Mw46CkU+oODjrBrkjFICWDtApdh5yG9LF0XXc58Ck57d8XBl0Ch4vMHJqU72xqWyX9ddWZc0x5H4QGh9vYgptFVtq2zLTeee2111LqDYdXm4fa/O5gG98e4cSBWwdQuw65Nfsd+hnsrm5wiRxYpASJw3tD2xpSf2q/zoW4KVVqFBkjmue/wLJpXPaNf928e9eZNg5rKfP5FRymdtRgE2dedh2+PtwO/W2vGIuU+Ba6k/BOrfsn/YrPshJLh/3YY4/1XtozFEFL1H99daWx08Au9xrCDH7wgx8cfO9730vZkF0IZjO4EdPBzqFtzT2usWNaB4f5PX/+/A0hIq9/5syZg/vvv/+m/Gx+l+3g0UcfTalyZE/AMKIgLFLsuOdcpgUHV50zcp/OhQsX0ppqMqJ4d8zTqW2wWJv+VkRBumTOPeuI2xJljn4lTIUnOffaHLubxuU6PPXNu3fd0fLkk08mBm/PIPSyIsv5JdvGnPPeBuuuw9cG89jv9oaxIMRy80jP8tnPfvbgxIkTaVMyhfnNzZKSy1kocm+1HTYci6S53yMJOulvYQ8ppN4f/ehHidB6F2ORs0yqmqUIVCl8Y8e0rn2aBe1UElGpbnICR4qOw6cYhNQ4Ni/tiRDi7owmjiUkdbU0SdU6IYW7uAnT1pa8U1J3zFVolNp2C6HkpZKK0rifffbZxMTakhfO0fcc/UrmKE9WWyoTMG4al214KZ13NMKRBoKK8FoCB6FMklq5Au2jJfbQrsPXhtM5vtsbxvLcc8+lhI6IAYLBfJGrr9KwSGX/3e9+N0nvwVhMLI1Gni1Exh0scU0xBMpk7GQ+0wTTiKuASZRMI8wsCBnVGUGQTyg/CY+g05Sc+I28X+v6a5s0ucqMRX6jIJTqkRi1LRyQGQVjwUgl1VMPzH4PJhNtl8BQMu5oT/vun0Hg4QqcxzoS78U7cz31aw4k9ISHfM7NReDLeBAHOPKOVDtt5eWXX04aDOGElqLt48ePJ1yTzmkRcj/l/bS1U/Idif7FF188+MM//MNkqrT2mC3dPGo87lJnZso1qpJ219WZo19wIrw0q9hLzX43ictm3/G5dN4li7V+JY5lNrd27HGMxnqhwSxRdh2+Jcaszb1hLDY9IkryI200Nz5NRRZjCyQkD+YA2UylyqfmsjljBjKfuvQH4yDZkVptHuYKNnf1EE7Sq2SQJDcMi8Tzzne+84atme3/4sWL6UZLpilS+7r+mnCbBO8g2PrjL4riMyn94YcfTrfhGYc+pN7na5KB2UYBr7rwUjJmMJSMG84kwpQ5FgPFYDAtxBhzYUJoahEB+1xPd5jDibmKWz/13ywEDeZADIUgISN0s8ATBi1haH7JkzFYP4iAzLjGaD6mFnMBT65M0Ic2gzFjiNZc21iiX1czIHwf+MAHbrrYDuNwKyoNCLNqwjq1X/1HglXXdBPammXTuGz2H59L5l1dmr7krYSIGI/U9/YWf2fQjGg3f46dB21sAr4c1l35e/ru2cBITDyTCEkP8Wyzhdpcf/zHf5yIX/we6fJPnTqVNqE6r776alJ/ESup6BEum4j06iY6fVhICK8b4zAMzEX2YYsTURIgoPzrv/5relcafVkD3M3wT//0Twd9/bXlJCNlY26kxLzIiqwgJDJAM59Jvy8duBvySLqckLSYeLdkzGAoGbd6165dS5cpfexjHzs4sTI/IojwwpzzJ3/yJ4mYx0bNYZ/rb9Kk+eRbw0yNFZEGDyYf2RrMp6ukMT8aJzNNs5h/c2T+ct8WQsyEBp8ITKyf5vtDP2McCJc24Q2TkA3729/+dhoHfPZpKzQa64kvCNOEZ2uBYPGtb30rjd36bDKWqf0aJ/Mg4aqtfb9vGpf6bCsl807YRD/cUU8Lo/G7gI+wYs24RiPWUVsfY+dBW5uArw3mbX+3F4zFxkJQ3K7YFpkSSMylPxFjNjBGEfco+J2TTnp9Eq3NihCRvtzdQjK00d2tgnCSfPWLiEtdjykxxSFMNjhTAInUPzCW9NfGWAL+vidpy0Y3fve+ML3l4/Vu6ZgDhnXjVk9/8CTqDJFBdG0WN28yG2pjacaCecK5uZManebmnhtXOodZA1xBpJk52jRD+HJTaF4IDFeuXEmmUnOL2LS9m79T+jdiRaNQwP3ggw8mho6oMYHx84Qpr61N2jktilaq8MfwH7l0yt3uBKk2gji1X1q9te6iqlgrTfg2jctm//G5ZN4JJDR9DIIwiCZg9kznhMiHHnoo4TbWT7Qdz7Hz4P1NwBdw7tJzLxgLCcwEMVFgBCWFv8SCcg0pLSeIsL/5WDAXzl6biARMYwkCiXAiMJgFBqJ/BIyTD5HzGy2HRoEhkX75Q0r6s6iXKqVjDhjWjRucpDzaEJMYTcUYbUgmo4isWWo82qWVYNyubeZIZuqCez4SWgy/V37tdCksCI2bPQkY5tQ8fvKTn+wkpKXtdtWjRd19993p7h9rWYQSrUrIeRfx5lNibmSKxFxoERiqOeGAZs7TVl8Z0y9/JC363LlzN/ZEXx+bxmUfLG2/EbisYb7Mxx9//IB5z562lh944IGDJ5544oD2aD+0lTnmoa3d+G4qfNHOLj33grGwrZMwMAKEvqvww1g8FgupUF2mq6ZUh3GQSl1DjFlpv6kJkSRtmGBkJ0+eTHbt11ehwpx+JB/MKuzfJNKS/jCykqLfMG+V1FdnKAwl4ybRM7/xsdAERAjB3zqCVgrzunpMgOYqN4HSnERVuV3zv//7v5MdOwSHde35nW+MOYlPjW/FPeikcwLDUuPCPASd0DTM0+nTp5M5ixbSxVjASuO5dOlSehdjt/YwQMSuBNYx/TIX8Q3oo7kvwJSXbeAy77/kb/vfGhLsQlPEkH1mheCnYm40JwSUplkx2p86D9FO27MUvi7G19bmtr/bC8ZiA1kUP/nJT5KvAwFoLgBMQIw6SfrM6nwDtdbioXVgHnnhX8AEEFZtq7eu8LOI7nHwDlHAWJhVSP/aGNJfW1/BwOI39njSefP7+L3tORSGdYQJ42VuQjxIr8xxJD0wIXJLFz4l8yRKK2ccgjjgHZMegh/wClcXuEFapaUI5Mgd+XOOSdDHCy+8kHx3GBhznn8IGPMqQcg808Ty8eUwGD9cRz2MnvkGzLSRtjK1X+9b22Dt2xubxGXbOEu/wyTRi2MrU7L1G2Oy/h2g9ZnJu28tjZmHueErbW8X6v3OLgBRAgPJ1eJwBsHVvs1FIGqLtmJDMGnxQZAEnGfIQ08xGfUQJlpQ14ZuwqQt0g6ziUOKzE7upQ9pk0Qztj+mtIhGin6Z1nKNKb7ve06Boa1dDk9mJ7Z8pqh3v/vdyWQocq7JrNven/odc4zIN6HmmEwURABRM3/riF+8E0+HI41JwIUAAAIDTWyJAk8YC4EIzFHgjoRs7YUwEL/lT8SMZsUEe99996W1L2DCGJy/6MoIPqVfjIsgwaHdxbgCxk3iMvoc86RhYyACNOAspx3WAq2sT1sdOw+lsJbCV9reLtTbC40Foix0TjSHzZw1EM3BmWsTsTvbaJjFmZW2gtjYsNRcDrvr168f3HvvvUnCxYDYkBEsGgtpJV9ofZOCsbB1u87Tu5zoQZSYDUr6a2ufD4HPh81faCqbK4KCseSMT58+GxNmRNLNyxQY8nbib3jEzBEbTEZxpkdkFUbobM9QrSraLnmS7Pl2ROgRFjA2wQJMF+aQfwJzCQl0XZs0MOYzEixN1oHaZuFLC1Np87ehn5nXzNnTTz+dTHnWi7mlESBo1lMf/Ag3R721a73SUoTI8ruYA2u87YDllH6Zes07htsXWLBpXA7FfV7fOAimjzzySGLKaAS8Y/j8hWgJxpLvtfz9sfOQt9H3dyl8fW3s2m97w1hITxiKjSr6SsSW70gTFjmiI1yYuQox9I9UypTjnIi4fwyETV6UGGl1nUTWnCyMi/ROY2HnzgkQ88zY/hyWwxwtYP4Mph8BBohQbq6ijdFKrl69mrINnD179iYQp8BwU0O//YAAi5QSIacvG0AfNDXaGSLN+S1se4lirh165KSXqod2yPzF18Zf0ef4boOHliO0mPZjTeS4jfo0XUTH+pla4A8xc8aIA54ZhrCAKTPBCRvPw56b/REcrI2IagQv+GSgsAaYdtrgnNKvkHoMEO77GPamcdnEzdDPMjewdIhmtG5pKYQWjnyBIEvMwxAYp8A3pJ9N1d27+1gsDtFYpGh/2wDUXNIcp27ubGRyUJdqz6xCwlOHr8CmtVExKIWEhlhGcUrX++rm3ztgh+BrB6PJN3ZJf9onrXLMR9v+pqaDFdHjVEQ0jI9GhJgg6swp6jHH0U7AbIMjttFWKQyl40bE4RrM8MeBqF+bEo7AyqFMg8jHFHic+jRmODdOZ0BI0+ablgf/bcyBPwKe+OUQ2agjrYe10HfKWpSZPvqI6pAxmU99Mp0yfwUOwQ++fL0228UEwdqWUoUGSQDRXhusY/s1pwSu5l5qwrYNXDZhaH7umveoZy2bi1irsZbttaXmIfr2XAq+vI9d+XvvGAvE0TwQHGYYBNfm6lJj1Wc+sdEQGCaVIDR+W6KM7c97GANm0kYsAlbEBkPrG8dYGKKP5pPJUZ85bOYg4OiDt9nWmM8x3wgAralv7GPaX/KdWK+0FfiyXj2Xxtm2+l0Sl1PbJoShBZ7moY+hTO1rzPu7Dl/pmPaSsZQOrtarGKgYqBioGNg8BvYmKmzzqKk9VgxUDFQMVAyMwcB0D+WYXhvvcFhLD8IMtLR5oNF1/VgxUDFQMXDLYoB5VoAIP+mcZsGdYCyiNIT8cVJWxnLLrvE68IqBioENY8DxDZGyomuPHGPhjJb4MU7Bbhi3tbuKgYqBioFbEgPO8Qjy8G/OshMaiwGJ6vJviMYiUgmXFQ485L05EVjbqhioGKgY2FcM9EXTThnTzjCW0kHI8eTAnnh750kUBxWdmHYKec4zCKUwDa0n9bnUEg749R3MKmnXqXTnR6S1n9pWSX9ddeYcU7MP0pRzG86CECacTWETzs9wqOOMgjDrZnGqupmyg+ovi4AzQQ5ESmHvAOaSAgoY5Zpz5op2vqmQ6W3125yH+vnWwcBeMRZ5v5zCdrjPoUH/HDryWfZd2Wqd0EZEliQQU5fH66u0GQ6YuedkCDNgD3XqXzYBB9iMUSZapsShbU0dQ/P9sWNqttP8jAHISCBhYyQUdXbJNQcXLlxIawCBtg7Onz9/Q9jI2zmzSuFx//33p5PvvpeyXsJSBy/hDuGVJFKqFGmCliL4Dm0++uijKZuBbM0Y4ybKtvrdxNhqH7uJgb1hLAixVBZOYEvBcGJ1fwJJ08E5v33zm99MuZ8QCGlIpM/Y1SISwyEtBG1IkT1Ayg0E1LsYi5QnDlXBxTbL2DGtg9mdJRKP0ijMK9One0kwWQxGuhRBHzQaWqzUGE5U5yq+9DjqKtaKK6YxIneaIPCYs1xzkj3C55y4FOnoxDfiTqsjHElHM3Tu1+Gp+fu2+m3CUT/fmhjYG8Yiw618WoiLjckckmslTtSX9Yg6AABAAElEQVRzREkySXoPxmKD0WikcbCZ5dpCaCJNixv6nIKWXoOpRdizlBZMLZGYjslEwkXpPiLppOWCoNOUnASPtCvr+mtbZrItG4s0H0EA1ZPhWNvSt5CuMRaM1NWq6oHZ78Fkou0SGErGHe1p3xXBCDdcgfNYR56qeGeup1s6SfaECVoKhuHecrihxcgfJ7UJ+PwmdBK+8rVhzgKvzzzzTBqHZKJx+6S5xmi0QYOZs8AzZmXutJ9n2p6zn2Zb2+q3CUf9fGtiYG8YCyKCiMpwTMrMCYepo6m47x1hCIlTriXJBl1ly7zB/o4ZSADoTg6Mg4+CFIwRMaWx4auHcJKG5cWSjRbDIum60TDC8tj0L168mEKlmaZI7ev6a8INdu8g2PrjL4riM+nbvenuODcOfch8zNck4y0CCV514aVkzGAoGTecyQkmgSIGisFgWgg45sJ01NQOAvY5nsaFobqsjV8imLo+zTfiGdcNvLW6YwfuzKn8cWDNtZaAh8YnrxrmRBhRZAyGY36vWDtRf+qTAETQwVAIOJJ2lhb3rmB4cZlcvMc8KKmqbNoEJWunWab022yrfq4YGIqBt6/IoS1soL4Nz9Qh6SDi2WYDt7lkHkZQ4neMRor1U6dOpU2ojru8mT0QIVl5ESQJFknDsiPrw4ZGeF2whGFgLhIxIkqInAABxWVf3pXVmAYklfm6/uL+lhxtpFnMihM+L5JSKgiJtN/MZw6TyorroiiSPOczLSbeLRkzGErGrd61a9fSnSKyQZ9YmR9J/vAigMLdIIhmEOgc9jn+Nl9wCt+5LwrzZFoyfozAfNMGPPngMF04wYDATRgwP4QO68jVvsxfND83jWJGzlG5miE02Tng14Z1pm3rkibMtFdaXI9gPfEtBZ6tBYIFQQOs1mcbY5nSbyl8tV7FQBcG9oKx2FgIhfs5QltoG1AuoYoYk14fo4i0437nnHX3CcnRZrXhScZuEyQZIlbO1CCcJEz9IuKy+GJKTHEInQ3OTENy9w+MJf21MZa2sTS/I2XTooxfWnOmt3y86peOOWBYN2719AdPos4QMcQb0ZJ+nNlQG0sxFuNzVUBeMPgrV64k06a5wBRoYBgLJmtuzLFMwjQ85lPBDsxeskXT+BBsQoG1gTExoRImHnroodb7TfL+h/4NX9aUkifwLGmHdk4ro5Uqoh75adzR4opjglQXI5zSbwlstU7FQB8G9oKxMFPZKMwwGEFJ4S8hud5zzz1JywkiTOPhY8FcOFWZxki2NJYgkAgngoVZYCD6R5g4dxEvv9FUaBQYEmmazb+kP8RsqVI65oBh3bjBSbqnDTGJ0VSMESFmgoqIqqXGk7eLIbgIjUBgDuDdnTjBJGklGLybLl2JwPRkjvhmaDH8YxivsfBpPf744+kuDnNrTA888MDBE088kbQyeNmFwlfE3MgUibkIWsAozYnAA+ZB+6KWioFdw8BeMBY2c5IlRoDQdxV+GEQDkWDiUJfpqinVYRykXBeEYVbab2pCTD6IWTCykydPJrv266tQYc5eEi9mFfZvEnJJfxhZSdFvmLdK6qszFIaScdMQmN/4WEjcIrDgb5MEjS+L+YcPjG/l8uXLB25JxOADDqZCc5qbSmlYzz77bLrYSdoKJiN1BD24vRFh9pk2yl/B7ASHNOM281LpPMxZjw/l0qVLKWAFY7f2MFRMJ8Y+Z3+1rYqBOTCwF4zFBkIMXCWK0CEozY2PCTibQJI+szq3wPyAaNA6MI+88C9gAgirttVbV/hZ3E7pgNvp06cTY2GmIf1rY0h/bX0FA4vfOG1J3c3v4/e251AY1hEmjJcZCWE/d+5ccnqT8MGEyG2iCC8XaEGroKUIvMgd+WDgezKfosNCM/W9YA/zg5mDmUPbujm2MikaR8w7PEQ6IabPITjXz5LFuODaeqCJWf/MdnAw9AbUJeGsbVcM5BjYm7T5JFJEwZkGtvLm5he1RVtB7Jm0+CBoKs4N5CGemIx6CA4tKCdEOWKaf2uLlMsM45Ais5MresMUQ7Ic2x9TWkQ3Rb9Ma7nGFN/3PafA0NYuRzdzEls+E5Prn5kMRc41mXXb+3N853BkBEhwxGPwER0W7Qt+ECEnJB2TiYJJYEzmme+FQIKBcPwLCMnXkD5orbkWFO1s64mp0NSYYO+777609gVMwIkAFGOopWJgFzGwFxoLxInY4cx0MM7ZBZE2nLSIHLuzjYZZnFlpK4gI6Z15g6P2+vXrB/fee2+SXDEgZ1oQIhoLqTUnMH2ThLGwdX/jG99I73KiB5Fjmijpr619vgE+Hz6E97///ckXgKBgLDnjA6/PxoQZkWDzMgWGvJ34Gx4xc1IyJqM40yNSCyN0tmeoVhVtlzxpTDJf0zRoni+99NLbXuP7Ei7MBySSj1CBAQoqYNoy11LnYC4YPwHFPfSIs7Xie5owv5E1hbHkOH9bhxv8Aowc9dau9UpLMVZ+F3NgjXPoe9ZSMbBLGNgbxkLtx1AQV9FXIrZ8R6pDgBAT4cLMVYihf8KAmXKcExH3j4GwtYsSI/0ONSVgXKR3Ggs7d/hqTCizy9j+HOrDHBES/gwmHQEGnM25uYo2Riu5evVqOkF+9uzZm9bSFBhuaui3HxB0kVci5PTF76QPmhoijehzpgvbXqLQNoQW00LMYY6L6I9mSotyCJGTXkofWiTzF5+cw7QYS2iWTubTeEW1gZ+Wgim95z3vSY7+PKw5+tjWk+BgbURUo/EbrwwU1gCTnnVeS8XArmFgJ64mJkXKoRR27j4kIQqisUjR/sZovEea46zNnfDMNeryETCXkOzUEbpr09qoGJTCxIJYRnHK3Pvq5t87sIfgawejyTd2SX/aZ3bhmI+2/c08A1ZElDMZ0TA+GhFigqgz7ajHHEc7ATPii4hGW6UwlI4bcYZrMMOfiCn9IsZwBFYOZZpBPqbA45Sn9Cvmru80vGgvmpU65gY+fvOb36TvrAvaoHnKmZIxaTdgjjHBeb5+psDe9i4/ifnjL8S0c5ja6mOqxiVgolmXBkkAMSfhK2prw3dD++1qp35/9DDA+mL933nnnbNqvnvHWEwtzQORZYZBcG2uPvMFswiCbXMylTQ36dzLZWx/3sMYMJM+YoHYYGh94xgLQxcumBz1mcNmDgKOPni72lzi+1gXGATtqgtHmLE14Wn9LMlQlhhnbbNiYA4MLMVY9lKPRsQQDf9KCubj36bK2P5KYcw1qK4xjYWhqz1RVM0S/qXm99v8DKYSuAgibWPaJuy174qBo4KBvYkKOyoIr+OoGKgYqBg46hiojOWoz3AdX8VAxUDFwIYxUBnLhhFeu6sYqBioGDjqGKiM5ajPcB1fxUDFQMXAhjFQGcuGEV67qxioGKgYOOoY2GvGIgRWeGnpyfmjPpl1fBUDFQMVA7uAgUHhxgi4lB4OGyoOekmrUhomO8eAJSN0EtzhPAcVFSfgpfaQ3sJhuV05U9E1XndqyPPkRPjUk97SmDiY6L6UqW11wVvy/Zxjyvuz5hxmdC6nWaRfacvt5R3JQh2adUiyeZbF7w4YyvdGOHFY0SHPksOGTRhKPoPdnnGAEyzgsneWPjuzrX5LcFLrHG0MFDMWhFDKDJddyRHlYJmUKHfffXe6IEnOpaWLhJJgcGrcaXT/nCr2WVp3adWl9EBsdpm5vL5Kve9UuQu0hjADedKkk5Gmxsl/Y3QTogOTQ9uae67GjmkdHOb3/PnzN4SIvP6ZVa6v+++//23nUZxul8lBOhpp9DGMKFIASZ3jioXIfE0wkhbmwoULaU01GVG8O+bp9DxYrE1/K/bNXXfdlVIULbVvttXvGBzVd44eBooZi5xQ/klb7vY6p94RObmtpPVA2OKirCXQhBDLkSRlh8ubTpw4kRIIMoX57Zvf/GZKUui+DgRllw+/SS7p1DfJeUgh9boeGaH1LsYiRxYmvxSBKoVv7JjWtU+zoJ3K8SX1RJ5hQT610JZlLJCqBVOhPRFC5Alr4tjVwDJk/9Ef/VFaJ7QGF2hh2tr6yle+klKorIOr9HcapbYlUJWklFBG43ZPDMFoqSSS2+q3FC+13tHGQDFjeeGFF9KJZndiMBsgZvIwSefN7CAz7JKMRUp0iRoxDQSD+SLXSvQtQ63sxZhcMBYEh0YjLxQiI4kjghSn1139Ki0J0wTTiOt2SZTGGJlumf8QBHmp8lPdcEBTkgEg8nmt669tOUnjbyzwGYRSPanztS0vGDMKxoKRuqtdPTD7PZhMtF0CQ8m4oz3tu/oXgYcrcB7bUAJE/WImkjHCQz7n5iLwZTyYLBx5J78qIcbh6TppGgzhhJai7ePHjydc02IkOpWDK+8nf3/I37SjF198MSVGZaq09pgt5S4zHlmqZczONaoh7XfV3Va/XfDU7289DBQzFpuYzZZkjCjbeIi5Z54kzyVEzBdxs2Kg1GKXYZjUZoMhUEOKTY+IkvyYN5obn6by1FNPJbNQSO/MAbLiuvuceQP8mAGty+VRGAfJjtSKETFXsLmrh3CSXiVcfOWVVxLD+vrXv37gCtywjbP9X7x4MTFVpilS+7r+mnDDgXfgQ3/8RVF8JqU//PDDB9/61rfSOPQhpT5fk1Tq0viDV114KRkzGErGDWeSTboaFwPFYDAtxBhzkb69qUUE7HM931pdygYn5krCUTDkWkv0Q9AgUGAoBAnadbPAEwbtVlF+jhASjMH6wZziXpyh67PZl8/mAp5chaAPbfoXAoQ11zaWaGvsXprab/RfnxUDYzFQTN2lSqcRIK7u3CY5P//880kCs3FCWyGFuQeD/TrMY5gKYog4YkqI8JCNy7/DJELSy6+ezQetPSntEZ6wkWM0YDl16lRiaOrwEbm7BbGS7h3hkrmX9Crtvj5saITXzX1gxVxk+GWGQpQECCiuJ/audPnG5Y6Mdf1F+vYcdlI2ZkWazYtsxwr8yQDNfMb0+LnPfe7gjjvuSJKuTMfmIt4tGTMYSsat3rVr1w5oq64ZOLEyP9IQ4IU5x6VTMcc53HP+Tfswn3xrmKmxItLgweRD8zSftGbzT+Nk8moW82+OzF/u20KImdBoDoSSWD/N94d+xjgIBdqENwKXzMuufTAO+OzTVsbupan9Dh1nrV8x0MRAMWNxgyBbNNXePRYkV2ndMRkmojBJ0ChIfiRphQ2ZzduFRXwziH8QgyYwXZ8xKRvRXeShLbTVzaU/EWM2MEYR91n4XRSbS7VItIgiQkSSZeKjZdnot99+eyKcJF/9IuL8SJgSUxzChNgzq5A+/QNjSX9tjKVtLM3vpMWnRRm/C8aY3vLxql865oBh3bjV0x88MeVgsoguIu4+E2ZDbYRQ0YR5js8YC+YJ5+bOnfQ0UGZRQQyiAWke4Aoi7XObZghf7rTJC4HhypUryVRqbucM/LDOaecKuB988MHE0H/9618nExg/T+ybHKb4e+xemtpv9F+fFQNjMVDMWEiozCEkdrc5Iihu3vMdIvPAAw+ky5TYwZlImE8wF8wIEcCAPvOZzyQTxFCJkATmHcwMIygp/CUk3HvuuSdpOUGEaTx8LJgLZy/TGAmYxhIEMkwvmAUGon8EzBWxiBziQ1OhUWBIpF/+kJL+MLqlSumYA4Z14waneaMNMYnRVIyR5spkJBqtdD7GjplWgnETbNyhw9QF93wktBhCDYFjaGFSdGMnAcOcmkeXtwXTHdreuvoRQcmfZi2/+eabSavKLyFrtjHHXhrTbxOO+rliYCgGihkL57lFSrVnGkKQmR5IsyKySFexKUlply5dSk52xIgj3Ka1UYYyFQNiW2dOwAgQ+q7CD2PjYnKkQnUxwqaGhHGQaN08iVlpv6kJkSQRnyCcJ0+eTD6i11ehwsbDDIZZhS+JRFrSH7yVFP2GeaukvjpDYSgZN4me+Y0AQRPgT4O/MfNYOo68HhOgucpNoDQnUVU0ZzeC8q+E4JC/2/U33xjTLJ8a38rly5cP3vWudyWBYalx2RuCTmjt5un06dPJNEyjj33TBu/UvTS23zZY6ncVA6UY+J3SikJ6Ob0RasTFRkdgXQmM4JNoEWkFgcVQ2JRJmYgTv0VXpM46GGx2EVA0BG21MRdM4Mknn0z+ANoUswiCROsIuKIf/gVtIKzaLiEmmKlrj0XAkdYxFmYV0r/3h/QXcOTPYGDxHdzBV/P7+L3tORSGdePGeJmbmPwQQuHeBAu+qD5i2AbbmO/4lMyTvnLGIYgD3jHpIfgBg3D1c+fOJUZJS6FZu1Oe1rMOH0PHIOiDZsU0DFbmPAc2rRvmVX5D89wnbIzZS3P0O3SstX7FQI6BYsbiJTb8JlFnGrEhmZFsfr+TBpmNhCI7M8DJ61AapzlH/JhCcuV81R7fTpOgCDOmrdi0YOGDoKk4z5AzNExGPYSJFpQTrD64tCU0lNnE+R1mJwENQWBJlmP7Y0qLaKSAgWkt15ji+77nFBja2kX4mJ34xZiiCBFMhiLnmsy67f2p3wlqEPlGW8ZkovB7YRDmD7Fu86dE3ebTOoyACwEABIaIDmvWnfoZngQ+MBmDOQrc0VqsvRAG4rf8OXYvTe03h6H+XTEwBgPFpjD2eBvEVZYILNPRG2+8kUI7+R8i9t/G5agnBSIKwjpvu+225HcRkWMjjTkUxq/D3OawmbMGzHCcuTYRHw6mhVmcWZ3GRmz0I7RZZM3169cTPCRDDMiZFrDRWBClJpPqQqRx8xvBgXc50YMoMfOV9NfWNh8Cnw+bvwg7DBxzxlhyxqdPn40JM6IN5mUKDHk78Tc8Yua0RExGcabHPGKEzvYM1aqi7ZInLYImLEKPsICx0UZF3plD/gnMpZSx0MCYz5jOaLIvvfTS28CwlsNU+rYfB37BvGbOnn766aThWy/mlkaBuVlPffCP3UtT+x04zFq9YuBtGChmLEwgQo2FDAszRuCo8SJbEHobSEHsHP6KSCzaDO2BGeXq1asHx0YerOPfiX5EX4nY8h2pDsFAdJhomKsQQ/+EATPlOCfiDA0GwiYPNtKq94cUjIv0TmPhM8oJEPPM2P7gC3NESPgzMGkBBohQbp6hjdFK4JFp8uzZszeBPwWGmxr67QcEWKSUwA198Tvpg6ZGO0OkOb+FbS9RrCmHHjnppeqhHRIOmF75K/oc323w0HKEFtN+rIkct1HfWkXsrZ+pBf4eeeSRdMaIyU0QCGEBU2ZWFjaehz03+xu7l6b224Sjfq4YGIqB/7citkVhViKA+DhszrdWPgpSInuxMNgwwejcxlU3PzQZQJF6Ec2m+s/MJZ+S9tZJn8xg4WvxN+LjPZoRp27uhGdyUJezlllFv+qA2aZFWDAohUkEsYzilLn31c2/d8AOwdcORpMToJL+tE9a5ZiPtv3tHAVYET2hzRiw8dGIEDtEnTlFPeY42gmYzQdiG22VwlA6bkScxgJm+KO56pcmAUdg5WujQeRjCjxOfRoznBunMyDhp6DlwX8bcyDwwBO/HCIbdTBja8H67CqizIaa17ra8r351CfTKfNX4BD84MvXa7OdMXsp2pjSb7RRn0cfA6wv9vSdd96Z1uZcIy5mLNGhjU7qUmwSRHcdM4h3u55DGIs28EJwMMMguODITUbNfphPbDQEhkklCE2z3lyfx/bnPYwBM+nDKcKIofWNYywMXThgctRnDps5CDj64O1qc8j3Md8IMa2pb+xD2t1E3Viv9g18Wa+eS+NsW/1uAqe1j3kwsBRjGazvIyz+bbPYkIiLfyUF8/FvU2Vsf6Uw5hpU15jGwtDVHjNOs2xyHezCumuOv/Tz0PVa2u66etvqdx1c9fejj4FBUWFHHx11hBUDFQMVAxUDUzFQGctUDNb3KwYqBioGKgZuwkBlLDeho36oGKgYqBioGJiKgcpYpmKwvl8xUDFQMVAxcBMGKmO5CR31Q8VAxUDFQMXAVAwMjgqb2uHY951MdzCxmRJG2KYT2s5TSAjZd5J5bN9OfjujIXV834E2aWwcfpMEMw6Mju2z7T3XDxi/g4F9cLS9W7+rGDjqGBBeLZefM23OtrWFpDtQLXuEs2BohQPeDt4uHfoN97sO35zrY28YiwOOLp1yeM3hyzi3YrLi5kd5mZxwxmTi9zmQ9eMf/zgdqpPJuY+gOz3v8OTHP/7xRRjL66vMyg75rYNjjjHXNioG9g0DDsU6aC1bhCS5zgvlxVUFEtU6cOssGNohu7VrPqSHamNE+ftT/951+KaOL39/bxiLA38O6Ul/IsdSnKGwOByUlFpEqhl5ks6s8oUNTdeSI6X5t7QiTreTcPqKxeogH5iWKA7YOei5VPtLwFzbrBhYEgMOFMsOgWjT6CWdle6nuUcIZFJSycrgXiiMh8AoxyBLgz2+bn+PGceuwzdmTCXv7A1jicFI4yF9iRP0eXEgUKp+WoOFlTMWaq+FRw2WSoOqnKfSkKH55z//eVp0Fhz12GVeofU4iGmh5uoyRucGRelOwKTd5mKWRdk7fssPP8pcrE1pWQIO77otUZoU5j3vSOvi71oqBioG2jHgSgJMgUXD/skzmedvPPPMM2lvSSIbt45KRYTReK8vzU/eztC/dx2+oeMprX9kqJYFRavAUIIh8ItImilpJd8HIo/Ay7xMaiGhSKToalp5nBB5WgFGIXng8ePHU32LESP5yle+kiQdOZwkRmSairQm7qmRCy36NgESHWIM3pOwMorP8vPEfejybjHhuUwLgyHlaAdzoaarm7cb7dRnxcCtjgG54JiG7X/571gu2sqPfvSjJJDa0yGUyrpuD/Jb9mkr7pLCgOJSv2ifoCq5razm8iW2CYGbgC/g2aXn3jMWRJhGQqVF5N39EqlemMY43mU8/tSnPpWIO/UXo7G4ZCjGVCwaN15aaOyv0sLLMszuyp+jfW1jVIo7Yf75n//54MMf/nBabGDQv0SDtKEopCjMKt6L7yWbVCxMhe+If0jG5RMnTiRmRvuSVdh4bJzYDOmF+l/FQMVAwgCBzRUaBDJCpX3ZLLQRQp9rtu1/1gk3zLJKeNeVHH1pklxT4aoGVy3EXrR3XW1BcPWupLRtjGUT8DXHuwuf946xuAYZEQ5Hm0XDfoqIu2PFfSb8LzQPtlNEn3YiDb0FcPvtt6c72y2Uu+66K2WeFVUmHb6Mxy6yQsRpEWGmyidKhl3vqocxkVQUi9Rd8E1zWP5u19/6sWBFnVmgxmZBfuc730nmNtpSZSxd2Kvf38oYsFfCSW/f5+bqwAtTN3qAQbj5lUWChmK/0kYeeuih3jui3APFpMVyobhPij/HvVOumkY7uhjTJuCLce7Sc+8Yi0UkMiukAyqw+1ios0KSheJaXCQSmgE/BaZDK4lCsrGowufC4edujPe85z0H0qZ7Mj3lfpp4Vwp0d7pgKhZoMDiMBpPhXB9aSFK0GsyMpmIjgI/5DuMcw6yGwlDrVwwcVQyI1LS/+Dwff/zxtL8lVbXPHA144oknkqWAybmt8IUySTNXYy7/9m//lny59i2hlZUi6EDb++u+mwrfuva38fveMRb+EeprSCk0EszFJVn/8i//kqJCaA98JogyBsMOGowIkhFw50wQ7C996UtJA3IbZtxwGRcs5f3E5FCpLVJms1yj0T4tg8msr+izaRoTAMD0xsdC6mJ+09aUxdoHQ/2tYuBWwgBtgrDp/hsRpRiBz44l8I+wQKAXLBc5nchxRGhkLhcYRPjj+CdcYjpT92kpfF2ML4dzV/7eO8aC4FoQTdOQO9ldQezwk6iPcOK7DpaJKe6mzxFvsWjHYaqPfOQjSYJB3Jmg3HjpQisRaHlRnzYjkoSJKleBaU0CCPLS1DZoThhhfE/bcsslTejcuXOpP9KU3y3gWioGKgamYYADHcNgvbC3MBUFQ4jLBQmEsSfbemPdsB/tX4InYZAZDZNqs2y0tdH1XSl8Xe/v4vdHJqWLyaZBiNiixVhEtBrqLubgfEv8EwnGMS402XXGHPYOSmFIDldxojN30U7CwR6Tx6EvsoypKtdO9MN5n2sjTHbMWeCJItSYvTcWsT7czshOizm6YplZzZmd/L14vz4rBioGhmGAMIqBOHJA+Iu9pxV7D91g5ejSPMJRz2d73333peAdQTWsJIJ2tDmllMI3pY9Nv3tkGEtoKBgLrcEVyKK2+FlEiiDgrvp9ZhXPzumGOTA58cs4jetMigVEg+Co8ztzV1M1xlioz07xfv/730+mNozJnewOauWLVrgw+6l6NCkhjyJJMJYIH5ZJQB8kIDCqz8EonBlT+p//+Z+bNJxNL5DaX8XAvmPAXnZLLQaAGWAmTOSYAgGRyRtjiT3ZHK930AxmeAFChD8n/EWbMp/b3/b02FIK39j2t/He3pnCupAkyoPEQRvBQHz+67/+67SYnnrqqWRHtXA414UXMo+xudJQSCKc/piTBULdtYD83pRiaEFf/OIX06EsjjyMCiMTbcbslmsxsgTQiixMPhTMTnQa5hHtUoPBQIM6e/ZsYmjCpf/sz/4smdmMR2z+pz/96a6h1+8rBioG1mCAqRtdYOa2p2gpfK0CdT772c/2pmoKn+tHP/rRGz6V3//930/m8qtXrybrSFMAXQPO236eAt/bGtuBLwbfeb8EzCV33jM10T4Qe8Q4CHPAw2xE+2CKcnI+7J60gF/84hfJTIUBWBB8JxxhFoPFJVqEGUucOg2CVsI55xQ+ZuTglXeZ1PhU9KU+yUf4MTXbAUnf87toXz2wUL/BjaHxDTHRWeCc9GAhrdB0aCzaw7jApg2wOYnvPfD88pe/TG0GHDH2+qwYqBg4SAKh/dZFI+wzvszYR7HP7Mk8EKeJS1YQgUAsHE26g74QGO3b8N0034/PBNYl4Iv2xzyXuvN+bxjLGKTFO0xcYVvFcJqLCENA+C0ehJ6DvkQCoZ1wxFtU3utaWJiNPrri7ANOfhX95vX04TsLuqv9eL8+KwYqBvoxQEC01z3t2yYt6H97+V83Dd9SjOXImML6phxh7stKjGi3RY31tek3DMC/dQXTKSkiVpqlpP3mO/VzxUDFQDsGWCDa9ll77c1/u+vwlWJkJxgLqZwpyLNK5aVTV+tVDFQMVAxMwwBTnvM7c5edYCxCbUVZlZif5kZAba9ioGKgYuBWxQBTPt/R3LR3J3wsnNnhA7lVJ7iOu2KgYqBiYNMYcDxCMFR+cHQOGHaCscwxkNpGxUDFQMVAxcBuYGCvD0iKouKXyQ8l7gZaKxQVAxUDFQO3LgZ2wscyBP3OszhM6HyHg4aKlAjSsUhn7RzKrgcASLnN9OdQZl+0Wgle3DfjvIwDn1PbKumvq86cY9IHYeG1115LuZicE2qeH1BH2KgzRoosBw7FiqrJi3Y4KJ07IohQ+50J6jt3IDz9xRdfTE5Np6vzfHB526V/CzeXYVuGBuOQX8pZi6VDXaf267CwDBASNzrY2wbvGPyW4m3uemCVAcNcKOYgUjTN3Vfe3rb6zWHY9N97xVikt7fQEROHC/1z6MhnySN/9rOfpdQq0jPsMnN5fXXzpDtk3MEyhBn84Ac/OPje976Xkla6t8UYXVzk/M3QtuZeaGPH1AWHg2TSZshK4KAqRpAXmQyk6pGaw8YlUEhhLpV51MUgZD2Q9drhV2eJhH7LxXbhwoW0ftoYltPZbv88efJkqjOFsThcZxzWpr8VZ6ncBfT5z3++9+bCfLxD/56jX3nt4E5qpCbDBs9Y/A4dyxz1CXJox6uvvprSJDkvYh7uvvvudB8LoWSJsq1+lxjLkDb3hrEgxDIOO0UvBcOJEyfSpmQK85sLwF566aWUTBIx2uVYdZIgaRtBHFJIWvKNnT9/Pr2Lsbjv2yZZamOUwjd2THn7CL/T0ZgKDYggIU15E08ECTmaaB8yQjvrI+2NuzYw6rgREONx26dMDNYEidtdGhg0BuOKaBExeZH9gBboKd0HmKYUbenPZVFyzMn9RuN+9tlnE9OiZQcjnNJP8905+n3llVdSJt+2E+f6G4PfJpyb+mx9+Pe+970vXc4lGoqQRkCR2SJuhpwbnm31O/c4hra3N4zlueeeS3m3EAjEBlHJtRKn5SWU++53v5sWSTAWhAEhksYBgZIqP26ThCwJJ4XaUYuZSySjJMkwl0RiOuozguASsPzAIoJOU5LbK9KzrOuvbYKklDEW5pz8MCWJUdvufGBGwVgwUlerqgdmvxtXjosSGErGHbBq3zUBzI9wBc5jqzQYc4coggmjNE792fxtRX4m83H58uWDd77znTfMSzQMd2tgJLSMl19+ORFtgggthdTtzvOQxGkM0nEE7rT51a9+NTGbKVpKwBwmNeY0pkprj9nSBXHyyLnRkJlpbsYyR7/gpA3TrGIvxbjiORS/8d42nm6dtXflD7S37V3rWLZiJleJKNGQucu2+p17HEPb2xvGQiVHREl+TCNBDGLA0t5LNsksFNI7cwCTxg9/+MNEfNicLShSyxe+8IXEOEh2JFmbh7mCHV49hJNEK2cXyQ3D+vrXv54IWdia5R26ePFiWpRMU6T2df014Qa/dxBp/fEXRfFZPqOHH3443a1tHPqQIZmvSaJMiTDBqy68lIwZDCXjZiaSr8zNeRgoBoNpIdA2pVv1wNdmJokxDHkSFkiOGAphgLTXLMZPgMDkzY01oSDeDnoxddAGZVLAjOVw49MIgQC81gomFlcawL2xSW+BobmqFrOfWswFPLkuW7/68c93BANrrg937vtg6v3ABz5wE9HDOFxeRwPCrLSZl6n9aivy4NHa2giuNTcEvzl82/jb/IMZbSA02APG5dmlkQWcY+fB+1P6jf738XnzitzREbBTcsCS9BDPNru4zeWgJQIRv0dW41OnTqVNqA7CI102Ai5j8FtvvZU2EYnW3Sz6sJAQXhf7YBgImMwAzFAIFXu+Ir29DSjbscXKPENi7uuvLXUMCR2zIiXmRfJKBSGRqBPBpLq7RvmOO+5Iku5//dd/JS0m3i0ZMxhKxq3etWvX0g2b7qhhfkQQ4YU5x50Uc5oQzAnJ0RzSGplamsVNf/BgTnKtwpwfWwkDGAbhQEZa82Gucj8WosvURktAZGKtmEvCAz+NtRDCQ7P/IZ8xDkKBfuANk5C09Nvf/naaM/js01ZoNNYT/1Dg2VogWHzrW99K47c+m4xlar/GyGQIn23t+12fQ/DrnW0W2cIJJIRDfjia//PPP5/2HMbfxjwD3rHz4P0p/Ub/+/jcC8ZiY1kIJNK+DZ9LfyLGbGCMItJd+10UiCzIpGGbFREjyVCRSYY2uhT4CCepWb+IODsspsQUh1jZ4EwBpE//wFjSXxtjKVk4Mhrb6MYvPT/TWz5ebZSOOWBYN2719AdPTDmIDEKMAXBwMxtqo29Tlowt6mg7CC1Jr027w+D12XZaGH60gXko73jHO6Lp9PTulStXklnUPEaQB6HF9zSAe+655214vamRAR8wPhqFgiE++OCDiaELOGACY7LDcLoK7RyjpJUq/DF8T+4G+eAHP5gEqZy5RjtT+6XVW+suxou1Em3H09orxW+8s82nS/QwS9F+0ubTvB3MxmRov0vMg/FO6Xeb+Jra9+9MbWAT75PAEAyLASMoKfwlzEVs8LScIML+5mPxG0exTcRMQUoN1RjhRHQwCwxE/8KZESYagu9oKjQKBIJEzCxQ0p8xLFVKxxwwrBs3OG08JjdaI02FM5xvgxmJ2bF0PuYas4vYzFkb02EWa/ue+YyWipkwJxIiHnjggUQ0MSlSrHL//ffftFbmglk7EYGEWGOAUhjRyGhXXYVvjbnR2sRcwA3/mArNionPvugrY/rlj6RFC4IoERr68NsH2yZ/Iygy57I4nD59Ot0SS0DzHSFpqXmY0u8m8TN3X3uhsfzu7/5uMidgBIh6V+GHYRtHNEiF6lpITakO40CAEClEVvtNTYgEY8ME4eQYZtd+fRUqzPnHdIJZhf2bRFrSH6JYUvQb5q2S+uoMhaFk3Jgp85sNSIugKcDfOoJWCvPQepg8vLetBcwPfmmUAR8/GNMR/xnfCqaIuBMc1OFcJcXSVDBm2guNx9wTIjAiJlZnaZompyGwk/wFnWAK5glxY86ihXRpBdqn8Vy6dCm9i7Fbe5/85CdvXDi1DoYx/XLa83dhbM190exvHX6b9bfx2TXjLBSYLNMkM6p1wuxKGxdRSjucex7m6Hcb+Jqjz71gLAiACKif/OQnSWtAFJqbHCFwxTBJ+syZM8mkgnnQOkJCD4TxL2ACCKu226TcqBtPi5GDWAQJooCxMAUwtWmDCae0v2gzfwYDi+/Y4zmxm9/H723PoTCAu69gvF/+8pfT5UjCepnjEHYwIXLbKMyh5p7g0GS8CDamh4CYC6HpgjRokqR9QRu5Ix/81gLNy7xiPt6zNjAXt3/6HTFCaJtrbt34BX1gXHx3mBqG558xMK8ShMwzf1Bo1M02wQLXUQ+j5wM0DuNsK1P79b61Dda+vVGC3zb4Nv2dqE5rgJ+QoGnfK4Qk1wzzYwlSocF37Ykx8zBHv5vG1Vz97YUpzGA5r21spgC20SbB5XilrdgQ1Hc+CJqKsxB52Comox6GwKnataGbCNYWsxcpVvw76ZZjLqQckuXY/pjSIkIp+hUSm2tM8X3fcwoMbe2S3pn8SOxsxTYhswyzQZNZt72/xHekaL4e+MH4Yh2QDvkj2MvDfBP3mwuuQFQIB7SuvLgSlkmK74K5SYSeCDthyExBjz32WBp3nw0+by//G54wFgIRX10UuMMErb0QBuK3/ImY0bZcnS0s1toXMGFcTHs0tLYypV+MixZi7F2MK/oswW/U3eYz/HV8kHCaFwwHM7FmumjB2HmY2m8O5779vRcaC6Ra6NRVh82cP6DGcsTbRCRLGw2zOLPSVkhaNixnrIiO69evH9x7771J/cWA2JD5DUguJLIgTusmD2NBfISlepeNNggVglfSX1sfnP8CCr7//e+n0FQbAEHBWPLFrk+fjQkzIunmZQoMeTvxNzxi5ogNJqOQwkQDYYTOfQzVqqLtsU/z+ud//ufpFDU4Pv7xjyfCgOhaCz6DG9PhpKVp0Fodnm0WfjMmLpJqXmgr5tV6QsgJIGMKk5s5e/rpp5PvxnoxtzQCDNt60naXVoBwc9Rbu9YrLeW2225Lfhdjh4u2A5ZT+mXqhT9MuI+ZluI3zM5j8DfXO7RFQhcGb+/COxPfG2+8kQJ0rIP8PFOz37HzMLXfJhz79HlvGAvpCUOxUUVfidjyHWnCIidNMzkwVyGG/pFUmXJIofwjGAjJVpQYCXadRNacSISG9E5jYefONw3H8dj+HJbDHC1g/gyLXIABIpSr5rQxG8QBQdkGzp49exOIU2C4qaHffkCUObw5IPWF0OiDpkY7Q7jZroVtb7KYP7jBiBEHhBlTEXWFEYCTRoNBCE02/zkeA1ZaLcJurSxR4O+RRx5JGpCzQMyIhAVMmVlO2HgeCt2EgeBgbURUozGAWQYKa+DYKgigDfYp/QqpxwDtsy6GB05msG3jt4mvrs9w7OCtIA1+LWHGBDTmRZF5QVe63h87D1P77YJnH77fu7T5zGCisUjR/rYBSJ2kOSaS3NnI5KAu1V6UCwlPHb4Ci8VGxaAUEhpiGcUpc++rm38v+gtR0w5Gk2/skv60T1rlH4i2/e1sBVgRQqHNiIbxkZwRE8SSOUU95jjaCZhtcI7IaKsUhtJxc5LDNZjhj3SvXzZpOAIrhzItMB9T4HHs06Y3Vr41hLLJGJwHEe5sbpkx1IMD6wHRwHj9xn/SVZjNSOdNAkpYceaBDd748jXV1VbX9+YTHEynzF+BQ1oqmPvaRrjBD47m+GmQBBDtNeEHy9h+zSmBq7mXmuObgt9mW5v4DI/2l/3y1spvBmfohjUTJuQuOKbMw5R+u+DZh+/3jrFAKs0DkWWGQXBtrtxk1ES8kFIbzeZEhJqbtFl/6uex/XkPYwjbbBccFiuG1jeOsTB09Ukj0GcOmzkIONqIW1dbc32PoQZcXQR2rr6mtBPrlbYCX2D1XBpn2+p3Cq6WfteaNQ+KeSA0Lj0P+tpWv/reRtlLxrINRNU+KwYqBioGKgbKMLA3UWFlw6m1KgYqBioGKga2jYHKWLY9A7X/ioGKgYqBI4aBZcJhBiJJlAtnYJ/PYGCTtXrFQMVAxUDFwBoMOGj8qU99KkVT5kFKa15b+/NOMBaRVjICy+O1CUfaWqzUChUDFQMVA7cABiJPmkCYOctOMBYRXTIKC/+rjGXO6a1tVQxUDFQMdGPA+TmWIhGEc5a99rEINRXGNzdS5kRwbatioGKgYuBWw8BOaCxDkC63j5PgDnI5qKg4AS8tg/QWbQfehrS/ibryUsnz9IlPfKL35HUJLC4kczDRfSl9p7hL2ppSZ84x5XAQGhwwdC6nWSQjjSzF+W/ekVTSoVlacNN353cHDB1aJJw4gOmQ59JnYdbBlY9hzr+31e+cY6ht7RcG9oqxSCj5ta99LZ2WdxrdPye0nSKX1l12WkkDEZtdNqm9vsrHJFhByu4hzECeNOlkpKlxMtoYpTh3YHJoW3Mv07FjWgeH+T1//vwNISKvf2aVF84dKlKl5MWJ/UcffTSlo+G3wzCiOFUvdY7MwpH52iFbyTUvXLiQ1lSTEcW7U599cE1tu+/9bfXbB1P97WhjYG8YC0IsekxKhs9+9rMHJ06cSHmemML85k4FiQYlfpPfqklsdmkanfyVCYAkOaRISyOXE0LrXYxFDiSOt7GJEof031d37Jj62vQbzYJ2KguxdDJ5hgX51DAFRcYC6WcQUdoTIcT9J00cy2QsYaUcUdaJlCpuFsS0tSWzsRQqc5VSuObqL9rZVr/Rf33e2hjYG8by3HPPpUSNiAGCwXyRayVStcjvJHsx6T0Yiw1Go5HLCpGRFwhBitA6qdal15C3iWlE/im5kphGaD4ImYy+MvnKLSWlSRQEnaYkMWPk81rXX7ybP6XxNxb5o4JQ+l0iRW3LCyZyDmPBSH/+85+nemD2ezCZaLMEhpJxR3va/9WvfpUIPFyB81hHAsR4Z66nfs2BZIzwkM+5uQh8GQ8mC0feya9KyGFxnTQNhnBCS9H28ePHE65pMRIS9mW6zdsq+bsUrpK2htTZVr9DYKx1jy4G9oax2PSIqNT5bWHJNJWnnnoqmYVCepc8TmZbtwAyb7DTYwYyy7oACuPgoyC1YkRMaWzu6iGcpFcJF6U5x7BkR3XVcSQOZPu/ePFiSuHPNEVqX9dfThhjWXkHwdYff1EUn0npLpqSldU49CGlPl+TVOrS+INXXXgpGTMYSsYNZ5JNysyLgWIwmBZijLm4NrepRQTscz0lDIQTcyXBJBj03ywEDQIFhkKQkHW5WeAJg3bZE/9LCAnGYP0gxnEvjvmYo5TA1dWPC72YAuOW0qjHnCdbt2saCEptsE7pN/qpz4qBsRiYZ/eM7b3wPY5uJhHZhBHPNhu4zSWlPcITv2M0bodzRsYmVOfVV19Nd7cgVtK9I1wy95Jepd3Xhw2N8Lq5D8PAXGT4ZYZClAQIKG6R9K50+TQgd2Ss6y8uBsuHTsrGrDjh8yIbq4KQuOiM+UxafenW3UBI8pbpmBYT75aMGQwl41bv2rVr6bIq1wycWJkfaQjwIoBCinrEnLa4VKF9mE++NczUWDEC8GDyoXmaT3f0mH8aJ5NXs5h/c2T+ct8WZsyEBp+Eklg/zffHfC6Bq6td9+5YT3xBgWdrgWBB0DB267ONsUzptwue+n3FQCkG9oKx2FgIiitdQ1toG2AuyYoYc28LRhH3Wfjd5WAu1SLR2qwIEUnW1bUkQ8TFmRqEk+SrX0Rc+nRMiSkOYbLBmVVI7v6BsaS/NsbSNpbmd9J706KM330ZTG/5eNUvHXPAsG7c6ukPnkSdIWKILqLlYBWzoTaWZiyYJ5ybO6nnaW7i7wUxiAakeYDL3Ck+t2mG8OVOm7wQGK5cuZJMpeZ27sCPErhyePK/aee0KFqpIuqR/8jlXx/84AeTIBWMNX/P31P6bbZVP1cMDMXAXjAWZiobhRkGIygp/CUk3HvuuSdpOUGEaTx8LJgLZy/TGAmYxhIEEuFEYDALDET/CJgrYhE5v9FUaBQYEumXP6SkP4xuqVI65oBh3bjB+aEPfShpQ0xiNBVj5MtiMhKNVjofY8dMK8G4XY3sDh2mLrjnI6HF8HsROIYWJkUXhREwzKl5dHlbMN2h7S1Rn0+JuZEpEnMRZIChmpPPfOYzyZxnX9RSMbBrGNgLxsK2zkSBESD0XYUfhiP8gQceOJADR12mq6ZUh3GQaN08iVlpv6kJMfkgPkE4T548mezar69ChTn2mcEwq7B/k6RL+sPISop+w7xVUl+doTCUjJtEz/zGx0ITEDEFf5siaEyA5io3gdKcnn322XSDpRtB+RNCcCjBFd8YcxKfGt/K5cuXD1znS2DY1LhK4FSHD+XSpUspYAVjt/YwQExn12AtHVOtd/QxsBeMxQYSAeXOaoQOAWjalTGBJ598MknSZ1bnG5hFECRaB+aRF/4FTABh1Xab2SSv729+FtceO3h3+vTpxFiYVUj/2hjSX7Ntn4OBxW+ctqTz5vfxe9tzKAzrCBPGy9yEEJ87dy7dtkd7AxMit3ThUzJPorRyxiGIA94x6SH4Aa9wdYEbNC9aikCO3JG/9JiGtm/8cG090Nisfz5AMA+9Wnto37V+xcBYDLw9vGZsSwu/R3LFTJxBcGVvk6CI2qKtIPZMWnwQNBXnGfLQU0xGPYSJFpQTrL4haOtP//RPk9nEIUVmJ3e/h+kkrjcd0x9TWkQjBQxMa7nGFN/3PafA0NaugAlmJ0ERTFHvfve7k8lQ5FyTWbe9P/U7QQ0i34SaYzJR+L0wCPPH91IiGMS7DkdGwIUAAAJDRIdFnV15Yio0KybY++67L619ARPG8I//+I8pe8OuwFrhqBjIMbAXGguA3/ve96ZQYwfZnDUQAcSZi8ixO9tomAVtBbEhvYsEE1lz/fr1g3vvvTdJuBiQMy0IFo0FUWoyqRxB+d8YC1v3N77xjfQuJ3oQJaaJkv7y9uJvPgQ+Hzb/97///ckJj6BgLDnjA6/PxoQZkWDzMgWGvJ34Gx4xc1IyJqM40yOyCiN0tmeoVhVtlzz5Tvh2ROgRFjA2wQIipcyhlDiYSyljoYH99Kc/TaYzmqwDtc3Clxam0uZvm/6MgXDUW7vWKy3ltttuS34Xc2CNc+h71lIxsEsY2BvGQu3HUBBX0VcitnxHqkMwEB3hwsxViKF/woCZcpwTEfePgbDJixIjrQ41JWBcpHcaCzt3ToCYZ8b25/Af5oiQ8Gcw/QgwEOWVm6toY7SSq1evpmwDZ8+evWktTYHhpoZ++4HvQqSUCDl98Tvpg6ZGg0OkOb+FbS9RzLVDj5z0UvXQDpm/+NocksVYQmMs6Z+WI7SY9mNN5LiN92m6mJX1s+1CcLA2IqoRvOCTgcIaOLahQ6rbxkPtf/8wsBN33jNzye1UkjafGUw0Fina34iP90hznLq5E565Rl0+AmYVkp06QndtWhsVg1KYRHInvzQi3lc3/94BOwRfOxhNToBK+tMXUwzHfLTtb+cowIroCW1GNIyPRoSYIOpMQOqJYqKdgBmxRGyjrVIYSseNiMM1mOFPJJl+aRJwBFYOZRpEPibjnKMYM5wb529+85ukjZpvWh78tzEH/gh44pfDHKOO1D/Wgmi2riLKbKh5raut5vddcDXrxWdMEKwCJmIM8RsNkgBiTtZpbEP7jT7q8+hjgPXFnr7zzjtn1Xz3jrGYapoHgsMMg+DaXLnJqLkcmE8QbJuTSaW5SZv1p34e25/3MAbMpI9YIDYY2v9v7z6gLCmqPoC3OSfMmDCLGJAjQcR1AV1gAYmiguQgIsGEsCAIgkjOiKJkA3EJK0gQCSKIrllURBTEjFkxh29+9Vl7anvfm+n35s3M65lb58y8VF3h39U3163R5tHvGLrNnclRn+XY3IM8jtHG263NXr7P95vgQGsabe69tBt1A4GZjMBEMZap1/f7uKuIGOLir0nBfPxNVum3v6ZjLDWobnPqdwzd2hMNVi/Zv1T/fiI+62sy+5uIOUSbgcBMQaA1UWEz5YbEPAOBQCAQaDsCwVjafgdj/IFAIBAIDBkCwViG7IbEcAKBQCAQaDsCwVjafgdj/IFAIBAIDBkCrWYsIpVECzXd4Dhk2MdwAoFAIBCYlgj0HRVmY+Jll12WMsvalNgkUmkQCMrxZMOePRT2kyg2KtoxbRfyRO1BGMTYcxtSnztjxga/8lyQ/Hsvr3al2z8irf142+ql33rdQc5J24QFednsT7JvpVN4sRDyvB/HvpZOKXq0Y8+HFDwEEfta7LsZbf/HoNe20G/7qOzHMQ9zssem3HNVx3MQn6eq30GMPdpoNwJ9Mxbncdi9LOtvzss10VDIw2UXNmKiT382f/ks+65stXZoS1I50fsqxjPX60YyJNus55yTXpiBdDZ2/csmYIOmOV577bVpE12vbY1n/J2u7XdOndrynQ2ONs3a+S+zMUZQFhkKHOYlizXmQaCQSl5K+VwXg5DNQNbrnIxUGLYjEvbee++0fjoxrEGubZsczcPa9F6R8WHOnDkpkwRmOBFlqvqdiLlEm+1DoC/GYhc2SdnrrFmzJiUhIUIslYUd2M7imD17dpJQmcL8dvLJJ6fcT9KgI0ad9l0My+2RA4y03asJj9TrFMu99torXYuxSHli5/1EEaimmPU7p7J9m0Pt9MdUaEAECalb6jgRJOTKon3Iumx/i9QyhxxySGLU8srRoDEeSUtXWGGFtCZoCM40waAxGMc529VelkGvbc+J/hzaJZec/Go0bmn/CUYTletrqvotsYz3MxeBnhmLB+PQQw9ND+Rkmb/cHhlu5dPCNBAbRKXUSuyodwCVJJOk98xYECuESMoRBEquLXm48tid0Gf3ONMEc4lTEUmUzCU0Hzv6JV40b+k+yk16CDpNyUZNRAKxGqu/TktNtmVzYc7RRi4yHGtb+hZmFIwFI124cGGqZ8x+N68SiyZjaDLvPA7tOyKY+RFWxrnMBOSpMiaM0jz1V2alzmPxKk+W++EcFQeBZfMS7VmCSozE/XXCJ+2FIEJLcS9XXHHFdCgbLUbuOWlRMnaDXtvZpMZUzFRp7TFbSkUjB5hkohKbZg2rnON43k9Vv+MZc1w7vRDoibEgMFIAeOjf9a53pfTzdTicFcE8lQ/Ayr9b7BJBkto8YAhULwUhQERJfkwjmRjkNmgqznuX7iRL78wBzHWOskV82JwxA2dwOJMD4yDZkWQxIuYKdnj1EE4SrbxYV199dWJYhx9+eCJk2TYu79S8efNSpmWmKVL7WP3Vx238roGH/viLcvFZHp999903nXFuHvqQ+ZivScZb2ZaNV124NJmzMTSZN8zkBHOCIQbq/mNaCDTm4nRD4xstnU6eS5NXwgKhAEMhDNBC6sX8CRCYvHtjTSiIt2zIjo+mDUpOiRk76I1PIwsExmutYGL5qALYN1nb9bGM9dm9gJOM1frVj78sQFhzo2HX77M03n7Hmlf8HgiMhUBP1N2piTQCtmwSYCawZSekMFIjmzYiQZPAVBDD008/PUmSiLAHrGnh6OaAJemVJwmW12tP5mEEItvNMRpj2XzzzRNDUwfhkWIfAZeV986RQ7+YP8xHdmR9eKARXgcsGSsCJhEjMxRCxZ6vwMO1shqTkJlnxuqvUzZeEjosSbNlkZRSgZ9EnQgm38I73vGOatVVV02SroSUtJh8bZM5G0OTeat3wQUXVAsWLEjZoJkfaVRwYc5xNki+x+W4+33vnjgOwT2kNTJl1YtTMuHgnmStUx33nDCAYRAOJBl1P9yr0o+F6DK10RIIIHmtNFnb9bGM9RnjIBToB24ELok0Zed2z+A5fbqrDQAAOuVJREFUmrbS77M03n7Hmlf8HgiMhUBj6o6wH3nkkYlAO0e+m6RFo/Bwk6QVNmT2cudKrLPOOon4lwRhrAH6HZPyIJJIOzGz3EY5JhFjHmCMIqcd97szXJx9QhpGFBExEr/TBGlZHvTlllsuEU5Ss34RcVl8MSWmOMQKsWdqIX36M8Ym/XViLHn8o73KXoxwmr9zYJjeyvm6tumc8xjGmrd6+oMTUw4mixBjABzczIbaIDwMomg7E1oaRiftDoPXJ99IXTiBjzYwD8UxA2VxrTXMLOo+5iCPpmu7bKvJe+ucdq5giHvssUdi6AIOmMCY7ErTZ73Nfp+l8fZbH0d8DgR6ReC+TS9gz1a23XbbpDXUiVpuhz+AiYQGgLkg2ByomApNh1kiS4n5mrFeSWCuYYbBCJoU/hLmIjZ4Wk4er/d8LH7jKGYaY6YwXgQSMUM4ER3MAgPRv3BmhImG4DuaCo0CgSARM7s06c8cJqo0nXMew1jzNk5RVkxutEaaintpLTAjMTs2vR+DmrOzd9yzTkyHWazT98xntFTMhDnRmtxpp52SuQyTYuJUxlrb45kDDWyttdaqVllllSQg3HTTTUkjo111K4N4lvrpt9t44vtAoCkCjRmL8+ZJX4iXk/eyvR+h9R7RRXAV9fbZZ5/0kCNGzBIOxvKg9MpUtOfMFeYEjCD34ft64YcROeYcE1KhukxXdQ0J40CAEClEVvt1TYgk6fpMODmGScjXjYQKM8UwnWBW2ZdEIm3SH6LYpOg3m7ea1Fen1zE0mbf7yn+DKBMURFXRJvq5j03nMVo9TB7undYC5gdfGmUeHz8Y57050MAwRYeWLfM/7cZath+r6doebWyj/aZvQSfCnAUfGCfTsICB0cp4n6V++x1tTPFbIDAWAo1NYQidDWsc3IgyIsrkIFKLvZ4tmfMV8fUbhsKmjKkgTvwWtBUSVK8FkRABhblpizZRN4OQSk866aQkSW8zcjwxk4px0jqyhJ77NV5jRFi13UnKzXXzK5s+BzEMttpqq8RYmFqY2rTRS3+5zfI1M7D8Hex6Pfa31zFk4pv7rL9ivPbMIM7CepnjEHZjdX+nojCHuvcEhzrjxVgJDdaYeyo0XZAGTZKWImijdOQbv7VA82q6tnuZs6AP/im+OwEDGJ4/c2BeJQjlZyRr1PX2+3mWBtFvfRzxORDoBYHGjIVTuAz/JB0yJ3DgIrTZXuxB4Kg/5ZRTqte//vXVxhtvnCLJbFTz8Aiz7Ie5cF5jTkwxy4xImxhWyRAEFQjb5dNh0uKDoKnYC8HHkvvEZNTDEGhBxlQn6p0A1BazF8nXJkWaG4JBIlRIlk37q7fPlIZQlgxQqDFm2WRsub3xjCG3Ub7yPTD5ieR73etelwgizG+44YbFxlpeM9Hvab18PfDB+HK4sCOn+fZEi2WfjzVn/LQUfkE4l2vGWK1PPivrNpduazv/3vSVmQtjsRb56Mo1iAlae1kY6NRmv8/SePvtNJb4LhDoBYHGjIWvoiy0FSYRBBpzQaQV5hKO+rlz5ybbPAnxyU9+cvK7iNLxIPWzKcymN85Mm83sPxA9xBHvIaI1saEbC22FVKgfBFFkzaWXXprGw/yAAXHe8hvQWBCapsQbYzE/IdeuRZByGCuC16S/EsP8nvNfQMH8+fNTaConPOaMsZSSrD59NidEEnMty3jGULaT38ORdkBLxGQUe3rcRz4WZpxetarcdr+v7utmm22WMjAYx5Zbbpk0RgKHteCzcWM6n//855MWTWtl8qoXfjPpYviaytJtbZd1mrznT3HPTj311OTns17cWxoFhmc9eW7qzC63jTH28yyNt9/cf7wGAv0i0JixNO0AsaOV5Egs5hYSG98H2zJto27GatI2aQ9D8aCKvhKx5TtSHSKy2mqrJQ2CuUr7/oQBM+XYJ2IPDQZCsjW2LbbYYpEE2aR/dTAuIc00Fj6j7KvxG8dxv/3BC3NESIQTk8IFGCBCpbmK6Y1WAkfZBvgKyjKeMZTt5PdMm3wrQov1xe+kj/XXXz9pZwi36Dph25NZ3D/YYMS0J4QZUxF1RcgxThoNBsEf5v6XOOaxWpcIez/rMbcx2iv83vve9yYt114gZkTCAqbMLCdsnIDQrfT7LI23327jie8DgaYI9H3mPYJuo5qwT2p+dn57mNmsfV9/mEm9iGZd/Wfmkk+J9NhNessT4pgXjUWK9h6jcR3NiIkkj0N9piV1+QjsFdGvOnwFHlrjw6CU+r4Iu8xdr27p/Bf9hahpB6MpiVKT/vRFWuUfyG17b2+FsSKE8MSAzY9GhAAilkKf1RMwQTsxZn4Emlhuq+kYms6bkxzWxgw/0r1+bZyEkbHKUkALLOdknuMpfA/myreGUNbXkv0gwp3dW6Yv9WCQtTqM12/WYrfCbEa7qa+5bmu7Wzujfe9+GgfTKfNXxpCWaszleq2308+zlNsYT7+5jXid/ghM1Jn3fTOWQULeC2PRL80DkWWGQXA9rKXJqD42IaUeNMQJEaoTqXr98X7utz/XYQyYSZ3YlWNCLDG00ebR7xjKfsr3NAJ9lmNzD/I4Rhtv2c4g32OoeVzWwFSMocl88nqlrcDLWL1O9Hinqt8mmESd4UBgohjLwE1hkwGXB5JJxl+Tgvn4m6zSb39Nx1hqUN3m1O8YurXHjFMv2b9U/36yPhMmaBzDXnpdr4Oaz1T1O6jxRzvtRaDxPpb2TjFGHggEAoFAIDCZCARjmUy0o69AIBAIBGYAAsFYZsBNjikGAoFAIDCZCARjmUy0o69AIBAIBGYAAsFYZsBNjikGAoFAIDCZCARjmUy0o69AIBAIBGYAAq0JN5byxI53eZzKYj+ApH426sk0PFqKjPK6Xt479MvmP2eSjLZTWn40u6rlULNRb9DFuTbmv/322486jkH3G+0FAtMBAft6pCSyWVqxQVVaqKZh/v1iMFX99jveQVzXGsZiMTjN0L4Fu/rzhkg3LR8pLOGf1BmYTP59ECBde+21aQe3A69GYyzSstiVL1/VRDCW60ZS9ttRPtY4BjHnaCMQmE4IEMgOO+ywdFifHHc210oJ5Ywcx6znXIeDnvNU9TvoefTaXmsYi53kdlnLqyV5X96ch7HYgS9n1cUXX5wOUpKIMmeS7RWQTvX333//tBDHWnx2xNuNbkwTUezclkFgotqfiDFHm4HAMCCAPviTo82hg2gGC4jcfARRwlrOij3I8U5Vv4OcQz9ttYax5MnJzyUnVH0RUGedEUJrcKBSyVjk1pJ3Sg4oOZrkFStzNDmvY+HChelsDCdMOgLAmS9Z67HDHzG3kzkXjE6uKnm0jEm7dYIvPb9r/Faq2xIkalO+rzwO1951110p/xbznmuW6TNhZx5jvAYCgcD/I8CaQRh1Lo/cdjQWz5ijPZzFI1t6naYMArup6ncQYx9PG61jLN0mSwLJ6m1mCPwiTumTDZnvA5FH4KX0d0wyDUSGXuegSxCIyNMKMApZaVdcccVUX6p8jMRZLBiP5IDUaqapnC/LSZWSbOa+jVNWXUzCdTIh5+KzZI4ORzMGCR2Z8L7yla8k5iRfmHYsfMc8q1u2m9uJ10AgEGiGAKbiGfa8SYmEFmAkXjslzC1bdQ6UpKj5tNj8G0FV1nTHZcg67lmvl/H0W2+rTZ+XRKJNox8ZKyJMI3EeCyIvbXrOIcY0xvEulf7OO++ciDt/CUZjUUl9j6lYNI5Sdm6M7MXO+aAic+xZdNrXNkalOPvjvPPOqzbaaKO02IxB/zLY0oZy4RfCrPJ1+XtZjBULU+E7ItlI5T979uzEzGhf0tWbz0Sp6anz+BcIzAAEHPUgG/vhhx9erbfeeul5Rh88m2uuueao2orzj84999x0Gm1+Fj27zkwiuGJUsp13Yizj6bfNt6V1jOXkk09ORDhn9uXX4NBGxB3eZZGQEmgeorQQfdqJ800sgOWWWy4tMAtlzpw5KaW5qDLnrEilv/zyy6dFRovIZqryBkvX7lr1MCaSisJ0ZuHWzWHltd3e68eCFXVmgZobDeecc85J5jaS1kSo6d3GE98HAtMNASegXnPNNdVll12WrBTogmMpMBnHJ5Sm6vrcHTDodFKWC8VBhSI0HcLGX4N2dEsMO55+6+No0+fWMRYpx0VmZemACcxBX6IvhCQLxaXeOuqXZrDMiJ8C06GV5ML/gglkn4vjix26NGvWrLTIvDI9lX6afK2zNRwWhqkwmWUGh9FgMpzrvRaLm+SEmdFU+HyMj/kO4+yHWfU6hqgfCExnBGj/TM1M1k6jJaxdf/316TsCnC0C+ZjxOg58oUzSzNWYCwbFl+u5JbSyUmQ6UL92PP3W22rT59YxFv4RjjYMRiF5YC5OX7zkkkvSGfe0Bz4TRBmDYQfNjMg1CLhwYAT7gAMOSBqQkwjz0cn55L6yH9cp/CiYALNZqdFon5bBZDZa0WfdNCYAgOnNwqdtMb9pq9tiHa39+C0QCAQWR4AgKDqLoMiv6aA6/ljPN0sBKwitpBtj0RqhkblcYBDhTwAA4RLT6facDqLfxWfSnk+tYywIrvDAummIyulsexugnGVuEdE6nDPOxNRp0Vgs2nEC5cYbb5w0BcSdBOMoZQtQBFpZ1NeuCC5ST6kC05os2LLUtQ2aE0aYv6dtOT6ZJvTud7879efsE79bwFECgUBgfAigCawAfJg0lrzHjADnSHOmbcKmIJluTIJ1w/Po+SV4EgY59WkrnSwbRjyIfsc386m7etqkdHGzaRAitmgxTGC0GgsKc1hllVUW/YkEo6JSZ3fcccfksF922WUTQ3LGuwVI2qCdZAd7vkV5py5TVamd6IfzvtRGmOyYs4wnF6HG/D+ZsejDsb/stJijhc6sZs9OeV2+Pl4DgUCgNwRYAZjHbV6uP8+eW8wkC4ydWnYNRz2frfBkwTuCalhJBO0QKDuV8fbbqc22fNc6jaUbsFlDwVhoDY997GNT1JYbL4KLRsJBd8UVVySnm2gNEgu/DA3Cjn5ny9MgOOrUZe4qTWj6xliEF5JW5s+fn0KXMZjjjjuucj58edKicGHtqyeowMK2QDGWHD6sX32QgDAZhaTDLIcp2SVcajipQvwLBAKBxggQGlkn+FQcxWuDNSGU+ftrX/taEijRC8ynU8FAOOqZ4QUI0VLQBn6XbD7n0M/m+dzGePvN7bTxddowFvHpJA/aiGgPn22GIk187GMfS+ouYs65znnHPCZXEA2FJMLpjzkh+tRdC8jvddXY4nnPe95T2Y3PkXfGGWckRibajNmt1GJkCaAVWZh8KBav6DQMJrf7pCc9KY2BBrXbbrslhiZcGuNjZjMf9uFddtmljesrxhwITDkCLAeeV6HGwoOFGaMFnnObod/5zncuMo91Gmz2ub7xjW9c5FMhhDKXEyhZR+oCqHbG22+nsbTlu/uMmGQmJv9IDwhsvfXW1UEHHZR8Hd2kBiqrKC/EHjHOhDl3w2xEO2CKsliy3ZMWcOuttyYzFU3GguA7YU+1GNhW7ZBnxrrnnnuS5kIr4ZyzGdICJNW4lkkNsdeX+kxYwo/5aNhufc/von31jMWOf+PG0PiGLEKMj5psLDQjmg6NRXsYl7Fpw9i+/vWvp+uM53vf+15qM48jzz1eA4FAYHQEBPJ4DtGCO++8M2knnlvPEm2m9JXWW2IFcT0LR53uoC8ERs9tJ9o1nn7r45iIzzQ49GaNNdZYQuMaT3+tYSzjmSQbKc0FD8Vwymgu7WIICL9FgNCzt3aSQOpjoJ0wU1lUruu0sFyD2egj21zr7eTP/Cr6Levpw3cWdLf28/XxGggEAqMj4HlilVA8txjKZDxXU9Xv6GhUyTQ4EYxl2pjCRgMQYaaWdiuIdqeosW718/cYgL+xCqbTpJT+mVy/Sfu5brwGAoHA6Ag0fWZHb6X3X6eq395HOpgrhoKxkCCE8o3mQBvMdKOVQCAQCAQCgYyAAAaBB4MuQ8FYhPcyQ+VIqUFPMtoLBAKBQCAQWBKB2SO5CUWv1d0DS9bs7Zuh8LHwf2AsUQKBQCAQCAQmFwF+59F8xP2MZigYSz8Dj2sCgUAgEAgEhhOBabPzfjjhjVEFAksiIEJIKPoQRPovObj4JhAYAAJD4WPpdR6c/Q7asu/EJkg2wibhwb32M5X1peVmIrRxc7SItqkcY/TdHAHh7HLQ3XzzzSnPHH+iHdz2K9kMW2bK1mq/998BdRIjcsjmvVzNR7lkzX7HsWRL8c1MQqCVGouDdxzWZbe6jMYe2ulWrhs5nfKqq65aFHM/3eY3k+bj2Aa7uw899NC0Sc9ObsKC47Cl+JHlwSbdMo9Vv/dfyiLHQAzKZ9nvOGbS/Y25LolAKzWWiy66KM3EBicZjbfZZpu0Y37J6bX3G1oZhhnmkvbeQyPHLJxS+pnPfKbaaqut0hkeMjrYMGtH9+WXX57StqsjPYjs3Uq/99+GXUxlUOum33GkScS/GYtA6xiLPFs0FtlF7Zi1/0XSRg+ryAbvJW50Kly5uVBKFinx5eHKqVQ89FI8SMvALMGEIKuxh55pgnlNe3biO+PFscVStHj4V1555fR9uWvXddK/SL3iwZYqIp9cWa6wpvXKa+J9OxGgQWAqzkvfdtttU3qeMqzeiaEyZTs8SlqfnPOu02x7WTeyOFi71rYUSLJ724Bbrldr1PEP0gZZ65KmSjk03czKnbCM7yYWgdYxFuYhxF+KeTbkK6+8Mkl9K620Unoor7766ur8889PCecQ/xyf7byTefPmpcN9PMy0gaOOOio90KQyqR3kGJN+BSORuVRuIK9MF7IeyzXE74FJSTjJnu1BRChIn8ccc0zKnmwnv3bUW3311as999wzMSYPddN6E3vbo/XJQoBGbc3QqqXOKJmKMRCIPvCBDyQGYF2WhL8cYy/rxiF3zvbBqKxBDEmuuf32229REkV56JyISNjCYNQxNszFmu801nI88T4QGA2BVjEWDwkzGEIv6SNJDGHHXN7+9renDKWcoR4oDEgdaekVdSR5lJkYE3Fq3Cc/+cnk5JTtmPYixb6dqBLT5XNVaB++c1yxPmQH8PnUU09Nmsvee++d+s0ZlDfffPOUVp/U99nPfja1ScORnVjamKb1Rrtp8Vt7EKA1WDOYRhZy6qN/yUteUsmOTSCpM55ct5d1QwNytEPO0C1gwHq3DjEXWtEFF1yQTk61OXn2yCY52j7tn9+SNcDJigSsKIFAPwi0irFgDB5UZ6t4WGkssnLK0CmzsTMSVl111WRuQNSlxMdYMA3mCNKYP9FkInRoKNLnM4EppEeOTwysLCQ60WcO4cKUEAHOVue8OCiMxiOYgPksp9ZGIGRJNi5p7z2oJNcm9frJW1aON94PBwK03HziYJkvjh+EVmAdlgUz2GyzzZaI5mL+7WXdYByCBZiDMQcCFkGLJk/AcYIiJmdNWv+YHqaG8XgubrvttkWaezm+eB8INEWgVVFhCxYsSIdfSZ0v0gajyQ8FaQvhZkd2NDGt5Y477khMRT1mLAf8iMbxXup6pzbSfmgX/pi3pLav25gxHgyJz4apAgNieuPL8dDzu0jrz/Tmtyx1es/H4jep8ZvWY5aI0n4EEOtsiiKclMVeFgzGn3Vk/XaL5nJEQ5P1ldeNNScEn+BlvTLp8rHwu2hHvfXWWy9pNJ4ZfTsV8YgjjkjP1yCd/+Wc4/3MQaA1GgufCK3Dw3HCCSckJ7yHJj+YNBS+F5rMhhtuWJ122mmVUEm2ZWYwxJ4DlQTnDBVaTN08QYrDeLRZFvVKidNv2bnvwfSwao9kSKMpCynQOEmvHKlN6tU1prK9eN8eBASKuP80E/c9F2uEWZUfTiEkORDO5zoD8jufSS/rRp+YWlkIY9a3dYixELwcPsfHQmDCfDpdV7YR7wOBpgi0hrE4SfHOkQN6hGzSLEoC7uFgG+b7oHHws7Bbf+5zn0v1MRYaRt6EhrlgNLSW8oH3UGMS9Y1lGE39gcdQMABaCaaHeSAgWWrMN8CY9cH88Nvf/rZRvTpRyG3Fa7sQsMZEBtII+Or4BAkovkfoc2F6Gk2YyIdIjbW+8rqhHdfXIaGG35DgxXR78MEHVwJaOPn5FGn61rixRgkExotAa0xhnPYePiGbHI5veMMbFv1xUiLcInBoNJgOsxdT2IUXXphOe7S7OfsuMB/SGydnqZ0IFc6mghJYJiztZubi1Zn3Hkb2bL4UfTJllO15uPliMLRe6iE8UaYHAoI5aATOTCfI5DWUZ8cMdsbI8dZeu5V8wuFY6yuvG35I7eW+vBK+CEH+aOzMw0zBoiv5Dpdffvm0xusMqduY4vtAYDQEWqGxMAUg5JyR9ptkBpEn5sHzYNjfQkOg1mMszqTn2Md0aDl5X4v6orzUP/fcc1MUjIdN5IwHkvZRlrvvvjuFEmNqHkxS3fXXX18hGtoWNMDxqj3Mbe7cuYkJcpaSVDPjsxemSb2y73jfbgTc7w022CBliHjXu95VbbLJJsn/gQnw9QkAsfaYzTJjqM+Yj6+XdWO92nC55ZZbJi2Jxk7A2WYk5JnGQkjiR2QOo8komNHxxx+ffCyegU5aen1c8TkQ6IZAKxiLB4P/BIGuMxUTo33MmTOnYi7ja2EKoyWQyGgsO+ywQ7IfZ4bhIRYmbN8JXwyJ0Xec75zx2emZQeMMZa7YddddE9MheTKtMcsZDxOEMGbmhdwmKdGYRYnRsLTZtF7uN17bjwCzq7VB+GCupSnk9YWR+N5+EuHEpXm3nLm12WR95WvWXHPNJGC94x3vSMyKCY2gtfXWWyct25hETBrPbrvtlsxz+qDVG4PnSCSjCLIogUA/CLQibb7dwTQRkS60kU6SHbsy4k8ryKYuO+UxCs737GzPIFH5tUtaJKEJS15mZE8MExvtRjgo6U5oMRv0zjvvnKQ50qA+BAWQJDE1RXskUHZrzlh2cf26VuRZtn83rYcAsYm7vhvByXOJ1+FHgJBhfVjH1pw1ZBMiDdg6oj1Y13nt1u9/03UjCEAAiqgzfTHtWvvW0dOe9rRFEY+eF33qx1o1FgKZjZN24huHNU7jjnU4/Otr2EbYCsYyaNA4L0mJGAltIjvzMQU2ZxE7zF4kS4zFAyaPE+bjWiY1xD5rQOX4RPZw5mMk2s0MpazjfdN69evic3sRoMXmMGNriJbQaQ2NNsNe1o26+sM46iH0uY9sFjOePBbXqG/t5u9y/XgNBJog0ApTWJOJ9FIHUyBB2hdDimQCo7UIEBByyZnpoa8XET31sON+6rimSVv1tuNzuxFApK2rTmur6cx6WTdN6gpAqZfsi6x/H58DgaYIzEjGQhKzM1laC+YzETMeMEyFg9VmyKxpMCEIGOgm8TUFOuoFAoFAIDBTEJiRpjA3V+gyOzObsk1jpDumLrZm73NhYyZh5h36+ft4DQQCgUAgEOiMwIxlLJ3hiG8DgUAgEAgExotA7MQbL4JxfSAQCAQCgcBiCARjWQyO+BAIBAKBQCAwXgRazViESgqNzKkrxgtGXB8IBAKBQCAwfgRaFxVmw5cdwzZx2fyoiOaSKn/TTTdNDviIve++ME455ZS00XOnnXZK6Wi614xfBo0AAUhiVFGGNivmyMOyH3ug5KxTRCfKMddpQ/Cg65VjiPeBwHgRaBVjkYTvsMMOSw+ec+v9OUjJgyhk2El5jnm1q3k6MBf7aqSkkSrGLv5BzMnObAxZHil5zqJMHgKSmR500EEpnYod9zYulkUae7nDZHfAhEQpSiPk7JSy7qDrlWOI94HAIBBoDWORcv79739/deutt1a77757NXv27CTNMYX5TQLJyy+/PKV9kQep08avQQA2mW1IDePkv7322isRmkEwFoc4hflw8u6iVCzC2jGVj3zkIykDtrRBdfMt4UgSSKn1pbK3SVG+rkMOOSSlaHF8to29g643eUhETzMJgdYwFufTk7YxDQ+mB7AktNKnSIMvo7AjVzNj8WB7GO1H8TDLbCypZM6/9a1vfSttfnQ+hpP6bJiUykUaF5pPNkM0rZcXj5T9iImzWJg0mD9yXrFcx28LFy5MfTLnyd7sVZ855xgmoI69NfmwJtc3aV9KD/ORE0q+KOOoE7Q8lnidGASsm/333z/lj5ObrjxWoexRyiDZH5ziKBMEM5k149A6Gbgdo23NDrpeOYZ4HwgMCoHWMJZLLrkkbVSUdpwZoWQqwJCgUoZYEjm7tPL73/8+ZRu+4oor0oOK0NoYufrqq1d77rlnYhxOo0TwMSKmNAEB6skj9sEPfjAl5vOQN60nYd/pp5+ezij/3e9+l8aJKUilz6xhbMZOy0JEZGM2JtcxfWy00UYpG7M2jFtusqOPPjolE5TdmUmkSfvmzmx43cgpmpiTzAFOuJQmPTPLBFL8m1AECEAEHQyFgEMLqRf3mFBEsJAIMqd8cVjd0ksvndYIn4p7P8h6ErpGCQQmAoFWMBanNSKIpG5MpZPTE+GUJp9Enn/HaEh7zk1xnoU6CPnHP/7xpBlIC37nSAZYGV6d57LjjjumPpz9gpE4d4VvQ2r8pvUuvvjidC2iICMyDeTaa69NjIZWxWmuPWP71Kc+lca2xhprpMSVvnN+jPPJJcdETNjTpT9fddVVE2Fp2r4zzNnrMSpHMtPczJtWRhKOMjkIuP/OY7EuacLuSb04b8i9xlSyJq2OdUzAofXkCMhB1jOmuoBWH1t8DgT6QaAVjMV5EqRu0lvdnFROupTEOagRbszImShSk/vdaY+33HJLkhxJkh4uGspb3/rWRIBJhTIaizwjYeo3nwEzVj2MS9QVIk47ySY37ZE0MTljQSTOOeecZJpypgvzHK2F1rTvvvsms8nGG2+ciIr5SuMvUMFYmravL8zSWTTaV5j2jMOco0wOAphDdryXGYTL3pkqrS0ahDVUFoxFGzRQprJB1rNO6/2Vfcf7QKBfBBZfxf22MsHXIbgeLg9CU6JIMheavPbaayctJzMdGg+Cj7lwqiLo8oMhwjQKhZSJCGNo/CC5jFVPNI9syYgBx7uItVxIq4g6AqFdh4UxjzGVmJs/GpcgBGMsJdfcRtP2nfchezOmgrFqWzFHTIbUG2V4EJCrzjrspD0wi+XvB11veBCIkUw3BFrBWITF8k1gBCWhr98MfhhHsDI3IcLq8ivUiTTG4WH1oGJW2q9rQvwibN8lIxurHpMGH4++nUxZSoOSXbre78x6xsYkVSa8REQQ/26lafucxHw2Tics52U85k7ziTI8CBCcCD6d1jczMKbD/2a9DrJeFjiGB4kYyXRBoBWMxQMgIso588wGtImSaLsZmMBJJ52UNv9tM3K2N/MD5kE7wDzKwl+CsCP02s4SYVmn0/uxHkRaiQef6Wq77bZbZEIr26IxmIM+MZtybMbEnu7kSma/emnaPm3NODAYppOSsWZCVW87Pk8dAu619UwgIRCUhTBBGHDvaZ+DrNd03ZfjifeBQBMEWpPSxXndHipOaWakUpMwUWHGtBVn0TNp8UkgqDZVliGeCLl6HtJuu5qbANepDhMYhsYE5xwXTvj85xxxfhsai7Bf9vavfvWri2kPIricB2OzJyZTL720TxNieiu1E+PCdOrEq95PfJ5cBPj/BIlYD7SSvLaZMwkaosWs6UHXm9xZRm8zCYHWMBYbxIQa01oc0mVfi/0dIq6EBR9++OGJWdBWmA08hCLBbrzxxurSSy9NxJ7vQbSXPS2c6DSWQUptNA1RWPwson+YvDDBM844o/roRz+azF60HmMT8iy8WZQXf4xotWOOOSZFn0n3wYRlfDQPcyC5Osu8SfsCFMz9pptuqubPn58k4dtvvz1lJWBuyYRrJi30YZ4rYWSzzTZLa8UmSQxGyqL3ve99KdBDlgRretD1hhmTGFu7EWiFKQzETAEYCmIr2kvElu9I9qQ8xwkLFxbmS7Px9573vCelQ0Gw+TwQVFIgpiKc1/WDLPoUXcbclEOdMQbOcowxm8fy2JipbHg79dRTEwHxvV32tC0MiPbFdKaOfS+77bZbT+3bmHfssccmxsZOLzqNma7UYgY5/2irfwSsSZGMBIEbbrghCTyiB/fYY4/qla985SJf3KDr9T/iuDIQ6I5A6w76ogHQCPgpvMdoSPgc4cwJpbOa2UvdvIudxKcOM5UTIRFvDEqp7yEgMbpeXSa1pvUwL5qK1DPMTgg6RqF9UWWYh+J7u+eN784Rnw8mZw7qkU5pUhiA3fvq0XL8Zn5N2jd2/dujQ1ODkUAG32No9fmmQcW/CUVAlKD7yV+YowHLDt0nmRKsV6Yv9ay/rLnmuoOul9uN10BgUAi0jrGYOOKN6PKd8CVgGDSDbgUhpTVgJB5YrxNdaFI0F2PFNEqGV/atnnlgON32OfDL+L0cd9P2M04wgtUgTX/lPOL9YBAgcNBU3G/3rNv9GnS9wYw+WgkE/h+BVjKWuHmBQCAQCAQCw4tAdzF/eMccIwsEAoFAIBAYYgSCsQzxzYmhBQKBQCDQRgSCsbTxrsWYA4FAIBAYYgSCsQzxzYmhBQKBQCDQRgRay1ikcBEtJeoqSiAQCAQCgcDwINCaDZIgE4Ypdcs111yTdtL7zm53u9ilmZcVeLSwY/XLImT3zDPPTLv57SFx5LE9A1ECgUAgEAgE+kegNRqLvShSXBx55JHpXAr5tp75zGemJJN21jsRUi6sXjQYmxilqfdqr4l9IhdddFH15je/OW1a7KWt/m9BXBkIBAKBwPRCoDUai1xa8mq99rWvrbbffvu0c9kmMYcfSUzpt8svvzydyGgTZJNih7Nd7O9///urddddN22elLfrqquuSqlVMJZuG9SatB91AoFAIBCYiQi0hrE4mMsZ8ptuumk6qKvcye78lauvvjol72MSy4xF+hKpWCSdxCTk3cqnOmIg0mvQhBS75L/97W+nhJB2q0twaae6tBqux2CkVZF+xXW0pWWXXTale5Hg0bVSv7z0pS9dIl2HvqWwlybGjmralkzF3ivS09C2ZFzOqWZ8L3eU5JO+65QCRJ0ogUAgEAgMGwKtYSwIOwKN4K+88sqLpUiRXNF57o4QzscIO6mRieyKK65IJi4MhIbDH8NsdsZIxuELLrggpXo58cQTK8zBNdeNpK4XGHD00UcnYv+2t70tZUSWal4OLyntJYSUfmWdddZJ56ZoJ59GieEcdthhiekwrTlz5cADD0zXGT9mxw+EuTDtyR+mP1qT7MXG7IAuaV78jpH5DmOJEggEAoFAGxC43wEjpQ0DRYyZqJwxQjuQiI9mwTdCQ5Fk0YFIOZ8WZnH22WenVPvbbrttOgaYliOdvTpMahiNJI177713tf7666fMvzIlM5FhBnPmzEmHin3oQx9KGWflbnrDG96Qvnemi5T9GNJrXvOaSh8Yw4IFC1IQAc2FBmMMMiu/7nWvS3UwI9FsTHcSYmIwrpN80LWYl8SDjgWQat912h8tb1Qb7l+MMRAIBGYOAq3RWKR7P/TQQ9NhWQ7N8ocI+3MQEh+JM0hoEkxIUuszLUkzzoSFMTmnhEntrLPOqt70pjel75mjVlhhhUTMMRwmKgxIf7IS5ygzEWQ777xzNWvWrGQi45txDoyDvBxChkloJ6f0x/QcOastfUmZr44+MMBzzjknMROalEg0mpFDyU455ZR0ndMwHVO8+eabJ8YTvp6Z81DGTAOBtiPQGsZCYsc8nLfiTBXp5kVzYTCIOR+L81f4YKSL57NYe+21FwtBFo7Mx4K5ZNNV0xuISUk7n01tTFiZAWX/hyOTaU/MXxiRst5666UTG3336U9/Oo2L1sVfVO7DoXHRnN7ylrckhkWbOuKII5KPJ2thTcca9QKBQCAQmEoEWsFY+CWyX4MDniZBSxER5nQ9J+7NmzevYrJiNuLwRtgxAuaostAWSP9MXtptWuyXwUhyyY53pris1WjX+7JdjnnmML4Z2tTjHve4xUx2uT3XOtCJVuPkR2Y4JjEBBFECgUAgEGgTAq1gLJjErrvumvwo++yzTzqrnmbgb+mll04mLSfvOcKXI/whD3lIYh733HPPYkTejeEM1x7zUy+aQGYk9ZubmUr9e58xr4MPPjj5bN797ncnRsE8xolPe6mXK6+8MkWP0Xz4cESRLbXUUkswx/p18TkQCAQCgWFCoBUbJGkKTou88MILk9OemagszGTMSpiFurQamgqfheiqXGgSCDazFoI9GlPI14znlR9GcMArXvGK5IR3fDK/iQwCpVajD5qNI4iZ1USVYZpMYYIUMKIogUAgEAi0BYFWMBYMYIcddkhRYPwodseLorI35Oabb06+CQxj7ty5yfGNsXDk02AuvfTS5NdAoE844YS0J4VDv1vqFt/rz7VManUG0MuNFVhA08E0MBlBBbQSjIOPhSkP4+PAP+qoo9LYnGvPL7PLLrukzzZ/CoOOEggEAoFAWxBohSkMmMKBOcCF7u63336LUrAgzCKwNtlkk0SMMzHHgJih7AFxDamf0x9T2WKLLdL1nW7SSiutlDQe2gO/DkLfb6F9cMZ/4hOfSO3wlzDTmQuNSuCBCDUBAUKNN9hgg7TPhrlMNNgNN9xQnXvuuSnyDNPko4kSCAQCgcCwI9Cqo4kRehFf9o7cfffdSZvgjBdGbIe8HerZF0LTEDlmT4pd9sxlHOMc4nl3Ow1CHXtOEHcOdEzKzvo77rgj+W5e+MIXpugzN9L7HAwgqozWpG8O+WxWY34zhlxXPRoLk5gxiCbzGybJh2JDpb6Z+vwmOizPQfui22hgOfJs2BdUjC8QCAQCgVYxFreL5kFL4aRXEHr+iEzY05fFP2ame++9Nznq1WvqsOezQeCb1i+67PiWX0V7tA4MTMHEch/5u44Xx5eBQCAQCLQIgdYxlhZhG0MNBAKBQGBGItAK5/2MvDMx6UAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4pAMJZhvTMxrkAgEAgEWopAMJaW3rgYdiAQCAQCw4rA/Yd1YDGu4UXgX//6V/WTn/ykespTnlI94AEP6DrQX/3qV9X97ne/6jGPeUx13/uOX4b573//W917773VQx/60J7a++lPf1r98Y9/rP7zn/8sGut97nOf6tGPfnT1hCc8obr//SfmMTD/3/72t9W///3v6sEPfnD19Kc/fVS8Fg1ukt/86U9/qn72s59V7mtZHve4x1X+3EPlz3/+c/XNb34z4f+CF7wgzamsH+8DgYzAxDxRufV4HRoEzjzzzOqSSy6pHvKQhyRC/9e//rX65S9/WT3qUY9Kn//yl7+kzyuttFK1++67p+86Df6qq66qtIUYH3HEEYkw1+t9//vfr84555zq+uuvr3bddddq7bXXTv3W6zX9jKF86Utfqj75yU9WL3/5y6uNNtqoevjDH9708gqBP/vss6tbbrml+sc//pGu3WCDDRIh/e53v5uI/cYbb1zNmjVroMQSU9lvv/2qu+66q3rOc55THXvssR3xajyRHiredttt1TOf+czqgQ984JhXLViwoPrYxz5WYTBled/73le95jWvqeB/wgknVO49BuTeL7300tVee+1VWS9N+ijbjffTH4FgLNP/HqcZfuc730kS87rrrpuYxtVXX50YDQK76aabJoJ63XXXVbfeemv197//vSsqL37xixOhpjkg0p3Kk570pOrZz352deKJJ1a/+c1vFtMUOtUf67uvfvWribFcdNFFSYKmAfRSjAVD+vjHP17dc8891corr1y98pWvTEx1ueWWq/bff//qC1/4QrXmmmtW++67b/X4xz++l+a71n3a055W/fOf/0xSPqy64dW1gT5/wEAxioMOOqhyL8YqV1xxRfXFL36xImzkQuDIGumll16aMNlyyy2ryy67LK0bmsuTn/zkatlll60e+9jH5sviNRBICARjmSEL4UEPelD15je/ORECEuftt9+ezEkIz/LLL5/MQqTqI488clQzE2KCkJREqA7hIx/5yOp5z3vewCRZkjczDVMW6bnXYjyYS5asfX7qU5+aCOMyyyxT/ehHP0rM5ayzzkrMd8cdd6we8YhH9NrNEvUf9rCHJeY1CDPgEo13+eLOO++sDj744Oruu+9OTK1LtUVf/+EPf0iCximnnLLYnJnunv/85yfzF+0ETnB7yUtekpgQzH7xi18sYT5b1HC8mdEIBGOZIbd/8803T6aR0fwJ7OnbbLNNIoZg4UdBeEjeCC2/hMLmnt///ve/r/gwRquTLvrfP8xBu67jc0CsxiK8Sy21VPXEJz5xXL6Qbn1gNkxrxx9/fMWEx+SzzjrrJKLqGhrXt771rerHP/5xIsC0HTj5jVkIgaWVwIRW9MMf/rBihvKexK9krLxH+M0/+zP4KpigEGmaGFMlrZCJTtsve9nLqmc84xmLzd2Yvva1ryXTJYapDsHhjjvuqPbZZ5+K5gmzm2++OQkAGHM3Xxht0Fhc734897nPTe/Nz5zMDTPBaMyDT8qrz8xkmGeUQKCOQDCWOiLT9DMNohtxLaesHlPKZz/72eRz4Ic5/PDDq7e//e2J4JWMid8C0UUAEbI999yzWnXVVRdpBmW73mvr/PPPT8ST38G1tIM11lhjoL6Ner9jfca0EG/zMC6M4VnPelZ12mmnJSy23377xFSOPvroRGyZzlZZZZX0/rzzzktmNER4u+22q3zGOMyJKapeEGT+J4yKWRJjcV9oDN/+9reT6YrfwlgQfYQbs1hrrbUSRl//+terQw89tHrRi15UrbbaaknDNBY4Yk7aZcr0Zx7mNlrQwOc///nkO/nc5z6X+sLU3va2t1UvfelL0/wwpMyUmD/POOOMFJDgdz4pjDBKIFBHYPyhOvUW4/NQItCEqRi4enwjCBst5FWvelWShPk3RAXlQvIW7fWWt7wlOejVP/DAA5MU38lcRVPhvGZ6mz17dkWD4nP44Ac/WP385z/vy8SVxzLeV3M2LszB2BFk2sjpp59e8S+IQkPEaVe0gFNPPTVpcpgp7YWmgAm4Tj1aHqw64eC33/3ud9U2I5ohzYhZMfsyaEyCFJgkEXdtYyTnnntu0mr+9re/JSbPz8F8ucIKKyTt4iMf+Ujy49A2aDAYgeCG1772tUkYyCbAOk7Gh7Fgpt/73veqr3zlK4nxu6cXXnhhJaAjF+M+4IADqpNPPjlF2DGl0vLMJUogUEcgNJY6IvE5EX42df4UZg9mEoSkdD4jiJgO5y3pGFFEdBYuXNjRYcy0I6pIYTpDzPlpvvGNbySihrh2I4CTcUsQ42yyMjbMxPwVjmyhyaR/dWgW5uwa36vPnDR7hGEygTERCQDwW1mYyT784Q+nYIk5c+YsimzTDlOjdmgAK664YjLFwRaemBam8oMf/KC66aabUsg1puZ+6IOmguFhNjQcY9QmTHO9ejixuZmjYAW+N4ICJiOIgZmNrw2TVEfxusUWW6R7e9JJJyVmhPkwI5qLNRIlEMgIBGPJSMTrIgQ22WSTJHWLFkLsEDWEtNwHQiJGcDMxXH311auPfvSjSzCg3KhoM4yDSScT7Fe84hVJEheZVZrY8jWT+Zr3m+iTGYx2QpsSRYawC7H2yueAoHfSRmgL5uYvM6k8B34RJi3mv6y15d/KVwwBc3E9sxl8aQ6wF9nHHKXv+fPnp35oWcK53YtuhSOfKVPYdS4YBxMa5sH0pl3Rgddee23SijBPDMtc9GFcGBezGrPhHnvskTRNY3r1q18djCUDG68JgWAssRCWQIAWIRxZKDKpF5GtlzrhzBsNScyIYb0grMwmorEQ4Pr19fqT+Zm2QYvySivh9yCBMxEJ2/VeqC2tCyHtVjJz7DQ3mDBVIdjCnmklG264YU8+Cs79zNBe+MIXprBpBF9xj7J2UR8foeDOEY3EHHPJDMr1xu1aWpb7zaTHf0YQqM+FBjR37tw0B4KHzxhPlECgRCAYS4lGvE/S8VFHHZWc8AghwoH41AlMHSpmHnVoH53MIvwFdr/bNCl8NRPBX//614nocUaT0KeiCFbIfh6SOOZHszjkkEOqL3/5y9Vhhx2W8Lj44os7Ms0mY2bq2nnnnVOUGZMgBzwGJuKqKWEunfCwpDnkjaJ8IN3ukXr8QhhMLu4TTchYtGssGAnTGqaHAXqPAcGCFkd4wCDdO+8xIlqM+/bpT386mdDWX3/9tAYyw8v9xevMQiAYy8y634tmy7xF+kVsshTsR1Its5UQVeYXvhP2eY76fI16pb/F5xtvvDE5o/kFSMC53VxPtBhidMwxxyRGxT+DmPMhrLfeeuk77XQrCCmJvdPmSE7na665ptpqq60W+UHq7ZQpXRDUPGdmIiHG/D077bRT9aY3vSn5O2gn5m4e5kDi99l7ODCJacPn3Fbuo2QU5e8Y6rx58xK+NB/M5bjjjkumsdye11xoUNrOr0xWovZofwILYE1zYb5iXoNpNk9mhs0nI0jAtfXCRyLiD6a006233jq9F0Cwyy67pOAB2tUZI5Fg2hX1Z2OpOcFit912S1qYvpjWBABYP/bRECSizFwE7jcS6XHAzJ3+zJs5oiC0lM+AyQMxZEZhAmHbJ33ahX3DDTckos+05RpSPemVNOszhsD/wG5PCyHJcgLnsF2bDbWDMGJSIp1IuJzDCKHIJsRJWC6pfTTHPUb3iU98Ijmu9Y1As/1nU42oKf4dfh7f101xmA6nOSkc4RaxxSzH4X3dyJ4PhFXqGUyFtpJ9CldeeWWaI6e57wQsaMMenGzOowkIbNCGsZmnesZg/n4XZo0Z+Q3OTEhwY2qjZbjO/IQpq0c70B7mIXw4XysKzG/6w5gwAPcJw8ZUslnKd/rU3uyRgAJjzWa6csXrm8Yi8sxOeveYXwWO/mhD2hAZp457rm+M3H2TBsdctWOetFZ+M9dGGHKJ9Mx7f5+RB7/3rcwzD6dpM2NaCKJDUkdgEWfEDkFmwkIksrROSkWUSO0IKpMJkxVmJLzY0kFAMR9ESBsImA1/iCZJ1vfLjESYIXo0BcyM9O86/ZK61dFvt8LMY7zaQ7CEQWNS2dwi3NeYETX91NvK+ztyCLDfmXkwC3Ng2sFQstPcODAgxBQBV5dZ0DwRX8WmREQV08ntqgcvjNr8jNd8MQafMVh1MNR8DZ+OduCpnvHAC/aIOE3Kd66FP+1Fm7lf3+kTluajXaYrDMn9YKoqN7emwf/vnznCzXogNBgbXPUPEzhhovpzP7VtN7756derOsbHZKhP+1uMtRMjK/uO99MbgWAs0/v+9j07RAdxyESaVF33tZCqEbqsOTTtTFvaLdtvem23evXxdqvX6/eYofllQonQYxK9zrnXfseqj9nDv2SG+RqE3u9ZUMjfd3slbGhrtPuhPfeaRouB1Ys+3depxqU+rvg8NQgEY5ka3KPXQCAQCASmLQJLxoVO26nGxAKBQCAQCAQmA4FgLJOBcvQRCAQCgcAMQiAYywy62THVQCAQCAQmA4FgLJOBcvQRCAQCgcAMQiAYywy62THVQCAQCAQmA4FgLJOBcvQRCAQCgcAMQiAYywy62THVQCAQCAQmA4FgLJOBcvQRCAQCgcAMQiAYywy62THVQCAQCAQmA4FgLJOBcvQRCAQCgcAMQiAYywy62THVQCAQCAQmA4FgLJOBcvQRCAQCgcAMQuD/AC+Zpu9jCbPbAAAAAElFTkSuQmCC) Structure of the TFMG YOLOv3 project for detection task1. Dataloaders: - Input images for prediction of detection task: - `preprocess_input()`: process original images to image data. - `decoder()`: convert image data to tensor. - `parser()`: - Load pre-trained weights into YOLOv3 model: - (utils) `download_weights()`: - `load_weights_to_model()`:2. Modeling: - `nn_block()` or `layers`: - `yolo_model()`: 3. Ops: - `decode_netout()`: decode the prediction of detection model into boxes - `correct_boxes()`: - `nms()` and `bbox_iou()`: - `draw_boxes()`:4. Configs: - Image input: - `image_path`: - `num_classes`: - `input_size`: for modeling - Pretrain weights: - `pb_file`: pretrain weights 5. Tasks: - `build_input()`: - `build_model()`: - `build_loss()`: - metrics: 6. Common / Import Registry: - include components above7. Train: - `train.py`8. Demo Step 0: SetupIt is recommended to run the models using GPUs or TPUs. To select a GPU/TPU in Colab, select `Runtime > Change runtime type > Hardware accelerator` dropdown in the top menu. If you upgraded to **[Colab PRO](https://colab.research.google.com/signup)**, check [![Colab Pro](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/notebooks/pro.ipynb) for using priority access to our fastest GPUs. 0.1 Install the TFMG requiresTensorFlow provides compenhensive framework and API tools to support TFMG. Here is briefly introduce: 1. Install **TensorFlow 2** packages: - `pip install tensorflow` : Latest stable release with CPU and GPU support (Ubuntu and Windows) - `pip install tf-nightly` : Preview build (unstable) . Ubuntu and Windows include GPU support . - For more details or older version, please check TF official document [here](https://www.tensorflow.org/install/pip). 2. Install **TensorFlow Model Garden** packages: ( TODO: May change or add more info) - `pip install tf-models-official` : Latest stable Model Garden package. Note that it may not include the latest changes in the `tensorflow_models` github repo. - `pip install tf-models-nightly` : Preview build (unstable). Note that to include latest changes, you may install `tf-models-nightly`, which is the nightly Model Garden package created daily automatically. 3. Install **TensorFlow Datasets** packages: - `pip install tensorflow-datasets` : The stable version, released every few months. - `pip install tfds-nightly` : Released every day, contains the last versions of the datasets. - For more details or older version, please check TFDS official document [here](https://www.tensorflow.org/datasets/overview). For above packages, `pip` will install all models and dependencies automatically.In this Colab, we use the `nightly` versions of TFMG and TFDS, and the stable version of TF2 and TF Hub. Run the comments below:```bash$ !pip install -q tensorflow tf-models-nightly tfds-nightly ``` ###Code # install TFMG, TFDS if not exist !pip install -q tf-models-nightly tfds-nightly ###Output _____no_output_____ ###Markdown 0.2 Provide a requirements file We recommend to write a `requirements.txt` file to install require packages with version numbers as a part of your TFMG project. So, when reusing the models project, we can run the commend to meet the requirement of the stable version of your project. ```shell$ !pip install -r yolo/requirements.txt``` Step 1: Create DataloaderThis section is following the Sec 1.1 (build YOLO from scratch), we can download pre-trained weigths from author's official source. We will have a DataLoader to load and parse weights into our DarkNet-53 architecture. 1.0 Example DataloaderA dataloader reads, decodes and parses the input data. We have created various dataloaders to handle standard input formats for classification, detection and segmentation. If you have non-standard or complex data, you may want to create your own dataloader. It contains a `Decoder` and a `Parser`.The `Decoder` decodes a TF Example record and returns a dictionary of decoded tensors:```pythonclass Decoder(decoder.Decoder): """A tf.Example decoder for classification task.""" def __init__(self): """Initializes the decoder. The constructor defines the mapping between the field name and the value from an input tf.Example. For example, we define two fields for image bytes and labels. There is no limit on the number of fields to decode. """ self._keys_to_features = { 'image/encoded': tf.io.FixedLenFeature((), tf.string, default_value=''), 'image/class/label': tf.io.FixedLenFeature((), tf.int64, default_value=-1) }```The `Parser` parses the decoded tensors and performs pre-processing to the input data, such as image decoding, augmentation and resizing, etc. It should have `_parse_train_data` and `_parse_eval_data` functions, in which the processed images and labels are returned. 2.1 Dataloader for YOLOv3wer 2.1.1 Load Dataset from TFDS TensorFlow provide many popular datasets [COCO](https://www.tensorflow.org/datasets/catalog/coco), [Kitti](), [VOC]() etc. For know the datatset, you can explore by this tool **[Know Your Data](https://knowyourdata-tfds.withgoogle.com/dataset=kitti&tab=DATASETS&draw=kyd/kitti/objects_type,bbox,bbox&auto_draw=false)**, e.g. [COCO](https://knowyourdata-tfds.withgoogle.com/dataset=coco&tab=STATS&draw=kyd/coco/objects_label,bbox,bbox&auto_draw=false). *Keywords: TFDS, Know Your Data.* 2.1.2 Preprocess input 2.1.3 Decode and parse 2.3 Utils for YOLOv3The `utils` component is optional. In this case, we need to download and load weights into model. ###Code def preprocess_input(image, net_h, net_w): new_h, new_w, _ = image.shape # determine the new size of the image if (float(net_w)/new_w) < (float(net_h)/new_h): new_h = (new_h * net_w)/new_w new_w = net_w else: new_w = (new_w * net_h)/new_h new_h = net_h # resize the image to the new size resized = cv2.resize(image[:,:,::-1]/255., (int(new_w), int(new_h))) # embed the image into the standard letter box new_image = np.ones((net_h, net_w, 3)) * 0.5 new_image[int((net_h-new_h)//2):int((net_h+new_h)//2), int((net_w-new_w)//2):int((net_w+new_w)//2), :] = resized new_image = np.expand_dims(new_image, 0) return new_image def decode_netout(netout, anchors, obj_thresh, nms_thresh, net_h, net_w): grid_h, grid_w = netout.shape[:2] nb_box = 3 netout = netout.reshape((grid_h, grid_w, nb_box, -1)) nb_class = netout.shape[-1] - 5 boxes = [] netout[..., :2] = _sigmoid(netout[..., :2]) netout[..., 4:] = _sigmoid(netout[..., 4:]) netout[..., 5:] = netout[..., 4][..., np.newaxis] * netout[..., 5:] netout[..., 5:] *= netout[..., 5:] > obj_thresh for i in range(grid_h*grid_w): row = i / grid_w col = i % grid_w for b in range(nb_box): # 4th element is objectness score objectness = netout[int(row)][int(col)][b][4] #objectness = netout[..., :4] if(objectness.all() <= obj_thresh): continue # first 4 elements are x, y, w, and h x, y, w, h = netout[int(row)][int(col)][b][:4] x = (col + x) / grid_w # center position, unit: image width y = (row + y) / grid_h # center position, unit: image height w = anchors[2 * b + 0] * np.exp(w) / net_w # unit: image width h = anchors[2 * b + 1] * np.exp(h) / net_h # unit: image height # last elements are class probabilities classes = netout[int(row)][col][b][5:] box = BoundBox(x-w/2, y-h/2, x+w/2, y+h/2, objectness, classes) #box = BoundBox(x-w/2, y-h/2, x+w/2, y+h/2, None, classes) boxes.append(box) return boxes def correct_yolo_boxes(boxes, image_h, image_w, net_h, net_w): if (float(net_w)/image_w) < (float(net_h)/image_h): new_w = net_w new_h = (image_h*net_w)/image_w else: new_h = net_w new_w = (image_w*net_h)/image_h for i in range(len(boxes)): x_offset, x_scale = (net_w - new_w)/2./net_w, float(new_w)/net_w y_offset, y_scale = (net_h - new_h)/2./net_h, float(new_h)/net_h boxes[i].xmin = int((boxes[i].xmin - x_offset) / x_scale * image_w) boxes[i].xmax = int((boxes[i].xmax - x_offset) / x_scale * image_w) boxes[i].ymin = int((boxes[i].ymin - y_offset) / y_scale * image_h) boxes[i].ymax = int((boxes[i].ymax - y_offset) / y_scale * image_h) def do_nms(boxes, nms_thresh): if len(boxes) > 0: nb_class = len(boxes[0].classes) else: return for c in range(nb_class): sorted_indices = np.argsort([-box.classes[c] for box in boxes]) for i in range(len(sorted_indices)): index_i = sorted_indices[i] if boxes[index_i].classes[c] == 0: continue for j in range(i+1, len(sorted_indices)): index_j = sorted_indices[j] if bbox_iou(boxes[index_i], boxes[index_j]) >= nms_thresh: boxes[index_j].classes[c] = 0 def draw_boxes(image, boxes, labels, obj_thresh): for box in boxes: label_str = '' label = -1 for i in range(len(labels)): if box.classes[i] > obj_thresh: label_str += labels[i] label = i print(labels[i] + ': ' + str(box.classes[i]*100) + '%') if label >= 0: cv2.rectangle(image, (box.xmin,box.ymin), (box.xmax,box.ymax), (0,255,0), 3) cv2.putText(image, label_str + ' ' + str(box.get_score()), (box.xmin, box.ymin - 13), cv2.FONT_HERSHEY_SIMPLEX, 1e-3 * image.shape[0], (0,255,0), 2) return image ###Output _____no_output_____ ###Markdown 2.3.1 Download weights The utils component is optional. In this case, we need to download and load weights into model.In practial, we can use the pre-treained models to do detection task directly.Also, if we have custom improvement models, the pre-trained weights might be crucial to start training to achieve SOTA. The weights trained on COCO have been provided and can be loaded from official source. ###Code # create a folder if no exist !mkdir -p data # load pre-trained DarkNet weights to `data/yolov[VERSION].weights` !wget https://pjreddie.com/media/files/yolov1.weights -O data/yolov1.weights !wget https://pjreddie.com/media/files/yolov2.weights -O data/yolov2.weights !wget https://pjreddie.com/media/files/yolov3.weights -O data/yolov3.weights ###Output --2021-07-23 20:46:47-- https://pjreddie.com/media/files/yolov1.weights Resolving pjreddie.com (pjreddie.com)... 128.208.4.108 Connecting to pjreddie.com (pjreddie.com)|128.208.4.108|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 789312988 (753M) [application/octet-stream] Saving to: ‘data/yolov1.weights’ data/yolov1.weights 100%[===================>] 752.75M 39.0MB/s in 20s 2021-07-23 20:47:08 (38.1 MB/s) - ‘data/yolov1.weights’ saved [789312988/789312988] --2021-07-23 20:47:08-- https://pjreddie.com/media/files/yolov2.weights Resolving pjreddie.com (pjreddie.com)... 128.208.4.108 Connecting to pjreddie.com (pjreddie.com)|128.208.4.108|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 203934260 (194M) [application/octet-stream] Saving to: ‘data/yolov2.weights’ data/yolov2.weights 100%[===================>] 194.49M 38.4MB/s in 5.5s 2021-07-23 20:47:14 (35.6 MB/s) - ‘data/yolov2.weights’ saved [203934260/203934260] --2021-07-23 20:47:14-- https://pjreddie.com/media/files/yolov3.weights Resolving pjreddie.com (pjreddie.com)... 128.208.4.108 Connecting to pjreddie.com (pjreddie.com)|128.208.4.108|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 248007048 (237M) [application/octet-stream] Saving to: ‘data/yolov3.weights’ data/yolov3.weights 100%[===================>] 236.52M 39.6MB/s in 6.4s 2021-07-23 20:47:21 (36.8 MB/s) - ‘data/yolov3.weights’ saved [248007048/248007048] ###Markdown 2.3.2 Load pre-trained weights ( TODO: working on this part still) ###Code # input class WeightReader: # dataloader? def __init__(self, weight_file): with open(weight_file, 'rb') as w_f: major, = struct.unpack('i', w_f.read(4)) minor, = struct.unpack('i', w_f.read(4)) revision, = struct.unpack('i', w_f.read(4)) if (major*10 + minor) >= 2 and major < 1000 and minor < 1000: w_f.read(8) else: w_f.read(4) transpose = (major > 1000) or (minor > 1000) binary = w_f.read() self.offset = 0 self.all_weights = np.frombuffer(binary, dtype='float32') def read_bytes(self, size): self.offset = self.offset + size return self.all_weights[self.offset-size:self.offset] def load_weights(self, model): for i in range(106): try: conv_layer = model.get_layer('conv_' + str(i)) print("loading weights of convolution #" + str(i)) if i not in [81, 93, 105]: norm_layer = model.get_layer('bnorm_' + str(i)) size = np.prod(norm_layer.get_weights()[0].shape) beta = self.read_bytes(size) # bias gamma = self.read_bytes(size) # scale mean = self.read_bytes(size) # mean var = self.read_bytes(size) # variance weights = norm_layer.set_weights([gamma, beta, mean, var]) if len(conv_layer.get_weights()) > 1: bias = self.read_bytes(np.prod(conv_layer.get_weights()[1].shape)) kernel = self.read_bytes(np.prod(conv_layer.get_weights()[0].shape)) kernel = kernel.reshape(list(reversed(conv_layer.get_weights()[0].shape))) kernel = kernel.transpose([2,3,1,0]) conv_layer.set_weights([kernel, bias]) else: kernel = self.read_bytes(np.prod(conv_layer.get_weights()[0].shape)) kernel = kernel.reshape(list(reversed(conv_layer.get_weights()[0].shape))) kernel = kernel.transpose([2,3,1,0]) conv_layer.set_weights([kernel]) except ValueError: print("no convolution #" + str(i)) def reset(self): self.offset = 0 # TODO : config # make the yolov3 model to predict 80 classes on COCO yolov3 = make_yolov3_model() # load the weights trained on COCO into the model weight_reader = WeightReader(weights_path) weight_reader.load_weights(yolov3) ###Output _____no_output_____ ###Markdown Step 3: Create ModelIn this section, we will provide how YOLOv3 model can be created from scratch and how to load pre-trained models from TensorFlow Hub. In section 1.1, we provide an example how to build YOLOv3 from scratch, including the DarkNet architecture and other principle functions in YOLOv3. In section 1.2, we will update the usage of YOLO collections on TF hub soon once the pre-trained models and checkpoints have been published. 3.0 Example ModelIn `example_model`.py, we show how to create a new model. The ExampleModel is a subclass of `tf.keras.Model` that defines necessary parameters. Here, you need to have `input_specs` to specify the input shape and dimensions, and build layers within constructor:```pythonclass ExampleModel(tf.keras.Model): def __init__( self, num_classes: int, input_specs: tf.keras.layers.InputSpec = tf.keras.layers.InputSpec( shape=[None, None, None, 3]), **kwargs): Build layers.```Given the `ExampleModel`, you can define a function that takes a model config as input and return an `ExampleModel` instance, similar as `build_example_model`. As a simple example, we define a single model. However, you can split the model implementation to individual components, such as backbones, decoders, heads, as what we do here. And then in `build_example_model` function, you can hook up these components together to obtain your full model.( TODO: The `build_example_model` above should link to a code snippet or github gist link, the link may claim in an internal doc of TF. ) 3.1 Modeling for YOLOv3Implement the Darknet-53 following by model example. ###Code import argparse import os import numpy as np # from keras.layers import Conv2D, Input, BatchNormalization, LeakyReLU, ZeroPadding2D, UpSampling2D # from tf.keras.layers.merge import add, concatenate # from tf.keras.models import Model import struct import cv2 ## modifed from exmaple_model.py => yolo3_model.py """A sample model implementation. This is only a dummy example to showcase how a model is composed. It is usually not needed to implement a modedl from scratch. Most SoTA models can be found and directly used from `official/vision/beta/modeling` directory. """ from typing import Any, Mapping # Import libraries import tensorflow as tf # from official.vision.beta.projects.yolo3 import yolo3_config @tf.keras.utils.register_keras_serializable(package='Vision') class Yolo3Model(tf.keras.Model): """A example model class. A model is a subclass of tf.keras.Model where layers are built in the constructor. """ def __init__( self, num_classes: int, input_specs: tf.keras.layers.InputSpec = tf.keras.layers.InputSpec( shape=[None, None, None, 3]), **kwargs): """Initializes the example model. All layers are defined in the constructor, and config is recorded in the `_config_dict` object for serialization. Args: num_classes: The number of classes in classification task. input_specs: A `tf.keras.layers.InputSpec` spec of the input tensor. **kwargs: Additional keyword arguments to be passed. """ inputs = tf.keras.Input(shape=input_specs.shape[1:], name=input_specs.name) # Layer 0 => 1 : outputs = self.conv_block(inputs, [ {'filter': 32, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 0}, {'filter': 64, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 1} ], skip=False) # Layer 2 => 4 : Layer 4 Residual outputs = self.conv_block(outputs, [ {'filter': 32, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 2}, {'filter': 64, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 3} ], skip=True) # Layer 5 outputs = self.conv_block(outputs, [ {'filter': 128, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 5} ], skip=False) # Layer 6 => 11 : Layer 8/11 Residual for i in range(2): outputs = self.conv_block(outputs, [ {'filter': 64, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 6 + i * 3}, {'filter': 128, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 7 + i * 3} ], skip=True) # Layer 12 outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 12} ], skip=False) # Layer 13 => 36 for i in range(8): outputs = self.conv_block(outputs, [ {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 13 + i * 3}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 14 + i * 3} ], skip=True) skip_36 = outputs # Layer 37 => 40 outputs = self.conv_block(outputs, [ {'filter': 512, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 37} ], skip=False) # Layer 41 => 61 for i in range(8): outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 38 + i * 3}, {'filter': 512, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 39 + i * 3} ]) skip_61 = outputs # Layer 62 outputs = self.conv_block(outputs, [ {'filter': 1024, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 62} ], skip=False) # Layer 63 => 74 for i in range(4): outputs = self.conv_block(outputs, [ {'filter': 512, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 63 + i * 3}, {'filter': 1024, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 64 + i * 3} ], skip=True) # Layer 75 => 80 for i in range(3): outputs = self.conv_block(outputs, [ {'filter': 512, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 75 + i * 2}, {'filter': 1024, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 76 + i * 2} ], skip=False) ## TODO: check the Layer 81 # Layer 82 yolo_82 = self.conv_block(outputs, [ {'filter': 255, 'kernel': 1, 'stride': 1, 'bnorm': False, 'leaky': False, 'layer_idx': 81} ], skip=False) ## TODO: check layer_idx # Layer 83 => 86 outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 84}, ], skip=False) outputs = tf.keras.layers.UpSampling2D(2)(outputs) outputs = tf.keras.layers.concatenate([outputs, skip_61]) # Layer 87 => 92 for i in range(3): outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 87 + i * 2}, {'filter': 512, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 88 + i * 2} ], skip=False) # Layer 93 => 94 yolo_94 = self.conv_block(outputs, [ {'filter': 255, 'kernel': 1, 'stride': 1, 'bnorm': False, 'leaky': False, 'layer_idx': 93} ]) # Layer 95 => 98 outputs = self.conv_block(outputs, [ {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 96}, ]) outputs = tf.keras.layers.UpSampling2D(2)(outputs) outputs = tf.keras.layers.concatenate([outputs, skip_36]) # Layer 99 => 106 yolo_106 = self.conv_block(outputs, [ {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 99}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 100}, {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 101}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 102}, {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 103}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 104}, {'filter': 255, 'kernel': 1, 'stride': 1, 'bnorm': False, 'leaky': False, 'layer_idx': 105} ]) # final model self.model = tf.keras.Model(inputs, [yolo_82, yolo_94, yolo_106], name='yolo3_model') super().__init__(inputs=inputs, outputs=outputs, **kwargs) self._input_specs = input_specs self._config_dict = { 'num_classes': num_classes, 'input_specs': input_specs } def conv_block(self, input_img, convs, skip=True): x = input_img count = 0 for conv in convs: if count == (len(convs) - 2) and skip: skip_connection = x count += 1 if conv['stride'] > 1: # peculiar padding as darknet prefer left and top x = tf.keras.layers.ZeroPadding2D(((1, 0), (1, 0)))(x) x = tf.keras.layers.Conv2D( filters=conv['filter'], kernel_size=conv['kernel'], strides=conv['stride'], # peculiar padding as darknet prefer left and top padding='valid' if conv['stride'] > 1 else 'same', name='conv_' + str(conv['layer_idx']), use_bias=False if conv['bnorm'] else True )(x) if conv['bnorm']: x = tf.keras.layers.BatchNormalization( name='bnorm_' + str(conv['layer_idx']))(x) if conv['leaky']: x = tf.keras.layers.LeakyReLU( alpha=0.1, name='leaky_' + str(conv['layer_idx']))(x) return skip_connection + x if skip else x # def get_config(self) -> Mapping[str, Any]: # """Gets the config of this model.""" # return self._config_dict # @classmethod # def from_config(cls, config, custom_objects=None): # """Constructs an instance of this model from input config.""" # return cls(**config) ###Output _____no_output_____ ###Markdown 3.2 Ops for YOLOv3Bounding box in YOLOv3, play a crucial role. ###Code import numpy as np class BoundBox: def __init__(self, xmin, ymin, xmax, ymax, objness = None, classes = None): self.xmin = xmin self.ymin = ymin self.xmax = xmax self.ymax = ymax self.objness = objness self.classes = classes self.label = -1 self.score = -1 def get_label(self): if self.label == -1: self.label = np.argmax(self.classes) return self.label def get_score(self): if self.score == -1: self.score = self.classes[self.get_label()] return self.score def _interval_overlap(interval_a, interval_b): x1, x2 = interval_a x3, x4 = interval_b if x3 < x1: if x4 < x1: return 0 else: return min(x2,x4) - x1 else: if x2 < x3: return 0 else: return min(x2,x4) - x3 def _sigmoid(x): return 1. / (1. + np.exp(-x)) def bbox_iou(box1, box2): intersect_w = _interval_overlap([box1.xmin, box1.xmax], [box2.xmin, box2.xmax]) intersect_h = _interval_overlap([box1.ymin, box1.ymax], [box2.ymin, box2.ymax]) intersect = intersect_w * intersect_h w1, h1 = box1.xmax-box1.xmin, box1.ymax-box1.ymin w2, h2 = box2.xmax-box2.xmin, box2.ymax-box2.ymin union = w1*h1 + w2*h2 - intersect return float(intersect) / union ###Output _____no_output_____ ###Markdown 3.3 Loss for YOLOv3Loss functions with bbx for training a custom model and detaction tasks. Step 1: Create Configcreate config 1.0 Example ConfigNext you will define configs for your project. All configs are defined as dataclass objects, and can have default parameter values.First, you will define your `ExampleDataConfig`. It inherits from `config_definitions.DataConfig` that already defines a few common fields, like `input_path`, `file_type`, `global_batch_size`, etc. You can add more fields in your own config as needed.```[email protected] ExampleDataConfig(cfg.DataConfig): """Input config for training. Add more fields as needed.""" input_path: str = '' global_batch_size: int = 0 is_training: bool = True dtype: str = 'float32' shuffle_buffer_size: int = 10000 cycle_length: int = 10 file_type: str = 'tfrecord'```You can then define you model config `ExampleModel` that inherits from `hyperparams.Config`. Expose your own model parameters here.```[email protected] ExampleModel(hyperparams.Config): """The model config. Used by build_example_model function.""" num_classes: int = 0 input_size: List[int] = dataclasses.field(default_factory=list)```You can then define your `Loss` and `Evaluation` configs.```[email protected] Losses(hyperparams.Config): l2_weight_decay: float = [email protected] Evaluation(hyperparams.Config): top_k: int = 5```Next, you will put all the above configs into an `ExampleTask` config. Here you list the configs for your data, model, loss, and evaluation, etc.Finally, you can define a `tf_vision_example_experiment`, which creates a template for your experiments and fills with default parameters. These default parameter values can be overridden by a `YAML` file, like `example_config_tpu.yaml`. Also, make sure you give a unique name to your experiment template by the decorator:```python@exp_factory.register_config_factory('tf_vision_example_experiment')def tf_vision_example_experiment() -> cfg.ExperimentConfig: """Definition of a full example experiment.""" Create and return experiment template.``` ###Code def build_yolo3_model(input_specs: tf.keras.layers.InputSpec, model_config: yolo3_cfg.Yolo3Model, ## TODO **kwargs) -> tf.keras.Model: """Builds and returns the example model. This function is the main entry point to build a model. Commonly, it build a model by building a backbone, decoder and head. An example of building a classification model is at third_party/tensorflow_models/official/vision/beta/modeling/backbones/resnet.py. However, it is not mandatory for all models to have these three pieces exactly. Depending on the task, model can be as simple as the example model here or more complex, such as multi-head architecture. Args: input_specs: The specs of the input layer that defines input size. model_config: The config containing parameters to build a model. **kwargs: Additional keyword arguments to be passed. Returns: A tf.keras.Model object. """ return Yolo3Model( num_classes=model_config.num_classes, input_specs=input_specs, **kwargs) ###Output _____no_output_____ ###Markdown Step 4: Create Tasktasks 4.0 Example TaskA task is a class that encapsules the logic of loading data, building models, performing one-step training and validation, etc. It connects all components together and is called by the base Trainer.You can create your own task by inheriting from base Task, or from one of the tasks we already defined, if most of the operations can be reused. An `ExampleTask` inheriting from `ImageClassificationTask` can be found here. We will go through each important components in the task in the following.- `build_model`: you can instantiate a model you have defined above. It is also good practice to run forward pass with a dummy input to ensure layers within the model are properly initialized, similar as this.- `build_inputs`: here you can instantiate a Decoder object and a Parser object. They are used to create an InputReader that will generate a tf.data.Dataset object.- `build_losses`: it takes groundtruth labels and model outputs as input, and computes the loss. It will be called in train_step and validation_step. You can also define different losses for training and validation, for example, build_train_losses and build_validation_losses. Just make sure they are called by the corresponding functions properly.- `build_metrics`: here you can define your own metrics. It should return a list of tf.keras.metrics.Metric objects. You can create your own metric class by subclassing tf.keras.metrics.Metric.- `train_step` and `validation_step`: they perform one-step training and validation. They take one batch of training/validation data, run forward pass, gather losses and update metrics. They assume the data format is consistency with that from the Parser output. train_step also contains backward pass to update model weights. 4.1 Build tasks for YOLOv3werwer 4.2 Fine-tuning Models Put every pieces together. The `example_task.py` includes:- `build_model()` :- `build_inputs()` :- `build_losses()` :- `build_metrics()` :- `train_step()` :- `validation_step()` :- `inference_step()` : ###Code ## modified from example_task.py => yolo3_task.py """An example task definition for image classification.""" from typing import Any, List, Optional, Tuple, Sequence, Mapping import tensorflow as tf from official.common import dataset_fn from official.core import base_task from official.core import task_factory from official.modeling import tf_utils from official.vision.beta.dataloaders import input_reader_factory from official.vision.beta.projects.yolo3 import yolo3_config as exp_cfg from official.vision.beta.projects.yolo3 import yolo3_input from official.vision.beta.projects.yolo3 import yolo3_model @task_factory.register_task_cls(exp_cfg.ExampleTask) class ExampleTask(base_task.Task): """Class of an example task. A task is a subclass of base_task.Task that defines model, input, loss, metric and one training and evaluation step, etc. """ def build_model(self) -> tf.keras.Model: """Builds a model.""" input_specs = tf.keras.layers.InputSpec(shape=[None] + self.task_config.model.input_size) model = yolo3_model.build_yolo3_model( input_specs=input_specs, model_config=self.task_config.model) return model def build_inputs( self, params: exp_cfg.ExampleDataConfig, input_context: Optional[tf.distribute.InputContext] = None ) -> tf.data.Dataset: """Builds input. The input from this function is a tf.data.Dataset that has gone through pre-processing steps, such as augmentation, batching, shuffuling, etc. Args: params: The experiment config. input_context: An optional InputContext used by input reader. Returns: A tf.data.Dataset object. """ num_classes = self.task_config.model.num_classes input_size = self.task_config.model.input_size decoder = example_input.Decoder() parser = example_input.Parser( output_size=input_size[:2], num_classes=num_classes) reader = input_reader_factory.input_reader_generator( params, dataset_fn=dataset_fn.pick_dataset_fn(params.file_type), decoder_fn=decoder.decode, parser_fn=parser.parse_fn(params.is_training)) dataset = reader.read(input_context=input_context) return dataset def build_losses(self, labels: tf.Tensor, model_outputs: tf.Tensor, aux_losses: Optional[Any] = None) -> tf.Tensor: """Builds losses for training and validation. Args: labels: Input groundtruth labels. model_outputs: Output of the model. aux_losses: The auxiliarly loss tensors, i.e. `losses` in tf.keras.Model. Returns: The total loss tensor. """ total_loss = tf.keras.losses.sparse_categorical_crossentropy( labels, model_outputs, from_logits=True) total_loss = tf_utils.safe_mean(total_loss) if aux_losses: total_loss += tf.add_n(aux_losses) return total_loss def build_metrics(self, training: bool = True) -> Sequence[tf.keras.metrics.Metric]: """Gets streaming metrics for training/validation. This function builds and returns a list of metrics to compute during training and validation. The list contains objects of subclasses of tf.keras.metrics.Metric. Training and validation can have different metrics. Args: training: Whether the metric is for training or not. Returns: A list of tf.keras.metrics.Metric objects. """ k = self.task_config.evaluation.top_k metrics = [ tf.keras.metrics.SparseCategoricalAccuracy(name='accuracy'), tf.keras.metrics.SparseTopKCategoricalAccuracy( k=k, name='top_{}_accuracy'.format(k)) ] return metrics def train_step(self, inputs: Tuple[Any, Any], model: tf.keras.Model, optimizer: tf.keras.optimizers.Optimizer, metrics: Optional[List[Any]] = None) -> Mapping[str, Any]: """Does forward and backward. This example assumes input is a tuple of (features, labels), which follows the output from data loader, i.e., Parser. The output from Parser is fed into train_step to perform one step forward and backward pass. Other data structure, such as dictionary, can also be used, as long as it is consistent between output from Parser and input used here. Args: inputs: A tuple of of input tensors of (features, labels). model: A tf.keras.Model instance. optimizer: The optimizer for this training step. metrics: A nested structure of metrics objects. Returns: A dictionary of logs. """ features, labels = inputs num_replicas = tf.distribute.get_strategy().num_replicas_in_sync with tf.GradientTape() as tape: outputs = model(features, training=True) # Casting output layer as float32 is necessary when mixed_precision is # mixed_float16 or mixed_bfloat16 to ensure output is casted as float32. outputs = tf.nest.map_structure(lambda x: tf.cast(x, tf.float32), outputs) # Computes per-replica loss. loss = self.build_losses( model_outputs=outputs, labels=labels, aux_losses=model.losses) # Scales loss as the default gradients allreduce performs sum inside the # optimizer. scaled_loss = loss / num_replicas # For mixed_precision policy, when LossScaleOptimizer is used, loss is # scaled for numerical stability. if isinstance(optimizer, tf.keras.mixed_precision.LossScaleOptimizer): scaled_loss = optimizer.get_scaled_loss(scaled_loss) tvars = model.trainable_variables grads = tape.gradient(scaled_loss, tvars) # Scales back gradient before apply_gradients when LossScaleOptimizer is # used. if isinstance(optimizer, tf.keras.mixed_precision.LossScaleOptimizer): grads = optimizer.get_unscaled_gradients(grads) optimizer.apply_gradients(list(zip(grads, tvars))) logs = {self.loss: loss} if metrics: self.process_metrics(metrics, labels, outputs) return logs def validation_step(self, inputs: Tuple[Any, Any], model: tf.keras.Model, metrics: Optional[List[Any]] = None) -> Mapping[str, Any]: """Runs validatation step. Args: inputs: A tuple of of input tensors of (features, labels). model: A tf.keras.Model instance. metrics: A nested structure of metrics objects. Returns: A dictionary of logs. """ features, labels = inputs outputs = self.inference_step(features, model) outputs = tf.nest.map_structure(lambda x: tf.cast(x, tf.float32), outputs) loss = self.build_losses( model_outputs=outputs, labels=labels, aux_losses=model.losses) logs = {self.loss: loss} if metrics: self.process_metrics(metrics, labels, outputs) return logs def inference_step(self, inputs: tf.Tensor, model: tf.keras.Model) -> Any: """Performs the forward step. It is used in validation_step.""" return model(inputs, training=False) ###Output _____no_output_____ ###Markdown Step 5: Common / Import Registrywerwe 5.0 Import Registry ExampleTo use your custom dataloaders, models, tasks, etc., you will need to register them properly. The recommended way is to have a single file with all relevant files imported, for example, `registry_imports.py`. You can see in this file we import all our custom components:```python"""All necessary imports for registration.Custom models, task, configs, etc need to be imported to registry so they can bepicked up by the trainer. They can be included in this file so you do not needto handle each file separately.""" pylint: disable=unused-importfrom tensorflow_models.official.common import registry_importsfrom tensorflow_models.official.vision.beta.projects.example import example_configfrom tensorflow_models.official.vision.beta.projects.example import example_inputfrom tensorflow_models.official.vision.beta.projects.example import example_modelfrom tensorflow_models.official.vision.beta.projects.example import example_task``` 5.1 Import Registry for YOLOv3from tensorflow_models.official.common import registry_imports ###Code # pylint: disable=unused-import from tensorflow_models.official.common import registry_imports from tensorflow_models.official.vision.beta.projects.yolo3 import yolo3_config from tensorflow_models.official.vision.beta.projects.yolo3 import yolo3_input from tensorflow_models.official.vision.beta.projects.yolo3 import yolo3_model from tensorflow_models.official.vision.beta.projects.yolo3 import yolo3_task ###Output _____no_output_____ ###Markdown Step 6: Train ModelUsing Model Zoo as baseline + model train 6.0 Training ExampleYou can create your own trainer by branching from our core trainer. Just make sure you import the registry like this:```python"""TensorFlow Model Garden Vision trainer.All custom registry are imported from registry_imports. Here we use defaulttrainer so we directly call train.main. If you need to customize the trainer,branch from `official/vision/beta/train.py` and make changes."""from absl import appfrom official.common import flags as tfm_flagsfrom official.vision.beta import trainfrom official.vision.beta.projects.example import registry_imports pylint: disable=unused-importif __name__ == '__main__': tfm_flags.define_flags() app.run(train.main)```You can run training locally for testing purpose:```bashblaze run -c opt \//third_party/tensorflow_models/official/vision/beta/projects/example:train -- \ --experiment=tf_vision_example_experiment \ --config_file=$PWD/third_party/tensorflow_models/official/vision/beta/projects/example/example_config_local.yaml \ --mode=train \ --model_dir=/tmp/tfvision_test/ \ --alsologtostderr``` 6.1 Training the YOLOv3 model ###Code from tensorflow_models.official.vision.beta.projects.yolo3 import registry_imports ###Output _____no_output_____ ###Markdown Step 7: Write DemoWe recommend to write a **Colab** demo or inference tutorial, so your TFMG project can be reused the model. We also provide an example [here](). ( --- old version below --- )https://colab.research.google.com/gist/evawyf/8f38bf6722cdb6296c83c5f4e6812a3f/tfmg-project-tutorial.ipynb 4. After installed TensorFlow, install **TensorFlow Hub** package: - `pip install tensorflow-hub` : The stable version, work for both TensorFlow 2 and TensorFlow 1. We recommend that new users start with TensorFlow 2 right away, and current users upgrade to it. - `pip install tf-hub-nightly` : Built automatically from the source code on github, with no release testing. - For more details, please check TF Hub official document [here](https://www.tensorflow.org/hub/installation)```bash$ !pip install -q tensorflow tensorflow-hub tf-models-nightly tfds-nightly ``` The TensorFlow Model Garden (TFMG) has a modular structure, supporting component re-use between exemplar implementations. Modularity both simplifies implementation, andaccelerates innovation: model components can be recombined into a new model per-forming a different function. For example, the YOLO family is targeted towards object detection, but can be used for image classification by connecting an image classification head to the current backbone.In this Colab Notebook, we will be using the YOLOv3 model to show how to create a TensorFlow Model Garden project by following [TFMG components](https://github.com/tensorflow/models/tree/master/official/vision/beta/projects/example). Also, TensorFlow provides compenhensive framework and API to support your TFMG project, including [TensorFlow Hub](https://www.tensorflow.org/hub) and [TensorFlow Datasets](https://www.tensorflow.org/datasets). In this tutorial, we provide a step-by-step example which works for both building a model from scratch and directly loading the pre-trained model from TensorFlow Hub for detection. 1.1 Build YOLOv3 from scratchYOLOv3 uses a new network architecture (DarkNet-53) for performing feature extraction, it's a hybrid approach between the network used in YOLOv2 (DarkNet-19) and residual network stuff. For the crucial functions of YOLOv3 (e.g. bounding boxes, non-max suppression, and IoU), we will provide the functions in the Step 1 (Create Model) and will use them in Step 4 (Create Task). 1.1.1 Build the DarkNet-53 ArchitectureThe network architecture of the model will be the **Model** component. Component: `example_model.py`( TODO: need to check 81-83 layers) ###Code ## modifed from exmaple_model.py => yolo3_model.py """A sample model implementation. This is only a dummy example to showcase how a model is composed. It is usually not needed to implement a modedl from scratch. Most SoTA models can be found and directly used from `official/vision/beta/modeling` directory. """ from typing import Any, Mapping # Import libraries import tensorflow as tf from official.vision.beta.projects.yolo3 import yolo3_config as yolo3_cfg ## TODO @tf.keras.utils.register_keras_serializable(package='Vision') class Yolo3Model(tf.keras.Model): """A example model class. A model is a subclass of tf.keras.Model where layers are built in the constructor. """ def __init__( self, num_classes: int, input_specs: tf.keras.layers.InputSpec = tf.keras.layers.InputSpec( shape=[None, None, None, 3]), **kwargs): """Initializes the example model. All layers are defined in the constructor, and config is recorded in the `_config_dict` object for serialization. Args: num_classes: The number of classes in classification task. input_specs: A `tf.keras.layers.InputSpec` spec of the input tensor. **kwargs: Additional keyword arguments to be passed. """ inputs = tf.keras.Input(shape=input_specs.shape[1:], name=input_specs.name) # Layer 0 => 1 : outputs = self.conv_block(inputs, [ {'filter': 32, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 0}, {'filter': 64, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 1} ], skip=False) # Layer 2 => 4 : Layer 4 Residual outputs = self.conv_block(outputs, [ {'filter': 32, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 2}, {'filter': 64, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 3} ], skip=True) # Layer 5 outputs = self.conv_block(outputs, [ {'filter': 128, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 5} ], skip=False) # Layer 6 => 11 : Layer 8/11 Residual for i in range(2): outputs = self.conv_block(outputs, [ {'filter': 64, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 6 + i * 3}, {'filter': 128, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 7 + i * 3} ], skip=True) # Layer 12 outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 12} ], skip=False) # Layer 13 => 36 for i in range(8): outputs = self.conv_block(outputs, [ {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 13 + i * 3}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 14 + i * 3} ], skip=True) skip_36 = outputs # Layer 37 => 40 outputs = self.conv_block(outputs, [ {'filter': 512, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 37} ], skip=False) # Layer 41 => 61 for i in range(8): outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 38 + i * 3}, {'filter': 512, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 39 + i * 3} ]) skip_61 = outputs # Layer 62 outputs = self.conv_block(outputs, [ {'filter': 1024, 'kernel': 3, 'stride': 2, 'bnorm': True, 'leaky': True, 'layer_idx': 62} ], skip=False) # Layer 63 => 74 for i in range(4): outputs = self.conv_block(outputs, [ {'filter': 512, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 63 + i * 3}, {'filter': 1024, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 64 + i * 3} ], skip=True) # Layer 75 => 80 for i in range(3): outputs = self.conv_block(outputs, [ {'filter': 512, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 75 + i * 2}, {'filter': 1024, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 76 + i * 2} ], skip=False) ## TODO: check the Layer 81 # Layer 82 yolo_82 = self.conv_block(outputs, [ {'filter': 255, 'kernel': 1, 'stride': 1, 'bnorm': False, 'leaky': False, 'layer_idx': 81} ], skip=False) ## TODO: check layer_idx # Layer 83 => 86 outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 84}, ], skip=False) outputs = tf.keras.layers.UpSampling2D(2)(outputs) outputs = tf.keras.layers.concatenate([outputs, skip_61]) # Layer 87 => 92 for i in range(3): outputs = self.conv_block(outputs, [ {'filter': 256, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 87 + i * 2}, {'filter': 512, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 88 + i * 2} ], skip=False) # Layer 93 => 94 yolo_94 = self.conv_block(outputs, [ {'filter': 255, 'kernel': 1, 'stride': 1, 'bnorm': False, 'leaky': False, 'layer_idx': 93} ]) # Layer 95 => 98 outputs = self.conv_block(outputs, [ {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 96}, ]) outputs = tf.keras.layers.UpSampling2D(2)(outputs) outputs = tf.keras.layers.concatenate([outputs, skip_36]) # Layer 99 => 106 yolo_106 = self.conv_block(outputs, [ {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 99}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 100}, {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 101}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 102}, {'filter': 128, 'kernel': 1, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 103}, {'filter': 256, 'kernel': 3, 'stride': 1, 'bnorm': True, 'leaky': True, 'layer_idx': 104}, {'filter': 255, 'kernel': 1, 'stride': 1, 'bnorm': False, 'leaky': False, 'layer_idx': 105} ]) # final model self.model = tf.keras.Model(inputs, [yolo_82, yolo_94, yolo_106], name='yolo3_model') super().__init__(inputs=inputs, outputs=outputs, **kwargs) self._input_specs = input_specs self._config_dict = {'num_classes': num_classes, 'input_specs': input_specs} def conv_block(self, input_img, convs, skip=True): x = input_img count = 0 for conv in convs: if count == (len(convs) - 2) and skip: skip_connection = x count += 1 if conv['stride'] > 1: # peculiar padding as darknet prefer left and top x = tf.keras.layers.ZeroPadding2D(((1, 0), (1, 0)))(x) x = tf.keras.layers.Conv2D( filters=conv['filter'], kernel_size=conv['kernel'], strides=conv['stride'], # peculiar padding as darknet prefer left and top padding='valid' if conv['stride'] > 1 else 'same', name='conv_' + str(conv['layer_idx']), use_bias=False if conv['bnorm'] else True )(x) if conv['bnorm']: x = tf.keras.layers.BatchNormalization( name='bnorm_' + str(conv['layer_idx']))(x) if conv['leaky']: x = tf.keras.layers.LeakyReLU( alpha=0.1, name='leaky_' + str(conv['layer_idx']))(x) return skip_connection + x if skip else x def get_config(self) -> Mapping[str, Any]: """Gets the config of this model.""" return self._config_dict @classmethod def from_config(cls, config, custom_objects=None): """Constructs an instance of this model from input config.""" return cls(**config) def build_yolo3_model(input_specs: tf.keras.layers.InputSpec, model_config: yolo3_cfg.Yolo3Model, ## TODO **kwargs) -> tf.keras.Model: """Builds and returns the example model. This function is the main entry point to build a model. Commonly, it build a model by building a backbone, decoder and head. An example of building a classification model is at third_party/tensorflow_models/official/vision/beta/modeling/backbones/resnet.py. However, it is not mandatory for all models to have these three pieces exactly. Depending on the task, model can be as simple as the example model here or more complex, such as multi-head architecture. Args: input_specs: The specs of the input layer that defines input size. model_config: The config containing parameters to build a model. **kwargs: Additional keyword arguments to be passed. Returns: A tf.keras.Model object. """ return Yolo3Model( num_classes=model_config.num_classes, input_specs=input_specs, **kwargs) ###Output _____no_output_____ ###Markdown ( TODO: helping check architecture, it will be removed after)```bash$ ./darknet detect cfg/yolov3.cfg yolov3.weights data/dog.jpg [2:24:08]layer filters size input output 0 conv 32 3 x 3 / 1 608 x 608 x 3 -> 608 x 608 x 32 0.639 BFLOPs 1 conv 64 3 x 3 / 2 608 x 608 x 32 -> 304 x 304 x 64 3.407 BFLOPs 2 conv 32 1 x 1 / 1 304 x 304 x 64 -> 304 x 304 x 32 0.379 BFLOPs 3 conv 64 3 x 3 / 1 304 x 304 x 32 -> 304 x 304 x 64 3.407 BFLOPs 4 res 1 304 x 304 x 64 -> 304 x 304 x 64 5 conv 128 3 x 3 / 2 304 x 304 x 64 -> 152 x 152 x 128 3.407 BFLOPs 6 conv 64 1 x 1 / 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BFLOPs 7 conv 128 3 x 3 / 1 152 x 152 x 64 -> 152 x 152 x 128 3.407 BFLOPs 8 res 5 152 x 152 x 128 -> 152 x 152 x 128 9 conv 64 1 x 1 / 1 152 x 152 x 128 -> 152 x 152 x 64 0.379 BFLOPs 10 conv 128 3 x 3 / 1 152 x 152 x 64 -> 152 x 152 x 128 3.407 BFLOPs 11 res 8 152 x 152 x 128 -> 152 x 152 x 128 12 conv 256 3 x 3 / 2 152 x 152 x 128 -> 76 x 76 x 256 3.407 BFLOPs 13 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 14 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 15 res 12 76 x 76 x 256 -> 76 x 76 x 256 16 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 17 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 18 res 15 76 x 76 x 256 -> 76 x 76 x 256 19 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 20 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 21 res 18 76 x 76 x 256 -> 76 x 76 x 256 22 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 23 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 24 res 21 76 x 76 x 256 -> 76 x 76 x 256 25 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 26 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 27 res 24 76 x 76 x 256 -> 76 x 76 x 256 28 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 29 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 30 res 27 76 x 76 x 256 -> 76 x 76 x 256 31 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 32 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 33 res 30 76 x 76 x 256 -> 76 x 76 x 256 34 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 35 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 36 res 33 76 x 76 x 256 -> 76 x 76 x 256 37 conv 512 3 x 3 / 2 76 x 76 x 256 -> 38 x 38 x 512 3.407 BFLOPs 38 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 39 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 40 res 37 38 x 38 x 512 -> 38 x 38 x 512 41 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 42 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 43 res 40 38 x 38 x 512 -> 38 x 38 x 512 44 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 45 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 46 res 43 38 x 38 x 512 -> 38 x 38 x 512 47 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 48 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 49 res 46 38 x 38 x 512 -> 38 x 38 x 512 50 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 51 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 52 res 49 38 x 38 x 512 -> 38 x 38 x 512 53 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 54 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 55 res 52 38 x 38 x 512 -> 38 x 38 x 512 56 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 57 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 58 res 55 38 x 38 x 512 -> 38 x 38 x 512 59 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 60 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 61 res 58 38 x 38 x 512 -> 38 x 38 x 512 62 conv 1024 3 x 3 / 2 38 x 38 x 512 -> 19 x 19 x1024 3.407 BFLOPs 63 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 64 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 65 res 62 19 x 19 x1024 -> 19 x 19 x1024 66 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 67 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 68 res 65 19 x 19 x1024 -> 19 x 19 x1024 69 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 70 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 71 res 68 19 x 19 x1024 -> 19 x 19 x1024 72 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 73 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 74 res 71 19 x 19 x1024 -> 19 x 19 x1024 75 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 76 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 77 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 78 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 79 conv 512 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 512 0.379 BFLOPs 80 conv 1024 3 x 3 / 1 19 x 19 x 512 -> 19 x 19 x1024 3.407 BFLOPs 81 conv 255 1 x 1 / 1 19 x 19 x1024 -> 19 x 19 x 255 0.189 BFLOPs 82 yolo 83 route 79 84 conv 256 1 x 1 / 1 19 x 19 x 512 -> 19 x 19 x 256 0.095 BFLOPs 85 upsample 2x 19 x 19 x 256 -> 38 x 38 x 256 86 route 85 61 87 conv 256 1 x 1 / 1 38 x 38 x 768 -> 38 x 38 x 256 0.568 BFLOPs 88 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 89 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 90 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 91 conv 256 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 256 0.379 BFLOPs 92 conv 512 3 x 3 / 1 38 x 38 x 256 -> 38 x 38 x 512 3.407 BFLOPs 93 conv 255 1 x 1 / 1 38 x 38 x 512 -> 38 x 38 x 255 0.377 BFLOPs 94 yolo 95 route 91 96 conv 128 1 x 1 / 1 38 x 38 x 256 -> 38 x 38 x 128 0.095 BFLOPs 97 upsample 2x 38 x 38 x 128 -> 76 x 76 x 128 98 route 97 36 99 conv 128 1 x 1 / 1 76 x 76 x 384 -> 76 x 76 x 128 0.568 BFLOPs 100 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 101 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 102 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 103 conv 128 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 128 0.379 BFLOPs 104 conv 256 3 x 3 / 1 76 x 76 x 128 -> 76 x 76 x 256 3.407 BFLOPs 105 conv 255 1 x 1 / 1 76 x 76 x 256 -> 76 x 76 x 255 0.754 BFLOPs 106 yoloLoading weights from yolov3.weights...Done!data/dog.jpg: Predicted in 24.269431 seconds.bicycle: 99%truck: 92%dog: 100%``` 1.2 Load from TensorFlow Hub ( TODO: This section will be added after YOLO collection published. )TensorFlow Hub is a repository of trained machine learning models ready for fine-tuning and deployable anywhere. Reuse trained models like BERT and Faster R-CNN with just a few lines of code.We will update the TF hub link here after the YOLO collections published. 1.2.1 SavedModels in TensorFlow Hub (`tfhub.dev`)> (TODO) Publishing process: https://www.tensorflow.org/hub/publish```pythonimport tensorflow_hub as hubmodel = hub.KerasLayer("https://tfhub.dev/google/nnlm-en-dim128/2")embeddings = model(["The rain in Spain.", "falls", "mainly", "In the plain!"])print(embeddings.shape) (4,128)``` 1.2.2 Example Using SavedModels from TF Hub> (TODO) SavedModels from TF Hub in TensorFlow 2: https://www.tensorflow.org/hub/tf2_saved_modelLoad YOLO-Base from TensorFlow Hub, as part of the [YOLO collection](LINK).The following code will:- Load a YOLO KerasLayer from [tfhub.dev](https://tfhub.dev).- Wrap the layer in a [Keras Model](https://www.tensorflow.org/api_docs/python/tf/keras/Model).- Load an example image, and do object detection. Step 2: Create DataloaderThis section is following the Sec 1.1 (build YOLO from scratch), we can download pre-trained weigths from author's official source. We will have a `DataLoader` to load and parse weights into our DarkNet-53 architecture. TensorFlow Datasets Instruction View on TensorFlow.org Run in Google Colab View source on GitHub Download notebook ###Code %env JOBLIB_TEMP_FOLDER=/tmp # rm -r /tmp/ !pip install -q tfds-nightly tensorflow matplotlib # !pip install tensorflow-datasets import tensorflow_datasets as tfds import tensorflow as tf # Construct a tf.data.Dataset ds = tfds.load('voc', split='train', as_supervised=False, shuffle_files=True) # Build your input pipeline ds = ds.shuffle(1000).batch(128).prefetch(10).take(5) %matplotlib inline # tfds works in both Eager and Graph modes tf.compat.v1.enable_eager_execution() dataset = tfds.load("voc", split=tfds.Split.TRAIN, batch_size=16) dataset = dataset.shuffle(10) for feature in dataset.take(1): print(feature['image'].shape) plt.imshow(feature['image'][5]) plt.show() print(feature['objects']['bbox'][5]) ###Output Downloading and preparing dataset 868.85 MiB (download: 868.85 MiB, generated: Unknown size, total: 868.85 MiB) to /root/tensorflow_datasets/voc/2007/4.0.0...
demo/musicGenRNN_Demo.ipynb
###Markdown IntroIn this demo, we construct a RNN that will learn to output piano music after being trained on a number of piano compositions in the MIDI file format. ###Code !sudo apt install -y fluidsynth !pip install --upgrade pyfluidsynth !pip install pretty_midi import collections import datetime import fluidsynth import glob import numpy as np import pathlib import pandas as pd import pretty_midi import seaborn as sns import tensorflow as tf import tensorflow.keras as keras from IPython import display from matplotlib import pyplot as plt from typing import Dict, List, Optional, Sequence, Tuple seed = 42 tf.random.set_seed(seed) np.random.seed(seed) #Sampling Rate for Audio Playback _SAMPLING_RATE = 16000 ###Output _____no_output_____ ###Markdown DataThis RNN can be trained on any number of MIDI files but, for the sake of brevity, we have uploaded 10 to use as data for this demo. ###Code #Import Data data_dir = pathlib.Path("data/example_midi_data") !wget -O data.zip https://www.dropbox.com/s/vho3um36lpwmdri/example_midi_data.zip?dl=0 !unzip -d data data.zip #data_dir = pathlib.Path("data/maestro-v2.0.0") #if not data_dir.exists(): #tf.keras.utils.get_file('maestro-v2.0.0-midi.zip', origin='https://storage.googleapis.com/magentadata/datasets/maestro/v2.0.0/maestro-v2.0.0-midi.zip', extract=True, cache_dir='.', cache_subdir='data') filenames = glob.glob(str(data_dir/"*.mid")) print("Number of Files:", len(filenames)) sample_file = filenames[1] print(sample_file) pm = pretty_midi.PrettyMIDI(sample_file) ###Output _____no_output_____ ###Markdown Helper FunctionsThere are a number of helper functions that are necessary for this method of neural net music generation to operate. Some coming from python packages and some coming courtesy of TensorFlow, they allow us to do things like convert MIDI files to our input data and vice-versa, display the music in piano roll form, generate notes with the trained model, among others. ###Code #Display Audio def display_audio(pm: pretty_midi.PrettyMIDI, seconds = 30): waveform = pm.fluidsynth(fs=_SAMPLING_RATE) #Take a sample of the generated waveform to mitigate kernel resets waveform_short = waveform[:seconds*_SAMPLING_RATE] return display.Audio(waveform_short, rate=_SAMPLING_RATE) #display_audio(pm) print("Number of Instruments:", len(pm.instruments)) instrument = pm.instruments[0] instrument_name = pretty_midi.program_to_instrument_name(instrument.program) print("Instrument Name:", instrument_name) for i, note in enumerate(instrument.notes[:10]): note_name = pretty_midi.note_number_to_name(note.pitch) duration = note.end - note.start print(f"{i}: pitch={note.pitch}, note_name={note_name}," f"duration={duration:.4f}") #Midi To Notes def midi_to_notes(midi_file: str) -> pd.DataFrame: pm = pretty_midi.PrettyMIDI(midi_file) instrument = pm.instruments[0] notes = collections.defaultdict(list) #sort the notes by start time sorted_notes = sorted(instrument.notes, key=lambda note: note.start) prev_start = sorted_notes[0].start for note in sorted_notes: start = note.start end = note.end notes['pitch'].append(note.pitch) notes['start'].append(start) notes['end'].append(end) notes['step'].append(start - prev_start) notes['duration'].append(end - start) prev_start = start return pd.DataFrame({name: np.array(value) for name, value in notes.items()}) ###Output _____no_output_____ ###Markdown Data FormatThe MIDI-To-Notes converts a midi file into the data we are training the net on. It converts each note into three separate values; pitch, step, and duration. The pitch represents the frequency of the sound, the step is the amount of time that has passed since the last played note, and the duration is how long the note sounds out for. ###Code #Notes by Pitch raw_notes = midi_to_notes(sample_file) raw_notes.head() #Notes by Note get_note_names = np.vectorize(pretty_midi.note_number_to_name) sample_note_names = get_note_names(raw_notes["pitch"]) sample_note_names[:10] #Plotting def plot_piano_roll(notes: pd.DataFrame, count: Optional[int] = None): if count: title = f'First {count} notes' else: title = f'Whole track' count = len(notes['pitch']) plt.figure(figsize=(20, 4)) plot_pitch = np.stack([notes['pitch'], notes['pitch']], axis=0) plot_start_stop = np.stack([notes['start'], notes['end']], axis=0) plt.plot(plot_start_stop[:, :count], plot_pitch[:, :count], color="b", marker=".") plt.xlabel('Time [s]') plt.ylabel('Pitch') _ = plt.title(title) #plot_piano_roll(raw_notes, count=100) #plot_piano_roll(raw_notes) def plot_distributions(notes: pd.DataFrame, drop_percentile=2.5): plt.figure(figsize=[15, 5]) plt.subplot(1, 3, 1) sns.histplot(notes, x="pitch", bins=20) plt.subplot(1, 3, 2) max_step = np.percentile(notes['step'], 100 - drop_percentile) sns.histplot(notes, x="step", bins=np.linspace(0, max_step, 21)) plt.subplot(1, 3, 3) max_duration = np.percentile(notes['duration'], 100 - drop_percentile) sns.histplot(notes, x="duration", bins=np.linspace(0, max_duration, 21)) #plot_distributions(raw_notes) #Create MIDI File def notes_to_midi(notes: pd.DataFrame, out_file: str, instrument_name: str, velocity: int = 100) -> pretty_midi.PrettyMIDI: pm = pretty_midi.PrettyMIDI() instrument = pretty_midi.Instrument(program=pretty_midi.instrument_name_to_program(instrument_name)) prev_start = 0 for i, note in notes.iterrows(): start = float(prev_start + note['step']) end = float(start + note['duration']) note = pretty_midi.Note(velocity=velocity, pitch=int(note['pitch']), start=start, end=end) instrument.notes.append(note) prev_start = start #Multiple Instruments Add Below To Loop? pm.instruments.append(instrument) pm.write(out_file) return pm #example_file = 'example.midi' #example_pm = notes_to_midi(raw_notes, out_file=example_file, instrument_name=instrument_name) #display_audio(example_pm) #Create Training Dataset num_files = 10 all_notes = [] for f in filenames[0:num_files]: notes = midi_to_notes(f) all_notes.append(notes) all_notes = pd.concat(all_notes) n_notes = len(all_notes) print("Number of Parsed Notes:", n_notes) #Create Dataset from Notes key_order = ["pitch", "step", "duration"] train_notes = np.stack([all_notes[key] for key in key_order], axis=1) notes_ds = tf.data.Dataset.from_tensor_slices(train_notes) notes_ds.element_spec ###Output _____no_output_____ ###Markdown SequencesThe network is trained on sequences of notes, followed by a "next note" so that, given a sequence, it has a better idea of what the next pitch, step, and duration should be. ###Code #Create Sequences so Net Will Train to Predict Next Note in Sequence def create_sequences(dataset: tf.data.Dataset, seq_length: int, vocab_size = 128) -> tf.data.Dataset: #Returns TF Dataset of sequence and label examples seq_length = seq_length+1 #Take 1 extra for the labels windows = dataset.window(seq_length, shift=1, stride=1, drop_remainder=True) # `flat_map` flattens the" dataset of datasets" into a dataset of tensors flatten = lambda x: x.batch(seq_length, drop_remainder=True) sequences = windows.flat_map(flatten) #Normalize note pitch def scale_pitch(x): x = x/[vocab_size,1.0,1.0] return x # Split the labels def split_labels(sequences): inputs = sequences[:-1] labels_dense = sequences[-1] labels = {key:labels_dense[i] for i,key in enumerate(key_order)} return scale_pitch(inputs), labels return sequences.map(split_labels, num_parallel_calls=tf.data.AUTOTUNE) seq_length = 25 vocab_size = 128 seq_ds = create_sequences(notes_ds, seq_length, vocab_size) seq_ds.element_spec for seq, target in seq_ds.take(1): print("Sequence Shape:", seq.shape) print("Sequence Elements (First 10):", seq[0:10]) print() print("Target:", target) #Batch Examples and Configure Dataset for Performance batch_size = 64 buffer_size = n_notes - seq_length # the number of items in the dataset train_ds = (seq_ds.shuffle(buffer_size).batch(batch_size, drop_remainder=True).cache().prefetch(tf.data.experimental.AUTOTUNE)) train_ds.element_spec #Create And Train Model #Encourage Model to Output Positive Values def mse_with_positive_pressure(y_true: tf.Tensor, y_pred: tf.Tensor): mse = (y_true - y_pred) ** 2 positive_pressure = 10 * tf.maximum(-y_pred, 0.0) return tf.reduce_mean(mse + positive_pressure) ###Output _____no_output_____ ###Markdown ArchitectureNumerous architectures were tested, including different amounts of stacked LSTM units, GRU units, an RNN with LSTM layers separated by residual blocks, and even a far more simple feed-forward net with dense layers. Our peak performance was achieved with just 2 stacked LSTM layers as you see below. ###Code #Build & Compile Model input_shape = (seq_length, 3) learning_rate = 0.005 inputs = tf.keras.Input(input_shape) x = tf.keras.layers.LSTM(256, return_sequences=True)(inputs) x = tf.keras.layers.LSTM(256)(x) outputs = { 'pitch': tf.keras.layers.Dense(128, name='pitch')(x), 'step': tf.keras.layers.Dense(1, name='step')(x), 'duration': tf.keras.layers.Dense(1, name='duration')(x), } model = tf.keras.Model(inputs, outputs) loss = { 'pitch': tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), 'step': mse_with_positive_pressure, 'duration': mse_with_positive_pressure, } optimizer = tf.keras.optimizers.Adam(learning_rate=learning_rate) model.compile(loss=loss, optimizer=optimizer) model.summary() tf.keras.utils.plot_model(model,show_shapes=True,expand_nested=True) losses = model.evaluate(train_ds, return_dict=True) losses model.compile(loss=loss, loss_weights={'pitch': 0.05, 'step': 1.0, 'duration':1.0}, optimizer=optimizer) model.evaluate(train_ds, return_dict=True) #Train Model callbacks = [tf.keras.callbacks.ModelCheckpoint(filepath='./training_checkpoints/ckpt_{epoch}', save_weights_only=True), tf.keras.callbacks.EarlyStopping(monitor='loss', patience=5, verbose=1, restore_best_weights=True)] %%time epochs = 75 #history = model.fit(train_ds, epochs=epochs, callbacks=callbacks) #Load Pre-Trained Weights !wget -O weightsRNN.zip https://www.dropbox.com/s/zhuzqmjicbkdunv/weightsRNN.zip?dl=0 !unzip weightsRNN.zip model.load_weights("weightsRNN/weights") #model.save_weights("./weightsRNN/weights") #Plot #plt.plot(history.epoch, history.history['loss'], label='total loss') #plt.show() #Generate Notes Definition def predict_next_note(notes: np.ndarray, keras_model: tf.keras.Model, temperature: float = 1.0) -> int: #Generates Note ID's Using a Trained Sequence Model assert temperature > 0 # Add batch dimension inputs = tf.expand_dims(notes, 0) predictions = model.predict(inputs) pitch_logits = predictions['pitch'] step = predictions['step'] duration = predictions['duration'] pitch_logits /= temperature pitch = tf.random.categorical(pitch_logits, num_samples=1) pitch = tf.squeeze(pitch, axis=-1) duration = tf.squeeze(duration, axis=-1) step = tf.squeeze(step, axis=-1) # `step` and `duration` values should be non-negative step = tf.maximum(0, step) duration = tf.maximum(0, duration) return int(pitch), float(step), float(duration) ###Output _____no_output_____ ###Markdown Note GenerationNext we allow the net to generate its own music. The network will produce what it thinks the next pitch, step, and duration should be, and the helper function will convert these values into a playable MIDI file. ###Code #Generate Notes temperature = 2.0 num_predictions = 200 sample_notes = np.stack([raw_notes[key] for key in key_order], axis=1) # The initial sequence of notes; pitch is normalized similar to training # sequences input_notes = (sample_notes[:seq_length] / np.array([vocab_size, 1, 1])) generated_notes = [] prev_start = 0 for _ in range(num_predictions): pitch, step, duration = predict_next_note(input_notes, model, temperature) start = prev_start + step end = start + duration input_note = (pitch, step, duration) generated_notes.append((*input_note, start, end)) input_notes = np.delete(input_notes, 0, axis=0) input_notes = np.append(input_notes, np.expand_dims(input_note, 0), axis=0) prev_start = start generated_notes = pd.DataFrame(generated_notes, columns=(*key_order, 'start', 'end')) generated_notes.head(10) out_file = 'output.mid' out_pm = notes_to_midi(generated_notes, out_file=out_file, instrument_name=instrument_name) display_audio(out_pm) #Download from google.colab import files #files.download(out_file) #files.download("weightsRNN.zip") #Plots plot_piano_roll(generated_notes) plot_distributions(generated_notes) ###Output _____no_output_____
Model 3b2.ipynb
###Markdown --- Get all texts and split into sexist and non-sexist. For each set, make a list with all sentences.--- ###Code # Get all texts and combine sex_txts = pickle.load(open("pickles2/sexist_new.p", "rb")) nsex_txts = pickle.load(open("pickles2/nonsexist_new.p", "rb")) texts_diary = pickle.load(open("pickles/texts_diary.p", "rb")) texts_mydiary = pickle.load(open("pickles/texts_mydiary.p", "rb")) texts_everydaysexism = pickle.load(open("pickles/texts_everydaysexism.p", "rb")) # New texts come in tuples, with source info, not strings. Fix this sex_txts = [item[0] for item in sex_txts] nsex_txts = [item[0] for item in nsex_txts] # Remove /n from texts (as much as possible) sex_txts = [item.replace(" /n ", "") for item in sex_txts] nsex_txts = [item.replace(" /n ", "") for item in nsex_txts] # Add the old texts sex_txts.extend(texts_everydaysexism) nsex_txts.extend(texts_diary) nsex_txts.extend(texts_mydiary) # Split into sentences sex_sentences = discrimination.texts.sentences_split(sex_txts) nsex_sentences = discrimination.texts.sentences_split(nsex_txts) # Save pickle.dump(sex_txts, open("pickles3/sex_txts.p", "wb")) pickle.dump(nsex_txts, open("pickles3/nsex_txts.p", "wb")) pickle.dump(sex_sentences, open("pickles3/sex_sentences.p", "wb")) pickle.dump(nsex_sentences, open("pickles3/nsex_sentences.p", "wb")) ###Output _____no_output_____ ###Markdown --- Tokenize and check how the length distribution looks across the two groups.--- ###Code # Tokenize texts and remove stop-words sex_tkns = discrimination.texts.tokenize(sex_txts) nsex_tkns = discrimination.texts.tokenize(nsex_txts) # Remove stop-words a second time, in case some stopwords where misspelled. sex_tkns = discrimination.texts.remove_stopwords(sex_tkns) nsex_tkns = discrimination.texts.remove_stopwords(nsex_tkns) # Check the average token length (words per token) in each group sex_wd_cnt = 0 for tkn in sex_tkns: sex_wd_cnt += len(tkn) nsex_wd_cnt = 0 for tkn in nsex_tkns: nsex_wd_cnt += len(tkn) print("Av. number of words-per-token in non-sexist texts is", round(nsex_wd_cnt/len(nsex_tkns),1)) print("Av. number of words-per-token in sexist texts is", round(sex_wd_cnt/len(sex_tkns),1)) # Make some graphs of the number of words per token for more information list_of_tokens = [sex_tkns, nsex_tkns] legend = ["Sexist", "Non-Sexist"] discrimination.texts.tokens_plot(list_of_tokens, (0,0.07), 135, [10,5], 90, legend) ###Output _____no_output_____ ###Markdown First of all, the reason it appears that sexist texts have more words per token, when in reality they don't is due to the fatter right-tail of non-sexist tokens, for lengths larger than 100.However, since extremely few tokens have lengths larger than 128 words, will pad tokens at 128. At the same time I will randomly discard non-sexist tokens with very few words and with more than 100, in an effort to match the distribution. --- Match the distribution of the two groups by discarding and splitting non-sexist texts, that are anyway more than the sexist ones.--- ###Code # Load sex_txts = pickle.load(open("pickles3/sex_txts.p", "rb")) nsex_txts = pickle.load(open("pickles3/nsex_txts.p", "rb")) sex_sentences = pickle.load(open("pickles3/sex_sentences.p", "rb")) nsex_sentences = pickle.load(open("pickles3/nsex_sentences.p", "rb")) # Split non-sexist sentences into two groups. Those with 50 or more sentneces, and the rest. a = [] b = [] for item in nsex_sentences: if len(item) > 100: a.append(item) else: b.append(item) # For the group with 40 or more sentences, split all items into 2 sentences. temp = [] for item in a: for i in range(0, len(item), 2): temp.append(item[i: i+2]) # Join the group to form the "new" non-sexist sentences nsex_sentences = b nsex_sentences.extend(temp) # Make the sentences into texts again texts = [] for item in nsex_sentences: text = " ".join(item) texts.append(text) # These are the "new" non-sexist texts nsex_txts = texts # Tokenize texts and remove stop-words sex_tkns = discrimination.texts.tokenize(sex_txts) nsex_tkns = discrimination.texts.tokenize(nsex_txts) # Spell-check tokens. (Notificaiton every 20.000 tokens) sex_tkns = discrimination.texts.spellcheck_tokens(sex_tkns) nsex_tkns = discrimination.texts.spellcheck_tokens(nsex_tkns) # Remove stop-words a second time, in case some stopwords where misspelled. sex_tkns = discrimination.texts.remove_stopwords(sex_tkns) nsex_tkns = discrimination.texts.remove_stopwords(nsex_tkns) # Discard all tokens with length larger than 128 from both groups sex_tkns = [tkn for tkn in sex_tkns if len(tkn) <=128] nsex_tkns = [tkn for tkn in nsex_tkns if len(tkn) <=128] # With a probability of 3% keep non-sexist tokens with a length of 1,2,3, and 4 a = [tkn for tkn in nsex_tkns if len(tkn) <= 3 and random.random() < 0.03] # With a probability of 10% keep non-sexist tokens with a length of 4 b = [tkn for tkn in nsex_tkns if len(tkn) == 4 and random.random() < 0.1] # With a probability of 20% keep non-sexist tokens with a length of 5 c = [tkn for tkn in nsex_tkns if len(tkn) == 5 and random.random() < 0.2] # With a probability of 40% keep non-sexist tokens with a length of 6 d = [tkn for tkn in nsex_tkns if len(tkn) == 6 and random.random() < 0.4] # With a probability of 60% keep non-sexist tokens with a length of 7 e = [tkn for tkn in nsex_tkns if len(tkn) == 7 and random.random() < 0.6] # With a probability of 70% keep non-sexist tokens with a length of 8 f = [tkn for tkn in nsex_tkns if len(tkn) == 8 and random.random() < 0.7] # Keep all non-sexist tokens with a length larger than 8 nsex_tkns = [tkn for tkn in nsex_tkns if len(tkn) > 8] # Combine all nsex_tkns.extend(a) nsex_tkns.extend(b) nsex_tkns.extend(c) nsex_tkns.extend(d) nsex_tkns.extend(e) nsex_tkns.extend(f) # With a probability of 30% separate non-sexist tokens with a length between 15 and 25 sep1 = [] sep2 = [] for tkn in nsex_tkns: if 15 <= len(tkn) <= 25 and random.random() < 0.3: sep1.append(tkn) else: sep2.append(tkn) a = sep2 # Join each pair of the separated tokens into one larger token. b = [] for i in range(0, len(sep1) - 1, 2): tkn = sep1[i] tkn.extend(sep1[i+1]) b.append(tkn) # Form the new non-sexist tokens nsex_tkns = a nsex_tkns.extend(b) # With a probability of 20% separate non-sexist tokens with a length between 26 and 42 sep1 = [] sep2 = [] for tkn in nsex_tkns: if 26 <= len(tkn) <= 42 and random.random() < 0.2: sep1.append(tkn) else: sep2.append(tkn) a = sep2 # Join each pair of the separated tokens into one larger token. b = [] for i in range(0, len(sep1) - 1, 2): tkn = sep1[i] tkn.extend(sep1[i+1]) b.append(tkn) # Form the new non-sexist tokens nsex_tkns = a nsex_tkns.extend(b) # With a probability of 22% separate non-sexist tokens with a length of 7 sep1 = [] sep2 = [] for tkn in nsex_tkns: if len(tkn) == 7 and random.random() < 0.22: sep2.append(tkn) else: sep1.append(tkn) nsex_tkns = sep1 # With a probability of 18% separate non-sexist tokens with a length of 8 sep1 = [] sep3 = [] for tkn in nsex_tkns: if len(tkn) == 8 and random.random() < 0.18: sep3.append(tkn) else: sep1.append(tkn) nsex_tkns = sep1 # With a probability of 30% separate non-sexist tokens with a length of 9 sep1 = [] sep4 = [] for tkn in nsex_tkns: if len(tkn) == 9 and random.random() < 0.3: sep4.append(tkn) else: sep1.append(tkn) nsex_tkns = sep1 # With a probability of 14% separate non-sexist tokens with a length of 10 sep1 = [] sep5 = [] for tkn in nsex_tkns: if len(tkn) == 10 and random.random() < 0.14: sep5.append(tkn) else: sep1.append(tkn) nsex_tkns = sep1 # Join all the separated tokens in a new variable and shuffle it separated = sep2.copy() separated.extend(sep3) separated.extend(sep4) separated.extend(sep5) # Average token length is 8.5 so after randomizing join every 6 tokens together and add to the nsex_tkns random.shuffle(separated) new = [] for i in range(0, len(separated) - 5, 6): tkn = separated[i] tkn.extend(separated[i+1]) tkn.extend(separated[i+2]) tkn.extend(separated[i+3]) tkn.extend(separated[i+4]) tkn.extend(separated[i+5]) new.append(tkn) nsex_tkns.extend(new) # Make some graphs of the number of words per token for more information list_of_tokens = [sex_tkns, nsex_tkns] legend = ["Sexist", "Non-Sexist"] discrimination.texts.tokens_plot(list_of_tokens, (0,0.06), 128, [10,5], 90, legend) # Save pickle.dump(sex_tkns, open("pickles3/sex_tkns.p", "wb")) pickle.dump(nsex_tkns, open("pickles3/nsex_tkns.p", "wb")) ###Output _____no_output_____ ###Markdown ___ NN preparation___ ###Code # Load tokens sex_tkns = pickle.load(open("pickles3/sex_tkns.p", "rb")) nsex_tkns = pickle.load(open("pickles3/nsex_tkns.p", "rb")) # Match the length of sex_tkns by randomly dropping nsex_tkns nsex_tkns = random.sample(nsex_tkns, len(sex_tkns)) # Convert tokens back to text for Keras keras_txts = [] for tkn in itertools.chain(sex_tkns, nsex_tkns): txt = " ".join(tkn) keras_txts.append(txt) # Create labels keras_labels = np.zeros(len(keras_txts)) keras_labels[:len(sex_tkns)] = 1 # Tokenizing - Sequencing tokenizer = Tokenizer(lower = False) tokenizer.fit_on_texts(keras_txts) sequences = tokenizer.texts_to_sequences(keras_txts) word_index = tokenizer.word_index # Create and shuffle data and labels. MAXLEN = 128 data = pad_sequences(sequences, maxlen=128) labels = np.asarray(keras_labels) indices = np.arange(data.shape[0]) np.random.shuffle(indices) data = data[indices] labels = labels[indices] # Split 80-20 nb_validation_samples = int(0.2 * data.shape[0]) x_train = data[:-nb_validation_samples] y_train = labels[:-nb_validation_samples] x_val = data[-nb_validation_samples:] y_val = labels[-nb_validation_samples:] # Parse the GloVe word embeddings glove_dir = "glove/" embeddings_index = {} f = open(os.path.join(glove_dir, "glove.42B.300d.txt")) for line in f: values = line.split() word = values[0] coefs = np.asarray(values[1:], dtype="float32") embeddings_index[word] = coefs f.close() # Create the embedding matrix embedding_matrix = np.zeros((len(word_index) + 1, 300)) for word, i in word_index.items(): embedding_vector = embeddings_index.get(word) if embedding_vector is not None: # words not found in embedding index will be all-zeros. embedding_matrix[i] = embedding_vector # Delete the embeddings index as it's no longer needed. del embeddings_index # Create the embedding layer. INPUT LENGTH 128 embedding_layer = Embedding(len(word_index) + 1, 300, input_length=128, weights=[embedding_matrix], trainable=False) ###Output _____no_output_____ ###Markdown --- NN setup and compilation--- ###Code # Setup model = Sequential() model.add(embedding_layer) model.add(Flatten()) model.add(Dropout(0.5)) model.add(Dense(256, activation="relu", kernel_regularizer = regularizers.l2(0.001))) model.add(Dense(128, activation="relu", kernel_regularizer = regularizers.l2(0.001))) model.add(Dense(32, activation="relu", kernel_regularizer = regularizers.l2(0.001))) model.add(Dense(1, activation="sigmoid")) model.summary() # Compilation model.compile(optimizer = "Adam", loss = "binary_crossentropy", metrics = ["acc"]) history = model.fit(x_train, y_train, epochs = 3, batch_size = 256, validation_data = (x_val, y_val)) # Save model weights model.save_weights("pickles3/model3b2.h5") # Load weights model.load_weights("pickles3/model3b2.h5") # Predictions predictions = model.predict(data) # Save pickle.dump(predictions, open("pickles3/predictions2.p", "wb")) # Create a predicted labels list labels_predicted = [] for prediction in predictions: labels_predicted.append( round(prediction[0]) ) # Calculate the confusion matrix from sklearn.metrics import confusion_matrix, precision_score, recall_score, accuracy_score CF = confusion_matrix(labels, labels_predicted) #"Disentangle" the matrix TN = round((CF[0,0] / sum(CF[0,:])) * 100, 1) FN = round((CF[0,1] / sum(CF[0,:])) * 100, 1) TP = round((CF[1,1] / sum(CF[1,:])) * 100, 1) FP = round((CF[1,0] / sum(CF[1,:])) * 100, 1) GTN = round((CF[0,0] / (sum(CF[0,:]) + sum(CF[1,:]))) * 100, 1) GFN = round((CF[0,1] / (sum(CF[0,:]) + sum(CF[1,:]))) * 100, 1) GTP = round((CF[1,1] / (sum(CF[0,:]) + sum(CF[1,:]))) * 100, 1) GFP = round((CF[1,0] / (sum(CF[0,:]) + sum(CF[1,:]))) * 100, 1) # Print the results print("True positives account for "+str(TP)+"% or "+str(GTP)+"% of the total (sexist texts labelled as sexist).") print("False positives account for "+str(FP)+"% or "+str(GFP)+"% of the total (sexist texts labelled as non-sexist).") print("True negatives account for "+str(TN)+"% or "+str(GTN)+"% of the total (non-sexist texts labelled as non-sexist).") print("False negatives account for "+str(FN)+"% or "+str(GFN)+"% of the total (non-sexist texts labelled as sexist).") ###Output True positives account for 94.9% or 47.5% of the total (sexist texts labelled as sexist). False positives account for 5.1% or 2.5% of the total (sexist texts labelled as non-sexist). True negatives account for 95.8% or 47.9% of the total (non-sexist texts labelled as non-sexist). False negatives account for 4.2% or 2.1% of the total (non-sexist texts labelled as sexist). ###Markdown Test the model! ###Code # Test the network test = ['''Women'''] # Convert the test phrase to lowercase, tokenize, spellcheck, remove stopwords. test = discrimination.texts.lowercase(test) test = discrimination.texts.tokenize(test) test = discrimination.texts.spellcheck_tokens(test) test = discrimination.texts.remove_stopwords(test) # Convert the token back to text, sequence it, pad it, feed it into the model. text = "" for item in test: for word in item: text += word + " " test_sequence = tokenizer.texts_to_sequences([text]) x_test = pad_sequences(test_sequence, maxlen=128) model.load_weights("pickles3/model3b2.h5") # Make the output look pretty... because it deserves it. str(round(model.predict(x_test)[0,0]*100,0))[:-2] + "% sexist" ###Output _____no_output_____
experiments/tl_3v2.wisig+cores/cores_wisig-oracle.run1.framed/trials/8/trial.ipynb
###Markdown Transfer Learning Template ###Code %load_ext autoreload %autoreload 2 %matplotlib inline import os, json, sys, time, random import numpy as np import torch from torch.optim import Adam from easydict import EasyDict import matplotlib.pyplot as plt from steves_models.steves_ptn import Steves_Prototypical_Network from steves_utils.lazy_iterable_wrapper import Lazy_Iterable_Wrapper from steves_utils.iterable_aggregator import Iterable_Aggregator from steves_utils.ptn_train_eval_test_jig import PTN_Train_Eval_Test_Jig from steves_utils.torch_sequential_builder import build_sequential from steves_utils.torch_utils import get_dataset_metrics, ptn_confusion_by_domain_over_dataloader from steves_utils.utils_v2 import (per_domain_accuracy_from_confusion, get_datasets_base_path) from steves_utils.PTN.utils import independent_accuracy_assesment from torch.utils.data import DataLoader from steves_utils.stratified_dataset.episodic_accessor import Episodic_Accessor_Factory from steves_utils.ptn_do_report import ( get_loss_curve, get_results_table, get_parameters_table, get_domain_accuracies, ) from steves_utils.transforms import get_chained_transform ###Output _____no_output_____ ###Markdown Allowed ParametersThese are allowed parameters, not defaultsEach of these values need to be present in the injected parameters (the notebook will raise an exception if they are not present)Papermill uses the cell tag "parameters" to inject the real parameters below this cell.Enable tags to see what I mean ###Code required_parameters = { "experiment_name", "lr", "device", "seed", "dataset_seed", "n_shot", "n_query", "n_way", "train_k_factor", "val_k_factor", "test_k_factor", "n_epoch", "patience", "criteria_for_best", "x_net", "datasets", "torch_default_dtype", "NUM_LOGS_PER_EPOCH", "BEST_MODEL_PATH", "x_shape", } from steves_utils.CORES.utils import ( ALL_NODES, ALL_NODES_MINIMUM_1000_EXAMPLES, ALL_DAYS ) from steves_utils.ORACLE.utils_v2 import ( ALL_DISTANCES_FEET_NARROWED, ALL_RUNS, ALL_SERIAL_NUMBERS, ) standalone_parameters = {} standalone_parameters["experiment_name"] = "STANDALONE PTN" standalone_parameters["lr"] = 0.001 standalone_parameters["device"] = "cuda" standalone_parameters["seed"] = 1337 standalone_parameters["dataset_seed"] = 1337 standalone_parameters["n_way"] = 8 standalone_parameters["n_shot"] = 3 standalone_parameters["n_query"] = 2 standalone_parameters["train_k_factor"] = 1 standalone_parameters["val_k_factor"] = 2 standalone_parameters["test_k_factor"] = 2 standalone_parameters["n_epoch"] = 50 standalone_parameters["patience"] = 10 standalone_parameters["criteria_for_best"] = "source_loss" standalone_parameters["datasets"] = [ { "labels": ALL_SERIAL_NUMBERS, "domains": ALL_DISTANCES_FEET_NARROWED, "num_examples_per_domain_per_label": 100, "pickle_path": os.path.join(get_datasets_base_path(), "oracle.Run1_framed_2000Examples_stratified_ds.2022A.pkl"), "source_or_target_dataset": "source", "x_transforms": ["unit_mag", "minus_two"], "episode_transforms": [], "domain_prefix": "ORACLE_" }, { "labels": ALL_NODES, "domains": ALL_DAYS, "num_examples_per_domain_per_label": 100, "pickle_path": os.path.join(get_datasets_base_path(), "cores.stratified_ds.2022A.pkl"), "source_or_target_dataset": "target", "x_transforms": ["unit_power", "times_zero"], "episode_transforms": [], "domain_prefix": "CORES_" } ] standalone_parameters["torch_default_dtype"] = "torch.float32" standalone_parameters["x_net"] = [ {"class": "nnReshape", "kargs": {"shape":[-1, 1, 2, 256]}}, {"class": "Conv2d", "kargs": { "in_channels":1, "out_channels":256, "kernel_size":(1,7), "bias":False, "padding":(0,3), },}, {"class": "ReLU", "kargs": {"inplace": True}}, {"class": "BatchNorm2d", "kargs": {"num_features":256}}, {"class": "Conv2d", "kargs": { "in_channels":256, "out_channels":80, "kernel_size":(2,7), "bias":True, "padding":(0,3), },}, {"class": "ReLU", "kargs": {"inplace": True}}, {"class": "BatchNorm2d", "kargs": {"num_features":80}}, {"class": "Flatten", "kargs": {}}, {"class": "Linear", "kargs": {"in_features": 80*256, "out_features": 256}}, # 80 units per IQ pair {"class": "ReLU", "kargs": {"inplace": True}}, {"class": "BatchNorm1d", "kargs": {"num_features":256}}, {"class": "Linear", "kargs": {"in_features": 256, "out_features": 256}}, ] # Parameters relevant to results # These parameters will basically never need to change standalone_parameters["NUM_LOGS_PER_EPOCH"] = 10 standalone_parameters["BEST_MODEL_PATH"] = "./best_model.pth" # Parameters parameters = { "experiment_name": "tl_3Av2:cores+wisig -> oracle.run1.framed", "device": "cuda", "lr": 0.0001, "x_shape": [2, 200], "n_shot": 3, "n_query": 2, "train_k_factor": 3, "val_k_factor": 2, "test_k_factor": 2, "torch_default_dtype": "torch.float32", "n_epoch": 50, "patience": 3, "criteria_for_best": "target_accuracy", "x_net": [ {"class": "nnReshape", "kargs": {"shape": [-1, 1, 2, 200]}}, { "class": "Conv2d", "kargs": { "in_channels": 1, "out_channels": 256, "kernel_size": [1, 7], "bias": False, "padding": [0, 3], }, }, {"class": "ReLU", "kargs": {"inplace": True}}, {"class": "BatchNorm2d", "kargs": {"num_features": 256}}, { "class": "Conv2d", "kargs": { "in_channels": 256, "out_channels": 80, "kernel_size": [2, 7], "bias": True, "padding": [0, 3], }, }, {"class": "ReLU", "kargs": {"inplace": True}}, {"class": "BatchNorm2d", "kargs": {"num_features": 80}}, {"class": "Flatten", "kargs": {}}, {"class": "Linear", "kargs": {"in_features": 16000, "out_features": 256}}, {"class": "ReLU", "kargs": {"inplace": True}}, {"class": "BatchNorm1d", "kargs": {"num_features": 256}}, {"class": "Linear", "kargs": {"in_features": 256, "out_features": 256}}, ], "NUM_LOGS_PER_EPOCH": 10, "BEST_MODEL_PATH": "./best_model.pth", "n_way": 16, "datasets": [ { "labels": [ "1-10.", "1-11.", "1-15.", "1-16.", "1-17.", "1-18.", "1-19.", "10-4.", "10-7.", "11-1.", "11-14.", "11-17.", "11-20.", "11-7.", "13-20.", "13-8.", "14-10.", "14-11.", "14-14.", "14-7.", "15-1.", "15-20.", "16-1.", "16-16.", "17-10.", "17-11.", "17-2.", "19-1.", "19-16.", "19-19.", "19-20.", "19-3.", "2-10.", "2-11.", "2-17.", "2-18.", "2-20.", "2-3.", "2-4.", "2-5.", "2-6.", "2-7.", "2-8.", "3-13.", "3-18.", "3-3.", "4-1.", "4-10.", "4-11.", "4-19.", "5-5.", "6-15.", "7-10.", "7-14.", "8-18.", "8-20.", "8-3.", "8-8.", ], "domains": [1, 2, 3, 4, 5], "num_examples_per_domain_per_label": -1, "pickle_path": "/mnt/wd500GB/CSC500/csc500-main/datasets/cores.stratified_ds.2022A.pkl", "source_or_target_dataset": "source", "x_transforms": ["take_200"], "episode_transforms": [], "domain_prefix": "C_", }, { "labels": [ "1-10", "1-12", "1-14", "1-16", "1-18", "1-19", "1-8", "10-11", "10-17", "10-4", "10-7", "11-1", "11-10", "11-19", "11-20", "11-4", "11-7", "12-19", "12-20", "12-7", "13-14", "13-18", "13-19", "13-20", "13-3", "13-7", "14-10", "14-11", "14-12", "14-13", "14-14", "14-19", "14-20", "14-7", "14-8", "14-9", "15-1", "15-19", "15-6", "16-1", "16-16", "16-19", "16-20", "17-10", "17-11", "18-1", "18-10", "18-11", "18-12", "18-13", "18-14", "18-15", "18-16", "18-17", "18-19", "18-2", "18-20", "18-4", "18-5", "18-7", "18-8", "18-9", "19-1", "19-10", "19-11", "19-12", "19-13", "19-14", "19-15", "19-19", "19-2", "19-20", "19-3", "19-4", "19-6", "19-7", "19-8", "19-9", "2-1", "2-13", "2-15", "2-3", "2-4", "2-5", "2-6", "2-7", "2-8", "20-1", "20-12", "20-14", "20-15", "20-16", "20-18", "20-19", "20-20", "20-3", "20-4", "20-5", "20-7", "20-8", "3-1", "3-13", "3-18", "3-2", "3-8", "4-1", "4-10", "4-11", "5-1", "5-5", "6-1", "6-15", "6-6", "7-10", "7-11", "7-12", "7-13", "7-14", "7-7", "7-8", "7-9", "8-1", "8-13", "8-14", "8-18", "8-20", "8-3", "8-8", "9-1", "9-7", ], "domains": [1, 2, 3, 4], "num_examples_per_domain_per_label": -1, "pickle_path": "/mnt/wd500GB/CSC500/csc500-main/datasets/wisig.node3-19.stratified_ds.2022A.pkl", "source_or_target_dataset": "source", "x_transforms": ["take_200"], "episode_transforms": [], "domain_prefix": "W_", }, { "labels": [ "3123D52", "3123D65", "3123D79", "3123D80", "3123D54", "3123D70", "3123D7B", "3123D89", "3123D58", "3123D76", "3123D7D", "3123EFE", "3123D64", "3123D78", "3123D7E", "3124E4A", ], "domains": [32, 38, 8, 44, 14, 50, 20, 26], "num_examples_per_domain_per_label": 2000, "pickle_path": "/mnt/wd500GB/CSC500/csc500-main/datasets/oracle.Run1_framed_2000Examples_stratified_ds.2022A.pkl", "source_or_target_dataset": "target", "x_transforms": ["take_200", "resample_20Msps_to_25Msps"], "episode_transforms": [], "domain_prefix": "O_", }, ], "seed": 154325, "dataset_seed": 154325, } # Set this to True if you want to run this template directly STANDALONE = False if STANDALONE: print("parameters not injected, running with standalone_parameters") parameters = standalone_parameters if not 'parameters' in locals() and not 'parameters' in globals(): raise Exception("Parameter injection failed") #Use an easy dict for all the parameters p = EasyDict(parameters) if "x_shape" not in p: p.x_shape = [2,256] # Default to this if we dont supply x_shape supplied_keys = set(p.keys()) if supplied_keys != required_parameters: print("Parameters are incorrect") if len(supplied_keys - required_parameters)>0: print("Shouldn't have:", str(supplied_keys - required_parameters)) if len(required_parameters - supplied_keys)>0: print("Need to have:", str(required_parameters - supplied_keys)) raise RuntimeError("Parameters are incorrect") ################################### # Set the RNGs and make it all deterministic ################################### np.random.seed(p.seed) random.seed(p.seed) torch.manual_seed(p.seed) torch.use_deterministic_algorithms(True) ########################################### # The stratified datasets honor this ########################################### torch.set_default_dtype(eval(p.torch_default_dtype)) ################################### # Build the network(s) # Note: It's critical to do this AFTER setting the RNG ################################### x_net = build_sequential(p.x_net) start_time_secs = time.time() p.domains_source = [] p.domains_target = [] train_original_source = [] val_original_source = [] test_original_source = [] train_original_target = [] val_original_target = [] test_original_target = [] # global_x_transform_func = lambda x: normalize(x.to(torch.get_default_dtype()), "unit_power") # unit_power, unit_mag # global_x_transform_func = lambda x: normalize(x, "unit_power") # unit_power, unit_mag def add_dataset( labels, domains, pickle_path, x_transforms, episode_transforms, domain_prefix, num_examples_per_domain_per_label, source_or_target_dataset:str, iterator_seed=p.seed, dataset_seed=p.dataset_seed, n_shot=p.n_shot, n_way=p.n_way, n_query=p.n_query, train_val_test_k_factors=(p.train_k_factor,p.val_k_factor,p.test_k_factor), ): if x_transforms == []: x_transform = None else: x_transform = get_chained_transform(x_transforms) if episode_transforms == []: episode_transform = None else: raise Exception("episode_transforms not implemented") episode_transform = lambda tup, _prefix=domain_prefix: (_prefix + str(tup[0]), tup[1]) eaf = Episodic_Accessor_Factory( labels=labels, domains=domains, num_examples_per_domain_per_label=num_examples_per_domain_per_label, iterator_seed=iterator_seed, dataset_seed=dataset_seed, n_shot=n_shot, n_way=n_way, n_query=n_query, train_val_test_k_factors=train_val_test_k_factors, pickle_path=pickle_path, x_transform_func=x_transform, ) train, val, test = eaf.get_train(), eaf.get_val(), eaf.get_test() train = Lazy_Iterable_Wrapper(train, episode_transform) val = Lazy_Iterable_Wrapper(val, episode_transform) test = Lazy_Iterable_Wrapper(test, episode_transform) if source_or_target_dataset=="source": train_original_source.append(train) val_original_source.append(val) test_original_source.append(test) p.domains_source.extend( [domain_prefix + str(u) for u in domains] ) elif source_or_target_dataset=="target": train_original_target.append(train) val_original_target.append(val) test_original_target.append(test) p.domains_target.extend( [domain_prefix + str(u) for u in domains] ) else: raise Exception(f"invalid source_or_target_dataset: {source_or_target_dataset}") for ds in p.datasets: add_dataset(**ds) # from steves_utils.CORES.utils import ( # ALL_NODES, # ALL_NODES_MINIMUM_1000_EXAMPLES, # ALL_DAYS # ) # add_dataset( # labels=ALL_NODES, # domains = ALL_DAYS, # num_examples_per_domain_per_label=100, # pickle_path=os.path.join(get_datasets_base_path(), "cores.stratified_ds.2022A.pkl"), # source_or_target_dataset="target", # x_transform_func=global_x_transform_func, # domain_modifier=lambda u: f"cores_{u}" # ) # from steves_utils.ORACLE.utils_v2 import ( # ALL_DISTANCES_FEET, # ALL_RUNS, # ALL_SERIAL_NUMBERS, # ) # add_dataset( # labels=ALL_SERIAL_NUMBERS, # domains = list(set(ALL_DISTANCES_FEET) - {2,62}), # num_examples_per_domain_per_label=100, # pickle_path=os.path.join(get_datasets_base_path(), "oracle.Run2_framed_2000Examples_stratified_ds.2022A.pkl"), # source_or_target_dataset="source", # x_transform_func=global_x_transform_func, # domain_modifier=lambda u: f"oracle1_{u}" # ) # from steves_utils.ORACLE.utils_v2 import ( # ALL_DISTANCES_FEET, # ALL_RUNS, # ALL_SERIAL_NUMBERS, # ) # add_dataset( # labels=ALL_SERIAL_NUMBERS, # domains = list(set(ALL_DISTANCES_FEET) - {2,62,56}), # num_examples_per_domain_per_label=100, # pickle_path=os.path.join(get_datasets_base_path(), "oracle.Run2_framed_2000Examples_stratified_ds.2022A.pkl"), # source_or_target_dataset="source", # x_transform_func=global_x_transform_func, # domain_modifier=lambda u: f"oracle2_{u}" # ) # add_dataset( # labels=list(range(19)), # domains = [0,1,2], # num_examples_per_domain_per_label=100, # pickle_path=os.path.join(get_datasets_base_path(), "metehan.stratified_ds.2022A.pkl"), # source_or_target_dataset="target", # x_transform_func=global_x_transform_func, # domain_modifier=lambda u: f"met_{u}" # ) # # from steves_utils.wisig.utils import ( # # ALL_NODES_MINIMUM_100_EXAMPLES, # # ALL_NODES_MINIMUM_500_EXAMPLES, # # ALL_NODES_MINIMUM_1000_EXAMPLES, # # ALL_DAYS # # ) # import steves_utils.wisig.utils as wisig # add_dataset( # labels=wisig.ALL_NODES_MINIMUM_100_EXAMPLES, # domains = wisig.ALL_DAYS, # num_examples_per_domain_per_label=100, # pickle_path=os.path.join(get_datasets_base_path(), "wisig.node3-19.stratified_ds.2022A.pkl"), # source_or_target_dataset="target", # x_transform_func=global_x_transform_func, # domain_modifier=lambda u: f"wisig_{u}" # ) ################################### # Build the dataset ################################### train_original_source = Iterable_Aggregator(train_original_source, p.seed) val_original_source = Iterable_Aggregator(val_original_source, p.seed) test_original_source = Iterable_Aggregator(test_original_source, p.seed) train_original_target = Iterable_Aggregator(train_original_target, p.seed) val_original_target = Iterable_Aggregator(val_original_target, p.seed) test_original_target = Iterable_Aggregator(test_original_target, p.seed) # For CNN We only use X and Y. And we only train on the source. # Properly form the data using a transform lambda and Lazy_Iterable_Wrapper. Finally wrap them in a dataloader transform_lambda = lambda ex: ex[1] # Original is (<domain>, <episode>) so we strip down to episode only train_processed_source = Lazy_Iterable_Wrapper(train_original_source, transform_lambda) val_processed_source = Lazy_Iterable_Wrapper(val_original_source, transform_lambda) test_processed_source = Lazy_Iterable_Wrapper(test_original_source, transform_lambda) train_processed_target = Lazy_Iterable_Wrapper(train_original_target, transform_lambda) val_processed_target = Lazy_Iterable_Wrapper(val_original_target, transform_lambda) test_processed_target = Lazy_Iterable_Wrapper(test_original_target, transform_lambda) datasets = EasyDict({ "source": { "original": {"train":train_original_source, "val":val_original_source, "test":test_original_source}, "processed": {"train":train_processed_source, "val":val_processed_source, "test":test_processed_source} }, "target": { "original": {"train":train_original_target, "val":val_original_target, "test":test_original_target}, "processed": {"train":train_processed_target, "val":val_processed_target, "test":test_processed_target} }, }) from steves_utils.transforms import get_average_magnitude, get_average_power print(set([u for u,_ in val_original_source])) print(set([u for u,_ in val_original_target])) s_x, s_y, q_x, q_y, _ = next(iter(train_processed_source)) print(s_x) # for ds in [ # train_processed_source, # val_processed_source, # test_processed_source, # train_processed_target, # val_processed_target, # test_processed_target # ]: # for s_x, s_y, q_x, q_y, _ in ds: # for X in (s_x, q_x): # for x in X: # assert np.isclose(get_average_magnitude(x.numpy()), 1.0) # assert np.isclose(get_average_power(x.numpy()), 1.0) ################################### # Build the model ################################### # easfsl only wants a tuple for the shape model = Steves_Prototypical_Network(x_net, device=p.device, x_shape=tuple(p.x_shape)) optimizer = Adam(params=model.parameters(), lr=p.lr) ################################### # train ################################### jig = PTN_Train_Eval_Test_Jig(model, p.BEST_MODEL_PATH, p.device) jig.train( train_iterable=datasets.source.processed.train, source_val_iterable=datasets.source.processed.val, target_val_iterable=datasets.target.processed.val, num_epochs=p.n_epoch, num_logs_per_epoch=p.NUM_LOGS_PER_EPOCH, patience=p.patience, optimizer=optimizer, criteria_for_best=p.criteria_for_best, ) total_experiment_time_secs = time.time() - start_time_secs ################################### # Evaluate the model ################################### source_test_label_accuracy, source_test_label_loss = jig.test(datasets.source.processed.test) target_test_label_accuracy, target_test_label_loss = jig.test(datasets.target.processed.test) source_val_label_accuracy, source_val_label_loss = jig.test(datasets.source.processed.val) target_val_label_accuracy, target_val_label_loss = jig.test(datasets.target.processed.val) history = jig.get_history() total_epochs_trained = len(history["epoch_indices"]) val_dl = Iterable_Aggregator((datasets.source.original.val,datasets.target.original.val)) confusion = ptn_confusion_by_domain_over_dataloader(model, p.device, val_dl) per_domain_accuracy = per_domain_accuracy_from_confusion(confusion) # Add a key to per_domain_accuracy for if it was a source domain for domain, accuracy in per_domain_accuracy.items(): per_domain_accuracy[domain] = { "accuracy": accuracy, "source?": domain in p.domains_source } # Do an independent accuracy assesment JUST TO BE SURE! # _source_test_label_accuracy = independent_accuracy_assesment(model, datasets.source.processed.test, p.device) # _target_test_label_accuracy = independent_accuracy_assesment(model, datasets.target.processed.test, p.device) # _source_val_label_accuracy = independent_accuracy_assesment(model, datasets.source.processed.val, p.device) # _target_val_label_accuracy = independent_accuracy_assesment(model, datasets.target.processed.val, p.device) # assert(_source_test_label_accuracy == source_test_label_accuracy) # assert(_target_test_label_accuracy == target_test_label_accuracy) # assert(_source_val_label_accuracy == source_val_label_accuracy) # assert(_target_val_label_accuracy == target_val_label_accuracy) experiment = { "experiment_name": p.experiment_name, "parameters": dict(p), "results": { "source_test_label_accuracy": source_test_label_accuracy, "source_test_label_loss": source_test_label_loss, "target_test_label_accuracy": target_test_label_accuracy, "target_test_label_loss": target_test_label_loss, "source_val_label_accuracy": source_val_label_accuracy, "source_val_label_loss": source_val_label_loss, "target_val_label_accuracy": target_val_label_accuracy, "target_val_label_loss": target_val_label_loss, "total_epochs_trained": total_epochs_trained, "total_experiment_time_secs": total_experiment_time_secs, "confusion": confusion, "per_domain_accuracy": per_domain_accuracy, }, "history": history, "dataset_metrics": get_dataset_metrics(datasets, "ptn"), } ax = get_loss_curve(experiment) plt.show() get_results_table(experiment) get_domain_accuracies(experiment) print("Source Test Label Accuracy:", experiment["results"]["source_test_label_accuracy"], "Target Test Label Accuracy:", experiment["results"]["target_test_label_accuracy"]) print("Source Val Label Accuracy:", experiment["results"]["source_val_label_accuracy"], "Target Val Label Accuracy:", experiment["results"]["target_val_label_accuracy"]) json.dumps(experiment) ###Output _____no_output_____
doc/source/ipynb/electric_dipole.ipynb
###Markdown Electric DipoleTo complete. ###Code %pylab inline import mathlab ###Output _____no_output_____ ###Markdown importing the Math and Physics Lab project To Do: write equations/comments/ ###Code charge_location = [[-2.0, 1.0], [2.0, 0]] mathlab.electric_dipole(charge_location) ###Output _____no_output_____
ML 1.1 - Linear Regression - example.ipynb
###Markdown Internet Resources:[Python Programming.net - machine learning episodes 1-11](https://pythonprogramming.net/machine-learning-tutorial-python-introduction/)Linear regression finds the best possible linear equation that fits to the training data. ###Code %matplotlib inline import os import numpy as np import pandas as pd import quandl, math from sklearn import preprocessing, svm, model_selection from sklearn.linear_model import LinearRegression import matplotlib.pyplot as plt from matplotlib import style import pickle # getting data if not os.path.isfile(os.path.abspath("data/wiki_googl.csv")): df = quandl.get("WIKI/GOOGL") else: df = pd.read_csv("data/wiki_googl.csv") # use date as index df["Date"] = pd.to_datetime(df["Date"]) df.set_index("Date", inplace=True) # only keep adjusted data df = df[['Adj. Open', 'Adj. High', 'Adj. Low', 'Adj. Close', 'Adj. Volume']] df['HL_PCT'] = (df['Adj. High'] - df['Adj. Low']) / df['Adj. Close'] * 100.0 # HL_PCT High low percentage df['PCT_change'] = (df['Adj. Close'] - df['Adj. Open']) / df['Adj. Open'] * 100.0 df = df[['Adj. Close', 'HL_PCT', 'PCT_change', 'Adj. Volume']] df.fillna(value=-99999, inplace=True) predict_lenght = int(math.ceil(0.1 * len(df))) # we are using 90% of the data to "train" our model and 10% to make predictions predict_col = 'Adj. Close' # we want to predict the adj close price df['label'] = df[predict_col].shift(-predict_lenght) df # split and prepare data # x = features # y = labels x = np.array(df.drop(['label'], 1)) # 1 denotes which axis # mean value along the axis is subtracted from x and the resulting value # is divided by standard deviation along the axis. # see https://scikit-learn.org/stable/modules/preprocessing.html x = preprocessing.scale(x) x_predict = x[-predict_lenght:] # these are the last 10% of the data. We will use these to make predictions # the predict data frame will hold our predicted values predict_df = df.iloc[-predict_lenght:] # iloc: Purely integer-location based indexing for selection by position predict_df = predict_df.rename(columns={"label":"Prediction"}) # getting our "training" data x = x[:-predict_lenght] df.dropna(inplace=True) y = np.array(df["label"]) x_train, x_test, y_train, y_test = model_selection.train_test_split(x, y, test_size=0.2) predict_df # do linear regression if os.path.exists('linearregression.pickle'): clf = pickle.load(open('linearregression.pickle','rb')) else: clf = LinearRegression(n_jobs=-1) # do as many jobs as possible clf.fit(x_train, y_train) # fit/"train" to data with open('linearregression.pickle','wb') as f: # save classifier pickle.dump(clf, f) accuracy = clf.score(x_test, y_test) forecast = clf.predict(x_predict) #print(forecast,accuracy, predict_lenght) print(f"accuracy: {accuracy}") #plotting style.use('ggplot') predict_df["Predict"] = forecast df['Adj. Close'].plot() predict_df['Predict'].plot() plt.legend(loc=4) plt.xlabel('Date') plt.ylabel('Price') plt.show() ###Output _____no_output_____ ###Markdown Internet Resources:[Python Programming.net - machine learning episodes 1-11](https://pythonprogramming.net/machine-learning-tutorial-python-introduction/)Linear regression finds the best possible linear equation that fits to the training data. ###Code %matplotlib inline import os import numpy as np import pandas as pd import quandl, math from sklearn import preprocessing, svm, model_selection from sklearn.linear_model import LinearRegression import matplotlib.pyplot as plt from matplotlib import style import pickle # getting data if not os.path.isfile(os.path.abspath("data/wiki_googl.csv")): df = quandl.get("WIKI/GOOGL") else: df = pd.read_csv("data/wiki_googl.csv") # use date as index df["Date"] = pd.to_datetime(df["Date"]) df.set_index("Date", inplace=True) # only keep adjusted data df = df[['Adj. Open', 'Adj. High', 'Adj. Low', 'Adj. Close', 'Adj. Volume']] df['HL_PCT'] = (df['Adj. High'] - df['Adj. Low']) / df['Adj. Close'] * 100.0 # HL_PCT High low percentage df['PCT_change'] = (df['Adj. Close'] - df['Adj. Open']) / df['Adj. Open'] * 100.0 df = df[['Adj. Close', 'HL_PCT', 'PCT_change', 'Adj. Volume']] df.fillna(value=-99999, inplace=True) predict_lenght = int(math.ceil(0.1 * len(df))) # we are using 90% of the data to "train" our model and 10% to make predictions predict_col = 'Adj. Close' # we want to predict the adj close price df['label'] = df[predict_col].shift(-predict_lenght) df # split and prepare data # x = features # y = labels x = np.array(df.drop(['label'], 1)) # 1 denotes which axis # mean value along the axis is subtracted from x and the resulting value # is divided by standard deviation along the axis. # see https://scikit-learn.org/stable/modules/preprocessing.html x = preprocessing.scale(x) x_predict = x[-predict_lenght:] # these are the last 10% of the data. We will use these to make predictions # the predict data frame will hold our predicted values predict_df = df.iloc[-predict_lenght:] # iloc: Purely integer-location based indexing for selection by position predict_df = predict_df.rename(columns={"label":"Prediction"}) # getting our "training" data x = x[:-predict_lenght] df.dropna(inplace=True) y = np.array(df["label"]) x_train, x_test, y_train, y_test = model_selection.train_test_split(x, y, test_size=0.2) predict_df # do linear regression if os.path.exists('linearregression.pickle'): clf = pickle.load(open('linearregression.pickle','rb')) else: clf = LinearRegression(n_jobs=-1) # do as many jobs as possible clf.fit(x_train, y_train) # fit/"train" to data with open('linearregression.pickle','wb') as f: # save classifier pickle.dump(clf, f) accuracy = clf.score(x_test, y_test) forecast = clf.predict(x_predict) print(forecast,accuracy, predict_lenght) print(f"accuracy: {accuracy}") #plotting style.use('ggplot') predict_df["Predict"] = forecast df['Adj. Close'].plot() predict_df['Predict'].plot() plt.legend(loc=4) plt.xlabel('Date') plt.ylabel('Price') plt.show() ###Output _____no_output_____
DN_Demo_Notebook.ipynb
###Markdown Text Analytics SimplifiedIn this demo we will showcase the powerful capabilities of the Data Ninja services by contructing a Text Analytics pipeline from scratch. By combining open source tools and packages with Data Ninja we will show you how the sematic content from unstructured data can be easily obtained and leveraged in your analytics pipeline.We will walk through the following steps: 1. Fetch trending URLsFirst we will scrape a news aggregation website (in this case Google News, but the idea can be extended to other news sites as well) and obtain a list of URLs that point to valid news articles. 2. Extract article text from URLsWe will show how the Data Ninja text extraction service can be used to identify and extract the main text from an HTML page removing all the boilerplate content (such as running headers/footers, menus, ads). 3. Extract semantic content from article textOnce the text has been extracted from a webpage, we will then use another Data Ninja service to tag the article with entities and sentiment. Our content tagging system is capable of identifying the broader context of an article as we will show. 4. Clustering to find topicsSemantic content extracted from the text can be utilized as features and Machine Learning techniques can be used to derive isights from unstructured data. One simple example is to use a common text clustering technique like LDA to identify the topics from a collection of articles. 5. VisualizationThe insights obtained from previous stages can be communicated using a graphical visualization library. We will show a new Data Ninja app called Newsbot Ninja that brings many of these ideas together: https://newsbot.dataninja.net/Let's get started! DemoWe will scrape a news aggregator site (Google News) to collect a list URLs that point to trending news articles. We need to remove links that are not likely to be news articles from popular media sites (such as Wikipedia or Youtube). Helper methods for harvesting the links ###Code from bs4 import BeautifulSoup import requests # Sites to exclude from our trending news URL collection exclusions = ['google.com','youtube.com','wikipedia.org','blogspot.com'] prefix = 'http://' def include_url(url): for excl in exclusions: if url.find(excl) > 0: return False return True # Fetch the page content and extract the links def fetch_links(url): response = requests.get(prefix + url) page_content = response.text soup = BeautifulSoup(page_content, "lxml") links = soup.find_all('a') return links ###Output _____no_output_____ ###Markdown Collect and print the set of links ###Code linkset = set() links = fetch_links('news.google.com') # Collect the article links applying the URL filters for link in links: href = link.get('href') if str(href).startswith(prefix) and include_url(str(href)): linkset.add(link.get('href').strip()) print str(href) print 'Links harvested: ', str(len(linkset)) # Take 100 links for the demo links100 = list(linkset)[:100] ###Output _____no_output_____ ###Markdown Accessing the Data Ninja servicesPlease sign-up at https://market.mashape.com/dataninja/smart-content and obtain your free Data Ninja API key. We will access the Smart Content service to analyze the semantic content of each article obtained in the previous step. The Smart Content service is based on our pre-built knowledge graph database.Alternatively, you can use the Amazon Web Serivices API Gateway to access our services (using your AWS account): https://auth.dataninja.net/cart ###Code import json with open('mashape_key.txt', 'r') as keyfile: mashape_key = keyfile.read().rstrip() # Please add your own Data Ninja API Mashape key here --> # mashape_key = <your-own-mashape-key> smartcontent_url = 'https://smartcontent.dataninja.net/smartcontent/tag' headers = {'Content-Type': 'application/json', 'Accept': 'application/json', 'X-Mashape-User': 'Newsbot', 'X-Mashape-Key': mashape_key} # If you are using AWS API Gateway, please add the X-API-Key: <your-AWS-key> # in place of 'X-Mashape-Key': mashape_key and use the following link to access # the service: https://api.dataninja.net/smartcontent/tag def fetch_smartcontent(link): payload = {'url': link, 'max_size': 10} response = requests.post(smartcontent_url, headers=headers, data=json.dumps(payload)) return response.json() ###Output _____no_output_____ ###Markdown Fetch Smart ContentThe Smart Content service analyzes the text to produce concepts, categories, keywords and sentiments in JSON output format. Here is an example:http://www.nba.com/warriors/news/warriors-announce-tv-radio-schedule-western-conference-finals ###Code data = fetch_smartcontent('http://www.nba.com/warriors/news/warriors-announce-tv-radio-schedule-western-conference-finals') # Display the JSON output from Smart Content print json.dumps(data, indent=4) ###Output _____no_output_____ ###Markdown Article Text ExtractionBuilt into our Smart Content Service is the ability to extract the main text from a web page using machine learning techniques. Here is an example: ###Code # Dispay the extracted text from Smart Content print data['text'] ###Output _____no_output_____ ###Markdown Fetch Smart Content for a set of linksNow we will extract the Smart Content for the list of URLs we obtained from Google News earlier. We will specifically prepare a list of extracted text for topic clustering in the next step. ###Code import json # Call the Smart Content service and collect the article text into a list documents = [] for link in links100: data = fetch_smartcontent(link) if 'text' in data and len(data['text']) > 100: documents.append(data['text']) print 'Documents in collection: ', str(len(documents)) ###Output _____no_output_____ ###Markdown Clustering text to find topicsThe rich semantic content returned by Data Ninja services can be used for solving many Text Analytics problems and derive additional insights. Here will show you how topic clustering can be done with the extracted text using the standard LDA algorithm from the Gensim library. ###Code from gensim import corpora, models import re # Prepare a set of stopwords and other tokens to be removed from stream stoplist = set([ 'a', 'able', 'about', 'above', 'according', 'accordingly', 'across', 'actually', 'after', 'afterwards', 'again', 'against', 'all', 'allow', 'allows', 'almost', 'alone', 'along', 'already', 'also', 'although', 'always', 'am', 'among', 'amongst', 'an', 'and', 'another', 'any', 'anybody', 'anyhow', 'anyone', 'anything', 'anyway', 'anyways', 'anywhere', 'apart', 'appear', 'appreciate', 'appropriate', 'are', 'around', 'as', 'aside', 'ask', 'asking', 'associated', 'at', 'available', 'away', 'awfully', 'b', 'be', 'became', 'because', 'become', 'becomes', 'becoming', 'been', 'before', 'beforehand', 'behind', 'being', 'believe', 'below', 'beside', 'besides', 'best', 'better', 'between', 'beyond', 'both', 'brief', 'but', 'by', 'c', 'came', 'can', 'cannot', 'cant', 'cause', 'causes', 'certain', 'certainly', 'changes', 'clearly ', 'co', 'com', 'come', 'comes', 'concerning', 'consequently', 'consider', 'considering', 'contain', 'containing', 'contains', 'corresponding', 'could', 'course','currently', 'd', 'definitely', 'described', 'despite', 'did', 'different', 'do', 'does', 'doing', 'done', 'down', 'downwards', 'during', 'e', 'each', 'edu', 'eg', 'eight', 'either', 'else', 'elsewhere', 'enough', 'entirely', 'especially', 'et', 'etc', 'even', 'ever', 'every', 'everybody', 'everyone', 'everything', 'everywhere', 'ex', 'exactly', 'example', 'except', 'f', 'far', 'few', 'fifth', 'first', 'five', 'followed', 'following', 'follows', 'for', 'former', 'formerly', 'forth', 'four', 'from', 'further', 'furthermore', 'g', 'get', 'gets', 'getting', 'given', 'gives', 'go ', 'goes', 'going', 'gone', 'got', 'gotten', 'greetings', 'h', 'had', 'happens', 'hardly', 'has', 'have', 'having', 'he', 'hello', 'help', 'hence', 'her', 'here', 'hereafter', 'hereby', 'herein', 'hereupon', 'hers', 'herself', 'hi', 'him', 'himself', 'his', 'hither', 'hopefully', 'how', 'howbeit', 'however', 'i', 'ie', 'if','ignored', 'immediate', 'in', 'inasmuch', 'inc', 'indeed', 'indicate', 'indicated', 'indicates', 'inner', 'insofar', 'instead', 'into', 'inward', 'is', 'it', 'its', 'itself', 'j', 'just', 'k', 'keep', 'keeps', 'kept', 'know', 'knows', 'known', 'l', 'last', 'lately', 'later', 'latter', 'latterly', 'least', 'less', 'lest', 'let', 'like', 'liked', 'likely', 'little', 'look', 'looking', 'looks', 'ltd', 'm', 'mainly', 'many', 'may', 'maybe', 'me', 'mean', 'meanwhile', 'merely', 'might', 'more', 'moreover', 'most', 'mostly', 'much', 'must', 'my', 'myself', 'n', 'name', 'namely', 'nd', 'near', 'nearly', 'necessary', 'need', 'needs', 'neither', 'never', 'nevertheless', 'next', 'nine', 'no', 'nobody', 'non', 'none', 'noone', 'nor', 'normally', 'not', 'nothing', 'novel', 'now', 'nowhere', 'o', 'obviously', 'of', 'off', 'often', 'oh', 'ok', 'okay', 'old','on', 'once', 'one', 'ones', 'only', 'onto', 'or', 'other', 'others', 'otherwise', 'ought', 'our', 'ours', 'ourselves', 'out', 'outside', 'over', 'overall', 'own', 'p', 'particular', 'particularly', 'per', 'perhaps', 'placed', 'please', 'plus', 'possible', 'presumably', 'probably', 'provides', 'q', 'que', 'quite', 'qv', 'r', 'rather', 'rd', 're', 'really', 'reasonably', 'regarding', 'regardless', 'regards', 'relatively', 'respectively', 'right', 's', 'said', 'same', 'saw', 'say', 'saying', 'says', 'second', 'secondly', 'see', 'seeing', 'seem', 'seemed', 'seeming', 'seems', 'seen', 'self', 'selves', 'sensible', 'sent', 'serious', 'seriously', 'seven', 'several', 'shall', 'she', 'should', 'since', 'six', 'so', 'some', 'somebody', 'somehow', 'someone', 'something', 'sometime', 'sometimes', 'somewhat', 'somewhere', 'soon', 'sorry', 'specified', 'specify', 'specifying', 'still', 'sub', 'such', 'sup', 'sure', 't','take', 'taken', 'tell', 'tends', 'th', 'than', 'thank', 'thanks', 'thanx', 'that', 'thats', 'the', 'their', 'theirs', 'them', 'themselves', 'then', 'thence', 'there', 'thereafter', 'thereby', 'therefore', 'therein', 'theres', 'thereupon', 'these', 'they', 'think', 'third', 'this', 'thorough', 'thoroughly', 'those', 'though', 'three', 'through', 'throughout', 'thru', 'thus', 'to', 'together', 'too', 'took', 'toward', 'towards', 'tried', 'tries', 'truly', 'try', 'trying', 'twice', 'two', 'u', 'un', 'under', 'unfortunately', 'unless', 'unlikely', 'until', 'unto', 'up', 'upon', 'us', 'use', 'used', 'useful', 'uses', 'using', 'usually', 'uucp', 'v', 'value', 'various', 'very', 'via', 'viz', 'vs', 'w', 'want', 'wants', 'was', 'way', 'we', 'welcome', 'well', 'went', 'were', 'what', 'whatever', 'when', 'whence', 'whenever', 'where', 'whereafter', 'whereas', 'whereby','wherein', 'whereupon', 'wherever', 'whether', 'which', 'while', 'whither', 'who', 'whoever', 'whole', 'whom', 'whose', 'why', 'will', 'willing', 'wish', 'with', 'within', 'without', 'wonder', 'would', 'would', 'x', 'y', 'yes', 'yet', 'you', 'your', 'yours', 'yourself', 'yourselves', '', 'am', 'pm', 'mr', 'hd', 'vr', 'top', 'new','z', 'zero', '-', '--', '|' ]) stoplist |= set(map(str, list(range(0, 3000)))) ###Output _____no_output_____ ###Markdown Run LDA on our news article collectionWe have developed our own community detection and topic clustering pipeline that takes full advantage of the rich sematic content provided by the Data Ninja services.You can take a look at our Newsbot Ninja App here: https://newsbot.dataninja.net/ ###Code def get_topics(num_topics, documents): texts = [[word for word in re.split(r'\W+', document.lower()) if word.strip() not in stoplist] for document in documents] dictionary = corpora.Dictionary(texts) corpus = [dictionary.doc2bow(text) for text in texts] # Run the LDA model lda = models.ldamodel.LdaModel(corpus=corpus, id2word=dictionary, num_topics=num_topics, update_every=1, passes=10) # Collect the LDA output for display result = {} for i in xrange(num_topics): result[i] = lda.show_topic(i, 5) # Get only the top five labels per topic return result # Run LDA for our news article collection and display the results topics = get_topics(num_topics=20, documents=documents) print "{:<10} {:<15} {:<10}".format('Topic Id','Label','Probability') print '-' * 45 for k, v in topics.iteritems(): for item in v: print "{:<10} {:<15} {:<10}".format(k, item[0], item[1]) print '-' * 45 ###Output _____no_output_____
DS_intro.ipynb
###Markdown 1. LINEAR Difference equations 1.1. Motivation - easiest case - malthusian growth (linear, 1-variable, 1st degree) ###Code import matplotlib.pyplot as PLT import numpy as NP import math as M def full_plot(seq): x = seq PLT.subplots_adjust(wspace = 0.7, hspace=0.7) PLT.figure(1) PLT.subplot(221) PLT.title('A: (linear) value vs time') PLT.xlabel('time') PLT.ylabel('population/capital') PLT.plot(x, 'r.', x, 'r') PLT.subplot(222) PLT.title('B: (log) value vs time') PLT.xlabel('time') PLT.ylabel('population (log scale)') PLT.semilogy(x, 'r.') PLT.subplot(223) x_old = x[0:n_ticks-1] x_new = x[1:n_ticks] PLT.title('C: lag plot') PLT.xlabel('x old') PLT.ylabel('x new') PLT.plot(x_old, x_new, 'b.', x_old, x_old, 'c') PLT.subplot(224) PLT.xlabel('x old (log scale)') PLT.ylabel('x new (log scale)') PLT.title('D: lagplot (log)') PLT.loglog(x_old, x_new, 'b.', x_old, x_old, 'c') PLT.show() n_ticks=15 x = [i for i in range(n_ticks)] x[0]=0.2 alpha = 1.3 # 'reproductive rate' of flies = interest rate of capital for t in range(n_ticks-1): x[t+1] = alpha * x[t] full_plot(x) ###Output _____no_output_____ ###Markdown 1.2. Classification of problems * differential/ difference * linear/ non-linear * static/dynamic [driven (?)] * n-th degree * vector/scalar 1.3. More examples: pendulum. - pendulum (linear, 1-variable, 2nd degree) 1.4. Linear; vector case MOTIVATION: Pendulum STABILITY: - fixed point - attracting/repelling/neutral(?)/saddle - cycle - ('ditto') 1.5. Closed form solution - general solution (linear, 1-variable, n-th degree) 1.6. PARAMETER ESTIMATION - AR to estimate parameters 2. NON-linear difference equations 2.1. Motivation: logistic parabola - logistic parabola (pop. of flies - sir Robert May) - crash intro to chaos ###Code import matplotlib.pyplot as PLT import numpy as NP n_ticks=30 x = [i for i in range(n_ticks)] x[0]=0.2 alpha = 3.5 # 'reproductive rate' of flies for t in range(n_ticks-1): x[t+1] = alpha * x[t] * (1 - x[t]) full_plot(x) ###Output _____no_output_____
Analysis2.ipynb
###Markdown | Cut | Davies | Silhouette ||:-------------:|:-------------:|:-----:||[0.2, 0.3]| 1.63206334851 | 0.0170302136701 ||[0.2, 0.4]| 1.6320633406 | 0.0188368402443 ||[0.2, 0.5]| 1.63206333812 | 0.0181812861725 ||[0.2, 0.6]| 1.63206333505 | 0.0182496293806 ||[0.2, 0.7]| 1.63206333206 | 0.0182788602025 ||[0.2, 0.75]| 1.63206333127 | 0.0176717463632 ||[0.2, 0.8]| 1.63206332839 | 0.0171676859755 ||[0.2, 0.85]| 1.63206332668 | 0.0171347854363 ||[0.2, 0.9]| 1.63206332338 | 0.015277385074 ||**[0.2, 0.95]**| **1.63206331953** | 0.0146721613182 ||**[0.3, 0.4]**| 1.63206334908 | **0.0203214004136** ||[0.3, 0.5]| 1.63206334694 | 0.0180394197208 ||[0.3, 0.6]| 1.63206334387 | 0.0180920648375 ||[0.3, 0.7]| 1.63206334088 | 0.0181680222463 ||[0.3, 0.75]| 1.63206334009 | 0.0172711503904 ||[0.3, 0.8]| 1.63206333722 | 0.0167689770711 ||[0.3, 0.85]| 1.6320633355 | 0.0168353296156 ||[0.3, 0.9]| 1.6320633322 | 0.0151297600769 ||[0.3, 0.95]| 1.63206332835 | 0.0148090559579 ||[0.4, 0.5]| 1.63206335423 | 0.0116222130607 ||[0.4, 0.6]| 1.63206335116 | 0.0144558601793 ||[0.4, 0.7]| 1.63206334817 | 0.0157336578169 ||[0.4, 0.75]| 1.63206334738 | 0.0147616732479 ||[0.4, 0.8]| 1.6320633445 | 0.0148811517773 ||[0.4, 0.85]| 1.63206334278 | 0.0152615911554 ||[0.4, 0.9]| 1.63206333949 | 0.0142575280731 ||[0.4, 0.95]| 1.63206333564 | 0.0142983610145 ||[0.5, 0.6]| 1.63206335308 | 0.0168053212842 ||[0.5, 0.7]| 1.63206335009 | 0.0174941090121 ||[0.5, 0.75]| 1.6320633493 | 0.0154384141321 ||[0.5, 0.8]| 1.63206334643 | 0.0152361306211 ||[0.5, 0.85]| 1.63206334471 | 0.0156384893486 ||[0.5, 0.9]| 1.63206334141 | 0.0143047943129 ||[0.5, 0.95]| 1.63206333756 | 0.0143292896173 ||[0.6, 0.7]| 1.63206335325 | 0.0175607470043 ||[0.6, 0.75]| 1.63206335251 | 0.0139532264764 ||[0.6, 0.8]| 1.63206334977 | 0.0143772037858 ||[0.6, 0.85]| 1.63206334813 | 0.0150242258351 ||[0.6, 0.9]| 1.63206334524 | 0.0138256549357 ||[0.6, 0.95]| 1.63206334182 | 0.013986761675 ||[0.7, 0.75]| 1.63206335546 | 0.00901638693935 ||[0.7, 0.8]| 1.63206335283 | 0.012787243572 ||[0.7, 0.85]| 1.63206335126 | 0.0138615914924 ||[0.7, 0.9]| 1.63206334874 | 0.0131225257941 ||[0.7, 0.95]| 1.6320633457 | 0.0134793155216 ||[0.75, 0.8]| 1.63206335358 | 0.0142925246246 ||[0.75, 0.85]| 1.63206335203 | 0.0151228085039 ||[0.75, 0.9]| 1.63206334962 | 0.0131665006411 ||[0.75, 0.95]| 1.63206334671 | 0.0134358355033 ||[0.8, 0.85]| 1.63206335467 | 0.0161534462294 ||[0.8, 0.9]| 1.63206335259 | 0.0125688529289 ||[0.8, 0.95]| 1.63206335002 | 0.0130225829836 ||[0.85, 0.9]| 1.63206335423 | 0.0119223678382 ||[0.85, 0.95]| 1.63206335184 | 0.0126827147301 ||[0.9, 0.95]| 1.63206335413 | 0.012628788748 | ###Code rows_inds_v1 = rows_ind.copy() inds = np.where((norm(U)[:,0] >= 0.05) & (norm(U)[:,0] <= 0.95))[0] rows_inds_v1[inds] = 2 rows_inds_v1 best = 1e10 for _ in xrange(3): U_t, S_t, V_t, rows_ind_t, cols_ind_t = matrix_factorization_clustering(X_train_norm.toarray(), 3, 2, onmtf, num_iters=100) try: dav_sc = davies_bouldin_score(X_train_norm.toarray(), rows_ind_t, S_t.dot(V_t.T)) except: continue if dav_sc < best: best = dav_sc U_v2 = U_t S_v2 = S_t V_v2 = V_t rows_ind_v2 = rows_ind_t cols_ind_v2 = cols_ind_t print 'tf norm (3 clusters): %s' % rand_score(labels, rows_ind_t) print 'tf norm (3 clusters): %s' % sil_score(X_train_norm, rows_ind_t) print 'tf norm (3 clusters): %s' % dav_sc print '' pairplot(U_t) print 'Num elems in cluster 0: %s' % np.sum(rows_ind_t == 0) print 'Num elems in cluster 1: %s' % np.sum(rows_ind_t == 1) print 'Num elems in cluster 2: %s' % np.sum(rows_ind_t == 2) print rows_ind_v2 plt.hold() inds = np.where(rows_ind_t == 2)[0] plt.hist(norm(U)[inds, 0], bins=50) inds = np.where(rows_ind_t == 1)[0] plt.hist(norm(U)[inds, 0], bins=50) inds = np.where(rows_ind_t == 0)[0] plt.hist(norm(U)[inds, 0], bins=50) plt.show() def overlap(a, b, k, l): clust_a = a == k clust_b = b == l inds = [] sum_all = 0 sum_equals = 0 for i, elem in enumerate(clust_a): if (clust_a[i] == False and clust_b[i] == False): continue elif (clust_a[i] == True and clust_b[i] == False): sum_all += 1 elif (clust_a[i] == False and clust_b[i] == True): sum_all += 1 elif (clust_a[i] == True and clust_b[i] == True): sum_equals += 1 sum_all += 1 inds.append(i) return np.array(inds), float(sum_equals) / sum_all print 'Do they overlap on cluster 2?' inds, overlap_rate = overlap(rows_inds_v1, rows_ind_v2, 2, 0) print '%.2f' % overlap_rate print inds U, S, V, rows_ind, cols_ind = fnmtf(X, 3, 3, norm=True) ###Output _____no_output_____
Natural Language Processing/Course 3 - Natural Language Processing with Sequence Models/Labs/Week 1/Trax.ipynb
###Markdown Trax : Ungraded Lecture NotebookIn this notebook you'll get to know about the Trax framework and learn about some of its basic building blocks. Background Why Trax and not TensorFlow or PyTorch?TensorFlow and PyTorch are both extensive frameworks that can do almost anything in deep learning. They offer a lot of flexibility, but that often means verbosity of syntax and extra time to code.Trax is much more concise. It runs on a TensorFlow backend but allows you to train models with 1 line commands. Trax also runs end to end, allowing you to get data, model and train all with a single terse statements. This means you can focus on learning, instead of spending hours on the idiosyncrasies of big framework implementation. Why not Keras then?Keras is now part of Tensorflow itself from 2.0 onwards. Also, trax is good for implementing new state of the art algorithms like Transformers, Reformers, BERT because it is actively maintained by Google Brain Team for advanced deep learning tasks. It runs smoothly on CPUs,GPUs and TPUs as well with comparatively lesser modifications in code. How to Code in TraxBuilding models in Trax relies on 2 key concepts:- **layers** and **combinators**.Trax layers are simple objects that process data and perform computations. They can be chained together into composite layers using Trax combinators, allowing you to build layers and models of any complexity. Trax, JAX, TensorFlow and Tensor2TensorYou already know that Trax uses Tensorflow as a backend, but it also uses the JAX library to speed up computation too. You can view JAX as an enhanced and optimized version of numpy. **Watch out for assignments which import `import trax.fastmath.numpy as np`. If you see this line, remember that when calling `np` you are really calling Trax’s version of numpy that is compatible with JAX.**As a result of this, where you used to encounter the type `numpy.ndarray` now you will find the type `jax.interpreters.xla.DeviceArray`.Tensor2Tensor is another name you might have heard. It started as an end to end solution much like how Trax is designed, but it grew unwieldy and complicated. So you can view Trax as the new improved version that operates much faster and simpler. Resources- Trax source code can be found on Github: [Trax](https://github.com/google/trax)- JAX library: [JAX](https://jax.readthedocs.io/en/latest/index.html) Installing TraxTrax has dependencies on JAX and some libraries like JAX which are yet to be supported in [Windows](https://github.com/google/jax/blob/1bc5896ee4eab5d7bb4ec6f161d8b2abb30557be/README.mdinstallation) but work well in Ubuntu and MacOS. We would suggest that if you are working on Windows, try to install Trax on WSL2. Official maintained documentation - [trax-ml](https://trax-ml.readthedocs.io/en/latest/) not to be confused with this [TraX](https://trax.readthedocs.io/en/latest/index.html) ###Code !pip install trax==1.3.1 Use this version for this notebook ###Output Requirement already satisfied: trax==1.3.1 in /opt/conda/lib/python3.7/site-packages (1.3.1) Collecting Use Downloading use-0.1.0-py3-none-any.whl (2.8 kB) ERROR: Could not find a version that satisfies the requirement this (from versions: none) ERROR: No matching distribution found for this WARNING: You are using pip version 20.1.1; however, version 20.2.3 is available. You should consider upgrading via the '/opt/conda/bin/python -m pip install --upgrade pip' command. ###Markdown Imports ###Code import numpy as np # regular ol' numpy from trax import layers as tl # core building block from trax import shapes # data signatures: dimensionality and type from trax import fastmath # uses jax, offers numpy on steroids # Trax version 1.3.1 or better !pip list | grep trax ###Output trax 1.3.1 WARNING: You are using pip version 20.1.1; however, version 20.2.3 is available. You should consider upgrading via the '/opt/conda/bin/python -m pip install --upgrade pip' command. ###Markdown LayersLayers are the core building blocks in Trax or as mentioned in the lectures, they are the base classes.They take inputs, compute functions/custom calculations and return outputs.You can also inspect layer properties. Let me show you some examples. Relu LayerFirst I'll show you how to build a relu activation function as a layer. A layer like this is one of the simplest types. Notice there is no object initialization so it works just like a math function.**Note: Activation functions are also layers in Trax, which might look odd if you have been using other frameworks for a longer time.** ###Code # Layers # Create a relu trax layer relu = tl.Relu() # Inspect properties print("-- Properties --") print("name :", relu.name) print("expected inputs :", relu.n_in) print("promised outputs :", relu.n_out, "\n") # Inputs x = np.array([-2, -1, 0, 1, 2]) print("-- Inputs --") print("x :", x, "\n") # Outputs y = relu(x) print("-- Outputs --") print("y :", y) ###Output -- Properties -- name : Relu expected inputs : 1 promised outputs : 1 -- Inputs -- x : [-2 -1 0 1 2] -- Outputs -- y : [0 0 0 1 2] ###Markdown Concatenate LayerNow I'll show you how to build a layer that takes 2 inputs. Notice the change in the expected inputs property from 1 to 2. ###Code # Create a concatenate trax layer concat = tl.Concatenate() print("-- Properties --") print("name :", concat.name) print("expected inputs :", concat.n_in) print("promised outputs :", concat.n_out, "\n") # Inputs x1 = np.array([-10, -20, -30]) x2 = x1 / -10 print("-- Inputs --") print("x1 :", x1) print("x2 :", x2, "\n") # Outputs y = concat([x1, x2]) print("-- Outputs --") print("y :", y) ###Output -- Properties -- name : Concatenate expected inputs : 2 promised outputs : 1 -- Inputs -- x1 : [-10 -20 -30] x2 : [1. 2. 3.] -- Outputs -- y : [-10. -20. -30. 1. 2. 3.] ###Markdown Layers are ConfigurableYou can change the default settings of layers. For example, you can change the expected inputs for a concatenate layer from 2 to 3 using the optional parameter `n_items`. ###Code # Configure a concatenate layer concat_3 = tl.Concatenate(n_items=3) # configure the layer's expected inputs print("-- Properties --") print("name :", concat_3.name) print("expected inputs :", concat_3.n_in) print("promised outputs :", concat_3.n_out, "\n") # Inputs x1 = np.array([-10, -20, -30]) x2 = x1 / -10 x3 = x2 * 0.99 print("-- Inputs --") print("x1 :", x1) print("x2 :", x2) print("x3 :", x3, "\n") # Outputs y = concat_3([x1, x2, x3]) print("-- Outputs --") print("y :", y) ###Output -- Properties -- name : Concatenate expected inputs : 3 promised outputs : 1 -- Inputs -- x1 : [-10 -20 -30] x2 : [1. 2. 3.] x3 : [0.99 1.98 2.97] -- Outputs -- y : [-10. -20. -30. 1. 2. 3. 0.99 1.98 2.97] ###Markdown **Note: At any point,if you want to refer the function help/ look up the [documentation](https://trax-ml.readthedocs.io/en/latest/) or use help function.** ###Code #help(tl.Concatenate) #Uncomment this to see the function docstring with explaination ###Output _____no_output_____ ###Markdown Layers can have WeightsSome layer types include mutable weights and biases that are used in computation and training. Layers of this type require initialization before use.For example the `LayerNorm` layer calculates normalized data, that is also scaled by weights and biases. During initialization you pass the data shape and data type of the inputs, so the layer can initialize compatible arrays of weights and biases. ###Code # Uncomment any of them to see information regarding the function help(tl.LayerNorm) # help(shapes.signature) # Layer initialization norm = tl.LayerNorm() # You first must know what the input data will look like x = np.array([0, 1, 2, 3], dtype="float") # Use the input data signature to get shape and type for initializing weights and biases norm.init(shapes.signature(x)) # We need to convert the input datatype from usual tuple to trax ShapeDtype print("Normal shape:",x.shape, "Data Type:",type(x.shape)) print("Shapes Trax:",shapes.signature(x),"Data Type:",type(shapes.signature(x))) # Inspect properties print("-- Properties --") print("name :", norm.name) print("expected inputs :", norm.n_in) print("promised outputs :", norm.n_out) # Weights and biases print("weights :", norm.weights[0]) print("biases :", norm.weights[1], "\n") # Inputs print("-- Inputs --") print("x :", x) # Outputs y = norm(x) print("-- Outputs --") print("y :", y) ###Output Normal shape: (4,) Data Type: <class 'tuple'> Shapes Trax: ShapeDtype{shape:(4,), dtype:float64} Data Type: <class 'trax.shapes.ShapeDtype'> -- Properties -- name : LayerNorm expected inputs : 1 promised outputs : 1 weights : [1. 1. 1. 1.] biases : [0. 0. 0. 0.] -- Inputs -- x : [0. 1. 2. 3.] -- Outputs -- y : [-1.3416404 -0.44721344 0.44721344 1.3416404 ] ###Markdown Custom LayersThis is where things start getting more interesting!You can create your own custom layers too and define custom functions for computations by using `tl.Fn`. Let me show you how. ###Code help(tl.Fn) # Define a custom layer # In this example you will create a layer to calculate the input times 2 def TimesTwo(): layer_name = "TimesTwo" #don't forget to give your custom layer a name to identify # Custom function for the custom layer def func(x): return x * 2 return tl.Fn(layer_name, func) # Test it times_two = TimesTwo() # Inspect properties print("-- Properties --") print("name :", times_two.name) print("expected inputs :", times_two.n_in) print("promised outputs :", times_two.n_out, "\n") # Inputs x = np.array([1, 2, 3]) print("-- Inputs --") print("x :", x, "\n") # Outputs y = times_two(x) print("-- Outputs --") print("y :", y) ###Output -- Properties -- name : TimesTwo expected inputs : 1 promised outputs : 1 -- Inputs -- x : [1 2 3] -- Outputs -- y : [2 4 6] ###Markdown CombinatorsYou can combine layers to build more complex layers. Trax provides a set of objects named combinator layers to make this happen. Combinators are themselves layers, so behavior commutes. Serial CombinatorThis is the most common and easiest to use. For example could build a simple neural network by combining layers into a single layer using the `Serial` combinator. This new layer then acts just like a single layer, so you can inspect intputs, outputs and weights. Or even combine it into another layer! Combinators can then be used as trainable models. _Try adding more layers_**Note:As you must have guessed, if there is serial combinator, there must be a parallel combinator as well. Do try to explore about combinators and other layers from the trax documentation and look at the repo to understand how these layers are written.** ###Code # help(tl.Serial) # help(tl.Parallel) # Serial combinator serial = tl.Serial( tl.LayerNorm(), # normalize input tl.Relu(), # convert negative values to zero times_two, # the custom layer you created above, multiplies the input recieved from above by 2 ### START CODE HERE # tl.Dense(n_units=2), # try adding more layers. eg uncomment these lines # tl.Dense(n_units=1), # Binary classification, maybe? uncomment at your own peril # tl.LogSoftmax() # Yes, LogSoftmax is also a layer ### END CODE HERE ) # Initialization x = np.array([-2, -1, 0, 1, 2]) #input serial.init(shapes.signature(x)) #initialising serial instance print("-- Serial Model --") print(serial,"\n") print("-- Properties --") print("name :", serial.name) print("sublayers :", serial.sublayers) print("expected inputs :", serial.n_in) print("promised outputs :", serial.n_out) print("weights & biases:", serial.weights, "\n") # Inputs print("-- Inputs --") print("x :", x, "\n") # Outputs y = serial(x) print("-- Outputs --") print("y :", y) ###Output -- Serial Model -- Serial[ LayerNorm Relu TimesTwo ] -- Properties -- name : Serial sublayers : [LayerNorm, Relu, TimesTwo] expected inputs : 1 promised outputs : 1 weights & biases: [(DeviceArray([1, 1, 1, 1, 1], dtype=int32), DeviceArray([0, 0, 0, 0, 0], dtype=int32)), (), ()] -- Inputs -- x : [-2 -1 0 1 2] -- Outputs -- y : [0. 0. 0. 1.4142132 2.8284264] ###Markdown JAXJust remember to lookout for which numpy you are using, the regular ol' numpy or Trax's JAX compatible numpy. Both tend to use the alias np so watch those import blocks.**Note:There are certain things which are still not possible in fastmath.numpy which can be done in numpy so you will see in assignments we will switch between them to get our work done.** ###Code # Numpy vs fastmath.numpy have different data types # Regular ol' numpy x_numpy = np.array([1, 2, 3]) print("good old numpy : ", type(x_numpy), "\n") # Fastmath and jax numpy x_jax = fastmath.numpy.array([1, 2, 3]) print("jax trax numpy : ", type(x_jax)) ###Output good old numpy : <class 'numpy.ndarray'> jax trax numpy : <class 'jax.interpreters.xla.DeviceArray'>
docs/examples/Usage Example.ipynb
###Markdown Reference Data RepositoryThis notebook contains a few examples that demonstrate the main functionality of the **Reference Data Repository** package `refdata`. Local Data StoreThe local data store is responsible for maintaining information about downloaded datasets and providing access to the downloaded data. By default, all downloaded files are stored in a local folder under `$HOME/.refdata`. This behavior can be changed by either setting the environment variable *REFDATA_BASEDIR* to point to different directory on the file system or by providing a reference to the directory using the `basedir` parameter when creating an instance of the `LocalStore`.The local data store is associated with a (remote) data repository index file that contains the list of datasets that are available for download. By default, the [index file in this repository is used](https://github.com/VIDA-NYU/reference-data-repository/blob/master/data/index.json). You can change this behavior by setting the environment variable *REFDATA_URL*. ###Code # Create an instance of the local data store with default settings. from refdata.store import RefStore refstore = RefStore() # Print the identifier, name, and description for all # datasets that are listed in the associated repository # index. for dataset in refstore.repository().find(): print('{} (id={})'.format(dataset.name, dataset.identifier)) desc = dataset.description if dataset.description is not None else 'no description available' print('{}\n'.format(desc)) ###Output Cities in the U.S. (id=encyclopaedia_britannica:us_cities) Names of cities in the U.S. from the Encyclopaedia Britannica. REST Countries (id=restcountries.eu) Information about countries in the world available from the restcountries.eu project. C1 Street Suffix Abbreviations (id=usps:street_abbrev) Mapping of common street type abbreviations to a standard format. C2 Secondary Unit Designators (id=usps:secondary_unit_designators) no description available ###Markdown Manage Downloaded DatasetsThe local datastore provides basic functionality to download datasets, get a list of all downloaded datasets, access metadata for these datasets, and remove a dataset from the local file system. ###Code # Download the restcountries dataset dataset = refstore.download('restcountries.eu') print('downloaded dataset {} (size {} bytes).'.format(dataset.identifier, dataset.filesize)) # List identifier nad names for datasets that have # been downloaded to the local store. print('Downloaded datasets:\n') for dataset in refstore.list(): print('> {} (id={})'.format(dataset.name, dataset.identifier)) # List identifier and names for columns (attributes) # in the restcountries dataset. print('Columns:\n') for col in refstore.load('restcountries.eu').columns: print(' {} (id={})'.format(col.name, col.identifier)) # The full dataset metadata is also available as a # dictionary. import json print(json.dumps(refstore.load('restcountries.eu').to_dict(), indent=4)) # Remove a downloaded dataset from the local file system. refstore.remove('restcountries.eu') for dataset in refstore.list(): print(dataset.identifier) ###Output encyclopaedia_britannica:us_cities ###Markdown Access Reference DataData from downloaded datasets can be accessed in three different ways:- Set of distinct values- Lookup table generated from dataset columns- Pandas data frame Set of Distinct ValuesGet set of distinct values for one or more columns of the datasets. If multiple columns are specified (as a list) the resulting set will contain tuples of distinct value combinations. ###Code # Get list of distinct U.S. state names from the # Encyclopaedia Britannica dataset with U.S. city # names. # Instead of downloading and then opening the dataset # we can open it directly and set the auto_download flag # which will download the datast if it is no in the local # store. dataset = refstore.load('encyclopaedia_britannica:us_cities', auto_download=True) # Alternative shortcut: # refstore.distinct(key='encyclopaedia_britannica:us_cities', columns='state') dataset.distinct('state') ###Output _____no_output_____ ###Markdown Lookup TablesIt is possible to directly generate a lookup table that maps values from one column (or multiple columns) to the values in another column(s). Lookup tables are represented as dictionaries. ###Code # Get a lookup table (dictionary) that maps the # ISO 3166-1 3-letter country code to the country's # captital city. Convert values from both attributes # to upper case before adding them to the mapping. dataset = refstore.load('restcountries.eu', auto_download=True) # Alternative shortcut: # refstore.mapping(key='restcountries.eu', lhs='alpha3Code', rhs='capital') mapping = dataset.mapping(lhs='alpha3Code', rhs='capital', transformer=str.upper) mapping['AUS'] ###Output _____no_output_____ ###Markdown Data FrameThe full dataset (or a subset of the columns) can also be loaded as a pandas data frame. ###Code # Get data frame with country name, 3-letter country code, # and capital city. dataset = refstore.load('restcountries.eu', auto_download=True) # Alternative shortcut: # refstore.load('restcountries.eu', ['name', 'alpha3Code', 'capital']) df = dataset.df(['name', 'alpha3Code', 'capital']) df.head() ###Output _____no_output_____
notebooks/04-estimating-proportions.ipynb
###Markdown Estimating Proportions Think Bayes, Second EditionCopyright 2020 Allen B. DowneyLicense: [Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/) ###Code # If we're running on Colab, install empiricaldist # https://pypi.org/project/empiricaldist/ import sys IN_COLAB = 'google.colab' in sys.modules if IN_COLAB: !pip install empiricaldist # Get utils.py and create directories import os if not os.path.exists('utils.py'): !wget https://github.com/AllenDowney/ThinkBayes2/raw/master/soln/utils.py from utils import set_pyplot_params set_pyplot_params() ###Output _____no_output_____ ###Markdown In the previous chapter we solved the 101 Bowls Problem, and I admitted that it is not really about guessing which bowl the cookies came from; it is about estimating proportions.In this chapter, we take another step toward Bayesian statistics by solving the Euro problem.We'll start with the same prior distribution, and we'll see that the update is the same, mathematically.But I will argue that it is a different problem, philosophically, and use it to introduce two defining elements of Bayesian statistics: choosing prior distributions, and using probability to represent the unknown. The Euro ProblemIn *Information Theory, Inference, and Learning Algorithms*, David MacKay poses this problem:"A statistical statement appeared in *The Guardian* on Friday January 4, 2002:> When spun on edge 250 times, a Belgian one-euro coin came up heads 140 times and tails 110. \`It looks very suspicious to me,' said Barry Blight, a statistics lecturer at the London School of Economics. \`If the coin were unbiased, the chance of getting a result as extreme as that would be less than 7%.'"But [MacKay asks] do these data give evidence that the coin is biased rather than fair?"To answer that question, we'll proceed in two steps.First we'll use the binomial distribution to see where that 7% came from; then we'll use Bayes's Theorem to estimate the probability that this coin comes up heads. The Binomial DistributionSuppose I tell you that a coin is "fair", that is, the probability of heads is 50%. If you spin it twice, there are four outcomes: `HH`, `HT`, `TH`, and `TT`. All four outcomes have the same probability, 25%.If we add up the total number of heads, there are three possible results: 0, 1, or 2. The probabilities of 0 and 2 are 25%, and the probability of 1 is 50%.More generally, suppose the probability of heads is $p$ and we spin the coin $n$ times. The probability that we get a total of $k$ heads is given by the [binomial distribution](https://en.wikipedia.org/wiki/Binomial_distribution):$$\binom{n}{k} p^k (1-p)^{n-k}$$for any value of $k$ from 0 to $n$, including both.The term $\binom{n}{k}$ is the [binomial coefficient](https://en.wikipedia.org/wiki/Binomial_coefficient), usually pronounced "n choose k".> (personal notes) > $\binom{n}{k}$ tells us how many paths drive to $k$ heads in $n$ flips > $p^k$ is the joint probability for the positive matches > $(1-p)^{n-k}$ is the joint probabilty of the negative matches We could evaluate this expression ourselves, but we can also use the SciPy function `binom.pmf`.For example, if we flip a coin `n=2` times and the probability of heads is `p=0.5`, here's the probability of getting `k=1` heads: ###Code from scipy.stats import binom n = 2 p = 0.5 k = 1 binom.pmf(k, n, p) ###Output _____no_output_____ ###Markdown Instead of providing a single value for `k`, we can also call `binom.pmf` with an array of values. ###Code import numpy as np ks = np.arange(n+1) ps = binom.pmf(ks, n, p) ps ###Output _____no_output_____ ###Markdown The result is a NumPy array with the probability of 0, 1, or 2 heads.If we put these probabilities in a `Pmf`, the result is the distribution of `k` for the given values of `n` and `p`.Here's what it looks like: ###Code from empiricaldist import Pmf pmf_k = Pmf(ps, ks) pmf_k ###Output _____no_output_____ ###Markdown The following function computes the binomial distribution for given values of `n` and `p` and returns a `Pmf` that represents the result. ###Code def make_binomial(n, p): """Make a binomial Pmf.""" ks = np.arange(n+1) ps = binom.pmf(ks, n, p) return Pmf(ps, ks) ###Output _____no_output_____ ###Markdown Here's what it looks like with `n=250` and `p=0.5`: ###Code pmf_k = make_binomial(n=250, p=0.5) from utils import decorate pmf_k.plot(label='n=250, p=0.5') decorate(xlabel='Number of heads (k)', ylabel='PMF', title='Binomial distribution') ###Output _____no_output_____ ###Markdown The most likely quantity in this distribution is 125: ###Code pmf_k.max_prob() ###Output _____no_output_____ ###Markdown But even though it is the most likely quantity, the probability that we get exactly 125 heads is only about 5%. ###Code pmf_k[125] ###Output _____no_output_____ ###Markdown In MacKay's example, we got 140 heads, which is even less likely than 125: ###Code pmf_k[140] ###Output _____no_output_____ ###Markdown In the article MacKay quotes, the statistician says, "If the coin were unbiased the chance of getting a result as extreme as that would be less than 7%."We can use the binomial distribution to check his math. The following function takes a PMF and computes the total probability of quantities greater than or equal to `threshold`. ###Code def prob_ge(pmf, threshold): """Probability of quantities greater than threshold.""" ge = (pmf.qs >= threshold) total = pmf[ge].sum() return total ###Output _____no_output_____ ###Markdown Here's the probability of getting 140 heads or more: ###Code prob_ge(pmf_k, 140) ###Output _____no_output_____ ###Markdown `Pmf` provides a method that does the same computation. ###Code pmf_k.prob_ge(140) ###Output _____no_output_____ ###Markdown The result is about 3.3%, which is less than the quoted 7%. The reason for the difference is that the statistician includes all outcomes "as extreme as" 140, which includes outcomes less than or equal to 110.To see where that comes from, recall that the expected number of heads is 125. If we get 140, we've exceeded that expectation by 15.And if we get 110, we have come up short by 15.7% is the sum of both of these "tails", as shows in the following figure. ###Code import matplotlib.pyplot as plt def fill_below(pmf): qs = pmf.index ps = pmf.values plt.fill_between(qs, ps, 0, color='C5', alpha=0.4) qs = pmf_k.index fill_below(pmf_k[qs>=140]) fill_below(pmf_k[qs<=110]) pmf_k.plot(label='n=250, p=0.5') decorate(xlabel='Number of heads (k)', ylabel='PMF', title='Binomial distribution') ###Output _____no_output_____ ###Markdown Here's how we compute the total probability of the left tail. ###Code pmf_k.prob_le(110) ###Output _____no_output_____ ###Markdown The probability of outcomes less than or equal to 110 is also 3.3%,so the total probability of outcomes "as extreme" as 140 is 6.6%.The point of this calculation is that these extreme outcomes are unlikely if the coin is fair.That's interesting, but it doesn't answer MacKay's question. Let's see if we can. Bayesian EstimationAny given coin has some probability of landing heads up when spunon edge; I'll call this probability `x`.It seems reasonable to believe that `x` dependson physical characteristics of the coin, like the distributionof weight.If a coin is perfectly balanced, we expect `x` to be close to 50%, butfor a lopsided coin, `x` might be substantially different.We can use Bayes's theorem and the observed data to estimate `x`.For simplicity, I'll start with a uniform prior, which assumes that all values of `x` are equally likely.That might not be a reasonable assumption, so we'll come back and consider other priors later.We can make a uniform prior like this: ###Code hypos = np.linspace(0, 1, 101) prior = Pmf(1, hypos) ###Output _____no_output_____ ###Markdown `hypos` is an array of equally spaced values between 0 and 1.We can use the hypotheses to compute the likelihoods, like this: ###Code likelihood_heads = hypos likelihood_tails = 1 - hypos ###Output _____no_output_____ ###Markdown I'll put the likelihoods for heads and tails in a dictionary to make it easier to do the update. ###Code likelihood = { 'H': likelihood_heads, 'T': likelihood_tails } ###Output _____no_output_____ ###Markdown To represent the data, I'll construct a string with `H` repeated 140 times and `T` repeated 110 times. ###Code dataset = 'H' * 140 + 'T' * 110 ###Output _____no_output_____ ###Markdown The following function does the update. ###Code def update_euro(pmf, dataset): """Update pmf with a given sequence of H and T.""" for data in dataset: pmf *= likelihood[data] pmf.normalize() ###Output _____no_output_____ ###Markdown The first argument is a `Pmf` that represents the prior.The second argument is a sequence of strings.Each time through the loop, we multiply `pmf` by the likelihood of one outcome, `H` for heads or `T` for tails.Notice that `normalize` is outside the loop, so the posterior distribution only gets normalized once, at the end.That's more efficient than normalizing it after each spin (although we'll see later that it can also cause problems with floating-point arithmetic).Here's how we use `update_euro`. ###Code posterior = prior.copy() update_euro(posterior, dataset) ###Output _____no_output_____ ###Markdown And here's what the posterior looks like. ###Code def decorate_euro(title): decorate(xlabel='Proportion of heads (x)', ylabel='Probability', title=title) posterior.plot(label='140 heads out of 250', color='C4') decorate_euro(title='Posterior distribution of x') ###Output _____no_output_____ ###Markdown This figure shows the posterior distribution of `x`, which is the proportion of heads for the coin we observed.The posterior distribution represents our beliefs about `x` after seeing the data.It indicates that values less than 0.4 and greater than 0.7 are unlikely; values between 0.5 and 0.6 are the most likely.In fact, the most likely value for `x` is 0.56 which is the proportion of heads in the dataset, `140/250`. ###Code posterior.max_prob() ###Output _____no_output_____ ###Markdown Triangle PriorSo far we've been using a uniform prior: ###Code uniform = Pmf(1, hypos, name='uniform') uniform.normalize() ###Output _____no_output_____ ###Markdown But that might not be a reasonable choice based on what we know about coins.I can believe that if a coin is lopsided, `x` might deviate substantially from 0.5, but it seems unlikely that the Belgian Euro coin is so imbalanced that `x` is 0.1 or 0.9.It might be more reasonable to choose a prior that giveshigher probability to values of `x` near 0.5 and lower probabilityto extreme values.As an example, let's try a triangle-shaped prior.Here's the code that constructs it: ###Code ramp_up = np.arange(50) ramp_down = np.arange(50, -1, -1) a = np.append(ramp_up, ramp_down) triangle = Pmf(a, hypos, name='triangle') triangle.normalize() ###Output _____no_output_____ ###Markdown `arange` returns a NumPy array, so we can use `np.append` to append `ramp_down` to the end of `ramp_up`.Then we use `a` and `hypos` to make a `Pmf`.The following figure shows the result, along with the uniform prior. ###Code uniform.plot() triangle.plot() decorate_euro(title='Uniform and triangle prior distributions') ###Output _____no_output_____ ###Markdown Now we can update both priors with the same data: ###Code update_euro(uniform, dataset) update_euro(triangle, dataset) ###Output _____no_output_____ ###Markdown Here are the posteriors. ###Code uniform.plot() triangle.plot() decorate_euro(title='Posterior distributions') ###Output _____no_output_____ ###Markdown The differences between the posterior distributions are barely visible, and so small they would hardly matter in practice.And that's good news.To see why, imagine two people who disagree angrily about which prior is better, uniform or triangle.Each of them has reasons for their preference, but neither of them can persuade the other to change their mind.But suppose they agree to use the data to update their beliefs.When they compare their posterior distributions, they find that there is almost nothing left to argue about.This is an example of **swamping the priors**: with enoughdata, people who start with different priors will tend toconverge on the same posterior distribution. The Binomial Likelihood FunctionSo far we've been computing the updates one spin at a time, so for the Euro problem we have to do 250 updates.A more efficient alternative is to compute the likelihood of the entire dataset at once.For each hypothetical value of `x`, we have to compute the probability of getting 140 heads out of 250 spins.Well, we know how to do that; this is the question the binomial distribution answers.If the probability of heads is $p$, the probability of $k$ heads in $n$ spins is:$$\binom{n}{k} p^k (1-p)^{n-k}$$And we can use SciPy to compute it.The following function takes a `Pmf` that represents a prior distribution and a tuple of integers that represent the data: ###Code from scipy.stats import binom def update_binomial(pmf, data): """Update pmf using the binomial distribution.""" k, n = data xs = pmf.qs likelihood = binom.pmf( k, # the flips that turn into heads n, # the total flips xs # the hypothesis space ) pmf *= likelihood pmf.normalize() ###Output _____no_output_____ ###Markdown The data are represented with a tuple of values for `k` and `n`, rather than a long string of outcomes.Here's the update. ###Code uniform2 = Pmf(1, hypos, name='uniform2') data = 140, 250 update_binomial(uniform2, data) ###Output _____no_output_____ ###Markdown And here's what the posterior looks like. ###Code uniform.plot() uniform2.plot() decorate_euro(title='Posterior distributions computed two ways') ###Output _____no_output_____ ###Markdown We can use `allclose` to confirm that result is the same as in the previous section except for a small floating-point round-off. ###Code np.allclose(uniform, uniform2) ###Output _____no_output_____ ###Markdown But this way of doing the computation is much more efficient. Bayesian StatisticsYou might have noticed similarities between the Euro problem and the 101 Bowls Problem in >.The prior distributions are the same, the likelihoods are the same, and with the same data the results would be the same.But there are two differences.The first is the choice of the prior.With 101 bowls, the uniform prior is implied by the statement of the problem, which says that we choose one of the bowls at random with equal probability.In the Euro problem, the choice of the prior is subjective; that is, reasonable people could disagree, maybe because they have different information about coins or because they interpret the same information differently.Because the priors are subjective, the posteriors are subjective, too.And some people find that problematic. The other difference is the nature of what we are estimating.In the 101 Bowls problem, we choose the bowl randomly, so it is uncontroversial to compute the probability of choosing each bowl.In the Euro problem, the proportion of heads is a physical property of a given coin.Under some interpretations of probability, that's a problem because physical properties are not considered random.As an example, consider the age of the universe.Currently, our best estimate is 13.80 billion years, but it might be off by 0.02 billion years in either direction (see [here](https://en.wikipedia.org/wiki/Age_of_the_universe)).Now suppose we would like to know the probability that the age of the universe is actually greater than 13.81 billion years.Under some interpretations of probability, we would not be able to answer that question.We would be required to say something like, "The age of the universe is not a random quantity, so it has no probability of exceeding a particular value."Under the Bayesian interpretation of probability, it is meaningful and useful to treat physical quantities as if they were random and compute probabilities about them.In the Euro problem, the prior distribution represents what we believe about coins in general and the posterior distribution represents what we believe about a particular coin after seeing the data.So we can use the posterior distribution to compute probabilities about the coin and its proportion of heads. The subjectivity of the prior and the interpretation of the posterior are key differences between using Bayes's Theorem and doing Bayesian statistics.Bayes's Theorem is a mathematical law of probability; no reasonable person objects to it.But Bayesian statistics is surprisingly controversial.Historically, many people have been bothered by its subjectivity and its use of probability for things that are not random.If you are interested in this history, I recommend Sharon Bertsch McGrayne's book, *[The Theory That Would Not Die](https://yalebooks.yale.edu/book/9780300188226/theory-would-not-die)*. SummaryIn this chapter I posed David MacKay's Euro problem and we started to solve it.Given the data, we computed the posterior distribution for `x`, the probability a Euro coin comes up heads.We tried two different priors, updated them with the same data, and found that the posteriors were nearly the same.This is good news, because it suggests that if two people start with different beliefs and see the same data, their beliefs tend to converge.This chapter introduces the binomial distribution, which we used to compute the posterior distribution more efficiently.And I discussed the differences between applying Bayes's Theorem, as in the 101 Bowls problem, and doing Bayesian statistics, as in the Euro problem.However, we still haven't answered MacKay's question: "Do these data give evidence that the coin is biased rather than fair?"I'm going to leave this question hanging a little longer; we'll come back to it in >.In the next chapter, we'll solve problems related to counting, including trains, tanks, and rabbits.But first you might want to work on these exercises. Exercises **Exercise:** In Major League Baseball, most players have a batting average between 200 and 330, which means that their probability of getting a hit is between 0.2 and 0.33.Suppose a player appearing in their first game gets 3 hits out of 3 attempts. What is the posterior distribution for their probability of getting a hit? For this exercise, I'll construct the prior distribution by starting with a uniform distribution and updating it with imaginary data until it has a shape that reflects my background knowledge of batting averages.Here's the uniform prior: ###Code hypos = np.linspace(0.1, 0.4, 101) prior = Pmf(1, hypos) ###Output _____no_output_____ ###Markdown And here is a dictionary of likelihoods, with `Y` for getting a hit and `N` for not getting a hit. ###Code likelihood = { 'Y': hypos, 'N': 1-hypos } ###Output _____no_output_____ ###Markdown Here's a dataset that yields a reasonable prior distribution. ###Code dataset = 'Y' * 25 + 'N' * 75 ###Output _____no_output_____ ###Markdown And here's the update with the imaginary data. ###Code for data in dataset: prior *= likelihood[data] prior.normalize() ###Output _____no_output_____ ###Markdown Finally, here's what the prior looks like. ###Code prior.plot(label='prior') decorate(xlabel='Probability of getting a hit', ylabel='PMF') ###Output _____no_output_____ ###Markdown This distribution indicates that most players have a batting average near 250, with only a few players below 175 or above 350. I'm not sure how accurately this prior reflects the distribution of batting averages in Major League Baseball, but it is good enough for this exercise.Now update this distribution with the data and plot the posterior. What is the most likely quantity in the posterior distribution? ###Code # Solution goes here # Solution goes here # Solution goes here # Solution goes here ###Output _____no_output_____ ###Markdown **Exercise:** Whenever you survey people about sensitive issues, you have to deal with [social desirability bias](https://en.wikipedia.org/wiki/Social_desirability_bias), which is the tendency of people to adjust their answers to show themselves in the most positive light.One way to improve the accuracy of the results is [randomized response](https://en.wikipedia.org/wiki/Randomized_response).As an example, suppose want to know how many people cheat on their taxes. If you ask them directly, it is likely that some of the cheaters will lie.You can get a more accurate estimate if you ask them indirectly, like this: Ask each person to flip a coin and, without revealing the outcome,* If they get heads, they report YES.* If they get tails, they honestly answer the question "Do you cheat on your taxes?"If someone says YES, we don't know whether they actually cheat on their taxes; they might have flipped yes.Knowing this, people might be more willing to answer honestly.Suppose you survey 100 people this way and get 80 YESes and 20 NOs. Based on this data, what is the posterior distribution for the fraction of people who cheat on their taxes? What is the most likely quantity in the posterior distribution? ###Code # Solution goes here # Solution goes here # Solution goes here # Solution goes here # Solution goes here ###Output _____no_output_____ ###Markdown **Exercise:** Suppose you want to test whether a coin is fair, but you don't want to spin it hundreds of times.So you make a machine that spins the coin automatically and uses computer vision to determine the outcome.However, you discover that the machine is not always accurate. Specifically, suppose the probability is `y=0.2` that an actual heads is reported as tails, or actual tails reported as heads.If we spin a coin 250 times and the machine reports 140 heads, what is the posterior distribution of `x`?What happens as you vary the value of `y`? ###Code # Solution goes here # Solution goes here # Solution goes here # Solution goes here ###Output _____no_output_____ ###Markdown **Exercise:** In preparation for an alien invasion, the Earth Defense League (EDL) has been working on new missiles to shoot down space invaders. Of course, some missile designs are better than others; let's assume that each design has some probability of hitting an alien ship, `x`.Based on previous tests, the distribution of `x` in the population of designs is approximately uniform between 0.1 and 0.4.Now suppose the new ultra-secret Alien Blaster 9000 is being tested. In a press conference, an EDL general reports that the new design has been tested twice, taking two shots during each test. The results of the test are confidential, so the general won't say how many targets were hit, but they report: "The same number of targets were hit in the two tests, so we have reason to think this new design is consistent."Is this data good or bad; that is, does it increase or decrease your estimate of `x` for the Alien Blaster 9000? Hint: If the probability of hitting each target is $x$, the probability of hitting one target in both testsis $\left[2x(1-x)\right]^2$. ###Code # Solution goes here # Solution goes here # Solution goes here # Solution goes here # Solution goes here # Solution goes here # Solution goes here ###Output _____no_output_____
getting_started/notebooks/2021-06-20-amazon-personalize-workshop-part-4.ipynb
###Markdown Amazon Personalize Workshop Part 4 - Security best practices (optional)> We are using a pre-made dataset that hasn't been encrypted so there is no need to decrypt this dataset. However, it would be a good security practice to store your datasets encrypted.- toc: true- badges: true- comments: true- categories: [amazonpersonalize, movie, security, privacy]- image: Get the Personalize boto3 Client ###Code import boto3 import json import numpy as np import pandas as pd import time personalize = boto3.client('personalize') personalize_runtime = boto3.client('personalize-runtime') iam = boto3.client("iam") s3 = boto3.client("s3") ###Output _____no_output_____ ###Markdown Specify a Bucket and Data Output Location ###Code bucket = "personalize-demo" # replace with the name of your S3 bucket filename = "movie-lens-100k.csv" # replace with a name that you want to save the dataset under ###Output _____no_output_____ ###Markdown Download, Prepare, and Upload Training Data Download and Explore the Dataset ###Code !wget -N http://files.grouplens.org/datasets/movielens/ml-100k.zip !unzip -o ml-100k.zip data = pd.read_csv('./ml-100k/u.data', sep='\t', names=['USER_ID', 'ITEM_ID', 'RATING', 'TIMESTAMP']) pd.set_option('display.max_rows', 5) data ###Output --2020-05-05 09:38:29-- http://files.grouplens.org/datasets/movielens/ml-100k.zip Resolving files.grouplens.org (files.grouplens.org)... 128.101.65.152 Connecting to files.grouplens.org (files.grouplens.org)|128.101.65.152|:80... connected. HTTP request sent, awaiting response... 304 Not Modified File ‘ml-100k.zip’ not modified on server. Omitting download. Archive: ml-100k.zip inflating: ml-100k/allbut.pl inflating: ml-100k/mku.sh inflating: ml-100k/README inflating: ml-100k/u.data inflating: ml-100k/u.genre inflating: ml-100k/u.info inflating: ml-100k/u.item inflating: ml-100k/u.occupation inflating: ml-100k/u.user inflating: ml-100k/u1.base inflating: ml-100k/u1.test inflating: ml-100k/u2.base inflating: ml-100k/u2.test inflating: ml-100k/u3.base inflating: ml-100k/u3.test inflating: ml-100k/u4.base inflating: ml-100k/u4.test inflating: ml-100k/u5.base inflating: ml-100k/u5.test inflating: ml-100k/ua.base inflating: ml-100k/ua.test inflating: ml-100k/ub.base inflating: ml-100k/ub.test ###Markdown Optional security practice: Protect data at rest - Encrypt/decrypt your datasetWe are using a pre-made dataset that hasn't been encrypted so there is no need to decrypt this dataset. However, it would be a good security practice to store your datasets encrypted.For more information on encrypting your data when using S3, visit https://docs.aws.amazon.com/AmazonS3/latest/dev/KMSUsingRESTAPI.html Optional security practice: Protect data in transit - SSL access only for S3 bucket ###Code requires_ssl_access_policy = { "Version": "2012-10-17", "Id": "RequireSSLAccess", "Statement": [ { "Sid": "RequireSSLAccess", "Effect": "Deny", "Principal": "*", "Action": "*", "Resource": [ "arn:aws:s3:::{}".format(bucket), "arn:aws:s3:::{}/*".format(bucket) ], "Condition": { "Bool": { "aws:SecureTransport": "false" } } } ] } s3.put_bucket_policy(Bucket=bucket, Policy=json.dumps(requires_ssl_access_policy)) ###Output _____no_output_____ ###Markdown Additional security note:Some users prevent accidental information disclosure by limiting S3 access to only come from a VPC. Another common security practice is to validate this limited access. It should be noted that this security check will fail when performed against S3 buckets used for Personalize - as Personalize copies data from the user's S3 into the internal systems used by Personalize (during dataset import jobs). Optional security practice: validate bucket owner matches your account canonical id[More information about canonical ids here](https://docs.aws.amazon.com/general/latest/gr/acct-identifiers.htmlFindingCanonicalId) ###Code bucket_owner_id = boto3.client('s3').get_bucket_acl(Bucket=bucket)['Owner']['ID'] print("This bucket belongs to: {} ".format(bucket_owner_id)) ###Output This bucket belongs to: 28398dc6b1acac01a4a73b246b5f9c9a688f50b9ce70240f74c0f90ebf5e2c61 ###Markdown Optional security practice: Protect data integrity - Enable S3 bucket versioning ###Code s3_resource = boto3.resource('s3') bucket_versioning = s3_resource.BucketVersioning(bucket) bucket_versioning.enable() ###Output _____no_output_____ ###Markdown Prepare and Upload Data ###Code data = data[data['RATING'] > 3.6] # keep only movies rated 3.6 and above data = data[['USER_ID', 'ITEM_ID', 'TIMESTAMP']] # select columns that match the columns in the schema below data.to_csv(filename, index=False) boto3.Session().resource('s3').Bucket(bucket).Object(filename).upload_file(filename) ###Output _____no_output_____ ###Markdown Create Schema ###Code schema = { "type": "record", "name": "Interactions", "namespace": "com.amazonaws.personalize.schema", "fields": [ { "name": "USER_ID", "type": "string" }, { "name": "ITEM_ID", "type": "string" }, { "name": "TIMESTAMP", "type": "long" } ], "version": "1.0" } create_schema_response = personalize.create_schema( name = "DEMO-schema", schema = json.dumps(schema) ) schema_arn = create_schema_response['schemaArn'] print(json.dumps(create_schema_response, indent=2)) ###Output { "schemaArn": "arn:aws:personalize:us-west-2:237539672711:schema/DEMO-schema", "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "12eb7cba-2b64-4be9-9f6e-eeebff7629a5", "HTTPHeaders": { "date": "Tue, 04 Dec 2018 05:49:04 GMT", "x-amzn-requestid": "12eb7cba-2b64-4be9-9f6e-eeebff7629a5", "content-length": "79", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } } } ###Markdown Optional security practice - Protect data at rest - Encrypt datasets under PersonalizeIf you skip this step, do not pass kmsKeyArn or roleArn when you create your dataset group. ###Code kmsKeyArn = boto3.client('kms').create_key(Description="personalize-data")['KeyMetadata']['Arn'] print(kmsKeyArn) key_accessor_policy_name = "AccessPersonalizeDatasetPolicy" key_accessor_policy = { "Version": "2012-10-17", "Statement": { "Effect": "Allow", "Action": [ "kms:*" ], "Resource": [ kmsKeyArn ] } } key_access_policy = iam.create_policy( PolicyName = key_accessor_policy_name, PolicyDocument = json.dumps(key_accessor_policy) ) key_access_role_name = "AccessPersonalizeDatasetRole" assume_role_policy_document = { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "personalize.amazonaws.com" }, "Action": "sts:AssumeRole" } ] } create_role_response = iam.create_role( RoleName = key_access_role_name, AssumeRolePolicyDocument = json.dumps(assume_role_policy_document) ) iam.attach_role_policy( RoleName = create_role_response["Role"]["RoleName"], PolicyArn = key_access_policy["Policy"]["Arn"] ) keyAccessRoleArn = create_role_response["Role"]["Arn"] print(keyAccessRoleArn) ###Output arn:aws:kms:us-west-2:001513653716:key/f5be82af-a160-4c49-813e-e5448fa95693 arn:aws:iam::001513653716:role/AccessPersonalizeDatasetRole ###Markdown Create and Wait for Encrypted Dataset Group Create Encrypted Dataset GroupIf you did not create a KMS Key and IAM role from the last step, then do not pass in roleArn or kmsKeyArn. ###Code create_dataset_group_response = personalize.create_dataset_group( name = "DEMO-dataset-group", roleArn = keyAccessRoleArn, kmsKeyArn = kmsKeyArn ) dataset_group_arn = create_dataset_group_response['datasetGroupArn'] print(json.dumps(create_dataset_group_response, indent=2)) ###Output _____no_output_____ ###Markdown Wait for Dataset Group to Have ACTIVE Status ###Code max_time = time.time() + 3*60*60 # 3 hours while time.time() < max_time: describe_dataset_group_response = personalize.describe_dataset_group( datasetGroupArn = dataset_group_arn ) status = describe_dataset_group_response["datasetGroup"]["status"] print("DatasetGroup: {}".format(status)) if status == "ACTIVE" or status == "CREATE FAILED": break time.sleep(60) ###Output DatasetGroup: CREATE PENDING DatasetGroup: CREATE FAILED ###Markdown Create Dataset ###Code dataset_type = "INTERACTIONS" create_dataset_response = personalize.create_dataset( name = "DEMO-dataset", datasetType = dataset_type, datasetGroupArn = dataset_group_arn, schemaArn = schema_arn ) dataset_arn = create_dataset_response['datasetArn'] print(json.dumps(create_dataset_response, indent=2)) ###Output { "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "29ab75c8-df6e-4807-943f-1b48014181d1", "HTTPHeaders": { "date": "Tue, 04 Dec 2018 05:50:19 GMT", "x-amzn-requestid": "29ab75c8-df6e-4807-943f-1b48014181d1", "content-length": "101", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } }, "datasetArn": "arn:aws:personalize:us-west-2:237539672711:dataset/DEMO-dataset-group/INTERACTIONS" } ###Markdown Prepare, Create, and Wait for Dataset Import Job Attach Policy to S3 Bucket ###Code policy = { "Version": "2012-10-17", "Id": "PersonalizeS3BucketAccessPolicy", "Statement": [ { "Sid": "PersonalizeS3BucketAccessPolicy", "Effect": "Allow", "Principal": { "Service": "personalize.amazonaws.com" }, "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::{}".format(bucket), "arn:aws:s3:::{}/*".format(bucket) ] } ] } s3.put_bucket_policy(Bucket=bucket, Policy=json.dumps(policy)) ###Output _____no_output_____ ###Markdown Create Personalize Role ###Code role_name = "PersonalizeRole" assume_role_policy_document = { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "personalize.amazonaws.com" }, "Action": "sts:AssumeRole" } ] } create_role_response = iam.create_role( RoleName = role_name, AssumeRolePolicyDocument = json.dumps(assume_role_policy_document) ) # AmazonPersonalizeFullAccess provides access to any S3 bucket with a name that includes "personalize" or "Personalize" # if you would like to use a bucket with a different name, please consider creating and attaching a new policy # that provides read access to your bucket or attaching the AmazonS3ReadOnlyAccess policy to the role policy_arn = "arn:aws:iam::aws:policy/service-role/AmazonPersonalizeFullAccess" iam.attach_role_policy( RoleName = role_name, PolicyArn = policy_arn ) time.sleep(60) # wait for a minute to allow IAM role policy attachment to propagate role_arn = create_role_response["Role"]["Arn"] print(role_arn) ###Output arn:aws:iam::660166145966:role/PersonalizeRole2 ###Markdown Create Dataset Import Job ###Code create_dataset_import_job_response = personalize.create_dataset_import_job( jobName = "DEMO-dataset-import-job", datasetArn = dataset_arn, dataSource = { "dataLocation": "s3://{}/{}".format(bucket, filename) }, roleArn = role_arn ) dataset_import_job_arn = create_dataset_import_job_response['datasetImportJobArn'] print(json.dumps(create_dataset_import_job_response, indent=2)) ###Output { "datasetImportJobArn": "arn:aws:personalize:us-west-2:237539672711:dataset-import-job/DEMO-dataset-import-job", "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "3c77fe8d-d9fe-4ca5-ad03-b18e937acbb3", "HTTPHeaders": { "date": "Tue, 04 Dec 2018 05:50:55 GMT", "x-amzn-requestid": "3c77fe8d-d9fe-4ca5-ad03-b18e937acbb3", "content-length": "113", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } } } ###Markdown Wait for Dataset Import Job to Have ACTIVE Status ###Code max_time = time.time() + 3*60*60 # 3 hours while time.time() < max_time: describe_dataset_import_job_response = personalize.describe_dataset_import_job( datasetImportJobArn = dataset_import_job_arn ) status = describe_dataset_import_job_response["datasetImportJob"]['status'] print("DatasetImportJob: {}".format(status)) if status == "ACTIVE" or status == "CREATE FAILED": break time.sleep(60) ###Output DatasetImportJob: CREATE PENDING DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: CREATE IN_PROGRESS DatasetImportJob: ACTIVE ###Markdown Select Recipe ###Code list_recipes_response = personalize.list_recipes() recipe_arn = "arn:aws:personalize:::recipe/aws-hrnn" # aws-hrnn selected for demo purposes list_recipes_response ###Output _____no_output_____ ###Markdown Create and Wait for Solution Create Solution ###Code create_solution_response = personalize.create_solution( name = "DEMO-solution", datasetGroupArn = dataset_group_arn, recipeArn = recipe_arn ) solution_arn = create_solution_response['solutionArn'] print(json.dumps(create_solution_response, indent=2)) ###Output { "solutionArn": "arn:aws:personalize:us-west-2:237539672711:solution/DEMO-solution", "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "2042832f-0775-43e2-86de-53a061be1f63", "HTTPHeaders": { "date": "Mon, 03 Dec 2018 23:55:17 GMT", "x-amzn-requestid": "2042832f-0775-43e2-86de-53a061be1f63", "content-length": "83", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } } } ###Markdown Create Solution Version ###Code create_solution_version_response = personalize.create_solution_version( solutionArn = solution_arn ) solution_version_arn = create_solution_version_response['solutionVersionArn'] print(json.dumps(create_solution_version_response, indent=2)) ###Output { "solutionVersionArn": "arn:aws:personalize:us-west-2:237539672711:solution/DEMO-solution/702e0792", "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "2042832f-0775-43e2-86de-53a061be1f65", "HTTPHeaders": { "date": "Mon, 03 Dec 2018 23:55:17 GMT", "x-amzn-requestid": "2042832f-0775-43e2-86de-53a061be1f65", "content-length": "90", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } } } ###Markdown Wait for Solution Version to Have ACTIVE Status ###Code max_time = time.time() + 3*60*60 # 3 hours while time.time() < max_time: describe_solution_version_response = personalize.describe_solution_version( solutionVersionArn = solution_version_arn ) status = describe_solution_version_response["solutionVersion"]["status"] print("SolutionVersion: {}".format(status)) if status == "ACTIVE" or status == "CREATE FAILED": break time.sleep(60) ###Output SolutionVersion: CREATE PENDING SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: CREATE IN_PROGRESS SolutionVersion: ACTIVE ###Markdown Get Metrics of Solution ###Code get_solution_metrics_response = personalize.get_solution_metrics( solutionVersionArn = solution_version_arn ) print(json.dumps(get_solution_metrics_response, indent=2)) ###Output { "metrics": { "coverage": 0.2603, "mean_reciprocal_rank_at_25": 0.0539, "normalized_discounted_cumulative_gain_at_5": 0.0486, "normalized_discounted_cumulative_gain_at_10": 0.0649, "normalized_discounted_cumulative_gain_at_25": 0.0918, "precision_at_5": 0.0109, "precision_at_10": 0.0098, "precision_at_25": 0.0083, }, "solutionVersionArn": "arn:aws:personalize:us-west-2:237539672711:solution/DEMO-solution/702e0792", "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "5b5f4f4f-5249-4c0e-9f83-45e3fe22f09f", "HTTPHeaders": { "date": "Tue, 04 Dec 2018 00:53:54 GMT", "x-amzn-requestid": "5b5f4f4f-5249-4c0e-9f83-45e3fe22f09f", "content-length": "724", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } } } ###Markdown Create and Wait for Campaign Create Campaign ###Code create_campaign_response = personalize.create_campaign( name = "DEMO-campaign", solutionVersionArn = solution_version_arn, minProvisionedTPS = 1 ) campaign_arn = create_campaign_response['campaignArn'] print(json.dumps(create_campaign_response, indent=2)) ###Output { "campaignArn": "arn:aws:personalize:us-west-2:237539672711:campaign/DEMO-campaign", "ResponseMetadata": { "RetryAttempts": 0, "HTTPStatusCode": 200, "RequestId": "527e97ba-683c-4dc7-8218-00716f22c904", "HTTPHeaders": { "date": "Tue, 04 Dec 2018 00:54:17 GMT", "x-amzn-requestid": "527e97ba-683c-4dc7-8218-00716f22c904", "content-length": "83", "content-type": "application/x-amz-json-1.1", "connection": "keep-alive" } } } ###Markdown Wait for Campaign to Have ACTIVE Status ###Code max_time = time.time() + 3*60*60 # 3 hours while time.time() < max_time: describe_campaign_response = personalize.describe_campaign( campaignArn = campaign_arn ) status = describe_campaign_response["campaign"]["status"] print("Campaign: {}".format(status)) if status == "ACTIVE" or status == "CREATE FAILED": break time.sleep(60) ###Output Campaign: CREATE PENDING Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: CREATE IN_PROGRESS Campaign: ACTIVE ###Markdown Get Recommendations Select a User and an Item ###Code items = pd.read_csv('./ml-100k/u.item', sep='|', usecols=[0,1], encoding='latin-1') items.columns = ['ITEM_ID', 'TITLE'] user_id, item_id, _ = data.sample().values[0] item_title = items.loc[items['ITEM_ID'] == item_id].values[0][-1] print("USER: {}".format(user_id)) print("ITEM: {}".format(item_title)) items ###Output USER: 711 ITEM: Silence of the Lambs, The (1991) ###Markdown Call GetRecommendations ###Code get_recommendations_response = personalize_runtime.get_recommendations( campaignArn = campaign_arn, userId = str(user_id), itemId = str(item_id) ) item_list = get_recommendations_response['itemList'] title_list = [items.loc[items['ITEM_ID'] == np.int(item['itemId'])].values[0][-1] for item in item_list] print("Recommendations: {}".format(json.dumps(title_list, indent=2))) ###Output Recommendations: [ "Godfather, The (1972)", "Contact (1997)", "Titanic (1997)", "Star Wars (1977)", "Fargo (1996)", "Liar Liar (1997)", "Evita (1996)", "Jerry Maguire (1996)", "Scream (1996)", "Devil's Advocate, The (1997)", "Full Monty, The (1997)", "Conspiracy Theory (1997)", "Edge, The (1997)", "Sense and Sensibility (1995)", "English Patient, The (1996)", "Twelve Monkeys (1995)", "L.A. Confidential (1997)", "As Good As It Gets (1997)", "In & Out (1997)", "Rock, The (1996)", "Return of the Jedi (1983)", "Amistad (1997)", "Men in Black (1997)", "Truth About Cats & Dogs, The (1996)", "Alien: Resurrection (1997)" ]
365 DS/3_6_numpy-fundamentals-solution.ipynb
###Markdown NumPy Fundamentals 1. Run the following cells: ###Code import numpy as np array_1D = np.array([10,11,12,13, 14]) array_1D array_2D = np.array([[20,30,40,50,60], [43,54,65,76,87], [11,22,33,44,55]]) array_2D array_3D = np.array([[[1,2,3,4,5], [11,21,31,41,51]], [[11,12,13,14,15], [51,52,53,54,5]]]) array_3D ###Output _____no_output_____ ###Markdown 2. Display the first element (not necessarily individual element) for each of the 3 arrays we defined above. ###Code array_1D[0] array_2D[0] array_3D[0] ###Output _____no_output_____ ###Markdown 3. Call the first individual element of the each of the 3 arrays. ###Code array_1D[0] array_2D[0,0] #array_2D[0][0] array_3D[0,0,0] #array_3D[0][0][0] #array_3D[0,0][0] #array_3D[0][0,0] ###Output _____no_output_____ ###Markdown 4. Uses negative indices to display the last element of each array. ###Code array_1D[-1] array_2D[-1] array_3D[-1] ###Output _____no_output_____ ###Markdown 5. Set the penultimate (second-to-last) individual element of each array equal to 10. Then display the variables to check your work. (Hint: If it's the penultimate individual element of the 2-D/3-D array, it needs to be in the last row of the 2-D array) ###Code array_1D[-2] = 10 array_1D array_2D[-1,-2] = 10 array_2D array_3D[-1,-1,-2] = 10 array_3D ###Output _____no_output_____ ###Markdown 6. Set the last column of the 2-D array to 100. Then display the variables to check your work. ###Code array_2D[:,-1] = 100 array_2D ###Output _____no_output_____ ###Markdown 7. Set all the values of the 3-D array to 1000. Then display the variables to check your work. ###Code array_3D[:] = 1000 array_3D ###Output _____no_output_____ ###Markdown 8. Run the next 3 cells to re-define the 3 arrays, since we altered their contents. ###Code array_1D = np.array([10,11,12,13,14]) array_1D array_2D = np.array([[20,30,40,50,60], [43,54,65,76,87], [11,22,33,44,55]]) array_2D array_3D = np.array([[[1,2,3,4,5], [11,21,31,41,51]], [[11,12,13,14,15], [51,52,53,54,5]]]) array_3D ###Output _____no_output_____ ###Markdown 9. Add 2 to every element of the 3 arrays without overwriting their values. ###Code array_1D + 2 array_2D + 2 array_3D + 2 ###Output _____no_output_____ ###Markdown 10. Multiply the values of each array by 100 without overwriting their values. ###Code array_1D * 100 array_2D * 100 array_3D * 100 ###Output _____no_output_____ ###Markdown 11. Add up array_1D and the first row of array_2D ###Code array_1D + array_2D[0] ###Output _____no_output_____ ###Markdown 12. Find the product of array_1D and the first row of array_2D ###Code array_1D * array_2D[0] ###Output _____no_output_____ ###Markdown 13. Find the product of the first row of array_2D and the first row of the first array of array_3D ###Code array_2D[0] * array_3D[0, 0] ###Output _____no_output_____ ###Markdown 14. Subtract array_1D from the first row of array_2D ###Code array_2D[0] - array_1D ###Output _____no_output_____ ###Markdown 15. Subtract the first row of array_2D from array_1D ###Code array_1D - array_2D[0] ###Output _____no_output_____ ###Markdown 16. Alter the code in the next 3 cells to re-define the 3 arrays as the following datatypes: A) array_1D -> NumPy Strings B) array_2D -> Complex Numbers C) array_3D -> 64-bit Floats (Hint: The datatypes are the following: np.str, np.complex, np.float64) ###Code array_1D = np.array([10,11,12,13,14], dtype = np.str) array_1D array_2D = np.array([[20,30,40,50,60], [43,54,65,76,87], [11,22,33,44,55]], dtype = np.complex) array_2D array_3D = np.array([[[1,2,3,4,5], [11,21,31,41,51]], [[11,12,13,14,15], [51,52,53,54,5]]], dtype = np.float64) array_3D ###Output _____no_output_____ ###Markdown 17. Now run the next 3 cells to re-define the 3 arrays as 32-bit floats, since we want to run some more computations on them ###Code array_1D = np.array([10,11,12,13,14], dtype = np.float32) array_1D array_2D = np.array([[20,30,40,50,60], [43,54,65,76,87], [11,22,33,44,55]], dtype = np.float32) array_2D array_3D = np.array([[[1,2,3,4,5], [11,21,31,41,51]], [[11,12,13,14,15], [51,52,53,54,5]]], dtype = np.float32) array_3D ###Output _____no_output_____ ###Markdown 18. Use broadcasting to subtract array_1D from every row of array_2D (Hint: You **don't** need a function to do this.) ###Code array_2D - array_1D ###Output _____no_output_____ ###Markdown 19. Use broadcasting to divide all rows of array_3D by array_1D ###Code array_3D / array_1D ###Output _____no_output_____ ###Markdown 19. Use broadcasting to find the product of all rows of array_3D and the last row of array_2D ###Code array_3D * array_2D[-1] ###Output _____no_output_____ ###Markdown 20. Since these products are all integers, let's cast them as such (Hint: You can use the np.array() function here) ###Code np.array(array_3D * array_2D[-1], dtype = np.int32) ###Output _____no_output_____ ###Markdown 21. Let's use axis argument of the np.mean() function to find the mean for every column of the 2-D array ###Code np.mean(array_2D, axis = 0) ###Output _____no_output_____ ###Markdown 22. Let's use axis argument of the np.mean() function to find the mean for every column of the 3-D array (Hint: Make sure you define the correct axis.) ###Code np.mean(array_3D, axis = 1) ###Output _____no_output_____
notebooks/01_Data_Fetching/01 - DEMs for Training Tiles.ipynb
###Markdown Fetching Elevation DataWe have prepared shapefiles containing the USGS quarter quadrangles that have good coverage of forest stand delineations that we want to grab other data for. We'll fetch elevation data (a Digital Elevation Model) from The National Map for each tile, create additional derivative products, and write our outputs as GeoTiffs to Google Drive. DEMs will be retrieved from a webservice hosted by The National Map using elevation data produced by the US Geological Survey's [3D Elevation Program](https://www.usgs.gov/core-science-systems/ngp/3dep/what-is-3dep?). In subsequent processing, we may generate other terrain-derived layers (e.g., slope, aspect, curvature) from these DEMs. For now, we'll just grab the raw DEM and generate a couple layers quantifying Topographic Position Index (TPI). We add TPI layers here because, as described below, the calculation of TPI involves a convolution that requires elevation data that extends beyond the footprint of the tile we will ultimately retain and export to Google Drive. Topographic Position Index (TPI)TPI characterizes the elevation of a point on the landscape relative to its surroundings. It is calculated as a convolution of a DEM where the size of the kernel used for the convolution can be adjusted to capture local-scale to regional-scale topographic features. Illustration of TPI (credit: Jeff Jenness)In this notebook, we follow the original description of TPI by [Weiss (2001)](http://www.jennessent.com/downloads/tpi-poster-tnc_18x22.pdf) by using an annular (donut-shaped) kernel that subtracts the average elevation of pixels in the donut from the elevation at the single pixel in the center of the donut hole. We implement TPI calculations at a range of 300m (annulus runs from 150-300m from the center pixel) and for 2000m (annulus runs 1850-2000m). Mount Google Drive So we can access our files showing tile locations, and save the rasters we will generate from the elevation data. ###Code from google.colab import drive drive.mount('/content/drive', force_remount=True) ! pip install geopandas rasterio -q ###Output _____no_output_____ ###Markdown The following functions will do the work to retrieve the DEM (or calculate a TPI raster from a DEM) from The National Map's web service. ###Code import io import numpy as np import geopandas as gpd # import richdem as rd import os import rasterio import requests from functools import partial from imageio import imread from matplotlib import pyplot as plt from multiprocessing.pool import ThreadPool from rasterio import transform from scipy.ndimage.filters import convolve from skimage import filters from skimage.morphology import disk from skimage.transform import resize from skimage.util import apply_parallel def dem_from_tnm(bbox, res, inSR=4326, **kwargs): """ Retrieves a Digital Elevation Model (DEM) image from The National Map (TNM) web service. Parameters ---------- bbox : list-like list of bounding box coordinates (minx, miny, maxx, maxy) res : numeric spatial resolution to use for returned DEM (grid cell size) inSR : int spatial reference for bounding box, such as an EPSG code (e.g., 4326) Returns ------- dem : numpy array DEM image as array """ width = int(abs(bbox[2] - bbox[0]) // res) height = int(abs(bbox[3] - bbox[1]) // res) BASE_URL = ''.join([ 'https://elevation.nationalmap.gov/arcgis/rest/', 'services/3DEPElevation/ImageServer/exportImage?' ]) params = dict(bbox=','.join([str(x) for x in bbox]), bboxSR=inSR, size=f'{width},{height}', imageSR=inSR, time=None, format='tiff', pixelType='F32', noData=None, noDataInterpretation='esriNoDataMatchAny', interpolation='+RSP_BilinearInterpolation', compression=None, compressionQuality=None, bandIds=None, mosaicRule=None, renderingRule=None, f='image') for key, value in kwargs.items(): params.update({key: value}) r = requests.get(BASE_URL, params=params) dem = imread(io.BytesIO(r.content)) return dem def quad_fetch(fetcher, bbox, num_threads=4, qq=False, *args, **kwargs): """Breaks user-provided bounding box into quadrants and retrieves data using `fetcher` for each quadrant in parallel using a ThreadPool. Parameters ---------- fetcher : callable data-fetching function, expected to return an array-like object bbox : 4-tuple or list coordinates of x_min, y_min, x_max, and y_max for bounding box of tile num_threads : int number of threads to use for parallel executing of data requests qq : bool whether or not to execute request for quarter quads, which executes this function recursively for each quadrant *args additional positional arguments that will be passed to `fetcher` **kwargs additional keyword arguments that will be passed to `fetcher` Returns ------- quad_img : array image returned with quads stitched together into a single array """ bboxes = split_quad(bbox) if qq: nw = quad_fetch(fetcher, bbox=bboxes[0], *args, **kwargs) ne = quad_fetch(fetcher, bbox=bboxes[1], *args, **kwargs) sw = quad_fetch(fetcher, bbox=bboxes[2], *args, **kwargs) se = quad_fetch(fetcher, bbox=bboxes[3], *args, **kwargs) else: get_quads = partial(fetcher, *args, **kwargs) with ThreadPool(num_threads) as p: quads = p.map(get_quads, bboxes) nw, ne, sw, se = quads quad_img = np.vstack([np.hstack([nw, ne]), np.hstack([sw, se])]) return quad_img def split_quad(bbox): """Splits a bounding box into four quadrants and returns their bounds. Parmeters --------- bbox : 4-tuple or list coordinates of x_min, y_min, x_max, and y_max for bounding box of tile Returns ------- quads : list coordinates of x_min, y_min, x_max, and y_max for each quadrant, in order of nw, ne, sw, se """ xmin, ymin, xmax, ymax = bbox nw_bbox = [xmin, (ymin + ymax) / 2, (xmin + xmax) / 2, ymax] ne_bbox = [(xmin + xmax) / 2, (ymin + ymax) / 2, xmax, ymax] sw_bbox = [xmin, ymin, (xmin + xmax) / 2, (ymin + ymax) / 2] se_bbox = [(xmin + xmax) / 2, ymin, xmax, (ymin + ymax) / 2] quads = [nw_bbox, ne_bbox, sw_bbox, se_bbox] return quads def tpi_from_tnm(bbox, irad, orad, dem_resolution, smooth_highres_dem=True, tpi_resolution=30, parallel=True, norm=True, fixed_mean=None, fixed_std=None, **kwargs): """ Produces a raster of Topographic Position Index (TPI) by fetching a Digital Elevation Model (DEM) from The National Map (TNM) web service. TPI is the difference between the elevation at a location from the average elevation of its surroundings, calculated using an annulus (ring). This function permits the calculation of average surrounding elevation using a coarser grain, and return the TPI user a higher-resolution DEM. Parameters ---------- bbox : list-like list of bounding box coordinates (minx, miny, maxx, maxy) irad : numeric inner radius of annulus used to calculate TPI orad : numeric outer radius of annulus used to calculate TPI dem_resolution : numeric spatial resolution of Digital Elevation Model (DEM) tpi_resolution : numeric spatial resolution of DEM used to calculate TPI norm : bool whether to return a normalized version of TPI, with mean = 0 and SD = 1 fixed_mean : numeric mean value to use to normalize data, useful to to set as a constant when processing adjacent tiles to avoid stitching/edge effects fixed_std : numeric standard deviation value to use to normalize data, useful to to set as a constant when processing adjacent tiles to avoid stitching/edge effects Returns ------- tpi : array TPI image as array """ tpi_bbox = np.array(bbox) tpi_bbox[0:2] = tpi_bbox[0:2] - orad tpi_bbox[2:4] = tpi_bbox[2:4] + orad k_orad = orad // tpi_resolution k_irad = irad // tpi_resolution kernel = disk(k_orad) - np.pad(disk(k_irad), pad_width=(k_orad - k_irad)) weights = kernel / kernel.sum() if dem_resolution != tpi_resolution: dem = dem_from_tnm(bbox, dem_resolution, **kwargs) if dem_resolution < 3 and smooth_highres_dem: dem = filters.gaussian(dem, 3) dem = np.pad(dem, orad // dem_resolution) tpi_dem = dem_from_tnm(tpi_bbox, tpi_resolution, **kwargs) else: tpi_dem = dem_from_tnm(tpi_bbox, tpi_resolution, **kwargs) dem = tpi_dem if parallel: def conv(tpi_dem): return convolve(tpi_dem, weights) convolved = apply_parallel(conv, tpi_dem, compute=True, depth=k_orad) if tpi_resolution != dem_resolution: tpi = dem - resize(convolved, dem.shape) else: tpi = dem - convolved else: if tpi_resolution != dem_resolution: tpi = dem - resize(convolve(tpi_dem, weights), dem.shape) else: tpi = dem - convolve(tpi_dem, weights) # trim the padding around the dem used to calculate TPI tpi = tpi[orad // dem_resolution:-orad // dem_resolution, orad // dem_resolution:-orad // dem_resolution] if norm: if fixed_mean is not None and fixed_std is not None: tpi_mean = fixed_mean tpi_std = fixed_std else: tpi_mean = (tpi_dem - convolved).mean() tpi_std = (tpi_dem - convolved).std() tpi = (tpi - tpi_mean) / tpi_std return tpi ###Output _____no_output_____ ###Markdown Download Data for Training TilesHere is where are shapefiles of USGS Quarter Quads live: ###Code WORK_DIR = '/content/drive/Shared drives/stand_mapping/data/processed/training_tiles/' OR_QUADS = ['oregon_utm10n_training_quads_epsg6339.shp', 'oregon_utm11n_training_quads_epsg6340.shp'] WA_QUADS = ['washington_utm10n_training_quads_epsg6339.shp', 'washington_utm11n_training_quads_epsg6340.shp'] ###Output _____no_output_____ ###Markdown These functions will loop through a GeoDataFrame, fetch the relevant data, and write GeoTiffs to disk in the appropriate formats. ###Code def fetch_dems(path_to_tiles, out_dir, overwrite=False): gdf = gpd.read_file(path_to_tiles) epsg = gdf.crs.to_epsg() print('Fetching DEMs for {:,d} tiles'.format(len(gdf))) PROFILE = { 'driver': 'GTiff', 'interleave': 'band', 'tiled': True, 'blockxsize': 256, 'blockysize': 256, 'compress': 'lzw', 'nodata': -9999, 'dtype': rasterio.float32, 'count': 1, } ## loop through all the geometries in the geodataframe and fetch the DEM for idx, row in gdf.iterrows(): xmin, ymin, xmax, ymax = row['geometry'].bounds xmin, ymin = np.floor((xmin, ymin)) xmax, ymax = np.ceil((xmax, ymax)) width, height = xmax-xmin, ymax-ymin trf = transform.from_bounds(xmin, ymin, xmax, ymax, width, height) ## don't bother fetching data if we already have processed this tile outname = f'{row.CELL_ID}_dem.tif' outfile = os.path.join(out_dir, outname) if os.path.exists(outfile) and not overwrite: if idx % 100 == 0: print() if idx % 10 == 0: print(idx, end='') else: print('.', end='') continue dem = quad_fetch(dem_from_tnm, bbox=[xmin, ymin, xmax, ymax], qq=True, res=1, inSR=epsg, noData=-9999) ## apply a smoothing filter to mitigate stitching/edge artifacts dem = filters.gaussian(dem, 3) ## write the data to disk PROFILE.update(width=width, height=height) with rasterio.open(outfile, 'w', **PROFILE, crs=epsg, transform=trf) as dst: dst.write(dem.astype(rasterio.float32), 1) dst.set_band_unit(1, 'meters') dst.set_band_description(1, 'DEM retrieved from The National Map') ## report progress if idx % 100 == 0: print() if idx % 10 == 0: print(idx, end='') else: print('.', end='') def fetch_tpis(path_to_tiles, out_dir, overwrite=False): gdf = gpd.read_file(path_to_tiles) epsg = gdf.crs.to_epsg() print('Fetching TPIs for {:,d} tiles'.format(len(gdf))) PROFILE = { 'driver': 'GTiff', 'interleave': 'band', 'tiled': True, 'blockxsize': 256, 'blockysize': 256, 'compress': 'lzw', 'nodata': -9999, 'dtype': rasterio.float32, 'count': 1, } ## loop through all the geometries in the geodataframe and fetch the DEM for idx, row in gdf.iterrows(): xmin, ymin, xmax, ymax = row['geometry'].bounds xmin, ymin = np.floor((xmin, ymin)) xmax, ymax = np.ceil((xmax, ymax)) width, height = xmax-xmin, ymax-ymin trf = transform.from_bounds(xmin, ymin, xmax, ymax, width, height) ## don't bother fetching data if we already have processed this tile outname300 = f'{row.CELL_ID}_tpi300.tif' outname2000 = f'{row.CELL_ID}_tpi2000.tif' outfile300 = os.path.join(out_dir, outname300) outfile2000 = os.path.join(out_dir, outname2000) if os.path.exists(outfile300) and os.path.exists(outfile2000) and not overwrite: if idx % 100 == 0: print() if idx % 10 == 0: print(idx, end='') else: print('.', end='') continue tpi300 = quad_fetch(tpi_from_tnm, bbox=[xmin, ymin, xmax, ymax], qq=True, irad=150, orad=300, dem_resolution=1, norm=False, inSR=epsg, noData=-9999) tpi2000 = quad_fetch(tpi_from_tnm, bbox=[xmin, ymin, xmax, ymax], qq=True, irad=1850, orad=2000, dem_resolution=1, norm=False, inSR=epsg, noData=-9999) ## write the data to disk PROFILE.update(width=width, height=height, crs=epsg, transform=trf) DESC300 = ''.join(['Topographic Position Index for 150-300 meter', 'annulus calculated from DEM retrieved from The', 'National Map']) DESC2000 = ''.join(['Topographic Position Index for 1850-2000 meter', 'annulus calculated from DEM retrieved from The', 'National Map']) with rasterio.open(outfile300, 'w', **PROFILE) as dst: dst.write(tpi300.astype(rasterio.float32), 1) dst.set_band_description(1, DESC300) with rasterio.open(outfile2000, 'w', **PROFILE) as dst: dst.write(tpi2000.astype(rasterio.float32), 1) dst.set_band_description(1, DESC2000) ## report progress if idx % 100 == 0: print() if idx % 10 == 0: print(idx, end='') else: print('.', end='') ###Output _____no_output_____ ###Markdown Fetch Digital Elevation Models for each tile ###Code fetch_dems(path_to_tiles=os.path.join(WORK_DIR, WA_QUADS[0]), out_dir=os.path.join(WORK_DIR, 'wa_training_tiles'), overwrite=False) fetch_dems(path_to_tiles=os.path.join(WORK_DIR, WA_QUADS[1]), out_dir=os.path.join(WORK_DIR, 'wa_training_tiles'), overwrite=False) fetch_dems(path_to_tiles=os.path.join(WORK_DIR, OR_QUADS[0]), out_dir=os.path.join(WORK_DIR, 'or_training_tiles'), overwrite=False) fetch_dems(path_to_tiles=os.path.join(WORK_DIR, OR_QUADS[1]), out_dir=os.path.join(WORK_DIR, 'or_training_tiles'), overwrite=False) ###Output Fetching DEMs for 524 tiles 0.........10.........20.........30.........40.........50.........60.........70.........80.........90......... 100.........110.........120.........130.........140.........150.........160.........170.........180.........190......... 200.........210.........220.........230.........240.........250.........260.........270.........280.........290......... 300.........310.........320.........330.........340.........350.........360.........370.........380.........390......... 400.........410.........420.........430.........440.........450.........460.........470.........480.........490......... 500.........510.........520... ###Markdown Fetch Topographic Position Index for each tile ###Code fetch_tpis(path_to_tiles=os.path.join(WORK_DIR, WA_QUADS[0]), out_dir=os.path.join(WORK_DIR, 'wa_training_tiles'), overwrite=False) fetch_tpis(path_to_tiles=os.path.join(WORK_DIR, WA_QUADS[1]), out_dir=os.path.join(WORK_DIR, 'wa_training_tiles'), overwrite=False) fetch_tpis(path_to_tiles=os.path.join(WORK_DIR, OR_QUADS[0]), out_dir=os.path.join(WORK_DIR, 'or_training_tiles'), overwrite=False) fetch_tpis(path_to_tiles=os.path.join(WORK_DIR, OR_QUADS[1]), out_dir=os.path.join(WORK_DIR, 'or_training_tiles'), overwrite=False) ###Output Fetching TPIs for 524 tiles 0.........10.........20.........30.........40.........50.........60.........70.........80.........90......... 100.........110.........120.........130.........140.........150.........160.........170.........180.........190......... 200.........210.........220.........230.........240.........250.........260.........270.........280.........290......... 300.........310.........320.........330.........340.........350.........360.........370.........380.........390......... 400.........410.........420.........430.........440.........450.........460.........470.........480.........490......... 500.........510.........520...
general_assembly/07_evaluating_model_fit/starter-code-7.ipynb
###Markdown Guided Practice/DemoThe following code samples are provided directly from the lesson and should serve as a jumping off point for students to run the code on their own. ###Code import numpy as np import pandas as pd from sklearn import linear_model, metrics import math from matplotlib import pyplot as plt %matplotlib inline df = pd.DataFrame({'x': range(100), 'y': range(100)}) biased_df = df.copy() biased_df.loc[:20, 'x'] = 1 biased_df.loc[:20, 'y'] = 1 def return_jitter(): return np.random.random_sample(size=100) Random = pd.DataFrame({'x':return_jitter(),'y':return_jitter()}) df['x'] = df.x + Random.x df['y'] = df.y + Random.y biased_df['x'] = biased_df.x + Random.x biased_df['y'] = biased_df.y + Random.y ## fit lm1 = linear_model.LinearRegression().fit(df[['x']], df['y']) print('unbiased: ',metrics.mean_squared_error(df['y'], lm1.predict(df[['x']]))) #MSE of actual Y and predicted Y ## biased fit lm2 = linear_model.LinearRegression().fit(biased_df[['x']], biased_df['y']) print('biased: ',metrics.mean_squared_error(df['y'], lm2.predict(df[['x']]))) biased_df['predicted_y'] = lm2.predict(df[['x']]) df['predicted_y'] = lm1.predict(df[['x']]) biased_df[['y','predicted_y']].plot(title='biased') df[['y','predicted_y']].plot(title='unbiased') from sklearn import cross_validation bikeshare = pd.read_csv('bikeshare.csv') weather = pd.get_dummies(bikeshare.weathersit, prefix='weather') modeldata = bikeshare[['temp', 'hum']].join(weather[['weather_1', 'weather_2', 'weather_3']]) #these join on index y = bikeshare.casual kf = cross_validation.KFold(len(modeldata), n_folds=5, shuffle=True) #len(modeldata) is number of observations #n_folds is number of subsets you want scores = [] for train_index, test_index in kf: lm = linear_model.LinearRegression().fit(modeldata.iloc[train_index], y.iloc[train_index]) scores.append(metrics.mean_squared_error(y.iloc[test_index], lm.predict(modeldata.iloc[test_index]))) print(np.mean(scores)) # this score will be lower, but we're trading off bias error for generalized error lm = linear_model.LinearRegression().fit(modeldata, y) print(metrics.mean_squared_error(y, lm.predict(modeldata))) ###Output 1673.22054346 1672.58110765 ###Markdown Cross-validation is a technique to evaluate predictive models by partitioning the original sample into a training set to train the model, and a test set to evaluate it. In k-fold cross-validation, the original sample is randomly partitioned into k equal size subsamples. ###Code k = range(2,51,2) final_scores = [] for i in k: kf = cross_validation.KFold(len(modeldata), n_folds=i) scores = [] for train_index, test_index in kf: lm = linear_model.LinearRegression().fit(modeldata.iloc[train_index], y.iloc[train_index]) scores.append(metrics.mean_squared_error(y.iloc[test_index], lm.predict(modeldata.iloc[test_index]))) print(np.mean(scores)) final_scores.append(np.mean(scores)) df_plot = pd.DataFrame({'k':k,'final_scores':final_scores}) plt.plot(df_plot['k'],df_plot['final_scores']) df_plot.plot() kf = cross_validation.KFold(len(modeldata), n_folds=5) scores = [] for train_index, test_index in kf: lm = linear_model.LinearRegression().fit(modeldata.iloc[train_index], y.iloc[train_index]) scores.append(metrics.mean_squared_error(y.iloc[test_index], lm.predict(modeldata.iloc[test_index]))) print (np.mean(scores)) lm = linear_model.LinearRegression().fit(modeldata, y) print (metrics.mean_squared_error(y, lm.predict(modeldata))) lm = linear_model.Lasso().fit(modeldata, y) print (metrics.mean_squared_error(y, lm.predict(modeldata))) lm = linear_model.Ridge().fit(modeldata, y) print (metrics.mean_squared_error(y, lm.predict(modeldata))) # Lasso and Ridge are trying to generalise the model so the error is higher alphas = np.logspace(-10, 10, 21) # 21 numbers from 10^-10 to 10^10 print(alphas) results=[] for a in alphas: print(format(a,'.65f')) #change from scientific notation to normal for a in alphas: print('\nAlpha:', a) lm = linear_model.Ridge(alpha=a) lm.fit(modeldata, y) print('coef: ',lm.coef_) print('MSE: ',metrics.mean_squared_error(y, lm.predict(modeldata))) results.append(metrics.mean_squared_error(y, lm.predict(modeldata))) plt.scatter(x=alphas,y=results) plt.xlim(10**(-10),10**10) plt.xscale('log') plt.xlabel('alpha') plt.ylabel('MSE') plt.title('Alpha vs MSE') from sklearn import grid_search #grid_search goes through all the parameters and chooses the best one for the model alphas = np.logspace(-10, 10, 21) gs = grid_search.GridSearchCV( estimator=linear_model.Ridge(fit_intercept=True), #fit_intercept=False param_grid={'alpha': alphas}, scoring='neg_mean_squared_error', cv=10) # by default, uses 3-fold cross validation gs.fit(modeldata, y) print ('\n',-gs.best_score_) # mean squared error here comes in negative, so let's make it positive. print ('\n',gs.best_estimator_) # explains which grid_search setup worked best for i in np.arange(len(gs.grid_scores_)): print ('\n',gs.grid_scores_[i]) # shows all the grid pairings and their performances. # without fitting intercept alphas = np.logspace(-10, 10, 21) gs = grid_search.GridSearchCV( estimator=linear_model.Ridge(fit_intercept=False), param_grid={'alpha': alphas}, scoring='neg_mean_squared_error') # by default, uses 3-fold cross validation gs.fit(modeldata, y) print ('\n',-gs.best_score_) # mean squared error here comes in negative, so let's make it positive. print ('\n',gs.best_estimator_) # explains which grid_search setup worked best for i in np.arange(len(gs.grid_scores_)): print ('\n',gs.grid_scores_[i]) # shows all the grid pairings and their performances. num_to_approach, start, steps, optimized = 6.2, 0., [-1, 1], False while not optimized: current_distance = num_to_approach - start got_better = False next_steps = [start + i for i in steps] for n in next_steps: distance = np.abs(num_to_approach - n) if distance < current_distance: got_better = True print (distance, 'is better than', current_distance) current_distance = distance start = n if got_better: print ('found better solution! using', current_distance) a += 1 else: optimized = True print (start, 'is closest to', num_to_approach) lm = linear_model.SGDRegressor() # Stochastic Gradient Descent lm.fit(modeldata, y) print (lm.score(modeldata, y)) # lm.score is the R2 value print (metrics.mean_squared_error(y, lm.predict(modeldata))) ###Output 0.308471246783 1681.00000101 ###Markdown Independent PracticeUse the following code to work through the problems given.1. With a set of alpha values between 10^-10 and 10^-1, how does the mean squared error change?2. We know when to properly use l1 vs l2 regularization based on the data. By using a grid search with l1_ratios between 0 and 1 (increasing every 0.05), does that statement hold true? * (if it didn't look like it, did gradient descent have enough iterations?)3. How do results change when you alter the learning rate (power_t)? learning rate is how far you jump along to find the minimum ###Code alphas = np.logspace(-10, -1, 10) print(alphas) params = {'alpha':alphas, } # put your gradient descent parameters here gs = grid_search.GridSearchCV( estimator=linear_model.SGDRegressor(), cv=cross_validation.KFold(len(modeldata), n_folds=5, shuffle=True), param_grid=params, scoring='neg_mean_squared_error', ) gs.fit(modeldata, y) grid = pd.DataFrame(gs.grid_scores_) #print(grid) #grid[[0]] = grid.apply(lambda x: x['alpha']) #grid[[1]] = grid.apply(lambda x: -x) # #grid['parameters'] grid['mean_validation_score'] = grid['mean_validation_score'].apply(lambda x: -x) grid.columns = ['alpha', 'mean_squared_error', 'cv'] grid grid.plot('alpha','mean_squared_error',logx=True) #reduce variance and bias as much as possible so that the model is generalised for new data and not just the #training set i.e. make sure it is not overfitted l1_2_ratios = [float(i) / 100 for i in range(0, 101, 5)] print(l1_2_ratios) params = {'l1_ratio':l1_2_ratios, 'penalty': ['elasticnet'], 'alpha': [.1], 'n_iter': [50]} gs = grid_search.GridSearchCV( estimator=linear_model.SGDRegressor(), cv=cross_validation.KFold(len(modeldata), n_folds=5, shuffle=True), param_grid=params, scoring='neg_mean_squared_error', ) gs.fit(modeldata, y) grid = pd.DataFrame(gs.grid_scores_) grid['mean_validation_score'] = grid['mean_validation_score'].apply(lambda x: -x) grid.columns = ['l1_ratio', 'neg_mean_squared_error', 'cv'] grid gs.best_estimator_ grid.plot('l1_ratio', 'neg_mean_squared_error',rot=30) learning = list(range(1,50)) print(learning) params = {'eta0':list(range(1,50)), 'n_iter': [50]} gs = grid_search.GridSearchCV( estimator=linear_model.SGDRegressor(), cv=cross_validation.KFold(len(modeldata), n_folds=5, shuffle=True), param_grid=params, scoring='neg_mean_squared_error', ) gs.fit(modeldata, y) grid = pd.DataFrame(gs.grid_scores_) grid.columns = ['eta0', 'neg_mean_squared_error', 'cv'] grid['neg_mean_squared_error'] = grid['neg_mean_squared_error'].apply(lambda x: -x) grid grid.plot('eta0', 'neg_mean_squared_error', logy=True, rot=30) ###Output _____no_output_____
sgd_classifier_with_regularization.ipynb
###Markdown SGD Classifier with Regularizaion Regularisation There are two elementary regularisers, the L1 and L2 regularisation, also known as Lasso and Ridge penalty, respectively. ###Code import numpy as np import struct import gzip import pandas as pd import sklearn import sklearn.ensemble import sklearn.neural_network import sklearn.decomposition import sklearn.pipeline# adjust settings to plot nice figures inline %matplotlib inline import matplotlib import matplotlib.pyplot as plt from sklearn.linear_model import SGDClassifier import warnings warnings.filterwarnings('ignore') plt.rcParams['axes.labelsize'] = 14 plt.rcParams['xtick.labelsize'] = 12 plt.rcParams['ytick.labelsize'] = 12 # data directory data_dir = 'mnist' import torchvision.datasets as dset # train data train_set = dset.MNIST(root=data_dir, train=True, download=True) train_x_mnist = np.array(train_set.train_data) train_y_mnist = np.array(train_set.train_labels) # test data test_set = dset.MNIST(root=data_dir, train=False, download=True) test_x_mnist = np.array(test_set.test_data) test_y_mnist = np.array(test_set.test_labels) ############################################################################ # Extract sample digits ############################################################################ def sample_data_digits(data, labels, labels_to_select): # convert input 3d arrays to 2d arrays nsamples, nx, ny = data.shape data_vec = np.reshape(data,(nsamples,nx*ny)) selected_indexes = np.isin(labels, labels_to_select) selected_data = data_vec[selected_indexes] selected_labels = labels[selected_indexes] # Convert images from gray to binary by thresholding intensity values selected_data = 1.0 * (selected_data >= 128) # convert labels to binary: digit_1=False, digit_2=True selected_labels = selected_labels==labels_to_select[1] # shuffle data shuffle_index = np.random.permutation(len(selected_labels)) selected_data, selected_labels = selected_data[shuffle_index], selected_labels[shuffle_index] return selected_data, selected_labels from sklearn.manifold.t_sne import TSNE from sklearn.neighbors.classification import KNeighborsClassifier def plot_decision_boundary(model,X,y): Y_pred=model.predict(X) X_Train_embedded = TSNE(n_components=2).fit_transform(X) # create meshgrid resolution = 100 # 100x100 background pixels X2d_xmin, X2d_xmax = np.min(X_Train_embedded[:,0]), np.max(X_Train_embedded[:,0]) X2d_ymin, X2d_ymax = np.min(X_Train_embedded[:,1]), np.max(X_Train_embedded[:,1]) xx, yy = np.meshgrid(np.linspace(X2d_xmin, X2d_xmax, resolution), np.linspace(X2d_ymin, X2d_ymax, resolution)) # approximate Voronoi tesselation on resolution x resolution grid using 1-NN background_model = KNeighborsClassifier(n_neighbors=1).fit(X_Train_embedded, Y_pred) voronoiBackground = background_model.predict(np.c_[xx.ravel(), yy.ravel()]) voronoiBackground = voronoiBackground.reshape((resolution, resolution)) #plot plt.contourf(xx, yy, voronoiBackground) plt.scatter(X_Train_embedded[:,0], X_Train_embedded[:,1], c=y) plt.show() ###Output _____no_output_____ ###Markdown Using Regularisers An example of regularisation. The dataset to be used here is the *'make moons'* one which can be directly used from sklearn. ###Code from sklearn.datasets import make_moons from sklearn.model_selection import train_test_split from sklearn.linear_model import SGDClassifier np.random.seed(42) X, Y = make_moons(1000, noise=0.1) ##################################################################### # Perform a train test split , use 20% of the dataset for testing. ##################################################################### x_train, x_test, y_train, y_test = train_test_split(X, Y, test_size = 0.2, random_state = 42) ##################################################################### # Perform a polynomial feature transform use degree 10 ##################################################################### from sklearn.preprocessing import * poly = PolynomialFeatures(10) x_train_poly = poly.fit_transform(x_train) model = SGDClassifier(tol = None, loss="log", alpha = 0.002, penalty = "l2") model.fit(x_train_poly, y_train) x_test_poly = poly.fit_transform(x_test) ##################################################################### # Plot the decision boundary ##################################################################### plot_decision_boundary(model, x_test_poly, y_test) ###Output _____no_output_____ ###Markdown MNISTExtract from the dataset digits 0 and 8. ###Code ########################################################################### # Extract ones and eights digits from both training and testing data ############################################################################ labels_to_select = [0,8] selected_train_data, selected_train_labels = sample_data_digits(train_x_mnist, train_y_mnist, labels_to_select) selected_test_data, selected_test_labels = sample_data_digits(test_x_mnist, test_y_mnist, labels_to_select) from sklearn.base import BaseEstimator, ClassifierMixin class LogisticRegression(BaseEstimator, ClassifierMixin): def __init__(self, lr=0.05, num_iter=1000, add_bias=True, verbose=True,lambda_val=0.5,penalty='L1'): self.lr = lr self.lambda_val = lambda_val self.verbose = verbose self.num_iter = num_iter self.add_bias = add_bias self.penalty = penalty def __add_bias(self, X): bias = np.ones((X.shape[0], 1)) return np.concatenate((bias, X), axis=1) ############################################################################ # compute the loss + ADD PENALTY HERE ############################################################################ def __loss(self, h, y): ''' computes loss values ''' y = np.array(y,dtype=float) # ADD YOUR CODE HERE if self.penalty == 'L2': reg = 0.5*self.lambda_val * np.matmul(self.theta, np.transpose(self.theta)) #L2 norm elif self.penalty == 'L1': reg = self.lambda_val * np.linalg.norm(self.theta,1) #L1 norm else: reg = 0 return (-y * np.log(h) - (1 - y) * np.log(1 - h)+ reg ).mean() def fit(self, X, y): ''' Optimise the model using gradient descent Arguments: X input features y labels from training data ''' if self.add_bias: X = self.__add_bias(X) ############################################################################ #initialise weights randomly with normal distribution N(0,1) ############################################################################ self.theta = np.random.normal(0.0,0.01,X.shape[1]) for i in range(self.num_iter): ############################################################################ # forward propagation ############################################################################ z = X.dot(self.theta) h = 1.0 / (1.0 + np.exp(-z)) ############################################################################ # backward propagation + ADD PENALTY HERE ############################################################################ # update parameters reg = np.array([0 for k in range(len(self.theta))]) if self.penalty == 'L2': reg = self.lambda_val * self.theta #L2 norm elif self.penalty == 'L1': #L1 norm reg = np.array([1 for k in range(len(self.theta))]) for k in range(len(self.theta)): if self.theta[k]<0: reg[k]=-1 else: reg = np.array([0 for k in range(len(self.theta))]) self.theta -= self.lr * ( np.dot(X.T, (h - y))+reg) / y.size ############################################################################ # print loss ############################################################################ if(self.verbose == True and i % 50 == 0): h = 1.0 / (1.0 + np.exp(-X.dot(self.theta))) print('loss: {} \t'.format(self.__loss(h, y))) def predict_probs(self,X): ''' returns output probabilities ''' ############################################################################ # forward propagation ############################################################################ if self.add_bias: X = self.__add_bias(X) z = X.dot(self.theta) return 1.0 / (1.0 + np.exp(-z)) def predict(self, X, threshold=0.5): ''' returns output classes ''' return self.predict_probs(X) >= threshold def score(self, X,Y): ''' Returns accuracy of model ''' preds = self.predict(X) accuracy = (preds == Y).mean() return accuracy ###Output _____no_output_____ ###Markdown Deploying Regressor with regularisation ###Code ######################################################################### # train the model using raw pixels , train for all penalties and for various lambda values ######################################################################### for pen in ['None','L1','L2']: for lam_val in [0.7]: ######################################################################### # Evaluate the trained model - compute train and test accuracies ######################################################################### model = LogisticRegression(lr=1e-3, num_iter=10000, add_bias=True, verbose=True,lambda_val=lam_val,penalty=pen) model.fit(selected_train_data[:75],selected_train_labels[:75]) selected_train_results = model.predict(selected_train_data) selected_test_results = model.predict(selected_test_data) count1 = 0 count2 = 0 for i in range(75): if selected_train_results[i] and selected_train_labels[i] == 1: count1 += 1 if selected_train_results[i]==False and selected_train_labels[i] == 0: count1 += 1 for i in range(len(selected_test_results)): if selected_test_results[i] and selected_test_labels[i] == 1: count2 += 1 if selected_test_results[i]==False and selected_test_labels[i] == 0: count2 += 1 print("penalty =", pen) print("training accuracy is", count1/75) print("testing accuracy is", count2/selected_test_labels.size) ######################################################################### # draw trained model params (weights) as an image of size (28x28) ######################################################################### plt.imshow(model.theta[:-1].reshape(28,28)) plt.colorbar() plt.show() ###Output loss: 0.6774344414956851 loss: 0.5490156697838957 loss: 0.4676366686955057 loss: 0.40801834478931287 loss: 0.3618210462720561 loss: 0.3250134598769958 loss: 0.2951125457853307 loss: 0.27042598711205307 loss: 0.24975114527583353 loss: 0.23221340252385014 loss: 0.21716542702106853 loss: 0.20412055078102018 loss: 0.1927075099757217 loss: 0.18263916821532528 loss: 0.17369058728002632 loss: 0.1656834379866788 loss: 0.15847476797346005 loss: 0.15194880227735208 loss: 0.14601088214726254 loss: 0.14058293047418727 loss: 0.13560002057655812 loss: 0.13100775187486394 loss: 0.12676022232810516 loss: 0.12281844697715504 loss: 0.11914911337759908 loss: 0.11572359390085916 loss: 0.11251715567779684 loss: 0.10950832392620259 loss: 0.10667836528443236 loss: 0.10401086575970814 loss: 0.1014913838151198 loss: 0.09910716353940041 loss: 0.09684689617370819 loss: 0.09470052079858277 loss: 0.09265905691924896 loss: 0.09071446317866479 loss: 0.08885951758481868 loss: 0.08708771554251094 loss: 0.08539318269009243 loss: 0.08377060010313617 loss: 0.08221513987345618 loss: 0.08072240942877551 loss: 0.07928840324512562 loss: 0.07790946083565664 loss: 0.07658223008745815 loss: 0.07530363517117203 loss: 0.07407084837358788 loss: 0.07288126530651821 loss: 0.07173248303036167 loss: 0.0706222807012942 loss: 0.06954860240969267 loss: 0.06850954192636977 loss: 0.06750332911422117 loss: 0.06652831779736337 loss: 0.06558297490891195 loss: 0.0646658707631369 loss: 0.06377567031858819 loss: 0.06291112531653129 loss: 0.062071067194169334 loss: 0.061254400685080125 loss: 0.06046009803040293 loss: 0.05968719373386023 loss: 0.0589347798019322 loss: 0.05820200141761311 loss: 0.05748805300233883 loss: 0.056792174626019085 loss: 0.05611364872975821 loss: 0.05545179712989969 loss: 0.05480597827556957 loss: 0.05417558473498962 loss: 0.05356004088854813 loss: 0.0529588008089994 loss: 0.05237134631126331 loss: 0.05179718515614548 loss: 0.051235849393932956 loss: 0.05068689383526497 loss: 0.05014989463795868 loss: 0.04962444799960601 loss: 0.049110168946766765 loss: 0.048606690212482304 loss: 0.04811366119463429 loss: 0.04763074698838898 loss: 0.047157627486604924 loss: 0.04669399654265555 loss: 0.04623956119062942 loss: 0.04579404091833177 loss: 0.04535716698892376 loss: 0.04492868180740777 loss: 0.044508338328502525 loss: 0.04409589950275286 loss: 0.043691137757992875 loss: 0.043293834513526426 loss: 0.042903779724612845 loss: 0.04252077145504737 loss: 0.04214461547580898 loss: 0.04177512488791473 loss: 0.04141211976777049 loss: 0.041055426833445895 loss: 0.0407048791304257 loss: 0.040360315735504436 loss: 0.04002158147759504 loss: 0.03968852667431669 loss: 0.039361006883314224 loss: 0.03903888266734133 loss: 0.03872201937221082 loss: 0.03841028691678376 loss: 0.038103559594228995 loss: 0.03780171588384071 loss: 0.03750463827275384 loss: 0.03721221308694353 loss: 0.036924330330939115 loss: 0.03664088353572241 loss: 0.03636176961431768 loss: 0.03608688872461423 loss: 0.03581614413899397 loss: 0.03554944212036568 loss: 0.03528669180423373 loss: 0.03502780508645479 loss: 0.034772696516357865 loss: 0.03452128319492514 loss: 0.03427348467775029 loss: 0.03402922288250893 loss: 0.033788422000693534 loss: 0.03355100841338006 loss: 0.03331691061080848 loss: 0.03308605911557305 loss: 0.03285838640923034 loss: 0.03263382686214521 loss: 0.032412316666405684 loss: 0.032193793771647675 loss: 0.03197819782364027 loss: 0.03176547010549082 loss: 0.03155555348133785 loss: 0.03134839234240674 loss: 0.031143932555311043 loss: 0.030942121412488824 loss: 0.030742907584669313 loss: 0.030546241075271704 loss: 0.03035207317664283 loss: 0.030160356428046296 loss: 0.029971044575319527 loss: 0.029784092532121022 loss: 0.029599456342692825 loss: 0.029417093146068864 loss: 0.029236961141662045 loss: 0.02905901955616769 loss: 0.028883228611723533 loss: 0.028709549495269656 loss: 0.028537944329055397 loss: 0.028368376142241705 loss: 0.02820080884355145 loss: 0.028035207194921506 loss: 0.02787153678611342 loss: 0.027709764010241448 loss: 0.027549856040178556 loss: 0.027391780805803425 loss: 0.027235506972052872 loss: 0.027081003917745886 loss: 0.02692824171514742 loss: 0.026777191110241492 loss: 0.02662782350368402 loss: 0.02648011093240839 loss: 0.026334026051856828 loss: 0.02618954211881261 loss: 0.02604663297480924 loss: 0.025905273030093632 loss: 0.025765437248121234 loss: 0.025627101130562813 loss: 0.02549024070280232 loss: 0.025354832499907526 loss: 0.02522085355305462 loss: 0.02508828137638998 loss: 0.02495709395431238 loss: 0.02482726972915934 loss: 0.024698787589283296 loss: 0.0245716268575023 loss: 0.024445767279911844 loss: 0.024321189015044358 loss: 0.024197872623363634 loss: 0.024075799057082094 loss: 0.023954949650289128 loss: 0.023835306109379416 loss: 0.023716850503770346 loss: 0.023599565256898616 loss: 0.023483433137485587 loss: 0.02336843725106245 loss: 0.023254561031745998 loss: 0.02314178823425612 loss: 0.02303010292616677 loss: 0.02291948948038256 loss: 0.022809932567832897 loss: 0.022701417150376782 loss: 0.022593928473910476 loss: 0.022487452061671867 loss: 0.022381973707734603 loss: 0.022277479470685764 loss: 0.022173955667480892 loss: 0.02207138886747072 loss: 0.021969765886593925 loss: 0.021869073781730357 penalty = None training accuracy is 1.0 testing accuracy is 0.9800409416581372 ###Markdown Regularisation using sklearn ###Code ############################################################################ ############################################################################ # reshape to (N,728) this is number of samples N and the nuber of features 28x28=728 x_train=np.reshape(train_x_mnist,(train_x_mnist.shape[0],train_x_mnist.shape[1]*train_x_mnist.shape[2])) x_test=np.reshape(test_x_mnist,(test_x_mnist.shape[0],test_x_mnist.shape[1]*test_x_mnist.shape[2])) ############################################################################ ############################################################################ # No Regularisation ############################################################################ mymodel1 = SGDClassifier(max_iter = 20, tol = 1e-3, alpha = 0.1, loss="log", penalty = "none") mymodel1.fit(x_train, train_y_mnist) print("accuracy of train set", mymodel1.score(x_train, train_y_mnist)) print("accuracy of test set", mymodel1.score(x_test, test_y_mnist)) ############################################################################ # L1 Reguralization ############################################################################ mymodel2 = SGDClassifier(max_iter = 20, tol = 1e-3, alpha = 0.1, loss="log", penalty = "l1") mymodel2.fit(x_train, train_y_mnist) print("accuracy of train set", mymodel2.score(x_train, train_y_mnist)) print("accuracy of test set", mymodel2.score(x_test, test_y_mnist)) ############################################################################ # L2 Reguralization ############################################################################ mymodel3 = SGDClassifier(max_iter = 20, tol = 1e-3, alpha = 0.1, loss="log", penalty = "l2") mymodel3.fit(x_train, train_y_mnist) print("accuracy of train set", mymodel3.score(x_train, train_y_mnist)) print("accuracy of test set", mymodel3.score(x_test, test_y_mnist)) ###Output accuracy of train set 0.8383666666666667 accuracy of test set 0.8471 accuracy of train set 0.8140166666666667 accuracy of test set 0.8223 accuracy of train set 0.88995 accuracy of test set 0.8838 ###Markdown Validation ###Code from sklearn.model_selection import train_test_split ############################################################################ # Split into train and val test_size=0.10, random_state=42 ############################################################################ x_train, x_val, y_train, y_val = train_test_split(train_x_mnist, train_y_mnist, test_size = 0.1, random_state = 42) x_train=np.reshape(x_train,(x_train.shape[0],x_train.shape[1]*x_train.shape[2])) x_val=np.reshape(x_val,(x_val.shape[0],x_val.shape[1]*x_val.shape[2])) #Initialize best values best = [0,0,0] ############################################################################ # Iterate over selected values ############################################################################ for lr in [1e-2,1e-3,1e-6]: for al in [0.5,0.1,0.01]: mymodel4 = SGDClassifier(max_iter = 20, learning_rate = 'constant', eta0 = lr, alpha = al, tol = 1e-3, loss="log", penalty = "l2") mymodel4.fit(x_train, y_train) t = mymodel4.score(x_val, y_val) if t > best[2]: best=[lr, al, t] # print best hyperparameters and accuracy print(best) ###Output [1e-06, 0.01, 0.8896666666666667]
python/Module-06-Modules.ipynb
###Markdown Module 6 - Modules Scripts and ProgramsIf you quit from the Python interpreter and enter it again, the definitions you have made (functions and variables) are lost. Therefore, if you want to write a somewhat longer program, you are better off using a text editor to prepare the input for the interpreter and running it with that file as input instead. This is known as creating a script. As your program gets longer, you may want to split it into several files for easier maintenance. You may also want to use a handy function that you’ve written in several programs without copying its definition into each program. ModulesTo support this, Python has a way to put definitions in a file and use them in a script or in an interactive instance of the interpreter. Such a file is called a module; definitions from a module can be imported into other modules or into the main module (the collection of variables that you have access to in a script executed at the top level and in calculator mode).A module is a file containing Python definitions and statements. The file name is the module name with the suffix .py appended. Within a module, the module’s name (as a string) is available as the value of the global variable `__name__`. `fib.py`For instance, use your favorite text editor to create a file called fibo.py in the current directory with the following contents: ###Code # Fibonacci numbers module from __future__ import print_function def fib(n): # write Fibonacci series up to n a, b = 0, 1 while b < n: print(b,) a, b = b, a+b def fib2(n): # return Fibonacci series up to n result = [] a, b = 0, 1 while b < n: result.append(b) a, b = b, a+b return result ###Output _____no_output_____ ###Markdown Now enter the Python interpreter and import this module with the following command: ###Code import fibo ###Output _____no_output_____ ###Markdown This does not enter the names of the functions defined in fibo directly in the current symbol table; it only enters the module name `fibo` there. Using the module name you can access the functions: ###Code fibo.fib(1000) fibo.fib2(100) fibo.__name__ ###Output _____no_output_____ ###Markdown If you intend to use a function often you can assign it to a local name: ###Code fib = fibo.fib fib(500) ###Output 1 1 2 3 5 8 13 21 34 55 89 144 233 377 ###Markdown More on ModulesA module can contain executable statements as well as function definitions. These statements are intended to initialize the module. They are executed only the first time the module name is encountered in an import statement.Each module has its own private symbol table, which is used as the global symbol table by all functions defined in the module. Thus, the author of a module can use global variables in the module without worrying about accidental clashes with a user’s global variables. On the other hand, if you know what you are doing you can touch a module’s global variables with the same notation used to refer to its functions, `modname.itemname`. Modules can import other modules. It is customary but not required to place all import statements at the beginning of a module (or script, for that matter). The imported module names are placed in the importing module’s global symbol table.There is a variant of the `import` statement that imports names from a module directly into the importing module’s symbol table. For example: ###Code from fibo import fib, fib2 fib(500) ###Output 1 1 2 3 5 8 13 21 34 55 89 144 233 377 ###Markdown This does not introduce the module name from which the imports are taken in the local symbol table (so in the example, fibo is not defined). `import *`There is even a variant to import all names that a module defines: ###Code from fibo import * fib(500) ###Output 1 1 2 3 5 8 13 21 34 55 89 144 233 377 ###Markdown This imports all names except those beginning with an underscore (_).Note that in general the practice of importing * from a module or package is frowned upon, since it often causes poorly readable code. However, it is okay to use it to save typing in interactive sessions. The `dir()` FunctionThe built-in function dir() is used to find out which names a module defines. It returns a sorted list of strings: ###Code import fibo, sys print(dir(fibo)) print(dir(sys)) ###Output ['__displayhook__', '__doc__', '__excepthook__', '__interactivehook__', '__loader__', '__name__', '__package__', '__spec__', '__stderr__', '__stdin__', '__stdout__', '_clear_type_cache', '_current_frames', '_debugmallocstats', '_getframe', '_home', '_mercurial', '_xoptions', 'abiflags', 'api_version', 'argv', 'base_exec_prefix', 'base_prefix', 'builtin_module_names', 'byteorder', 'call_tracing', 'callstats', 'copyright', 'displayhook', 'dont_write_bytecode', 'exc_info', 'excepthook', 'exec_prefix', 'executable', 'exit', 'flags', 'float_info', 'float_repr_style', 'get_asyncgen_hooks', 'get_coroutine_wrapper', 'getallocatedblocks', 'getcheckinterval', 'getdefaultencoding', 'getdlopenflags', 'getfilesystemencodeerrors', 'getfilesystemencoding', 'getprofile', 'getrecursionlimit', 'getrefcount', 'getsizeof', 'getswitchinterval', 'gettrace', 'hash_info', 'hexversion', 'implementation', 'int_info', 'intern', 'is_finalizing', 'last_traceback', 'last_type', 'last_value', 'maxsize', 'maxunicode', 'meta_path', 'modules', 'path', 'path_hooks', 'path_importer_cache', 'platform', 'prefix', 'ps1', 'ps2', 'ps3', 'set_asyncgen_hooks', 'set_coroutine_wrapper', 'setcheckinterval', 'setdlopenflags', 'setprofile', 'setrecursionlimit', 'setswitchinterval', 'settrace', 'stderr', 'stdin', 'stdout', 'thread_info', 'version', 'version_info', 'warnoptions'] ###Markdown Without arguments, dir() lists the names you have defined currently: ###Code a = [1, 2, 3, 4, 5] import fibo fib = fibo.fib print(dir()) ###Output ['In', 'Out', '_', '_7', '_8', '__', '___', '__builtin__', '__builtins__', '__doc__', '__loader__', '__name__', '__package__', '__spec__', '_dh', '_i', '_i1', '_i10', '_i11', '_i12', '_i13', '_i14', '_i2', '_i3', '_i4', '_i5', '_i6', '_i7', '_i8', '_i9', '_ih', '_ii', '_iii', '_oh', '_sh', 'a', 'exit', 'fib', 'fib2', 'fibo', 'get_ipython', 'print_function', 'quit', 'sys'] ###Markdown Note that it lists all types of names: variables, modules, functions, etc. `dir()` does not list the names of built-in functions and variables. If you want a list of those, they are defined in the standard module `builtins`: ###Code import builtins print(dir(__builtin__)) ###Output ['ArithmeticError', 'AssertionError', 'AttributeError', 'BaseException', 'BlockingIOError', 'BrokenPipeError', 'BufferError', 'BytesWarning', 'ChildProcessError', 'ConnectionAbortedError', 'ConnectionError', 'ConnectionRefusedError', 'ConnectionResetError', 'DeprecationWarning', 'EOFError', 'Ellipsis', 'EnvironmentError', 'Exception', 'False', 'FileExistsError', 'FileNotFoundError', 'FloatingPointError', 'FutureWarning', 'GeneratorExit', 'IOError', 'ImportError', 'ImportWarning', 'IndentationError', 'IndexError', 'InterruptedError', 'IsADirectoryError', 'KeyError', 'KeyboardInterrupt', 'LookupError', 'MemoryError', 'ModuleNotFoundError', 'NameError', 'None', 'NotADirectoryError', 'NotImplemented', 'NotImplementedError', 'OSError', 'OverflowError', 'PendingDeprecationWarning', 'PermissionError', 'ProcessLookupError', 'RecursionError', 'ReferenceError', 'ResourceWarning', 'RuntimeError', 'RuntimeWarning', 'StopAsyncIteration', 'StopIteration', 'SyntaxError', 'SyntaxWarning', 'SystemError', 'SystemExit', 'TabError', 'TimeoutError', 'True', 'TypeError', 'UnboundLocalError', 'UnicodeDecodeError', 'UnicodeEncodeError', 'UnicodeError', 'UnicodeTranslateError', 'UnicodeWarning', 'UserWarning', 'ValueError', 'Warning', 'ZeroDivisionError', '__IPYTHON__', '__build_class__', '__debug__', '__doc__', '__import__', '__loader__', '__name__', '__package__', '__spec__', 'abs', 'all', 'any', 'ascii', 'bin', 'bool', 'bytearray', 'bytes', 'callable', 'chr', 'classmethod', 'compile', 'complex', 'copyright', 'credits', 'delattr', 'dict', 'dir', 'divmod', 'dreload', 'enumerate', 'eval', 'exec', 'filter', 'float', 'format', 'frozenset', 'get_ipython', 'getattr', 'globals', 'hasattr', 'hash', 'help', 'hex', 'id', 'input', 'int', 'isinstance', 'issubclass', 'iter', 'len', 'license', 'list', 'locals', 'map', 'max', 'memoryview', 'min', 'next', 'object', 'oct', 'open', 'ord', 'pow', 'print', 'property', 'range', 'repr', 'reversed', 'round', 'set', 'setattr', 'slice', 'sorted', 'staticmethod', 'str', 'sum', 'super', 'tuple', 'type', 'vars', 'zip']
DAY 001 ~ 100/DAY089_[BaekJoon] 네 수 (Python).ipynb
###Markdown 2020년 5월 5일 화요일 BaekJoon - 10824번 : 네 수 문제 : https://www.acmicpc.net/problem/10824 블로그 : https://somjang.tistory.com/entry/BaekJoon-10824%EB%B2%88-%EB%84%A4-%EC%88%98-Python 첫번째 시도 ###Code A, B, C, D = map(str, input().split()) intAB = int(A+B) intCB = int(C+D) print(intAB + intCB) ###Output _____no_output_____
_solutions/sheet3_sols.ipynb
###Markdown Solutions to Sheet 3 by Rasmus Kornbeck ###Code import matplotlib.pyplot as plt import numpy as np import itertools # problem 1 def f(x): return 1/(x-1) x1 = np.linspace(-2, 0.99) x2 = np.linspace(1.01, 6) y1 = f(x1) y2 = f(x2) plt.plot(x1, y1, lw=4, ls='--', c="m") plt.plot(x2, y2, lw=4, ls='--', c="m") plt.xlim(-2, 6) plt.ylim(-6, 6) plt.show() # problem 2 x = np.linspace(0, np.pi*2) styles = [ {"c": "g", "ls": "-"}, {"c": "r", "ls": "--"}, {"c": "b", "ls": "--"}, {"c": "m", "ls": ":"} ] fig = plt.figure() ax1 = fig.add_subplot(221) ax1.set_title("sin(x)") ax1.plot(x, np.sin(x), **styles[0]) plt.axis([0, np.pi*2, -2, 2]) ax2 = fig.add_subplot(222) ax2.set_title("sin(2x)") ax2.plot(x, np.sin(2*x), **styles[1]) plt.axis([0, np.pi*2, -2, 2]) ax3 = fig.add_subplot(223) ax3.set_title("2sin(x)") ax3.plot(x, 2*np.sin(x), **styles[2]) plt.axis([0, np.pi*2, -2, 2]) ax4 = fig.add_subplot(224) ax4.set_title("2sin(2x)") ax4.plot(x, 2*np.sin(2*x), **styles[3]) plt.axis([0, np.pi*2, -2, 2]) plt.suptitle("Bunch of plots") plt.show() # problem 2 - alternative x = np.linspace(0, np.pi*2) y = [np.sin(x), 2*np.sin(x), np.sin(2*x), 2*np.sin(2*x)] titles = ["sin(x)", "2sin(x)", "sin(2x)", "2sin(2)"] styles = [ {"c": "g", "ls": "-"}, {"c": "b", "ls": "--"}, {"c": "r", "ls": "--"}, {"c": "m", "ls": ":"} ] fig, axs = plt.subplots(2, 2) fig.suptitle("Bunch of plots") q = itertools.chain.from_iterable(zip(*axs)) # goes by column rather than row for ax, f, titl, style in zip(q, y, titles, styles): ax.plot(x, f, **style) ax.axis([0, np.pi*2, -2, 2]) ax.set_title(titl) # problem 3 y = np.loadtxt("results.txt", delimiter=",", skiprows=1) x = np.arange(1, 22) fig = plt.figure(figsize=(7,7)) ax1 = fig.add_subplot(221) ax1.plot(x, y[:,0]) ax1.set_title("asg1 impl") ax2 = fig.add_subplot(222) ax2.loglog(x, y[:,0]) ax2.set_title("asg1 log") # polynomial ax3 = fig.add_subplot(223) ax3.plot(x, y[:,1]) ax3.set_title("numpy impl") ax4 = fig.add_subplot(224) ax4.semilogy(x, y[:,1]) ax4.set_title("numpy log-linear") # exponential ###Output _____no_output_____
Udemy-Python-DS-and-ML-Bootcamp/Machine_Learning/K Means Clustering Project .ipynb
###Markdown ______ K Means Clustering Project For this project we will attempt to use KMeans Clustering to cluster Universities into to two groups, Private and Public.___It is **very important to note, we actually have the labels for this data set, but we will NOT use them for the KMeans clustering algorithm, since that is an unsupervised learning algorithm.** When using the Kmeans algorithm under normal circumstances, it is because you don't have labels. In this case we will use the labels to try to get an idea of how well the algorithm performed, but you won't usually do this for Kmeans, so the classification report and confusion matrix at the end of this project, don't truly make sense in a real world setting!.___ The DataWe will use a data frame with 777 observations on the following 18 variables.* Private A factor with levels No and Yes indicating private or public university* Apps Number of applications received* Accept Number of applications accepted* Enroll Number of new students enrolled* Top10perc Pct. new students from top 10% of H.S. class* Top25perc Pct. new students from top 25% of H.S. class* F.Undergrad Number of fulltime undergraduates* P.Undergrad Number of parttime undergraduates* Outstate Out-of-state tuition* Room.Board Room and board costs* Books Estimated book costs* Personal Estimated personal spending* PhD Pct. of faculty with Ph.D.’s* Terminal Pct. of faculty with terminal degree* S.F.Ratio Student/faculty ratio* perc.alumni Pct. alumni who donate* Expend Instructional expenditure per student* Grad.Rate Graduation rate Import Libraries** Import the libraries you usually use for data analysis.** ###Code import pandas as pd import numpy as np import matplotlib.pyplot as plt %matplotlib inline import seaborn as sns ###Output _____no_output_____ ###Markdown Get the Data ** Read in the College_Data file using read_csv. Figure out how to set the first column as the index.** ###Code df = pd.read_csv('College_Data',index_col=0) #College_Data has no .csv on it ###Output _____no_output_____ ###Markdown **Check the head of the data** ###Code df.head() ###Output _____no_output_____ ###Markdown ** Check the info() and describe() methods on the data.** ###Code df.info() df.describe() ###Output _____no_output_____ ###Markdown EDAIt's time to create some data visualizations!** Create a scatterplot of Grad.Rate versus Room.Board where the points are colored by the Private column. ** ###Code sns.set_style("whitegrid") sns.lmplot('Room.Board','Grad.Rate',data=df,hue='Private',fit_reg=False,scatter_kws={"s": 10,'alpha':0.3}) ###Output _____no_output_____ ###Markdown **Create a scatterplot of F.Undergrad versus Outstate where the points are colored by the Private column.** ###Code sns.lmplot('Outstate','F.Undergrad',data=df,hue='Private',fit_reg=False,scatter_kws={"s": 10,'alpha':0.3}) ###Output _____no_output_____ ###Markdown ** Create a stacked histogram showing Out of State Tuition based on the Private column. Try doing this using [sns.FacetGrid](https://stanford.edu/~mwaskom/software/seaborn/generated/seaborn.FacetGrid.html). If that is too tricky, see if you can do it just by using two instances of pandas.plot(kind='hist'). ** ###Code sns.set_style("darkgrid") fig = sns.FacetGrid(df[['Outstate','Private']],hue='Private') fig = fig.map(plt.hist,'Outstate',bins=20,alpha=0.3,edgecolor='b') ###Output _____no_output_____ ###Markdown **Create a similar histogram for the Grad.Rate column.** ###Code fig = sns.FacetGrid(df[['Grad.Rate','Private']],hue='Private') fig = fig.map(plt.hist,'Grad.Rate',bins=20,alpha=0.3,edgecolor='b') ###Output _____no_output_____ ###Markdown ** Notice how there seems to be a private school with a graduation rate of higher than 100%.What is the name of that school?** ###Code df[df['Grad.Rate']>100] ###Output _____no_output_____ ###Markdown ** Set that school's graduation rate to 100 so it makes sense. You may get a warning not an error) when doing this operation, so use dataframe operations or just re-do the histogram visualization to make sure it actually went through.** ###Code df.loc['Cazenovia College','Grad.Rate'] df.loc['Cazenovia College','Grad.Rate'] = 100 fig = sns.FacetGrid(df[['Grad.Rate','Private']],hue='Private') fig = fig.map(plt.hist,'Grad.Rate',bins=20,alpha=0.3,edgecolor='b') ###Output _____no_output_____ ###Markdown K Means Cluster CreationNow it is time to create the Cluster labels!** Import KMeans from SciKit Learn.** ###Code from sklearn.cluster import KMeans df2 = df.copy() ###Output _____no_output_____ ###Markdown ** Create an instance of a K Means model with 2 clusters.** ###Code km = KMeans(n_clusters=2) ###Output _____no_output_____ ###Markdown **Fit the model to all the data except for the Private label.** ###Code df2.drop('Private',axis=1) km.fit(df2.drop('Private',axis=1)) ###Output _____no_output_____ ###Markdown ** What are the cluster center vectors?** ###Code km.cluster_centers_ ###Output _____no_output_____ ###Markdown EvaluationThere is no perfect way to evaluate clustering if you don't have the labels, however since this is just an exercise, we do have the labels, so we take advantage of this to evaluate our clusters, keep in mind, you usually won't have this luxury in the real world.** Create a new column for df called 'Cluster', which is a 1 for a Private school, and a 0 for a public school.** ###Code IsPrivate = pd.get_dummies(df2.Private) IsPrivate.head() Cluster = km.predict(df2.drop('Private',axis=1)) Cluster #Cluster appears to be the opposite of IsPrivate Cluster2 = -1 * (Cluster - 1) Cluster2 df2['IsPrivate'] = IsPrivate.Yes df2['Cluster'] = Cluster2 df2.head() ###Output _____no_output_____ ###Markdown ** Create a confusion matrix and classification report to see how well the Kmeans clustering worked without being given any labels.** ###Code from sklearn.metrics import confusion_matrix from sklearn.metrics import classification_report print(confusion_matrix(df2.IsPrivate,df.Cluster) ) print(classification_report(df2.IsPrivate,df.Cluster)) ###Output [[ 74 138] [ 34 531]] precision recall f1-score support 0 0.69 0.35 0.46 212 1 0.79 0.94 0.86 565 avg / total 0.76 0.78 0.75 777
Spotify Data Extract.ipynb
###Markdown Sign up for Spotify developer account* Connect developer account to regular spotify account.* Accept terms.* Create a non-commercial app ###Code import sys import spotipy import spotipy.util as util import os from pprint import pprint with open('config.txt') as f: content = f.readlines() os.environ['SPOTIPY_CLIENT_ID'] = content[0].strip() os.environ['SPOTIPY_CLIENT_SECRET'] = content[1].strip() os.environ['SPOTIPY_REDIRECT_URI'] = content[2].strip() username = content[3].strip() scope = 'user-top-read user-follow-modify' #separated by spaces token = util.prompt_for_user_token(username, scope) sp = spotipy.Spotify(auth=token) ###Output _____no_output_____ ###Markdown Example of Spotify Music Data Analysishttps://github.com/AsTimeGoesBy111/Spotify-Music-Data-Analysis Get playlist as dataframe from ID ###Code import pandas as pd import datetime uris = {'37i9dQZF1DX10zKzsJ2jva': 'Viva_Latina', '37i9dQZF1DWY7IeIP1cdjF': 'Baila_Reggaeton', '37i9dQZF1DXcBWIGoYBM5M': 'Todays_Top_Hits', #'37i9dQZF1DX5nwnRMcdReF': 'Top_Tracks_2017', '37i9dQZF1DX1HUbZS4LEyL': 'Top_Tracks_2018'} date = str(datetime.datetime.now()) df = pd.DataFrame(columns=['song_uri', 'playlist', 'song_name', 'popularity']) for uri, playlist in uris.items(): response = sp.user_playlist_tracks(username, playlist_id=uri, fields=None, limit=100, offset=0, market=None) # pprint(response) for i in range(len(response['items'])): song_uri = response['items'][i]['track']['uri'] song_name = response['items'][i]['track']['name'] popularity = response['items'][i]['track']['popularity'] explicit = response['items'][i]['track']['explicit'] duration = response['items'][i]['track']['duration_ms'] df = df.append({'song_uri': song_uri, 'playlist': playlist, 'song_name': song_name, 'popularity': popularity, 'explicit': explicit, 'duration': duration}, ignore_index=True) ###Output _____no_output_____ ###Markdown Get audio features ###Code response = [] l = len(df['song_uri']) for a in range(0,l,50): b = min(a + 50, l) song_uris = list(df['song_uri'][a:b]) response.extend(sp.audio_features(song_uris)) #maximum: 50 ids df2 = pd.DataFrame(response) df3 = pd.concat([df, df2], axis=1, sort=False) df3 = df3.drop(['analysis_url', 'id', 'track_href', 'uri', 'type'], axis=1) df3.to_csv("Playlist Audio Features_2019.csv", index=False) df3.head(25) import matplotlib.pyplot as plt %matplotlib inline x='energy' y='playlist' s='popularity' #target = shading df3 = df3.sort_values(s, ascending=False) fig, ax = plt.subplots() plt.gray() ax.scatter(x=df3[x], y=df3[y], s=200, c=-df3[s]**2, vmin=-10000, vmax=0, edgecolors = 'black', alpha=.2) plt.xlabel(x) plt.ylabel(y) print(df.columns) #https://spotifycharts.com/regional #playlists = sp.featured_playlists(locale=None, country='US', timestamp=None, limit=20, offset=0) #pprint(playlists) global_50 = sp.user_playlist_tracks(user='[email protected]',playlist_id='37i9dQZF1DXcBWIGoYBM5M', fields=None, limit=100, offset=0, market=None) pprint(global_50) ###Output _____no_output_____ ###Markdown top latin tracks of 2017 https://open.spotify.com/playlist/37i9dQZF1DX1CvQj0vxP1Ytop tracks of 2017 https://open.spotify.com/playlist/37i9dQZF1DX7Axsg3uaDZb ###Code response = sp.user_playlist_tracks(user='[email protected]', playlist_id='37i9dQZF1DXcBWIGoYBM5M', fields=None, limit=100, offset=0, market=None) pprint(response) ###Output _____no_output_____ ###Markdown Get search results as dataframe ###Code search_terms = {'year:2018 AND genre:Latin': '2018_Latin', 'year:2017 AND genre:Latin': '2017_Latin', 'year:2016 AND genre:Latin': '2016_Latin', 'year:2015 AND genre:Latin': '2015_Latin', 'year:2014 AND genre:Latin': '2014_Latin', 'year:2018 AND genre:Pop': '2018_Pop', 'year:2017 AND genre:Pop': '2017_Pop', 'year:2016 AND genre:Pop': '2016_Pop', 'year:2015 AND genre:Pop': '2015_Pop', 'year:2014 AND genre:Pop': '2014_Pop', } max_search_length = 10000 date = str(datetime.datetime.now()) length_of_response = max_search_length df = pd.DataFrame(columns=['song_uri', 'search_term', 'search_title', 'song_name', 'popularity']) for search_term, search_title in search_terms.items(): l = max_search_length for a in range(0,l,50): response = sp.search(q=search_term, limit=50, offset=a, type='track') length_of_response = response['tracks']['total'] #pprint(response) for i in range(len(response['tracks']['items'])): song_uri = response['tracks']['items'][i]['uri'] song_name = response['tracks']['items'][i]['name'] popularity = response['tracks']['items'][i]['popularity'] explicit = response['tracks']['items'][i]['explicit'] duration = response['tracks']['items'][i]['duration_ms'] df = df.append({'song_uri': song_uri, 'search_term': search_term, 'search_title': search_title, 'song_name': song_name, 'popularity': popularity, 'explicit': explicit, 'duration': duration}, ignore_index=True) l = min(max_search_length, length_of_response) response = [] l = len(df['song_uri']) for a in range(0,l,50): b = min(a + 50, l) song_uris = list(df['song_uri'][a:b]) response.extend(sp.audio_features(song_uris)) #maximum: 50 ids df2 = pd.DataFrame(response) df3 = pd.concat([df, df2], axis=1, sort=False) df3 = df3.drop(['analysis_url', 'id', 'track_href', 'uri', 'type'], axis=1) df3.head(5) df3.to_csv("Search Audio Features.csv", index=False) ## NOTE : genre:Latin AND genre:Pop AND year:2018 ###Output _____no_output_____
inequality/Inequality_Empirical_Notebook.ipynb
###Markdown SS164 Inequality Class Activity This notebook examines inequality in a country, based on data [from the World Inequality Database](https://wid.world/bulk_download/wid_all_data.zip) and from the [World Bank](https://data.worldbank.org/indicator/SI.POV.GINI?view=chart), and allows you to compare the income distribution of the chosen country with the global one.To prepare for class, everyone must read the notebook corresponding to the United Kingdom (our example) and reach a basic understanding of it. When the class starts, you will be sorted into 4 groups - each assigned to a different country - and given the notebook of that country (as a pdf, so you don't have to worry about running code during class). The only change from what you will receive in class and the example you previously read will be the cell below (a country from this [list of options](https://github.com/dianagold/SS164/blob/master/inequality/Countries_for_Inequality.csv)). In breakouts, you will complete a series of questions about inequality.Disclaimer: the class activity will consist on **demonstrating a solid understanding of the concepts** through the discussion and interpretation of the results of this notebook. You are not required to perform any analytical work nor expected to code nor fully understand the code (but are welcomed to play with it on your own after the class if you so wish). ###Code #Change here for the country you chose (must be spelled exactly the same as in the list of options) chosencountry = 'United Kingdom' ###Output _____no_output_____ ###Markdown I. SetupYou may read only the markdown cells from this section. It is just setting the stage for the relevant outputs in the following sections.No need to worry about the math, but in case you need a refresher on the Lorenz curve and the Gini coefficient, rewatch this 7min video explaining the Gini coefficient that was assigned for class 2.1: https://www.youtube.com/watch?time_continue=447&v=Yu8ehXSNobIImport libraries ###Code import os import numpy as np import pandas as pd import wbgapi as wb import urllib.request from zipfile import ZipFile from matplotlib import pyplot as plt %matplotlib inline ###Output _____no_output_____ ###Markdown Find chosen country in table with the list of options to store corresponding country codes (how data from this country is labelled in the World Inequality Database, WID, and in the World Bank Group data portal, WBG). For most countries, the most recent year with WID data is 2019, and the usual variable is 'tptinc992j' which is pre-tax income for a certain segment of the population, for example, the poorest decile or 10% (p0p10) and the poorest percentile or 1% (p0p1). You will need this data for all deciles (p0p10… p90p100) and percentiles (p0p1… p99p100). The rest of the variable name means age = 992 (adults); pop = j (equal split). This is here to allow some flexibility in terms of variables, as some countries in WID have post-tax income data too. ###Code countrytable = pd.read_csv('Countries_for_Inequality.csv') chosenline = countrytable[countrytable['Country']==chosencountry] widcode = chosenline['WID_Code'].values[0] wbgcode = chosenline['WBG_Code'].values[0] lastyear = chosenline['Latest Year'].values[0] incomevar = chosenline['Variable'].values[0] chosenline ###Output _____no_output_____ ###Markdown Download data file from the [World Inequality Database (WID)](https://wid.world/data/), which is ~100Mb ###Code if os.path.isfile('./wid_all_data.zip') : print('File already downloaded') else : import zipfile url = 'https://wid.world/bulk_download/wid_all_data.zip' filename = 'wid_all_data.zip' urllib.request.urlretrieve(url, filename) ###Output File already downloaded ###Markdown Extract the contents of the chosen country and the whole word ###Code with ZipFile('wid_all_data.zip', 'r') as zip_ref: for filename in ['WID_data_'+widcode+'.csv', 'WID_data_WO.csv']: zip_ref.extract(filename) ###Output _____no_output_____ ###Markdown Function to plot a histogram of income distribution. This is just a simple plot, no calculation done! A good way to visualize the 'input' data that is being used ###Code def inequality_plot(data): fig, ax = plt.subplots(figsize=[8,8]) plt.bar([x for x in range(len(data))],data,width=1,color='grey') #labels for plot ax.set_xlabel('Population') ax.set_ylabel('Income') ax.set_xticks([]) ax.set_xticklabels([]) plt.show() ###Output _____no_output_____ ###Markdown Function to calculate the Gini index based on sorted array (which was plot in the histogram) ###Code def gini(arr): #sort array sorted_arr = arr.copy() sorted_arr.sort() n = arr.size coef_ = 2. / n const_ = (n + 1.) / n weighted_sum = sum([(i+1)*yi for i, yi in enumerate(sorted_arr)]) return coef_*weighted_sum/(sorted_arr.sum()) - const_ ###Output _____no_output_____ ###Markdown Function to plot the Lorenz Curve from grouped data (we'll use this function for both the percentiles and the deciles data). Added bonus, it reports the Gini value as a legend. ###Code def lorenz_curve(X): #Process data for Lorenz Curve X_lorenz = X.cumsum() / X.sum() X_lorenz = np.insert(X_lorenz, 0, 0) X_lorenz[0], X_lorenz[-1] fig, ax = plt.subplots(figsize=[8,8]) ##Plot of Lorenz curve ax.plot(np.arange(X_lorenz.size)/(X_lorenz.size-1), X_lorenz,marker='.',markevery=2, color='black') ## shade area ax.fill(np.arange(X_lorenz.size)/(X_lorenz.size-1), X_lorenz,color='grey') ax.fill_between(np.arange(X_lorenz.size)/(X_lorenz.size-1), X_lorenz,color='pink') ## line plot of equality ax.plot([0,1], [0,1], color='k') #plot gini index as a legend ax.text(0.1,0.9,'Gini Index = %f' %(gini(X))) #labels for plot ax.set_xlabel('Fraction of the population') ax.set_ylabel('Cumulative Income share') ax.set_yticklabels([]) ax.set_xticklabels([]) plt.show() ###Output _____no_output_____ ###Markdown Function that plot everything for a chosen Country, Year, and Variable. It plots the histogram, the Lorenz curve, and prints the Gini value on it. This just bundles the 3 previous functions into a single one to make our life easier when calling it later. ###Code def inequality(country,variable,year,division='percentile'): """ country = Country code for i.e BR is Brazil. Data is extracted for the respective country variable = One of the variables from tptinc992j, tfiinc992j, tdiinc992j or tcainc992j. Income inequality data on the variable is extracted year = Year for which you extract the income inequality data for division = Whether you want data in percentiles or deciles. The default is percentiles so if you want deciles, just pass the word 'decile' to the function """ #Read csv file data = pd.read_csv('WID_data_'+country+'.csv', delimiter=';') #If you want data in percentiles if division == 'percentile' or division == 'percentiles': percentiles = [] for i in range(100): percentiles.append('p'+str(i)+'p'+str(i+1)) #Filter data according to student defined constraints inequality_data = [] temp = data.loc[(data['variable'] == variable) & (data['year'] == year)] for i in percentiles: inequality_data.append(temp.loc[temp['percentile']==i]['value'].values[0]) return inequality_plot(np.array(inequality_data)), lorenz_curve(np.array(inequality_data)) elif division == 'decile' or division == 'deciles' : deciles = [] for i in range(0,100,10): deciles.append('p'+str(i)+'p'+str(i+10)) #Filter data according to student defined constraints inequality_data = [] temp = data.loc[(data['variable'] == variable) & (data['year'] == year)] for i in deciles: inequality_data.append(temp.loc[temp['percentile']==i]['value'].values[0]) return inequality_plot(np.array(inequality_data)), lorenz_curve(np.array(inequality_data)) ###Output _____no_output_____ ###Markdown II. Obtain four Gini valuesThis section will give you 4 Gini values:- Calculated using the income deciles data from the WID (10 points)- Calculated using the income percentiles data from the WID (100 points)- Estimated by WID and retrieved from their public data- Estimated by the World Bank and retrieved from their World Development IndicatorsRead through this section carefully and take note of those 4 values and corresponding years. Your first task is to add them to a template table and discuss why they differ. See the template table below for the **United Kingdom** as an example:|Country|Year WID|WID deciles|WID percentiles|WID reported|WBG reported|Year WBG|| --- | --- | --- |---| --- | --- | --- ||United Kingdom|2019|0.380|0.416|0.466|0.351|2017| ###Code #Calculate Gini from WID income deciles inequality(widcode,incomevar,lastyear,'deciles') #Calculate Gini from WID income percentiles inequality(widcode,incomevar,lastyear) ###Output _____no_output_____ ###Markdown Retrieve the Gini stored in the WID dataset. This does not require any calculation, only knowing the name of the variable that corresponds to the reported Gini value of pre-tax income for all adults (gptinc992j). There are many more variables in the WID dataset, and you can look at their codebook for a complete list if you are curious. ###Code #Gini is coded as variable gptinc992j cty = pd.read_csv('WID_data_'+widcode+'.csv', delimiter=';') gini = cty[(cty['variable']=='gptinc992j') & (cty['year']==lastyear)] gini ###Output _____no_output_____ ###Markdown Retrieve the Gini estimated by the World Bank in their World Development Indicators, available in [this website](https://data.worldbank.org/indicator/SI.POV.GINI?locations=BR&view=chart) or queried via API. Recent years are queried because there are many missing values (NaN). Choose the value from the year that is the closest to the one you analyzed in WID to include in your table. ###Code wdi = wb.data.DataFrame('SI.POV.GINI', [wbgcode], time=range(2010, 2020), labels=True) wdi ###Output _____no_output_____ ###Markdown III. Compare a poor/rich person in this country with the global distribution of income This section will help you answer: How does a poor person (between 10-20% of the distribution or p10p20), a median person (close to 50%) and a rich person (at the top decile, 90%-100% or p90p100) in this country compares to the world?To answer this, you will compare the WID file with global data and the WID file from your country, both of which were extracted in the Setup. We must be mindful of the units: the global data is expressed in Euros, while countries are reported in the local currency unit (LCU). Thus, we will also need the variable 'xlceux999i' (market exchange rate : LCU per EUR).Using the United Kingdom as an example, we have xlceux999i = 0.846 in 2019. Thus, in 2019:- a poor person (p10p20) in the United Kingdom makes 4,598 in local currency unit, or 5,435 in euros- a median person (p50p51) in the United Kingdom makes 24,213 in local currency unit, or 28,618 in euros- a rich person (p90p100) in the United Kingdom makes 57,557 in local currency unit, or 68,027 in eurosManually browsing through the global distribution, we find that these values for the UK are closest to the global numbers for the percentiles p43p44 (5,377 EUR) and p86p87 (29,140 EUR) and and p96p97 (64,880 EUR). ###Code #Exchange rate between Local Currency Unit and Euros xrate = cty[(cty['variable']=='xlceux999i') & (cty['year']==lastyear)]['value'].values[0] xrate #Poor, median and rich person in this country poor = cty[(cty['variable']==incomevar) & (cty['year']==lastyear) & (cty['percentile']=='p10p20')]['value'].values[0] middle = cty[(cty['variable']==incomevar) & (cty['year']==lastyear) & (cty['percentile']=='p50p51')]['value'].values[0] rich = cty[(cty['variable']==incomevar) & (cty['year']==lastyear) & (cty['percentile']=='p90p100')]['value'].values[0] print('In the year of '+np.array2string(lastyear)) print('Poor person (p10p20) in '+chosencountry+' makes '+np.array2string(poor, precision=0)+' in local currency unit, or ' +np.array2string(poor/xrate, precision=0)+' in euros') print('Median person (p50p51) in '+chosencountry+' makes '+np.array2string(middle, precision=0)+' in local currency unit, or ' +np.array2string(middle/xrate, precision=0)+' in euros') print('Rich person (p90p10) in '+chosencountry+' makes '+np.array2string(rich, precision=0)+' in local currency unit, or ' +np.array2string(rich/xrate, precision=0)+' in euros') ###Output In the year of 2019 Poor person (p10p20) in United Kingdom makes 4598. in local currency unit, or 5435. in euros Median person (p50p51) in United Kingdom makes 24213. in local currency unit, or 28618. in euros Rich person (p90p10) in United Kingdom makes 57557. in local currency unit, or 68027. in euros ###Markdown What percentile of the world income distribution approximatly matches those values? You will do this manually by scrolling the list below to find the best match. ###Code #Display world income distribution to find equivalence world = pd.read_csv('WID_data_WO.csv', delimiter=';') world = world[(world['variable'] == 'tptinc992j') & (world['year'] == lastyear)] percentiles = [] for i in range(100): percentiles.append('p'+str(i)+'p'+str(i+1)) pd.set_option('display.max_rows', None) showme = world[world['percentile'].isin(percentiles)] #Note that this is not really sorted as the percentiles rather by values showme[['percentile','value']].sort_values(['value', 'percentile']) ###Output _____no_output_____
1 - basics/4 - percentiles.ipynb
###Markdown PercentilesA percentile is a measure used in statistics indicating the value below which a given percentage of observations in a group of observations falls. In a distribution of data observation, 70th percentile is value below which 70% of the observations lie. ###Code %matplotlib inline import numpy as np import matplotlib.pyplot as plt vals = np.random.normal(100, 10, 10000) plt.hist(vals, 50) plt.show() np.percentile(vals, 50) ###Output _____no_output_____ ###Markdown Above value gives the value below which 50% of the values exist. As we generated a normal distribution, This value should be close to mean i.e. 100. ###Code np.percentile(vals, 90) ###Output _____no_output_____ ###Markdown Above value gives the value below which 90% of the values exist. ###Code np.percentile(vals, 20) ###Output _____no_output_____
Big-Data-Clusters/CU9/public/content/install/sop040-upgrade-pip.ipynb
###Markdown SOP040 - Upgrade pip in ADS Python sandbox Steps Common functionsDefine helper functions used in this notebook. ###Code # Define `run` function for transient fault handling, hyperlinked suggestions, and scrolling updates on Windows import sys import os import re import platform import shlex import shutil import datetime from subprocess import Popen, PIPE from IPython.display import Markdown retry_hints = {} # Output in stderr known to be transient, therefore automatically retry error_hints = {} # Output in stderr where a known SOP/TSG exists which will be HINTed for further help install_hint = {} # The SOP to help install the executable if it cannot be found def run(cmd, return_output=False, no_output=False, retry_count=0, base64_decode=False, return_as_json=False): """Run shell command, stream stdout, print stderr and optionally return output NOTES: 1. Commands that need this kind of ' quoting on Windows e.g.: kubectl get nodes -o jsonpath={.items[?(@.metadata.annotations.pv-candidate=='data-pool')].metadata.name} Need to actually pass in as '"': kubectl get nodes -o jsonpath={.items[?(@.metadata.annotations.pv-candidate=='"'data-pool'"')].metadata.name} The ' quote approach, although correct when pasting into Windows cmd, will hang at the line: `iter(p.stdout.readline, b'')` The shlex.split call does the right thing for each platform, just use the '"' pattern for a ' """ MAX_RETRIES = 5 output = "" retry = False # When running `azdata sql query` on Windows, replace any \n in """ strings, with " ", otherwise we see: # # ('HY090', '[HY090] [Microsoft][ODBC Driver Manager] Invalid string or buffer length (0) (SQLExecDirectW)') # if platform.system() == "Windows" and cmd.startswith("azdata sql query"): cmd = cmd.replace("\n", " ") # shlex.split is required on bash and for Windows paths with spaces # cmd_actual = shlex.split(cmd) # Store this (i.e. kubectl, python etc.) to support binary context aware error_hints and retries # user_provided_exe_name = cmd_actual[0].lower() # When running python, use the python in the ADS sandbox ({sys.executable}) # if cmd.startswith("python "): cmd_actual[0] = cmd_actual[0].replace("python", sys.executable) # On Mac, when ADS is not launched from terminal, LC_ALL may not be set, which causes pip installs to fail # with: # # UnicodeDecodeError: 'ascii' codec can't decode byte 0xc5 in position 4969: ordinal not in range(128) # # Setting it to a default value of "en_US.UTF-8" enables pip install to complete # if platform.system() == "Darwin" and "LC_ALL" not in os.environ: os.environ["LC_ALL"] = "en_US.UTF-8" # When running `kubectl`, if AZDATA_OPENSHIFT is set, use `oc` # if cmd.startswith("kubectl ") and "AZDATA_OPENSHIFT" in os.environ: cmd_actual[0] = cmd_actual[0].replace("kubectl", "oc") # To aid supportability, determine which binary file will actually be executed on the machine # which_binary = None # Special case for CURL on Windows. The version of CURL in Windows System32 does not work to # get JWT tokens, it returns "(56) Failure when receiving data from the peer". If another instance # of CURL exists on the machine use that one. (Unfortunately the curl.exe in System32 is almost # always the first curl.exe in the path, and it can't be uninstalled from System32, so here we # look for the 2nd installation of CURL in the path) if platform.system() == "Windows" and cmd.startswith("curl "): path = os.getenv('PATH') for p in path.split(os.path.pathsep): p = os.path.join(p, "curl.exe") if os.path.exists(p) and os.access(p, os.X_OK): if p.lower().find("system32") == -1: cmd_actual[0] = p which_binary = p break # Find the path based location (shutil.which) of the executable that will be run (and display it to aid supportability), this # seems to be required for .msi installs of azdata.cmd/az.cmd. (otherwise Popen returns FileNotFound) # # NOTE: Bash needs cmd to be the list of the space separated values hence shlex.split. # if which_binary == None: which_binary = shutil.which(cmd_actual[0]) # Display an install HINT, so the user can click on a SOP to install the missing binary # if which_binary == None: print(f"The path used to search for '{cmd_actual[0]}' was:") print(sys.path) if user_provided_exe_name in install_hint and install_hint[user_provided_exe_name] is not None: display(Markdown(f'HINT: Use [{install_hint[user_provided_exe_name][0]}]({install_hint[user_provided_exe_name][1]}) to resolve this issue.')) raise FileNotFoundError(f"Executable '{cmd_actual[0]}' not found in path (where/which)") else: cmd_actual[0] = which_binary start_time = datetime.datetime.now().replace(microsecond=0) print(f"START: {cmd} @ {start_time} ({datetime.datetime.utcnow().replace(microsecond=0)} UTC)") print(f" using: {which_binary} ({platform.system()} {platform.release()} on {platform.machine()})") print(f" cwd: {os.getcwd()}") # Command-line tools such as CURL and AZDATA HDFS commands output # scrolling progress bars, which causes Jupyter to hang forever, to # workaround this, use no_output=True # # Work around a infinite hang when a notebook generates a non-zero return code, break out, and do not wait # wait = True try: if no_output: p = Popen(cmd_actual) else: p = Popen(cmd_actual, stdout=PIPE, stderr=PIPE, bufsize=1) with p.stdout: for line in iter(p.stdout.readline, b''): line = line.decode() if return_output: output = output + line else: if cmd.startswith("azdata notebook run"): # Hyperlink the .ipynb file regex = re.compile(' "(.*)"\: "(.*)"') match = regex.match(line) if match: if match.group(1).find("HTML") != -1: display(Markdown(f' - "{match.group(1)}": "{match.group(2)}"')) else: display(Markdown(f' - "{match.group(1)}": "[{match.group(2)}]({match.group(2)})"')) wait = False break # otherwise infinite hang, have not worked out why yet. else: print(line, end='') if wait: p.wait() except FileNotFoundError as e: if install_hint is not None: display(Markdown(f'HINT: Use {install_hint} to resolve this issue.')) raise FileNotFoundError(f"Executable '{cmd_actual[0]}' not found in path (where/which)") from e exit_code_workaround = 0 # WORKAROUND: azdata hangs on exception from notebook on p.wait() if not no_output: for line in iter(p.stderr.readline, b''): try: line_decoded = line.decode() except UnicodeDecodeError: # NOTE: Sometimes we get characters back that cannot be decoded(), e.g. # # \xa0 # # For example see this in the response from `az group create`: # # ERROR: Get Token request returned http error: 400 and server # response: {"error":"invalid_grant",# "error_description":"AADSTS700082: # The refresh token has expired due to inactivity.\xa0The token was # issued on 2018-10-25T23:35:11.9832872Z # # which generates the exception: # # UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa0 in position 179: invalid start byte # print("WARNING: Unable to decode stderr line, printing raw bytes:") print(line) line_decoded = "" pass else: # azdata emits a single empty line to stderr when doing an hdfs cp, don't # print this empty "ERR:" as it confuses. # if line_decoded == "": continue print(f"STDERR: {line_decoded}", end='') if line_decoded.startswith("An exception has occurred") or line_decoded.startswith("ERROR: An error occurred while executing the following cell"): exit_code_workaround = 1 # inject HINTs to next TSG/SOP based on output in stderr # if user_provided_exe_name in error_hints: for error_hint in error_hints[user_provided_exe_name]: if line_decoded.find(error_hint[0]) != -1: display(Markdown(f'HINT: Use [{error_hint[1]}]({error_hint[2]}) to resolve this issue.')) # Verify if a transient error, if so automatically retry (recursive) # if user_provided_exe_name in retry_hints: for retry_hint in retry_hints[user_provided_exe_name]: if line_decoded.find(retry_hint) != -1: if retry_count < MAX_RETRIES: print(f"RETRY: {retry_count} (due to: {retry_hint})") retry_count = retry_count + 1 output = run(cmd, return_output=return_output, retry_count=retry_count) if return_output: if base64_decode: import base64 return base64.b64decode(output).decode('utf-8') else: return output elapsed = datetime.datetime.now().replace(microsecond=0) - start_time # WORKAROUND: We avoid infinite hang above in the `azdata notebook run` failure case, by inferring success (from stdout output), so # don't wait here, if success known above # if wait: if p.returncode != 0: raise SystemExit(f'Shell command:\n\n\t{cmd} ({elapsed}s elapsed)\n\nreturned non-zero exit code: {str(p.returncode)}.\n') else: if exit_code_workaround !=0 : raise SystemExit(f'Shell command:\n\n\t{cmd} ({elapsed}s elapsed)\n\nreturned non-zero exit code: {str(exit_code_workaround)}.\n') print(f'\nSUCCESS: {elapsed}s elapsed.\n') if return_output: if base64_decode: import base64 return base64.b64decode(output).decode('utf-8') else: return output # Hints for tool retry (on transient fault), known errors and install guide # retry_hints = {'kubectl': ['A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', ], 'python': [ ], 'azdata': ['Endpoint sql-server-master does not exist', 'Endpoint livy does not exist', 'Failed to get state for cluster', 'Endpoint webhdfs does not exist', 'Adaptive Server is unavailable or does not exist', 'Error: Address already in use', 'Login timeout expired (0) (SQLDriverConnect)', 'SSPI Provider: No Kerberos credentials available', ], } error_hints = {'kubectl': [['no such host', 'TSG010 - Get configuration contexts', '../monitor-k8s/tsg010-get-kubernetes-contexts.ipynb'], ['No connection could be made because the target machine actively refused it', 'TSG056 - Kubectl fails with No connection could be made because the target machine actively refused it', '../repair/tsg056-kubectl-no-connection-could-be-made.ipynb'], ], 'python': [['Library not loaded: /usr/local/opt/unixodbc', 'SOP012 - Install unixodbc for Mac', '../install/sop012-brew-install-odbc-for-sql-server.ipynb'], ['WARNING: You are using pip version', 'SOP040 - Upgrade pip in ADS Python sandbox', '../install/sop040-upgrade-pip.ipynb'], ], 'azdata': [['Please run \'azdata login\' to first authenticate', 'SOP028 - azdata login', '../common/sop028-azdata-login.ipynb'], ['The token is expired', 'SOP028 - azdata login', '../common/sop028-azdata-login.ipynb'], ['Reason: Unauthorized', 'SOP028 - azdata login', '../common/sop028-azdata-login.ipynb'], ['Max retries exceeded with url: /api/v1/bdc/endpoints', 'SOP028 - azdata login', '../common/sop028-azdata-login.ipynb'], ['Look at the controller logs for more details', 'TSG027 - Observe cluster deployment', '../diagnose/tsg027-observe-bdc-create.ipynb'], ['provided port is already allocated', 'TSG062 - Get tail of all previous container logs for pods in BDC namespace', '../log-files/tsg062-tail-bdc-previous-container-logs.ipynb'], ['Create cluster failed since the existing namespace', 'SOP061 - Delete a big data cluster', '../install/sop061-delete-bdc.ipynb'], ['Failed to complete kube config setup', 'TSG067 - Failed to complete kube config setup', '../repair/tsg067-failed-to-complete-kube-config-setup.ipynb'], ['Data source name not found and no default driver specified', 'SOP069 - Install ODBC for SQL Server', '../install/sop069-install-odbc-driver-for-sql-server.ipynb'], ['Can\'t open lib \'ODBC Driver 17 for SQL Server', 'SOP069 - Install ODBC for SQL Server', '../install/sop069-install-odbc-driver-for-sql-server.ipynb'], ['Control plane upgrade failed. Failed to upgrade controller.', 'TSG108 - View the controller upgrade config map', '../diagnose/tsg108-controller-failed-to-upgrade.ipynb'], ['NameError: name \'azdata_login_secret_name\' is not defined', 'SOP013 - Create secret for azdata login (inside cluster)', '../common/sop013-create-secret-for-azdata-login.ipynb'], ['ERROR: No credentials were supplied, or the credentials were unavailable or inaccessible.', 'TSG124 - \'No credentials were supplied\' error from azdata login', '../repair/tsg124-no-credentials-were-supplied.ipynb'], ['Please accept the license terms to use this product through', 'TSG126 - azdata fails with \'accept the license terms to use this product\'', '../repair/tsg126-accept-license-terms.ipynb'], ], } install_hint = {'kubectl': [ 'SOP036 - Install kubectl command line interface', '../install/sop036-install-kubectl.ipynb' ], 'azdata': [ 'SOP063 - Install azdata CLI (using package manager)', '../install/sop063-packman-install-azdata.ipynb' ], } print('Common functions defined successfully.') ###Output _____no_output_____ ###Markdown Upgrade pip ###Code import sys run(f'python -m pip install --upgrade pip==20.2.4') print("Notebook execution is complete.") ###Output _____no_output_____
i_clustering/K Means Clustering in Python.ipynb
###Markdown K Means Clustering Step 1. Import libraries ###Code import pandas as pd import matplotlib.pyplot as plt import seaborn as sns sns.set_style('whitegrid') ###Output _____no_output_____ ###Markdown Step 2. Load data ###Code contents = pd.read_csv('HackneyData.csv') contents.head() contents.info() ###Output <class 'pandas.core.frame.DataFrame'> RangeIndex: 19 entries, 0 to 18 Data columns (total 3 columns): ward 19 non-null object age 19 non-null float64 income 19 non-null float64 dtypes: float64(2), object(1) memory usage: 536.0+ bytes ###Markdown Step 3. Transform the data for python ###Code contents.columns = ['ward','age','income'] contents.head() contents['income'] = contents['income'].replace('[\£,]','',regex=True).astype(float) contents.head() ###Output _____no_output_____ ###Markdown Step 4. Visualise the data ###Code sns.scatterplot(x='age',y='income',data=contents) ###Output _____no_output_____ ###Markdown Step 5. Apply K-means clustering ###Code data = [] for index, row in contents.iterrows(): age = row['age'] income = row['income'] data.append([float(age),float(income)]) data from sklearn.cluster import KMeans km = KMeans(n_clusters=4,verbose=0) X = contents.drop('ward',axis=1).astype(float) y = contents['ward'] km.fit(X) contents['cluster'] = km.labels_.astype(int) ###Output _____no_output_____ ###Markdown Step 6. Visualise the clusters ###Code sns.scatterplot(x='age',y='income',hue='cluster',data=contents) groups = contents.groupby(by='cluster') # plot the clusters # cmd forward slash to comment uncomment # fig, ax = plt.subplots() for name, group in groups: print('---') print(group) groups = contents.groupby(by='cluster') # plot the clusters # cmd forward slash to comment uncomment fig, ax = plt.subplots() for name, group in groups: ax.plot(group['age'],group['income'],marker='o',ls='',label=name) plt.xlabel('Age') plt.ylabel('Income') ax.legend() ###Output _____no_output_____ ###Markdown Step 7. Doesn't seem quite right... the data isn't scaled properly! ###Code from sklearn.preprocessing import scale km2 = KMeans(n_clusters=4,verbose=0) km2.fit(scale(X)) contents['cluster2'] = km2.labels_.astype(int) groups = contents.groupby(by='cluster2') # plot the clusters # cmd forward slash to comment uncomment fig, ax = plt.subplots() for name, group in groups: ax.plot(group['age'],group['income'],marker='o',ls='',label=name) plt.xlabel('Age') plt.ylabel('Income') ax.legend() sns.scatterplot(x='age',y='income',data=contents,hue='cluster2') ###Output _____no_output_____ ###Markdown Step 8. Use Elbow test to decide number of clusters ###Code num_clusters = list(range(1,10)) kmeans = [KMeans(n_clusters=i) for i in num_clusters] score = [kmeans[i-1].fit(scale(X)).score(scale(X)) for i in num_clusters] plt.plot(num_clusters,score) plt.xlabel('Number of clusters') plt.ylabel('Score') plt.title('Elbow Curve Analysis') plt.show() contents.head() for index, row in contents.sort_values(by=['cluster2','ward']).iterrows(): print('---') print(row['ward'],row['cluster2']) dir() dir(X) import antigravity ###Output _____no_output_____
Descriptive-statistics-length-PS.ipynb
###Markdown Copyright 2021 Andrew M. Olney and made available under [CC BY-SA](https://creativecommons.org/licenses/by-sa/4.0) for text and [Apache-2.0](http://www.apache.org/licenses/LICENSE-2.0) for code. Descriptive statistics length-based metrics: Problem solvingIn this session, we'll work with medication pamphlet data from the [Patient Information Leaflet (PIL) Corpus](http://mcs.open.ac.uk/nlg/old_projects/pills/corpus/PIL/)Our question of interest is whether medication instructions could have an effect on medication overdoses (suppose a colleague has the overdose data).We will calculate [Flesch Kincaid Grade Level (FKGL)](https://en.wikipedia.org/wiki/Flesch%E2%80%93Kincaid_readability_tests) as a metric of text difficulty, under the hypothesis that if the medication instructions are too difficult, patients will be less likely to understand them and overdose.In theory, this corpus is available from NLTK, but [it has been broken for some time](https://github.com/nltk/nltk/issues/1851).Therefore we have included the first 50 files of the corpus in the `datasets/pil` folder. Load the dataStart by importing `nltk` and importing `reader` from `nltk.corpus`. ###Code import nltk as nltk from nltk.corpus import reader #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="]HJ?tF]9lV9p9(|h{fmK">nltk</variable><variable id="c$0G$;i#PH-Rh{N@jV}`">reader</variable></variables><block type="importAs" id="aN-,nCP!gVNy`aijN.:*" x="3" y="126"><field name="libraryName">nltk</field><field name="libraryAlias" id="]HJ?tF]9lV9p9(|h{fmK">nltk</field><next><block type="importFrom" id="`p5.6(Uv@:/Rqfl0Ndp!"><field name="libraryName">nltk.corpus</field><field name="libraryAlias" id="c$0G$;i#PH-Rh{N@jV}`">reader</field></block></next></block></xml> ###Output _____no_output_____ ###Markdown With `reader` create `PlaintextCorpusReader` using a list containing `"datasets/pil"` and `".*"`, then store the result in a new variable `pil`.*Note: We haven't loaded our own corpus before, but it's pretty easy with NLTK - just give it the name of the folder with the text files and a regular expression pattern (in our case, a wildcard, `.*`). This will use the default word/sentence tokenizers, but you can override these, use a different `CorpusReader`, or implement your own.* ###Code pil = reader.PlaintextCorpusReader('datasets/pil', '.*') #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="jS$TX:+[N5`f|}~n(My(">pil</variable><variable id="c$0G$;i#PH-Rh{N@jV}`">reader</variable></variables><block type="variables_set" id="EeAkbuz6YwB^hQdQhcUw" x="54" y="256"><field name="VAR" id="jS$TX:+[N5`f|}~n(My(">pil</field><value name="VALUE"><block type="varCreateObject" id="B@hsB@Upeucpk|sNF0^n"><field name="VAR" id="c$0G$;i#PH-Rh{N@jV}`">reader</field><field name="MEMBER">PlaintextCorpusReader</field><data>reader:PlaintextCorpusReader</data><value name="INPUT"><block type="lists_create_with" id="IPJ8Dvf)InkJrOSj!;xc"><mutation items="2"></mutation><value name="ADD0"><block type="text" id="J8H~r:b,elkI,?{v(WZy"><field name="TEXT">datasets/pil</field></block></value><value name="ADD1"><block type="text" id="7}`avk29E@uM$xO18M?P"><field name="TEXT">.*</field></block></value></block></value></block></value></block></xml> ###Output _____no_output_____ ###Markdown You can now use `pil` like `gutenberg` in the worked example notebook for this module. Calculate length-based metrics Word lengthsGet the word lengths for each text in the corpus and store in `wordLengths`. ###Code wordLengths = [(len(pil.words(i))) for i in (pil.fileids())] wordLengths #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="nb}L;_W{,H)*Jc!qq]@S">wordLengths</variable><variable id="LZ#}.J~9XYczA[nu4?|Q">i</variable><variable id="jS$TX:+[N5`f|}~n(My(">pil</variable></variables><block type="variables_set" id="SKZpRLP{wcl/g*{^W3WV" x="4" y="319"><field name="VAR" id="nb}L;_W{,H)*Jc!qq]@S">wordLengths</field><value name="VALUE"><block type="lists_create_with" id="8eO0[B%0~mtEmLX:+=dw"><mutation items="1"></mutation><value name="ADD0"><block type="comprehensionForEach" id="qHXxT3|WBM:kMw~muDTN"><field name="VAR" id="LZ#}.J~9XYczA[nu4?|Q">i</field><value name="LIST"><block type="varDoMethod" id="tLR@!_zful,@toy1e3E("><field name="VAR" id="jS$TX:+[N5`f|}~n(My(">pil</field><field name="MEMBER">fileids</field><data>pil:fileids</data></block></value><value name="YIELD"><block type="lists_length" id="b5(0SiwR87=9]SU8IBvy"><value name="VALUE"><block type="varDoMethod" id="#IalxaHkKH5=q1de@8Ar"><field name="VAR" id="jS$TX:+[N5`f|}~n(My(">pil</field><field name="MEMBER">words</field><data>pil:words</data><value name="INPUT"><block type="variables_get" id="u[Lug07Y.B({Q]G-6du|"><field name="VAR" id="LZ#}.J~9XYczA[nu4?|Q">i</field></block></value></block></value></block></value></block></value></block></value></block><block type="variables_get" id="aJWMl/VaGmo=-c|`$nxp" x="8" y="436"><field name="VAR" id="nb}L;_W{,H)*Jc!qq]@S">wordLengths</field></block></xml> ###Output _____no_output_____ ###Markdown Sentence lengthsGet the sentence lengths for each text in the corpus and store in `sentenceLengths`. ###Code sentenceLengths = [(len(pil.sents(i))) for i in (pil.fileids())] sentenceLengths #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="$OD[:+1843Cn0O3j8JiE">sentenceLengths</variable><variable id="LZ#}.J~9XYczA[nu4?|Q">i</variable><variable id="jS$TX:+[N5`f|}~n(My(">pil</variable></variables><block type="variables_set" id="SKZpRLP{wcl/g*{^W3WV" x="4" y="319"><field name="VAR" id="$OD[:+1843Cn0O3j8JiE">sentenceLengths</field><value name="VALUE"><block type="lists_create_with" id="8eO0[B%0~mtEmLX:+=dw"><mutation items="1"></mutation><value name="ADD0"><block type="comprehensionForEach" id="qHXxT3|WBM:kMw~muDTN"><field name="VAR" id="LZ#}.J~9XYczA[nu4?|Q">i</field><value name="LIST"><block type="varDoMethod" id="tLR@!_zful,@toy1e3E("><field name="VAR" id="jS$TX:+[N5`f|}~n(My(">pil</field><field name="MEMBER">fileids</field><data>pil:fileids</data></block></value><value name="YIELD"><block type="lists_length" id="b5(0SiwR87=9]SU8IBvy"><value name="VALUE"><block type="varDoMethod" id="#IalxaHkKH5=q1de@8Ar"><field name="VAR" id="jS$TX:+[N5`f|}~n(My(">pil</field><field name="MEMBER">sents</field><data>pil:sents</data><value name="INPUT"><block type="variables_get" id="u[Lug07Y.B({Q]G-6du|"><field name="VAR" id="LZ#}.J~9XYczA[nu4?|Q">i</field></block></value></block></value></block></value></block></value></block></value></block><block type="variables_get" id="aJWMl/VaGmo=-c|`$nxp" x="8" y="436"><field name="VAR" id="$OD[:+1843Cn0O3j8JiE">sentenceLengths</field></block></xml> ###Output _____no_output_____ ###Markdown Collect in dataframeMake a `dataframe` with `fileids`, `wordLengths`, and `sentenceLengths`Start by importing `pandas`. ###Code import pandas as pd #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="_V`RIwppcpbRKT:m6^qH">pd</variable></variables><block type="importAs" id="Gy5)p-`[BHUUWE}k1DeL" x="16" y="10"><field name="libraryName">pandas</field><field name="libraryAlias" id="_V`RIwppcpbRKT:m6^qH">pd</field></block></xml> ###Output _____no_output_____ ###Markdown Now make the `dataframe`. ###Code dataframe = pd.DataFrame(zip(pil.fileids(), wordLengths, sentenceLengths), columns=['corpus','words','sentences']) dataframe #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="d*P53^Ni!VyA[RubgfYr">dataframe</variable><variable id="_V`RIwppcpbRKT:m6^qH">pd</variable><variable id="jS$TX:+[N5`f|}~n(My(">pil</variable><variable id="nb}L;_W{,H)*Jc!qq]@S">wordLengths</variable><variable id="$OD[:+1843Cn0O3j8JiE">sentenceLengths</variable></variables><block type="variables_set" id="Nnj3jwtJa6+~cV1=RDn_" x="-149" y="237"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field><value name="VALUE"><block type="varCreateObject" id="*C0V~q-fS]kDUMRq`O-N"><field name="VAR" id="_V`RIwppcpbRKT:m6^qH">pd</field><field name="MEMBER">DataFrame</field><data>pd:DataFrame</data><value name="INPUT"><block type="lists_create_with" id="DoM~-@qI)6TgbDc;vBMb"><mutation items="2"></mutation><value name="ADD0"><block type="zipBlock" id="nv7/65]-+;=,M.B)yU%U"><value name="x"><block type="lists_create_with" id="@PN$;KCRy[Jv;QJ+d#($"><mutation items="3"></mutation><value name="ADD0"><block type="varDoMethod" id="FJ#DraHg!(/g)_[-^Dn]"><field name="VAR" id="jS$TX:+[N5`f|}~n(My(">pil</field><field name="MEMBER">fileids</field><data>pil:fileids</data></block></value><value name="ADD1"><block type="variables_get" id="1UbO_xg6I)qqO#p=]Trh"><field name="VAR" id="nb}L;_W{,H)*Jc!qq]@S">wordLengths</field></block></value><value name="ADD2"><block type="variables_get" id="%~eQRKbkLX{by+raZ$LQ"><field name="VAR" id="$OD[:+1843Cn0O3j8JiE">sentenceLengths</field></block></value></block></value></block></value><value name="ADD1"><block type="dummyOutputCodeBlock" id="o35uqta54?^UVk|[,.|("><field name="CODE">columns=['corpus','words','sentences']</field></block></value></block></value></block></value></block><block type="variables_get" id="(gZ^x=Q!@}~:|xjc+ZXy" x="-133" y="397"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field></block></xml> ###Output _____no_output_____ ###Markdown ----------------------**QUESTION:**Do you see any problems? **ANSWER: (click here to edit)***Action_Asthma.txt has one word and one sentence. It should be disregarded in the analysis.*---------------------- ReadabilityCalculate FKGL for each text using this formula and add a new FKGL column to the `dataframe`:\begin{equation*}0.39 \left( \frac{\mbox{total words}}{\mbox{total sentences}} \right) +11.8 \left( \frac{\mbox{total syllables}}{\mbox{total words}} \right) - 15.59\end{equation*}*Note assume 1.5 syllables per word.* ###Code dataframe = dataframe.assign(fkgl= ((0.39 * (dataframe['words'] / dataframe['sentences']) + 11.8 * 1.5) - 15.59)) dataframe #<xml xmlns="https://developers.google.com/blockly/xml"><variables><variable id="d*P53^Ni!VyA[RubgfYr">dataframe</variable></variables><block type="variables_set" id="d3ELqhcW@UA^R%PLy3cV" x="-83" y="308"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field><value name="VALUE"><block type="varDoMethod" id="!?g`gMP):imN8F)T,@V|"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field><field name="MEMBER">assign</field><data>dataframe:assign</data><value name="INPUT"><block type="valueOutputCodeBlock" id="k^_3ol/yS5*#Dha3rPVq"><field name="CODE">fkgl=</field><value name="INPUT"><block type="math_arithmetic" id="Md5u*{OHNtri0AjDj],6"><field name="OP">MINUS</field><value name="A"><shadow type="math_number" id="oOOHvHpa1xT0^Jls*1X+"><field name="NUM">0.39</field></shadow><block type="math_arithmetic" id="QGJQ@m3mx5!l=,g9-R+d"><field name="OP">ADD</field><value name="A"><shadow type="math_number" id="X]w:+`6NY)?+=r#3v?Aj"><field name="NUM">0.39</field></shadow><block type="math_arithmetic" id=",gsasO{}sRYc]60t@#ud"><field name="OP">MULTIPLY</field><value name="A"><shadow type="math_number" id="^KWa~qEgh3Q5R?==H)+9"><field name="NUM">0.39</field></shadow></value><value name="B"><shadow type="math_number" id="?4DQu[d7x;oy^4iY|Y3("><field name="NUM">1</field></shadow><block type="math_arithmetic" id="b1}M?uM#pV~iqEz`%Dh~"><field name="OP">DIVIDE</field><value name="A"><shadow type="math_number"><field name="NUM">1</field></shadow><block type="indexer" id="/t1A;EYsDfw+BYK-f{P8"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field><value name="INDEX"><block type="text" id="/`cSF)tsx%3]HefEnKlV"><field name="TEXT">words</field></block></value></block></value><value name="B"><shadow type="math_number"><field name="NUM">1</field></shadow><block type="indexer" id="9_HN7=?;UGbK`ZFpBD)n"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field><value name="INDEX"><block type="text" id="SDk,ItV,c;N.~+Z%vOH]"><field name="TEXT">sentences</field></block></value></block></value></block></value></block></value><value name="B"><shadow type="math_number" id="osy!WzPZ^Ac.Li?,M]Sq"><field name="NUM">1</field></shadow><block type="math_arithmetic" id="pns2jEFkEFV2potV{wzJ"><field name="OP">MULTIPLY</field><value name="A"><shadow type="math_number" id="~z95el]}Uct`QV~*3@LC"><field name="NUM">11.8</field></shadow></value><value name="B"><shadow type="math_number" id="L,wO%x|x/NAv$%-5KtZc"><field name="NUM">1.5</field></shadow></value></block></value></block></value><value name="B"><shadow type="math_number" id="N6x=21,Sww{nHR%@JF|@"><field name="NUM">15.59</field></shadow></value></block></value></block></value></block></value></block><block type="variables_get" id="F^RD,W:_|bYpb/e8Wq3b" x="-90" y="410"><field name="VAR" id="d*P53^Ni!VyA[RubgfYr">dataframe</field></block></xml> ###Output _____no_output_____
Analise_de_dados/data-wrangling.ipynb
###Markdown Data Analysis with Python Data Wrangling Welcome!By the end of this notebook, you will have learned the basics of Data Wrangling! Table of content Identify and handle missing values Identify missing values Deal with missing values Correct data format Data standardization Data Normalization (centering/scaling) Binning Indicator variable Estimated Time Needed: 30 min What is the purpose of Data Wrangling? Data Wrangling is the process of converting data from the initial format to a format that may be better for analysis. What is the fuel consumption (L/100k) rate for the diesel car? Import dataYou can find the "Automobile Data Set" from the following link: https://archive.ics.uci.edu/ml/machine-learning-databases/autos/imports-85.data. We will be using this data set throughout this course. Import pandas ###Code import pandas as pd import matplotlib.pylab as plt ###Output _____no_output_____ ###Markdown Reading the data set from the URL and adding the related headers. URL of the dataset This dataset was hosted on IBM Cloud object click HERE for free storage ###Code filename = "https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/DA0101EN/auto.csv" ###Output _____no_output_____ ###Markdown Python list headers containing name of headers ###Code headers = ["symboling","normalized-losses","make","fuel-type","aspiration", "num-of-doors","body-style", "drive-wheels","engine-location","wheel-base", "length","width","height","curb-weight","engine-type", "num-of-cylinders", "engine-size","fuel-system","bore","stroke","compression-ratio","horsepower", "peak-rpm","city-mpg","highway-mpg","price"] ###Output _____no_output_____ ###Markdown Use the Pandas method read_csv() to load the data from the web address. Set the parameter "names" equal to the Python list "headers". ###Code df = pd.read_csv(filename, names = headers) ###Output _____no_output_____ ###Markdown Use the method head() to display the first five rows of the dataframe. ###Code # To see what the data set looks like, we'll use the head() method. df.head() ###Output _____no_output_____ ###Markdown As we can see, several question marks appeared in the dataframe; those are missing values which may hinder our further analysis. So, how do we identify all those missing values and deal with them? How to work with missing data?Steps for working with missing data: dentify missing data deal with missing data correct data format Identify and handle missing valuesIdentify missing valuesConvert "?" to NaNIn the car dataset, missing data comes with the question mark "?".We replace "?" with NaN (Not a Number), which is Python's default missing value marker, for reasons of computational speed and convenience. Here we use the function: .replace(A, B, inplace = True) to replace A by B ###Code import numpy as np # replace "?" to NaN df.replace("?", np.nan, inplace = True) df.head(5) ###Output _____no_output_____ ###Markdown dentify_missing_valuesEvaluating for Missing DataThe missing values are converted to Python's default. We use Python's built-in functions to identify these missing values. There are two methods to detect missing data: .isnull() .notnull()The output is a boolean value indicating whether the value that is passed into the argument is in fact missing data. ###Code missing_data = df.isnull() missing_data.head(5) ###Output _____no_output_____ ###Markdown "True" stands for missing value, while "False" stands for not missing value. Count missing values in each columnUsing a for loop in Python, we can quickly figure out the number of missing values in each column. As mentioned above, "True" represents a missing value, "False" means the value is present in the dataset. In the body of the for loop the method ".value_counts()" counts the number of "True" values. ###Code for column in missing_data.columns.values.tolist(): print(column) print (missing_data[column].value_counts()) print("") ###Output symboling False 205 Name: symboling, dtype: int64 normalized-losses False 164 True 41 Name: normalized-losses, dtype: int64 make False 205 Name: make, dtype: int64 fuel-type False 205 Name: fuel-type, dtype: int64 aspiration False 205 Name: aspiration, dtype: int64 num-of-doors False 203 True 2 Name: num-of-doors, dtype: int64 body-style False 205 Name: body-style, dtype: int64 drive-wheels False 205 Name: drive-wheels, dtype: int64 engine-location False 205 Name: engine-location, dtype: int64 wheel-base False 205 Name: wheel-base, dtype: int64 length False 205 Name: length, dtype: int64 width False 205 Name: width, dtype: int64 height False 205 Name: height, dtype: int64 curb-weight False 205 Name: curb-weight, dtype: int64 engine-type False 205 Name: engine-type, dtype: int64 num-of-cylinders False 205 Name: num-of-cylinders, dtype: int64 engine-size False 205 Name: engine-size, dtype: int64 fuel-system False 205 Name: fuel-system, dtype: int64 bore False 201 True 4 Name: bore, dtype: int64 stroke False 201 True 4 Name: stroke, dtype: int64 compression-ratio False 205 Name: compression-ratio, dtype: int64 horsepower False 203 True 2 Name: horsepower, dtype: int64 peak-rpm False 203 True 2 Name: peak-rpm, dtype: int64 city-mpg False 205 Name: city-mpg, dtype: int64 highway-mpg False 205 Name: highway-mpg, dtype: int64 price False 201 True 4 Name: price, dtype: int64 ###Markdown Based on the summary above, each column has 205 rows of data, seven columns containing missing data: "normalized-losses": 41 missing data "num-of-doors": 2 missing data "bore": 4 missing data "stroke" : 4 missing data "horsepower": 2 missing data "peak-rpm": 2 missing data "price": 4 missing data Deal with missing dataHow to deal with missing data? drop data a. drop the whole row b. drop the whole column replace data a. replace it by mean b. replace it by frequency c. replace it based on other functions Whole columns should be dropped only if most entries in the column are empty. In our dataset, none of the columns are empty enough to drop entirely.We have some freedom in choosing which method to replace data; however, some methods may seem more reasonable than others. We will apply each method to many different columns:Replace by mean: "normalized-losses": 41 missing data, replace them with mean "stroke": 4 missing data, replace them with mean "bore": 4 missing data, replace them with mean "horsepower": 2 missing data, replace them with mean "peak-rpm": 2 missing data, replace them with meanReplace by frequency: "num-of-doors": 2 missing data, replace them with "four". Reason: 84% sedans is four doors. Since four doors is most frequent, it is most likely to occur Drop the whole row: "price": 4 missing data, simply delete the whole row Reason: price is what we want to predict. Any data entry without price data cannot be used for prediction; therefore any row now without price data is not useful to us Calculate the average of the column ###Code avg_norm_loss = df["normalized-losses"].astype("float").mean(axis=0) print("Average of normalized-losses:", avg_norm_loss) ###Output Average of normalized-losses: 122.0 ###Markdown Replace "NaN" by mean value in "normalized-losses" column ###Code df["normalized-losses"].replace(np.nan, avg_norm_loss, inplace=True) ###Output _____no_output_____ ###Markdown Calculate the mean value for 'bore' column ###Code avg_bore=df['bore'].astype('float').mean(axis=0) print("Average of bore:", avg_bore) ###Output Average of bore: 3.3297512437810943 ###Markdown Replace NaN by mean value ###Code df["bore"].replace(np.nan, avg_bore, inplace=True) ###Output _____no_output_____ ###Markdown Question 1: According to the example above, replace NaN in "stroke" column by mean. ###Code # Write your code below and press Shift+Enter to execute avg_stroke=df["stroke"].astype('float').mean(axis=0) print("Average Stroke:",avg_stroke) ###Output Average Stroke: 3.255422885572139 ###Markdown Double-click here for the solution.<!-- The answer is below: calculate the mean vaule for "stroke" columnavg_stroke = df["stroke"].astype("float").mean(axis = 0)print("Average of stroke:", avg_stroke) replace NaN by mean value in "stroke" columndf["stroke"].replace(np.nan, avg_stroke, inplace = True)--> Calculate the mean value for the 'horsepower' column: ###Code avg_horsepower = df['horsepower'].astype('float').mean(axis=0) print("Average horsepower:", avg_horsepower) ###Output Average horsepower: 104.25615763546799 ###Markdown Replace "NaN" by mean value: ###Code df['horsepower'].replace(np.nan, avg_horsepower, inplace=True) ###Output _____no_output_____ ###Markdown Calculate the mean value for 'peak-rpm' column: ###Code avg_peakrpm=df['peak-rpm'].astype('float').mean(axis=0) print("Average peak rpm:", avg_peakrpm) ###Output Average peak rpm: 5125.369458128079 ###Markdown Replace NaN by mean value: ###Code df['peak-rpm'].replace(np.nan, avg_peakrpm, inplace=True) ###Output _____no_output_____ ###Markdown To see which values are present in a particular column, we can use the ".value_counts()" method: ###Code df['num-of-doors'].value_counts() ###Output _____no_output_____ ###Markdown We can see that four doors are the most common type. We can also use the ".idxmax()" method to calculate for us the most common type automatically: ###Code df['num-of-doors'].value_counts().idxmax() ###Output _____no_output_____ ###Markdown The replacement procedure is very similar to what we have seen previously ###Code #replace the missing 'num-of-doors' values by the most frequent df["num-of-doors"].replace(np.nan, "four", inplace=True) ###Output _____no_output_____ ###Markdown Finally, let's drop all rows that do not have price data: ###Code # simply drop whole row with NaN in "price" column df.dropna(subset=["price"], axis=0, inplace=True) # reset index, because we droped two rows df.reset_index(drop=True, inplace=True) df.head() ###Output _____no_output_____ ###Markdown Good! Now, we obtain the dataset with no missing values. Correct data formatWe are almost there!The last step in data cleaning is checking and making sure that all data is in the correct format (int, float, text or other).In Pandas, we use .dtype() to check the data type.astype() to change the data type Lets list the data types for each column ###Code df.dtypes ###Output _____no_output_____ ###Markdown As we can see above, some columns are not of the correct data type. Numerical variables should have type 'float' or 'int', and variables with strings such as categories should have type 'object'. For example, 'bore' and 'stroke' variables are numerical values that describe the engines, so we should expect them to be of the type 'float' or 'int'; however, they are shown as type 'object'. We have to convert data types into a proper format for each column using the "astype()" method. Convert data types to proper format ###Code df[["bore", "stroke"]] = df[["bore", "stroke"]].astype("float") df[["normalized-losses"]] = df[["normalized-losses"]].astype("int") df[["price"]] = df[["price"]].astype("float") df[["peak-rpm"]] = df[["peak-rpm"]].astype("float") ###Output _____no_output_____ ###Markdown Let us list the columns after the conversion ###Code df.dtypes ###Output _____no_output_____ ###Markdown Wonderful!Now, we finally obtain the cleaned dataset with no missing values and all data in its proper format. Data StandardizationData is usually collected from different agencies with different formats.(Data Standardization is also a term for a particular type of data normalization, where we subtract the mean and divide by the standard deviation) What is Standardization?Standardization is the process of transforming data into a common format which allows the researcher to make the meaningful comparison.ExampleTransform mpg to L/100km:In our dataset, the fuel consumption columns "city-mpg" and "highway-mpg" are represented by mpg (miles per gallon) unit. Assume we are developing an application in a country that accept the fuel consumption with L/100km standardWe will need to apply data transformation to transform mpg into L/100km? The formula for unit conversion isL/100km = 235 / mpgWe can do many mathematical operations directly in Pandas. ###Code df.head() # Convert mpg to L/100km by mathematical operation (235 divided by mpg) df['city-L/100km'] = 235/df["city-mpg"] # check your transformed data df.head() ###Output _____no_output_____ ###Markdown Question 2: According to the example above, transform mpg to L/100km in the column of "highway-mpg", and change the name of column to "highway-L/100km". ###Code # Write your code below and press Shift+Enter to execute # transform mpg to L/100km by mathematical operation (235 divided by mpg) df["highway-mpg"] = 235/df["highway-mpg"] # rename column name from "highway-mpg" to "highway-L/100km" df.rename(columns={'"highway-mpg"':'highway-L/100km'}, inplace=True) # check your transformed data df.head() ###Output _____no_output_____ ###Markdown Double-click here for the solution.<!-- The answer is below: transform mpg to L/100km by mathematical operation (235 divided by mpg)df["highway-mpg"] = 235/df["highway-mpg"] rename column name from "highway-mpg" to "highway-L/100km"df.rename(columns={'"highway-mpg"':'highway-L/100km'}, inplace=True) check your transformed data df.head()--> Data NormalizationWhy normalization?Normalization is the process of transforming values of several variables into a similar range. Typical normalizations include scaling the variable so the variable average is 0, scaling the variable so the variance is 1, or scaling variable so the variable values range from 0 to 1ExampleTo demonstrate normalization, let's say we want to scale the columns "length", "width" and "height" Target:would like to Normalize those variables so their value ranges from 0 to 1.Approach: replace original value by (original value)/(maximum value) ###Code # replace (original value) by (original value)/(maximum value) df['length'] = df['length']/df['length'].max() df['width'] = df['width']/df['width'].max() ###Output _____no_output_____ ###Markdown Questiont 3: According to the example above, normalize the column "height". ###Code # Write your code below and press Shift+Enter to execute df["height"] = df["height"]/df["height"].max() df[["length","width","height"]].head() ###Output _____no_output_____ ###Markdown Double-click here for the solution.<!-- The answer is below:df['height'] = df['height']/df['height'].max() show the scaled columnsdf[["length","width","height"]].head()--> Here we can see, we've normalized "length", "width" and "height" in the range of [0,1]. BinningWhy binning? Binning is a process of transforming continuous numerical variables into discrete categorical 'bins', for grouped analysis.Example: In our dataset, "horsepower" is a real valued variable ranging from 48 to 288, it has 57 unique values. What if we only care about the price difference between cars with high horsepower, medium horsepower, and little horsepower (3 types)? Can we rearrange them into three ‘bins' to simplify analysis? We will use the Pandas method 'cut' to segment the 'horsepower' column into 3 bins Example of Binning Data In Pandas Convert data to correct format ###Code df["horsepower"]=df["horsepower"].astype(int, copy=True) ###Output _____no_output_____ ###Markdown Lets plot the histogram of horspower, to see what the distribution of horsepower looks like. ###Code %matplotlib inline import matplotlib as plt from matplotlib import pyplot plt.pyplot.hist(df["horsepower"]) # set x/y labels and plot title plt.pyplot.xlabel("horsepower") plt.pyplot.ylabel("count") plt.pyplot.title("horsepower bins") ###Output _____no_output_____ ###Markdown We would like 3 bins of equal size bandwidth so we use numpy's linspace(start_value, end_value, numbers_generated function.Since we want to include the minimum value of horsepower we want to set start_value=min(df["horsepower"]).Since we want to include the maximum value of horsepower we want to set end_value=max(df["horsepower"]).Since we are building 3 bins of equal length, there should be 4 dividers, so numbers_generated=4. We build a bin array, with a minimum value to a maximum value, with bandwidth calculated above. The bins will be values used to determine when one bin ends and another begins. ###Code bins = np.linspace(min(df["horsepower"]), max(df["horsepower"]), 4) bins ###Output _____no_output_____ ###Markdown We set group names: ###Code group_names = ['Low', 'Medium', 'High'] ###Output _____no_output_____ ###Markdown We apply the function "cut" the determine what each value of "df['horsepower']" belongs to. ###Code df['horsepower-binned'] = pd.cut(df['horsepower'], bins, labels=group_names, include_lowest=True ) df[['horsepower','horsepower-binned']].head(20) ###Output _____no_output_____ ###Markdown Lets see the number of vehicles in each bin. ###Code df["horsepower-binned"].value_counts() ###Output _____no_output_____ ###Markdown Lets plot the distribution of each bin. ###Code %matplotlib inline import matplotlib as plt from matplotlib import pyplot pyplot.bar(group_names, df["horsepower-binned"].value_counts()) # set x/y labels and plot title plt.pyplot.xlabel("horsepower") plt.pyplot.ylabel("count") plt.pyplot.title("horsepower bins") ###Output _____no_output_____ ###Markdown Check the dataframe above carefully, you will find the last column provides the bins for "horsepower" with 3 categories ("Low","Medium" and "High"). We successfully narrow the intervals from 57 to 3! Bins visualizationNormally, a histogram is used to visualize the distribution of bins we created above. ###Code %matplotlib inline import matplotlib as plt from matplotlib import pyplot a = (0,1,2) # draw historgram of attribute "horsepower" with bins = 3 plt.pyplot.hist(df["horsepower"], bins = 3) # set x/y labels and plot title plt.pyplot.xlabel("horsepower") plt.pyplot.ylabel("count") plt.pyplot.title("horsepower bins") ###Output _____no_output_____ ###Markdown The plot above shows the binning result for attribute "horsepower". Indicator variable (or dummy variable)What is an indicator variable? An indicator variable (or dummy variable) is a numerical variable used to label categories. They are called 'dummies' because the numbers themselves don't have inherent meaning. Why we use indicator variables? So we can use categorical variables for regression analysis in the later modules.Example We see the column "fuel-type" has two unique values, "gas" or "diesel". Regression doesn't understand words, only numbers. To use this attribute in regression analysis, we convert "fuel-type" into indicator variables. We will use the panda's method 'get_dummies' to assign numerical values to different categories of fuel type. ###Code df.columns ###Output _____no_output_____ ###Markdown get indicator variables and assign it to data frame "dummy_variable_1" ###Code dummy_variable_1 = pd.get_dummies(df["fuel-type"]) dummy_variable_1.head() ###Output _____no_output_____ ###Markdown change column names for clarity ###Code dummy_variable_1.rename(columns={'fuel-type-diesel':'gas', 'fuel-type-diesel':'diesel'}, inplace=True) dummy_variable_1.head() ###Output _____no_output_____ ###Markdown We now have the value 0 to represent "gas" and 1 to represent "diesel" in the column "fuel-type". We will now insert this column back into our original dataset. ###Code # merge data frame "df" and "dummy_variable_1" df = pd.concat([df, dummy_variable_1], axis=1) # drop original column "fuel-type" from "df" df.drop("fuel-type", axis = 1, inplace=True) df.head() ###Output _____no_output_____ ###Markdown The last two columns are now the indicator variable representation of the fuel-type variable. It's all 0s and 1s now. Question 4: As above, create indicator variable to the column of "aspiration": "std" to 0, while "turbo" to 1. ###Code # Write your code below and press Shift+Enter to execute dummy_variable_2 = pd.get_dummies(df["aspiration"]) dummy_variable_2.head() dummy_variable_2.rename(columns={'std':"aspiration-std",'turbo':'aspiration-turbo'},inplace=True) dummy_variable_2.head() ###Output _____no_output_____ ###Markdown Double-click here for the solution.<!-- The answer is below: get indicator variables of aspiration and assign it to data frame "dummy_variable_2"dummy_variable_2 = pd.get_dummies(df['aspiration']) change column names for claritydummy_variable_2.rename(columns={'std':'aspiration-std', 'turbo': 'aspiration-turbo'}, inplace=True) show first 5 instances of data frame "dummy_variable_1"dummy_variable_2.head()--> Question 5: Merge the new dataframe to the original dataframe then drop the column 'aspiration' ###Code # Write your code below and press Shift+Enter to execute # merge data frame "df" and "dummy_variable_2" df = pd.concat([df, dummy_variable_2], axis=1) # drop original column "aspiration" from "df" df.drop("aspiration", axis = 1, inplace=True) ###Output _____no_output_____ ###Markdown Double-click here for the solution.<!-- The answer is below:merge the new dataframe to the original dataframdf = pd.concat([df, dummy_variable_2], axis=1) drop original column "aspiration" from "df"df.drop('aspiration', axis = 1, inplace=True)--> save the new csv ###Code df.to_csv('clean_df.csv') ###Output _____no_output_____
LAB04-Genetic-algorithms/GenetskiAlgoritmi.ipynb
###Markdown Genetski algoritmi ###Code #+ stvori populaciju; #+ evaluiraj populaciju; // svaka jedinka ima pridruženu dobrotu # ponavljaj # odaberi slucajno 3 jedinke; # izbaci najlosiju od 3 odabrane iz populacije; # nova_jedinka = krizanje(preostale dvije); # obavi mutaciju na novoj jedinki uz vjerojatnost p_M; # evaluiraj novu jedinku; # dodaj novu jedinku u populaciju; # dok (nije zadovoljen uvjet zaustavljanja); import numpy as np import pandas as pd import matplotlib.pyplot as plt from math import log, ceil, sin, sqrt from enum import Enum class GenetskiAlgoritam: def __init__(self, prikaz, velicina, dg, gg, dimenzije, p_mutacije, p_krizanja=None, preciznost=0, br_evaluacija_fje_cilja=None, epsilon=1e-6, k=3, ispis=True): self.prikaz = prikaz self.velicina = velicina self.dg = dg self.gg = gg self.dimenzije = dimenzije self.k = k self.p_mutacije = p_mutacije self.p_krizanja = p_krizanja self.preciznost = preciznost self.br_evaluacija_fje_cilja = br_evaluacija_fje_cilja self.epsilon = epsilon self.ispis = ispis self.najbolje = [] def stvori_populaciju(self, func): populacija = [] for i in range(self.velicina): kromosom = Kromosom(self.prikaz, self.dg, self.gg, self.dimenzije, self.preciznost) kromosom.izracunaj_dobrotu(func) populacija.append(kromosom) populacija.sort(key=lambda jedinka: jedinka.dobrota) return populacija def pokreni(self, func, vrsta_krizanja, vrsta_mutacije): populacija = self.stvori_populaciju(func) count = 0 while True: count += 1 populacija.sort(key=lambda kromosom: kromosom.dobrota) if abs(populacija[-1].dobrota) <= self.epsilon: print("Dosegnut je minimum fje cilja."); break if func.br_poziva > self.br_evaluacija_fje_cilja: print("Dosegnut je maksimalan broj evaluacija fje cilja."); break if self.ispis == True and count % 3000 == 0: print("Trenutna najbolja jedinka je: ", populacija[-1]) print("Vrijednost funkcije cilja za nju je: ", populacija[-1].dobrota, "\n") turnir = Turnir(self.k, vrsta_krizanja, vrsta_mutacije) self.najbolje.append(-populacija[-1].dobrota) populacija = turnir.novaPopulacija(func, populacija, self.p_mutacije, self.p_krizanja) populacija.sort(key=lambda jedinka: jedinka.dobrota) print("Najbolja jedinka je: ", populacija[-1]) print("Vrijednost funkcije dobrote za nju je: ", populacija[-1].dobrota, "\n") rez = [] if self.prikaz == Prikaz.binarni: rez.append(populacija[-1].bin_u_dek()) else: rez.append(populacija[-1]) rez.append(populacija[-1].dobrota) return rez class Funkcija: def __init__(self, f): self.f = f self.br_poziva = 0 self.vrijednosti = {} def vrijednost(self, x): self.br_poziva += 1 try: return self.vrijednosti[str(x)] except: self.vrijednosti[str(x)] = self.f(x) return self.vrijednosti[str(x)] def reset(self): self.vrijednosti = {} self.br_poziva = 0 # funkcije def f1(x): return 100 * (x[1] - (x[0])**2)**2 + (1 - x[0])**2 def f3(x): return sum([(x[i] - (i + 1))**2 for i in range(len(x))]) def f6(x): suma = sum([x[i]**2 for i in range(len(x))]) return 0.5 + ((sin(sqrt(suma))**2) - 0.5) / ((1 + 0.001 * suma)**2) def f7(x): suma = sum([x[i]**2 for i in range(len(x))]) return (suma**0.25) * (1 + (sin(50*(suma**0.1))**2)) # - KROMOSOM - class Prikaz(Enum): binarni = 1 realni = 2 class Kromosom(): def __init__(self, prikaz, dg, gg, dimenzije, preciznost=None, vrijednost=None): self.prikaz = prikaz self.dg = dg self.gg = gg self.dimenzije = dimenzije self.preciznost = preciznost if vrijednost is not None: self.vrijednost = vrijednost else: self.vrijednost = self.stvori_kromosom() self.dobrota = None def stvori_kromosom(self): if self.prikaz == Prikaz.binarni: kromosom = [] for i in range(self.dimenzije): duljina = ceil(log((self.gg-self.dg)*(10**self.preciznost)+1)/(log(2))) kromosom.append(np.array([np.random.randint(0, 2) for i in range(duljina)])) return np.array(kromosom) else: d = np.full((self.dimenzije, ), self.dg) g = np.full((self.dimenzije, ), self.gg) rand = np.random.rand(self.dimenzije,) return d + rand * (g - d) def izracunaj_dobrotu(self, func): if self.prikaz == Prikaz.realni: self.dobrota = - func.vrijednost(self.vrijednost) else: duljina = ceil(log((self.gg-self.dg)*(10**self.preciznost)+1)/(log(2))) brojevi = [] for i in range(len(self.vrijednost)): b = int(''.join(map(str, self.vrijednost[i])), 2) brojevi.append(self.dg + (b / (2**duljina))*(self.gg - self.dg)) self.dobrota = -func.vrijednost(np.array(brojevi)) def bin_u_dek(self): duljina = ceil(log((self.gg-self.dg)*(10**self.preciznost)+1)/(log(2))) brojevi = [] for i in range(len(self.vrijednost)): b = int(''.join(map(str, self.vrijednost[i])), 2) brojevi.append(self.dg + (b / (2**duljina))*(self.gg - self.dg)) return brojevi def __repr__(self): return str(self.vrijednost) def __str__(self): if self.prikaz == Prikaz.realni: return str(self.vrijednost) else: return str(self.bin_u_dek()) # - MUTACIJA - class VrstaMutacije(Enum): uniformna_b = 1 uniformna_r = 2 gauss = 3 def mutacija(vrsta, func, kromosom, p_mutacije): mutirana_vrijednost = kromosom.vrijednost.copy() for i in range(kromosom.dimenzije): if np.random.rand() <= p_mutacije: # uniformno - binarno if vrsta == VrstaMutacije.uniformna_b: for bit in range(len(mutirana_vrijednost[i])): mutirana_vrijednost[i][bit] = 1 - mutirana_vrijednost[i][bit] # uniformno - realno elif vrsta == VrstaMutacije.uniformna_r: mutirana_vrijednost[i] = kromosom.dg + (np.random.rand() * (kromosom.gg - kromosom.dg)) mutirani_kromosom = Kromosom(kromosom.prikaz, kromosom.dg, kromosom.gg, kromosom.dimenzije, kromosom.preciznost, mutirana_vrijednost) mutirani_kromosom.izracunaj_dobrotu(func) return mutirani_kromosom # - KRIZANJE - def krizanje_uniformno(func, roditelji, p_krizanja): dijete = Kromosom(roditelji[0].prikaz ,roditelji[0].dg, roditelji[0].gg, roditelji[0].dimenzije, roditelji[0].preciznost) prvi = roditelji[0].vrijednost.copy() drugi = roditelji[1].vrijednost.copy() for i in range(roditelji[0].dimenzije): for j in range(len(prvi[i])): if prvi[i][j] == drugi[i][j] or np.random.rand() < 0.5: dijete.vrijednost[i][j] = prvi[i][j] else: dijete.vrijednost[i][j] = drugi[i][j] dijete.izracunaj_dobrotu(func) return dijete def krizanje_jednom_tockom(func, roditelji, p_krizanja): dijete = Kromosom(roditelji[0].prikaz ,roditelji[0].dg, roditelji[0].gg, roditelji[0].dimenzije, roditelji[0].preciznost) prvi = roditelji[0].vrijednost.copy() drugi = roditelji[1].vrijednost.copy() for i in range(roditelji[0].dimenzije): tocka_krizanja = np.random.randint(len(prvi[i])) for j in range(tocka_krizanja): dijete.vrijednost[i][j] = prvi[i][j] for j in range(tocka_krizanja, len(prvi[i])): dijete.vrijednost[i][j] = drugi[i][j] dijete.izracunaj_dobrotu(func) return dijete def krizanje_aritmeticko(func, roditelji, p_krizanja): r = np.random.rand() dijete_vrijednost = r*roditelji[0].vrijednost + (1-r)*roditelji[1].vrijednost dijete = Kromosom(roditelji[0].prikaz, roditelji[0].dg, roditelji[0].gg, roditelji[0].dimenzije, roditelji[0].preciznost, dijete_vrijednost) dijete.izracunaj_dobrotu(func) return dijete def krizanje_heuristicko(func, roditelji, p_krizanja): r = np.random.rand() roditelji.sort(key=lambda roditelj: roditelj.dobrota) dijete_vrijednost = r * (roditelji[1].vrijednost - roditelji[0].vrijednost) + roditelji[1].vrijednost for i in range(len(dijete_vrijednost)): if dijete_vrijednost[i] < roditelji[0].dg: dijete_vrijednost[i] = roditelji[0].dg elif dijete_vrijednost[i] > roditelji[0].gg: dijete_vrijednost[i] = roditelji[0].gg dijete = Kromosom(roditelji[0].prikaz, roditelji[0].dg, roditelji[0].gg, roditelji[0].dimenzije, roditelji[0].preciznost, dijete_vrijednost) dijete.izracunaj_dobrotu(func) return dijete class Turnir: def __init__(self, k, krizanje, vrsta_mutacije): self.k = k self.krizanje = krizanje self.vrsta_mutacije = vrsta_mutacije def novaPopulacija(self, func, populacija, p_mutacije, p_krizanja=None): indeksi_kandidata = self.izaberi_indekse(populacija) indeksi_kandidata.sort(key=lambda indeks: populacija[indeks].dobrota, reverse=True) roditelji = [populacija[indeksi_kandidata[1]], populacija[indeksi_kandidata[2]]] dijete = self.krizanje(func, roditelji, p_krizanja) dijete = mutacija(self.vrsta_mutacije, func, dijete, p_mutacije) nova_populacija = populacija nova_populacija[indeksi_kandidata[-1]] = dijete return nova_populacija def izaberi_indekse(self, populacija): indeksi = set(np.random.randint(len(populacija), size=self.k)) while len(indeksi) < self.k: indeksi.add(np.random.randint(len(populacija))) return list(indeksi) ###Output _____no_output_____ ###Markdown Zadatci 1. Isprobajte vašu implementaciju GA nad svim funkcijama uz granice [-50, 150] za sve varijable te zaoba prikaza rješenja (binarni, s pomičnom točkom). Za binarni prikaz zadajte preciznost od barem tridecimalna mjesta. Za funkciju f3 odaberite barem 5 varijabli, a za f6 i f7 dvije varijable. Za svefunkcije možete smatrati da je rješenje pronađeno ako je krajnja vrijednost funkcije cilja manja od10-6. Za neke funkcije algoritam će biti potrebno pokrenuti nekoliko puta. Koje zaključke možetedonijeti o uspješnosti GA za pojedinu funkciju? Koje su se funkcije pokazale teškima i zašto? ###Code gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.049, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f1), krizanje_heuristicko ,VrstaMutacije.uniformna_r) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=5, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f3), krizanje_aritmeticko ,VrstaMutacije.uniformna_r) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f6), krizanje_aritmeticko ,VrstaMutacije.uniformna_r) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f7), krizanje_heuristicko ,VrstaMutacije.uniformna_r) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.049, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f1), krizanje_uniformno ,VrstaMutacije.uniformna_b) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=5, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f3), krizanje_uniformno ,VrstaMutacije.uniformna_b) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=5, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f3), krizanje_jednom_tockom ,VrstaMutacije.uniformna_b) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f6), krizanje_uniformno, VrstaMutacije.uniformna_b) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.099, br_evaluacija_fje_cilja=100000, epsilon=1e-6) rez = gen_alg.pokreni(Funkcija(f7), krizanje_uniformno,VrstaMutacije.uniformna_b) ###Output Dosegnut je minimum fje cilja. Najbolja jedinka je: [0.0, 0.0] Vrijednost funkcije dobrote za nju je: -0.0 ###Markdown 2. Provedite GA na funkcijama f6 i f7 mijenjajući dimenzionalnost funkcije (1, 3, 6, 10). Kakopovećanje dimenzionalnosti funkcije utječe na ponašanje algoritma? ###Code rez_f6 = [] rez_f7 = [] for dim in [1, 3, 6, 10]: gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=dim, p_mutacije=0.099, br_evaluacija_fje_cilja=10000, epsilon=1e-6, ispis=False) rez_f6.append(gen_alg.pokreni(Funkcija(f6), krizanje_uniformno, VrstaMutacije.uniformna_b)[1]) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=dim, p_mutacije=0.099, br_evaluacija_fje_cilja=10000, epsilon=1e-6, ispis=False) rez_f7.append(gen_alg.pokreni(Funkcija(f7), krizanje_uniformno, VrstaMutacije.uniformna_b)[1]) pd.DataFrame(np.column_stack((rez_f6, rez_f7)), [1, 3, 6, 10], columns=['dobrota f6', 'dobrota f7']) ###Output Dosegnut je minimum fje cilja. Najbolja jedinka je: [0.0] Vrijednost funkcije dobrote za nju je: -0.0 Dosegnut je minimum fje cilja. Najbolja jedinka je: [0.0] Vrijednost funkcije dobrote za nju je: -0.0 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [5.46875, -7.03125, -3.125] Vrijednost funkcije dobrote za nju je: -0.07866519909900771 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-0.78125, -0.78125, -1.5625] Vrijednost funkcije dobrote za nju je: -1.575192770722792 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [4.6875, 10.15625, 12.5, 7.8125, -4.6875, -16.40625] Vrijednost funkcije dobrote za nju je: -0.31315554793301337 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-3.90625, -14.84375, -7.8125, -2.34375, -9.375, -2.34375] Vrijednost funkcije dobrote za nju je: -4.603604464214859 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [17.1875, 6.25, -16.40625, -15.625, -0.78125, -3.125, 1.5625, -7.03125, 44.53125, -17.96875] Vrijednost funkcije dobrote za nju je: -0.4733081818285945 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-10.15625, -11.71875, -15.625, -47.65625, 10.15625, 26.5625, 29.6875, 8.59375, -7.8125, -7.8125] Vrijednost funkcije dobrote za nju je: -8.302517913277764 ###Markdown 3. Za funkcije f6 i f7 usporedite učinkovitost GA koji koristi binarni prikaz uz preciznost na 4 decimale(tj. 10-4) i GA koji koristi prikaz s pomičnom točkom (ostali parametri neka budu jednaki), zadimenzije 3 i 6. Rad algoritma ograničite zadanim brojem evaluacija (oko 105). Inačice algoritmausporedite po uputama u sljedećem odjeljku. Što možete zaključiti o različitim prikazima rješenja zarazličite funkcije? ###Code for dim in [3, 6]: gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=dim, p_mutacije=0.099, preciznost=4, br_evaluacija_fje_cilja=105, epsilon=1e-6, ispis=False) rez = gen_alg.pokreni(Funkcija(f6), krizanje_uniformno, VrstaMutacije.uniformna_b) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.binarni, velicina=25, dg=-50, gg=150, dimenzije=dim, p_mutacije=0.099, preciznost=4, br_evaluacija_fje_cilja=105, epsilon=1e-6, ispis=False) rez = gen_alg.pokreni(Funkcija(f7), krizanje_uniformno,VrstaMutacije.uniformna_b) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=dim, p_mutacije=0.099, br_evaluacija_fje_cilja=105, epsilon=1e-6, ispis=False) rez = gen_alg.pokreni(Funkcija(f6), krizanje_aritmeticko ,VrstaMutacije.uniformna_r) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=dim, p_mutacije=0.099, br_evaluacija_fje_cilja=105, epsilon=1e-6, ispis=False) rez = gen_alg.pokreni(Funkcija(f6), krizanje_aritmeticko ,VrstaMutacije.uniformna_r) ###Output Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-28.17068099975586, 0.4757881164550781, 56.296348571777344] Vrijednost funkcije dobrote za nju je: -0.4802953076436903 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-6.8248748779296875, -2.747058868408203, -5.973243713378906] Vrijednost funkcije dobrote za nju je: -3.1406750053932373 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-8.65304805 -4.66870698 0.11121428] Vrijednost funkcije dobrote za nju je: -0.21519778239361792 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-49.3743552 25.64778793 -20.75679567] Vrijednost funkcije dobrote za nju je: -0.48002820111417427 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [26.21440887451172, 5.206775665283203, 33.33616256713867, -37.110233306884766, -13.97085189819336, -11.311054229736328] Vrijednost funkcije dobrote za nju je: -0.4801612661995017 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [-27.865886688232422, 104.7755241394043, -38.951873779296875, 61.28673553466797, -32.89651870727539, 53.51972579956055] Vrijednost funkcije dobrote za nju je: -12.347241336509226 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [ 20.85970251 47.55450014 -24.53567003 47.69427526 -47.59987134 -33.94462356] Vrijednost funkcije dobrote za nju je: -0.49795044848879283 Dosegnut je maksimalan broj evaluacija fje cilja. Najbolja jedinka je: [24.4320079 96.35793529 19.00281309 1.69924012 60.5717098 65.09008657] Vrijednost funkcije dobrote za nju je: -0.49897741699883663 ###Markdown 4. Za funkciju f6 pokušajte pronaći 'idealne' parametre genetskog algoritma. 'Idealne' parametrepotrebno je odrediti barem za veličinu populacije (npr. 30, 50, 100, 200) i vjerojatnost mutacijejedinke (npr. 0.1, 0.3, 0.6, 0.9) a po želji možete i za još neke druge parametre koje je vaš algoritamkoristio. Jedan postupak traženja parametara opisan je u nastavku. Koristite medijan kao mjeruusporedbe i prikažite kretanje učinkovitosti za barem jedan parametar uz pomoć box-plot prikaza(opisano u nastavku). ###Code razl_vel = [] tocke_vel = [] velicine = [30, 50, 100, 200] for vel in [30, 50, 100, 200]: print("velicina populacije: ", vel) gen_alg_vel = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=vel, dg=-50, gg=150, dimenzije=2, p_mutacije=0.099, preciznost=4, br_evaluacija_fje_cilja=10000, epsilon=1e-6, ispis=False) razl_vel.append([vel, gen_alg_vel.pokreni(Funkcija(f6), krizanje_heuristicko, VrstaMutacije.uniformna_r)[1]]) tocke_vel.append(gen_alg_vel.najbolje) i = np.argmax(np.array(razl_vel)[:, 1]) razl_p = [] tocke_mut = [] vjer = [0.1, 0.3, 0.6, 0.9] for p_m in [0.1, 0.3, 0.6, 0.9]: print("vjerojatnost mutacije: ", p_m) gen_alg_mut = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=velicine[i], dg=-50, gg=150, dimenzije=2, p_mutacije=p_m, preciznost=4, br_evaluacija_fje_cilja=10000, epsilon=1e-6, ispis=False) razl_p.append([p_m, gen_alg_mut.pokreni(Funkcija(f6), krizanje_heuristicko, VrstaMutacije.uniformna_r)[1]]) tocke_mut.append(gen_alg_mut.najbolje) p = np.argmax(np.array(razl_p)[:, 1]) print("najbolja velicina: ", velicine[i], " najbolje vjerojatnost mutacije: ", vjer[p]) data_1 = np.array(tocke_vel[0]) data_2 = np.array(tocke_vel[1]) data_3 = np.array(tocke_vel[2]) data_4 = np.array(tocke_vel[3]) data = [data_1, data_2, data_3, data_4] fig = plt.figure(figsize =(10, 7)) ax = fig.add_axes([0, 0, 1, 1]) bp = ax.boxplot(data) plt.show() data_1 = np.array(tocke_mut[0]) data_2 = np.array(tocke_mut[1]) data_3 = np.array(tocke_mut[2]) data_4 = np.array(tocke_mut[3]) data = [data_1, data_2, data_3, data_4] fig = plt.figure(figsize =(10, 7)) ax = fig.add_axes([0, 0, 1, 1]) bp = ax.boxplot(data) plt.show() ###Output _____no_output_____ ###Markdown 5. Ako ste implementirali turnirsku selekciju, probajte nad nekom težom funkcijom (f6 ili f7) izvestialgoritam koristeći različite veličine turnira. Pomaže li veći turnir algoritmu da pronađe boljarješenja? Ako ste implementirali selekciju roulette wheel, isprobajte više vrijednosti omjera odabiranajbolje i najlošije jedinke (skaliranje funkcije cilja) te komentirajte dobivene rezultate (takođermožete isprobati generacijsku i eliminacijsku varijantu) ###Code rez_f6 = [] rez_f7 = [] for k in [3, 10, 25]: gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=10, dimenzije=2, p_mutacije=0.099, br_evaluacija_fje_cilja=10000, epsilon=1e-6, k=k, ispis=False) rez_f6.append(gen_alg.pokreni(Funkcija(f6), krizanje_aritmeticko ,VrstaMutacije.uniformna_r)) gen_alg = GenetskiAlgoritam(prikaz=Prikaz.realni, velicina=25, dg=-50, gg=150, dimenzije=2, p_mutacije=0.099, br_evaluacija_fje_cilja=10000, epsilon=1e-6, k=k, ispis=False) rez_f7.append(gen_alg.pokreni(Funkcija(f7), krizanje_heuristicko ,VrstaMutacije.uniformna_r)) rez_k_f6 = np.array(rez_f6) rez_k_f7 = np.array(rez_f7) pd.DataFrame(np.column_stack((rez_k_f6, rez_k_f7)), [3, 10, 25], ['najbolja jedinka f6', 'dobrota f6', 'najbolja jedinka f7', 'dobrota f7']) ###Output _____no_output_____
content/c1/concept.ipynb
###Markdown Concept==============$$\newcommand{\sumN}{\sum_{n = 1}^N} \newcommand{\sumn}{\sum_n} \newcommand{\bx}{\mathbf{x}} \newcommand{\bbeta}{\boldsymbol{\beta}} \newcommand{\btheta}{\boldsymbol{\theta}} \newcommand{\bbetahat}{\boldsymbol{\hat{\beta}}} \newcommand{\bthetahat}{\boldsymbol{\hat{\theta}}} \newcommand{\dadb}[2]{\frac{\partial 1}{\partial 2}} \newcommand{\by}{\mathbf{y}} \newcommand{\bX}{\mathbf{X}}\newcommand{\bphi}{\boldsymbol{\phi}}\newcommand{\bPhi}{\boldsymbol{\Phi}}$$ Model Structure*Linear regression* is a relatively simple method that is extremely widely-used. It is also a great stepping stone for more sophisticated methods, making it a natural algorithm to study first.In linear regression, the target variable $y$ is assumed to follow a linear function of one or more predictor variables, $x_1, \dots, x_D$, plus some random error. Specifically, we assume the model for the $n^\text{th}$ observation in our sample is of the form $$y_n = \beta_0 + \beta_1 x_{n1} + \dots + \beta_Dx_{nD} + \epsilon_n. $$Here $\beta_0$ is the intercept term, $\beta_1$ through $\beta_D$ are the coefficients on our feature variables, and $\epsilon$ is an error term that represents the difference between the true $y$ value and the linear function of the predictors. Note that the terms with an $n$ in the subscript differ between observations while the terms without (namely the $\beta\text{s}$) do not. The math behind linear regression often becomes easier when we use vectors to represent our predictors and coefficients. Let's define $\bx_n$ and $\bbeta$ as follows:$$\begin{align}\bx_n &= \begin{pmatrix} 1 & x_{n1} & \dots & x_{nD} \end{pmatrix}^\top \\\bbeta &= \begin{pmatrix} \beta_0 & \beta_1 & \dots & \beta_D \end{pmatrix}^\top.\end{align}$$Note that $\bx_n$ includes a leading 1, corresponding to the intercept term $\beta_0$. Using these definitions, we can equivalently express $y_n$ as $$y_n = \bbeta^\top \bx_n + \epsilon_n.$$Below is an example of a dataset designed for linear regression. The input variable is generated randomly and the target variable is generated as a linear combination of that input variable plus an error term. ###Code import numpy as np import matplotlib.pyplot as plt import seaborn as sns # generate data np.random.seed(123) N = 20 beta0 = -4 beta1 = 2 x = np.random.randn(N) e = np.random.randn(N) y = beta0 + beta1*x + e true_x = np.linspace(min(x), max(x), 100) true_y = beta0 + beta1*true_x # plot fig, ax = plt.subplots() sns.scatterplot(x, y, s = 40, label = 'Data') sns.lineplot(true_x, true_y, color = 'red', label = 'True Model') ax.set_xlabel('x', fontsize = 14) ax.set_title(fr"$y = {beta0} + ${beta1}$x + \epsilon$", fontsize = 16) ax.set_ylabel('y', fontsize=14, rotation=0, labelpad=10) ax.legend(loc = 4) sns.despine() ###Output _____no_output_____ ###Markdown Parameter EstimationThe previous section covers the entire structure we assume our data follows in linear regression. The machine learning task is then to estimate the parameters in $\bbeta$. These estimates are represented by $\hat{\beta}_0, \dots, \hat{\beta}_D$ or $\bbetahat$. The estimates give us *fitted values* for our target variable, represented by $\hat{y}_n$. This task can be accomplished in two ways which, though slightly different conceptually, are identical mathematically. The first approach is through the lens of *minimizing loss*. A common practice in machine learning is to choose a loss function that defines how well a model with a given set of parameter estimates the observed data. The most common loss function for linear regression is squared error loss. This says the *loss* of our model is proportional to the sum of squared differences between the true $y_n$ values and the fitted values, $\hat{y}_n$. We then *fit* the model by finding the estimates $\bbetahat$ that minimize this loss function. This approach is covered in the subsection {doc}`s1/loss_minimization`.The second approach is through the lens of *maximizing likelihood*. Another common practice in machine learning is to model the target as a random variable whose distribution depends on one or more parameters, and then find the parameters that maximize its likelihood. Under this approach, we will represent the target with $Y_n$ since we are treating it as a random variable. The most common model for $Y_n$ in linear regression is a Normal random variable with mean $E(Y_n) = \bbeta^\top \bx_n$. That is, we assume $$Y_n|\bx_n \sim \mathcal{N}(\bbeta^\top \bx_n, \sigma^2),$$and we find the values of $\bbetahat$ to maximize the likelihood. This approach is covered in subsection {doc}`s1/likelihood_maximization`. Once we've estimated $\bbeta$, our model is *fit* and we can make predictions. The below graph is the same as the one above but includes our estimated line-of-best-fit, obtained by calculating $\hat{\beta}_0$ and $\hat{\beta}_1$. ###Code # generate data np.random.seed(123) N = 20 beta0 = -4 beta1 = 2 x = np.random.randn(N) e = np.random.randn(N) y = beta0 + beta1*x + e true_x = np.linspace(min(x), max(x), 100) true_y = beta0 + beta1*true_x # estimate model beta1_hat = sum((x - np.mean(x))*(y - np.mean(y)))/sum((x - np.mean(x))**2) beta0_hat = np.mean(y) - beta1_hat*np.mean(x) fit_y = beta0_hat + beta1_hat*true_x # plot fig, ax = plt.subplots() sns.scatterplot(x, y, s = 40, label = 'Data') sns.lineplot(true_x, true_y, color = 'red', label = 'True Model') sns.lineplot(true_x, fit_y, color = 'purple', label = 'Estimated Model') ax.set_xlabel('x', fontsize = 14) ax.set_title(fr"Linear Regression for $y = {beta0} + ${beta1}$x + \epsilon$", fontsize = 16) ax.set_ylabel('y', fontsize=14, rotation=0, labelpad=10) ax.legend(loc = 4) sns.despine() ###Output _____no_output_____
cs231n_assignments/assignment2/PyTorch.ipynb
###Markdown What's this PyTorch business?You've written a lot of code in this assignment to provide a whole host of neural network functionality. Dropout, Batch Norm, and 2D convolutions are some of the workhorses of deep learning in computer vision. You've also worked hard to make your code efficient and vectorized.For the last part of this assignment, though, we're going to leave behind your beautiful codebase and instead migrate to one of two popular deep learning frameworks: in this instance, PyTorch (or TensorFlow, if you switch over to that notebook). What is PyTorch?PyTorch is a system for executing dynamic computational graphs over Tensor objects that behave similarly as numpy ndarray. It comes with a powerful automatic differentiation engine that removes the need for manual back-propagation. Why?* Our code will now run on GPUs! Much faster training. When using a framework like PyTorch or TensorFlow you can harness the power of the GPU for your own custom neural network architectures without having to write CUDA code directly (which is beyond the scope of this class).* We want you to be ready to use one of these frameworks for your project so you can experiment more efficiently than if you were writing every feature you want to use by hand. * We want you to stand on the shoulders of giants! TensorFlow and PyTorch are both excellent frameworks that will make your lives a lot easier, and now that you understand their guts, you are free to use them :) * We want you to be exposed to the sort of deep learning code you might run into in academia or industry. PyTorch versionsThis notebook assumes that you are using **PyTorch version 0.4**. Prior to this version, Tensors had to be wrapped in Variable objects to be used in autograd; however Variables have now been deprecated. In addition 0.4 also separates a Tensor's datatype from its device, and uses numpy-style factories for constructing Tensors rather than directly invoking Tensor constructors. How will I learn PyTorch?Justin Johnson has made an excellent [tutorial](https://github.com/jcjohnson/pytorch-examples) for PyTorch. You can also find the detailed [API doc](http://pytorch.org/docs/stable/index.html) here. If you have other questions that are not addressed by the API docs, the [PyTorch forum](https://discuss.pytorch.org/) is a much better place to ask than StackOverflow. Table of ContentsThis assignment has 5 parts. You will learn PyTorch on different levels of abstractions, which will help you understand it better and prepare you for the final project. 1. Preparation: we will use CIFAR-10 dataset.2. Barebones PyTorch: we will work directly with the lowest-level PyTorch Tensors. 3. PyTorch Module API: we will use `nn.Module` to define arbitrary neural network architecture. 4. PyTorch Sequential API: we will use `nn.Sequential` to define a linear feed-forward network very conveniently. 5. CIFAR-10 open-ended challenge: please implement your own network to get as high accuracy as possible on CIFAR-10. You can experiment with any layer, optimizer, hyperparameters or other advanced features. Here is a table of comparison:| API | Flexibility | Convenience ||---------------|-------------|-------------|| Barebone | High | Low || `nn.Module` | High | Medium || `nn.Sequential` | Low | High | Part I. PreparationFirst, we load the CIFAR-10 dataset. This might take a couple minutes the first time you do it, but the files should stay cached after that.In previous parts of the assignment we had to write our own code to download the CIFAR-10 dataset, preprocess it, and iterate through it in minibatches; PyTorch provides convenient tools to automate this process for us. ###Code import torch import torch.nn as nn import torch.optim as optim from torch.utils.data import DataLoader from torch.utils.data import sampler import torchvision.datasets as dset import torchvision.transforms as T import numpy as np NUM_TRAIN = 49000 # The torchvision.transforms package provides tools for preprocessing data # and for performing data augmentation; here we set up a transform to # preprocess the data by subtracting the mean RGB value and dividing by the # standard deviation of each RGB value; we've hardcoded the mean and std. transform = T.Compose([ T.ToTensor(), T.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)) ]) # We set up a Dataset object for each split (train / val / test); Datasets load # training examples one at a time, so we wrap each Dataset in a DataLoader which # iterates through the Dataset and forms minibatches. We divide the CIFAR-10 # training set into train and val sets by passing a Sampler object to the # DataLoader telling how it should sample from the underlying Dataset. cifar10_train = dset.CIFAR10('./cs231n/datasets', train=True, download=True, transform=transform) loader_train = DataLoader(cifar10_train, batch_size=64, sampler=sampler.SubsetRandomSampler(range(NUM_TRAIN))) cifar10_val = dset.CIFAR10('./cs231n/datasets', train=True, download=True, transform=transform) loader_val = DataLoader(cifar10_val, batch_size=64, sampler=sampler.SubsetRandomSampler(range(NUM_TRAIN, 50000))) cifar10_test = dset.CIFAR10('./cs231n/datasets', train=False, download=True, transform=transform) loader_test = DataLoader(cifar10_test, batch_size=64) ###Output Files already downloaded and verified Files already downloaded and verified Files already downloaded and verified ###Markdown You have an option to **use GPU by setting the flag to True below**. It is not necessary to use GPU for this assignment. Note that if your computer does not have CUDA enabled, `torch.cuda.is_available()` will return False and this notebook will fallback to CPU mode.The global variables `dtype` and `device` will control the data types throughout this assignment. ###Code USE_GPU = True dtype = torch.float32 # we will be using float throughout this tutorial if USE_GPU and torch.cuda.is_available(): device = torch.device('cuda') else: device = torch.device('cpu') # Constant to control how frequently we print train loss print_every = 100 print('using device:', device) ###Output using device: cuda ###Markdown Part II. Barebones PyTorchPyTorch ships with high-level APIs to help us define model architectures conveniently, which we will cover in Part II of this tutorial. In this section, we will start with the barebone PyTorch elements to understand the autograd engine better. After this exercise, you will come to appreciate the high-level model API more.We will start with a simple fully-connected ReLU network with two hidden layers and no biases for CIFAR classification. This implementation computes the forward pass using operations on PyTorch Tensors, and uses PyTorch autograd to compute gradients. It is important that you understand every line, because you will write a harder version after the example.When we create a PyTorch Tensor with `requires_grad=True`, then operations involving that Tensor will not just compute values; they will also build up a computational graph in the background, allowing us to easily backpropagate through the graph to compute gradients of some Tensors with respect to a downstream loss. Concretely if x is a Tensor with `x.requires_grad == True` then after backpropagation `x.grad` will be another Tensor holding the gradient of x with respect to the scalar loss at the end. PyTorch Tensors: Flatten FunctionA PyTorch Tensor is conceptionally similar to a numpy array: it is an n-dimensional grid of numbers, and like numpy PyTorch provides many functions to efficiently operate on Tensors. As a simple example, we provide a `flatten` function below which reshapes image data for use in a fully-connected neural network.Recall that image data is typically stored in a Tensor of shape N x C x H x W, where:* N is the number of datapoints* C is the number of channels* H is the height of the intermediate feature map in pixels* W is the height of the intermediate feature map in pixelsThis is the right way to represent the data when we are doing something like a 2D convolution, that needs spatial understanding of where the intermediate features are relative to each other. When we use fully connected affine layers to process the image, however, we want each datapoint to be represented by a single vector -- it's no longer useful to segregate the different channels, rows, and columns of the data. So, we use a "flatten" operation to collapse the `C x H x W` values per representation into a single long vector. The flatten function below first reads in the N, C, H, and W values from a given batch of data, and then returns a "view" of that data. "View" is analogous to numpy's "reshape" method: it reshapes x's dimensions to be N x ??, where ?? is allowed to be anything (in this case, it will be C x H x W, but we don't need to specify that explicitly). ###Code def flatten(x): N = x.shape[0] # read in N, C, H, W return x.view(N, -1) # "flatten" the C * H * W values into a single vector per image def test_flatten(): x = torch.arange(12).view(2, 1, 3, 2) print('Before flattening: ', x) print('After flattening: ', flatten(x)) test_flatten() ###Output Before flattening: tensor([[[[ 0, 1], [ 2, 3], [ 4, 5]]], [[[ 6, 7], [ 8, 9], [10, 11]]]]) After flattening: tensor([[ 0, 1, 2, 3, 4, 5], [ 6, 7, 8, 9, 10, 11]]) ###Markdown Barebones PyTorch: Two-Layer NetworkHere we define a function `two_layer_fc` which performs the forward pass of a two-layer fully-connected ReLU network on a batch of image data. After defining the forward pass we check that it doesn't crash and that it produces outputs of the right shape by running zeros through the network.You don't have to write any code here, but it's important that you read and understand the implementation. ###Code import torch.nn.functional as F # useful stateless functions def two_layer_fc(x, params): """ A fully-connected neural networks; the architecture is: NN is fully connected -> ReLU -> fully connected layer. Note that this function only defines the forward pass; PyTorch will take care of the backward pass for us. The input to the network will be a minibatch of data, of shape (N, d1, ..., dM) where d1 * ... * dM = D. The hidden layer will have H units, and the output layer will produce scores for C classes. Inputs: - x: A PyTorch Tensor of shape (N, d1, ..., dM) giving a minibatch of input data. - params: A list [w1, w2] of PyTorch Tensors giving weights for the network; w1 has shape (D, H) and w2 has shape (H, C). Returns: - scores: A PyTorch Tensor of shape (N, C) giving classification scores for the input data x. """ # first we flatten the image x = flatten(x) # shape: [batch_size, C x H x W] w1, w2 = params # Forward pass: compute predicted y using operations on Tensors. Since w1 and # w2 have requires_grad=True, operations involving these Tensors will cause # PyTorch to build a computational graph, allowing automatic computation of # gradients. Since we are no longer implementing the backward pass by hand we # don't need to keep references to intermediate values. # you can also use `.clamp(min=0)`, equivalent to F.relu() x = F.relu(x.mm(w1)) x = x.mm(w2) return x def two_layer_fc_test(): hidden_layer_size = 42 x = torch.zeros((64, 50), dtype=dtype) # minibatch size 64, feature dimension 50 w1 = torch.zeros((50, hidden_layer_size), dtype=dtype) w2 = torch.zeros((hidden_layer_size, 10), dtype=dtype) scores = two_layer_fc(x, [w1, w2]) print(scores.size()) # you should see [64, 10] two_layer_fc_test() ###Output torch.Size([64, 10]) ###Markdown Barebones PyTorch: Three-Layer ConvNetHere you will complete the implementation of the function `three_layer_convnet`, which will perform the forward pass of a three-layer convolutional network. Like above, we can immediately test our implementation by passing zeros through the network. The network should have the following architecture:1. A convolutional layer (with bias) with `channel_1` filters, each with shape `KW1 x KH1`, and zero-padding of two2. ReLU nonlinearity3. A convolutional layer (with bias) with `channel_2` filters, each with shape `KW2 x KH2`, and zero-padding of one4. ReLU nonlinearity5. Fully-connected layer with bias, producing scores for C classes.**HINT**: For convolutions: http://pytorch.org/docs/stable/nn.htmltorch.nn.functional.conv2d; pay attention to the shapes of convolutional filters! ###Code def three_layer_convnet(x, params): """ Performs the forward pass of a three-layer convolutional network with the architecture defined above. Inputs: - x: A PyTorch Tensor of shape (N, 3, H, W) giving a minibatch of images - params: A list of PyTorch Tensors giving the weights and biases for the network; should contain the following: - conv_w1: PyTorch Tensor of shape (channel_1, 3, KH1, KW1) giving weights for the first convolutional layer - conv_b1: PyTorch Tensor of shape (channel_1,) giving biases for the first convolutional layer - conv_w2: PyTorch Tensor of shape (channel_2, channel_1, KH2, KW2) giving weights for the second convolutional layer - conv_b2: PyTorch Tensor of shape (channel_2,) giving biases for the second convolutional layer - fc_w: PyTorch Tensor giving weights for the fully-connected layer. Can you figure out what the shape should be? - fc_b: PyTorch Tensor giving biases for the fully-connected layer. Can you figure out what the shape should be? Returns: - scores: PyTorch Tensor of shape (N, C) giving classification scores for x """ conv_w1, conv_b1, conv_w2, conv_b2, fc_w, fc_b = params scores = None ################################################################################ # TODO: Implement the forward pass for the three-layer ConvNet. # ################################################################################ layer1 = F.conv2d(x, conv_w1, conv_b1, stride=1,padding=(2, 2)) relu1 = F.relu(layer1) layer2 = F.conv2d(layer1, conv_w2, conv_b2,stride=1, padding=(1, 1)) relu2 = F.relu(layer2) scores = (flatten(relu2)).mm(fc_w) + fc_b ################################################################################ # END OF YOUR CODE # ################################################################################ return scores ###Output _____no_output_____ ###Markdown After defining the forward pass of the ConvNet above, run the following cell to test your implementation.When you run this function, scores should have shape (64, 10). ###Code def three_layer_convnet_test(): x = torch.zeros((64, 3, 32, 32), dtype=dtype) # minibatch size 64, image size [3, 32, 32] conv_w1 = torch.zeros((6, 3, 5, 5), dtype=dtype) # [out_channel, in_channel, kernel_H, kernel_W] conv_b1 = torch.zeros((6,)) # out_channel conv_w2 = torch.zeros((9, 6, 3, 3), dtype=dtype) # [out_channel, in_channel, kernel_H, kernel_W] conv_b2 = torch.zeros((9,)) # out_channel # you must calculate the shape of the tensor after two conv layers, before the fully-connected layer fc_w = torch.zeros((9 * 32 * 32, 10)) fc_b = torch.zeros(10) scores = three_layer_convnet(x, [conv_w1, conv_b1, conv_w2, conv_b2, fc_w, fc_b]) print(scores.size()) # you should see [64, 10] three_layer_convnet_test() ###Output torch.Size([64, 10]) ###Markdown Barebones PyTorch: InitializationLet's write a couple utility methods to initialize the weight matrices for our models.- `random_weight(shape)` initializes a weight tensor with the Kaiming normalization method.- `zero_weight(shape)` initializes a weight tensor with all zeros. Useful for instantiating bias parameters.The `random_weight` function uses the Kaiming normal initialization method, described in:He et al, *Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification*, ICCV 2015, https://arxiv.org/abs/1502.01852 ###Code def random_weight(shape): """ Create random Tensors for weights; setting requires_grad=True means that we want to compute gradients for these Tensors during the backward pass. We use Kaiming normalization: sqrt(2 / fan_in) """ if len(shape) == 2: # FC weight fan_in = shape[0] else: fan_in = np.prod(shape[1:]) # conv weight [out_channel, in_channel, kH, kW] # randn is standard normal distribution generator. w = torch.randn(shape, device=device, dtype=dtype) * np.sqrt(2. / fan_in) w.requires_grad = True return w def zero_weight(shape): return torch.zeros(shape, device=device, dtype=dtype, requires_grad=True) # create a weight of shape [3 x 5] # you should see the type `torch.cuda.FloatTensor` if you use GPU. # Otherwise it should be `torch.FloatTensor` random_weight((3, 5)) ###Output _____no_output_____ ###Markdown Barebones PyTorch: Check AccuracyWhen training the model we will use the following function to check the accuracy of our model on the training or validation sets.When checking accuracy we don't need to compute any gradients; as a result we don't need PyTorch to build a computational graph for us when we compute scores. To prevent a graph from being built we scope our computation under a `torch.no_grad()` context manager. ###Code def check_accuracy_part2(loader, model_fn, params): """ Check the accuracy of a classification model. Inputs: - loader: A DataLoader for the data split we want to check - model_fn: A function that performs the forward pass of the model, with the signature scores = model_fn(x, params) - params: List of PyTorch Tensors giving parameters of the model Returns: Nothing, but prints the accuracy of the model """ split = 'val' if loader.dataset.train else 'test' print('Checking accuracy on the %s set' % split) num_correct, num_samples = 0, 0 with torch.no_grad(): for x, y in loader: x = x.to(device=device, dtype=dtype) # move to device, e.g. GPU y = y.to(device=device, dtype=torch.int64) scores = model_fn(x, params) _, preds = scores.max(1) num_correct += (preds == y).sum() num_samples += preds.size(0) acc = float(num_correct) / num_samples print('Got %d / %d correct (%.2f%%)' % (num_correct, num_samples, 100 * acc)) ###Output _____no_output_____ ###Markdown BareBones PyTorch: Training LoopWe can now set up a basic training loop to train our network. We will train the model using stochastic gradient descent without momentum. We will use `torch.functional.cross_entropy` to compute the loss; you can [read about it here](http://pytorch.org/docs/stable/nn.htmlcross-entropy).The training loop takes as input the neural network function, a list of initialized parameters (`[w1, w2]` in our example), and learning rate. ###Code def train_part2(model_fn, params, learning_rate): """ Train a model on CIFAR-10. Inputs: - model_fn: A Python function that performs the forward pass of the model. It should have the signature scores = model_fn(x, params) where x is a PyTorch Tensor of image data, params is a list of PyTorch Tensors giving model weights, and scores is a PyTorch Tensor of shape (N, C) giving scores for the elements in x. - params: List of PyTorch Tensors giving weights for the model - learning_rate: Python scalar giving the learning rate to use for SGD Returns: Nothing """ for t, (x, y) in enumerate(loader_train): # Move the data to the proper device (GPU or CPU) x = x.to(device=device, dtype=dtype) y = y.to(device=device, dtype=torch.long) # Forward pass: compute scores and loss scores = model_fn(x, params) loss = F.cross_entropy(scores, y) # Backward pass: PyTorch figures out which Tensors in the computational # graph has requires_grad=True and uses backpropagation to compute the # gradient of the loss with respect to these Tensors, and stores the # gradients in the .grad attribute of each Tensor. loss.backward() # Update parameters. We don't want to backpropagate through the # parameter updates, so we scope the updates under a torch.no_grad() # context manager to prevent a computational graph from being built. with torch.no_grad(): for w in params: w -= learning_rate * w.grad # Manually zero the gradients after running the backward pass w.grad.zero_() if t % print_every == 0: print('Iteration %d, loss = %.4f' % (t, loss.item())) check_accuracy_part2(loader_val, model_fn, params) print() ###Output _____no_output_____ ###Markdown BareBones PyTorch: Train a Two-Layer NetworkNow we are ready to run the training loop. We need to explicitly allocate tensors for the fully connected weights, `w1` and `w2`. Each minibatch of CIFAR has 64 examples, so the tensor shape is `[64, 3, 32, 32]`. After flattening, `x` shape should be `[64, 3 * 32 * 32]`. This will be the size of the first dimension of `w1`. The second dimension of `w1` is the hidden layer size, which will also be the first dimension of `w2`. Finally, the output of the network is a 10-dimensional vector that represents the probability distribution over 10 classes. You don't need to tune any hyperparameters but you should see accuracies above 40% after training for one epoch. ###Code hidden_layer_size = 4000 learning_rate = 1e-2 w1 = random_weight((3 * 32 * 32, hidden_layer_size)) w2 = random_weight((hidden_layer_size, 10)) train_part2(two_layer_fc, [w1, w2], learning_rate) ###Output Iteration 0, loss = 3.5479 Checking accuracy on the val set Got 120 / 1000 correct (12.00%) Iteration 100, loss = 2.1995 Checking accuracy on the val set Got 359 / 1000 correct (35.90%) Iteration 200, loss = 1.7434 Checking accuracy on the val set Got 405 / 1000 correct (40.50%) Iteration 300, loss = 2.0229 Checking accuracy on the val set Got 368 / 1000 correct (36.80%) Iteration 400, loss = 2.1376 Checking accuracy on the val set Got 425 / 1000 correct (42.50%) Iteration 500, loss = 2.1530 Checking accuracy on the val set Got 434 / 1000 correct (43.40%) Iteration 600, loss = 1.7866 Checking accuracy on the val set Got 431 / 1000 correct (43.10%) Iteration 700, loss = 1.7147 Checking accuracy on the val set Got 452 / 1000 correct (45.20%) ###Markdown BareBones PyTorch: Training a ConvNetIn the below you should use the functions defined above to train a three-layer convolutional network on CIFAR. The network should have the following architecture:1. Convolutional layer (with bias) with 32 5x5 filters, with zero-padding of 22. ReLU3. Convolutional layer (with bias) with 16 3x3 filters, with zero-padding of 14. ReLU5. Fully-connected layer (with bias) to compute scores for 10 classesYou should initialize your weight matrices using the `random_weight` function defined above, and you should initialize your bias vectors using the `zero_weight` function above.You don't need to tune any hyperparameters, but if everything works correctly you should achieve an accuracy above 42% after one epoch. ###Code learning_rate = 3e-3 channel_1 = 32 channel_2 = 16 conv_w1 = None conv_b1 = None conv_w2 = None conv_b2 = None fc_w = None fc_b = None ################################################################################ # TODO: Initialize the parameters of a three-layer ConvNet. # ################################################################################ conv_w1 = random_weight((channel_1,3,5,5)) conv_b1 = zero_weight((channel_1,)) conv_w2 = random_weight((channel_2,channel_1, 3, 3)) conv_b2 = zero_weight((channel_2,)) fc_w = random_weight((32*32*channel_2, 10)) fc_b = zero_weight((10,)) ################################################################################ # END OF YOUR CODE # ################################################################################ params = [conv_w1, conv_b1, conv_w2, conv_b2, fc_w, fc_b] train_part2(three_layer_convnet, params, learning_rate) ###Output Iteration 0, loss = 3.7731 Checking accuracy on the val set Got 106 / 1000 correct (10.60%) Iteration 100, loss = 2.0191 Checking accuracy on the val set Got 377 / 1000 correct (37.70%) Iteration 200, loss = 1.8031 Checking accuracy on the val set Got 401 / 1000 correct (40.10%) Iteration 300, loss = 1.7237 Checking accuracy on the val set Got 453 / 1000 correct (45.30%) Iteration 400, loss = 1.4040 Checking accuracy on the val set Got 459 / 1000 correct (45.90%) Iteration 500, loss = 1.4098 Checking accuracy on the val set Got 475 / 1000 correct (47.50%) Iteration 600, loss = 1.3452 Checking accuracy on the val set Got 491 / 1000 correct (49.10%) Iteration 700, loss = 1.5515 Checking accuracy on the val set Got 499 / 1000 correct (49.90%) ###Markdown Part III. PyTorch Module APIBarebone PyTorch requires that we track all the parameter tensors by hand. This is fine for small networks with a few tensors, but it would be extremely inconvenient and error-prone to track tens or hundreds of tensors in larger networks.PyTorch provides the `nn.Module` API for you to define arbitrary network architectures, while tracking every learnable parameters for you. In Part II, we implemented SGD ourselves. PyTorch also provides the `torch.optim` package that implements all the common optimizers, such as RMSProp, Adagrad, and Adam. It even supports approximate second-order methods like L-BFGS! You can refer to the [doc](http://pytorch.org/docs/master/optim.html) for the exact specifications of each optimizer.To use the Module API, follow the steps below:1. Subclass `nn.Module`. Give your network class an intuitive name like `TwoLayerFC`. 2. In the constructor `__init__()`, define all the layers you need as class attributes. Layer objects like `nn.Linear` and `nn.Conv2d` are themselves `nn.Module` subclasses and contain learnable parameters, so that you don't have to instantiate the raw tensors yourself. `nn.Module` will track these internal parameters for you. Refer to the [doc](http://pytorch.org/docs/master/nn.html) to learn more about the dozens of builtin layers. **Warning**: don't forget to call the `super().__init__()` first!3. In the `forward()` method, define the *connectivity* of your network. You should use the attributes defined in `__init__` as function calls that take tensor as input and output the "transformed" tensor. Do *not* create any new layers with learnable parameters in `forward()`! All of them must be declared upfront in `__init__`. After you define your Module subclass, you can instantiate it as an object and call it just like the NN forward function in part II. Module API: Two-Layer NetworkHere is a concrete example of a 2-layer fully connected network: ###Code class TwoLayerFC(nn.Module): def __init__(self, input_size, hidden_size, num_classes): super().__init__() # assign layer objects to class attributes self.fc1 = nn.Linear(input_size, hidden_size) # nn.init package contains convenient initialization methods # http://pytorch.org/docs/master/nn.html#torch-nn-init nn.init.kaiming_normal_(self.fc1.weight) self.fc2 = nn.Linear(hidden_size, num_classes) nn.init.kaiming_normal_(self.fc2.weight) def forward(self, x): # forward always defines connectivity x = flatten(x) scores = self.fc2(F.relu(self.fc1(x))) return scores def test_TwoLayerFC(): input_size = 50 x = torch.zeros((64, input_size), dtype=dtype) # minibatch size 64, feature dimension 50 model = TwoLayerFC(input_size, 42, 10) scores = model(x) print(scores.size()) # you should see [64, 10] test_TwoLayerFC() ###Output torch.Size([64, 10]) ###Markdown Module API: Three-Layer ConvNetIt's your turn to implement a 3-layer ConvNet followed by a fully connected layer. The network architecture should be the same as in Part II:1. Convolutional layer with `channel_1` 5x5 filters with zero-padding of 22. ReLU3. Convolutional layer with `channel_2` 3x3 filters with zero-padding of 14. ReLU5. Fully-connected layer to `num_classes` classesYou should initialize the weight matrices of the model using the Kaiming normal initialization method.**HINT**: http://pytorch.org/docs/stable/nn.htmlconv2dAfter you implement the three-layer ConvNet, the `test_ThreeLayerConvNet` function will run your implementation; it should print `(64, 10)` for the shape of the output scores. ###Code class ThreeLayerConvNet(nn.Module): def __init__(self, in_channel, channel_1, channel_2, num_classes): super().__init__() ######################################################################## # TODO: Set up the layers you need for a three-layer ConvNet with the # # architecture defined above. # ######################################################################## self.conv1 = nn.Conv2d(in_channel, channel_1, kernel_size=(5, 5), stride=1, padding=(2, 2)) nn.init.kaiming_normal_(self.conv1.weight) self.conv2 = nn.Conv2d(channel_1, channel_2, kernel_size=(3, 3), stride=1, padding=(1, 1)) nn.init.kaiming_normal_(self.conv2.weight) self.fc = nn.Linear(channel_2*32*32, num_classes) nn.init.kaiming_normal_(self.fc.weight) ######################################################################## # END OF YOUR CODE # ######################################################################## def forward(self, x): scores = None ######################################################################## # TODO: Implement the forward function for a 3-layer ConvNet. you # # should use the layers you defined in __init__ and specify the # # connectivity of those layers in forward() # ######################################################################## out1 = F.relu(self.conv1(x)) out2 = F.relu(self.conv2(out1)) scores = self.fc(flatten(out2)) ######################################################################## # END OF YOUR CODE # ######################################################################## return scores def test_ThreeLayerConvNet(): x = torch.zeros((64, 3, 32, 32), dtype=dtype) # minibatch size 64, image size [3, 32, 32] model = ThreeLayerConvNet(in_channel=3, channel_1=12, channel_2=8, num_classes=10) scores = model(x) print(scores.size()) # you should see [64, 10] test_ThreeLayerConvNet() ###Output torch.Size([64, 10]) ###Markdown Module API: Check AccuracyGiven the validation or test set, we can check the classification accuracy of a neural network. This version is slightly different from the one in part II. You don't manually pass in the parameters anymore. ###Code def check_accuracy_part34(loader, model): if loader.dataset.train: print('Checking accuracy on validation set') else: print('Checking accuracy on test set') num_correct = 0 num_samples = 0 model.eval() # set model to evaluation mode with torch.no_grad(): for x, y in loader: x = x.to(device=device, dtype=dtype) # move to device, e.g. GPU y = y.to(device=device, dtype=torch.long) scores = model(x) _, preds = scores.max(1) num_correct += (preds == y).sum() num_samples += preds.size(0) acc = float(num_correct) / num_samples print('Got %d / %d correct (%.2f)' % (num_correct, num_samples, 100 * acc)) ###Output _____no_output_____ ###Markdown Module API: Training LoopWe also use a slightly different training loop. Rather than updating the values of the weights ourselves, we use an Optimizer object from the `torch.optim` package, which abstract the notion of an optimization algorithm and provides implementations of most of the algorithms commonly used to optimize neural networks. ###Code def train_part34(model, optimizer, epochs=1): """ Train a model on CIFAR-10 using the PyTorch Module API. Inputs: - model: A PyTorch Module giving the model to train. - optimizer: An Optimizer object we will use to train the model - epochs: (Optional) A Python integer giving the number of epochs to train for Returns: Nothing, but prints model accuracies during training. """ model = model.to(device=device) # move the model parameters to CPU/GPU for e in range(epochs): for t, (x, y) in enumerate(loader_train): model.train() # put model to training mode x = x.to(device=device, dtype=dtype) # move to device, e.g. GPU y = y.to(device=device, dtype=torch.long) scores = model(x) loss = F.cross_entropy(scores, y) # Zero out all of the gradients for the variables which the optimizer # will update. optimizer.zero_grad() # This is the backwards pass: compute the gradient of the loss with # respect to each parameter of the model. loss.backward() # Actually update the parameters of the model using the gradients # computed by the backwards pass. optimizer.step() if t % print_every == 0: print('Iteration %d, loss = %.4f' % (t, loss.item())) check_accuracy_part34(loader_val, model) print() ###Output _____no_output_____ ###Markdown Module API: Train a Two-Layer NetworkNow we are ready to run the training loop. In contrast to part II, we don't explicitly allocate parameter tensors anymore.Simply pass the input size, hidden layer size, and number of classes (i.e. output size) to the constructor of `TwoLayerFC`. You also need to define an optimizer that tracks all the learnable parameters inside `TwoLayerFC`.You don't need to tune any hyperparameters, but you should see model accuracies above 40% after training for one epoch. ###Code hidden_layer_size = 4000 learning_rate = 1e-2 model = TwoLayerFC(3 * 32 * 32, hidden_layer_size, 10) optimizer = optim.SGD(model.parameters(), lr=learning_rate) train_part34(model, optimizer) ###Output Iteration 0, loss = 3.2455 Checking accuracy on validation set Got 160 / 1000 correct (16.00) Iteration 100, loss = 3.1439 Checking accuracy on validation set Got 328 / 1000 correct (32.80) Iteration 200, loss = 1.8028 Checking accuracy on validation set Got 408 / 1000 correct (40.80) Iteration 300, loss = 1.9349 Checking accuracy on validation set Got 371 / 1000 correct (37.10) Iteration 400, loss = 1.7843 Checking accuracy on validation set Got 405 / 1000 correct (40.50) Iteration 500, loss = 1.8628 Checking accuracy on validation set Got 435 / 1000 correct (43.50) Iteration 600, loss = 1.8643 Checking accuracy on validation set Got 416 / 1000 correct (41.60) Iteration 700, loss = 1.9308 Checking accuracy on validation set Got 461 / 1000 correct (46.10) ###Markdown Module API: Train a Three-Layer ConvNetYou should now use the Module API to train a three-layer ConvNet on CIFAR. This should look very similar to training the two-layer network! You don't need to tune any hyperparameters, but you should achieve above above 45% after training for one epoch.You should train the model using stochastic gradient descent without momentum. ###Code learning_rate = 3e-3 channel_1 = 32 channel_2 = 16 model = None optimizer = None ################################################################################ # TODO: Instantiate your ThreeLayerConvNet model and a corresponding optimizer # ################################################################################ model = ThreeLayerConvNet(3, channel_1, channel_2, 10) optimizer = optim.SGD(model.parameters(), lr=learning_rate) ################################################################################ # END OF YOUR CODE ################################################################################ train_part34(model, optimizer) ###Output Iteration 0, loss = 3.1835 Checking accuracy on validation set Got 79 / 1000 correct (7.90) Iteration 100, loss = 1.8005 Checking accuracy on validation set Got 313 / 1000 correct (31.30) Iteration 200, loss = 1.9470 Checking accuracy on validation set Got 364 / 1000 correct (36.40) Iteration 300, loss = 1.7294 Checking accuracy on validation set Got 394 / 1000 correct (39.40) Iteration 400, loss = 1.7160 Checking accuracy on validation set Got 418 / 1000 correct (41.80) Iteration 500, loss = 1.4666 Checking accuracy on validation set Got 432 / 1000 correct (43.20) Iteration 600, loss = 1.6232 Checking accuracy on validation set Got 447 / 1000 correct (44.70) Iteration 700, loss = 1.5544 Checking accuracy on validation set Got 458 / 1000 correct (45.80) ###Markdown Part IV. PyTorch Sequential APIPart III introduced the PyTorch Module API, which allows you to define arbitrary learnable layers and their connectivity. For simple models like a stack of feed forward layers, you still need to go through 3 steps: subclass `nn.Module`, assign layers to class attributes in `__init__`, and call each layer one by one in `forward()`. Is there a more convenient way? Fortunately, PyTorch provides a container Module called `nn.Sequential`, which merges the above steps into one. It is not as flexible as `nn.Module`, because you cannot specify more complex topology than a feed-forward stack, but it's good enough for many use cases. Sequential API: Two-Layer NetworkLet's see how to rewrite our two-layer fully connected network example with `nn.Sequential`, and train it using the training loop defined above.Again, you don't need to tune any hyperparameters here, but you shoud achieve above 40% accuracy after one epoch of training. ###Code # We need to wrap `flatten` function in a module in order to stack it # in nn.Sequential class Flatten(nn.Module): def forward(self, x): return flatten(x) hidden_layer_size = 4000 learning_rate = 1e-2 model = nn.Sequential( Flatten(), nn.Linear(3 * 32 * 32, hidden_layer_size), nn.ReLU(), nn.Linear(hidden_layer_size, 10), ) # you can use Nesterov momentum in optim.SGD optimizer = optim.SGD(model.parameters(), lr=learning_rate, momentum=0.9, nesterov=True) train_part34(model, optimizer) ###Output Iteration 0, loss = 2.3181 Checking accuracy on validation set Got 136 / 1000 correct (13.60) Iteration 100, loss = 1.7247 Checking accuracy on validation set Got 366 / 1000 correct (36.60) Iteration 200, loss = 1.7866 Checking accuracy on validation set Got 421 / 1000 correct (42.10) Iteration 300, loss = 1.5300 Checking accuracy on validation set Got 423 / 1000 correct (42.30) Iteration 400, loss = 1.6290 Checking accuracy on validation set Got 448 / 1000 correct (44.80) Iteration 500, loss = 1.8494 Checking accuracy on validation set Got 414 / 1000 correct (41.40) Iteration 600, loss = 1.5936 Checking accuracy on validation set Got 421 / 1000 correct (42.10) Iteration 700, loss = 1.8066 Checking accuracy on validation set Got 460 / 1000 correct (46.00) ###Markdown Sequential API: Three-Layer ConvNetHere you should use `nn.Sequential` to define and train a three-layer ConvNet with the same architecture we used in Part III:1. Convolutional layer (with bias) with 32 5x5 filters, with zero-padding of 22. ReLU3. Convolutional layer (with bias) with 16 3x3 filters, with zero-padding of 14. ReLU5. Fully-connected layer (with bias) to compute scores for 10 classesYou should initialize your weight matrices using the `random_weight` function defined above, and you should initialize your bias vectors using the `zero_weight` function above.You should optimize your model using stochastic gradient descent with Nesterov momentum 0.9.Again, you don't need to tune any hyperparameters but you should see accuracy above 55% after one epoch of training. ###Code channel_1 = 32 channel_2 = 16 learning_rate = 1e-2 model = None optimizer = None ################################################################################ # TODO: Rewrite the 2-layer ConvNet with bias from Part III with the # # Sequential API. # ################################################################################ model = nn.Sequential( nn.Conv2d(3, channel_1, kernel_size=(5, 5), stride=1, padding=(2, 2)), nn.ReLU(), nn.Conv2d(channel_1, channel_2, kernel_size=(3, 3), stride=1, padding=(1, 1)), nn.ReLU(), Flatten(), nn.Linear(32*32*channel_2, 10) ) # you can use Nesterov momentum in optim.SGD optimizer = optim.SGD(model.parameters(), lr=learning_rate, momentum=0.9, nesterov=True) ################################################################################ # END OF YOUR CODE ################################################################################ train_part34(model, optimizer) ###Output Iteration 0, loss = 2.3163 Checking accuracy on validation set Got 134 / 1000 correct (13.40) Iteration 100, loss = 1.4038 Checking accuracy on validation set Got 432 / 1000 correct (43.20) Iteration 200, loss = 1.2995 Checking accuracy on validation set Got 509 / 1000 correct (50.90) Iteration 300, loss = 1.1850 Checking accuracy on validation set Got 503 / 1000 correct (50.30) Iteration 400, loss = 1.3431 Checking accuracy on validation set Got 530 / 1000 correct (53.00) Iteration 500, loss = 1.3092 Checking accuracy on validation set Got 552 / 1000 correct (55.20) Iteration 600, loss = 1.3287 Checking accuracy on validation set Got 569 / 1000 correct (56.90) Iteration 700, loss = 1.1478 Checking accuracy on validation set Got 566 / 1000 correct (56.60) ###Markdown Part V. CIFAR-10 open-ended challengeIn this section, you can experiment with whatever ConvNet architecture you'd like on CIFAR-10. Now it's your job to experiment with architectures, hyperparameters, loss functions, and optimizers to train a model that achieves **at least 70%** accuracy on the CIFAR-10 **validation** set within 10 epochs. You can use the check_accuracy and train functions from above. You can use either `nn.Module` or `nn.Sequential` API. Describe what you did at the end of this notebook.Here are the official API documentation for each component. One note: what we call in the class "spatial batch norm" is called "BatchNorm2D" in PyTorch.* Layers in torch.nn package: http://pytorch.org/docs/stable/nn.html* Activations: http://pytorch.org/docs/stable/nn.htmlnon-linear-activations* Loss functions: http://pytorch.org/docs/stable/nn.htmlloss-functions* Optimizers: http://pytorch.org/docs/stable/optim.html Things you might try:- **Filter size**: Above we used 5x5; would smaller filters be more efficient?- **Number of filters**: Above we used 32 filters. Do more or fewer do better?- **Pooling vs Strided Convolution**: Do you use max pooling or just stride convolutions?- **Batch normalization**: Try adding spatial batch normalization after convolution layers and vanilla batch normalization after affine layers. Do your networks train faster?- **Network architecture**: The network above has two layers of trainable parameters. Can you do better with a deep network? Good architectures to try include: - [conv-relu-pool]xN -> [affine]xM -> [softmax or SVM] - [conv-relu-conv-relu-pool]xN -> [affine]xM -> [softmax or SVM] - [batchnorm-relu-conv]xN -> [affine]xM -> [softmax or SVM]- **Global Average Pooling**: Instead of flattening and then having multiple affine layers, perform convolutions until your image gets small (7x7 or so) and then perform an average pooling operation to get to a 1x1 image picture (1, 1 , Filter), which is then reshaped into a (Filter) vector. This is used in [Google's Inception Network](https://arxiv.org/abs/1512.00567) (See Table 1 for their architecture).- **Regularization**: Add l2 weight regularization, or perhaps use Dropout. Tips for trainingFor each network architecture that you try, you should tune the learning rate and other hyperparameters. When doing this there are a couple important things to keep in mind:- If the parameters are working well, you should see improvement within a few hundred iterations- Remember the coarse-to-fine approach for hyperparameter tuning: start by testing a large range of hyperparameters for just a few training iterations to find the combinations of parameters that are working at all.- Once you have found some sets of parameters that seem to work, search more finely around these parameters. You may need to train for more epochs.- You should use the validation set for hyperparameter search, and save your test set for evaluating your architecture on the best parameters as selected by the validation set. Going above and beyondIf you are feeling adventurous there are many other features you can implement to try and improve your performance. You are **not required** to implement any of these, but don't miss the fun if you have time!- Alternative optimizers: you can try Adam, Adagrad, RMSprop, etc.- Alternative activation functions such as leaky ReLU, parametric ReLU, ELU, or MaxOut.- Model ensembles- Data augmentation- New Architectures - [ResNets](https://arxiv.org/abs/1512.03385) where the input from the previous layer is added to the output. - [DenseNets](https://arxiv.org/abs/1608.06993) where inputs into previous layers are concatenated together. - [This blog has an in-depth overview](https://chatbotslife.com/resnets-highwaynets-and-densenets-oh-my-9bb15918ee32) Have fun and happy training! ###Code ################################################################################ # TODO: # # Experiment with any architectures, optimizers, and hyperparameters. # # Achieve AT LEAST 70% accuracy on the *validation set* within 10 epochs. # # # # Note that you can use the check_accuracy function to evaluate on either # # the test set or the validation set, by passing either loader_test or # # loader_val as the second argument to check_accuracy. You should not touch # # the test set until you have finished your architecture and hyperparameter # # tuning, and only run the test set once at the end to report a final value. # ################################################################################ model = None optimizer = None # channel_1 = 48 # channel_2 = 24 # pad_1 = 1 # pad_2 = 1 # kernel_1 = 2 * pad_1 + 1 # kernel_2 = 2 * pad_2 + 1 model = nn.Sequential( nn.Conv2d(in_channels=3,out_channels=32,kernel_size=3,stride=1, padding=1), nn.Conv2d(in_channels=32,out_channels=64,kernel_size=3,stride=1, padding=1), nn.ReLU(inplace=True), nn.BatchNorm2d(num_features=64), nn.MaxPool2d(kernel_size=2,stride=2), nn.Conv2d(in_channels=64,out_channels=128,kernel_size=3,stride=1,padding=1), nn.Conv2d(in_channels=128,out_channels=256,kernel_size=3,stride=1,padding=1), nn.ReLU(inplace=True), nn.BatchNorm2d(num_features=256), nn.MaxPool2d(kernel_size=2,stride=2), Flatten(), nn.Linear(16384,1024), # 16384=64*32*32 input size nn.ReLU(inplace=True), nn.BatchNorm1d(num_features=1024), nn.Linear(1024,10), ) loss_fn = nn.CrossEntropyLoss() # optimizer = optim.Adam(params=model_gpu.parameters(), lr=1e-3) optimizer = optim.Adadelta(model.parameters(), lr=1.0, rho=0.9, eps=1e-06, weight_decay=0) train_part34(model, optimizer, epochs=10) # model = None # optimizer = None # channel_1 = 48 # channel_2 = 24 # # learning_rate = 1e-5 # # learning_momentum = 0.9 # pad_1 = 1 # pad_2 = 1 # kernel_1 = 2 * pad_1 + 1 # kernel_2 = 2 * pad_2 + 1 # model = nn.Sequential( # nn.Conv2d(3, channel_1, kernel_size=(kernel_1, kernel_1), # stride=1, padding=(pad_1, pad_1)), # # nn.BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True), # # nn.Softmax(dim=0), # nn.ReLU(), # nn.MaxPool2d(kernel_size=(2, 2)), # nn.Conv2d(channel_1, channel_2, kernel_size=(kernel_2, kernel_2), # stride=1, padding=(pad_2, pad_2)), # # nn.BatchNorm2d(16, eps=1e-05, momentum=0.1, affine=True), # # nn.Softmax(dim=0), # nn.ReLU(), # nn.MaxPool2d(kernel_size=(2, 2)), # Flatten(), # nn.Linear(8 * 8 * channel_2, 10) # ) # # you can use Nesterov momentum in optim.SGD # # optimizer = optim.SGD(model.parameters(), lr=learning_rate,momentum=learning_momentum, nesterov=True) # optimizer = optim.Adadelta(model.parameters(), lr=1.0, rho=0.9, eps=1e-06, weight_decay=0) # ################################################################################ # # END OF YOUR CODE # ################################################################################ # # accuracy: # # 66% # # 67.9% # # 68.9% # # 72% # # You should get at least 70% accuracy # train_part34(model, optimizer, epochs=10) ###Output Iteration 0, loss = 2.3201 Checking accuracy on validation set Got 114 / 1000 correct (11.40) Iteration 100, loss = 1.5856 Checking accuracy on validation set Got 495 / 1000 correct (49.50) Iteration 200, loss = 1.1109 Checking accuracy on validation set Got 555 / 1000 correct (55.50) Iteration 300, loss = 0.9352 Checking accuracy on validation set Got 641 / 1000 correct (64.10) Iteration 400, loss = 1.1815 Checking accuracy on validation set Got 613 / 1000 correct (61.30) Iteration 500, loss = 1.0509 Checking accuracy on validation set Got 669 / 1000 correct (66.90) Iteration 600, loss = 0.8350 Checking accuracy on validation set Got 684 / 1000 correct (68.40) Iteration 700, loss = 0.8219 Checking accuracy on validation set Got 667 / 1000 correct (66.70) Iteration 0, loss = 0.7875 Checking accuracy on validation set Got 710 / 1000 correct (71.00) Iteration 100, loss = 0.7809 Checking accuracy on validation set Got 737 / 1000 correct (73.70) Iteration 200, loss = 0.7297 Checking accuracy on validation set Got 714 / 1000 correct (71.40) Iteration 300, loss = 0.8799 Checking accuracy on validation set Got 737 / 1000 correct (73.70) Iteration 400, loss = 0.8326 Checking accuracy on validation set Got 746 / 1000 correct (74.60) Iteration 500, loss = 0.6341 Checking accuracy on validation set Got 720 / 1000 correct (72.00) Iteration 600, loss = 0.5888 Checking accuracy on validation set Got 751 / 1000 correct (75.10) Iteration 700, loss = 0.6924 Checking accuracy on validation set Got 766 / 1000 correct (76.60) Iteration 0, loss = 0.5622 Checking accuracy on validation set Got 758 / 1000 correct (75.80) Iteration 100, loss = 0.3687 Checking accuracy on validation set Got 773 / 1000 correct (77.30) Iteration 200, loss = 0.5916 Checking accuracy on validation set Got 782 / 1000 correct (78.20) Iteration 300, loss = 0.3237 Checking accuracy on validation set Got 774 / 1000 correct (77.40) Iteration 400, loss = 0.5440 Checking accuracy on validation set Got 769 / 1000 correct (76.90) Iteration 500, loss = 0.9410 Checking accuracy on validation set Got 773 / 1000 correct (77.30) Iteration 600, loss = 0.5636 Checking accuracy on validation set Got 778 / 1000 correct (77.80) Iteration 700, loss = 0.4803 Checking accuracy on validation set Got 769 / 1000 correct (76.90) Iteration 0, loss = 0.3897 Checking accuracy on validation set Got 787 / 1000 correct (78.70) Iteration 100, loss = 0.2614 Checking accuracy on validation set Got 794 / 1000 correct (79.40) Iteration 200, loss = 0.1775 Checking accuracy on validation set Got 774 / 1000 correct (77.40) Iteration 300, loss = 0.1869 Checking accuracy on validation set Got 779 / 1000 correct (77.90) Iteration 400, loss = 0.3393 Checking accuracy on validation set Got 792 / 1000 correct (79.20) Iteration 500, loss = 0.1709 Checking accuracy on validation set Got 761 / 1000 correct (76.10) Iteration 600, loss = 0.0797 Checking accuracy on validation set Got 793 / 1000 correct (79.30) Iteration 700, loss = 0.3197 Checking accuracy on validation set Got 784 / 1000 correct (78.40) Iteration 0, loss = 0.1266 Checking accuracy on validation set Got 781 / 1000 correct (78.10) Iteration 100, loss = 0.1061 Checking accuracy on validation set Got 799 / 1000 correct (79.90) Iteration 200, loss = 0.1033 Checking accuracy on validation set Got 793 / 1000 correct (79.30) Iteration 300, loss = 0.0898 Checking accuracy on validation set Got 804 / 1000 correct (80.40) Iteration 400, loss = 0.1138 Checking accuracy on validation set Got 793 / 1000 correct (79.30) Iteration 500, loss = 0.0557 Checking accuracy on validation set Got 789 / 1000 correct (78.90) Iteration 600, loss = 0.2430 Checking accuracy on validation set Got 787 / 1000 correct (78.70) Iteration 700, loss = 0.1706 Checking accuracy on validation set Got 793 / 1000 correct (79.30) Iteration 0, loss = 0.1478 Checking accuracy on validation set Got 742 / 1000 correct (74.20) Iteration 100, loss = 0.0706 Checking accuracy on validation set Got 792 / 1000 correct (79.20) Iteration 200, loss = 0.0265 Checking accuracy on validation set Got 802 / 1000 correct (80.20) Iteration 300, loss = 0.1135 Checking accuracy on validation set Got 788 / 1000 correct (78.80) Iteration 400, loss = 0.1728 Checking accuracy on validation set Got 786 / 1000 correct (78.60) Iteration 500, loss = 0.1027 Checking accuracy on validation set Got 794 / 1000 correct (79.40) Iteration 600, loss = 0.0768 Checking accuracy on validation set Got 789 / 1000 correct (78.90) Iteration 700, loss = 0.1079 Checking accuracy on validation set Got 787 / 1000 correct (78.70) Iteration 0, loss = 0.0443 Checking accuracy on validation set Got 791 / 1000 correct (79.10) Iteration 100, loss = 0.0441 Checking accuracy on validation set Got 806 / 1000 correct (80.60) Iteration 200, loss = 0.0099 Checking accuracy on validation set Got 802 / 1000 correct (80.20) Iteration 300, loss = 0.0388 Checking accuracy on validation set Got 796 / 1000 correct (79.60) Iteration 400, loss = 0.0141 Checking accuracy on validation set Got 814 / 1000 correct (81.40) Iteration 500, loss = 0.0675 Checking accuracy on validation set Got 778 / 1000 correct (77.80) Iteration 600, loss = 0.0577 Checking accuracy on validation set Got 801 / 1000 correct (80.10) Iteration 700, loss = 0.0349 Checking accuracy on validation set Got 788 / 1000 correct (78.80) Iteration 0, loss = 0.0318 Checking accuracy on validation set Got 803 / 1000 correct (80.30) Iteration 100, loss = 0.0056 Checking accuracy on validation set Got 800 / 1000 correct (80.00) Iteration 200, loss = 0.0115 Checking accuracy on validation set Got 806 / 1000 correct (80.60) Iteration 300, loss = 0.0260 Checking accuracy on validation set Got 805 / 1000 correct (80.50) Iteration 400, loss = 0.1340 Checking accuracy on validation set Got 804 / 1000 correct (80.40) Iteration 500, loss = 0.0139 Checking accuracy on validation set Got 809 / 1000 correct (80.90) Iteration 600, loss = 0.1500 Checking accuracy on validation set Got 805 / 1000 correct (80.50) Iteration 700, loss = 0.0668 Checking accuracy on validation set Got 813 / 1000 correct (81.30) Iteration 0, loss = 0.0103 Checking accuracy on validation set Got 792 / 1000 correct (79.20) Iteration 100, loss = 0.0510 Checking accuracy on validation set Got 799 / 1000 correct (79.90) Iteration 200, loss = 0.0424 Checking accuracy on validation set Got 809 / 1000 correct (80.90) Iteration 300, loss = 0.0154 Checking accuracy on validation set Got 817 / 1000 correct (81.70) Iteration 400, loss = 0.0100 Checking accuracy on validation set Got 800 / 1000 correct (80.00) Iteration 500, loss = 0.1010 Checking accuracy on validation set Got 797 / 1000 correct (79.70) Iteration 600, loss = 0.0081 Checking accuracy on validation set Got 805 / 1000 correct (80.50) Iteration 700, loss = 0.0543 Checking accuracy on validation set Got 807 / 1000 correct (80.70) Iteration 0, loss = 0.0212 Checking accuracy on validation set Got 801 / 1000 correct (80.10) Iteration 100, loss = 0.0073 Checking accuracy on validation set Got 809 / 1000 correct (80.90) Iteration 200, loss = 0.0026 Checking accuracy on validation set Got 807 / 1000 correct (80.70) Iteration 300, loss = 0.0032 Checking accuracy on validation set Got 809 / 1000 correct (80.90) Iteration 400, loss = 0.0043 Checking accuracy on validation set Got 797 / 1000 correct (79.70) Iteration 500, loss = 0.0031 Checking accuracy on validation set Got 804 / 1000 correct (80.40) Iteration 600, loss = 0.0564 Checking accuracy on validation set Got 806 / 1000 correct (80.60) Iteration 700, loss = 0.0093 Checking accuracy on validation set Got 810 / 1000 correct (81.00) ###Markdown Describe what you did In the cell below you should write an explanation of what you did, any additional features that you implemented, and/or any graphs that you made in the process of training and evaluating your network. TODO: Describe what you did Test set -- run this only onceNow that we've gotten a result we're happy with, we test our final model on the test set (which you should store in best_model). Think about how this compares to your validation set accuracy. ###Code best_model = model check_accuracy_part34(loader_test, best_model) ###Output Checking accuracy on test set Got 7895 / 10000 correct (78.95)
svm.scikit/svm_poly_pca.scikit_benchmark.ipynb
###Markdown MNIST digit recognition using SVC and PCA with Polynomial kernel > Using optimal parameters, fit to BOTH original and deskewed data--- ###Code from __future__ import division import os, time, math import cPickle as pickle import matplotlib.pyplot as plt import numpy as np import scipy import csv from operator import itemgetter from tabulate import tabulate from print_imgs import print_imgs # my own function to print a grid of square images from sklearn.preprocessing import StandardScaler from sklearn.utils import shuffle from sklearn.decomposition import PCA from sklearn.svm import SVC from sklearn.cross_validation import StratifiedKFold from sklearn.cross_validation import train_test_split from sklearn.grid_search import RandomizedSearchCV from sklearn.metrics import classification_report, confusion_matrix np.random.seed(seed=1009) %matplotlib inline #%qtconsole ###Output _____no_output_____ ###Markdown Where's the data? ###Code file_path = '../data/' train_img_deskewed_filename = 'train-images_deskewed.csv' train_img_original_filename = 'train-images.csv' test_img_deskewed_filename = 't10k-images_deskewed.csv' test_img_original_filename = 't10k-images.csv' train_label_filename = 'train-labels.csv' test_label_filename = 't10k-labels.csv' ###Output _____no_output_____ ###Markdown How much of the data will we use? ###Code portion = 1.0 # set to less than 1.0 for testing; set to 1.0 to use the entire dataset ###Output _____no_output_____ ###Markdown Read the training images and labels, both original and deskewed ###Code # read both trainX files with open(file_path + train_img_original_filename,'r') as f: data_iter = csv.reader(f, delimiter = ',') data = [data for data in data_iter] trainXo = np.ascontiguousarray(data, dtype = np.float64) with open(file_path + train_img_deskewed_filename,'r') as f: data_iter = csv.reader(f, delimiter = ',') data = [data for data in data_iter] trainXd = np.ascontiguousarray(data, dtype = np.float64) # vertically concatenate the two files trainX = np.vstack((trainXo, trainXd)) trainXo = None trainXd = None # read trainY twice and vertically concatenate with open(file_path + train_label_filename,'r') as f: data_iter = csv.reader(f, delimiter = ',') data = [data for data in data_iter] trainYo = np.ascontiguousarray(data, dtype = np.int8) trainYd = np.ascontiguousarray(data, dtype = np.int8) trainY = np.vstack((trainYo, trainYd)).ravel() trainYo = None trainYd = None data = None # shuffle trainX & trainY trainX, trainY = shuffle(trainX, trainY, random_state=0) # use less data if specified if portion < 1.0: trainX = trainX[:portion*trainX.shape[0]] trainY = trainY[:portion*trainY.shape[0]] print("trainX shape: {0}".format(trainX.shape)) print("trainY shape: {0}\n".format(trainY.shape)) print(trainX.flags) ###Output trainX shape: (120000, 784) trainY shape: (120000,) C_CONTIGUOUS : True F_CONTIGUOUS : False OWNDATA : True WRITEABLE : True ALIGNED : True UPDATEIFCOPY : False ###Markdown Read the DESKEWED test images and labels ###Code # read testX with open(file_path + test_img_deskewed_filename,'r') as f: data_iter = csv.reader(f, delimiter = ',') data = [data for data in data_iter] testX = np.ascontiguousarray(data, dtype = np.float64) # read testY with open(file_path + test_label_filename,'r') as f: data_iter = csv.reader(f, delimiter = ',') data = [data for data in data_iter] testY = np.ascontiguousarray(data, dtype = np.int8) # shuffle testX, testY testX, testY = shuffle(testX, testY, random_state=0) # use a smaller dataset if specified if portion < 1.0: testX = testX[:portion*testX.shape[0]] testY = testY[:portion*testY.shape[0]] print("testX shape: {0}".format(testX.shape)) print("testY shape: {0}".format(testY.shape)) ###Output testX shape: (10000, 784) testY shape: (10000, 1) ###Markdown Use the smaller, fewer images for testing ###Code from sklearn.datasets import load_digits digits = load_digits() trainX, testX, trainY, testY = train_test_split(digits.data, digits.target, test_size=0.25, random_state=1009) ###Output _____no_output_____ ###Markdown Print a sample ###Code print_imgs(images = trainX, actual_labels = trainY, predicted_labels = trainY, starting_index = np.random.randint(0, high=trainY.shape[0]-36, size=1)[0], size = 6) ###Output _____no_output_____ ###Markdown PCA dimensionality reduction ###Code t0 = time.time() pca = PCA(n_components=0.85, whiten=True) trainX = pca.fit_transform(trainX) testX = pca.transform(testX) print("trainX shape: {0}".format(trainX.shape)) print("trainY shape: {0}\n".format(trainY.shape)) print("testX shape: {0}".format(testX.shape)) print("testY shape: {0}".format(testY.shape)) print("\ntime in minutes {0:.2f}".format((time.time()-t0)/60)) ###Output trainX shape: (120000, 54) trainY shape: (120000,) testX shape: (10000, 54) testY shape: (10000, 1) time in minutes 0.23 ###Markdown SVC Parameter Settings ###Code # default parameters for SVC # ========================== default_svc_params = {} default_svc_params['C'] = 1.0 # penalty default_svc_params['class_weight'] = None # Set the parameter C of class i to class_weight[i]*C # set to 'auto' for unbalanced classes default_svc_params['gamma'] = 0.0 # Kernel coefficient for 'rbf', 'poly' and 'sigmoid' default_svc_params['kernel'] = 'rbf' # 'linear', 'poly', 'rbf', 'sigmoid', 'precomputed' or a callable # use of 'sigmoid' is discouraged default_svc_params['shrinking'] = True # Whether to use the shrinking heuristic. default_svc_params['probability'] = False # Whether to enable probability estimates. default_svc_params['tol'] = 0.001 # Tolerance for stopping criterion. default_svc_params['cache_size'] = 200 # size of the kernel cache (in MB). default_svc_params['max_iter'] = -1 # limit on iterations within solver, or -1 for no limit. default_svc_params['verbose'] = False default_svc_params['degree'] = 3 # 'poly' only default_svc_params['coef0'] = 0.0 # 'poly' and 'sigmoid' only # set the parameters for the classifier # ===================================== svc_params = dict(default_svc_params) svc_params['cache_size'] = 2000 svc_params['probability'] = True svc_params['kernel'] = 'poly' svc_params['C'] = 1.0 svc_params['gamma'] = 0.1112 svc_params['degree'] = 3 svc_params['coef0'] = 1 # create the classifier itself # ============================ svc_clf = SVC(**svc_params) ###Output _____no_output_____ ###Markdown Fit the training data ###Code t0 = time.time() svc_clf.fit(trainX, trainY) print("\ntime in minutes {0:.2f}".format((time.time()-t0)/60)) ###Output time in minutes 13.36 ###Markdown Predict the test set and analyze the result ###Code target_names = ["0", "1", "2", "3", "4", "5", "6", "7", "8", "9"] predicted_values = svc_clf.predict(testX) y_true, y_pred = testY, predicted_values print(classification_report(y_true, y_pred, target_names=target_names)) def plot_confusion_matrix(cm, target_names, title='Proportional Confusion matrix', cmap=plt.cm.Paired): """ given a confusion matrix (cm), make a nice plot see the skikit-learn documentation for the original done for the iris dataset """ plt.figure(figsize=(8, 6)) plt.imshow((cm/cm.sum(axis=1)), interpolation='nearest', cmap=cmap) plt.title(title) plt.colorbar() tick_marks = np.arange(len(target_names)) plt.xticks(tick_marks, target_names, rotation=45) plt.yticks(tick_marks, target_names) plt.tight_layout() plt.ylabel('True label') plt.xlabel('Predicted label') cm = confusion_matrix(y_true, y_pred) print(cm) model_accuracy = sum(cm.diagonal())/len(testY) model_misclass = 1 - model_accuracy print("\nModel accuracy: {0}, model misclass rate: {1}".format(model_accuracy, model_misclass)) plot_confusion_matrix(cm, target_names) ###Output [[ 977 0 0 1 0 0 2 0 0 0] [ 0 1127 4 1 0 0 2 0 1 0] [ 1 1 1016 1 2 0 1 5 5 0] [ 0 0 4 996 0 3 0 2 4 1] [ 1 0 1 0 971 0 3 1 0 5] [ 1 0 0 3 0 885 2 0 1 0] [ 4 3 1 0 3 1 946 0 0 0] [ 1 2 4 0 3 0 0 1015 0 3] [ 3 0 1 2 1 3 0 1 960 3] [ 4 2 1 3 8 1 1 4 1 984]] Model accuracy: 0.9877, model misclass rate: 0.0123 ###Markdown Learning Curves see http://scikit-learn.org/stable/auto_examples/model_selection/plot_learning_curve.html * The score is the model accuracy* The red line shows how well the model fits the data it was trained on: * a high score indicates low bias ... the model does fit the training data * it's not unusual for the red line to start at 1.00 and decline slightly * a low score indicates the model does not fit the training data ... more predictor variables are ususally indicated, or a different model * The green line shows how well the model predicts the test data: if it's rising then it means more data to train on will produce better predictions ###Code t0 = time.time() from sklearn.learning_curve import learning_curve from sklearn.cross_validation import ShuffleSplit def plot_learning_curve(estimator, title, X, y, ylim=None, cv=None, n_jobs=1, train_sizes=np.linspace(.1, 1.0, 5)): """ Generate a simple plot of the test and training learning curve. Parameters ---------- estimator : object type that implements the "fit" and "predict" methods An object of that type which is cloned for each validation. title : string Title for the chart. X : array-like, shape (n_samples, n_features) Training vector, where n_samples is the number of samples and n_features is the number of features. y : array-like, shape (n_samples) or (n_samples, n_features), optional Target relative to X for classification or regression; None for unsupervised learning. ylim : tuple, shape (ymin, ymax), optional Defines minimum and maximum yvalues plotted. cv : integer, cross-validation generator, optional If an integer is passed, it is the number of folds (defaults to 3). Specific cross-validation objects can be passed, see sklearn.cross_validation module for the list of possible objects n_jobs : integer, optional Number of jobs to run in parallel (default 1). """ plt.figure(figsize=(8, 6)) plt.title(title) if ylim is not None: plt.ylim(*ylim) plt.xlabel("Training examples") plt.ylabel("Score") train_sizes, train_scores, test_scores = learning_curve( estimator, X, y, cv=cv, n_jobs=n_jobs, train_sizes=train_sizes) train_scores_mean = np.mean(train_scores, axis=1) train_scores_std = np.std(train_scores, axis=1) test_scores_mean = np.mean(test_scores, axis=1) test_scores_std = np.std(test_scores, axis=1) plt.grid() plt.fill_between(train_sizes, train_scores_mean - train_scores_std, train_scores_mean + train_scores_std, alpha=0.1, color="r") plt.fill_between(train_sizes, test_scores_mean - test_scores_std, test_scores_mean + test_scores_std, alpha=0.1, color="g") plt.plot(train_sizes, train_scores_mean, 'o-', color="r", label="Training score") plt.plot(train_sizes, test_scores_mean, 'o-', color="g", label="Cross-validation score") plt.tight_layout() plt.legend(loc="best") return plt C_gamma = "C="+str(np.round(svc_params['C'],4))+", gamma="+str(np.round(svc_params['gamma'],6)) title = "Learning Curves (SVM, Poly, " + C_gamma + ")" plot_learning_curve(estimator = svc_clf, title = title, X = trainX, y = trainY, ylim = (0.85, 1.01), cv = ShuffleSplit(n = trainX.shape[0], n_iter = 5, test_size = 0.2, random_state=0), n_jobs = 8) plt.show() print("\ntime in minutes {0:.2f}".format((time.time()-t0)/60)) ###Output _____no_output_____
src/3) Bagging_Guassian.ipynb
###Markdown Ensemble Method : Bagging ###Code from sklearn.ensemble import BaggingClassifier bagging = BaggingClassifier(base_estimator = GaussianNB(), n_jobs = -1, random_state = 0, n_estimators = 100, bootstrap = False, max_samples = 0.6, max_features = 0.6, verbose = 2) bagging.fit(x_train[:,[2, 3, 5, 8, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 23, 24, 25, 28]],y_train) y_pred = bagging.predict(x_test[:,[2, 3, 5, 8, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 23, 24, 25, 28]]) from sklearn.metrics import confusion_matrix cm = confusion_matrix(y_test, y_pred) cm ###Output [Parallel(n_jobs=12)]: Using backend LokyBackend with 12 concurrent workers. [Parallel(n_jobs=12)]: Done 3 out of 12 | elapsed: 4.6s remaining: 14.1s [Parallel(n_jobs=12)]: Done 10 out of 12 | elapsed: 4.9s remaining: 0.9s [Parallel(n_jobs=12)]: Done 12 out of 12 | elapsed: 4.9s finished [Parallel(n_jobs=12)]: Using backend LokyBackend with 12 concurrent workers. [Parallel(n_jobs=12)]: Done 3 out of 12 | elapsed: 0.9s remaining: 3.0s [Parallel(n_jobs=12)]: Done 10 out of 12 | elapsed: 1.0s remaining: 0.1s [Parallel(n_jobs=12)]: Done 12 out of 12 | elapsed: 1.1s finished ###Markdown Ensemble Method : ADA Boost ###Code from sklearn.ensemble import AdaBoostClassifier from sklearn.naive_bayes import GaussianNB ada_boost = AdaBoostClassifier(base_estimator = GaussianNB(), n_estimators = 100, algorithm = 'SAMME.R') ada_boost.fit(x_train[:,[2, 3, 5, 8, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 23, 24, 25, 28]],y_train) from sklearn.metrics import confusion_matrix y_pred = ada_boost.predict(x_test[:,[2, 3, 5, 8, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 23, 24, 25, 28]]) cm = confusion_matrix(y_test, y_pred) cm ###Output _____no_output_____ ###Markdown Applying ADA Boost on Bagging Classifier ###Code from sklearn.ensemble import BaggingClassifier bagging = BaggingClassifier(base_estimator = GaussianNB(), n_jobs = -1, random_state = 0, n_estimators = 100, bootstrap = False, max_samples = 0.6, max_features = 0.6, verbose = 2) from sklearn.ensemble import AdaBoostClassifier ada_boost = AdaBoostClassifier(base_estimator = bagging, n_estimators = 100, algorithm = 'SAMME.R') ada_boost.fit(x_train[:,[2, 3, 5, 8, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 23, 24, 25, 28]],y_train) from sklearn.metrics import confusion_matrix y_pred = ada_boost.predict(x_test[:,[2, 3, 5, 8, 10, 11, 12, 13, 14, 15, 16, 17, 18, 21, 23, 24, 25, 28]]) cm = confusion_matrix(y_test, y_pred) cm cm ###Output _____no_output_____
tareas/tarea_02/tarea_02.ipynb
###Markdown Tarea N°02 Instrucciones1.- Completa tus datos personales (nombre y rol USM) en siguiente celda.**Nombre**: Felipe Pérez**Rol**: 201610530-12.- Debes pushear este archivo con tus cambios a tu repositorio personal del curso, incluyendo datos, imágenes, scripts, etc.3.- Se evaluará:- Soluciones- Código- Que Binder esté bien configurado.- Al presionar `Kernel -> Restart Kernel and Run All Cells` deben ejecutarse todas las celdas sin error. I.- Clasificación de dígitosEn este laboratorio realizaremos el trabajo de reconocer un dígito a partir de una imagen. ![rgb](https://www.wolfram.com/language/11/neural-networks/assets.en/digit-classification/smallthumb_1.png) El objetivo es a partir de los datos, hacer la mejor predicción de cada imagen. Para ellos es necesario realizar los pasos clásicos de un proyecto de _Machine Learning_, como estadística descriptiva, visualización y preprocesamiento. * Se solicita ajustar al menos tres modelos de clasificación: * Regresión logística * K-Nearest Neighbours * Uno o más algoritmos a su elección [link](https://scikit-learn.org/stable/supervised_learning.htmlsupervised-learning) (es obligación escoger un _estimator_ que tenga por lo menos un hiperparámetro). * En los modelos que posean hiperparámetros es mandatorio buscar el/los mejores con alguna técnica disponible en `scikit-learn` ([ver más](https://scikit-learn.org/stable/modules/grid_search.htmltuning-the-hyper-parameters-of-an-estimator)).* Para cada modelo, se debe realizar _Cross Validation_ con 10 _folds_ utilizando los datos de entrenamiento con tal de determinar un intervalo de confianza para el _score_ del modelo.* Realizar una predicción con cada uno de los tres modelos con los datos _test_ y obtener el _score_. * Analizar sus métricas de error (**accuracy**, **precision**, **recall**, **f-score**) Exploración de los datosA continuación se carga el conjunto de datos a utilizar, a través del sub-módulo `datasets` de `sklearn`. ###Code import numpy as np import pandas as pd from sklearn import datasets import matplotlib.pyplot as plt import seaborn as sns from sklearn.metrics import confusion_matrix,accuracy_score,recall_score,precision_score,f1_score %matplotlib inline digits_dict = datasets.load_digits() print(digits_dict["DESCR"]) digits_dict.keys() digits_dict["target"] ###Output _____no_output_____ ###Markdown A continuación se crea dataframe declarado como `digits` con los datos de `digits_dict` tal que tenga 65 columnas, las 6 primeras a la representación de la imagen en escala de grises (0-blanco, 255-negro) y la última correspondiente al dígito (`target`) con el nombre _target_. ###Code digits = ( pd.DataFrame( digits_dict["data"], ) .rename(columns=lambda x: f"c{x:02d}") .assign(target=digits_dict["target"]) .astype(int) ) digits.head() ###Output _____no_output_____ ###Markdown Ejercicio 1**Análisis exploratorio:** Realiza tu análisis exploratorio, no debes olvidar nada! Recuerda, cada análisis debe responder una pregunta.Algunas sugerencias:* ¿Cómo se distribuyen los datos?* ¿Cuánta memoria estoy utilizando?* ¿Qué tipo de datos son?* ¿Cuántos registros por clase hay?* ¿Hay registros que no se correspondan con tu conocimiento previo de los datos? ###Code ## FIX ME PLEASE print('Los posibles targets son:', digits['target'].unique()) print('Los datos de deistribuyen de la siguiente manera') sns.countplot(x='target', data=digits) digits.describe() digits.info(memory_usage='deep') print('Podemos ver que los datos son de tipo entero, estamos usando 456.4 kb de memoria y no hay datos nulos') nx,ny = 8,8 columnas = digits.columns i=0 fig, axs = plt.subplots(nx, ny, figsize=(25, 25)) for y in range(ny): for x in range(nx): sns.distplot(digits[columnas[i]], hist=True, rug=False, ax=axs[x,y],label = columnas[i]) i+=1 print("Vemos que hay a lo menos 10 parametros que no son relevantes") ###Output Vemos que hay a lo menos 10 parametros que no son relevantes ###Markdown Ejercicio 2**Visualización:** Para visualizar los datos utilizaremos el método `imshow` de `matplotlib`. Resulta necesario convertir el arreglo desde las dimensiones (1,64) a (8,8) para que la imagen sea cuadrada y pueda distinguirse el dígito. Superpondremos además el label correspondiente al dígito, mediante el método `text`. Esto nos permitirá comparar la imagen generada con la etiqueta asociada a los valores. Realizaremos lo anterior para los primeros 25 datos del archivo. ###Code digits_dict["images"][0] ###Output _____no_output_____ ###Markdown Visualiza imágenes de los dígitos utilizando la llave `images` de `digits_dict`. Sugerencia: Utiliza `plt.subplots` y el método `imshow`. Puedes hacer una grilla de varias imágenes al mismo tiempo! ###Code nx, ny = 5, 5 fig, axs = plt.subplots(nx, ny, figsize=(12, 12)) n=0 for i in range(5): for j in range(5): plt.text(0.5,0.5,digits_dict["target"][n],horizontalalignment='center', verticalalignment='center', fontsize=14, color='r') axs[i, j].imshow(digits_dict["images"][n]) n=n+1 ###Output _____no_output_____ ###Markdown Ejercicio 3**Machine Learning**: En esta parte usted debe entrenar los distintos modelos escogidos desde la librería de `skelearn`. Para cada modelo, debe realizar los siguientes pasos:* **train-test** * Crear conjunto de entrenamiento y testeo (usted determine las proporciones adecuadas). * Imprimir por pantalla el largo del conjunto de entrenamiento y de testeo. * **modelo**: * Instanciar el modelo objetivo desde la librería sklearn. * *Hiper-parámetros*: Utiliza `sklearn.model_selection.GridSearchCV` para obtener la mejor estimación de los parámetros del modelo objetivo.* **Métricas**: * Graficar matriz de confusión. * Analizar métricas de error.__Preguntas a responder:__* ¿Cuál modelo es mejor basado en sus métricas?* ¿Cuál modelo demora menos tiempo en ajustarse?* ¿Qué modelo escoges? ###Code from sklearn.model_selection import train_test_split from sklearn.model_selection import GridSearchCV from sklearn.linear_model import LogisticRegression from sklearn.neighbors import KNeighborsClassifier from sklearn.tree import DecisionTreeClassifier from time import time X = digits.drop(columns="target").values y = digits["target"].values # split dataset X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25, random_state=27) print('Largo conjunto entrenamiento', y_train.shape[0]) print('Largo conjunto testeo', y_test.shape[0]) n_neighbors = 7 max_depth=5 lr = LogisticRegression(solver='liblinear').fit(X_train, y_train) knn = KNeighborsClassifier(n_neighbors).fit(X_train, y_train) dtc = DecisionTreeClassifier(max_depth=max_depth).fit(X_train, y_train) y_true = list(y_test) lr_pred = list(lr.predict(X_test)) knn_pred = list(knn.predict(X_test)) dtc_pred = list(dtc.predict(X_test)) modelos=['lr', 'knn', 'dtc'] accuracy=[accuracy_score(y_true,lr_pred),accuracy_score(y_true,knn_pred),accuracy_score(y_true,dtc_pred)] recall=[recall_score(y_true,lr_pred,average='weighted'), recall_score(y_true,knn_pred,average='weighted'), recall_score(y_true,dtc_pred,average='weighted')] precision=[precision_score(y_true,lr_pred,average='weighted'),precision_score(y_true,knn_pred,average='weighted'),precision_score(y_true,dtc_pred,average='weighted')] fscore=[f1_score(y_true,lr_pred,average='weighted'),f1_score(y_true,knn_pred,average='weighted'),f1_score(y_true,dtc_pred,average='weighted')] Comparacion = pd.DataFrame({'Modelo': modelos, 'accuracy': accuracy,'recall': recall, 'precision':precision, 'f-score':fscore}, columns=['Modelo', 'accuracy','recall','precision','f-score']) Comparacion #tiempo lr tiempo_inicial = time() LogisticRegression(solver='liblinear').fit(X_train, y_train) tiempo_final = time() t_lr = tiempo_final - tiempo_inicial #tiempo knn tiempo_inicial = time() KNeighborsClassifier(n_neighbors).fit(X_train, y_train) tiempo_final = time() t_knn = tiempo_final - tiempo_inicial #tiempo dtc tiempo_inicial = time() DecisionTreeClassifier(max_depth=max_depth).fit(X_train, y_train) tiempo_final = time() t_dtc = tiempo_final - tiempo_inicial #Agregar tiempos al dataframe Comparacion['tiempo']=[t_lr,t_knn,t_dtc] Comparacion #Matriz de confusion lr print('\nMatriz de confusion lr:\n ') print(confusion_matrix(y_true,lr_pred)) #Matriz de confusion knn print('\nMatriz de confusion knn:\n ') print(confusion_matrix(y_true,knn_pred)) #Matriz de confusion dtc print('\nMatriz de confusion dtc:\n ') print(confusion_matrix(y_true,dtc_pred)) ###Output Matriz de confusion dtc: [[35 0 0 0 3 0 1 0 0 0] [ 0 10 0 0 4 6 3 0 14 0] [ 0 3 35 2 0 1 0 0 2 0] [ 0 5 2 33 0 2 0 0 2 2] [ 0 1 0 0 32 1 8 2 3 3] [ 0 0 0 0 3 40 2 0 2 2] [ 0 0 1 0 2 2 47 0 0 0] [ 0 9 0 1 0 0 0 26 2 0] [ 0 9 7 1 0 0 0 0 28 0] [ 0 2 0 0 1 1 0 0 32 15]] ###Markdown Vemos que el mejor modelo basado en sus metricas es knn, mientras que el que tiene mejor tiempo es dtc. Por ultimo el modelo que yo escogeria sería el modelo knn, debido a que es el que tiene mejor metrica y no tiene tanto tiempo de ejecución comparado con el resto. Ejercicio 4__Comprensión del modelo:__ Tomando en cuenta el mejor modelo entontrado en el `Ejercicio 3`, debe comprender e interpretar minuciosamente los resultados y gráficos asocados al modelo en estudio, para ello debe resolver los siguientes puntos: * **Cross validation**: usando **cv** (con n_fold = 10), sacar una especie de "intervalo de confianza" sobre alguna de las métricas estudiadas en clases: * $\mu \pm \sigma$ = promedio $\pm$ desviación estandar * **Curva de Validación**: Replica el ejemplo del siguiente [link](https://scikit-learn.org/stable/auto_examples/model_selection/plot_validation_curve.htmlsphx-glr-auto-examples-model-selection-plot-validation-curve-py) pero con el modelo, parámetros y métrica adecuada. Saque conclusiones del gráfico. * **Curva AUC–ROC**: Replica el ejemplo del siguiente [link](https://scikit-learn.org/stable/auto_examples/model_selection/plot_roc.htmlsphx-glr-auto-examples-model-selection-plot-roc-py) pero con el modelo, parámetros y métrica adecuada. Saque conclusiones del gráfico. ###Code from sklearn.model_selection import cross_val_score from sklearn.model_selection import validation_curve from sklearn.model_selection import validation_curve from sklearn.preprocessing import label_binarize from sklearn.multiclass import OneVsRestClassifier from sklearn.metrics import roc_curve, auc from sklearn.metrics import roc_auc_score from scipy import interp #intervalo de confianza scores = cross_val_score(knn, X, y, cv=10) print("Accuracy:",scores.mean()) print("Standard deviation:", scores.std()) #Curva de validacion param_range = np.arange(1, 10, 1) train_scores, test_scores = validation_curve( KNeighborsClassifier(n_neighbors=5), X, y, param_name="n_neighbors", param_range=param_range, scoring="accuracy", n_jobs=1) train_scores_mean = np.mean(train_scores, axis=1) train_scores_std = np.std(train_scores, axis=1) test_scores_mean = np.mean(test_scores, axis=1) test_scores_std = np.std(test_scores, axis=1) plt.figure(figsize=(15,5)) plt.title("Validation Curve with knn") plt.xlabel("n_neighbors") plt.ylabel("Score") plt.ylim(0.0, 1.1) lw = 2 plt.plot(param_range, train_scores_mean, label="Training score", color="darkorange", lw=lw) plt.fill_between(param_range, train_scores_mean - train_scores_std, train_scores_mean + train_scores_std, alpha=0.2, color="darkorange", lw=lw) plt.plot(param_range, test_scores_mean, label="Cross-validation score", color="navy", lw=lw) plt.fill_between(param_range, test_scores_mean - test_scores_std, test_scores_mean + test_scores_std, alpha=0.2, color="navy", lw=lw) plt.legend(loc="best") plt.show() #Curva AUC-ROC # Binarize the output y = label_binarize(y, classes=[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]) n_classes = y.shape[1] # shuffle and split training and test sets X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=.5, random_state=0) # Learn to predict each class against the other classifier = OneVsRestClassifier(KNeighborsClassifier(n_neighbors)) y_score = classifier.fit(X_train, y_train).predict(X_test) # Compute ROC curve and ROC area for each class fpr = dict() tpr = dict() roc_auc = dict() for i in range(n_classes): fpr[i], tpr[i], _ = roc_curve(y_test[:, i], y_score[:, i]) roc_auc[i] = auc(fpr[i], tpr[i]) # Compute micro-average ROC curve and ROC area fpr["micro"], tpr["micro"], _ = roc_curve(y_test.ravel(), y_score.ravel()) roc_auc["micro"] = auc(fpr["micro"], tpr["micro"]) # First aggregate all false positive rates all_fpr = np.unique(np.concatenate([fpr[i] for i in range(n_classes)])) # Then interpolate all ROC curves at this points mean_tpr = np.zeros_like(all_fpr) for i in range(n_classes): mean_tpr += interp(all_fpr, fpr[i], tpr[i]) # Finally average it and compute AUC mean_tpr /= n_classes fpr["macro"] = all_fpr tpr["macro"] = mean_tpr roc_auc["macro"] = auc(fpr["macro"], tpr["macro"]) # Plot all ROC curves plt.figure() plt.plot(fpr["micro"], tpr["micro"], label='micro-average ROC curve (area = {0:0.2f})' ''.format(roc_auc["micro"]), color='deeppink', linestyle=':', linewidth=4) plt.plot(fpr["macro"], tpr["macro"], label='macro-average ROC curve (area = {0:0.2f})' ''.format(roc_auc["macro"]), color='navy', linestyle=':', linewidth=4) for i in range(n_classes): plt.plot(fpr[i], tpr[i], lw=lw, label='ROC curve of class {0} (area = {1:0.2f})' ''.format(i, roc_auc[i])) #plt.figure(figsize=(15,5)) plt.plot([0, 1], [0, 1], 'k--', lw=lw) plt.xlim([0.0, 1.0]) plt.ylim([0.0, 1.05]) plt.xlabel('False Positive Rate') plt.ylabel('True Positive Rate') plt.title('Some extension of Receiver operating characteristic to multi-class') plt.legend(loc="lower right") plt.show() ###Output <ipython-input-52-9a867e3afff6>:7: DeprecationWarning: scipy.interp is deprecated and will be removed in SciPy 2.0.0, use numpy.interp instead mean_tpr += interp(all_fpr, fpr[i], tpr[i]) ###Markdown Ejercicio 5__Reducción de la dimensión:__ Tomando en cuenta el mejor modelo encontrado en el `Ejercicio 3`, debe realizar una redcción de dimensionalidad del conjunto de datos. Para ello debe abordar el problema ocupando los dos criterios visto en clases: * **Selección de atributos*** **Extracción de atributos**__Preguntas a responder:__Una vez realizado la reducción de dimensionalidad, debe sacar algunas estadísticas y gráficas comparativas entre el conjunto de datos original y el nuevo conjunto de datos (tamaño del dataset, tiempo de ejecución del modelo, etc.) Del analisis de datos vimos que hay 10 parametros que no son relevantes ###Code from sklearn.feature_selection import SelectKBest from sklearn.feature_selection import f_classif from sklearn.preprocessing import StandardScaler from sklearn.decomposition import PCA # Separamos las columnas objetivo x_training = digits.drop(['target',], axis=1) y_training = digits['target'] # Aplicando el algoritmo univariante de prueba F. k = 64-10 # número de atributos a seleccionar columnas = list(x_training.columns.values) seleccionadas = SelectKBest(f_classif, k=k).fit(x_training, y_training) catrib = seleccionadas.get_support() atributos = [columnas[i] for i in list(catrib.nonzero()[0])] atributos X = digits[atributos] y = y_training X_train, X_test, Y_train, Y_test = train_test_split(X , y, test_size=0.2, random_state = 2) model = KNeighborsClassifier(n_neighbors) model.fit(X_train,Y_train) x = StandardScaler().fit_transform(x_training.values) pca = PCA(n_components=k) principalComponents = pca.fit_transform(x) # graficar varianza por componente percent_variance = np.round(pca.explained_variance_ratio_* 100, decimals =2) columns = list(range(0,k)) plt.figure(figsize=(18,4)) plt.bar(x= range(0,k), height=percent_variance, tick_label=columns) plt.ylabel('Percentate of Variance Explained') plt.xlabel('Principal Component') plt.title('PCA Scree Plot') plt.show() # graficar varianza por la suma acumulada de los componente percent_variance_cum = np.cumsum(percent_variance) plt.figure(figsize=(18,4)) plt.bar(x= range(0,k), height=percent_variance_cum, tick_label=columns) plt.ylabel('Percentate of Variance Explained') plt.xlabel('Principal Component Cumsum') plt.title('PCA Scree Plot') plt.show() ###Output _____no_output_____ ###Markdown Vemos que dede el 21 tenemos mas del 80% ###Code pca = PCA(n_components=21) principalComponents = pca.fit_transform(x) principalDataframe = pd.DataFrame(data = principalComponents, columns = range(0,21)) targetDataframe = digits[['target']] newDataframe = pd.concat([principalDataframe, targetDataframe],axis = 1) newDataframe.head() X_train, X_test, Y_train, Y_test = train_test_split(principalDataframe , targetDataframe, test_size=0.2, random_state = 2) model = KNeighborsClassifier(n_neighbors) model.fit(X_train,Y_train) y_pred = model.predict(X_test) modelos=['lr'] accuracy=[accuracy_score(y_true,lr_pred)] recall=[recall_score(y_true,lr_pred,average='weighted')] precision=[precision_score(y_true,lr_pred,average='weighted')] fscore=[f1_score(y_true,lr_pred,average='weighted')] metricas = pd.DataFrame({'Modelo': modelos, 'accuracy': accuracy,'recall': recall, 'precision':precision, 'f-score':fscore}, columns=['Modelo', 'accuracy','recall','precision','f-score']) metricas ###Output _____no_output_____ ###Markdown De lo que podemos ver que las metricas obtenidas son buenas. Ejercicio 6__Visualizando Resultados:__ A continuación se provee código para comparar las etiquetas predichas vs las etiquetas reales del conjunto de _test_. ###Code def mostar_resultados(digits,model,nx=5, ny=5,label = "correctos"): """ Muestra los resultados de las prediciones de un modelo de clasificacion en particular. Se toman aleatoriamente los valores de los resultados. - label == 'correcto': retorna los valores en que el modelo acierta. - label == 'incorrecto': retorna los valores en que el modelo no acierta. Observacion: El modelo que recibe como argumento debe NO encontrarse 'entrenado'. :param digits: dataset 'digits' :param model: modelo de sklearn :param nx: numero de filas (subplots) :param ny: numero de columnas (subplots) :param label: datos correctos o incorrectos :return: graficos matplotlib """ X = digits.drop(columns="target").values Y = digits["target"].values X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.2, random_state = 42) model.fit(X_train, Y_train) # ajustando el modelo y_pred = list(model.predict(X_test)) # Mostrar los datos correctos if label=="correctos": mask = (y_pred == Y_test) color = "green" # Mostrar los datos incorrectos elif label=="incorrectos": mask = (y_pred != Y_test) color = "red" else: raise ValueError("Valor incorrecto") X_aux = X_test[mask] y_aux_true = Y_test[mask] y_aux_pred = np.array(y_pred)[mask] # We'll plot the first 100 examples, randomly choosen fig, ax = plt.subplots(nx, ny, figsize=(12,12)) for i in range(nx): for j in range(ny): index = j + ny * i data = X_aux[index, :].reshape(8,8) label_pred = str(int(y_aux_pred[index])) label_true = str(int(y_aux_true[index])) ax[i][j].imshow(data, interpolation='nearest', cmap='gray_r') ax[i][j].text(0, 0, label_pred, horizontalalignment='center', verticalalignment='center', fontsize=10, color=color) ax[i][j].text(7, 0, label_true, horizontalalignment='center', verticalalignment='center', fontsize=10, color='blue') ax[i][j].get_xaxis().set_visible(False) ax[i][j].get_yaxis().set_visible(False) plt.show() ###Output _____no_output_____ ###Markdown **Pregunta*** Tomando en cuenta el mejor modelo entontrado en el `Ejercicio 3`, grafique los resultados cuando: * el valor predicho y original son iguales * el valor predicho y original son distintos * Cuando el valor predicho y original son distintos , ¿Por qué ocurren estas fallas? ###Code mostar_resultados(digits,KNeighborsClassifier(n_neighbors),nx=5, ny=5,label = "correctos") mostar_resultados(digits,KNeighborsClassifier(n_neighbors),nx=2, ny=2,label = "incorrectos") ###Output _____no_output_____
examples/best-available-pixel/best-available-pixel-composite.ipynb
###Markdown BEST AVAILABLE PIXEL COMPOSITE (BAP) *in Google Earth Engine Python API*based on *Pixel-Based Image Compositing for Large-Area DenseTime Series Applications and Science (White, 2014)*https://goo.gl/Fi8fCY* **Author**: Rodrigo E. Principe* **email**: [email protected]* **GitHub**: github.com/fitoprincipe* **Repository GeeBap**: github.com/fitoprincipe/geebap Install Packages ###Code import sys !{sys.executable} -m pip install --upgrade geebap ###Output _____no_output_____ ###Markdown Make imports ###Code # Import Earth Engine and initialize import ee ee.Initialize() # Import packages import geebap from geebap import bap, season, filters, masks, \ scores, satcol, functions from geetools import tools import pprint print('version', geebap.__version__) ###Output ('version', '0.0.9') ###Markdown SeasonThis object holds information of the growing season (start, end and doy *best day of year*). You can make your own, or use 2 pre-made: `Growing_South` and `Growing_North`. This object does not hold any year, just day and month. For example, the pre-made `Growing_South` starts on November 15 (11-15) and ends on March 15 (03-15). But it has a method to add a year, see the example in the code: ###Code a_season = season.Season.Growing_South() ini, end = a_season.add_year(2000) print ini, end ###Output _____no_output_____ ###Markdown Note that when the season covers two years, start date is in the previous year. CollectionsThe main method to create a BAP uses a `ColGroup` object, that is basically a group of `Collection` objects. You can see all groups: ###Code satcol.ColGroup.options() ###Output _____no_output_____ ###Markdown You could also make your own `ColGroup`, but you have to keep in mind that it is composed by `satcol.Collection` objects.As the process is made to combine pixels from all collections, `Collection` object renames the bands of each collection to match across all, resulting in the following names: BLUE, GREEN, RED, NIR, SWIR, SWIR2, ATM_OP. Also, each collection has methods ready to map a vegetation index: `ndvi`, `evi` and `nbr`. MasksThere is *(by now)* one mask you can include in the process: clouds ###Code cld_mask = masks.Clouds() ###Output _____no_output_____ ###Markdown FiltersThere are *(by now)* two filters you can use in the process:**cloud percentage**: `filters.CloudsPercent`**masked pixel percentage**: `filters.MaskPercent`. This filter can be used **only** if maskpercent score is included in the process. ###Code filt_cld = filters.CloudsPercent() filt_mask = filters.MaskPercent() ###Output _____no_output_____ ###Markdown ScoresThis is what makes the difference. Every score has its own parameters, but all share two main params:**range_out**: the range of values the score will be, by default it is (0, 1)**sleep**: as the creaton of a BAP composite is a 'concatenated' process, it can make more requests that are allowed, so if you set this parameter, the process will wait those seconds until the next computing. White's scores DOY (best day of the year)Basically, pixels from images closer to that date will have higher scoreIt takes 2 params:**season**: You can use the same as the one for the process, or not. Each `Season` object has its own *doy*. By default it is `Season.Growing_South`**formula**: distribution ecuation. There are two (by now) options: `Normal` (https://en.wikipedia.org/wiki/Normal_distribution) or `Exponential` (https://en.wikipedia.org/wiki/Exponential_distribution). Default is `Normal` ###Code doy = scores.Doy() ###Output _____no_output_____ ###Markdown SatelliteIt uses a list of available satellite for each year that you can check: ###Code # List of satellites for 2000 season.SeasonPriority.relation[2000] ###Output _____no_output_____ ###Markdown the score has one main param:**rate**: how much score will decrease for each satellite. For example, for 2000, if rate is 0.05 (default value):* 'LANDSAT/LE07/C01/T1_SR' --> 1* 'LANDSAT/LE7_L1T_TOA_FMASK' --> 0.95* 'LANDSAT/LT05/C01/T1_SR' --> 0.90* 'LANDSAT/LT5_L1T_TOA_FMASK' --> 0.85*NOTE: may be the correct name would be 'ratio', so I may change it in the future* ###Code sat = scores.Satellite() ###Output _____no_output_____ ###Markdown Atmospheric OpacityIt uses the atmospheric opacity band computed by Surface Reflectance, so only SR collections will have this score. If the process uses a non-SR collection, like TOA, the this score will be zero. ###Code atm_op = scores.AtmosOpacity() ###Output _____no_output_____ ###Markdown Distance to maskThis assigns a score regarding the distance of the pixel to the closest masked pixel. As the only mask is for clouds, it could be considered 'distance to cloud'. It has 3 main params:**unit**: unit to measure distance. Defaults to 'meters'**dmax**: max distance. Pixel further than this distance will have score 1. Defaults to 600**dmin**: min distance. Defaults to 0 (next pixel from the maks will have score 0). ###Code dist = scores.CloudDist() ###Output _____no_output_____ ###Markdown Custom Scores *(not White's)* OutliersIt computes an statistics over the whole collection (in the season) and assigns score regarding the *distance* of each pixel value to that statistic. It has 3 main parameters:**bands**: a list of bands to include in the process. The process splits the score in the number of given bands. For example, if 4 bands are given, the max score for each band will be 0.25**process**: one of 'mean' or 'median'. Defaults to 'mean'**dist**: distance from 'mean' or 'median'. Defaults to 0.7*NOTE: bands must be in the image, so if you use a vegetation index, be sure to include it in the process* ###Code out = scores.Outliers(("ndvi",)) ###Output _____no_output_____ ###Markdown Mask percentageIt computes the precentage of masked pixels in the image (not the whole scene). It has one main parameter:**band**: the band that holds the mask. ###Code maskper = scores.MaskPercent() ###Output _____no_output_____ ###Markdown Vegetation IndexThis scores is based on the absolute value of the given index, parametrized to `range_out`. The only parameter is **index**: the name of it (*ndvi*, *evi* or *nbr*). Defaults to *ndvi*. ###Code ind = scores.Index("ndvi") ###Output _____no_output_____ ###Markdown Multiple years (seasons)If you want to use images from a range of seasons, and not just only one, this scores prioritizes the main season. Take in count that a season may hold 2 years, but the main is the least (see `Season`). It has 3 main params:**main_year**: this is the central year. Images from this year (season) will have score 1**season**: a `Season` object.**ratio**: amount of score that will decrease as it goes further to the main year. Defaults to 0.05. It is similar to *rate* parameter in `Satellite`. ###Code # Will not use it in the test # multi = scores.MultiYear() ###Output _____no_output_____ ###Markdown Making the composite (BAP)Next step is to create a `Bap` object. It has the following parameters:**year**: The main year. Remember that a season can have 2 years.**range**: for multiyear composites you can specify a 'range' as a tuple: (years before, years after). Defaults to (0, 0).For example, if `range=(1, 1)` and `year=2000`, will compute 1999, 2000 and 2001.**colgroup**: `ColGroup` object. If `None` it will use the list computed by `SeasonPriority`. Defaults to `None`.**season**: a `Season` object**scores**: a list of `Score` objects**masks**: a list of `Mask` objects**filters**: a list of `Filter` objects**fmap**: custom function to apply before computing scores. At this point bands have new names (NIR, SWIR, etc) and vegetation index is computed. ###Code bap = bap.Bap(year=2010, range=(0, 0), season=a_season, masks=(cld_mask,), scores=(doy, sat, atm_op, dist, out, maskper, ind), filters=(filt_cld, filt_mask)) ###Output _____no_output_____ ###Markdown Define a site ###Code site = ee.Geometry.Polygon([[-71,-42], [-71,-43], [-72,-43], [-72,-42]]) ###Output _____no_output_____ ###Markdown Finally, compute the composite`Bap` object has a method named `bestpixel` that creates one image out of the pixels with higher score in the collection (which includes all collections given). It has the following parameters:**site**: The site where the composite will be computed**indices**: a list of vegetation indices. Defaults to `None`**name**: name for the band that holds the final score. Defaults to *score***bands**: a list of bands that will be on the image (in case you don't need all). Defaults to `None` which means *include all bands***normalize**: normalize the final score to be between 0 and 1. Defaults to `True`**bbox**: distance of the buffer around the site. Defaults to 0.**force**: if there are no images for the specified parameters, and this is set to `True`, an empty image will be created. Defaults to `True`This method return a `namedtuple`, so `.image` will be the composite and `.col` will be the collection ###Code composite = bap.bestpixel(site=site, indices=("ndvi",)) ###Output _____no_output_____ ###Markdown Get the composite from the results ###Code image = composite.image ###Output _____no_output_____ ###Markdown Watch the resulting composite*it may take a while..* ###Code from IPython.display import Image url = image.getThumbUrl({'min':0, 'max':0.7, 'region':site.getInfo()['coordinates']}) Image(url=url, format='png', embed=True) ###Output _____no_output_____
ch07_Feature_learning.ipynb
###Markdown TOC __Chapter 07 - Feature learning__1. [Import](Import)1. [Parametric assumptions of data](Parametric-assumptions-of-data)1. [Restricted Boltzmann Machines](Restricted-Boltzmann-Machines) 1. [BernoulliRBM](BernoulliRBM) 1. [Extracting PCA components from MNIST](Extracting-PCA-components-from-MNIST) 1. [Extracting RBM components from MNIST](Extracting-RBM-components-from-MNIST) 1. [Using RBMs in a machine learning pipeline](Using-RBMs-in-a-machine-learning-pipeline) 1. [Using a linear model on raw pixel values](Using-a-linear-model-on-raw-pixel-values) 1. [Using a linear model on extracted PCA components](Using-a-linear-model-on-extracted-PCA-components) 1. [Using a linear model on extracted RBM components](Using-a-linear-model-on-extracted-RBM-components) Import ###Code # standard libary and settings import os import sys import importlib import itertools import warnings warnings.simplefilter("ignore") from IPython.core.display import display, HTML display(HTML("<style>.container { width:95% !important; }</style>")) # data extensions and settings import numpy as np np.set_printoptions(threshold=np.inf, suppress=True) import pandas as pd pd.set_option("display.max_rows", 500) pd.options.display.float_format = "{:,.6f}".format # modeling extensions from sklearn.neural_network import BernoulliRBM from sklearn.preprocessing import StandardScaler, RobustScaler, PolynomialFeatures, OrdinalEncoder, LabelEncoder, OneHotEncoder, KBinsDiscretizer, QuantileTransformer, PowerTransformer, MinMaxScaler from sklearn.pipeline import make_pipeline, Pipeline, FeatureUnion from sklearn.neighbors import KNeighborsClassifier, KNeighborsRegressor from sklearn.model_selection import KFold, train_test_split, GridSearchCV, StratifiedKFold, cross_val_score, RandomizedSearchCV from sklearn.decomposition import PCA, LatentDirichletAllocation from sklearn.linear_model import Lasso, Ridge, ElasticNet, LinearRegression, LogisticRegression, SGDRegressor # visualization extensions and settings import seaborn as sns import matplotlib.pyplot as plt import mlmachine as mlm from prettierplot.plotter import PrettierPlot import prettierplot.style as style # magic functions %matplotlib inline ###Output _____no_output_____ ###Markdown Parametric assumptions of dataFeature learning algorithms remove the parametric assumption of data that PCA and LDA make. Rather than making assumptions about the shape of the data, these feature learning techniques rely on stochastic learning - that is, an iterative process is performed on the data rather than a well-defined linear transformation.That being said, non-parametric techniques that will be discussed here still make assumptions about the data, just not the shape. Here are some examples of non-parametric feature extraction techniques:- Restricted Boltzmann Machine (RBM)- Word embeddings Restricted Boltzmann MachinesRBMs are a family of unsupervised feature learning algorithms that use probabilistic models to learn new features. Just as with PCA and LDA, RBMs extract new features from raw data. These new features tend to work best when followed by linear models.Conceptually, RBMs can be thought of as shallow neural networks, having only two layers. It is an unsupervised method, so there is only a visible layer and a hidden layer. The (first) visible layer contains as many nodes as there are input features. The number of nodes in the hidden layer is a parameter chosen by the user, and represents the number of features we wish to learn. There is theoretically no limit to the number of features that RBMs can learn - this number can be fewer than the number of features we start with, and can also be more than the number of features we start with.Each node in the the visible layer is connected to each node in the hidden layer, and these connections have a weight. The dot product of the input values and the weights (plus bias) generates a value that is pass into an activation function.The 'Restriction' aspect of RBMs means there aren't any connections between nodes in the same layer. This allows for nodes to independently create weights and end up being independent features.After one forward pass is completed, the direction of information flow reverses. The previously activated values are the input values and are multiplied by the same weights, but new biases, that connect the hidden layer nodes to all nodes in the visible layer. This backwards pass is effectively an approximation of the original input. After one full forward/backward cycle, the weights are adjusted with the intention of minimizing the distance between the original input and the approximations. This process is repeated for many iterarions, and in this process the network is trying to obtain an approximation of the original input.As an example, suppose we are trying to pass in a grid of pixels representing a digit between 0 and 9. We pass in the pixels with the question, "What digit is this?". On the reverse pass, we are in a way passing in a digit and asking "What pixels should I expect?" More formally, this is a joint probability, which is the simultaneous probability of y given x and x given y ###Code # # load data # dataPath = "..\\PythonMachineLearning2ndEd" # XData, yData = load_mnist(path=os.path.join(dataPath, "MNISTDigits"), kind="train") # print("Rows: {}, columns: {}".format(XData.shape[0], XData.shape[1])) # X_test, y_test = load_mnist(path=os.path.join(dataPath, "MNISTDigits"), kind="t10k") # print("Rows: {}, columns: {}".format(X_test.shape[0], X_test.shape[1])) # Load data and print dimensions df_train = pd.read_csv("../kaggle-mnist/train.csv", sep=",") df_test = pd.read_csv("../kaggle-mnist/test.csv", sep=",") # separate df_train_label = df_train["label"] df_train = df_train.drop(labels="label", axis=1).values # train/test split X_train, X_valid, y_train, y_valid = train_test_split( df_train, df_train_label, test_size=0.2 ) print('X_train shape: {}'.format(X_train.shape)) print('y_train shape: {}'.format(y_train.shape)) print('') print('X_valid shape: {}'.format(X_valid.shape)) print('y_valid shape: {}'.format(y_valid.shape)) print("Training data dimensions: {}".format(df_train.shape)) print("Test data dimensions: {}".format(df_test.shape)) # sample plt.imshow(X_train[0].reshape(28, 28), cmap=plt.cm.gray_r) ###Output _____no_output_____ ###Markdown BernoulliRBMThe only scikit-learn implemented RBM is the BernoulliRBM module. It allows for values between zero and one because the node values represent a probability that the node is activated or not, which allows for quicker learning of feature sets. To preprocess the data, we need to:- Scale the values of pixels to values between zero and one- Change the pixel values in place to be true if the value is over 0.5, and false otherwise. ###Code # scale to 0 and 1 X_train = X_train / 255.0 X_train = (X_train > 0.5).astype(float) np.min(X_train), np.max(X_train) # visualize sample plt.imshow(X_train[0].reshape(28, 28), cmap=plt.cm.gray_r) ###Output _____no_output_____ ###Markdown > Remarks - Note that this image is less fuzzy than the output above. Extracting PCA components from MNISTBefore applying the RBM approach, we will first apply PCA to the image dataset. The intent is to approach is to take the features (784 on/off pixels) and apply an eigenvalue decomposition to the matrix to extract the eigendigits.We will plot the first 100 components from the possible 784 ###Code # use PCA to extract top 100 components pca = PCA(n_components=100) pca.fit(X_train) plt.figure(figsize=(10, 10)) # visualize components for i, comp in enumerate(pca.components_): plt.subplot(10, 10, i + 1) plt.imshow(comp.reshape((28, 28)), cmap=plt.cm.gray_r) plt.xticks(()) plt.yticks(()) plt.suptitle("100 components extract by PCA") plt.show() ###Output _____no_output_____ ###Markdown > Remarks - This gallery of images displays what the eigenvalues of the covariance matrix look like when reshaped to the same dimensions as the original images. This is what extracted components of this dataset look like as PCA attempts to identify important features, as the algorithm attempts to understand a certain "aspect" of the iamges that will translate into interpretable knowledge.> Looking at the first component, it's pretty clear that PCA puts high value on searching for a 0-like shape in an observation. In the second and third componenets, there is a faint appearance of a 9 and 8, respectively. Beyond that, the components get look progressively less like digits and more like random assortments of black and white pixels. ###Code # variance captured by first 30 components pca.explained_variance_ratio_[:30].sum() # visualize cumulative explained variance full_pca = PCA(n_components=784) full_pca.fit(X_train) plt.plot(np.cumsum(full_pca.explained_variance_ratio_)) ###Output _____no_output_____ ###Markdown > Remarks - The first 30 components capture nearly 2/3's of the variance, and the variance capture by subsequent components is quite small. Extracting RBM components from MNIST ###Code # extract top 100 componenets using BernoulliRBM from sklearn.neural_network import BernoulliRBM rbm = BernoulliRBM( random_state=0, verbose=True, n_iter=20, n_components=100 ) rbm.fit(X_train) # visualize RBM components plt.figure(figsize=(10, 10)) for i, comp in enumerate(rbm.components_): plt.subplot(10, 10, i + 1) plt.imshow(comp.reshape((28, 28)), cmap=plt.cm.gray_r) plt.xticks(()) plt.yticks(()) plt.suptitle("100 components extracted by RBM") plt.show() ###Output _____no_output_____ ###Markdown > Remarks - While the PCA components became distorted very quickly, the RBM components appear to extracted various shape, and even apparent pen strokes, in each component. ###Code # some components look to be repeated, but are likely similar but still unique print(np.unique(rbm.components_.mean(axis=1)).shape) # capture new features for first observation, a number 5 image_new_features = rbm.transform(X_train[:1]).reshape(100) image_new_features # print top 20 features for the first observation top_features = image_new_features.argsort()[-20:][::-1] print(top_features) print(image_new_features[top_features]) ###Output [93 96 1 49 24 73 12 95 40 13 33 98 68 65 37 58 9 4 66 32] [1. 1. 1. 1. 1. 1. 1. 1. 0.99999999 0.99999999 0.99999999 0.99999999 0.99999998 0.99999985 0.99999978 0.99999937 0.99999813 0.99999344 0.99998844 0.99996676] ###Markdown > Remarks - There are 7 features in which the RBM has a full 100%. That means that passing in the 784 pixels for the first observations into the visible layer of the RBM lights up the nodes 63, 69, 14, 18, 62, 66, and 21 at full capacity. This speaks to the nodes perception that these pixels represent a 5. ###Code # plot top 20 RBM components for the first observation, a number 5 plt.figure(figsize=(25, 25)) for i, comp in enumerate(top_features): plt.subplot(5, 4, i + 1) plt.imshow(rbm.components_[comp].reshape((28, 28)), cmap=plt.cm.gray_r) plt.title( "Component {}, feature value: {}".format( comp, round(image_new_features[comp], 2) ), fontsize=20, ) plt.suptitle("Top 20 components extracted by RBM for first digit", fontsize=30) plt.show() ###Output _____no_output_____ ###Markdown > Remarks - In several of the component, a faint outline of key characteristics of the number 5 are visible. ###Code # plot bottom 20 RBM components for the first observation, a number 5 bottom_features = image_new_features.argsort()[:20] plt.figure(figsize=(25, 25)) for i, comp in enumerate(bottom_features): plt.subplot(5, 4, i + 1) plt.imshow(rbm.components_[comp].reshape((28, 28)), cmap=plt.cm.gray_r) plt.title( "Component {}, feature value: {}".format( comp, round(image_new_features[comp], 2) ), fontsize=20, ) plt.suptitle("Top 20 components extracted by RBM for first digit", fontsize=30) plt.show() ###Output _____no_output_____ ###Markdown > Remarks - For these bottom 20 components (those that are least associated with the number 5, we can see distinct outlines of several other digits, such as 1, 8 and 6. Using RBMs in a machine learning pipeline Using a linear model on raw pixel values ###Code # basic logistic regression grid search lr = LogisticRegression() params = {"C": [1e-2, 1e-1, 1e0, 1e1, 1e2]} grid = GridSearchCV(lr, params) grid.fit(X_train, y_train) print(grid.best_params_, grid.best_score_) ###Output {'C': 0.1} 0.9046130952380952 ###Markdown > Remarks - Logistic regression achieves a cross-validated accuracy of 90.61% on the raw pixel values. We will regard this is our baseline performance. Using a linear model on extracted PCA components ###Code # use PCA to extract new features lr = LogisticRegression() pca = PCA() # set the params for the pipeline params = {"clf__C": [1e-1, 1e0, 1e1], "pca__n_components": [10, 100, 200]} # create pipeline pipe = Pipeline([("pca", pca), ("clf", lr)]) # instantiate a gridsearh class grid = GridSearchCV(pipe, params) # fit to our data grid.fit(X_train, y_train) # check the best params print(grid.best_params_, grid.best_score_) ###Output {'clf__C': 1.0, 'pca__n_components': 200} 0.903779761904762 ###Markdown > Remarks - Inserting PCA into the pipeline did not increase the cross-validated accuracy score compared to the baseline. Using a linear model on extracted RBM components ###Code # use the RBM to learn new features lr = LogisticRegression() rbm = BernoulliRBM(random_state=0) # set up the params for our pipeline. params = {"clf__C": [1e-1, 1e0, 1e1], "rbm__n_components": [10, 100, 200]} # create our pipeline pipe = Pipeline([("rbm", rbm), ("clf", lr)]) # instantiate a gridsearch class grid = GridSearchCV(pipe, params) # fit to our data grid.fit(X_train, y_train) # check the best params print(grid.best_params_, grid.best_score_) ###Output {'clf__C': 10.0, 'rbm__n_components': 200} 0.9388095238095238
deep-learning-with-keras-notebooks-master/1.3-use-pretrained-model.ipynb
###Markdown 如何使用預訓練的模型來分類照片中的物體 卷積神經網絡現在能夠在一些電腦視覺任務上勝過人類的肉眼,例如圖像分類。也就是說,給出一個物體的照片,我們可以讓電腦來回答這個照片是1000個特定種類物體中的哪一類這樣的問題。在本教程中,我們將使用幾個Keras己經內建好的預訓練模型來進行圖像分類, 其中包括了:* VGG16* VGG19* ResNet50* InceptionV3* InceptionResNetV2* Xception* MobileNet![imagenet](https://kaggle2.blob.core.windows.net/datasets-images/2798/4640/d21dc5df65889e105877476d81240e11/data-original.png) ###Code # 這個Jupyter Notebook的環境 import platform import tensorflow import keras print("Platform: {}".format(platform.platform())) print("Tensorflow version: {}".format(tensorflow.__version__)) print("Keras version: {}".format(keras.__version__)) %matplotlib inline import matplotlib.pyplot as plt import matplotlib.image as mpimg import numpy as np from IPython.display import Image ###Output Using TensorFlow backend. ###Markdown 開發一個簡單的照片分類器 VGG16 1.獲取範例圖像首先,我們需要一個我們可以進行分類的圖像。您可以在這裡從Google隨意檢索一些動物照片並下載到這個jupyter notebook所處的目錄。比如說:我下載了: https://www.elephantvoices.org/images/slider/evimg16tf.jpg![evimg16tf.jpg](https://www.elephantvoices.org/images/slider/evimg16tf.jpg) 2.加載VGG模型加載在Keras己經預訓練好的VGG-16的權重模型檔。 ###Code from keras.applications.vgg16 import VGG16 # 載入權重 model_vgg16 = VGG16() ###Output _____no_output_____ ###Markdown 3.加載和準備圖像接下來,我們可以將圖像加載進來,並轉換成預訓練網絡所要求的張量規格。Keras提供了一些工具來幫助完成這一步驟。首先,我們可以使用load_img()函數加載圖像,並將其大小調整為224×224像素所需的大小。 ###Code from keras.preprocessing.image import load_img # 載入圖像檔 img_file = 'evimg16tf.jpg' image = load_img(img_file, target_size=(224, 224)) # 因為VGG16的模型的輸入是224x224 ###Output _____no_output_____ ###Markdown 接下來,我們可以將像素轉換為NumPy數組,以便我們可以在Keras中使用它。我們可以使用這個img_to_array()函數。 ###Code from keras.preprocessing.image import img_to_array # 將圖像資料轉為numpy array image = img_to_array(image) # RGB print("image.shape:", image.shape) ###Output image.shape: (224, 224, 3) ###Markdown VGG16網絡期望單色階(grey)或多色階圖像(rgb)來作為輸入;這意味著輸入數組需要是轉換成四個維度:(圖像批次量,圖像高,圖像寬, 圖像色階數) -> (batch_size, img_height, img_width, img_channels)我們只有一個樣本(一個圖像)。我們可以通過調用reshape()來重新調整數組的形狀,並添加額外的維度。 ###Code # 調整張量的維度 image = image.reshape((1, image.shape[0], image.shape[1], image.shape[2])) print("image.shape:", image.shape) ###Output image.shape: (1, 224, 224, 3) ###Markdown 接下來,我們需要按照VGG在訓練ImageNet數據一樣的方法來對圖像進行前處理。具體來說,從論文裡談到:> The only preprocessing we do is subtracting the mean RGB value, computed on the training set, from each pixel.>> __Very Deep Convolutional Networks for Large-Scale Image Recognition, 2014.__Keras提供了一個稱為preprocess_input()的函數來為VGG網絡準備新的圖像輸入。 ###Code from keras.applications.vgg16 import preprocess_input # 準備VGG模型的圖像 image = preprocess_input(image) ###Output _____no_output_____ ###Markdown 4.做一個預測我們可以調用模型中的predict()函數來預測圖像屬於1000個已知對像類型的機率。 ###Code # 預測所有產出類別的機率 y_pred = model_vgg16.predict(image) ###Output _____no_output_____ ###Markdown 5.解釋預測Keras提供了一個函數來解釋稱為decode_predictions()的概率。它可以返回一個類別的列表和每一個類別機率,為了簡單起見, 我們只會秀出第一個機率最高的種類。 ###Code from keras.applications.vgg16 import decode_predictions # 將機率轉換為類別標籤 label = decode_predictions(y_pred) # 檢索最可能的結果,例如最高的概率 label = label[0][0] # 打印預測結果 print('%s (%.2f%%)' % (label[1], label[2]*100)) ###Output African_elephant (30.40%) ###Markdown VGG19 ###Code from keras.applications.vgg19 import VGG19 from keras.preprocessing import image from keras.applications.vgg19 import preprocess_input from keras.applications.vgg19 import decode_predictions # 載入權重 model_vgg19 = VGG19(weights='imagenet') # 載入圖像檔 img_file = 'evimg16tf.jpg' image = load_img(img_file, target_size=(224, 224)) # 因為VGG19的模型的輸入是224x224 # 將圖像資料轉為numpy array image = img_to_array(image) # RGB print("image.shape:", image.shape) # 調整張量的維度 image = image.reshape((1, image.shape[0], image.shape[1], image.shape[2])) print("image.shape:", image.shape) # 準備模型所需要的圖像前處理 image = preprocess_input(image) # 預測所有產出類別的機率 y_pred = model_vgg19.predict(image) # 將機率轉換為類別標籤 label = decode_predictions(y_pred) # 檢索最可能的結果,例如最高的概率 label = label[0][0] # 打印預測結果 print('%s (%.2f%%)' % (label[1], label[2]*100)) ###Output image.shape: (224, 224, 3) image.shape: (1, 224, 224, 3) tusker (53.61%) ###Markdown ResNet50 ###Code from keras.applications.resnet50 import ResNet50 from keras.preprocessing import image from keras.applications.resnet50 import preprocess_input from keras.applications.resnet50 import decode_predictions # 載入權重 model_resnet50 = ResNet50(weights='imagenet') # 載入圖像檔 img_file = 'evimg16tf.jpg' # 因為ResNet的模型的輸入是224x224 image = load_img(img_file, target_size=(224, 224)) # 將圖像資料轉為numpy array image = img_to_array(image) # RGB print("image.shape:", image.shape) # 調整張量的維度 image = image.reshape((1, image.shape[0], image.shape[1], image.shape[2])) print("image.shape:", image.shape) # 準備模型所需要的圖像前處理 image = preprocess_input(image) # 預測所有產出類別的機率 y_pred = model_resnet50.predict(image) # 將機率轉換為類別標籤 label = decode_predictions(y_pred) # 檢索最可能的結果,例如最高的概率 label = label[0][0] # 打印預測結果 print('%s (%.2f%%)' % (label[1], label[2]*100)) ###Output image.shape: (224, 224, 3) image.shape: (1, 224, 224, 3) African_elephant (47.35%) ###Markdown InceptionV3 ###Code from keras.applications.inception_v3 import InceptionV3 from keras.preprocessing import image from keras.preprocessing.image import load_img from keras.applications.inception_v3 import preprocess_input from keras.applications.inception_v3 import decode_predictions # 載入權重 model_inception_v3 = InceptionV3(weights='imagenet') # 載入圖像檔 img_file = 'image.jpg' # InceptionV3的模型的輸入是299x299 img = load_img(img_file, target_size=(299, 299)) # 將圖像資料轉為numpy array image = image.img_to_array(img) # RGB print("image.shape:", image.shape) # 調整張量的維度 image = image.reshape((1, image.shape[0], image.shape[1], image.shape[2])) print("image.shape:", image.shape) # 準備模型所需要的圖像前處理 image = preprocess_input(image) # 預測所有產出類別的機率 y_pred = model_inception_v3.predict(image) # 將機率轉換為類別標籤 label = decode_predictions(y_pred) # 檢索最可能的結果,例如最高的概率 label = label[0][0] # 打印預測結果 print('%s (%.2f%%)' % (label[1], label[2]*100)) ###Output image.shape: (299, 299, 3) image.shape: (1, 299, 299, 3) giant_panda (94.42%) ###Markdown InceptionResNetV2 ###Code from keras.applications.inception_resnet_v2 import InceptionResNetV2 from keras.preprocessing import image from keras.applications.inception_resnet_v2 import preprocess_input from keras.applications.inception_resnet_v2 import decode_predictions # 載入權重 model_inception_resnet_v2 = InceptionResNetV2(weights='imagenet') # 載入圖像檔 img_file = 'evimg16tf.jpg' # InceptionResNetV2的模型的輸入是299x299 image = load_img(img_file, target_size=(299, 299)) # 將圖像資料轉為numpy array image = img_to_array(image) # RGB print("image.shape:", image.shape) # 調整張量的維度 image = image.reshape((1, image.shape[0], image.shape[1], image.shape[2])) print("image.shape:", image.shape) # 準備模型所需要的圖像前處理 image = preprocess_input(image) # 預測所有產出類別的機率 y_pred = model_inception_resnet_v2.predict(image) # 將機率轉換為類別標籤 label = decode_predictions(y_pred) # 檢索最可能的結果,例如最高的概率 label = label[0][0] # 打印預測結果 print('%s (%.2f%%)' % (label[1], label[2]*100)) ###Output image.shape: (299, 299, 3) image.shape: (1, 299, 299, 3) African_elephant (62.94%) ###Markdown MobileNet ###Code from keras.applications.mobilenet import MobileNet from keras.preprocessing import image from keras.applications.mobilenet import preprocess_input from keras.applications.mobilenet import decode_predictions # 載入權重 model_mobilenet = MobileNet(weights='imagenet') # 載入圖像檔 img_file = 'evimg16tf.jpg' # MobileNet的模型的輸入是224x224 image = load_img(img_file, target_size=(224, 224)) # 將圖像資料轉為numpy array image = img_to_array(image) # RGB print("image.shape:", image.shape) # 調整張量的維度 image = image.reshape((1, image.shape[0], image.shape[1], image.shape[2])) print("image.shape:", image.shape) # 準備模型所需要的圖像前處理 image = preprocess_input(image) # 預測所有產出類別的機率 y_pred = model_mobilenet.predict(image) # 將機率轉換為類別標籤 label = decode_predictions(y_pred) # 檢索最可能的結果,例如最高的概率 label = label[0][0] # 打印預測結果 print('%s (%.2f%%)' % (label[1], label[2]*100)) ###Output image.shape: (224, 224, 3) image.shape: (1, 224, 224, 3) African_elephant (90.81%)
Model_fastText/fastText-train.ipynb
###Markdown Create Model with specified optimizer and loss function ###Code model = fastText(config, len(dataset.vocab), dataset.word_embeddings) if torch.cuda.is_available(): model.cuda() model.train() optimizer = optim.SGD(model.parameters(), lr=config.lr) NLLLoss = nn.NLLLoss() model.add_optimizer(optimizer) model.add_loss_op(NLLLoss) train_losses = [] val_accuracies = [] def train(): for i in range(config.max_epochs): print ("Epoch: {}".format(i)) train_loss,val_accuracy = model.run_epoch(dataset.train_iterator, dataset.val_iterator, i) train_losses.append(train_loss) val_accuracies.append(val_accuracy) %time train() def calc_true_and_pred(model, iterator): all_preds = [] all_y = [] for idx,batch in enumerate(iterator): if torch.cuda.is_available(): x = batch.text.cuda() else: x = batch.text y_pred = model(x) predicted = torch.max(y_pred.cpu().data, 1)[1] + 1 all_preds.extend(predicted.numpy()) all_y.extend(batch.label.numpy()) return all_y, all_preds all_y, all_preds = calc_true_and_pred(model, dataset.test_iterator) ###Output _____no_output_____ ###Markdown accuracy score ###Code test_acc = accuracy_score(all_y, np.array(all_preds).flatten()) print ('Final Test Accuracy: {:.4f}'.format(test_acc)) ###Output _____no_output_____ ###Markdown confusion matrix ###Code report = classification_report(all_y, np.array(all_preds).flatten()) print(report) ###Output _____no_output_____
notebooks/morphology_tutorial/morphology_tutorial.ipynb
###Markdown Morphology Tutorial Morphology sub-package functions can be used with a clean mask of a plant (see [VIS tutorial](../vis_tutorial/vis_tutorial.ipynb) for examples of masking background. This tutorial will start with a binary mask (after object segmentation has been completed) but in a complete workflow users will need to use other functions to achieve plant isolation. Skeletonizing is very sensitive to any pepper noise remaining within a binary mask. Morphology functions are intended to be one type of object analysis. These functions can potentially return information about leaf length, leaf angle, and leaf curvature. ###Code # Import libraries from plantcv import plantcv as pcv class options: def __init__(self): self.image = "./img/mask.png" self.debug = "plot" self.writeimg= False self.result = "morphology_tutorial_results.txt" self.outdir = "." # Get options args = options() # Set debug to the global parameter pcv.params.debug = args.debug # Read image (sometimes you need to run this line twice to see the image) # Inputs: # filename - Image file to be read in # mode - How to read in the image; either 'native' (default), 'rgb', 'gray', or 'csv' img, path, filename = pcv.readimage(filename=args.image) # Crop the mask cropped_mask = img[1150:1750, 900:1550] # Skeletonize the mask #%matplotlib notebook # To enable the zoom feature to better see fine lines, uncomment the line above ^^ # Inputs: # mask = Binary image data skeleton = pcv.morphology.skeletonize(mask=cropped_mask) # Prune the skeleton # Generally, skeletonized images will have barbs (this image is particularly ideal, # that's why it's the example image in the tutorial!), # representing the width, that need to get pruned off. # Inputs: # skel_img = Skeletonized image # size = Size to get pruned off each branch # mask = (Optional) binary mask for debugging. If provided, debug image will be overlaid on the mask. img1, seg_img, edge_objects = pcv.morphology.prune(skel_img=skeleton, size=10, mask=cropped_mask) # Fill in segments (also stores out area data) # Inputs: # mask = Binary image, single channel, object = 1 and background = 0 # objects = List of contours # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects, label="default") # Identify branch points # Inputs: # skel_img = Skeletonized image # mask = (Optional) binary mask for debugging. If provided, debug image will be overlaid on the mask. # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) branch_pts_mask = pcv.morphology.find_branch_pts(skel_img=skeleton, mask=cropped_mask, label="default") # Identify tip points # Inputs: # skel_img = Skeletonized image # mask = (Optional) binary mask for debugging. If provided, debug # image will be overlaid on the mask # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) tip_pts_mask = pcv.morphology.find_tips(skel_img=skeleton, mask=None, label="default") # Adjust line thickness with the global line thickness parameter (default = 5), # and provide binary mask of the plant for debugging. NOTE: the objects and # hierarchies returned will be exactly the same but the debugging image (segmented_img) # will look different. pcv.params.line_thickness = 3 # Sort segments into primary (stem) objects and secondary (leaf) objects. # Downstream steps can be performed on just one class of objects at a time, # or all objects (output from segment_skeleton) # Inputs: # skel_img = Skeletonized image # objects = List of contours # mask = (Optional) binary mask for debugging. If provided, debug image will be overlaid on the mask. leaf_obj, stem_obj = pcv.morphology.segment_sort(skel_img=skeleton, objects=edge_objects, mask=cropped_mask) # Identify segments # Inputs: # skel_img = Skeletonized image # objects = List of contours # mask = (Optional) binary mask for debugging. If provided, # debug image will be overlaid on the mask. segmented_img, labeled_img = pcv.morphology.segment_id(skel_img=skeleton, objects=leaf_obj, mask=cropped_mask) # Similar to line thickness, there are optional text size and text thickness parameters # that can be adjusted to better suit images or varying sizes. pcv.params.text_size=.8 # (default text_size=.55) pcv.params.text_thickness=3 # (defaul text_thickness=2) segmented_img, labeled_img = pcv.morphology.segment_id(skel_img=skeleton, objects=leaf_obj, mask=cropped_mask) # Measure path lengths of segments # Inputs: # segmented_img = Segmented image to plot lengths on # objects = List of contours # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) labeled_img = pcv.morphology.segment_path_length(segmented_img=segmented_img, objects=leaf_obj, label="default") # Measure euclidean distance of segments # Inputs: # segmented_img = Segmented image to plot lengths on # objects = List of contours # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) labeled_img = pcv.morphology.segment_euclidean_length(segmented_img=segmented_img, objects=leaf_obj, label="default") # Measure curvature of segments # Inputs: # segmented_img = Segmented image to plot curvature on # objects = List of contours # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) labeled_img = pcv.morphology.segment_curvature(segmented_img=segmented_img, objects=leaf_obj, label="default") # Measure the angle of segments # Inputs: # segmented_img = Segmented image to plot angles on # objects = List of contours # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) labeled_img = pcv.morphology.segment_angle(segmented_img=segmented_img, objects=leaf_obj, label="default") # Measure the tangent angles of segments # Inputs: # segmented_img = Segmented image to plot tangent angles on # objects = List of contours # size = Size of ends used to calculate "tangent" lines # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) labeled_img = pcv.morphology.segment_tangent_angle(segmented_img=segmented_img, objects=leaf_obj, size=15, label="default") # Measure the leaf insertion angles # Inputs: # skel_img = Skeletonize image # segmented_img = Segmented image to plot insertion angles on # leaf_objects = List of leaf contours # stem_objects = List of stem objects # size = Size of the inner portion of each leaf to find a linear regression line # label = Optional label parameter, modifies the variable name of observations recorded. (default `label="default"`)filled_img = pcv.morphology.fill_segments(mask=cropped_mask, objects=edge_objects) labeled_img = pcv.morphology.segment_insertion_angle(skel_img=skeleton, segmented_img=segmented_img, leaf_objects=leaf_obj, stem_objects=stem_obj, size=20, label="default") # Write morphological data to results file # The save results function will take the measurements stored when running any PlantCV analysis functions, format, # and print an output text file for data analysis. The Outputs class stores data whenever any of the following functions # are ran: analyze_bound_horizontal, analyze_bound_vertical, analyze_color, analyze_nir_intensity, analyze_object, # fluor_fvfm, report_size_marker_area, watershed. If no functions have been run, it will print an empty text file pcv.outputs.save_results(filename=args.result) ###Output _____no_output_____
measures/regression-plot-generation.ipynb
###Markdown Imports*** ###Code import pandas as pd import matplotlib.pyplot as plt import numpy as np import seaborn as sns import warnings from pylab import rcParams %matplotlib inline warnings.filterwarnings("ignore") rcParams['figure.figsize'] = 20,10 rcParams['font.size'] = 30 sns.set() np.random.seed(8) import scipy ###Output _____no_output_____ ###Markdown Loading and visualizing data*** ###Code header_list = ["spec_name", "impl_name", "set_size", "edge_prob", "policy_size", "spec_len", "rego_lines_of_code", "error_rate", "edge_number", "comp_cum_time", "result"] df = pd.read_csv("equivalence.dat", sep=';', usecols=[0,1,2,3,4,5,6,7,8,9,10], names=header_list) chunks = pd.read_csv("equivalence.dat", sep=';', usecols=[0,1,2,3,4,5,6,7,8,9,10], names=header_list, chunksize=50000) slim_data = [] for chunk in chunks: chunk["comp_cum_time"] *= 1000 slim_data.append(chunk) df = pd.concat(slim_data) # Seconds to Milliseconds #df["comp_cum_time"] *= 1000 df ###Output _____no_output_____ ###Markdown Removing outliers and harmonizing sample size*** ###Code # All values of predictors set_sizes = sorted(df.set_size.unique()) policy_sizes = sorted(df.policy_size.unique()) error_rates = sorted(df.error_rate.unique()) # Removing 0.7 error rates values from data #error_rates = error_rates[:-1] print("Used values:") print("Set sizes: {}".format(set_sizes)) print("Policy sizes: {}".format(policy_sizes)) print("Error rates: {}".format(error_rates)) print("") # Making list of DFs by predictors preds_df_list = [] labels = [] for set_size in set_sizes: for policy_size in policy_sizes: for error_rate in error_rates: print("DF parameters: {} {} {}".format(set_size, policy_size, error_rate)) labels.append("{} {} {}".format(set_size, policy_size, error_rate)) preds_df_list.append(df[(df.set_size == set_size) & (df.policy_size == policy_size) & (df.error_rate == error_rate)]) print("\n") print("Unmodified DF shapes") for pred_df in preds_df_list: print(pred_df.shape) print("\n") # Removing outliers in DFs new_preds_df_list = [] for pred_df in preds_df_list: # Remove all values with Z-score > 3 new_preds_df_list.append( pred_df[np.abs(pred_df.comp_cum_time-pred_df.comp_cum_time.mean()) <= (3*pred_df.comp_cum_time.std())] ) preds_df_list = new_preds_df_list # Print DF shapes to check sample sizes, put them in list preds_df_list_sample_sizes = [] print("No outliers DF shapes") for pred_df in preds_df_list: print(pred_df.shape) preds_df_list_sample_sizes.append(pred_df.shape[0]) print("\n") minimum_sample_size = min(preds_df_list_sample_sizes) print("Minimum common sample size: {}".format(minimum_sample_size)) # Make sample sizes equal new_preds_df_list = [] for pred_df in preds_df_list: new_preds_df_list.append(pred_df.head(minimum_sample_size)) preds_df_list = new_preds_df_list # Check new DF shapes print("Modified DF shapes") for pred_df in preds_df_list: print(pred_df.shape) print("\n") ###Output Used values: Set sizes: [10, 20, 30, 50, 100] Policy sizes: [1, 2] Error rates: [0.0, 0.2, 0.4, 0.7] DF parameters: 10 1 0.0 DF parameters: 10 1 0.2 DF parameters: 10 1 0.4 DF parameters: 10 1 0.7 DF parameters: 10 2 0.0 DF parameters: 10 2 0.2 DF parameters: 10 2 0.4 DF parameters: 10 2 0.7 DF parameters: 20 1 0.0 DF parameters: 20 1 0.2 DF parameters: 20 1 0.4 DF parameters: 20 1 0.7 DF parameters: 20 2 0.0 DF parameters: 20 2 0.2 DF parameters: 20 2 0.4 DF parameters: 20 2 0.7 DF parameters: 30 1 0.0 DF parameters: 30 1 0.2 DF parameters: 30 1 0.4 DF parameters: 30 1 0.7 DF parameters: 30 2 0.0 DF parameters: 30 2 0.2 DF parameters: 30 2 0.4 DF parameters: 30 2 0.7 DF parameters: 50 1 0.0 DF parameters: 50 1 0.2 DF parameters: 50 1 0.4 DF parameters: 50 1 0.7 DF parameters: 50 2 0.0 DF parameters: 50 2 0.2 DF parameters: 50 2 0.4 DF parameters: 50 2 0.7 DF parameters: 100 1 0.0 DF parameters: 100 1 0.2 DF parameters: 100 1 0.4 DF parameters: 100 1 0.7 DF parameters: 100 2 0.0 DF parameters: 100 2 0.2 DF parameters: 100 2 0.4 DF parameters: 100 2 0.7 Unmodified DF shapes (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) (27000, 11) No outliers DF shapes (26587, 11) (26601, 11) (26558, 11) (26596, 11) (26987, 11) (26982, 11) (26986, 11) (26987, 11) (26995, 11) (26993, 11) (26992, 11) (26992, 11) (26808, 11) (26900, 11) (26884, 11) (26848, 11) (26691, 11) (26704, 11) (26716, 11) (26718, 11) (26695, 11) (26688, 11) (26669, 11) (26689, 11) (26825, 11) (26870, 11) (26862, 11) (26857, 11) (26705, 11) (26712, 11) (26717, 11) (26712, 11) (26409, 11) (26084, 11) (26198, 11) (26280, 11) (26257, 11) (26304, 11) (26254, 11) (26202, 11) Minimum common sample size: 26084 Modified DF shapes (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) (26084, 11) ###Markdown Merge all data back together*** ###Code merged_df = pd.DataFrame() for pred_df in preds_df_list: merged_df = merged_df.append(pred_df) merged_df ###Output _____no_output_____ ###Markdown Fit m*log(m) curve to data*** ###Code x = np.array(merged_df["edge_number"], dtype=float) y = np.array(merged_df["comp_cum_time"], dtype=float) def log_func(x, a, b): return a*x*np.log(x) + b popt, pcov = scipy.optimize.curve_fit(log_func, x, y) print("a = {}, b = {}".format(popt[0], popt[1])) a_log = popt[0] b_log = popt[1] log_formula_str = merged_df.columns[-2] + ' ~ ' + 'log_func(edge_number, a_log, b_log)' log_formula_str ###Output a = 0.0024863014152139476, b = -0.1976008531846339 ###Markdown Group data by edge number and calculate mean time*** ###Code df_mod = merged_df.groupby('edge_number', as_index=False)['comp_cum_time'].mean() df_mod ###Output _____no_output_____ ###Markdown Scatter plot and regression*** All points*** ###Code # Scatter plot sns.scatterplot(data=merged_df, x="edge_number", y="comp_cum_time") # Curve with fitted coefficients x = np.linspace(1, merged_df["edge_number"].max() + 10, merged_df["edge_number"].max() + 10) y = a_log * x * np.log(x) + b_log # Plotting fitted curve on top of data plt.plot(x, y, label="Fitted Curve", color="red") plt.xticks(fontsize=22) plt.yticks(fontsize=30) plt.ylim(top=120) plt.xlabel("Number of edges", fontsize=35) plt.ylabel("Algo duration (ms)", fontsize=35) plt.tight_layout() plt.savefig("figures/scatterplot-regression.pdf") ###Output _____no_output_____ ###Markdown Only means*** ###Code # Scatter plot sns.scatterplot(data=df_mod, x="edge_number", y="comp_cum_time") # Curve with fitted coefficients x = np.linspace(1, merged_df["edge_number"].max() + 10, merged_df["edge_number"].max() + 10) y = a_log * x * np.log(x) + b_log # Plotting fitted curve on top of data plt.plot(x, y, label="Fitted Curve", color="red") plt.xticks(fontsize=22) plt.yticks(fontsize=30) plt.xlabel("Number of edges", fontsize=35) plt.ylabel("Algo duration (ms)", fontsize=35) plt.tight_layout() plt.savefig("figures/scatterplot-means-regression.pdf") ###Output _____no_output_____
L11-optim/code/HW03-experiments/02_hw3-tanh.ipynb
###Markdown STAT 453: Deep Learning (Spring 2020) Instructor: Sebastian Raschka ([email protected]) Course website: http://pages.stat.wisc.edu/~sraschka/teaching/stat453-ss2020/ GitHub repository: https://github.com/rasbt/stat453-deep-learning-ss20--- ###Code %load_ext watermark %watermark -a 'Sebastian Raschka' -v -p matplotlib,torch,pandas,numpy,PIL ###Output Sebastian Raschka CPython 3.7.1 IPython 7.13.0 matplotlib 3.1.0 torch 1.4.0 pandas 1.0.1 numpy 1.18.1 PIL 7.0.0 ###Markdown Note that Python's Imaging Library (PIL) can be installed via conda install pillow **If you have any installation issues, please don't hesitate to ask via Piazza!** HW 3: Graduate Student Descent! Training and Tuning a Multilayer Perceptron (40 pts) **Your task in this homework is to take this existing Multilayer Perceptron implementation and change it to achieve a better performance on [Fashion-MNIST](https://github.com/zalandoresearch/fashion-mnist).**---For the successfull outcome of this homework: - Your Validation and Test set accuracies should be - greater than 85% for 10 pts - greater than 86% for 20 pts - greater than 87% for 30 pts - greater than 88% for 40 pts - Answer the questions at the bottom of this notebook- Submit this Jupyter Notebook with your modifications as .ipynb and .html file to Canvas (similar to previous homeworks)---**And as promised in class, the student with the highest performance will receive a little gift!**--- Please read and execute this notebook first to make sure everything works correctly. Then, feel free to make any changes to the architecture, i.e., you can change- the number of layers- the activation function(s) (logistic sigmoid, tanh, relu, leaky relu, ...)- the learning rate- the number of hidden layers- the number of units in the hidden layer(s)- the number of epochs- the minibatch sizeHowever,- don't change the weight initialization- don't change the random seed- don't change the optimization algorithmThe cells where you can/should make changes are clearly highlighted. For instance, I added comments as shown below:`````` ###Code import torch import os import pandas as pd import numpy as np import time import random from PIL import Image from torch.utils.data import Dataset from torchvision import transforms from torch.utils.data import DataLoader import torch.nn.functional as F import matplotlib.pyplot as plt %matplotlib inline # No need to change anything here! # If there is a GPU available, it will use it, # otherwise, it will use the CPU RANDOM_SEED = 123 DEVICE = torch.device('cuda:0' if torch.cuda.is_available() else 'cpu') ###Output _____no_output_____ ###Markdown Loading the Dataset The dataset consists of 10 classes similar to the original MNIST dataset. Also, it shares the same overall structure with MNIST, i.e., there are 60k training images and 10k test images, and all images are black & white images of size 28x28. Below is an example of how the images look like:![](figures/fashion-mnist-sprite.png)(Image Source: https://github.com/zalandoresearch/fashion-mnist) The 10 classes in this dataset are| Label | Description || --- | --- || 0 | T-shirt/top || 1 | Trouser || 2 | Pullover || 3 | Dress || 4 | Coat || 5 | Sandal || 6 | Shirt || 7 | Sneaker || 8 | Bag || 9 | Ankle boot | Before you continue, please execute the companion notebook "Notebook for Preparing the Dataset for HW3" ([`hw3-dataprep.ipynb`](hw3-dataprep.ipynb)) for downloading and preparing the dataset. --- Dataset Loader ###Code import torch from PIL import Image from torch.utils.data import Dataset from torchvision import transforms from torch.utils.data import DataLoader import pandas as pd import os class FashionMNISTDataset(Dataset): """Custom Dataset for loading FashionMNIST images""" def __init__(self, csv_path, img_dir, transform=None): df = pd.read_csv(csv_path) self.img_dir = img_dir self.img_names = df['image_name'].values self.y = df['class_label'].values self.transform = transform def __getitem__(self, index): img = Image.open(os.path.join(self.img_dir, self.img_names[index])) if self.transform is not None: img = self.transform(img) label = self.y[index] return img, label def __len__(self): return self.y.shape[0] ############################################################ # THIS CELL CAN BE MODIFIED ############################################################ custom_train_transform = transforms.Compose([ transforms.ToTensor(), transforms.Normalize(mean=(0.5,), std=(0.5,)) ]) #################################################################### # THIS CELL CAN BE MODIFIED BUT THERE SHOULD NOT BE ANY RANDOMNESS #################################################################### custom_test_transform = transforms.Compose([ transforms.ToTensor(), transforms.Normalize(mean=(0.5,), std=(0.5,)) ]) ############################################################ # THIS CELL CAN BE MODIFIED ############################################################ BATCH_SIZE = 64 train_dataset = FashionMNISTDataset(csv_path='train.csv', img_dir='png-files/', transform=custom_train_transform) train_loader = DataLoader(dataset=train_dataset, batch_size=BATCH_SIZE, shuffle=True, drop_last=True, num_workers=4) valid_dataset = FashionMNISTDataset(csv_path='valid.csv', img_dir='png-files/', transform=custom_test_transform) valid_loader = DataLoader(dataset=valid_dataset, batch_size=BATCH_SIZE, shuffle=False, num_workers=4) test_dataset = FashionMNISTDataset(csv_path='test.csv', img_dir='png-files/', transform=custom_test_transform) test_loader = DataLoader(dataset=test_dataset, batch_size=BATCH_SIZE, shuffle=False, num_workers=4) ###Output _____no_output_____ ###Markdown The cell below just checks if the dataset can be loaded correctly. ###Code torch.manual_seed(0) num_epochs = 2 for epoch in range(num_epochs): for batch_idx, (x, y) in enumerate(train_loader): print('Epoch:', epoch+1, end='') print(' | Batch index:', batch_idx, end='') print(' | Batch size:', y.size()[0]) x = x.to(DEVICE) y = y.to(DEVICE) print('break minibatch for-loop') break ###Output Epoch: 1 | Batch index: 0 | Batch size: 64 break minibatch for-loop Epoch: 2 | Batch index: 0 | Batch size: 64 break minibatch for-loop ###Markdown If you get an error, make sure the `png-files` folder is unzipped and it the same directory as this notebook! Multilayer Perceptron Model The cell below contains the multi-layer perceptron model. This is the main section where you want to make changes to the architecture. ###Code ############################################################ # THIS CELL CAN BE MODIFIED ############################################################ class MLP(torch.nn.Module): def __init__(self, num_features, num_hidden_1, num_hidden_2, num_hidden_3, num_classes): super(MLP, self).__init__() self.num_classes = num_classes ### ADD ADDITIONAL LAYERS BELOW IF YOU LIKE self.linear_1 = torch.nn.Linear(num_features, num_hidden_1) self.linear_2 = torch.nn.Linear(num_hidden_1, num_hidden_2) self.linear_3 = torch.nn.Linear(num_hidden_2, num_hidden_3) self.linear_out = torch.nn.Linear(num_hidden_3, num_classes) def forward(self, x): ### MAKE SURE YOU CONNECT THE LAYERS PROPERLY IF YOU CHANGED ### ANYTHNG IN THE __init__ METHOD ABOVE out = self.linear_1(x) out = torch.tanh(out) out = self.linear_2(out) out = torch.tanh(out) out = self.linear_3(out) out = torch.tanh(out) logits = self.linear_out(out) probas = F.softmax(logits, dim=1) return logits, probas ################################# ### Model Initialization ################################# # the random seed makes sure that the random weight initialization # in the model is always the same. # In practice, some weights don't work well, and we may also want # to try different random seeds. In this homework, this is not # necessary. random.seed(RANDOM_SEED) torch.manual_seed(RANDOM_SEED) ### IF YOU CHANGED THE ARCHITECTURE ABOVE, MAKE SURE YOU ### ACCOUNT FOR IT VIA THE PARAMETERS BELOW. I.e., if you ### added a second hidden layer, you may want to add a ### hidden_2 parameter here. Also you may want to play ### with the number of hidden units. model = MLP(num_features=28*28, num_hidden_1=75, num_hidden_2=50, num_hidden_3=25, num_classes=10) model = model.to(DEVICE) ############################################################ # THIS CELL CAN BE MODIFIED ############################################################ ### For this homework, do not change the optimizer. However, you ### likely want to experiment with the learning rate! optimizer = torch.optim.SGD(model.parameters(), lr=0.1) ############################################################ ############################################################ # THIS CELL CAN BE MODIFIED ############################################################ NUM_EPOCHS = 50 # Please feel free to change ############################################################ def compute_accuracy_and_loss(model, data_loader, device): correct_pred, num_examples = 0, 0 cross_entropy = 0. for i, (features, targets) in enumerate(data_loader): features = features.view(-1, 28*28).to(device) targets = targets.to(device) logits, probas = model(features) cross_entropy += F.cross_entropy(logits, targets).item() _, predicted_labels = torch.max(probas, 1) num_examples += targets.size(0) correct_pred += (predicted_labels == targets).sum() return correct_pred.float()/num_examples * 100, cross_entropy/num_examples start_time = time.time() train_acc_lst, valid_acc_lst = [], [] train_loss_lst, valid_loss_lst = [], [] for epoch in range(NUM_EPOCHS): model.train() for batch_idx, (features, targets) in enumerate(train_loader): ### PREPARE MINIBATCH features = features.view(-1, 28*28).to(DEVICE) targets = targets.to(DEVICE) ### FORWARD AND BACK PROP logits, probas = model(features) cost = F.cross_entropy(logits, targets) optimizer.zero_grad() cost.backward() ### UPDATE MODEL PARAMETERS optimizer.step() ### LOGGING if not batch_idx % 20: print (f'Epoch: {epoch+1:03d}/{NUM_EPOCHS:03d} | ' f'Batch {batch_idx:03d}/{len(train_loader):03d} |' f' Cost: {cost:.4f}') # no need to build the computation graph for backprop when computing accuracy model.eval() with torch.set_grad_enabled(False): train_acc, train_loss = compute_accuracy_and_loss(model, train_loader, device=DEVICE) valid_acc, valid_loss = compute_accuracy_and_loss(model, valid_loader, device=DEVICE) train_acc_lst.append(train_acc) valid_acc_lst.append(valid_acc) train_loss_lst.append(train_loss) valid_loss_lst.append(valid_loss) print(f'Epoch: {epoch+1:03d}/{NUM_EPOCHS:03d} Train Acc.: {train_acc:.2f}%' f' | Validation Acc.: {valid_acc:.2f}%') elapsed = (time.time() - start_time)/60 print(f'Time elapsed: {elapsed:.2f} min') elapsed = (time.time() - start_time)/60 print(f'Total Training Time: {elapsed:.2f} min') ###Output Epoch: 001/050 | Batch 000/859 | Cost: 2.3331 Epoch: 001/050 | Batch 020/859 | Cost: 1.7824 Epoch: 001/050 | Batch 040/859 | Cost: 1.2641 Epoch: 001/050 | Batch 060/859 | Cost: 0.9086 Epoch: 001/050 | Batch 080/859 | Cost: 0.9438 Epoch: 001/050 | Batch 100/859 | Cost: 0.7324 Epoch: 001/050 | Batch 120/859 | Cost: 0.7326 Epoch: 001/050 | Batch 140/859 | Cost: 0.6527 Epoch: 001/050 | Batch 160/859 | Cost: 0.6766 Epoch: 001/050 | Batch 180/859 | Cost: 0.6882 Epoch: 001/050 | Batch 200/859 | Cost: 0.6073 Epoch: 001/050 | Batch 220/859 | Cost: 0.7083 Epoch: 001/050 | Batch 240/859 | Cost: 0.4937 Epoch: 001/050 | Batch 260/859 | Cost: 0.6272 Epoch: 001/050 | Batch 280/859 | Cost: 0.8784 Epoch: 001/050 | Batch 300/859 | Cost: 0.7410 Epoch: 001/050 | Batch 320/859 | Cost: 0.6813 Epoch: 001/050 | Batch 340/859 | Cost: 0.6547 Epoch: 001/050 | Batch 360/859 | Cost: 0.5783 Epoch: 001/050 | Batch 380/859 | Cost: 0.4853 Epoch: 001/050 | Batch 400/859 | Cost: 0.5365 Epoch: 001/050 | Batch 420/859 | Cost: 0.4763 Epoch: 001/050 | Batch 440/859 | Cost: 0.5350 Epoch: 001/050 | Batch 460/859 | Cost: 0.4630 Epoch: 001/050 | Batch 480/859 | Cost: 0.4425 Epoch: 001/050 | Batch 500/859 | Cost: 0.4995 Epoch: 001/050 | Batch 520/859 | Cost: 0.5956 Epoch: 001/050 | Batch 540/859 | Cost: 0.7021 Epoch: 001/050 | Batch 560/859 | Cost: 0.4333 Epoch: 001/050 | Batch 580/859 | Cost: 0.5645 Epoch: 001/050 | Batch 600/859 | Cost: 0.4433 Epoch: 001/050 | Batch 620/859 | Cost: 0.5796 Epoch: 001/050 | Batch 640/859 | Cost: 0.6918 Epoch: 001/050 | Batch 660/859 | Cost: 0.5170 Epoch: 001/050 | Batch 680/859 | Cost: 0.6431 Epoch: 001/050 | Batch 700/859 | Cost: 0.5894 Epoch: 001/050 | Batch 720/859 | Cost: 0.5996 Epoch: 001/050 | Batch 740/859 | Cost: 0.5186 Epoch: 001/050 | Batch 760/859 | Cost: 0.3527 Epoch: 001/050 | Batch 780/859 | Cost: 0.5078 Epoch: 001/050 | Batch 800/859 | Cost: 0.5046 Epoch: 001/050 | Batch 820/859 | Cost: 0.4890 Epoch: 001/050 | Batch 840/859 | Cost: 0.5818 Epoch: 001/050 Train Acc.: 83.33% | Validation Acc.: 83.12% Time elapsed: 0.23 min Epoch: 002/050 | Batch 000/859 | Cost: 0.3389 Epoch: 002/050 | Batch 020/859 | Cost: 0.4080 Epoch: 002/050 | Batch 040/859 | Cost: 0.4553 Epoch: 002/050 | Batch 060/859 | Cost: 0.4079 Epoch: 002/050 | Batch 080/859 | Cost: 0.4345 Epoch: 002/050 | Batch 100/859 | Cost: 0.5626 Epoch: 002/050 | Batch 120/859 | Cost: 0.5876 Epoch: 002/050 | Batch 140/859 | Cost: 0.4401 Epoch: 002/050 | Batch 160/859 | Cost: 0.3515 Epoch: 002/050 | Batch 180/859 | Cost: 0.4840 Epoch: 002/050 | Batch 200/859 | Cost: 0.5122 Epoch: 002/050 | Batch 220/859 | Cost: 0.3826 Epoch: 002/050 | Batch 240/859 | Cost: 0.5452 Epoch: 002/050 | Batch 260/859 | Cost: 0.3723 Epoch: 002/050 | Batch 280/859 | Cost: 0.3934 Epoch: 002/050 | Batch 300/859 | Cost: 0.4100 Epoch: 002/050 | Batch 320/859 | Cost: 0.7461 Epoch: 002/050 | Batch 340/859 | Cost: 0.3071 Epoch: 002/050 | Batch 360/859 | Cost: 0.3877 Epoch: 002/050 | Batch 380/859 | Cost: 0.4044 Epoch: 002/050 | Batch 400/859 | Cost: 0.4109 Epoch: 002/050 | Batch 420/859 | Cost: 0.2861 Epoch: 002/050 | Batch 440/859 | Cost: 0.4138 Epoch: 002/050 | Batch 460/859 | Cost: 0.4675 Epoch: 002/050 | Batch 480/859 | Cost: 0.3843 Epoch: 002/050 | Batch 500/859 | Cost: 0.3635 Epoch: 002/050 | Batch 520/859 | Cost: 0.3590 Epoch: 002/050 | Batch 540/859 | Cost: 0.3113 Epoch: 002/050 | Batch 560/859 | Cost: 0.4519 Epoch: 002/050 | Batch 580/859 | Cost: 0.2864 Epoch: 002/050 | Batch 600/859 | Cost: 0.5296 Epoch: 002/050 | Batch 620/859 | Cost: 0.5765 Epoch: 002/050 | Batch 640/859 | Cost: 0.4660 Epoch: 002/050 | Batch 660/859 | Cost: 0.3820 Epoch: 002/050 | Batch 680/859 | Cost: 0.2834 Epoch: 002/050 | Batch 700/859 | Cost: 0.5233 Epoch: 002/050 | Batch 720/859 | Cost: 0.1960 Epoch: 002/050 | Batch 740/859 | Cost: 0.6929 Epoch: 002/050 | Batch 760/859 | Cost: 0.4009 Epoch: 002/050 | Batch 780/859 | Cost: 0.5954 Epoch: 002/050 | Batch 800/859 | Cost: 0.3472 Epoch: 002/050 | Batch 820/859 | Cost: 0.5055 Epoch: 002/050 | Batch 840/859 | Cost: 0.4708 Epoch: 002/050 Train Acc.: 86.06% | Validation Acc.: 85.28% Time elapsed: 0.41 min Epoch: 003/050 | Batch 000/859 | Cost: 0.3331 Epoch: 003/050 | Batch 020/859 | Cost: 0.3673 Epoch: 003/050 | Batch 040/859 | Cost: 0.3466 Epoch: 003/050 | Batch 060/859 | Cost: 0.2561 Epoch: 003/050 | Batch 080/859 | Cost: 0.3712 Epoch: 003/050 | Batch 100/859 | Cost: 0.3453 Epoch: 003/050 | Batch 120/859 | Cost: 0.2176 Epoch: 003/050 | Batch 140/859 | Cost: 0.4024 Epoch: 003/050 | Batch 160/859 | Cost: 0.4357 Epoch: 003/050 | Batch 180/859 | Cost: 0.3169 Epoch: 003/050 | Batch 200/859 | Cost: 0.4033 Epoch: 003/050 | Batch 220/859 | Cost: 0.5877 Epoch: 003/050 | Batch 240/859 | Cost: 0.5005 Epoch: 003/050 | Batch 260/859 | Cost: 0.1907 Epoch: 003/050 | Batch 280/859 | Cost: 0.2782 Epoch: 003/050 | Batch 300/859 | Cost: 0.4119 Epoch: 003/050 | Batch 320/859 | Cost: 0.5707 Epoch: 003/050 | Batch 340/859 | Cost: 0.3368 Epoch: 003/050 | Batch 360/859 | Cost: 0.3573 Epoch: 003/050 | Batch 380/859 | Cost: 0.4647 Epoch: 003/050 | Batch 400/859 | Cost: 0.5224 Epoch: 003/050 | Batch 420/859 | Cost: 0.3494 Epoch: 003/050 | Batch 440/859 | Cost: 0.2722 Epoch: 003/050 | Batch 460/859 | Cost: 0.3975 Epoch: 003/050 | Batch 480/859 | Cost: 0.5259 Epoch: 003/050 | Batch 500/859 | Cost: 0.5090 Epoch: 003/050 | Batch 520/859 | Cost: 0.4408 Epoch: 003/050 | Batch 540/859 | Cost: 0.3433 Epoch: 003/050 | Batch 560/859 | Cost: 0.3428 Epoch: 003/050 | Batch 580/859 | Cost: 0.4281 Epoch: 003/050 | Batch 600/859 | Cost: 0.2686 Epoch: 003/050 | Batch 620/859 | Cost: 0.2409 Epoch: 003/050 | Batch 640/859 | Cost: 0.2900 Epoch: 003/050 | Batch 660/859 | Cost: 0.2466 Epoch: 003/050 | Batch 680/859 | Cost: 0.4581 Epoch: 003/050 | Batch 700/859 | Cost: 0.3990 Epoch: 003/050 | Batch 720/859 | Cost: 0.3755 Epoch: 003/050 | Batch 740/859 | Cost: 0.2649 Epoch: 003/050 | Batch 760/859 | Cost: 0.3866 Epoch: 003/050 | Batch 780/859 | Cost: 0.3665 Epoch: 003/050 | Batch 800/859 | Cost: 0.4415 Epoch: 003/050 | Batch 820/859 | Cost: 0.4122 Epoch: 003/050 | Batch 840/859 | Cost: 0.5753 Epoch: 003/050 Train Acc.: 87.30% | Validation Acc.: 86.82% Time elapsed: 0.60 min Epoch: 004/050 | Batch 000/859 | Cost: 0.5313 Epoch: 004/050 | Batch 020/859 | Cost: 0.4560 Epoch: 004/050 | Batch 040/859 | Cost: 0.2909 Epoch: 004/050 | Batch 060/859 | Cost: 0.3304 Epoch: 004/050 | Batch 080/859 | Cost: 0.2387 Epoch: 004/050 | Batch 100/859 | Cost: 0.3177 Epoch: 004/050 | Batch 120/859 | Cost: 0.2246 Epoch: 004/050 | Batch 140/859 | Cost: 0.3130 Epoch: 004/050 | Batch 160/859 | Cost: 0.3131 Epoch: 004/050 | Batch 180/859 | Cost: 0.3086 Epoch: 004/050 | Batch 200/859 | Cost: 0.4397 Epoch: 004/050 | Batch 220/859 | Cost: 0.3520 Epoch: 004/050 | Batch 240/859 | Cost: 0.3877 Epoch: 004/050 | Batch 260/859 | Cost: 0.3473 Epoch: 004/050 | Batch 280/859 | Cost: 0.4055 Epoch: 004/050 | Batch 300/859 | Cost: 0.6045 Epoch: 004/050 | Batch 320/859 | Cost: 0.4639 Epoch: 004/050 | Batch 340/859 | Cost: 0.5630 Epoch: 004/050 | Batch 360/859 | Cost: 0.4264 Epoch: 004/050 | Batch 380/859 | Cost: 0.2146 Epoch: 004/050 | Batch 400/859 | Cost: 0.4397 Epoch: 004/050 | Batch 420/859 | Cost: 0.4090 Epoch: 004/050 | Batch 440/859 | Cost: 0.4345 Epoch: 004/050 | Batch 460/859 | Cost: 0.3314 Epoch: 004/050 | Batch 480/859 | Cost: 0.3971 Epoch: 004/050 | Batch 500/859 | Cost: 0.5167 Epoch: 004/050 | Batch 520/859 | Cost: 0.3729 Epoch: 004/050 | Batch 540/859 | Cost: 0.3072 Epoch: 004/050 | Batch 560/859 | Cost: 0.2928 Epoch: 004/050 | Batch 580/859 | Cost: 0.3561 Epoch: 004/050 | Batch 600/859 | Cost: 0.3459 Epoch: 004/050 | Batch 620/859 | Cost: 0.3599 Epoch: 004/050 | Batch 640/859 | Cost: 0.3487 Epoch: 004/050 | Batch 660/859 | Cost: 0.4872 Epoch: 004/050 | Batch 680/859 | Cost: 0.2947 Epoch: 004/050 | Batch 700/859 | Cost: 0.4998 Epoch: 004/050 | Batch 720/859 | Cost: 0.4008 Epoch: 004/050 | Batch 740/859 | Cost: 0.4094 Epoch: 004/050 | Batch 760/859 | Cost: 0.4165 Epoch: 004/050 | Batch 780/859 | Cost: 0.2591 Epoch: 004/050 | Batch 800/859 | Cost: 0.3185 Epoch: 004/050 | Batch 820/859 | Cost: 0.3990 Epoch: 004/050 | Batch 840/859 | Cost: 0.2990 Epoch: 004/050 Train Acc.: 86.35% | Validation Acc.: 85.08% Time elapsed: 0.79 min Epoch: 005/050 | Batch 000/859 | Cost: 0.2553 Epoch: 005/050 | Batch 020/859 | Cost: 0.2315 Epoch: 005/050 | Batch 040/859 | Cost: 0.1914 Epoch: 005/050 | Batch 060/859 | Cost: 0.3990 Epoch: 005/050 | Batch 080/859 | Cost: 0.2803 Epoch: 005/050 | Batch 100/859 | Cost: 0.3526 Epoch: 005/050 | Batch 120/859 | Cost: 0.3757 Epoch: 005/050 | Batch 140/859 | Cost: 0.2654 Epoch: 005/050 | Batch 160/859 | Cost: 0.3605 Epoch: 005/050 | Batch 180/859 | Cost: 0.1589 Epoch: 005/050 | Batch 200/859 | Cost: 0.3810 Epoch: 005/050 | Batch 220/859 | Cost: 0.4012 Epoch: 005/050 | Batch 240/859 | Cost: 0.4412 Epoch: 005/050 | Batch 260/859 | Cost: 0.2730 Epoch: 005/050 | Batch 280/859 | Cost: 0.2763 Epoch: 005/050 | Batch 300/859 | Cost: 0.3903 Epoch: 005/050 | Batch 320/859 | Cost: 0.3830 Epoch: 005/050 | Batch 340/859 | Cost: 0.2809 Epoch: 005/050 | Batch 360/859 | Cost: 0.3089 Epoch: 005/050 | Batch 380/859 | Cost: 0.6638 Epoch: 005/050 | Batch 400/859 | Cost: 0.2614 Epoch: 005/050 | Batch 420/859 | Cost: 0.2688 Epoch: 005/050 | Batch 440/859 | Cost: 0.3339 Epoch: 005/050 | Batch 460/859 | Cost: 0.4134 Epoch: 005/050 | Batch 480/859 | Cost: 0.2281 Epoch: 005/050 | Batch 500/859 | Cost: 0.3271 Epoch: 005/050 | Batch 520/859 | Cost: 0.2991 Epoch: 005/050 | Batch 540/859 | Cost: 0.3672 Epoch: 005/050 | Batch 560/859 | Cost: 0.4080 Epoch: 005/050 | Batch 580/859 | Cost: 0.3861 Epoch: 005/050 | Batch 600/859 | Cost: 0.3801 Epoch: 005/050 | Batch 620/859 | Cost: 0.2395 Epoch: 005/050 | Batch 640/859 | Cost: 0.4302 Epoch: 005/050 | Batch 660/859 | Cost: 0.3942 Epoch: 005/050 | Batch 680/859 | Cost: 0.1401 Epoch: 005/050 | Batch 700/859 | Cost: 0.1523 Epoch: 005/050 | Batch 720/859 | Cost: 0.3942 Epoch: 005/050 | Batch 740/859 | Cost: 0.3172 Epoch: 005/050 | Batch 760/859 | Cost: 0.3524 Epoch: 005/050 | Batch 780/859 | Cost: 0.2885 Epoch: 005/050 | Batch 800/859 | Cost: 0.2448 Epoch: 005/050 | Batch 820/859 | Cost: 0.5253 Epoch: 005/050 | Batch 840/859 | Cost: 0.3623 Epoch: 005/050 Train Acc.: 87.87% | Validation Acc.: 86.04% Time elapsed: 0.97 min Epoch: 006/050 | Batch 000/859 | Cost: 0.2554 Epoch: 006/050 | Batch 020/859 | Cost: 0.1249 Epoch: 006/050 | Batch 040/859 | Cost: 0.2728 Epoch: 006/050 | Batch 060/859 | Cost: 0.3389 Epoch: 006/050 | Batch 080/859 | Cost: 0.4331 Epoch: 006/050 | Batch 100/859 | Cost: 0.2781 Epoch: 006/050 | Batch 120/859 | Cost: 0.4042 Epoch: 006/050 | Batch 140/859 | Cost: 0.4019 Epoch: 006/050 | Batch 160/859 | Cost: 0.1936 Epoch: 006/050 | Batch 180/859 | Cost: 0.2200 Epoch: 006/050 | Batch 200/859 | Cost: 0.4020 Epoch: 006/050 | Batch 220/859 | Cost: 0.3184 Epoch: 006/050 | Batch 240/859 | Cost: 0.2063 Epoch: 006/050 | Batch 260/859 | Cost: 0.3412 Epoch: 006/050 | Batch 280/859 | Cost: 0.3359 Epoch: 006/050 | Batch 300/859 | Cost: 0.2898 Epoch: 006/050 | Batch 320/859 | Cost: 0.3107 Epoch: 006/050 | Batch 340/859 | Cost: 0.2349 Epoch: 006/050 | Batch 360/859 | Cost: 0.1610 Epoch: 006/050 | Batch 380/859 | Cost: 0.3005 Epoch: 006/050 | Batch 400/859 | Cost: 0.2570 Epoch: 006/050 | Batch 420/859 | Cost: 0.4363 Epoch: 006/050 | Batch 440/859 | Cost: 0.4041 Epoch: 006/050 | Batch 460/859 | Cost: 0.3524 Epoch: 006/050 | Batch 480/859 | Cost: 0.3128 Epoch: 006/050 | Batch 500/859 | Cost: 0.2975 Epoch: 006/050 | Batch 520/859 | Cost: 0.4188 Epoch: 006/050 | Batch 540/859 | Cost: 0.4050 Epoch: 006/050 | Batch 560/859 | Cost: 0.3582 Epoch: 006/050 | Batch 580/859 | Cost: 0.4396 Epoch: 006/050 | Batch 600/859 | Cost: 0.3204 Epoch: 006/050 | Batch 620/859 | Cost: 0.4528 Epoch: 006/050 | Batch 640/859 | Cost: 0.3957 Epoch: 006/050 | Batch 660/859 | Cost: 0.3482 Epoch: 006/050 | Batch 680/859 | Cost: 0.2115 Epoch: 006/050 | Batch 700/859 | Cost: 0.1922 Epoch: 006/050 | Batch 720/859 | Cost: 0.3216 Epoch: 006/050 | Batch 740/859 | Cost: 0.3146 Epoch: 006/050 | Batch 760/859 | Cost: 0.2937 Epoch: 006/050 | Batch 780/859 | Cost: 0.3163 Epoch: 006/050 | Batch 800/859 | Cost: 0.1265 Epoch: 006/050 | Batch 820/859 | Cost: 0.3836 Epoch: 006/050 | Batch 840/859 | Cost: 0.4815 Epoch: 006/050 Train Acc.: 87.88% | Validation Acc.: 86.14% Time elapsed: 1.15 min Epoch: 007/050 | Batch 000/859 | Cost: 0.4356 Epoch: 007/050 | Batch 020/859 | Cost: 0.2837 Epoch: 007/050 | Batch 040/859 | Cost: 0.3344 Epoch: 007/050 | Batch 060/859 | Cost: 0.3866 Epoch: 007/050 | Batch 080/859 | Cost: 0.2489 Epoch: 007/050 | Batch 100/859 | Cost: 0.2787 Epoch: 007/050 | Batch 120/859 | Cost: 0.3574 Epoch: 007/050 | Batch 140/859 | Cost: 0.2228 Epoch: 007/050 | Batch 160/859 | Cost: 0.3602 Epoch: 007/050 | Batch 180/859 | Cost: 0.4288 Epoch: 007/050 | Batch 200/859 | Cost: 0.1776 Epoch: 007/050 | Batch 220/859 | Cost: 0.3528 Epoch: 007/050 | Batch 240/859 | Cost: 0.2067 Epoch: 007/050 | Batch 260/859 | Cost: 0.2862 Epoch: 007/050 | Batch 280/859 | Cost: 0.3970 Epoch: 007/050 | Batch 300/859 | Cost: 0.4812 Epoch: 007/050 | Batch 320/859 | Cost: 0.3589 Epoch: 007/050 | Batch 340/859 | Cost: 0.2179 Epoch: 007/050 | Batch 360/859 | Cost: 0.2824 Epoch: 007/050 | Batch 380/859 | Cost: 0.5118 Epoch: 007/050 | Batch 400/859 | Cost: 0.3307 Epoch: 007/050 | Batch 420/859 | Cost: 0.2118 Epoch: 007/050 | Batch 440/859 | Cost: 0.4295 Epoch: 007/050 | Batch 460/859 | Cost: 0.3024 Epoch: 007/050 | Batch 480/859 | Cost: 0.2941 Epoch: 007/050 | Batch 500/859 | Cost: 0.2546 Epoch: 007/050 | Batch 520/859 | Cost: 0.3987 Epoch: 007/050 | Batch 540/859 | Cost: 0.2524 Epoch: 007/050 | Batch 560/859 | Cost: 0.2942 Epoch: 007/050 | Batch 580/859 | Cost: 0.1970 Epoch: 007/050 | Batch 600/859 | Cost: 0.4449 Epoch: 007/050 | Batch 620/859 | Cost: 0.2650 Epoch: 007/050 | Batch 640/859 | Cost: 0.2638 Epoch: 007/050 | Batch 660/859 | Cost: 0.2464 Epoch: 007/050 | Batch 680/859 | Cost: 0.2721 Epoch: 007/050 | Batch 700/859 | Cost: 0.2952 Epoch: 007/050 | Batch 720/859 | Cost: 0.4453 Epoch: 007/050 | Batch 740/859 | Cost: 0.2119 Epoch: 007/050 | Batch 760/859 | Cost: 0.2456 Epoch: 007/050 | Batch 780/859 | Cost: 0.2266 Epoch: 007/050 | Batch 800/859 | Cost: 0.4083 Epoch: 007/050 | Batch 820/859 | Cost: 0.3716 Epoch: 007/050 | Batch 840/859 | Cost: 0.5091 Epoch: 007/050 Train Acc.: 90.08% | Validation Acc.: 88.08% Time elapsed: 1.33 min Epoch: 008/050 | Batch 000/859 | Cost: 0.2284 Epoch: 008/050 | Batch 020/859 | Cost: 0.2097 Epoch: 008/050 | Batch 040/859 | Cost: 0.3235 Epoch: 008/050 | Batch 060/859 | Cost: 0.4316 Epoch: 008/050 | Batch 080/859 | Cost: 0.1889 Epoch: 008/050 | Batch 100/859 | Cost: 0.1879 Epoch: 008/050 | Batch 120/859 | Cost: 0.2792 Epoch: 008/050 | Batch 140/859 | Cost: 0.3921 Epoch: 008/050 | Batch 160/859 | Cost: 0.3628 Epoch: 008/050 | Batch 180/859 | Cost: 0.5153 Epoch: 008/050 | Batch 200/859 | Cost: 0.4004 Epoch: 008/050 | Batch 220/859 | Cost: 0.3409 Epoch: 008/050 | Batch 240/859 | Cost: 0.3912 Epoch: 008/050 | Batch 260/859 | Cost: 0.2919 Epoch: 008/050 | Batch 280/859 | Cost: 0.2746 Epoch: 008/050 | Batch 300/859 | Cost: 0.3713 Epoch: 008/050 | Batch 320/859 | Cost: 0.2319 Epoch: 008/050 | Batch 340/859 | Cost: 0.2443 Epoch: 008/050 | Batch 360/859 | Cost: 0.2224 Epoch: 008/050 | Batch 380/859 | Cost: 0.3154 Epoch: 008/050 | Batch 400/859 | Cost: 0.1623 Epoch: 008/050 | Batch 420/859 | Cost: 0.1112 Epoch: 008/050 | Batch 440/859 | Cost: 0.3821 Epoch: 008/050 | Batch 460/859 | Cost: 0.2217 Epoch: 008/050 | Batch 480/859 | Cost: 0.2450 Epoch: 008/050 | Batch 500/859 | Cost: 0.4132 Epoch: 008/050 | Batch 520/859 | Cost: 0.3903 Epoch: 008/050 | Batch 540/859 | Cost: 0.2072 Epoch: 008/050 | Batch 560/859 | Cost: 0.3298 Epoch: 008/050 | Batch 580/859 | Cost: 0.2866 Epoch: 008/050 | Batch 600/859 | Cost: 0.3194 Epoch: 008/050 | Batch 620/859 | Cost: 0.2770 Epoch: 008/050 | Batch 640/859 | Cost: 0.3012 Epoch: 008/050 | Batch 660/859 | Cost: 0.2512 Epoch: 008/050 | Batch 680/859 | Cost: 0.2856 Epoch: 008/050 | Batch 700/859 | Cost: 0.2780 Epoch: 008/050 | Batch 720/859 | Cost: 0.2646 Epoch: 008/050 | Batch 740/859 | Cost: 0.2156 Epoch: 008/050 | Batch 760/859 | Cost: 0.3774 Epoch: 008/050 | Batch 780/859 | Cost: 0.2387 Epoch: 008/050 | Batch 800/859 | Cost: 0.2580 Epoch: 008/050 | Batch 820/859 | Cost: 0.4187 Epoch: 008/050 | Batch 840/859 | Cost: 0.3592 Epoch: 008/050 Train Acc.: 90.12% | Validation Acc.: 87.58% Time elapsed: 1.51 min Epoch: 009/050 | Batch 000/859 | Cost: 0.3774 Epoch: 009/050 | Batch 020/859 | Cost: 0.2402 Epoch: 009/050 | Batch 040/859 | Cost: 0.2120 Epoch: 009/050 | Batch 060/859 | Cost: 0.2721 Epoch: 009/050 | Batch 080/859 | Cost: 0.2177 Epoch: 009/050 | Batch 100/859 | Cost: 0.3417 Epoch: 009/050 | Batch 120/859 | Cost: 0.4400 Epoch: 009/050 | Batch 140/859 | Cost: 0.2543 Epoch: 009/050 | Batch 160/859 | Cost: 0.3030 Epoch: 009/050 | Batch 180/859 | Cost: 0.1745 Epoch: 009/050 | Batch 200/859 | Cost: 0.3727 Epoch: 009/050 | Batch 220/859 | Cost: 0.3322 Epoch: 009/050 | Batch 240/859 | Cost: 0.2398 Epoch: 009/050 | Batch 260/859 | Cost: 0.2739 Epoch: 009/050 | Batch 280/859 | Cost: 0.2162 Epoch: 009/050 | Batch 300/859 | Cost: 0.3931 Epoch: 009/050 | Batch 320/859 | Cost: 0.3162 Epoch: 009/050 | Batch 340/859 | Cost: 0.3754 Epoch: 009/050 | Batch 360/859 | Cost: 0.2472 Epoch: 009/050 | Batch 380/859 | Cost: 0.2545 Epoch: 009/050 | Batch 400/859 | Cost: 0.2943 Epoch: 009/050 | Batch 420/859 | Cost: 0.1834 Epoch: 009/050 | Batch 440/859 | Cost: 0.2989 Epoch: 009/050 | Batch 460/859 | Cost: 0.2700 Epoch: 009/050 | Batch 480/859 | Cost: 0.1988 Epoch: 009/050 | Batch 500/859 | Cost: 0.2524 Epoch: 009/050 | Batch 520/859 | Cost: 0.4299 Epoch: 009/050 | Batch 540/859 | Cost: 0.3056 Epoch: 009/050 | Batch 560/859 | Cost: 0.2865 Epoch: 009/050 | Batch 580/859 | Cost: 0.2686 Epoch: 009/050 | Batch 600/859 | Cost: 0.2770 Epoch: 009/050 | Batch 620/859 | Cost: 0.2832 Epoch: 009/050 | Batch 640/859 | Cost: 0.2671 Epoch: 009/050 | Batch 660/859 | Cost: 0.2377 Epoch: 009/050 | Batch 680/859 | Cost: 0.3145 Epoch: 009/050 | Batch 700/859 | Cost: 0.2265 Epoch: 009/050 | Batch 720/859 | Cost: 0.4582 Epoch: 009/050 | Batch 740/859 | Cost: 0.3061 Epoch: 009/050 | Batch 760/859 | Cost: 0.3051 Epoch: 009/050 | Batch 780/859 | Cost: 0.3773 Epoch: 009/050 | Batch 800/859 | Cost: 0.3711 Epoch: 009/050 | Batch 820/859 | Cost: 0.1982 Epoch: 009/050 | Batch 840/859 | Cost: 0.2362 Epoch: 009/050 Train Acc.: 89.68% | Validation Acc.: 86.64% Time elapsed: 1.68 min Epoch: 010/050 | Batch 000/859 | Cost: 0.3063 Epoch: 010/050 | Batch 020/859 | Cost: 0.3262 Epoch: 010/050 | Batch 040/859 | Cost: 0.3996 Epoch: 010/050 | Batch 060/859 | Cost: 0.2621 Epoch: 010/050 | Batch 080/859 | Cost: 0.4335 Epoch: 010/050 | Batch 100/859 | Cost: 0.1522 Epoch: 010/050 | Batch 120/859 | Cost: 0.3314 Epoch: 010/050 | Batch 140/859 | Cost: 0.2629 Epoch: 010/050 | Batch 160/859 | Cost: 0.3189 Epoch: 010/050 | Batch 180/859 | Cost: 0.2859 Epoch: 010/050 | Batch 200/859 | Cost: 0.2742 Epoch: 010/050 | Batch 220/859 | Cost: 0.1915 Epoch: 010/050 | Batch 240/859 | Cost: 0.3715 Epoch: 010/050 | Batch 260/859 | Cost: 0.4274 Epoch: 010/050 | Batch 280/859 | Cost: 0.2568 Epoch: 010/050 | Batch 300/859 | Cost: 0.1602 Epoch: 010/050 | Batch 320/859 | Cost: 0.3749 Epoch: 010/050 | Batch 340/859 | Cost: 0.3588 Epoch: 010/050 | Batch 360/859 | Cost: 0.1234 Epoch: 010/050 | Batch 380/859 | Cost: 0.3455 Epoch: 010/050 | Batch 400/859 | Cost: 0.2429 Epoch: 010/050 | Batch 420/859 | Cost: 0.2908 Epoch: 010/050 | Batch 440/859 | Cost: 0.3181 Epoch: 010/050 | Batch 460/859 | Cost: 0.2999 Epoch: 010/050 | Batch 480/859 | Cost: 0.2537 Epoch: 010/050 | Batch 500/859 | Cost: 0.3686 Epoch: 010/050 | Batch 520/859 | Cost: 0.2637 Epoch: 010/050 | Batch 540/859 | Cost: 0.2403 Epoch: 010/050 | Batch 560/859 | Cost: 0.4509 Epoch: 010/050 | Batch 580/859 | Cost: 0.2899 Epoch: 010/050 | Batch 600/859 | Cost: 0.2686 Epoch: 010/050 | Batch 620/859 | Cost: 0.5482 Epoch: 010/050 | Batch 640/859 | Cost: 0.2560 Epoch: 010/050 | Batch 660/859 | Cost: 0.2505 Epoch: 010/050 | Batch 680/859 | Cost: 0.3446 Epoch: 010/050 | Batch 700/859 | Cost: 0.2687 Epoch: 010/050 | Batch 720/859 | Cost: 0.3494 Epoch: 010/050 | Batch 740/859 | Cost: 0.2210 Epoch: 010/050 | Batch 760/859 | Cost: 0.2671 Epoch: 010/050 | Batch 780/859 | Cost: 0.4579 Epoch: 010/050 | Batch 800/859 | Cost: 0.3551 Epoch: 010/050 | Batch 820/859 | Cost: 0.1705 Epoch: 010/050 | Batch 840/859 | Cost: 0.2069 Epoch: 010/050 Train Acc.: 89.00% | Validation Acc.: 86.88% Time elapsed: 1.86 min Epoch: 011/050 | Batch 000/859 | Cost: 0.3855 Epoch: 011/050 | Batch 020/859 | Cost: 0.4305 Epoch: 011/050 | Batch 040/859 | Cost: 0.2666 Epoch: 011/050 | Batch 060/859 | Cost: 0.3370 Epoch: 011/050 | Batch 080/859 | Cost: 0.2358 Epoch: 011/050 | Batch 100/859 | Cost: 0.1143 Epoch: 011/050 | Batch 120/859 | Cost: 0.2166 Epoch: 011/050 | Batch 140/859 | Cost: 0.3173 Epoch: 011/050 | Batch 160/859 | Cost: 0.2504 Epoch: 011/050 | Batch 180/859 | Cost: 0.2767 Epoch: 011/050 | Batch 200/859 | Cost: 0.4446 Epoch: 011/050 | Batch 220/859 | Cost: 0.2932 Epoch: 011/050 | Batch 240/859 | Cost: 0.1739 Epoch: 011/050 | Batch 260/859 | Cost: 0.3671 Epoch: 011/050 | Batch 280/859 | Cost: 0.2714 Epoch: 011/050 | Batch 300/859 | Cost: 0.2002 Epoch: 011/050 | Batch 320/859 | Cost: 0.3269 Epoch: 011/050 | Batch 340/859 | Cost: 0.4000 Epoch: 011/050 | Batch 360/859 | Cost: 0.3058 Epoch: 011/050 | Batch 380/859 | Cost: 0.1295 Epoch: 011/050 | Batch 400/859 | Cost: 0.2745 Epoch: 011/050 | Batch 420/859 | Cost: 0.3640 Epoch: 011/050 | Batch 440/859 | Cost: 0.3198 Epoch: 011/050 | Batch 460/859 | Cost: 0.2054 Epoch: 011/050 | Batch 480/859 | Cost: 0.2316 Epoch: 011/050 | Batch 500/859 | Cost: 0.2545 Epoch: 011/050 | Batch 520/859 | Cost: 0.1963 Epoch: 011/050 | Batch 540/859 | Cost: 0.3226 Epoch: 011/050 | Batch 560/859 | Cost: 0.2609 Epoch: 011/050 | Batch 580/859 | Cost: 0.3662 Epoch: 011/050 | Batch 600/859 | Cost: 0.2645 Epoch: 011/050 | Batch 620/859 | Cost: 0.3609 Epoch: 011/050 | Batch 640/859 | Cost: 0.4474 Epoch: 011/050 | Batch 660/859 | Cost: 0.2493 Epoch: 011/050 | Batch 680/859 | Cost: 0.2621 Epoch: 011/050 | Batch 700/859 | Cost: 0.2169 Epoch: 011/050 | Batch 720/859 | Cost: 0.2390 Epoch: 011/050 | Batch 740/859 | Cost: 0.2666 Epoch: 011/050 | Batch 760/859 | Cost: 0.4000 Epoch: 011/050 | Batch 780/859 | Cost: 0.2724 Epoch: 011/050 | Batch 800/859 | Cost: 0.2429 Epoch: 011/050 | Batch 820/859 | Cost: 0.3950 Epoch: 011/050 | Batch 840/859 | Cost: 0.2034 Epoch: 011/050 Train Acc.: 88.37% | Validation Acc.: 85.78% Time elapsed: 2.03 min Epoch: 012/050 | Batch 000/859 | Cost: 0.5046 Epoch: 012/050 | Batch 020/859 | Cost: 0.2162 Epoch: 012/050 | Batch 040/859 | Cost: 0.2352 Epoch: 012/050 | Batch 060/859 | Cost: 0.1599 Epoch: 012/050 | Batch 080/859 | Cost: 0.2115 Epoch: 012/050 | Batch 100/859 | Cost: 0.1326 Epoch: 012/050 | Batch 120/859 | Cost: 0.1166 Epoch: 012/050 | Batch 140/859 | Cost: 0.1782 Epoch: 012/050 | Batch 160/859 | Cost: 0.1794 Epoch: 012/050 | Batch 180/859 | Cost: 0.2012 Epoch: 012/050 | Batch 200/859 | Cost: 0.1677 Epoch: 012/050 | Batch 220/859 | Cost: 0.2472 Epoch: 012/050 | Batch 240/859 | Cost: 0.3529 Epoch: 012/050 | Batch 260/859 | Cost: 0.4770 Epoch: 012/050 | Batch 280/859 | Cost: 0.2132 Epoch: 012/050 | Batch 300/859 | Cost: 0.2273 Epoch: 012/050 | Batch 320/859 | Cost: 0.2459 Epoch: 012/050 | Batch 340/859 | Cost: 0.2726 Epoch: 012/050 | Batch 360/859 | Cost: 0.2983 Epoch: 012/050 | Batch 380/859 | Cost: 0.2519 Epoch: 012/050 | Batch 400/859 | Cost: 0.2725 Epoch: 012/050 | Batch 420/859 | Cost: 0.3442 Epoch: 012/050 | Batch 440/859 | Cost: 0.2459 Epoch: 012/050 | Batch 460/859 | Cost: 0.1485 Epoch: 012/050 | Batch 480/859 | Cost: 0.4274 Epoch: 012/050 | Batch 500/859 | Cost: 0.2438 Epoch: 012/050 | Batch 520/859 | Cost: 0.2708 Epoch: 012/050 | Batch 540/859 | Cost: 0.4881 Epoch: 012/050 | Batch 560/859 | Cost: 0.2065 Epoch: 012/050 | Batch 580/859 | Cost: 0.3144 Epoch: 012/050 | Batch 600/859 | Cost: 0.2712 Epoch: 012/050 | Batch 620/859 | Cost: 0.1396 Epoch: 012/050 | Batch 640/859 | Cost: 0.1605 Epoch: 012/050 | Batch 660/859 | Cost: 0.3421 Epoch: 012/050 | Batch 680/859 | Cost: 0.1675 Epoch: 012/050 | Batch 700/859 | Cost: 0.4446 Epoch: 012/050 | Batch 720/859 | Cost: 0.2616 Epoch: 012/050 | Batch 740/859 | Cost: 0.3251 Epoch: 012/050 | Batch 760/859 | Cost: 0.4214 Epoch: 012/050 | Batch 780/859 | Cost: 0.2033 Epoch: 012/050 | Batch 800/859 | Cost: 0.2578 Epoch: 012/050 | Batch 820/859 | Cost: 0.2151 Epoch: 012/050 | Batch 840/859 | Cost: 0.2236 Epoch: 012/050 Train Acc.: 90.74% | Validation Acc.: 87.90% Time elapsed: 2.20 min Epoch: 013/050 | Batch 000/859 | Cost: 0.2968 Epoch: 013/050 | Batch 020/859 | Cost: 0.2325 Epoch: 013/050 | Batch 040/859 | Cost: 0.2834 Epoch: 013/050 | Batch 060/859 | Cost: 0.2961 Epoch: 013/050 | Batch 080/859 | Cost: 0.3329 Epoch: 013/050 | Batch 100/859 | Cost: 0.3895 Epoch: 013/050 | Batch 120/859 | Cost: 0.1256 Epoch: 013/050 | Batch 140/859 | Cost: 0.2394 Epoch: 013/050 | Batch 160/859 | Cost: 0.2540 Epoch: 013/050 | Batch 180/859 | Cost: 0.2576 Epoch: 013/050 | Batch 200/859 | Cost: 0.2324 Epoch: 013/050 | Batch 220/859 | Cost: 0.3261 Epoch: 013/050 | Batch 240/859 | Cost: 0.1729 Epoch: 013/050 | Batch 260/859 | Cost: 0.2487 Epoch: 013/050 | Batch 280/859 | Cost: 0.2167 Epoch: 013/050 | Batch 300/859 | Cost: 0.2190 Epoch: 013/050 | Batch 320/859 | Cost: 0.2697 Epoch: 013/050 | Batch 340/859 | Cost: 0.1866 Epoch: 013/050 | Batch 360/859 | Cost: 0.2858 Epoch: 013/050 | Batch 380/859 | Cost: 0.1310 Epoch: 013/050 | Batch 400/859 | Cost: 0.2919 Epoch: 013/050 | Batch 420/859 | Cost: 0.2359 Epoch: 013/050 | Batch 440/859 | Cost: 0.2422 Epoch: 013/050 | Batch 460/859 | Cost: 0.1740 Epoch: 013/050 | Batch 480/859 | Cost: 0.3332 Epoch: 013/050 | Batch 500/859 | Cost: 0.1935 Epoch: 013/050 | Batch 520/859 | Cost: 0.2015 Epoch: 013/050 | Batch 540/859 | Cost: 0.3299 Epoch: 013/050 | Batch 560/859 | Cost: 0.2693 Epoch: 013/050 | Batch 580/859 | Cost: 0.1852 Epoch: 013/050 | Batch 600/859 | Cost: 0.1970 Epoch: 013/050 | Batch 620/859 | Cost: 0.2324 Epoch: 013/050 | Batch 640/859 | Cost: 0.2552 Epoch: 013/050 | Batch 660/859 | Cost: 0.2914 Epoch: 013/050 | Batch 680/859 | Cost: 0.3874 Epoch: 013/050 | Batch 700/859 | Cost: 0.1944 Epoch: 013/050 | Batch 720/859 | Cost: 0.2011 Epoch: 013/050 | Batch 740/859 | Cost: 0.2176 Epoch: 013/050 | Batch 760/859 | Cost: 0.2196 Epoch: 013/050 | Batch 780/859 | Cost: 0.1875 Epoch: 013/050 | Batch 800/859 | Cost: 0.2509 Epoch: 013/050 | Batch 820/859 | Cost: 0.1501 Epoch: 013/050 | Batch 840/859 | Cost: 0.3301 Epoch: 013/050 Train Acc.: 90.98% | Validation Acc.: 87.56% Time elapsed: 2.38 min Epoch: 014/050 | Batch 000/859 | Cost: 0.3904 Epoch: 014/050 | Batch 020/859 | Cost: 0.2171 Epoch: 014/050 | Batch 040/859 | Cost: 0.2101 Epoch: 014/050 | Batch 060/859 | Cost: 0.1762 Epoch: 014/050 | Batch 080/859 | Cost: 0.1510 Epoch: 014/050 | Batch 100/859 | Cost: 0.3130 Epoch: 014/050 | Batch 120/859 | Cost: 0.2656 Epoch: 014/050 | Batch 140/859 | Cost: 0.3299 Epoch: 014/050 | Batch 160/859 | Cost: 0.1473 Epoch: 014/050 | Batch 180/859 | Cost: 0.2145 Epoch: 014/050 | Batch 200/859 | Cost: 0.2168 Epoch: 014/050 | Batch 220/859 | Cost: 0.1747 Epoch: 014/050 | Batch 240/859 | Cost: 0.2867 Epoch: 014/050 | Batch 260/859 | Cost: 0.4023 Epoch: 014/050 | Batch 280/859 | Cost: 0.1599 Epoch: 014/050 | Batch 300/859 | Cost: 0.2580 Epoch: 014/050 | Batch 320/859 | Cost: 0.2027 Epoch: 014/050 | Batch 340/859 | Cost: 0.3845 Epoch: 014/050 | Batch 360/859 | Cost: 0.2749 Epoch: 014/050 | Batch 380/859 | Cost: 0.2960 Epoch: 014/050 | Batch 400/859 | Cost: 0.1555 Epoch: 014/050 | Batch 420/859 | Cost: 0.2515 Epoch: 014/050 | Batch 440/859 | Cost: 0.2152 Epoch: 014/050 | Batch 460/859 | Cost: 0.1245 Epoch: 014/050 | Batch 480/859 | Cost: 0.3035 Epoch: 014/050 | Batch 500/859 | Cost: 0.2258 Epoch: 014/050 | Batch 520/859 | Cost: 0.2425 Epoch: 014/050 | Batch 540/859 | Cost: 0.1888 Epoch: 014/050 | Batch 560/859 | Cost: 0.2301 Epoch: 014/050 | Batch 580/859 | Cost: 0.2281 Epoch: 014/050 | Batch 600/859 | Cost: 0.2417 Epoch: 014/050 | Batch 620/859 | Cost: 0.2429 Epoch: 014/050 | Batch 640/859 | Cost: 0.2469 Epoch: 014/050 | Batch 660/859 | Cost: 0.1591 Epoch: 014/050 | Batch 680/859 | Cost: 0.3132 Epoch: 014/050 | Batch 700/859 | Cost: 0.2085 Epoch: 014/050 | Batch 720/859 | Cost: 0.2338 Epoch: 014/050 | Batch 740/859 | Cost: 0.2778 Epoch: 014/050 | Batch 760/859 | Cost: 0.3075 Epoch: 014/050 | Batch 780/859 | Cost: 0.2722 Epoch: 014/050 | Batch 800/859 | Cost: 0.4057 Epoch: 014/050 | Batch 820/859 | Cost: 0.2320 Epoch: 014/050 | Batch 840/859 | Cost: 0.3540 Epoch: 014/050 Train Acc.: 90.27% | Validation Acc.: 86.68% Time elapsed: 2.55 min Epoch: 015/050 | Batch 000/859 | Cost: 0.1474 Epoch: 015/050 | Batch 020/859 | Cost: 0.1883 Epoch: 015/050 | Batch 040/859 | Cost: 0.3168 Epoch: 015/050 | Batch 060/859 | Cost: 0.2552 Epoch: 015/050 | Batch 080/859 | Cost: 0.2602 Epoch: 015/050 | Batch 100/859 | Cost: 0.3158 Epoch: 015/050 | Batch 120/859 | Cost: 0.1998 Epoch: 015/050 | Batch 140/859 | Cost: 0.2294 Epoch: 015/050 | Batch 160/859 | Cost: 0.2217 Epoch: 015/050 | Batch 180/859 | Cost: 0.1726 Epoch: 015/050 | Batch 200/859 | Cost: 0.3084 Epoch: 015/050 | Batch 220/859 | Cost: 0.1465 Epoch: 015/050 | Batch 240/859 | Cost: 0.2315 Epoch: 015/050 | Batch 260/859 | Cost: 0.1898 Epoch: 015/050 | Batch 280/859 | Cost: 0.1226 Epoch: 015/050 | Batch 300/859 | Cost: 0.2684 Epoch: 015/050 | Batch 320/859 | Cost: 0.5611 Epoch: 015/050 | Batch 340/859 | Cost: 0.1822 Epoch: 015/050 | Batch 360/859 | Cost: 0.2104 Epoch: 015/050 | Batch 380/859 | Cost: 0.2849 Epoch: 015/050 | Batch 400/859 | Cost: 0.2029 Epoch: 015/050 | Batch 420/859 | Cost: 0.1707 Epoch: 015/050 | Batch 440/859 | Cost: 0.3588 Epoch: 015/050 | Batch 460/859 | Cost: 0.2234 Epoch: 015/050 | Batch 480/859 | Cost: 0.1594 Epoch: 015/050 | Batch 500/859 | Cost: 0.2049 Epoch: 015/050 | Batch 520/859 | Cost: 0.2633 Epoch: 015/050 | Batch 540/859 | Cost: 0.1711 Epoch: 015/050 | Batch 560/859 | Cost: 0.1695 Epoch: 015/050 | Batch 580/859 | Cost: 0.1270 Epoch: 015/050 | Batch 600/859 | Cost: 0.3360 Epoch: 015/050 | Batch 620/859 | Cost: 0.3546 Epoch: 015/050 | Batch 640/859 | Cost: 0.1837 Epoch: 015/050 | Batch 660/859 | Cost: 0.1976 Epoch: 015/050 | Batch 680/859 | Cost: 0.2269 Epoch: 015/050 | Batch 700/859 | Cost: 0.2941 Epoch: 015/050 | Batch 720/859 | Cost: 0.2812 Epoch: 015/050 | Batch 740/859 | Cost: 0.3755 Epoch: 015/050 | Batch 760/859 | Cost: 0.2386 Epoch: 015/050 | Batch 780/859 | Cost: 0.2227 Epoch: 015/050 | Batch 800/859 | Cost: 0.2345 Epoch: 015/050 | Batch 820/859 | Cost: 0.1914 Epoch: 015/050 | Batch 840/859 | Cost: 0.1133 Epoch: 015/050 Train Acc.: 91.51% | Validation Acc.: 88.28% Time elapsed: 2.73 min Epoch: 016/050 | Batch 000/859 | Cost: 0.3705 Epoch: 016/050 | Batch 020/859 | Cost: 0.1768 Epoch: 016/050 | Batch 040/859 | Cost: 0.1995 Epoch: 016/050 | Batch 060/859 | Cost: 0.1291 Epoch: 016/050 | Batch 080/859 | Cost: 0.2208 Epoch: 016/050 | Batch 100/859 | Cost: 0.3217 Epoch: 016/050 | Batch 120/859 | Cost: 0.3195 Epoch: 016/050 | Batch 140/859 | Cost: 0.4541 Epoch: 016/050 | Batch 160/859 | Cost: 0.2485 Epoch: 016/050 | Batch 180/859 | Cost: 0.1780 Epoch: 016/050 | Batch 200/859 | Cost: 0.1817 Epoch: 016/050 | Batch 220/859 | Cost: 0.1495 Epoch: 016/050 | Batch 240/859 | Cost: 0.1743 Epoch: 016/050 | Batch 260/859 | Cost: 0.2879 Epoch: 016/050 | Batch 280/859 | Cost: 0.1905 Epoch: 016/050 | Batch 300/859 | Cost: 0.1118 Epoch: 016/050 | Batch 320/859 | Cost: 0.1483 Epoch: 016/050 | Batch 340/859 | Cost: 0.2706 Epoch: 016/050 | Batch 360/859 | Cost: 0.1307 Epoch: 016/050 | Batch 380/859 | Cost: 0.1512 Epoch: 016/050 | Batch 400/859 | Cost: 0.2918 Epoch: 016/050 | Batch 420/859 | Cost: 0.1589 Epoch: 016/050 | Batch 440/859 | Cost: 0.1207 Epoch: 016/050 | Batch 460/859 | Cost: 0.2981 Epoch: 016/050 | Batch 480/859 | Cost: 0.1669 Epoch: 016/050 | Batch 500/859 | Cost: 0.3287 Epoch: 016/050 | Batch 520/859 | Cost: 0.2353 Epoch: 016/050 | Batch 540/859 | Cost: 0.2736 Epoch: 016/050 | Batch 560/859 | Cost: 0.2683 Epoch: 016/050 | Batch 580/859 | Cost: 0.1573 Epoch: 016/050 | Batch 600/859 | Cost: 0.1608 Epoch: 016/050 | Batch 620/859 | Cost: 0.2590 Epoch: 016/050 | Batch 640/859 | Cost: 0.3828 Epoch: 016/050 | Batch 660/859 | Cost: 0.5349 Epoch: 016/050 | Batch 680/859 | Cost: 0.1393 Epoch: 016/050 | Batch 700/859 | Cost: 0.3159 Epoch: 016/050 | Batch 720/859 | Cost: 0.0806 Epoch: 016/050 | Batch 740/859 | Cost: 0.3981 Epoch: 016/050 | Batch 760/859 | Cost: 0.2579 Epoch: 016/050 | Batch 780/859 | Cost: 0.2224 Epoch: 016/050 | Batch 800/859 | Cost: 0.3861 Epoch: 016/050 | Batch 820/859 | Cost: 0.1884 Epoch: 016/050 | Batch 840/859 | Cost: 0.0956 Epoch: 016/050 Train Acc.: 92.21% | Validation Acc.: 88.12% Time elapsed: 2.91 min Epoch: 017/050 | Batch 000/859 | Cost: 0.1082 Epoch: 017/050 | Batch 020/859 | Cost: 0.2564 Epoch: 017/050 | Batch 040/859 | Cost: 0.2367 Epoch: 017/050 | Batch 060/859 | Cost: 0.1555 Epoch: 017/050 | Batch 080/859 | Cost: 0.2190 Epoch: 017/050 | Batch 100/859 | Cost: 0.2488 Epoch: 017/050 | Batch 120/859 | Cost: 0.1770 Epoch: 017/050 | Batch 140/859 | Cost: 0.2812 Epoch: 017/050 | Batch 160/859 | Cost: 0.3338 Epoch: 017/050 | Batch 180/859 | Cost: 0.1866 Epoch: 017/050 | Batch 200/859 | Cost: 0.1634 Epoch: 017/050 | Batch 220/859 | Cost: 0.1725 Epoch: 017/050 | Batch 240/859 | Cost: 0.1933 Epoch: 017/050 | Batch 260/859 | Cost: 0.2180 Epoch: 017/050 | Batch 280/859 | Cost: 0.2040 Epoch: 017/050 | Batch 300/859 | Cost: 0.2439 Epoch: 017/050 | Batch 320/859 | Cost: 0.1740 Epoch: 017/050 | Batch 340/859 | Cost: 0.3704 Epoch: 017/050 | Batch 360/859 | Cost: 0.2154 Epoch: 017/050 | Batch 380/859 | Cost: 0.2009 Epoch: 017/050 | Batch 400/859 | Cost: 0.2951 Epoch: 017/050 | Batch 420/859 | Cost: 0.2683 Epoch: 017/050 | Batch 440/859 | Cost: 0.3063 Epoch: 017/050 | Batch 460/859 | Cost: 0.3990 Epoch: 017/050 | Batch 480/859 | Cost: 0.1974 Epoch: 017/050 | Batch 500/859 | Cost: 0.2009 Epoch: 017/050 | Batch 520/859 | Cost: 0.1633 Epoch: 017/050 | Batch 540/859 | Cost: 0.1595 Epoch: 017/050 | Batch 560/859 | Cost: 0.2039 Epoch: 017/050 | Batch 580/859 | Cost: 0.1644 Epoch: 017/050 | Batch 600/859 | Cost: 0.0764 Epoch: 017/050 | Batch 620/859 | Cost: 0.3383 Epoch: 017/050 | Batch 640/859 | Cost: 0.1779 Epoch: 017/050 | Batch 660/859 | Cost: 0.1667 Epoch: 017/050 | Batch 680/859 | Cost: 0.3002 Epoch: 017/050 | Batch 700/859 | Cost: 0.2072 Epoch: 017/050 | Batch 720/859 | Cost: 0.1208 Epoch: 017/050 | Batch 740/859 | Cost: 0.2084 Epoch: 017/050 | Batch 760/859 | Cost: 0.1952 Epoch: 017/050 | Batch 780/859 | Cost: 0.1908 Epoch: 017/050 | Batch 800/859 | Cost: 0.2059 Epoch: 017/050 | Batch 820/859 | Cost: 0.2299 Epoch: 017/050 | Batch 840/859 | Cost: 0.3367 Epoch: 017/050 Train Acc.: 91.43% | Validation Acc.: 87.64% Time elapsed: 3.09 min Epoch: 018/050 | Batch 000/859 | Cost: 0.1022 Epoch: 018/050 | Batch 020/859 | Cost: 0.2095 Epoch: 018/050 | Batch 040/859 | Cost: 0.1974 Epoch: 018/050 | Batch 060/859 | Cost: 0.1820 Epoch: 018/050 | Batch 080/859 | Cost: 0.2626 Epoch: 018/050 | Batch 100/859 | Cost: 0.2241 Epoch: 018/050 | Batch 120/859 | Cost: 0.2876 Epoch: 018/050 | Batch 140/859 | Cost: 0.1160 Epoch: 018/050 | Batch 160/859 | Cost: 0.1946 Epoch: 018/050 | Batch 180/859 | Cost: 0.2698 Epoch: 018/050 | Batch 200/859 | Cost: 0.2164 Epoch: 018/050 | Batch 220/859 | Cost: 0.3277 Epoch: 018/050 | Batch 240/859 | Cost: 0.1644 Epoch: 018/050 | Batch 260/859 | Cost: 0.3755 Epoch: 018/050 | Batch 280/859 | Cost: 0.1989 Epoch: 018/050 | Batch 300/859 | Cost: 0.1532 Epoch: 018/050 | Batch 320/859 | Cost: 0.1736 Epoch: 018/050 | Batch 340/859 | Cost: 0.2813 Epoch: 018/050 | Batch 360/859 | Cost: 0.1636 Epoch: 018/050 | Batch 380/859 | Cost: 0.3248 Epoch: 018/050 | Batch 400/859 | Cost: 0.1178 Epoch: 018/050 | Batch 420/859 | Cost: 0.0981 Epoch: 018/050 | Batch 440/859 | Cost: 0.2316 Epoch: 018/050 | Batch 460/859 | Cost: 0.1119 Epoch: 018/050 | Batch 480/859 | Cost: 0.3150 Epoch: 018/050 | Batch 500/859 | Cost: 0.3086 Epoch: 018/050 | Batch 520/859 | Cost: 0.2000 Epoch: 018/050 | Batch 540/859 | Cost: 0.2596 Epoch: 018/050 | Batch 560/859 | Cost: 0.2016 Epoch: 018/050 | Batch 580/859 | Cost: 0.2014 Epoch: 018/050 | Batch 600/859 | Cost: 0.1967 Epoch: 018/050 | Batch 620/859 | Cost: 0.2489 Epoch: 018/050 | Batch 640/859 | Cost: 0.3054 Epoch: 018/050 | Batch 660/859 | Cost: 0.2608 Epoch: 018/050 | Batch 680/859 | Cost: 0.3598 Epoch: 018/050 | Batch 700/859 | Cost: 0.1367 Epoch: 018/050 | Batch 720/859 | Cost: 0.1607 Epoch: 018/050 | Batch 740/859 | Cost: 0.3574 Epoch: 018/050 | Batch 760/859 | Cost: 0.1990 Epoch: 018/050 | Batch 780/859 | Cost: 0.2026 Epoch: 018/050 | Batch 800/859 | Cost: 0.1628 Epoch: 018/050 | Batch 820/859 | Cost: 0.1423 Epoch: 018/050 | Batch 840/859 | Cost: 0.0988 Epoch: 018/050 Train Acc.: 91.87% | Validation Acc.: 87.44% Time elapsed: 3.27 min Epoch: 019/050 | Batch 000/859 | Cost: 0.3737 Epoch: 019/050 | Batch 020/859 | Cost: 0.2727 Epoch: 019/050 | Batch 040/859 | Cost: 0.3580 Epoch: 019/050 | Batch 060/859 | Cost: 0.1375 Epoch: 019/050 | Batch 080/859 | Cost: 0.1288 Epoch: 019/050 | Batch 100/859 | Cost: 0.2549 Epoch: 019/050 | Batch 120/859 | Cost: 0.1919 Epoch: 019/050 | Batch 140/859 | Cost: 0.1547 Epoch: 019/050 | Batch 160/859 | Cost: 0.1644 Epoch: 019/050 | Batch 180/859 | Cost: 0.2661 Epoch: 019/050 | Batch 200/859 | Cost: 0.3237 Epoch: 019/050 | Batch 220/859 | Cost: 0.3169 Epoch: 019/050 | Batch 240/859 | Cost: 0.2222 Epoch: 019/050 | Batch 260/859 | Cost: 0.2231 Epoch: 019/050 | Batch 280/859 | Cost: 0.1357 Epoch: 019/050 | Batch 300/859 | Cost: 0.2333 Epoch: 019/050 | Batch 320/859 | Cost: 0.4402 Epoch: 019/050 | Batch 340/859 | Cost: 0.2260 Epoch: 019/050 | Batch 360/859 | Cost: 0.1676 Epoch: 019/050 | Batch 380/859 | Cost: 0.2818 Epoch: 019/050 | Batch 400/859 | Cost: 0.1583 Epoch: 019/050 | Batch 420/859 | Cost: 0.2190 Epoch: 019/050 | Batch 440/859 | Cost: 0.3123 Epoch: 019/050 | Batch 460/859 | Cost: 0.1595 Epoch: 019/050 | Batch 480/859 | Cost: 0.2855 Epoch: 019/050 | Batch 500/859 | Cost: 0.2032 Epoch: 019/050 | Batch 520/859 | Cost: 0.2500 Epoch: 019/050 | Batch 540/859 | Cost: 0.1204 Epoch: 019/050 | Batch 560/859 | Cost: 0.1693 Epoch: 019/050 | Batch 580/859 | Cost: 0.2062 Epoch: 019/050 | Batch 600/859 | Cost: 0.3958 Epoch: 019/050 | Batch 620/859 | Cost: 0.2253 Epoch: 019/050 | Batch 640/859 | Cost: 0.2674 Epoch: 019/050 | Batch 660/859 | Cost: 0.2653 Epoch: 019/050 | Batch 680/859 | Cost: 0.0968 Epoch: 019/050 | Batch 700/859 | Cost: 0.1488 Epoch: 019/050 | Batch 720/859 | Cost: 0.1614 Epoch: 019/050 | Batch 740/859 | Cost: 0.3133 Epoch: 019/050 | Batch 760/859 | Cost: 0.1859 Epoch: 019/050 | Batch 780/859 | Cost: 0.1381 Epoch: 019/050 | Batch 800/859 | Cost: 0.2975 Epoch: 019/050 | Batch 820/859 | Cost: 0.3485 Epoch: 019/050 | Batch 840/859 | Cost: 0.3778 Epoch: 019/050 Train Acc.: 92.92% | Validation Acc.: 88.60% Time elapsed: 3.44 min Epoch: 020/050 | Batch 000/859 | Cost: 0.1917 Epoch: 020/050 | Batch 020/859 | Cost: 0.0971 Epoch: 020/050 | Batch 040/859 | Cost: 0.2905 Epoch: 020/050 | Batch 060/859 | Cost: 0.1286 Epoch: 020/050 | Batch 080/859 | Cost: 0.2139 Epoch: 020/050 | Batch 100/859 | Cost: 0.3293 Epoch: 020/050 | Batch 120/859 | Cost: 0.2396 Epoch: 020/050 | Batch 140/859 | Cost: 0.1471 Epoch: 020/050 | Batch 160/859 | Cost: 0.1423 Epoch: 020/050 | Batch 180/859 | Cost: 0.3037 Epoch: 020/050 | Batch 200/859 | Cost: 0.1506 Epoch: 020/050 | Batch 220/859 | Cost: 0.1521 Epoch: 020/050 | Batch 240/859 | Cost: 0.2009 Epoch: 020/050 | Batch 260/859 | Cost: 0.2657 Epoch: 020/050 | Batch 280/859 | Cost: 0.3740 Epoch: 020/050 | Batch 300/859 | Cost: 0.1610 Epoch: 020/050 | Batch 320/859 | Cost: 0.1088 Epoch: 020/050 | Batch 340/859 | Cost: 0.0689 Epoch: 020/050 | Batch 360/859 | Cost: 0.3309 Epoch: 020/050 | Batch 380/859 | Cost: 0.1796 Epoch: 020/050 | Batch 400/859 | Cost: 0.2282 Epoch: 020/050 | Batch 420/859 | Cost: 0.3450 Epoch: 020/050 | Batch 440/859 | Cost: 0.1852 Epoch: 020/050 | Batch 460/859 | Cost: 0.1529 Epoch: 020/050 | Batch 480/859 | Cost: 0.1802 Epoch: 020/050 | Batch 500/859 | Cost: 0.1904 Epoch: 020/050 | Batch 520/859 | Cost: 0.1808 Epoch: 020/050 | Batch 540/859 | Cost: 0.3636 Epoch: 020/050 | Batch 560/859 | Cost: 0.1544 Epoch: 020/050 | Batch 580/859 | Cost: 0.2167 Epoch: 020/050 | Batch 600/859 | Cost: 0.2561 Epoch: 020/050 | Batch 620/859 | Cost: 0.1287 Epoch: 020/050 | Batch 640/859 | Cost: 0.0818 Epoch: 020/050 | Batch 660/859 | Cost: 0.2532 Epoch: 020/050 | Batch 680/859 | Cost: 0.2351 Epoch: 020/050 | Batch 700/859 | Cost: 0.1938 Epoch: 020/050 | Batch 720/859 | Cost: 0.1992 Epoch: 020/050 | Batch 740/859 | Cost: 0.2965 Epoch: 020/050 | Batch 760/859 | Cost: 0.2143 Epoch: 020/050 | Batch 780/859 | Cost: 0.2765 Epoch: 020/050 | Batch 800/859 | Cost: 0.2004 Epoch: 020/050 | Batch 820/859 | Cost: 0.2370 Epoch: 020/050 | Batch 840/859 | Cost: 0.2915 Epoch: 020/050 Train Acc.: 92.51% | Validation Acc.: 88.14% Time elapsed: 3.63 min Epoch: 021/050 | Batch 000/859 | Cost: 0.1609 Epoch: 021/050 | Batch 020/859 | Cost: 0.1465 Epoch: 021/050 | Batch 040/859 | Cost: 0.1914 Epoch: 021/050 | Batch 060/859 | Cost: 0.2713 Epoch: 021/050 | Batch 080/859 | Cost: 0.2022 Epoch: 021/050 | Batch 100/859 | Cost: 0.0864 Epoch: 021/050 | Batch 120/859 | Cost: 0.2216 Epoch: 021/050 | Batch 140/859 | Cost: 0.2637 Epoch: 021/050 | Batch 160/859 | Cost: 0.2317 Epoch: 021/050 | Batch 180/859 | Cost: 0.2274 Epoch: 021/050 | Batch 200/859 | Cost: 0.2929 Epoch: 021/050 | Batch 220/859 | Cost: 0.1431 Epoch: 021/050 | Batch 240/859 | Cost: 0.1063 Epoch: 021/050 | Batch 260/859 | Cost: 0.1951 Epoch: 021/050 | Batch 280/859 | Cost: 0.2515 Epoch: 021/050 | Batch 300/859 | Cost: 0.3143 Epoch: 021/050 | Batch 320/859 | Cost: 0.1797 Epoch: 021/050 | Batch 340/859 | Cost: 0.1252 Epoch: 021/050 | Batch 360/859 | Cost: 0.1166 Epoch: 021/050 | Batch 380/859 | Cost: 0.2071 Epoch: 021/050 | Batch 400/859 | Cost: 0.2085 Epoch: 021/050 | Batch 420/859 | Cost: 0.2228 Epoch: 021/050 | Batch 440/859 | Cost: 0.2075 Epoch: 021/050 | Batch 460/859 | Cost: 0.2813 Epoch: 021/050 | Batch 480/859 | Cost: 0.1215 Epoch: 021/050 | Batch 500/859 | Cost: 0.5361 Epoch: 021/050 | Batch 520/859 | Cost: 0.2737 Epoch: 021/050 | Batch 540/859 | Cost: 0.1311 Epoch: 021/050 | Batch 560/859 | Cost: 0.1612 Epoch: 021/050 | Batch 580/859 | Cost: 0.1499 Epoch: 021/050 | Batch 600/859 | Cost: 0.1809 Epoch: 021/050 | Batch 620/859 | Cost: 0.1500 Epoch: 021/050 | Batch 640/859 | Cost: 0.2235 Epoch: 021/050 | Batch 660/859 | Cost: 0.2643 Epoch: 021/050 | Batch 680/859 | Cost: 0.2183 Epoch: 021/050 | Batch 700/859 | Cost: 0.2341 Epoch: 021/050 | Batch 720/859 | Cost: 0.1442 Epoch: 021/050 | Batch 740/859 | Cost: 0.2574 Epoch: 021/050 | Batch 760/859 | Cost: 0.3507 Epoch: 021/050 | Batch 780/859 | Cost: 0.1704 Epoch: 021/050 | Batch 800/859 | Cost: 0.1383 Epoch: 021/050 | Batch 820/859 | Cost: 0.2052 Epoch: 021/050 | Batch 840/859 | Cost: 0.2465 Epoch: 021/050 Train Acc.: 92.65% | Validation Acc.: 88.10% Time elapsed: 3.81 min Epoch: 022/050 | Batch 000/859 | Cost: 0.2088 Epoch: 022/050 | Batch 020/859 | Cost: 0.3418 Epoch: 022/050 | Batch 040/859 | Cost: 0.2063 Epoch: 022/050 | Batch 060/859 | Cost: 0.1301 Epoch: 022/050 | Batch 080/859 | Cost: 0.2816 Epoch: 022/050 | Batch 100/859 | Cost: 0.1150 Epoch: 022/050 | Batch 120/859 | Cost: 0.2103 Epoch: 022/050 | Batch 140/859 | Cost: 0.0794 Epoch: 022/050 | Batch 160/859 | Cost: 0.2462 Epoch: 022/050 | Batch 180/859 | Cost: 0.2222 Epoch: 022/050 | Batch 200/859 | Cost: 0.4063 Epoch: 022/050 | Batch 220/859 | Cost: 0.2211 Epoch: 022/050 | Batch 240/859 | Cost: 0.2426 Epoch: 022/050 | Batch 260/859 | Cost: 0.2939 Epoch: 022/050 | Batch 280/859 | Cost: 0.2353 Epoch: 022/050 | Batch 300/859 | Cost: 0.0685 Epoch: 022/050 | Batch 320/859 | Cost: 0.2525 Epoch: 022/050 | Batch 340/859 | Cost: 0.1195 Epoch: 022/050 | Batch 360/859 | Cost: 0.1434 Epoch: 022/050 | Batch 380/859 | Cost: 0.2261 Epoch: 022/050 | Batch 400/859 | Cost: 0.2024 Epoch: 022/050 | Batch 420/859 | Cost: 0.1927 Epoch: 022/050 | Batch 440/859 | Cost: 0.2747 Epoch: 022/050 | Batch 460/859 | Cost: 0.3003 Epoch: 022/050 | Batch 480/859 | Cost: 0.2345 Epoch: 022/050 | Batch 500/859 | Cost: 0.1357 Epoch: 022/050 | Batch 520/859 | Cost: 0.2259 Epoch: 022/050 | Batch 540/859 | Cost: 0.1992 Epoch: 022/050 | Batch 560/859 | Cost: 0.2982 Epoch: 022/050 | Batch 580/859 | Cost: 0.2918 Epoch: 022/050 | Batch 600/859 | Cost: 0.2009 Epoch: 022/050 | Batch 620/859 | Cost: 0.2473 Epoch: 022/050 | Batch 640/859 | Cost: 0.2384 Epoch: 022/050 | Batch 660/859 | Cost: 0.1856 Epoch: 022/050 | Batch 680/859 | Cost: 0.1522 Epoch: 022/050 | Batch 700/859 | Cost: 0.1093 Epoch: 022/050 | Batch 720/859 | Cost: 0.3141 Epoch: 022/050 | Batch 740/859 | Cost: 0.3297 Epoch: 022/050 | Batch 760/859 | Cost: 0.1944 Epoch: 022/050 | Batch 780/859 | Cost: 0.1558 Epoch: 022/050 | Batch 800/859 | Cost: 0.2366 Epoch: 022/050 | Batch 820/859 | Cost: 0.0873 Epoch: 022/050 | Batch 840/859 | Cost: 0.2254 Epoch: 022/050 Train Acc.: 93.12% | Validation Acc.: 88.24% Time elapsed: 3.99 min Epoch: 023/050 | Batch 000/859 | Cost: 0.3025 Epoch: 023/050 | Batch 020/859 | Cost: 0.2037 Epoch: 023/050 | Batch 040/859 | Cost: 0.0947 Epoch: 023/050 | Batch 060/859 | Cost: 0.1839 Epoch: 023/050 | Batch 080/859 | Cost: 0.1956 Epoch: 023/050 | Batch 100/859 | Cost: 0.2218 Epoch: 023/050 | Batch 120/859 | Cost: 0.1109 Epoch: 023/050 | Batch 140/859 | Cost: 0.1752 Epoch: 023/050 | Batch 160/859 | Cost: 0.1970 Epoch: 023/050 | Batch 180/859 | Cost: 0.1998 Epoch: 023/050 | Batch 200/859 | Cost: 0.1618 Epoch: 023/050 | Batch 220/859 | Cost: 0.1684 Epoch: 023/050 | Batch 240/859 | Cost: 0.1974 Epoch: 023/050 | Batch 260/859 | Cost: 0.2669 Epoch: 023/050 | Batch 280/859 | Cost: 0.1039 Epoch: 023/050 | Batch 300/859 | Cost: 0.1407 Epoch: 023/050 | Batch 320/859 | Cost: 0.1971 Epoch: 023/050 | Batch 340/859 | Cost: 0.2032 Epoch: 023/050 | Batch 360/859 | Cost: 0.0897 Epoch: 023/050 | Batch 380/859 | Cost: 0.2259 Epoch: 023/050 | Batch 400/859 | Cost: 0.1951 Epoch: 023/050 | Batch 420/859 | Cost: 0.1365 Epoch: 023/050 | Batch 440/859 | Cost: 0.1349 Epoch: 023/050 | Batch 460/859 | Cost: 0.1572 Epoch: 023/050 | Batch 480/859 | Cost: 0.1933 Epoch: 023/050 | Batch 500/859 | Cost: 0.1155 Epoch: 023/050 | Batch 520/859 | Cost: 0.2443 Epoch: 023/050 | Batch 540/859 | Cost: 0.1292 Epoch: 023/050 | Batch 560/859 | Cost: 0.3296 Epoch: 023/050 | Batch 580/859 | Cost: 0.1434 Epoch: 023/050 | Batch 600/859 | Cost: 0.3355 Epoch: 023/050 | Batch 620/859 | Cost: 0.2257 Epoch: 023/050 | Batch 640/859 | Cost: 0.2332 Epoch: 023/050 | Batch 660/859 | Cost: 0.2519 Epoch: 023/050 | Batch 680/859 | Cost: 0.2153 Epoch: 023/050 | Batch 700/859 | Cost: 0.1938 Epoch: 023/050 | Batch 720/859 | Cost: 0.1225 Epoch: 023/050 | Batch 740/859 | Cost: 0.1855 Epoch: 023/050 | Batch 760/859 | Cost: 0.2873 Epoch: 023/050 | Batch 780/859 | Cost: 0.1769 Epoch: 023/050 | Batch 800/859 | Cost: 0.1697 Epoch: 023/050 | Batch 820/859 | Cost: 0.2464 Epoch: 023/050 | Batch 840/859 | Cost: 0.1705 Epoch: 023/050 Train Acc.: 93.03% | Validation Acc.: 88.30% Time elapsed: 4.17 min Epoch: 024/050 | Batch 000/859 | Cost: 0.1356 Epoch: 024/050 | Batch 020/859 | Cost: 0.1244 Epoch: 024/050 | Batch 040/859 | Cost: 0.2367 Epoch: 024/050 | Batch 060/859 | Cost: 0.2965 Epoch: 024/050 | Batch 080/859 | Cost: 0.1140 Epoch: 024/050 | Batch 100/859 | Cost: 0.1892 Epoch: 024/050 | Batch 120/859 | Cost: 0.1505 Epoch: 024/050 | Batch 140/859 | Cost: 0.0968 Epoch: 024/050 | Batch 160/859 | Cost: 0.1886 Epoch: 024/050 | Batch 180/859 | Cost: 0.1284 Epoch: 024/050 | Batch 200/859 | Cost: 0.2855 Epoch: 024/050 | Batch 220/859 | Cost: 0.2607 Epoch: 024/050 | Batch 240/859 | Cost: 0.1274 Epoch: 024/050 | Batch 260/859 | Cost: 0.1254 Epoch: 024/050 | Batch 280/859 | Cost: 0.2268 Epoch: 024/050 | Batch 300/859 | Cost: 0.2253 Epoch: 024/050 | Batch 320/859 | Cost: 0.2047 Epoch: 024/050 | Batch 340/859 | Cost: 0.1604 Epoch: 024/050 | Batch 360/859 | Cost: 0.2191 Epoch: 024/050 | Batch 380/859 | Cost: 0.1151 Epoch: 024/050 | Batch 400/859 | Cost: 0.2163 Epoch: 024/050 | Batch 420/859 | Cost: 0.1669 Epoch: 024/050 | Batch 440/859 | Cost: 0.1798 Epoch: 024/050 | Batch 460/859 | Cost: 0.2433 Epoch: 024/050 | Batch 480/859 | Cost: 0.3262 Epoch: 024/050 | Batch 500/859 | Cost: 0.1736 Epoch: 024/050 | Batch 520/859 | Cost: 0.1888 Epoch: 024/050 | Batch 540/859 | Cost: 0.2341 Epoch: 024/050 | Batch 560/859 | Cost: 0.2469 Epoch: 024/050 | Batch 580/859 | Cost: 0.2785 Epoch: 024/050 | Batch 600/859 | Cost: 0.2942 Epoch: 024/050 | Batch 620/859 | Cost: 0.1479 Epoch: 024/050 | Batch 640/859 | Cost: 0.2583 Epoch: 024/050 | Batch 660/859 | Cost: 0.2160 Epoch: 024/050 | Batch 680/859 | Cost: 0.2883 Epoch: 024/050 | Batch 700/859 | Cost: 0.2625 Epoch: 024/050 | Batch 720/859 | Cost: 0.2848 Epoch: 024/050 | Batch 740/859 | Cost: 0.2559 Epoch: 024/050 | Batch 760/859 | Cost: 0.2459 Epoch: 024/050 | Batch 780/859 | Cost: 0.1629 Epoch: 024/050 | Batch 800/859 | Cost: 0.2532 Epoch: 024/050 | Batch 820/859 | Cost: 0.1552 Epoch: 024/050 | Batch 840/859 | Cost: 0.3181 Epoch: 024/050 Train Acc.: 92.69% | Validation Acc.: 88.38% Time elapsed: 4.35 min Epoch: 025/050 | Batch 000/859 | Cost: 0.2389 Epoch: 025/050 | Batch 020/859 | Cost: 0.1719 Epoch: 025/050 | Batch 040/859 | Cost: 0.0815 Epoch: 025/050 | Batch 060/859 | Cost: 0.1010 Epoch: 025/050 | Batch 080/859 | Cost: 0.2536 Epoch: 025/050 | Batch 100/859 | Cost: 0.2032 Epoch: 025/050 | Batch 120/859 | Cost: 0.1919 Epoch: 025/050 | Batch 140/859 | Cost: 0.0706 Epoch: 025/050 | Batch 160/859 | Cost: 0.0644 Epoch: 025/050 | Batch 180/859 | Cost: 0.1608 Epoch: 025/050 | Batch 200/859 | Cost: 0.1608 Epoch: 025/050 | Batch 220/859 | Cost: 0.1767 Epoch: 025/050 | Batch 240/859 | Cost: 0.2467 Epoch: 025/050 | Batch 260/859 | Cost: 0.2336 Epoch: 025/050 | Batch 280/859 | Cost: 0.1502 Epoch: 025/050 | Batch 300/859 | Cost: 0.1858 Epoch: 025/050 | Batch 320/859 | Cost: 0.1470 Epoch: 025/050 | Batch 340/859 | Cost: 0.0954 Epoch: 025/050 | Batch 360/859 | Cost: 0.1747 Epoch: 025/050 | Batch 380/859 | Cost: 0.0897 Epoch: 025/050 | Batch 400/859 | Cost: 0.1125 Epoch: 025/050 | Batch 420/859 | Cost: 0.0982 Epoch: 025/050 | Batch 440/859 | Cost: 0.2125 Epoch: 025/050 | Batch 460/859 | Cost: 0.2323 Epoch: 025/050 | Batch 480/859 | Cost: 0.1711 Epoch: 025/050 | Batch 500/859 | Cost: 0.1593 Epoch: 025/050 | Batch 520/859 | Cost: 0.2371 Epoch: 025/050 | Batch 540/859 | Cost: 0.3439 Epoch: 025/050 | Batch 560/859 | Cost: 0.2763 Epoch: 025/050 | Batch 580/859 | Cost: 0.1119 Epoch: 025/050 | Batch 600/859 | Cost: 0.1677 Epoch: 025/050 | Batch 620/859 | Cost: 0.1550 Epoch: 025/050 | Batch 640/859 | Cost: 0.1737 Epoch: 025/050 | Batch 660/859 | Cost: 0.1347 Epoch: 025/050 | Batch 680/859 | Cost: 0.1535 Epoch: 025/050 | Batch 700/859 | Cost: 0.1355 Epoch: 025/050 | Batch 720/859 | Cost: 0.1110 Epoch: 025/050 | Batch 740/859 | Cost: 0.1690 Epoch: 025/050 | Batch 760/859 | Cost: 0.2774 Epoch: 025/050 | Batch 780/859 | Cost: 0.1523 Epoch: 025/050 | Batch 800/859 | Cost: 0.2393 Epoch: 025/050 | Batch 820/859 | Cost: 0.1576 Epoch: 025/050 | Batch 840/859 | Cost: 0.1850 Epoch: 025/050 Train Acc.: 93.12% | Validation Acc.: 87.22% Time elapsed: 4.54 min Epoch: 026/050 | Batch 000/859 | Cost: 0.2399 Epoch: 026/050 | Batch 020/859 | Cost: 0.0713 Epoch: 026/050 | Batch 040/859 | Cost: 0.4895 Epoch: 026/050 | Batch 060/859 | Cost: 0.0630 Epoch: 026/050 | Batch 080/859 | Cost: 0.1314 Epoch: 026/050 | Batch 100/859 | Cost: 0.0723 Epoch: 026/050 | Batch 120/859 | Cost: 0.2327 Epoch: 026/050 | Batch 140/859 | Cost: 0.2068 Epoch: 026/050 | Batch 160/859 | Cost: 0.2344 Epoch: 026/050 | Batch 180/859 | Cost: 0.3068 Epoch: 026/050 | Batch 200/859 | Cost: 0.2186 Epoch: 026/050 | Batch 220/859 | Cost: 0.1547 Epoch: 026/050 | Batch 240/859 | Cost: 0.2329 Epoch: 026/050 | Batch 260/859 | Cost: 0.2355 Epoch: 026/050 | Batch 280/859 | Cost: 0.1544 Epoch: 026/050 | Batch 300/859 | Cost: 0.3086 Epoch: 026/050 | Batch 320/859 | Cost: 0.2192 Epoch: 026/050 | Batch 340/859 | Cost: 0.1737 Epoch: 026/050 | Batch 360/859 | Cost: 0.1151 Epoch: 026/050 | Batch 380/859 | Cost: 0.1243 Epoch: 026/050 | Batch 400/859 | Cost: 0.1229 Epoch: 026/050 | Batch 420/859 | Cost: 0.3944 Epoch: 026/050 | Batch 440/859 | Cost: 0.1699 Epoch: 026/050 | Batch 460/859 | Cost: 0.1144 Epoch: 026/050 | Batch 480/859 | Cost: 0.0910 Epoch: 026/050 | Batch 500/859 | Cost: 0.1245 Epoch: 026/050 | Batch 520/859 | Cost: 0.2358 Epoch: 026/050 | Batch 540/859 | Cost: 0.1781 Epoch: 026/050 | Batch 560/859 | Cost: 0.2162 Epoch: 026/050 | Batch 580/859 | Cost: 0.1338 Epoch: 026/050 | Batch 600/859 | Cost: 0.2689 Epoch: 026/050 | Batch 620/859 | Cost: 0.2339 Epoch: 026/050 | Batch 640/859 | Cost: 0.3264 Epoch: 026/050 | Batch 660/859 | Cost: 0.1061 Epoch: 026/050 | Batch 680/859 | Cost: 0.0599 Epoch: 026/050 | Batch 700/859 | Cost: 0.1148 Epoch: 026/050 | Batch 720/859 | Cost: 0.1022 Epoch: 026/050 | Batch 740/859 | Cost: 0.2811 Epoch: 026/050 | Batch 760/859 | Cost: 0.1942 Epoch: 026/050 | Batch 780/859 | Cost: 0.1060 Epoch: 026/050 | Batch 800/859 | Cost: 0.2104 Epoch: 026/050 | Batch 820/859 | Cost: 0.1185 Epoch: 026/050 | Batch 840/859 | Cost: 0.3073 Epoch: 026/050 Train Acc.: 93.59% | Validation Acc.: 88.16% Time elapsed: 4.72 min Epoch: 027/050 | Batch 000/859 | Cost: 0.0529 Epoch: 027/050 | Batch 020/859 | Cost: 0.1727 Epoch: 027/050 | Batch 040/859 | Cost: 0.1859 Epoch: 027/050 | Batch 060/859 | Cost: 0.1822 Epoch: 027/050 | Batch 080/859 | Cost: 0.2994 Epoch: 027/050 | Batch 100/859 | Cost: 0.1113 Epoch: 027/050 | Batch 120/859 | Cost: 0.1216 Epoch: 027/050 | Batch 140/859 | Cost: 0.1523 Epoch: 027/050 | Batch 160/859 | Cost: 0.1718 Epoch: 027/050 | Batch 180/859 | Cost: 0.2062 Epoch: 027/050 | Batch 200/859 | Cost: 0.1490 Epoch: 027/050 | Batch 220/859 | Cost: 0.1024 Epoch: 027/050 | Batch 240/859 | Cost: 0.1330 Epoch: 027/050 | Batch 260/859 | Cost: 0.1802 Epoch: 027/050 | Batch 280/859 | Cost: 0.2412 Epoch: 027/050 | Batch 300/859 | Cost: 0.0795 Epoch: 027/050 | Batch 320/859 | Cost: 0.1300 Epoch: 027/050 | Batch 340/859 | Cost: 0.1672 Epoch: 027/050 | Batch 360/859 | Cost: 0.1095 Epoch: 027/050 | Batch 380/859 | Cost: 0.1269 Epoch: 027/050 | Batch 400/859 | Cost: 0.2490 Epoch: 027/050 | Batch 420/859 | Cost: 0.1226 Epoch: 027/050 | Batch 440/859 | Cost: 0.1725 Epoch: 027/050 | Batch 460/859 | Cost: 0.3308 Epoch: 027/050 | Batch 480/859 | Cost: 0.1636 Epoch: 027/050 | Batch 500/859 | Cost: 0.2293 Epoch: 027/050 | Batch 520/859 | Cost: 0.2325 Epoch: 027/050 | Batch 540/859 | Cost: 0.3192 Epoch: 027/050 | Batch 560/859 | Cost: 0.1019 Epoch: 027/050 | Batch 580/859 | Cost: 0.2013 Epoch: 027/050 | Batch 600/859 | Cost: 0.3074 Epoch: 027/050 | Batch 620/859 | Cost: 0.1062 Epoch: 027/050 | Batch 640/859 | Cost: 0.0695 Epoch: 027/050 | Batch 660/859 | Cost: 0.2815 Epoch: 027/050 | Batch 680/859 | Cost: 0.1163 Epoch: 027/050 | Batch 700/859 | Cost: 0.0818 Epoch: 027/050 | Batch 720/859 | Cost: 0.1777 Epoch: 027/050 | Batch 740/859 | Cost: 0.2586 Epoch: 027/050 | Batch 760/859 | Cost: 0.1277 Epoch: 027/050 | Batch 780/859 | Cost: 0.0743 Epoch: 027/050 | Batch 800/859 | Cost: 0.1170 Epoch: 027/050 | Batch 820/859 | Cost: 0.4157 Epoch: 027/050 | Batch 840/859 | Cost: 0.2588 Epoch: 027/050 Train Acc.: 93.81% | Validation Acc.: 88.66% Time elapsed: 4.90 min Epoch: 028/050 | Batch 000/859 | Cost: 0.2808 Epoch: 028/050 | Batch 020/859 | Cost: 0.1562 Epoch: 028/050 | Batch 040/859 | Cost: 0.1831 Epoch: 028/050 | Batch 060/859 | Cost: 0.2373 Epoch: 028/050 | Batch 080/859 | Cost: 0.4045 Epoch: 028/050 | Batch 100/859 | Cost: 0.1476 Epoch: 028/050 | Batch 120/859 | Cost: 0.2268 Epoch: 028/050 | Batch 140/859 | Cost: 0.1038 Epoch: 028/050 | Batch 160/859 | Cost: 0.2181 Epoch: 028/050 | Batch 180/859 | Cost: 0.1930 Epoch: 028/050 | Batch 200/859 | Cost: 0.2043 Epoch: 028/050 | Batch 220/859 | Cost: 0.1618 Epoch: 028/050 | Batch 240/859 | Cost: 0.1563 Epoch: 028/050 | Batch 260/859 | Cost: 0.1363 Epoch: 028/050 | Batch 280/859 | Cost: 0.3487 Epoch: 028/050 | Batch 300/859 | Cost: 0.2009 Epoch: 028/050 | Batch 320/859 | Cost: 0.1604 Epoch: 028/050 | Batch 340/859 | Cost: 0.2299 Epoch: 028/050 | Batch 360/859 | Cost: 0.2120 Epoch: 028/050 | Batch 380/859 | Cost: 0.1945 Epoch: 028/050 | Batch 400/859 | Cost: 0.1706 Epoch: 028/050 | Batch 420/859 | Cost: 0.1939 Epoch: 028/050 | Batch 440/859 | Cost: 0.2767 Epoch: 028/050 | Batch 460/859 | Cost: 0.1932 Epoch: 028/050 | Batch 480/859 | Cost: 0.2953 Epoch: 028/050 | Batch 500/859 | Cost: 0.1828 Epoch: 028/050 | Batch 520/859 | Cost: 0.1576 Epoch: 028/050 | Batch 540/859 | Cost: 0.1098 Epoch: 028/050 | Batch 560/859 | Cost: 0.1206 Epoch: 028/050 | Batch 580/859 | Cost: 0.1494 Epoch: 028/050 | Batch 600/859 | Cost: 0.1319 Epoch: 028/050 | Batch 620/859 | Cost: 0.1194 Epoch: 028/050 | Batch 640/859 | Cost: 0.2796 Epoch: 028/050 | Batch 660/859 | Cost: 0.1671 Epoch: 028/050 | Batch 680/859 | Cost: 0.1781 Epoch: 028/050 | Batch 700/859 | Cost: 0.0845 Epoch: 028/050 | Batch 720/859 | Cost: 0.2467 Epoch: 028/050 | Batch 740/859 | Cost: 0.0835 Epoch: 028/050 | Batch 760/859 | Cost: 0.2384 Epoch: 028/050 | Batch 780/859 | Cost: 0.1835 Epoch: 028/050 | Batch 800/859 | Cost: 0.2610 Epoch: 028/050 | Batch 820/859 | Cost: 0.1620 Epoch: 028/050 | Batch 840/859 | Cost: 0.2390 Epoch: 028/050 Train Acc.: 94.16% | Validation Acc.: 88.70% Time elapsed: 5.07 min Epoch: 029/050 | Batch 000/859 | Cost: 0.2127 Epoch: 029/050 | Batch 020/859 | Cost: 0.1805 Epoch: 029/050 | Batch 040/859 | Cost: 0.2869 Epoch: 029/050 | Batch 060/859 | Cost: 0.2093 Epoch: 029/050 | Batch 080/859 | Cost: 0.2462 Epoch: 029/050 | Batch 100/859 | Cost: 0.1740 Epoch: 029/050 | Batch 120/859 | Cost: 0.2019 Epoch: 029/050 | Batch 140/859 | Cost: 0.1931 Epoch: 029/050 | Batch 160/859 | Cost: 0.1627 Epoch: 029/050 | Batch 180/859 | Cost: 0.2130 Epoch: 029/050 | Batch 200/859 | Cost: 0.1163 Epoch: 029/050 | Batch 220/859 | Cost: 0.1638 Epoch: 029/050 | Batch 240/859 | Cost: 0.1158 Epoch: 029/050 | Batch 260/859 | Cost: 0.2676 Epoch: 029/050 | Batch 280/859 | Cost: 0.2160 Epoch: 029/050 | Batch 300/859 | Cost: 0.1389 Epoch: 029/050 | Batch 320/859 | Cost: 0.1544 Epoch: 029/050 | Batch 340/859 | Cost: 0.2025 Epoch: 029/050 | Batch 360/859 | Cost: 0.2857 Epoch: 029/050 | Batch 380/859 | Cost: 0.3102 Epoch: 029/050 | Batch 400/859 | Cost: 0.2889 Epoch: 029/050 | Batch 420/859 | Cost: 0.1926 Epoch: 029/050 | Batch 440/859 | Cost: 0.1558 Epoch: 029/050 | Batch 460/859 | Cost: 0.1106 Epoch: 029/050 | Batch 480/859 | Cost: 0.1662 Epoch: 029/050 | Batch 500/859 | Cost: 0.2388 Epoch: 029/050 | Batch 520/859 | Cost: 0.2909 Epoch: 029/050 | Batch 540/859 | Cost: 0.1278 Epoch: 029/050 | Batch 560/859 | Cost: 0.1262 Epoch: 029/050 | Batch 580/859 | Cost: 0.3082 Epoch: 029/050 | Batch 600/859 | Cost: 0.1369 Epoch: 029/050 | Batch 620/859 | Cost: 0.2511 Epoch: 029/050 | Batch 640/859 | Cost: 0.2007 Epoch: 029/050 | Batch 660/859 | Cost: 0.0447 Epoch: 029/050 | Batch 680/859 | Cost: 0.1529 Epoch: 029/050 | Batch 700/859 | Cost: 0.1994 Epoch: 029/050 | Batch 720/859 | Cost: 0.2356 Epoch: 029/050 | Batch 740/859 | Cost: 0.1673 Epoch: 029/050 | Batch 760/859 | Cost: 0.3247 Epoch: 029/050 | Batch 780/859 | Cost: 0.2301 Epoch: 029/050 | Batch 800/859 | Cost: 0.1187 Epoch: 029/050 | Batch 820/859 | Cost: 0.1772 Epoch: 029/050 | Batch 840/859 | Cost: 0.0821 Epoch: 029/050 Train Acc.: 94.04% | Validation Acc.: 88.40% Time elapsed: 5.25 min Epoch: 030/050 | Batch 000/859 | Cost: 0.2438 Epoch: 030/050 | Batch 020/859 | Cost: 0.1898 Epoch: 030/050 | Batch 040/859 | Cost: 0.1362 Epoch: 030/050 | Batch 060/859 | Cost: 0.1197 Epoch: 030/050 | Batch 080/859 | Cost: 0.1742 Epoch: 030/050 | Batch 100/859 | Cost: 0.1447 Epoch: 030/050 | Batch 120/859 | Cost: 0.1887 Epoch: 030/050 | Batch 140/859 | Cost: 0.1706 Epoch: 030/050 | Batch 160/859 | Cost: 0.2153 Epoch: 030/050 | Batch 180/859 | Cost: 0.2010 Epoch: 030/050 | Batch 200/859 | Cost: 0.3102 Epoch: 030/050 | Batch 220/859 | Cost: 0.2031 Epoch: 030/050 | Batch 240/859 | Cost: 0.1324 Epoch: 030/050 | Batch 260/859 | Cost: 0.1075 Epoch: 030/050 | Batch 280/859 | Cost: 0.1809 Epoch: 030/050 | Batch 300/859 | Cost: 0.3061 Epoch: 030/050 | Batch 320/859 | Cost: 0.1584 Epoch: 030/050 | Batch 340/859 | Cost: 0.1685 Epoch: 030/050 | Batch 360/859 | Cost: 0.1369 Epoch: 030/050 | Batch 380/859 | Cost: 0.1101 Epoch: 030/050 | Batch 400/859 | Cost: 0.1059 Epoch: 030/050 | Batch 420/859 | Cost: 0.2838 Epoch: 030/050 | Batch 440/859 | Cost: 0.1900 Epoch: 030/050 | Batch 460/859 | Cost: 0.1383 Epoch: 030/050 | Batch 480/859 | Cost: 0.1020 Epoch: 030/050 | Batch 500/859 | Cost: 0.1716 Epoch: 030/050 | Batch 520/859 | Cost: 0.2276 Epoch: 030/050 | Batch 540/859 | Cost: 0.1608 Epoch: 030/050 | Batch 560/859 | Cost: 0.0891 Epoch: 030/050 | Batch 580/859 | Cost: 0.2771 Epoch: 030/050 | Batch 600/859 | Cost: 0.0527 Epoch: 030/050 | Batch 620/859 | Cost: 0.0346 Epoch: 030/050 | Batch 640/859 | Cost: 0.1554 Epoch: 030/050 | Batch 660/859 | Cost: 0.1335 Epoch: 030/050 | Batch 680/859 | Cost: 0.1806 Epoch: 030/050 | Batch 700/859 | Cost: 0.0588 Epoch: 030/050 | Batch 720/859 | Cost: 0.1352 Epoch: 030/050 | Batch 740/859 | Cost: 0.2304 Epoch: 030/050 | Batch 760/859 | Cost: 0.2501 Epoch: 030/050 | Batch 780/859 | Cost: 0.1833 Epoch: 030/050 | Batch 800/859 | Cost: 0.1481 Epoch: 030/050 | Batch 820/859 | Cost: 0.2322 Epoch: 030/050 | Batch 840/859 | Cost: 0.0870 Epoch: 030/050 Train Acc.: 93.96% | Validation Acc.: 87.58% Time elapsed: 5.42 min Epoch: 031/050 | Batch 000/859 | Cost: 0.1091 Epoch: 031/050 | Batch 020/859 | Cost: 0.1386 Epoch: 031/050 | Batch 040/859 | Cost: 0.1870 Epoch: 031/050 | Batch 060/859 | Cost: 0.1738 Epoch: 031/050 | Batch 080/859 | Cost: 0.2939 Epoch: 031/050 | Batch 100/859 | Cost: 0.1165 Epoch: 031/050 | Batch 120/859 | Cost: 0.1581 Epoch: 031/050 | Batch 140/859 | Cost: 0.1759 Epoch: 031/050 | Batch 160/859 | Cost: 0.1120 Epoch: 031/050 | Batch 180/859 | Cost: 0.2432 Epoch: 031/050 | Batch 200/859 | Cost: 0.1674 Epoch: 031/050 | Batch 220/859 | Cost: 0.1226 Epoch: 031/050 | Batch 240/859 | Cost: 0.2853 Epoch: 031/050 | Batch 260/859 | Cost: 0.0858 Epoch: 031/050 | Batch 280/859 | Cost: 0.2493 Epoch: 031/050 | Batch 300/859 | Cost: 0.2112 Epoch: 031/050 | Batch 320/859 | Cost: 0.0611 Epoch: 031/050 | Batch 340/859 | Cost: 0.0874 Epoch: 031/050 | Batch 360/859 | Cost: 0.3295 Epoch: 031/050 | Batch 380/859 | Cost: 0.1997 Epoch: 031/050 | Batch 400/859 | Cost: 0.1979 Epoch: 031/050 | Batch 420/859 | Cost: 0.0731 Epoch: 031/050 | Batch 440/859 | Cost: 0.1185 Epoch: 031/050 | Batch 460/859 | Cost: 0.2684 Epoch: 031/050 | Batch 480/859 | Cost: 0.2291 Epoch: 031/050 | Batch 500/859 | Cost: 0.1206 Epoch: 031/050 | Batch 520/859 | Cost: 0.1387 Epoch: 031/050 | Batch 540/859 | Cost: 0.1494 Epoch: 031/050 | Batch 560/859 | Cost: 0.2224 Epoch: 031/050 | Batch 580/859 | Cost: 0.1297 Epoch: 031/050 | Batch 600/859 | Cost: 0.2018 Epoch: 031/050 | Batch 620/859 | Cost: 0.1244 Epoch: 031/050 | Batch 640/859 | Cost: 0.1402 Epoch: 031/050 | Batch 660/859 | Cost: 0.0751 Epoch: 031/050 | Batch 680/859 | Cost: 0.1094 Epoch: 031/050 | Batch 700/859 | Cost: 0.1118 Epoch: 031/050 | Batch 720/859 | Cost: 0.1297 Epoch: 031/050 | Batch 740/859 | Cost: 0.1594 Epoch: 031/050 | Batch 760/859 | Cost: 0.1029 Epoch: 031/050 | Batch 780/859 | Cost: 0.1999 Epoch: 031/050 | Batch 800/859 | Cost: 0.1286 Epoch: 031/050 | Batch 820/859 | Cost: 0.1977 Epoch: 031/050 | Batch 840/859 | Cost: 0.4065 Epoch: 031/050 Train Acc.: 93.73% | Validation Acc.: 88.12% Time elapsed: 5.60 min Epoch: 032/050 | Batch 000/859 | Cost: 0.0904 Epoch: 032/050 | Batch 020/859 | Cost: 0.2106 Epoch: 032/050 | Batch 040/859 | Cost: 0.1186 Epoch: 032/050 | Batch 060/859 | Cost: 0.0742 Epoch: 032/050 | Batch 080/859 | Cost: 0.1206 Epoch: 032/050 | Batch 100/859 | Cost: 0.0876 Epoch: 032/050 | Batch 120/859 | Cost: 0.1961 Epoch: 032/050 | Batch 140/859 | Cost: 0.0828 Epoch: 032/050 | Batch 160/859 | Cost: 0.0907 Epoch: 032/050 | Batch 180/859 | Cost: 0.2126 Epoch: 032/050 | Batch 200/859 | Cost: 0.2902 Epoch: 032/050 | Batch 220/859 | Cost: 0.1050 Epoch: 032/050 | Batch 240/859 | Cost: 0.2348 Epoch: 032/050 | Batch 260/859 | Cost: 0.0804 Epoch: 032/050 | Batch 280/859 | Cost: 0.1441 Epoch: 032/050 | Batch 300/859 | Cost: 0.2431 Epoch: 032/050 | Batch 320/859 | Cost: 0.1777 Epoch: 032/050 | Batch 340/859 | Cost: 0.2116 Epoch: 032/050 | Batch 360/859 | Cost: 0.2117 Epoch: 032/050 | Batch 380/859 | Cost: 0.1308 Epoch: 032/050 | Batch 400/859 | Cost: 0.1113 Epoch: 032/050 | Batch 420/859 | Cost: 0.1173 Epoch: 032/050 | Batch 440/859 | Cost: 0.1210 Epoch: 032/050 | Batch 460/859 | Cost: 0.1582 Epoch: 032/050 | Batch 480/859 | Cost: 0.1974 Epoch: 032/050 | Batch 500/859 | Cost: 0.1816 Epoch: 032/050 | Batch 520/859 | Cost: 0.1678 Epoch: 032/050 | Batch 540/859 | Cost: 0.0725 Epoch: 032/050 | Batch 560/859 | Cost: 0.1164 Epoch: 032/050 | Batch 580/859 | Cost: 0.2376 Epoch: 032/050 | Batch 600/859 | Cost: 0.1492 Epoch: 032/050 | Batch 620/859 | Cost: 0.2769 Epoch: 032/050 | Batch 640/859 | Cost: 0.2402 Epoch: 032/050 | Batch 660/859 | Cost: 0.2090 Epoch: 032/050 | Batch 680/859 | Cost: 0.1612 Epoch: 032/050 | Batch 700/859 | Cost: 0.1471 Epoch: 032/050 | Batch 720/859 | Cost: 0.1186 Epoch: 032/050 | Batch 740/859 | Cost: 0.1485 Epoch: 032/050 | Batch 760/859 | Cost: 0.0899 Epoch: 032/050 | Batch 780/859 | Cost: 0.1704 Epoch: 032/050 | Batch 800/859 | Cost: 0.1613 Epoch: 032/050 | Batch 820/859 | Cost: 0.2199 Epoch: 032/050 | Batch 840/859 | Cost: 0.1920 Epoch: 032/050 Train Acc.: 94.57% | Validation Acc.: 88.14% Time elapsed: 5.77 min Epoch: 033/050 | Batch 000/859 | Cost: 0.0765 Epoch: 033/050 | Batch 020/859 | Cost: 0.1038 Epoch: 033/050 | Batch 040/859 | Cost: 0.2132 Epoch: 033/050 | Batch 060/859 | Cost: 0.2084 Epoch: 033/050 | Batch 080/859 | Cost: 0.1742 Epoch: 033/050 | Batch 100/859 | Cost: 0.1449 Epoch: 033/050 | Batch 120/859 | Cost: 0.2139 Epoch: 033/050 | Batch 140/859 | Cost: 0.3520 Epoch: 033/050 | Batch 160/859 | Cost: 0.1472 Epoch: 033/050 | Batch 180/859 | Cost: 0.3742 Epoch: 033/050 | Batch 200/859 | Cost: 0.2367 Epoch: 033/050 | Batch 220/859 | Cost: 0.3473 Epoch: 033/050 | Batch 240/859 | Cost: 0.1584 Epoch: 033/050 | Batch 260/859 | Cost: 0.0820 Epoch: 033/050 | Batch 280/859 | Cost: 0.2540 Epoch: 033/050 | Batch 300/859 | Cost: 0.2629 Epoch: 033/050 | Batch 320/859 | Cost: 0.1381 Epoch: 033/050 | Batch 340/859 | Cost: 0.0662 Epoch: 033/050 | Batch 360/859 | Cost: 0.2161 Epoch: 033/050 | Batch 380/859 | Cost: 0.1068 Epoch: 033/050 | Batch 400/859 | Cost: 0.1877 Epoch: 033/050 | Batch 420/859 | Cost: 0.2593 Epoch: 033/050 | Batch 440/859 | Cost: 0.2719 Epoch: 033/050 | Batch 460/859 | Cost: 0.2337 Epoch: 033/050 | Batch 480/859 | Cost: 0.1967 Epoch: 033/050 | Batch 500/859 | Cost: 0.1283 Epoch: 033/050 | Batch 520/859 | Cost: 0.2776 Epoch: 033/050 | Batch 540/859 | Cost: 0.1030 Epoch: 033/050 | Batch 560/859 | Cost: 0.0469 Epoch: 033/050 | Batch 580/859 | Cost: 0.1548 Epoch: 033/050 | Batch 600/859 | Cost: 0.3339 Epoch: 033/050 | Batch 620/859 | Cost: 0.1948 Epoch: 033/050 | Batch 640/859 | Cost: 0.1784 Epoch: 033/050 | Batch 660/859 | Cost: 0.1567 Epoch: 033/050 | Batch 680/859 | Cost: 0.1055 Epoch: 033/050 | Batch 700/859 | Cost: 0.3566 Epoch: 033/050 | Batch 720/859 | Cost: 0.2140 Epoch: 033/050 | Batch 740/859 | Cost: 0.1538 Epoch: 033/050 | Batch 760/859 | Cost: 0.0527 Epoch: 033/050 | Batch 780/859 | Cost: 0.1194 Epoch: 033/050 | Batch 800/859 | Cost: 0.0500 Epoch: 033/050 | Batch 820/859 | Cost: 0.0990 Epoch: 033/050 | Batch 840/859 | Cost: 0.1458 Epoch: 033/050 Train Acc.: 94.54% | Validation Acc.: 88.30% Time elapsed: 5.95 min Epoch: 034/050 | Batch 000/859 | Cost: 0.1250 Epoch: 034/050 | Batch 020/859 | Cost: 0.2649 Epoch: 034/050 | Batch 040/859 | Cost: 0.1537 Epoch: 034/050 | Batch 060/859 | Cost: 0.0980 Epoch: 034/050 | Batch 080/859 | Cost: 0.1970 Epoch: 034/050 | Batch 100/859 | Cost: 0.0910 Epoch: 034/050 | Batch 120/859 | Cost: 0.1468 Epoch: 034/050 | Batch 140/859 | Cost: 0.0894 Epoch: 034/050 | Batch 160/859 | Cost: 0.1474 Epoch: 034/050 | Batch 180/859 | Cost: 0.0852 Epoch: 034/050 | Batch 200/859 | Cost: 0.1423 Epoch: 034/050 | Batch 220/859 | Cost: 0.1588 Epoch: 034/050 | Batch 240/859 | Cost: 0.1081 Epoch: 034/050 | Batch 260/859 | Cost: 0.1394 Epoch: 034/050 | Batch 280/859 | Cost: 0.0962 Epoch: 034/050 | Batch 300/859 | Cost: 0.2636 Epoch: 034/050 | Batch 320/859 | Cost: 0.1707 Epoch: 034/050 | Batch 340/859 | Cost: 0.0951 Epoch: 034/050 | Batch 360/859 | Cost: 0.2186 Epoch: 034/050 | Batch 380/859 | Cost: 0.1671 Epoch: 034/050 | Batch 400/859 | Cost: 0.1215 Epoch: 034/050 | Batch 420/859 | Cost: 0.1923 Epoch: 034/050 | Batch 440/859 | Cost: 0.1976 Epoch: 034/050 | Batch 460/859 | Cost: 0.0942 Epoch: 034/050 | Batch 480/859 | Cost: 0.1939 Epoch: 034/050 | Batch 500/859 | Cost: 0.0836 Epoch: 034/050 | Batch 520/859 | Cost: 0.1637 Epoch: 034/050 | Batch 540/859 | Cost: 0.1298 Epoch: 034/050 | Batch 560/859 | Cost: 0.1883 Epoch: 034/050 | Batch 580/859 | Cost: 0.1973 Epoch: 034/050 | Batch 600/859 | Cost: 0.1822 Epoch: 034/050 | Batch 620/859 | Cost: 0.2009 Epoch: 034/050 | Batch 640/859 | Cost: 0.1172 Epoch: 034/050 | Batch 660/859 | Cost: 0.3137 Epoch: 034/050 | Batch 680/859 | Cost: 0.4786 Epoch: 034/050 | Batch 700/859 | Cost: 0.1493 Epoch: 034/050 | Batch 720/859 | Cost: 0.2635 Epoch: 034/050 | Batch 740/859 | Cost: 0.1322 Epoch: 034/050 | Batch 760/859 | Cost: 0.1752 Epoch: 034/050 | Batch 780/859 | Cost: 0.1175 Epoch: 034/050 | Batch 800/859 | Cost: 0.1106 Epoch: 034/050 | Batch 820/859 | Cost: 0.0547 Epoch: 034/050 | Batch 840/859 | Cost: 0.1822 Epoch: 034/050 Train Acc.: 94.33% | Validation Acc.: 87.66% Time elapsed: 6.12 min Epoch: 035/050 | Batch 000/859 | Cost: 0.2275 Epoch: 035/050 | Batch 020/859 | Cost: 0.1081 Epoch: 035/050 | Batch 040/859 | Cost: 0.1545 Epoch: 035/050 | Batch 060/859 | Cost: 0.0911 Epoch: 035/050 | Batch 080/859 | Cost: 0.2897 Epoch: 035/050 | Batch 100/859 | Cost: 0.1461 Epoch: 035/050 | Batch 120/859 | Cost: 0.1918 Epoch: 035/050 | Batch 140/859 | Cost: 0.3283 Epoch: 035/050 | Batch 160/859 | Cost: 0.1858 Epoch: 035/050 | Batch 180/859 | Cost: 0.1646 Epoch: 035/050 | Batch 200/859 | Cost: 0.1834 Epoch: 035/050 | Batch 220/859 | Cost: 0.1395 Epoch: 035/050 | Batch 240/859 | Cost: 0.1914 Epoch: 035/050 | Batch 260/859 | Cost: 0.1476 Epoch: 035/050 | Batch 280/859 | Cost: 0.1117 Epoch: 035/050 | Batch 300/859 | Cost: 0.0801 Epoch: 035/050 | Batch 320/859 | Cost: 0.1777 Epoch: 035/050 | Batch 340/859 | Cost: 0.2412 Epoch: 035/050 | Batch 360/859 | Cost: 0.2458 Epoch: 035/050 | Batch 380/859 | Cost: 0.3120 Epoch: 035/050 | Batch 400/859 | Cost: 0.1029 Epoch: 035/050 | Batch 420/859 | Cost: 0.1779 Epoch: 035/050 | Batch 440/859 | Cost: 0.1162 Epoch: 035/050 | Batch 460/859 | Cost: 0.0777 Epoch: 035/050 | Batch 480/859 | Cost: 0.2387 Epoch: 035/050 | Batch 500/859 | Cost: 0.1176 Epoch: 035/050 | Batch 520/859 | Cost: 0.1380 Epoch: 035/050 | Batch 540/859 | Cost: 0.1691 Epoch: 035/050 | Batch 560/859 | Cost: 0.1220 Epoch: 035/050 | Batch 580/859 | Cost: 0.1574 Epoch: 035/050 | Batch 600/859 | Cost: 0.1794 Epoch: 035/050 | Batch 620/859 | Cost: 0.2437 Epoch: 035/050 | Batch 640/859 | Cost: 0.1570 Epoch: 035/050 | Batch 660/859 | Cost: 0.1353 Epoch: 035/050 | Batch 680/859 | Cost: 0.2412 Epoch: 035/050 | Batch 700/859 | Cost: 0.1019 Epoch: 035/050 | Batch 720/859 | Cost: 0.0611 Epoch: 035/050 | Batch 740/859 | Cost: 0.1989 Epoch: 035/050 | Batch 760/859 | Cost: 0.1576 Epoch: 035/050 | Batch 780/859 | Cost: 0.1867 Epoch: 035/050 | Batch 800/859 | Cost: 0.1425 Epoch: 035/050 | Batch 820/859 | Cost: 0.1551 Epoch: 035/050 | Batch 840/859 | Cost: 0.1275 Epoch: 035/050 Train Acc.: 93.90% | Validation Acc.: 87.40% Time elapsed: 6.30 min Epoch: 036/050 | Batch 000/859 | Cost: 0.1468 Epoch: 036/050 | Batch 020/859 | Cost: 0.1647 Epoch: 036/050 | Batch 040/859 | Cost: 0.1382 Epoch: 036/050 | Batch 060/859 | Cost: 0.0916 Epoch: 036/050 | Batch 080/859 | Cost: 0.1408 Epoch: 036/050 | Batch 100/859 | Cost: 0.1184 Epoch: 036/050 | Batch 120/859 | Cost: 0.1505 Epoch: 036/050 | Batch 140/859 | Cost: 0.3210 Epoch: 036/050 | Batch 160/859 | Cost: 0.1048 Epoch: 036/050 | Batch 180/859 | Cost: 0.1230 Epoch: 036/050 | Batch 200/859 | Cost: 0.1230 Epoch: 036/050 | Batch 220/859 | Cost: 0.2479 Epoch: 036/050 | Batch 240/859 | Cost: 0.1346 Epoch: 036/050 | Batch 260/859 | Cost: 0.1749 Epoch: 036/050 | Batch 280/859 | Cost: 0.1093 Epoch: 036/050 | Batch 300/859 | Cost: 0.1037 Epoch: 036/050 | Batch 320/859 | Cost: 0.1038 Epoch: 036/050 | Batch 340/859 | Cost: 0.0811 Epoch: 036/050 | Batch 360/859 | Cost: 0.1495 Epoch: 036/050 | Batch 380/859 | Cost: 0.1776 Epoch: 036/050 | Batch 400/859 | Cost: 0.1861 Epoch: 036/050 | Batch 420/859 | Cost: 0.1170 Epoch: 036/050 | Batch 440/859 | Cost: 0.1893 Epoch: 036/050 | Batch 460/859 | Cost: 0.2582 Epoch: 036/050 | Batch 480/859 | Cost: 0.1254 Epoch: 036/050 | Batch 500/859 | Cost: 0.0710 Epoch: 036/050 | Batch 520/859 | Cost: 0.2902 Epoch: 036/050 | Batch 540/859 | Cost: 0.1467 Epoch: 036/050 | Batch 560/859 | Cost: 0.1307 Epoch: 036/050 | Batch 580/859 | Cost: 0.1226 Epoch: 036/050 | Batch 600/859 | Cost: 0.1108 Epoch: 036/050 | Batch 620/859 | Cost: 0.1401 Epoch: 036/050 | Batch 640/859 | Cost: 0.0519 Epoch: 036/050 | Batch 660/859 | Cost: 0.1275 Epoch: 036/050 | Batch 680/859 | Cost: 0.1836 Epoch: 036/050 | Batch 700/859 | Cost: 0.1053 Epoch: 036/050 | Batch 720/859 | Cost: 0.1684 Epoch: 036/050 | Batch 740/859 | Cost: 0.1272 Epoch: 036/050 | Batch 760/859 | Cost: 0.2839 Epoch: 036/050 | Batch 780/859 | Cost: 0.0963 Epoch: 036/050 | Batch 800/859 | Cost: 0.0908 Epoch: 036/050 | Batch 820/859 | Cost: 0.1302 Epoch: 036/050 | Batch 840/859 | Cost: 0.0971 Epoch: 036/050 Train Acc.: 95.39% | Validation Acc.: 88.50% Time elapsed: 6.48 min Epoch: 037/050 | Batch 000/859 | Cost: 0.0993 Epoch: 037/050 | Batch 020/859 | Cost: 0.0968 Epoch: 037/050 | Batch 040/859 | Cost: 0.1005 Epoch: 037/050 | Batch 060/859 | Cost: 0.1534 Epoch: 037/050 | Batch 080/859 | Cost: 0.1649 Epoch: 037/050 | Batch 100/859 | Cost: 0.1854 Epoch: 037/050 | Batch 120/859 | Cost: 0.2035 Epoch: 037/050 | Batch 140/859 | Cost: 0.1170 Epoch: 037/050 | Batch 160/859 | Cost: 0.0885 Epoch: 037/050 | Batch 180/859 | Cost: 0.1364 Epoch: 037/050 | Batch 200/859 | Cost: 0.1152 Epoch: 037/050 | Batch 220/859 | Cost: 0.2533 Epoch: 037/050 | Batch 240/859 | Cost: 0.0640 Epoch: 037/050 | Batch 260/859 | Cost: 0.0181 Epoch: 037/050 | Batch 280/859 | Cost: 0.1066 Epoch: 037/050 | Batch 300/859 | Cost: 0.0976 Epoch: 037/050 | Batch 320/859 | Cost: 0.2051 Epoch: 037/050 | Batch 340/859 | Cost: 0.1224 Epoch: 037/050 | Batch 360/859 | Cost: 0.2601 Epoch: 037/050 | Batch 380/859 | Cost: 0.1904 Epoch: 037/050 | Batch 400/859 | Cost: 0.2126 Epoch: 037/050 | Batch 420/859 | Cost: 0.1492 Epoch: 037/050 | Batch 440/859 | Cost: 0.1587 Epoch: 037/050 | Batch 460/859 | Cost: 0.2099 Epoch: 037/050 | Batch 480/859 | Cost: 0.0846 Epoch: 037/050 | Batch 500/859 | Cost: 0.1570 Epoch: 037/050 | Batch 520/859 | Cost: 0.2059 Epoch: 037/050 | Batch 540/859 | Cost: 0.0929 Epoch: 037/050 | Batch 560/859 | Cost: 0.0728 Epoch: 037/050 | Batch 580/859 | Cost: 0.0722 Epoch: 037/050 | Batch 600/859 | Cost: 0.1119 Epoch: 037/050 | Batch 620/859 | Cost: 0.1143 Epoch: 037/050 | Batch 640/859 | Cost: 0.0987 Epoch: 037/050 | Batch 660/859 | Cost: 0.1784 Epoch: 037/050 | Batch 680/859 | Cost: 0.2272 Epoch: 037/050 | Batch 700/859 | Cost: 0.1640 Epoch: 037/050 | Batch 720/859 | Cost: 0.1676 Epoch: 037/050 | Batch 740/859 | Cost: 0.1733 Epoch: 037/050 | Batch 760/859 | Cost: 0.2536 Epoch: 037/050 | Batch 780/859 | Cost: 0.1072 Epoch: 037/050 | Batch 800/859 | Cost: 0.3005 Epoch: 037/050 | Batch 820/859 | Cost: 0.2751 Epoch: 037/050 | Batch 840/859 | Cost: 0.2482 Epoch: 037/050 Train Acc.: 95.07% | Validation Acc.: 88.02% Time elapsed: 6.65 min Epoch: 038/050 | Batch 000/859 | Cost: 0.2318 Epoch: 038/050 | Batch 020/859 | Cost: 0.1421 Epoch: 038/050 | Batch 040/859 | Cost: 0.0569 Epoch: 038/050 | Batch 060/859 | Cost: 0.1826 Epoch: 038/050 | Batch 080/859 | Cost: 0.1826 Epoch: 038/050 | Batch 100/859 | Cost: 0.1964 Epoch: 038/050 | Batch 120/859 | Cost: 0.1141 Epoch: 038/050 | Batch 140/859 | Cost: 0.1101 Epoch: 038/050 | Batch 160/859 | Cost: 0.1246 Epoch: 038/050 | Batch 180/859 | Cost: 0.0995 Epoch: 038/050 | Batch 200/859 | Cost: 0.1893 Epoch: 038/050 | Batch 220/859 | Cost: 0.1375 Epoch: 038/050 | Batch 240/859 | Cost: 0.0850 Epoch: 038/050 | Batch 260/859 | Cost: 0.0469 Epoch: 038/050 | Batch 280/859 | Cost: 0.0731 Epoch: 038/050 | Batch 300/859 | Cost: 0.0517 Epoch: 038/050 | Batch 320/859 | Cost: 0.1746 Epoch: 038/050 | Batch 340/859 | Cost: 0.1435 Epoch: 038/050 | Batch 360/859 | Cost: 0.0912 Epoch: 038/050 | Batch 380/859 | Cost: 0.3757 Epoch: 038/050 | Batch 400/859 | Cost: 0.0707 Epoch: 038/050 | Batch 420/859 | Cost: 0.1088 Epoch: 038/050 | Batch 440/859 | Cost: 0.2381 Epoch: 038/050 | Batch 460/859 | Cost: 0.1350 Epoch: 038/050 | Batch 480/859 | Cost: 0.1227 Epoch: 038/050 | Batch 500/859 | Cost: 0.1168 Epoch: 038/050 | Batch 520/859 | Cost: 0.0706 Epoch: 038/050 | Batch 540/859 | Cost: 0.0643 Epoch: 038/050 | Batch 560/859 | Cost: 0.1142 Epoch: 038/050 | Batch 580/859 | Cost: 0.0785 Epoch: 038/050 | Batch 600/859 | Cost: 0.1202 Epoch: 038/050 | Batch 620/859 | Cost: 0.1513 Epoch: 038/050 | Batch 640/859 | Cost: 0.1070 Epoch: 038/050 | Batch 660/859 | Cost: 0.1311 Epoch: 038/050 | Batch 680/859 | Cost: 0.1852 Epoch: 038/050 | Batch 700/859 | Cost: 0.0606 Epoch: 038/050 | Batch 720/859 | Cost: 0.1883 Epoch: 038/050 | Batch 740/859 | Cost: 0.2588 Epoch: 038/050 | Batch 760/859 | Cost: 0.1552 Epoch: 038/050 | Batch 780/859 | Cost: 0.2065 Epoch: 038/050 | Batch 800/859 | Cost: 0.0867 Epoch: 038/050 | Batch 820/859 | Cost: 0.0933 Epoch: 038/050 | Batch 840/859 | Cost: 0.1141 Epoch: 038/050 Train Acc.: 94.17% | Validation Acc.: 87.46% Time elapsed: 6.83 min Epoch: 039/050 | Batch 000/859 | Cost: 0.1666 Epoch: 039/050 | Batch 020/859 | Cost: 0.1753 Epoch: 039/050 | Batch 040/859 | Cost: 0.1111 Epoch: 039/050 | Batch 060/859 | Cost: 0.1969 Epoch: 039/050 | Batch 080/859 | Cost: 0.1593 Epoch: 039/050 | Batch 100/859 | Cost: 0.1760 Epoch: 039/050 | Batch 120/859 | Cost: 0.1564 Epoch: 039/050 | Batch 140/859 | Cost: 0.1395 Epoch: 039/050 | Batch 160/859 | Cost: 0.1113 Epoch: 039/050 | Batch 180/859 | Cost: 0.2184 Epoch: 039/050 | Batch 200/859 | Cost: 0.1934 Epoch: 039/050 | Batch 220/859 | Cost: 0.0627 Epoch: 039/050 | Batch 240/859 | Cost: 0.1795 Epoch: 039/050 | Batch 260/859 | Cost: 0.1252 Epoch: 039/050 | Batch 280/859 | Cost: 0.0908 Epoch: 039/050 | Batch 300/859 | Cost: 0.2302 Epoch: 039/050 | Batch 320/859 | Cost: 0.0798 Epoch: 039/050 | Batch 340/859 | Cost: 0.1260 Epoch: 039/050 | Batch 360/859 | Cost: 0.1798 Epoch: 039/050 | Batch 380/859 | Cost: 0.1378 Epoch: 039/050 | Batch 400/859 | Cost: 0.0955 Epoch: 039/050 | Batch 420/859 | Cost: 0.1813 Epoch: 039/050 | Batch 440/859 | Cost: 0.1520 Epoch: 039/050 | Batch 460/859 | Cost: 0.0608 Epoch: 039/050 | Batch 480/859 | Cost: 0.1126 Epoch: 039/050 | Batch 500/859 | Cost: 0.0678 Epoch: 039/050 | Batch 520/859 | Cost: 0.2202 Epoch: 039/050 | Batch 540/859 | Cost: 0.2025 Epoch: 039/050 | Batch 560/859 | Cost: 0.0907 Epoch: 039/050 | Batch 580/859 | Cost: 0.1764 Epoch: 039/050 | Batch 600/859 | Cost: 0.1599 Epoch: 039/050 | Batch 620/859 | Cost: 0.1682 Epoch: 039/050 | Batch 640/859 | Cost: 0.3561 Epoch: 039/050 | Batch 660/859 | Cost: 0.3304 Epoch: 039/050 | Batch 680/859 | Cost: 0.0914 Epoch: 039/050 | Batch 700/859 | Cost: 0.0581 Epoch: 039/050 | Batch 720/859 | Cost: 0.2701 Epoch: 039/050 | Batch 740/859 | Cost: 0.2319 Epoch: 039/050 | Batch 760/859 | Cost: 0.0544 Epoch: 039/050 | Batch 780/859 | Cost: 0.1004 Epoch: 039/050 | Batch 800/859 | Cost: 0.1905 Epoch: 039/050 | Batch 820/859 | Cost: 0.2417 Epoch: 039/050 | Batch 840/859 | Cost: 0.0986 Epoch: 039/050 Train Acc.: 94.94% | Validation Acc.: 87.68% Time elapsed: 7.02 min Epoch: 040/050 | Batch 000/859 | Cost: 0.0803 Epoch: 040/050 | Batch 020/859 | Cost: 0.1199 Epoch: 040/050 | Batch 040/859 | Cost: 0.2828 Epoch: 040/050 | Batch 060/859 | Cost: 0.2354 Epoch: 040/050 | Batch 080/859 | Cost: 0.0427 Epoch: 040/050 | Batch 100/859 | Cost: 0.1240 Epoch: 040/050 | Batch 120/859 | Cost: 0.0980 Epoch: 040/050 | Batch 140/859 | Cost: 0.1559 Epoch: 040/050 | Batch 160/859 | Cost: 0.2548 Epoch: 040/050 | Batch 180/859 | Cost: 0.2741 Epoch: 040/050 | Batch 200/859 | Cost: 0.0671 Epoch: 040/050 | Batch 220/859 | Cost: 0.2604 Epoch: 040/050 | Batch 240/859 | Cost: 0.1197 Epoch: 040/050 | Batch 260/859 | Cost: 0.0783 Epoch: 040/050 | Batch 280/859 | Cost: 0.1099 Epoch: 040/050 | Batch 300/859 | Cost: 0.0712 Epoch: 040/050 | Batch 320/859 | Cost: 0.0621 Epoch: 040/050 | Batch 340/859 | Cost: 0.2153 Epoch: 040/050 | Batch 360/859 | Cost: 0.0457 Epoch: 040/050 | Batch 380/859 | Cost: 0.0751 Epoch: 040/050 | Batch 400/859 | Cost: 0.0597 Epoch: 040/050 | Batch 420/859 | Cost: 0.0967 Epoch: 040/050 | Batch 440/859 | Cost: 0.0366 Epoch: 040/050 | Batch 460/859 | Cost: 0.1579 Epoch: 040/050 | Batch 480/859 | Cost: 0.1113 Epoch: 040/050 | Batch 500/859 | Cost: 0.1949 Epoch: 040/050 | Batch 520/859 | Cost: 0.1166 Epoch: 040/050 | Batch 540/859 | Cost: 0.1849 Epoch: 040/050 | Batch 560/859 | Cost: 0.0533 Epoch: 040/050 | Batch 580/859 | Cost: 0.1750 Epoch: 040/050 | Batch 600/859 | Cost: 0.1823 Epoch: 040/050 | Batch 620/859 | Cost: 0.0820 Epoch: 040/050 | Batch 640/859 | Cost: 0.1321 Epoch: 040/050 | Batch 660/859 | Cost: 0.1296 Epoch: 040/050 | Batch 680/859 | Cost: 0.2085 Epoch: 040/050 | Batch 700/859 | Cost: 0.1014 Epoch: 040/050 | Batch 720/859 | Cost: 0.1037 Epoch: 040/050 | Batch 740/859 | Cost: 0.1909 Epoch: 040/050 | Batch 760/859 | Cost: 0.1027 Epoch: 040/050 | Batch 780/859 | Cost: 0.1472 Epoch: 040/050 | Batch 800/859 | Cost: 0.1186 Epoch: 040/050 | Batch 820/859 | Cost: 0.1475 Epoch: 040/050 | Batch 840/859 | Cost: 0.1743 Epoch: 040/050 Train Acc.: 95.70% | Validation Acc.: 88.30% Time elapsed: 7.20 min Epoch: 041/050 | Batch 000/859 | Cost: 0.1592 Epoch: 041/050 | Batch 020/859 | Cost: 0.2684 Epoch: 041/050 | Batch 040/859 | Cost: 0.2216 Epoch: 041/050 | Batch 060/859 | Cost: 0.0887 Epoch: 041/050 | Batch 080/859 | Cost: 0.1477 Epoch: 041/050 | Batch 100/859 | Cost: 0.2453 Epoch: 041/050 | Batch 120/859 | Cost: 0.2103 Epoch: 041/050 | Batch 140/859 | Cost: 0.1840 Epoch: 041/050 | Batch 160/859 | Cost: 0.2676 Epoch: 041/050 | Batch 180/859 | Cost: 0.2106 Epoch: 041/050 | Batch 200/859 | Cost: 0.1303 Epoch: 041/050 | Batch 220/859 | Cost: 0.0700 Epoch: 041/050 | Batch 240/859 | Cost: 0.0953 Epoch: 041/050 | Batch 260/859 | Cost: 0.3038 Epoch: 041/050 | Batch 280/859 | Cost: 0.2636 Epoch: 041/050 | Batch 300/859 | Cost: 0.1480 Epoch: 041/050 | Batch 320/859 | Cost: 0.1653 Epoch: 041/050 | Batch 340/859 | Cost: 0.1802 Epoch: 041/050 | Batch 360/859 | Cost: 0.1228 Epoch: 041/050 | Batch 380/859 | Cost: 0.1862 Epoch: 041/050 | Batch 400/859 | Cost: 0.0959 Epoch: 041/050 | Batch 420/859 | Cost: 0.2031 Epoch: 041/050 | Batch 440/859 | Cost: 0.0726 Epoch: 041/050 | Batch 460/859 | Cost: 0.2038 Epoch: 041/050 | Batch 480/859 | Cost: 0.2127 Epoch: 041/050 | Batch 500/859 | Cost: 0.1610 Epoch: 041/050 | Batch 520/859 | Cost: 0.1129 Epoch: 041/050 | Batch 540/859 | Cost: 0.1476 Epoch: 041/050 | Batch 560/859 | Cost: 0.2085 Epoch: 041/050 | Batch 580/859 | Cost: 0.0671 Epoch: 041/050 | Batch 600/859 | Cost: 0.0994 Epoch: 041/050 | Batch 620/859 | Cost: 0.1299 Epoch: 041/050 | Batch 640/859 | Cost: 0.2369 Epoch: 041/050 | Batch 660/859 | Cost: 0.0921 Epoch: 041/050 | Batch 680/859 | Cost: 0.2307 Epoch: 041/050 | Batch 700/859 | Cost: 0.1529 Epoch: 041/050 | Batch 720/859 | Cost: 0.0925 Epoch: 041/050 | Batch 740/859 | Cost: 0.1614 Epoch: 041/050 | Batch 760/859 | Cost: 0.1258 Epoch: 041/050 | Batch 780/859 | Cost: 0.1275 Epoch: 041/050 | Batch 800/859 | Cost: 0.1465 Epoch: 041/050 | Batch 820/859 | Cost: 0.2620 Epoch: 041/050 | Batch 840/859 | Cost: 0.1574 Epoch: 041/050 Train Acc.: 95.00% | Validation Acc.: 87.64% Time elapsed: 7.38 min Epoch: 042/050 | Batch 000/859 | Cost: 0.1432 Epoch: 042/050 | Batch 020/859 | Cost: 0.1501 Epoch: 042/050 | Batch 040/859 | Cost: 0.1366 Epoch: 042/050 | Batch 060/859 | Cost: 0.1169 Epoch: 042/050 | Batch 080/859 | Cost: 0.1106 Epoch: 042/050 | Batch 100/859 | Cost: 0.2178 Epoch: 042/050 | Batch 120/859 | Cost: 0.0497 Epoch: 042/050 | Batch 140/859 | Cost: 0.1578 Epoch: 042/050 | Batch 160/859 | Cost: 0.0841 Epoch: 042/050 | Batch 180/859 | Cost: 0.0581 Epoch: 042/050 | Batch 200/859 | Cost: 0.1343 Epoch: 042/050 | Batch 220/859 | Cost: 0.1255 Epoch: 042/050 | Batch 240/859 | Cost: 0.1368 Epoch: 042/050 | Batch 260/859 | Cost: 0.1419 Epoch: 042/050 | Batch 280/859 | Cost: 0.2004 Epoch: 042/050 | Batch 300/859 | Cost: 0.2023 Epoch: 042/050 | Batch 320/859 | Cost: 0.1568 Epoch: 042/050 | Batch 340/859 | Cost: 0.2392 Epoch: 042/050 | Batch 360/859 | Cost: 0.1592 Epoch: 042/050 | Batch 380/859 | Cost: 0.1393 Epoch: 042/050 | Batch 400/859 | Cost: 0.1966 Epoch: 042/050 | Batch 420/859 | Cost: 0.1024 Epoch: 042/050 | Batch 440/859 | Cost: 0.1957 Epoch: 042/050 | Batch 460/859 | Cost: 0.2031 Epoch: 042/050 | Batch 480/859 | Cost: 0.0910 Epoch: 042/050 | Batch 500/859 | Cost: 0.1891 Epoch: 042/050 | Batch 520/859 | Cost: 0.0901 Epoch: 042/050 | Batch 540/859 | Cost: 0.2638 Epoch: 042/050 | Batch 560/859 | Cost: 0.2176 Epoch: 042/050 | Batch 580/859 | Cost: 0.1018 Epoch: 042/050 | Batch 600/859 | Cost: 0.0368 Epoch: 042/050 | Batch 620/859 | Cost: 0.0775 Epoch: 042/050 | Batch 640/859 | Cost: 0.1153 Epoch: 042/050 | Batch 660/859 | Cost: 0.0338 Epoch: 042/050 | Batch 680/859 | Cost: 0.1562 Epoch: 042/050 | Batch 700/859 | Cost: 0.1698 Epoch: 042/050 | Batch 720/859 | Cost: 0.1200 Epoch: 042/050 | Batch 740/859 | Cost: 0.2326 Epoch: 042/050 | Batch 760/859 | Cost: 0.0693 Epoch: 042/050 | Batch 780/859 | Cost: 0.0756 Epoch: 042/050 | Batch 800/859 | Cost: 0.1377 Epoch: 042/050 | Batch 820/859 | Cost: 0.0964 Epoch: 042/050 | Batch 840/859 | Cost: 0.2416 Epoch: 042/050 Train Acc.: 95.51% | Validation Acc.: 87.94% Time elapsed: 7.57 min Epoch: 043/050 | Batch 000/859 | Cost: 0.1987 Epoch: 043/050 | Batch 020/859 | Cost: 0.0811 Epoch: 043/050 | Batch 040/859 | Cost: 0.1188 Epoch: 043/050 | Batch 060/859 | Cost: 0.1274 Epoch: 043/050 | Batch 080/859 | Cost: 0.1237 Epoch: 043/050 | Batch 100/859 | Cost: 0.1534 Epoch: 043/050 | Batch 120/859 | Cost: 0.1044 Epoch: 043/050 | Batch 140/859 | Cost: 0.3545 Epoch: 043/050 | Batch 160/859 | Cost: 0.0845 Epoch: 043/050 | Batch 180/859 | Cost: 0.1607 Epoch: 043/050 | Batch 200/859 | Cost: 0.1140 Epoch: 043/050 | Batch 220/859 | Cost: 0.2215 Epoch: 043/050 | Batch 240/859 | Cost: 0.1184 Epoch: 043/050 | Batch 260/859 | Cost: 0.0477 Epoch: 043/050 | Batch 280/859 | Cost: 0.0611 Epoch: 043/050 | Batch 300/859 | Cost: 0.0963 Epoch: 043/050 | Batch 320/859 | Cost: 0.2546 Epoch: 043/050 | Batch 340/859 | Cost: 0.2320 Epoch: 043/050 | Batch 360/859 | Cost: 0.1348 Epoch: 043/050 | Batch 380/859 | Cost: 0.1172 Epoch: 043/050 | Batch 400/859 | Cost: 0.2292 Epoch: 043/050 | Batch 420/859 | Cost: 0.2817 Epoch: 043/050 | Batch 440/859 | Cost: 0.2234 Epoch: 043/050 | Batch 460/859 | Cost: 0.0791 Epoch: 043/050 | Batch 480/859 | Cost: 0.0614 Epoch: 043/050 | Batch 500/859 | Cost: 0.2209 Epoch: 043/050 | Batch 520/859 | Cost: 0.1376 Epoch: 043/050 | Batch 540/859 | Cost: 0.0831 Epoch: 043/050 | Batch 560/859 | Cost: 0.1764 Epoch: 043/050 | Batch 580/859 | Cost: 0.1488 Epoch: 043/050 | Batch 600/859 | Cost: 0.1473 Epoch: 043/050 | Batch 620/859 | Cost: 0.1741 Epoch: 043/050 | Batch 640/859 | Cost: 0.2375 Epoch: 043/050 | Batch 660/859 | Cost: 0.1502 Epoch: 043/050 | Batch 680/859 | Cost: 0.1351 Epoch: 043/050 | Batch 700/859 | Cost: 0.1510 Epoch: 043/050 | Batch 720/859 | Cost: 0.0791 Epoch: 043/050 | Batch 740/859 | Cost: 0.0796 Epoch: 043/050 | Batch 760/859 | Cost: 0.1797 Epoch: 043/050 | Batch 780/859 | Cost: 0.1431 Epoch: 043/050 | Batch 800/859 | Cost: 0.3055 Epoch: 043/050 | Batch 820/859 | Cost: 0.2638 Epoch: 043/050 | Batch 840/859 | Cost: 0.0710 Epoch: 043/050 Train Acc.: 95.24% | Validation Acc.: 87.94% Time elapsed: 7.75 min Epoch: 044/050 | Batch 000/859 | Cost: 0.0657 Epoch: 044/050 | Batch 020/859 | Cost: 0.0281 Epoch: 044/050 | Batch 040/859 | Cost: 0.1566 Epoch: 044/050 | Batch 060/859 | Cost: 0.0382 Epoch: 044/050 | Batch 080/859 | Cost: 0.1306 Epoch: 044/050 | Batch 100/859 | Cost: 0.0637 Epoch: 044/050 | Batch 120/859 | Cost: 0.1121 Epoch: 044/050 | Batch 140/859 | Cost: 0.2516 Epoch: 044/050 | Batch 160/859 | Cost: 0.1837 Epoch: 044/050 | Batch 180/859 | Cost: 0.1279 Epoch: 044/050 | Batch 200/859 | Cost: 0.1504 Epoch: 044/050 | Batch 220/859 | Cost: 0.0566 Epoch: 044/050 | Batch 240/859 | Cost: 0.1554 Epoch: 044/050 | Batch 260/859 | Cost: 0.0665 Epoch: 044/050 | Batch 280/859 | Cost: 0.1238 Epoch: 044/050 | Batch 300/859 | Cost: 0.2742 Epoch: 044/050 | Batch 320/859 | Cost: 0.0852 Epoch: 044/050 | Batch 340/859 | Cost: 0.0857 Epoch: 044/050 | Batch 360/859 | Cost: 0.1918 Epoch: 044/050 | Batch 380/859 | Cost: 0.1772 Epoch: 044/050 | Batch 400/859 | Cost: 0.2922 Epoch: 044/050 | Batch 420/859 | Cost: 0.2093 Epoch: 044/050 | Batch 440/859 | Cost: 0.1468 Epoch: 044/050 | Batch 460/859 | Cost: 0.0822 Epoch: 044/050 | Batch 480/859 | Cost: 0.1245 Epoch: 044/050 | Batch 500/859 | Cost: 0.1424 Epoch: 044/050 | Batch 520/859 | Cost: 0.1026 Epoch: 044/050 | Batch 540/859 | Cost: 0.1122 Epoch: 044/050 | Batch 560/859 | Cost: 0.0665 Epoch: 044/050 | Batch 580/859 | Cost: 0.1159 Epoch: 044/050 | Batch 600/859 | Cost: 0.0897 Epoch: 044/050 | Batch 620/859 | Cost: 0.2021 Epoch: 044/050 | Batch 640/859 | Cost: 0.1836 Epoch: 044/050 | Batch 660/859 | Cost: 0.3001 Epoch: 044/050 | Batch 680/859 | Cost: 0.1370 Epoch: 044/050 | Batch 700/859 | Cost: 0.1947 Epoch: 044/050 | Batch 720/859 | Cost: 0.2498 Epoch: 044/050 | Batch 740/859 | Cost: 0.0677 Epoch: 044/050 | Batch 760/859 | Cost: 0.0962 Epoch: 044/050 | Batch 780/859 | Cost: 0.1086 Epoch: 044/050 | Batch 800/859 | Cost: 0.0666 Epoch: 044/050 | Batch 820/859 | Cost: 0.1756 Epoch: 044/050 | Batch 840/859 | Cost: 0.0865 Epoch: 044/050 Train Acc.: 95.89% | Validation Acc.: 88.20% Time elapsed: 7.93 min Epoch: 045/050 | Batch 000/859 | Cost: 0.1051 Epoch: 045/050 | Batch 020/859 | Cost: 0.1695 Epoch: 045/050 | Batch 040/859 | Cost: 0.1048 Epoch: 045/050 | Batch 060/859 | Cost: 0.1583 Epoch: 045/050 | Batch 080/859 | Cost: 0.1582 Epoch: 045/050 | Batch 100/859 | Cost: 0.1747 Epoch: 045/050 | Batch 120/859 | Cost: 0.0790 Epoch: 045/050 | Batch 140/859 | Cost: 0.0959 Epoch: 045/050 | Batch 160/859 | Cost: 0.0595 Epoch: 045/050 | Batch 180/859 | Cost: 0.1243 Epoch: 045/050 | Batch 200/859 | Cost: 0.1730 Epoch: 045/050 | Batch 220/859 | Cost: 0.1684 Epoch: 045/050 | Batch 240/859 | Cost: 0.1053 Epoch: 045/050 | Batch 260/859 | Cost: 0.0442 Epoch: 045/050 | Batch 280/859 | Cost: 0.1117 Epoch: 045/050 | Batch 300/859 | Cost: 0.4313 Epoch: 045/050 | Batch 320/859 | Cost: 0.1114 Epoch: 045/050 | Batch 340/859 | Cost: 0.0393 Epoch: 045/050 | Batch 360/859 | Cost: 0.1157 Epoch: 045/050 | Batch 380/859 | Cost: 0.1247 Epoch: 045/050 | Batch 400/859 | Cost: 0.2084 Epoch: 045/050 | Batch 420/859 | Cost: 0.1285 Epoch: 045/050 | Batch 440/859 | Cost: 0.2126 Epoch: 045/050 | Batch 460/859 | Cost: 0.1333 Epoch: 045/050 | Batch 480/859 | Cost: 0.1161 Epoch: 045/050 | Batch 500/859 | Cost: 0.1382 Epoch: 045/050 | Batch 520/859 | Cost: 0.1362 Epoch: 045/050 | Batch 540/859 | Cost: 0.0581 Epoch: 045/050 | Batch 560/859 | Cost: 0.1544 Epoch: 045/050 | Batch 580/859 | Cost: 0.2302 Epoch: 045/050 | Batch 600/859 | Cost: 0.0926 Epoch: 045/050 | Batch 620/859 | Cost: 0.1892 Epoch: 045/050 | Batch 640/859 | Cost: 0.1512 Epoch: 045/050 | Batch 660/859 | Cost: 0.1394 Epoch: 045/050 | Batch 680/859 | Cost: 0.0845 Epoch: 045/050 | Batch 700/859 | Cost: 0.1168 Epoch: 045/050 | Batch 720/859 | Cost: 0.0876 Epoch: 045/050 | Batch 740/859 | Cost: 0.1007 Epoch: 045/050 | Batch 760/859 | Cost: 0.1074 Epoch: 045/050 | Batch 780/859 | Cost: 0.1567 Epoch: 045/050 | Batch 800/859 | Cost: 0.1193 Epoch: 045/050 | Batch 820/859 | Cost: 0.1663 Epoch: 045/050 | Batch 840/859 | Cost: 0.2418 Epoch: 045/050 Train Acc.: 95.57% | Validation Acc.: 87.76% Time elapsed: 8.11 min Epoch: 046/050 | Batch 000/859 | Cost: 0.1240 Epoch: 046/050 | Batch 020/859 | Cost: 0.2141 Epoch: 046/050 | Batch 040/859 | Cost: 0.1418 Epoch: 046/050 | Batch 060/859 | Cost: 0.1259 Epoch: 046/050 | Batch 080/859 | Cost: 0.1013 Epoch: 046/050 | Batch 100/859 | Cost: 0.1421 Epoch: 046/050 | Batch 120/859 | Cost: 0.0498 Epoch: 046/050 | Batch 140/859 | Cost: 0.0791 Epoch: 046/050 | Batch 160/859 | Cost: 0.0759 Epoch: 046/050 | Batch 180/859 | Cost: 0.0445 Epoch: 046/050 | Batch 200/859 | Cost: 0.0331 Epoch: 046/050 | Batch 220/859 | Cost: 0.1843 Epoch: 046/050 | Batch 240/859 | Cost: 0.1787 Epoch: 046/050 | Batch 260/859 | Cost: 0.0537 Epoch: 046/050 | Batch 280/859 | Cost: 0.2970 Epoch: 046/050 | Batch 300/859 | Cost: 0.0591 Epoch: 046/050 | Batch 320/859 | Cost: 0.1891 Epoch: 046/050 | Batch 340/859 | Cost: 0.1668 Epoch: 046/050 | Batch 360/859 | Cost: 0.0489 Epoch: 046/050 | Batch 380/859 | Cost: 0.2678 Epoch: 046/050 | Batch 400/859 | Cost: 0.0418 Epoch: 046/050 | Batch 420/859 | Cost: 0.0715 Epoch: 046/050 | Batch 440/859 | Cost: 0.0691 Epoch: 046/050 | Batch 460/859 | Cost: 0.2261 Epoch: 046/050 | Batch 480/859 | Cost: 0.1430 Epoch: 046/050 | Batch 500/859 | Cost: 0.0978 Epoch: 046/050 | Batch 520/859 | Cost: 0.0945 Epoch: 046/050 | Batch 540/859 | Cost: 0.0616 Epoch: 046/050 | Batch 560/859 | Cost: 0.1440 Epoch: 046/050 | Batch 580/859 | Cost: 0.2180 Epoch: 046/050 | Batch 600/859 | Cost: 0.1369 Epoch: 046/050 | Batch 620/859 | Cost: 0.1200 Epoch: 046/050 | Batch 640/859 | Cost: 0.0551 Epoch: 046/050 | Batch 660/859 | Cost: 0.0867 Epoch: 046/050 | Batch 680/859 | Cost: 0.1232 Epoch: 046/050 | Batch 700/859 | Cost: 0.1482 Epoch: 046/050 | Batch 720/859 | Cost: 0.1881 Epoch: 046/050 | Batch 740/859 | Cost: 0.0406 Epoch: 046/050 | Batch 760/859 | Cost: 0.1728 Epoch: 046/050 | Batch 780/859 | Cost: 0.1455 Epoch: 046/050 | Batch 800/859 | Cost: 0.1346 Epoch: 046/050 | Batch 820/859 | Cost: 0.1300 Epoch: 046/050 | Batch 840/859 | Cost: 0.1089 Epoch: 046/050 Train Acc.: 96.12% | Validation Acc.: 88.40% Time elapsed: 8.29 min Epoch: 047/050 | Batch 000/859 | Cost: 0.1265 Epoch: 047/050 | Batch 020/859 | Cost: 0.1231 Epoch: 047/050 | Batch 040/859 | Cost: 0.0685 Epoch: 047/050 | Batch 060/859 | Cost: 0.1141 Epoch: 047/050 | Batch 080/859 | Cost: 0.1214 Epoch: 047/050 | Batch 100/859 | Cost: 0.0778 Epoch: 047/050 | Batch 120/859 | Cost: 0.2905 Epoch: 047/050 | Batch 140/859 | Cost: 0.0768 Epoch: 047/050 | Batch 160/859 | Cost: 0.2152 Epoch: 047/050 | Batch 180/859 | Cost: 0.1176 Epoch: 047/050 | Batch 200/859 | Cost: 0.1300 Epoch: 047/050 | Batch 220/859 | Cost: 0.2583 Epoch: 047/050 | Batch 240/859 | Cost: 0.0825 Epoch: 047/050 | Batch 260/859 | Cost: 0.1318 Epoch: 047/050 | Batch 280/859 | Cost: 0.1467 Epoch: 047/050 | Batch 300/859 | Cost: 0.0857 Epoch: 047/050 | Batch 320/859 | Cost: 0.1326 Epoch: 047/050 | Batch 340/859 | Cost: 0.1387 Epoch: 047/050 | Batch 360/859 | Cost: 0.1710 Epoch: 047/050 | Batch 380/859 | Cost: 0.2095 Epoch: 047/050 | Batch 400/859 | Cost: 0.0909 Epoch: 047/050 | Batch 420/859 | Cost: 0.2575 Epoch: 047/050 | Batch 440/859 | Cost: 0.1511 Epoch: 047/050 | Batch 460/859 | Cost: 0.1397 Epoch: 047/050 | Batch 480/859 | Cost: 0.0568 Epoch: 047/050 | Batch 500/859 | Cost: 0.1136 Epoch: 047/050 | Batch 520/859 | Cost: 0.1237 Epoch: 047/050 | Batch 540/859 | Cost: 0.2376 Epoch: 047/050 | Batch 560/859 | Cost: 0.1317 Epoch: 047/050 | Batch 580/859 | Cost: 0.2187 Epoch: 047/050 | Batch 600/859 | Cost: 0.2291 Epoch: 047/050 | Batch 620/859 | Cost: 0.0949 Epoch: 047/050 | Batch 640/859 | Cost: 0.3118 Epoch: 047/050 | Batch 660/859 | Cost: 0.1989 Epoch: 047/050 | Batch 680/859 | Cost: 0.0641 Epoch: 047/050 | Batch 700/859 | Cost: 0.2159 Epoch: 047/050 | Batch 720/859 | Cost: 0.1208 Epoch: 047/050 | Batch 740/859 | Cost: 0.0438 Epoch: 047/050 | Batch 760/859 | Cost: 0.0738 Epoch: 047/050 | Batch 780/859 | Cost: 0.2012 Epoch: 047/050 | Batch 800/859 | Cost: 0.1874 Epoch: 047/050 | Batch 820/859 | Cost: 0.0703 Epoch: 047/050 | Batch 840/859 | Cost: 0.0434 Epoch: 047/050 Train Acc.: 93.91% | Validation Acc.: 86.34% Time elapsed: 8.47 min Epoch: 048/050 | Batch 000/859 | Cost: 0.4587 Epoch: 048/050 | Batch 020/859 | Cost: 0.1581 Epoch: 048/050 | Batch 040/859 | Cost: 0.1681 Epoch: 048/050 | Batch 060/859 | Cost: 0.0204 Epoch: 048/050 | Batch 080/859 | Cost: 0.1193 Epoch: 048/050 | Batch 100/859 | Cost: 0.0930 Epoch: 048/050 | Batch 120/859 | Cost: 0.1437 Epoch: 048/050 | Batch 140/859 | Cost: 0.1362 Epoch: 048/050 | Batch 160/859 | Cost: 0.2000 Epoch: 048/050 | Batch 180/859 | Cost: 0.0887 Epoch: 048/050 | Batch 200/859 | Cost: 0.0721 Epoch: 048/050 | Batch 220/859 | Cost: 0.0298 Epoch: 048/050 | Batch 240/859 | Cost: 0.1800 Epoch: 048/050 | Batch 260/859 | Cost: 0.1585 Epoch: 048/050 | Batch 280/859 | Cost: 0.1106 Epoch: 048/050 | Batch 300/859 | Cost: 0.1126 Epoch: 048/050 | Batch 320/859 | Cost: 0.0445 Epoch: 048/050 | Batch 340/859 | Cost: 0.1651 Epoch: 048/050 | Batch 360/859 | Cost: 0.2713 Epoch: 048/050 | Batch 380/859 | Cost: 0.1326 Epoch: 048/050 | Batch 400/859 | Cost: 0.1602 Epoch: 048/050 | Batch 420/859 | Cost: 0.1277 Epoch: 048/050 | Batch 440/859 | Cost: 0.0921 Epoch: 048/050 | Batch 460/859 | Cost: 0.1555 Epoch: 048/050 | Batch 480/859 | Cost: 0.0505 Epoch: 048/050 | Batch 500/859 | Cost: 0.0432 Epoch: 048/050 | Batch 520/859 | Cost: 0.1445 Epoch: 048/050 | Batch 540/859 | Cost: 0.2448 Epoch: 048/050 | Batch 560/859 | Cost: 0.0709 Epoch: 048/050 | Batch 580/859 | Cost: 0.1097 Epoch: 048/050 | Batch 600/859 | Cost: 0.0663 Epoch: 048/050 | Batch 620/859 | Cost: 0.1139 Epoch: 048/050 | Batch 640/859 | Cost: 0.0316 Epoch: 048/050 | Batch 660/859 | Cost: 0.0872 Epoch: 048/050 | Batch 680/859 | Cost: 0.0659 Epoch: 048/050 | Batch 700/859 | Cost: 0.0554 Epoch: 048/050 | Batch 720/859 | Cost: 0.0797 Epoch: 048/050 | Batch 740/859 | Cost: 0.1127 Epoch: 048/050 | Batch 760/859 | Cost: 0.0717 Epoch: 048/050 | Batch 780/859 | Cost: 0.0725 Epoch: 048/050 | Batch 800/859 | Cost: 0.2492 Epoch: 048/050 | Batch 820/859 | Cost: 0.0708 Epoch: 048/050 | Batch 840/859 | Cost: 0.1293 Epoch: 048/050 Train Acc.: 94.81% | Validation Acc.: 87.26% Time elapsed: 8.65 min Epoch: 049/050 | Batch 000/859 | Cost: 0.1902 Epoch: 049/050 | Batch 020/859 | Cost: 0.1622 Epoch: 049/050 | Batch 040/859 | Cost: 0.1337 Epoch: 049/050 | Batch 060/859 | Cost: 0.1214 Epoch: 049/050 | Batch 080/859 | Cost: 0.0716 Epoch: 049/050 | Batch 100/859 | Cost: 0.1060 Epoch: 049/050 | Batch 120/859 | Cost: 0.0954 Epoch: 049/050 | Batch 140/859 | Cost: 0.1179 Epoch: 049/050 | Batch 160/859 | Cost: 0.1290 Epoch: 049/050 | Batch 180/859 | Cost: 0.2193 Epoch: 049/050 | Batch 200/859 | Cost: 0.0882 Epoch: 049/050 | Batch 220/859 | Cost: 0.0177 Epoch: 049/050 | Batch 240/859 | Cost: 0.0609 Epoch: 049/050 | Batch 260/859 | Cost: 0.0083 Epoch: 049/050 | Batch 280/859 | Cost: 0.0281 Epoch: 049/050 | Batch 300/859 | Cost: 0.0456 Epoch: 049/050 | Batch 320/859 | Cost: 0.1011 Epoch: 049/050 | Batch 340/859 | Cost: 0.1299 Epoch: 049/050 | Batch 360/859 | Cost: 0.0780 Epoch: 049/050 | Batch 380/859 | Cost: 0.0872 Epoch: 049/050 | Batch 400/859 | Cost: 0.2404 Epoch: 049/050 | Batch 420/859 | Cost: 0.0233 Epoch: 049/050 | Batch 440/859 | Cost: 0.0937 Epoch: 049/050 | Batch 460/859 | Cost: 0.0638 Epoch: 049/050 | Batch 480/859 | Cost: 0.1651 Epoch: 049/050 | Batch 500/859 | Cost: 0.0490 Epoch: 049/050 | Batch 520/859 | Cost: 0.1801 Epoch: 049/050 | Batch 540/859 | Cost: 0.1660 Epoch: 049/050 | Batch 560/859 | Cost: 0.1281 Epoch: 049/050 | Batch 580/859 | Cost: 0.1763 Epoch: 049/050 | Batch 600/859 | Cost: 0.0223 Epoch: 049/050 | Batch 620/859 | Cost: 0.1499 Epoch: 049/050 | Batch 640/859 | Cost: 0.1143 Epoch: 049/050 | Batch 660/859 | Cost: 0.0577 Epoch: 049/050 | Batch 680/859 | Cost: 0.0845 Epoch: 049/050 | Batch 700/859 | Cost: 0.0793 Epoch: 049/050 | Batch 720/859 | Cost: 0.0503 Epoch: 049/050 | Batch 740/859 | Cost: 0.1434 Epoch: 049/050 | Batch 760/859 | Cost: 0.0455 Epoch: 049/050 | Batch 780/859 | Cost: 0.2270 Epoch: 049/050 | Batch 800/859 | Cost: 0.1815 Epoch: 049/050 | Batch 820/859 | Cost: 0.1495 Epoch: 049/050 | Batch 840/859 | Cost: 0.0778 Epoch: 049/050 Train Acc.: 96.02% | Validation Acc.: 87.76% Time elapsed: 8.83 min Epoch: 050/050 | Batch 000/859 | Cost: 0.1043 Epoch: 050/050 | Batch 020/859 | Cost: 0.1309 Epoch: 050/050 | Batch 040/859 | Cost: 0.1629 Epoch: 050/050 | Batch 060/859 | Cost: 0.1635 Epoch: 050/050 | Batch 080/859 | Cost: 0.1441 Epoch: 050/050 | Batch 100/859 | Cost: 0.0876 Epoch: 050/050 | Batch 120/859 | Cost: 0.0886 Epoch: 050/050 | Batch 140/859 | Cost: 0.0735 Epoch: 050/050 | Batch 160/859 | Cost: 0.0980 Epoch: 050/050 | Batch 180/859 | Cost: 0.0754 Epoch: 050/050 | Batch 200/859 | Cost: 0.1502 Epoch: 050/050 | Batch 220/859 | Cost: 0.1598 Epoch: 050/050 | Batch 240/859 | Cost: 0.1215 Epoch: 050/050 | Batch 260/859 | Cost: 0.3276 Epoch: 050/050 | Batch 280/859 | Cost: 0.0734 Epoch: 050/050 | Batch 300/859 | Cost: 0.1238 Epoch: 050/050 | Batch 320/859 | Cost: 0.0826 Epoch: 050/050 | Batch 340/859 | Cost: 0.1169 Epoch: 050/050 | Batch 360/859 | Cost: 0.1248 Epoch: 050/050 | Batch 380/859 | Cost: 0.0747 Epoch: 050/050 | Batch 400/859 | Cost: 0.1406 Epoch: 050/050 | Batch 420/859 | Cost: 0.1413 Epoch: 050/050 | Batch 440/859 | Cost: 0.0482 Epoch: 050/050 | Batch 460/859 | Cost: 0.1501 Epoch: 050/050 | Batch 480/859 | Cost: 0.1086 Epoch: 050/050 | Batch 500/859 | Cost: 0.0694 Epoch: 050/050 | Batch 520/859 | Cost: 0.0628 Epoch: 050/050 | Batch 540/859 | Cost: 0.1795 Epoch: 050/050 | Batch 560/859 | Cost: 0.1272 Epoch: 050/050 | Batch 580/859 | Cost: 0.1021 Epoch: 050/050 | Batch 600/859 | Cost: 0.2878 Epoch: 050/050 | Batch 620/859 | Cost: 0.0946 Epoch: 050/050 | Batch 640/859 | Cost: 0.1194 Epoch: 050/050 | Batch 660/859 | Cost: 0.1060 Epoch: 050/050 | Batch 680/859 | Cost: 0.1008 Epoch: 050/050 | Batch 700/859 | Cost: 0.0824 Epoch: 050/050 | Batch 720/859 | Cost: 0.1444 Epoch: 050/050 | Batch 740/859 | Cost: 0.0682 Epoch: 050/050 | Batch 760/859 | Cost: 0.0797 Epoch: 050/050 | Batch 780/859 | Cost: 0.0479 Epoch: 050/050 | Batch 800/859 | Cost: 0.1872 Epoch: 050/050 | Batch 820/859 | Cost: 0.0586 Epoch: 050/050 | Batch 840/859 | Cost: 0.1301 Epoch: 050/050 Train Acc.: 96.31% | Validation Acc.: 87.90% Time elapsed: 9.01 min Total Training Time: 9.01 min ###Markdown Evaluation (No Need To Change Any Code in This Section!) ###Code plt.plot(range(1, NUM_EPOCHS+1), train_loss_lst, label='Training loss') plt.plot(range(1, NUM_EPOCHS+1), valid_loss_lst, label='Validation loss') plt.legend(loc='upper right') plt.ylabel('Cross entropy') plt.xlabel('Epoch') plt.show() plt.plot(range(1, NUM_EPOCHS+1), train_acc_lst, label='Training accuracy') plt.plot(range(1, NUM_EPOCHS+1), valid_acc_lst, label='Validation accuracy') plt.legend(loc='upper left') plt.ylabel('Accuracy') plt.xlabel('Epoch') plt.ylim([80, 100]) plt.show() model.eval() with torch.set_grad_enabled(False): # save memory during inference test_acc, test_loss = compute_accuracy_and_loss(model, test_loader, DEVICE) print(f'Test accuracy: {test_acc:.2f}%') ###Output Test accuracy: 87.88%
PySpark/Spark Determine Column DataTypes.ipynb
###Markdown Functions ###Code def _renameDF_Columns(df): ORIGINAL_COLUMN_NAMES = df.columns NEW_COLUMN_NAMES = [str(x).upper().strip() for x in ORIGINAL_COLUMN_NAMES] for orig, new in zip(ORIGINAL_COLUMN_NAMES,NEW_COLUMN_NAMES): df=df.withColumnRenamed(orig,new) return df import re import datetime import pandas as pd def intTryParse(value): value = str(value) res = re.search('e', str(value), re.IGNORECASE) if not res: try: int(value) return True except Exception: return isFloatWholeNumber(value) else: return False def floatTryParse(value): value = str(value) res = re.search('e', str(value), re.IGNORECASE) if not res: try: float(value) return True except ValueError: return False else: return False def isFloatWholeNumber(x): value = str(x) try: fval = float(value) return fval.is_integer() except Exception: return False def dateTryParse(s): s = str(s) isDate = False formats = ["%m/%d/%Y", "%Y-%m-%d", "%Y-%m-%d %H:%M:%S"] for f in formats: try: d = datetime.datetime.strptime(s, f) return True,d except ValueError: pass return isDate, None def isNull(val): if val==None: return True if pd.isnull(val): return True if str(val)=="nan": return True return False #returns column datatype.. Float, Int, Date or String def getType(x): isContainsPeriod = False if str(x).find(".") != -1: isContainsPeriod = True if intTryParse(x) and not (str(x).startswith('0') and not isContainsPeriod and not (str(x) == '0')): return "int" if floatTryParse(x) and not (str(x).startswith('0') and not isContainsPeriod): return "float" isDate, dtConverted = dateTryParse(str(x)) if isDate: return "date" return "string" @pandas_udf(returnType=ArrayType(StringType())) def getType_pudf(ser): converted = [getType(v) for v in ser] list_set = set(converted) ltypes = (list(list_set)) #Determine Length if "string" in ltypes: lengths = [len(v) for v in ser] maxLength = max(lengths) elif ("string" not in ltypes) and ("date" not in ltypes): maxLength = max(ser) else: maxLength = 0 isNullable = True #Determine Nullable convNull = [isNull(v) for v in ser] list_set = set(convNull) nullTypes = (list(list_set)) d = {} d["types"] = ltypes d["length"] = maxLength d["nullTypes"] = nullTypes return json.dumps(d) def array_to_string_original(my_list): return '[' + ','.join([str(elem) for elem in my_list]) + ']' def array_to_string(my_list): #return ','.join([str(elem) for elem in my_list]) return str(my_list) array_to_string_udf = udf(array_to_string, returnType=StringType()) #Read DataSet df = sqlContext.read.csv("1500000SalesRecords.csv",header=True, sep=",", inferSchema=False) #Strip Column Names and UpperCase all Columns df = _renameDF_Columns(df) #Print Columns df.head() #GET UNIQUE VALUES TO ARRAY IN A DATAFRAME.. COLUMN_NAMES = df.columns dfU = df.withColumn("Temp",lit("a")) exprs = [F.collect_set(colName) for colName in COLUMN_NAMES] dfU = dfU.groupby('Temp').agg(*exprs) dfU = dfU.drop("Temp") dfU = dfU.toDF(*COLUMN_NAMES) dfU.show(10) #apply UDF to grouped by data dfTypes = dfU.select( *[getType_pudf(col(col_name)).name(col_name) for col_name in dfU.columns] ) #convert to string to allow printing dfTypes_cln = dfTypes.select( *[array_to_string_udf(col(col_name)).name(col_name) for col_name in dfTypes.columns] ) dfPandas = dfTypes_cln.toPandas() dfPandas d = dfPandas.to_dict() d d v = d['TOTAL REVENUE'][0] print(v) jo = json.loads(v) jo['types'] for col in d: jo = json.loads(d[col][0]) print(col + " " + str(jo["types"]) + " " + str(jo["length"])) dfTypes_cln.write.csv('type_df.csv',header=True) #Convert to Dictionary new_dict = dfTypes.toPandas().to_dict(orient='list') new_dict ###Output _____no_output_____
Connect_Python_to_DB.ipynb
###Markdown ###Code #pip install mysql.connector import mysql.connector mydb = mysql.connector.connect( host="localhost", user="root", password="" ) mycursor = mydb.cursor() mycursor.execute("CREATE DATABASE CMG") import mysql.connector mydb=mysql.connector.connect(host="localhost", user="root", password="", database="morrisobiri" ) #print(mydb) mycursor=mydb.cursor() #mycursor.execute("CREATE TABLE Patient (P_Name varchar(100), DateofBirth varchar(50))") sql = "INSERT INTO Patient(ID, P_Name, DateofBirth, Phonenumber) values (%s, %s,%s,%s)" values = [(" ","Jane", "02-27-1980", "817-500-5587"), (" ","Peter", "01-23-1998", "214-365-3766"), (" ","Rose", "05-23-2008", "682-365-3766"), (" ","Lucas", "01-23-1988", "214-099-3766"), (" ","Jaque", "09-11-1986", "903-365-3766"), (" ","Amir", "07-08-2016", " "), (" ","James", "01-01-1983", "214-574-8458"), (" pppp","Judy", "05-01-1984", "473-365-3766"), ("tttt","Wesley", "03-01-2009", "142-859-4589")] mycursor.executemany(sql,values) mydb.commit() print(mycursor.rowcount, "record inserted") #for morris in mycursor: # print(morris) mycursor=mydb.cursor() mycursor.execute("SELECT * FROM Patient") myresults = mycursor.fetchall() for morris in myresults: print(morris) ###Output _____no_output_____
RBM/reversemap_testing.ipynb
###Markdown Nov 16: Reverse map testing ###Code import matplotlib as mpl import os from scipy.linalg import qr from data_process import image_data_collapse def rebuild_R_from_xi_image(xi_image): xi_collapsed = image_data_collapse(xi_image) Q, R = qr(xi_collapsed, mode='economic') return Q, R def plot_basis_candidate(xcol, idx, outdir, label=''): cmap='seismic_r' norm = mpl.colors.DivergingNorm(vcenter=0.) plt.figure() plt.imshow(xcol.reshape((28, 28)), cmap=cmap, norm=norm) # turn off labels ax = plt.gca() ax.grid(False) ax.set_xticklabels([]) ax.set_yticklabels([]) ax.tick_params(axis='both', which='both', length=0) # colorbar plt.colorbar() #plt.title('Basis example: %d %s' % (idx, label)) plt.savefig(outdir + os.sep + 'basis_example_%d%s.jpg' % (idx, label)) plt.close() def plot_basis_candidate_fancy(xcol, idx, outdir, label=''): # generate masked xnol for discrete cmap # ref: https://stackoverflow.com/questions/53360879/create-a-discrete-colorbar-in-matplotlib # v <= -1.5 = orange # -1.5 < v < -0.5 = light orange # -0.5 < v < 0.5 = grey # 0.5 < v < 1.5 = light blue # v > 1.5 = blue cmap = mpl.colors.ListedColormap(["firebrick", "salmon", "lightgrey", "deepskyblue", "mediumblue"]) norm = mpl.colors.BoundaryNorm(np.arange(-2.5, 3), cmap.N) # clip the extreme values xcol_clipped = xcol xcol_clipped[xcol_clipped > 1.5] = 2 xcol_clipped[xcol_clipped < -1.5] = -2 img = xcol_clipped.reshape((28, 28)) # plot prepped image plt.figure() ims = plt.imshow(img, cmap=cmap, norm=norm) # turn off labels ax = plt.gca() ax.grid(False) ax.set_xticklabels([]) ax.set_yticklabels([]) ax.tick_params(axis='both', which='both', length=0) # colorbar plt.colorbar(ims, ticks=np.linspace(-2, 2, 5)) #plt.title(r'$Basis example: %d %s$' % (idx, label)) plt.savefig(outdir + os.sep + 'basis_example_%d%s.jpg' % (idx, label)) plt.close() def plot_error_timeseries(error_timeseries, outdir, label='', ylim=None): plt.plot(error_timeseries) plt.xlabel('iteration') #plt.ylabel(r'$||Wx - tanh(\beta Wx)||^2$') plt.title('Error over gradient updates %s' % (label)) print('error_timeseries min/max', np.min(error_timeseries), np.max(error_timeseries)) if ylim is None: plt.savefig(outdir + os.sep + 'error_%s.jpg' % (label)) else: plt.ylim(ylim) plt.savefig(outdir + os.sep + 'error_%s_ylim.jpg' % (label)) plt.close() def binarize_search_as_matrix_NB(weights, outdir, num_steps=200, beta=100, noise=0, init=None): # search for (p x p) X such that W*X is approximately binary (N x p matrix) # condition for binary: W*X = sgn(W*X) # soften the problem as W*X = tanh(beta W*X) # define error E = W*X - tanh(beta W*X) # perform gradient descent on ||W*X - tanh(beta W*X)||^2 = tr(E * E^T) # speedups and aliases N, p = weights.shape WTW = np.dot(weights.T, weights) def get_err(err_matrix): err = np.trace( np.dot(err_matrix, err_matrix.T) ) return err def build_overlaps(X): overlaps = np.zeros((p, p)) for i in range(p): x_i = X[:, i] for j in range(p): x_j = X[:,j] overlaps[i,j] = np.dot(X[:,i], np.dot(WTW, X[:,j])) return overlaps def gradient_search(X, num_steps=num_steps, eta=2*1e-2, noise=noise, plot_all=True): # note eta may need to be prop. to beta; 0.1 worked with beta 200 # performs gradient descent for single basis vector # TODO idea for gradient feedback: add terms as basis formed corresponding to 'dot product with basis elements is small' err_timeseries = np.zeros(num_steps + 1) ALPHA = 1e-3 # lagrange mult for encouraging basis vector separation # large local output dir for gradient traj outdir_local = outdir + os.sep + 'num_details' if not os.path.exists(outdir_local): os.makedirs(outdir_local) def gradient_iterate(X, col_by_col=True): # gather terms WX = np.dot(weights, X) tanhu = np.tanh(beta * WX) err_matrix = WX - tanhu if col_by_col: for col in range(p): delta = 1 - tanhu[:, col] ** 2 factor_2 = err_matrix[:, col] - beta * err_matrix[:, col] * delta gradient = np.dot(2 * weights.T, factor_2) print('grad:', np.mean(gradient), np.linalg.norm(gradient), np.min(gradient), np.max(gradient)) # encourage separation of the near binary vectors (columns of W*X) # TODO look into this lagrange mult problem further """ #overlaps = build_overlaps(X) colsum = 0 for c in range(p): if c != col: colsum += X[:, c] # TODO weight them by their magnitude? alternate_obj = np.dot( WTW, colsum) """ # compute overall update (binarization gradient + separation gradient) noise_vec = np.random.normal(loc=0, scale=noise, size=p) # saw print(np.min(gradient * eta), np.max(gradient * eta)) in -1.5, 1.5 new_xcol = X[:, col] - gradient * eta + noise_vec # - ALPHA * alternate_obj #magA = np.linalg.norm(gradient) #print('A', magA, eta * magA) #magB = np.linalg.norm(alternate_obj) #print('B', magB, ALPHA * magB) # update X X[:, col] = new_xcol else: # compute gradient delta = 1 - tanhu ** 2 factor_2 = err_matrix - beta * err_matrix * delta gradient = np.dot(2 * weights.T, factor_2) X = X - gradient * eta return X, WX, err_matrix for idx in range(num_steps): X, WX, err_matrix = gradient_iterate(X, col_by_col=True) err_timeseries[idx] = get_err(err_matrix) if plot_all and idx % 10 == 0: for col in range(p): candidate = WX[:, col] plot_basis_candidate_fancy(candidate, col, outdir_local, '(iterate_%s_discrete)' % idx) plot_basis_candidate(candidate, col, outdir_local, '(iterate_%s)' % idx) # compute last element of error (not done in loop) WX = np.dot(weights, X) tanhu = np.tanh(beta * WX) err_matrix = WX - tanhu err_timeseries[num_steps] = get_err(err_matrix) print('ZEROTH err_timeseries') print(err_timeseries[0], err_timeseries[1], err_timeseries[-1]) return X, err_timeseries # initial guesses for candidate columns of R matrix if init is None: X = np.random.rand(p, p)*2 - 1 # draw from U(-1,1) else: assert init.shape == (p, p) X = init # perform num random searches for basis vector candidates x0 = X X_final, err_timeseries = gradient_search(x0) plot_error_timeseries(err_timeseries, outdir, 'traj') plot_error_timeseries(err_timeseries, outdir, 'traj', ylim=(-10, np.min(err_timeseries)*2)) WX_final = np.dot(weights, X_final) for idx in range(p): candidate = WX_final[:, idx] plot_basis_candidate_fancy(candidate, idx, outdir, 'final_fancy') plot_basis_candidate(candidate, idx, outdir, 'final') return X_final ############################################## # MAIN (load weights) ############################################## # THINK PATH TOO LONG FOR PYTHON - CHANGE HKEY run_num = 0 rundir = 'NOVEMBER_fig4_comparisons_alt_inits_p10_1000batch_earlysteps' subdir = 'hopfield_10hidden_0fields_2.00beta_1000batch_3epochs_20cdk_1.00E-04eta_1000ais_10ppEpoch' weights_fname = 'weights_10hidden_0fields_20cdk_1000stepsAIS_2.00beta.npz' objective_fname = 'objective_10hidden_0fields_20cdk_1000stepsAIS_2.00beta.npz' iteration_idx = 0 """ run_num = 0 rundir = 'NOVEMBER_fig4_comparisons_alt_inits_p10_1000batch' subdir = 'hopfield_10hidden_0fields_2.00beta_1000batch_70epochs_20cdk_1.00E-04eta_0ais_1ppEpoch' weights_fname = 'weights_10hidden_0fields_20cdk_0stepsAIS_2.00beta.npz' objective_fname = 'objective_10hidden_0fields_20cdk_0stepsAIS_2.00beta.npz' iteration_idx = 0""" weights_path = bigruns + sep + 'rbm' + sep + rundir + sep + subdir + sep + 'run%d' % run_num + sep + weights_fname objective_path = bigruns + sep + 'rbm' + sep + rundir + sep + subdir + sep + 'run%d' % run_num + sep + objective_fname weights_obj = np.load(weights_path) weights_timeseries = weights_obj['weights'] objective_obj = np.load(objective_path) epochs = objective_obj['epochs'] iterations = objective_obj['iterations'] print('weights_timeseries.shape', weights_timeseries.shape) print('epochs', epochs) print('iterations', iterations) HIDDEN_UNITS = 10 from RBM_train import load_rbm_hopfield fname = 'hopfield_mnist_%d.npz' % HIDDEN_UNITS rbm = load_rbm_hopfield(npzpath=DIR_MODELS + os.sep + 'saved' + os.sep + fname) Q, R_star = rebuild_R_from_xi_image(rbm.xi_image) X_star = R_star print(X_star) ###Output weights_timeseries.shape (784, 10, 31) epochs [0 1 2 3] iterations [ 0. 6. 12. 18. 24. 30. 36. 42. 48. 54. 60. 66. 72. 78. 84. 90. 96. 102. 108. 114. 120. 126. 132. 138. 144. 150. 156. 162. 168. 174. 180.] LOADING: models\saved\hopfield_mnist_10.npz [[ 28. 13.57142857 16.28571429 17.42857143 15.42857143 19.28571429 16.92857143 16.42857143 17.57142857 16.71428571] [ 0. 24.49114792 12.86095665 13.04427987 13.0092818 11.68852138 11.7697669 14.57836184 12.63846894 13.9300643 ] [ 0. 0. 18.79817289 5.97937619 6.45929183 1.89341808 7.81648628 3.24272472 6.98439866 4.92824307] [ 0. 0. 0. 16.56317364 1.13400927 7.73328534 0.88710888 2.1807274 3.8116505 3.47266788] [ 0. 0. 0. 0. 18.26771312 3.14160582 6.36123174 7.30567223 3.45110735 11.88607013] [ 0. 0. 0. 0. 0. 14.21904607 2.83455098 1.0414347 5.05339441 3.18611413] [ 0. 0. 0. 0. 0. 0. 15.76425349 -2.30917977 1.36617823 0.30791071] [ 0. 0. 0. 0. 0. 0. 0. 15.05034309 2.97647807 4.07525107] [ 0. 0. 0. 0. 0. 0. 0. 0. 14.28393136 0.34463028] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. -10.29503391]] ###Markdown Spectral dynamics of the weights ###Code # SHORT VERION: note epoch label is not correct if ppEpoch not 1 from weights_analysis import plot_weights_timeseries plot_weights_timeseries(weights_timeseries, NOTEBOOK_OUTDIR, mode='eval', extra=False) # use eval or minmax ###Output No handles with labels found to put in legend. ###Markdown Reverse map analysis ###Code iteration_idx_pick = 0 noise = 0.0 X_guess = True num_steps = 20 beta = 2000 alt_names = False # some weights had to be run separately with different naming convention lowdin_approx = False ais_val = 1000 # load misc data to get initial transformation guess (R array if hopfield from QR) if X_guess: X0_guess = X_star else: X0_guess = None # choose weights to study weights = weights_timeseries[:, :, iteration_idx_pick] if lowdin_approx: print('Taking Lowdin approx of the weights') u, s, vh = np.linalg.svd(weights, full_matrices=False) print('Original singular values:\n', s) weights = u # analysis outdir = NOTEBOOK_OUTDIR + sep + 'hopfield_earlysteps_iter%d_star%d_num%d_beta%.2f_noise%.2f' % (iteration_idx_pick, X_guess, num_steps, beta, noise) if not os.path.exists(outdir): os.makedirs(outdir) # binarize_search(weights, outdir, num=10, beta=2000, init=X0_guess) # OLD WAY -- search vector by vector binarize_search_as_matrix_NB(weights, outdir, num_steps=num_steps, beta=beta, init=X0_guess, noise=noise) # NEW WAY - do gradient descent to search for p x p matrix at once ###Output grad: 2.503395119960236e-07 2.503395180042384e-06 -5.120903698767429e-14 2.5033951800423838e-06 grad: 6.173429143105484e-08 1.2046487956489045e-06 -4.852499802681153e-07 1.1025929337285582e-06 grad: 1.5864987128111008e-07 8.198666355553653e-07 -1.580645812955964e-08 6.288142418329561e-07 grad: 1.152446668752592e-07 1.3788839145461346e-06 -2.4983562682561136e-07 1.3515640342779379e-06 grad: 1.262624018979236e-07 1.2113095852772352e-06 -1.986461068047724e-07 1.137080182391824e-06 grad: 1.4425061616990488e-07 1.724069317517944e-06 -1.2954629498264326e-07 1.710013969159923e-06 grad: 1.460279224490208e-07 1.2461748686837059e-06 -1.9715061009585316e-07 1.1516242776373292e-06 grad: 1.2300223116520011e-07 1.2938417928947734e-06 -1.5234946400761276e-07 1.2289415455733655e-06 grad: 1.2948895678951552e-07 1.238336501466303e-06 -2.253165235210337e-07 1.186512730555029e-06 grad: 1.1243961164364211e-07 1.348688776764692e-06 -1.6015717518825902e-07 1.3145391697329249e-06 ###Markdown Troubleshooting ###Code print(weights_timeseries[:,:,25]) plt.imshow(weights_timeseries[:,9,0].reshape(28,28)) plt.colorbar() plt.show() plt.imshow(weights_timeseries[:,9,20].reshape(28,28)) plt.colorbar() plt.show() ###Output [[-0.04560887 -0.02603476 -0.01038936 ... -0.00289868 -0.00027312 -0.00337843] [-0.0455037 -0.02598116 -0.0103141 ... -0.00284622 -0.00022916 -0.0033882 ] [-0.04528877 -0.02576656 -0.0102866 ... -0.00279537 -0.00024024 -0.00337373] ... [-0.0452201 -0.02570138 -0.01025062 ... -0.00284492 -0.00028502 -0.00334642] [-0.04528904 -0.02583857 -0.01024055 ... -0.00288428 -0.0003354 -0.00334747] [-0.04534468 -0.02586195 -0.01030687 ... -0.0028759 -0.00027973 -0.00338763]]
sistemas-de-recomendacion-steam-dataset.ipynb
###Markdown Sistemas de Recomendación Dataset: STEAM**Para descargar el dataset, hacer click [aquí](https://github.com/kang205/SASRec). Son dos archivos, uno de calificaciones y otro de información sobre los juegos.**Para comenzar a trabajar, se puede asumir que cada entrada es un enlace entre una persona usuaria y un item, independientemente de si la crítica es buena o mala. Parte A - Exploración de Datos 1. CONVERTIMOS ARCHIVOS JSON A CSV ###Code import gzip import pandas as pd def parse(path): g = gzip.open(path, 'r') for l in g: yield eval(l) ###Output _____no_output_____ ###Markdown 1.1 REVIEWS ###Code contador = 0 data_reviews = [] # Vamos a guardar una de cada 10 reviews para no llenar la memoria RAM. n = 10 for l in parse('steam_reviews.json.gz'): if contador%n == 0: data_reviews.append(l) else: pass contador += 1 data_reviews = pd.DataFrame(data_reviews) data_reviews.head() data_reviews.to_csv('new_data_reviews.csv') ###Output _____no_output_____ ###Markdown 1.2 GAMES ###Code data_games = [] for l in parse('steam_games.json.gz'): data_games.append(l) data_games = pd.DataFrame(data_games) data_games.head() data_games.to_csv('new_data_games.csv') ###Output _____no_output_____ ###Markdown 2. BREVE DESCRIPCIÓN DE STEAM __Steam__ es un sistema de distribución de juegos multiplataforma en línea, con alrededor de 75 millones de usuarios activos, alrededor de 172 millones de cuentas en total, que aloja más de 3000 juegos, lo que lo convierte en una plataforma ideal para el tipo de trabajo que aquí se presenta. El conjunto de datos contiene registros de más de 3200 juegos y aplicaciones. Steam es un servicio de distribución digital de videojuegos de Valve. Se lanzó como un cliente de software independiente en septiembre de 2003 como una forma de que Valve proporcionara actualizaciones automáticas para sus juegos y luego, se expandió para incluir juegos de editores externos. Steam también se ha expandido a una tienda digital móvil y basada en la web en línea. De acuerdo con la **popularidad del juego, la similitud de la descripción del juego, la calidad del juego y la preferencia del jugador por el juego**, recomiendan el juego correspondiente al jugador del juego, de modo que Steam obtenga un mayor grado de satisfacción del cliente. 3. ANÁLISIS EXPLORATORIO DE DATOS 1. __Se importan las librerías__ necesarias para trabajar en la consigna. ###Code import numpy as np import matplotlib.pyplot as plt import seaborn as sns sns.set() import pandas as pd import gc # garbage collector from surprise import Dataset from surprise import Reader from surprise.model_selection import train_test_split ###Output _____no_output_____ ###Markdown 2. __Se realiza la carga el dataset__ usando las funcionalidades de Pandas. __DATA REVIEW__ ###Code new_data_reviews = pd.read_csv('new_data_reviews.csv') new_data_reviews.shape # Filas y columnas ###Output _____no_output_____ ###Markdown * *El Dataset, cuenta con **779.307 Filas**, y **13 Columnas**.* ###Code new_data_reviews.head(3) # Primeras 3 instancias (filas) ###Output _____no_output_____ ###Markdown __DATA GAMES__ ###Code new_data_games = pd.read_csv('new_data_games.csv') new_data_games.shape # Filas y columnas ###Output _____no_output_____ ###Markdown * *El Dataset, cuenta con **32.135 Filas**, y **17 Columnas**.* ###Code new_data_games.head(3) # Primeras 3 instancias (filas) ###Output _____no_output_____ ###Markdown 3. __Valores Faltantes:__ se imprimen en pantalla los nombres de las columnas y cuántos valores faltantes hay por columna. En un principio es a mera exposición, ya que por el momento no vamos a descartar ninguno de ellos, ni realizar imputación de datos. __DATA REVIEW__ ###Code new_data_reviews.isnull().sum() # Nombres de las columnas y su cantidad de faltantes ###Output _____no_output_____ ###Markdown * *Variables con elementos faltantes:* *1. `compensation` **98%** (764.719);* *2. `found_funny` **86%** (659.143);* *3. `user_id` 59% c/u (461.967);* *4. `hours` 0,3% (2.637);* *5. `text` 0,2% (1.839);* *6. `product` 0,2% (1.566).* __DATA GAMES__ ###Code new_data_games.isnull().sum() # Nombres de las columnas y su cantidad de faltantes ###Output _____no_output_____ ###Markdown * *Casi todas las Variables tienen elementos faltantes. Detallamos las principales:* *1. `discount_price` **98%** (31.910);* *2. `metascore` **98%** (29.528);* *3. `publisher` 59% c/u (8.062);* *4. `sentiment` 0,3% (7.182);* *5. `developer` 0,2% (3.299);* *6. `genres` 0,2% (3.283).* * *Cabe aclarar que la columna `id`, sólo tiene 2 valores faltantes.** *`metascore` refiere a la media de todas las reseñas recibidas para dicho juego.* 4. Reseñas a partir de la calificación __sentiment.__ ###Code pd.unique(new_data_games['sentiment']) print(new_data_games['sentiment'].value_counts()) print(new_data_games['sentiment'].value_counts().sum()) ###Output 24953 ###Markdown * *En total tenemos 24953 reseñas a partir de `sentiment`.* ###Code sns.countplot(data = new_data_games, y = 'sentiment', order = new_data_games['sentiment'].value_counts().index, palette='pastel') plt.title('Número de Calificaciones por Tipo') ###Output _____no_output_____ ###Markdown * *Podemos observar, que existe un tipo de calificación -`sentiment`-, que divide a los juegos en distintas reseñas, como ser, Muy Positivo, Positivo, Negativo, Muy Negativo, etc..** *Sin embargo, también tenemos dentro de la misma, reseñas que van de 1 user reviews a 9 user reviews, las cuales son poco viables de rankear, ya que es difícil dar un orden a las mismas.* 5. Reseñas a partir de la calificaciones __metascore.__ ###Code pd.unique(new_data_games['metascore']) print(new_data_games['metascore'].value_counts()) print(new_data_games['metascore'].value_counts().sum()) ###Output 2607 ###Markdown * *En total tenemos 2607 reseñas a partir de la calificación `metascore`.* ###Code plt.figure(figsize = (15,13)) sns.countplot(data = new_data_games, y = 'metascore', order = new_data_games['metascore'].value_counts().index, palette='pastel') plt.title('Número de Calificaciones por Tipo') ###Output _____no_output_____ ###Markdown * *Si bien `metascore` parece ser una buena forma de darle puntuación a los juegos, la cantidad de calificaciones disponibles es realmente baja en función al dataset total de `reviews`.** *En total tenemos 2607 reseñas a partir de `metascore`, lo cual representa un 0,3% del dataset.* 4. PREPARACIÓN Y TRANSFORMACIÓN DE DATOS PARA RECOMENDACIÓN COLABORATIVA * Los métodos de **filtrado colaborativo** construyen un modelo basado en el comportamiento pasado de los usuarios (artículos comprados anteriormente, películas vistas y calificadas, etc.) y utilizan las decisiones tomadas por los usuarios actuales y otros. Este modelo se utiliza luego para predecir elementos (o calificaciones de elementos) en los que el usuario puede estar interesado. * Ventajas: no necesitamos tener información acerca de los productos. * Desventajas: necesitamos tener la matríz de utilidad (que es muy dispersa) y llenarla es costosa en tiempo y dinero. * Para implementarlo, necesitamos un dataset donde cada fila represente un `usuario`, un `juego` y la `calificación del usuario` a ese juego. Es decir, tiras de tres componentes. Hay otra información que puede ser útil, pero con esos tres datos ya podemos implementar un filtro colaborativo. CASO PARTICULAR STEAM* No hay registros tanto en el sitio web Steam, sobre las calificaciones continuas de estos usuarios. En realidad, en la plataforma, los usuarios sólo dan "Recomendación" o "No Recomendación", lo que significa revisiones binarias, positivas y negativas, incluso en el sitio web del usuario, todavía no hay ningún mecanismo sobre las calificaciones continuas, como una estrella a cinco estrellas.* Para obtener calificaciones continuas sobre la interacción entre los usuarios y los juegos, debemos suponer un mecanismo de interacción de calificación de los juegos por parte de los usuarios. Ya que las concentraciones de los usuarios sobre los juegos pueden ser ajustadas por sus `tiempos de juego`, podemos asumir que el tiempo de juego es una información bastante persuasiva sobre los intereses de los usuarios.* Por lo tanto, __aquí asumimos que el `tiempo de juego` es una parte muy importante de los intereses.__ 4.1 DATA REVIEWS 1. Seleccionamos aquellos **features que nos seran útiles** a la hora de realizar las predicciones. * *Según lo explicado en el punto anterior, en éste caso vamos a considerar los features de `username`, `product_id` y `hours`, ya que son los que nos van a ser útiles a la hora de realizar nuestro filtro colaborativo*. ###Code df = pd.read_csv('new_data_reviews.csv', dtype={'hours': np.float, 'product_id': np.int}) print(df.shape) df1 = df[['username','hours','product_id']] df1 ###Output _____no_output_____ ###Markdown * *Nos quedamos con los 3 features indicados anteriormente.* 2. __Valores Faltantes:__ visualización y tratamiento. ###Code df1.isnull().sum() ###Output _____no_output_____ ###Markdown * *Los valores fatantes representan menos del 0,3% del total de instancias, por lo que se procede a eliminarlos, ya que no deberían generar grandes distorsiones en el dataset.* ###Code df2 = df1.dropna() df2.isnull().sum() print(df2.shape) ###Output (776652, 3) ###Markdown 3. __Outliers:__ visualización y tratamiento de valores atípicos en `horas`, la cual será la que utilizaremos luego para elaborar los Ratings. ###Code plt.figure(figsize = (6,4)) sns.boxplot(data = df2, y = 'hours', palette= 'pastel') plt.title('Cantidad Horas Jugadas por Usuario por Juego') plt.ticklabel_format(axis = 'y', style = 'plain') ###Output _____no_output_____ ###Markdown * *Se procede a descartar los datos atípicos para `horas`, en éste caso aquellos valores ubicados por encima de 21.000 horas de juego.* ###Code mask_hours = (df2['hours'] <= 21000) df3 = df2[mask_hours] df3['hours'].describe() print(df3.shape) ###Output (776650, 3) ###Markdown * *El **Dataset Final de Reviews con el que vamos a trabajar**, representa aprox. un **99,7% del Dataset Original Descargado**.* 4. __Encoders:__ aplicación de LabelEncoder s/ `username`. * *Cada nombre de usuario es **único**, lo cual se refleja en el feature `username` de nuestro dataset.** *Si bien Surpr!se puede trabajar con features bajo éstas condiciones, **se decide asignarle un Id a cada usuario único**, a fin de facilitar comparaciones a futuro.* ###Code from sklearn.preprocessing import LabelEncoder le = LabelEncoder() df3['username'] = le.fit_transform(df3['username']) print(df1['username']) df3 ###Output _____no_output_____ ###Markdown * *Como se observa en la columna de `username`, ahora cada usuario se encuentra representado por un Id Number.* 5. __Determinación de Calificaciones:__ confección de un `rating` a partir de las horas jugadas. * *La **cantidad de horas jugadas por cada usuario para cada juego**, será para el presente estudio, determinante a la hora de establecer las calificaciones.** *Se establecerán 5 puntuaciones que **van del 1 a 5**, determinadas en función a la distribución de los datos (quintiles).** *Se escoje dicha elección, para que en cada una de las puntuaciones se halle una cantidad de valores similares.* * *A modo de demostración, se expone la concentración de datos resultante para un tipo de rating, si simplemente hubiéramos optado por tomar el valor máximo de horas, es decir 20573, y lo dividiéramos en 5 Bins.* ###Code demo = df2 bins = [0, 4114.6, 8229.2, 12343.8, 16458.4, 20573] labels =[1,2,3,4,5] demo['rating'] = pd.cut(demo['hours'], bins,labels=labels) demo['rating'].value_counts(normalize=True) fig = plt.figure() fig, ax = plt.subplots(figsize = (6,4)) plt.title('Cantidad de Calificaciones por Rating') sns.countplot(data=demo, x ='rating') ###Output _____no_output_____ ###Markdown * *Observamos que, todas las calificaciones prarecerían quedar en **rating = 1**, y ésto ocurre porque concentra el **99,85%** de los datos.* * *Ahora sí, **aplicamos la modalidad elegida** de dividir los datos en **quintiles**.* ###Code df3['rating'] = pd.qcut(df3.hours, 5, labels=['1', '2', '3', '4', '5']) print (df3) df3['rating'].value_counts(normalize=True) ###Output _____no_output_____ ###Markdown * *Se cumple la simetría en la distribución de los datos.** *A continuación graficamos.* ###Code fig = plt.figure() fig, ax = plt.subplots(figsize = (6,4)) plt.title('Cantidad de Calificaciones por Rating') sns.countplot(data=df3, x ='rating') pd.unique(df3['rating']) ###Output _____no_output_____ ###Markdown * *Los valores resultan categóricos, ordenados de menor a mayor.** *Los pasamos a enteros, con el fin de poder seguir explorando sus datos.* ###Code df3['rating'] = df3['rating'].astype(int) ###Output <ipython-input-45-90c6f3196ed2>:1: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df3['rating'] = df3['rating'].astype(int) ###Markdown * *Eliminamos la columna de `hours`, dejando en su remplazo la confeccionada de `rating`.* ###Code final_reviews = df3[['username','product_id','rating']] final_reviews final_reviews.dtypes ###Output _____no_output_____ ###Markdown * *Tenemos int32 para todos los features, lo cual nos permitirá un mejor procesamiento de datos en función a la memoria a utilizar.* * *Obtenemos finalmente el **Dataset de Reviews**, con Usuarios, Id de Productos y Rating, que vamos a utilizar para seguir explorando los datos y sus relaciones con Data Games y además, será la base para entrenar el modelo elegido y realizar las recomendaciones de juegos propuestas.* ###Code if True: final_reviews.to_csv('final_reviews.csv', index= False) # Guardamos el Dataset modificado en un nuevo archivo ###Output _____no_output_____ ###Markdown 4.2 DATA GAMES 1. Seleccionamos aquellos **features que nos seran útiles** a la hora de realizar las predicciones. * *En éste caso, serán útiles las columnas de `title`, para poder identificar el nombre de los juegos, y del `id` de los juegos, para realizar el cruce de datos con Data Reviews, ya que es éste último el feature, el que tienen en común ambos Datasets.* ###Code df_titulo = pd.read_csv('new_data_games.csv', encoding = "ISO-8859-1", usecols = [4,13]) print(df_titulo.shape) df_titulo.head() ###Output (32135, 2) ###Markdown 2. Se **intercambian** las columnas, y se **renombra** la de `id`, a fin de que coincida con el dataset de Reviews. ###Code df_titulo = df_titulo[['id','title']] df_titulo.head() df_new = df_titulo.rename(columns={'id':'product_id'}) df_new ###Output _____no_output_____ ###Markdown 3. __Valores Faltantes:__ visualización y tratamiento. ###Code df_new.isnull().sum() ###Output _____no_output_____ ###Markdown * *Los valores fatantes representan el 6,4% del total de instancias.** *Se procede a eliminar todos ellos, ya que sin `product_id` no podemos cruzar los datos con el dataframe de Reviews, y sin `title` no podremos realizar las recomendaciones.* ###Code df_new_2 = df_new.dropna() df_new_2 df_new_2.isnull().sum() ###Output _____no_output_____ ###Markdown * *El **Dataset Final de Games con el que vamos a trabajar**, representa aprox. un **93,6% del Dataset Original** descargado.* 4. Tratamiento del feature `product_id`, a fin de **indexarlo** para realizar el cruce con el Dataset de Reviews en Surpr!se. ###Code df_new_2.dtypes ###Output _____no_output_____ ###Markdown * *Precisamos que el tipo de dato sea entero para `product_id`, a fin de poder realizar su indexación.* ###Code df_new_2[('product_id')] = df_new_2['product_id'].astype(int) ###Output <ipython-input-57-f03820844022>:1: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_new_2[('product_id')] = df_new_2['product_id'].astype(int) ###Markdown * *Ahora sí, se indexa la columna `pruduct_id`.* ###Code df_title = df_new_2.set_index('product_id', drop=True) df_title df_title.dtypes ###Output _____no_output_____ ###Markdown * *Luego, el tipo de dato de `pruduct_id` y `title` resultan ser **object**, siendo acorde para llevar a cabo nuestras recomendaciones a posteriori.* 5. __Eliminación__ de juegos __repetidos__. * *Se eliminan aquellos valores que se encuentran duplicados en `product_id`, ya que sólo puede haber 1 juego con el mismo Id.* ###Code print(df_title.loc[612880]) df_title = df_title[~df_title.index.duplicated(keep='first')] df_title ###Output _____no_output_____ ###Markdown * *Sólo había 1 juego duplicado.* * *Obtuvimos el **Dataset de Games** final, con Usuarios y Id de Productos como Index, para poder realizar el cruce de datos con Reviews, y que será utilizado para los mismos objetivos antes nombrados, es decir, seguir explorando los datos, y servir de base para entrenar el modelo elegido y realizar las recomendaciones de juegos propuestas.* 5. EXPLORANDO EL COMPORTAMIENTO DE LOS DATOS Y SU RELACIÓN ENTRE AMBOS DATASETS * *Realizamos distintas preguntas, con el fin de realizar un Análisis Exploratorio de Datos más profundo y enfocado a nuestro objetivo de Recomendar Juegos.* 1. ¿Cuántos usuarios únicos hay? ###Code print(len(final_reviews['username'].unique())) ###Output 539030 ###Markdown * *539030 usuarios calificaron juegos.* 2. ¿Cuántos juegos únicos hay? ###Code print(len(df_new_2['product_id'].unique())) # Se utiliza df_new_2, porque es donde product_id aún no se encuentra indexado ###Output 30083 ###Markdown * *En total se trabajará con 30083 juegos.* 3. ¿Cuántas reseñas se realizaron? ###Code final_reviews['rating'].shape ###Output _____no_output_____ ###Markdown * *Existen un total de 776650 calificaciones realizadas.* 4. Podemos obtener el nombre de un juego dado su `Id`. ###Code product_id = 4574 print(df_new_2.loc[product_id]) product_id = 27432 print(df_new_2.loc[product_id]) ###Output product_id 331930 title TerraTech R&amp;D Pack Name: 27432, dtype: object ###Markdown 5. ¿Cuántos juegos calificó cada usuario? ###Code calificaciones_por_usuario = final_reviews.username.value_counts() calificaciones_por_usuario ###Output _____no_output_____ ###Markdown * *La primer columna es el ID del usuario y la segunda, la cantidad de calificaciones que dió.** *El usuario 5442 es el que mas calificaciones realizó (juegos jugó), con un total de 213 calificaciones.* 6. ¿Cómo es la distribución del número de calificaciones por usuario? ###Code calificaciones_por_usuario.hist(log = True) plt.xlabel('Cantidad de Calificaciones') plt.ylabel('Cantidad de Usuarios') plt.title('Cantidad de Calificaciones por Usuarios') plt.show() ###Output _____no_output_____ ###Markdown * *Vemos cuántas calificaciones de juegos realizó cada usuario.** *Alrededor del 10% de los usuarios, son los que más calificaciones han realizado (o juegos han jugado).* 7. ¿Cuáles son los juegos más populares? ¿Cuántas calificaciones tiene? ¿Y los juegos menos populares? ###Code juegos_por_jugados = final_reviews.product_id.value_counts() juegos_por_jugados.index = df_title.loc[juegos_por_jugados.index].title juegos_por_jugados ###Output _____no_output_____ ###Markdown * *Se realiza un conteo por juego, de los valores únicos por juego (para c/ Id de juego, cuántos hay).** *El juego jugado por mayor cantidad de usuarios, es Team Fortress con 18372 calificaciones.** *5 de los juegos jugados 1 sola vez (menos calificados) son The Perks of Being a Wallflower, CitiesCorp Concept - Build Everything on Your Own, Island Racer, Island Racer, The Frost y DP Animation Maker.* 8. ¿Cuál es la calificación promedio de cada juego? * *Primero, unimos los dataset de reviews y games* ###Code game_data = pd.merge(final_reviews, df_title, on='product_id') game_data ###Output _____no_output_____ ###Markdown * *Observamos los primeros 5 juegos mejor puntuados, en forma descendente.* ###Code game_data.groupby('title')['rating'].mean().sort_values(ascending=False).head() ###Output _____no_output_____ ###Markdown * *Sin embargo, hay un problema. Un juego puede llegar a la cima de la lista anterior incluso si sólo un usuario le ha dado cinco estrellas. Por lo tanto, las estadísticas anteriores pueden ser engañosas. Normalmente, un juego que es realmente bueno obtiene una calificación más alta por un gran número de usuarios.** *Ahora volvamos a ver el número total de calificaciones de juego:* ###Code game_data.groupby('title')['rating'].count().sort_values(ascending=False).head() ###Output _____no_output_____ ###Markdown * *Ahora que sabemos que tanto la calificación promedio por juego como el número de calificaciones por juego son atributos importantes, crearemos un nuevo marco de datos que contenga ambos atributos.** *Crearemos un nuevo dataframe de datos llamado `ratings_mean_count` y primero añadiremos la clasificación media de cada juego a este dataframe de datos de la siguiente manera:* ###Code ratings_mean_count = pd.DataFrame(game_data.groupby('title')['rating'].mean()) ###Output _____no_output_____ ###Markdown * *A continuación, añadiremos el número de calificaciones de un juego al cuadro de datos, con el conteo de la media de calificaciones.* ###Code ratings_mean_count['rating_counts'] = pd.DataFrame(game_data.groupby('title')['rating'].count()) ratings_mean_count.head() ###Output _____no_output_____ ###Markdown * *Podemos ver el título del juego, junto con la calificación promedio y el número de calificaciones de los juegos.* ###Code plt.figure(figsize=(7,5)) plt.rcParams['patch.force_edgecolor'] = True ratings_mean_count['rating'].hist(bins=50) plt.xlabel('Calificación Promedio') plt.ylabel('Cantidad de Juegos') plt.title('Cantidad de Juegos por Calificación Promedio') ###Output _____no_output_____ ###Markdown * *En nuestro caso, los juegos con un mayor número de valoraciones suelen tener también una valoración media baja.* Parte B - Modelo de Machine Learning Los sistemas de recomendación se encuentran entre las aplicaciones más populares de la ciencia de datos en la actualidad. Se utilizan para predecir la "calificación" o "preferencia" que un usuario le daría a un artículo. Casi todas las grandes empresas de tecnología los han aplicado de alguna forma. Amazon lo usa para sugerir productos a los clientes, YouTube lo usa para decidir qué video reproducir a continuación en la reproducción automática y Facebook lo usa para recomendar páginas que les gusten y personas a seguir. __Motores de filtrado colaborativo:__ estos sistemas se utilizan ampliamente e intentan predecir la calificación o preferencia que un usuario daría a un elemento en función de las calificaciones y preferencias pasadas de otros usuarios. Los filtros colaborativos no requieren metadatos de elementos como sus homólogos basados en contenido. Hay __3 enfoques:__ * Filtrado colaborativo usuario-usuario;* Filtrado colaborativo item-item y;* Factorización matricial. *Se trabajará con __Surpr!se__, tanto para el Benchmark como en el modelo SVD elegido, para construir y analizar nuestros sistemas de recomendación, trabajando para ello con datos de calificación explícitos.* 1. MODELO BENCHMARK: FILTRO COLABORATIVO ITEM-ITEM + KNN BASIC 1.1 ENCONTRANDO SIMILITUDES ENTRE JUEGOS __FILTRADO COLABORATIVO ITEM-ITEM (JUEGO-JUEGO)__Ventajas:* La recomendación no necesita entrenarse con frecuencia a pesar de que cambien las preferencias de los usuarios.* Es computacionalmente más barato, ya que, en muchos casos, hay muchos más usuarios que elementos. Tiene sentido utilizar el filtrado basado en elementos en este caso.Un ejemplo famoso de filtrado basado en elementos es el motor de recomendaciones de Amazon. * *En el presente análisis, usaremos la **correlación entre las clasificaciones** de un juego como la **métrica de la similitud**.* * *Utilizaremos el dataset fusionado en el punto anterior **(game_data)**, ya que tiene los features seleccionados de los datasets de reviews y games en uno solo.** *Vamos a descartar usuarios con el objetivo de achicar la base de datos. Se realizará de una manera **ad-hoc**.** *Descartaremos aquellos usuarios que califican poco (menos de 5 calificaciones) o mucho (más de 5000 calificaciones).* ###Code mask_usuarios_descartables = np.logical_or(game_data.username.value_counts() <= 5, game_data.username.value_counts() > 5000) usuarios_descartables = mask_usuarios_descartables[mask_usuarios_descartables].index.values print(len(usuarios_descartables)) mascara_descartables = game_data.username.isin(usuarios_descartables) print(mascara_descartables.sum()) print(game_data.shape) game_data = game_data[~mascara_descartables] print(game_data.shape) ###Output (776650, 4) (98787, 4) ###Markdown * *También vamos a descartar aquellos juegos que tengan pocas calificaciones (menos de 100). Esto, lo hacemos con el objetivo de achicar la matriz de utilidad aún más.* ###Code mask_items_descartables = game_data.product_id.value_counts() <= 100 # mask_items_descartables items_descartables = mask_items_descartables[mask_items_descartables].index.values # items_descartables print(len(items_descartables)) mascara_descartables = game_data.product_id.isin(items_descartables) print(mascara_descartables.sum()) print(game_data.shape) metascore_data = game_data[~mascara_descartables] print(game_data.shape) ###Output (98787, 4) (98787, 4) ###Markdown * *Para encontrar la **correlación entre las clasificaciones del juego**, necesitamos crear una matriz donde cada columna sea el nombre del juego y cada fila contenga la clasificación asignada por un usuario específico a ese juego.** *Esta matriz tendrá **muchos valores nulos**, ya que cada juego no está clasificado por todos los usuarios.** *Crearemos la matriz de títulos de juegos y las correspondientes clasificaciones de los usuarios.* ###Code user_game_rating = game_data.pivot_table(index='username', columns='title', values='rating') user_game_rating ###Output _____no_output_____ ###Markdown * *Vemos que nuestra matriz, posee 8898 filas y 8611 columnas, por lo que podremos trabajar con más facilidad.** *Cada columna contiene todas las clasificaciones de los usuarios de un juego en particular.* * *Primero, chequeamos cuales son los juegos más populares luego de los filtros.* ###Code game_data.groupby('title')['rating'].count().sort_values(ascending=False).head() ###Output _____no_output_____ ###Markdown * *`Team Fortress 2` sigue siendo el que más calificaciones tiene, mientras que `Rust` conserva el segundo lugar.* * *Luego, buscaremos todas las clasificaciones de usuarios para el juego `Team Fortress 2` y encontremos los juegos similares a él.** *Escogimos éste juego porque, como se indicó anteriormente, tiene el mayor número de clasificaciones y queremos encontrar la correlación entre los juegos que tienen un mayor número de clasificaciones.* ###Code team_fortress_2_ratings = user_game_rating['Team Fortress 2'] team_fortress_2_ratings.head() ###Output _____no_output_____ ###Markdown * *Recuperaremos todas los juegos que son similares a `Team Fortress 2`.** *Podemos encontrar la correlación entre las clasificaciones de usuario de Team Fortress 2 y todas los demás juegos usando la función corrwith() como se muestra a continuación:* ###Code games_like_team_fortress_2 = user_game_rating.corrwith(team_fortress_2_ratings) corr_team_fortress_2 = pd.DataFrame(games_like_team_fortress_2, columns=['Correlation']) corr_team_fortress_2.dropna(inplace=True) corr_team_fortress_2.head() ###Output C:\Users\Ale\anaconda3\envs\ds\lib\site-packages\numpy\lib\function_base.py:2551: RuntimeWarning: Degrees of freedom <= 0 for slice c = cov(x, y, rowvar) C:\Users\Ale\anaconda3\envs\ds\lib\site-packages\numpy\lib\function_base.py:2480: RuntimeWarning: divide by zero encountered in true_divide c *= np.true_divide(1, fact) ###Markdown * *Ahora, vamos a ordenar los juegos en orden descendente de correlación para ver los juegos altamente correlacionadas en la parte superior.* ###Code corr_team_fortress_2.sort_values('Correlation', ascending=False).head() ###Output _____no_output_____ ###Markdown * *Podemos ver que algunos juegos que tienen una alta correlación con Team Fortress 2 no son muy conocidas.** *Esto muestra que la correlación por sí sola no es una buena medida para la similitud porque puede haber un usuario que haya visto Team Fortress 2 y otros pocos juegos y que los haya calificado a todos con 5.** *Una solución a este problema es recuperar sólo aquellos juegos correlacionados que tengan al menos más de 50 clasificaciones.** *Para ello, añadiremos la columna rating_counts del cuadro de datos rating_mean_count a nuestro cuadro de datos corr_team_fortress_2.* * *Primero veamos la calificación promedio de cada juego.* ###Code ratings_mean_count_ = pd.DataFrame(game_data.groupby('title')['rating'].mean()) ###Output _____no_output_____ ###Markdown * *Luego, incorporamos el número de clasificaciones de un juego al cuadro de datos de la cuenta media de clasificaciones.* ###Code ratings_mean_count_['rating_counts'] = pd.DataFrame(game_data.groupby('title')['rating'].count()) ratings_mean_count_.head() corr_team_fortress_2 = corr_team_fortress_2.join(ratings_mean_count_['rating_counts']) corr_team_fortress_2.head() ###Output _____no_output_____ ###Markdown * *Ahora, filtremos los juegos correlacionados con Team Fortress 2, que tienen más de 50 clasificaciones.* ###Code corr_team_fortress_2[corr_team_fortress_2 ['rating_counts']>50].sort_values('Correlation', ascending=False).head() ###Output _____no_output_____ ###Markdown * *Podemos ver los juegos que están altamente correlacionadas con Team Fortress 2.** *Los juegos de la lista son todos juegos disparos en primera persona, y como Team Fortress 2 es un juego muy famoso en ese género, hay una alta probabilidad de que estos juegos estén altamente correlacionadas.** *Por lo tanto, **hemos creado un simple sistema de recomendación**.* 1.2. MODELO BENCHMARK: SURPRISE - KNN BASIC * *Se elige como Benckmark, un simple algoritmo de filtrado colaborativo basado en memoria, que se deriva directamente de un enfoque básico de vecinos más cercanos.* ###Code reader = Reader() from surprise import KNNBasic from surprise import Dataset from surprise import accuracy data = Dataset.load_from_df(game_data[['username','product_id','rating']], reader) trainset, testset = train_test_split(data, test_size=.25) # Se construye el algoritmo y se entrena. algo_KNN = KNNBasic() algo_KNN.fit(trainset) predictions_KNN = algo_KNN.test(testset) accuracy.rmse(predictions_KNN) ###Output Computing the msd similarity matrix... Done computing similarity matrix. RMSE: 1.2378 ###Markdown * *Si bien el error obtenido medido a través de RMSE resulta razonable, esperamos obtener un mejor resultado con el modelo elegido SVD, que desarrollaremos en el siguiente apartado.* 2. MODELO PREDICTIVO ELEGIDO: RECOMENDACIÓN COLABORATIVA + MODELO SVD 2.1 MODELO PREDICTIVO ELEGIDO: FACTORIZACIÓN MATRICIAL. MODELO SVD CON SURPR!SE * *El **Modelo de ML elegido es Singular Value Decomposition (SVD)**, basado en factorización matricial.** *El **Dataset utilizado** para llevar a cabo el modelo, es el de **final_reviews**, que luego será cruzado con df_title con el fin de visualizar el título de los juegos recomendados, una vez realizadas las predicciones:* * *Llevaremos adelante los siguientes pasos:* * *Se carga el Dataset.* * *Ya se aplicó anteriormente el Reader, para que Surpr!se pueda leer el dataset.* * *Se crea el Dataset de Surpr!se usando `Dataset.load_from_df`.* * *Se realiza un train_test_split.* * *Se entrena un algoritmo SVD.* * *Entrenamos sobre el `trainset`.* * *Predecimos sobre el `testset`.* * *Para el conjunto de `testset`, evaluamos el error RMSE entre las predicciones y las verdaderas calificaciones que le habían dado a los juegos.* ###Code df_svd = pd.read_csv('final_reviews.csv') print(df_svd.shape) df_svd.head() N_filas = 100000 # Limitamos el dataset a N_filas data_svd = Dataset.load_from_df(df_svd[['username','product_id','rating']][:N_filas], reader) from surprise import SVD from surprise import Dataset from surprise import accuracy from surprise.model_selection import KFold # Se define un iterador de validación cruzada kf = KFold(n_splits=3) algo = SVD() for trainset, testset in kf.split(data): # Algoritmo train and test algo.fit(trainset) predictions = algo.test(testset) # Calcula e imprime el RMSE accuracy.rmse(predictions, verbose=True) ###Output RMSE: 1.0874 RMSE: 1.0873 RMSE: 1.0945 ###Markdown * *Usamos RMSE como evaluaciones de resultados de nuestras recomendaciones. A medida que se amplíe el conjunto de datos, se mejorarán las evaluaciones de RMSE.* * *En éste caso, vemos que efectivamente el resultado obtenido fue mejor que el de KNN Basic.* 2.2 OPTIMIZACIÓN DE HIPERPARÁMETROS * *Realizaremos una optimización de hiperparámetros a partir de GridSearch, para hacer una búsqueda de fuerza bruta de los hiperparámetros para el algoritmo SVD.* ###Code from surprise import SVD from surprise import Dataset from surprise.model_selection import GridSearchCV param_grid = {'n_factors':[5,25,50], 'n_epochs': [5,10,20], 'lr_all': [0.001,0.002,0.005], 'reg_all': [0.002, 0.02, 0.2]} gs = GridSearchCV(SVD, param_grid, measures=['rmse'], cv=3) gs.fit(data) # Mejor RMSE Score print(gs.best_score['rmse']) # Combinación de parámetros que dan la mejor puntuación de RMSE print(gs.best_params['rmse']) ###Output 1.0807308991988 {'n_factors': 5, 'n_epochs': 20, 'lr_all': 0.005, 'reg_all': 0.02} ###Markdown * *Aquí estamos evaluando el RMSE promedio en un procedimiento de validación cruzada de 3 veces.* * *Ahora podemos usar el algoritmo que produce el mejor RMSE.** *Tomamos la instancia del algoritmo con el conjunto óptimo de parámetros, y los utilizamos para realizar las predicciones:* ###Code algo_gs = gs.best_estimator['rmse'] algo_gs.fit(data.build_full_trainset()) predictions_gs = algo_gs.test(testset) accuracy.rmse(predictions_gs, verbose=True) ###Output RMSE: 0.9578 ###Markdown * *El valor del RMSE obtenido con el modelo SVD una vez aplicada la optimización de hiperparámetros, es realmente mejor que el obtenido del modelo Benchmark.* 2.3 COMPARACION ENTRE MODELOS * Comparación del desempeño de los modelos de ML utilizados con Surpr!se. | Modelos con Surprise | RMSE | Hiperparámetros Utilizados ||:--------------------:|:----:|:-------------------------------------------------------------:|| KNN Basic | 1.24 | neighbors máx = 40 || SVD | 0.96 | n_factors = 5, n_epochs = 20, lr_all = 0.005, reg_all = 0.02 | * *__La factorización Matricial con SVD__ fue el modelo con mejor desempeño, y mejor aún habiendo optimizado sus hiperparámetros.** *Éste resultado, era realmente el esperado, ya que se trata de un algoritmo eficiente y fácil de usar que ofrece un alto rendimiento y precisión en comparación con otros algoritmos con Surpr!se.** *Si bien los métodos de filtrado colaborativo basados en items (nuestro modelo Benchmark) o en usuarios son simples e intuitivos, las técnicas de factorización matricial suelen ser más efectivas porque nos permiten descubrir las características latentes que subyacen a las interacciones entre los usuarios y los elementos.** *Para ello, SVD emplea el uso del descenso de gradiente para minimizar el error al cuadrado, entre la calificación predicha y la calificación real, obteniendo finalmente el mejor modelo.* 2.4 REALIZANDO PREDICCIONES - RECOMENDACIÓN COLABORATIVA *Exploraremos las característica de `predictions` y alguno de sus elementos.* 1. Realizamos **una predicción para un usuario en particular**. ###Code predictions_gs[1] ###Output _____no_output_____ ###Markdown * *uid = Id del usuario.** *iid = Id del juego.** *r_ui = calificación que le da a ese juego en particular (la conocida).** *est = estimación de la calificación (obtenida de SVD).** *was imposible : True = no fue posible calificar el juego // False = sí fue posible calificarlo.* 2. Realizamos **una predicción para un usuario y un juego determinado** (mediante la función `predict`) ###Code algo_gs.predict(50262, 35143) ###Output _____no_output_____ ###Markdown * *r_ui=None: el usuario no la calificó dicho juego.* 3. Observamos, cuáles son las **mejores predicciones realizadas y cuales las peores**. ###Code # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. def invert_dictionary(dictionary): """Invert a dictionary Args: dictionary (dict): A dictionary Returns: dict: inverted dictionary """ return {v: k for k, v in dictionary.items()} def surprise_trainset_to_df(trainset, col_user="uid", col_item="iid", col_rating="rating"): df = pd.DataFrame(trainset.all_ratings(), columns=[col_user, col_item, col_rating]) map_user = trainset._inner2raw_id_users if trainset._inner2raw_id_users is not None else invert_dictionary(trainset._raw2inner_id_users) map_item = trainset._inner2raw_id_items if trainset._inner2raw_id_items is not None else invert_dictionary(trainset._raw2inner_id_items) df[col_user] = df[col_user].map(map_user) df[col_item] = df[col_item].map(map_item) return df trainset_df = surprise_trainset_to_df(trainset) ###Output _____no_output_____ ###Markdown * *A través de Surpr!se, elaboramos un Dataframe con los usuarios películas y ratings.* ###Code trainset_df.head() def get_Iu(uid): """ return the number of items rated by given user args: uid: the id of the user returns: the number of items rated by the user """ try: return len(trainset.ur[trainset.to_inner_uid(uid)]) except ValueError: # user was not part of the trainset return 0 def get_Ui(iid): """ return number of users that have rated given item args: iid: the raw id of the item returns: the number of users that have rated the item. """ try: return len(trainset.ir[trainset.to_inner_iid(iid)]) except ValueError: return 0 df = pd.DataFrame(predictions, columns=['uid', 'iid', 'rui', 'est', 'details']) df['Iu'] = df.uid.apply(get_Iu) df['Ui'] = df.iid.apply(get_Ui) df['err'] = abs(df.est - df.rui) best_predictions = df.sort_values(by='err')[:10] worst_predictions = df.sort_values(by='err')[-10:] ###Output _____no_output_____ ###Markdown * *Luego, podemos determinar el **error resultante** de las predicciones para combinación de usuario/juego, a partir de la diferencia existente entre la calificación real y la predicción realizada.* ###Code df.head() ###Output _____no_output_____ ###Markdown * *Las ordenamos de mayor a menor y de menor a mayor.* __MEJORES PREDICCIONES__ ###Code best_predictions.head() ###Output _____no_output_____ ###Markdown * *Se observan las mejores predicciones, donde el error es igual a 0.* __PEORES PREDICCIONES__ ###Code worst_predictions.head() ###Output _____no_output_____ ###Markdown * *Obtenemos las peores predicciones, donde el error es cercano a 4, es decir que se calificó 1 y se predijo cerca de 5 o viceversa.* 4. Finalmente, visualiamos el **Top-N de recomendaciones para cada usuario de un conjunto de predicciones.** ###Code def get_top_n(predictions, n=5): # Primero mapea las predicciones a cada usuario. top_n = defaultdict(list) for uid, iid, true_r, est, _ in predictions: top_n[uid].append((iid, est)) # Luego ordena las predicciones para cada usuario y trae las k highest ones. for uid, user_ratings in top_n.items(): user_ratings.sort(key=lambda x: x[1], reverse=True) top_n[uid] = user_ratings[:n] return top_n ###Output _____no_output_____ ###Markdown * *Imprimimos los artículos recomendados para cada usuario.* ###Code from collections import defaultdict top_n = get_top_n(predictions, n=5) for uid, user_ratings in top_n.items(): print(uid, [iid for (iid, _) in user_ratings]) ###Output 99515 [252490, 252490, 230410, 9900, 98800] 56654 [352220] 211719 [48700, 206420, 221100, 221100, 245170] 355569 [107410, 415200, 275670, 3730] 517422 [212680, 42960, 253110, 369000, 46490] 20626 [48700, 230410, 252490, 252490, 105450] 421379 [272270, 251170, 595960, 448670, 311730] 262238 [200710, 405900, 490080] 407584 [291650, 231430, 453090] 454345 [437880, 301050, 314250, 268220, 228960] 376649 [438740, 241600, 313120, 333300, 294810] 78364 [220, 345860, 259080, 423230, 261110] 301625 [4000, 440, 219990, 55230, 438740] 306059 [204880, 394510, 372000, 204180] 196307 [200710, 620, 630, 233740, 214190] 110334 [291550, 319630, 9050] 341590 [364470, 500, 220, 241600] 178986 [440, 440, 346900, 304050, 241560] 55545 [49520, 330830, 391420, 448440, 35720] 78955 [394360, 597150, 577940, 253410, 380] 288140 [403640, 213670] 72805 [301520, 298240, 417860] 161310 [363970, 246620, 266430] 222924 [343780, 286080, 286100, 359050] 152954 [268500, 268500, 304390, 363970, 262280] 505945 [4920, 15620, 32800, 329490] 125188 [230410, 374320, 252490, 440, 271590] 211311 [8930, 438790, 431240, 224920, 303210] 252189 [48220, 332310] 387080 [240, 3910, 63380, 286690, 354380] 65446 [281990, 346110, 271590, 377160, 230410] 360589 [460930, 252610, 7760, 422970] 218301 [336040, 12360, 375220] 351209 [4000, 271590, 440, 236430, 252490] 287265 [107100, 107100, 253920] 82755 [252490, 287450, 202170, 233130, 243120] 152015 [227300, 20, 220860, 1520] 191131 [365450, 67370] 269235 [219990, 115800, 521630] 424761 [4000, 252490, 200510, 620, 239070] 361956 [377160, 200210] 441751 [206420, 268650] 292812 [298110, 25850, 350310, 498050, 251910] 485941 [105100, 40700] 453252 [61500, 12500, 260270, 284830, 392450] 265131 [239140, 252490, 240, 238320] 167604 [289070, 220200, 221100, 221100, 252490] 152978 [203140, 440900, 360430, 245170, 208090] 302600 [212680, 305620, 263280, 65740, 238530] 96384 [221100, 391540, 207610, 444640, 248710] 305233 [586620, 463060, 463220, 613210, 438020] 310517 [221100, 433850, 200710, 113200, 248390] 417457 [472830, 507050, 446040, 465130, 363940] 61026 [328680, 384190] 409627 [40390, 248860, 65600, 282440, 247730] 135003 [240760, 35140, 220, 295790, 22000] 21352 [382140, 372210] 342217 [15970, 32120, 301750, 20530, 215790] 172645 [252490, 252490, 252490, 440, 8930] 256477 [241600, 380] 369813 [301520, 201810, 269030] 538371 [453480, 365590, 550900, 40390, 50620] 40486 [355070, 487370, 343630, 284950, 485220] 524574 [440, 440, 730, 107410, 438740] 361743 [303290, 400170, 609990, 403510, 558340] 173696 [311210, 200210, 105450, 4500, 313120] 323272 [211820, 435150, 289070, 444090, 393420] 347055 [214490, 319630, 351100, 348160, 361930] 240234 [108710, 12520, 33730, 57300, 95700] 408164 [35140, 246620, 8190, 92800, 228440] 3962 [252490, 4000, 268500, 374320, 310950] 58610 [35450, 235600, 221040, 234330, 19900] 126482 [359550, 304930, 588430] 470503 [25900, 359920, 23450] 263476 [349250, 496810, 346730] 522678 [247080, 243780, 200170, 363600] 184422 [234650, 290340, 18820, 270210] 165282 [8930, 232090, 282800, 17470, 623940] 289265 [8930, 200710, 582160, 403640, 228300] 45783 [99900, 15100, 22120, 434570, 3483] 200353 [440] 18181 [242050, 232090, 205100, 201810, 425220] 273307 [346110, 4500] 321645 [227940, 287020, 280, 70, 7510] 296182 [201810, 620, 7670, 310370] 103202 [252490, 107410, 49520, 304390, 24980] 337508 [49520, 32370, 65930, 224540, 332310] 92926 [221100, 307780, 330840, 220260, 521150] 227908 [232090] 415213 [242680, 279540, 375820] 305154 [4000, 448370] 110139 [225540, 21640, 351900] 119997 [440, 281990, 394230] 153294 [236870, 279260, 45730, 415420] 365735 [284790, 533070, 486810, 321880, 657050] 83398 [8930, 21690, 365300] 108638 [203770, 245170, 489520] 411682 [107410, 297120, 502940] 228437 [262060, 306020, 220, 238320, 108800] 69940 [32370, 360430, 392110, 299720, 387290] 259305 [252490, 344760, 620] 385802 [108710, 375120, 224820, 366970] 251016 [24780, 266390] 26977 [257510, 263280, 70400, 110800, 449960] 507537 [344760, 39140, 7940, 204360, 350280] 469394 [620, 236090, 230330] 123418 [377160, 377160, 4000, 291650, 24960] 18667 [252490, 620, 22300, 212480] 258882 [230410, 10500, 71340, 498330, 214420] 476714 [361190, 253650] 332398 [394360, 211420, 386360, 359550, 440] 41518 [211820, 219890] 4386 [24980, 211600, 9050, 21980, 50300] 76846 [270210, 270450] 376846 [39140, 285840, 296910] 79439 [339800, 423120, 296050, 238910, 285050] 340109 [271590, 240, 306020] 265322 [346900, 238320] 296660 [534600, 221640, 387860] 169711 [271590, 49520, 49520, 372000, 385760] 72325 [22330, 411330, 9800, 206440, 404620] 276880 [268500, 41500, 225540, 252030, 232790] 441618 [233130, 225260, 285580] 188735 [440, 440, 255710, 110800, 265550] 216662 [34030, 301970, 270550] 203490 [346110, 298110, 259680, 235620, 409510] 52023 [230230, 39500, 12120, 57900] 277374 [239140, 488790, 60] 265061 [250760, 211260] 217252 [230410, 212070, 102700, 402020] 229246 [50130, 236090, 225280, 282100, 72200] 74307 [339610, 319630, 319630, 347710, 418340] 388054 [588430, 22370, 529770] 152132 [359550, 298110, 221040, 582160, 242920] 120607 [8930, 435150, 433850, 8980, 212680] 255397 [221380, 391540, 211260, 225080] 147027 [262060] 78655 [363970, 271640, 438000, 462220, 321950] 31805 [50130, 253710] 93572 [220240, 22330, 6850, 212800] 129860 [299740, 230270] 363696 [252490, 431960, 476600, 22370] 145100 [289070, 4870, 206480, 17300] 164431 [49520, 252490, 252490, 394690, 301520] 4866 [202170, 356800, 375130, 481110, 277430] 95423 [440, 440, 394360, 294100, 374320] 320714 [107410, 464700] 350106 [209540] 32077 [107410, 22330] 63038 [233450, 365670, 388900] 407976 [377160, 273110, 424280] 447591 [316010, 65980, 17480, 277430] 364786 [262830, 274350, 285840, 281370] 81105 [203770, 236870, 330830, 227940, 366090] 50871 [268500, 247080, 324800, 22370, 581490] 177862 [240, 248820, 462770, 214770, 26800] 85045 [9420, 556180, 344480, 361850, 353640] 298295 [200510, 620, 236870, 110800, 283640] 324267 [289130, 240, 283640, 260210, 302830] 433862 [221100] 291470 [57690, 369000, 446780, 528300, 464760] 216450 [220240, 107200, 290080, 220, 41800] 520298 [32400, 207230] 150070 [40400, 21090, 417860] 516322 [105450, 23310, 284730, 431220, 657790] 265966 [460930, 418460, 333930, 252450, 431960] 34769 [252490, 433850, 307780, 242860, 395900] 247623 [467320, 206440, 429570] 208254 [316010, 240760, 233450, 319630, 466560] 145420 [225840, 208090, 253980] 74879 [12220, 71250, 205690, 249590, 219890] 409724 [209270, 224260, 361280, 273500, 427730] 195499 [301520, 8980, 361800] 19123 [377160, 377160, 730, 427520, 427520] 444819 [49520, 246420, 98200, 72200] 11618 [48700, 417860] 269253 [386940, 203290, 221640] 158873 [440, 234140, 57690, 12170, 200390] 161071 [385080, 399430, 405610, 355150, 360870] 27948 [346110, 4920, 209000, 32800] 381239 [200710] 280393 [377160, 45760, 304650, 372000, 113400] 43571 [8930, 242920, 241240] 145500 [274190, 241260, 477730, 441380, 303790] 164362 [300570, 314450] 432320 [271590, 221100, 289070, 310950, 504370] 233807 [268500, 276810, 282560, 268910, 268910] 169939 [242920] 188637 [6060, 70000] 169328 [377160, 6060, 227940, 239030] 436757 [49520, 63200, 20820] 258869 [211820, 8190] 313033 [271590, 256290, 391540, 218060, 313340] 432712 [424840, 313120, 63380] 161748 [282070, 345390] 72174 [385760, 325630, 404640, 485980] 489315 [201790, 206610, 355920, 339160, 253980] 402391 [4920, 32430] 26940 [107410, 447040, 653950, 376760, 286690] 63475 [359550, 262060, 232090, 447040, 344760] 255679 [208730, 24780, 6910, 209850, 72200] 44832 [271590, 48700, 291550, 234140, 319630] 224858 [252490, 211420, 233450, 489520] 269574 [271590, 40300, 528200, 12200, 357190] 351156 [346110, 335300, 219640, 214490, 346010] 407023 [386360, 305620, 6860, 200210] 148059 [440, 310950, 344760, 372000, 306130] 315808 [364470, 316600, 208200, 448910, 293260] 37013 [227300] 39807 [427520, 47890, 359550, 323470, 233450] 538186 [720670, 368710, 283230, 327410] 166127 [389730, 212680, 394690, 282800, 219150] 297313 [304390, 524220, 361420, 414950, 204450] 164953 [440, 730, 377160, 346110, 8930] 398787 [206420, 23310, 265300, 206440] 103972 [415200, 108600, 70, 307290] 288591 [19900, 316430, 4570, 322300] 275387 [440, 297920, 214560] 387970 [13520, 2620, 377470, 63380, 377980] 333516 [319630, 17080, 332310] 141833 [307780, 219640, 588430, 70] 247824 [98800, 242570, 40930, 209830, 321480] 385220 [247080, 391540] 321268 [237990] 256191 [221380, 248610, 214560, 262830, 448510] 282481 [252490, 6060, 416330] 377623 [230410, 201810, 38600] 530351 [492280, 447170, 344770, 629280, 575650] 47325 [220200, 33930, 224260, 657200] 314586 [213670, 493180, 268910, 318220, 206210] 66054 [211820, 391540, 42700, 238460, 381020] 490774 [317510, 229600, 281370] 510964 [528610] 251402 [107410, 48700, 200510, 363490] 2554 [453480, 385770, 407230] 377804 [247370, 346180, 220780] 338785 [440, 9900, 356190, 359550, 289650] 323127 [377160, 219990, 346110, 433850, 316010] 222262 [359550, 433850] 132239 [242920, 459040, 423620] 82006 [203770, 4000, 374320, 377160, 377160] 57620 [107410, 107410, 220200, 203770, 49520] 351856 [252490, 304390, 203770, 427520, 227300] 20486 [8500, 204300, 273350] 115264 [372000, 224540, 235250, 262790, 70100] 373641 [242050, 32370, 39140, 12100, 25900] 293691 [252490, 221100, 524220, 48700, 262060] 172368 [440, 221100, 221100, 294100, 200710] 17877 [440, 440, 440, 220200, 230410] 80882 [271590, 105600, 240, 291650, 248820] 98350 [48700, 379720, 349700] 295129 [213670, 214730, 614570, 657280, 221810] 509175 [205100, 241930, 12200, 47780, 7670] 294225 [206420] 292373 [484900, 17410, 22000] 319707 [359550, 365590, 248820, 427270, 220460] 16859 [231430, 35450, 283980] 226807 [367520, 352400, 212480, 363440, 535840] 289668 [35450, 80, 320, 24780] 228702 [241540, 304050, 104900] 250466 [215160, 40700] 26575 [107410, 594570, 200210, 291550, 235600] 13683 [64000, 241600, 251730] 397070 [296490, 304650, 8850, 282680, 212700] 256295 [242920, 322190, 360550, 637850] 370130 [48700, 206420, 219640, 391540, 383980] 147749 [219640, 620, 10680, 3730, 213650] 378094 [240, 316390] 282347 [304650, 620, 65980, 394230, 241600] 391449 [434420, 581200, 204180, 423880] 9899 [242860, 268910, 7670] 38505 [440, 212680, 17390, 285920, 204360] 279781 [211420, 524220, 524220] 279369 [359550, 270790, 63380, 72200, 40] 208690 [200510, 433850, 96000] 202599 [284830, 319470] 336681 [724280, 657770, 679860, 496500] 354691 [252490, 451020, 390880, 303210] 142948 [281990, 291650, 230410, 230410, 349040] 186284 [427520, 200510, 342200, 363890, 237430] 239907 [55150, 300080, 566990, 448070, 290930] 54792 [7670, 58550, 39190, 214340] 371605 [107410, 440, 203770] 345974 [33650, 344230, 351820, 316480, 346630] 496492 [535400, 509220, 356670] 102645 [50620, 266490, 250320] 260506 [425670, 239030, 322210] 221131 [252490, 252490, 440, 291650, 290080] 145639 [231430, 6060, 9480, 391540, 2620] 163818 [440, 304050, 107410, 289070, 271590] 14454 [363970, 500, 275470, 1900, 406080] 102823 [301520, 240, 227940, 2630, 243950] 43215 [386360, 240, 494720, 405500] 216986 [48700, 394230, 273350, 9200, 400170] 235459 [50130, 291550, 204360] 256519 [304050, 263280, 385250, 324080, 451020] 164845 [8930, 208090, 210770] 478716 [497800, 277870] 349914 [4000, 386360, 271590, 257510, 204300] 196540 [281990, 205100, 529180, 326460, 200210] 177312 [221100, 337950] 170839 [377160, 4000, 440, 440, 440] 276794 [440, 304390, 248610] 495959 [252490, 403640, 488790] 262150 [289070, 220200, 113400] 26248 [267920, 382490] 517999 [204530, 319560, 218510] 322137 [208580, 237870] 152261 [245620, 221640] 147792 [354400, 252290] 257185 [252490, 111800, 238320] 341772 [440, 440, 282900, 460120, 372350] 212068 [208140, 213670, 314220, 209830] 245522 [440, 252490, 334230, 208580, 391540] 440239 [206500, 365300, 461560] 245453 [386360, 40400, 407510, 229480, 427270] 364451 [301520, 353360] 334616 [440, 380360, 91600] 123072 [231200, 232430] 282478 [440, 433850, 221100, 291550, 224260] 46187 [48700, 48700, 377160, 377160, 230410] 50863 [252490, 271590, 298110, 311310, 372000] 226535 [253310, 245170, 372930, 360580, 17470] 366022 [230410, 49520, 231720] 240242 [227300, 40800] 187700 [250900, 242050, 413150, 344760, 304930] 467849 [427490] 240385 [300550, 219680, 232430] 174935 [382900, 364270, 222480, 63380, 39160] 144158 [107410, 200510, 312530, 273110, 208090] 481860 [440, 569480, 322500] 447049 [49520, 310700, 403440] 522950 [410320, 17500, 221810] 147082 [271590, 211820, 428690, 10680, 2700] 64809 [240, 55230, 391540, 406150] 135784 [211820, 252490, 220, 12200, 592580] 320182 [221380, 221640, 70300] 529718 [252490, 319730, 459820, 339610, 102700] 115458 [221100, 298050, 113400, 12110, 214420] 338562 [6020, 272060] 405705 [252030, 250760, 318440, 256460, 207040] 188573 [252010, 209520, 289090] 99938 [281990, 212680, 363970, 214490, 39120] 77474 [108600, 327510, 7670, 311980, 32430] 63965 [6880, 7760, 49900] 150018 [416000, 341720] 232554 [221810, 207530] 408976 [383150, 217920] 355881 [208140, 299360, 458560, 377150, 387450] 435771 [487250, 327220, 331980] 22789 [440, 304050, 252490, 252490, 227300] 121022 [377160, 440, 305620, 440900, 334120] 310963 [253110, 409710, 269790] 311695 [427520, 241540, 4500, 238320] 277500 [369060, 227580, 346250] 209910 [337000, 241540, 211360] 224944 [367520, 274170, 290340, 104600, 1530] 479141 [449710, 282880, 315930] 209768 [214560, 654880, 290770] 46165 [432330, 313120, 271570] 282724 [230230, 367500, 204880, 42910, 42910] 238422 [440, 289070, 236870, 208650, 4700] 52119 [239140, 282070, 242860, 318430] 324589 [305050, 240720, 495890] 461393 [227940, 262830, 236130, 431450, 284790] 89307 [250900, 306130, 200710, 208090, 237890] 322669 [22300, 238430] 220997 [440, 9900, 219640, 9480, 418190] 151340 [294860, 263400, 348540, 388120] 47547 [35450, 65980, 619910, 283230] 48408 [377160, 386360, 10090, 240970, 293220] 375882 [350080, 360740, 405640] 76432 [345370, 431240, 738060] 250658 [49520, 239350, 349040, 209650, 319510] 376612 [440, 442080, 282140, 223330, 233290] 316784 [265550, 362930] 505150 [48220, 274170, 620, 204360, 102700] 120282 [261030] 500077 [108710, 21980, 8850, 55100, 32390] 181234 [222750, 440, 440, 214490, 386940] 70778 [238010, 17390, 6860, 70] 512933 [280220] 61482 [107100, 369070] 325663 [333930, 319630, 238260, 317360] 82227 [263140, 221260, 405640] 95242 [440, 234080, 274190, 332200, 474030] 273176 [294650, 253110, 234490, 55040, 275490] 202411 [440, 282800, 204360] 226329 [10090, 17080, 360870] 65594 [8930, 8930] 189984 [239140, 22330, 221380, 221380, 291550] 170359 [252490, 252490, 427520, 221100, 221100] 79092 [220240, 6020, 460920] 221891 [385760, 268650, 206420, 237310, 227860] 112065 [254700, 311190, 366960, 328500] 144829 [507490, 273350, 208200, 6510, 286040] 20081 [249050, 307780, 339610, 277520, 253390] 70981 [294100, 393380, 301520, 220240, 55150] 282502 [4000, 377160, 220200, 334230, 233450] 196675 [440, 440, 8930, 310950, 35450] 171502 [227300, 232090, 247240, 328760, 390660] 198759 [360430, 244450, 269210] 495163 [242860, 244930, 205950] 196893 [440, 233450, 209000, 218230, 301520] 69337 [252490, 323370, 227940, 391540, 337340] 278421 [299740, 224760, 327860, 248170, 238090] 411443 [245620, 244070, 275200] 486297 [209000, 339340, 236130] 220929 [283020, 253980, 214550] 283112 [220200, 25000] 137368 [235540, 273350, 407510] 490892 [352430, 267530, 396930, 530020, 569010] 189881 [252490, 386360, 377160, 65800, 49520] 79161 [241560, 55150, 261180, 311730, 356670] 239954 [427520, 377160, 4000, 212680, 330830] 525182 [632730] 245893 [271590, 438740, 234140, 620, 17080] 37265 [32430] 310912 [301520, 41500, 235600, 256290, 330830] 116669 [440, 48220, 107200, 247080, 237930] 323491 [33230, 246110, 237870, 10150, 225080] 220507 [323370, 234670, 241600, 310380, 376870] 98323 [4500, 57300] 185456 [236430, 444090] 72657 [99900, 454150, 238530, 214360] 454307 [289070, 221100, 226560] 62432 [214490, 214870, 303430] 302322 [201810, 225260, 355180] 215408 [377160, 271590, 440, 386360, 221100] 84078 [200710, 238460, 364060, 264080, 219890] 28141 [261760, 19980] 41685 [211820, 242050, 363970, 304030, 24960] 310779 [107410, 252490, 252490, 440, 301520] 472239 [269210, 227580] 241567 [252490, 359550, 346110, 49520, 236430] 316505 [221380, 8190, 238460, 324260, 264140] 533851 [242920, 417860] 453548 [55230, 261030, 8850, 40800] 247830 [8980, 335670, 350910, 204630] 240777 [427520, 4000, 238090, 17480] 435421 [242680, 22320, 35720] 97326 [49520, 356190, 589290, 273110, 389040] 136838 [447040, 22370, 282440] 7575 [22320, 310370] 381468 [91600, 457760, 264140] 223723 [219990, 17460, 48220] 138622 [221100, 219640, 333930, 305620, 10680] 249144 [201790, 21090, 2300, 225260, 250180] 24498 [433850, 39120] 236576 [677650, 238240] 67546 [248820, 349100, 293780, 70300] 2645 [218680, 293680, 70300, 95700] 368469 [252490, 55230, 403640, 237930] 135662 [330350] 204503 [230410, 374320, 301520, 271590, 4000] 463820 [45760, 285160, 108710, 57900, 111100] 85999 [48190, 209000, 12110, 13540] 83559 [220200, 211420, 304050, 107100, 387970] 145476 [524220, 335670, 242680, 38410, 48720] 339160 [252490, 427270, 238320] 415582 [8930, 2600, 500, 20550, 17410] 109641 [337000, 2870, 296570, 236090, 365450] 475701 [230410, 374320, 440, 200510, 330070] 37364 [417860] 155235 [226860, 298050, 445220, 208090] 332694 [322520, 348400, 324570, 372330, 379980] 43337 [8930, 387290, 290730, 227080] 387987 [304390, 233150, 232430] 287887 [115110] 383857 [219990, 227300, 24960, 78000, 220] 30166 [282900, 204300, 241600] 19673 [78000, 8190, 237110, 108800] 82919 [232090, 220, 499910, 523210] 243205 [252490, 107410, 306130, 306130, 457140] 104347 [291550, 407120] 218310 [242050, 236090] 59859 [20920, 302670] 70832 [230410, 252490, 211820, 113400, 204300] 345684 [35450, 209000, 219640, 305620, 278460] 253896 [252490, 48700, 239140, 393420] 182834 [235600, 262280, 24780, 35720, 102700] 117721 [49520] 245947 [4000, 246070, 313240, 679990, 418070] 375450 [107410, 48700, 374320, 220200, 255710] 65317 [377160, 311310, 520440] 320107 [239140, 420290, 319630, 313120, 319510] 303916 [246420, 291480] 66380 [252490, 115800] 373049 [440, 49520, 227940, 20920, 225540] 10137 [440, 440, 620, 596650, 341870] 76299 [67370, 207400, 288060, 249990, 273110] 432436 [33230, 398930, 464060, 10130, 324810] 134695 [240, 620] 67230 [440, 335890, 227940] 65918 [315330, 555850, 225360, 454930] 472189 [433850, 224260, 99900, 316390] 196107 [294860, 517910, 418070, 405640] 200580 [391540, 319630, 268750, 417880] 277241 [346900, 391540, 330390, 300280, 45400] 93147 [440, 646570, 312990] 82741 [481510, 12100, 208090] 260918 [242050, 254480, 266010] 122756 [24980, 238010, 108710, 339400, 480480] 533785 [440, 304930, 286690, 529780, 277430] 268144 [235620] 429717 [257510, 260570, 291710] 275420 [227300, 236090, 24810, 37800, 33990] 70379 [10500, 438680] 102396 [215160, 205650, 464150] 350516 [107410, 227300, 227300, 227300, 24010] 232713 [287390, 320, 239430] 149998 [287450, 248390, 241930, 345080, 227860] 457849 [91600, 275670] 5442 [346110, 346110, 346110, 346110, 346110] 232161 [440, 301520, 204300, 319630, 362870] 538475 [416370] 215271 [359550, 363970, 240, 219640, 227940] 296655 [8930] 205280 [252490, 252490, 440, 438740, 287450] 438658 [49520, 363440] 296468 [236430, 393380, 304390, 620] 37211 [220200, 242050, 433850, 276810, 45710] 212986 [231430, 212480] 263550 [504370, 10180, 476600, 40700] 170888 [438740, 241540, 237930, 40] 368369 [330840, 33320, 321480, 324810, 262410] 220809 [214490, 372540, 314000, 225080] 117169 [242680, 371570, 245490, 352640, 339590] 91095 [48700, 211420, 233130, 63380, 206210] 294678 [301520, 249050, 243970, 318600] 340124 [4760, 248570, 219950] 194202 [304930, 17520] 242606 [48700, 8930, 440900, 361420, 431960] 71303 [377160, 289130, 232090, 200510, 520440] 321704 [10180, 245280, 218680] 474030 [219640, 234080, 512250, 318430] 343563 [92800, 323380, 277680, 425580, 220780] 293034 [227300, 282900, 7860] 17607 [363970, 227940, 446600] 139053 [374320, 34030, 211420, 220] 145057 [227300, 219640, 35720, 211600] 214778 [374320, 252490, 440, 8930, 377160] 355577 [248610, 381120, 284240] 314488 [219990, 252490, 49520, 289070, 239140] 240312 [222750, 270150, 302270, 235540, 252410] 464530 [359550, 252490, 252490, 304930, 310380] 284977 [213610, 232750, 289400, 47920, 274900] 99093 [200710, 312750, 222980] 3968 [48700, 440, 440, 301520, 301520] 228126 [49520, 55230, 314410, 203160, 269210] 505898 [220, 249630] 354268 [17740, 92800, 1510, 380, 6920] 381426 [204100, 247020, 522570, 545280, 391210] 68997 [208140, 351080, 4770, 1520, 40930] 54455 [95900, 363600, 295790, 433950] 35506 [221100, 35450, 391540, 252490, 319630] 315465 [281990, 601430, 431960, 298260, 417860] 308895 [427190, 384110] 280420 [247310] 142911 [49520, 220, 310060] 72992 [440, 287290, 50620, 21090] 61681 [230410, 337000, 433850, 214190] 296929 [242860, 299740] 362902 [363970, 212680, 212680, 351970, 201790] 25858 [358200, 525040, 358340, 24810, 9000] 403224 [254700, 10100, 224760, 231160, 292500] 367660 [374320, 49520, 359550, 346110, 4000] 81163 [10680, 463110, 385710] 182634 [32370, 108800] 14126 [250900, 113200, 9900, 221100, 301520] 369546 [252490, 227300, 246940, 16030, 47920] 45555 [342200, 228300, 694500, 716630, 587650] 280613 [212200, 410320] 11225 [271920, 214190, 233470] 165349 [281990, 208140, 237430, 568770, 243950] 340526 [306130, 203140, 362400, 80, 238460] 296981 [356040, 274270, 387860] 321085 [252490, 301520] 308945 [594570, 291550, 245170, 491770] 307306 [41700, 237870] 118925 [224260, 379980, 107800] 486912 [221100, 433340, 12120, 360170, 2620] 119739 [239840, 307670, 289690] 138272 [241560, 206420, 582660, 403640, 346900] 496211 [65980, 10180] 42677 [220200, 339350] 282321 [730, 268500, 200510, 49520, 204880] 266804 [292120, 242920, 204100] 276334 [24400, 33660, 20700, 99700, 215670] 17295 [433850, 35450, 38600, 227940, 45700] 48186 [24980] 87759 [304030, 289300, 466240] 204143 [107410, 236430, 386360, 8930, 200510] 307424 [219990, 24980, 200710] 168471 [235400, 310110, 431240, 273500, 506610] 170101 [9420, 55230, 224600, 361670] 202761 [234140, 311120, 29800] 249731 [49520, 385760, 433340, 612880, 439650] 123370 [365590, 329430, 422940, 70, 302830] 116350 [386880] 80719 [433340, 356370, 370310, 252850, 211360] 173509 [440, 200510, 722940, 285900, 321980] 179949 [252490, 305620, 242680, 350990] 389928 [8930, 233450, 221100, 381120, 209160] 251983 [252490, 384190] 185376 [345090, 215530, 473560, 296870] 258733 [374320, 433850, 247730, 211500] 82450 [220200, 372000, 3590, 620, 35720] 235617 [20920, 4760, 57300, 22370, 35720] 44386 [4000, 223510, 232970, 209670] 151031 [440, 49520, 550650] 24056 [208480, 20920, 3590, 207420, 13560] 335219 [236430, 394230, 233470] 409560 [209000, 286690, 264280, 242880, 201480] 129510 [70420] 168643 [105450, 55230, 227300, 257510, 263280] 330138 [291650] 50794 [377160, 522030, 286690] 149722 [233290, 40800, 336510, 270570, 377430] 136128 [359870, 7660, 214420, 327410] 213013 [65800, 2450, 301970, 232430] 476329 [440, 252490, 4000, 291650, 393380] 352215 [39210, 49520, 33930, 55230, 252490] 109443 [444430, 1670, 70650, 220900, 32390] 182363 [65980, 32370, 246110] 52511 [440, 346110, 206420, 35720, 236090] 60608 [377160, 230410, 359550, 294100, 620] 157719 [406920, 371670, 107300, 274560] 314445 [377160, 227300, 224500, 201790, 448510] 306256 [482730, 211820, 221100, 447040, 291480] 400480 [377160, 340050, 449540] 383519 [40100, 298110, 403640, 39150, 410890] 332311 [385830, 208670, 449680] 184057 [319630, 322080, 258180, 249590, 278640] 53422 [400800, 716380] 90930 [10680, 413710, 274190, 12100] 143167 [373720, 509420, 307880, 402020] 325291 [268500, 50620, 12100] 533809 [202170, 292630, 424280] 108032 [504370, 504370, 417860] 106553 [236430, 211820, 206440] 243140 [321950, 341860, 105300] 106254 [428750, 247950, 384190] 200659 [440, 252490, 203140, 364240, 306660] 122845 [107410, 107410, 107410, 440, 440] 110303 [200370, 299780, 388390, 334840, 521630] 216491 [105450, 201810, 20540, 342370] 301680 [220240, 6060, 203140, 57900, 233130] 251356 [300570, 345180, 211260, 22180, 398850] 473188 [252490, 363970, 239030] 57168 [49520, 49520, 620, 620, 10180] 71179 [242920, 220440, 273110, 612880, 221260] 318866 [232090, 363970, 214190] 260220 [231430, 238010, 299360] 348572 [208650, 620, 431960, 418370, 65300] 204163 [204360, 380600] 140437 [504370] 177069 [255710, 214490, 363890, 335000] 220139 [289070, 200510, 234140, 290300, 366220] 374809 [92800, 108710, 287390, 237930, 365300] 116939 [304390, 365360, 204630] 487696 [334230, 238010] 3646 [46250, 32150, 98400, 39670, 46510] 255835 [10680, 49600, 33130] 381336 [308060, 45740] 236162 [115320, 17410] 284831 [230410, 230410, 394230, 333930, 239160] 264360 [239840, 428750, 293460, 306660] 274527 [435150, 220200, 107410, 48700, 230230] 524880 [220, 203510, 317470, 410890, 329490] 260096 [47890, 291550, 447040, 242860, 376870] 448684 [263280, 463680, 413110] 178237 [107410, 440, 524220, 220240, 282070] 504394 [344760, 440900] 259607 [206420, 228280, 238240, 206210, 200210] 323868 [270150, 233740, 224260] 524789 [319630, 10180, 247910, 209650] 348753 [306130, 291550] 239113 [248820] 409366 [55230, 238010, 237430, 115320, 336510] 189336 [252490, 304930, 429660, 253230, 282800] 398457 [200510, 234490, 263740, 274310, 434570] 115060 [252490, 20510, 433340] 243796 [49520, 211820, 294100, 289130, 200210] 122362 [271590, 391540, 247080] 368271 [337000, 241930] 240907 [230410, 332800] 286696 [393380, 344760, 50300, 318220] 105369 [271590, 8930, 242680, 386360, 242920] 77897 [107410, 231430, 264140, 50, 355840] 112737 [4000, 394360, 4760, 331470, 237930] 144455 [49520, 359550, 230410, 356040, 387290] 477460 [206420] 464080 [311210, 373420, 339800, 403560, 202970] 167687 [200510, 203140, 28000, 104900] 241569 [35450, 207610] 197298 [250900, 440, 211420, 356190, 344760] 513955 [586140, 252410, 207040, 499440] 386112 [212500, 329490] 195846 [359550, 377160, 377160, 8930, 301520] 440685 [353090, 233370, 395860, 331750, 630830] 538903 [301520, 8190, 319630, 319630, 220] 197016 [405900, 288120, 568570, 298050, 227020] 265303 [346110, 346110, 394360, 271590, 301520] 298468 [373420, 304390, 239840, 261030, 297750] 42554 [221380, 227940, 620, 307780, 319630] 374100 [252490, 239140, 212680, 20920, 312530] 271736 [70, 17470] 65688 [48700, 204300, 215080, 224820, 218130] 487540 [35140, 21690, 344030] 190774 [248820, 204100, 314410] 220499 [208650, 246420, 310380, 265550, 423880] 273600 [252490, 301520, 220, 246620, 390670] 49944 [240] 32210 [113200, 504370, 337320, 448510, 219150] 199017 [212680, 360430] 177936 [359550, 294100, 214950, 268050, 250340] 298958 [8930, 322190] 350484 [377160, 241540, 227940, 287600] 412063 [214560, 293660, 266010, 340280, 252410] 326317 [242680, 23310, 232790, 43000, 9480] 323315 [49520, 322190] 261526 [220200, 238210] 207696 [212050, 264320, 448010, 359090, 6200] 337745 [548370, 655020, 653760, 491560, 354430] 82394 [201790] 180903 [440, 386360] 383846 [7940, 310380, 285580, 204060] 244163 [363970, 234140, 220440, 280520, 431450] 249211 [219640, 10230, 107100, 429570, 272600] 171002 [220200, 213670, 361420, 34800, 386090] 1441 [289650, 48190, 265590] 69517 [289650, 361420, 584400] 38568 [49520, 346110, 306130, 55150, 409720] 68883 [301520, 312840, 331190, 238320, 369580] 357135 [440, 8930, 240760, 38410, 313120] 327435 [239160, 285900, 509980, 217920] 174941 [240, 200710, 218230, 242860, 270150] 198689 [236090, 243970, 247730, 270070, 257050] 196601 [239820, 205190, 250760, 17300, 63940] 294241 [406310, 385360] 306991 [24980, 51100] 515271 [333930, 363970, 221380, 339570, 238090] 530079 [654050, 378930, 226560, 446380, 448370] 51008 [220240, 283450, 226720, 267360] 42472 [304930, 205100, 319630] 355284 [99900, 19900, 267530] 26138 [230410, 262060, 20920, 206420, 444220] 279554 [332200, 287390, 220160, 40380] 318071 [113200, 205230, 362930] 284767 [205100, 424370] 443685 [262550, 38700] 223171 [440, 230410, 99900, 386070, 92100] 110425 [312450, 569480, 466560, 51100] 355819 [363970, 55230, 212680, 356040, 3480] 374232 [49520, 24960, 620, 504370, 376870] 136853 [271590, 232090, 35450, 265550, 597170] 420045 [207710, 338170, 317360] 101764 [49520, 381120, 332310] 198112 [289650, 113200, 330830, 415850, 435030] 383724 [234650, 219890, 268540] 260989 [377160, 440, 33930, 24980, 339800] 49641 [6060, 433950] 134462 [219640, 12120, 47000, 280520, 263880] 60403 [440, 252490, 433850, 113400, 270150] 57442 [391540] 487791 [377160, 359550, 440, 582160, 311310] 254266 [20510, 237930, 50300, 2320, 499440] 377076 [500870, 319320] 61376 [48190, 109200, 249330, 215670, 340] 298655 [55150, 44350, 387340, 202970, 391720] 394774 [741670, 772540, 327490, 282560] 9074 [211820, 48700, 391540, 319630, 220] 462682 [23450, 290790, 410110] 58484 [208650, 35140, 35720, 42140, 277590] 139150 [107410, 233450, 518790, 319510] 312872 [202170, 348200, 411560, 250500] 214140 [107100, 262390, 282760] 198650 [49520, 730, 440, 24740, 263760] 29235 [219990, 359550, 8980, 287390] 358626 [230410, 231430, 238320, 232010, 353560] 40644 [359870, 220240, 250520, 22370, 326670] 512589 [49520, 393380, 359370, 436520] 198257 [271590, 49520, 377160, 236430, 582660] 198460 [8930, 227300, 232090, 41500, 620] 370577 [239140, 327070, 391720] 287694 [331470, 416450, 215870, 458560, 493370] 311801 [620] 1788 [241600, 254320, 357120, 356110, 433590] 62638 [48700, 242920, 388320, 288930, 435400] 223061 [558620, 597240, 532030, 461910] 319305 [301520, 312150, 312280] 25355 [257350, 17430, 368360, 274310, 72200] 118377 [360740] 397877 [333930, 247910, 282440, 700330, 2320] 164529 [449680, 369400, 361630, 232430] 319133 [333930, 227940] 500459 [489470, 550390, 449170, 589690, 440870] 247653 [17450, 397950, 57300] 249872 [346110, 201790, 102600, 115320] 87208 [214490, 300570, 290770] 147044 [3590, 680120, 391340, 678670, 731230] 370811 [24420, 48000] 40132 [39140, 237740, 206190] 325007 [24960, 259550] 66273 [440, 107410, 431960, 355840, 302510] 14484 [440, 440, 200710, 363970, 49520] 37678 [235540, 35450, 302670] 484327 [8980, 42700, 613730] 40151 [230410, 372000, 4500, 299740, 206210] 56799 [252490, 252490, 214950, 220240, 292120] 45704 [252490, 418460, 17520, 265630, 302610] 489058 [221100, 337000, 333930, 393420] 533705 [209330, 342740, 257750, 249630] 431966 [47890, 50130, 12120, 7670, 70000] 6955 [107410, 374320, 335300, 6020, 447530] 263481 [230410, 8930, 35450, 360430, 244930] 101357 [17460, 8980, 214490, 368360] 230756 [214490, 232790, 288690, 260250, 327260] 277327 [271590, 8980, 394690, 214490, 404410] 322528 [440, 231430, 225540, 219830, 241240] 138320 [61100, 17410, 50300] 251833 [48700, 504370, 235540, 249050, 346900] 94896 [230410, 57900, 2210] 278882 [211820, 329440, 298180, 6200] 1815 [202170, 108710, 271900] 228611 [209000, 327690] 321639 [49520, 291550, 35720] 252951 [391540, 431180, 273350] 68595 [262060, 4580, 223810] 106578 [42700, 417860] 278876 [356190, 240, 427950, 55100, 674940] 368104 [107410, 231430, 391720] 534050 [245620, 38600, 462110, 491620, 316970] 450071 [102500, 241560, 65980, 9480, 314520] 257288 [433850, 242920, 259080, 253390] 445149 [291650, 313120, 417860] 226928 [294860, 214150] 192836 [440, 262060] 316376 [221680, 220200, 620, 447020, 222480] 251101 [250900, 377160, 8930, 271590, 252490] 283361 [377160, 620, 219640, 256190, 233720] 324840 [4000, 346110, 433850, 410820, 410150] 282279 [70400, 7940, 42700, 99300] 241180 [261030, 282210, 266130] 441394 [391540, 249130, 456670] 489672 [299740, 232890, 12100, 739260, 306350] 136189 [113200, 620, 47780, 17100, 18000] 294674 [291550, 237850, 554620, 307130, 407530] 200075 [4000, 230410, 377160, 9900, 385760] 478737 [341440, 275390, 279160, 299460, 303390] 172915 [219990, 219990, 211820, 252410] 74870 [374320, 107410, 48720] 356162 [248820, 202970] 44986 [239140, 239140, 214490, 378770, 394140] 183257 [239070, 249650, 6910, 218060, 294370] 47975 [80, 50, 264240, 351640] 36407 [271590, 440, 440, 8930, 239140] 223295 [2500, 48000, 278360, 303210] 277712 [395860, 465170, 409600, 369420, 340770] 45539 [215120, 214770] 296120 [391340, 286500, 373860, 252890] 185107 [440, 227300, 433850, 433850, 243470] 70195 [4500, 239160, 108710, 99900, 261030] 228510 [377160, 271590, 330830, 209000, 246420] 205860 [215710, 108710, 4470] 152545 [271590, 374320, 221100, 206420, 206420] 166349 [98800, 280740] 340125 [440, 221910] 314824 [305620, 344760, 251830, 245150, 21680] 203957 [440, 526490] 12369 [252490, 554620, 203140, 220160] 76131 [211820, 271590, 291650, 49520, 255710] 109218 [374320, 204880, 341940] 520788 [3590, 3310, 3350, 35800, 33320] 271321 [49520, 252490, 200710, 20920, 32370] 166732 [221100, 39210, 377160, 444090, 107410] 455621 [249050, 214490, 630] 279943 [342580, 410210, 353640] 37369 [219640, 319570, 344040] 72235 [230410, 242920, 225540, 248820, 312610] 83009 [440, 252490, 230410, 438740, 221100] 319244 [211820, 330830, 271240, 10150, 17570] 59760 [221100] 88887 [313340, 223710] 343857 [543050, 697010, 535690, 223100] 19401 [504370, 365360, 333930, 393380, 20540] 109616 [201310, 210970] 230591 [449060, 243160, 29800] 277016 [282900, 269250, 315810] 256361 [113400, 311080, 431960, 429570, 243120] 111876 [272470, 16720, 8080] 337060 [340] 307027 [240, 48700, 377160, 242550, 65980] 207921 [233130, 326190, 545370] 349040 [49520, 20920, 479110, 321370] 23205 [49520, 290080, 6060, 434410, 39800] 131845 [70, 17470, 104700, 17410, 201480] 157900 [377160, 65980, 441550, 290790] 67312 [65800, 35450, 49470] 61360 [2870, 73010, 49600] 358663 [433340, 237930] 483946 [227300, 224600, 339350, 449780, 1510] 25400 [440, 301520, 200210, 623940, 397900] 81998 [221040, 292120, 18000, 322500] 355856 [356670] 525551 [113200, 291550, 210550, 415150, 264340] 370395 [32470, 359650, 261570, 645090, 575640] 91197 [373420, 247080, 8190, 323850, 17080] 511688 [236870, 219640, 47780, 268910, 18700] 166667 [346110, 241540, 204300, 212480, 288140] 495021 [390520, 259080, 41000] 278143 [48700, 49520, 239140, 219640, 1700] 180137 [417150, 494600, 649670, 296730, 308420] 2135 [292910, 237990, 316180, 348550, 65930] 329739 [221100, 311210, 313160] 370193 [271590, 252490, 325610, 70000, 214420] 187963 [701460, 414760, 647410, 343710, 272890] 79346 [219990, 35450, 206420] 45021 [374320, 218230, 304930, 223410, 104900] 22192 [363970, 242860, 397950, 401190] 261589 [214490, 330840, 15100, 32800, 302510] 537782 [304050, 433850, 234140, 488790, 265550] 241827 [252610, 263880, 232430] 365431 [439910, 484950, 385690, 271900, 201420] 317392 [291550, 204360] 318388 [49520, 214950, 466730, 588430, 466940] 355470 [236430, 50130, 238090, 233130, 216910] 412756 [427520, 262060, 280220, 2820, 274230] 22979 [304050, 360430, 239160] 61248 [351970, 257970, 292400, 353330] 132416 [49520, 49520, 304910] 44815 [387490, 357470, 279140, 449050] 205617 [252490, 312530, 444200, 518660, 211160] 374501 [377160, 377160, 8930, 262060, 65980] 350734 [49520, 225540, 6060, 200170] 181309 [377160, 233450, 270880, 113200, 218230] 85927 [271590, 220200, 227300, 8980, 221380] 93279 [208650, 32370, 67370, 292730, 291070] 267976 [306130, 403640, 601430] 327984 [337000, 203140, 256290, 245280] 135326 [289070, 231430, 271590, 440, 8930] 260977 [374320, 294100, 393410, 202970, 202970] 241311 [108710, 338540] 67859 [220, 248820, 253230, 383230, 725020] 218746 [211820, 304050, 304390, 391540, 337320] 533277 [49520, 620, 15950, 57300, 50300] 474105 [208650, 9050, 30, 497640, 306040] 189795 [278080, 527340, 428550] 323005 [252490, 8930, 48700, 48700, 211820] 358030 [500, 107100, 48000, 38700] 218192 [246620, 282140, 444640] 142784 [220200, 287580] 426196 [214490, 287980, 242640, 468650, 517790] 488921 [231430, 19900, 12220, 4520] 36317 [298110, 216150, 623940, 221910, 224460] 441123 [50300, 266130, 232430] 273842 [230410, 246420, 237990, 527760] 29619 [271590, 221100, 24010, 238240, 356670] 344449 [322190, 412520, 3390, 307210] 367372 [252490, 227300, 289650, 323470, 273350] 17529 [337850, 370480, 249630] 446612 [233450] 274688 [444090, 304650, 319630, 392110, 240] 123746 [222750, 331340, 347440, 338170, 321260] 19717 [8930, 220, 212480, 355840, 320] 336797 [242050, 208580, 108600, 63200, 18470] 249586 [271590, 219640, 33950, 637850, 432020] 171036 [252490, 242760, 348020, 433530] 57937 [386940, 319630, 434260] 277824 [227300, 356190, 397950] 509655 [239160, 431240, 355180] 111636 [310950, 534820] 240628 [107410, 301520, 200510, 433850, 22600] 279153 [208400, 246760, 234160] 166359 [440, 48700, 211820, 211820, 323470] 39616 [411960, 107100] 122416 [4000, 273350, 210770] 323389 [49520, 518790, 237930, 313120, 278360] 95418 [509980, 274940] 17985 [504370, 344770, 115100, 507010, 70300] 202692 [436150, 301640, 273350, 206190] 453451 [355270, 353560] 262389 [440, 240, 220200, 239140, 242920] 37773 [214510, 63380, 249630, 247240, 384190] 46477 [4700, 99300, 70600, 104900] 439128 [40100, 219830, 200170, 104900, 274900] 348488 [281990, 212680, 207610, 249630] 233582 [730, 248390, 247080] 270015 [271590, 252490, 289650, 253250, 331600] 354273 [236430, 212680, 360430, 302830, 356670] 41705 [730, 8980, 346110, 225540, 391540] 306031 [363970, 70400, 233350, 234490, 49600] 270638 [271590, 271590, 291480, 414740, 491260] 140780 [200510, 236870] 170095 [200510, 220240, 206190] 141091 [213670, 248860, 337000, 17450, 360430] 143038 [45760, 256290, 47780, 250760, 269670] 513674 [8930, 221380, 19900, 391540, 383150] 351537 [391540, 392820] 199556 [386360, 236090, 516510, 306660, 420930] 273048 [374040, 642560, 410110, 349700] 367273 [281990, 67370, 382490] 288279 [242680, 394310, 644930, 489760, 586240] 295343 [409100, 314180, 353700] 358581 [314200, 317510, 214340, 298260] 529991 [230410, 200510, 220440, 364050, 270210] 32929 [331670, 237990, 238370] 130068 [306630, 498240] 423748 [377160, 440, 41500, 35720, 261490] 212061 [55230, 393380, 446150, 417860] 316099 [440, 301520] 123065 [8500, 391540, 292330, 288260] 507211 [701730, 26000, 259740] 253579 [209650] 56604 [440, 252490, 386360, 304390, 346940] 384296 [440, 212680, 321400, 104900] 425856 [360590, 463150, 423900, 262770] 296154 [250900, 223470] 174667 [206420, 108710, 233270, 55100] 22510 [268500, 98800, 282140] 416462 [280160] 82527 [374320, 271590, 8930, 252490, 35450] 261290 [219990, 242920, 245170, 221640, 259600] 43783 [236870, 270150, 274620] 152742 [234140, 394510, 213670, 324800, 209270] 293467 [294100, 730, 203770, 413150, 4920] 409941 [265550, 12120, 253750] 152762 [460810] 67171 [39140, 2200] 127081 [282070, 42170, 243970, 65930, 252550] 22616 [214560, 40970, 224260, 12100, 204860] 374438 [391540, 22330, 22370, 42120, 250260] 243076 [440, 440, 220200, 377160, 230410] 64097 [304050, 99900, 42700, 376870, 210870] 476754 [288790, 266940, 348290, 96000] 104789 [418340, 703950] 312879 [293840, 269310, 217100] 61961 [48700, 8930, 433850, 252490, 241930] 178567 [289070, 248610, 468250] 128549 [200510, 620, 201810, 325630, 250760] 405281 [80360, 246090, 80340] 163688 [50130, 206420, 298160, 237930, 391720] 62580 [254320, 466560] 136250 [201810, 6860, 362890, 274900, 220780] 76541 [239160, 12120] 529274 [212680, 343270, 243120] 4596 [249650, 33980, 225600] 524617 [230410, 242050, 394230, 304050, 291550] 370276 [48700, 107410, 212680, 413150, 204300] 216717 [227300, 391540, 208090, 242880] 285524 [107200, 236150, 35700, 42990] 2070 [220240, 386070, 402710, 242640, 251530] 361899 [371330, 249680, 260230, 427820, 371420] 382947 [504850, 431260, 415350, 550990, 391140] 60316 [252490, 236870] 289570 [240, 200710, 11360, 6980, 204240] 129403 [239140, 221100, 241930] 84388 [210550] 55391 [231430, 333870, 205070, 263820, 281370] 513230 [15750, 311340, 22230] 31911 [252490, 302510, 238320] 178428 [364470, 534780] 44367 [233290] 371035 [201810, 246420, 322500] 7759 [435150, 241930, 219780, 388880] 328341 [620, 8190, 357290, 107300, 60] 176955 [440, 440, 245620, 356190, 312530] 507627 [296570, 51060, 238910] 227771 [50130] 374389 [393380, 428750] 77040 [289070, 7670, 33130, 207430, 214870] 56591 [310380, 57900, 209160] 372197 [266940, 391260, 335660, 284810, 259810] 180471 [460920, 326410, 416190, 384590] 7351 [359550, 316390] 26369 [220240, 502140] 135342 [440, 230410, 261030, 47920, 398850] 340818 [433340, 580200] 406465 [281280, 575490, 11280] 516587 [394360, 355840, 20] 258522 [34030, 242920, 327030, 268910, 414340] 219329 [230410, 304050, 200710, 113200, 419020] 335502 [230190, 344240, 243040, 344840, 358460] 275311 [393380, 291480, 204360, 451340, 409590] 99607 [49520, 241930, 385250, 92900, 249330] 323668 [242050, 433850, 304050, 24010, 384490] 43331 [500, 9350, 3483] 510232 [33930] 23964 [372540, 429940] 228385 [615390, 622530, 635000, 672050, 697490] 153310 [202130, 65730] 179941 [227300, 220700, 340770] 207494 [200510, 578930, 404580, 223450, 203650] 215540 [692090, 734540, 671760] 348446 [240, 240, 230410, 225540, 314160] 309897 [248820, 92800, 26500, 236370] 36871 [384190, 272060] 55909 [364420, 562220] 515719 [325420] 150581 [221040] 83654 [10180, 55100, 290930] 422824 [200710, 391540, 520440, 350910] 194799 [200710, 108600, 299740, 251170, 232790] 509720 [391540, 330830, 211340, 269150, 7510] 105441 [211820, 20510, 221100, 236090, 232910] 62275 [282070, 55150, 18020, 3170, 55110] 159237 [311560, 233130, 498240] 220114 [332200, 113200, 251470, 420530, 409710] 89042 [588430, 228180] 414536 [264000, 353130] 6560 [4000, 1930, 22200] 241889 [49520, 252490, 32370, 299740, 6060] 280270 [212680, 239030, 253150, 219680] 160472 [108600, 324800, 207610, 307780, 359040] 92731 [221100, 319630, 227940, 223450, 32430] 537706 [65930, 341940, 277430] 217254 [239140, 335670, 294440, 298260] 47493 [9930, 224540] 78658 [240, 304050, 500, 212630, 405900] 40584 [301520, 307670, 635260, 246620] 438596 [275390, 577940, 460810] 417515 [113200, 50300, 41000, 362930] 344960 [250900, 211820, 440, 294100, 306130] 177903 [433850, 363970] 168334 [8930, 252490, 359550, 244450, 333930] 361102 [236870, 324800, 612880, 610860, 113420] 163375 [240, 234140, 80, 667360, 533690] 109387 [435150, 440, 440, 466560] 134107 [4000, 291550] 431416 [239140, 261820, 417860] 238534 [252490, 219640, 278080, 292120, 386180] 183410 [65980, 219640, 219640] 200549 [241600, 282140, 15130] 199750 [252490, 524220, 242050, 346010, 266110] 352483 [221100, 113200, 504370, 329110, 257510] 521823 [271590, 365450, 317510] 152990 [440, 440, 440, 427520, 433850] 116915 [232890, 246090, 467850, 417860] 187750 [203770, 344240, 252530, 351700] 26294 [236430, 214950, 230410, 230410, 386360] 240344 [4000, 271590, 427520, 730, 211820] 261860 [252490, 268500, 429300, 40380, 214420] 228377 [377160, 252490, 7670, 239030] 181679 [526160] 177367 [393420, 7670, 212070, 219890] 165700 [20500, 39500, 466500] 201285 [250900, 304650, 332410, 262960, 311080] 425770 [40100, 324160, 207750, 316720] 276797 [440, 374320, 230410, 200510, 212680] 68833 [107410, 252490, 344760, 393380, 313340] 49874 [49520, 377160, 377160, 289130, 212680] 281317 [24980, 208580, 235600, 213610, 310380] 72678 [203770, 8800, 482300, 17570, 409510] 238585 [389730, 322290, 201420] 283380 [302340] 137153 [237110, 209160, 258180] 90552 [525450, 245170, 450670, 586090, 444420] 366585 [335300, 252490, 440, 444090, 33340] 526465 [311210, 235360, 542720, 532600, 520090] 292377 [233350, 16810, 325090, 264380, 380] 149200 [9880, 326190, 337420, 107100] 159998 [107410, 252490, 433850, 8500, 219150] 369976 [271590, 433850, 233450, 48720, 15100] 408660 [230410, 24200, 376870, 206190] 216423 [200710, 220240, 31290, 212800] 290898 [240, 12200, 286690, 434570] 26232 [377160, 113200, 227940, 380560, 244930] 367357 [24980, 620, 22330, 38420, 40800] 377002 [241540, 20920, 249050, 396480, 420440] 504971 [391540, 104900] 91792 [359550, 233130] 214655 [269210, 213650, 234190] 183040 [239120, 345750, 380150, 237930, 406150] 282455 [241540, 219640, 99910] 213450 [386360, 8980, 233130] 303154 [105700, 258890, 209790] 177965 [230410, 305620, 489830, 204360] 319058 [9450, 301640, 286690, 253710] 272514 [262280, 489180, 286690, 55100] 73270 [406130, 399640, 7520, 262960] 126370 [227940, 269110, 40800, 404700, 448470] 299555 [440, 289070, 304050, 391540, 224820] 76692 [531640, 293260, 545820] 318379 [221100, 424780, 221910] 63256 [211820, 214950, 212500, 98800, 444220] 202802 [213670, 108200, 385770] 195611 [440, 393380, 12120, 29800] 105565 [22370, 473560] 88090 [8500, 102500, 304030, 250760, 209160] 361487 [200510, 4500, 266010] 193700 [22100, 227940, 431240] 459544 [271590, 298110, 220, 310460, 288930] 277866 [346940, 62100, 212480, 200940] 48685 [504370, 554620, 210970, 388880, 400] 47089 [230410, 4000, 271590, 359550, 359550] 287168 [440, 211820, 367500, 418340, 418340] 74272 [48700, 344740] 25289 [221100, 283640, 200510, 248820, 240] 447254 [208580, 447020, 207610, 200210, 223100] 266849 [206420, 620, 108600, 395170] 139708 [251710, 16450, 209830] 119767 [271590, 211820, 359550, 294860, 12100] 333140 [70400, 213610, 251990, 31280, 204340] 268894 [107410, 209000, 204030, 323850, 209650] 130423 [211420, 433850, 503820] 421794 [233450, 282070] 385273 [367500, 215470, 264140, 259170] 267453 [220, 368360, 584400, 400, 319510] 176858 [212480, 331470] 134281 [371660, 113200, 361420, 448750, 289760] 38590 [440, 374320, 281990, 359550, 359550] 278030 [49520, 49520, 49520, 4000, 391540] 62241 [252490, 249050, 6060, 107300] 208851 [359810, 611790] 195021 [367520, 453480, 316010, 282070, 323370] 321114 [620, 455120, 369400] 267984 [49520, 335300, 524220, 337000, 4920] 321050 [48700, 301520, 248820, 282070, 269790] 354577 [391540, 514470, 200170, 70110] 427755 [285130, 234940, 443420] 196638 [294100, 212680, 227300, 239350, 253840] 279441 [48700, 107410, 107410, 301520, 344760] 236793 [6800, 386880, 303210] 212215 [32470, 249650, 24740] 466839 [204100, 204100, 352400, 303940] 297306 [48700, 289130, 403190, 704510, 322300] 60501 [49520, 221100, 301520] 491032 [43500, 451020] 72166 [206420, 6020, 319630, 63380, 698300] 10529 [360, 40] 83588 [107410, 237990, 402710, 201510] 370274 [252490, 252490] 259819 [252490, 8930, 344760, 230410, 211820] 473963 [259130, 233510, 286100] 1099 [8930, 8930, 440, 440, 310950] 29845 [45740, 209100] 117355 [227300, 209870, 340150, 381120, 444250] 378660 [211820, 219640, 391280, 249050, 280180] 386923 [310950, 33730, 269490] 173164 [248610, 8850, 252410, 239030, 322500] 288726 [386360, 230410, 211820, 233720] 389098 [7670, 107100, 17330, 98400] 260485 [221100, 268500, 8930, 301520, 242920] 143945 [391540, 313120, 495890] 176745 [366280, 263300, 428550, 237760] 313597 [252490, 299740, 21000, 360640, 223850] 296711 [201810, 57300, 385710, 203650] 125037 [203770, 440, 440, 8930, 230410] 521433 [326470, 489370] 295489 [440, 440, 377160, 9900, 22330] 317558 [107200, 20920, 50620, 284950, 224460] 100736 [48700, 376210, 622170] 341541 [49520, 220200, 252490, 311560, 249130] 13634 [102600, 336230, 350510, 340200, 105300] 248703 [33930, 282590, 219890] 315424 [208400, 13520, 21000, 315320, 225160] 204106 [377160, 346940, 464350, 285920, 286040] 337689 [107410, 359870, 3480, 233290] 268948 [250900, 500, 35460, 417860] 371198 [524220, 380600, 506610] 280916 [297350] 329839 [262060, 291550, 247020, 70, 34270] 364422 [200710, 32470, 204100, 511800] 312210 [228300, 225080] 142893 [271590, 319630, 207140] 412310 [3590, 24790, 39660, 38420, 50300] 35921 [107410, 233450, 492340] 485884 [219640, 40800, 249990, 252410, 22200] 309995 [252490, 377160, 49520, 367520, 387990] 213950 [271590, 8190, 240, 403640, 439340] 206761 [339340, 48000] 406297 [524220, 367500, 267490] 515093 [4500, 407900] 294482 [335300, 35450, 620, 221380, 333930] 30141 [271590, 219640, 202970] 16687 [271590, 233130] 76394 [377160, 334230, 489520, 507010] 193751 [582550, 408440, 253390, 266110, 420880] 60147 [205100, 204100, 115320, 208200, 233270] 321852 [435150, 212500, 221020, 203630, 253920] 199931 [214950, 8190, 292140, 257120, 32500] 215533 [268500, 113200, 201810, 67370, 346250] 153434 [225540, 212480, 208200, 303210] 99855 [24980] 293240 [212680, 234330, 283450, 556280, 257970] 442162 [403640, 249050, 71000, 248510, 202970] 182881 [304390, 366040, 441870] 136160 [274170, 329430, 285330] 74020 [221100, 233130, 58550] 17404 [252490, 80, 515040, 332800] 359822 [4500, 274170, 219640, 235460, 17410] 381516 [459820, 42220, 255300] 474883 [282070, 6000, 318430] 50723 [298110, 250760, 645090] 170078 [212680, 273350] 358198 [221040, 233450, 271820, 204360, 391730] 340183 [438640, 3830] 300227 [24960, 491330, 394280, 55020, 284950] 367462 [252490, 524220, 403640, 57690, 282070] 212408 [4700, 227300, 105450, 612880, 300080] 111601 [49520, 344760, 204300, 105450] 141296 [227600, 387860] 477478 [588430, 464530, 581820, 621530, 562590] 252219 [48220, 201810, 208090] 351878 [383980, 304240] 284090 [367520, 304390, 301520, 305620, 221100] 141401 [232090, 115100] 314388 [4000, 394230, 282800, 431240, 17080] 468588 [220200, 274920, 242820] 13068 [385730, 320400, 388050, 249630, 294140] 50497 [346110, 107410, 301520, 41500, 8190] 239160 [374320, 221380, 221380, 306130, 213850] 303947 [212200, 326840] 41369 [305620, 391540, 413150, 50130, 221040] 70665 [268910, 245170] 29697 [385360, 3130] 145691 [8850, 249680, 243120, 221910] 86927 [274500, 252430, 295690] 98357 [239350, 204360] 249003 [220240, 17500, 227940, 107100, 107100] 59821 [22330, 337000, 238320] 145073 [212680] 196617 [289130, 393380, 248310, 314660] 96250 [8930, 208580, 403640, 404410, 210770] 277130 [4000, 21690, 462770, 6060, 10220] 315208 [207570, 11240, 24840, 234190] 403868 [252490, 319630, 533950, 718090] 68717 [245620, 227940, 360430, 336010, 504810] 503745 [230230, 333430, 627690, 391270] 64985 [239140, 224600, 554620, 554620, 209080] 37168 [427520, 240760, 242680, 209080, 6000] 227527 [200510, 200510, 55230, 214790] 144969 [45760, 203140, 55140, 205230, 48000] 364567 [293900] 299807 [24010, 8190, 333950, 2590] 251569 [435150, 236430, 50620, 12220, 104900] 518426 [426690, 290730, 337070, 266490] 337434 [204100, 319630, 255390, 45740, 405640] 44598 [4000, 219990, 221380, 233450, 641990] 22720 [211820, 45760, 221380, 431960, 50620] 228031 [440, 4000, 252490, 344760, 239030] 123120 [240, 206410, 731770, 657050, 410890] 423414 [19500, 230150] 45745 [248860, 282140] 318320 [221100, 220, 434000, 216910] 460586 [20540, 238090] 44724 [374320, 204100, 229890, 391720] 105166 [107410, 271590, 241540, 233450, 550] 178635 [440, 236430, 49520, 8980, 290300] 7217 [253840, 340000, 295250, 374030] 349767 [373420, 381780, 251990, 6030] 221953 [242920, 3720] 491369 [356190, 403640, 235540, 368500, 486490] 176248 [440, 240, 298240] 268091 [339800, 219780, 7670, 12110, 203290] 335629 [657630, 513780, 553640] 184774 [113200, 363970, 620, 62100, 625370] 255187 [459100, 327680] 435495 [434620, 331570, 388050, 491330] 369671 [8930, 211820, 301520, 201810, 6060] 383160 [220200, 270880, 292120, 582660, 204360] 536338 [35450, 41500, 20500] 233654 [311210, 232090, 4720, 399120] 409876 [17390, 409510] 112735 [241600, 333640, 40800, 428750] 80857 [266110, 226100, 342860, 221180, 280720] 98004 [49520, 40800, 72200] 285785 [48700, 367520, 418460, 372000, 302270] 336400 [440, 49520, 614570, 357340, 383870] 41496 [240, 362930] 472641 [113200, 253710] 355657 [38600, 582660, 220, 414740, 41060] 183558 [241600, 204560, 487350, 12710, 233370] 478879 [304930, 281610, 40990, 256190, 398710] 276490 [435150, 346110, 304050, 242680, 326460] 527709 [263380, 237740] 110436 [252490, 346900, 376570] 156926 [362890, 427270] 14777 [6420, 219890] 80117 [730, 6220, 329490] 58712 [33930, 241600, 110800, 273350, 204360] 258479 [221100, 262060, 34010, 92000] 159578 [261760, 645790, 297100, 365260] 493330 [233130, 277470, 35310, 339460, 599100] 458453 [341000, 351990, 274190, 275200, 455400] 460966 [332710, 284180, 284180, 374830, 269590] 320965 [230410] 446314 [253410, 248490] 102754 [244160, 413410, 249130, 247910, 317410] 419156 [250900, 374320, 333930, 242860, 355840] 178609 [242050, 282140, 363440, 63380, 308420] 486062 [241600, 377360, 274940, 397550, 266130] 281277 [250900, 377160, 107410, 268500, 241600] 263380 [57700, 215670, 111800] 283068 [346110, 304050, 35450, 9930, 235540] 125788 [329430, 355520, 205840, 343710] 326626 [362490, 219640, 225640] 185641 [323470, 17460, 48720, 233130, 2200] 252446 [304650] 33000 [270450, 299360, 15100, 431240, 8600] 209294 [221100, 3260] 333166 [71340] 310959 [319630] 71862 [113200, 268910, 461560] 186257 [241540, 7600, 203290] 148158 [418040, 551080, 470450, 399430, 412600] 396390 [234650, 214560, 425210, 355760] 318362 [107410, 282070, 35140, 322500, 349700] 458944 [2820, 215160, 236090, 242780, 236930] 489693 [620, 26800] 116885 [374320, 402570, 50620, 376570] 25669 [207610] 508518 [304050, 304050, 17390, 248570] 495782 [200710, 213670, 65930, 410850, 455910] 265202 [346110, 339350, 247020] 438214 [571870, 333260, 320630, 252130, 461840] 322641 [252490, 301520, 212500, 305620, 428690] 177725 [208580, 208580, 214490, 290340, 200210] 309164 [4000, 306130, 232090, 394510, 204300] 10612 [281990, 337000, 219640, 215790, 261980] 526044 [287450, 339800, 241600, 557400, 304050] 363004 [8930, 294100, 49520, 278080, 260210] 138310 [271590, 49520, 304050, 363970, 335300] 311676 [377160, 246620, 12200, 361380] 347280 [367500] 483503 [632070] 162848 [252490, 304930, 113200, 113200, 9480] 518357 [113400, 201790] 328112 [240] 258261 [268500, 334230, 333930, 238460, 218680] 110879 [220200, 252490, 17390, 620, 242860] 258866 [289070, 200210, 212680] 97474 [220200, 211820, 252490, 289130, 553260] 136695 [48700, 8930, 49520, 49520, 211820] 279912 [306130, 200710, 504370, 253230, 12100] 258647 [252490, 346110, 365590, 448510, 495280] 184611 [232090, 242680, 215830, 552100, 402040] 315312 [262940, 40800, 287390] 144064 [287220, 265750, 416130, 356670] 41016 [216150, 261030, 322500, 416130] 277032 [201810, 50130, 253710] 366440 [377160, 394360, 440, 543260, 313120] 429984 [211820, 386940, 23490, 270550, 205950] 324395 [304050, 204300, 298180] 288002 [252490, 252490, 220200, 294100, 304050] 315162 [440, 214950, 234710, 209160] 19009 [330840, 386070] 387592 [335670, 274190, 252550, 288020, 9010] 31242 [224600, 469920, 264240] 52240 [220200, 345350, 238010, 253250, 252490] 288914 [209000, 620] 465642 [6060, 233270, 314230, 226740, 288160] 305819 [367520, 359870, 246760, 271240, 619280] 76120 [304390, 489830, 300, 253710, 335330] 212370 [298110, 313340] 378311 [282070, 757330, 513590, 269310, 515690] 303692 [281990, 91200, 569480] 233281 [435150, 274190, 479020, 237930, 464830] 99531 [400790, 581650, 520860, 368590, 37400] 182617 [252490, 6020, 227940, 588430] 408845 [319630, 238090, 301640, 63380] 402372 [248860, 291050] 95905 [252490, 323370, 207140, 427270] 414227 [257510, 434210, 342520, 339690, 419080] 256575 [8930, 597220, 326160] 47158 [374320, 374320, 377160, 252490, 8930] 275589 [107410, 35450, 447020, 8850] 339612 [344760, 230270, 63380, 312210] 107922 [55230, 376210, 331500, 208090] 221156 [377160, 239350, 284460, 266430] 16775 [356190, 212680, 332200, 58610, 434050] 269158 [221260, 40700] 304723 [220200, 20920, 307780, 219830, 377840] 181003 [33930, 334230, 200510, 45760, 299740] 414330 [33230, 207170, 33460, 284460] 163 [104700, 39190, 22000] 434770 [302390, 301690, 220860] 147698 [601430, 364510, 239820, 207140, 316390] 377888 [391540, 204360, 102700] 302943 [55100] 216269 [298110, 99300, 418180] 386228 [49520, 230410, 304050, 620, 207610] 292861 [252490, 244930] 350394 [286320, 343930, 355520] 521311 [250420] 60887 [219640, 220, 397550, 370510, 285310] 533770 [41500, 50000, 245550] 198345 [236870, 248860, 402570, 218410, 94300] 91544 [322330, 463210] 1763 [219640, 12100, 16450, 400] 282568 [207610, 202530] 329437 [286730, 18820, 259490, 329380] 8834 [444090, 250340, 584400] 17434 [271590, 433850, 319630, 394380, 225080] 115996 [207170, 209850, 314150, 91900] 181891 [283330, 404580, 308580, 428430, 37400] 171996 [48700, 363970, 20920, 270150, 256290] 175168 [45760, 403640, 268050, 47780, 238430] 75119 [49520, 8930, 237870, 409720] 407134 [403190, 259080, 282560] 229549 [113200, 319630, 292730, 224460] 319048 [204300, 644710, 48000] 149259 [9900, 620, 282070, 213850, 291550] 69706 [22300, 40800] 100817 [319630, 268910, 319510] 284844 [427270, 221020, 60350] 106949 [230410] 352080 [212680, 200710, 232090, 302270, 251170] 335160 [55150, 237990, 6910, 9800, 229890] 164978 [213610, 33460] 220488 [8930, 4000, 230410, 227300] 319983 [236430, 49520, 218230, 346900, 242680] 120304 [252490, 440, 208480, 326460, 460920] 129258 [384190] 333387 [272270, 370020, 732290] 67932 [440, 271590, 221680, 4000, 311210] 388167 [377160, 252490, 107410, 212680, 438740] 163202 [444090, 241930, 241930, 20510, 207140] 371879 [206420, 274190, 206190] 289031 [377160, 440, 262960] 291214 [22600, 723780] 262386 [8850, 7670, 55110] 484641 [207610, 310880, 282100] 308778 [274170, 228380, 274290, 214790] 172233 [500, 6020, 399670] 322969 [110800, 399040, 242840, 351090, 285330] 31047 [310950, 447040, 265550, 280010, 252230] 2673 [310510, 318570, 130, 230050] 294586 [110400] 222070 [204530] 139040 [252490, 310950, 216150, 460920, 201210] 191268 [583490, 410120, 360890, 553910, 453270] 49653 [425670, 217690] 386798 [200510, 62100, 245550] 316675 [220240, 339800, 246620, 40800, 207690] 184776 [226860, 233700, 57740, 32430] 434920 [404590, 3510, 442890] 538452 [413710, 406110, 438660, 545540, 313660] 310458 [252490, 298110, 341000, 9930, 222480] 52503 [116120, 504980, 321260] 375862 [501760, 365020] 259203 [569860, 223450, 227580, 250050, 499520] 179568 [48700, 107410, 524220, 344770] 296183 [326840, 473530, 535400, 233740, 47790] 238241 [234140, 220240, 282800, 361280, 200170] 30713 [292120, 242920, 42910] 195664 [49520, 113200] 360967 [389730, 289650, 305620, 444090, 214560] 151767 [6020] 19467 [214950, 239140, 12100, 15300, 238320] 186643 [440, 347830, 403640, 215510, 250260] 409447 [440, 377160, 220200, 24960, 212070] 415056 [377160, 220200] 337756 [377160, 10180] 533753 [259080, 12110] 488598 [250180, 324810] 155078 [238010, 3820, 12110] 220983 [383980, 108710, 286120, 201480] 310701 [230410, 227300, 391540, 311720, 394540] 102938 [335240] 436389 [243470, 50620, 242640, 220440] 244128 [239140, 233450, 304930, 71340, 420530] 49030 [301520, 26800] 270567 [20500, 243950, 202750] 30792 [212680, 390670, 55040] 204640 [48700, 245620, 216150, 338000] 103085 [8930, 214560, 282760, 47400] 284265 [271590, 203770, 220200, 220200, 440] 93024 [440, 389730, 444090, 213850, 351140] 466077 [486310] 209337 [396790, 414660, 284770] 466769 [107200, 259680, 113200, 527740] 244482 [394510, 219830, 262690, 6200] 137071 [440, 440] 23192 [202170, 15620, 218130, 220780] 384322 [252490, 252490, 440, 427520, 233450] 372885 [346900, 307780, 413420, 224600, 238210] 8759 [242920, 57690, 441550, 12770, 317400] 239799 [49520, 105600, 500, 262410] 411633 [245620, 330830, 343840, 13520, 31280] 226713 [252490, 230410, 304050, 261640, 8850] 370714 [73020, 355980, 380150, 308420, 503350] 397208 [239140, 232090, 201790, 220, 288470] 4794 [253370, 246090, 252850] 57533 [346110, 239140, 213670, 235540, 220] 319599 [281990, 347830, 310700, 102600, 361360] 112056 [203770, 205100, 307690, 251060, 282140] 302293 [271590, 252490, 324080, 324800, 440] 413624 [248610, 324570, 387860] 199802 [201790, 361420, 365300] 288375 [400] 74609 [211820, 435150, 204300, 99900, 291480] 471843 [735240, 693190, 674060, 556860, 496550] 23874 [442710] 246695 [204180, 263060, 283290, 29800, 299460] 183015 [377160, 391540, 213670, 12220, 430440] 285299 [413150, 236090, 210970, 473690, 39160] 106100 [241600, 308440, 250260] 1700 [252490, 241540, 39120, 299740, 212630] 171868 [32470, 212680, 282140, 236850, 504620] 466013 [220240, 22320, 214560, 17520, 20] 375652 [384300, 431730, 212070] 534710 [39210, 387290, 350080] 380950 [202170, 236090, 588430] 17893 [223100] 334038 [208140, 403640] 294649 [202170, 269210, 344860, 247240, 249630] 521247 [221040, 17520] 502499 [8500, 33770, 368720, 219780] 100678 [220200, 304050, 550900, 434170, 247080] 130647 [266510, 23380, 57300] 274866 [271590, 107410, 374320, 113200, 390670] 64981 [558990] 217511 [236870, 362890, 40800] 279676 [372000, 296240, 500710, 510630, 625140] 486043 [286690, 546410, 393930, 434150, 402630] 527633 [268400, 262940, 296770] 105164 [4000, 500, 205100] 75840 [224600, 115320, 263060, 417860] 385389 [2300, 221640] 329543 [221040, 528300, 384960, 486360, 434730] 407838 [357600, 467390, 253310, 461970, 205550] 403512 [409710] 446708 [451340, 239160, 385590, 273760, 448820] 19411 [255710] 188427 [203770, 212680, 221100, 209000, 107100] 309606 [204300, 476600] 134092 [211820, 280740] 37920 [4000, 438740, 221380, 55230, 413150] 218992 [301520, 253230, 353280, 212480, 434410] 101957 [221100, 377160, 221680, 440, 206420] 426837 [219990, 200710, 235600, 34830, 70000] 100440 [346110, 221100, 438740, 212500, 444090] 199493 [221100, 304240, 291480] 466227 [12630, 441680, 404680, 340830, 226720] 376674 [690060, 7520, 340460, 2590] 38541 [289070, 722060, 319050] 142941 [208140, 280220, 246620, 305660, 1610] 193815 [239140, 394230, 384550, 232430] 215767 [538590, 255870, 201480] 53988 [435150, 274170] 47930 [49800, 277430, 230050] 15829 [730, 271590, 231430, 221100, 444090] 264803 [304650, 360430, 261570] 360750 [33930, 282900, 409710, 436180] 237594 [304050, 205100, 200940, 219890] 105196 [13570, 17470, 92000, 32350, 204060] 274800 [230410, 377160, 440, 367520, 248820] 19398 [241600, 244870, 406150] 337143 [220200, 427520, 271590, 252490, 386360] 214736 [230410, 252490, 377360, 6060, 215470] 165393 [377160, 440, 440, 427520, 271590] 32218 [49520, 306130, 391540, 405900, 676820] 293171 [240760, 454650, 504370, 433340, 203290] 264923 [233130, 268910] 238707 [213670, 55040, 630] 448112 [208650, 242640, 235210, 350080, 209190] 25935 [202170, 55150] 130990 [239120, 282070, 409450, 221830] 463814 [289070, 208650, 410670, 238260, 2400] 327092 [212680, 601430, 70400, 10150, 239200] 221756 [377160, 234140, 70, 346010, 260730] 302540 [252490, 257350, 4500, 282070, 12120] 53964 [376210, 342300] 96263 [230410, 289070, 386360, 242860, 32440] 213484 [233450, 70400, 282070, 334620, 275470] 170901 [301520, 221100, 525700, 355840, 562590] 94633 [377160, 281990, 294100, 107410, 220200] 224636 [200510, 1250, 295550, 7760] 306107 [212480, 224760, 91600, 210770] 303606 [35450, 364390, 108710, 238320, 373990] 199063 [328470, 356580, 253980, 6090, 382490] 503203 [10180, 300580] 536056 [335300, 24980, 620, 207350, 233270] 111320 [231430, 307690] 19783 [335670, 261760, 201790, 209950, 588430] 205140 [373420, 620, 65980, 552700, 65930] 276096 [40100, 258890, 250620] 228726 [219640, 291480, 317360] 419196 [202170, 534290, 332310] 296372 [404590, 107100, 529780] 111297 [296050, 386260, 344040] 427328 [206420, 270370, 313340, 248570] 313583 [107410, 250900, 39210, 33930, 262060] 162464 [294860, 620, 275850, 34010, 412830] 159724 [306130, 378540, 361280, 285800] 524679 [239140, 460920, 275490] 240318 [8930, 32370] 101983 [246620, 12140, 32390, 237740, 238430] 396932 [386360, 394690, 200210, 29800] 492016 [346110, 4000, 208140, 205100] 147524 [453480, 287290, 405640] 370928 [453350, 453740, 490880, 401290, 437340] 278445 [252490, 49520, 294100, 386360, 262060] 305745 [332250, 425460, 369070, 387290, 261510] 440108 [4000] 192517 [408280, 261680, 302510] 405422 [464360] 381090 [368290, 581610, 320, 402890] 282033 [213670, 438020, 55040, 301460, 205230] 82348 [482730, 291550, 589290] 169561 [246110, 47790] 51987 [230410] 493776 [113200, 363970, 32470, 512470, 360870] 265782 [482300, 115100, 278360] 236772 [261640, 33420, 262940, 1700] 71558 [440, 362890, 271370, 12150, 17430] 406862 [440, 49520] 260015 [453480, 326650, 452420] 405534 [445220, 400450, 558420] 84052 [377160, 242920, 249050, 237760] 166450 [236430, 486310, 457010, 274350, 550650] 271385 [221100, 272510, 258970] 349045 [319630, 414340, 63940, 245550] 70352 [208140, 262120, 63380, 9070] 86187 [440, 440, 376210, 360430, 439950] 23012 [374320, 440, 440, 252490, 8980] 497221 [368340, 356000, 206190] 432531 [9450, 73010, 208200, 4570, 202200] 374398 [234650, 228280, 16720] 66986 [98800, 107100, 57300, 70300] 193963 [374320, 49520, 360430, 240620, 574050] 439524 [200710, 6020, 38420, 9050, 8850] 472352 [221040, 293220] 80837 [252490, 440, 440, 281990, 301520] 386631 [377160, 220200, 250900, 214950, 346110] 94213 [306350, 267980, 526790, 395170, 340] 217988 [359390, 225080, 63000] 26193 [252490, 204880, 319630, 333950, 217200] 12305 [220200, 427520, 311210, 240, 268420] 339048 [252490] 145169 [346110, 6060, 22300, 47790, 65790] 81082 [203770, 268500, 221380, 272270, 230190] 253571 [214490, 57900, 591740] 63907 [282070, 259080, 431960, 17410] 215674 [381260, 705240, 307570] 474441 [238010, 262060, 226960] 132129 [232090, 314410, 305460, 319910] 87499 [377160, 20920, 204100, 362490] 383699 [49520, 6910, 352520, 258180, 220780] 321027 [230410, 689400, 563560] 52005 [291550, 224260, 375600] 271764 [65800, 330070, 400040, 50300] 384343 [301520, 49520, 249130, 207610, 227940] 21112 [413150, 206420, 218230, 221540, 504210] 322714 [204100, 39800, 362680] 343302 [10090, 214490, 42700, 7940] 92008 [389160, 238370, 24780, 368950, 70000] 63778 [374320, 353560] 379144 [234670] 417660 [214490, 202970] 236945 [48700, 48700, 221100, 211820, 4000] 186152 [211820, 201810] 406682 [425220, 1250, 635880, 355920, 313210] 268284 [271590, 356190, 224260] 257428 [221680, 242550, 283290] 334374 [221100, 35450, 329430, 360430, 555640] 534932 [4000, 440, 211820, 304390, 221100] 310082 [48720, 298140, 17510] 474154 [367520, 107200, 433340] 105426 [242640, 233290, 57300] 237181 [335300] 193636 [384190] 125746 [227780, 252010, 255070, 221910] 259480 [237990, 243950, 108710, 209650] 266414 [6060, 252330, 22230] 14670 [385730, 600450, 341120, 361360, 325090] 261172 [201790, 384490] 172119 [48700, 413150, 214490, 393380, 287980] 263113 [294860, 44350, 356050, 550650] 396709 [269670, 529780, 206190, 368990] 12992 [290730, 223810, 218510] 241204 [281990, 480650, 527280] 324775 [440, 230410, 301520, 237870, 431240] 357702 [374320, 200710] 313151 [268650, 333930, 400040, 276890, 286340] 68844 [220200, 271590, 67000, 201570] 388657 [221100] 237666 [225540, 405640] 107591 [208400, 113200, 269210, 228440, 229480] 44528 [291650] 331284 [271590, 323470, 413710, 49600, 301860] 327710 [440, 440, 219640, 227060, 221910] 326385 [231430, 310790, 332310, 221910] 67318 [221380, 350640, 42120] 193230 [346110, 363970, 620, 460790, 388090] 279628 [271590, 440, 274170] 130466 [22370, 6060, 219890, 264240, 643270] 438608 [233290, 348790] 378925 [234140, 329430, 397340, 11390, 252730] 495701 [55230, 108710, 9000, 233290, 210770] 317201 [305620, 620, 23490, 311870, 219150] 364857 [377160, 45760, 240, 227940, 253710] 273583 [221100, 280320, 324190, 355840] 92986 [232090, 252410] 83046 [209000, 418370, 12200, 237740] 121790 [304410, 674310, 364640, 244070, 301700] 22384 [8930, 224260] 298814 [4700, 396480, 333640, 61520, 250620] 62436 [347830, 214610] 172052 [359550, 210770] 195960 [204880, 239030, 352520, 104900] 517263 [13520] 267281 [440, 390670, 104600] 452318 [700790, 696550, 722670] 27744 [294100, 289130, 696460, 703210, 607270] 271912 [200710, 4920, 661960, 509530] 287343 [214950, 200710, 240, 227940, 274940] 383485 [39210, 212680, 524220, 22300, 212160] 192781 [377160, 55230, 8980] 302910 [230410, 394690, 55230, 227300, 222750] 319870 [220200, 202170, 314160, 275490, 258180] 117463 [233270, 40700, 219680] 158302 [601430, 35140, 232750, 369300, 314220] 461373 [4760, 63200, 6060] 236384 [241600, 333950] 220651 [440, 4000, 323370, 50620, 420530] 448526 [201810, 32370, 42700, 313340] 386429 [377160, 386360, 374320, 20920, 504370] 217823 [440, 236430, 245170, 314280, 550650] 259371 [48700, 107410, 49520, 292120, 403640] 249100 [310070, 276220, 318570, 252910] 74022 [55230, 330840] 370716 [48700, 252490, 219640, 227160, 232890] 319358 [240, 440900, 255420, 265610, 21660] 507664 [262280, 6030, 305050, 385560] 35796 [49520, 339800, 227940, 243800, 21100] 58601 [223510, 204390, 299420, 214570, 110500] 180677 [482730, 290080, 372000, 301520, 55230] 327407 [234140, 242860, 20540] 335900 [65980, 113200, 234650, 250760, 22300] 46717 [208480, 219640, 32370, 344760, 307880] 124065 [221380, 200210, 261570] 159158 [252490, 252490, 495910, 236090, 91600] 460170 [287390, 311340, 225080] 17006 [536930, 283640, 340170, 456670, 377870] 334989 [212680, 65800, 227940] 307246 [549180, 317710, 420740, 436180] 109330 [305620, 287450, 218510] 255982 [440, 252490, 386360, 413420, 363680] 305546 [255300, 286040] 95995 [4500, 8850, 24740, 9000, 7670] 475558 [304050, 218230, 285900, 362930] 193177 [252490, 214950, 227940, 322330, 80] 74828 [335670, 361420] 224062 [340490, 286340, 233470] 500046 [393380, 311730, 311730] 94442 [319630, 311140] 467743 [304390, 252490, 201810, 282440, 130] 332659 [464620, 369990] 399149 [24010, 371200, 302270, 584400, 488790] 220961 [48700, 216150, 512770, 512300, 205230] 385737 [377160, 440, 240, 218230, 38230] 353132 [232890] 322308 [362620, 253250, 397150] 513165 [377160, 316010, 424250] 248425 [304390, 304390, 284160, 274190] 302464 [35140, 15500] 130716 [250900, 335300, 205100, 492290] 202584 [39210, 239840, 266170, 386070, 232750] 195419 [271240, 225420] 54621 [200260, 630] 105305 [400910, 99300, 237930] 50079 [390200, 615350, 284810] 508314 [308040, 455400] 78893 [1250, 302830] 214867 [268500, 49520, 241540, 257690, 98400] 211787 [270880, 238010, 40800] 39229 [212680, 312530, 312530, 234080, 239120] 224386 [440, 359550, 63380, 282440, 238320] 183325 [211820, 211820, 240, 241930, 444090] 24189 [730, 252490, 433850, 333930, 433340] 66117 [230410, 291550] 132439 [211820, 524220, 235600, 209160] 166387 [224260, 51100] 60255 [307670, 203750, 95000, 228300, 371520] 213882 [572410, 390670, 311260, 563560] 325019 [108600, 49600, 248370, 290650] 250605 [363970, 333930, 371660, 330840, 300970] 7252 [427520, 204300, 232790, 251130, 219190] 357072 [273350, 264690] 254989 [359550, 365590, 205100, 582660, 403640] 497838 [412520, 627690, 438680] 422076 [282070, 264140, 206440, 575640, 48000] 328579 [346110, 271590, 304050] 81205 [107410, 208400, 400580, 266510] 261256 [440, 221100, 323370, 339800, 35140] 38097 [203140, 115800, 38700, 232430] 169313 [271590, 227300, 20500, 9480, 240] 489243 [363970, 620, 9480] 177439 [377160, 444090, 110800, 108200, 70000] 360418 [620] 482301 [200210, 35720, 299600, 111400] 395655 [239820, 224760, 215870, 320, 219150] 277537 [9900, 55230, 327670, 259660] 166057 [374320, 211420] 413337 [431960, 204450] 322787 [359310, 280, 2590] 392945 [200510, 307780, 481510, 481510] 521236 [252150, 283290] 348238 [402840, 7530] 203132 [577490, 558990, 253860, 70210, 49900] 270912 [220200, 291550, 249050] 107551 [242680, 323850, 351340, 417860] 411826 [403640, 449830, 333980, 206440] 212796 [33930, 42700, 7830, 248610] 81964 [252490, 211820, 241930, 304150, 593530] 141139 [252490, 305620, 334230, 203140, 35450] 87099 [252490, 211820, 314160, 63380, 71340] 8752 [240] 50765 [220200, 240, 400630, 259080, 219890] 396008 [331470, 367120, 314450, 423880] 55688 [440, 250900, 206420] 49212 [377160, 219640, 70, 575720] 35112 [3590, 8190, 246090, 214190, 434570] 172061 [102200, 312720, 39670, 55040, 220090] 323091 [47890, 55230, 39140, 22330, 2870] 174897 [248820, 107100] 150133 [48700, 230410, 310950, 48720, 319630] 286072 [312280, 266430] 415403 [48700, 107410, 24010, 252490, 266230] 247548 [335300, 20920, 339800, 459820] 390432 [108710, 209160, 204180] 161490 [467820, 240720] 260820 [8850, 32900, 205230, 26800, 357070] 282465 [377160, 346900, 460120, 113400, 443810] 222155 [209000, 35720, 15100] 515443 [33930, 221380, 204340, 355180] 163349 [221100, 261470, 12830, 384190] 175902 [536930, 17710, 107100, 295730, 214590] 84354 [316010, 212480, 32800, 344190, 212010] 268514 [108500, 207530, 206190, 242880, 209790] 442203 [340950, 342890, 340470] 31418 [497100, 697610, 104700, 575640] 255461 [440, 250900, 4000, 236430, 4700] 120566 [466560, 462930] 332863 [240, 247730, 227100, 362680, 303210] 186026 [232090] 35854 [359550, 427520, 394690, 304050, 302830] 20627 [236870, 50300, 2290] 180988 [248860, 603960] 337030 [281990, 239140, 55230, 238010, 273070] 32538 [388160, 428830, 463210, 263980] 183426 [391540, 270150, 270450] 526048 [49520, 24980, 221040, 279900, 15100] 146943 [440, 49520, 438740, 359550, 620] 246662 [306130, 8500, 391540, 113200, 21690] 3919 [346110, 440, 234140, 363970, 251060] 71062 [385250, 373630, 394290, 288000, 402390] 359089 [240, 220, 70, 63380, 41000] 85768 [335300, 221100, 221100, 306130, 49520] 383326 [440, 346900, 220, 220, 417860] 66435 [230410, 290080, 265550, 327890] 144731 [440, 107410, 314160, 219640, 219150] 97863 [379720, 250340, 234650, 25000, 268200] 66685 [363970, 99900, 439910, 468490, 293180] 61294 [620, 365590, 16810, 15210, 211740] 126677 [9900, 433850, 113400, 261510, 104900] 77655 [107410, 224260, 394760] 100030 [392820, 218740, 438180, 428430] 266158 [208650, 6060, 12120, 356040, 562590] 281948 [6060, 275850, 431960, 274940] 202427 [290040, 464050, 597220] 392054 [241600, 269210, 12810] 356834 [40300, 220240, 200210, 35130, 242780] 104131 [409000, 444260, 357300, 351090, 493370] 408166 [248820, 303390, 433950] 145985 [200510, 233450, 242920, 238240, 320140] 33919 [387340, 261510, 323060, 315260] 426509 [253610, 289090, 234900] 50719 [245620, 312010] 364945 [287700, 287100, 224540] 75782 [214490, 226620, 261470, 2600] 70190 [440, 440, 274190, 597220, 31280] 152755 [2600, 212780, 207430] 369737 [271590, 301520] 27225 [242920, 232890] 198585 [227300, 50, 17410, 111900, 214550] 102794 [230410, 212680, 346940, 248310, 376570] 301028 [515220, 286690, 628490] 372564 [45300, 115320] 328021 [20920, 38410, 9480] 87402 [301520, 359550, 242920, 444090, 239350] 228384 [294100, 48700, 356190, 236690] 73586 [248390, 42910, 326410, 235540] 338475 [386360, 250760, 277890, 285440, 22200] 374690 [231430] 266821 [289650, 2280] 16344 [252490, 386360, 17390, 10180, 371180] 508247 [250900, 49520, 211820, 213670, 635060] 59898 [203770, 250900, 110800, 55150, 70400] 83580 [281990, 440, 440, 206420, 218820] 377046 [233450, 384180, 253710] 437697 [33930, 212160, 239070, 7670, 6850] 36966 [34030, 233450, 543140] 118166 [252490, 350640, 220860, 1520] 144339 [271590, 261530] 525085 [9880, 310380, 204300, 338810, 527440] 374864 [271640] 152284 [242860, 314520, 527440] 90729 [234650, 8190, 308580, 242880] 303860 [582160, 610180, 277590] 225418 [244930] 489834 [339800, 13230, 13210] 446048 [367500, 367520, 233130, 45700, 242640] 215340 [289070, 65800, 227300, 202170, 413150] 30727 [200510, 331670, 15400, 291370, 107100] 15183 [8930, 49520, 252490, 24200, 236430] 102167 [48700, 377160, 271590, 241930, 512300] 433057 [210970, 297370, 253290, 405640, 302010] 420323 [8930, 403640, 202170, 35140, 620] 349664 [240, 337000, 330840, 10150, 420530] 200734 [386360, 304390, 204100, 447530, 238090] 122417 [444090, 345440, 394310, 220820] 128493 [440900, 205100, 256370, 403980, 339280] 167095 [242680, 620, 107100, 255940, 315540] 367901 [383980, 273110, 636480] 310565 [204100, 204240, 265890] 253273 [214830, 232430] 221773 [339610, 304290, 232010, 417860] 261620 [200510] 246214 [433850, 359550] 152566 [78000, 34010, 358180, 403860, 7670] 86760 [8930, 306130, 105450, 444090, 246620] 249666 [282900, 206420, 104900] 43226 [8930, 220200, 214950, 524220, 33230] 101744 [220200, 4000, 221100, 239140, 413150] 318838 [230410, 270880, 250760, 230050] 140793 [225540, 433340, 322500] 49674 [22100, 7510, 360590, 241240, 307070] 320104 [104900, 252670, 238430] 254576 [263280, 328940, 246700, 361550, 274310] 18448 [8930, 32370, 539700, 717640, 221910] 275942 [220700, 249650] 474029 [8930, 212500, 434570, 29800, 222980] 253040 [377160, 226860, 262060, 620, 215530] 232007 [391540, 40800, 207400, 34900] 174648 [49520, 221100, 333930, 391540, 248820] 102957 [241600, 248820, 307690, 387290, 11450] 463274 [291650, 240760, 207430] 68930 [49520, 238010, 13230, 31280] 259121 [316010, 234900] 431662 [45760, 500, 273350, 220, 235210] 495159 [241600, 252030, 35140, 22300, 62000] 381618 [252490, 252490, 481870, 250560, 209650] 106966 [201810, 312970, 247660] 65970 [346900, 252490, 319560] 199614 [333930, 263500, 8850, 207080, 48000] 36044 [19900, 249990, 269270] 9217 [268500, 209270, 20550, 230980, 22200] 527865 [231430, 322190, 274190, 368680, 73020] 146062 [39210, 39210, 234670] 61439 [39210, 335240, 6060] 130293 [94590, 284260, 96000, 417860] 433749 [239350, 425040] 157251 [211820, 564710, 599460, 596650, 12500] 36649 [239140, 239140, 363970, 251060, 664430] 135514 [65930, 355270, 302010, 94200] 374716 [254440, 247660, 275080] 72177 [440, 209000, 113200, 297750] 111829 [107100] 104778 [250340, 325610] 262570 [241930, 205100, 282140, 235460, 15000] 131969 [394360, 4000, 385760, 323370, 363970] 26976 [257510, 289650, 365420, 331760, 225600] 212530 [219990, 40390, 90200] 438564 [22330, 35450, 321260, 351800] 263989 [205190, 219640, 322500] 258807 [440, 48700, 113400, 2600, 300080] 127960 [212680, 115120, 24240, 6850, 17410] 14168 [55150, 3720, 200170] 414616 [440, 224500, 533780, 461560] 234904 [377160, 240, 261640, 339340, 208090] 150431 [249130, 564710] 283696 [213850, 97330, 210770, 210770, 207040] 89615 [267360, 283230] 333896 [409600, 233290] 244592 [262060, 334230, 310380, 383750, 9010] 331957 [115110, 215690] 94200 [234650, 70300] 111423 [433850, 256290, 273350] 390454 [292120, 296970, 214910] 66283 [558490, 661040, 505090, 770400, 440420] 219233 [274190] 467956 [204880, 214490, 220, 21090] 292281 [65800, 218410, 201570] 450237 [220, 339800, 227940] 308129 [239140] 313641 [303790, 204180, 219950] 45712 [252490, 221100, 272510] 530219 [311210] 24598 [2870, 223430, 246090, 19200] 478141 [252490, 284730, 244910, 301860, 369580] 51045 [49520, 45700, 21090, 246580, 303210] 26366 [359550, 290530, 371010, 13570] 212942 [268500, 227300] 130552 [203140, 20510, 22600, 386940, 33620] 238825 [225840, 205950, 319510] 137682 [281990, 245620, 25000, 253770, 375510] 480211 [429940] 130081 [526880, 394540, 416040, 327680, 220820] 20283 [16720, 225160] 155105 [421040, 447850, 588970] 467221 [214950, 55150] 380317 [271590, 252490] 74140 [440900, 208400, 234080, 256290] 43216 [377160, 220200, 427520, 34030, 239820] 315830 [301520, 240, 233980, 8190, 234710] 354761 [235540, 105000, 32510] 213625 [414720, 2400, 211360, 92000, 95700] 499245 [2210, 291010] 41736 [320, 474750] 9153 [227300, 440, 307780] 363082 [250900, 2870] 10136 [203770, 6860, 362890] 386990 [233720] 219874 [260160, 207140, 43190, 17470, 250560] 30028 [242680, 234370, 274500, 32410, 253350] 304256 [271590, 730, 374320, 374320, 8930] 106042 [230410, 48190, 17710, 78000, 219640] 160642 [250900, 207610, 387290] 29756 [346120, 439940, 24800, 588010, 314010] 136341 [220, 273350, 571310, 244050] 54689 [211340, 431450, 302830, 319970, 469820] 268692 [301520, 261640, 368390, 202970, 239800] 89698 [211820, 344850, 225080] 351323 [252490, 252490, 440, 262060, 391540] 188391 [407300, 429570, 232430] 400649 [536930, 287920, 268420, 482920, 397720] 431720 [334420, 263980] 50183 [107410, 495910, 314160, 361420, 442500] 236599 [415480, 683280, 454610, 376570] 481273 [310070, 306660, 343100] 177115 [8930, 240] 241233 [440, 435150, 377160, 389730, 305620] 255137 [394360, 108600, 200210, 91600, 310080] 50150 [283310, 104900] 318668 [70400, 236470, 213330, 262550] 348648 [209000, 282070] 105337 [91200, 11200, 99300] 496763 [435150, 248610] 79760 [241540, 237110, 221910, 207930] 74764 [346110, 350280, 545280, 2500] 248030 [620, 218020, 38700, 368950] 415091 [730, 240] 330273 [237990, 214770, 48000] 2748 [238240, 220860, 409510] 295734 [236430, 274190, 319280, 4570, 208750] 322122 [49520, 240760, 35140] 23988 [22330, 12790] 40258 [48700, 304390, 304050, 35450, 684410] 302505 [323470, 389730, 219910] 326452 [440, 268500, 244160, 322190, 224260] 440291 [41700, 270090, 211400, 253710, 26800] 166540 [391540, 20] 132977 [49520, 386360, 433850, 393380, 304050] 39094 [248820, 360650, 362870, 467000, 264240] 340201 [200510, 274940, 22370] 108160 [221680, 536930, 99900, 237310] 316634 [239140, 375950] 196606 [231430, 385830, 323380] 120865 [304240] 145523 [365360, 282070, 208520] 50206 [413410, 391540, 232790, 242820] 297821 [386360, 240, 251470] 356185 [225540, 337000] 333261 [236110, 322520, 308173, 34600] 294609 [49520, 212680, 209000, 299780] 319128 [363970, 65930, 244930] 25981 [25000, 318650, 50, 252410, 6000] 207207 [312530, 236110, 6910, 91600] 319869 [267490, 268850] 36792 [220200, 323370, 360430, 236090, 266010] 299476 [252490, 24010, 263280, 4500, 319630] 493388 [8980, 319630, 305380] 334085 [379720, 320, 21090, 407840] 112897 [506610] 24449 [323470, 45760, 245620, 240, 300] 526363 [271590, 271590, 304930, 249130, 470220] 427608 [324760, 41900, 7760, 7860] 170312 [575130, 533300] 300216 [466010, 563320, 467490, 447890] 173079 [32370, 6910, 225080] 531606 [224260, 512020] 147439 [49520, 391540] 13808 [305620, 391540, 6370, 10180] 328777 [252490, 282900] 237726 [252490, 252490] 326685 [200710, 50620, 67370, 391720, 223220] 10841 [4000, 7940, 374570] 234666 [214560, 213330] 207876 [220700, 310360, 346920] 493039 [339800, 400] 202977 [212680, 32430, 221640] 255113 [78000, 234650, 313400, 395160, 252310] 368918 [32440, 2300, 353360, 270170] 472205 [28000, 211400, 374030, 302510] 225133 [374320] 221394 [220200, 440, 113200, 241560, 253230] 27605 [620, 403640, 393380, 214560, 13570] 368123 [243970, 267490, 224960, 32360] 27956 [319630, 42700, 227940, 262410, 209160] 262766 [49520, 620, 70000] 286704 [8930, 24200, 213670, 620, 233740] 181271 [24200, 220240, 282070, 504370, 454070] 241496 [440, 342510, 234920, 290140, 286940] 208636 [294100, 440, 243470, 320, 50620] 120077 [220240, 240, 48950, 266130, 6000] 334346 [440, 230410, 230410, 313120] 72352 [42960, 40800] 477010 [227940, 620, 213610, 428900, 57300] 136576 [301910] 490284 [290730, 321480, 270570, 307290, 257710] 375856 [8930, 444090, 221100, 221100, 8190] 339741 [251060, 48720, 10680, 28000, 33460] 249189 [203770, 239140] 311103 [35450, 45760, 24740, 238320] 225628 [252490, 447040, 200710, 715010, 40800] 172821 [386360, 238750, 242860] 314673 [70, 376570, 225640, 438680] 38826 [268500, 300380, 302510, 222730, 221910] 465894 [433850, 238010, 393380, 220440, 299360] 273673 [203140, 248610, 556180, 429620] 386983 [287390, 415240, 104200, 533300, 409670] 476294 [48700, 281990, 224920, 206190] 326769 [320, 302610, 461640, 380020, 70000] 117536 [440, 230410, 427520, 227300, 49520] 368176 [261760, 4420, 503620, 487120] 252405 [440, 444090, 311310, 413710, 397060] 455878 [223830, 491580, 234940, 202730, 41740] 303129 [227300, 316010, 236870, 382900, 300] 128299 [287290, 48000] 213870 [375250, 322040, 252010, 399820] 371670 [213670, 215530, 39160, 333930, 23450] 445428 [236430, 57690, 253650, 391270] 307806 [377160, 21690, 314710] 336979 [221680, 285160, 266210, 444140, 298930] 331085 [730, 250900, 239820] 266186 [24960, 620, 255870, 70000] 275994 [598470] 50674 [252490, 230410, 298110, 250760] 77458 [250900, 8980, 212480, 209630, 294690] 4949 [446020, 314010, 305960] 240462 [204530, 566160, 270450] 258012 [252030, 230700, 369070, 206440] 378947 [251130, 207170] 22635 [271590, 386360] 299055 [200510, 107410, 63380] 28413 [346110, 224260, 209160, 427730] 94257 [22330, 224260] 123351 [33930, 12100, 32350, 65780, 71260] 164801 [660160, 631540, 493220, 630360, 402130] 132272 [230410, 211820, 252490, 221100, 220200] 281443 [205840, 501300] 25380 [314540, 277510, 274880, 233130, 387840] 521387 [4000, 271590, 213670, 291550, 423580] 56708 [4000, 49520] 292810 [440, 208580, 209000, 374040, 393530] 41571 [374320, 440, 507010] 220557 [444090, 274170, 266210, 431960, 403700] 446634 [228380, 291190, 259080, 115110] 124165 [107410, 4000, 203770, 393380, 71340] 228763 [48700, 239140, 35140, 234490, 349480] 88910 [12110, 485490, 313340] 463099 [252490] 129685 [255710, 556720, 568580, 526400, 513780] 162525 [48700, 65800, 252490, 275850, 307780] 22956 [220200, 227300, 234650, 553260, 254440] 73753 [620, 24780, 247660, 200080] 274153 [207610, 287290, 10180] 120672 [386940, 261470, 57690, 38420, 422970] 403222 [443260, 209230, 322970, 11370, 266510] 457033 [209670, 107310, 340] 66118 [113200, 45760, 291010] 516504 [113200, 512060, 206210, 350080, 434570] 19531 [107410, 435150, 403640, 363680] 446870 [221640, 41000] 67370 [688630, 301200, 588420, 577460, 429470] 419837 [220200, 236110] 272208 [268500, 55230, 207610, 280010, 507010] 404955 [49520, 301520, 282800] 215826 [363970, 319630] 101913 [230410, 9930, 500710, 388880, 220260] 382218 [377160, 200510, 20920, 20920] 257945 [305620] 303516 [22320, 24740, 17730] 196952 [330840, 6020, 375510] 436211 [211820, 440, 440] 475046 [55230, 263320, 259080, 224260] 215125 [374320, 271590, 447040, 339800, 225540] 196174 [209080] 523166 [8190, 30, 250420] 57999 [444090, 242760, 364980, 217690, 402330] 365370 [234140] 336124 [221100, 242680] 161215 [363440, 560260] 164218 [377160, 49520, 47890, 233450, 108600] 51480 [271590, 377840, 32430] 133319 [611800, 311730, 227180] 450015 [237990, 221640] 139603 [206420, 7210, 80360, 80310, 130] 352246 [304930, 610080, 344760, 8190, 279990] 140450 [349220, 355760] 245797 [24740, 219150, 302510] 200562 [359310, 291610, 94590, 231040, 277890] 94542 [394540, 526800, 371180, 301910, 385710] 529249 [230410, 447040, 273350, 203290, 398710] 164799 [367520, 216110, 35000, 221910] 95659 [386360, 205100, 227940, 218230, 239030] 68485 [107410, 65800, 301520, 241540, 20920] 53105 [49520, 22370] 460117 [374320, 291550, 12810, 352460, 70300] 71839 [429660, 41700, 291480, 266010] 144378 [55230, 41700, 359310, 497050, 358180] 69090 [234140, 403640, 6020, 233270, 368370] 199410 [236110] 215890 [49520, 333930, 214490, 312750, 44350] 102856 [372000, 24960, 71240, 268850, 401190] 355878 [242640, 345180, 11450] 191768 [367500, 47400, 652980] 328161 [440, 201230, 472870, 233130, 201570] 236072 [352070, 576420, 267360, 239030] 320849 [323470] 39827 [241600, 65980, 248570, 285900, 16720] 39856 [367520, 301520, 349040, 2820, 403640] 227084 [440, 438640, 220460, 395200] 332462 [236430, 4000, 50130, 233270, 292410] 184816 [252490, 221100, 334230, 17390, 321290] 84225 [304390, 402570, 311340, 261820] 293120 [252490, 444090, 329050, 527240, 299360] 180227 [204880, 105000, 232790, 582270] 30460 [4000, 271590, 319630, 278360] 124826 [208650, 273350, 522040, 39800, 235250] 310890 [394510, 364420, 218060, 253900, 298600] 249463 [107410, 241930, 215470, 212800] 105972 [201790, 307780] 422422 [440, 220200, 99900] 188985 [20920] 157466 [65800, 240760, 369000, 232890, 331440] 64698 [4500, 6020] 428814 [9160] 267455 [440, 107410, 262060, 262060, 462930] 213365 [440, 359550, 482730, 230410, 11610] 177569 [498240, 375820] 104241 [208480, 500, 588430, 6860] 109136 [251290, 221380, 531640] 121980 [391540, 294670, 401810, 3483] 321826 [26800] 514031 [440, 289130, 329110, 242680, 400660] 172608 [394360, 282070, 227940, 418370, 203290] 43011 [440, 240, 219640, 226100] 217361 [375820] 13697 [440, 236430, 8930, 214950, 391540] 519681 [453480, 263500, 214420] 30357 [4000, 244160, 20920, 269250, 265810] 215939 [377160, 619910, 245170, 21800] 66144 [255710, 567080, 495890, 253710] 363140 [377160, 49520] 62672 [225540, 282070, 409710] 106173 [200510, 24980, 335670] 29330 [219150, 265890] 26731 [351090, 397780, 437160, 225080] 6457 [12120, 687260, 660880] 385645 [271590, 319630, 204100, 420290, 287780] 414739 [355920, 506610] 440770 [221100, 65980, 42910, 232790] 259655 [242050, 268050, 224600] 140068 [39140, 34900] 81730 [370240, 17390, 295990, 218680, 412450] 243851 [48220, 234140, 340810, 435790, 423810] 301526 [252490, 236430, 211820, 333930, 249050] 274423 [223830, 213670, 281640, 446390, 205650] 60770 [373420, 10220] 378303 [55230, 268650, 679990, 296470, 316390] 410358 [440, 240, 6060, 338170, 288160] 117245 [24200, 24960, 254440] 346512 [267980, 336520, 253980] 14130 [3590, 452060] 279287 [219150] 317393 [8930, 252490, 433850, 251060, 403640] 66291 [377160, 235540, 6860, 50300, 293500] 26729 [391540, 242880] 285760 [427520, 252490] 94651 [227940, 286690] 275638 [220, 570970, 326480, 342230] 505221 [374320, 440, 302610] 258711 [306680] 435817 [239070, 225840, 265630] 481931 [40400, 340520] 526089 [428750, 360740] 184850 [203770, 435150, 413410, 291550, 251990] 283035 [6020, 253230] 60021 [8930, 242050, 71340, 102600, 22340] 378135 [211820, 49520, 220200, 200710, 39150] 300047 [440, 440, 38600, 244770, 107100] 28326 [237930, 362930, 533300] 388065 [377160, 221100, 429680, 249330] 351737 [363970] 177075 [332200, 501590, 686500, 330820] 189840 [4570, 397690] 212723 [270880, 250340, 391540, 314160] 125653 [107410, 221100, 391040, 206190, 224540] 74086 [387990, 340] 54437 [212680, 620, 268130, 319560] 62358 [289650, 3590, 40800, 291370, 263760] 373236 [238750, 39200, 389160, 233290, 485890] 446985 [65980, 299740, 7940, 293780, 224480] 259343 [211820, 300550, 319630, 204120, 252550] 179207 [435150, 347830, 46520, 231060] 527517 [220, 57900] 320568 [227300, 312530, 19900, 349700] 238025 [221380, 22120, 449090, 388090, 291010] 18677 [204880, 291480, 17470] 45843 [55230, 209100, 7010, 40800, 398850] 372295 [243470, 8850] 467352 [261570, 21680] 392428 [307690, 294230, 233230, 303390] 21280 [33930, 234650, 6060, 270570, 202970] 264715 [227300, 383980, 464880] 490722 [306020, 526740, 434260] 63780 [238210] 138640 [223510, 237870, 32500] 133878 [255710, 20920, 42990, 260230, 464060] 24247 [290300, 285160] 190169 [294100, 41300] 373569 [4920, 227940, 346330, 349580] 251699 [201230, 72200] 115169 [238320, 400250] 35064 [4760, 350280, 225840, 224260, 355180] 282641 [440, 289070, 504370, 2630, 283880] 460388 [440, 440, 294100, 255710, 204880] 254953 [15620] 430204 [49520, 367500, 219640, 21090, 239030] 399165 [4850, 105000, 265380, 251830] 376923 [9900, 234330, 23310, 239660, 11450] 7599 [203770, 55150, 47810, 239160, 423230] 196413 [252490, 252490, 252490, 49520, 433850] 248181 [200510, 35140, 24780, 32730] 172757 [359550, 271590, 107410, 282070, 269950] 263125 [392110, 375530, 293240, 398710] 233166 [20500, 282140, 299460] 317293 [459220, 473690] 95734 [252490, 440, 282900, 285330, 111900] 325263 [394230, 644930, 252250, 247870] 250482 [405900, 268050, 562420, 298600] 286038 [231430, 242920, 413420, 35140, 552990] 266446 [230410, 226860, 433850, 367500, 262280] 322058 [367500, 274170] 171044 [346110, 262060, 203140, 274170, 342370] 392475 [98800, 620, 426010, 475550, 33460] 237870 [17460, 588430] 153612 [291650, 255710, 394690, 418460, 250500] 232632 [210770] 28884 [323220, 209000, 339600, 418910, 291010] 52678 [506610] 135888 [374320, 234140, 391540, 214490, 21090] 479963 [70, 224540, 266010, 206190, 418070] 24951 [339800, 394600, 265890] 314678 [230410, 212680, 204030, 481510, 202970] 397968 [7670, 25010] 444080 [22330, 238010, 363610, 566670, 248730] 9550 [263340, 295690, 356670] 398449 [220200, 9930, 391540, 331470, 80] 113505 [209000, 212070] 256946 [377160, 211420, 227300, 213670, 24240] 165725 [6060, 232790, 427270] 108678 [8930] 376181 [252490, 220200, 365590, 12810, 220440] 53004 [212680, 247660, 311340, 331200] 489488 [319630, 424840, 42670, 258520, 288160] 90096 [359550, 402570, 299360, 317470, 332800] 344611 [55230, 39140, 6860, 395500, 311340] 404320 [226620, 107310, 48000] 97564 [6060, 261570, 242840, 238320, 391460] 216457 [588950, 227100] 334779 [346110, 224540] 199714 [287390, 394760, 361630] 311408 [221380, 461560] 339891 [203140, 3350, 394270, 2310] 282448 [211820, 215530, 323900, 580720] 164428 [212410, 13500, 48000, 244710] 133916 [214950, 242860, 213670, 287920, 426630] 172192 [4700, 252490, 247730, 228380, 253710] 22938 [255710] 483216 [222480, 224540] 243453 [55230, 312530, 22370, 363440] 475577 [221100] 118216 [362890, 282440, 219150, 17470, 227600] 404272 [261640, 63380, 285440, 41300] 237336 [205100, 48720, 313470, 397500, 46520] 234920 [204100, 330840, 347710] 73855 [440, 110800] 171632 [252490, 272270, 221380, 230190, 236090] 376729 [440, 313120, 630, 390340] 14565 [237110, 347830, 267750, 261510, 263300] 121461 [290890] 108465 [39210, 3590, 207230] 230734 [98800, 237990, 97330, 246110, 279440] 53353 [304050, 203830, 672730, 130, 360870] 233838 [377160, 291550, 637100] 525636 [274190, 11360] 331793 [329110, 393980] 85975 [319630, 363680, 20, 343360] 228320 [253230, 291480, 219150] 186225 [394360, 237930, 204060] 364574 [218230, 44350] 56318 [289070, 367520, 255920, 219950, 417860] 322246 [227300, 270880, 319630, 308270, 344850] 18508 [48700, 227300, 250340, 242920, 319630] 256531 [220200, 346110, 344760, 201790, 201310] 450708 [227940] 200158 [49520, 304390, 225540, 301520, 337320] 277458 [250900, 359550, 204880, 324800, 35140] 492178 [496620, 439920, 335620, 367600] 185895 [290300, 8190, 242860, 368950] 366907 [262060, 302610] 137554 [344760, 418460, 545050, 9460] 501707 [243970, 346130, 269690, 397320] 147328 [252490] 329230 [107410, 438740, 393380, 550650] 94825 [440, 4000, 4000, 201810, 552100] 349665 [24240, 236450] 226646 [221100, 403640, 391540, 367600, 343710] 392733 [316010, 80330] 187042 [32370, 32390, 233720, 324810] 159475 [704440, 679990, 401190, 272060] 63177 [4000, 221380, 285920, 203290, 390670] 73970 [282140, 259170, 395170] 43239 [442260, 564940, 375840, 410110] 241618 [8930, 395920, 375620] 31231 [252490, 289650, 322190, 12810, 63380] 1058 [440, 234140] 209557 [399270, 244430, 368450] 182061 [582160, 232750, 248650] 122258 [513780, 269270] 395500 [516510, 293880, 360650, 232770] 143708 [230070, 245010, 274290, 315260, 277430] 259966 [673760, 319830, 362610, 71230, 291010] 282591 [211820, 377160, 489520, 225080] 504966 [351710, 333250, 212030] 538579 [319630, 329690, 496300, 297130, 461560] 260747 [39140, 497470, 497400, 360640] 261990 [49520, 304930, 304930, 365670, 108710] 108300 [323470, 440900, 17570] 482718 [435150, 209000] 177129 [214950, 242920, 274170, 265550, 286240] 242382 [3590, 363790, 458700, 258950, 286320] 76820 [346900, 236090, 282440] 482112 [8930, 305620, 23490] 173966 [304930, 241930, 238460] 467963 [358130, 319630, 362490, 274190] 195704 [219990, 367520, 452060, 488690] 322871 [394360, 307780, 20500] 371885 [204300, 203290, 270450, 17470, 233720] 202626 [440, 206420, 7670, 240720] 81296 [252490] 177764 [440, 620, 227940] 289117 [630, 17470, 383870] 200855 [204100, 6900, 410110, 234190] 200541 [255220] 307645 [252490, 346110, 219640, 248820, 352160] 241775 [257850] 24699 [220240, 428220] 432403 [220200, 257510, 384190] 25652 [363970, 393420, 420530, 387860, 417860] 408580 [241600, 99400, 232750, 29180, 219890] 180866 [57900, 346900, 43100, 49600] 76006 [363970, 201230, 351030] 373898 [335670, 230980] 517683 [73010, 312280] 147787 [359550, 236110, 237110, 218060] 217877 [32370, 248820, 239070, 220780] 275817 [335300, 340490, 55150, 243800, 630] 153199 [435150, 8930, 440, 252490, 413150] 313844 [8930, 227300, 220240, 241600, 241600] 400668 [49520, 252490, 268500, 221100, 227300] 39415 [305620, 40800, 40700] 223621 [10680, 16450, 12140] 131382 [440, 524220, 387290, 31280] 537843 [384360] 144337 [435150, 33320] 195948 [214340, 227600, 371500] 311391 [252490, 330830, 236090] 213617 [374320, 233450, 234650, 237850, 205990] 232688 [271590, 299050, 223100, 312840] 83578 [58610, 537800, 6000] 129527 [342550, 352960, 432240, 323580] 483753 [236430, 45760, 304240, 339340] 255233 [284180, 264220, 95400, 320140, 205910] 339226 [349150, 398210, 27020] 164950 [35450, 204450] 270990 [4000, 282140, 688430, 67370, 388210] 858 [110800, 46260] 147107 [8980, 355840, 459820, 115120, 268910] 182655 [35140, 205650, 217690] 354444 [421040, 27020, 67370, 266010, 231740] 29747 [212480, 254860, 13210] 313561 [440, 359550, 45770, 25000, 478010] 315684 [49520, 500, 460790, 295770, 341860] 143671 [489520, 550650] 411595 [206420, 253230, 253230, 252410, 35700] 310958 [356040, 251270, 280160, 274310] 183914 [237990, 227800, 249680] 426718 [418340] 97038 [440, 261640, 408900, 238210] 258186 [218040, 253150, 15700] 458537 [440, 302830, 407530] 175039 [235600, 350020, 207040] 234698 [230410, 340460] 68671 [301520, 41050, 235780, 232430] 471733 [203290, 208200] 143383 [440, 304050, 474750] 281219 [310380, 414720, 319630, 428750, 313340] 463149 [294860, 228260, 258070, 262850, 298360] 48995 [233450, 304930, 204300, 282070, 219640] 87222 [440, 4000, 620, 10680, 230190] 156212 [359550, 282800, 620, 220, 57650] 466960 [234330, 290340, 331670, 307580, 635320] 122513 [311210, 363970, 394510, 251630, 288470] 357349 [306130, 227940, 241930, 3830, 238090] 88744 [377160, 310950, 200710, 203140, 504370] 388975 [107410, 305620, 261030] 303313 [489890, 403900, 513610, 527750, 326180] 304000 [289130, 230410, 210970, 629650, 268910] 351408 [212680, 234650] 279666 [391540, 227940, 268420, 2870] 491499 [241930, 40800, 71340] 13770 [493760, 71340, 368360] 358353 [200210, 391040, 247730] 125995 [219990, 113200, 259680, 23310] 77400 [230410, 418460, 236150, 226720, 257030] 109043 [301520, 107410, 230230, 447020, 4920] 525373 [394140, 383670, 279940, 611980] 416744 [433340, 312990] 112273 [224500, 203140, 220, 207170, 237740] 443859 [346180, 288040, 232430] 264454 [8930, 304050, 504370, 38420, 319630] 66743 [427520, 242800] 431197 [369990, 401330, 370070] 407610 [32370, 413700, 417860] 126172 [317400, 312280] 347489 [324310, 399670] 411744 [40400, 235210, 233270, 254460, 55110] 363749 [200510, 200710, 301520, 269210] 324609 [292480, 435970, 527280, 445430, 269230] 105525 [440, 440, 4000, 200710] 8903 [346110, 377160, 271590, 49520, 49520] 43371 [435150, 427520, 200510, 227940, 271590] 24424 [620, 339400, 333540, 34270, 274230] 166122 [360430, 340150, 399670] 170143 [306130, 304390, 24010, 268130, 42910] 398468 [200510, 302830] 311219 [385770, 270630, 380] 377097 [113200, 515040] 287281 [620, 734580, 274900, 369580] 311433 [301640, 347430, 326180] 526067 [224260, 220820, 219190, 269270] 150336 [211420, 212680, 296490, 268750, 282900] 212596 [45760, 498240, 70300, 71230] 522971 [10150, 225280, 29900] 53365 [4000, 440, 42700, 33460] 309543 [440] 80722 [311340] 164753 [251060, 496300, 407530, 258520] 238483 [233450, 19900] 142309 [377160, 220240, 41700, 273110] 537375 [206420, 227940, 285480, 315260, 293180] 69863 [245280, 241910] 437351 [212680] 261508 [319630] 95320 [4000, 440, 335300, 204880, 291550] 47167 [394360, 313120] 383690 [249130] 497314 [10500, 35720] 307445 [233450, 594570, 9200, 24240, 231040] 353659 [99900, 348530, 46350, 206410, 20550] 123042 [304050, 205810, 102700] 212794 [219990, 227300, 227300, 241540, 20920] 420713 [98400] 305951 [8260, 8310, 272600] 105291 [240, 221100, 233450, 232090, 500] 414971 [200510, 7670] 110262 [3590, 357600, 541240, 487370, 340280] 35520 [65800, 237930] 82716 [440, 365590, 50130, 242860, 354560] 136643 [274170, 360830, 268910, 360940] 315791 [440, 268420, 368500, 2100, 526160] 370946 [250900, 582160, 375480, 353700] 153646 [213670, 620, 238320, 221260] 505395 [211420, 24960, 241560] 373597 [418460, 440900, 221020, 8190, 17520] 339397 [292390, 225080] 138730 [353560] 90573 [221100, 316010, 6910, 280] 391533 [335300, 637100, 390340] 311368 [239140, 278360] 204420 [237870, 368470, 311340] 271948 [39210, 431240, 521630] 303964 [440, 386360, 236870, 21690, 502210] 248865 [227940, 9480, 380570, 32800, 113420] 98460 [457140, 660160, 541210, 321410, 204450] 156297 [65270, 236150] 187193 [219990, 291650, 223830, 447040, 312790] 150995 [48700, 21690, 97100, 202750, 2500] 327890 [47890, 262940, 6830] 215840 [35700, 380] 259985 [289070, 231140] 368794 [273350] 56931 [294100, 70400, 411830] 355484 [359550, 312280] 135815 [331600, 211400] 364766 [367500, 214770] 268915 [262060, 310700] 251926 [213610, 306040, 236450] 296353 [42910, 32900, 260730, 70, 263360] 40783 [440, 200510, 201810, 57900] 93340 [45760, 455910, 224920, 2310] 354532 [440, 271590, 221100, 237990, 346010] 268754 [102500, 8980, 420790, 268910, 219150] 251908 [4000, 221380, 363360, 254440, 313120] 252754 [284160, 4500, 407530] 101926 [233720] 218342 [25000] 245319 [427520, 440, 440, 440, 236430] 422061 [65800, 219150, 222730] 296406 [208670, 228320, 415350, 248800] 421921 [227300, 42700, 17390] 295233 [33930, 391540, 4500, 317620, 92800] 93395 [427520, 221100, 299740, 346250] 357421 [33930, 310950, 230410, 102600, 12500] 220206 [17460, 11610, 211600, 270570, 353360] 2732 [108500, 288470, 219150, 35720, 35700] 276180 [107410, 252490, 212680, 212680, 274170] 137463 [239140, 200710] 350838 [345060] 319995 [620, 275850, 291550, 386070, 224260] 151757 [440, 203140, 247430, 202750] 519007 [327510, 255070] 90980 [271590, 438740, 22380, 22330, 283060] 78195 [269210, 349550, 264280] 340019 [329130, 394510, 518790, 323060, 40380] 106958 [211820, 385770] 323894 [227300, 453480, 377160, 6060, 553640] 160741 [408120] 538870 [274900] 376420 [48700, 4700, 15620, 8190, 6910] 123828 [394360, 252490, 493900, 319630, 212160] 128434 [24200, 345640, 3730, 70000] 338952 [341090, 345220, 431710, 337630, 400240] 522897 [620, 243470, 6860, 12110] 214984 [107410, 203770, 200710, 391540, 335670] 527963 [233130, 67370, 224940, 238910] 160519 [330830, 34830, 421670, 252330, 221910] 173885 [253230, 602120, 439350] 240822 [218820, 588430] 201080 [377160, 282070, 362490, 330840, 431240] 263161 [688590, 434120] 278331 [202170, 282140] 403560 [239070, 365960] 316913 [24980, 333930, 268750, 351100, 17430] 195456 [211820] 300467 [49520, 271590, 4000, 466560, 238460] 269376 [440, 304930, 403640, 391540, 17710] 197781 [230410, 374320, 298110, 433850, 304930] 301415 [225540, 33420, 304240, 227200, 95300] 402700 [282100, 246840, 234900, 205910] 257028 [305620, 297000] 8422 [8980, 45400, 108500, 33460, 205870] 354659 [287390, 252330] 279713 [221100, 466560] 110614 [214560, 239430] 246880 [235540, 246620, 345180, 35000] 19064 [391540, 39150, 250760, 371570, 346830] 346253 [275080, 293320] 365807 [359550, 234140, 227060, 24740] 280918 [620, 530390, 280740] 203526 [48700, 359550, 377160, 113400, 200510] 286760 [271590, 262060, 346010, 235900, 460790] 138763 [227940, 306660, 274250, 365300] 63374 [3590, 220780] 64518 [418180, 512250] 145000 [2820, 13580, 21090, 6000] 212362 [360890, 403200, 487220, 299460, 225080] 245341 [232090, 286660, 341940] 268751 [8600, 34870] 158031 [421040, 339400, 303510, 297130] 40670 [7670, 322500] 272543 [377840, 293840, 311340] 515856 [240, 380] 114153 [65800, 362890, 359050] 408699 [230410, 730, 235800] 475259 [252490, 288840] 367184 [252490, 239140] 300639 [99900, 231060, 227800] 114898 [288470, 274900] 111591 [208090, 414920, 392820] 496826 [48700, 1500, 1200] 175372 [205100, 17390] 100331 [535480, 642560, 259490, 284390] 338005 [324800, 225080] 445774 [367520, 331670, 251470] 157046 [504370, 473690] 305920 [230410, 230410, 110800] 334843 [65800, 302270, 288020] 441062 [105450, 269950] 128513 [304050, 203140, 291550, 204030, 356570] 9040 [333930, 204030, 65790] 216048 [281990, 462770, 225080] 365193 [337000, 201790, 235540, 252610, 17410] 424484 [8930, 236870, 40800, 493700, 236890] 209035 [211820, 295770, 204240, 373990] 493565 [55230, 113200, 9450, 356570, 356670] 371544 [271590, 440, 213670] 182519 [227300, 620, 354380, 311340, 346330] 282289 [331470, 371670, 485310, 104200, 60] 13960 [355840] 350221 [221380, 239820, 250760, 206210] 426843 [266050, 348020, 293200] 156060 [234650, 291550, 251060, 280320, 259060] 356753 [236430, 433850, 284160, 242920, 344760] 89232 [408640] 160620 [289130, 620, 220, 361420, 236090] 210814 [257510, 261760, 226620, 22200, 221910] 88765 [20500, 214510, 67370, 33460, 373990] 451148 [8980, 12100, 34190, 96000, 306040] 136304 [271590, 221380, 308173, 219150, 243120] 298018 [242050, 200510, 242920, 204100, 260210] 51341 [242050, 403640, 242860, 12150, 6060] 313022 [219990, 444090, 57690, 246420, 9420] 168649 [4000, 220200, 33930, 108600, 221040] 375735 [386360, 486310, 301910, 360830, 266510] 337658 [227300, 447040, 620, 70110] 370191 [49520, 292930, 319970] 66795 [377160, 252490, 252490, 233450, 241560] 263687 [457140, 496460, 487120] 364170 [221100, 337000, 17520] 263668 [113200, 205190] 373874 [22370, 588430] 453972 [442890, 40800, 22350, 241320] 417275 [734750, 457330] 150542 [363970, 70400, 347830, 207350, 220] 236703 [294100, 234140] 239391 [398940, 361560, 219890, 267360, 350970] 103214 [4000, 113400, 227940, 70100] 119358 [333880] 326446 [4000, 323060, 233290, 293180] 50779 [755890, 763010, 462200] 253890 [219830, 8190, 241720, 294160] 246006 [329110, 42700, 63380, 475550] 72246 [310490, 290970, 261530] 513929 [350010, 38720] 210624 [207140, 230840, 225600] 266525 [212480, 418070] 85392 [265300, 270310, 389570] 252375 [113200, 268750, 24810, 202970, 201510] 341806 [67370, 98400, 26800, 107300] 107958 [257730, 331470] 48235 [243780] 139230 [211600, 666800] 333644 [301520, 204360] 262491 [324800, 405820] 29291 [386360, 730, 219640, 200510, 24960] 356261 [273350, 207170, 413850, 390040] 318239 [8930, 291650, 276890, 406150] 439610 [220, 6910, 426440, 222420, 50] 225081 [440, 379720, 113400, 282070, 113200] 252377 [374320, 221100, 70300] 76250 [402560] 369474 [235540, 287390, 37500, 322500, 303390] 184661 [373420, 206420, 310370] 260733 [213670, 200010] 178552 [440, 440, 242050, 65980, 220240] 151307 [48700, 374320, 234650, 273350, 394310] 124448 [2870, 17470, 374570] 113084 [296300, 342970, 301910, 244930, 262410] 245184 [98800, 22120, 22180] 225959 [440, 377160, 435150, 233450, 433850] 147965 [377160, 252490, 257510, 253250, 330830] 355386 [359550, 10500, 291550, 50300, 410900] 364272 [107200, 326150, 104700, 95300] 422025 [8930, 433850, 243200, 67370, 208090] 363625 [383120, 375480, 57900] 109843 [31290] 534155 [292240, 248310, 242980] 124237 [220200, 245620, 220240, 20500, 239070] 312346 [374320, 304050, 282590, 9200, 393420] 58765 [220240, 214490, 265300] 275119 [242050, 214490, 210770] 255116 [40390, 261030, 244950, 314020, 299460] 68709 [251990, 250180, 416130] 356320 [207140, 207140] 247758 [427520, 291550, 91200] 12317 [213670, 557400, 308420, 463270, 115320] 114004 [10090, 236870, 265300] 497500 [260710, 431510, 398680, 80360, 236930] 303813 [674940, 652980] 352685 [239820, 225080] 293866 [428750, 219640, 449460, 229480, 299660] 298508 [435150, 49520, 444090, 250520, 291550] 523533 [317600, 42890, 251990] 251943 [6910] 218322 [252490, 387990, 40800, 17570, 409710] 183429 [214950, 433340, 57900] 144194 [252490, 291910, 236790] 180715 [440, 460930, 274940] 112019 [48700, 24960, 276810, 273350, 273350] 64277 [346110, 231430, 235600, 38410, 1300] 393306 [322500, 258180, 220860] 50352 [377160, 10680, 17410] 412392 [233270, 665360] 151913 [231430, 239430] 23883 [376300, 35140, 382080, 372830, 333760] 36424 [227300, 293760] 130117 [209000, 239030, 230050] 311477 [221100, 339800, 267750, 364470, 251730] 256674 [555210, 268850] 359871 [449830, 460790, 12150, 280740, 206190] 203377 [236430, 582660, 391540, 363970, 223330] 83170 [252490, 107410, 107410, 242050, 304030] 75247 [433850, 232090, 35140, 12220, 21690] 212450 [252490, 304390, 310380, 366440, 28000] 534403 [440, 304390, 12120, 291550, 282440] 250440 [47890, 252530, 378610, 401190] 72832 [425670, 248710, 359960] 288488 [236870, 22320, 559650, 582270, 655160] 145402 [271590, 200210, 620, 620] 335652 [281990, 440, 289070, 240, 389730] 145023 [388410, 374040] 433951 [6870, 102200] 65590 [363970, 232750, 48000, 225420] 257320 [346110, 344760, 220240, 219640, 6870] 119339 [236430, 413410, 243120, 380] 275440 [233450, 9930, 6060, 6060, 34010] 286369 [221100, 333930, 35720] 295816 [355180] 299304 [216150, 211600, 375820, 674940] 10133 [209670, 98200, 239800] 117773 [271590, 219990, 33930, 221380, 219640] 488656 [391540, 233980, 95300, 299460] 263024 [319630, 204450] 461377 [211820, 40100, 22100, 48000] 86410 [108710, 284950, 339200, 204450, 291130] 449890 [235600, 202170, 301120, 46500, 272060] 246291 [212680, 224260, 219890, 210770] 355575 [498240, 265890] 222532 [107410, 289070, 346900] 241607 [48700, 113400, 224260, 253710] 246512 [249990] 465639 [259390, 96000, 286380] 228155 [214490, 206190, 427730] 351171 [294100, 9900, 239160] 453658 [252490, 391040, 375910, 270370, 630] 123716 [345180, 365480] 443868 [48700, 418460, 307780] 18144 [219640, 235400, 2420] 385936 [49520, 273580, 501590] 12086 [113400, 40800, 214560] 302932 [224260] 426221 [261640, 21130, 398850] 27414 [252490, 304390, 104900] 185301 [237990, 409590] 384403 [311210, 221100, 242680, 379720, 445720] 165016 [413150, 252610, 341980, 314660] 291973 [241540, 220700, 21980, 12200, 95700] 299965 [255710, 265890] 38959 [323470, 381100, 12110, 214590, 253710] 323741 [257510, 330840] 39735 [32370, 282070, 455910] 283145 [39210, 346110, 427520, 45760, 227940] 232428 [454700, 498570, 206190] 381387 [221100, 330070, 444090, 7670] 515782 [244830, 378030, 207080, 233550] 383086 [234670, 409410] 349593 [291480, 233840] 435773 [248970, 617030, 438310, 445600, 716050] 301656 [230410, 107410] 45479 [225260] 64137 [206420] 389912 [304240, 274900, 351800] 141399 [271590, 276810, 391540, 391540, 257730] 528017 [268650, 431960] 213236 [427520, 200510, 219640, 393930, 24780] 61657 [4000, 294100, 319630, 21090, 203290] 59005 [220200, 227940, 212070, 266150, 214340] 272536 [10680, 95900, 233270, 115320, 206190] 99503 [346110, 328060, 207250] 241074 [537800] 55543 [227940, 108710, 115320, 231060, 233270] 324736 [524220, 225540, 225540, 200210, 244810] 380227 [306130, 385760] 119389 [440] 503935 [331600, 460790, 376310] 283307 [49520, 24960, 287290, 237630] 298446 [306020, 290890] 228640 [240760, 462030, 110400, 300510, 504980] 474327 [242720, 70000] 236840 [49520, 304930, 440, 239140, 239140] 363863 [7610, 218680, 328940, 250560, 63000] 259248 [220, 365320, 223100] 161775 [291550, 308420, 476460, 212800, 291010] 207243 [221040, 40300, 6910, 8170, 33610] 72725 [393380, 465650, 211580, 236450] 146317 [33930, 202170, 315540, 244070, 348790] 285525 [232090, 39190, 281920] 446679 [256070, 261110, 251530, 287340, 63700] 309960 [209870] 69438 [220, 220, 338980, 628900] 140841 [281990, 359550, 268500, 6060, 423230] 212590 [211820, 386180, 444090] 145048 [440, 107410, 394690, 230270, 509960] 128238 [582660, 330840, 238260] 137969 [342200, 367520, 322290, 22200, 243160] 252609 [440, 221100, 102700, 223100, 263680] 104649 [48700, 252490, 223830, 304390, 218230] 46524 [268500, 253110, 382140, 438270] 104563 [328070, 214190, 99700, 94200] 107884 [252490, 391540, 400170, 63380] 90671 [386070, 391720] 219799 [555150, 22230] 264767 [214560, 233720] 167775 [8930, 231430, 254200, 249650] 62233 [454650, 230410, 236110, 366970] 275280 [730, 249050, 15620] 52921 [500, 319630, 9480, 40] 80274 [22320, 6300, 470590, 406870] 223473 [620, 224260, 588430] 297213 [268750, 248170, 374570] 372680 [252490] 528688 [518730, 287840, 698370, 522090, 337420] 326593 [38600, 601510, 366760, 463350, 455400] 308951 [377160, 12150, 3710] 226892 [236430, 319630, 286690, 420530] 366050 [346560] 459550 [391960, 600340, 357410, 372540, 353700] 110391 [221380, 361420] 479483 [271590, 302830] 418296 [279720] 522364 [386360, 583490, 224260, 431240, 380600] 508336 [211420, 22300, 232910, 275470, 239200] 48285 [220240, 262410] 305979 [203770, 250760, 224760, 203290, 48000] 152492 [39150, 45770, 9050, 50] 62460 [41500, 37260, 91100, 111100] 257021 [24740] 274334 [230410, 221680, 8980, 364470, 339800] 72769 [4000, 57900] 496461 [237110, 28000, 315430] 135887 [545110, 550650] 37442 [268500, 363970, 205100, 50130, 322500] 153078 [49520, 448780, 270550] 299654 [48700, 268500, 55230, 287390, 99900] 370221 [113200, 99900, 274190, 206500, 506610] 134466 [239350, 291410, 214510, 489520] 464953 [582160, 504210, 236150, 355180] 373446 [444090, 302380, 422970, 312990] 384811 [235600, 221020, 315840] 321808 [47890, 205190, 12170] 334381 [9900, 233450, 40800, 322920, 208090] 139121 [211820, 221100, 221100, 212500, 257350] 385778 [444090, 228380, 35720] 128912 [359550, 304650, 50130, 19900] 270807 [440, 35450, 391540, 365450, 705210] 324099 [311210, 334230, 337940, 428540, 237930] 295468 [440, 333930, 207610, 213670] 338422 [221040, 257510, 394160, 375600] 139234 [110800, 254960, 19830, 206190] 351225 [212680, 369580] 79252 [212680, 209540, 427820] 241819 [49520, 304390, 223100] 195693 [316010, 4890] 348121 [346110, 357780] 222269 [530560, 7620] 378497 [373420, 403190, 553260] 111102 [233740, 242980, 233290, 303390] 516736 [4000, 113400] 12688 [630, 32400, 232430] 142332 [240, 113200, 35700] 225724 [49520, 230410, 378850, 227860, 270210] 222067 [326670, 247730, 407840] 471554 [211820, 227940] 527408 [313120] 63486 [48700, 440, 334230, 284160, 287450] 216772 [208650, 313160, 280740] 381089 [551110, 339200, 356520] 398568 [357300, 391720, 312280] 27645 [55230, 246620, 251170, 244770] 263478 [289070, 236110, 351710, 224260, 389870] 298430 [219640, 588430, 269670, 375710, 307760] 62286 [247950] 34001 [200710, 225840, 443580] 6159 [205100, 417860] 53905 [227940, 201230, 297120, 209370] 319629 [20500, 324760, 6980] 262770 [230410] 8931 [291550, 205890, 382490] 298057 [252630, 662870, 680680] 286147 [298110, 207610, 286690, 340] 86608 [377160, 22330] 459157 [42910, 524340, 335550, 409360, 15930] 311643 [346110, 268910, 339580, 246700] 221550 [333930, 70400] 184889 [427520, 215530, 347830, 496300, 351640] 211186 [236430] 326189 [274190, 335820] 166779 [200510, 11480] 383975 [105600, 230230, 232090, 65980, 291480] 363242 [346110, 48700, 200510, 301520, 311560] 215439 [22610] 456522 [234140, 286690, 285440, 366250] 444565 [32370, 429570] 350666 [394690, 282070, 233130] 306376 [65930, 266510, 234710, 233470] 98271 [440, 221380, 333930, 8190, 205950] 228963 [220660, 264140, 311340, 312280] 506161 [239200, 204060] 141316 [230410, 221100, 220240, 65980] 362968 [403640, 265550, 373770, 225640, 400240] 92448 [346110, 10180, 386650] 74059 [304050, 270880] 193775 [261030, 221910] 490825 [8190, 317360] 339273 [240, 78000, 418340, 338060, 230050] 30804 [200510, 213670, 201810, 32430] 161903 [35450, 286690, 612880, 261530] 131405 [302970, 202730, 449960] 47950 [221380, 372000, 4500, 330820, 431240] 188484 [239140, 252490, 255710, 304050, 214490] 492998 [270760, 313240] 132632 [12500, 40930, 630] 463194 [380, 2320] 95776 [282900, 339610, 70, 498240, 236450] 298810 [440, 55230, 451340, 417890] 109262 [234940] 316288 [217690] 494038 [232090, 220, 104900] 478886 [451990] 34874 [221380, 361800, 289690, 280160] 308380 [351990, 369550] 441721 [462770, 491650, 637880, 364690, 379640] 304246 [274170, 32500] 489636 [397500, 214340, 367580] 352142 [289130, 291480] 439692 [240, 447020] 284990 [433850, 204300, 283640, 324800, 261530] 496824 [48700, 433340, 201790, 361420] 147761 [220240, 209000, 310790] 21849 [390200, 463980, 394280] 326425 [221380, 35700, 319270, 266010, 57300] 391901 [734750, 627690, 397900, 365300] 8126 [367120, 40800, 365450, 206440, 346250] 180734 [433850, 346110, 644930, 255420, 7670] 232502 [233450, 233450, 213670, 214510] 490752 [252490, 240, 219640, 353560] 158788 [306130, 47700] 228527 [313160, 242920, 42700, 620, 41800] 131552 [243040, 18000, 57800, 33120, 48000] 168448 [262060, 213670, 2870, 474750] 193595 [289070, 232090, 216150, 597170] 169209 [8930, 403640, 400170, 225840] 247799 [292660, 358810, 372690, 289050, 292090] 31011 [97000, 337950] 501225 [247000, 319850, 316790] 43815 [227940, 17580, 528160, 390290] 314792 [107410, 297000, 233150, 72200] 373034 [378660, 6920] 537625 [49520, 35450, 394510, 2400] 309881 [208650, 225540, 108710] 198292 [316010, 311310, 272270, 688430, 348030] 50776 [8930, 206190, 630] 289256 [274170, 238010, 301970] 529712 [674040, 602940, 301860, 257990] 189463 [4760, 40800, 24740, 12120, 274920] 464409 [244930, 272060] 530049 [239140, 310360, 202200, 6200] 269608 [444090] 463763 [17390] 186534 [211820, 24740] 126922 [282900, 242920, 243970, 276730, 360940] 165788 [20500, 299740, 259080, 575910, 219150] 20850 [252490, 107100, 10180, 270210] 54289 [65800, 113200, 200170, 268220] 164378 [224260, 355180] 29423 [620, 247000, 282070, 213670, 12200] 50921 [227300, 274190, 334140] 409118 [211820, 289130, 359550, 391540, 329440] 292714 [259640, 339470] 216341 [32150, 481510, 354240, 278360] 460933 [242050, 427730] 173942 [252490, 440, 440, 6030, 266490] 25859 [213610, 329460] 64136 [200710, 238010, 220240, 204300, 273350] 96452 [221100, 291550, 235600, 227940, 227940] 96962 [241560, 264140, 506610] 63617 [268050, 381560, 377860, 270810] 363088 [440, 239160, 45770, 428180, 205650] 279938 [544180, 421670, 429300] 147286 [15620, 33460, 31280, 22670] 25944 [113400, 335670, 97330, 376570] 380811 [363970] 466423 [22370, 292140, 22450] 351257 [307780, 1840, 237930, 261110] 236644 [204360, 12120, 225840] 68799 [435150, 252490, 221380, 240, 304050] 521497 [620, 6310, 204060] 174619 [333420, 368340, 290970, 553210] 53576 [49520, 220200, 394230, 20540, 237990] 295699 [4000, 252490, 212630, 312210, 210950] 236201 [433850, 404410, 233980, 247730] 140594 [440, 287390] 171245 [440, 377160, 319630, 380600, 22370] 269003 [242940, 413850, 209650] 203336 [4000, 11610, 237930, 208090] 16743 [333420, 108710, 398170, 392580, 432250] 98428 [371180, 310080, 45100] 254663 [377160, 107410, 206420, 221100, 385760] 167437 [113200, 220, 257120, 314280] 381436 [230410, 438740, 224760, 239200, 341940] 338520 [8190, 620, 220260, 332800] 72159 [281990, 427520, 200510, 304030, 32370] 414552 [440, 242860] 68201 [440, 205100, 234140, 398710] 27842 [421040, 360830, 384190, 351640] 371812 [409590, 433950] 36998 [204630, 222730, 94400] 195943 [374320, 365360, 403190, 208650, 444090] 326474 [203140, 416170, 477290, 28000] 383080 [235540, 206190] 116968 [6020, 466170, 398710] 333113 [241540, 203140, 237990, 226960, 614570] 241377 [290080, 335670, 8080] 424984 [230190, 330000, 417860] 68633 [220240, 20920, 70] 92805 [8980, 261640, 11150, 301640, 46510] 353644 [440, 386360, 240760, 610080, 440650] 165720 [239070, 244810] 298448 [387290, 200940, 380, 270550] 385362 [341510, 448230, 356510, 420570, 449020] 160108 [4700, 203140, 211600, 207380] 294217 [208580, 317360] 212671 [218230, 347290] 488830 [8930, 359550, 440, 240, 203140] 303599 [49520, 9900, 55230, 24780, 211400] 467781 [9450, 396640, 94200] 297883 [49520, 8980, 22320, 274190, 327510] 498806 [339800, 391540, 207670, 320, 261180] 253251 [214510, 373390] 309737 [332200, 203750, 361420, 452060, 221810] 350291 [261030, 323610, 104900, 319510] 124119 [271590, 413150, 48220, 208500] 349460 [440, 233450, 257730] 294862 [334230, 208650, 209000, 513590, 549100] 504967 [377160, 654880, 310080, 361630] 31899 [234140, 475550, 17410] 66193 [231430, 107410, 333930, 227940, 386940] 121423 [244430, 80330, 219890] 357292 [271590, 230410, 610080, 205890] 321523 [251830, 242820] 22730 [307780, 449550, 431590, 398680, 71250] 201672 [8930, 10680, 19800, 2280, 562220] 13954 [49520, 261640, 264240, 340270, 355970] 90770 [251150, 45100] 12257 [107410, 301650] 510348 [113200, 206420, 416640, 107100] 360195 [207610, 10680] 294762 [208580, 330840, 12340, 261180] 147255 [225540, 293860, 630] 288027 [234140, 242860, 327860, 747190, 447800] 237094 [550900, 35720, 55100, 70000, 104900] 228636 [39210, 231430, 65270] 316527 [355970, 70000, 104900] 26261 [444090, 70400] 76631 [49520, 91100] 484418 [49520, 4000, 394510, 471570, 485370] 110715 [219640, 7670, 7670] 37812 [25900, 3260] 180690 [306130, 250520, 228960] 439066 [8170, 10250] 248355 [377160, 271590, 211820, 301520, 220200] 40541 [403640, 261570] 447960 [238750, 392260, 466170, 339200] 141573 [227940, 274170, 446760, 61730, 231040] 353027 [281640, 356570, 324610, 353360] 264851 [301520, 379720, 371420] 411978 [330830, 233210] 13349 [391590, 392450] 307363 [108710, 12100, 405820] 19143 [304930, 222880, 299360, 237930] 256148 [294100, 4000, 48700, 732430, 298260] 304971 [257690] 224195 [228400, 57300] 59842 [440, 291650, 49520, 223830, 234140] 57217 [239140] 297517 [230410, 243470, 245170, 71340, 251630] 145827 [427660, 47000] 340102 [306130, 248820, 394230, 203140] 94930 [300, 70, 489520] 325304 [289130, 282900, 232090, 378540, 216910] 30967 [374320, 386360, 444640] 43270 [41500, 340] 232584 [220200, 240, 300, 50620] 210857 [440, 377160, 107410, 113200, 259430] 419995 [524220, 55230, 225540, 6860, 400040] 255948 [254060, 282880, 372490, 319180, 583490] 180217 [201810, 38410] 248506 [38410, 312530, 232790] 180656 [339800, 248820] 57239 [220, 12120, 17520] 182009 [208400, 6020, 391540] 119295 [346110, 367080, 431960, 244710] 754 [236090, 22300, 32800, 360740] 514173 [271590, 221380, 504370, 282800, 67370] 245852 [242680, 274170, 203140, 220, 201790] 355851 [9480, 108710, 248310] 68271 [300550, 32360] 79467 [19900, 6000] 109278 [225840, 285900, 327890] 390126 [234140, 220240] 80345 [347830, 314150, 236450] 307086 [231310, 219890] 28665 [219640, 224300] 225538 [322300, 237630, 427730] 211937 [453090, 296150] 213222 [221680, 301520, 299740, 261030] 209388 [221100, 8980, 294860, 454060, 388210] 315966 [440, 220200, 10500] 527593 [324800, 370940, 7520, 297000, 356500] 447639 [236430, 55230, 237110] 72049 [305620, 270150, 10680, 569480, 314240] 341130 [271590, 110800, 394510, 235540, 209000] 272499 [302670] 99349 [80, 24240, 348200, 302380] 221007 [448640, 375460, 315260] 476752 [207610, 466240, 9010, 17410, 312280] 212143 [207140, 317400, 48000] 73485 [233450, 322190, 108700, 48000, 295690] 239605 [262060, 372000, 226620, 411830, 315440] 403094 [251150, 628760, 290770, 303390] 41423 [440, 8930, 239660, 10680, 262120] 459107 [391540, 391540, 569860, 288160] 195282 [284160, 238460, 244930] 277099 [236430, 431960] 61978 [262060, 215530, 212480, 200210, 236090] 305919 [200510, 293780, 290790, 253330] 186262 [403640, 322440, 425210, 331440, 233290] 179078 [500, 433340, 12120] 442509 [29900, 209160] 391525 [304950, 315930, 257030, 229580, 311340] 187307 [236870, 339800] 189218 [233740, 433280, 246840, 217200, 342300] 273856 [220, 220700, 214340, 365170] 136374 [104900] 159135 [307780, 253110] 532754 [204300, 35700, 270550] 163937 [234140, 6860, 582550, 332570] 193781 [435150, 202170, 108710, 219150, 240720] 476046 [440, 361420, 322500] 376503 [227300, 262830, 270760, 286120] 359562 [292840, 489610, 234390, 315430, 96100] 274170 [49520, 113400, 266010] 474608 [200510, 243470, 108500, 206190] 171020 [440, 440, 203770] 300609 [620, 220660, 24960, 227940, 364420] 526235 [49520, 235460, 314710, 378100, 289090] 309671 [107410, 4000, 350310, 495740] 52586 [440, 304050, 50, 329830] 467405 [55230, 113200, 57690, 236090, 243470] 90380 [110800, 272470, 17470, 57300] 85386 [271640, 233270] 195079 [440, 440, 49520, 49520, 49520] 334562 [70400, 241600, 644930, 356570, 467010] 118396 [247080, 251990] 350440 [433850, 231430, 16450] 130273 [113200, 301520, 678800, 627690] 282082 [214560, 230270, 364930, 260230, 329690] 258530 [220200, 271590, 107410, 440, 730] 280807 [212680, 329430, 208400, 224600] 148132 [300550, 7010, 374570] 287617 [203770, 333930, 24010, 620, 402570] 80718 [240760, 15620] 196080 [391540, 201480, 513560] 304881 [317720, 377980, 385220, 489520] 190493 [200490, 258200, 221910] 68771 [55230, 35460] 483890 [394690, 24240, 418370, 286080, 434260] 506205 [374320, 374320, 372360, 108800] 143117 [70400, 233290, 206190] 183039 [347830, 319630, 243470, 286690] 146163 [389040, 231330, 238320] 319519 [10500, 219640, 208750, 205650, 458000] 143361 [221020, 644930, 286340, 204240, 107100] 375624 [262060, 21690, 8850, 304950, 266130] 366723 [271590, 524220] 319267 [275850, 289760, 310740, 400020] 269105 [113400, 215080] 349569 [298110] 243153 [8930, 34420, 225420] 517829 [78000, 50620] 97066 [333930, 15120, 318600, 382490] 407664 [225840, 206210, 550650] 519426 [204960, 542680, 9940] 29046 [33230, 488770, 16450, 2420, 22670] 46964 [438040, 327680, 292410, 436110, 434260] 282620 [240760, 203350, 45400, 3020] 349718 [415670, 432470] 312177 [444090, 251130, 249590] 387646 [247660, 94300, 8600, 239030, 55110] 276157 [4920, 239120, 414340] 228568 [435150, 230230] 407454 [335320, 211740, 236370] 538575 [203770, 261470, 237990] 427191 [235600] 99249 [214770, 312520, 241320, 63000] 345762 [444090, 1840, 735570, 513450] 51744 [22330, 220700, 2870] 14659 [360430, 402710, 420000] 273763 [346110, 26800] 341876 [730, 320, 233720, 50300] 109668 [287450, 562500, 258970, 233270, 310080] 102353 [271590, 281990, 289130, 242680] 452323 [431240] 332050 [415960, 467120, 633350, 540860, 331480] 369935 [379720, 219890] 295209 [49520, 232090] 208576 [440, 268500, 447020, 238090] 110419 [377160, 433850, 433850, 200710, 22330] 378117 [208140, 310380, 370360] 232881 [440, 403190, 363680, 65790] 34982 [248820, 518790] 78974 [330660, 287120] 83181 [440, 249130, 222980] 7196 [240, 444090, 331470, 316790] 309804 [410320, 350020, 643880, 742630] 526420 [107410, 214950, 365360, 39680, 108710] 8896 [440, 35450, 335240, 513560] 427888 [246840] 352522 [386180, 282140] 49994 [282070, 220440, 434510] 158468 [212390, 242720] 413540 [63380, 395150, 356570, 3260, 432870] 46843 [234140, 305380] 42892 [209000, 6910, 369890, 99300, 220860] 347287 [524220, 413420, 361550] 14342 [201790, 203630, 221640] 42593 [230410, 305620, 527230] 174255 [386360, 24200, 524220, 372000, 242680] 224897 [440, 524220, 301520, 403190, 36100] 346623 [447530, 499520, 272040, 265890, 406150] 168592 [289070, 289070, 9900, 240, 221380] 277051 [440, 48220, 550650, 405640] 318131 [377160, 8190, 201810, 9930, 39690] 182088 [224260, 317360] 203919 [211820, 438740, 333930, 514340, 281200] 285786 [314160, 220260, 32430] 372531 [440, 208730, 237110, 299360, 80] 15632 [306130, 355010] 338870 [271590, 203770, 220200, 427520, 8930] 299375 [203770, 239070, 242640, 212010, 7670] 258075 [21690, 239160] 409097 [12200, 238320] 328598 [292160, 22370, 203630] 461769 [440, 22330, 204300, 6060, 269210] 83028 [433850, 305620, 221380, 619910, 313340] 234878 [235360, 286340, 277490, 233980, 205910] 131068 [4000, 240, 240, 208650, 70400] 83600 [476240, 45400, 354240] 213626 [364470] 26697 [205100, 394510, 237990, 313340] 510651 [500, 310110] 223646 [98800, 401710] 522857 [23700] 315154 [310950, 1610, 332480] 24813 [359550, 232090, 387990, 564710, 589290] 110194 [211820, 572410, 2700, 266070, 293500] 32100 [234140, 221380, 15620, 680360, 206440] 487677 [388360, 474820, 284890, 279540] 206531 [113200, 233130, 362890, 489520, 48000] 241248 [40390, 357480] 163795 [252490, 17080, 104900] 16457 [42700, 330840, 281200] 187411 [302240, 280560, 297210, 292630, 319510] 297475 [440, 463270, 322920, 521570, 270550] 25994 [65980, 299950, 278360] 184528 [377160, 412830, 431960, 252610, 261510] 12355 [39210, 206420, 11040] 326977 [275470, 294230, 344910, 221640, 319510] 158396 [314210, 332530, 312990] 486226 [433320, 50300] 143851 [212480, 327090, 261030, 219150] 279990 [268420, 35140, 223810, 200210] 281356 [304030, 346900, 394230, 319630, 379720] 161487 [301640, 434570] 277506 [224500, 218130] 497914 [440, 300550, 208580] 465930 [233130, 313690, 365720] 104877 [107410, 377160, 396650] 135464 [371570, 629280, 474820, 558610, 448660] 163717 [307690] 207472 [448510, 293940, 563560] 266298 [266490, 258890] 111737 [107410, 316010, 316010, 284160, 339800] 42800 [297720, 580200, 373880, 98400] 178529 [296570, 99100, 245730, 312840] 123550 [221100, 42700, 405640] 184451 [282900, 535980, 364640, 295690, 575640] 307311 [20510] 145130 [252490, 211820, 440, 250340, 391540] 31126 [104700, 209520, 57300, 243220] 140260 [730, 231430, 582160, 208650] 537558 [410320, 204100, 311340, 371180] 456341 [377660, 347430, 295250, 329830] 255336 [221100, 49520, 113200, 236110, 6020] 473846 [377160, 235900] 203361 [43100, 238460, 312840] 243167 [32470, 248860, 238240, 300380] 75336 [49520, 262060, 8500, 316010, 306130] 275075 [202170] 11493 [440, 237630, 398710, 245550] 211542 [242640, 384190] 279085 [268750, 232890, 569610, 393530, 292730] 306117 [283640, 346110, 40800, 411830] 401674 [486990, 224980, 404080, 340] 386298 [220, 562590, 563560] 228513 [292570, 628760, 15100, 418180, 267900] 15951 [200210, 393380, 4890, 352010, 227760] 283358 [341060, 501840, 437160] 273891 [282440, 104700, 292500, 317360, 303210] 381859 [227100] 138822 [230410, 252490, 359550, 208580, 391540] 434489 [35450, 203140, 22100, 15100, 211600] 73574 [219990, 235540, 665930, 216110] 394777 [8190, 259130, 204180, 250500] 188834 [35720] 322413 [235600, 10680, 311340] 356874 [384190] 283950 [289130, 236110, 408410, 323060] 260200 [107410, 374320, 4000, 440, 244160] 340694 [271590, 2820] 390191 [314560, 6200, 226120, 206190] 312982 [377160, 304930] 128246 [220, 222480, 104700, 2400] 78767 [299740, 418120] 303030 [207170, 257220, 203830, 35700, 271900] 299950 [287120, 390290, 29800] 207855 [219640, 323850, 222520, 20] 58576 [107410, 466560] 266792 [215160, 288040, 277680, 11200] 28369 [113200, 3590, 9200, 57640, 496300] 409782 [340490, 340120, 412380, 245010, 453960] 397637 [10500, 22000] 36606 [208650, 10090, 496890] 229043 [298110, 204880, 280360] 143635 [211820, 440, 335670, 227940, 214190] 170168 [386360, 253230, 234080, 501300] 238866 [238370, 630, 6000] 313550 [107410, 107410, 221100, 379720, 578350] 232527 [247020, 7670, 225420] 183143 [346140, 394280, 349060, 205830] 167874 [249050, 373390, 427730] 170571 [363970, 413420, 241540, 221640] 84482 [223000, 251830, 387840, 211360] 149768 [394510, 212680, 300, 477160] 450359 [7940, 71340, 6900, 34870] 297084 [335670, 302510, 251870] 350438 [268500, 20500, 38420, 323720, 6900] 80605 [391540, 296050, 17410, 283290, 387860] 268255 [286690] 54100 [306950, 219150] 194278 [108710, 219150, 26800] 248741 [440, 239820, 104700, 283960] 398681 [40390, 305380, 48000] 254402 [207320, 237630] 27964 [47790, 211400] 348939 [377160, 212480, 24740] 181649 [236870, 9180, 417860] 372301 [447040, 227300, 365670, 214560, 368340] 285657 [569860, 269690, 107100, 273110, 286940] 283472 [204880, 221380, 355250, 1520] 290059 [38600, 115120, 228320, 98400, 225600] 177724 [307170, 508290] 32464 [398210, 334560, 91200, 343100, 253980] 19018 [214950, 433850, 57300] 201871 [245490, 385770, 321840, 418110, 265120] 76020 [367500, 17080, 224540, 512900, 300380] 56921 [49520, 244090, 317360] 504949 [221260, 250260] 501497 [213670, 384190, 469820] 379796 [47890, 339800, 385730, 402840, 324760] 173067 [541670, 259980, 400370, 8260, 271860] 374313 [219640, 221040, 235540, 236150, 238210] 449488 [20510, 224920] 21745 [391540, 221540, 304050, 7670, 320] 223055 [211820, 3590, 248610] 348091 [231430, 251060, 394380, 250580] 42954 [49520, 278360] 75571 [489830, 225640, 107100, 95400] 222214 [265610, 395470, 115110, 347560, 40700] 78480 [58610, 400170, 400170] 323019 [104900] 355988 [265610, 329970, 381260] 167926 [524220, 485490, 232430] 73014 [277430] 221709 [359870, 285900, 304460, 310080, 8400] 268394 [233720, 248650, 423880] 204578 [241930] 375167 [219640, 588430, 214490, 249050, 314410] 476345 [15320] 247908 [49520, 214560, 643270] 468693 [48700, 38700] 63199 [252490, 582660, 48720, 207610, 246920] 35009 [24980, 239820, 70, 250640, 2290] 295841 [730, 4000, 227940, 304030, 444090] 388620 [385760, 620, 237110, 10180] 430638 [431730, 400160, 243160] 102452 [440, 221100, 301520, 228280, 591370] 175580 [341440, 98500] 279240 [290140, 278360, 346250] 156541 [433850, 433850, 268910, 232910] 20976 [22330, 241930, 241930, 108800, 99700] 164147 [4000, 301520, 291410, 220, 201810] 146171 [620, 209230] 59627 [60700, 489900, 385150, 216290] 90650 [346110, 239140, 274190, 261030, 107100] 385048 [48700, 218230, 35450, 402570, 19900] 222096 [313160, 70, 253710] 288793 [208480, 40300, 249230] 322797 [730, 440, 221100, 620, 360430] 339991 [620, 218680, 220780] 533061 [228300] 39161 [440, 278360, 250260] 434455 [393380, 242760, 65930] 313344 [294100, 386180] 82679 [307690, 6850, 274940, 203290, 6000] 260579 [350810, 443460, 325520, 544190, 46000] 319817 [250900, 208480, 6060] 473716 [377160, 65800, 331470, 242860, 200210] 52618 [440, 386940, 403640, 418340, 28000] 104980 [201810, 247910, 440730, 286040] 57102 [346110, 393380, 462770, 644930, 2300] 533363 [243470, 354640, 312280] 466029 [221910, 295690, 425580] 370080 [274170, 266510, 221910, 359050] 68362 [440, 377160, 22330, 47800, 360740] 173785 [204880, 23400, 410210, 277490] 24337 [49520, 311310, 387990, 32440, 376870] 293838 [250340, 404770] 49923 [4000, 234140, 257730, 224260, 468250] 516385 [200510, 367520] 212702 [113200, 257730, 427810, 476600, 286100] 73876 [39120, 258970, 31280] 418768 [300, 35130] 294636 [304390] 9334 [108710, 10680, 107100, 344910, 207320] 324723 [373420, 257510, 292120, 108600, 35720] 234292 [366890, 31280] 220029 [208650, 521330] 75133 [211820, 227300, 330840, 464350] 2630 [386940, 368800, 429950, 342580, 24740] 499879 [65800, 207170, 27330, 274900] 236268 [211580, 35460, 233720, 407840, 337950] 276248 [346630, 399660, 378270, 36000, 562220] 363206 [2870, 310380, 340170, 657200] 474110 [202170, 32800, 57300] 225041 [418180] 235196 [304650, 371660, 417860] 203991 [214950, 440, 440, 204300, 436150] 261803 [440, 211820, 216150, 270880, 362890] 293717 [440, 220240, 219640] 90850 [9450, 287390, 264000, 63200] 314144 [10090, 207610, 300380, 104900] 256644 [270880, 368070, 3800] 485524 [271590, 339800] 515051 [588430, 501590, 391780, 380840, 317360] 482492 [290340, 58570] 308281 [204100, 290490, 514920, 50300] 347731 [108710, 212480, 6980] 248291 [394230, 526250, 268910] 433944 [8930] 282065 [391540, 250340] 375527 [107410, 237110, 452220, 402880, 405640] 306752 [204360, 31280, 102700] 248076 [410430, 340270, 353360] 321244 [219990, 291650, 200710, 237990, 202200] 286051 [346110, 262060, 22330, 208200, 341940] 113738 [356040, 327390] 213791 [230410, 221100, 262280, 211740, 306660] 369618 [282900, 418340, 7510, 204450, 251990] 44554 [47790, 427730] 284740 [433850, 232090, 238870, 289340, 296970] 137336 [221380, 35140] 119880 [445190, 419480, 339580] 349422 [233470, 232430] 145760 [271590, 233450, 433850, 230230, 235540] 162653 [115110, 247910, 104900] 115601 [360640, 253900] 357864 [252490, 49520, 7670] 161994 [310950] 427581 [288840, 288930, 434260] 68635 [374320, 221100, 294860, 213670, 257970] 82553 [35450] 169248 [290080, 400240] 21897 [49520, 104900] 384493 [218820] 218022 [296470, 286100] 57133 [367450, 70, 268910, 286570] 392299 [24780, 273500, 19200, 269490, 267360] 86756 [289650, 367580] 73512 [224540, 247910] 463256 [389730, 208650, 412880, 441870, 292410] 431281 [200710, 3590, 115800] 76084 [18500, 330180, 45500, 375820, 367580] 13761 [237930] 54063 [221100, 242920, 17410] 390725 [366440, 241720] 159743 [214950, 242920, 237990, 440, 313340] 128943 [4000, 268650, 12810, 218820, 436150] 316757 [228400] 178154 [568570, 361420, 245370, 427730] 2379 [407510, 462770, 286260, 351340] 192738 [200210, 218680, 261180] 494187 [203770, 4700, 341000, 351080] 277455 [32370, 592340, 11330, 217140] 127807 [306130, 20920, 554620, 206210] 398660 [291650, 94400] 209278 [202170, 227940, 336670] 144402 [207080] 145338 [40800, 262410, 314020] 20216 [252490, 24960, 57300] 365342 [386360, 268910, 383870, 225080] 169338 [264140] 335624 [108600, 403640, 42700, 207610, 444000] 315481 [219910] 86957 [359550, 227940, 272510, 220160, 302830] 197267 [242050, 105000, 200370] 167947 [212680, 430960, 332310] 353760 [238750, 307640, 48720, 4850, 217790] 38837 [214950, 252490, 8980, 572830, 247660] 459239 [212680, 15400, 35720, 26500] 145028 [212200, 432010, 322210] 525500 [203160, 218640, 378100, 263060] 373404 [55230, 4580] 439077 [24200, 291550, 241930] 429631 [238750, 239820, 241600, 421020, 422970] 223637 [8930, 346110, 391540, 9930, 242860] 135445 [334230, 393420, 221910] 238357 [290340, 364190, 345640, 337420] 16267 [268500, 268500, 292120, 391540, 379720] 378601 [4000, 227940, 280, 221910] 416400 [294100, 268500, 319630, 22370, 67370] 281249 [211420, 250760] 28244 [252490, 263280, 298050, 418370, 232450] 296231 [335300, 460790, 238210] 278339 [220, 220860] 237855 [391540, 316750] 433582 [310510, 381120] 526307 [296300, 533300] 405440 [46500, 57000, 335820] 57787 [298110, 50130, 57900] 126443 [314520, 91600] 71150 [9450, 200710, 41700, 228280, 50] 215986 [333930, 612880, 208090, 248350, 253710] 49574 [220240, 239070, 245170] 338214 [49520, 271590, 373180] 487194 [538990] 380198 [203770, 204880, 245170, 282140] 182648 [440, 685310, 219890, 302380] 2455 [241720, 206190] 372843 [382420, 248570] 173854 [298110, 220, 9480, 253370] 172880 [238090] 500059 [221380, 78000, 28000] 538952 [221100, 17510] 15002 [475550] 196996 [223830, 241540, 18500, 202750] 364933 [230410, 220200, 49520, 620, 282800] 376214 [55230, 105450] 178817 [391540, 238460, 354200] 156934 [24810, 429470] 174276 [238010, 385770, 453340, 440880, 50300] 161474 [49520, 231430, 312520, 239660, 204030] 475201 [271240, 237930, 499520] 127881 [15620, 564710] 256242 [252490, 19800, 209160] 2671 [339610, 279480, 10130] 201114 [239350, 313120, 314660] 450248 [34010, 256460] 29538 [203770, 271590, 24200, 444090, 4760] 209804 [427520, 245170, 322300] 225761 [390680] 227509 [344760, 55230, 304390] 382391 [207140, 21690, 18020] 371566 [50300, 70660, 263060] 138082 [55230, 319050, 282140, 475550, 303210] 255988 [8930, 3730, 225260] 5032 [10180, 636480, 391720, 398710] 216062 [440900] 257548 [220240, 313160, 256330, 34190, 286040] 18140 [440, 374320, 409720, 320760] 470821 [374320, 387990, 261570, 370170, 315060] 237270 [271590, 360830, 227260, 423230] 458940 [257510, 234650, 215160, 233370, 492630] 69268 [385760, 230230, 291550, 339800, 360430] 320631 [8190, 242960, 202530] 281118 [284160, 108600, 206420, 219150, 293520] 130839 [113200] 252322 [252490, 433850, 297130] 220192 [211420, 302830] 377782 [252490, 386360, 620, 15620, 204360] 236074 [373420, 289070] 253562 [304390] 470088 [504370, 310380] 311983 [252490, 340] 96652 [107410, 321360, 236690, 333930, 249870] 10877 [400250] 435682 [393240, 384940, 315080, 354050, 363940] 238621 [500, 222480, 17470] 107029 [213610, 297120, 71250, 232770] 406646 [377160, 291650, 4000, 249650, 282800] 532849 [225540, 220, 270570] 205318 [243470, 248820, 107100] 85094 [240, 33930, 438740, 9900, 42700] 221958 [245620, 554620, 251990] 198735 [214770] 222897 [346110, 250760, 17300] 261846 [271590, 231430, 265300, 224540, 438270] 126101 [210770, 243780] 26390 [240, 330830, 340] 452388 [48720] 212075 [209000, 237850, 115800] 10191 [219990, 350640, 509220] 329462 [261640, 49600, 277430] 414052 [300550, 272270, 257850] 252744 [433850, 206420, 339800, 322500] 331930 [301520, 254700, 24670] 329826 [268500, 8190, 233720, 214420] 28788 [433850, 255710, 275850] 275586 [10090, 257510, 39120, 235540, 345440] 451828 [377160, 252490, 232090, 48190] 224869 [223000, 45450, 281370, 22000] 279526 [248570, 263360, 234190] 2887 [301520, 219830, 35720, 252110, 257750] 365064 [230410, 230410, 233740] 222568 [38410, 244710] 197412 [200510, 240, 518790, 237930, 399120] 473770 [4000, 440, 440, 245170, 236090] 209568 [107410, 391540, 307290] 144250 [236870] 91946 [9900, 403640, 220440, 7670, 269490] 168735 [8980, 393380, 273350, 238460, 444640] 127037 [211820, 4000, 215160, 532110, 454950] 218248 [212680, 31500, 45100] 264663 [413150, 493340, 485310] 452952 [206420, 80310, 335330] 184437 [42990, 204630, 220780] 39866 [21660, 26800, 15700, 32400] 88382 [241540, 110800, 214340, 312990] 17414 [25980] 502126 [248820, 371970, 345860, 375820, 6000] 102099 [221040, 247660, 238320] 226141 [394690, 644930, 48000] 386120 [359550, 221100, 105450, 32470, 620] 286254 [371660, 233270, 41050] 248122 [230410, 304930, 301520, 204360, 235540] 241334 [290300, 262280, 248570] 52555 [282070, 220700, 313120] 70515 [212500, 235540, 364470, 431960] 538987 [273350, 244630] 455744 [376300] 470760 [257690, 425580] 234703 [289650, 242920, 620] 492515 [233270] 373270 [228300, 115320, 371670, 11140, 275490] 397596 [211420, 20920, 247730, 385330, 104900] 211064 [216890, 370240, 241600, 287290, 495480] 243301 [208650, 247660, 323720, 233130] 70134 [20820] 327034 [231430, 276810] 137949 [391540, 219640, 204100, 620, 386070] 89 [4920, 287290, 434570] 282971 [262060, 287450, 248820] 72170 [551080, 378810, 284890] 492473 [364640, 294370, 305050, 252410, 239430] 270361 [259680, 10680] 223772 [473770, 219150, 357900] 136654 [214560, 518790] 247674 [516840, 22180, 345650] 272051 [24980, 42700, 7670, 17330, 108800] 449190 [440, 377160, 201810, 201810] 113591 [391540, 95700] 79505 [440, 440, 227300, 239160, 7670] 151088 [227300, 70400] 516285 [264000] 346736 [304050, 249050, 298610, 608110] 280352 [440, 273350, 286040, 233270] 190394 [250900, 204360] 268696 [236690] 479333 [357720, 238930, 290260] 278153 [220240, 280160, 224580] 244800 [353270, 524580, 626680, 474750] 172582 [221100, 582160, 248820, 50130, 304930] 334669 [22320, 300, 12200, 320, 70] 143615 [371200, 233230, 233270] 93023 [57900, 451020, 214590, 243120] 203427 [464760, 438030, 453980, 92800, 370090] 24651 [49520, 49520, 220, 99300, 220780] 165648 [294100, 440, 241540, 612880, 246960] 4797 [24200, 356540, 283390, 264240, 298520] 120665 [227300, 293220, 322110] 334944 [220440, 348020, 17330, 464060, 261110] 355140 [495780, 338170, 402020] 141182 [12110, 91600, 225160, 380] 236432 [394360, 48700, 48700, 373420, 306350] 347164 [4700, 220, 2870] 259143 [285900, 222520, 282560] 143890 [252490, 8930, 17460, 272510] 224656 [243970, 63380] 95688 [49520, 201790, 201810, 346420, 413170] 111337 [427520, 252490, 4700, 206480, 383120] 192774 [262060, 232050, 363960, 239030] 375068 [391040, 242720, 209160] 366419 [230230, 306020, 440, 249230, 240340] 394190 [212680, 620, 391540, 65300, 1520] 469266 [97330, 63960, 264540, 236450] 254192 [282070, 245170, 258050, 210770, 221640] 371642 [614570, 268540, 367580] 247812 [252490, 225080] 344210 [428440, 330620, 404100, 528610, 299460] 342203 [314160, 214490, 641860] 347997 [279800, 398850, 499440] 144166 [281990, 223830, 391540, 337320, 35450] 141672 [271590, 239140, 317400] 419084 [266170, 267220, 259720, 269230, 259830] 66196 [227180] 322928 [50300, 312990] 517538 [298110, 316390] 171399 [307690, 63380] 30796 [233110, 390890, 355530] 95973 [4000, 281990, 268500, 24980, 221100] 358910 [548650, 562090] 271594 [214910, 384190] 21010 [440, 8980, 290530] 35219 [346110, 291550, 252610, 345820] 12967 [293180, 212800, 65790] 493768 [412830, 386700] 3987 [15270, 202970] 320853 [208140, 620, 504370, 291550] 386275 [205100, 404410, 475550] 111651 [304390, 278080, 268420] 90960 [460810, 355760] 25562 [271590, 107410, 212480, 243970, 381780] 455793 [298110, 243470, 204450, 302510] 261577 [18400, 24420, 301750] 4971 [33930, 239140, 243470, 12220, 97100] 193010 [113200, 311480] 212161 [281990, 291550, 409100, 439700, 207140] 287491 [289130, 274310] 214541 [386360, 24420, 317820, 422970] 96848 [271590, 202170, 9050, 233270, 10180] 184368 [230410, 8190, 350810, 209080, 285480] 21375 [263340, 284460] 97023 [345350, 15320, 17080] 449963 [340270, 272060] 511133 [353270, 449300, 25850, 382640, 286790] 226237 [289130, 263060, 297130] 63919 [435150, 261030, 342970] 86299 [433850, 442080, 273350, 319510] 150233 [6860, 33320, 67370, 9070, 340] 164731 [289130, 223830, 437890] 250866 [237110, 12100, 335000] 534577 [302610] 175966 [48700, 675010] 287071 [220820] 147488 [440, 241930, 15620] 434952 [298260] 208721 [281990, 253230, 246620, 15130, 284240] 282260 [48700, 4760] 41597 [64000, 9420, 242760, 287980, 342620] 395637 [306130, 306020, 243470, 202530] 516495 [252490, 440] 316046 [107100, 304460, 332310] 25233 [240760, 318020, 9460, 15300] 470876 [405310, 380840, 248550] 160983 [4500, 316160, 33610] 413633 [374320, 236430, 252490, 113200, 22300] 64644 [386360, 48220, 239820] 106922 [237990, 315810, 356520] 221361 [282070, 346010, 213330, 21100] 502566 [45760, 238010, 220740, 372360] 497850 [3483] 331844 [410820, 203750] 191094 [310380, 422130, 204530, 393530, 275670] 274655 [49520, 261640, 211070] 271132 [225020, 369580] 443738 [239350, 335240, 10180] 189689 [211420, 200710, 453480, 282070, 50620] 139525 [291550, 313120, 265300, 227940] 427672 [221380, 242920, 78000, 231140] 125324 [202970, 12140] 295152 [356190, 393380, 333930, 47410, 55000] 434969 [102500, 201790, 214510, 22200] 150254 [251730, 540100, 268910, 356520] 331320 [206420, 210770, 250260] 223255 [262060, 214790] 72803 [225540, 274190, 299360, 221180] 40229 [271590, 330830, 301120, 6060, 327890] 484355 [113200, 251630] 204399 [268500, 291550, 443810, 689090] 173235 [377160, 344760] 325375 [245620, 295790] 208234 [213670, 265550, 245730, 207610, 115800] 229479 [304930, 113200, 440, 440] 383513 [17740, 348700, 91200, 45500] 75050 [403640, 20920, 238240] 468999 [588430, 453750, 423890] 220132 [287390, 316610, 489140, 353560] 254673 [440, 49520, 304390, 251170, 45500] 187868 [252490, 50130, 255220, 234490, 637100] 163011 [235540, 228300, 273110] 415319 [305620, 201810, 420290, 224580] 126682 [6860, 13570, 327060] 17371 [248820, 221260] 73988 [344500, 342230, 70660] 351609 [57900, 342870, 281840] 259388 [218680, 357380] 92637 [346010, 42910] 221207 [4000, 339800, 221260, 253710] 517038 [200210, 219890] 198510 [269210, 416110, 435790, 275670, 91100] 274285 [349680, 351140, 327640, 329460] 417590 [391540, 346940, 274190] 254808 [440, 363970, 47790] 383145 [370240, 220700, 572810, 584400] 314060 [4760, 205730] 422271 [49520, 17330] 284801 [314410, 274190] 249692 [252490, 237110] 169152 [213610, 65070] 268719 [377160, 313160, 312530, 545650, 70300] 67644 [245450, 409720] 237983 [301520, 239820, 385350] 312514 [9200, 323460, 224860, 108800] 270067 [255420, 391720, 356670] 18077 [444090, 269210, 353330, 39190, 220] 29591 [407300, 466240] 283637 [346110, 389730, 331600, 428690, 357190] 26743 [310080, 489520, 405640] 89751 [20510, 346900, 12130, 70300] 417259 [248820, 207140] 240853 [224540] 89861 [281990, 241930, 345610, 402020] 80000 [23310, 214610, 418070] 336540 [304390, 496620, 248170, 340170, 17410] 191157 [440, 240, 113200] 163902 [407020, 588970] 97584 [203770, 230410, 227940, 403640, 209000] 392300 [205100, 201210] 88684 [389730, 402570, 393420] 261410 [252490, 227080, 302590] 288637 [253840, 21100, 288160] 437403 [46550, 70300, 354650] 436501 [587290, 430960] 140197 [252490, 63380] 29597 [413150, 35140, 319630] 130942 [230410, 220, 275530, 2320, 298520] 305008 [200510, 520440, 538680] 116300 [208400, 19030, 207320] 233430 [113200, 207170] 117109 [107410, 582660, 235540, 268910, 356670] 297648 [440, 221100, 107410, 271590, 49520] 441241 [362890, 282030, 215470, 243120] 52711 [620, 236150] 28214 [446390, 233270, 312990] 252598 [105450, 304930, 360430, 330840, 550240] 97624 [251430] 445964 [3260, 33600, 266510, 211400] 485926 [18500, 288160] 394693 [212680] 383000 [231740, 269810] 232806 [113200, 352050, 248290, 369580] 145046 [440, 271590, 210770, 225080] 497343 [282590, 265000, 262690, 424280, 270550] 248084 [391260, 370510, 301200, 204180] 181850 [316010, 355050] 70566 [47810, 35700] 140043 [4760, 537800, 211400] 165695 [367500, 2400, 207230] 356885 [35140, 399120, 409710, 250580, 214700] 312113 [252490, 377160, 294100, 289070, 17390] 157130 [200510, 500, 224260, 238210] 226018 [6060, 211400] 199924 [55230, 213670, 364050, 264340, 17080] 182232 [440, 271590, 361420, 339230, 206210] 116280 [667820, 258220, 293240, 380140, 366890] 314623 [384680, 276870, 415420] 203599 [49520, 435150, 42990] 203655 [730, 377160, 33930, 234140, 319630] 346897 [50130, 267600] 19004 [40800] 349556 [20920, 237990, 35140, 381780] 150634 [248820, 445190, 542310] 501334 [358250, 328760, 242720, 348550, 17330] 478011 [374320, 218410, 227080] 173436 [211820, 444090, 249050, 375950] 222571 [440, 300, 532110, 354200, 253710] 30286 [213670, 65930] 388124 [374320, 289070, 433850, 107100] 10316 [289130, 15620] 186780 [377160, 48700, 240760, 3590, 431960] 527486 [504490, 285070, 291010] 488123 [209080, 712770, 406210, 2290] 472212 [209950, 17470, 204240, 201480] 18937 [21690, 224540, 214830, 252330, 48000] 232164 [271590, 4000, 291480, 206190] 129161 [230070, 362290, 252630] 39271 [447530, 327670, 306700, 353700] 119895 [4000, 230410, 470260, 206500, 72200] 524632 [433850, 232090, 219830, 99900] 320198 [220, 252670, 398710] 75162 [581640, 284830, 259550, 314020] 402779 [105450, 280930] 70868 [427810, 345390] 353754 [208200, 265300, 6000] 316027 [291550, 42910, 292630] 94723 [200510, 268050, 207610, 261030] 407496 [294860, 221380, 7670, 50620] 214953 [251170, 283230] 172971 [8930, 206440] 323685 [221380, 334230, 8190, 235360, 130] 195335 [620, 233150, 205910] 501076 [346110, 456670, 583890, 266510] 375709 [444090, 344440, 414740, 268750, 360840] 326097 [205190, 580950, 341730, 340520, 244710] 115086 [35450, 241560, 339800, 218060, 265670] 406456 [9200, 3730] 39978 [255710, 557040, 393150, 290790, 417860] 240360 [440, 438740, 32470] 383595 [107100, 407840, 547680] 469808 [22350] 391098 [326190, 204340, 301910, 303390] 44451 [207610, 3320] 67544 [444090, 261030] 239877 [8500, 41500, 287700, 673610, 394760] 382229 [248820, 331470, 349700] 464293 [236110, 259080] 129145 [252490, 363970, 464620, 356310] 92803 [440, 391540, 65980, 110800, 245170] 178235 [227100, 265300] 136679 [6060, 346010, 17410, 511800, 322500] 19593 [221100, 236870, 20920, 559910, 394310] 64504 [50130, 529180, 230410, 407840] 341046 [290080, 301520, 239840, 20920, 238010] 169703 [240, 233230] 204630 [4760, 282900, 291550, 206420] 355178 [433850, 22370, 319630, 208090] 416670 [271590, 284160, 318430] 505061 [564710, 651280] 296349 [230410, 270880, 248820, 420290, 208200] 198691 [374320, 8930, 337000, 219640] 52286 [359960] 364549 [230190, 209650] 374911 [241910, 7760] 255490 [341950] 36412 [19900, 282140, 224260, 206440] 150600 [260160] 450127 [208650, 234650, 40800, 395160, 323580] 86136 [282070, 11610, 37600] 291902 [300, 17480] 57176 [45760, 378540, 352520] 7390 [257510, 220] 144281 [356190, 440, 42910, 327690, 46200] 225772 [107410, 211820, 322500] 278365 [359550, 333930, 224260, 422970] 168151 [312990] 412215 [433850, 247730] 387726 [281990, 311260] 294975 [271590, 323370, 257510, 266510, 40700] 77485 [377160] 276173 [4000, 4000, 730, 227300, 221100] 175513 [306130, 268050, 21780] 214295 [384980] 413297 [252490, 213670] 123834 [227300, 22330, 219640, 317360, 391460] 228705 [65800, 480650, 6060] 405394 [51100] 359999 [48220, 342200, 10680, 23400] 44237 [620, 444770, 34920, 209520, 461840] 98300 [331470, 377710, 429940] 360366 [259680] 36734 [346110, 652980] 57929 [47780, 495890] 125992 [8930, 220200] 356050 [433950] 342677 [488790, 201810, 237850, 287390, 435530] 48635 [440, 8190, 201810, 246620, 358270] 257570 [209000, 50620, 202750] 357013 [440, 49520, 60] 262317 [435150, 211820] 45447 [292480, 361560, 99120, 455700, 323240] 67552 [280220, 351480] 235572 [227940, 268130, 580150] 274254 [360830, 582550, 268910] 273718 [377160, 238430, 224580] 177140 [620, 3590, 33550] 362552 [391490, 299360] 220425 [203630] 215704 [211820, 41700, 41700, 8190, 589290] 218351 [359550, 239820, 620] 176587 [239140, 377430] 124843 [231430, 391540, 318530, 418370, 285330] 258738 [240, 261030, 22300, 356520] 105790 [4700, 588430, 303390] 357308 [333930, 12120, 200210, 246090, 391460] 180611 [211340, 205830, 209390, 209690] 349759 [317400] 321873 [287450, 293500, 70000, 6920] 445905 [238010, 206420, 204450] 24051 [233130, 19030, 26900] 39061 [236430, 342200, 202170, 620, 247730] 444961 [49520, 2400, 360640, 225280, 233720] 241800 [47890, 295790] 340164 [427520, 24960] 224984 [63200, 505170, 380, 41300] 358504 [440, 440, 10090, 393380, 94590] 248767 [628490, 434420, 104900, 449540] 131522 [49520, 311310, 242720] 299952 [236690, 322500] 203229 [227600, 10180, 398850, 283920, 224460] 135239 [239140, 289890, 225260] 522444 [213850, 7670, 340460] 76619 [25850, 208090, 91200, 49600] 464872 [216150, 339350, 452440, 402620, 388880] 14440 [20920, 242720, 31280] 124026 [206420, 576010] 169827 [8310, 18700, 367580] 183070 [346110, 304390, 330830, 319630, 286690] 341990 [4500, 219640, 70] 318061 [255710, 322500, 211500, 51100] 251073 [294100, 251870] 17852 [440, 239140, 65980, 293780] 255186 [233740, 405640] 189205 [440, 22180, 257750] 486672 [57690, 233130, 313120, 332800] 323136 [332070, 310080, 356670] 23628 [259080, 286040] 417169 [627670] 68423 [291650, 221100, 24980, 20920, 305620] 233806 [289650, 17740, 35420] 384253 [221380, 255710, 480650] 532382 [31280, 262410, 303390, 70000, 263980] 378341 [255710, 220240, 282070] 329429 [283880] 492548 [440, 7510, 301860] 121703 [384180, 630] 218609 [337070, 655300, 674310, 394890] 169532 [107410, 8980, 202170, 403640, 20920] 416924 [550, 212160] 283063 [55230, 35140] 116593 [234650, 350070, 506510, 215100] 69130 [4000, 346900, 261180, 405820] 49117 [282070, 242550, 307570, 22340, 214250] 49545 [589290, 326480] 377648 [433910, 719610, 675720, 409420, 33110] 128136 [20] 297835 [377160, 440, 252490, 8980] 110535 [620, 304930, 57690, 95400] 389748 [9450, 8850, 3480, 41800] 235242 [33610, 3730] 262866 [6860, 391540, 113200, 362890, 225080] 109741 [211420, 268910, 214790] 292024 [319630, 237310, 274620, 3700] 98527 [377160, 301520, 620, 383870] 196521 [211820, 113200, 39150, 92600, 47400] 121367 [241540, 367270, 254060, 355180] 212312 [220200, 337000, 527290] 412641 [204880, 48220, 402840, 418910, 302750] 324024 [328730, 284390] 439051 [220, 245170, 329460] 196387 [536930] 6362 [323720, 287860] 136357 [221100, 316010, 227300, 552990, 237930] 116585 [323370, 265550] 185710 [377160, 620, 24240] 458388 [203290] 172643 [78000, 351640] 176198 [346940, 249650, 387290, 308420] 252722 [45400, 115320, 202970, 349700] 35989 [108600, 247000, 235540] 40638 [427520, 252490, 261640, 339800, 268130] 224759 [220240, 466560] 37840 [351250, 442210, 523780] 157924 [290080, 350310, 8170] 38464 [238750, 528820] 303062 [394360, 8930, 241930, 372360, 206210] 78602 [346110, 442080, 277630, 274940, 354500] 145814 [48700, 252490, 200510, 55230, 210970] 240598 [232090, 57300, 319510] 77227 [35450, 408740, 70, 70, 444640] 455741 [435100, 492630, 254460] 314459 [281060, 348020, 237740, 209160, 215510] 88054 [35450, 601430, 375480, 341940, 429570] 204051 [296490, 376300, 298360, 400180, 328940] 237377 [518790, 254060, 11330, 48000] 119108 [373420, 65980, 364420, 236150] 112291 [635060] 160967 [227300, 227300, 260190, 223100] 212034 [211420, 355840] 75884 [394360, 352400, 324800, 376870, 431240] 38956 [283270, 203210, 27940, 4720, 219890] 105064 [219640] 279853 [252490, 335300, 305620, 372000, 422970] 286875 [48700, 48720, 237990, 6040] 174252 [49520, 208580, 335670] 237086 [459820, 215120] 42351 [211440, 322500] 288802 [206420, 282800, 35700, 264080, 509880] 159215 [209000, 289650, 329050, 543460, 21100] 315469 [374320, 363680, 368230] 256155 [394360, 377160, 4700, 10500, 242860] 335620 [484900] 249536 [211820, 268050, 459820, 434570] 453275 [209000, 465720, 35000, 432770, 208630] 224172 [239140, 248820, 248820, 268050] 100323 [240, 306130, 30] 225122 [45760, 201790, 44350, 238430] 263741 [264060, 231720] 180851 [306130, 12220, 226560, 225260, 384980] 119972 [585560, 259320, 214700] 239300 [240, 200010] 362750 [360430, 460810, 13540] 445192 [370220, 300340, 323580, 365300] 378887 [440, 220, 208580, 35450, 326410] 144752 [346110, 204880, 310950, 206420, 204530] 136598 [268500, 249130, 24240, 3730, 323850] 408617 [463680, 694980, 487120] 16202 [394360, 15380, 42980, 9310, 11330] 342152 [49520, 396640] 166318 [55230, 201790, 107100, 404770, 401190] 251548 [241540, 50300, 237930] 100231 [205990, 228260, 420740, 209160] 209927 [294100, 344240, 299360, 333760, 411560] 159331 [251060, 204450] 429699 [440, 55230, 35450, 10180] 292757 [564710] 446009 [239160, 356500, 313340, 57300] 57259 [335300, 317840, 313130] 98076 [271590, 237990] 289212 [329110, 61100, 485890] 323090 [49520, 208650, 253530] 107473 [202170, 433910] 250446 [259680, 219640, 219640] 365052 [241540, 6060] 203614 [49520, 49520] 37823 [262940, 368370] 94126 [49520, 55230, 55100, 273110, 289090] 6545 [107100, 239430] 180836 [220240, 234080, 219640, 232430] 152106 [232050] 111025 [252490, 306130, 393380] 44504 [413710, 505120, 221260, 357070] 211451 [433850, 344760, 39140, 62100, 12100] 311293 [319630, 282800, 266310, 362890, 313340] 154507 [242050, 293220, 65930] 348284 [113400, 391540] 129415 [401690, 393610, 319830] 226395 [226860, 496300] 188220 [271590, 107410, 236870, 359400, 2810] 437148 [362490, 287840] 2826 [550580, 363600, 226960, 224820, 401190] 255955 [203160, 338000, 317360] 271035 [334230, 212630, 22370] 267117 [233450, 296910] 408029 [237990, 224540, 221910] 129925 [18820, 45500, 298520, 234390] 318772 [47890, 200510, 24200, 208580, 529180] 334244 [281990, 645790, 418250, 284830] 37020 [433850, 251130, 285920, 206210] 148398 [440, 304930, 35450, 620, 620] 243772 [302080] 55074 [300, 49600] 386365 [12110, 214970] 369789 [372800, 270550] 333277 [346940, 396790] 97012 [252490, 221100] 269825 [4570, 204630] 182955 [65800, 300570] 257478 [427520, 252490, 230410, 240760, 113400] 37363 [8190, 241600, 214870, 388490, 9180] 325000 [252490, 49520, 249950, 301910, 33900] 377709 [287290, 220, 371570, 461490, 225840] 69890 [48700, 203140, 113420] 193547 [237990, 355000, 401840, 261180, 302010] 454252 [433850, 394510, 323220, 637100, 738060] 162529 [257510, 268050, 723820, 232430, 365300] 132335 [2610, 407530, 233720] 372323 [252490, 233700, 333420, 285920, 345370] 241191 [268500, 249130, 7940, 12110, 57300] 99843 [557810, 239160, 350990] 138153 [257510] 427846 [307780, 2320, 71250] 374762 [230410, 203750, 382850, 416130] 453791 [234920] 426477 [496680, 205890] 124938 [214950, 202970] 35339 [245170, 260270, 335890, 274900, 264380] 190045 [394360, 284810, 233550, 204060] 383338 [221100, 433850, 433340, 253710] 94316 [221380, 289130] 207717 [221040, 261030, 220, 13200] 365718 [48700, 500, 251110, 40800, 587650] 513457 [437720, 687590, 441060] 481706 [21690, 440760, 508290, 398680, 356670] 65483 [252490, 319630, 219780] 245992 [233130, 280640, 207420, 428550, 211260] 319868 [39140, 430440, 296910] 294315 [252490, 102810, 202970] 210800 [305620, 57690, 460790, 310380, 280160] 303417 [268420, 415860] 478752 [319630, 116120, 280930, 453340, 251850] 241442 [221100, 227300, 200710, 256290, 216910] 231181 [392690, 403460, 398710] 232021 [8930, 302120, 21680] 138000 [367520, 351970, 359390, 674940, 417860] 79003 [23310, 10680, 385560] 31459 [220200, 23530] 455217 [206420, 427270, 10180] 297380 [271590, 377160, 252490, 208140, 24960] 442898 [221380, 299660, 253570, 330350, 206440] 477800 [240, 237110, 291480, 238320] 364427 [49520, 50130, 282070, 65930] 363818 [300550, 481510] 225101 [252490, 234140, 206420, 319630, 219640] 213890 [440, 221100, 4000, 311210, 582160] 55716 [231430, 304730, 212630, 225420] 35294 [233450, 403190, 425070] 432835 [433850, 524220] 310336 [440, 230410, 17390, 24740, 388090] 338105 [220200, 233450, 220] 138224 [41700, 206420, 17460, 394230, 246620] 284370 [219990, 239140, 363970, 8980, 65980] 53324 [8930, 211820, 285160, 380, 63710] 190268 [226740, 384190] 185321 [335840, 17080, 436180, 399670, 209650] 328913 [363970, 391540, 221640] 53818 [359550, 220240, 12360] 97698 [322110] 114059 [540840] 510497 [427520, 65980, 237990, 207170] 212248 [113400, 80300, 387290, 434730, 220780] 202992 [39210, 384100, 386340, 383930] 195728 [284810, 102700, 16720, 281370] 302124 [12480, 204450, 22340] 170666 [8930, 440, 394690, 370240, 597700] 528067 [394230, 435530] 203665 [219640] 50170 [35450, 104900, 220780] 131136 [316010, 286690] 221188 [231430, 70300, 340] 263574 [48700, 233450, 238430] 394824 [281990, 294100, 233270] 67771 [359550] 195979 [620, 91200, 257630] 266726 [688260, 754620, 420740] 274851 [392970, 314020] 293379 [463850, 584400] 412860 [440, 346110, 232090, 200710, 253710] 55151 [98200, 286080] 322191 [236430, 346110, 8980, 9480, 313340] 486525 [213670, 21690] 77283 [228400, 423880] 192686 [440, 366090, 70600, 652980] 90192 [377160, 440, 24960, 409720] 232739 [113200, 17300, 22120, 24740, 400] 71166 [377160, 346110, 227300, 107410, 24960] 318904 [241540, 301520, 391540, 235600, 22350] 228263 [211820, 407810, 319510] 319085 [268050, 346490, 514900, 431650] 334033 [270880, 105450, 92000] 306627 [211820, 359550, 316010, 323470, 239140] 326517 [16720, 48000] 432568 [200910, 211340, 92800, 33650, 253750] 455256 [247020, 284870] 164912 [40800, 63380, 238090] 270975 [55230, 17520, 2400] 226304 [236430, 435150] 425698 [620, 319630, 22320, 498580, 202970] 274238 [65800, 42960, 282070, 265300] 149205 [440, 274170] 239554 [214490, 307670, 342310, 464060] 88162 [252490, 377160, 620, 237930, 245280] 22311 [218230, 431240, 498240] 357716 [228300, 94620, 689400, 286260] 181186 [220200, 440, 204360, 269490] 116839 [319630, 237990, 392820, 301670, 359250] 276241 [327690, 317360] 171598 [108710, 200210, 464470, 245170, 665360] 366388 [377160, 319630, 537110, 57300] 296905 [225540, 219640] 472354 [291090, 489580, 331440, 2310, 298260] 373907 [356570, 550310] 146344 [214950, 212200, 347830, 520440, 448510] 355525 [49520, 394690, 460920, 246620, 107100] 385021 [500, 45700] 15502 [24980, 500, 22320, 57640, 403510] 424068 [223100] 266348 [554620] 470686 [113200, 107100, 39640] 195804 [8800, 361420, 259080, 282440, 302510] 247442 [271240, 286690, 276440, 33460, 417860] 292569 [440, 252490, 304390, 113200, 273350] 463550 [241540, 339800, 256330] 463226 [394230, 327510, 443530, 364710, 33120] 239861 [346110, 379720, 22300, 301640, 31280] 132286 [440, 227300, 225540, 268050, 233210] 426619 [206420, 13210, 98400, 201510, 380] 279325 [227300, 305620, 410710, 235360, 562160] 369516 [200710, 243780] 267957 [440, 10500, 235600, 404410] 30877 [17080, 203750, 417860] 271907 [440, 4000, 417860] 161047 [204560, 250030, 244590, 277490, 237760] 293369 [252490, 440, 377160, 236430, 311210] 328529 [252490, 252490, 323470, 35450, 236110] 167416 [319630, 361280, 261030] 314976 [48700, 255710, 262060, 39120, 6060] 313557 [209670, 283230] 48570 [236430, 222880] 394358 [4920, 200710, 353360] 127429 [271640, 373810, 304170, 409360] 169560 [294100, 208650, 273350] 278494 [230410, 433850, 6060] 337858 [219150, 244930] 372962 [433850, 265590, 210770] 377008 [220200, 428750] 241398 [337000, 403640, 612880, 612880] 383308 [348020] 237564 [78000, 310080] 152787 [4000, 49520, 239140, 282070] 339660 [304030, 293780] 462054 [200510, 32470, 268400, 369990, 234650] 400895 [304930, 35720, 422810] 364465 [209000, 247730, 512250] 14336 [374320, 15620, 35720, 95700] 288895 [359550, 473740] 295418 [33930, 234140, 500, 12100, 107100] 523453 [33930, 438740, 22330, 12480] 95186 [305620, 329430, 31280] 154124 [263620] 200612 [49520, 460120, 227940, 7520, 260230] 527460 [275850, 332200, 514290] 310406 [330840] 228065 [261640, 200710, 262960] 112526 [610080, 282400, 352220] 421862 [203770, 49520, 214190] 508334 [257510, 3310, 27020, 243450, 72200] 15428 [39140, 42910, 233290] 277247 [113200, 201810, 237740] 145862 [281990, 319630] 200281 [377160, 240, 248820, 238260, 219890] 362898 [406120, 336090, 292200, 391170, 288160] 235574 [346330, 282440] 245550 [299030, 41050] 169831 [305620, 268650, 232090, 516510, 514660] 114086 [200510, 269210, 98300] 381573 [440, 440, 359550] 425166 [228280, 205950] 123781 [22200] 418017 [65800, 17470, 225080] 477957 [329110, 627690, 299910, 265890] 225720 [205100, 418460, 235540, 239030, 393980] 212111 [24010, 242820] 322890 [108600, 4760, 233290, 2270] 201078 [234140, 368070, 203750, 7860] 24373 [20510, 253230, 454770, 214770] 228761 [301520, 50300] 88987 [255710, 230230, 440, 232790, 224260] 207041 [440, 209000, 444090, 63380, 57300] 156979 [271590, 385070] 351289 [620, 339120, 302380] 137284 [233130, 92000] 306445 [410380, 537180, 346250] 516022 [314410, 255390, 206190, 206190] 228681 [206210, 57300, 221910, 236450] 17651 [317400, 225840, 630] 233163 [271590, 220, 209670, 35480] 69007 [110800, 220440, 20500, 35700, 8200] 325085 [248310, 17410] 526835 [24010, 477160, 2420, 231140, 230050] 242740 [252490, 206420, 401890, 233720, 563560] 283406 [224260] 464313 [7670, 249870, 224540] 60263 [248410, 249650, 234290, 521890, 243450] 244232 [240, 394380, 224540] 1067 [215530, 256290, 231020, 327410] 509458 [426560, 412730, 207080, 207040, 440740] 64065 [273350] 8082 [17710, 310370] 395501 [371330, 341980, 448070, 390570, 318230] 421444 [263500, 512900] 231951 [221100, 15620, 8190, 202750, 229580] 91726 [248610] 131617 [8930, 620, 32800, 102700] 341717 [304240, 15520] 115466 [440, 427520, 204880, 42960, 306130] 67076 [348380, 425150] 3176 [298110, 261640, 345200, 425670] 204077 [220240, 521200, 6060, 292730, 211500] 295390 [558420] 421298 [48700, 440, 291480, 113420] 385251 [231430, 4920, 261880, 7510] 219978 [98800, 256330, 55150, 248310] 295724 [504370, 282140, 47790] 168348 [4000, 344760, 204300, 220, 219640] 45184 [442080] 526879 [444090] 258129 [33930] 307807 [221100, 39120] 196903 [257350, 38430, 577940, 50, 91700] 128590 [49520, 204880, 208650, 220240, 315660] 98879 [282140, 388420, 80340] 22687 [252490, 200510, 232090, 107100, 2450] 493796 [22330, 349650] 198728 [304390, 234140, 225540] 408223 [220240, 240760] 406809 [358130, 287980, 396090, 314240, 334420] 331541 [99900, 365670, 40800] 375483 [3590, 271670] 345903 [241930, 620] 294006 [35700, 26800] 69990 [344760, 289650, 8190] 372595 [25010, 55110] 170588 [33650, 202530] 175932 [375870, 346470, 331570] 536084 [49520, 45740, 302830] 84713 [281990, 219640, 330830, 432250, 257750] 353831 [548720, 486660, 356570, 497100] 246281 [365670, 438030, 433950] 87424 [98800, 241600, 340520, 203630, 21680] 150653 [247730, 432020] 68682 [376870, 203210, 57200] 51002 [433850, 42700, 227940] 58926 [273350] 92332 [8980, 212050] 135174 [64000, 690560, 286690, 47400] 204160 [265550, 12110, 340] 128779 [301520, 283290] 450857 [433850, 449140] 27395 [227300, 285960, 417860] 353488 [268500, 252950, 211400] 81873 [242050, 412880, 245280] 222622 [227300, 391540, 299250, 320] 237770 [294860, 410850, 270050] 303265 [113200, 447890] 116746 [377160, 298110, 238010] 510654 [17410, 104900] 243985 [440, 440, 233700, 324810] 245565 [221100, 404700, 677980, 338190, 241720] 409933 [440, 261570] 76746 [435150, 362620, 208520, 339580] 16784 [203140, 500, 219640, 39160] 281561 [377160, 316010, 4920, 232090, 265610] 317675 [48700, 47410, 57300, 374570] 58623 [745880, 417860] 49317 [477770] 48675 [384180] 64014 [312280, 266510] 352826 [200510, 50130, 237430, 219150, 227080] 76926 [9930, 213610, 245390, 201570] 159547 [107410, 39120, 6860, 220460, 219890] 42568 [440, 440, 482730, 275850, 250760] 363938 [440, 65980, 32900, 48000] 302993 [463150, 280, 356670] 406699 [251130, 412170, 410380] 62854 [262830, 258180, 278360] 86257 [214950, 438740, 201810, 429790, 246090] 243547 [296490, 341150, 492290, 433340] 407748 [695020, 751690, 674110, 2350] 45496 [220, 273350, 107410, 226620, 376210] 32817 [50130] 62668 [212680, 200510, 272270, 334420] 59028 [601640, 408650] 263426 [203140, 50, 13240] 161493 [289520, 228960, 221640] 181678 [250900, 440, 32470, 50300] 335041 [236430, 39140, 351970, 55230] 416189 [228280, 23400, 289520] 29069 [240, 507490] 339753 [541920, 446780, 246300, 275060, 226720] 1073 [16450] 50296 [291630] 339435 [374320, 367500, 200210, 238320] 147865 [265610] 158965 [403190] 156620 [393080, 485370, 219190] 342485 [341530, 208670, 71340] 127893 [268750, 328760, 209540, 219340] 293151 [391540, 365670] 535835 [221680, 443810, 209060, 211400] 161598 [319630, 469820] 238036 [319630, 338550] 435991 [227300, 214830, 269790] 317607 [275850, 268130, 246620, 377840] 460961 [4920, 42700, 317360] 378631 [440, 227940, 3730] 30250 [261030, 380600, 431960] 204967 [242920, 227940, 449680] 390246 [620, 25980, 105450, 363600, 409710] 106668 [346110, 4000, 364470, 302510, 303210] 360123 [251060, 242920, 355180] 262906 [324800, 238240, 250110, 104900, 96100] 468305 [40720] 46619 [465520] 351615 [221100, 252490, 287290, 224260, 319630] 302770 [377900] 335342 [596620, 318530, 260790, 262390, 239430] 301117 [394230, 491770, 294140] 246476 [41300, 259600, 96100] 404432 [365450] 11516 [12690, 367240, 217920] 62743 [304930, 323470, 55230, 224260, 65980] 292408 [346110, 241540] 10903 [205100, 207140, 431260, 22180] 226310 [440, 301520, 9420, 393380, 330820] 334113 [48700, 441550, 407530, 337950] 8247 [330830, 307090, 342350, 207080, 261490] 119170 [4000, 440, 289650, 291550, 105450] 280172 [8980, 469920, 104900] 10741 [496300, 274230] 392995 [374320, 233470] 374563 [419650, 420530] 330164 [234650, 201810, 340170, 2100] 186356 [39210, 98800, 238460, 107100, 421670] 76297 [219990, 306130] 134818 [240] 109530 [107100, 220780, 221910, 246580] 225705 [374320, 230410, 232090, 620, 413420] 57650 [418340] 266008 [301520, 8190, 241600, 225540, 293240] 40392 [221380, 227940, 346900, 361420, 285900] 164559 [261640, 225260, 365300] 425535 [203630, 63380, 234390, 299460, 299460] 217613 [322190, 456670, 293520, 205910] 139656 [15120, 219640] 224890 [730, 339800, 310370] 75206 [300580] 86351 [289130, 312530, 427730] 288766 [730, 431960] 271382 [208140, 355800, 292140, 301910] 18335 [219990, 451340, 200210, 273350, 98400] 137766 [8930, 40800, 238210] 270744 [233270, 233720] 404928 [241540, 50620] 274547 [40800, 209870] 215313 [252490, 242050, 247730] 474901 [620, 256290] 325386 [319630, 24740] 503949 [275850, 318220, 298240, 221910] 35699 [240, 240440] 537309 [344760, 211420, 8850] 132045 [239140, 218230, 242860, 543360, 235780] 358270 [48700] 105530 [339350, 588050] 483175 [402570, 287390] 37114 [206480, 282760, 312990] 173564 [33900] 352033 [234140, 220780] 243352 [440, 220200, 206420, 92900, 568770] 512570 [243780, 245170, 331440, 311480, 415350] 224391 [252490, 252490, 4000, 383180, 281920] 57735 [13560, 21010] 525785 [55230, 252490, 22370, 7670] 315420 [241540, 234650] 42322 [227300, 431960, 282140, 291550, 274190] 2766 [300970, 260430, 310370] 527083 [40800, 267980, 219890, 219890, 312990] 65516 [252490, 43190] 443521 [274290] 136334 [377160, 235540, 9480, 207140] 366109 [346110, 298110, 274170, 321410, 310370] 165641 [426790, 297130] 114954 [35450, 238320, 409590] 533433 [371390, 344040, 406150] 375929 [582160, 335430] 368807 [386050, 384490] 24139 [346110, 359550] 383924 [239160, 35140, 233290] 457958 [110800] 47865 [620, 234650, 484900, 207420] 68428 [45300] 327376 [377160, 271590, 440, 252490, 221380] 31822 [8980, 261760] 23987 [261570, 355840, 464700, 416130] 304481 [20500, 380] 447079 [252490, 440, 433340, 328070, 207610] 66574 [413410, 329130, 332200, 545330] 440492 [391540, 10680, 237310, 459820, 340] 115000 [10680, 231200] 35134 [234650, 307780, 348020, 301910] 378719 [214490, 429050, 336670] 60937 [359550, 305620, 341860, 232430] 140246 [386360, 227940, 516510, 369030, 380840] 274226 [386360, 233980, 384490] 25016 [377160, 262060, 394510, 268910] 509235 [252490, 337000, 12120, 239030, 57300] 145272 [289130, 346900, 239030] 184992 [336130, 313780, 258950, 465520, 346250] 25909 [98800, 99400, 429570] 297215 [113200, 420290, 224580] 317515 [214490, 39500] 72489 [261640, 211440, 16450] 98401 [380, 232010] 195237 [252490, 226860, 489830, 627270, 235540] 486024 [435840, 107100, 204060] 128704 [427270, 362070, 226560] 317094 [230410, 2870] 252485 [531640] 179451 [70] 446446 [335300, 40100, 64000, 551730, 9200] 203228 [449710, 498240] 371731 [620, 403640, 35140, 204360, 349220] 287842 [244030, 251730] 85061 [200710, 342200, 214490, 220780] 237423 [440] 372887 [48700, 236870, 283640, 361420, 212110] 99788 [248820, 71250] 68488 [333930, 501930, 451020, 389310, 337480] 342452 [304050, 301640] 408314 [287980, 8850, 467360] 138347 [242860, 333950] 180995 [377160, 440, 440, 440, 260430] 463669 [377160, 15120] 30099 [4700, 110800, 249230] 237643 [113200, 355840, 268910, 449140] 248088 [346900, 204100, 473690, 330840] 538147 [353270, 358410] 430099 [304650, 425410, 499520, 365300] 171966 [242920, 282070, 363970] 11744 [231740, 26800] 8788 [265630] 319812 [49520, 212680, 242820] 226009 [274190] 60047 [377160, 239140, 418460, 331500] 213504 [248310, 379380, 303210] 375051 [108600, 40700] 255429 [440, 104700, 293500] 45036 [440, 282900, 13230, 2200, 3320] 309094 [306130, 239160] 372475 [301520, 320, 251170, 35720] 215227 [271590, 209000, 285310] 291789 [220, 245170, 332800] 138113 [661000, 603370] 377208 [582160, 268650, 282900, 25000] 373987 [460930, 368070] 117678 [232430] 236036 [48700, 219830, 286690] 219071 [57300, 230050] 76152 [200210, 365450] 536015 [403640, 351710, 593200] 143465 [234140, 35140, 55150, 275180, 107100] 34094 [512470, 263540] 22292 [415200, 506610] 339025 [282140] 137750 [268500, 8930, 227300, 269110, 248190] 305800 [271590, 334230, 289650, 39120, 220700] 15552 [730, 334420] 491947 [50, 6120] 348917 [221380, 217200, 295750, 377970, 361630] 534118 [228280, 263300, 311730, 345650] 225565 [48700, 271590, 440, 248820, 213670] 527735 [251150, 300970, 292230, 16130] 249508 [49520, 220240, 382850, 302080] 32977 [219640, 301520, 369990, 15100] 168451 [221680, 227940, 215280, 217200, 234710] 4292 [24400, 285580, 241410] 363952 [434620, 378610] 331500 [440, 429570] 219418 [559650, 9480, 675010] 366606 [312530, 245410, 252410, 263360, 354240] 370778 [374320, 284160] 360135 [204100, 233130] 330360 [28000, 265890] 484148 [395520, 417860] 31417 [107410, 49520, 268500, 108710, 235980] 177061 [343100] 478293 [97000, 221640, 205910] 16629 [212200, 253110, 391460] 482316 [300570, 410820, 291390, 111100, 348270] 201303 [444090, 219640, 374570] 314998 [208580, 331920] 327155 [434380, 509220] 140571 [397900, 224540] 187216 [427820] 488240 [45760, 49600, 15950] 280375 [346110, 339610, 242920, 554620, 425410] 482859 [380580, 238320, 291010] 174045 [8980, 20920, 435300, 238320, 303390] 57062 [241600, 239160, 11590, 401190] 92580 [319630, 286690, 405820] 81414 [221100, 433850, 255340, 268910, 281920] 358600 [200510, 219890, 225080] 315199 [20500, 207230] 260807 [359310, 491950] 140869 [107410, 252490, 291550, 227940, 224260] 272976 [316010, 312530, 597170] 9332 [557790, 359600, 387290] 418122 [289070, 203140] 449158 [301520, 236870, 514490, 476920] 98875 [57900, 232770] 84565 [361420, 237550, 443580, 245550] 224756 [220, 94590] 66537 [337000, 208650, 343780] 116696 [49520] 92968 [370360, 493370, 8400] 265312 [540900, 306410, 365070] 176758 [231430, 221380, 365590, 450590] 172631 [377160, 22320, 341310] 403121 [49520, 220, 391730, 33460, 367580] 67814 [55150, 221910] 312052 [55230, 227780, 291480, 225840, 65790] 110604 [282900, 385730, 236090, 436500, 262390] 325685 [17710] 43198 [221100, 301520, 295850, 400, 94000] 347851 [440] 663 [440, 363970, 431960, 209650] 246473 [202170, 73210, 246580, 33900, 2300] 519838 [236870, 274190, 10680, 308420, 9050] 401070 [370910, 252710, 50300, 303210] 304361 [433340, 330820, 347670, 447530] 199478 [236110, 248310] 79604 [277590, 402020, 298240] 358409 [353130, 223810, 455400] 311174 [221100, 339800, 213670] 201338 [282140, 291370, 391720] 383809 [203140, 8190, 34830] 99947 [7940, 238210] 97355 [241930, 292260] 536374 [436280, 385950, 492340, 313140, 224760] 492951 [229810, 271900, 367000, 402330, 338170] 25541 [393530, 397040] 287664 [292140, 444000, 253860, 302790, 332800] 402179 [359870, 379720] 494201 [204360, 22100, 414500, 433210, 416080] 101848 [273350, 104020] 221655 [330840, 221640] 311998 [440, 318230, 467370] 267678 [553340, 513560] 26084 [394360, 435150, 230230, 21090, 290790] 18268 [208090, 355980, 258520, 550650] 218633 [360430, 57900, 203290] 28575 [208650, 236930, 245280] 362454 [374320, 212500, 257750] 242903 [274190, 449530, 422970, 48000] 269347 [335300, 236430, 394510, 213670, 237550] 186348 [286690] 195580 [440, 440, 294100, 271590, 620] 376165 [10500, 48000] 502296 [391540, 334190, 71250] 478138 [351970, 234490, 221910] 137265 [399270, 233840, 217200] 288269 [4000, 359550, 107410, 49520, 49520] 369329 [113200, 448510, 247950] 262514 [367520, 17570, 327690, 317470] 126517 [282070, 361700, 317400] 129936 [346110, 212680, 272510, 91600] 17622 [282800] 156899 [367710] 71296 [377160, 224540] 81183 [203160] 71253 [282800, 386360, 613450, 418070] 227002 [319630, 251470, 394310] 358327 [238430, 50300] 95479 [107100, 113420, 382490] 199654 [298110, 206190, 7760, 221640] 384864 [24010, 249050, 297470] 209971 [392000, 341720] 326083 [4000, 433850, 35450, 330830] 126804 [250900, 372360, 335000, 365300] 292879 [24980, 22330, 55150, 19900, 307780] 144278 [361280, 35700, 240440] 388588 [24980, 282900, 228960, 443570, 332620] 274282 [298110, 391540, 311310] 474083 [305620, 203140, 10180] 105492 [15620, 295790] 107670 [440, 240] 516200 [211820, 282070] 116133 [239350, 38600, 18300] 374437 [203140, 108710] 240893 [221380, 261640, 213670, 362620, 233130] 389942 [362890, 713680, 208200] 41891 [242920, 208650, 331470, 15190] 447858 [49520, 304050, 268910, 47790, 370360] 366627 [678800] 317692 [32370, 707230] 284012 [239140, 307780] 69122 [17080, 322500] 452246 [397210, 248630] 250221 [247080, 422970] 457383 [353700] 444747 [236090, 221910] 32346 [438740, 409100] 109824 [204300, 234080, 220] 347905 [207320, 365300] 192443 [218680, 228400] 332559 [242920, 35720] 326316 [6040] 525062 [99900, 272890] 300236 [49520, 21980] 501733 [218820, 473450, 222660] 435571 [246420, 232790, 247240] 244252 [440, 273350, 215530, 214700] 249857 [287920, 706280, 206190] 184475 [265590, 627270, 65740, 352160, 221020] 143297 [10180, 338000, 263980] 231585 [237870, 320, 380] 57410 [230410, 444090, 391540, 391540] 186775 [236430, 13230, 16450, 244630, 303210] 240951 [107410, 334230, 435440, 92800, 50] 71255 [386360, 214730, 278530, 556280] 260703 [610810, 628420, 322920, 476240, 389400] 102202 [377160, 24980] 184792 [227940, 377140, 225840, 251450] 138323 [335300, 224260, 385770, 262260] 486412 [323470] 508801 [218820, 586620, 413510, 545700, 515690] 86393 [215630, 16720] 496205 [200710, 380] 96731 [252490, 57690, 274170, 239070] 167300 [330390, 509220] 497406 [391540, 282140, 205990, 228300] 279022 [1900, 252130, 11280, 575640] 260290 [230410, 230410, 294860, 225600] 288465 [304390, 282070, 8980, 398710, 70420] 258745 [391540, 311340] 133768 [211820, 292120, 462990] 353612 [271590, 248820] 392993 [290890, 46520, 204630] 204791 [290300] 499871 [244870] 369594 [704040, 359650, 263540] 170789 [211820, 226100, 632000, 443580] 400026 [239350, 346900, 25800] 477660 [376130, 448810, 325370, 322900, 298180] 111783 [448510, 10180] 181893 [102500] 450469 [236430, 10180] 373686 [235540, 212800] 225059 [377160, 274170, 391040, 204360, 13210] 184926 [390570, 418340, 221640, 38700, 367580] 194553 [214950, 301520, 200710, 282140, 402710] 146081 [440, 35140, 376130, 340] 67898 [252490] 125301 [630, 225600, 233470] 296519 [620, 258970, 93200, 130] 186659 [262960, 365300] 24918 [241560, 20920, 319630, 574050, 427190] 137815 [281640, 409710] 429285 [304030, 268420, 529720, 339120, 340350] 4604 [211820, 200210] 42499 [212500, 402330] 413533 [454690, 415850, 577480, 366260, 406150] 304145 [304030, 20920, 208580, 262410] 422052 [434250, 211280, 486820] 370309 [221100, 45770, 239030] 160662 [256290, 310360, 412400] 498410 [385700, 295990, 288930] 41691 [363970, 391540, 445110, 220090] 261922 [278080, 481510, 212010, 395520, 443580] 320134 [219990, 427250] 428908 [46480, 317030, 225000, 12100] 153579 [201810, 8190, 220090, 206190, 295790] 232459 [200510, 370240, 3590, 70, 344230] 143559 [271590, 4000, 314410, 32430, 319510] 105853 [268360] 386503 [375030, 350330, 313340] 261836 [230410, 252490, 242920, 20500, 228380] 352269 [360430, 63700, 245550] 134905 [252490, 47890, 113400, 237990] 228055 [70, 50, 409710] 303725 [6910, 15750] 286103 [49520, 344240] 47899 [334230, 220, 107300, 319510] 521515 [236110, 370300] 148123 [383070, 39800] 112496 [242050, 226620, 238930] 294098 [496300, 258090] 411356 [252490] 13847 [497850] 13168 [542480] 313087 [219990, 203140, 391540, 282070, 301640] 84892 [107410, 49520, 624090, 447020, 236090] 110366 [221380, 13250] 285917 [271590, 252490, 4000, 272510, 282070] 196041 [213670, 363360] 310050 [48700, 221100, 203770, 313160, 518790] 333982 [329110, 291710, 318230, 98100, 341940] 320157 [337320, 282140, 375710] 5180 [271820, 641990] 335109 [371880, 427250] 269478 [201810, 452420, 248310, 281220] 288104 [377160, 243450, 57900, 251210, 200940] 193675 [281990, 8500, 227300, 38410] 374571 [203140, 225080] 263117 [250900, 233450, 204300, 268910] 451516 [239350, 7940, 387990] 370111 [646570, 354240] 279800 [271820, 349790, 387340, 322500, 250260] 373216 [211820, 80, 65790] 336332 [220200, 415200, 6060, 220, 65980] 117953 [335670, 99900, 71340, 274900] 149695 [246300, 323610] 76045 [4000, 582660, 248310, 540900] 295924 [218230, 252450] 302808 [220, 570840] 537916 [253940] 475798 [4500, 55150, 251770, 210950, 25010] 384135 [248860, 16810, 461910, 214970, 454890] 425720 [55230, 55230, 220440, 200960] 121922 [268500, 25990, 426630] 472258 [287020, 378610, 405500] 164349 [4000, 200510, 10180, 362930] 289718 [738060, 533300] 73582 [391540, 311100, 446640] 252790 [440, 200510, 304050, 320, 6000] 334133 [17710, 207610, 70660, 283290] 174009 [271590, 440, 24780] 439815 [299460, 272060] 52668 [20920, 620, 256010] 140851 [48190, 12220, 108800] 70652 [221810, 21090, 306700, 4720] 73623 [620, 273350, 403560] 71916 [252490, 359550, 448290, 248330, 22230] 323017 [200210, 247730, 636480, 232770] 370802 [200710, 220260, 258880] 216263 [359550, 391540, 355840, 273110] 456337 [377160, 380600] 300542 [444090, 391540, 65300] 109598 [259680, 626010, 299660, 332800] 62482 [305840, 212070] 438933 [206420, 363440] 179606 [261030, 256290, 603960, 587620, 214420] 511096 [493720, 512790, 344410] 507090 [4000, 214490, 426000, 269210] 357626 [268050] 384916 [47780, 382420, 357070] 218548 [299740, 392470, 298600] 354402 [356190, 245620, 220440, 6550] 297733 [304650, 288120] 87091 [204880, 50130, 230190, 496300, 312990] 144371 [238430, 242880, 405640] 91263 [265550] 373778 [377160, 558110, 209540] 128863 [219640, 99900, 269210, 207170] 333210 [464340, 249680, 206190] 530380 [440, 207610, 270570] 348718 [268420, 251470, 355840] 292681 [323850, 231720, 48000] 363154 [304930, 311290] 86702 [444090, 522570] 58984 [4000, 203770, 8190, 314160, 34900] 419922 [249050, 42700, 15170] 35295 [200510, 221100, 463150] 154266 [209000, 242860, 2620, 302510] 193060 [416170, 313130, 249680, 372540] 240773 [500, 225840, 630, 238430, 355180] 19858 [520440, 102600] 69897 [232090, 213330, 436500] 382750 [49520, 221100] 109514 [252490, 281370] 269665 [363970, 236150, 351490, 490230] 82332 [240, 282140] 309909 [346110, 273350, 206210, 337950] 185015 [324470, 418360] 306227 [212680, 252490] 436955 [433850, 433850, 431240, 33900, 391460] 5877 [353700] 150745 [70000, 324810] 265216 [224260, 96000] 374032 [242880, 230050] 206364 [221380, 219640, 256010] 295657 [286220, 269330, 259740] 134059 [212500, 257730] 407947 [208480, 418040, 252550, 296870] 307945 [233270] 376653 [206190, 405640] 258078 [206420, 346010, 110800] 201861 [203140, 269490, 96100] 294036 [64000, 278080, 241600, 205690, 224260] 265063 [239140, 411960] 243724 [314590, 432240, 413480, 248470, 383930] 345628 [435150, 367520, 220440] 6728 [620, 22200] 116486 [24240, 21780, 274900] 93405 [104700] 188444 [565540, 514080, 556000, 551680] 110330 [307580] 196702 [334230, 215470] 26100 [644930] 119062 [248610, 301640] 230301 [233130, 694370, 237930, 404080] 74809 [205100] 86534 [6880, 12770] 169084 [236110, 102700, 202970] 294560 [253940] 95994 [239140, 236870] 202444 [48700, 319630] 93259 [299720, 17520] 169433 [335300, 359550, 249050, 326460, 200210] 44440 [55230, 228300, 261030] 386482 [282070, 212480, 246620, 214560, 366250] 285162 [238010, 367570, 417860] 513115 [107410, 384180, 225600, 266510] 219000 [377160, 391540, 404410, 319630, 481510] 292414 [233290] 364183 [219830, 304240, 233840, 270550] 394858 [289130, 237990, 57650, 41900] 442202 [65980, 48720, 15210, 293960] 169472 [271820] 195049 [44350] 198369 [282070, 387860] 366574 [221100, 214770] 336164 [575830, 237930] 54080 [360430, 612880] 106119 [427270, 314970, 237930] 171003 [264140, 492800, 530900, 250420] 67588 [444280, 449200] 474543 [620, 237930, 388880, 243160] 25473 [500, 273350, 8190, 346220, 57300] 317437 [440, 15320] 451955 [347830, 236090, 312750] 280500 [252490, 289130, 444090, 262060, 8850] 484865 [318330, 399920, 319470] 40117 [57690] 96475 [291650, 107200, 572520, 219150, 57300] 362245 [248570, 386880] 368022 [211820, 32370, 201810, 3590, 242550] 317580 [219640, 90200] 90436 [107410, 221910] 391685 [506670, 16040, 652980] 135639 [305620, 63660] 238467 [221100, 219780, 268910, 243120, 356520] 19654 [204100, 614100, 12120, 423230] 269998 [429680, 340490] 38021 [241600, 231200, 250260] 43078 [377160, 311210, 426790] 385041 [220700] 99284 [410320, 113420, 258970, 368080, 225600] 373218 [373800, 341540, 108800] 102994 [335000, 204180] 369169 [231310, 346450] 147971 [214490, 239820, 289580, 352220] 221722 [57690, 6250, 6060] 115579 [49520, 239350, 246620, 21090, 323610] 82662 [20920, 219890, 221640, 233720, 207650] 248175 [220200, 214490, 385770, 277930, 94400] 282138 [6060] 33430 [238460, 202970] 439838 [205100, 35140, 238460] 212544 [20510, 445190, 42910] 292241 [359550, 232090, 457140, 205100] 107909 [4700] 187455 [248820, 391260] 354410 [236870, 42700, 444090] 172309 [380810, 375820] 66219 [241930, 683830] 314640 [238010, 250340, 339800, 10680, 285900] 302153 [33930, 257830, 428430] 228703 [40300, 319630, 335620, 34010] 69483 [201810, 282070, 247000, 252110] 150323 [237930, 250700] 57502 [361280, 236090, 224460, 417860] 47628 [314020] 40740 [282800, 266490] 187178 [361690] 389181 [248820, 407900] 165410 [730, 304050, 304050] 46262 [212680, 366220, 242570] 471937 [355840, 98400] 105754 [239140, 346900, 620, 302510] 128560 [393460, 65930] 95912 [305620, 504370, 333420, 267530, 630] 309396 [70, 371890, 285310] 359328 [49520, 369990] 179228 [377160] 237778 [377160, 24980, 302510] 84808 [440, 33930, 40800, 283390, 273350] 103906 [209000, 397500, 107100, 217140, 232750] 18975 [399120, 17410, 6000] 477047 [320] 472117 [225260, 233470, 31290, 419460, 291010] 179123 [8930, 594570] 187897 [3800, 453850, 360640, 225080] 302938 [220200, 334230, 620, 200010] 267788 [200510, 287290, 222480, 544610] 67061 [211820] 91699 [107410, 19830] 474351 [346110, 363970, 250400] 316125 [232090, 367520, 227940, 319630, 202970] 197163 [393380] 374015 [214490] 186869 [524220, 524220, 218230, 6880] 107605 [208480, 233450] 63964 [370080, 394760] 57165 [6200, 307880] 225410 [55230, 221100, 63380] 512348 [440, 33930, 25000, 273110, 253530] 389223 [230410, 65800] 425914 [208480, 31290] 225773 [32370, 433340, 235540, 258890, 351640] 5728 [220240, 21690, 10180, 71340, 311340] 17379 [242920, 17080] 76387 [102600, 275610] 350046 [203140, 349040, 236150, 312990, 406150] 29885 [55230, 218820] 507036 [304650, 371530, 355980] 314344 [377160, 204100, 263720] 167981 [242920, 214610, 219890] 96606 [220] 129203 [8980, 49600, 31290, 55100] 267729 [274170, 200210, 311870, 613730] 381724 [262060, 454650, 209670, 427730] 373025 [50130, 203140, 273350, 500550] 158188 [13230, 200170, 290790] 271780 [374320, 212160, 10180] 326381 [4000, 427520, 372000, 441870, 425580] 378930 [730, 4000, 31290] 222693 [440, 245170, 207400] 428314 [270880, 65980, 285580] 357821 [227300, 230190] 482459 [346900, 210970, 208520] 265086 [227300, 259080, 407530, 288160] 453813 [306130, 209080, 296930] 48707 [4000, 271590, 391540, 368340] 172066 [271590, 227300, 282140] 364383 [346970, 71260] 221978 [107410, 201810] 318574 [328760, 221640, 253330, 297110] 204006 [225540, 22320] 261714 [113400, 345610, 412830, 353560] 299726 [232090, 377840, 1250, 248800] 121451 [221100, 206420, 418040] 311496 [247080, 210970, 448510, 437920] 199827 [218230] 333292 [204880, 304240, 439720, 317510] 168193 [294100, 203770, 234140, 227300, 202170] 438842 [204360, 115120, 415850, 216290, 310080] 29320 [734750, 619310, 680860, 653220, 489560] 136686 [210770, 274900] 158113 [365670, 263980] 57009 [283640, 262960, 480480] 175658 [244810, 225080] 90278 [203140, 555210, 248570] 91350 [200210, 353190] 258380 [282070, 621830] 349189 [235380, 558100] 266179 [113200, 267340] 97103 [304390] 106710 [370360, 214560, 107100] 513687 [220240, 430300, 12150] 301608 [260430, 212070] 321802 [294100, 49520, 212680, 232430] 135462 [208580, 10180] 313522 [258880] 143782 [628800, 104900] 525816 [1200, 292410] 241411 [260250, 234390, 274270] 371900 [245170] 153670 [212680, 268910, 22180, 390040] 190860 [230410, 220, 233270] 538040 [209870, 224260, 264120] 168217 [220240, 12120, 45740, 280, 283290] 479627 [283640, 207170, 228280, 105800] 378785 [200510] 276993 [304050, 207610, 243970] 226186 [268500, 10680, 244630] 36203 [245150, 232430] 335822 [460810] 382327 [440, 8930, 212680, 248820, 242680] 148039 [359550, 440, 608470, 301910] 283542 [316010, 375530] 307442 [50620, 681860] 537909 [252490, 33930, 363970, 322500] 374869 [227940, 393380, 300, 348360] 54698 [298110, 292120, 359190, 499520, 246740] 60104 [49520, 217690] 238993 [382900, 529660, 329200] 149600 [295730, 234390, 313340] 332795 [248820, 287700] 44383 [64000, 355840] 60555 [32470, 248610, 49800, 316930, 206440] 299219 [292120, 588190, 263620, 391720] 169746 [221100] 22820 [251150, 330840, 307580, 288930] 246303 [252490, 200510, 312530, 12140, 312280] 133961 [227300, 32470, 345240] 487724 [219640, 8190, 266110, 13250] 368209 [232090, 319630, 224580] 427748 [262060, 253250, 410900, 233270] 35669 [48700, 447020, 225540, 8980, 42700] 399715 [225540, 247020] 276718 [394230, 253750] 501379 [316010, 204360, 569840, 349500] 44372 [435150, 268750, 411830, 397550] 158673 [377160, 418460, 39140, 620, 427270] 128460 [397690, 353640] 322004 [240, 220] 102871 [262410, 265890, 232430] 454761 [22330, 391540, 22300] 60620 [9480] 288574 [220200, 259680, 102500, 247080] 529676 [107100, 32800, 421120, 22200] 467502 [447020, 420000, 588430, 420930, 590280] 235099 [207650] 251448 [209870, 23490, 8850] 355976 [48700, 502740] 151803 [8980, 24010, 50, 365300] 146971 [334230, 216110] 371789 [269810, 368370] 524394 [219150] 260287 [17390, 1520, 630] 51997 [230410, 241540] 340803 [200210] 315770 [17710, 294370, 50300, 353560] 321902 [227300, 267600] 28140 [45750, 344770] 434613 [33460] 250778 [250760] 131431 [463530] 92065 [620, 1250, 507010] 273830 [33930, 113400, 391540] 319716 [239350, 13570] 352461 [225540, 263360, 402330] 46512 [487020, 9940, 250380, 391070] 58393 [252490, 248610] 192870 [230410, 45700] 58145 [268500, 262060, 413150] 157070 [391540, 251910] 439100 [212680, 414340, 414340] 131394 [391720] 107651 [280220, 102600, 275670] 145456 [446610] 292317 [4000, 10090, 7940] 323717 [371660, 201810] 18948 [302510, 286100] 354786 [620, 257350, 433340, 214340] 319943 [377160, 241540, 241720, 239200] 189230 [296490, 7670, 206440] 179116 [440, 9980] 241326 [440, 549680] 239519 [223830, 108710] 61401 [301520, 291550, 213670, 406150] 62870 [330830, 249330, 269010, 239030] 226682 [312560, 217140] 58328 [49520, 620, 208090, 13210, 204450] 396437 [211820, 391540, 319180, 220860] 43341 [305620, 280, 51100, 319510] 47571 [337150, 313340] 19160 [252490, 35450, 393380, 12120, 201790] 70330 [287260] 261318 [211820, 9900, 24200, 504370, 345350] 364788 [242920, 387010] 123521 [6980] 318984 [220200, 427520, 9900, 203140, 283640] 116385 [225540] 136773 [242860, 6060, 469820] 426103 [271590, 268910, 422970] 60824 [48700, 48700, 227940] 171896 [8930, 48700, 220200, 220200, 212680] 49627 [298260] 224084 [4920, 447530] 403612 [273740, 383530] 311503 [207610, 407840] 372832 [542340, 91700] 172108 [241600, 365450, 262450, 230050] 239110 [212680, 319630] 93403 [239140, 333930, 241540, 299740, 42700] 277519 [652030, 409590, 587450] 132192 [20920] 301489 [266510] 112333 [463680, 508300] 54910 [20920, 35140] 468575 [49520, 10090] 60218 [232090, 242720, 231720] 368301 [377160, 105600, 20920, 213670] 70646 [435150, 70400, 219640, 335670, 209630] 112609 [107410, 376570, 226560, 24840, 65790] 67993 [287290, 302510] 294763 [311560, 2600, 42700, 588430, 260230] 264262 [29900, 217920, 245550] 100176 [327510] 316149 [377160, 223850] 506266 [306020, 55230, 35720] 238185 [65800, 221380, 3590, 21780, 18820] 214616 [252490, 403640, 339800, 296910] 489562 [113200, 300550, 241600] 242875 [240] 488776 [391540, 333420, 417860] 19835 [291650, 387990, 246620, 295590, 376870] 195455 [2100] 325567 [374320, 33930, 273350] 92790 [504370, 220, 241260, 238320, 278360] 385610 [221380, 15100] 157093 [674940, 319510] 284347 [57900, 57740] 115582 [291550, 40800] 338343 [208650, 104900] 119551 [57690, 376570, 223100, 223100, 6040] 235864 [398680, 345220, 334560, 523670] 261837 [440, 268130, 109600, 259080, 636480] 218913 [107410, 240760, 38410, 355180] 20597 [394230, 360430, 268910] 303765 [310950, 227940, 400450, 15120] 116920 [268500, 220440, 47790, 265300, 266510] 525429 [17460, 39140] 238222 [444090, 207170, 269490, 630] 513338 [377160, 495890] 81865 [8930, 734580] 485142 [306040] 271196 [271590, 221100, 310380] 303434 [299740] 332356 [374320, 304930] 485073 [17300, 394680, 433350] 241529 [334230, 320] 355911 [271590, 657630] 27296 [351710] 290267 [17470, 29180, 31290] 236048 [201790] 223611 [271590, 211820, 219640, 8980, 245300] 226386 [620, 368340, 597220, 268910] 435403 [433850, 620, 291480, 384630, 34900] 314275 [469890, 361020, 448070, 284930, 398710] 144971 [440, 393380, 48190, 373390, 99910] 410752 [282800, 274940, 266430] 5541 [433850, 319630] 147982 [48700, 240760, 64000, 202200, 408990] 508954 [400580, 323780] 30242 [310950, 337000] 458571 [252490, 318860] 353771 [70000] 273160 [403430, 104200] 232536 [283640, 50130] 67737 [287390, 252490] 338387 [301520, 433340] 162935 [418190, 274310] 279459 [220780] 47790 [377160, 440, 312990] 34622 [200510, 7670, 246110, 12220, 502750] 383739 [312990] 120178 [221100, 329130, 310790] 248077 [104900] 247820 [301520, 249130] 200755 [620, 300, 15120, 63380] 65893 [433340, 274940, 233720, 206190] 362176 [236110, 208090, 262410] 374715 [243970, 265970, 273070, 259340, 512250] 286713 [372000, 466560] 404092 [444200, 376870, 31280, 48000, 417860] 58490 [312370, 204450] 440679 [245620, 2870, 211600, 300900, 433350] 381163 [252490, 99900, 248570] 48693 [301520, 209330, 269210, 390660, 466240] 130713 [368180, 260430, 313340] 277668 [280, 102600, 12790, 17520, 252830] 241368 [440, 444090, 241930, 22320, 286690] 69476 [300570, 17480, 107100, 70] 100214 [620, 220780] 183020 [377140, 302510] 453251 [267490, 47920] 132451 [22370, 603960] 212249 [260430, 379870, 388880] 120050 [440, 620, 22300, 280, 317400] 255865 [440, 444640, 244730] 420477 [252490, 39200, 397460, 266010] 202958 [310700] 260239 [603290, 540020] 125318 [239160] 92257 [7670, 8850, 312150] 513842 [451570, 209230, 320110, 278460] 257347 [55230, 224260, 312990] 156497 [214950, 32370, 227940] 299642 [20920, 292120, 264280] 219717 [337210, 444440, 392610, 341980] 134951 [518120, 312150] 311681 [212680, 261820] 67704 [741670, 377900, 204340] 48804 [241910, 107300] 537977 [252490, 349700, 356670] 11735 [274190, 238910, 444480, 449960] 213306 [284770, 368730] 76815 [307690, 360] 322853 [110800, 13230, 316480, 31280] 57002 [346110, 440, 252490, 226620, 40970] 138753 [24980, 214250] 126425 [7520, 209870] 220522 [391540, 391540, 444090] 218871 [200510, 529180, 484900, 391720] 6229 [271590, 212680, 220240, 218680] 117417 [55230] 219832 [228300, 99900, 311730] 317818 [242640, 345180, 303790, 19980, 359610] 417439 [4920, 6880] 173226 [433850, 433850, 290790, 219150, 512470] 303526 [212500, 220, 218060, 399120] 352428 [4000, 22350, 584890, 49600, 235250] 430106 [420000] 374263 [48700, 265550, 543260, 665090, 335460] 293285 [213670, 301650, 306700] 293332 [339000, 238430, 227100] 305316 [24960, 396090] 90347 [239820, 414660] 15188 [301520, 220, 365300] 225858 [227300, 11610] 135283 [47890, 221100, 368500, 230190, 383230] 154379 [9900, 368500, 17430] 479689 [620, 275850, 251470, 1610, 22000] 303307 [211820, 233450, 620, 319630, 108710] 320366 [49520, 281990] 232180 [47810, 363930, 345180] 4418 [427520, 512900, 248290] 92006 [440, 33930, 7670, 17700, 30] 130102 [361800] 528877 [377160, 367520, 206420, 12150] 63120 [3590, 7670, 286380, 308040] 302371 [49520, 208650, 235540] 364593 [6910, 212480, 239160, 42670, 7860] 347473 [12220, 361630, 319510] 140216 [620, 202170, 108710, 208520, 204080] 262192 [242550, 219640] 329541 [268500, 50130] 527827 [265380] 38481 [20920, 356670] 372365 [10500, 310890, 342580] 538555 [211820, 49520, 276810] 442191 [248610, 240720, 215510, 32400] 93625 [248820, 291710, 104900] 465925 [391540, 391540, 330830, 261030, 313120] 468592 [252490, 274170, 287390, 286500] 92571 [20920, 8850] 89277 [252490, 35720, 206440] 199463 [207040] 512893 [200910, 359900] 339921 [250900, 24980, 34030, 270880, 374570] 108429 [440, 233450, 453100, 245170] 324449 [219990, 372000, 451340, 321360, 275850] 440816 [206420, 7510, 446990] 128923 [233720] 363375 [41500, 235210, 49900] 283047 [238210, 213650] 27897 [374320, 49520, 420290, 236090, 438270] 24524 [46250, 50, 2100, 264240] 245142 [258220] 63495 [289070, 242680, 247660] 487712 [6250, 15740] 67169 [346110, 242860] 403832 [434570, 368370] 170129 [346900, 252330] 488049 [361300, 398710] 215337 [206420, 420440] 323850 [280220, 388880, 405640] 364036 [454070, 316390] 27505 [220] 324986 [316010, 393380, 271240] 144932 [434610, 416550, 252630, 360, 489520] 134891 [248310, 233470] 161046 [48220, 403640, 575940] 241099 [319630, 10180] 274858 [377160, 221040, 227400, 251170] 391056 [236450] 440134 [241930, 40390, 42910] 121181 [48700, 304390, 374040] 373165 [304390, 251990] 497224 [8980, 220440] 29108 [252490, 440, 386700] 123573 [48700, 326460] 431617 [440, 460930, 22370, 19900, 203160] 269239 [251530, 247140, 17410] 398544 [230410, 255220, 323580] 18791 [2300] 196170 [269110, 248610, 212700] 304398 [211420] 204084 [273110] 89174 [208480, 9050] 119850 [200710, 299360] 184965 [233130] 496788 [49520, 12120, 261570] 297859 [12110, 286100] 482152 [312600, 40390, 10270] 318257 [289130, 203140] 91974 [440, 263280, 220] 253829 [286690, 354240, 436520] 214379 [107410] 81137 [35140, 9480, 422970] 504324 [233450, 300570, 336300, 341950] 527825 [561120, 214340, 415420, 388490] 318299 [253370] 85480 [4000, 17390, 6850] 355329 [440, 386180] 416692 [359550, 242860, 4720, 42670, 31280] 333224 [22300, 288160] 65988 [211820] 210457 [335300, 7670] 241852 [236110] 296810 [221380, 376210, 431960, 301640] 238694 [440, 252490, 212500, 208650, 444200] 134464 [227300, 221100] 512488 [304930, 373730, 323060, 251990] 338061 [110800, 236090, 261110] 107790 [377160, 233740] 108362 [489830, 248570] 122858 [220200, 213650, 319510, 319510] 239343 [4700, 331500, 10180, 340] 97316 [213670] 167715 [444090, 237310] 75114 [35450, 212480, 245170, 469820] 370174 [12140, 336420] 99487 [39210, 346110, 233450, 620, 322500] 81775 [22320, 102700, 311480] 480128 [55230, 200710, 234140, 206210] 58627 [48000] 173907 [281990, 344760] 96283 [243470, 371660, 108710, 55110] 239700 [320, 225840, 270210] 423905 [48700] 227159 [251730, 443940, 220780] 268252 [230410, 351970, 207610, 351800] 185979 [18500, 375820] 153323 [342200, 200410, 220780] 417399 [337000, 249130] 348295 [212680, 257770, 351240, 350740] 255763 [330840, 65930] 180199 [582660, 242680, 559650] 269037 [603960, 356670] 310637 [107410, 242860] 441247 [300550, 234650, 22500, 265550, 490950] 268879 [524220, 346940, 268050, 224580] 229150 [365590, 407510, 326160] 177784 [304030, 301910] 130395 [500, 241600, 368500, 332800] 349642 [260000, 33790, 306410, 279540] 59937 [35140, 233130, 27000, 233270] 271300 [8930, 330840, 264140] 19309 [230410, 223830, 403640] 209061 [231430, 49300, 301640, 250620] 476307 [291650, 7940] 182167 [200210, 391540] 195715 [273350, 417860] 330385 [268850, 241720, 238430] 4536 [291480, 220900] 278341 [57690, 11450, 400240] 157861 [384300, 398710] 159133 [39680] 493534 [220] 350445 [301520, 352700, 433910, 342980] 317650 [268910, 287390] 39219 [4000, 4920, 221380, 333950] 17664 [17080, 368990] 65773 [71340] 397516 [372800, 204360] 110273 [236430, 562220] 433380 [285740, 595030] 291142 [6000] 212127 [394360] 39495 [385730] 459345 [301520, 304050] 148470 [440, 251060, 265690, 302010] 297172 [113200, 70, 612880, 407840] 351639 [4570] 366895 [394360, 48700, 221100, 418460, 70000] 165950 [212050, 269810] 77387 [214490, 498240] 18351 [234080, 204450] 431201 [659480, 440420] 126015 [367520, 333980, 512900] 334467 [1250, 205910] 221715 [440, 304390, 12100] 286751 [261570, 322500] 281329 [250900, 234140, 249130, 247240] 360912 [233980, 267670] 7245 [6020] 288190 [255710, 289650, 261920, 554510] 91378 [262060, 113400, 444090, 274170, 213670] 286209 [271590, 240, 80, 388080, 330840] 14133 [220, 290770] 346049 [209000, 395640, 257790] 43434 [305620] 106484 [524220, 322500] 62018 [322190, 265300, 434570, 3710] 12736 [55230, 20920, 202750] 289584 [227240] 513034 [212680, 258970, 508390, 239430] 371149 [391540, 276750, 346370, 204340] 167454 [208580, 211400, 245550, 365300] 253172 [245620, 287980, 1520] 215371 [367520, 362620, 282140, 98400, 406150] 259013 [316010, 346900, 201810] 449675 [208140, 248450, 112100] 219710 [433850, 34010, 291910, 217920] 379529 [337000] 326660 [35140, 584400] 440435 [440, 45760] 204662 [400910, 332200, 405540, 235800] 215808 [291550, 243040, 18070, 65930] 172380 [690040, 728440, 696990] 423133 [348620] 10267 [222750, 203140, 51100] 491691 [246840] 213556 [252490, 304390, 286260] 194905 [355840, 242720] 87850 [440900, 108800] 52057 [433850, 17460, 234670, 55150, 288470] 388018 [113400, 404680, 282440, 95400] 428836 [218410, 3910] 244483 [248820] 250780 [699160, 294040] 267450 [230410, 485980, 359580] 355209 [231430, 269770, 203750, 6200] 360267 [98800, 245170] 195418 [226860, 214490] 309357 [22600, 36000, 72200] 285568 [233130, 233720, 202970, 274270] 290189 [346010, 530560, 542560] 420015 [204880, 218680, 381020, 206190] 282854 [386360, 240760, 234080] 51751 [214950, 371660, 264140] 233356 [202170] 56959 [282800, 289690, 275180] 199998 [107410, 24200, 674940] 277555 [271590, 274170, 227380, 395170] 209053 [391540] 91217 [21690, 13210, 250620] 371984 [240, 363680] 87546 [325090] 143179 [201790, 21090, 98600] 356776 [234670, 241600] 338009 [203140] 464911 [465900, 111010, 449350] 76616 [435400, 405640] 141059 [113420, 270170] 527748 [346110, 506730, 288160] 382222 [451020, 431240, 234190] 245850 [257510, 104200] 339593 [55150, 45700] 224455 [294100, 457140, 225840] 513332 [260230, 279940, 50300] 254853 [2990, 206500, 233230] 190572 [238460, 407900] 187650 [328430] 276737 [4000, 334690, 238320] 474779 [294570, 252110, 265890] 150758 [252490, 346110, 270880, 207790] 234466 [248820, 259080, 628440] 41379 [304050, 440, 379720, 391460] 314620 [236430, 220200, 365360, 400170] 366242 [291650, 64000, 392110, 442080, 22600] 77498 [249050, 356040, 202750] 289403 [211420, 291550, 238320, 238320] 292645 [207040, 312990] 100646 [440, 265550] 33897 [339610, 326460] 280533 [200710, 287390, 402530, 475550] 96556 [391540, 368360, 250260] 111802 [224540, 391720, 263340] 411820 [271590, 533300, 384190] 514814 [241930, 6060, 275470, 263980] 285958 [49520, 6860] 59088 [377160, 208140, 227940, 57900] 359312 [266130] 277414 [250900, 331470, 319630, 627690, 107100] 317378 [211420, 349040, 218230] 126060 [288120, 505730, 410840, 377900] 247918 [4000, 274170, 2450] 183704 [627270, 204300, 24740] 231026 [212480, 295790] 513628 [430080, 602790, 514090, 293920, 434920] 220166 [6020, 341870, 358460, 224300, 207530] 403508 [208750] 410158 [7670, 207140, 219890] 239304 [423230] 74389 [252490, 391540, 216150, 16450, 258520] 133023 [107200, 203140, 234650, 224460] 537924 [239030, 57300, 212800] 260852 [204530, 284950, 306410] 525950 [218230, 332570] 231164 [57000, 219200, 111600, 265690] 113639 [396310, 367600] 271811 [268500, 241540, 20500, 214490, 38420] 474406 [209540, 273110, 302380, 233720] 269098 [377160, 230410] 116502 [576050, 221910] 242359 [268850, 209160] 232524 [300570, 391540, 351640] 77317 [48700, 8190] 225331 [319630, 410850] 38919 [1250] 363644 [239350, 310380, 233130, 203750, 364420] 326339 [219890, 202530] 83646 [536930, 236870, 238010, 104900] 116814 [346110, 102500] 524252 [268850, 355180] 262070 [444090, 238010, 221040, 402020] 199164 [201790, 61700] 256753 [17410] 18215 [275850, 544330] 227267 [337000, 236870, 207170, 21680, 6030] 366846 [252490, 601530] 446858 [252490, 299360, 274190, 227000] 65576 [326410, 248610, 353640] 309537 [300, 335670, 6060] 249565 [372000, 625990, 239030] 121927 [25800, 17470, 207420, 6370] 27165 [200710, 330830, 627690] 38312 [418370, 356670] 147823 [32470, 278080] 76030 [223830, 367520, 301520, 394230, 588430] 117743 [262060, 393380, 402710] 13869 [21090, 4570] 264580 [8930, 289070] 351810 [424840, 47790] 292719 [221910, 346250] 105541 [582660, 42700, 108710, 673610, 431960] 139507 [211820, 304050, 108710, 1840, 365300] 323065 [344760, 390670] 54706 [507490, 319630, 418340] 379171 [367450, 384000, 290490] 124441 [22330] 52572 [218410, 233270] 75161 [391540, 269770, 245550] 382673 [203770, 15750] 168048 [377160, 558870] 223673 [237930] 358489 [218230, 500, 9480, 313240] 353244 [259080, 292630] 86886 [388320, 399670] 188131 [214490, 657590, 535980] 377943 [219890, 409710] 4361 [372000] 494123 [240, 4920, 237310] 469032 [339610, 359840, 379760, 248330, 230270] 327174 [35140] 42606 [4000, 4000, 265770] 120271 [386360, 236110, 374040, 330840, 429570] 276767 [4000, 438740, 272510, 310380, 646570] 525315 [283060, 352460, 387860] 332475 [298110, 209670] 255898 [377160, 222880, 9200] 320967 [730, 220240] 52406 [221100, 49520, 219640] 24602 [271590, 232090, 459220] 284845 [281990, 252530, 388090, 344040, 272060] 255409 [248860] 42602 [440, 245620, 420850] 342074 [45760, 233840, 41300, 31280] 33395 [234140] 107927 [205100, 236090, 332310] 457186 [267980, 251450, 385830, 409100, 366890] 50390 [359550, 218230, 473690] 150415 [301520, 337320, 274190] 106755 [281990, 231430, 469820] 54774 [24790, 324740, 477160] 236847 [49520, 221100] 295037 [521280, 253410, 411960, 207420] 71695 [208650, 319630] 505433 [227300, 314160, 323370, 41700, 338170] 262781 [235540, 500710] 334420 [230410, 207230] 240198 [233130, 17410, 225080] 97518 [57900, 512900, 2360, 274350] 310984 [350280, 371220] 306910 [208650, 258520, 417860] 18355 [283640, 265690, 341020] 524210 [612610, 704040, 652980] 196447 [24740, 360, 530330] 282712 [440, 221380, 24240] 217569 [40300, 224600, 354240, 233470] 304939 [319630, 261180] 205412 [57690, 346560, 257830, 317510, 423880] 180165 [20920, 108200] 322550 [22330, 17410, 235250] 300732 [359310, 238090] 14487 [373770, 320760, 555640, 391270] 220500 [252490, 203770, 311210, 311210, 236090] 79590 [289520, 258520] 136845 [32470, 33620] 487678 [221640, 2280] 76009 [340000, 271730, 290770, 352520] 518398 [50620, 245170, 245170] 42563 [50130, 332800] 145872 [233450, 216890, 573490, 360730, 318430] 8963 [4000, 440, 440, 35450, 331470] 265257 [212480, 252010] 134510 [227300, 267600] 186890 [234670, 17570, 287100] 59636 [268500, 289070, 243970] 342148 [24010, 530070, 273350, 507010] 305059 [98100] 117273 [220820] 224368 [214770, 219890, 368370] 290600 [201810, 34870, 233720] 203879 [48700, 588430, 252370] 424687 [257510, 71260] 504073 [13210, 202270] 93148 [32370, 383980, 335670, 278360] 206001 [247020, 457760, 327860] 182044 [108600, 447040] 512013 [363970, 359050] 37384 [102500, 326460, 330830] 319588 [24960, 227940, 22300] 412350 [271590, 49520, 252490, 42700] 81175 [234140, 234140, 227940] 351282 [374320, 227300, 366220, 319630, 434740] 378734 [301860] 203369 [227300, 291550, 335330] 265109 [216290, 282760, 239430] 242153 [4000, 239820, 212160] 349547 [32440, 15100, 222940, 219150] 171915 [15620, 311340] 340963 [98400, 17570, 202970] 138312 [440, 239350] 295087 [32470, 322500] 319948 [336610, 312990] 512386 [10180] 10895 [365360, 9420] 261943 [236150] 187130 [234390, 292630] 64804 [440, 391540, 304240, 406150] 118362 [292380] 131499 [236430, 17300] 265706 [428900] 164725 [35450, 365330] 255213 [273350, 363970, 17080, 41100] 268464 [107410, 342560, 250180, 211500, 312990] 509705 [307010, 239030] 177898 [48190, 7670, 284850, 289090] 445022 [303210] 464520 [108700] 209602 [245150, 232430] 169701 [310950] 171953 [8930, 238430] 516021 [319630, 319630, 436150] 68013 [49520, 433850, 620, 504370, 361420] 220843 [8980, 225080] 410054 [4000, 252490, 227940, 445130] 117958 [385590] 93998 [438740, 386940] 505940 [618970, 566190, 431260, 265690, 406970] 471218 [377160, 391540] 146356 [25000] 239675 [327510, 515040] 502479 [248820, 248820] 243951 [219640, 317360] 93826 [274170, 225160, 261820] 65729 [518790, 380600, 363220, 588180] 21900 [233450, 206420, 34870] 18933 [233130, 358420, 104900, 200170] 202257 [209000, 233130, 250260] 437357 [35140, 261030] 147508 [65980, 450700, 399670] 92869 [323470, 339350] 426344 [377160, 363680, 237930] 53075 [108600, 261640, 221910] 222306 [443630, 393410, 467360, 416130, 359050] 350888 [212480, 91600] 189232 [269770, 564710, 382490] 59877 [70] 50952 [9900, 536680, 343710, 407530] 132802 [614570, 286340, 2360] 28733 [391540] 360568 [434780, 311400, 451670, 487330] 243710 [304050, 466240] 322192 [244030, 217200, 57300] 18805 [222730] 384571 [645760, 323450] 48134 [215160, 254440, 225600] 209358 [230410, 219990, 211820, 208650, 310380] 473210 [346110, 206420, 239350] 39338 [440, 219150, 434570, 388090] 342218 [245390] 48512 [342200, 113200, 201810, 271240, 323060] 476131 [207610] 367451 [440, 225540, 227940, 107100, 417860] 89224 [63380, 448370] 209247 [227300, 338170] 530832 [383560, 467010] 30965 [609150, 65930] 82234 [319630, 242860, 6060, 3900, 239030] 172870 [3483] 165125 [323720, 449540] 140403 [108500, 217200, 234390, 15500] 244973 [115320, 306410] 372556 [365670] 162927 [49520, 364470, 440550, 529440, 374570] 388569 [48700, 211420, 241930] 411338 [248820, 248820, 314320, 201480, 251990] 286795 [9450, 304390, 397950] 227155 [261760, 532030, 253920, 252150] 126120 [35450] 360674 [311560, 238090, 308420, 258180] 111961 [48700, 48700] 32875 [591420] 497589 [311210, 284160, 201810] 208774 [231160, 408280] 64901 [110800, 35130, 8140, 26900] 431673 [332620] 354835 [311210, 200710, 433850] 145244 [107410, 370240] 249937 [301520, 8190, 266170, 387340] 239893 [250900, 236090, 200910] 68413 [252490, 20900, 466910] 359336 [427520, 4000, 444090, 291550, 247730] 179982 [329110, 233700, 272470, 263560] 250681 [242920, 430960] 294254 [402630] 146103 [202170, 313160, 210970, 302790, 417860] 211639 [569860, 286340, 364300, 384190] 20249 [48700, 433850, 206420, 504210, 6910] 374902 [8980, 113200, 22100] 518265 [207610, 420880, 431510, 465520, 285580] 232289 [290340] 438496 [220240, 555710, 425820, 357290] 39949 [305620, 332310] 127558 [221910] 110677 [255340, 214770, 238430, 321260] 187522 [322500, 384190] 259762 [290080, 231430, 110800, 287980, 298260] 333100 [444090] 284965 [262060, 378540, 247020, 360430, 526250] 159362 [440340, 202970] 81848 [261030] 156666 [219640, 673980, 327690, 349700] 425279 [48700, 612880] 337716 [333930, 375600] 96639 [329050, 224260] 309677 [110800, 224600, 350910, 261110] 54316 [65980, 274170, 463680] 318753 [236430, 406150] 389437 [35450, 307670] 275876 [271590, 261640, 319630, 270550] 167295 [252490, 205100, 35720] 138066 [252490, 110800, 287290, 363130] 103828 [385760, 17500, 302830] 521138 [205100, 271730, 209790, 272060] 502409 [413420, 627910] 267946 [231430, 219640, 228960] 116860 [597220] 294653 [247080, 249990] 140771 [107410, 17460, 405640] 373724 [358030, 430440] 372605 [427520, 278490, 274980, 461560] 100896 [399890, 284790, 302130] 436035 [433850, 49520, 35720] 257245 [236430, 504050, 314070] 437532 [263060] 498985 [65800, 240, 207610] 161806 [405900] 112858 [415350, 222660, 206190, 445130] 282274 [204300, 317400] 531520 [386360, 8190, 3590] 297161 [219640, 8190, 248820, 41000] 345406 [113400, 6060] 428590 [200510, 2400, 263060] 180491 [440, 377160, 221100, 105800, 390340] 521521 [365670, 383120, 652980] 123302 [321840] 474070 [304390, 440, 41000] 465434 [102500, 410850, 245280] 340039 [345180, 7520, 107300] 106435 [301520, 113200, 391540, 220860] 221658 [245620, 475550] 245022 [10150, 246620, 224260] 488790 [98800, 414190, 351700] 486548 [365350, 203990, 104020, 7510, 219680] 251370 [7670, 540510, 33680] 377505 [440, 378720, 12120, 355840] 77619 [500, 39200] 318104 [221910] 224800 [248820, 389430, 630] 138313 [582160, 47410, 6060] 175879 [241930, 485890] 365862 [113400, 102500, 346900] 29362 [386420] 43066 [65980, 347830, 224600, 490980] 170872 [440, 236430, 204300, 237110, 96100] 280010 [230410, 261980] 347331 [508600, 263020, 253630] 292853 [324420, 107100, 286200, 211070] 63430 [316010, 115800] 29032 [67370] 261096 [221100, 42910, 209080, 95400] 24146 [286690, 506610] 340599 [219910, 263060] 130670 [339200] 384972 [234650, 17470] 81376 [301520, 4580, 425210, 67370, 409710] 309129 [294860, 57690, 231160] 514045 [212480, 12220, 2500, 224580] 414929 [678800, 295770, 41100] 110202 [236430, 363970, 351710, 274980] 62078 [107410] 200085 [227940, 4000, 345350, 7940, 406550] 198106 [204300] 240979 [235540, 315130, 11240, 630] 205639 [55230, 214490, 239350, 386940] 43240 [370040, 339120] 136879 [282070, 41800] 228712 [8190, 620] 97263 [304050, 291550, 287290] 459675 [296300, 242860, 300380, 592020, 512900] 318306 [231430, 221260] 413505 [252410, 41060] 236409 [491280, 242860, 332800] 169376 [223830, 303430, 239430] 22474 [296490, 256290, 8850] 253524 [257690, 233290, 242820] 286686 [12830, 217920] 378227 [20900, 501990, 203830, 115100] 314009 [221100, 502770, 266210] 461516 [250900, 9900, 321360, 568580] 148789 [230410, 341090, 487270, 485660] 251390 [272270, 283270, 356520] 315659 [427520, 319630, 249680, 345650] 226207 [346110, 300550, 15620, 346420] 134986 [45450, 248710, 224960, 296910] 427297 [200710, 417860] 346718 [355840, 335190, 276730, 331440, 57300] 46902 [440, 316010, 319630] 302435 [239350, 596650, 265630] 236308 [399430, 391070] 325522 [233270] 409435 [20920, 113200, 411830, 422970] 6825 [223830, 274190, 402430, 266430] 293339 [449470, 41000] 37174 [40] 265841 [273350, 284240] 316775 [256290, 224460] 471022 [620] 433694 [301520, 444200] 107931 [206210, 266510] 240946 [301860, 405640] 180411 [70, 238320, 343710] 257325 [323470] 16441 [222480, 227100, 264480, 50300, 217100] 466115 [230190, 428750, 241410, 258520, 461560] 519897 [113200, 300] 306940 [440, 386360, 333930, 333420, 242720] 160256 [206420] 363173 [390670, 355840, 40950, 104600, 50300] 37618 [200510, 94400] 72934 [298110, 348450] 433913 [448540, 429570] 111262 [20, 130, 314020] 183078 [440, 444090, 227940, 7600] 29476 [320140] 370883 [298110, 105450, 214490] 436126 [220200, 220200, 444090, 219150] 189267 [386180, 349040, 319510] 303716 [500710, 292140] 334580 [107200, 212680, 234140] 47768 [228280, 4460] 112644 [361280, 514620, 236370] 303199 [328070] 222137 [225540, 366800] 302142 [367520] 17463 [201810, 431240] 76279 [287980, 258070, 317040, 317710, 313130] 385695 [257510, 343430] 122868 [372000, 386070] 255292 [377160, 240, 200710] 100195 [208480, 394230] 123379 [203160, 233290] 105601 [382050, 323060] 23710 [433850, 201810, 290930] 331325 [289070, 218230] 64168 [274170, 319630] 292986 [17460, 444090, 403640, 420290, 399000] 255690 [49520, 444090, 360640] 350576 [22320] 486746 [298110, 312230, 35700, 4720, 284460] 468851 [8930, 361420, 580200] 413836 [113400, 597220] 174705 [279580, 243160] 24567 [253840, 578930, 465520] 507909 [418050, 246700, 7510, 341940] 366611 [581120, 205730] 285753 [226860, 227940, 244160, 220] 47235 [330840, 319850] 266067 [558420] 226209 [223830, 327670, 344840] 180784 [301520] 194059 [49520, 440, 10090, 305620, 261640] 366658 [440, 319630] 219453 [206420, 280, 239030, 20] 518816 [49600] 150975 [17140, 207320] 477701 [34830, 236730, 46540, 221910] 306275 [233450, 377150] 45936 [239250, 17300] 140082 [207170, 422900, 405950] 294564 [219640] 229668 [394360, 39210, 273350] 134604 [22300] 150856 [242920, 8190, 2590] 232542 [26800] 197759 [221100, 22300, 8190] 343556 [215530, 643880, 466490, 7210] 238391 [372000, 386940] 492338 [312300, 331650, 301540, 274290] 375332 [4000, 33930, 339800] 116382 [500, 403640, 234080, 346900, 346470] 456484 [362490, 319430] 260272 [592580] 219119 [440, 346110, 99900, 317360] 250185 [4000, 224480] 151614 [620, 348620] 42705 [271590, 271590, 346110, 221380, 312530] 130778 [219990, 557810] 359817 [40300, 327010] 274533 [397060, 458450, 509420] 205237 [377160, 440, 203160] 372937 [55230, 21090, 523780] 206792 [212680, 284460] 4508 [204100, 238320] 188297 [70400, 107310] 108452 [728050, 563340] 490069 [231200] 173064 [8980, 200010, 224820, 298600] 362202 [110800, 50300] 287861 [4500] 491208 [432190, 44100, 326190, 463930] 42911 [215630, 454060, 634700] 190133 [290890] 282273 [582660] 90100 [107410, 444090, 447040, 394740] 336599 [359550, 210970, 214340, 378610, 205950] 89295 [49600] 199968 [48700, 550650] 178169 [272060] 218119 [291550, 383230, 50300, 22230] 236259 [376870, 268910, 368730] 303782 [50130, 63380] 204832 [440, 105600, 314560] 97074 [20920, 48190, 366890, 204180, 232770] 406527 [212680] 468725 [268500, 212680] 512368 [15100, 221910, 296470] 256725 [344240, 203290, 232430] 533515 [232090, 394310, 408900, 50, 208090] 298054 [367520, 251060] 455155 [252030] 221069 [214950, 220, 227940] 338724 [230050] 444477 [202170, 232890] 234055 [219640, 247080, 251690, 247660] 112463 [239350, 107100] 95883 [48700, 232090, 259080] 263110 [313120, 225080] 199489 [220200, 221100, 252490, 367520, 63380] 249829 [233470, 63710] 93178 [284160, 246620] 87276 [65980, 457140, 319630] 194012 [385100] 290171 [219640, 391270, 315430] 27795 [41060, 6900] 337588 [500, 376210, 620, 200210] 229193 [440, 22300, 481510, 361380, 17570] 344477 [200260] 249210 [65980, 71340] 506117 [306130, 273350] 301788 [351710, 397680] 396327 [312530, 220] 444420 [576470, 531960, 346040, 258050, 342980] 319642 [287390, 65540] 521271 [206420, 433800] 63904 [264520, 204180] 498620 [339800, 209520] 361852 [282070, 220, 233550] 4412 [22300, 273350, 16450, 48000, 533300] 82927 [299460] 278607 [239700, 261490, 219150] 180619 [200510, 18000] 488475 [391540, 542660, 242720] 49467 [287020, 253110, 210770] 321813 [211420, 48220, 63380] 197858 [49520] 406900 [363970, 444090, 387290] 35975 [10500, 254060, 225840, 302250, 96100] 257980 [387990, 674940] 26545 [55040, 322500] 164723 [201790] 377285 [206420, 233130, 630] 112334 [273350] 202876 [71340, 205230, 357070] 305932 [219640, 340170, 325090] 286910 [3720, 39690] 295257 [329460, 40800, 104900, 261820, 417860] 90458 [420530, 297130] 127793 [24960, 280] 230462 [282070, 108710, 258520] 124808 [340350, 219890] 438654 [107410, 394360, 48700, 252490] 25150 [238010, 238010, 459820] 192946 [63380, 206190] 185488 [350070, 669650, 463220] 364347 [440, 427520, 394230, 333930, 57690] 36852 [113200, 269210, 94620, 253630, 252710] 119598 [240, 210970, 204240] 285509 [374320, 242920] 286690 [107100, 252350] 123349 [250900, 338000, 269950] 294392 [225540, 70400, 362890] 92315 [346110, 219640, 431240] 362071 [391540, 6060, 393530, 358460, 295790] 44946 [420440, 627690, 94200] 117 [370590, 319320, 253630, 314020] 34047 [286690] 420009 [291480] 252908 [240, 306130, 233130] 428996 [41700, 250420] 267894 [524220, 500, 227100] 282726 [311560, 205950] 192579 [281990, 438740, 414740, 470220] 347609 [403410, 286100] 320739 [286120, 55040, 1510, 221910] 382072 [440, 440, 239140, 460920, 7110] 363607 [220240, 92800, 331200, 31280] 158211 [252490, 444090, 227940, 17520, 57300] 262834 [302710] 196002 [221910] 82502 [265550, 432130] 131937 [252490, 8980, 25010] 524571 [10680, 296630, 233290] 68403 [211820, 239070, 264240, 247870, 255920] 163597 [239140, 458960, 252330] 349086 [263280, 393380, 365450, 607890, 423490] 242034 [252490, 334230, 470220] 141009 [391040, 257850, 238430] 192595 [200510, 233680, 2400] 179684 [282900, 307690, 603880, 2400] 52405 [273110, 391460] 450941 [8980] 271288 [335300, 443810] 75759 [252490, 313120, 280160] 246188 [370600, 603030, 353640] 423715 [513630, 484950] 349886 [272470, 115800, 99300, 107310] 328699 [220200, 204100] 237749 [200710, 329970, 265610] 224055 [288160] 9301 [239350] 203814 [231430, 438740] 175481 [358380, 301640] 319493 [319630, 247080, 469820] 256418 [263280, 35320, 47920, 288160] 126647 [444090, 273110, 265630] 15601 [433850] 329735 [49520, 50130] 71598 [17460, 305620, 110800, 8850, 26800] 92950 [55230, 274940] 16362 [8930, 301640, 299360, 104900] 18452 [113200, 47790, 407530] 432843 [440, 346110, 55230, 268850] 398008 [15620, 233720] 37187 [227600, 219890, 241410, 272600] 221314 [41500, 284390, 204450, 215690] 417427 [301520, 282070, 67370, 433950] 166011 [299500, 545270, 394550] 286298 [317940, 71250] 213850 [204530, 16450] 87327 [346110, 235540] 484284 [12220, 302830] 75784 [224600, 258970, 17570] 253850 [214560] 171145 [237350] 158029 [333930] 160374 [35720, 44350] 384401 [99300] 236783 [371220, 8400] 159540 [211820, 404530] 163084 [241930, 584400] 365915 [319630, 301200, 252350, 378660] 283697 [227300] 208940 [487220] 134586 [241540, 391040, 674940, 221910] 397223 [200210, 303210] 312672 [49520, 207140, 262410, 94400] 495357 [515570, 219150, 244910] 145617 [344760] 258173 [274940, 40800, 205910, 418070] 88643 [307690, 15120, 314660, 206190, 288160] 126785 [343780, 288160] 371690 [335300, 287390] 385839 [256290, 3830, 237430] 380979 [440, 282900, 262490] 251006 [359550, 252490, 475240] 226427 [98800, 70400, 71340, 107100] 367439 [440, 50130] 312877 [234140, 360430, 314000] 249792 [41500, 20920] 217428 [209080, 7520, 559210] 502993 [8930, 39680, 2870, 245550] 218348 [225840, 314280, 317360] 241231 [375910, 288470] 140879 [230840] 449987 [33930, 240, 365590, 285900] 225875 [300, 348820, 67370] 428244 [35720] 362367 [237990, 310060, 261030, 429620] 330257 [585990, 431540, 449020] 65065 [214560, 286100] 380897 [290730, 258890, 211400] 361334 [47540, 238210, 275610] 121414 [239140, 108600, 6250, 104900] 279547 [9900, 393380, 305050] 111731 [217690, 266130] 371051 [316390] 150467 [243120] 143827 [211820, 211820, 1200, 247730] 272190 [24960, 237430, 664430, 225080] 120913 [200710, 346900] 398254 [381130, 242820] 36147 [374320, 310380] 496069 [219990, 252490, 238210, 55110] 204509 [282900, 245170] 173233 [339800, 274250] 448443 [377160, 250340, 221380, 285920, 358200] 510174 [562500, 448560, 576470] 465976 [311340, 303390, 349700] 46678 [230410, 405900, 220780] 452026 [2300, 434570] 47260 [33930, 262210, 2420] 185845 [35140, 254460, 473670] 63033 [630, 258890, 206440] 40797 [271590, 627270, 257850, 509190, 323380] 252349 [310380, 214250] 251713 [386360, 8190, 204100, 35720, 34010] 39320 [387990, 319510] 134376 [438740, 105450, 104900] 123557 [96000, 299680] 326562 [202170, 219890, 214420] 140801 [208650, 239030] 52391 [200510, 280520, 302830] 161303 [489830, 514390, 359050] 348950 [204300, 274940] 199386 [49520, 55230, 206420, 225260, 6800] 213671 [213670, 268050, 237930] 347990 [221540, 42170] 232304 [361850, 247730] 135433 [327310, 97000, 310740] 422363 [2620, 2100] 88479 [8190] 39192 [661740, 363440, 324810] 83922 [306130, 9010] 341633 [324800, 347830, 221040, 246620, 277890] 325188 [359550, 220240, 393380, 344840] 488406 [342620, 263620, 427250] 283412 [305980, 6920] 168033 [239350, 6800] 82066 [255710, 417860] 77444 [220, 552590] 317754 [8980, 620] 358113 [107410, 317400, 409710, 362930] 468627 [200510, 22000] 338659 [250900, 49520] 198250 [453480, 70400, 42700, 7670, 388340] 365523 [252490, 15620, 253110, 57300] 271167 [466350, 280740] 238453 [319630, 227080] 35733 [269250, 242720] 48750 [304050, 99900] 9626 [206420, 35720, 247730] 113026 [242920] 192486 [15320, 380] 507938 [237850, 203350] 259355 [237310, 462930] 18294 [312530, 391540, 211580] 135646 [230410, 42850, 267360] 181789 [211820, 330830, 438180] 135712 [39140, 11450, 355760, 287100] 295465 [370240, 80, 286690] 284714 [208580, 272060] 124983 [233450, 274310] 335377 [202270, 202750] 152167 [333930] 315483 [4000, 306130, 102700, 211500] 514692 [45740] 181814 [319630, 517790, 91600, 209540, 72000] 99052 [427520, 444090] 142480 [48700] 403227 [562230, 299360] 84643 [206420, 40800, 204080] 516715 [304650, 242680, 315260, 214700] 62695 [6860, 42910, 40950, 12110, 1520] 76613 [107410, 386360] 537928 [440] 529647 [378490] 382190 [360430, 374040, 249990] 75384 [291550, 267530] 276559 [438010] 279112 [274190] 28259 [231430] 227614 [620, 371660, 361420] 418670 [377160, 304650, 257510, 250760] 69481 [25800, 21090] 365757 [45740] 37698 [335300, 200010] 34004 [265610] 447376 [222940, 487250] 227188 [212480] 296773 [208090] 145869 [230410, 253250, 340540] 509709 [38600, 49600] 215493 [49520, 252890] 350171 [211820, 221100, 32470] 63480 [337270, 204340, 359050] 137802 [50620, 230050] 467285 [624270, 355180] 203423 [447040, 50300, 417860] 462893 [440, 61700] 150456 [428690] 369449 [242050, 262060, 262060, 326410] 297091 [6060] 131628 [331670] 270708 [551770, 246580, 384490] 268538 [372360] 278186 [224260] 200813 [337320, 247730] 446896 [410710] 132013 [49520, 346110] 250120 [105450, 218410, 358170, 44350] 440928 [20920, 239350, 240, 63380, 291910] 295395 [233270] 287948 [200210, 372000, 269770, 312790] 151609 [271590, 373420, 39680, 22350] 23208 [327370] 234522 [274170, 204100, 284950, 488910, 550650] 480189 [4000] 140199 [359550, 362490] 488827 [327690, 2540, 46560] 83324 [440, 356670] 374787 [433850, 289130, 440, 330840, 261180] 340327 [242920, 364420] 376042 [311210, 361420, 436150] 11500 [233450, 6860, 544330, 397950] 131387 [220, 203750] 162417 [220780] 349484 [326410, 365300] 163255 [346110, 601430] 280263 [381120, 383790] 111715 [20570, 274940, 380, 422970] 212343 [227300, 346900, 245620, 8190, 588430] 123052 [334230, 446640] 424600 [322500] 472102 [113200, 385250, 218060, 353640, 111800] 168763 [329050] 90152 [48700, 433850, 239140, 595280, 260430] 57872 [252490, 32370] 120656 [287390, 28000] 54150 [296970, 512250] 282211 [440, 440] 203697 [403640, 488790, 420060, 255300, 237630] 505992 [204300, 501840, 387290] 396873 [238010, 282400, 208200, 19200] 508368 [248570] 20152 [224600, 315830, 330460, 252310] 28802 [391040] 379386 [2720, 7510, 219890, 298180] 228296 [349140] 53178 [8930, 111400, 335330] 513323 [225840, 278360] 90626 [31290] 31311 [374320, 301520] 58001 [242920, 421740, 417880] 39755 [394310] 314724 [49520, 211820] 299380 [292140] 37006 [263280, 227940] 248969 [444090, 319630, 17520, 331120] 226161 [299600, 318600] 50564 [333930, 225540, 738060] 149123 [290080, 6060] 108953 [220240, 45300] 102893 [377160, 10500, 218230, 416030, 409710] 74977 [232430] 359661 [227300, 476360, 224540] 318242 [35460] 264194 [377160, 228300] 93230 [440, 48220, 70300] 191300 [65540, 420930] 370403 [110800, 55100] 452432 [57690, 38410, 25890] 129991 [49520] 121727 [8930, 344760, 329970, 402890] 314411 [214950] 386961 [250420] 7966 [436520] 266651 [204100, 12150] 25514 [251150, 620, 411830, 415300] 259857 [203140] 100818 [241540, 201810, 475550] 105540 [238320, 96100] 133494 [233270] 36895 [41700, 273350, 488790, 288470] 116944 [215530, 17470, 342620] 139774 [220240, 202170, 108710, 65930] 275622 [440, 301520, 401190] 279959 [488580, 285090, 555210] 436910 [48700, 234190] 78345 [348430] 315322 [48700, 291480, 355840, 35420] 504036 [619870, 527340] 187137 [19900, 206210] 217078 [333210] 41844 [32470, 207610, 334070] 246279 [444800, 242720, 433950] 144646 [339800, 241260] 534901 [268500, 204120] 284552 [248820, 204300, 70, 533300] 170578 [6020, 502280] 52219 [420790] 241572 [252490, 438740, 369580] 281470 [337000, 32470, 327560] 19761 [20920, 339800, 274920] 463469 [336730, 347030] 50113 [213850] 393832 [206420] 272205 [207000, 224960, 575640] 284584 [330820, 474010, 8190, 467360] 231461 [9050, 211400] 340213 [48700, 2870, 330840] 337373 [8930, 252490, 17390, 210770] 98168 [377160, 304930, 252490] 216812 [212480, 278360] 309978 [377160, 7200, 210770] 203235 [49520, 49520, 489830, 293780, 239030] 421754 [228300, 2630, 207610, 278360] 87365 [252490, 226860, 412450] 266178 [440, 510050, 296470] 295640 [211340, 242860] 221835 [231430, 251060, 219640, 35720] 138903 [675630, 647960, 55100] 253068 [230410, 319630, 433340, 21660, 17410] 26662 [268500, 108600, 433000] 269500 [403640, 390030] 112462 [380, 201570] 464524 [33120, 406150] 444110 [377160, 48700, 205100, 337340] 142048 [230410, 107410, 236870] 473806 [224260, 50300] 449832 [252490, 24980, 24960, 326460, 50300] 403459 [70620, 72200, 219950, 296470] 250452 [219640, 40720] 331161 [275850] 275806 [344760, 242920, 335670, 2400, 266010] 32892 [236450] 169220 [207140, 236090, 250260] 384605 [234140, 612880, 268910, 383870] 497276 [21690, 63380, 8600, 293180] 280373 [32370, 228300, 282140, 4720] 315102 [41900] 196715 [406170] 108663 [35450, 35450] 458272 [293860, 278620, 351150] 172935 [261470, 40700, 70300] 44103 [232090, 22340, 256190, 395170] 93482 [277430] 446749 [440, 49520, 55230, 6060, 250560] 527594 [220200, 3720, 24810, 1200] 100205 [367520, 235600, 433340, 301640] 22477 [48700, 284970, 237930, 262260, 266510] 265520 [92000] 42293 [252490, 236430, 310380] 257388 [252490] 502376 [254960, 237430, 562250] 335676 [240, 4760, 246620, 32380] 8045 [281990, 32150, 3590] 423698 [396640, 302790] 88845 [65740, 73020, 55110] 296920 [305070] 147647 [221380, 267530, 203350] 213115 [500, 376730, 49600, 314280, 221910] 297761 [359550, 488790] 223731 [252490, 202170, 384300, 257850] 10223 [342230] 6684 [394510, 460810] 373290 [212010, 220160, 243450] 134110 [12120, 367030, 70000] 246266 [300550, 384310, 417860] 170947 [224500, 204300, 6840, 384150] 324588 [372000, 316010, 201790, 301220] 308107 [382490] 108684 [410320] 387688 [417980, 603880, 368590] 71787 [383980, 220, 113420, 418340, 57300] 348906 [255710, 55000, 284390] 111254 [221910] 172000 [463270, 205230] 130210 [259550] 75416 [232090, 285160, 237990, 434170, 206440] 168175 [237930, 422970] 124443 [230230, 584400, 274130, 17410, 290730] 429833 [346900, 433350] 156229 [236430, 8500, 394510, 24010, 261760] 125480 [440, 220200, 221100] 51798 [202750, 266510, 22230] 74807 [227940, 409710] 241081 [49520, 306130, 563560] 256254 [220240, 203140, 635260] 342466 [48700] 293211 [110400, 511960, 224580] 285801 [253250, 25000, 394380, 357310] 447387 [443810, 221640] 19713 [41700, 347940, 233470] 244727 [311560, 102700] 280829 [25000, 2400, 355180] 346366 [210770] 263627 [207610, 224600, 261570] 109164 [440, 4920, 34870] 498282 [220200] 217773 [377160, 377160] 275232 [252490, 407530] 333867 [248820] 244486 [433850, 220, 397950] 26129 [107100, 237930] 259801 [493480, 22300] 42564 [214950, 289070, 376870, 417860] 387656 [440, 291550] 481096 [400, 461840] 14547 [71340] 245723 [40100, 212680, 381120] 375955 [377160, 35450] 48902 [440, 227940] 59541 [391720] 257114 [49520, 330840, 361420, 424840, 291480] 360247 [394360] 315201 [374320, 374320] 204028 [252490, 282440, 253310] 403614 [346940] 487469 [290300, 238010, 38420] 57295 [281640, 220] 261706 [376210, 431240] 155165 [211820, 243450, 282440] 164564 [220200, 246620, 346010] 184079 [301520, 221380, 200210, 238430] 261037 [391540, 245170, 322500] 170292 [333930, 57900, 332480, 130, 293180] 501354 [259680, 237930, 8880, 272600] 469209 [57690, 302380, 94200, 317360] 338376 [222480, 274190, 470260] 325033 [25890, 204030] 228047 [440, 221040, 493200] 263219 [385250, 299720, 250260] 341725 [248570, 31280, 286100] 513693 [545150] 223847 [50300, 22200, 251990] 232118 [39210, 24980, 224540] 63085 [4000, 265630] 405269 [220200, 253960, 2590] 14825 [262060, 48220] 386292 [318600] 300329 [33930, 204360, 15960, 328760] 94404 [319630] 393310 [55100, 58550, 22000] 404518 [220200, 359550, 304930, 584400] 263250 [306130, 17460, 418340] 210383 [373420, 447040, 325790] 276238 [588430, 230270, 657680, 581760, 539640] 426890 [204530, 250580] 210511 [440, 230410, 431240, 239200] 360844 [500120, 345480, 437160] 303937 [345640, 512900] 229002 [261570] 511656 [440, 433850] 382798 [203770, 235540, 275530, 519530] 263727 [218230, 274170, 278460] 91875 [47890] 483298 [372000, 212680, 374570] 407359 [4760, 204300, 344770, 232430, 368990] 167361 [242920, 215530] 83158 [273350] 219190 [377160, 211820, 384490] 25050 [261110, 207530, 367570] 175587 [212680, 680360] 64594 [333930, 355840] 110773 [107100, 200190] 457914 [17410] 412710 [287390, 17120] 179035 [494460, 303590] 57489 [423870, 554530] 89670 [403640, 424370, 47790, 303210] 536625 [444090, 366970, 220860] 304869 [208090, 107100, 242110] 226943 [214560, 514890, 429570] 263791 [7670, 343270] 207781 [19800, 2200, 9010] 218944 [207610, 319630] 234669 [620, 429790, 300220] 132033 [455400] 370848 [55230, 246580, 391460] 41257 [347430, 45450] 284336 [346900, 387290] 102487 [455200, 216290] 280216 [374320, 4000, 391460] 163221 [239140, 13230, 307350, 285900] 457532 [294860, 389850] 60618 [208580, 351710] 90457 [327090, 378610, 286940] 106588 [359050] 53411 [273350, 324810] 131141 [15120] 488141 [444090, 444090] 204670 [50130] 144270 [403430] 161709 [363970] 316252 [440, 214950, 224260] 158841 [4770, 449040, 303210] 487299 [318310, 468490] 509765 [382090] 130190 [310950, 63380] 54161 [261640, 12120] 391502 [233130] 177509 [311560, 214490] 67201 [337480, 264140, 237930, 250110] 457571 [200510, 274170, 15120, 93200] 443792 [40800, 316600, 603960] 365723 [271590] 366174 [440, 377160, 299740, 461430, 242860] 385693 [49520, 524220, 212680, 318230, 239070] 385204 [205100, 55150, 460790] 314945 [758500, 372540, 227380] 386761 [337940, 468050, 391270] 284217 [250340, 219640, 205650] 285277 [12150, 17470, 311340] 327919 [48220, 375910] 172449 [284160, 393380, 15320, 509220] 464227 [270880, 305620, 202170, 209000, 102840] 108132 [424840, 314790] 15339 [8930, 200510, 45760] 227114 [238320] 22414 [304930] 310376 [236370] 378224 [107410] 318532 [238240, 361300] 75010 [405780, 498240] 43891 [306130, 64000] 229345 [6060, 39800, 31280] 307279 [248410] 37381 [227300, 233150, 443580] 146545 [319630, 286440] 260460 [221100, 237990, 200940] 508340 [243040] 499927 [200710, 8190, 48720] 99811 [225080] 221929 [435150, 391540, 418180, 402310] 387174 [232090, 466560, 20900] 433263 [561740, 284950, 425580] 373877 [39140, 242860, 203650] 445372 [345180, 295490, 210770, 247950] 216868 [215670] 85813 [393420, 203140] 268441 [644930, 210970] 61876 [261570, 227580, 443580, 96100] 262580 [440, 444090] 534420 [270880, 30] 238521 [312780, 1200] 443427 [251730] 296146 [374320, 211820, 291550, 620] 22264 [214950, 282900] 254201 [230050] 216794 [433490, 91600, 278360, 507010] 378143 [232430] 517311 [307780, 356570, 71230] 67556 [55230, 427100, 375530, 340540] 173468 [253250, 242860, 409100, 236370] 315186 [65930, 102600] 25432 [57800] 200269 [322300] 122092 [111600, 214970] 367852 [280720, 259550] 510564 [50300] 217830 [444090, 292120] 386985 [215470] 327802 [22200] 413582 [335670, 7660] 321875 [65980, 237990, 21600] 358010 [6850, 223220] 111927 [239160, 33320] 225441 [231020, 369890, 9940] 149156 [4920, 287290, 227060, 208520, 217140] 25259 [395500, 449680] 204437 [211820, 287290] 360177 [6860, 362890] 226755 [427520, 291550] 188324 [318430] 144497 [347670] 52269 [239140, 335240, 518790] 221511 [355840] 59155 [246580] 133013 [200510, 15200, 412450] 394390 [605990] 411340 [252490] 306056 [377160, 105450, 239350] 460385 [212480, 391960, 334190] 315142 [359400] 134225 [375910, 251270, 427730] 354377 [234650, 1520] 246256 [206420, 12120, 71000] 171114 [346900, 500] 315372 [240, 4500, 3910] 322995 [440, 235540] 256599 [65800, 391540, 370070, 405640] 165976 [356570, 339280] 20489 [20920, 6060, 67000] 212568 [48700] 280610 [562220, 461560] 62562 [440] 27679 [250900, 345350, 403640, 244770] 285957 [49520, 221100, 265550] 292284 [225080, 319510] 48216 [304050, 283640, 276810, 233130, 63380] 523443 [424840, 265300, 317100] 341107 [377160, 241600] 521899 [9010] 407160 [462930, 369580, 359050] 333116 [524220, 234330] 181249 [591420, 342580] 59154 [9000, 4530, 9010] 95860 [200510, 4720, 50, 15190] 161633 [39140, 222730] 213047 [221380, 339280] 278022 [212680, 291550, 257670, 291190] 66207 [346110, 250420] 379490 [215530, 255070, 244730] 310738 [252490, 488790, 268910, 311340] 294724 [224600, 506610] 573 [214630] 243629 [206420, 434570] 179249 [433340, 414340] 226514 [211820, 286040] 279107 [247730] 10099 [220820, 320140] 320736 [252610, 17300, 285900, 206190] 100325 [4000, 377160, 363970, 251060, 529180] 114373 [504210, 268050, 63950, 293780, 219890] 454089 [347830, 388880] 188309 [15620, 253110] 359210 [8980, 31280, 380] 225334 [271590, 281990, 444200, 252610] 456563 [35300, 12830] 37988 [290930, 274250] 387706 [335300, 11140] 16169 [212680, 410380, 273110, 273350, 215100] 269209 [39120, 265610, 204450] 500606 [49520, 531640] 18767 [35720, 336240, 3480] 234847 [311560, 293540] 405580 [394690, 339580, 39800, 233270] 350454 [301520, 22370] 264148 [63710] 164163 [588110, 261570] 430453 [440, 40800, 2270] 283364 [35720, 238320] 358123 [262690, 209850, 241620] 119124 [359550, 258970, 221020, 10100] 221997 [439190, 510630] 111627 [377160, 318600] 103676 [211820, 431960] 279014 [98400, 225080] 329282 [271590, 459820, 249630] 481782 [427030, 207610] 372458 [35140] 242478 [286240] 193850 [220, 326480] 88418 [346900, 466560] 230305 [204360, 107310] 129459 [225840, 50620] 428074 [8930, 49520, 12200] 142690 [208650, 252030, 282140] 504988 [391420, 571350] 354993 [273350, 274250] 473048 [207420] 18931 [57690] 375660 [252490, 55230, 298280] 321136 [300380] 71899 [242860, 209080, 317360] 141149 [440, 327090] 428773 [208650, 504370, 307780, 204450] 374133 [220260] 450820 [440, 4700, 504370] 495611 [207380] 304679 [213670, 511250, 317510] 196272 [395890, 313340] 118228 [48700, 500, 229580] 208318 [65980, 231160] 350031 [281990, 200210, 6060, 359050] 117785 [248820, 15120] 361773 [65800, 504370, 233740, 303210] 30342 [522470, 345390] 123843 [227300, 33680] 105224 [102500, 224500, 230050] 174297 [15120, 200010, 29180, 41100] 326303 [435150, 231430, 204880, 305620, 244160] 320996 [19900, 431240] 64798 [269770, 290140] 423553 [242570, 406970] 43211 [346110, 238010, 373720] 286203 [290300, 345650] 244813 [427520, 557400, 7520] 349081 [305940] 466002 [16450, 296470] 320968 [204880] 180097 [8930, 311730] 197978 [8980, 214950, 10180, 204940] 53687 [393420] 187670 [322500] 256059 [433850, 403190, 291550, 113420, 313120] 223582 [291550, 291550, 253920] 120051 [207610, 545980, 642390] 272990 [327520, 384000] 379729 [442120] 1065 [304390, 579760, 11450, 302610] 355054 [544330] 254181 [33770, 225080] 398934 [8980, 2620, 15190, 409720] 383692 [233980, 231740, 249590, 94400] 322712 [15620] 178843 [440, 518790, 238320] 69800 [63200, 22200, 15750] 378398 [224260, 564710] 477359 [35720, 429720] 222678 [444940, 575640] 270375 [236870, 387990, 623940] 27110 [312530, 282900, 242680] 208915 [305620] 164700 [33930, 224260] 214451 [207430, 359050] 161471 [666260, 452000] 264980 [440, 333950, 550650] 398523 [457890, 502450] 70142 [312450] 240682 [290340, 284260, 398850] 287756 [213670] 65933 [211820, 15100] 159990 [231430, 313160, 357310] 513604 [209160, 290770] 291443 [241540, 322300] 256306 [41000] 318738 [526790, 50300, 351640] 163000 [291480, 10180, 10180, 427730] 79540 [233450, 113200] 356962 [435150, 257510, 261570, 359050] 44655 [313690, 285330] 277960 [221380, 273350] 47370 [254200, 438320, 251730, 504210] 342158 [8190, 35720, 221640] 232665 [227300, 240, 346010] 463367 [386360, 595430, 298240] 292830 [47780, 11240, 420930, 70000] 189103 [377160, 290790, 417860] 318557 [362490, 290340, 339580] 490364 [233450, 431240] 118947 [620] 14995 [406550, 345820, 296910] 185197 [440, 204880, 20] 26781 [346110] 8992 [269770, 346010] 156356 [208090] 81607 [22330, 481510, 253710] 179858 [388210] 371982 [424760, 350740] 13844 [394760, 22610] 217942 [386700, 297350] 11694 [4000, 326460] 223488 [21690, 363930, 449540] 453050 [32140, 495890] 32547 [214490, 246090, 251710, 313340] 47118 [17520] 50683 [301520] 296304 [311210, 402710, 207140, 31280] 394688 [410320] 488374 [7260] 337844 [110800, 421040] 150596 [265550, 317910] 63717 [377160, 201070] 460998 [242680, 449960] 341992 [524220, 223710, 224580] 248985 [349040, 15390] 214957 [286100, 303210] 219374 [346110, 253250, 269210, 630] 22546 [208730, 35140] 215451 [4760, 20900, 397950] 280873 [313160, 240, 498240] 385976 [236430, 391540, 239700] 151073 [333250] 79219 [4920, 206190] 331271 [269610, 48000] 168487 [318220, 244730, 419460] 303614 [204340, 55100, 253980] 382149 [438420, 314560, 261820] 168341 [288470] 465905 [435150, 40390, 215630] 181441 [221680, 291650] 467764 [57300, 25010] 213507 [319630, 330840] 90908 [433850] 323538 [297120, 265890] 205334 [55230, 312790, 236090] 18555 [269030] 96291 [306130, 316010, 4000, 266330] 352179 [418370] 164862 [230230, 207610, 504130] 518897 [390290, 305260] 3658 [422900] 53338 [318430] 259192 [250760, 233720] 237820 [475550] 163827 [261980] 310773 [236110, 331500] 297507 [241300, 92800, 17410, 241320] 51063 [96000, 107310] 221593 [252490, 230410, 274920] 7497 [24960, 500, 319630, 35720, 340] 202768 [410320, 385730, 303390] 392967 [252490, 233130, 420530, 406150] 351193 [213030, 287340, 209160, 347560] 15818 [236430, 40800] 507275 [222980, 253710] 124104 [224540] 532071 [447040, 22330, 4720, 380, 253710] 229723 [220] 111059 [262060, 211600] 325895 [239350, 293440] 89161 [236870, 208650] 421544 [289130, 249050, 50620, 221910] 273724 [360940, 291010] 172436 [220, 292660, 435120, 221640] 369925 [306130, 204360] 354150 [255710] 317163 [237850, 219150] 268047 [630, 676820] 401636 [440, 319510] 428670 [306410, 306410, 306410, 306410] 21146 [337000] 247807 [304930, 282800] 363047 [4000, 303210] 212775 [252490] 6657 [329110] 332506 [71340, 324570] 477272 [204060] 85542 [336130, 26900] 102677 [249650, 228960] 247669 [227940] 503415 [221100] 336965 [397950, 17470] 277357 [230410, 220240, 24800, 203630] 51875 [730] 165188 [365590, 363890, 615610] 402052 [263280, 253230, 371660] 200502 [239140, 35460, 12140, 242800] 343947 [29800, 462930] 65480 [263680, 248650] 167522 [49520, 363410] 149642 [394360, 536930, 304050, 354200] 285251 [300, 202970] 107385 [242050, 236090, 296300, 285900, 104900] 393418 [238320] 59679 [377160, 355000] 404103 [230840] 230040 [42910, 242700] 518326 [246420] 236950 [236870, 335240] 490732 [510220, 431710, 401190] 232070 [37360, 3600, 49600] 421096 [50130] 440815 [440, 8930, 239160, 104900] 117401 [49600, 224540] 136309 [377160, 212630] 52738 [261570] 41326 [376300, 551730] 122377 [377160] 512713 [275850, 334850] 180467 [4000, 104900] 58637 [49520, 250400] 248970 [219640, 220] 285749 [233130, 290930, 571740] 409299 [533170, 582270] 26934 [8850] 156686 [243470, 219890] 424916 [219990] 294864 [610360, 327680] 26791 [50620] 31495 [394360, 204530] 22742 [227300, 12220, 440, 332310] 468810 [234140, 435030, 248650] 465769 [363970, 489830] 355934 [248820] 289727 [21090, 17480] 260479 [48720] 359446 [213850, 25980] 86205 [282900, 409510] 345200 [524220] 350866 [221100, 488790, 21090] 517206 [208580, 292140, 322500, 233720] 9083 [48700, 201210, 291150] 243531 [397270] 21351 [292120, 234820, 248650] 498536 [620, 307690] 381692 [48220, 449760, 257730] 457087 [293660, 12140, 400250, 48000, 242820] 52210 [250320] 154548 [257260, 205690, 268540, 211360] 335004 [420290, 420930, 63710] 467089 [239840, 310060, 260750] 492160 [365670, 29900, 329460, 357480] 101120 [369180] 121675 [449780, 7010] 522668 [240, 264140, 212800] 473241 [340] 339431 [350640, 242720] 137313 [382740, 215670] 287528 [256290, 363440] 81658 [273350, 9010, 219950] 346334 [304650, 290300, 580200] 189460 [107410, 12810] 137624 [346110, 200510, 355180] 530228 [360170] 438648 [4000, 313690] 102808 [304390, 241600] 276909 [238750, 443810] 412673 [359550, 248350] 225495 [248820, 259080] 354797 [271590, 221100] 252465 [274940, 252330] 39487 [274170, 241930, 245170, 427270] 62217 [384190, 384190, 340] 433019 [8930, 239350, 214560] 137849 [316010] 370568 [248970, 319510] 32268 [2310] 354428 [254440, 225080] 248308 [71340] 316624 [22330, 313160, 427730] 336055 [245390] 1110 [265000, 382050, 270450] 351367 [207140, 212160] 275258 [216260] 371355 [389730, 530070] 29920 [211820, 268750, 359050] 357500 [391540, 209870] 221595 [282800, 427270] 50716 [374040, 319510] 101359 [335300, 204100] 537679 [8930, 49520] 375160 [359550, 330820, 346250] 437093 [40800, 312610] 242946 [236430, 261570, 222480, 247730] 23423 [236430, 370190] 441826 [289650, 600970, 8260, 434570, 221910] 285622 [233450, 212680, 200510, 620, 442070] 107980 [4000, 221380, 311400, 263740] 488792 [433850, 8980] 231911 [220260, 411830] 397784 [46410, 344040] 142217 [80, 115120] 508153 [210970, 496300] 236413 [397460, 266430, 274900] 27527 [211820, 285900, 407840] 186250 [337000] 313004 [211820, 346110] 375784 [247730, 252570] 239494 [342200, 236870, 255220] 516565 [323370, 108200] 511875 [377160, 107410] 277081 [55020] 221530 [363600] 199904 [32430, 247730, 400] 485990 [363970, 7670, 266070] 367481 [310950, 337000, 246110] 255069 [225080] 271603 [40800] 309408 [39140, 249050, 319630] 82247 [292390, 252010] 466569 [268500, 227940, 10180, 268750, 33320] 449718 [244160] 438640 [219150] 87641 [580910, 225260, 424280] 96316 [220780] 136887 [562590] 149116 [242920, 26800] 108356 [39120, 250110, 62000, 104200] 229183 [440, 49520, 113400, 409710] 519651 [237740] 174290 [4000, 10180] 60000 [200910] 333564 [644930, 211500] 57428 [259720, 427730] 43917 [269210, 508600, 219150] 527787 [386360, 270880, 51100] 333312 [252490, 319630, 260230] 402719 [275490, 292380, 344040] 297411 [377310, 428430, 261880, 369580, 303390] 24970 [236090, 207350, 283680] 265487 [40420, 207430] 136149 [55230] 352952 [379720, 327690] 453844 [241930, 262240, 477160] 373696 [203830] 72024 [342550] 152550 [12200, 12130] 143679 [500590, 262590] 488250 [241600, 63710] 51040 [375910] 185522 [441050] 455476 [329110, 678800] 515325 [216890, 298050, 297000] 276383 [232790] 43702 [582660, 289650, 374570] 69860 [40100, 24980, 417860] 358052 [213670, 246420, 506140] 367918 [319630, 421630, 274480] 134956 [227940] 225367 [346110, 34030, 400660] 99100 [377160, 230410, 270150] 198203 [49520, 368370] 156944 [292120, 531640] 108514 [247080, 409100, 397270] 124878 [431560, 476460, 271570] 280921 [234650, 608800, 215690, 34870] 309637 [281990, 289070, 307780] 464556 [3980, 22350] 61980 [55230, 221380] 138500 [289090, 417860] 438562 [569860, 237110, 33650, 395170] 388328 [63380, 55140] 156346 [494360, 425580] 349950 [24960, 220240] 28354 [313120, 270070] 213038 [269210, 326160] 362848 [440900] 197097 [7940] 171761 [219990, 365590, 568090, 221910] 104402 [730, 419480] 264056 [301520, 251450, 389850] 62909 [234140, 233450, 394690, 202170, 274170] 341481 [335300] 289930 [306020] 204447 [361280, 317820] 141942 [524220, 280220, 410380, 204340] 413749 [321400, 250260] 329012 [249050, 2400] 314506 [211420] 296144 [374320, 200210, 22380] 4061 [242050, 67370, 267490] 237014 [394690, 270880, 244630] 225293 [271590] 409949 [16450, 630, 45730] 343774 [236870] 104638 [251990, 523780, 244710, 319510] 475216 [472420, 433360, 241910, 461840, 313020] 303114 [393380, 489000, 391740, 406150] 64374 [310380] 324311 [230230, 3590, 57650, 24800] 257516 [211820, 410320] 245227 [243970, 8190, 18070] 42735 [252490, 369060, 200170] 228210 [339350, 375130, 420740] 77833 [65800, 55230, 265300] 361611 [433340, 221910] 58616 [110800] 387492 [313340] 36297 [204300] 506797 [334030] 46974 [206440, 220780] 94601 [380840] 198608 [440] 408097 [274170] 358041 [377160, 376300] 165321 [220, 50300, 301910] 291183 [111100, 21090, 230050] 56564 [334230] 451837 [317400] 30778 [468150] 75453 [236870, 394310, 397350, 231140] 99927 [524220, 256290] 60246 [220200, 209160] 362707 [449050, 22200, 33680, 111900] 238991 [481510] 180508 [304390, 319630, 346330, 427730] 291321 [237740] 127654 [315430] 129090 [444090, 203290] 264731 [22330, 249990, 316390] 3430 [386360, 334230] 134661 [449140] 440294 [440, 363970] 103318 [489520] 340528 [55150, 22200, 57300, 738060] 271746 [319630] 289878 [214170, 345180] 511642 [24960, 20510, 261030, 204450] 419593 [235600, 16450, 238430] 12468 [233130] 237066 [221100] 131430 [2600, 265770] 365808 [263700, 250700, 63380] 223676 [375310, 289820] 10902 [227780, 12120] 323145 [4000, 204360, 409710] 21022 [250620] 82402 [24400, 12220, 98400, 233550] 44351 [230070, 232970, 294370] 156136 [283040, 359560, 461580, 57900] 179160 [261570] 522990 [17330, 291130] 315904 [319630, 236090, 569480, 201480] 43873 [20920, 301520, 6060, 298240] 182417 [268050] 42740 [213670, 332200] 133724 [281990, 8500, 243950] 238863 [225540, 204360, 466240] 294553 [348020, 390040] 365820 [22610, 262470] 318346 [239160, 1500] 342728 [231430, 220] 430480 [7510, 391170] 448864 [304930, 419480] 328457 [236090] 32359 [252490, 212680] 44802 [105600, 620, 225080] 92570 [500, 225160, 312990] 44580 [394510] 218353 [346330] 371361 [440, 113400] 348809 [35140, 329490, 297110] 243765 [572410, 433530] 13752 [389040] 242652 [537800] 264673 [213610, 418120] 477865 [42910, 233150] 76546 [113400] 72405 [205100, 266010] 308290 [454550] 156768 [47890, 620, 238320] 52319 [212680, 8850, 213030] 20948 [212680, 500] 98909 [247370] 322465 [359550, 437920] 10955 [4000, 221100, 438740] 224852 [49520, 413150, 238320, 322500] 440619 [239070, 263100, 258520] 113501 [346110, 377160] 69299 [329050, 224300, 35130] 247411 [250900, 261640, 201810] 208291 [230410] 184805 [236690, 214190] 111396 [205100] 111435 [8270] 391539 [8930] 5607 [381780] 427019 [305620] 258061 [391540, 39140] 42455 [49520, 49520] 325609 [225260] 204480 [209000, 260430, 405640, 388090] 253641 [17300] 511354 [440, 440, 365050] 29963 [237630, 94400] 266441 [239030, 16450] 325653 [206420] 257192 [312520, 252610, 268910, 239030] 228014 [231430, 262060] 82197 [107410, 357900] 124126 [427520, 310380] 350451 [41500, 99900] 159465 [407720, 356100] 337852 [285900] 235852 [10090, 38400] 153390 [346110, 265550, 439340] 223413 [241260, 368500] 172858 [3720, 311340] 374356 [239820, 260210, 2400] 261648 [262940, 13600] 173840 [466910] 84092 [427520] 334497 [427520, 330100] 500325 [221100] 122594 [333930, 57690] 78310 [282140, 385730] 91752 [307690, 42700, 12200, 232790, 324070] 38858 [203140, 505040, 70000] 249117 [12810, 313980, 261110] 121327 [55230, 267530] 76448 [440] 118312 [255070, 72200] 341635 [220440, 224260] 312900 [227300, 35140, 35140] 132378 [453480, 227940, 434510] 241688 [35140, 50620, 48000] 341334 [113200] 109703 [403190, 206190] 149235 [440, 204360, 235540] 16563 [363970] 78047 [388410, 70] 179855 [287980, 236370] 424891 [219640] 3679 [385710] 340497 [440, 277430] 283249 [211820, 104900] 353011 [219990, 301520, 22330, 220] 370380 [65800, 35140] 138860 [6020] 49473 [418430, 364770, 683690, 434420] 183343 [418460] 516730 [359550, 291550, 270170] 159752 [311210, 571740] 115335 [300, 42700] 219439 [6060] 109739 [211820, 264160, 449960] 329145 [301520, 268420, 246620] 336003 [266490] 388719 [220200, 104900] 362145 [17410, 211400] 143320 [432020] 91432 [377160, 235600, 209870] 112591 [49520] 367909 [330070, 324520] 246763 [252490, 233720] 183408 [224600, 200210, 221020] 503819 [221100, 250340, 339610, 258590, 111800] 271613 [107410] 376581 [50620, 35320] 128685 [214490, 663390] 174592 [284160, 225020] 116271 [212680, 219640, 2700] 354077 [241600] 158455 [212480, 13230] 206828 [40300, 281640] 333254 [376300, 460810] 436861 [227240] 128968 [274190, 12110] 218444 [252490, 329050] 196317 [201790, 391460] 274659 [49520, 6060] 23728 [48700, 227940, 366970] 249661 [200210, 627690] 167622 [50130, 39670, 50300] 459587 [20920, 47780, 472870] 178478 [107100, 311870, 266510] 338945 [233450] 275815 [46410] 181817 [263280, 42890, 91700] 140866 [208650] 37693 [250620] 74595 [65540] 331826 [261180] 258009 [202170, 92800, 17410] 86291 [337000, 582660, 24240] 40102 [434570] 252235 [397340] 259323 [201790] 233702 [45760, 620] 67458 [440] 529723 [447040, 452060] 70816 [48700, 35450, 34820] 52457 [221100, 612880, 221260] 54349 [335670, 300540] 215713 [107410, 221910, 221910] 202986 [319630, 12200] 187662 [374320, 434460] 176206 [235380] 356584 [227300, 115120] 329980 [308460] 474193 [440, 416260, 96100] 240173 [294100, 231430, 393420, 34410] 79203 [521890, 346370] 133739 [346110, 441870] 77255 [65980] 322962 [274170, 219640, 379980, 340] 131597 [318220, 322500] 456970 [212680] 384017 [447040] 277261 [301860] 52730 [220200, 218230] 44136 [102700] 270870 [344040] 265833 [367580, 417860] 61567 [511750, 249330, 250260, 257710] 204785 [201810, 40420] 508925 [55230, 6880, 26800] 52110 [603360, 418180, 602890] 286653 [219150, 429790, 238320] 485737 [620, 202970] 115330 [200510, 319630, 2600, 343370, 417860] 77803 [236690, 274190] 322035 [488790] 40443 [305620, 7670, 71340] 286896 [239160] 472439 [241260, 205650, 261030] 110298 [501990] 378806 [409910, 41000] 288856 [239140] 448714 [499520, 429570] 362917 [20920] 142427 [282900, 283640, 21690, 251270] 163817 [574760] 407539 [262060] 376445 [252490, 346110, 221910, 398710] 411505 [227940] 445377 [47890, 288470] 131897 [221100, 370090] 498646 [250900, 234330, 501860] 213477 [253710] 379793 [40800, 12120, 2290] 294548 [271590, 201810] 147539 [390290] 125605 [252490, 346110] 16876 [252490, 374320, 205100] 257235 [322540] 305452 [459820, 444260] 179591 [48700, 6860] 446596 [582270] 260192 [419220, 388880, 317510] 512453 [201790, 108710] 405511 [201790, 207610] 365561 [346110, 332500] 206398 [652600, 465930, 19200, 343270] 336140 [391540] 232448 [252490, 620] 54217 [220440, 327410] 404528 [440, 304050] 126399 [304650, 536890] 279590 [252490] 364961 [254480] 38900 [274170] 130411 [226100] 314761 [250900, 294860, 290080] 109462 [311340, 249630] 105750 [427980] 103159 [261530] 274263 [113200, 233130, 267490] 51941 [49520] 166813 [48190, 220240, 243470] 301380 [300260, 394380] 305631 [33930, 636480, 209160] 282048 [352460] 455375 [253650] 163584 [41700, 612880] 327270 [433340, 433340] 317090 [239250, 104900] 241299 [365210] 444099 [35450, 563400] 124145 [12140] 36486 [32800, 9030] 327983 [553260, 233290] 219608 [35450, 45740] 346514 [323370, 299360] 112824 [252490] 204333 [211820, 239140] 84663 [19900, 481510, 6060, 278460] 332182 [250500] 372395 [31290] 29223 [374320, 346110, 391540, 70] 285476 [12120] 75677 [350010] 457799 [294860, 12150] 313138 [385070, 300380] 374386 [416240, 311800, 485580, 411560] 238668 [440, 335660] 336427 [248610] 465442 [400910] 293953 [220240, 266310] 203 [434420, 220780] 526870 [594960, 466830] 327216 [304030, 22370, 3483] 223664 [356280, 274500] 73414 [213670, 249650, 108700] 90565 [444090] 408099 [202970] 84577 [346420, 236150] 303835 [12110] 366450 [453480, 620] 424125 [620, 287390] 270139 [204300, 388880, 352520] 427894 [234140] 359752 [230230] 267621 [214950, 274170, 32370, 17500, 260430] 403169 [415480, 50300] 47147 [12140] 270472 [434620] 445482 [10090, 225540] 492060 [325120, 243200] 267844 [240, 393380, 7670, 218680] 303531 [230230, 346940, 299740, 10180] 464875 [239030] 86694 [415880, 98600] 50585 [236430, 227940, 34830, 317470] 409243 [19830] 351530 [271590, 225080] 100366 [550650, 346250] 248140 [346900, 282640] 50154 [698720] 99973 [310380, 27000] 389406 [273350] 230212 [39120, 104900] 449777 [453390, 415840, 318310, 361850, 286360] 166933 [40800] 251439 [337820] 124274 [386360, 21090] 203466 [220200, 341000, 58570, 16020] 27137 [629000] 414065 [48000, 252350] 95943 [367520, 400800] 46218 [219640, 203350] 320036 [218230, 42960] 297074 [290300, 280220] 210436 [15620, 233130] 424789 [107410, 232090, 261760, 384630, 70000] 32287 [363970, 391460] 222850 [512180] 394784 [273110] 176 [8980] 145782 [230410, 213610] 53016 [2400, 266510] 235330 [65980] 88700 [200510, 391540] 128774 [41500, 268050] 125323 [372210] 226718 [12110, 390520] 249290 [377160, 455980] 78973 [107410, 420520] 440845 [23310] 17639 [256290, 437060] 522756 [289650] 108323 [328880, 362030, 403510] 238948 [460120, 239030, 206190] 159219 [301640] 375704 [242050, 311560, 580200] 273898 [113200, 268910] 218345 [12200, 260410] 357913 [340050] 282093 [406150] 310444 [242860, 313340] 195355 [252490, 107410, 351970] 63934 [263280, 225080] 463958 [504380] 274534 [370240, 355840] 153261 [246070, 204450] 69163 [248820] 187147 [65980, 219640] 269830 [217140] 249948 [268500, 233450] 445241 [307190, 305480] 192113 [712730] 341214 [234140, 287290] 427477 [374570] 311166 [227940] 151623 [233450, 235320] 516535 [434520] 86879 [233470] 201251 [390030, 448370] 315459 [220240, 644930, 200210] 196256 [211340] 145107 [209160, 60] 534813 [219990, 40300, 241930] 503581 [200710, 91600] 437650 [202170, 42170] 186434 [340470] 418093 [108200, 201790] 486163 [262260] 320214 [418250, 234160, 282210] 236242 [242860, 99900] 376141 [239140, 203160, 20500, 22000] 55334 [620, 386180] 91423 [55110] 445474 [495890] 142816 [339610, 214250, 225080] 530786 [311210, 334230, 379720] 173552 [359550] 123277 [200510, 22350, 48000] 230679 [50300] 86633 [290300] 67889 [453270] 25420 [206420, 41700, 32800, 206190, 31280] 278776 [403640] 9499 [327690, 356130, 461560] 349075 [22320] 470875 [554620] 410965 [237930, 207650] 157308 [496500] 485403 [234940] 210417 [382900] 176616 [273350] 66020 [294140] 287776 [262060, 374040] 182988 [214770] 138665 [304930, 272890] 45107 [45760, 300] 152122 [301520] 272224 [4720, 234390, 70000, 253330] 522528 [299360, 429300] 121100 [278360] 11959 [42910] 456440 [22100] 128677 [227300, 17410] 60583 [536890, 326730, 242880, 314020] 105587 [233130, 248310] 63072 [227940, 219150] 275352 [289650, 242720] 245164 [440] 435397 [207150, 277490] 403380 [240760] 484171 [214490] 282081 [206480, 433320] 83999 [400170, 461430] 278513 [236430, 389730] 520579 [202170] 430693 [249330] 3977 [4000, 433850, 104200] 13687 [219910, 313020] 313908 [233450, 252910] 203029 [306130, 240] 522150 [307780] 457068 [214950] 155826 [8190, 489260] 6487 [319630, 219150] 275025 [208140] 471703 [221100, 246620, 312210, 307640, 302610] 353053 [253690] 480092 [409450] 519964 [238430, 360870] 191124 [258180] 143990 [420290] 369902 [225080] 286428 [201810, 208200] 295073 [272470] 311629 [237870] 434086 [4000, 359550] 375959 [220200] 226589 [301520, 418250, 262960] 453005 [527280] 537964 [500] 314161 [209000, 205100, 24840] 356421 [558100, 431240] 51855 [282070, 6060] 320746 [298110, 449140] 48400 [351800] 208561 [262940, 63380, 12640] 59935 [32430] 118670 [448510, 230050] 331958 [444640, 353360] 421010 [98400] 103633 [18070] 214062 [377160, 359870] 25337 [107410, 275850] 213274 [240, 268420] 246984 [431540, 392450] 150402 [435150, 7670, 219890] 206660 [234140, 330840] 360902 [220200, 252490, 266310] 332733 [239140, 219890] 376204 [440, 220240, 242680, 102700] 475267 [211440] 354964 [400910] 151446 [32400] 398399 [340340, 22340] 155934 [12110] 523131 [307090] 28183 [4760] 59110 [386360, 221100] 238277 [331340] 228443 [550650, 317470] 416108 [433850, 380600, 361630] 357638 [239120, 6060, 2840] 215342 [35700] 56773 [676820, 224460] 44917 [440, 220200, 243160] 47866 [261030, 250620] 431793 [294860, 307690] 518146 [287390] 280530 [238010] 398059 [208200] 35744 [239030] 162605 [221100] 291717 [55150] 453421 [23380] 429882 [730, 201810] 330976 [200170] 122451 [214510, 294390, 29180] 71746 [440900, 504370, 22350] 198866 [274170] 393320 [233150] 330592 [12360] 230157 [374900, 42670, 305380] 136236 [244710] 94683 [359550, 287450, 104900] 150640 [233450] 139733 [214830] 203062 [65800] 47366 [252490, 249050, 495910] 113374 [319630, 242550] 302324 [20920, 241600] 196382 [253290] 89525 [50] 49578 [221100, 242860, 2400] 37701 [41500, 304650] 334385 [2900, 485580] 226421 [202170, 227080, 31280] 203537 [554620] 257826 [203140, 635260, 228280] 485351 [7860, 422970] 140148 [433850, 17080] 57204 [313140] 450375 [273110] 419052 [364270] 386952 [2300, 18700] 262612 [239350, 291410] 413026 [227300, 92800] 426120 [2620, 50300] 213176 [41500, 225540, 319630, 359580] 121460 [444090, 35140, 32400] 162339 [19900] 63331 [63710] 61391 [427250, 441870] 78917 [274170, 12200] 14546 [245280] 279219 [435150] 295570 [409450] 302271 [48000] 339779 [239140] 338949 [400250, 291010] 62478 [339610, 16450, 738060] 253180 [233290, 207170] 181527 [345390] 341735 [104000, 501300] 30085 [48700] 516662 [466840, 352460] 488993 [265300, 238320] 57718 [440, 466560, 417860] 478521 [322330] 523817 [214770] 254649 [227300, 32440] 156701 [220200, 212680] 14987 [444770] 356614 [42700] 312218 [240] 15198 [32740] 394573 [301520, 407530] 407803 [33100] 257104 [204240, 233290] 268828 [268910] 347751 [220] 19038 [70] 246463 [221910] 238153 [440, 382490] 219702 [219830] 465545 [98200] 208753 [377160, 3900, 348670] 144965 [402130] 195768 [209000, 339800, 220] 501961 [238090] 385728 [281990, 234670, 272270] 339760 [49520, 374320, 8980, 42960] 462956 [50130, 3590, 296770] 29078 [496300, 373480] 154411 [312530] 470582 [282800, 466170, 227100] 181314 [337000, 208650, 6920] 281495 [333420, 349220] 405171 [49520] 200183 [271240, 338980] 108252 [322290, 283820] 75600 [221040, 98400] 437664 [255710] 482629 [210970] 353634 [4000, 306130] 305956 [394360] 100505 [239160, 370600] 422979 [332800] 53863 [3480] 281936 [204080] 431412 [313120] 501580 [228280, 23450, 49300] 86813 [7510] 337183 [102700, 501320] 169245 [355180] 166890 [405640] 437068 [466240] 331207 [584400] 335075 [232430] 189708 [21130] 133749 [70420] 479980 [219640, 253530] 231263 [259390] 34031 [730] 23318 [238320, 224260] 174199 [212680] 489585 [31170] 243636 [282800] 11396 [360430, 61730, 221910] 151147 [304930, 206210, 6830] 349455 [252730] 342686 [247730, 249680] 339084 [306130] 398896 [45100] 458921 [24980] 526749 [8850, 352010] 455434 [424280] 67863 [200510, 254440] 31564 [49520, 24960, 200210, 220090] 64911 [49520, 238090] 48141 [620, 345390, 384190] 52207 [26800] 310846 [8190] 304837 [616090] 426651 [98100] 185981 [293780, 200010] 169901 [383120] 393167 [363970] 151077 [204100] 57670 [281990] 261506 [7670] 512492 [213670, 49600, 233270, 314450] 212552 [204300, 248820, 57300, 57300] 45851 [393380] 394918 [377160] 247885 [237870] 530167 [440, 2630] 245924 [643620] 53468 [324800] 44926 [12100] 356378 [39210, 251060] 248427 [4000] 209635 [102700] 226778 [350810, 270570, 628750] 285422 [49520, 389900] 196661 [384180, 201510] 75360 [227480] 352492 [440, 377160] 298243 [221100] 55224 [329440] 519732 [220, 17300] 144714 [203140, 22230] 57403 [460930, 214360] 110849 [268500, 107100] 337950 [264340, 341890, 359810] 294176 [259660] 417447 [620] 290193 [201810, 366250] 178197 [377160, 313020] 452606 [280140] 51187 [224260, 333950, 356670] 44684 [301640] 80938 [211820, 4000] 20364 [385330] 504426 [242920] 477714 [267750] 370934 [287290] 374075 [314980] 521912 [211820] 504517 [388680] 476073 [214510] 199496 [48190, 211500] 79307 [98400] 515649 [229810, 492270, 330820] 259432 [345860] 288224 [271590] 348191 [2600] 422216 [420530] 475991 [208650, 597220, 232010] 473095 [285900] 75367 [200710, 261640, 418190, 370280, 226740] 312726 [252490, 394510, 620] 186009 [297130] 281363 [102500] 202658 [377160, 3590] 55844 [463350, 374570] 43878 [282900] 158318 [252490, 249050, 285920] 185713 [206420, 312210] 237009 [251130, 368610] 521844 [353360] 387422 [221100, 524220] 211541 [6060] 2842 [440, 371310] 303470 [256290] 200263 [252610] 356384 [7940] 75036 [205100, 17450, 370360] 406025 [304730, 249050] 454822 [224540] 64866 [252490, 304930, 393380] 138417 [322910] 98745 [440, 394510, 250340] 137055 [17430] 411276 [8140] 455755 [20920] 258674 [383980, 207610] 29348 [455400] 509490 [440, 225540] 303735 [200210] 111704 [310070] 3692 [32370, 349700] 22279 [230700] 118634 [252490] 524833 [377160, 225020, 239660] 173257 [201810] 213633 [234140] 24386 [360430, 430440] 285851 [377160, 40800] 386784 [304390, 278360] 157453 [454910] 281672 [212500] 218015 [420290] 306079 [57740, 10180] 388912 [209160] 297266 [502150, 200940] 366769 [252550] 388718 [257830, 319630, 346560] 232284 [233450, 316370] 60429 [265610] 524072 [238320] 329207 [400240] 218753 [20920] 136981 [449680] 89713 [4760, 102840] 507489 [343390] 96375 [70400] 429628 [64000, 236370] 346320 [204100, 341940] 517413 [239140] 71901 [218230, 6060] 62797 [368370] 362012 [221680, 48000] 205654 [232090, 49330] 276983 [257510, 266010, 209370] 365437 [261570] 68346 [377160, 300380] 34148 [200510, 251060] 337463 [271590, 208090] 395185 [249990] 77210 [238320] 174557 [584400] 25103 [440, 221100, 314160] 451658 [252490] 183140 [473690, 57300] 142607 [314710] 129127 [334230] 53255 [620] 289481 [274190] 366997 [374320, 211820] 287403 [219990, 221100] 361441 [9450] 489402 [236090] 134814 [569480] 221458 [319630, 48000] 79204 [8930, 6060] 280325 [440, 65980, 359050] 7228 [252530] 84174 [231430, 70000] 115584 [49520, 304930] 5355 [356050] 26715 [365360, 237430] 352162 [24780] 224010 [535760] 351613 [107100, 17430] 100272 [215080] 417136 [563150] 420401 [243120] 212105 [252630] 229117 [8980] 493393 [612880, 370360] 56660 [391540] 204558 [524220] 322077 [39140, 17430] 366002 [3540] 63648 [211820, 487120] 65851 [211820, 352460] 95463 [250760] 22734 [444220] 223232 [27940, 42120] 29860 [271590] 371234 [480650] 116055 [245170] 336932 [284710] 295658 [244160] 153028 [242920] 292704 [268910] 129183 [305260] 182602 [15300] 147679 [292140] 109433 [213670] 255837 [339800] 10300 [202170] 175567 [286120] 323554 [213670] 212284 [50130] 181202 [287340] 248118 [265120] 373842 [2420] 184097 [415200, 9350] 174077 [364420] 215419 [337000] 51610 [220240, 42910] 168790 [393380] 524409 [535760] 125904 [262960] 93627 [394510] 143386 [94200] 227723 [226100, 356040] 3974 [216150, 509560] 344768 [221100] 158732 [232910] 18896 [61730, 46400] 388562 [427520] 294691 [220200] 226065 [49520, 363680] 254250 [233740] 230532 [238090, 49600] 305191 [363680] 305913 [391540, 234650] 339986 [427730] 465143 [70, 35700] 114096 [202970] 72148 [447040] 64428 [245280, 220780] 376106 [266010] 222665 [485980] 205511 [254060, 474820] 67733 [200510] 319675 [212680] 24543 [238090] 152009 [300550] 502161 [246580, 233720] 106718 [307670] 477838 [293180] 236597 [261180] 306241 [107410] 44128 [240] 193674 [6060] 420088 [26500] 436125 [241600] 72417 [214870] 4975 [35450] 183358 [319550] 213623 [355180] 490108 [286100] 267998 [495720] 67417 [275850] 174052 [287140, 413500] 226307 [232790] 21681 [257350, 337950] 263812 [301520] 152125 [35450] 209052 [290140] 293546 [4920] 112557 [353360] 373116 [374670, 310370] 373944 [473670] 348652 [215160] 374808 [211820, 291650] 513903 [39560, 336060] 209322 [343070] 493538 [31280, 22230] 232221 [261030, 17480] 81457 [391540] 311650 [319630] 242818 [276810] 304664 [304390, 444090] 55116 [24780, 99900, 269030] 265123 [45760, 262410] 241767 [362890] 159192 [384490] 448938 [242050] 129647 [361300, 591960] 384102 [241600, 367260] 228840 [232090] 410696 [288860] 145507 [615610] 390227 [208580] 67211 [49520] 76756 [113200] 352525 [247080] 87185 [258970] 321099 [594570] 110063 [259080] 79220 [346900, 390970] 75699 [32630] 348432 [250260, 369580] 375181 [350280] 426799 [20700, 534290] 154087 [250900] 288511 [388090] 379528 [225080] 429083 [394550] 35120 [241540] 229915 [242550] 383524 [206420, 2400] 370012 [34830] 139435 [261030] 375410 [265550] 281348 [271240] 294151 [204100] 16310 [214490] 173353 [377160, 265550] 44020 [206420] 387570 [218130, 224820] 125971 [211820] 481493 [418370, 491130] 193655 [391540] 318834 [250340, 325880] 285297 [48720, 384740] 42500 [505210] 128989 [411740] 351480 [305620, 213850] 58076 [620] 483806 [224600, 406930] 78959 [7670] 61101 [558100] 190452 [11190] 261885 [226720] 283011 [241540] 229038 [57900] 129622 [520440] 327572 [444090] 441447 [272060] 391582 [235540] 164660 [288290] 159514 [311730] 305040 [489930] 6223 [444090] 97504 [359550] 188281 [495910] 35369 [427520] 106124 [485380] 274517 [286000] 300449 [39120] 404049 [57900] 75198 [320] 476052 [207570] 367125 [495890] 117499 [359870, 400250] 475945 [223100] 477587 [214950] 9121 [210970] 359594 [252410] 244393 [42700] 159032 [104900] 35703 [486150] 105845 [221810] 495792 [238320] 502433 [23420] 239683 [219890] 205011 [233290, 544730] 122026 [266210] 471299 [33930] 186103 [306020, 20920] 30994 [378660] 60146 [233450, 330830] 388250 [203630] 98213 [423880] 76193 [359550, 410320] 208607 [409710] 128174 [243000] 153364 [99900] 168798 [428550] 217891 [212630] 169334 [219640] 124266 [204360] 381920 [64000] 52008 [248470, 378660] 87039 [32370] 490686 [290770] 46429 [282440] 176889 [319630] ###Markdown * *Y las recomendaciones para un usuario en particular, con sus respectivos ratings.* ###Code print(top_n[538371]) ###Output [(453480, 2.700361270257303), (365590, 2.1755992772558863), (550900, 1.9665633655362522), (40390, 1.798607238208222), (50620, 1.7855464771223508)] ###Markdown 5. Finalmente, vemos cuáles juegos le gustaron a un determinado usuario y **cuáles le recomienda el sistema**. ###Code usuario = 398896 rating = 4 # Le pedimos los juegos que tengan un rating de 4 o más df_user = df_svd[(df_svd['username'] == usuario) & (df_svd['rating'] >= rating)] df_user = df_user.reset_index(drop=True) df_user['title'] = df_title['title'].loc[df_user.product_id].values df_user ###Output _____no_output_____ ###Markdown * *El juego que más le gustó al usuario fue Stranded Deep, seguido por Saints Row: The Third y Windward.* * *Ahora, creamos el Dataframe donde vamos a guardar las recomendaciones para un usuario en particular.* ###Code recomendaciones_usuario = df_title.iloc[:398896].copy() print(recomendaciones_usuario.shape) recomendaciones_usuario.head() ###Output (30083, 1) ###Markdown * *Luego, quitamos del Dataframe todas los juegos que ya sabemos que vio.* ###Code usuario_vistas = df_svd[df_svd['username'] == usuario] print(usuario_vistas.shape) usuario_vistas ###Output (6, 3) ###Markdown * *¡Vemos las recomendaciones que ya podemos hacerle a dicho usuario!* ###Code recomendaciones_usuario.drop(usuario_vistas.product_id, inplace = True) recomendaciones_usuario = recomendaciones_usuario.reset_index() recomendaciones_usuario.head() ###Output _____no_output_____ ###Markdown * *__Y hacemos las recomendaciones!!!__ con el Id de juego específico, y ordenados de mayor a menor. Abajo la recomendación con su valor.* ###Code recomendaciones_usuario['Estimate_Score'] = recomendaciones_usuario['product_id'].apply(lambda x: algo.predict(usuario, x).est) recomendaciones_usuario = recomendaciones_usuario.sort_values('Estimate_Score', ascending=False) print(recomendaciones_usuario.head(10)) # Recomendaciones con valuaciones estimadas por debajo ###Output product_id title Estimate_Score 29819 440 Team Fortress 2 4.865367 20948 394360 Hearts of Iron IV 4.825660 27104 236430 DARK SOULS™ II 4.806146 22236 268500 XCOM® 2 4.772525 27993 214950 Total War™: ROME II - Emperor Edition 4.765711 377 48700 Mount &amp; Blade: Warband 4.737455 48 4000 Garry's Mod 4.709083 2789 250900 The Binding of Isaac: Rebirth 4.662892 27457 39210 FINAL FANTASY XIV Online 4.603413 27663 252490 Rust 4.590698
W0D6_2P_Analysis.ipynb
###Markdown ###Code import numpy as np ###Output _____no_output_____
M1890-Hidrologia/Precipitacion/docs/.ipynb_checkpoints/Ej4_Precipitacion-checkpoint.ipynb
###Markdown Ejercicios de precipitación Ejercicio 4 - Método hipsométricoDada la curva hipsométrica de una cuenca (relación área-elevación) y la información de varias estaciones pluviométricas en dicha cuenca (tabla *Exercise_004* del archivo *RainfallData.xlsx*), calcula la precipitación media anual para la cuenca usando el método hipsométrico.| **Rango de altitud (m)** | **Fracción del área de la cuenca** ||-------------------------|-----------------------------------|| 311-400 | 0.028 || 400-600 | 0.159 || 600-800 | 0.341 || 800-1000 | 0.271 || 1000-1200 | 0.151 || 1200-1400 | 0.042 || 1400-1600 | 0.008 | ###Code import numpy as np import pandas as pd from matplotlib import pyplot as plt %matplotlib inline plt.style.use('dark_background') #plt.style.use('seaborn-whitegrid') #from scipy.stats import genextreme #from scipy.optimize import curve_fit ###Output _____no_output_____ ###Markdown __Curva hipsométrica__La curva hipsométrica define el porcentaje de área de la cuenca que está por debajo de una altitud dada.En este ejercicio utilizaremos la curva hipsométrica para asignar la proporción de la cuenca (en tanto por uno) correspondiente a cada franja de altitud. ###Code # Rangos de altitud Zs = np.array([311, 400, 600, 800, 1000, 1200, 1400, 1600]) Zs = np.mean([Zs[:-1], Zs[1:]], axis=0) # Área asociada As = np.array([0.028, 0.159, 0.341, 0.271, 0.151, 0.042, 0.008]) # crear data frame hipso = pd.DataFrame(data=[Zs, As]).transpose() hipso.columns = ['Z', 'A'] hipso['Aac'] = hipso.A.cumsum() hipso # Gráfico de la curva hipsométrica plt.plot(hipso.Z, hipso.Aac * 100) plt.title('Curva hipsométrica', fontsize=16, weight='bold') plt.xlabel('altitud (msnm)', fontsize=13) plt.xlim(Zs[0], Zs[-1]) plt.ylabel('área (%)', fontsize=13) plt.ylim((0, 100)); # guardar la figura plt.savefig('../output/Ej4_curva hipsométrica.png', dpi=300) ###Output _____no_output_____ ###Markdown __Regresión precipitación-altitud__Utilizaremos los datos de precipitación anual en las estaciones de la cuenca para establecer la regresión lineal de la precipitación con la altitud. ###Code # Importar datos de precipitación data4 = pd.read_excel('../data/RainfallData.xlsx', sheet_name='Exercise_004', index_col='Gage') # Simplificar nombres de las variables data4.columns = ['Z', 'P'] data4 ###Output _____no_output_____ ###Markdown Se calcula la regresión lineal de la precipitación media anual con respecto a la altura.$$P=a·Z+b$$Donde $P$ es la precipitación media anual (mm) de un punto a cota $Z$ (msnm). ###Code # ajustar la recta de regresión (a, b) = np.polyfit(data4.Z, data4.P, deg=1) print('P = {0:.3f} Z + {1:.3f}'.format(a,b)) # Gráfico altitud vs precipitación anual plt.scatter(data4.Z, data4.P) # recta de regresión xlim = np.array([0, Zs[-1]]) plt.plot(xlim, a * xlim + b, 'r--') # configuración plt.title('', fontsize=16, weight='bold') plt.xlabel('altitud (msnm)', fontsize=13) plt.xlim(xlim) plt.ylabel('Panual (mm)', fontsize=13) plt.ylim(0, 2200); # guardar la figura plt.savefig('../output/Ej4_regresión lienal Z-Panual.png', dpi=300) ###Output _____no_output_____ ###Markdown __Precipitación areal__Conocida la regresión, se calcula la precipitación media anual en los puntos intermedios de cada una de los rangos de altitud que define la curva hipsométrica. ###Code hipso['P'] = a * hipso.Z + b hipso ###Output _____no_output_____ ###Markdown La precipitación areal es el sumatorio del producto del área y precipitación en cada uno de los rangos de altitud. ###Code Pareal = np.sum(hipso.A * hipso.P) print('La precipitación media anual sobre la cuenca es {0:.1f} mm'.format(Pareal)) ###Output La precipitación media anual sobre la cuenca es 1613.0 mm ###Markdown Hacer lo mismo de forma simplificada: ###Code p = np.polyfit(data4.Z, data4.P, deg=1) # ajustar la regresión Ps = np.polyval(p, Zs) # interpolar precipitación Pareal = np.sum(Ps * As) # precipitación areal print('La precipitación media anual sobre la cuenca es {0:.1f} mm'.format(Pareal)) ###Output La precipitación media anual sobre la cuenca es 1613.0 mm ###Markdown Si se hubiera calculado la precipitación areal por el **método de la media de las estacions**, habríamos subestimado la precipitación areal de la cuenca. ###Code Pareal2 = data4.P.mean() print('La precipitación media anual sobre la cuenca {0:.1f} mm'.format(Pareal2)) ###Output La precipitación media anual sobre la cuenca 1550.0 mm
practice_notes_copies/Mission_to_Mars_ChallengeXXX.ipynb
###Markdown Featured Images ###Code # Visit URL url = 'https://spaceimages-mars.com' browser.visit(url) # Find and click the full image button full_image_elem = browser.find_by_tag('button')[1] full_image_elem.click() # Parse the resulting html with soup html = browser.html img_soup = soup(html, 'html.parser') # Find the relative image url img_url_rel = img_soup.find('img', class_='fancybox-image').get('src') img_url_rel # Use the base URL to create an absolute URL img_url = f'https://spaceimages-mars.com/{img_url_rel}' img_url import pandas as pd df = pd.read_html('https://galaxyfacts-mars.com')[0] df.columns=['description', 'Mars', 'Earth'] df.set_index('description', inplace=True) df df.to_html() browser.quit() ###Output _____no_output_____
SMS_spam_Detection.ipynb
###Markdown ###Code import tensorflow as tf import tensorflow_hub as hub !pip install tensorflow_text import tensorflow_text as text ###Output Requirement already satisfied: tensorflow_text in /usr/local/lib/python3.7/dist-packages (2.8.1) Requirement already satisfied: tensorflow-hub>=0.8.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow_text) (0.12.0) Requirement already satisfied: tensorflow<2.9,>=2.8.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow_text) (2.8.0) Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (0.24.0) Requirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.13.3) Requirement already satisfied: tensorboard<2.9,>=2.8 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (2.8.0) Requirement already satisfied: setuptools in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (57.4.0) Requirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (3.1.0) Requirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (3.10.0.2) Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (3.3.0) Requirement already satisfied: flatbuffers>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (2.0) Requirement already satisfied: libclang>=9.0.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (13.0.0) Requirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.0.0) Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.1.0) Requirement already satisfied: gast>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (0.5.3) Requirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.6.3) Requirement already satisfied: six>=1.12.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.15.0) Requirement already satisfied: numpy>=1.20 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.21.5) Requirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.1.2) Requirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (0.2.0) Requirement already satisfied: protobuf>=3.9.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (3.17.3) Requirement already satisfied: tf-estimator-nightly==2.8.0.dev2021122109 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (2.8.0.dev2021122109) Requirement already satisfied: keras<2.9,>=2.8.0rc0 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (2.8.0) Requirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.7/dist-packages (from tensorflow<2.9,>=2.8.0->tensorflow_text) (1.44.0) Requirement already satisfied: wheel<1.0,>=0.23.0 in /usr/local/lib/python3.7/dist-packages (from astunparse>=1.6.0->tensorflow<2.9,>=2.8.0->tensorflow_text) (0.37.1) Requirement already satisfied: cached-property in /usr/local/lib/python3.7/dist-packages (from h5py>=2.9.0->tensorflow<2.9,>=2.8.0->tensorflow_text) (1.5.2) Requirement already satisfied: google-auth<3,>=1.6.3 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (1.35.0) Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (2.23.0) Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (1.8.1) Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (1.0.1) Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (0.4.6) Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (3.3.6) Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (0.6.1) Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.6.3->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (0.2.8) Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.6.3->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (4.8) Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.6.3->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (4.2.4) Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (1.3.1) Requirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (4.11.2) Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (3.7.0) Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.7/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (0.4.8) Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (3.0.4) Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (2.10) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (2021.10.8) Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (1.24.3) Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow<2.9,>=2.8.0->tensorflow_text) (3.2.0) ###Markdown Dataset Available at: https://www.kaggle.com/uciml/sms-spam-collection-dataset ###Code import pandas as pd df = pd.read_csv('/content/drive/MyDrive/TKM/references for review analysis/datasets/spam2.csv',encoding='cp1252') df.head() df.groupby('Category').describe() ###Output _____no_output_____ ###Markdown ###Code df['Category'].value_counts() import matplotlib.pyplot as plt %matplotlib inline sms = df['Category'].value_counts() sms.plot(kind="pie", labels=["ham", "spam"], autopct="%1.0f%%") plt.title("Message Distribution") plt.ylabel("") plt.show() df_spam = df[df['Category']=='spam'] df_spam.shape df_ham = df[df['Category']=='ham'] df_ham.shape df_ham_downsampled = df_ham.sample(df_spam.shape[0]) df_ham_downsampled.shape df_balanced = pd.concat([df_ham_downsampled, df_spam]) df_balanced.shape df_balanced['Category'].value_counts() df_balanced['spam']=df_balanced['Category'].apply(lambda x: 1 if x=='spam' else 0) df_balanced.sample(5) from sklearn.model_selection import train_test_split X_train, X_test, y_train, y_test = train_test_split(df_balanced['Message'],df_balanced['spam'], stratify=df_balanced['spam']) X_train.head(4) import numpy as np # linear algebra from sklearn.feature_extraction.text import CountVectorizer,TfidfVectorizer tv=TfidfVectorizer(ngram_range=(2,2)) print(tv) #cv = CountVectorizer(ngram_range=(2,2)) #print(cv) """ as we increase the ngram count, InTFIDF, there is a large fall in accuracy of LR classifier and a relatively small change(decrease) in svm also In CV also the accuracy decreases TFIDF::: ngram(1,1)-lr:87%, svm=88% ngram(2,2)-lr=86.75%,svm=86.5% ngram(1,10)-lr=82%, svm=86% ngram(2.10)-lr=81.5%,svm=76.25% CV::: ngram(1,1)-lr=88%,svm=83% ngram(2,2)-lr=83%,svm=81% ngram(1,10-lr=83.5,svm=74.5) ngram(2.10)-lr=81%,svm=70% """ x_train=tv.fit_transform(X_train) #print(x_train) from sklearn.linear_model import LogisticRegression lr=LogisticRegression(max_iter=100000) lr.fit(x_train,y_train) pred_1=lr.predict(tv.transform(X_test)) #print(pred_1) from sklearn.metrics import confusion_matrix conf=confusion_matrix(y_test, pred_1) import pandas as pd import seaborn as sn import matplotlib.pyplot as plt data = {'y_Actual':y_test,'y_Predicted':pred_1} dataset = pd.DataFrame(data, columns=['y_Actual','y_Predicted']) conf_matrix = pd.crosstab(dataset['y_Actual'], dataset['y_Predicted'], rownames=['Actual'], colnames=['Predicted'],margins="TRUE") print(conf_matrix) data = {'y_Actual':y_test,'y_Predicted':pred_1} dataset = pd.DataFrame(data, columns=['y_Actual','y_Predicted']) conf_matrix_table = pd.crosstab(dataset['y_Actual'], dataset['y_Predicted'], rownames=['Actual'], colnames=['Predicted']) sn.heatmap(conf_matrix_table, annot=True,fmt=".1f") plt.show() from sklearn.metrics import accuracy_score score_1=accuracy_score(y_test,pred_1) print("\nUsing Logistic Regression, Accuracy= ",score_1) from sklearn.svm import SVC svm=SVC() svm.fit(x_train,y_train) pred_2=svm.predict(tv.transform(X_test)) score_2=accuracy_score(y_test,pred_2) data = {'y_Actual':y_test,'y_Predicted':pred_2} dataset = pd.DataFrame(data, columns=['y_Actual','y_Predicted']) conf_matrix_table = pd.crosstab(dataset['y_Actual'], dataset['y_Predicted'], rownames=['Actual'], colnames=['Predicted']) sn.heatmap(conf_matrix_table, annot=True,fmt=".1f") plt.show() print("\nUsing SVM, Accuracy= ",score_2) !pip install transformers bert_preprocess = hub.KerasLayer("https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/3") bert_encoder = hub.KerasLayer("https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/4") def get_sentence_embeding(sentences): preprocessed_text = bert_preprocess(sentences) return bert_encoder(preprocessed_text)['pooled_output'] # Bert layers text_input = tf.keras.layers.Input(shape=(), dtype=tf.string, name='text') preprocessed_text = bert_preprocess(text_input) outputs = bert_encoder(preprocessed_text) # Neural network layers l = tf.keras.layers.Dropout(0.1, name="dropout")(outputs['pooled_output']) l = tf.keras.layers.Dense(1, activation='sigmoid', name="output")(l) # Use inputs and outputs to construct a final model model = tf.keras.Model(inputs=[text_input], outputs = [l]) #model = ClassificationModel('bert', 'bert-base-multilingual-cased', num_labels=5, args={'preprocess_input_data': True, 'overwrite_output_dir': True}) # You can set class weights by using the optional weight argument model.summary() METRICS = [ tf.keras.metrics.BinaryAccuracy(name='accuracy'), tf.keras.metrics.Precision(name='precision'), tf.keras.metrics.Recall(name='recall') ] model.compile(optimizer='adam', loss='binary_crossentropy', metrics=METRICS) model.fit(X_train, y_train, epochs=10) model.evaluate(X_test, y_test) y_predicted = model.predict(X_test) y_predicted = y_predicted.flatten() import numpy as np y_predicted = np.where(y_predicted > 0.5, 1, 0) y_predicted from sklearn.metrics import confusion_matrix, classification_report cm = confusion_matrix(y_test, y_predicted) cm from matplotlib import pyplot as plt import seaborn as sn sn.heatmap(cm, annot=True, fmt='d') plt.xlabel('Predicted') plt.ylabel('Truth') print(classification_report(y_test, y_predicted)) from sklearn.linear_model import LogisticRegression lr=LogisticRegression(max_iter=100000) lr.fit(x_train,y_train) reviews = [ 'Enter a chance to win $5000, hurry up, offer valid until march 31, 2021', 'You are awarded a SiPix Digital Camera! call 09061221061 from landline. Delivery within 28days. T Cs Box177. M221BP. 2yr warranty. 150ppm. 16 . p p£3.99', 'it to 80488. Your 500 free text messages are valid until 31 December 2005.', 'Hey Sam, Are you coming for a cricket game tomorrow', "Why don't you wait 'til at least wednesday to see if you get your ." ] #x_test=["hi how are you","I had originally chosen the Conrad Chicago Hotel for its location near shopping centers that were within walking distance. ","nice food","bad room","what r u speaking"] pred_1=lr.predict(tv.transform(reviews)) r_predicted = np.where(pred_1> 0.5, 'spam', 'ham') r_predicted reviews = [ 'Enter a chance to win $5000, hurry up, offer valid until march 31, 2021', 'You are awarded a SiPix Digital Camera! call 09061221061 from landline. Delivery within 28days. T Cs Box177. M221BP. 2yr warranty. 150ppm. 16 . p p£3.99', 'it to 80488. Your 500 free text messages are valid until 31 December 2005.', 'Hey Sam, Are you coming for a cricket game tomorrow', "Why don't you wait 'til at least wednesday to see if you get your ." ] m=model.predict(reviews) r_predicted = np.where(m > 0.5, 'spam', 'ham') r_predicted ###Output _____no_output_____
arules.ipynb
###Markdown Import Modules ###Code import numpy as np import pandas as pd import seaborn as sns from wordcloud import WordCloud import matplotlib.pyplot as plt from mlxtend.preprocessing import TransactionEncoder from mlxtend.frequent_patterns import fpgrowth, apriori, association_rules from typing import Union, NoReturn from IPython.display import display ###Output _____no_output_____ ###Markdown Read Dataset ###Code data = pd.read_csv('datasets/groceries.csv') data ###Output _____no_output_____ ###Markdown Preprocessing Rename Columns ###Code data.rename( columns={ 'Member_number': 'id', 'Date': 'date', 'itemDescription': 'item' }, inplace=True ) ###Output _____no_output_____ ###Markdown Date Information ###Code data.date = pd.to_datetime(data.date) data['day'] = data.date.dt.day data['year'] = data.date.dt.year data['weekday'] = data.date.dt.weekday data ###Output _____no_output_____ ###Markdown EDA The Most Popular Items ###Code items = data.item.value_counts().reset_index() items.rename( columns={ 'item': 'count', 'index': 'item', }, inplace=True ) fig = plt.figure(figsize=(6, 8)) sns.barplot( x='count', y='item', data=items[:25], orient='h', palette='rocket' ) plt.title('The Most Popular Items') plt.show() ###Output _____no_output_____ ###Markdown Word Cloud ###Code fig = plt.figure(figsize=(12, 12)) cloud = WordCloud( max_words=len(items), width=1200, height=1200, ).generate(text=', '.join(data.item)) plt.imshow(cloud) plt.axis('off') ###Output _____no_output_____ ###Markdown Transactions By Day, Weekday ###Code by_weekday = data.groupby(['id', 'date']).agg({'weekday': lambda x: x.unique()}) plt.figure(figsize=(8, 8)) sns.countplot(x='weekday', data=by_weekday, palette='magma') plt.title('Transactions By Weekday') by_day = data.groupby('day').item.count().reset_index() plt.figure(figsize=(8, 6)) sns.lineplot(x='day', y='item', data=by_day) plt.title('Cummulative day transactions') ###Output _____no_output_____ ###Markdown Customers Activity ###Code plt.figure(figsize=(8, 12)) sns.countplot( y='id', hue='year', data=data, order=data.id.value_counts().index[:25], palette='viridis' ) plt.title('Customers') ###Output _____no_output_____ ###Markdown Association Rule Mining Create Transactions ###Code transactions = [ transaction[1].item.tolist() for transaction in data.groupby(['id', 'date']) ] transactions[:10] ###Output _____no_output_____ ###Markdown Encode Items ###Code encoder = TransactionEncoder() encoded_trans = encoder.fit_transform(transactions) trans_matrix = pd.DataFrame( data=encoded_trans, columns=encoder.columns_ ) trans_matrix ###Output _____no_output_____ ###Markdown Choose Min Support And Min Confidence ###Code class FindMetricThreshold(): def __init__(self, data: pd.DataFrame, min_support: Union[list[float], np.ndarray], metric: str = 'confidence', min_threshold: Union[list[float], np.ndarray] = None, algorithm: str = 'fpgrowth'): self.data = data self.min_support = min_support self.metric = metric self.min_threshold = min_threshold self.algorithm = apriori if algorithm == 'apriori' else fpgrowth def grid_search(self) -> np.ndarray: rules_matrix = np.zeros((len(self.min_support), len(self.min_threshold))) for i, support in enumerate(self.min_support): for j, threshold in enumerate(self.min_threshold): frequent_itemsets = self.algorithm( df=self.data, min_support=support ) rules = association_rules( df=frequent_itemsets, metric=self.metric, min_threshold=threshold ) rules_matrix[i, j] = len(rules) self.rules_matrix = rules_matrix return rules_matrix def plot_metrics(self) -> NoReturn: fig = plt.figure(figsize=(8, 8)) for row in self.rules_matrix: plt.plot(self.min_threshold, row, 'o-') plt.legend([ 'Support {:.2%}'.format(level) for level in self.min_support ]) plt.grid(True) plt.xlabel(self.metric.title()) plt.ylabel('Number of Rules') plt.savefig('Min Threshold Search.png') min_support = [4e-3, 2e-3, 1.2e-3, 8e-4] metric = 'lift' # 'confidence' # min_theshold = [1, 1.25, 1.5, 1.75, 2] # [0.15, 0.125, 0.1] # # %%timeit metrics = FindMetricThreshold( data=trans_matrix, min_support=min_support, metric=metric, min_threshold=min_theshold ) metrics.grid_search() metrics.plot_metrics() min_support = 0.0012 metric = 'confidence' min_threshold = 0.125 ###Output _____no_output_____ ###Markdown Mine Rules ###Code frequent_itemsets = fpgrowth( df=trans_matrix, min_support=min_support, use_colnames=True, ) frequent_itemsets['length'] = frequent_itemsets.itemsets.apply(len) frequent_itemsets rules = association_rules( df=frequent_itemsets, metric=metric, min_threshold=min_threshold ) rules.sort_values( by='lift', ascending=False ).reset_index(drop=True).head(15) ###Output _____no_output_____ ###Markdown Anylize Results of Mining Find correlation between different metrics ###Code metrics = ['support', 'confidence', 'lift', 'leverage', 'conviction'] sns_plot = sns.pairplot( data=rules[metrics], kind='reg', corner=True ) plt.show() ###Output _____no_output_____